WorldWideScience

Sample records for compute transfer maps

  1. Accurate computation of transfer maps from magnetic field data

    International Nuclear Information System (INIS)

    Venturini, Marco; Dragt, Alex J.

    1999-01-01

    Consider an arbitrary beamline magnet. Suppose one component (for example, the radial component) of the magnetic field is known on the surface of some imaginary cylinder coaxial to and contained within the magnet aperture. This information can be obtained either by direct measurement or by computation with the aid of some 3D electromagnetic code. Alternatively, suppose that the field harmonics have been measured by using a spinning coil. We describe how this information can be used to compute the exact transfer map for the beamline element. This transfer map takes into account all effects of real beamline elements including fringe-field, pseudo-multipole, and real multipole error effects. The method we describe automatically takes into account the smoothing properties of the Laplace-Green function. Consequently, it is robust against both measurement and electromagnetic code errors. As an illustration we apply the method to the field analysis of high-gradient interaction region quadrupoles in the Large Hadron Collider (LHC)

  2. Transfer maps and projection formulas

    OpenAIRE

    Tabuada, Goncalo

    2010-01-01

    Transfer maps and projection formulas are undoubtedly one of the key tools in the development and computation of (co)homology theories. In this note we develop an unified treatment of transfer maps and projection formulas in the non-commutative setting of dg categories. As an application, we obtain transfer maps and projection formulas in algebraic K-theory, cyclic homology, topological cyclic homology, and other scheme invariants.

  3. Computer mapping software and geographic data base development: Oak Ridge National Laboratory user experience

    International Nuclear Information System (INIS)

    Honea, B.; Johnson, P.

    1978-01-01

    As users of computer display tools, our opinion is that the researcher's needs should guide and direct the computer scientist's development of mapping software and data bases. Computer graphic techniques developed for the sake of the computer graphics community tend to be esoteric and rarely suitable for user problems. Two types of users exist for computer graphic tools: the researcher who is generally satisfied with abstract but accurate displays for analysis purposes and the decision maker who requires synoptic and easily comprehended displays relevant to the issues he or she must address. Computer mapping software and data bases should be developed for the user in a generalized and standardized format for ease in transferring and to facilitate the linking or merging with larger analysis systems. Maximum utility of computer mapping tools is accomplished when linked to geographic information and analysis systems. Computer graphic techniques have varying degrees of utility depending upon whether they are used for data validation, analysis procedures or presenting research results

  4. Rigorous bounds on survival times in circular accelerators and efficient computation of fringe-field transfer maps

    International Nuclear Information System (INIS)

    Hoffstaetter, G.H.

    1994-12-01

    Analyzing stability of particle motion in storage rings contributes to the general field of stability analysis in weakly nonlinear motion. A method which we call pseudo invariant estimation (PIE) is used to compute lower bounds on the survival time in circular accelerators. The pseudeo invariants needed for this approach are computed via nonlinear perturbative normal form theory and the required global maxima of the highly complicated multivariate functions could only be rigorously bound with an extension of interval arithmetic. The bounds on the survival times are large enough to the relevant; the same is true for the lower bounds on dynamical aperatures, which can be computed. The PIE method can lead to novel design criteria with the objective of maximizing the survival time. A major effort in the direction of rigourous predictions only makes sense if accurate models of accelerators are available. Fringe fields often have a significant influence on optical properties, but the computation of fringe-field maps by DA based integration is slower by several orders of magnitude than DA evaluation of the propagator for main-field maps. A novel computation of fringe-field effects called symplectic scaling (SYSCA) is introduced. It exploits the advantages of Lie transformations, generating functions, and scaling properties and is extremely accurate. The computation of fringe-field maps is typically made nearly two orders of magnitude faster. (orig.)

  5. Analytic transfer maps for Lie algebraic design codes

    International Nuclear Information System (INIS)

    van Zeijts, J.; Neri, F.; Dragt, A.J.

    1990-01-01

    Lie algebraic methods provide a powerful tool for modeling particle transport through Hamiltonian systems. Briefly summarized, Lie algebraic design codes work as follows: first the time t flow generated by a Hamiltonian system is represented by a Lie algebraic map acting on the initial conditions. Maps are generated for each element in the lattice or beamline under study. Next all these maps are concatenated into a one-turn or one-pass map that represents the complete dynamics of the system. Finally, the resulting map is analyzed and design decisions are made based on the linear and nonlinear entries in the map. The authors give a short description of how to find Lie algebraic transfer maps in analytic form, for inclusion in accelerator design codes. As an example they find the transfer map, through third order, for the combined-function quadrupole magnet, and use such magnets to correct detrimental third-order aberrations in a spot forming system

  6. Computers in technical information transfer

    Energy Technology Data Exchange (ETDEWEB)

    Price, C.E.

    1978-01-01

    The use of computers in transferring scientific and technical information from its creation to its use is surveyed. The traditional publication and distribution processes for S and T literature in past years have been the vehicle for transfer, but computers have altered the process in phenomenal ways. Computers are used in literature publication through text editing and photocomposition applications. Abstracting and indexing services use computers for preparing their products, but the machine-readable document descriptions created for this purpose are input to a rapidly growing computerized information retrieval service industry. Computer use is making many traditional processes passe, and may eventually lead to a largely ''paperless'' information utility.

  7. Automatic Computer Mapping of Terrain

    Science.gov (United States)

    Smedes, H. W.

    1971-01-01

    Computer processing of 17 wavelength bands of visible, reflective infrared, and thermal infrared scanner spectrometer data, and of three wavelength bands derived from color aerial film has resulted in successful automatic computer mapping of eight or more terrain classes in a Yellowstone National Park test site. The tests involved: (1) supervised and non-supervised computer programs; (2) special preprocessing of the scanner data to reduce computer processing time and cost, and improve the accuracy; and (3) studies of the effectiveness of the proposed Earth Resources Technology Satellite (ERTS) data channels in the automatic mapping of the same terrain, based on simulations, using the same set of scanner data. The following terrain classes have been mapped with greater than 80 percent accuracy in a 12-square-mile area with 1,800 feet of relief; (1) bedrock exposures, (2) vegetated rock rubble, (3) talus, (4) glacial kame meadow, (5) glacial till meadow, (6) forest, (7) bog, and (8) water. In addition, shadows of clouds and cliffs are depicted, but were greatly reduced by using preprocessing techniques.

  8. Accurate atom-mapping computation for biochemical reactions.

    Science.gov (United States)

    Latendresse, Mario; Malerich, Jeremiah P; Travers, Mike; Karp, Peter D

    2012-11-26

    The complete atom mapping of a chemical reaction is a bijection of the reactant atoms to the product atoms that specifies the terminus of each reactant atom. Atom mapping of biochemical reactions is useful for many applications of systems biology, in particular for metabolic engineering where synthesizing new biochemical pathways has to take into account for the number of carbon atoms from a source compound that are conserved in the synthesis of a target compound. Rapid, accurate computation of the atom mapping(s) of a biochemical reaction remains elusive despite significant work on this topic. In particular, past researchers did not validate the accuracy of mapping algorithms. We introduce a new method for computing atom mappings called the minimum weighted edit-distance (MWED) metric. The metric is based on bond propensity to react and computes biochemically valid atom mappings for a large percentage of biochemical reactions. MWED models can be formulated efficiently as Mixed-Integer Linear Programs (MILPs). We have demonstrated this approach on 7501 reactions of the MetaCyc database for which 87% of the models could be solved in less than 10 s. For 2.1% of the reactions, we found multiple optimal atom mappings. We show that the error rate is 0.9% (22 reactions) by comparing these atom mappings to 2446 atom mappings of the manually curated Kyoto Encyclopedia of Genes and Genomes (KEGG) RPAIR database. To our knowledge, our computational atom-mapping approach is the most accurate and among the fastest published to date. The atom-mapping data will be available in the MetaCyc database later in 2012; the atom-mapping software will be available within the Pathway Tools software later in 2012.

  9. Pacing a data transfer operation between compute nodes on a parallel computer

    Science.gov (United States)

    Blocksome, Michael A [Rochester, MN

    2011-09-13

    Methods, systems, and products are disclosed for pacing a data transfer between compute nodes on a parallel computer that include: transferring, by an origin compute node, a chunk of an application message to a target compute node; sending, by the origin compute node, a pacing request to a target direct memory access (`DMA`) engine on the target compute node using a remote get DMA operation; determining, by the origin compute node, whether a pacing response to the pacing request has been received from the target DMA engine; and transferring, by the origin compute node, a next chunk of the application message if the pacing response to the pacing request has been received from the target DMA engine.

  10. Transfer map approach to and optical effects of energy degraders in fragment separators

    Directory of Open Access Journals (Sweden)

    B. Erdelyi

    2009-01-01

    Full Text Available A second order analytical and an arbitrary order numerical procedure is developed for the computation of transfer maps of energy degraders. The incorporation of the wedges into the optics of fragment separators for next-generation exotic beam facilities, their optical effects, and the optimization of their performance is studied in detail. It is shown how to place and shape the degraders in the system such that aberrations are minimized and resolving powers are maximized.

  11. Parallel algorithms for mapping pipelined and parallel computations

    Science.gov (United States)

    Nicol, David M.

    1988-01-01

    Many computational problems in image processing, signal processing, and scientific computing are naturally structured for either pipelined or parallel computation. When mapping such problems onto a parallel architecture it is often necessary to aggregate an obvious problem decomposition. Even in this context the general mapping problem is known to be computationally intractable, but recent advances have been made in identifying classes of problems and architectures for which optimal solutions can be found in polynomial time. Among these, the mapping of pipelined or parallel computations onto linear array, shared memory, and host-satellite systems figures prominently. This paper extends that work first by showing how to improve existing serial mapping algorithms. These improvements have significantly lower time and space complexities: in one case a published O(nm sup 3) time algorithm for mapping m modules onto n processors is reduced to an O(nm log m) time complexity, and its space requirements reduced from O(nm sup 2) to O(m). Run time complexity is further reduced with parallel mapping algorithms based on these improvements, which run on the architecture for which they create the mappings.

  12. Controlling data transfers from an origin compute node to a target compute node

    Science.gov (United States)

    Archer, Charles J [Rochester, MN; Blocksome, Michael A [Rochester, MN; Ratterman, Joseph D [Rochester, MN; Smith, Brian E [Rochester, MN

    2011-06-21

    Methods, apparatus, and products are disclosed for controlling data transfers from an origin compute node to a target compute node that include: receiving, by an application messaging module on the target compute node, an indication of a data transfer from an origin compute node to the target compute node; and administering, by the application messaging module on the target compute node, the data transfer using one or more messaging primitives of a system messaging module in dependence upon the indication.

  13. Automatic computation of transfer functions

    Science.gov (United States)

    Atcitty, Stanley; Watson, Luke Dale

    2015-04-14

    Technologies pertaining to the automatic computation of transfer functions for a physical system are described herein. The physical system is one of an electrical system, a mechanical system, an electromechanical system, an electrochemical system, or an electromagnetic system. A netlist in the form of a matrix comprises data that is indicative of elements in the physical system, values for the elements in the physical system, and structure of the physical system. Transfer functions for the physical system are computed based upon the netlist.

  14. Three-dimensional multislice spiral computed tomographic angiography: a potentially useful tool for safer free tissue transfer to complicated regions

    DEFF Research Database (Denmark)

    Demirtas, Yener; Cifci, Mehmet; Kelahmetoglu, Osman

    2009-01-01

    Three-dimensional multislice spiral computed tomographic angiography (3D-MSCTA) is a minimally invasive method of vascular mapping. The aim of this study was to evaluate the clinical usefulness of this imaging technique in delineating the recipient vessels for safer free tissue transfer to compli......Three-dimensional multislice spiral computed tomographic angiography (3D-MSCTA) is a minimally invasive method of vascular mapping. The aim of this study was to evaluate the clinical usefulness of this imaging technique in delineating the recipient vessels for safer free tissue transfer...... be kept in mind, especially inthe patients with peripheral vascular disease. 3D-MSCTA has the potential to replace digital subtraction angiography for planning of microvascular reconstructions and newer devices with higher resolutions will probably increase the reliability of this technique. (c) 2009...

  15. Software survey: VOSviewer, a computer program for bibliometric mapping

    NARCIS (Netherlands)

    N.J.P. van Eck (Nees Jan); L. Waltman (Ludo)

    2010-01-01

    textabstractWe present VOSviewer, a freely available computer program that we have developed for constructing and viewing bibliometric maps. Unlike most computer programs that are used for bibliometric mapping, VOSviewer pays special attention to the graphical representation of bibliometric maps.

  16. Transfer map approach to an optical effects of energy degraders on the performance of fragment separators

    International Nuclear Information System (INIS)

    Erdelyi, B.; Bandura, L.; Nolen, J.

    2009-01-01

    A second order analytical and an arbitrary order numerical procedure is developed for the computation of transfer maps of energy degraders. The incorporation of the wedges into the optics of fragment separators for next-generation exotic beam facilities, their optical effects, and the optimization of their performance is studied in detail. It is shown how to place and shape the degraders in the system such that aberrations are minimized and resolving powers are maximized

  17. Paper-Based and Computer-Based Concept Mappings: The Effects on Computer Achievement, Computer Anxiety and Computer Attitude

    Science.gov (United States)

    Erdogan, Yavuz

    2009-01-01

    The purpose of this paper is to compare the effects of paper-based and computer-based concept mappings on computer hardware achievement, computer anxiety and computer attitude of the eight grade secondary school students. The students were randomly allocated to three groups and were given instruction on computer hardware. The teaching methods used…

  18. Journal of Computer Science and Its Application: Site Map

    African Journals Online (AJOL)

    Journal of Computer Science and Its Application: Site Map. Journal Home > About the Journal > Journal of Computer Science and Its Application: Site Map. Log in or Register to get access to full text downloads.

  19. Realistic ion optical transfer maps for Super-FRS magnets from numerical field data

    Energy Technology Data Exchange (ETDEWEB)

    Kazantseva, Erika; Boine-Frankenheim, Oliver [Technische Universitaet Darmstadt (Germany)

    2016-07-01

    In large aperture accelerators such as Super-FRS, the non-linearity of the magnetic field in bending elements leads to the non-linear beam dynamics, which cannot be described by means of linear ion optics. Existing non-linear approach is based on the Fourier harmonics formalism and is not working if horizontal aperture is bigger as vertical or vice versa. In Super-FRS dipole the horizontal aperture is much bigger than the vertical. Hence, it is necessary to find a way to create the higher order transfer map for this dipole to accurately predict the particle dynamics in the realistic magnetic fields in the whole aperture. The aim of this work is to generate an accurate high order transfer map of magnetic elements from measured or simulated 3D magnetic field data. Using differential algebraic formalism allows generating transfer maps automatically via numerical integration of ODEs of motion in beam physics coordinates along the reference path. To make the transfer map accurate for all particles in the beam, the magnetic field along the integration path should be represented by analytical function, matching with the real field distribution in the volume of interest. Within this work the steps of high order realistic transfer map production starting from the field values on closed box, covering the volume of interest, will be analyzed in detail.

  20. WinSCP for Windows File Transfers | High-Performance Computing | NREL

    Science.gov (United States)

    WinSCP for Windows File Transfers WinSCP for Windows File Transfers WinSCP for can used to securely transfer files between your local computer running Microsoft Windows and a remote computer running Linux

  1. Functional magnetic resonance maps obtained by personal computer

    International Nuclear Information System (INIS)

    Gomez, F. j.; Manjon, J. V.; Robles, M.; Marti-Bonmati, L.; Dosda, R.; Molla, E.

    2001-01-01

    Functional magnetic resonance (fMR) is of special relevance in the analysis of certain types of brain activation. The present report describes the development of a simple software program for use with personal computers (PCs) that analyzes these images and provides functional activation maps. Activation maps are based on the temporal differences in oxyhemoglobin in tomographic images. To detect these differences, intensities registered repeatedly during brain control and activation are compared. The experiments were performed with a 1.5-Tesla MR unit. To verify the reliability of the program fMR studies were carried out in 4 healthy individuals (12 contiguous slices, 80 images per slice every 3.1 seconds for a total of 960 images). All the images were transferred to a PC and were processed pixel by pixel within each sequence to obtain an intensity/time curve. The statistical study of the results (Student's test and cross correlation analysis) made it possible to establish the activation of each pixel. The images were prepared using spatial filtering, temporal filtering, baseline correction, normalization and segmentation of the parenchyma. The postprocessing of the results involved the elimination of single pixels, superposition of an anatomical image of greater spatial resolution and anti-aliasing. The application (Xfun 1.0, Valencia, Spain) was developed in Microsoft Visual C++5.0 Developer Studio for Windows NT Workstation. As a representative example, the program took 8.2 seconds to calculate and present the results of the entire study (12 functional maps). In the motor and visual activation experiments, the activation corresponding to regions proximal to the central sulcus of the hemisphere contralateral to the hand that moved and in the occipital cortex were observed. While programs that calculate activation maps are available, the development of software for PCs running Microsoft Windows ensures several key features for its use on a daily basis: it is easy

  2. MPEG-4 AVC saliency map computation

    Science.gov (United States)

    Ammar, M.; Mitrea, M.; Hasnaoui, M.

    2014-02-01

    A saliency map provides information about the regions inside some visual content (image, video, ...) at which a human observer will spontaneously look at. For saliency maps computation, current research studies consider the uncompressed (pixel) representation of the visual content and extract various types of information (intensity, color, orientation, motion energy) which are then fusioned. This paper goes one step further and computes the saliency map directly from the MPEG-4 AVC stream syntax elements with minimal decoding operations. In this respect, an a-priori in-depth study on the MPEG-4 AVC syntax elements is first carried out so as to identify the entities appealing the visual attention. Secondly, the MPEG-4 AVC reference software is completed with software tools allowing the parsing of these elements and their subsequent usage in objective benchmarking experiments. This way, it is demonstrated that an MPEG-4 saliency map can be given by a combination of static saliency and motion maps. This saliency map is experimentally validated under a robust watermarking framework. When included in an m-QIM (multiple symbols Quantization Index Modulation) insertion method, PSNR average gains of 2.43 dB, 2.15dB, and 2.37 dB are obtained for data payload of 10, 20 and 30 watermarked blocks per I frame, i.e. about 30, 60, and 90 bits/second, respectively. These quantitative results are obtained out of processing 2 hours of heterogeneous video content.

  3. Introduction to computational mass transfer with applications to chemical engineering

    CERN Document Server

    Yu, Kuo-Tsung

    2017-01-01

    This book offers an easy-to-understand introduction to the computational mass transfer (CMT) method. On the basis of the contents of the first edition, this new edition is characterized by the following additional materials. It describes the successful application of this method to the simulation of the mass transfer process in a fluidized bed, as well as recent investigations and computing methods for predictions for the multi-component mass transfer process. It also demonstrates the general issues concerning computational methods for simulating the mass transfer of the rising bubble process. This new edition has been reorganized by moving the preparatory materials for Computational Fluid Dynamics (CFD) and Computational Heat Transfer into appendices, additions of new chapters, and including three new appendices on, respectively, generalized representation of the two-equation model for the CMT, derivation of the equilibrium distribution function in the lattice-Boltzmann method, and derivation of the Navier-S...

  4. Finite Precision Logistic Map between Computational Efficiency and Accuracy with Encryption Applications

    Directory of Open Access Journals (Sweden)

    Wafaa S. Sayed

    2017-01-01

    Full Text Available Chaotic systems appear in many applications such as pseudo-random number generation, text encryption, and secure image transfer. Numerical solutions of these systems using digital software or hardware inevitably deviate from the expected analytical solutions. Chaotic orbits produced using finite precision systems do not exhibit the infinite period expected under the assumptions of infinite simulation time and precision. In this paper, digital implementation of the generalized logistic map with signed parameter is considered. We present a fixed-point hardware realization of a Pseudo-Random Number Generator using the logistic map that experiences a trade-off between computational efficiency and accuracy. Several introduced factors such as the used precision, the order of execution of the operations, parameter, and initial point values affect the properties of the finite precision map. For positive and negative parameter cases, the studied properties include bifurcation points, output range, maximum Lyapunov exponent, and period length. The performance of the finite precision logistic map is compared in the two cases. A basic stream cipher system is realized to evaluate the system performance for encryption applications for different bus sizes regarding the encryption key size, hardware requirements, maximum clock frequency, NIST and correlation, histogram, entropy, and Mean Absolute Error analyses of encrypted images.

  5. Failure detection in high-performance clusters and computers using chaotic map computations

    Science.gov (United States)

    Rao, Nageswara S.

    2015-09-01

    A programmable media includes a processing unit capable of independent operation in a machine that is capable of executing 10.sup.18 floating point operations per second. The processing unit is in communication with a memory element and an interconnect that couples computing nodes. The programmable media includes a logical unit configured to execute arithmetic functions, comparative functions, and/or logical functions. The processing unit is configured to detect computing component failures, memory element failures and/or interconnect failures by executing programming threads that generate one or more chaotic map trajectories. The central processing unit or graphical processing unit is configured to detect a computing component failure, memory element failure and/or an interconnect failure through an automated comparison of signal trajectories generated by the chaotic maps.

  6. Behavior Life Style Analysis for Mobile Sensory Data in Cloud Computing through MapReduce

    Science.gov (United States)

    Hussain, Shujaat; Bang, Jae Hun; Han, Manhyung; Ahmed, Muhammad Idris; Amin, Muhammad Bilal; Lee, Sungyoung; Nugent, Chris; McClean, Sally; Scotney, Bryan; Parr, Gerard

    2014-01-01

    Cloud computing has revolutionized healthcare in today's world as it can be seamlessly integrated into a mobile application and sensor devices. The sensory data is then transferred from these devices to the public and private clouds. In this paper, a hybrid and distributed environment is built which is capable of collecting data from the mobile phone application and store it in the cloud. We developed an activity recognition application and transfer the data to the cloud for further processing. Big data technology Hadoop MapReduce is employed to analyze the data and create user timeline of user's activities. These activities are visualized to find useful health analytics and trends. In this paper a big data solution is proposed to analyze the sensory data and give insights into user behavior and lifestyle trends. PMID:25420151

  7. Analyzing the Use of Concept Maps in Computer Science: A Systematic Mapping Study

    Science.gov (United States)

    dos Santos, Vinicius; de Souza, Érica F.; Felizardo, Katia R; Vijaykumar, Nandamudi L.

    2017-01-01

    Context: concept Maps (CMs) enable the creation of a schematic representation of a domain knowledge. For this reason, CMs have been applied in different research areas, including Computer Science. Objective: the objective of this paper is to present the results of a systematic mapping study conducted to collect and evaluate existing research on…

  8. Computational methods for constructing protein structure models from 3D electron microscopy maps.

    Science.gov (United States)

    Esquivel-Rodríguez, Juan; Kihara, Daisuke

    2013-10-01

    Protein structure determination by cryo-electron microscopy (EM) has made significant progress in the past decades. Resolutions of EM maps have been improving as evidenced by recently reported structures that are solved at high resolutions close to 3Å. Computational methods play a key role in interpreting EM data. Among many computational procedures applied to an EM map to obtain protein structure information, in this article we focus on reviewing computational methods that model protein three-dimensional (3D) structures from a 3D EM density map that is constructed from two-dimensional (2D) maps. The computational methods we discuss range from de novo methods, which identify structural elements in an EM map, to structure fitting methods, where known high resolution structures are fit into a low-resolution EM map. A list of available computational tools is also provided. Copyright © 2013 Elsevier Inc. All rights reserved.

  9. [The automatic iris map overlap technology in computer-aided iridiagnosis].

    Science.gov (United States)

    He, Jia-feng; Ye, Hu-nian; Ye, Miao-yuan

    2002-11-01

    In the paper, iridology and computer-aided iridiagnosis technologies are briefly introduced and the extraction method of the collarette contour is then investigated. The iris map can be overlapped on the original iris image based on collarette contour extraction. The research on collarette contour extraction and iris map overlap is of great importance to computer-aided iridiagnosis technologies.

  10. Behavior Life Style Analysis for Mobile Sensory Data in Cloud Computing through MapReduce

    Directory of Open Access Journals (Sweden)

    Shujaat Hussain

    2014-11-01

    Full Text Available Cloud computing has revolutionized healthcare in today’s world as it can be seamlessly integrated into a mobile application and sensor devices. The sensory data is then transferred from these devices to the public and private clouds. In this paper, a hybrid and distributed environment is built which is capable of collecting data from the mobile phone application and store it in the cloud. We developed an activity recognition application and transfer the data to the cloud for further processing. Big data technology Hadoop MapReduce is employed to analyze the data and create user timeline of user’s activities. These activities are visualized to find useful health analytics and trends. In this paper a big data solution is proposed to analyze the sensory data and give insights into user behavior and lifestyle trends.

  11. The H2 + + He proton transfer reaction: quantum reactive differential cross sections to be linked with future velocity mapping experiments

    Science.gov (United States)

    Hernández Vera, Mario; Wester, Roland; Gianturco, Francesco Antonio

    2018-01-01

    We construct the velocity map images of the proton transfer reaction between helium and molecular hydrogen ion {{{H}}}2+. We perform simulations of imaging experiments at one representative total collision energy taking into account the inherent aberrations of the velocity mapping in order to explore the feasibility of direct comparisons between theory and future experiments planned in our laboratory. The asymptotic angular distributions of the fragments in a 3D velocity space is determined from the quantum state-to-state differential reactive cross sections and reaction probabilities which are computed by using the time-independent coupled channel hyperspherical coordinate method. The calculations employ an earlier ab initio potential energy surface computed at the FCI/cc-pVQZ level of theory. The present simulations indicate that the planned experiments would be selective enough to differentiate between product distributions resulting from different initial internal states of the reactants.

  12. How Concept-Mapping Perception Navigates Student Knowledge Transfer Performance

    Science.gov (United States)

    Tseng, Kuo-Hung; Chang, Chi-Cheng; Lou, Shi-Jer; Tan, Yue; Chiu, Chien-Jung

    2012-01-01

    The purpose of this paper is to investigate students' perception of concept maps as a learning tool where knowledge transfer is the goal. This article includes an evaluation of the learning performance of 42 undergraduate students enrolled in a nanotech course at a university in Taiwan. Canonical correlation and MANOVA analyses were employed to…

  13. Dataflow-Based Mapping of Computer Vision Algorithms onto FPGAs

    Directory of Open Access Journals (Sweden)

    Ivan Corretjer

    2007-01-01

    Full Text Available We develop a design methodology for mapping computer vision algorithms onto an FPGA through the use of coarse-grain reconfigurable dataflow graphs as a representation to guide the designer. We first describe a new dataflow modeling technique called homogeneous parameterized dataflow (HPDF, which effectively captures the structure of an important class of computer vision applications. This form of dynamic dataflow takes advantage of the property that in a large number of image processing applications, data production and consumption rates can vary, but are equal across dataflow graph edges for any particular application iteration. After motivating and defining the HPDF model of computation, we develop an HPDF-based design methodology that offers useful properties in terms of verifying correctness and exposing performance-enhancing transformations; we discuss and address various challenges in efficiently mapping an HPDF-based application representation into target-specific HDL code; and we present experimental results pertaining to the mapping of a gesture recognition application onto the Xilinx Virtex II FPGA.

  14. DIMACS Workshop on Interconnection Networks and Mapping, and Scheduling Parallel Computations

    CERN Document Server

    Rosenberg, Arnold L; Sotteau, Dominique; NSF Science and Technology Center in Discrete Mathematics and Theoretical Computer Science; Interconnection networks and mapping and scheduling parallel computations

    1995-01-01

    The interconnection network is one of the most basic components of a massively parallel computer system. Such systems consist of hundreds or thousands of processors interconnected to work cooperatively on computations. One of the central problems in parallel computing is the task of mapping a collection of processes onto the processors and routing network of a parallel machine. Once this mapping is done, it is critical to schedule computations within and communication among processor from universities and laboratories, as well as practitioners involved in the design, implementation, and application of massively parallel systems. Focusing on interconnection networks of parallel architectures of today and of the near future , the book includes topics such as network topologies,network properties, message routing, network embeddings, network emulation, mappings, and efficient scheduling. inputs for a process are available where and when the process is scheduled to be computed. This book contains the refereed pro...

  15. A NON-PARAMETRIC APPROACH TO CONSTRAIN THE TRANSFER FUNCTION IN REVERBERATION MAPPING

    Energy Technology Data Exchange (ETDEWEB)

    Li, Yan-Rong; Wang, Jian-Min [Key Laboratory for Particle Astrophysics, Institute of High Energy Physics, Chinese Academy of Sciences, 19B Yuquan Road, Beijing 100049 (China); Bai, Jin-Ming, E-mail: liyanrong@mail.ihep.ac.cn [Yunnan Observatories, Chinese Academy of Sciences, Kunming 650011 (China)

    2016-11-10

    Broad emission lines of active galactic nuclei stem from a spatially extended region (broad-line region, BLR) that is composed of discrete clouds and photoionized by the central ionizing continuum. The temporal behaviors of these emission lines are blurred echoes of continuum variations (i.e., reverberation mapping, RM) and directly reflect the structures and kinematic information of BLRs through the so-called transfer function (also known as the velocity-delay map). Based on the previous works of Rybicki and Press and Zu et al., we develop an extended, non-parametric approach to determine the transfer function for RM data, in which the transfer function is expressed as a sum of a family of relatively displaced Gaussian response functions. Therefore, arbitrary shapes of transfer functions associated with complicated BLR geometry can be seamlessly included, enabling us to relax the presumption of a specified transfer function frequently adopted in previous studies and to let it be determined by observation data. We formulate our approach in a previously well-established framework that incorporates the statistical modeling of continuum variations as a damped random walk process and takes into account long-term secular variations which are irrelevant to RM signals. The application to RM data shows the fidelity of our approach.

  16. A NON-PARAMETRIC APPROACH TO CONSTRAIN THE TRANSFER FUNCTION IN REVERBERATION MAPPING

    International Nuclear Information System (INIS)

    Li, Yan-Rong; Wang, Jian-Min; Bai, Jin-Ming

    2016-01-01

    Broad emission lines of active galactic nuclei stem from a spatially extended region (broad-line region, BLR) that is composed of discrete clouds and photoionized by the central ionizing continuum. The temporal behaviors of these emission lines are blurred echoes of continuum variations (i.e., reverberation mapping, RM) and directly reflect the structures and kinematic information of BLRs through the so-called transfer function (also known as the velocity-delay map). Based on the previous works of Rybicki and Press and Zu et al., we develop an extended, non-parametric approach to determine the transfer function for RM data, in which the transfer function is expressed as a sum of a family of relatively displaced Gaussian response functions. Therefore, arbitrary shapes of transfer functions associated with complicated BLR geometry can be seamlessly included, enabling us to relax the presumption of a specified transfer function frequently adopted in previous studies and to let it be determined by observation data. We formulate our approach in a previously well-established framework that incorporates the statistical modeling of continuum variations as a damped random walk process and takes into account long-term secular variations which are irrelevant to RM signals. The application to RM data shows the fidelity of our approach.

  17. Improving CMS data transfers among its distributed computing facilities

    International Nuclear Information System (INIS)

    Flix, J; Magini, N; Sartirana, A

    2011-01-01

    CMS computing needs reliable, stable and fast connections among multi-tiered computing infrastructures. For data distribution, the CMS experiment relies on a data placement and transfer system, PhEDEx, managing replication operations at each site in the distribution network. PhEDEx uses the File Transfer Service (FTS), a low level data movement service responsible for moving sets of files from one site to another, while allowing participating sites to control the network resource usage. FTS servers are provided by Tier-0 and Tier-1 centres and are used by all computing sites in CMS, according to the established policy. FTS needs to be set up according to the Grid site's policies, and properly configured to satisfy the requirements of all Virtual Organizations making use of the Grid resources at the site. Managing the service efficiently requires good knowledge of the CMS needs for all kinds of transfer workflows. This contribution deals with a revision of FTS servers used by CMS, collecting statistics on their usage, customizing the topologies and improving their setup in order to keep CMS transferring data at the desired levels in a reliable and robust way.

  18. Numerical computation of aeroacoustic transfer functions for realistic airfoils

    NARCIS (Netherlands)

    De Santana, Leandro Dantas; Miotto, Renato Fuzaro; Wolf, William Roberto

    2017-01-01

    Based on Amiet's theory formalism, we propose a numerical framework to compute the aeroacoustic transfer function of realistic airfoil geometries. The aeroacoustic transfer function relates the amplitude and phase of an incoming periodic gust to the respective unsteady lift response permitting,

  19. Improving CMS data transfers among its distributed computing facilities

    CERN Document Server

    Flix, J; Sartirana, A

    2001-01-01

    CMS computing needs reliable, stable and fast connections among multi-tiered computing infrastructures. For data distribution, the CMS experiment relies on a data placement and transfer system, PhEDEx, managing replication operations at each site in the distribution network. PhEDEx uses the File Transfer Service (FTS), a low level data movement service responsible for moving sets of files from one site to another, while allowing participating sites to control the network resource usage. FTS servers are provided by Tier-0 and Tier-1 centres and are used by all computing sites in CMS, according to the established policy. FTS needs to be set up according to the Grid site's policies, and properly configured to satisfy the requirements of all Virtual Organizations making use of the Grid resources at the site. Managing the service efficiently requires good knowledge of the CMS needs for all kinds of transfer workflows. This contribution deals with a revision of FTS servers used by CMS, collecting statistics on thei...

  20. Improving CMS data transfers among its distributed computing facilities

    CERN Document Server

    Flix, Jose

    2010-01-01

    CMS computing needs reliable, stable and fast connections among multi-tiered computing infrastructures. For data distribution, the CMS experiment relies on a data placement and transfer system, PhEDEx, managing replication operations at each site in the distribution network. PhEDEx uses the File Transfer Service (FTS), a low level data movement service responsible for moving sets of files from one site to another, while allowing participating sites to control the network resource usage. FTS servers are provided by Tier-0 and Tier-1 centres and are used by all computing sites in CMS, according to the established policy. FTS needs to be set up according to the Grid site's policies, and properly configured to satisfy the requirements of all Virtual Organizations making use of the Grid resources at the site. Managing the service efficiently requires good knowledge of the CMS needs for all kinds of transfer workflows. This contribution deals with a revision of FTS servers used by CMS, collecting statistics on the...

  1. Automatic transfer function design for medical visualization using visibility distributions and projective color mapping.

    Science.gov (United States)

    Cai, Lile; Tay, Wei-Liang; Nguyen, Binh P; Chui, Chee-Kong; Ong, Sim-Heng

    2013-01-01

    Transfer functions play a key role in volume rendering of medical data, but transfer function manipulation is unintuitive and can be time-consuming; achieving an optimal visualization of patient anatomy or pathology is difficult. To overcome this problem, we present a system for automatic transfer function design based on visibility distribution and projective color mapping. Instead of assigning opacity directly based on voxel intensity and gradient magnitude, the opacity transfer function is automatically derived by matching the observed visibility distribution to a target visibility distribution. An automatic color assignment scheme based on projective mapping is proposed to assign colors that allow for the visual discrimination of different structures, while also reflecting the degree of similarity between them. When our method was tested on several medical volumetric datasets, the key structures within the volume were clearly visualized with minimal user intervention. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. Direct dose mapping versus energy/mass transfer mapping for 4D dose accumulation: fundamental differences and dosimetric consequences.

    Science.gov (United States)

    Li, Haisen S; Zhong, Hualiang; Kim, Jinkoo; Glide-Hurst, Carri; Gulam, Misbah; Nurushev, Teamour S; Chetty, Indrin J

    2014-01-06

    The direct dose mapping (DDM) and energy/mass transfer (EMT) mapping are two essential algorithms for accumulating the dose from different anatomic phases to the reference phase when there is organ motion or tumor/tissue deformation during the delivery of radiation therapy. DDM is based on interpolation of the dose values from one dose grid to another and thus lacks rigor in defining the dose when there are multiple dose values mapped to one dose voxel in the reference phase due to tissue/tumor deformation. On the other hand, EMT counts the total energy and mass transferred to each voxel in the reference phase and calculates the dose by dividing the energy by mass. Therefore it is based on fundamentally sound physics principles. In this study, we implemented the two algorithms and integrated them within the Eclipse treatment planning system. We then compared the clinical dosimetric difference between the two algorithms for ten lung cancer patients receiving stereotactic radiosurgery treatment, by accumulating the delivered dose to the end-of-exhale (EE) phase. Specifically, the respiratory period was divided into ten phases and the dose to each phase was calculated and mapped to the EE phase and then accumulated. The displacement vector field generated by Demons-based registration of the source and reference images was used to transfer the dose and energy. The DDM and EMT algorithms produced noticeably different cumulative dose in the regions with sharp mass density variations and/or high dose gradients. For the planning target volume (PTV) and internal target volume (ITV) minimum dose, the difference was up to 11% and 4% respectively. This suggests that DDM might not be adequate for obtaining an accurate dose distribution of the cumulative plan, instead, EMT should be considered.

  3. Charge transfer complex between 2,3-diaminopyridine with chloranilic acid. Synthesis, characterization and DFT, TD-DFT computational studies

    Science.gov (United States)

    Al-Ahmary, Khairia M.; Habeeb, Moustafa M.; Al-Obidan, Areej H.

    2018-05-01

    New charge transfer complex (CTC) between the electron donor 2,3-diaminopyridine (DAP) with the electron acceptor chloranilic (CLA) acid has been synthesized and characterized experimentally and theoretically using a variety of physicochemical techniques. The experimental work included the use of elemental analysis, UV-vis, IR and 1H NMR studies to characterize the complex. Electronic spectra have been carried out in different hydrogen bonded solvents, methanol (MeOH), acetonitrile (AN) and 1:1 mixture from AN-MeOH. The molecular composition of the complex was identified to be 1:1 from Jobs and molar ratio methods. The stability constant was determined using minimum-maximum absorbances method where it recorded high values confirming the high stability of the formed complex. The solid complex was prepared and characterized by elemental analysis that confirmed its formation in 1:1 stoichiometric ratio. Both IR and NMR studies asserted the existence of proton and charge transfers in the formed complex. For supporting the experimental results, DFT computations were carried out using B3LYP/6-31G(d,p) method to compute the optimized structures of the reactants and complex, their geometrical parameters, reactivity parameters, molecular electrostatic potential map and frontier molecular orbitals. The analysis of DFT results strongly confirmed the high stability of the formed complex based on existing charge transfer beside proton transfer hydrogen bonding concordant with experimental results. The origin of electronic spectra was analyzed using TD-DFT method where the observed λmax are strongly consisted with the computed ones. TD-DFT showed the contributed states for various electronic transitions.

  4. GPU-accelerated computation of electron transfer.

    Science.gov (United States)

    Höfinger, Siegfried; Acocella, Angela; Pop, Sergiu C; Narumi, Tetsu; Yasuoka, Kenji; Beu, Titus; Zerbetto, Francesco

    2012-11-05

    Electron transfer is a fundamental process that can be studied with the help of computer simulation. The underlying quantum mechanical description renders the problem a computationally intensive application. In this study, we probe the graphics processing unit (GPU) for suitability to this type of problem. Time-critical components are identified via profiling of an existing implementation and several different variants are tested involving the GPU at increasing levels of abstraction. A publicly available library supporting basic linear algebra operations on the GPU turns out to accelerate the computation approximately 50-fold with minor dependence on actual problem size. The performance gain does not compromise numerical accuracy and is of significant value for practical purposes. Copyright © 2012 Wiley Periodicals, Inc.

  5. Bringing MapReduce Closer To Data With Active Drives

    Science.gov (United States)

    Golpayegani, N.; Prathapan, S.; Warmka, R.; Wyatt, B.; Halem, M.; Trantham, J. D.; Markey, C. A.

    2017-12-01

    Moving computation closer to the data location has been a much theorized improvement to computation for decades. The increase in processor performance, the decrease in processor size and power requirement combined with the increase in data intensive computing has created a push to move computation as close to data as possible. We will show the next logical step in this evolution in computing: moving computation directly to storage. Hypothetical systems, known as Active Drives, have been proposed as early as 1998. These Active Drives would have a general-purpose CPU on each disk allowing for computations to be performed on them without the need to transfer the data to the computer over the system bus or via a network. We will utilize Seagate's Active Drives to perform general purpose parallel computing using the MapReduce programming model directly on each drive. We will detail how the MapReduce programming model can be adapted to the Active Drive compute model to perform general purpose computing with comparable results to traditional MapReduce computations performed via Hadoop. We will show how an Active Drive based approach significantly reduces the amount of data leaving the drive when performing several common algorithms: subsetting and gridding. We will show that an Active Drive based design significantly improves data transfer speeds into and out of drives compared to Hadoop's HDFS while at the same time keeping comparable compute speeds as Hadoop.

  6. Computational fluid mechanics and heat transfer

    CERN Document Server

    Pletcher, Richard H; Anderson, Dale

    2012-01-01

    ""I have always considered this book the best gift from one generation to the next in computational fluid dynamics. I earnestly recommend this book to graduate students and practicing engineers for the pleasure of learning and a handy reference. The description of the basic concepts and fundamentals is thorough and is crystal clear for understanding. And since 1984, two newer editions have kept abreast to the new, relevant, and fully verified advancements in CFD.""-Joseph J.S. Shang, Wright State University""Computational Fluid Mechanics and Heat Transfer is very well written to be used as a t

  7. Monitoring data transfer latency in CMS computing operations

    CERN Document Server

    Bonacorsi, D; Magini, N; Sartirana, A; Taze, M; Wildish, T

    2015-01-01

    During the first LHC run, the CMS experiment collected tens of Petabytes of collision and simulated data, which need to be distributed among dozens of computing centres with low latency in order to make efficient use of the resources. While the desired level of throughput has been successfully achieved, it is still common to observe transfer workflows that cannot reach full completion in a timely manner due to a small fraction of stuck files which require operator intervention.For this reason, in 2012 the CMS transfer management system, PhEDEx, was instrumented with a monitoring system to measure file transfer latencies, and to predict the completion time for the transfer of a data set. The operators can detect abnormal patterns in transfer latencies while the transfer is still in progress, and monitor the long-term performance of the transfer infrastructure to plan the data placement strategy.Based on the data collected for one year with the latency monitoring system, we present a study on the different fact...

  8. MRPack: Multi-Algorithm Execution Using Compute-Intensive Approach in MapReduce

    Science.gov (United States)

    2015-01-01

    Large quantities of data have been generated from multiple sources at exponential rates in the last few years. These data are generated at high velocity as real time and streaming data in variety of formats. These characteristics give rise to challenges in its modeling, computation, and processing. Hadoop MapReduce (MR) is a well known data-intensive distributed processing framework using the distributed file system (DFS) for Big Data. Current implementations of MR only support execution of a single algorithm in the entire Hadoop cluster. In this paper, we propose MapReducePack (MRPack), a variation of MR that supports execution of a set of related algorithms in a single MR job. We exploit the computational capability of a cluster by increasing the compute-intensiveness of MapReduce while maintaining its data-intensive approach. It uses the available computing resources by dynamically managing the task assignment and intermediate data. Intermediate data from multiple algorithms are managed using multi-key and skew mitigation strategies. The performance study of the proposed system shows that it is time, I/O, and memory efficient compared to the default MapReduce. The proposed approach reduces the execution time by 200% with an approximate 50% decrease in I/O cost. Complexity and qualitative results analysis shows significant performance improvement. PMID:26305223

  9. Filtering Non-Linear Transfer Functions on Surfaces.

    Science.gov (United States)

    Heitz, Eric; Nowrouzezahrai, Derek; Poulin, Pierre; Neyret, Fabrice

    2014-07-01

    Applying non-linear transfer functions and look-up tables to procedural functions (such as noise), surface attributes, or even surface geometry are common strategies used to enhance visual detail. Their simplicity and ability to mimic a wide range of realistic appearances have led to their adoption in many rendering problems. As with any textured or geometric detail, proper filtering is needed to reduce aliasing when viewed across a range of distances, but accurate and efficient transfer function filtering remains an open problem for several reasons: transfer functions are complex and non-linear, especially when mapped through procedural noise and/or geometry-dependent functions, and the effects of perspective and masking further complicate the filtering over a pixel's footprint. We accurately solve this problem by computing and sampling from specialized filtering distributions on the fly, yielding very fast performance. We investigate the case where the transfer function to filter is a color map applied to (macroscale) surface textures (like noise), as well as color maps applied according to (microscale) geometric details. We introduce a novel representation of a (potentially modulated) color map's distribution over pixel footprints using Gaussian statistics and, in the more complex case of high-resolution color mapped microsurface details, our filtering is view- and light-dependent, and capable of correctly handling masking and occlusion effects. Our approach can be generalized to filter other physical-based rendering quantities. We propose an application to shading with irradiance environment maps over large terrains. Our framework is also compatible with the case of transfer functions used to warp surface geometry, as long as the transformations can be represented with Gaussian statistics, leading to proper view- and light-dependent filtering results. Our results match ground truth and our solution is well suited to real-time applications, requires only a few

  10. NET: an inter-computer file transfer command

    International Nuclear Information System (INIS)

    Burris, R.D.

    1978-05-01

    The NET command was defined and supported in order to facilitate file transfer between computers. Among the goals of the implementation were greatest possible ease of use, maximum power (i.e., support of a diversity of equipment and operations), and protection of the operating system

  11. Computer Software Configuration Item-Specific Flight Software Image Transfer Script Generator

    Science.gov (United States)

    Bolen, Kenny; Greenlaw, Ronald

    2010-01-01

    A K-shell UNIX script enables the International Space Station (ISS) Flight Control Team (FCT) operators in NASA s Mission Control Center (MCC) in Houston to transfer an entire or partial computer software configuration item (CSCI) from a flight software compact disk (CD) to the onboard Portable Computer System (PCS). The tool is designed to read the content stored on a flight software CD and generate individual CSCI transfer scripts that are capable of transferring the flight software content in a given subdirectory on the CD to the scratch directory on the PCS. The flight control team can then transfer the flight software from the PCS scratch directory to the Electronically Erasable Programmable Read Only Memory (EEPROM) of an ISS Multiplexer/ Demultiplexer (MDM) via the Indirect File Transfer capability. The individual CSCI scripts and the CSCI Specific Flight Software Image Transfer Script Generator (CFITSG), when executed a second time, will remove all components from their original execution. The tool will identify errors in the transfer process and create logs of the transferred software for the purposes of configuration management.

  12. Development of a scanning proton microprobe - computer-control, elemental mapping and applications

    International Nuclear Information System (INIS)

    Loevestam, Goeran.

    1989-08-01

    A scanning proton microprobe set-up has been developed at the Pelletron accelerator in Lund. A magnetic beam scanning system and a computer-control system for beam scanning and data aquisition is described. The computer system consists of a VMEbus front-end computer and a μVax-II host-computer, interfaced by means of a high-speed data link. The VMEbus computer controls data acquisition, beam charge monitoring and beam scanning while the more sophisticated work of elemental mapping and spectrum evaluations is left to the μVax-II. The calibration of the set-up is described as well as several applications. Elemental micro patterns in tree rings and bark has been investigated by means of elemental mapping and quantitative analysis. Large variations of elemental concentrations have been found for several elements within a single tree ring. An external beam set-up has been developed in addition to the proton microprobe set-up. The external beam has been used for the analysis of antique papyrus documents. Using a scanning sample procedure and particle induced X-ray emission (PIXE) analysis, damaged and missing characters of the text could be made visible by means of multivariate statistical data evaluation and elemental mapping. Also aspects of elemental mapping by means of scanning μPIXE analysis are discussed. Spectrum background, target thickness variations and pile-up are shown to influence the structure of elemental maps considerably. In addition, a semi-quantification procedure has been developed. (author)

  13. Computation of turbulent flow and heat transfer in subassemblies

    International Nuclear Information System (INIS)

    Slagter, W.

    1979-01-01

    This research is carried out in order to provide information on the thermohydraulic behaviour of fast reactor subassemblies. The research work involves the development of versatile computation methods and the evaluation of combined theoretical and experimental work on fluid flow and heat transfer in fuel rod bundles. The computation method described here rests on the application of the distributed parameter approach. The conditions considered cover steady, turbulent flow and heat transfer of incompressible fluids in bundles of bare rods. Throughout 1978 main efforts were given to the development of the VITESSE program and to the validation of the hydrodynamic part of the code. In its present version the VITESSE program is applicable to predict the fully developed turbulent flow and heat transfer in the subchannels of a bundle with bare rods. In this paper the main features of the code are described as well as the present status of development

  14. A European map regarding the strictness of the transfer pricing regulations

    Directory of Open Access Journals (Sweden)

    Ioana Ignat

    2017-12-01

    Full Text Available In the context in which transfer pricing may represent a mechanism through which multinationals have the possibility to move funds internationally, in order to prevent the base erosion and profit shifting between multinationals, countries over the world have adopted various transfer pricing regulations. Furthermore, some of the countries adopted stricter regulations than others. The objective of our research was to identify the level of strictness for the transfer pricing regulations from the European countries. To achieve this objective, we analyzed the transfer pricing regulations of all European countries and we built a transfer pricing strictness index, based on which we defined 4 categories of countries (where category 1 includes the countries with the least strict transfer pricing regulations and category 4 countries with the strictest regulations. After that, we illustrated how these categories are distributed on the European map. In order to collect the information, we used the transfer pricing guides issued by the Big Four companies for the year 2015. The study`s results show that the strictness of the transfer pricing regulations decreases from the west of Europe to east. Moreover, most of the countries were included in category 2, respectively category 3, meaning that the transfer pricing regulations from the European continent are not so flexible, but in the same time are not so strict.

  15. Ultrafast and scalable cone-beam CT reconstruction using MapReduce in a cloud computing environment.

    Science.gov (United States)

    Meng, Bowen; Pratx, Guillem; Xing, Lei

    2011-12-01

    Four-dimensional CT (4DCT) and cone beam CT (CBCT) are widely used in radiation therapy for accurate tumor target definition and localization. However, high-resolution and dynamic image reconstruction is computationally demanding because of the large amount of data processed. Efficient use of these imaging techniques in the clinic requires high-performance computing. The purpose of this work is to develop a novel ultrafast, scalable and reliable image reconstruction technique for 4D CBCT∕CT using a parallel computing framework called MapReduce. We show the utility of MapReduce for solving large-scale medical physics problems in a cloud computing environment. In this work, we accelerated the Feldcamp-Davis-Kress (FDK) algorithm by porting it to Hadoop, an open-source MapReduce implementation. Gated phases from a 4DCT scans were reconstructed independently. Following the MapReduce formalism, Map functions were used to filter and backproject subsets of projections, and Reduce function to aggregate those partial backprojection into the whole volume. MapReduce automatically parallelized the reconstruction process on a large cluster of computer nodes. As a validation, reconstruction of a digital phantom and an acquired CatPhan 600 phantom was performed on a commercial cloud computing environment using the proposed 4D CBCT∕CT reconstruction algorithm. Speedup of reconstruction time is found to be roughly linear with the number of nodes employed. For instance, greater than 10 times speedup was achieved using 200 nodes for all cases, compared to the same code executed on a single machine. Without modifying the code, faster reconstruction is readily achievable by allocating more nodes in the cloud computing environment. Root mean square error between the images obtained using MapReduce and a single-threaded reference implementation was on the order of 10(-7). Our study also proved that cloud computing with MapReduce is fault tolerant: the reconstruction completed

  16. A computer simulation model to compute the radiation transfer of mountainous regions

    Science.gov (United States)

    Li, Yuguang; Zhao, Feng; Song, Rui

    2011-11-01

    In mountainous regions, the radiometric signal recorded at the sensor depends on a number of factors such as sun angle, atmospheric conditions, surface cover type, and topography. In this paper, a computer simulation model of radiation transfer is designed and evaluated. This model implements the Monte Carlo ray-tracing techniques and is specifically dedicated to the study of light propagation in mountainous regions. The radiative processes between sun light and the objects within the mountainous region are realized by using forward Monte Carlo ray-tracing methods. The performance of the model is evaluated through detailed comparisons with the well-established 3D computer simulation model: RGM (Radiosity-Graphics combined Model) based on the same scenes and identical spectral parameters, which shows good agreements between these two models' results. By using the newly developed computer model, series of typical mountainous scenes are generated to analyze the physical mechanism of mountainous radiation transfer. The results show that the effects of the adjacent slopes are important for deep valleys and they particularly affect shadowed pixels, and the topographic effect needs to be considered in mountainous terrain before accurate inferences from remotely sensed data can be made.

  17. Computer Texture Mapping for Laser Texturing of Injection Mold

    Directory of Open Access Journals (Sweden)

    Yongquan Zhou

    2014-04-01

    Full Text Available Laser texturing is a relatively new multiprocess technique that has been used for machining 3D curved surfaces; it is more flexible and efficient to create decorative texture on 3D curved surfaces of injection molds so as to improve the surface quality and achieve cosmetic surface of molded plastic parts. In this paper, a novel method of laser texturing 3D curved surface based on 3-axis galvanometer scanning unit has been presented to prevent the texturing of injection mold surface from much distortion which is often caused by traditional texturing processes. The novel method has been based on the computer texture mapping technology which has been developed and presented. The developed texture mapping algorithm includes surface triangulation, notations, distortion measurement, control, and numerical method. An interface of computer texture mapping has been built to implement the algorithm of texture mapping approach to controlled distortion rate of 3D texture math model from 2D original texture applied to curvature surface. Through a case study of laser texturing of a high curvature surface of injection mold of a mice top case, it shows that the novel method of laser texturing meets the quality standard of laser texturing of injection mold.

  18. Computer graphics in heat-transfer simulations

    International Nuclear Information System (INIS)

    Hamlin, G.A. Jr.

    1980-01-01

    Computer graphics can be very useful in the setup of heat transfer simulations and in the display of the results of such simulations. The potential use of recently available low-cost graphics devices in the setup of such simulations has not been fully exploited. Several types of graphics devices and their potential usefulness are discussed, and some configurations of graphics equipment are presented in the low-, medium-, and high-price ranges

  19. Heat Transfer Computations of Internal Duct Flows With Combined Hydraulic and Thermal Developing Length

    Science.gov (United States)

    Wang, C. R.; Towne, C. E.; Hippensteele, S. A.; Poinsatte, P. E.

    1997-01-01

    This study investigated the Navier-Stokes computations of the surface heat transfer coefficients of a transition duct flow. A transition duct from an axisymmetric cross section to a non-axisymmetric cross section, is usually used to connect the turbine exit to the nozzle. As the gas turbine inlet temperature increases, the transition duct is subjected to the high temperature at the gas turbine exit. The transition duct flow has combined development of hydraulic and thermal entry length. The design of the transition duct required accurate surface heat transfer coefficients. The Navier-Stokes computational method could be used to predict the surface heat transfer coefficients of a transition duct flow. The Proteus three-dimensional Navier-Stokes numerical computational code was used in this study. The code was first studied for the computations of the turbulent developing flow properties within a circular duct and a square duct. The code was then used to compute the turbulent flow properties of a transition duct flow. The computational results of the surface pressure, the skin friction factor, and the surface heat transfer coefficient were described and compared with their values obtained from theoretical analyses or experiments. The comparison showed that the Navier-Stokes computation could predict approximately the surface heat transfer coefficients of a transition duct flow.

  20. Fencing data transfers in a parallel active messaging interface of a parallel computer

    Science.gov (United States)

    Blocksome, Michael A.; Mamidala, Amith R.

    2015-06-02

    Fencing data transfers in a parallel active messaging interface (`PAMI`) of a parallel computer, the PAMI including data communications endpoints, each endpoint including a specification of data communications parameters for a thread of execution on a compute node, including specifications of a client, a context, and a task; the compute nodes coupled for data communications through the PAMI and through data communications resources including at least one segment of shared random access memory; including initiating execution through the PAMI of an ordered sequence of active SEND instructions for SEND data transfers between two endpoints, effecting deterministic SEND data transfers through a segment of shared memory; and executing through the PAMI, with no FENCE accounting for SEND data transfers, an active FENCE instruction, the FENCE instruction completing execution only after completion of all SEND instructions initiated prior to execution of the FENCE instruction for SEND data transfers between the two endpoints.

  1. NEW DEVELOPMENTS ON INVERSE POLYGON MAPPING TO CALCULATE GRAVITATIONAL LENSING MAGNIFICATION MAPS: OPTIMIZED COMPUTATIONS

    International Nuclear Information System (INIS)

    Mediavilla, E.; Lopez, P.; Mediavilla, T.; Ariza, O.; Muñoz, J. A.; Gonzalez-Morcillo, C.; Jimenez-Vicente, J.

    2011-01-01

    We derive an exact solution (in the form of a series expansion) to compute gravitational lensing magnification maps. It is based on the backward gravitational lens mapping of a partition of the image plane in polygonal cells (inverse polygon mapping, IPM), not including critical points (except perhaps at the cell boundaries). The zeroth-order term of the series expansion leads to the method described by Mediavilla et al. The first-order term is used to study the error induced by the truncation of the series at zeroth order, explaining the high accuracy of the IPM even at this low order of approximation. Interpreting the Inverse Ray Shooting (IRS) method in terms of IPM, we explain the previously reported N –3/4 dependence of the IRS error with the number of collected rays per pixel. Cells intersected by critical curves (critical cells) transform to non-simply connected regions with topological pathologies like auto-overlapping or non-preservation of the boundary under the transformation. To define a non-critical partition, we use a linear approximation of the critical curve to divide each critical cell into two non-critical subcells. The optimal choice of the cell size depends basically on the curvature of the critical curves. For typical applications in which the pixel of the magnification map is a small fraction of the Einstein radius, a one-to-one relationship between the cell and pixel sizes in the absence of lensing guarantees both the consistence of the method and a very high accuracy. This prescription is simple but very conservative. We show that substantially larger cells can be used to obtain magnification maps with huge savings in computation time.

  2. Teaching Concept Mapping and University Level Study Strategies Using Computers.

    Science.gov (United States)

    Mikulecky, Larry; And Others

    1989-01-01

    Assesses the utility and effectiveness of three interactive computer programs and associated print materials in instructing and modeling for undergraduates how to comprehend and reconceptualize scientific textbook material. Finds that "how to" reading strategies can be taught via computer and transferred to new material. (RS)

  3. Application of computer-generated functional (parametric) maps in radionuclide renography

    International Nuclear Information System (INIS)

    Agress, H. Jr.; Levenson, S.M.; Gelfand, M.J.; Green, M.V.; Bailey, J.J.; Johnston, G.S.

    1975-01-01

    A functional (parametric) map is a single visual display of regional dynamic phenomena which facilitates interpretation of the nature of focal abnormalities in renal function. Methods for producing several kinds of functional maps based on computer calculations of radionuclide scan data are briefly described. Three abnormal cases are presented to illustrate the use of functional maps to separate focal lesions and to specify the dynamic nature of the abnormalities in a way which is difficult to achieve with conventional sequential renal scans and renograms alone

  4. An Overview of Plume Tracker: Mapping Volcanic Emissions with Interactive Radiative Transfer Modeling

    Science.gov (United States)

    Realmuto, V. J.; Berk, A.; Guiang, C.

    2014-12-01

    Infrared remote sensing is a vital tool for the study of volcanic plumes, and radiative transfer (RT) modeling is required to derive quantitative estimation of the sulfur dioxide (SO2), sulfate aerosol (SO4), and silicate ash (pulverized rock) content of these plumes. In the thermal infrared, we must account for the temperature, emissivity, and elevation of the surface beneath the plume, plume altitude and thickness, and local atmospheric temperature and humidity. Our knowledge of these parameters is never perfect, and interactive mapping allows us to evaluate the impact of these uncertainties on our estimates of plume composition. To enable interactive mapping, the Jet Propulsion Laboratory is collaborating with Spectral Sciences, Inc., (SSI) to develop the Plume Tracker toolkit. This project is funded by a NASA AIST Program Grant (AIST-11-0053) to SSI. Plume Tracker integrates (1) retrieval procedures for surface temperature and emissivity, SO2, NH3, or CH4 column abundance, and scaling factors for H2O vapor and O3 profiles, (2) a RT modeling engine based on MODTRAN, and (3) interactive visualization and analysis utilities under a single graphics user interface. The principal obstacle to interactive mapping is the computational overhead of the RT modeling engine. Under AIST-11-0053 we have achieved a 300-fold increase in the performance of the retrieval procedures through the use of indexed caches of model spectra, optimization of the minimization procedures, and scaling of the effects of surface temperature and emissivity on model radiance spectra. In the final year of AIST-11-0053 we will implement parallel processing to exploit multi-core CPUs and cluster computing, and optimize the RT engine to eliminate redundant calculations when iterating over a range of gas concentrations. These enhancements will result in an additional 8 - 12X increase in performance. In addition to the improvements in performance, we have improved the accuracy of the Plume Tracker

  5. Computing Radiative Transfer in a 3D Medium

    Science.gov (United States)

    Von Allmen, Paul; Lee, Seungwon

    2012-01-01

    A package of software computes the time-dependent propagation of a narrow laser beam in an arbitrary three- dimensional (3D) medium with absorption and scattering, using the transient-discrete-ordinates method and a direct integration method. Unlike prior software that utilizes a Monte Carlo method, this software enables simulation at very small signal-to-noise ratios. The ability to simulate propagation of a narrow laser beam in a 3D medium is an improvement over other discrete-ordinate software. Unlike other direct-integration software, this software is not limited to simulation of propagation of thermal radiation with broad angular spread in three dimensions or of a laser pulse with narrow angular spread in two dimensions. Uses for this software include (1) computing scattering of a pulsed laser beam on a material having given elastic scattering and absorption profiles, and (2) evaluating concepts for laser-based instruments for sensing oceanic turbulence and related measurements of oceanic mixed-layer depths. With suitable augmentation, this software could be used to compute radiative transfer in ultrasound imaging in biological tissues, radiative transfer in the upper Earth crust for oil exploration, and propagation of laser pulses in telecommunication applications.

  6. TME (Task Mapping Editor): tool for executing distributed parallel computing. TME user's manual

    International Nuclear Information System (INIS)

    Takemiya, Hiroshi; Yamagishi, Nobuhiro; Imamura, Toshiyuki

    2000-03-01

    At the Center for Promotion of Computational Science and Engineering, a software environment PPExe has been developed to support scientific computing on a parallel computer cluster (distributed parallel scientific computing). TME (Task Mapping Editor) is one of components of the PPExe and provides a visual programming environment for distributed parallel scientific computing. Users can specify data dependence among tasks (programs) visually as a data flow diagram and map these tasks onto computers interactively through GUI of TME. The specified tasks are processed by other components of PPExe such as Meta-scheduler, RIM (Resource Information Monitor), and EMS (Execution Management System) according to the execution order of these tasks determined by TME. In this report, we describe the usage of TME. (author)

  7. Geometric optical transfer function and tis computation method

    International Nuclear Information System (INIS)

    Wang Qi

    1992-01-01

    Geometric Optical Transfer Function formula is derived after expound some content to be easily ignored, and the computation method is given with Bessel function of order zero and numerical integration and Spline interpolation. The method is of advantage to ensure accuracy and to save calculation

  8. Implementation of a solution Cloud Computing with MapReduce model

    International Nuclear Information System (INIS)

    Baya, Chalabi

    2014-01-01

    In recent years, large scale computer systems have emerged to meet the demands of high storage, supercomputing, and applications using very large data sets. The emergence of Cloud Computing offers the potentiel for analysis and processing of large data sets. Mapreduce is the most popular programming model which is used to support the development of such applications. It was initially designed by Google for building large datacenters on a large scale, to provide Web search services with rapid response and high availability. In this paper we will test the clustering algorithm K-means Clustering in a Cloud Computing. This algorithm is implemented on MapReduce. It has been chosen for its characteristics that are representative of many iterative data analysis algorithms. Then, we modify the framework CloudSim to simulate the MapReduce execution of K-means Clustering on different Cloud Computing, depending on their size and characteristics of target platforms. The experiment show that the implementation of K-means Clustering gives good results especially for large data set and the Cloud infrastructure has an influence on these results

  9. Heat Transfer treatment in computer codes for safety analysis

    International Nuclear Information System (INIS)

    Jerele, A.; Gregoric, M.

    1984-01-01

    Increased number of operating nuclear power plants has stressed importance of nuclear safety evaluation. For this reason, accordingly to regulatory commission request, safety analyses with computer codes are preformed. In this paper part of this thermohydraulic models dealing with wall-to-fluid heat transfer correlations in computer codes TRAC=PF1, RELAP4/MOD5, RELAP5/MOD1 and COBRA-IV is discussed. (author)

  10. Systematic approach for deriving feasible mappings of parallel algorithms to parallel computing platforms

    NARCIS (Netherlands)

    Arkin, Ethem; Tekinerdogan, Bedir; Imre, Kayhan M.

    2017-01-01

    The need for high-performance computing together with the increasing trend from single processor to parallel computer architectures has leveraged the adoption of parallel computing. To benefit from parallel computing power, usually parallel algorithms are defined that can be mapped and executed

  11. Mapping urban green open space in Bontang city using QGIS and cloud computing

    Science.gov (United States)

    Agus, F.; Ramadiani; Silalahi, W.; Armanda, A.; Kusnandar

    2018-04-01

    Digital mapping techniques are available freely and openly so that map-based application development is easier, faster and cheaper. A rapid development of Cloud Computing Geographic Information System makes this system can help the needs of the community for the provision of geospatial information online. The presence of urban Green Open Space (GOS) provide great benefits as an oxygen supplier, carbon-binding agent and can contribute to providing comfort and beauty of city life. This study aims to propose a platform application of GIS Cloud Computing (CC) of Bontang City GOS mapping. The GIS-CC platform uses the basic map available that’s free and open source. The research used survey method to collect GOS data obtained from Bontang City Government, while application developing works Quantum GIS-CC. The result section describes the existence of GOS Bontang City and the design of GOS mapping application.

  12. Introduction to computational mass transfer with applications to chemical engineering

    CERN Document Server

    Yu, Kuo-Tsong

    2014-01-01

    This book presents a new computational methodology called Computational Mass Transfer (CMT). It offers an approach to rigorously simulating the mass, heat and momentum transfer under turbulent flow conditions with the help of two newly published models, namely the C’2—εC’ model and the Reynolds  mass flux model, especially with regard to predictions of concentration, temperature and velocity distributions in chemical and related processes. The book will also allow readers to understand the interfacial phenomena accompanying the mass transfer process and methods for modeling the interfacial effect, such as the influences of Marangoni convection and Rayleigh convection. The CMT methodology is demonstrated by means of its applications to typical separation and chemical reaction processes and equipment, including distillation, absorption, adsorption and chemical reactors. Professor Kuo-Tsong Yu is a Member of the Chinese Academy of Sciences. Dr. Xigang Yuan is a Professor at the School of Chemical Engine...

  13. Regularization and computational methods for precise solution of perturbed orbit transfer problems

    Science.gov (United States)

    Woollands, Robyn Michele

    individual algorithms. Following this discussion, the combined parallel algorithm, known as the unified Lambert tool, is presented and an explanation is given as to how it automatically selects which of the three perturbed solvers to compute the perturbed solution for a particular orbit transfer. The unified Lambert tool may be used to determine a single orbit transfer or for generating of an extremal field map. A case study is presented for a mission that is required to rendezvous with two pieces of orbit debris (spent rocket boosters). The unified Lambert tool software developed in this dissertation is already being utilized by several industrial partners and we are confident that it will play a significant role in practical applications, including solution of Lambert problems that arise in the current applications focused on enhanced space situational awareness.

  14. A computational linguistics motivated mapping of ICPC-2 PLUS to SNOMED CT.

    Science.gov (United States)

    Wang, Yefeng; Patrick, Jon; Miller, Graeme; O'Hallaran, Julie

    2008-10-27

    A great challenge in sharing data across information systems in general practice is the lack of interoperability between different terminologies or coding schema used in the information systems. Mapping of medical vocabularies to a standardised terminology is needed to solve data interoperability problems. We present a system to automatically map an interface terminology ICPC-2 PLUS to SNOMED CT. Three steps of mapping are proposed in this system. The UMLS metathesaurus mapping utilises explicit relationships between ICPC-2 PLUS and SNOMED CT terms in the UMLS library to perform the first stage of the mapping. Computational linguistic mapping uses natural language processing techniques and lexical similarities for the second stage of mapping between terminologies. Finally, the post-coordination mapping allows one ICPC-2 PLUS term to be mapped into an aggregation of two or more SNOMED CT terms. A total 5,971 of all 7,410 ICPC-2 terms (80.58%) were mapped to SNOMED CT using the three stages but with different levels of accuracy. UMLS mapping achieved the mapping of 53.0% ICPC2 PLUS terms to SNOMED CT with the precision rate of 96.46% and overall recall rate of 44.89%. Lexical mapping increased the result to 60.31% and post-coordination mapping gave an increase of 20.27% in mapped terms. A manual review of a part of the mapping shows that the precision of lexical mappings is around 90%. The accuracy of post-coordination has not been evaluated yet. Unmapped terms and mismatched terms are due to the differences in the structures between ICPC-2 PLUS and SNOMED CT. Terms contained in ICPC-2 PLUS but not in SNOMED CT caused a large proportion of the failures in the mappings. Mapping terminologies to a standard vocabulary is a way to facilitate consistent medical data exchange and achieve system interoperability and data standardisation. Broad scale mapping cannot be achieved by any single method and methods based on computational linguistics can be very useful for the task

  15. Massive calculations of electrostatic potentials and structure maps of biopolymers in a distributed computing environment

    International Nuclear Information System (INIS)

    Akishina, T.P.; Ivanov, V.V.; Stepanenko, V.A.

    2013-01-01

    Among the key factors determining the processes of transcription and translation are the distributions of the electrostatic potentials of DNA, RNA and proteins. Calculations of electrostatic distributions and structure maps of biopolymers on computers are time consuming and require large computational resources. We developed the procedures for organization of massive calculations of electrostatic potentials and structure maps for biopolymers in a distributed computing environment (several thousands of cores).

  16. The comparative effect of individually-generated vs. collaboratively-generated computer-based concept mapping on science concept learning

    Science.gov (United States)

    Kwon, So Young

    Using a quasi-experimental design, the researcher investigated the comparative effects of individually-generated and collaboratively-generated computer-based concept mapping on middle school science concept learning. Qualitative data were analyzed to explain quantitative findings. One hundred sixty-one students (74 boys and 87 girls) in eight, seventh grade science classes at a middle school in Southeast Texas completed the entire study. Using prior science performance scores to assure equivalence of student achievement across groups, the researcher assigned the teacher's classes to one of the three experimental groups. The independent variable, group, consisted of three levels: 40 students in a control group, 59 students trained to individually generate concept maps on computers, and 62 students trained to collaboratively generate concept maps on computers. The dependent variables were science concept learning as demonstrated by comprehension test scores, and quality of concept maps created by students in experimental groups as demonstrated by rubric scores. Students in the experimental groups received concept mapping training and used their newly acquired concept mapping skills to individually or collaboratively construct computer-based concept maps during study time. The control group, the individually-generated concept mapping group, and the collaboratively-generated concept mapping group had equivalent learning experiences for 50 minutes during five days, excepting that students in a control group worked independently without concept mapping activities, students in the individual group worked individually to construct concept maps, and students in the collaborative group worked collaboratively to construct concept maps during their study time. Both collaboratively and individually generated computer-based concept mapping had a positive effect on seventh grade middle school science concept learning but neither strategy was more effective than the other. However

  17. Advanced Computational Methods for Thermal Radiative Heat Transfer

    Energy Technology Data Exchange (ETDEWEB)

    Tencer, John; Carlberg, Kevin Thomas; Larsen, Marvin E.; Hogan, Roy E.,

    2016-10-01

    Participating media radiation (PMR) in weapon safety calculations for abnormal thermal environments are too costly to do routinely. This cost may be s ubstantially reduced by applying reduced order modeling (ROM) techniques. The application of ROM to PMR is a new and unique approach for this class of problems. This approach was investigated by the authors and shown to provide significant reductions in the computational expense associated with typical PMR simulations. Once this technology is migrated into production heat transfer analysis codes this capability will enable the routine use of PMR heat transfer in higher - fidelity simulations of weapon resp onse in fire environments.

  18. Heat transfer analyses using computational fluid dynamics in the air blast freezing of guava pulp in large containers

    Directory of Open Access Journals (Sweden)

    W. M. Okita

    2013-12-01

    Full Text Available Heat transfer during the freezing of guava pulp conditioned in large containers such as in stacked boxes (34 L and buckets (20 L and unstacked drums (200 L is discussed. The air velocities across the cross-section of the tunnel were measured, and the values in the outlet of the evaporator were used as the initial conditions in computational fluid dynamics (CFD simulations. The model tested was turbulent standard k-ε. The CFD-generated convective heat transfer coefficients were mapped on the surfaces for each configuration and used in procedures for the calculation of freezing-time estimates. These estimates were compared with the experimental results for validation. The results showed that CFD determined representative coefficients and produced good correlations between the predicted and experimental values when applied to the freezing-time estimates for the box and drum configurations. The errors depended on the configuration and the adopted mesh (3-D grid construction.

  19. A computational theory of the hippocampal cognitive map.

    Science.gov (United States)

    O'Keefe, J

    1990-01-01

    Evidence from single unit and lesion studies suggests that the hippocampal formation acts as a spatial or cognitive map (O'Keefe and Nadel, 1978). In this chapter, I summarise some of the unit recording data and then outline the most recent computational version of the cognitive map theory. The novel aspects of the present version of the theory are that it identifies two allocentric parameters, the centroid and the eccentricity, which can be calculated from the array of cues in an environment and which can serve as the bases for an allocentric polar co-ordinate system. Computations within this framework enable the animal to identify its location within an environment, to predict the location which will be reached as a result of any specific movement from that location, and conversely, to calculate the spatial transformation necessary to go from the current location to a desired location. Aspects of the model are identified with the information provided by cells in the hippocampus and dorsal presubiculum. The hippocampal place cells are involved in the calculation of the centroid and the presubicular direction cells in the calculation of the eccentricity.

  20. The UK Human Genome Mapping Project online computing service.

    Science.gov (United States)

    Rysavy, F R; Bishop, M J; Gibbs, G P; Williams, G W

    1992-04-01

    This paper presents an overview of computing and networking facilities developed by the Medical Research Council to provide online computing support to the Human Genome Mapping Project (HGMP) in the UK. The facility is connected to a number of other computing facilities in various centres of genetics and molecular biology research excellence, either directly via high-speed links or through national and international wide-area networks. The paper describes the design and implementation of the current system, a 'client/server' network of Sun, IBM, DEC and Apple servers, gateways and workstations. A short outline of online computing services currently delivered by this system to the UK human genetics research community is also provided. More information about the services and their availability could be obtained by a direct approach to the UK HGMP-RC.

  1. Computation of single- and two-phase heat transfer rates suitable for water-cooled tubes and subchannels

    International Nuclear Information System (INIS)

    Groeneveld, D.C.; Leung, L.K.H.; Cheng, S.C.; Nguyen, C.

    1989-01-01

    A computational method for predicting heat transfer, valid for a wide range of flow conditions (from pool boiling and laminar flow conditions to highly turbulent flow), has been developed. It correctly identifies the heat transfer modes and predicts the heat transfer rates as well as transition points (such as the critical heat flux point) on the boiling curve. The computational heat transfer method consists of a combination of carefully chosen heat transfer equations for each heat transfer mode. Each of these equations has been selected because of their accuracy, wide range of application, and correct asymptotic trends. Using a mechanistically-based heat transfer logic, these equations have been combined in a convenient software package suitable for PC or mainframe application. The computational method has been thoroughly tested against many sets of experimental data. The parametric and asymptotic trends of the prediction method have been examined in detail. Correction factors are proposed for extending the use of individual predictive techniques to various geometric configurations and upstream conditions. (orig.)

  2. Model-driven product line engineering for mapping parallel algorithms to parallel computing platforms

    NARCIS (Netherlands)

    Arkin, Ethem; Tekinerdogan, Bedir

    2016-01-01

    Mapping parallel algorithms to parallel computing platforms requires several activities such as the analysis of the parallel algorithm, the definition of the logical configuration of the platform, the mapping of the algorithm to the logical configuration platform and the implementation of the

  3. Cost-effective computational method for radiation heat transfer in semi-crystalline polymers

    Science.gov (United States)

    Boztepe, Sinan; Gilblas, Rémi; de Almeida, Olivier; Le Maoult, Yannick; Schmidt, Fabrice

    2018-05-01

    This paper introduces a cost-effective numerical model for infrared (IR) heating of semi-crystalline polymers. For the numerical and experimental studies presented here semi-crystalline polyethylene (PE) was used. The optical properties of PE were experimentally analyzed under varying temperature and the obtained results were used as input in the numerical studies. The model was built based on optically homogeneous medium assumption whereas the strong variation in the thermo-optical properties of semi-crystalline PE under heating was taken into account. Thus, the change in the amount radiative energy absorbed by the PE medium was introduced in the model induced by its temperature-dependent thermo-optical properties. The computational study was carried out considering an iterative closed-loop computation, where the absorbed radiation was computed using an in-house developed radiation heat transfer algorithm -RAYHEAT- and the computed results was transferred into the commercial software -COMSOL Multiphysics- for solving transient heat transfer problem to predict temperature field. The predicted temperature field was used to iterate the thermo-optical properties of PE that varies under heating. In order to analyze the accuracy of the numerical model experimental analyses were carried out performing IR-thermographic measurements during the heating of the PE plate. The applicability of the model in terms of computational cost, number of numerical input and accuracy was highlighted.

  4. Transfer of numeric ASCII data files between Apple and IBM personal computers.

    Science.gov (United States)

    Allan, R W; Bermejo, R; Houben, D

    1986-01-01

    Listings for programs designed to transfer numeric ASCII data files between Apple and IBM personal computers are provided with accompanying descriptions of how the software operates. Details of the hardware used are also given. The programs may be easily adapted for transferring data between other microcomputers.

  5. Non-linear heat transfer computer code by finite element method

    International Nuclear Information System (INIS)

    Nagato, Kotaro; Takikawa, Noboru

    1977-01-01

    The computer code THETA-2D for the calculation of temperature distribution by the two-dimensional finite element method was made for the analysis of heat transfer in a high temperature structure. Numerical experiment was performed for the numerical integration of the differential equation of heat conduction. The Runge-Kutta method of the numerical experiment produced an unstable solution. A stable solution was obtained by the β method with the β value of 0.35. In high temperature structures, the radiative heat transfer can not be neglected. To introduce a term of the radiative heat transfer, a functional neglecting the radiative heat transfer was derived at first. Then, the radiative term was added after the discretion by variation method. Five model calculations were carried out by the computer code. Calculation of steady heat conduction was performed. When estimated initial temperature is 1,000 degree C, reasonable heat blance was obtained. In case of steady-unsteady temperature calculation, the time integral by THETA-2D turned out to be under-estimation for enthalpy change. With a one-dimensional model, the temperature distribution in a structure, in which heat conductivity is dependent on temperature, was calculated. Calculation with a model which has a void inside was performed. Finally, model calculation for a complex system was carried out. (Kato, T.)

  6. A high-resolution computational localization method for transcranial magnetic stimulation mapping.

    Science.gov (United States)

    Aonuma, Shinta; Gomez-Tames, Jose; Laakso, Ilkka; Hirata, Akimasa; Takakura, Tomokazu; Tamura, Manabu; Muragaki, Yoshihiro

    2018-05-15

    Transcranial magnetic stimulation (TMS) is used for the mapping of brain motor functions. The complexity of the brain deters determining the exact localization of the stimulation site using simplified methods (e.g., the region below the center of the TMS coil) or conventional computational approaches. This study aimed to present a high-precision localization method for a specific motor area by synthesizing computed non-uniform current distributions in the brain for multiple sessions of TMS. Peritumoral mapping by TMS was conducted on patients who had intra-axial brain neoplasms located within or close to the motor speech area. The electric field induced by TMS was computed using realistic head models constructed from magnetic resonance images of patients. A post-processing method was implemented to determine a TMS hotspot by combining the computed electric fields for the coil orientations and positions that delivered high motor-evoked potentials during peritumoral mapping. The method was compared to the stimulation site localized via intraoperative direct brain stimulation and navigated TMS. Four main results were obtained: 1) the dependence of the computed hotspot area on the number of peritumoral measurements was evaluated; 2) the estimated localization of the hand motor area in eight non-affected hemispheres was in good agreement with the position of a so-called "hand-knob"; 3) the estimated hotspot areas were not sensitive to variations in tissue conductivity; and 4) the hand motor areas estimated by this proposal and direct electric stimulation (DES) were in good agreement in the ipsilateral hemisphere of four glioma patients. The TMS localization method was validated by well-known positions of the "hand-knob" in brains for the non-affected hemisphere, and by a hotspot localized via DES during awake craniotomy for the tumor-containing hemisphere. Copyright © 2018 Elsevier Inc. All rights reserved.

  7. Effects of a Computer-Assisted Concept Mapping Learning Strategy on EFL College Students' English Reading Comprehension

    Science.gov (United States)

    Liu, Pei-Lin; Chen, Chiu-Jung; Chang, Yu-Ju

    2010-01-01

    The purpose of this research was to investigate the effects of a computer-assisted concept mapping learning strategy on EFL college learners' English reading comprehension. The research questions were: (1) what was the influence of the computer-assisted concept mapping learning strategy on different learners' English reading comprehension? (2) did…

  8. A Novel Transfer Learning Method Based on Common Space Mapping and Weighted Domain Matching

    KAUST Repository

    Liang, Ru-Ze; Xie, Wei; Li, Weizhi; Wang, Hongqi; Wang, Jim Jing-Yan; Taylor, Lisa

    2017-01-01

    In this paper, we propose a novel learning framework for the problem of domain transfer learning. We map the data of two domains to one single common space, and learn a classifier in this common space. Then we adapt the common classifier to the two domains by adding two adaptive functions to it respectively. In the common space, the target domain data points are weighted and matched to the target domain in term of distributions. The weighting terms of source domain data points and the target domain classification responses are also regularized by the local reconstruction coefficients. The novel transfer learning framework is evaluated over some benchmark cross-domain data sets, and it outperforms the existing state-of-the-art transfer learning methods.

  9. A Novel Transfer Learning Method Based on Common Space Mapping and Weighted Domain Matching

    KAUST Repository

    Liang, Ru-Ze

    2017-01-17

    In this paper, we propose a novel learning framework for the problem of domain transfer learning. We map the data of two domains to one single common space, and learn a classifier in this common space. Then we adapt the common classifier to the two domains by adding two adaptive functions to it respectively. In the common space, the target domain data points are weighted and matched to the target domain in term of distributions. The weighting terms of source domain data points and the target domain classification responses are also regularized by the local reconstruction coefficients. The novel transfer learning framework is evaluated over some benchmark cross-domain data sets, and it outperforms the existing state-of-the-art transfer learning methods.

  10. The transfer from survey (map-like) to route representations into Virtual Reality Mazes: effect of age and cerebral lesion.

    Science.gov (United States)

    Carelli, Laura; Rusconi, Maria Luisa; Scarabelli, Chiara; Stampatori, Chiara; Mattioli, Flavia; Riva, Giuseppe

    2011-01-31

    To go from one place to another, we routinely generate internal representations of surrounding spaces, which can include egocentric (body-centred) and allocentric (world-centred) coordinates, combined into route and survey representations.Recent studies have shown how both egocentric and allocentric representations exist in parallel and are joined to support behaviour according to the task.Our study investigated the transfer from survey (map-like) to route representations in healthy and brain-damaged subjects. The aim was two-fold: first, to understand how this ability could change with age in a sample of healthy participants, aged from 40 to 71 years old; second, to investigate how it is affected after a brain lesion in a 8 patients' sample, with reference to specific neuropsychological frames. Participants were first required to perform the paper and pencil version of eight mazes, then to translate the map-like paths into egocentric routes, in order to find the right way into equivalent Virtual Reality (VR) mazes.Patients also underwent a comprehensive neuropsychological evaluation, including a specific investigation of some topographical orientation components. As regards the healthy sample, we found age-related deterioration in VR task performance. While education level and gender were not found to be related to performance, global cognitive level (Mini Mental State Examination), previous experience with computer and fluidity of navigation into the VR appeared to influence VR task results.Considering the clinical sample, there was a difficulty in performing the VR Maze task; executive functions and visuo-spatial abilities deficits appeared to be more relevant for predicting patients' results. Our study suggests the importance of developing tools aimed at investigating the survey to route transfer ability in both healthy elderly and clinical samples, since this skill seems high cognitive demanding and sensitive to cognitive decline.Human-computer interaction

  11. The transfer from survey (map-like to route representations into Virtual Reality Mazes: effect of age and cerebral lesion

    Directory of Open Access Journals (Sweden)

    Stampatori Chiara

    2011-01-01

    Full Text Available Abstract Background To go from one place to another, we routinely generate internal representations of surrounding spaces, which can include egocentric (body-centred and allocentric (world-centred coordinates, combined into route and survey representations. Recent studies have shown how both egocentric and allocentric representations exist in parallel and are joined to support behaviour according to the task. Our study investigated the transfer from survey (map-like to route representations in healthy and brain-damaged subjects. The aim was two-fold: first, to understand how this ability could change with age in a sample of healthy participants, aged from 40 to 71 years old; second, to investigate how it is affected after a brain lesion in a 8 patients' sample, with reference to specific neuropsychological frames. Methods Participants were first required to perform the paper and pencil version of eight mazes, then to translate the map-like paths into egocentric routes, in order to find the right way into equivalent Virtual Reality (VR mazes. Patients also underwent a comprehensive neuropsychological evaluation, including a specific investigation of some topographical orientation components. Results As regards the healthy sample, we found age-related deterioration in VR task performance. While education level and gender were not found to be related to performance, global cognitive level (Mini Mental State Examination, previous experience with computer and fluidity of navigation into the VR appeared to influence VR task results. Considering the clinical sample, there was a difficulty in performing the VR Maze task; executive functions and visuo-spatial abilities deficits appeared to be more relevant for predicting patients' results. Conclusions Our study suggests the importance of developing tools aimed at investigating the survey to route transfer ability in both healthy elderly and clinical samples, since this skill seems high cognitive

  12. Large break LOCA analysis for retrofitted ECCS at MAPS using modified computer code ATMIKA

    International Nuclear Information System (INIS)

    Singhal, Mukesh; Khan, T.A.; Yadav, S.K.; Pramod, P.; Rammohan, H.P.; Bajaj, S.S.

    2002-01-01

    Full text: Computer code ATMIKA which has been used for thermal hydraulic analysis is based on unequal velocity equal temperature (UVET) model. Thermal hydraulic transient was predicted using three conservation equations and drift flux model. The modified drift flux model is now able to predict counter current flow and the relative velocity in vertical channel more accurately. Apart from this, stratification model is also introduced to predict the fuel behaviour under stratified condition. Many more improvements were carried out with respect to solution of conservation equation, heat transfer package and frictional pressure drop model. All these modifications have been well validated with published data on RD-12/RD-14 experiments. This paper describes the code modifications and also deals with the application of the code for the large break LOCA analysis for retrofitted emergency core cooling system (ECCS) being implemented at Madras Atomic Power Station (MAPS). This paper also brings out the effect of accumulator on stratification and fuel behaviour

  13. Computational investigation of fluid flow and heat transfer of an economizer by porous medium approach

    Science.gov (United States)

    Babu, C. Rajesh; Kumar, P.; Rajamohan, G.

    2017-07-01

    Computation of fluid flow and heat transfer in an economizer is simulated by a porous medium approach, with plain tubes having a horizontal in-line arrangement and cross flow arrangement in a coal-fired thermal power plant. The economizer is a thermal mechanical device that captures waste heat from the thermal exhaust flue gasses through heat transfer surfaces to preheat boiler feed water. In order to evaluate the fluid flow and heat transfer on tubes, a numerical analysis on heat transfer performance is carried out on an 110 t/h MCR (Maximum continuous rating) boiler unit. In this study, thermal performance is investigated using the computational fluid dynamics (CFD) simulation using ANSYS FLUENT. The fouling factor ε and the overall heat transfer coefficient ψ are employed to evaluate the fluid flow and heat transfer. The model demands significant computational details for geometric modeling, grid generation, and numerical calculations to evaluate the thermal performance of an economizer. The simulation results show that the overall heat transfer coefficient 37.76 W/(m2K) and economizer coil side pressure drop of 0.2 (kg/cm2) are found to be conformity within the tolerable limits when compared with existing industrial economizer data.

  14. A new computational method for the detection of horizontal gene transfer events.

    Science.gov (United States)

    Tsirigos, Aristotelis; Rigoutsos, Isidore

    2005-01-01

    In recent years, the increase in the amounts of available genomic data has made it easier to appreciate the extent by which organisms increase their genetic diversity through horizontally transferred genetic material. Such transfers have the potential to give rise to extremely dynamic genomes where a significant proportion of their coding DNA has been contributed by external sources. Because of the impact of these horizontal transfers on the ecological and pathogenic character of the recipient organisms, methods are continuously sought that are able to computationally determine which of the genes of a given genome are products of transfer events. In this paper, we introduce and discuss a novel computational method for identifying horizontal transfers that relies on a gene's nucleotide composition and obviates the need for knowledge of codon boundaries. In addition to being applicable to individual genes, the method can be easily extended to the case of clusters of horizontally transferred genes. With the help of an extensive and carefully designed set of experiments on 123 archaeal and bacterial genomes, we demonstrate that the new method exhibits significant improvement in sensitivity when compared to previously published approaches. In fact, it achieves an average relative improvement across genomes of between 11 and 41% compared to the Codon Adaptation Index method in distinguishing native from foreign genes. Our method's horizontal gene transfer predictions for 123 microbial genomes are available online at http://cbcsrv.watson.ibm.com/HGT/.

  15. PERKAM: Personalized Knowledge Awareness Map for Computer Supported Ubiquitous Learning

    Science.gov (United States)

    El-Bishouty, Moushir M.; Ogata, Hiroaki; Yano, Yoneo

    2007-01-01

    This paper introduces a ubiquitous computing environment in order to support the learners while doing tasks; this environment is called PERKAM (PERsonalized Knowledge Awareness Map). PERKAM allows the learners to share knowledge, interact, collaborate, and exchange individual experiences. It utilizes the RFID ubiquities technology to detect the…

  16. Teaching Computer-Aided Design of Fluid Flow and Heat Transfer Engineering Equipment.

    Science.gov (United States)

    Gosman, A. D.; And Others

    1979-01-01

    Describes a teaching program for fluid mechanics and heat transfer which contains both computer aided learning (CAL) and computer aided design (CAD) components and argues that the understanding of the physical and numerical modeling taught in the CAL course is essential to the proper implementation of CAD. (Author/CMV)

  17. Transport maps and dimension reduction for Bayesian computation

    KAUST Repository

    Marzouk, Youssef

    2015-01-01

    We introduce a new framework for efficient sampling from complex probability distributions, using a combination of optimal transport maps and the Metropolis-Hastings rule. The core idea is to use continuous transportation to transform typical Metropolis proposal mechanisms (e.g., random walks, Langevin methods) into non-Gaussian proposal distributions that can more effectively explore the target density. Our approach adaptively constructs a lower triangular transport map—an approximation of the Knothe-Rosenblatt rearrangement—using information from previous MCMC states, via the solution of an optimization problem. This optimization problem is convex regardless of the form of the target distribution. It is solved efficiently using a Newton method that requires no gradient information from the target probability distribution; the target distribution is instead represented via samples. Sequential updates enable efficient and parallelizable adaptation of the map even for large numbers of samples. We show that this approach uses inexact or truncated maps to produce an adaptive MCMC algorithm that is ergodic for the exact target distribution. Numerical demonstrations on a range of parameter inference problems show order-of-magnitude speedups over standard MCMC techniques, measured by the number of effectively independent samples produced per target density evaluation and per unit of wallclock time. We will also discuss adaptive methods for the construction of transport maps in high dimensions, where use of a non-adapted basis (e.g., a total order polynomial expansion) can become computationally prohibitive. If only samples of the target distribution, rather than density evaluations, are available, then we can construct high-dimensional transformations by composing sparsely parameterized transport maps with rotations of the parameter space. If evaluations of the target density and its gradients are available, then one can exploit the structure of the variational

  18. Transport maps and dimension reduction for Bayesian computation

    KAUST Repository

    Marzouk, Youssef

    2015-01-07

    We introduce a new framework for efficient sampling from complex probability distributions, using a combination of optimal transport maps and the Metropolis-Hastings rule. The core idea is to use continuous transportation to transform typical Metropolis proposal mechanisms (e.g., random walks, Langevin methods) into non-Gaussian proposal distributions that can more effectively explore the target density. Our approach adaptively constructs a lower triangular transport map—an approximation of the Knothe-Rosenblatt rearrangement—using information from previous MCMC states, via the solution of an optimization problem. This optimization problem is convex regardless of the form of the target distribution. It is solved efficiently using a Newton method that requires no gradient information from the target probability distribution; the target distribution is instead represented via samples. Sequential updates enable efficient and parallelizable adaptation of the map even for large numbers of samples. We show that this approach uses inexact or truncated maps to produce an adaptive MCMC algorithm that is ergodic for the exact target distribution. Numerical demonstrations on a range of parameter inference problems show order-of-magnitude speedups over standard MCMC techniques, measured by the number of effectively independent samples produced per target density evaluation and per unit of wallclock time. We will also discuss adaptive methods for the construction of transport maps in high dimensions, where use of a non-adapted basis (e.g., a total order polynomial expansion) can become computationally prohibitive. If only samples of the target distribution, rather than density evaluations, are available, then we can construct high-dimensional transformations by composing sparsely parameterized transport maps with rotations of the parameter space. If evaluations of the target density and its gradients are available, then one can exploit the structure of the variational

  19. Quantum computing based on space states without charge transfer

    International Nuclear Information System (INIS)

    Vyurkov, V.; Filippov, S.; Gorelik, L.

    2010-01-01

    An implementation of a quantum computer based on space states in double quantum dots is discussed. There is no charge transfer in qubits during a calculation, therefore, uncontrolled entanglement between qubits due to long-range Coulomb interaction is suppressed. Encoding and processing of quantum information is merely performed on symmetric and antisymmetric states of the electron in double quantum dots. Other plausible sources of decoherence caused by interaction with phonons and gates could be substantially suppressed in the structure as well. We also demonstrate how all necessary quantum logic operations, initialization, writing, and read-out could be carried out in the computer.

  20. A computational procedure for finding multiple solutions of convective heat transfer equations

    International Nuclear Information System (INIS)

    Mishra, S; DebRoy, T

    2005-01-01

    In recent years numerical solutions of the convective heat transfer equations have provided significant insight into the complex materials processing operations. However, these computational methods suffer from two major shortcomings. First, these procedures are designed to calculate temperature fields and cooling rates as output and the unidirectional structure of these solutions preclude specification of these variables as input even when their desired values are known. Second, and more important, these procedures cannot determine multiple pathways or multiple sets of input variables to achieve a particular output from the convective heat transfer equations. Here we propose a new method that overcomes the aforementioned shortcomings of the commonly used solutions of the convective heat transfer equations. The procedure combines the conventional numerical solution methods with a real number based genetic algorithm (GA) to achieve bi-directionality, i.e. the ability to calculate the required input variables to achieve a specific output such as temperature field or cooling rate. More important, the ability of the GA to find a population of solutions enables this procedure to search for and find multiple sets of input variables, all of which can lead to the desired specific output. The proposed computational procedure has been applied to convective heat transfer in a liquid layer locally heated on its free surface by an electric arc, where various sets of input variables are computed to achieve a specific fusion zone geometry defined by an equilibrium temperature. Good agreement is achieved between the model predictions and the independent experimental results, indicating significant promise for the application of this procedure in finding multiple solutions of convective heat transfer equations

  1. Optimal Selection Method of Process Patents for Technology Transfer Using Fuzzy Linguistic Computing

    Directory of Open Access Journals (Sweden)

    Gangfeng Wang

    2014-01-01

    Full Text Available Under the open innovation paradigm, technology transfer of process patents is one of the most important mechanisms for manufacturing companies to implement process innovation and enhance the competitive edge. To achieve promising technology transfers, we need to evaluate the feasibility of process patents and optimally select the most appropriate patent according to the actual manufacturing situation. Hence, this paper proposes an optimal selection method of process patents using multiple criteria decision-making and 2-tuple fuzzy linguistic computing to avoid information loss during the processes of evaluation integration. An evaluation index system for technology transfer feasibility of process patents is designed initially. Then, fuzzy linguistic computing approach is applied to aggregate the evaluations of criteria weights for each criterion and corresponding subcriteria. Furthermore, performance ratings for subcriteria and fuzzy aggregated ratings of criteria are calculated. Thus, we obtain the overall technology transfer feasibility of patent alternatives. Finally, a case study of aeroengine turbine manufacturing is presented to demonstrate the applicability of the proposed method.

  2. Computer assisted surgical anatomy mapping : applications in surgical anatomy research, tailor-made surgery and presonalized teaching

    NARCIS (Netherlands)

    A.L.A. Kerver (Anton)

    2017-01-01

    markdownabstractThis thesis presents a novel anatomy mapping tool named Computer Assisted Surgical Anatomy Mapping (CASAM). It allows researchers to map complex anatomy of multiple specimens and compare their location and course. Renditions such as safe zones or danger zones can be visualized,

  3. Contour Detection for UAV-Based Cadastral Mapping

    Directory of Open Access Journals (Sweden)

    Sophie Crommelinck

    2017-02-01

    Full Text Available Unmanned aerial vehicles (UAVs provide a flexible and low-cost solution for the acquisition of high-resolution data. The potential of high-resolution UAV imagery to create and update cadastral maps is being increasingly investigated. Existing procedures generally involve substantial fieldwork and many manual processes. Arguably, multiple parts of UAV-based cadastral mapping workflows could be automated. Specifically, as many cadastral boundaries coincide with visible boundaries, they could be extracted automatically using image analysis methods. This study investigates the transferability of gPb contour detection, a state-of-the-art computer vision method, to remotely sensed UAV images and UAV-based cadastral mapping. Results show that the approach is transferable to UAV data and automated cadastral mapping: object contours are comprehensively detected at completeness and correctness rates of up to 80%. The detection quality is optimal when the entire scene is covered with one orthoimage, due to the global optimization of gPb contour detection. However, a balance between high completeness and correctness is hard to achieve, so a combination with area-based segmentation and further object knowledge is proposed. The localization quality exhibits the usual dependency on ground resolution. The approach has the potential to accelerate the process of general boundary delineation during the creation and updating of cadastral maps.

  4. Multiple Concurrent Visual-Motor Mappings: Implications for Models of Adaptation

    Science.gov (United States)

    Cunningham, H. A.; Welch, Robert B.

    1994-01-01

    Previous research on adaptation to visual-motor rearrangement suggests that the central nervous system represents accurately only 1 visual-motor mapping at a time. This idea was examined in 3 experiments where subjects tracked a moving target under repeated alternations between 2 initially interfering mappings (the 'normal' mapping characteristic of computer input devices and a 108' rotation of the normal mapping). Alternation between the 2 mappings led to significant reduction in error under the rotated mapping and significant reduction in the adaptation aftereffect ordinarily caused by switching between mappings. Color as a discriminative cue, interference versus decay in adaptation aftereffect, and intermanual transfer were also examined. The results reveal a capacity for multiple concurrent visual-motor mappings, possibly controlled by a parametric process near the motor output stage of processing.

  5. Computational Fluid Dynamics Based Extraction of Heat Transfer Coefficient in Cryogenic Propellant Tanks

    Science.gov (United States)

    Yang, H. Q.; West, Jeff

    2015-01-01

    Current reduced-order thermal model for cryogenic propellant tanks is based on correlations built for flat plates collected in the 1950's. The use of these correlations suffers from: inaccurate geometry representation; inaccurate gravity orientation; ambiguous length scale; and lack of detailed validation. The work presented under this task uses the first-principles based Computational Fluid Dynamics (CFD) technique to compute heat transfer from tank wall to the cryogenic fluids, and extracts and correlates the equivalent heat transfer coefficient to support reduced-order thermal model. The CFD tool was first validated against available experimental data and commonly used correlations for natural convection along a vertically heated wall. Good agreements between the present prediction and experimental data have been found for flows in laminar as well turbulent regimes. The convective heat transfer between tank wall and cryogenic propellant, and that between tank wall and ullage gas were then simulated. The results showed that commonly used heat transfer correlations for either vertical or horizontal plate over predict heat transfer rate for the cryogenic tank, in some cases by as much as one order of magnitude. A characteristic length scale has been defined that can correlate all heat transfer coefficients for different fill levels into a single curve. This curve can be used for the reduced-order heat transfer model analysis.

  6. Functional magnetic resonance maps obtained by personal computer; Mapas de resonancia magnetica funcional obtenidos con PC

    Energy Technology Data Exchange (ETDEWEB)

    Gomez, F. j.; Manjon, J. V.; Robles, M. [Universidad Politecnica de Valencia (Spain); Marti-Bonmati, L.; Dosda, R. [Cinica Quiron. Valencia (Spain); Molla, E. [Universidad de Valencia (Spain)

    2001-07-01

    Functional magnetic resonance (fMR) is of special relevance in the analysis of certain types of brain activation. The present report describes the development of a simple software program for use with personal computers (PCs) that analyzes these images and provides functional activation maps. Activation maps are based on the temporal differences in oxyhemoglobin in tomographic images. To detect these differences, intensities registered repeatedly during brain control and activation are compared. The experiments were performed with a 1.5-Tesla MR unit. To verify the reliability of the program fMR studies were carried out in 4 healthy individuals (12 contiguous slices, 80 images per slice every 3.1 seconds for a total of 960 images). All the images were transferred to a PC and were processed pixel by pixel within each sequence to obtain an intensity/time curve. The statistical study of the results (Student's test and cross correlation analysis) made it possible to establish the activation of each pixel. The images were prepared using spatial filtering, temporal filtering, baseline correction, normalization and segmentation of the parenchyma. The postprocessing of the results involved the elimination of single pixels, superposition of an anatomical image of greater spatial resolution and anti-aliasing. The application (Xfun 1.0, Valencia, Spain) was developed in Microsoft Visual C++5.0 Developer Studio for Windows NT Workstation. As a representative example, the program took 8.2 seconds to calculate and present the results of the entire study (12 functional maps). In the motor and visual activation experiments, the activation corresponding to regions proximal to the central sulcus of the hemisphere contralateral to the hand that moved and in the occipital cortex were observed. While programs that calculate activation maps are available, the development of software for PCs running Microsoft Windows ensures several key features for its use on a daily basis: it is

  7. Biomedical cloud computing with Amazon Web Services.

    Science.gov (United States)

    Fusaro, Vincent A; Patil, Prasad; Gafni, Erik; Wall, Dennis P; Tonellato, Peter J

    2011-08-01

    In this overview to biomedical computing in the cloud, we discussed two primary ways to use the cloud (a single instance or cluster), provided a detailed example using NGS mapping, and highlighted the associated costs. While many users new to the cloud may assume that entry is as straightforward as uploading an application and selecting an instance type and storage options, we illustrated that there is substantial up-front effort required before an application can make full use of the cloud's vast resources. Our intention was to provide a set of best practices and to illustrate how those apply to a typical application pipeline for biomedical informatics, but also general enough for extrapolation to other types of computational problems. Our mapping example was intended to illustrate how to develop a scalable project and not to compare and contrast alignment algorithms for read mapping and genome assembly. Indeed, with a newer aligner such as Bowtie, it is possible to map the entire African genome using one m2.2xlarge instance in 48 hours for a total cost of approximately $48 in computation time. In our example, we were not concerned with data transfer rates, which are heavily influenced by the amount of available bandwidth, connection latency, and network availability. When transferring large amounts of data to the cloud, bandwidth limitations can be a major bottleneck, and in some cases it is more efficient to simply mail a storage device containing the data to AWS (http://aws.amazon.com/importexport/). More information about cloud computing, detailed cost analysis, and security can be found in references.

  8. Cost Savings Associated with the Adoption of a Cloud Computing Data Transfer System for Trauma Patients.

    Science.gov (United States)

    Feeney, James M; Montgomery, Stephanie C; Wolf, Laura; Jayaraman, Vijay; Twohig, Michael

    2016-09-01

    Among transferred trauma patients, challenges with the transfer of radiographic studies include problems loading or viewing the studies at the receiving hospitals, and problems manipulating, reconstructing, or evalu- ating the transferred images. Cloud-based image transfer systems may address some ofthese problems. We reviewed the charts of patients trans- ferred during one year surrounding the adoption of a cloud computing data transfer system. We compared the rates of repeat imaging before (precloud) and af- ter (postcloud) the adoption of the cloud-based data transfer system. During the precloud period, 28 out of 100 patients required 90 repeat studies. With the cloud computing transfer system in place, three out of 134 patients required seven repeat films. There was a statistically significant decrease in the proportion of patients requiring repeat films (28% to 2.2%, P < .0001). Based on an annualized volume of 200 trauma patient transfers, the cost savings estimated using three methods of cost analysis, is between $30,272 and $192,453.

  9. Symplectic maps and chromatic optics in particle accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Cai, Yunhai

    2015-10-11

    We have applied the nonlinear map method to comprehensively characterize the chromatic optics in particle accelerators. Our approach is built on the foundation of symplectic transfer maps of magnetic elements. The chromatic lattice parameters can be transported from one element to another by the maps. We introduce a Jacobian operator that provides an intrinsic linkage between the maps and the matrix with parameter dependence. The link allows us to directly apply the formulation of the linear optics to compute the chromatic lattice parameters. As an illustration, we analyze an alternating-gradient cell with nonlinear sextupoles, octupoles, and decapoles and derive analytically their settings for the local chromatic compensation. As a result, the cell becomes nearly perfect up to the third-order of the momentum deviation.

  10. oRGB: a practical opponent color space for computer graphics.

    Science.gov (United States)

    Bratkova, Margarita; Boulos, Solomon; Shirley, Peter

    2009-01-01

    Designed for computer graphics, oRGB is a new color model based on opponent color theory. It works well for both HSV-style color selection and computational applications such as color transfer. oRGB also enables new applications such as a quantitative cool-to-warm metric, intuitive color manipulation and variations, and simple gamut mapping.

  11. Secure information transfer based on computing reservoir

    Energy Technology Data Exchange (ETDEWEB)

    Szmoski, R.M.; Ferrari, F.A.S. [Department of Physics, Universidade Estadual de Ponta Grossa, 84030-900, Ponta Grossa (Brazil); Pinto, S.E. de S, E-mail: desouzapinto@pq.cnpq.br [Department of Physics, Universidade Estadual de Ponta Grossa, 84030-900, Ponta Grossa (Brazil); Baptista, M.S. [Institute for Complex Systems and Mathematical Biology, SUPA, University of Aberdeen, Aberdeen (United Kingdom); Viana, R.L. [Department of Physics, Universidade Federal do Parana, 81531-990, Curitiba, Parana (Brazil)

    2013-04-01

    There is a broad area of research to ensure that information is transmitted securely. Within this scope, chaos-based cryptography takes a prominent role due to its nonlinear properties. Using these properties, we propose a secure mechanism for transmitting data that relies on chaotic networks. We use a nonlinear on–off device to cipher the message, and the transfer entropy to retrieve it. We analyze the system capability for sending messages, and we obtain expressions for the operating time. We demonstrate the system efficiency for a wide range of parameters. We find similarities between our method and the reservoir computing.

  12. Computational study of energy transfer in two-dimensional J-aggregates

    International Nuclear Information System (INIS)

    Gallos, Lazaros K.; Argyrakis, Panos; Lobanov, A.; Vitukhnovsky, A.

    2004-01-01

    We perform a computational analysis of the intra- and interband energy transfer in two-dimensional J-aggregates. Each aggregate is represented as a two-dimensional array (LB-film or self-assembled film) of two kinds of cyanine dyes. We consider the J-aggregate whose J-band is located at a shorter wavelength to be a donor and an aggregate or a small impurity with longer wavelength to be an acceptor. Light absorption in the blue wing of the donor aggregate gives rise to the population of its excitonic states. The depopulation of these states is possible by (a) radiative transfer to the ground state (b) intraband energy transfer, and (c) interband energy transfer to the acceptor. We study the dependence of energy transfer on properties such as the energy gap, the diagonal disorder, and the exciton-phonon interaction strength. Experimentally observable parameters, such as the position and form of luminescence spectrum, and results of the kinetic spectroscopy measurements strongly depend upon the density of states in excitonic bands, rates of energy exchange between states and oscillator strengths for luminescent transitions originating from these states

  13. Research Progress in Mathematical Analysis of Map Projection by Computer Algebra

    Directory of Open Access Journals (Sweden)

    BIAN Shaofeng

    2017-10-01

    Full Text Available Map projection is an important component of modern cartography, and involves many fussy mathematical analysis processes, such as the power series expansions of elliptical functions, differential of complex and implicit functions, elliptical integral and the operation of complex numbers. The derivation of these problems by hand not only consumes much time and energy but also makes mistake easily, and sometimes can not be realized at all because of the impossible complexity. The research achievements in mathematical analysis of map projection by computer algebra are systematically reviewed in five aspects, i.e., the symbolic expressions of forward and inverse solution of ellipsoidal latitudes, the direct transformations between map projections with different distortion properties, expressions of Gauss projection by complex function, mathematical analysis of oblique Mercator projection, polar chart projection with its transformation. Main problems that need to be further solved in this research field are analyzed. It will be helpful to promote the development of map projection.

  14. Simulation of seagrass bed mapping by satellite images based on the radiative transfer model

    Science.gov (United States)

    Sagawa, Tatsuyuki; Komatsu, Teruhisa

    2015-06-01

    Seagrass and seaweed beds play important roles in coastal marine ecosystems. They are food sources and habitats for many marine organisms, and influence the physical, chemical, and biological environment. They are sensitive to human impacts such as reclamation and pollution. Therefore, their management and preservation are necessary for a healthy coastal environment. Satellite remote sensing is a useful tool for mapping and monitoring seagrass beds. The efficiency of seagrass mapping, seagrass bed classification in particular, has been evaluated by mapping accuracy using an error matrix. However, mapping accuracies are influenced by coastal environments such as seawater transparency, bathymetry, and substrate type. Coastal management requires sufficient accuracy and an understanding of mapping limitations for monitoring coastal habitats including seagrass beds. Previous studies are mainly based on case studies in specific regions and seasons. Extensive data are required to generalise assessments of classification accuracy from case studies, which has proven difficult. This study aims to build a simulator based on a radiative transfer model to produce modelled satellite images and assess the visual detectability of seagrass beds under different transparencies and seagrass coverages, as well as to examine mapping limitations and classification accuracy. Our simulations led to the development of a model of water transparency and the mapping of depth limits and indicated the possibility for seagrass density mapping under certain ideal conditions. The results show that modelling satellite images is useful in evaluating the accuracy of classification and that establishing seagrass bed monitoring by remote sensing is a reliable tool.

  15. Computer mapping as an aid in air-pollution studies: Montreal region study

    Energy Technology Data Exchange (ETDEWEB)

    Granger, J M

    1972-01-01

    Through the use of computer-mapping programs, an operational technique has been designed which allows an almost-instant appraisal of the intensity of atmospheric pollution in an urban region on the basis of epiphytic sensitivity. The epiphytes considered are essentially lichens and mosses growing on trees. This study was applied to the Montreal region, with 349 samplings statiions distributed nearly uniformly. Computer graphics of the findings are included in the appendix.

  16. A regional land use survey based on remote sensing and other data: A report on a LANDSAT and computer mapping project, volume 2

    Science.gov (United States)

    Nez, G. (Principal Investigator); Mutter, D.

    1977-01-01

    The author has identified the following significant results. The project mapped land use/cover classifications from LANDSAT computer compatible tape data and combined those results with other multisource data via computer mapping/compositing techniques to analyze various land use planning/natural resource management problems. Data were analyzed on 1:24,000 scale maps at 1.1 acre resolution. LANDSAT analysis software and linkages with other computer mapping software were developed. Significant results were also achieved in training, communication, and identification of needs for developing the LANDSAT/computer mapping technologies into operational tools for use by decision makers.

  17. Teaching Topographic Map Skills and Geomorphology Concepts with Google Earth in a One-Computer Classroom

    Science.gov (United States)

    Hsu, Hsiao-Ping; Tsai, Bor-Wen; Chen, Che-Ming

    2018-01-01

    Teaching high-school geomorphological concepts and topographic map reading entails many challenges. This research reports the applicability and effectiveness of Google Earth in teaching topographic map skills and geomorphological concepts, by a single teacher, in a one-computer classroom. Compared to learning via a conventional instructional…

  18. Computer determination of event maps with application to auxiliary supply systems

    International Nuclear Information System (INIS)

    Wredenberg, L.; Billinton, R.

    1975-01-01

    A method of evaluating the reliability of sequential operations in systems containing standby and alternate supply facilities is presented. The method is based upon the use of a digital computer for automatic development of event maps. The technique is illustrated by application to a nuclear power plant auxiliary supply system. (author)

  19. Leveraging anatomical information to improve transfer learning in brain-computer interfaces

    Science.gov (United States)

    Wronkiewicz, Mark; Larson, Eric; Lee, Adrian K. C.

    2015-08-01

    Objective. Brain-computer interfaces (BCIs) represent a technology with the potential to rehabilitate a range of traumatic and degenerative nervous system conditions but require a time-consuming training process to calibrate. An area of BCI research known as transfer learning is aimed at accelerating training by recycling previously recorded training data across sessions or subjects. Training data, however, is typically transferred from one electrode configuration to another without taking individual head anatomy or electrode positioning into account, which may underutilize the recycled data. Approach. We explore transfer learning with the use of source imaging, which estimates neural activity in the cortex. Transferring estimates of cortical activity, in contrast to scalp recordings, provides a way to compensate for variability in electrode positioning and head morphologies across subjects and sessions. Main results. Based on simulated and measured electroencephalography activity, we trained a classifier using data transferred exclusively from other subjects and achieved accuracies that were comparable to or surpassed a benchmark classifier (representative of a real-world BCI). Our results indicate that classification improvements depend on the number of trials transferred and the cortical region of interest. Significance. These findings suggest that cortical source-based transfer learning is a principled method to transfer data that improves BCI classification performance and provides a path to reduce BCI calibration time.

  20. Portable radiation detector and mapping system

    International Nuclear Information System (INIS)

    Hofstetter, K.J.; Hayes, D.W.; Eakle, R.F.

    1995-01-01

    A portable radiation detector and mapping system (RADMAPS) has been developed to detect, locate and plot nuclear radiation intensities on commercially available digital maps and other images. The field unit records gamma-ray spectra or neutron signals together with positions from a Global Positioning System (GPS) on flash memory cards. The recorded information is then transferred to a lap-top computer for spectral data analyses and then georegistered graphically on maps, photographs, etc. RADMAPS integrates several existing technologies to produce a preprogrammable field unit uniquely suited for each survey, as required. The system presently records spectra from a Nal(Tl) gamma-ray detector or an enriched Li-6 doped glass neutron scintillator. Standard Geographic Information System software installed in a lap-top, complete with CD-ROM supporting digitally imaged maps, permits the characterization of nuclear material in the field when the presence of such material is not otherwise documented. This paper gives the results of a typical site survey of the Savannah River Site (SRS) using RADMAPS

  1. A Novel UDT-Based Transfer Speed-Up Protocol for Fog Computing

    Directory of Open Access Journals (Sweden)

    Zhijie Han

    2018-01-01

    Full Text Available Fog computing is a distributed computing model as the middle layer between the cloud data center and the IoT device/sensor. It provides computing, network, and storage devices so that cloud based services can be closer to IOT devices and sensors. Cloud computing requires a lot of bandwidth, and the bandwidth of the wireless network is limited. In contrast, the amount of bandwidth required for “fog computing” is much less. In this paper, we improved a new protocol Peer Assistant UDT-Based Data Transfer Protocol (PaUDT, applied to Iot-Cloud computing. Furthermore, we compared the efficiency of the congestion control algorithm of UDT with the Adobe’s Secure Real-Time Media Flow Protocol (RTMFP, based on UDP completely at the transport layer. At last, we built an evaluation model of UDT in RTT and bit error ratio which describes the performance. The theoretical analysis and experiment result have shown that UDT has good performance in IoT-Cloud computing.

  2. Using Computer-Assisted Argumentation Mapping to develop effective argumentation skills in high school advanced placement physics

    Science.gov (United States)

    Heglund, Brian

    Educators recognize the importance of reasoning ability for development of critical thinking skills, conceptual change, metacognition, and participation in 21st century society. There is a recognized need for students to improve their skills of argumentation, however, argumentation is not explicitly taught outside logic and philosophy---subjects that are not part of the K-12 curriculum. One potential way of supporting the development of argumentation skills in the K-12 context is through incorporating Computer-Assisted Argument Mapping to evaluate arguments. This quasi-experimental study tested the effects of such argument mapping software and was informed by the following two research questions: 1. To what extent does the collaborative use of Computer-Assisted Argumentation Mapping to evaluate competing theories influence the critical thinking skill of argument evaluation, metacognitive awareness, and conceptual knowledge acquisition in high school Advanced Placement physics, compared to the more traditional method of text tables that does not employ Computer-Assisted Argumentation Mapping? 2. What are the student perceptions of the pros and cons of argument evaluation in the high school Advanced Placement physics environment? This study examined changes in critical thinking skills, including argumentation evaluation skills, as well as metacognitive awareness and conceptual knowledge, in two groups: a treatment group using Computer-Assisted Argumentation Mapping to evaluate physics arguments, and a comparison group using text tables to evaluate physics arguments. Quantitative and qualitative methods for collecting and analyzing data were used to answer the research questions. Quantitative data indicated no significant difference between the experimental groups, and qualitative data suggested students perceived pros and cons of argument evaluation in the high school Advanced Placement physics environment, such as self-reported sense of improvement in argument

  3. On the Transfer of a Number of Concepts of Statistical Radiophysics to the Theory of One-dimensional Point Mappings

    Directory of Open Access Journals (Sweden)

    Agalar M. Agalarov

    2018-01-01

    Full Text Available In the article, the possibility of using a bispectrum under the investigation of regular and chaotic behaviour of one-dimensional point mappings is discussed. The effectiveness of the transfer of this concept to nonlinear dynamics was demonstrated by an example of the Feigenbaum mapping. Also in the work, the application of the Kullback-Leibler entropy in the theory of point mappings is considered. It has been shown that this information-like value is able to describe the behaviour of statistical ensembles of one-dimensional mappings. In the framework of this theory some general properties of its behaviour were found out. Constructivity of the Kullback-Leibler entropy in the theory of point mappings was shown by means of its direct calculation for the ”saw tooth” mapping with linear initial probability density. Moreover, for this mapping the denumerable set of initial probability densities hitting into its stationary probability density after a finite number of steps was pointed out. 

  4. Functional physiology of the human terminal antrum defined by high-resolution electrical mapping and computational modeling.

    Science.gov (United States)

    Berry, Rachel; Miyagawa, Taimei; Paskaranandavadivel, Niranchan; Du, Peng; Angeli, Timothy R; Trew, Mark L; Windsor, John A; Imai, Yohsuke; O'Grady, Gregory; Cheng, Leo K

    2016-11-01

    High-resolution (HR) mapping has been used to study gastric slow-wave activation; however, the specific characteristics of antral electrophysiology remain poorly defined. This study applied HR mapping and computational modeling to define functional human antral physiology. HR mapping was performed in 10 subjects using flexible electrode arrays (128-192 electrodes; 16-24 cm 2 ) arranged from the pylorus to mid-corpus. Anatomical registration was by photographs and anatomical landmarks. Slow-wave parameters were computed, and resultant data were incorporated into a computational fluid dynamics (CFD) model of gastric flow to calculate impact on gastric mixing. In all subjects, extracellular mapping demonstrated normal aboral slow-wave propagation and a region of increased amplitude and velocity in the prepyloric antrum. On average, the high-velocity region commenced 28 mm proximal to the pylorus, and activation ceased 6 mm from the pylorus. Within this region, velocity increased 0.2 mm/s per mm of tissue, from the mean 3.3 ± 0.1 mm/s to 7.5 ± 0.6 mm/s (P human terminal antral contraction is controlled by a short region of rapid high-amplitude slow-wave activity. Distal antral wave acceleration plays a major role in antral flow and mixing, increasing particle strain and trituration. Copyright © 2016 the American Physiological Society.

  5. Computer Games versus Maps before Reading Stories: Priming Readers' Spatial Situation Models

    Science.gov (United States)

    Smith, Glenn Gordon; Majchrzak, Dan; Hayes, Shelley; Drobisz, Jack

    2011-01-01

    The current study investigated how computer games and maps compare as preparation for readers to comprehend and retain spatial relations in text narratives. Readers create situation models of five dimensions: spatial, temporal, causal, goal, and protagonist (Zwaan, Langston, & Graesser 1995). Of these five, readers mentally model the spatial…

  6. Fuzzy cluster quantitative computations of component mass transfer in rocks or minerals

    International Nuclear Information System (INIS)

    Liu Dezheng

    2000-01-01

    The author advances a new component mass transfer quantitative computation method on the basis of closure nature of mass percentage of components in rocks or minerals. Using fuzzy dynamic cluster analysis, and calculating restore closure difference, and determining type of difference, and assisted by relevant diagnostic parameters, the method gradually screens out the true constant component. Then, true mass percentage and mass transfer quantity of components of metabolic rocks or minerals are calculated by applying the true constant component fixed coefficient. This method is called true constant component fixed method (TCF method)

  7. A high speed, selective multi-ADC to computer data transfer interface, for nuclear physics experiments

    International Nuclear Information System (INIS)

    Arctaedius, T.; Ekstroem, R.E.

    1986-08-01

    A link connecting up to fifteen Analog to Digital Converters with a computer, through a Direct Memory Access interface, is described. The interface decides which of the connected ADC:s that participate in an event, and transfers the output-data from these to the computer, accompanied with a 2-byte word identifying the participating ADC:s. This data format can be recorded on tape without further transformations, and is easy to unfold at the off-line analysis. Data transfer is accomplished in less than a few microseconds, which is made possible by the use of high speed TTL circuits. (authors)

  8. 41 CFR 102-36.475 - What is the authority for transfers under “Computers for Learning”?

    Science.gov (United States)

    2010-07-01

    ....475 What is the authority for transfers under “Computers for Learning”? (a) The Stevenson-Wydler... excess education-related federal equipment to educational institutions or nonprofit organizations for... education-related equipment may be transferred directly under established agency procedures, or reported to...

  9. Molecular Computational Investigation of Electron Transfer Kinetics across Cytochrome-Iron Oxide Interfaces

    International Nuclear Information System (INIS)

    Kerisit, Sebastien N.; Rosso, Kevin M.; Dupuis, Michel; Valiev, Marat

    2007-01-01

    The interface between electron transfer proteins such as cytochromes and solid phase mineral oxides is central to the activity of dissimilatory-metal reducing bacteria. A combination of potential-based molecular dynamics simulations and ab initio electronic structure calculations are used in the framework of Marcus' electron transfer theory to compute elementary electron transfer rates from a well-defined cytochrome model, namely the small tetraheme cytochrome (STC) from Shewanella oneidensis, to surfaces of the iron oxide mineral hematite (a-Fe2O3). Room temperature molecular dynamics simulations show that an isolated STC molecule favors surface attachment via direct contact of hemes I and IV at the poles of the elongated axis, with electron transfer distances as small as 9 Angstroms. The cytochrome remains attached to the mineral surface in the presence of water and shows limited surface diffusion at the interface. Ab initio electronic coupling matrix element (VAB) calculations of configurations excised from the molecular dynamics simulations reveal VAB values ranging from 1 to 20 cm-1, consistent with nonadiabaticity. Using these results, together with experimental data on the redox potential of hematite and hemes in relevant cytochromes and calculations of the reorganization energy from cluster models, we estimate the rate of electron transfer across this model interface to range from 1 to 1000 s-1 for the most exothermic driving force considered in this work, and from 0.01 to 20 s-1 for the most endothermic. This fairly large range of electron transfer rates highlights the sensitivity of the rate upon the electronic coupling matrix element, which is in turn dependent on the fluctuations of the heme configuration at the interface. We characterize this dependence using an idealized bis-imidazole heme to compute from first principles the VAB variation due to porphyrin ring orientation, electron transfer distance, and mineral surface termination. The electronic

  10. Computational simulation of heat transfer in laser melted material flow

    International Nuclear Information System (INIS)

    Shankar, V.; Gnanamuthu, D.

    1986-01-01

    A computational procedure has been developed to study the heat transfer process in laser-melted material flow associated with surface heat treatment of metallic alloys to improve wear-and-tear and corrosion resistance. The time-dependent incompressible Navier-Stokes equations are solved, accounting for both convective and conductive heat transfer processes. The convection, induced by surface tension and high surface temperature gradients, sets up a counterrotating vortex flow within the molten pool. This recirculating material flow is responsible for determining the molten pool shape and the associated cooling rates which affect the solidifying material composition. The numerical method involves an implicit triple-approximate factorization scheme for the energy equation, and an explicit treatment for the momentum and the continuity equations. An experimental setup, using a continuous wave CO 2 laser beam as a heat source, has been carried out to generate data for validation of the computational model. Results in terms of the depth, width, and shape of the molten pool and the heat-affected zone for various power settings and shapes of the laser, and for various travel speeds of the workpiece, compare very well with experimental data. The presence of the surface tension-induced vortex flow is demonstrated

  11. Cryogenic heat transfer

    CERN Document Server

    Barron, Randall F

    2016-01-01

    Cryogenic Heat Transfer, Second Edition continues to address specific heat transfer problems that occur in the cryogenic temperature range where there are distinct differences from conventional heat transfer problems. This updated version examines the use of computer-aided design in cryogenic engineering and emphasizes commonly used computer programs to address modern cryogenic heat transfer problems. It introduces additional topics in cryogenic heat transfer that include latent heat expressions; lumped-capacity transient heat transfer; thermal stresses; Laplace transform solutions; oscillating flow heat transfer, and computer-aided heat exchanger design. It also includes new examples and homework problems throughout the book, and provides ample references for further study.

  12. CRIME MAPS AND COMPUTER TECNOLOGY

    Directory of Open Access Journals (Sweden)

    Erdal KARAKAŞ

    2004-04-01

    Full Text Available Crime maps show crime density values and locations where crime have accured. For this reason it had been easy to examine the spatial distribution of crime locations with crime maps. There for crime maps have long been part of the process to crime analysis. In this study, the crime of home burglary was mapped with respect to general areal distribution by GIS (Geographic Information System in the city of Elazig The distribution of the crime was handled considering the parameters such as month, day and hour, and related to the land use. As a result, it was determined that there were differences in the distribution and concentration in the crime of theft with respect to the land use inside the city. The methods and findings in this study will provide rapid and accurate analyses for such kinds of studies. In addition, Interrelating the type of the crime with the regions or areas will contribute to preventing crime, and security in urban areas.

  13. Facial Performance Transfer via Deformable Models and Parametric Correspondence.

    Science.gov (United States)

    Asthana, Akshay; de la Hunty, Miles; Dhall, Abhinav; Goecke, Roland

    2012-09-01

    The issue of transferring facial performance from one person's face to another's has been an area of interest for the movie industry and the computer graphics community for quite some time. In recent years, deformable face models, such as the Active Appearance Model (AAM), have made it possible to track and synthesize faces in real time. Not surprisingly, deformable face model-based approaches for facial performance transfer have gained tremendous interest in the computer vision and graphics community. In this paper, we focus on the problem of real-time facial performance transfer using the AAM framework. We propose a novel approach of learning the mapping between the parameters of two completely independent AAMs, using them to facilitate the facial performance transfer in a more realistic manner than previous approaches. The main advantage of modeling this parametric correspondence is that it allows a "meaningful" transfer of both the nonrigid shape and texture across faces irrespective of the speakers' gender, shape, and size of the faces, and illumination conditions. We explore linear and nonlinear methods for modeling the parametric correspondence between the AAMs and show that the sparse linear regression method performs the best. Moreover, we show the utility of the proposed framework for a cross-language facial performance transfer that is an area of interest for the movie dubbing industry.

  14. System of Information in Conceptual Maps for students of Computer Engineering.

    Directory of Open Access Journals (Sweden)

    Lydia Rosa Ríos Rodríguez

    2014-06-01

    Full Text Available The university students, in their daily activities make some decisions which contribute in their professional formation; in many occasions they don't have the enough help to make it in the best way. The systems of information constitute an important alternative to take into consideration in these cases. This work refers to the experience of the department of Computer Engineering of the University of Sancti Spíritus José Martí Pérez (UNISS in the conception and creation of a system of information for the students of the career Computer Engineering on the discipline Artificial Intelligence which uses the conceptual maps as form of representing the information.

  15. A Novel Imaging Technique (X-Map) to Identify Acute Ischemic Lesions Using Noncontrast Dual-Energy Computed Tomography.

    Science.gov (United States)

    Noguchi, Kyo; Itoh, Toshihide; Naruto, Norihito; Takashima, Shutaro; Tanaka, Kortaro; Kuroda, Satoshi

    2017-01-01

    We evaluated whether X-map, a novel imaging technique, can visualize ischemic lesions within 20 hours after the onset in patients with acute ischemic stroke, using noncontrast dual-energy computed tomography (DECT). Six patients with acute ischemic stroke were included in this study. Noncontrast head DECT scans were acquired with 2 X-ray tubes operated at 80 kV and Sn150 kV between 32 minutes and 20 hours after the onset. Using these DECT scans, the X-map was reconstructed based on 3-material decomposition and compared with a simulated standard (120 kV) computed tomography (CT) and diffusion-weighted imaging (DWI). The X-map showed more sensitivity to identify the lesions as an area of lower attenuation value than a simulated standard CT in all 6 patients. The lesions on the X-map correlated well with those on DWI. In 3 of 6 patients, the X-map detected a transient decrease in the attenuation value in the peri-infarct area within 1 day after the onset. The X-map is a powerful tool to supplement a simulated standard CT and characterize acute ischemic lesions. However, the X-map cannot replace a simulated standard CT to diagnose acute cerebral infarction. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  16. Spin-transfer torque magnetoresistive random-access memory technologies for normally off computing (invited)

    International Nuclear Information System (INIS)

    Ando, K.; Yuasa, S.; Fujita, S.; Ito, J.; Yoda, H.; Suzuki, Y.; Nakatani, Y.; Miyazaki, T.

    2014-01-01

    Most parts of present computer systems are made of volatile devices, and the power to supply them to avoid information loss causes huge energy losses. We can eliminate this meaningless energy loss by utilizing the non-volatile function of advanced spin-transfer torque magnetoresistive random-access memory (STT-MRAM) technology and create a new type of computer, i.e., normally off computers. Critical tasks to achieve normally off computers are implementations of STT-MRAM technologies in the main memory and low-level cache memories. STT-MRAM technology for applications to the main memory has been successfully developed by using perpendicular STT-MRAMs, and faster STT-MRAM technologies for applications to the cache memory are now being developed. The present status of STT-MRAMs and challenges that remain for normally off computers are discussed

  17. Evaluation of piping heat transfer, piping flow regimes, and steam generator heat transfer for the Semiscale Mod-1 isothermal tests

    International Nuclear Information System (INIS)

    French, R.T.

    1975-08-01

    Selected experimental data pertinent to piping heat transfer, transient fluid flow regimes, and steam generator heat transfer obtained during the Semiscale Mod-1 isothermal blowdown test series (Test Series 1) are analyzed. The tests in this first test series were designed to provide counterparts to the LOFT nonnuclear experiments. The data from the Semiscale Mod-1 intact and broken loop piping are evaluated to determine the surface heat flux and average heat transfer coefficients effective during the blowdown transient and compared with well known heat transfer correlations used in the RELAP4 computer program. Flow regimes in horizontal pipe sections are calculated and compared with data obtained from horizontal and vertical densitometers and with an existing steady state flow map. Effects of steam generator heat transfer are evaluated quantitatively and qualitatively. The Semiscale Mod-1 data and the analysis presented in this report are valuable for evaluating the adequacy and improving the predictive capability of analytical models developed to predict system response to piping heat transfer, piping flow regimes, and steam generator heat transfer during a postulated loss-of-coolant accident (LOCA) in a pressurized water reactor (PWR). 16 references. (auth)

  18. Computer-based learning in neuroanatomy: A longitudinal study of learning, transfer, and retention

    Science.gov (United States)

    Chariker, Julia H.

    A longitudinal experiment was conducted to explore computer-based learning of neuroanatomy. Using a realistic 3D graphical model of neuroanatomy, and sections derived from the model, exploratory graphical tools were integrated into interactive computer programs so as to allow adaptive exploration. 72 participants learned either sectional anatomy alone or learned whole anatomy followed by sectional anatomy. Sectional anatomy was explored either in perceptually continuous animation or discretely, as in the use of an anatomical atlas. Learning was measured longitudinally to a high performance criterion. After learning, transfer to biomedical images and long-term retention was tested. Learning whole anatomy prior to learning sectional anatomy led to a more efficient learning experience. Learners demonstrated high levels of transfer from whole anatomy to sectional anatomy and from sectional anatomy to complex biomedical images. All learning groups demonstrated high levels of retention at 2--3 weeks.

  19. Modeling of Heat Transfer and Ablation of Refractory Material Due to Rocket Plume Impingement

    Science.gov (United States)

    Harris, Michael F.; Vu, Bruce T.

    2012-01-01

    CR Tech's Thermal Desktop-SINDA/FLUINT software was used in the thermal analysis of a flame deflector design for Launch Complex 39B at Kennedy Space Center, Florida. The analysis of the flame deflector takes into account heat transfer due to plume impingement from expected vehicles to be launched at KSC. The heat flux from the plume was computed using computational fluid dynamics provided by Ames Research Center in Moffet Field, California. The results from the CFD solutions were mapped onto a 3-D Thermal Desktop model of the flame deflector using the boundary condition mapping capabilities in Thermal Desktop. The ablation subroutine in SINDA/FLUINT was then used to model the ablation of the refractory material.

  20. Global Seabed Materials and Habitats Mapped: The Computational Methods

    Science.gov (United States)

    Jenkins, C. J.

    2016-02-01

    What the seabed is made of has proven difficult to map on the scale of whole ocean-basins. Direct sampling and observation can be augmented with proxy-parameter methods such as acoustics. Both avenues are essential to obtain enough detail and coverage, and also to validate the mapping methods. We focus on the direct observations such as samplings, photo and video, probes, diver and sub reports, and surveyed features. These are often in word-descriptive form: over 85% of the records for site materials are in this form, whether as sample/view descriptions or classifications, or described parameters such as consolidation, color, odor, structures and components. Descriptions are absolutely necessary for unusual materials and for processes - in other words, for research. This project dbSEABED not only has the largest collection of seafloor materials data worldwide, but it uses advanced computing math to obtain the best possible coverages and detail. Included in those techniques are linguistic text analysis (e.g., Natural Language Processing, NLP), fuzzy set theory (FST), and machine learning (ML, e.g., Random Forest). These techniques allow efficient and accurate import of huge datasets, thereby optimizing the data that exists. They merge quantitative and qualitative types of data for rich parameter sets, and extrapolate where the data are sparse for best map production. The dbSEABED data resources are now very widely used worldwide in oceanographic research, environmental management, the geosciences, engineering and survey.

  1. Application of the TEMPEST computer code to canister-filling heat transfer problems

    International Nuclear Information System (INIS)

    Farnsworth, R.K.; Faletti, D.W.; Budden, M.J.

    1988-03-01

    Pacific Northwest Laboratory (PNL) researchers used the TEMPEST computer code to simulate thermal cooldown behavior of nuclear waste glass after it was poured into steel canisters for long-term storage. The objective of this work was to determine the accuracy and applicability of the TEMPEST code when used to compute canister thermal histories. First, experimental data were obtained to provide the basis for comparing TEMPEST-generated predictions. Five canisters were instrumented with appropriately located radial and axial thermocouples. The canister were filled using the pilot-scale ceramic melter (PSCM) at PNL. Each canister was filled in either a continous or a batch filling mode. One of the canisters was also filled within a turntable simulant (a group of cylindrical shells with heat transfer resistances similar to those in an actual melter turntable). This was necessary to provide a basis for assessing the ability of the TEMPEST code to also model the transient cooling of canisters in a melter turntable. The continous-fill model, Version M, was found to predict temperatures with more accuracy. The turntable simulant experiment demonstrated that TEMPEST can adequately model the asymmetric temperature field caused by the turntable geometry. Further, TEMPEST can acceptably predict the canister cooling history within a turntable, despite code limitations in computing simultaneous radiation and convection heat transfer between shells, along with uncertainty in stainless-steel surface emissivities. Based on the successful performance of TEMPEST Version M, development was initiated to incorporate 1) full viscous glass convection, 2) a dynamically adaptive grid that automatically follows the glass/air interface throughout the transient, and 3) a full enclosure radiation model to allow radiation heat transfer to non-nearest neighbor cells. 5 refs., 47 figs., 17 tabs

  2. The Data Transfer Kit: A geometric rendezvous-based tool for multiphysics data transfer

    International Nuclear Information System (INIS)

    Slattery, S. R.; Wilson, P. P. H.; Pawlowski, R. P.

    2013-01-01

    The Data Transfer Kit (DTK) is a software library designed to provide parallel data transfer services for arbitrary physics components based on the concept of geometric rendezvous. The rendezvous algorithm provides a means to geometrically correlate two geometric domains that may be arbitrarily decomposed in a parallel simulation. By repartitioning both domains such that they have the same geometric domain on each parallel process, efficient and load balanced search operations and data transfer can be performed at a desirable algorithmic time complexity with low communication overhead relative to other types of mapping algorithms. With the increased development efforts in multiphysics simulation and other multiple mesh and geometry problems, generating parallel topology maps for transferring fields and other data between geometric domains is a common operation. The algorithms used to generate parallel topology maps based on the concept of geometric rendezvous as implemented in DTK are described with an example using a conjugate heat transfer calculation and thermal coupling with a neutronics code. In addition, we provide the results of initial scaling studies performed on the Jaguar Cray XK6 system at Oak Ridge National Laboratory for a worse-case-scenario problem in terms of algorithmic complexity that shows good scaling on 0(1 x 104) cores for topology map generation and excellent scaling on 0(1 x 105) cores for the data transfer operation with meshes of O(1 x 109) elements. (authors)

  3. The Data Transfer Kit: A geometric rendezvous-based tool for multiphysics data transfer

    Energy Technology Data Exchange (ETDEWEB)

    Slattery, S. R.; Wilson, P. P. H. [Department of Engineering Physics, University of Wisconsin - Madison, 1500 Engineering Dr., Madison, WI 53706 (United States); Pawlowski, R. P. [Sandia National Laboratories, P.O. Box 5800, Albuquerque, NM 87185 (United States)

    2013-07-01

    The Data Transfer Kit (DTK) is a software library designed to provide parallel data transfer services for arbitrary physics components based on the concept of geometric rendezvous. The rendezvous algorithm provides a means to geometrically correlate two geometric domains that may be arbitrarily decomposed in a parallel simulation. By repartitioning both domains such that they have the same geometric domain on each parallel process, efficient and load balanced search operations and data transfer can be performed at a desirable algorithmic time complexity with low communication overhead relative to other types of mapping algorithms. With the increased development efforts in multiphysics simulation and other multiple mesh and geometry problems, generating parallel topology maps for transferring fields and other data between geometric domains is a common operation. The algorithms used to generate parallel topology maps based on the concept of geometric rendezvous as implemented in DTK are described with an example using a conjugate heat transfer calculation and thermal coupling with a neutronics code. In addition, we provide the results of initial scaling studies performed on the Jaguar Cray XK6 system at Oak Ridge National Laboratory for a worse-case-scenario problem in terms of algorithmic complexity that shows good scaling on 0(1 x 104) cores for topology map generation and excellent scaling on 0(1 x 105) cores for the data transfer operation with meshes of O(1 x 109) elements. (authors)

  4. Development of optimized segmentation map in dual energy computed tomography

    Science.gov (United States)

    Yamakawa, Keisuke; Ueki, Hironori

    2012-03-01

    Dual energy computed tomography (DECT) has been widely used in clinical practice and has been particularly effective for tissue diagnosis. In DECT the difference of two attenuation coefficients acquired by two kinds of X-ray energy enables tissue segmentation. One problem in conventional DECT is that the segmentation deteriorates in some cases, such as bone removal. This is due to two reasons. Firstly, the segmentation map is optimized without considering the Xray condition (tube voltage and current). If we consider the tube voltage, it is possible to create an optimized map, but unfortunately we cannot consider the tube current. Secondly, the X-ray condition is not optimized. The condition can be set empirically, but this means that the optimized condition is not used correctly. To solve these problems, we have developed methods for optimizing the map (Method-1) and the condition (Method-2). In Method-1, the map is optimized to minimize segmentation errors. The distribution of the attenuation coefficient is modeled by considering the tube current. In Method-2, the optimized condition is decided to minimize segmentation errors depending on tube voltagecurrent combinations while keeping the total exposure constant. We evaluated the effectiveness of Method-1 by performing a phantom experiment under the fixed condition and of Method-2 by performing a phantom experiment under different combinations calculated from the total exposure constant. When Method-1 was followed with Method-2, the segmentation error was reduced from 37.8 to 13.5 %. These results demonstrate that our developed methods can achieve highly accurate segmentation while keeping the total exposure constant.

  5. Transferability of molecular markers from major legumes to Lathyrus spp. for their application in mapping and diversity studies.

    Science.gov (United States)

    Almeida, Nuno Felipe; Trindade Leitão, Susana; Caminero, Constantino; Torres, Ana Maria; Rubiales, Diego; Vaz Patto, Maria Carlota

    2014-01-01

    Lathyrus cicera L. (chickling pea) and L. sativus L. (grass pea) have great potential among grain legumes due to their adaptability to inauspicious environments, high protein content and resistance to serious diseases. Nevertheless, due to its past underused, further activities are required to exploit this potential and to capitalise on the advances in molecular biology that enable improved Lathyrus spp. breeding programmes. In this study we evaluated the transferability of molecular markers developed for closely related legume species to Lathyrus spp. (Medicago truncatula, pea, lentil, faba bean and lupin) and tested the application of those new molecular tools on Lathyrus mapping and diversity studies. Genomic and expressed sequence tag microsatellite, intron-targeted amplified polymorphic, resistance gene analogue and defence-related gene markers were tested. In total 128 (27.7 %) and 132 (28.6 %) molecular markers were successfully cross-amplified, respectively in L. cicera and L. sativus. In total, the efficiency of transferability from genomic microsatellites was 5 %, and from gene-based markers, 55 %. For L. cicera, three cleaved amplified polymorphic sequence markers and one derived cleaved amplified polymorphic sequence marker based on the cross-amplified markers were also developed. Nine of those molecular markers were suitable for mapping in a L. cicera recombinant inbred line population. From the 17 molecular markers tested for diversity analysis, six (35 %) in L. cicera and seven (41 %) in L. sativus were polymorphic and discriminate well all the L. sativus accessions. Additionally, L. cicera accessions were clearly distinguished from L. sativus accessions. This work revealed a high number of transferable molecular markers to be used in current genomic studies in Lathyrus spp. Although their usefulness was higher on diversity studies, they represent the first steps for future comparative mapping involving these species.

  6. Improving learning with science and social studies text using computer-based concept maps for students with disabilities.

    Science.gov (United States)

    Ciullo, Stephen; Falcomata, Terry S; Pfannenstiel, Kathleen; Billingsley, Glenna

    2015-01-01

    Concept maps have been used to help students with learning disabilities (LD) improve literacy skills and content learning, predominantly in secondary school. However, despite increased access to classroom technology, no previous studies have examined the efficacy of computer-based concept maps to improve learning from informational text for students with LD in elementary school. In this study, we used a concurrent delayed multiple probe design to evaluate the interactive use of computer-based concept maps on content acquisition with science and social studies texts for Hispanic students with LD in Grades 4 and 5. Findings from this study suggest that students improved content knowledge during intervention relative to a traditional instruction baseline condition. Learning outcomes and social validity information are considered to inform recommendations for future research and the feasibility of classroom implementation. © The Author(s) 2014.

  7. A computational model for the carbon transfer in stainless steel sodium systems

    International Nuclear Information System (INIS)

    Casadio, S.; Scibona, G.

    1980-01-01

    A method is proposed of computing the carbon transfer in the type 316, 304 and 321 stainless steels in sodium environment as a function of temperature, exposure time and carbon concentration in the sodium. The method is based on the criteria developed at ANL by introducing some simplifications and takes also into account the correlations obtained at WARD. Calculated carbon profiles are compared both with experimental data and with the results available by the other computer methods. The limits for quantitative predictions of the stainless steel carburization or decarburization exposed in a specific environment are discussed. (author)

  8. Influences of buoyancy and thermal boundary conditions on heat transfer with naturally-induced flow

    International Nuclear Information System (INIS)

    Jackson, J.D.; Li, J.

    2002-01-01

    A fundamental study is reported of heat transfer from a vertical heated tube to air which is induced naturally upwards through it by the action of buoyancy. Measurements of local heat transfer coefficient were made using a specially designed computer-controlled power supply and measurement system for conditions of uniform wall temperature and uniform wall heat flux. The effectiveness of heat transfer proved to be much lower than for conditions of forced convection. It was found that the results could be correlated satisfactorily when presented in terms of dimensionless parameters similar to those used for free convection heat transfer from vertical surfaces provided that the heat transfer coefficients were evaluated using local fluid bulk temperature calculated utilising the measured values of flow rate induced through the system. Additional experiments were performed' with pumped flow. These covered the entire mixed convection region. It was found that the data for naturally-induced flow mapped onto the pumped flow data when presented in terms of Nusselt number ratio (mixed to forced) and buoyancy parameter. Computational simulations of the experiments were performed using an advanced computer code which incorporated a buoyancy-influenced, variable property, developing wall shear flow formulation and a low Reynolds number k-ε turbulence model. These reproduced observed behaviour quite well. (author)

  9. An explorative study of the technology transfer coach as a preliminary for the design of a computer aid

    OpenAIRE

    Jönsson, Oscar

    2014-01-01

    The university technology transfer coach has an important role in supporting the commercialization of research results. This thesis has studied the technology transfer coach and their needs in the coaching process. The goal has been to investigate information needs of the technology transfer coach as a preliminary for the design of computer aids.Using a grounded theory approach, we interviewed 17 coaches working in the Swedish technology transfer environment. Extracted quotes from interviews ...

  10. Development of highly accurate approximate scheme for computing the charge transfer integral

    Energy Technology Data Exchange (ETDEWEB)

    Pershin, Anton; Szalay, Péter G. [Laboratory for Theoretical Chemistry, Institute of Chemistry, Eötvös Loránd University, P.O. Box 32, H-1518 Budapest (Hungary)

    2015-08-21

    The charge transfer integral is a key parameter required by various theoretical models to describe charge transport properties, e.g., in organic semiconductors. The accuracy of this important property depends on several factors, which include the level of electronic structure theory and internal simplifications of the applied formalism. The goal of this paper is to identify the performance of various approximate approaches of the latter category, while using the high level equation-of-motion coupled cluster theory for the electronic structure. The calculations have been performed on the ethylene dimer as one of the simplest model systems. By studying different spatial perturbations, it was shown that while both energy split in dimer and fragment charge difference methods are equivalent with the exact formulation for symmetrical displacements, they are less efficient when describing transfer integral along the asymmetric alteration coordinate. Since the “exact” scheme was found computationally expensive, we examine the possibility to obtain the asymmetric fluctuation of the transfer integral by a Taylor expansion along the coordinate space. By exploring the efficiency of this novel approach, we show that the Taylor expansion scheme represents an attractive alternative to the “exact” calculations due to a substantial reduction of computational costs, when a considerably large region of the potential energy surface is of interest. Moreover, we show that the Taylor expansion scheme, irrespective of the dimer symmetry, is very accurate for the entire range of geometry fluctuations that cover the space the molecule accesses at room temperature.

  11. South American regional ionospheric maps computed by GESA: A pilot service in the framework of SIRGAS

    Science.gov (United States)

    Brunini, C.; Meza, A.; Gende, M.; Azpilicueta, F.

    2008-08-01

    SIRGAS (Geocentric Reference Frame for the Americas) is an international enterprise of the geodetic community that aims to realize the Terrestrial Reference Frame in the America's countries. In order to fulfill this commitment, SIRGAS manages a network of continuously operational GNSS receivers totalling around one hundred sites in the Caribbean, Central, and South American region. Although the network was not planed for ionospheric studies, its potential to be used for such a purpose was recently recognized and SIRGAS started a pilot experiment devoted to establish a regular service for computing and releasing regional vertical TEC (vTEC) maps based on GNSS data. Since July, 2005, the GESA (Geodesia Espacial y Aeronomía) laboratory belonging to the Facultad de Ciencias Astronómicas y Geofísicas of the Universidad Nacional de La Plata computes hourly maps of vertical Total Electron Content (vTEC) in the framework of the SIRGAS pilot experiment. These maps exploit all the GNSS data available in the South American region and are computed with the LPIM (La Plata Ionospheric Model). LPIM implements a de-biasing procedure that improves data calibration in relation to other procedures commonly used for such purposes. After calibration, slant TEC measurements are converted to vertical and mapped using local-time and modip latitude. The use of modip latitude smoothed the spatial variability of vTEC, especially in the South American low latitude region and hence allows for a better vTEC interpolation. This contribution summarizes the results obtained by GESA in the framework of the SIRGAS pilot experiment.

  12. Quantum maps from transfer operators

    International Nuclear Information System (INIS)

    Bogomolny, E.B.; Carioli, M.

    1992-09-01

    The Selberg zeta function ζ S (s) yields an exact relationship between the periodic orbits of a fully chaotic Hamiltonian system (the geodesic flow on surfaces of constant negative curvature) and the corresponding quantum system (the spectrum of the Laplace-Beltrami operator on the same manifold). It was found that for certain manifolds, ζ S (s) can be exactly rewritten as the Fredholm-Grothendieck determinant det(1-T s ), where T s is a generalization of the Ruelle-Perron-Frobenius transfer operator. An alternative derivation of this result is given, yielding a method to find not only the spectrum but also the eigenfunctions of the Laplace-Beltrami operator in terms of eigenfunctions of T s . Various properties of the transfer operator are investigated both analytically and numerically for several systems. (author) 30 refs.; 16 figs.; 2 tabs

  13. The feasibility of colorectal cancer detection using dual-energy computed tomography with iodine mapping

    International Nuclear Information System (INIS)

    Boellaard, T.N.; Henneman, O.D.F.; Streekstra, G.J.; Venema, H.W.; Nio, C.Y.; Dorth-Rombouts, M.C. van; Stoker, J.

    2013-01-01

    Aim: To assess the feasibility of colorectal cancer detection using dual-energy computed tomography with iodine mapping and without bowel preparation or bowel distension. Materials and methods: Consecutive patients scheduled for preoperative staging computed tomography (CT) because of diagnosed or high suspicion for colorectal cancer were prospectively included in the study. A single contrast-enhanced abdominal CT acquisition using dual-source mode (100 kV/140 kV) was performed without bowel preparation. Weighted average 120 kV images and iodine maps were created with post-processing. Two observers performed a blinded read for colorectal lesions after being trained on three colorectal cancer patients. One observer performed an unblinded read for lesion detectability and placed a region of interest (ROI) within each lesion. Results: In total 21 patients were included and 18 had a colorectal cancer at the time of the CT acquisition. Median cancer size was 43 mm [interquartile range (IQR) 27–60 mm] and all 18 colorectal cancers were visible on the 120 kV images and iodine map during the unblinded read. During the blinded read, observers found 90% (27/30) of the cancers with 120 kV images only and 96.7% (29/30) after viewing the iodine map in addition (p = 0.5). Median enhancement of colorectal cancers was 29.9 HU (IQR 23.1–34.6). The largest benign lesions (70 and 25 mm) were visible on the 120 kV images and iodine map, whereas four smaller benign lesions (7–15 mm) were not. Conclusion: Colorectal cancers are visible on the contrast-enhanced dual-energy CT without bowel preparation or insufflation. Because of the patient-friendly nature of this approach, further studies should explore its use for colorectal cancer detection in frail and elderly patients

  14. cudaMap: a GPU accelerated program for gene expression connectivity mapping.

    Science.gov (United States)

    McArt, Darragh G; Bankhead, Peter; Dunne, Philip D; Salto-Tellez, Manuel; Hamilton, Peter; Zhang, Shu-Dong

    2013-10-11

    Modern cancer research often involves large datasets and the use of sophisticated statistical techniques. Together these add a heavy computational load to the analysis, which is often coupled with issues surrounding data accessibility. Connectivity mapping is an advanced bioinformatic and computational technique dedicated to therapeutics discovery and drug re-purposing around differential gene expression analysis. On a normal desktop PC, it is common for the connectivity mapping task with a single gene signature to take > 2h to complete using sscMap, a popular Java application that runs on standard CPUs (Central Processing Units). Here, we describe new software, cudaMap, which has been implemented using CUDA C/C++ to harness the computational power of NVIDIA GPUs (Graphics Processing Units) to greatly reduce processing times for connectivity mapping. cudaMap can identify candidate therapeutics from the same signature in just over thirty seconds when using an NVIDIA Tesla C2050 GPU. Results from the analysis of multiple gene signatures, which would previously have taken several days, can now be obtained in as little as 10 minutes, greatly facilitating candidate therapeutics discovery with high throughput. We are able to demonstrate dramatic speed differentials between GPU assisted performance and CPU executions as the computational load increases for high accuracy evaluation of statistical significance. Emerging 'omics' technologies are constantly increasing the volume of data and information to be processed in all areas of biomedical research. Embracing the multicore functionality of GPUs represents a major avenue of local accelerated computing. cudaMap will make a strong contribution in the discovery of candidate therapeutics by enabling speedy execution of heavy duty connectivity mapping tasks, which are increasingly required in modern cancer research. cudaMap is open source and can be freely downloaded from http://purl.oclc.org/NET/cudaMap.

  15. A Novel Image Encryption Scheme Based on Clifford Attractor and Noisy Logistic Map for Secure Transferring Images in Navy

    Directory of Open Access Journals (Sweden)

    Mohadeseh Kanafchian

    2017-04-01

    In this paper, we first give a brief introduction into chaotic image encryption and then we investigate some important properties and behaviour of the logistic map. The logistic map, aperiodic trajectory, or random-like fluctuation, could not be obtained with some choice of initial condition. Therefore, a noisy logistic map with an additive system noise is introduced. The proposed scheme is based on the extended map of the Clifford strange attractor, where each dimension has a specific role in the encryption process. Two dimensions are used for pixel permutation and the third dimension is used for pixel diffusion. In order to optimize the Clifford encryption system we increase the space key by using the noisy logistic map and a novel encryption scheme based on the Clifford attractor and the noisy logistic map for secure transfer images is proposed. This algorithm consists of two parts: the noisy logistic map shuffle of the pixel position and the pixel value. We use times for shuffling the pixel position and value then we generate the new pixel position and value by the Clifford system. To illustrate the efficiency of the proposed scheme, various types of security analysis are tested. It can be concluded that the proposed image encryption system is a suitable choice for practical applications.

  16. Reaching for the cloud: on the lessons learned from grid computing technology transfer process to the biomedical community.

    Science.gov (United States)

    Mohammed, Yassene; Dickmann, Frank; Sax, Ulrich; von Voigt, Gabriele; Smith, Matthew; Rienhoff, Otto

    2010-01-01

    Natural scientists such as physicists pioneered the sharing of computing resources, which led to the creation of the Grid. The inter domain transfer process of this technology has hitherto been an intuitive process without in depth analysis. Some difficulties facing the life science community in this transfer can be understood using the Bozeman's "Effectiveness Model of Technology Transfer". Bozeman's and classical technology transfer approaches deal with technologies which have achieved certain stability. Grid and Cloud solutions are technologies, which are still in flux. We show how Grid computing creates new difficulties in the transfer process that are not considered in Bozeman's model. We show why the success of healthgrids should be measured by the qualified scientific human capital and the opportunities created, and not primarily by the market impact. We conclude with recommendations that can help improve the adoption of Grid and Cloud solutions into the biomedical community. These results give a more concise explanation of the difficulties many life science IT projects are facing in the late funding periods, and show leveraging steps that can help overcoming the "vale of tears".

  17. Dipole-magnet field models based on a conformal map

    Directory of Open Access Journals (Sweden)

    P. L. Walstrom

    2012-10-01

    Full Text Available In general, generation of charged-particle transfer maps for conventional iron-pole-piece dipole magnets to third and higher order requires a model for the midplane field profile and its transverse derivatives (soft-edge model to high order and numerical integration of map coefficients. An exact treatment of the problem for a particular magnet requires use of measured magnetic data. However, in initial design of beam transport systems, users of charged-particle optics codes generally rely on magnet models built into the codes. Indeed, if maps to third order are adequate for the problem, an approximate analytic field model together with numerical map coefficient integration can capture the important features of the transfer map. The model described in this paper is based on the fact that, except at very large distances from the magnet, the magnetic field for parallel pole-face magnets with constant pole gap height and wide pole faces is basically two dimensional (2D. The field for all space outside of the pole pieces is given by a single (complex analytic expression and includes a parameter that controls the rate of falloff of the fringe field. Since the field function is analytic in the complex plane outside of the pole pieces, it satisfies two basic requirements of a field model for higher-order map codes: it is infinitely differentiable at the midplane and also a solution of the Laplace equation. It is apparently the only simple model available that combines an exponential approach to the central field with an inverse cubic falloff of field at large distances from the magnet in a single expression. The model is not intended for detailed fitting of magnetic field data, but for use in numerical map-generating codes for studying the effect of extended fringe fields on higher-order transfer maps. It is based on conformally mapping the area between the pole pieces to the upper half plane, and placing current filaments on the pole faces. An

  18. Cloudgene: a graphical execution platform for MapReduce programs on private and public clouds.

    Science.gov (United States)

    Schönherr, Sebastian; Forer, Lukas; Weißensteiner, Hansi; Kronenberg, Florian; Specht, Günther; Kloss-Brandstätter, Anita

    2012-08-13

    The MapReduce framework enables a scalable processing and analyzing of large datasets by distributing the computational load on connected computer nodes, referred to as a cluster. In Bioinformatics, MapReduce has already been adopted to various case scenarios such as mapping next generation sequencing data to a reference genome, finding SNPs from short read data or matching strings in genotype files. Nevertheless, tasks like installing and maintaining MapReduce on a cluster system, importing data into its distributed file system or executing MapReduce programs require advanced knowledge in computer science and could thus prevent scientists from usage of currently available and useful software solutions. Here we present Cloudgene, a freely available platform to improve the usability of MapReduce programs in Bioinformatics by providing a graphical user interface for the execution, the import and export of data and the reproducibility of workflows on in-house (private clouds) and rented clusters (public clouds). The aim of Cloudgene is to build a standardized graphical execution environment for currently available and future MapReduce programs, which can all be integrated by using its plug-in interface. Since Cloudgene can be executed on private clusters, sensitive datasets can be kept in house at all time and data transfer times are therefore minimized. Our results show that MapReduce programs can be integrated into Cloudgene with little effort and without adding any computational overhead to existing programs. This platform gives developers the opportunity to focus on the actual implementation task and provides scientists a platform with the aim to hide the complexity of MapReduce. In addition to MapReduce programs, Cloudgene can also be used to launch predefined systems (e.g. Cloud BioLinux, RStudio) in public clouds. Currently, five different bioinformatic programs using MapReduce and two systems are integrated and have been successfully deployed. Cloudgene is

  19. Cloudgene: A graphical execution platform for MapReduce programs on private and public clouds

    Directory of Open Access Journals (Sweden)

    Schönherr Sebastian

    2012-08-01

    Full Text Available Abstract Background The MapReduce framework enables a scalable processing and analyzing of large datasets by distributing the computational load on connected computer nodes, referred to as a cluster. In Bioinformatics, MapReduce has already been adopted to various case scenarios such as mapping next generation sequencing data to a reference genome, finding SNPs from short read data or matching strings in genotype files. Nevertheless, tasks like installing and maintaining MapReduce on a cluster system, importing data into its distributed file system or executing MapReduce programs require advanced knowledge in computer science and could thus prevent scientists from usage of currently available and useful software solutions. Results Here we present Cloudgene, a freely available platform to improve the usability of MapReduce programs in Bioinformatics by providing a graphical user interface for the execution, the import and export of data and the reproducibility of workflows on in-house (private clouds and rented clusters (public clouds. The aim of Cloudgene is to build a standardized graphical execution environment for currently available and future MapReduce programs, which can all be integrated by using its plug-in interface. Since Cloudgene can be executed on private clusters, sensitive datasets can be kept in house at all time and data transfer times are therefore minimized. Conclusions Our results show that MapReduce programs can be integrated into Cloudgene with little effort and without adding any computational overhead to existing programs. This platform gives developers the opportunity to focus on the actual implementation task and provides scientists a platform with the aim to hide the complexity of MapReduce. In addition to MapReduce programs, Cloudgene can also be used to launch predefined systems (e.g. Cloud BioLinux, RStudio in public clouds. Currently, five different bioinformatic programs using MapReduce and two systems are

  20. Catalyst-Controlled and Tunable, Chemoselective Silver-Catalyzed Intermolecular Nitrene Transfer: Experimental and Computational Studies.

    Science.gov (United States)

    Dolan, Nicholas S; Scamp, Ryan J; Yang, Tzuhsiung; Berry, John F; Schomaker, Jennifer M

    2016-11-09

    The development of new catalysts for selective nitrene transfer is a continuing area of interest. In particular, the ability to control the chemoselectivity of intermolecular reactions in the presence of multiple reactive sites has been a long-standing challenge in the field. In this paper, we demonstrate examples of silver-catalyzed, nondirected, intermolecular nitrene transfer reactions that are both chemoselective and flexible for aziridination or C-H insertion, depending on the choice of ligand. Experimental probes present a puzzling picture of the mechanistic details of the pathways mediated by [( t Bu 3 tpy)AgOTf] 2 and (tpa)AgOTf. Computational studies elucidate these subtleties and provide guidance for the future development of new catalysts exhibiting improved tunability in group transfer reactions.

  1. PUMP DESIGN AND COMPUTATIONAL FLUID DYNAMIC ANALYSIS FOR HIGH TEMPERATURE SULFURIC ACID TRANSFER SYSTEM

    Directory of Open Access Journals (Sweden)

    JUNG-SIK CHOI

    2014-06-01

    Full Text Available In this study, we proposed a newly designed sulfuric acid transfer system for the sulfur-iodine (SI thermochemical cycle. The proposed sulfuric acid transfer system was evaluated using a computational fluid dynamics (CFD analysis for investigating thermodynamic/hydrodynamic characteristics and material properties. This analysis was conducted to obtain reliable continuous operation parameters; in particular, a thermal analysis was performed on the bellows box and bellows at amplitudes and various frequencies (0.1, 0.5, and 1.0 Hz. However, the high temperatures and strongly corrosive operating conditions of the current sulfuric acid system present challenges with respect to the structural materials of the transfer system. To resolve this issue, we designed a novel transfer system using polytetrafluoroethylene (PTFE, Teflon® as a bellows material for the transfer of sulfuric acid. We also carried out a CFD analysis of the design. The CFD results indicated that the maximum applicable temperature of PTFE is about 533 K (260 °C, even though its melting point is around 600 K. This result implies that the PTFE is a potential material for the sulfuric acid transfer system. The CFD simulations also confirmed that the sulfuric acid transfer system was designed properly for this particular investigation.

  2. Computational Fluid Dynamics Uncertainty Analysis Applied to Heat Transfer over a Flat Plate

    Science.gov (United States)

    Groves, Curtis Edward; Ilie, Marcel; Schallhorn, Paul A.

    2013-01-01

    There have been few discussions on using Computational Fluid Dynamics (CFD) without experimental validation. Pairing experimental data, uncertainty analysis, and analytical predictions provides a comprehensive approach to verification and is the current state of the art. With pressed budgets, collecting experimental data is rare or non-existent. This paper investigates and proposes a method to perform CFD uncertainty analysis only from computational data. The method uses current CFD uncertainty techniques coupled with the Student-T distribution to predict the heat transfer coefficient over a at plate. The inputs to the CFD model are varied from a specified tolerance or bias error and the difference in the results are used to estimate the uncertainty. The variation in each input is ranked from least to greatest to determine the order of importance. The results are compared to heat transfer correlations and conclusions drawn about the feasibility of using CFD without experimental data. The results provide a tactic to analytically estimate the uncertainty in a CFD model when experimental data is unavailable

  3. Computer Independent Data Transfer Device

    Directory of Open Access Journals (Sweden)

    Darshana Rarath

    2017-07-01

    Full Text Available In today’s era, transferring data among distinct storage devices has become one of the tasks which are done most frequently. In order to make data and information omnipresent, it needs to be shared anywhere and anytime. However the reliance of user on a PC or laptop for the same is not efficient. This paper is about the innovative way to overcome this restriction. This paper discusses the development of a portable device with the use of wired and wireless communication applications to share data and information among distinct storage devices without relying on a PC or a laptop. The proposed device is compact, comprises of a touch screen, power source and is capable of transferring all types of files. Hence, it eliminates the dependence on a PC or a laptop for transferring data.

  4. IHadoop: Asynchronous iterations for MapReduce

    KAUST Repository

    Elnikety, Eslam Mohamed Ibrahim

    2011-11-01

    MapReduce is a distributed programming frame-work designed to ease the development of scalable data-intensive applications for large clusters of commodity machines. Most machine learning and data mining applications involve iterative computations over large datasets, such as the Web hyperlink structures and social network graphs. Yet, the MapReduce model does not efficiently support this important class of applications. The architecture of MapReduce, most critically its dataflow techniques and task scheduling, is completely unaware of the nature of iterative applications; tasks are scheduled according to a policy that optimizes the execution for a single iteration which wastes bandwidth, I/O, and CPU cycles when compared with an optimal execution for a consecutive set of iterations. This work presents iHadoop, a modified MapReduce model, and an associated implementation, optimized for iterative computations. The iHadoop model schedules iterations asynchronously. It connects the output of one iteration to the next, allowing both to process their data concurrently. iHadoop\\'s task scheduler exploits inter-iteration data locality by scheduling tasks that exhibit a producer/consumer relation on the same physical machine allowing a fast local data transfer. For those iterative applications that require satisfying certain criteria before termination, iHadoop runs the check concurrently during the execution of the subsequent iteration to further reduce the application\\'s latency. This paper also describes our implementation of the iHadoop model, and evaluates its performance against Hadoop, the widely used open source implementation of MapReduce. Experiments using different data analysis applications over real-world and synthetic datasets show that iHadoop performs better than Hadoop for iterative algorithms, reducing execution time of iterative applications by 25% on average. Furthermore, integrating iHadoop with HaLoop, a variant Hadoop implementation that caches

  5. IHadoop: Asynchronous iterations for MapReduce

    KAUST Repository

    Elnikety, Eslam Mohamed Ibrahim; El Sayed, Tamer S.; Ramadan, Hany E.

    2011-01-01

    MapReduce is a distributed programming frame-work designed to ease the development of scalable data-intensive applications for large clusters of commodity machines. Most machine learning and data mining applications involve iterative computations over large datasets, such as the Web hyperlink structures and social network graphs. Yet, the MapReduce model does not efficiently support this important class of applications. The architecture of MapReduce, most critically its dataflow techniques and task scheduling, is completely unaware of the nature of iterative applications; tasks are scheduled according to a policy that optimizes the execution for a single iteration which wastes bandwidth, I/O, and CPU cycles when compared with an optimal execution for a consecutive set of iterations. This work presents iHadoop, a modified MapReduce model, and an associated implementation, optimized for iterative computations. The iHadoop model schedules iterations asynchronously. It connects the output of one iteration to the next, allowing both to process their data concurrently. iHadoop's task scheduler exploits inter-iteration data locality by scheduling tasks that exhibit a producer/consumer relation on the same physical machine allowing a fast local data transfer. For those iterative applications that require satisfying certain criteria before termination, iHadoop runs the check concurrently during the execution of the subsequent iteration to further reduce the application's latency. This paper also describes our implementation of the iHadoop model, and evaluates its performance against Hadoop, the widely used open source implementation of MapReduce. Experiments using different data analysis applications over real-world and synthetic datasets show that iHadoop performs better than Hadoop for iterative algorithms, reducing execution time of iterative applications by 25% on average. Furthermore, integrating iHadoop with HaLoop, a variant Hadoop implementation that caches

  6. Chirality Transfer in Gold(I)-Catalysed Direct Allylic Etherifications of Unactivated Alcohols: Experimental and Computational Study.

    Science.gov (United States)

    Barker, Graeme; Johnson, David G; Young, Paul C; Macgregor, Stuart A; Lee, Ai-Lan

    2015-09-21

    Gold(I)-catalysed direct allylic etherifications have been successfully carried out with chirality transfer to yield enantioenriched, γ-substituted secondary allylic ethers. Our investigations include a full substrate-scope screen to ascertain substituent effects on the regioselectivity, stereoselectivity and efficiency of chirality transfer, as well as control experiments to elucidate the mechanistic subtleties of the chirality-transfer process. Crucially, addition of molecular sieves was found to be necessary to ensure efficient and general chirality transfer. Computational studies suggest that the efficiency of chirality transfer is linked to the aggregation of the alcohol nucleophile around the reactive π-bound Au-allylic ether complex. With a single alcohol nucleophile, a high degree of chirality transfer is predicted. However, if three alcohols are present, alternative proton transfer chain mechanisms that erode the efficiency of chirality transfer become competitive. © 2015 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA. This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.

  7. Effect of Software Designed by Computer Conceptual Map Method in Mobile Environment on Learning Level of Nursing Students

    Directory of Open Access Journals (Sweden)

    Salmani N

    2015-12-01

    Full Text Available Aims: In order to preserve its own progress, nursing training has to be utilized new training methods, in such a case that the teaching methods used by the nursing instructors enhance significant learning via preventing superficial learning in the students. Conceptual Map Method is one of the new training strategies playing important roles in the field. The aim of this study was to investigate the effectiveness of the designed software based on the mobile phone computer conceptual map on the learning level of the nursing students. Materials & Methods: In the semi-experimental study with pretest-posttest plan, 60 students, who were studying at the 5th semester, were studied at the 1st semester of 2015-16. Experimental group (n=30 from Meibod Nursing Faculty and control group (n=30 from Yazd Shahid Sadoughi Nursing Faculty were trained during the first 4 weeks of the semester, using computer conceptual map method and computer conceptual map method in mobile phone environment. Data was collected, using a researcher-made academic progress test including “knowledge” and “significant learning”. Data was analyzed in SPSS 21 software using Independent T, Paired T, and Fisher tests. Findings: There were significant increases in the mean scores of knowledge and significant learning in both groups before and after the intervention (p0.05. Nevertheless, the process of change of the scores of significant learning level between the groups was statistically significant (p<0.05.   Conclusion: Presenting the course content as conceptual map in mobile phone environment positively affects the significant learning of the nursing students.

  8. MetaQTL: a package of new computational methods for the meta-analysis of QTL mapping experiments

    Directory of Open Access Journals (Sweden)

    Charcosset Alain

    2007-02-01

    Full Text Available Abstract Background Integration of multiple results from Quantitative Trait Loci (QTL studies is a key point to understand the genetic determinism of complex traits. Up to now many efforts have been made by public database developers to facilitate the storage, compilation and visualization of multiple QTL mapping experiment results. However, studying the congruency between these results still remains a complex task. Presently, the few computational and statistical frameworks to do so are mainly based on empirical methods (e.g. consensus genetic maps are generally built by iterative projection. Results In this article, we present a new computational and statistical package, called MetaQTL, for carrying out whole-genome meta-analysis of QTL mapping experiments. Contrary to existing methods, MetaQTL offers a complete statistical process to establish a consensus model for both the marker and the QTL positions on the whole genome. First, MetaQTL implements a new statistical approach to merge multiple distinct genetic maps into a single consensus map which is optimal in terms of weighted least squares and can be used to investigate recombination rate heterogeneity between studies. Secondly, assuming that QTL can be projected on the consensus map, MetaQTL offers a new clustering approach based on a Gaussian mixture model to decide how many QTL underly the distribution of the observed QTL. Conclusion We demonstrate using simulations that the usual model choice criteria from mixture model literature perform relatively well in this context. As expected, simulations also show that this new clustering algorithm leads to a reduction in the length of the confidence interval of QTL location provided that across studies there are enough observed QTL for each underlying true QTL location. The usefulness of our approach is illustrated on published QTL detection results of flowering time in maize. Finally, MetaQTL is freely available at http://bioinformatics.org/mqtl.

  9. THE USE OF LAPTOP COMPUTERS, TABLETS AND GOOGLE EARTH/GOOGLE MAPS APPLICATIONS DURING GEOGRAPHY CLUB SEMINARS

    Directory of Open Access Journals (Sweden)

    FLORIN GALBIN

    2015-01-01

    Full Text Available In the current study, we aim to investigate the use of Google Earth and Google Maps Applications on tablet and laptop computers. The research was carried out during the Geography Club seminars organized at “Radu Petrescu” High School in the 2013-2014 school year. The research involved 13 students in various gymnasium and high school grades. The activities included: navigation with Google Earth/Maps, image capturing techniques, virtual tours, measuring distances or river lengths, identifying relief forms, and locating geographical components of the environment. In order to retrieve students’ opinions regarding the use of tablets and laptop computers with these two applications, they were asked to respond to a questionnaire after the activities took place. Conclusions revealed that students enjoyed using these applications with laptops and tablets and that the learning process during Geography classes became more interesting.

  10. Computation of Charged-Particle Transfer Maps for General Fields and Geometries Using Electromagnetic Boundary-Value Data

    OpenAIRE

    Dragt, A. J.; Roberts, P.; Stasevich, T. J.; Dragt, A. Bodoh-Creed A. J.; Roberts, P.; Stasevich, T. J.; Bodoh-Creed, A.; Walstrom, P. L.

    2010-01-01

    Three-dimensional field distributions from realistic beamline elements can be obtained only by measurement or by numerical solution of a boundary-value problem. In numerical charged-particle map generation, fields along a reference trajectory are differentiated multiple times. Any attempt to differentiate directly such field data multiple times is soon dominated by "noise" due to finite meshing and/or measurement errors. This problem can be overcome by the use of field data on a surface outsi...

  11. Computational brain connectivity mapping: A core health and scientific challenge.

    Science.gov (United States)

    Deriche, Rachid

    2016-10-01

    One third of the burden of all the diseases in Europe is due to problems caused by diseases affecting brain. Although exceptional progress have been obtained for exploring the brain during the past decades, it is still terra-incognita and calls for specific efforts in research to better understand its architecture and functioning. To take up this great challenge of modern science and to solve the limited view of the brain provided just by one imaging modality, this article advocates the idea developed in my research group of a global approach involving new generation of models for brain connectivity mapping and strong interactions between structural and functional connectivities. Capitalizing on the strengths of integrated and complementary non invasive imaging modalities such as diffusion Magnetic Resonance Imaging (dMRI) and Electro & Magneto-Encephalography (EEG & MEG) will contribute to achieve new frontiers for identifying and characterizing structural and functional brain connectivities and to provide a detailed mapping of the brain connectivity, both in space and time. Thus leading to an added clinical value for high impact diseases with new perspectives in computational neuro-imaging and cognitive neuroscience. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Computational Prediction of Atomic Structures of Helical Membrane Proteins Aided by EM Maps

    Science.gov (United States)

    Kovacs, Julio A.; Yeager, Mark; Abagyan, Ruben

    2007-01-01

    Integral membrane proteins pose a major challenge for protein-structure prediction because only ≈100 high-resolution structures are available currently, thereby impeding the development of rules or empirical potentials to predict the packing of transmembrane α-helices. However, when an intermediate-resolution electron microscopy (EM) map is available, it can be used to provide restraints which, in combination with a suitable computational protocol, make structure prediction feasible. In this work we present such a protocol, which proceeds in three stages: 1), generation of an ensemble of α-helices by flexible fitting into each of the density rods in the low-resolution EM map, spanning a range of rotational angles around the main helical axes and translational shifts along the density rods; 2), fast optimization of side chains and scoring of the resulting conformations; and 3), refinement of the lowest-scoring conformations with internal coordinate mechanics, by optimizing the van der Waals, electrostatics, hydrogen bonding, torsional, and solvation energy contributions. In addition, our method implements a penalty term through a so-called tethering map, derived from the EM map, which restrains the positions of the α-helices. The protocol was validated on three test cases: GpA, KcsA, and MscL. PMID:17496035

  13. Spectral Transfer Learning Using Information Geometry for a User-Independent Brain-Computer Interface.

    Science.gov (United States)

    Waytowich, Nicholas R; Lawhern, Vernon J; Bohannon, Addison W; Ball, Kenneth R; Lance, Brent J

    2016-01-01

    Recent advances in signal processing and machine learning techniques have enabled the application of Brain-Computer Interface (BCI) technologies to fields such as medicine, industry, and recreation; however, BCIs still suffer from the requirement of frequent calibration sessions due to the intra- and inter-individual variability of brain-signals, which makes calibration suppression through transfer learning an area of increasing interest for the development of practical BCI systems. In this paper, we present an unsupervised transfer method (spectral transfer using information geometry, STIG), which ranks and combines unlabeled predictions from an ensemble of information geometry classifiers built on data from individual training subjects. The STIG method is validated in both off-line and real-time feedback analysis during a rapid serial visual presentation task (RSVP). For detection of single-trial, event-related potentials (ERPs), the proposed method can significantly outperform existing calibration-free techniques as well as outperform traditional within-subject calibration techniques when limited data is available. This method demonstrates that unsupervised transfer learning for single-trial detection in ERP-based BCIs can be achieved without the requirement of costly training data, representing a step-forward in the overall goal of achieving a practical user-independent BCI system.

  14. A synthetic axiomatization of Map Theory

    DEFF Research Database (Denmark)

    Berline, Chantal; Grue, Klaus Ebbe

    2016-01-01

    of ZFC set theory including the axiom of foundation are provable in Map Theory, and if one omits Hilbert's epsilon operator from Map Theory then one is left with a computer programming language. Map Theory fulfills Church's original aim of lambda calculus. Map Theory is suited for reasoning about...... classical mathematics as well as computer programs. Furthermore, Map Theory is suited for eliminating the barrier between classical mathematics and computer science rather than just supporting the two fields side by side. Map Theory axiomatizes a universe of “maps”, some of which are “wellfounded......”. The class of wellfounded maps in Map Theory corresponds to the universe of sets in ZFC. The first axiomatization MT 0 of Map Theory had axioms which populated the class of wellfounded maps, much like the power set axiom along with others populate the universe of ZFC. The new axiomatization MT of Map Theory...

  15. Computer-aided diagnosis of early knee osteoarthritis based on MRI T2 mapping.

    Science.gov (United States)

    Wu, Yixiao; Yang, Ran; Jia, Sen; Li, Zhanjun; Zhou, Zhiyang; Lou, Ting

    2014-01-01

    This work was aimed at studying the method of computer-aided diagnosis of early knee OA (OA: osteoarthritis). Based on the technique of MRI (MRI: Magnetic Resonance Imaging) T2 Mapping, through computer image processing, feature extraction, calculation and analysis via constructing a classifier, an effective computer-aided diagnosis method for knee OA was created to assist doctors in their accurate, timely and convenient detection of potential risk of OA. In order to evaluate this method, a total of 1380 data from the MRI images of 46 samples of knee joints were collected. These data were then modeled through linear regression on an offline general platform by the use of the ImageJ software, and a map of the physical parameter T2 was reconstructed. After the image processing, the T2 values of ten regions in the WORMS (WORMS: Whole-organ Magnetic Resonance Imaging Score) areas of the articular cartilage were extracted to be used as the eigenvalues in data mining. Then,a RBF (RBF: Radical Basis Function) network classifier was built to classify and identify the collected data. The classifier exhibited a final identification accuracy of 75%, indicating a good result of assisting diagnosis. Since the knee OA classifier constituted by a weights-directly-determined RBF neural network didn't require any iteration, our results demonstrated that the optimal weights, appropriate center and variance could be yielded through simple procedures. Furthermore, the accuracy for both the training samples and the testing samples from the normal group could reach 100%. Finally, the classifier was superior both in time efficiency and classification performance to the frequently used classifiers based on iterative learning. Thus it was suitable to be used as an aid to computer-aided diagnosis of early knee OA.

  16. Computation of the Lyapunov exponents in the compass-gait model under OGY control via a hybrid Poincaré map

    International Nuclear Information System (INIS)

    Gritli, Hassène; Belghith, Safya

    2015-01-01

    Highlights: • A numerical calculation method of the Lyapunov exponents in the compass-gait model under OGY control is proposed. • A new linearization method of the impulsive hybrid dynamics around a one-periodic hybrid limit cycle is achieved. • We develop a simple analytical expression of a controlled hybrid Poincaré map. • A dimension reduction of the hybrid Poincaré map is realized. • We describe the numerical computation procedure of the Lyapunov exponents via the designed hybrid Poincaré map. - Abstract: This paper aims at providing a numerical calculation method of the spectrum of Lyapunov exponents in a four-dimensional impulsive hybrid nonlinear dynamics of a passive compass-gait model under the OGY control approach by means of a controlled hybrid Poincaré map. We present a four-dimensional simplified analytical expression of such hybrid map obtained by linearizing the uncontrolled impulsive hybrid nonlinear dynamics around a desired one-periodic passive hybrid limit cycle. In order to compute the spectrum of Lyapunov exponents, a dimension reduction of the controlled hybrid Poincaré map is realized. The numerical calculation of the spectrum of Lyapunov exponents using the reduced-dimension controlled hybrid Poincaré map is given in detail. In order to show the effectiveness of the developed method, the spectrum of Lyapunov exponents is calculated as the slope (bifurcation) parameter varies and hence used to predict the walking dynamics behavior of the compass-gait model under the OGY control.

  17. Human Mind Maps

    Science.gov (United States)

    Glass, Tom

    2016-01-01

    When students generate mind maps, or concept maps, the maps are usually on paper, computer screens, or a blackboard. Human Mind Maps require few resources and little preparation. The main requirements are space where students can move around and a little creativity and imagination. Mind maps can be used for a variety of purposes, and Human Mind…

  18. Fencing direct memory access data transfers in a parallel active messaging interface of a parallel computer

    Science.gov (United States)

    Blocksome, Michael A.; Mamidala, Amith R.

    2013-09-03

    Fencing direct memory access (`DMA`) data transfers in a parallel active messaging interface (`PAMI`) of a parallel computer, the PAMI including data communications endpoints, each endpoint including specifications of a client, a context, and a task, the endpoints coupled for data communications through the PAMI and through DMA controllers operatively coupled to segments of shared random access memory through which the DMA controllers deliver data communications deterministically, including initiating execution through the PAMI of an ordered sequence of active DMA instructions for DMA data transfers between two endpoints, effecting deterministic DMA data transfers through a DMA controller and a segment of shared memory; and executing through the PAMI, with no FENCE accounting for DMA data transfers, an active FENCE instruction, the FENCE instruction completing execution only after completion of all DMA instructions initiated prior to execution of the FENCE instruction for DMA data transfers between the two endpoints.

  19. Computation studies into architecture and energy transfer properties of photosynthetic units from filamentous anoxygenic phototrophs

    Energy Technology Data Exchange (ETDEWEB)

    Linnanto, Juha Matti [Institute of Physics, University of Tartu, Riia 142, 51014 Tartu (Estonia); Freiberg, Arvi [Institute of Physics, University of Tartu, Riia 142, 51014 Tartu, Estonia and Institute of Molecular and Cell Biology, University of Tartu, Riia 23, 51010 Tartu (Estonia)

    2014-10-06

    We have used different computational methods to study structural architecture, and light-harvesting and energy transfer properties of the photosynthetic unit of filamentous anoxygenic phototrophs. Due to the huge number of atoms in the photosynthetic unit, a combination of atomistic and coarse methods was used for electronic structure calculations. The calculations reveal that the light energy absorbed by the peripheral chlorosome antenna complex transfers efficiently via the baseplate and the core B808–866 antenna complexes to the reaction center complex, in general agreement with the present understanding of this complex system.

  20. Open Land-Use Map: A Regional Land-Use Mapping Strategy for Incorporating OpenStreetMap with Earth Observations

    Science.gov (United States)

    Yang, D.; Fu, C. S.; Binford, M. W.

    2017-12-01

    The southeastern United States has high landscape heterogeneity, withheavily managed forestlands, highly developed agriculture lands, and multiple metropolitan areas. Human activities are transforming and altering land patterns and structures in both negative and positive manners. A land-use map for at the greater scale is a heavy computation task but is critical to most landowners, researchers, and decision makers, enabling them to make informed decisions for varying objectives. There are two major difficulties in generating the classification maps at the regional scale: the necessity of large training point sets and the expensive computation cost-in terms of both money and time-in classifier modeling. Volunteered Geographic Information (VGI) opens a new era in mapping and visualizing our world, where the platform is open for collecting valuable georeferenced information by volunteer citizens, and the data is freely available to the public. As one of the most well-known VGI initiatives, OpenStreetMap (OSM) contributes not only road network distribution, but also the potential for using this data to justify land cover and land use classifications. Google Earth Engine (GEE) is a platform designed for cloud-based mapping with a robust and fast computing power. Most large scale and national mapping approaches confuse "land cover" and "land-use", or build up the land-use database based on modeled land cover datasets. Unlike most other large-scale approaches, we distinguish and differentiate land-use from land cover. By focusing our prime objective of mapping land-use and management practices, a robust regional land-use mapping approach is developed by incorporating the OpenstreepMap dataset into Earth observation remote sensing imageries instead of the often-used land cover base maps.

  1. DistMap: a toolkit for distributed short read mapping on a Hadoop cluster.

    Directory of Open Access Journals (Sweden)

    Ram Vinay Pandey

    Full Text Available With the rapid and steady increase of next generation sequencing data output, the mapping of short reads has become a major data analysis bottleneck. On a single computer, it can take several days to map the vast quantity of reads produced from a single Illumina HiSeq lane. In an attempt to ameliorate this bottleneck we present a new tool, DistMap - a modular, scalable and integrated workflow to map reads in the Hadoop distributed computing framework. DistMap is easy to use, currently supports nine different short read mapping tools and can be run on all Unix-based operating systems. It accepts reads in FASTQ format as input and provides mapped reads in a SAM/BAM format. DistMap supports both paired-end and single-end reads thereby allowing the mapping of read data produced by different sequencing platforms. DistMap is available from http://code.google.com/p/distmap/

  2. Computational solution to automatically map metabolite libraries in the context of genome scale metabolic networks

    Directory of Open Access Journals (Sweden)

    Benjamin eMerlet

    2016-02-01

    Full Text Available This article describes a generic programmatic method for mapping chemical compound libraries on organism-specific metabolic networks from various databases (KEGG, BioCyc and flat file formats (SBML and Matlab files. We show how this pipeline was successfully applied to decipher the coverage of chemical libraries set up by two metabolomics facilities MetaboHub (French National infrastructure for metabolomics and fluxomics and Glasgow Polyomics on the metabolic networks available in the MetExplore web server. The present generic protocol is designed to formalize and reduce the volume of information transfer between the library and the network database. Matching of metabolites between libraries and metabolic networks is based on InChIs or InChIKeys and therefore requires that these identifiers are specified in both libraries and networks.In addition to providing covering statistics, this pipeline also allows the visualization of mapping results in the context of metabolic networks.In order to achieve this goal we tackled issues on programmatic interaction between two servers, improvement of metabolite annotation in metabolic networks and automatic loading of a mapping in genome scale metabolic network analysis tool MetExplore. It is important to note that this mapping can also be performed on a single or a selection of organisms of interest and is thus not limited to large facilities.

  3. Spectral Transfer Learning using Information Geometry for a User-Independent Brain-Computer Interface

    OpenAIRE

    Nicholas Roy Waytowich; Nicholas Roy Waytowich; Vernon Lawhern; Vernon Lawhern; Addison Bohannon; Addison Bohannon; Kenneth Ball; Brent Lance

    2016-01-01

    Recent advances in signal processing and machine learning techniques have enabled the application of Brain-Computer Interface (BCI) technologies to fields such as medicine, industry and recreation. However, BCIs still suffer from the requirement of frequent calibration sessions due to the intra- and inter- individual variability of brain-signals, which makes calibration suppression through transfer learning an area of increasing interest for the development of practical BCI systems. In this p...

  4. Spectral Transfer Learning Using Information Geometry for a User-Independent Brain-Computer Interface

    OpenAIRE

    Waytowich, Nicholas R.; Lawhern, Vernon J.; Bohannon, Addison W.; Ball, Kenneth R.; Lance, Brent J.

    2016-01-01

    Recent advances in signal processing and machine learning techniques have enabled the application of Brain-Computer Interface (BCI) technologies to fields such as medicine, industry, and recreation; however, BCIs still suffer from the requirement of frequent calibration sessions due to the intra- and inter-individual variability of brain-signals, which makes calibration suppression through transfer learning an area of increasing interest for the development of practical BCI systems. In this p...

  5. Spectral Transfer Learning using Information Geometry for a User-Independent Brain-Computer Interface

    Directory of Open Access Journals (Sweden)

    Nicholas Roy Waytowich

    2016-09-01

    Full Text Available Recent advances in signal processing and machine learning techniques have enabled the application of Brain-Computer Interface (BCI technologies to fields such as medicine, industry and recreation. However, BCIs still suffer from the requirement of frequent calibration sessions due to the intra- and inter- individual variability of brain-signals, which makes calibration suppression through transfer learning an area of increasing interest for the development of practical BCI systems. In this paper, we present an unsupervised transfer method (spectral transfer using information geometry, STIG, which ranks and combines unlabeled predictions from an ensemble of information geometry classifiers built on data from individual training subjects. The STIG method is validated in both offline and real-time feedback analysis during a rapid serial visual presentation task (RSVP. For detection of single-trial, event-related potentials (ERPs, the proposed method can significantly outperform existing calibration-free techniques as well as outperform traditional within-subject calibration techniques when limited data is available. This method demonstrates that unsupervised transfer learning for single-trial detection in ERP-based BCIs can be achieved without the requirement of costly training data, representing a step-forward in the overall goal of achieving a practical user-independent BCI system.

  6. High temporal resolution mapping of seismic noise sources using heterogeneous supercomputers

    Science.gov (United States)

    Gokhberg, Alexey; Ermert, Laura; Paitz, Patrick; Fichtner, Andreas

    2017-04-01

    collecting, on a periodic basis, raw seismic records from the European seismic networks, (2) high-performance noise source mapping application responsible for generation of source maps using cross-correlation of seismic records, (3) back-end infrastructure for the coordination of various tasks and computations, (4) front-end Web interface providing the service to the end-users and (5) data repository. The noise mapping application is composed of four principal modules: (1) pre-processing of raw data, (2) massive cross-correlation, (3) post-processing of correlation data based on computation of logarithmic energy ratio and (4) generation of source maps from post-processed data. Implementation of the solution posed various challenges, in particular, selection of data sources and transfer protocols, automation and monitoring of daily data downloads, ensuring the required data processing performance, design of a general service oriented architecture for coordination of various sub-systems, and engineering an appropriate data storage solution. The present pilot version of the service implements noise source maps for Switzerland. Extension of the solution to Central Europe is planned for the next project phase.

  7. GATE Monte Carlo simulation of dose distribution using MapReduce in a cloud computing environment.

    Science.gov (United States)

    Liu, Yangchuan; Tang, Yuguo; Gao, Xin

    2017-12-01

    The GATE Monte Carlo simulation platform has good application prospects of treatment planning and quality assurance. However, accurate dose calculation using GATE is time consuming. The purpose of this study is to implement a novel cloud computing method for accurate GATE Monte Carlo simulation of dose distribution using MapReduce. An Amazon Machine Image installed with Hadoop and GATE is created to set up Hadoop clusters on Amazon Elastic Compute Cloud (EC2). Macros, the input files for GATE, are split into a number of self-contained sub-macros. Through Hadoop Streaming, the sub-macros are executed by GATE in Map tasks and the sub-results are aggregated into final outputs in Reduce tasks. As an evaluation, GATE simulations were performed in a cubical water phantom for X-ray photons of 6 and 18 MeV. The parallel simulation on the cloud computing platform is as accurate as the single-threaded simulation on a local server and the simulation correctness is not affected by the failure of some worker nodes. The cloud-based simulation time is approximately inversely proportional to the number of worker nodes. For the simulation of 10 million photons on a cluster with 64 worker nodes, time decreases of 41× and 32× were achieved compared to the single worker node case and the single-threaded case, respectively. The test of Hadoop's fault tolerance showed that the simulation correctness was not affected by the failure of some worker nodes. The results verify that the proposed method provides a feasible cloud computing solution for GATE.

  8. 3-D heat transfer computer calculations of the performance of the IAEA's air-bath calorimeters

    International Nuclear Information System (INIS)

    Elias, E.; Kaizermann, S.; Perry, R.B.; Fiarman, S.

    1989-01-01

    A three dimensional (3-D) heat transfer computer code was developed to study and optimize the design parameters and to better understand the performance characteristics of the IAEA's air-bath calorimeters. The computer model accounts for heat conduction and radiation in the complex materials of the calorimeter and for heat convection and radiation at its outer surface. The temperature servo controller is modelled as an integral part of the heat balance equations in the system. The model predictions will be validated against test data using the ANL bulk calorimeter. 11 refs., 6 figs

  9. Fencing network direct memory access data transfers in a parallel active messaging interface of a parallel computer

    Science.gov (United States)

    Blocksome, Michael A.; Mamidala, Amith R.

    2015-07-07

    Fencing direct memory access (`DMA`) data transfers in a parallel active messaging interface (`PAMI`) of a parallel computer, the PAMI including data communications endpoints, each endpoint including specifications of a client, a context, and a task, the endpoints coupled for data communications through the PAMI and through DMA controllers operatively coupled to a deterministic data communications network through which the DMA controllers deliver data communications deterministically, including initiating execution through the PAMI of an ordered sequence of active DMA instructions for DMA data transfers between two endpoints, effecting deterministic DMA data transfers through a DMA controller and the deterministic data communications network; and executing through the PAMI, with no FENCE accounting for DMA data transfers, an active FENCE instruction, the FENCE instruction completing execution only after completion of all DMA instructions initiated prior to execution of the FENCE instruction for DMA data transfers between the two endpoints.

  10. Temperature mapping and thermal dose calculation in combined radiation therapy and 13.56 MHz radiofrequency hyperthermia for tumor treatment

    Science.gov (United States)

    Kim, Jung Kyung; Prasad, Bibin; Kim, Suzy

    2017-02-01

    To evaluate the synergistic effect of radiotherapy and radiofrequency hyperthermia therapy in the treatment of lung and liver cancers, we studied the mechanism of heat absorption and transfer in the tumor using electro-thermal simulation and high-resolution temperature mapping techniques. A realistic tumor-induced mouse anatomy, which was reconstructed and segmented from computed tomography images, was used to determine the thermal distribution in tumors during radiofrequency (RF) heating at 13.56 MHz. An RF electrode was used as a heat source, and computations were performed with the aid of the multiphysics simulation platform Sim4Life. Experiments were carried out on a tumor-mimicking agar phantom and a mouse tumor model to obtain a spatiotemporal temperature map and thermal dose distribution. A high temperature increase was achieved in the tumor from both the computation and measurement, which elucidated that there was selective high-energy absorption in tumor tissue compared to the normal surrounding tissues. The study allows for effective treatment planning for combined radiation and hyperthermia therapy based on the high-resolution temperature mapping and high-precision thermal dose calculation.

  11. a Study on Mental Representations for Realistic Visualization the Particular Case of Ski Trail Mapping

    Science.gov (United States)

    Balzarini, R.; Dalmasso, A.; Murat, M.

    2015-08-01

    This article presents preliminary results from a research project in progress that brings together geographers, cognitive scientists, historians and computer scientists. The project investigates the evolution of a particular territorial model: ski trails maps. Ski resorts, tourist and sporting innovations for mountain economies since the 1930s, have needed cartographic representations corresponding to new practices of the space.Painter artists have been involved in producing ski maps with painting techniques and panoramic views, which are by far the most common type of map, because they allow the resorts to look impressive to potential visitors. These techniques have evolved throughout the mutations of the ski resorts. Paper ski maps no longer meet the needs of a large part of the customers; the question now arises of their adaptation to digital media. In a computerized process perspective, the early stage of the project aims to identify the artist-representations, based on conceptual and technical rules, which are handled by users-skiers to perform a task (location, wayfinding, decision-making) and can be transferred to a computer system. This article presents the experimental phase that analyzes artist and user mental representations that are at stake during the making and the reading of a paper ski map. It particularly focuses on how the invention of the artist influences map reading.

  12. A STUDY ON MENTAL REPRESENTATIONS FOR REALISTIC VISUALIZATION THE PARTICULAR CASE OF SKI TRAIL MAPPING

    Directory of Open Access Journals (Sweden)

    R. Balzarini

    2015-08-01

    Full Text Available This article presents preliminary results from a research project in progress that brings together geographers, cognitive scientists, historians and computer scientists. The project investigates the evolution of a particular territorial model: ski trails maps. Ski resorts, tourist and sporting innovations for mountain economies since the 1930s, have needed cartographic representations corresponding to new practices of the space.Painter artists have been involved in producing ski maps with painting techniques and panoramic views, which are by far the most common type of map, because they allow the resorts to look impressive to potential visitors. These techniques have evolved throughout the mutations of the ski resorts. Paper ski maps no longer meet the needs of a large part of the customers; the question now arises of their adaptation to digital media. In a computerized process perspective, the early stage of the project aims to identify the artist-representations, based on conceptual and technical rules, which are handled by users-skiers to perform a task (location, wayfinding, decision-making and can be transferred to a computer system. This article presents the experimental phase that analyzes artist and user mental representations that are at stake during the making and the reading of a paper ski map. It particularly focuses on how the invention of the artist influences map reading.

  13. A method for accurate computation of elastic and discrete inelastic scattering transfer matrix

    International Nuclear Information System (INIS)

    Garcia, R.D.M.; Santina, M.D.

    1986-05-01

    A method for accurate computation of elastic and discrete inelastic scattering transfer matrices is discussed. In particular, a partition scheme for the source energy range that avoids integration over intervals containing points where the integrand has discontinuous derivative is developed. Five-figure accurate numerical results are obtained for several test problems with the TRAMA program which incorporates the porposed method. A comparison with numerical results from existing processing codes is also presented. (author) [pt

  14. iHadoop: Asynchronous Iterations Support for MapReduce

    KAUST Repository

    Elnikety, Eslam

    2011-08-01

    MapReduce is a distributed programming framework designed to ease the development of scalable data-intensive applications for large clusters of commodity machines. Most machine learning and data mining applications involve iterative computations over large datasets, such as the Web hyperlink structures and social network graphs. Yet, the MapReduce model does not efficiently support this important class of applications. The architecture of MapReduce, most critically its dataflow techniques and task scheduling, is completely unaware of the nature of iterative applications; tasks are scheduled according to a policy that optimizes the execution for a single iteration which wastes bandwidth, I/O, and CPU cycles when compared with an optimal execution for a consecutive set of iterations. This work presents iHadoop, a modified MapReduce model, and an associated implementation, optimized for iterative computations. The iHadoop model schedules iterations asynchronously. It connects the output of one iteration to the next, allowing both to process their data concurrently. iHadoop\\'s task scheduler exploits inter- iteration data locality by scheduling tasks that exhibit a producer/consumer relation on the same physical machine allowing a fast local data transfer. For those iterative applications that require satisfying certain criteria before termination, iHadoop runs the check concurrently during the execution of the subsequent iteration to further reduce the application\\'s latency. This thesis also describes our implementation of the iHadoop model, and evaluates its performance against Hadoop, the widely used open source implementation of MapReduce. Experiments using different data analysis applications over real-world and synthetic datasets show that iHadoop performs better than Hadoop for iterative algorithms, reducing execution time of iterative applications by 25% on average. Furthermore, integrating iHadoop with HaLoop, a variant Hadoop implementation that caches

  15. Transfer Kernel Common Spatial Patterns for Motor Imagery Brain-Computer Interface Classification

    Science.gov (United States)

    Dai, Mengxi; Liu, Shucong; Zhang, Pengju

    2018-01-01

    Motor-imagery-based brain-computer interfaces (BCIs) commonly use the common spatial pattern (CSP) as preprocessing step before classification. The CSP method is a supervised algorithm. Therefore a lot of time-consuming training data is needed to build the model. To address this issue, one promising approach is transfer learning, which generalizes a learning model can extract discriminative information from other subjects for target classification task. To this end, we propose a transfer kernel CSP (TKCSP) approach to learn a domain-invariant kernel by directly matching distributions of source subjects and target subjects. The dataset IVa of BCI Competition III is used to demonstrate the validity by our proposed methods. In the experiment, we compare the classification performance of the TKCSP against CSP, CSP for subject-to-subject transfer (CSP SJ-to-SJ), regularizing CSP (RCSP), stationary subspace CSP (ssCSP), multitask CSP (mtCSP), and the combined mtCSP and ssCSP (ss + mtCSP) method. The results indicate that the superior mean classification performance of TKCSP can achieve 81.14%, especially in case of source subjects with fewer number of training samples. Comprehensive experimental evidence on the dataset verifies the effectiveness and efficiency of the proposed TKCSP approach over several state-of-the-art methods. PMID:29743934

  16. Thermoelectricity analogy method for computing the periodic heat transfer in external building envelopes

    International Nuclear Information System (INIS)

    Peng Changhai; Wu Zhishen

    2008-01-01

    Simple and effective computation methods are needed to calculate energy efficiency in buildings for building thermal comfort and HVAC system simulations. This paper, which is based upon the theory of thermoelectricity analogy, develops a new harmonic method, the thermoelectricity analogy method (TEAM), to compute the periodic heat transfer in external building envelopes (EBE). It presents, in detail, the principles and specific techniques of TEAM to calculate both the decay rates and time lags of EBE. First, a set of linear equations is established using the theory of thermoelectricity analogy. Second, the temperature of each node is calculated by solving the linear equations set. Finally, decay rates and time lags are found by solving simple mathematical expressions. Comparisons show that this method is highly accurate and efficient. Moreover, relative to the existing harmonic methods, which are based on the classical control theory and the method of separation of variables, TEAM does not require complicated derivation and is amenable to hand computation and programming

  17. Physical mapping in highly heterozygous genomes: a physical contig map of the Pinot Noir grapevine cultivar

    Directory of Open Access Journals (Sweden)

    Jurman Irena

    2010-03-01

    Full Text Available Abstract Background Most of the grapevine (Vitis vinifera L. cultivars grown today are those selected centuries ago, even though grapevine is one of the most important fruit crops in the world. Grapevine has therefore not benefited from the advances in modern plant breeding nor more recently from those in molecular genetics and genomics: genes controlling important agronomic traits are practically unknown. A physical map is essential to positionally clone such genes and instrumental in a genome sequencing project. Results We report on the first whole genome physical map of grapevine built using high information content fingerprinting of 49,104 BAC clones from the cultivar Pinot Noir. Pinot Noir, as most grape varieties, is highly heterozygous at the sequence level. This resulted in the two allelic haplotypes sometimes assembling into separate contigs that had to be accommodated in the map framework or in local expansions of contig maps. We performed computer simulations to assess the effects of increasing levels of sequence heterozygosity on BAC fingerprint assembly and showed that the experimental assembly results are in full agreement with the theoretical expectations, given the heterozygosity levels reported for grape. The map is anchored to a dense linkage map consisting of 994 markers. 436 contigs are anchored to the genetic map, covering 342 of the 475 Mb that make up the grape haploid genome. Conclusions We have developed a resource that makes it possible to access the grapevine genome, opening the way to a new era both in grape genetics and breeding and in wine making. The effects of heterozygosity on the assembly have been analyzed and characterized by using several complementary approaches which could be easily transferred to the study of other genomes which present the same features.

  18. Transfer Learning for SSVEP Electroencephalography Based Brain–Computer Interfaces Using Learn++.NSE and Mutual Information

    Directory of Open Access Journals (Sweden)

    Matthew Sybeldon

    2017-01-01

    Full Text Available Brain–Computer Interfaces (BCI using Steady-State Visual Evoked Potentials (SSVEP are sometimes used by injured patients seeking to use a computer. Canonical Correlation Analysis (CCA is seen as state-of-the-art for SSVEP BCI systems. However, this assumes that the user has full control over their covert attention, which may not be the case. This introduces high calibration requirements when using other machine learning techniques. These may be circumvented by using transfer learning to utilize data from other participants. This paper proposes a combination of ensemble learning via Learn++ for Nonstationary Environments (Learn++.NSEand similarity measures such as mutual information to identify ensembles of pre-existing data that result in higher classification. Results show that this approach performed worse than CCA in participants with typical SSVEP responses, but outperformed CCA in participants whose SSVEP responses violated CCA assumptions. This indicates that similarity measures and Learn++.NSE can introduce a transfer learning mechanism to bring SSVEP system accessibility to users unable to control their covert attention.

  19. Cielo Computational Environment Usage Model With Mappings to ACE Requirements for the General Availability User Environment Capabilities Release Version 1.1

    Energy Technology Data Exchange (ETDEWEB)

    Vigil,Benny Manuel [Los Alamos National Laboratory; Ballance, Robert [SNL; Haskell, Karen [SNL

    2012-08-09

    Cielo is a massively parallel supercomputer funded by the DOE/NNSA Advanced Simulation and Computing (ASC) program, and operated by the Alliance for Computing at Extreme Scale (ACES), a partnership between Los Alamos National Laboratory (LANL) and Sandia National Laboratories (SNL). The primary Cielo compute platform is physically located at Los Alamos National Laboratory. This Cielo Computational Environment Usage Model documents the capabilities and the environment to be provided for the Q1 FY12 Level 2 Cielo Capability Computing (CCC) Platform Production Readiness Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model is focused on the needs of the ASC user working in the secure computing environments at Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory, or Sandia National Laboratories, but also addresses the needs of users working in the unclassified environment. The Cielo Computational Environment Usage Model maps the provided capabilities to the tri-Lab ASC Computing Environment (ACE) Version 8.0 requirements. The ACE requirements reflect the high performance computing requirements for the Production Readiness Milestone user environment capabilities of the ASC community. A description of ACE requirements met, and those requirements that are not met, are included in each section of this document. The Cielo Computing Environment, along with the ACE mappings, has been issued and reviewed throughout the tri-Lab community.

  20. Affective Maps

    DEFF Research Database (Denmark)

    Salovaara-Moring, Inka

    . In particular, mapping environmental damage, endangered species, and human made disasters has become one of the focal point of affective knowledge production. These ‘more-than-humangeographies’ practices include notions of species, space and territory, and movement towards a new political ecology. This type...... of digital cartographies has been highlighted as the ‘processual turn’ in critical cartography, whereas in related computational journalism it can be seen as an interactive and iterative process of mapping complex and fragile ecological developments. This paper looks at computer-assisted cartography as part...

  1. Participatory Maps

    DEFF Research Database (Denmark)

    Salovaara-Moring, Inka

    2016-01-01

    practice. In particular, mapping environmental damage, endangered species, and human-made disasters has become one focal point for environmental knowledge production. This type of digital map has been highlighted as a processual turn in critical cartography, whereas in related computational journalism...... of a geo-visualization within information mapping that enhances embodiment in the experience of the information. InfoAmazonia is defined as a digitally created map-space within which journalistic practice can be seen as dynamic, performative interactions between journalists, ecosystems, space, and species...

  2. Constructing linkage maps in the genomics era with MapDisto 2.0.

    Science.gov (United States)

    Heffelfinger, Christopher; Fragoso, Christopher A; Lorieux, Mathias

    2017-07-15

    Genotyping by sequencing (GBS) generates datasets that are challenging to handle by current genetic mapping software with graphical interface. Geneticists need new user-friendly computer programs that can analyze GBS data on desktop computers. This requires improvements in computation efficiency, both in terms of speed and use of random-access memory (RAM). MapDisto v.2.0 is a user-friendly computer program for construction of genetic linkage maps. It includes several new major features: (i) handling of very large genotyping datasets like the ones generated by GBS; (ii) direct importation and conversion of Variant Call Format (VCF) files; (iii) detection of linkage, i.e. construction of linkage groups in case of segregation distortion; (iv) data imputation on VCF files using a new approach, called LB-Impute. Features i to iv operate through inclusion of new Java modules that are used transparently by MapDisto; (v) QTL detection via a new R/qtl graphical interface. The program is available free of charge at mapdisto.free.fr. mapdisto@gmail.com. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  3. Mapping and quantifying sediment transfer between the front of rapidly moving rock glaciers and torrential gullies

    Science.gov (United States)

    Kummert, Mario; Delaloye, Reynald

    2018-05-01

    The sedimentary connection which may occur between the front of active rock glaciers and torrential channels is not well understood, despite its potential impact on the torrential activity characterizing the concerned catchments. In this study, DEMs of difference (DoDs) covering various time intervals between 2013 and 2016 were obtained from LiDAR-derived multitemporal DEMs for three rapidly moving rock glaciers located in the western Swiss Alps. The DoDs were used to map and quantify sediment transfer activity between the front of these rock glaciers and the corresponding underlying torrential gullies. Sediment transfer rates ranging between 1500 m3/y and 7800 m3/y have been calculated, depending on the sites. Sediment eroded from the fronts generally accumulated in the upper sectors of the torrential gullies where they were occasionally mobilized within small to medium sized debris flow events. A clear relation between the motion rates of the rock glaciers and the sediment transfer rates calculated at their fronts could be highlighted. Along with the size of the frontal areas, rock glacier creep rates influence thus directly sediment availability in the headwaters of the studied torrents. The frequency-magnitude of debris flow events varied between sites and was mainly related to the concordance of local factors such as topography, water availability, sediment availability or sediment type.

  4. Computational study of the heat transfer of an avian egg in a tray.

    Science.gov (United States)

    Eren Ozcan, S; Andriessens, S; Berckmans, D

    2010-04-01

    The development of an embryo in an avian egg depends largely on its temperature. The embryo temperature is affected by its environment and the heat produced by the egg. In this paper, eggshell temperature and the heat transfer characteristics from one egg in a tray toward its environment are studied by means of computational fluid dynamics (CFD). Computational fluid dynamics simulations have the advantage of providing extensive 3-dimensional information on velocity and eggshell temperature distribution around an egg that otherwise is not possible to obtain by experiments. However, CFD results need to be validated against experimental data. The objectives were (1) to find out whether CFD can successfully simulate eggshell temperature from one egg in a tray by comparing to previously conducted experiments, (2) to visualize air flow and air temperature distribution around the egg in a detailed way, and (3) to perform sensitivity analysis on several variables affecting heat transfer. To this end, a CFD model was validated using 2 sets of temperature measurements yielding an effective model. From these simulations, it can be concluded that CFD can effectively be used to analyze heat transfer characteristics and eggshell temperature distribution around an egg. In addition, air flow and temperature distribution around the egg are visualized. It has been observed that temperature differences up to 2.6 degrees C are possible at high heat production (285 mW) and horizontal low flow rates (0.5 m/s). Sensitivity analysis indicates that average eggshell temperature is mainly affected by the inlet air velocity and temperature, flow direction, and the metabolic heat of the embryo and less by the thermal conductivity and emissivity of the egg and thermal emissivity of the tray.

  5. Mapping land cover change over continental Africa using Landsat and Google Earth Engine cloud computing.

    Science.gov (United States)

    Midekisa, Alemayehu; Holl, Felix; Savory, David J; Andrade-Pacheco, Ricardo; Gething, Peter W; Bennett, Adam; Sturrock, Hugh J W

    2017-01-01

    Quantifying and monitoring the spatial and temporal dynamics of the global land cover is critical for better understanding many of the Earth's land surface processes. However, the lack of regularly updated, continental-scale, and high spatial resolution (30 m) land cover data limit our ability to better understand the spatial extent and the temporal dynamics of land surface changes. Despite the free availability of high spatial resolution Landsat satellite data, continental-scale land cover mapping using high resolution Landsat satellite data was not feasible until now due to the need for high-performance computing to store, process, and analyze this large volume of high resolution satellite data. In this study, we present an approach to quantify continental land cover and impervious surface changes over a long period of time (15 years) using high resolution Landsat satellite observations and Google Earth Engine cloud computing platform. The approach applied here to overcome the computational challenges of handling big earth observation data by using cloud computing can help scientists and practitioners who lack high-performance computational resources.

  6. High Temporal Resolution Mapping of Seismic Noise Sources Using Heterogeneous Supercomputers

    Science.gov (United States)

    Paitz, P.; Gokhberg, A.; Ermert, L. A.; Fichtner, A.

    2017-12-01

    The time- and space-dependent distribution of seismic noise sources is becoming a key ingredient of modern real-time monitoring of various geo-systems like earthquake fault zones, volcanoes, geothermal and hydrocarbon reservoirs. We present results of an ongoing research project conducted in collaboration with the Swiss National Supercomputing Centre (CSCS). The project aims at building a service providing seismic noise source maps for Central Europe with high temporal resolution. We use source imaging methods based on the cross-correlation of seismic noise records from all seismic stations available in the region of interest. The service is hosted on the CSCS computing infrastructure; all computationally intensive processing is performed on the massively parallel heterogeneous supercomputer "Piz Daint". The solution architecture is based on the Application-as-a-Service concept to provide the interested researchers worldwide with regular access to the noise source maps. The solution architecture includes the following sub-systems: (1) data acquisition responsible for collecting, on a periodic basis, raw seismic records from the European seismic networks, (2) high-performance noise source mapping application responsible for the generation of source maps using cross-correlation of seismic records, (3) back-end infrastructure for the coordination of various tasks and computations, (4) front-end Web interface providing the service to the end-users and (5) data repository. The noise source mapping itself rests on the measurement of logarithmic amplitude ratios in suitably pre-processed noise correlations, and the use of simplified sensitivity kernels. During the implementation we addressed various challenges, in particular, selection of data sources and transfer protocols, automation and monitoring of daily data downloads, ensuring the required data processing performance, design of a general service-oriented architecture for coordination of various sub-systems, and

  7. A Computational Solution to Automatically Map Metabolite Libraries in the Context of Genome Scale Metabolic Networks.

    Science.gov (United States)

    Merlet, Benjamin; Paulhe, Nils; Vinson, Florence; Frainay, Clément; Chazalviel, Maxime; Poupin, Nathalie; Gloaguen, Yoann; Giacomoni, Franck; Jourdan, Fabien

    2016-01-01

    This article describes a generic programmatic method for mapping chemical compound libraries on organism-specific metabolic networks from various databases (KEGG, BioCyc) and flat file formats (SBML and Matlab files). We show how this pipeline was successfully applied to decipher the coverage of chemical libraries set up by two metabolomics facilities MetaboHub (French National infrastructure for metabolomics and fluxomics) and Glasgow Polyomics (GP) on the metabolic networks available in the MetExplore web server. The present generic protocol is designed to formalize and reduce the volume of information transfer between the library and the network database. Matching of metabolites between libraries and metabolic networks is based on InChIs or InChIKeys and therefore requires that these identifiers are specified in both libraries and networks. In addition to providing covering statistics, this pipeline also allows the visualization of mapping results in the context of metabolic networks. In order to achieve this goal, we tackled issues on programmatic interaction between two servers, improvement of metabolite annotation in metabolic networks and automatic loading of a mapping in genome scale metabolic network analysis tool MetExplore. It is important to note that this mapping can also be performed on a single or a selection of organisms of interest and is thus not limited to large facilities.

  8. On Data Transfers Over Wide-Area Dedicated Connections

    Energy Technology Data Exchange (ETDEWEB)

    Rao, Nageswara S. [ORNL; Liu, Qiang [ORNL

    2018-01-01

    Dedicated wide-area network connections are employed in big data and high-performance computing scenarios, since the absence of cross-traffic promises to make it easier to analyze and optimize data transfers over them. However, nonlinear transport dynamics and end-system complexity due to multi-core hosts and distributed file systems make these tasks surprisingly challenging. We present an overview of methods to analyze memory and disk file transfers using extensive measurements over 10 Gbps physical and emulated connections with 0–366 ms round trip times (RTTs). For memory transfers, we derive performance profiles of TCP and UDT throughput as a function of RTT, which show concave regions in contrast to entirely convex regions predicted by previous models. These highly desirable concave regions can be expanded by utilizing large buffers and more parallel flows. We also present Poincar´e maps and Lyapunov exponents of TCP and UDT throughputtraces that indicate complex throughput dynamics. For disk file transfers, we show that throughput can be optimized using a combination of parallel I/O and network threads under direct I/O mode. Our initial throughput measurements of Lustre filesystems mounted over long-haul connections using LNet routers show convex profiles indicative of I/O limits.

  9. GAM-HEAT -- a computer code to compute heat transfer in complex enclosures

    International Nuclear Information System (INIS)

    Cooper, R.E.; Taylor, J.R.; Kielpinski, A.L.; Steimke, J.L.

    1991-02-01

    The GAM-HEAT code was developed for heat transfer analyses associated with postulated Double Ended Guillotine Break Loss Of Coolant Accidents (DEGB LOCA) resulting in a drained reactor vessel. In these analyses the gamma radiation resulting from fission product decay constitutes the primary source of energy as a function of time. This energy is deposited into the various reactor components and is re- radiated as thermal energy. The code accounts for all radiant heat exchanges within and leaving the reactor enclosure. The SRS reactors constitute complex radiant exchange enclosures since there are many assemblies of various types within the primary enclosure and most of the assemblies themselves constitute enclosures. GAM-HEAT accounts for this complexity by processing externally generated view factors and connectivity matrices, and also accounts for convective, conductive, and advective heat exchanges. The code is applicable for many situations involving heat exchange between surfaces within a radiatively passive medium. The GAM-HEAT code has been exercised extensively for computing transient temperatures in SRS reactors with specific charges and control components. Results from these computations have been used to establish the need for and to evaluate hardware modifications designed to mitigate results of postulated accident scenarios, and to assist in the specification of safe reactor operating power limits. The code utilizes temperature dependence on material properties. The efficiency of the code has been enhanced by the use of an iterative equation solver. Verification of the code to date consists of comparisons with parallel efforts at Los Alamos National Laboratory and with similar efforts at Westinghouse Science and Technology Center in Pittsburgh, PA, and benchmarked using problems with known analytical or iterated solutions. All comparisons and tests yield results that indicate the GAM-HEAT code performs as intended

  10. Mapping radiation transfer through sea ice using a remotely operated vehicle (ROV

    Directory of Open Access Journals (Sweden)

    M. Nicolaus

    2013-05-01

    Full Text Available Transmission of sunlight into and through sea ice is of critical importance for sea-ice associated organisms and photosynthesis because light is their primary energy source. The amount of visible light transferred through sea ice contributes to the energy budget of the sea ice and the uppermost ocean. However, our current knowledge on the amount and distribution of light under sea ice is still restricted to a few local observations, and our understanding of light-driven processes and interdisciplinary interactions is still sparse. The main reasons are that the under-ice environment is difficult to access and that measurements require large logistical and instrumental efforts. Hence, it has not been possible to map light conditions under sea ice over larger areas and to quantify spatial variability on different scales. Here we present a detailed methodological description for operating spectral radiometers on a remotely operated vehicle (ROV under sea ice. Recent advances in ROV and radiation-sensor technology have allowed us to map under-ice spectral radiance and irradiance on floe scales within a few hours of station time. The ROV was operated directly from the sea ice, allowing for direct relations of optical properties to other sea-ice and surface features. The ROV was flown close to the sea ice in order to capture small-scale variability. Results from the presented data set and similar future studies will allow for better quantification of light conditions under sea ice. The presented experiences will support further developments in order to gather large data sets of under-ice radiation for different ice conditions and during different seasons.

  11. Frequency-selective near-field radiative heat transfer between photonic crystal slabs: a computational approach for arbitrary geometries and materials.

    Science.gov (United States)

    Rodriguez, Alejandro W; Ilic, Ognjen; Bermel, Peter; Celanovic, Ivan; Joannopoulos, John D; Soljačić, Marin; Johnson, Steven G

    2011-09-09

    We demonstrate the possibility of achieving enhanced frequency-selective near-field radiative heat transfer between patterned (photonic-crystal) slabs at designable frequencies and separations, exploiting a general numerical approach for computing heat transfer in arbitrary geometries and materials based on the finite-difference time-domain method. Our simulations reveal a tradeoff between selectivity and near-field enhancement as the slab-slab separation decreases, with the patterned heat transfer eventually reducing to the unpatterned result multiplied by a fill factor (described by a standard proximity approximation). We also find that heat transfer can be further enhanced at selective frequencies when the slabs are brought into a glide-symmetric configuration, a consequence of the degeneracies associated with the nonsymmorphic symmetry group.

  12. Fostering clinical reasoning in physiotherapy: comparing the effects of concept map study and concept map completion after example study in novice and advanced learners.

    Science.gov (United States)

    Montpetit-Tourangeau, Katherine; Dyer, Joseph-Omer; Hudon, Anne; Windsor, Monica; Charlin, Bernard; Mamede, Sílvia; van Gog, Tamara

    2017-12-01

    Health profession learners can foster clinical reasoning by studying worked examples presenting fully worked out solutions to a clinical problem. It is possible to improve the learning effect of these worked examples by combining them with other learning activities based on concept maps. This study investigated which combinaison of activities, worked examples study with concept map completion or worked examples study with concept map study, fosters more meaningful learning of intervention knowledge in physiotherapy students. Moreover, this study compared the learning effects of these learning activity combinations between novice and advanced learners. Sixty-one second-year physiotherapy students participated in the study which included a pre-test phase, a 130-min guided-learning phase and a four-week self-study phase. During the guided and self-study learning sessions, participants had to study three written worked examples presenting the clinical reasoning for selecting electrotherapeutic currents to treat patients with motor deficits. After each example, participants engaged in either concept map completion or concept map study depending on which learning condition they were randomly allocated to. Students participated in an immediate post-test at the end of the guided-learning phase and a delayed post-test at the end of the self-study phase. Post-tests assessed the understanding of principles governing the domain of knowledge to be learned (conceptual knowledge) and the ability to solve new problems that have similar (i.e., near transfer) or different (i.e., far transfer) solution rationales as problems previously studied in the examples. Learners engaged in concept map completion outperformed those engaged in concept map study on near transfer (p = .010) and far transfer (p concept map completion led to greater transfer performance than worked examples study combined with concept map study for both novice and advanced learners. Concept map completion

  13. Enhancing programming logic thinking using analogy mapping

    Science.gov (United States)

    Sukamto, R. A.; Megasari, R.

    2018-05-01

    Programming logic thinking is the most important competence for computer science students. However, programming is one of the difficult subject in computer science program. This paper reports our work about enhancing students' programming logic thinking using Analogy Mapping for basic programming subject. Analogy Mapping is a computer application which converts source code into analogies images. This research used time series evaluation and the result showed that Analogy Mapping can enhance students' programming logic thinking.

  14. A Numerical-Analytical Approach Based on Canonical Transformations for Computing Optimal Low-Thrust Transfers

    Science.gov (United States)

    da Silva Fernandes, S.; das Chagas Carvalho, F.; Bateli Romão, J. V.

    2018-04-01

    A numerical-analytical procedure based on infinitesimal canonical transformations is developed for computing optimal time-fixed low-thrust limited power transfers (no rendezvous) between coplanar orbits with small eccentricities in an inverse-square force field. The optimization problem is formulated as a Mayer problem with a set of non-singular orbital elements as state variables. Second order terms in eccentricity are considered in the development of the maximum Hamiltonian describing the optimal trajectories. The two-point boundary value problem of going from an initial orbit to a final orbit is solved by means of a two-stage Newton-Raphson algorithm which uses an infinitesimal canonical transformation. Numerical results are presented for some transfers between circular orbits with moderate radius ratio, including a preliminary analysis of Earth-Mars and Earth-Venus missions.

  15. 2015 MICCAI Workshop on Computational Diffusion MRI

    CERN Document Server

    Ghosh, Aurobrata; Kaden, Enrico; Rathi, Yogesh; Reisert, Marco

    2016-01-01

    These Proceedings of the 2015 MICCAI Workshop “Computational Diffusion MRI” offer a snapshot of the current state of the art on a broad range of topics within the highly active and growing field of diffusion MRI. The topics vary from fundamental theoretical work on mathematical modeling, to the development and evaluation of robust algorithms, new computational methods applied to diffusion magnetic resonance imaging data, and applications in neuroscientific studies and clinical practice. Over the last decade interest in diffusion MRI has exploded. The technique provides unique insights into the microstructure of living tissue and enables in-vivo connectivity mapping of the brain. Computational techniques are key to the continued success and development of diffusion MRI and to its widespread transfer into clinical practice. New processing methods are essential for addressing issues at each stage of the diffusion MRI pipeline: acquisition, reconstruction, modeling and model fitting, image processing, fiber t...

  16. Canadian pollutant releases and transfers : NPRI data 1998

    International Nuclear Information System (INIS)

    2001-03-01

    A large, two-sided fold-out poster showing pollution hotspots throughout Canada, has been released by the Canadian Institute for Environmental Law and Policy (CIELAP). The poster is based on the maps and tables of the National Pollutant Release Inventory (NPRI) 1998. It presents the top on-site releases and off-site transfers of pollutants by facilities across Canada, and provides a summary of releases, transfers and recycling of pollutants by province. The map depicts the facilities with the five largest quantities of individual pollutants released to each medium, i. e. water, air, land, underground. The crude oil and gas, other utilities, primary metal, paper and chemical industrial sectors are responsible for the largest on-site releases, while the business services, fabricated metal. primary metal and chemical industrial sectors account for the largest off-site transfers. Twenty-five of the 176 substances whose releases and transfers are reported under the NPRI are classified as 'toxic' or 'carcinogenic'. tabs., maps

  17. Using a Cloud Computing System to Reduce Door-to-Balloon Time in Acute ST-Elevation Myocardial Infarction Patients Transferred for Percutaneous Coronary Intervention.

    Science.gov (United States)

    Ho, Chi-Kung; Chen, Fu-Cheng; Chen, Yung-Lung; Wang, Hui-Ting; Lee, Chien-Ho; Chung, Wen-Jung; Lin, Cheng-Jui; Hsueh, Shu-Kai; Hung, Shin-Chiang; Wu, Kuan-Han; Liu, Chu-Feng; Kung, Chia-Te; Cheng, Cheng-I

    2017-01-01

    This study evaluated the impact on clinical outcomes using a cloud computing system to reduce percutaneous coronary intervention hospital door-to-balloon (DTB) time for ST segment elevation myocardial infarction (STEMI). A total of 369 patients before and after implementation of the transfer protocol were enrolled. Of these patients, 262 were transferred through protocol while the other 107 patients were transferred through the traditional referral process. There were no significant differences in DTB time, pain to door of STEMI receiving center arrival time, and pain to balloon time between the two groups. Pain to electrocardiography time in patients with Killip I/II and catheterization laboratory to balloon time in patients with Killip III/IV were significantly reduced in transferred through protocol group compared to in traditional referral process group (both p cloud computing system in our present protocol did not reduce DTB time.

  18. Experimental study on flow pattern and heat transfer of inverted annular flow

    International Nuclear Information System (INIS)

    Takenaka, Nobuyuki; Akagawa, Koji; Fujii, Terushige; Nishida, Koji

    1990-01-01

    Experimental results are presented on flow pattern and heat transfer in the regions from inverted annular flow to dispersed flow in a vertical tube using freon R-113 as a working fluid at atmospheric pressure to discuss the correspondence between them. Axial distributions of heat transfer coefficient are measured and flow patterns are observed. The heat transfer characteristics are divided into three regions and a heat transfer characteristics map is proposed. The flow pattern changes from inverted annular flow (IAF) to dispersed flow (DF) through inverted slug flow (ISF) for lower inlet velocities and through agitated inverted annular flow (AIAF) for higher inlet velocities. A flow pattern map is obtained which corresponds well with the heat transfer characteristic map. (orig.)

  19. Terrain Correction on the moving equal area cylindrical map projection of the surface of a reference ellipsoid

    Science.gov (United States)

    Ardalan, A.; Safari, A.; Grafarend, E.

    2003-04-01

    An operational algorithm for computing the ellipsoidal terrain correction based on application of closed form solution of the Newton integral in terms of Cartesian coordinates in the cylindrical equal area map projected surface of a reference ellipsoid has been developed. As the first step the mapping of the points on the surface of a reference ellipsoid onto the cylindrical equal area map projection of a cylinder tangent to a point on the surface of reference ellipsoid closely studied and the map projection formulas are computed. Ellipsoidal mass elements with various sizes on the surface of the reference ellipsoid is considered and the gravitational potential and the vector of gravitational intensity of these mass elements has been computed via the solution of Newton integral in terms of ellipsoidal coordinates. The geographical cross section areas of the selected ellipsoidal mass elements are transferred into cylindrical equal area map projection and based on the transformed area elements Cartesian mass elements with the same height as that of the ellipsoidal mass elements are constructed. Using the close form solution of the Newton integral in terms of Cartesian coordinates the potential of the Cartesian mass elements are computed and compared with the same results based on the application of the ellipsoidal Newton integral over the ellipsoidal mass elements. The results of the numerical computations show that difference between computed gravitational potential of the ellipsoidal mass elements and Cartesian mass element in the cylindrical equal area map projection is of the order of 1.6 × 10-8m^2/s^2 for a mass element with the cross section size of 10 km × 10 km and the height of 1000 m. For a 1 km × 1 km mass element with the same height, this difference is less than 1.5 × 10-4 m^2}/s^2. The results of the numerical computations indicate that a new method for computing the terrain correction based on the closed form solution of the Newton integral in

  20. Shielding computations for solution transfer lines from Analytical Lab to process cells of Demonstration Fast Reactor Plant (DFRP)

    International Nuclear Information System (INIS)

    Baskar, S.; Jose, M.T.; Baskaran, R.; Venkatraman, B.

    2018-01-01

    The diluted virgin solutions (both aqueous and organic) and aqueous analytical waste generated from experimental analysis of process solutions, pertaining to Fast Breeder Test Reactor (FBTR) and Prototype Fast Breeder Reactor (PFBR), in glove boxes of active analytical Laboratory (AAL) are pumped back to the process cells through a pipe in pipe arrangement. There are 6 transfer lines (Length 15-32 m), 2 for each type of transfer. The transfer lines passes through the area inside the AAL and also the operating area. Hence it is required to compute the necessary radial shielding requirement around the lines to limit the dose rates in both the areas to the permissible values as per the regulatory requirement

  1. Spectroscopic investigation and computational analysis of charge transfer hydrogen bonded reaction between 3-aminoquinoline with chloranilic acid in 1:1 stoichiometric ratio

    Science.gov (United States)

    Al-Ahmary, Khairia M.; Alenezi, Maha S.; Habeeb, Moustafa M.

    2015-10-01

    Charge transfer hydrogen bonded reaction between the electron donor (proton acceptor) 3-aminoquinoline with the electron acceptor (proton donor) chloranilic acid (H2CA) has been investigated experimentally and theoretically. The experimental work included the application of UV-vis spectroscopy to identify the charge transfer band of the formed complex, its molecular composition as well as estimating its formation constants in different solvent included acetonitrile (AN), methanol (MeOH), ethanol (EtOH) and chloroform (CHL). It has been recorded the presence of new absorption bands in the range 500-550 nm attributing to the formed complex. The molecular composition of the HBCT complex was found to be 1:1 (donor:acceptor) in all studied solvents based on continuous variation and photometric titration methods. In addition, the calculated formation constants from Benesi-Hildebrand equation recorded high values, especially in chloroform referring to the formation of stable HBCT complex. Infrared spectroscopy has been applied for the solid complex where formation of charge and proton transfer was proven in it. Moreover, 1H and 13C NMR spectroscopies were used to characterize the formed complex where charge and proton transfers were reconfirmed. Computational analysis included the use of GAMESS computations as a package of ChemBio3D Ultr12 program were applied for energy minimization and estimation of the stabilization energy for the produced complex. Also, geometrical parameters (bond lengths and bond angles) of the formed HBCT complex were computed and analyzed. Furthermore, Mullikan atomic charges, molecular potential energy surface, HOMO and LUMO molecular orbitals as well as assignment of the electronic spectra of the formed complex were presented. A full agreement between experimental and computational analysis has been found especially in the existence of the charge and proton transfers and the assignment of HOMO and LUMO molecular orbitals in the formed complex as

  2. The Research of the Parallel Computing Development from the Angle of Cloud Computing

    Science.gov (United States)

    Peng, Zhensheng; Gong, Qingge; Duan, Yanyu; Wang, Yun

    2017-10-01

    Cloud computing is the development of parallel computing, distributed computing and grid computing. The development of cloud computing makes parallel computing come into people’s lives. Firstly, this paper expounds the concept of cloud computing and introduces two several traditional parallel programming model. Secondly, it analyzes and studies the principles, advantages and disadvantages of OpenMP, MPI and Map Reduce respectively. Finally, it takes MPI, OpenMP models compared to Map Reduce from the angle of cloud computing. The results of this paper are intended to provide a reference for the development of parallel computing.

  3. Modelling flow and heat transfer around a seated human body by computational fluid dynamics

    DEFF Research Database (Denmark)

    Sørensen, Dan Nørtoft; Voigt, Lars Peter Kølgaard

    2003-01-01

    A database (http://www.ie.dtu.dk/manikin) containing a detailed representation of the surface geometry of a seated female human body was created from a surface scan of a thermal manikin (minus clothing and hair). The radiative heat transfer coefficient and the natural convection flow around...... of the computational manikin has all surface features of a human being; (2) the geometry is an exact copy of an experimental thermal manikin, enabling detailed comparisons between calculations and experiments....

  4. Development of eSSR-Markers in Setaria italica and Their Applicability in Studying Genetic Diversity, Cross-Transferability and Comparative Mapping in Millet and Non-Millet Species.

    Science.gov (United States)

    Kumari, Kajal; Muthamilarasan, Mehanathan; Misra, Gopal; Gupta, Sarika; Subramanian, Alagesan; Parida, Swarup Kumar; Chattopadhyay, Debasis; Prasad, Manoj

    2013-01-01

    Foxtail millet (Setariaitalica L.) is a tractable experimental model crop for studying functional genomics of millets and bioenergy grasses. But the limited availability of genomic resources, particularly expressed sequence-based genic markers is significantly impeding its genetic improvement. Considering this, we attempted to develop EST-derived-SSR (eSSR) markers and utilize them in germplasm characterization, cross-genera transferability and in silico comparative mapping. From 66,027 foxtail millet EST sequences 24,828 non-redundant ESTs were deduced, representing ~16 Mb, which revealed 534 (~2%) eSSRs in 495 SSR containing ESTs at a frequency of 1/30 kb. A total of 447 pp were successfully designed, of which 327 were mapped physically onto nine chromosomes. About 106 selected primer pairs representing the foxtail millet genome showed high-level of cross-genera amplification at an average of ~88% in eight millets and four non-millet species. Broad range of genetic diversity (0.02-0.65) obtained in constructed phylogenetic tree using 40 eSSR markers demonstrated its utility in germplasm characterizations and phylogenetics. Comparative mapping of physically mapped eSSR markers showed considerable proportion of sequence-based orthology and syntenic relationship between foxtail millet chromosomes and sorghum (~68%), maize (~61%) and rice (~42%) chromosomes. Synteny analysis of eSSRs of foxtail millet, rice, maize and sorghum suggested the nested chromosome fusion frequently observed in grass genomes. Thus, for the first time we had generated large-scale eSSR markers in foxtail millet and demonstrated their utility in germplasm characterization, transferability, phylogenetics and comparative mapping studies in millets and bioenergy grass species.

  5. Computed tomography lung iodine contrast mapping by image registration and subtraction

    Science.gov (United States)

    Goatman, Keith; Plakas, Costas; Schuijf, Joanne; Beveridge, Erin; Prokop, Mathias

    2014-03-01

    Pulmonary embolism (PE) is a relatively common and potentially life threatening disease, affecting around 600,000 people annually in the United States alone. Prompt treatment using anticoagulants is effective and saves lives, but unnecessary treatment risks life threatening haemorrhage. The specificity of any diagnostic test for PE is therefore as important as its sensitivity. Computed tomography (CT) angiography is routinely used to diagnose PE. However, there are concerns it may over-report the condition. Additional information about the severity of an occlusion can be obtained from an iodine contrast map that represents tissue perfusion. Such maps tend to be derived from dual-energy CT acquisitions. However, they may also be calculated by subtracting pre- and post-contrast CT scans. Indeed, there are technical advantages to such a subtraction approach, including better contrast-to-noise ratio for the same radiation dose, and bone suppression. However, subtraction relies on accurate image registration. This paper presents a framework for the automatic alignment of pre- and post-contrast lung volumes prior to subtraction. The registration accuracy is evaluated for seven subjects for whom pre- and post-contrast helical CT scans were acquired using a Toshiba Aquilion ONE scanner. One hundred corresponding points were annotated on the pre- and post-contrast scans, distributed throughout the lung volume. Surface-to-surface error distances were also calculated from lung segmentations. Prior to registration the mean Euclidean landmark alignment error was 2.57mm (range 1.43-4.34 mm), and following registration the mean error was 0.54mm (range 0.44-0.64 mm). The mean surface error distance was 1.89mm before registration and 0.47mm after registration. There was a commensurate reduction in visual artefacts following registration. In conclusion, a framework for pre- and post-contrast lung registration has been developed that is sufficiently accurate for lung subtraction

  6. High resolution global flood hazard map from physically-based hydrologic and hydraulic models.

    Science.gov (United States)

    Begnudelli, L.; Kaheil, Y.; McCollum, J.

    2017-12-01

    The global flood map published online at http://www.fmglobal.com/research-and-resources/global-flood-map at 90m resolution is being used worldwide to understand flood risk exposure, exercise certain measures of mitigation, and/or transfer the residual risk financially through flood insurance programs. The modeling system is based on a physically-based hydrologic model to simulate river discharges, and 2D shallow-water hydrodynamic model to simulate inundation. The model can be applied to large-scale flood hazard mapping thanks to several solutions that maximize its efficiency and the use of parallel computing. The hydrologic component of the modeling system is the Hillslope River Routing (HRR) hydrologic model. HRR simulates hydrological processes using a Green-Ampt parameterization, and is calibrated against observed discharge data from several publicly-available datasets. For inundation mapping, we use a 2D Finite-Volume Shallow-Water model with wetting/drying. We introduce here a grid Up-Scaling Technique (UST) for hydraulic modeling to perform simulations at higher resolution at global scale with relatively short computational times. A 30m SRTM is now available worldwide along with higher accuracy and/or resolution local Digital Elevation Models (DEMs) in many countries and regions. UST consists of aggregating computational cells, thus forming a coarser grid, while retaining the topographic information from the original full-resolution mesh. The full-resolution topography is used for building relationships between volume and free surface elevation inside cells and computing inter-cell fluxes. This approach almost achieves computational speed typical of the coarse grids while preserving, to a significant extent, the accuracy offered by the much higher resolution available DEM. The simulations are carried out along each river of the network by forcing the hydraulic model with the streamflow hydrographs generated by HRR. Hydrographs are scaled so that the peak

  7. Taylor series maps and their domain of convergence

    International Nuclear Information System (INIS)

    Abell, D.T.; Dragt, A.J.

    1992-01-01

    This paper tries to make clear what limits the validity of a Taylor series map, and how. We describe the concept of a transfer map and quote some theorems that justify not only their existence but also their advantages. Then, we describe the Taylor series representation for transfer maps. Following that, we attempt to elucidate some of the basic theorems from the theory of functions of one and several complex variables. This material forms the core of our understanding of what limits the domain of convergence of Taylor series maps. Lastly, we use the concrete example of a simple anharmonic oscillator to illustrate how the theorems from several complex variable theory affect the domain convergence of Taylor series maps. There we describe the singularities of the anharmonic oscillator in the complex planes of the initial conditions, show how they constrain our use of a Taylor series map, and then discuss our findings

  8. Data Assimilation with Optimal Maps

    Science.gov (United States)

    El Moselhy, T.; Marzouk, Y.

    2012-12-01

    Tarek El Moselhy and Youssef Marzouk Massachusetts Institute of Technology We present a new approach to Bayesian inference that entirely avoids Markov chain simulation and sequential importance resampling, by constructing a map that pushes forward the prior measure to the posterior measure. Existence and uniqueness of a suitable measure-preserving map is established by formulating the problem in the context of optimal transport theory. The map is written as a multivariate polynomial expansion and computed efficiently through the solution of a stochastic optimization problem. While our previous work [1] focused on static Bayesian inference problems, we now extend the map-based approach to sequential data assimilation, i.e., nonlinear filtering and smoothing. One scheme involves pushing forward a fixed reference measure to each filtered state distribution, while an alternative scheme computes maps that push forward the filtering distribution from one stage to the other. We compare the performance of these schemes and extend the former to problems of smoothing, using a map implementation of the forward-backward smoothing formula. Advantages of a map-based representation of the filtering and smoothing distributions include analytical expressions for posterior moments and the ability to generate arbitrary numbers of independent uniformly-weighted posterior samples without additional evaluations of the dynamical model. Perhaps the main advantage, however, is that the map approach inherently avoids issues of sample impoverishment, since it explicitly represents the posterior as the pushforward of a reference measure, rather than with a particular set of samples. The computational complexity of our algorithm is comparable to state-of-the-art particle filters. Moreover, the accuracy of the approach is controlled via the convergence criterion of the underlying optimization problem. We demonstrate the efficiency and accuracy of the map approach via data assimilation in

  9. Computational models of an inductive power transfer system for electric vehicle battery charge

    Science.gov (United States)

    Anele, A. O.; Hamam, Y.; Chassagne, L.; Linares, J.; Alayli, Y.; Djouani, K.

    2015-09-01

    One of the issues to be solved for electric vehicles (EVs) to become a success is the technical solution of its charging system. In this paper, computational models of an inductive power transfer (IPT) system for EV battery charge are presented. Based on the fundamental principles behind IPT systems, 3 kW single phase and 22 kW three phase IPT systems for Renault ZOE are designed in MATLAB/Simulink. The results obtained based on the technical specifications of the lithium-ion battery and charger type of Renault ZOE show that the models are able to provide the total voltage required by the battery. Also, considering the charging time for each IPT model, they are capable of delivering the electricity needed to power the ZOE. In conclusion, this study shows that the designed computational IPT models may be employed as a support structure needed to effectively power any viable EV.

  10. Computational models of an inductive power transfer system for electric vehicle battery charge

    International Nuclear Information System (INIS)

    Anele, A O; Hamam, Y; Djouani, K; Chassagne, L; Alayli, Y; Linares, J

    2015-01-01

    One of the issues to be solved for electric vehicles (EVs) to become a success is the technical solution of its charging system. In this paper, computational models of an inductive power transfer (IPT) system for EV battery charge are presented. Based on the fundamental principles behind IPT systems, 3 kW single phase and 22 kW three phase IPT systems for Renault ZOE are designed in MATLAB/Simulink. The results obtained based on the technical specifications of the lithium-ion battery and charger type of Renault ZOE show that the models are able to provide the total voltage required by the battery. Also, considering the charging time for each IPT model, they are capable of delivering the electricity needed to power the ZOE. In conclusion, this study shows that the designed computational IPT models may be employed as a support structure needed to effectively power any viable EV. (paper)

  11. Mapping tropical biodiversity using spectroscopic imagery : characterization of structural and chemical diversity with 3-D radiative transfer modeling

    Science.gov (United States)

    Feret, J. B.; Gastellu-Etchegorry, J. P.; Lefèvre-Fonollosa, M. J.; Proisy, C.; Asner, G. P.

    2014-12-01

    The accelerating loss of biodiversity is a major environmental trend. Tropical ecosystems are particularly threatened due to climate change, invasive species, farming and natural resources exploitation. Recent advances in remote sensing of biodiversity confirmed the potential of high spatial resolution spectroscopic imagery for species identification and biodiversity mapping. Such information bridges the scale-gap between small-scale, highly detailed field studies and large-scale, low-resolution satellite observations. In order to produce fine-scale resolution maps of canopy alpha-diversity and beta-diversity of the Peruvian Amazonian forest, we designed, applied and validated a method based on spectral variation hypothesis to CAO AToMS (Carnegie Airborne Observatory Airborne Taxonomic Mapping System) images, acquired from 2011 to 2013. There is a need to understand on a quantitative basis the physical processes leading to this spectral variability. This spectral variability mainly depends on canopy chemistry, structure, and sensor's characteristics. 3D radiative transfer modeling provides a powerful framework for the study of the relative influence of each of these factors in dense and complex canopies. We simulated series of spectroscopic images with the 3D radiative model DART, with variability gradients in terms of leaf chemistry, individual tree structure, spatial and spectral resolution, and applied methods for biodiversity mapping. This sensitivity study allowed us to determine the relative influence of these factors on the radiometric signal acquired by different types of sensors. Such study is particularly important to define the domain of validity of our approach, to refine requirements for the instrumental specifications, and to help preparing hyperspectral spatial missions to be launched at the horizon 2015-2025 (EnMAP, PRISMA, HISUI, SHALOM, HYSPIRI, HYPXIM). Simulations in preparation include topographic variations in order to estimate the robustness

  12. Mapping in the cloud

    CERN Document Server

    Peterson, Michael P

    2014-01-01

    This engaging text provides a solid introduction to mapmaking in the era of cloud computing. It takes students through both the concepts and technology of modern cartography, geographic information systems (GIS), and Web-based mapping. Conceptual chapters delve into the meaning of maps and how they are developed, covering such topics as map layers, GIS tools, mobile mapping, and map animation. Methods chapters take a learn-by-doing approach to help students master application programming interfaces and build other technical skills for creating maps and making them available on the Internet. Th

  13. Mapping ocean tides with satellites - A computer simulation

    Science.gov (United States)

    Won, I. J.; Kuo, J. T.; Jachens, R. C.

    1978-01-01

    As a preliminary study for the future worldwide direct mapping of the open ocean tide with satellites equipped with precision altimeters we conducted a simulated study using sets of artificially generated altimeter data constructed from a realistic geoid and four pairs of major tides in the northeastern Pacific Ocean. Recovery of the original geoid and eight tidal maps is accomplished by a space-time, least squares harmonic analysis scheme. The resultant maps appear fairly satisfactory even when random noises up to + or - 100 cm are added to the altimeter data of sufficient space-time density. The method also produces a refined geoid which is rigorously corrected for the dynamic tides.

  14. An XML transfer schema for exchange of genomic and genetic mapping data: implementation as a web service in a Taverna workflow.

    Science.gov (United States)

    Paterson, Trevor; Law, Andy

    2009-08-14

    exchange standard we present here provides a useful generic format for transfer and integration of genomic and genetic mapping data. The extensibility of our schema allows for inclusion of additional data and provides a mechanism for typing mapping objects via third party standards. Web services retrieving GMD-compliant mapping data demonstrate that use of this exchange standard provides a practical mechanism for achieving data integration, by facilitating syntactically and semantically-controlled access to the data.

  15. Edge maps: Representing flow with bounded error

    KAUST Repository

    Bhatia, Harsh

    2011-03-01

    Robust analysis of vector fields has been established as an important tool for deriving insights from the complex systems these fields model. Many analysis techniques rely on computing streamlines, a task often hampered by numerical instabilities. Approaches that ignore the resulting errors can lead to inconsistencies that may produce unreliable visualizations and ultimately prevent in-depth analysis. We propose a new representation for vector fields on surfaces that replaces numerical integration through triangles with linear maps defined on its boundary. This representation, called edge maps, is equivalent to computing all possible streamlines at a user defined error threshold. In spite of this error, all the streamlines computed using edge maps will be pairwise disjoint. Furthermore, our representation stores the error explicitly, and thus can be used to produce more informative visualizations. Given a piecewise-linear interpolated vector field, a recent result [15] shows that there are only 23 possible map classes for a triangle, permitting a concise description of flow behaviors. This work describes the details of computing edge maps, provides techniques to quantify and refine edge map error, and gives qualitative and visual comparisons to more traditional techniques. © 2011 IEEE.

  16. Mind the gap! Automated concept map feedback supports students in writing cohesive explanations.

    Science.gov (United States)

    Lachner, Andreas; Burkhart, Christian; Nückles, Matthias

    2017-03-01

    Many students are challenged with the demand of writing cohesive explanations. To support students in writing cohesive explanations, we developed a computer-based feedback tool that visualizes cohesion deficits of students' explanations in a concept map. We conducted three studies to investigate the effectiveness of such feedback as well as the underlying cognitive processes. In Study 1, we found that the concept map helped students identify potential cohesion gaps in their drafts and plan remedial revisions. In Study 2, students with concept map feedback conducted revisions that resulted in more locally and globally cohesive, and also more comprehensible, explanations than the explanations of students who revised without concept map feedback. In Study 3, we replicated the findings of Study 2 by and large. More importantly, students who had received concept map feedback on a training explanation 1 week later wrote a transfer explanation without feedback that was more cohesive than the explanation of students who had received no feedback on their training explanation. The automated concept map feedback appears to particularly support the evaluation phase of the revision process. Furthermore, the feedback enabled novice writers to acquire sustainable skills in writing cohesive explanations. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  17. The Effectiveness of Interactive Computer Assisted Modeling in Teaching Study Strategies and Concept Mapping of College Textbook Material.

    Science.gov (United States)

    Mikulecky, Larry

    A study evaluated the effectiveness of a series of print materials and interactive computer-guided study programs designed to lead undergraduate students to apply basic textbook reading and concept mapping strategies to the study of science and social science textbooks. Following field testing with 25 learning skills students, 50 freshman biology…

  18. Microgravity and Charge Transfer in the Neuronal Membrane: Implications for Computational Neurobiology

    Science.gov (United States)

    Wallace, Ron

    1995-01-01

    Evidence from natural and artificial membranes indicates that the neural membrane is a liquid crystal. A liquid-to-gel phase transition caused by the application of superposed electromagnetic fields to the outer membrane surface releases spin-correlated electron pairs which propagate through a charge transfer complex. The propagation generates Rydberg atoms in the lipid bilayer lattice. In the present model, charge density configurations in promoted orbitals interact as cellular automata and perform computations in Hilbert space. Due to the small binding energies of promoted orbitals, their automata are highly sensitive to microgravitational perturbations. It is proposed that spacetime is classical on the Rydberg scale, but formed of contiguous moving segments, each of which displays topological equivalence. This stochasticity is reflected in randomized Riemannian tensor values. Spacetime segments interact with charge automata as components of a computational process. At the termination of the algorithm, an orbital of high probability density is embedded in a more stabilized microscopic spacetime. This state permits the opening of an ion channel and the conversion of a quantum algorithm into a macroscopic frequency code.

  19. Application research of computational mass-transfer differential equation in MBR concentration field simulation.

    Science.gov (United States)

    Li, Chunqing; Tie, Xiaobo; Liang, Kai; Ji, Chanjuan

    2016-01-01

    After conducting the intensive research on the distribution of fluid's velocity and biochemical reactions in the membrane bioreactor (MBR), this paper introduces the use of the mass-transfer differential equation to simulate the distribution of the chemical oxygen demand (COD) concentration in MBR membrane pool. The solutions are as follows: first, use computational fluid dynamics to establish a flow control equation model of the fluid in MBR membrane pool; second, calculate this model by adopting direct numerical simulation to get the velocity field of the fluid in membrane pool; third, combine the data of velocity field to establish mass-transfer differential equation model for the concentration field in MBR membrane pool, and use Seidel iteration method to solve the equation model; last but not least, substitute the real factory data into the velocity and concentration field model to calculate simulation results, and use visualization software Tecplot to display the results. Finally by analyzing the nephogram of COD concentration distribution, it can be found that the simulation result conforms the distribution rule of the COD's concentration in real membrane pool, and the mass-transfer phenomenon can be affected by the velocity field of the fluid in membrane pool. The simulation results of this paper have certain reference value for the design optimization of the real MBR system.

  20. [MapDraw: a microsoft excel macro for drawing genetic linkage maps based on given genetic linkage data].

    Science.gov (United States)

    Liu, Ren-Hu; Meng, Jin-Ling

    2003-05-01

    MAPMAKER is one of the most widely used computer software package for constructing genetic linkage maps.However, the PC version, MAPMAKER 3.0 for PC, could not draw the genetic linkage maps that its Macintosh version, MAPMAKER 3.0 for Macintosh,was able to do. Especially in recent years, Macintosh computer is much less popular than PC. Most of the geneticists use PC to analyze their genetic linkage data. So a new computer software to draw the same genetic linkage maps on PC as the MAPMAKER for Macintosh to do on Macintosh has been crying for. Microsoft Excel,one component of Microsoft Office package, is one of the most popular software in laboratory data processing. Microsoft Visual Basic for Applications (VBA) is one of the most powerful functions of Microsoft Excel. Using this program language, we can take creative control of Excel, including genetic linkage map construction, automatic data processing and more. In this paper, a Microsoft Excel macro called MapDraw is constructed to draw genetic linkage maps on PC computer based on given genetic linkage data. Use this software,you can freely construct beautiful genetic linkage map in Excel and freely edit and copy it to Word or other application. This software is just an Excel format file. You can freely copy it from ftp://211.69.140.177 or ftp://brassica.hzau.edu.cn and the source code can be found in Excel's Visual Basic Editor.

  1. Heat transfer, velocity-temperature correlation, and turbulent shear stress from Navier-Stokes computations of shock wave/turbulent boundary layer interaction flows

    Science.gov (United States)

    Wang, C. R.; Hingst, W. R.; Porro, A. R.

    1991-01-01

    The properties of 2-D shock wave/turbulent boundary layer interaction flows were calculated by using a compressible turbulent Navier-Stokes numerical computational code. Interaction flows caused by oblique shock wave impingement on the turbulent boundary layer flow were considered. The oblique shock waves were induced with shock generators at angles of attack less than 10 degs in supersonic flows. The surface temperatures were kept at near-adiabatic (ratio of wall static temperature to free stream total temperature) and cold wall (ratio of wall static temperature to free stream total temperature) conditions. The computational results were studied for the surface heat transfer, velocity temperature correlation, and turbulent shear stress in the interaction flow fields. Comparisons of the computational results with existing measurements indicated that (1) the surface heat transfer rates and surface pressures could be correlated with Holden's relationship, (2) the mean flow streamwise velocity components and static temperatures could be correlated with Crocco's relationship if flow separation did not occur, and (3) the Baldwin-Lomax turbulence model should be modified for turbulent shear stress computations in the interaction flows.

  2. A Flexible Computational Framework Using R and Map-Reduce for Permutation Tests of Massive Genetic Analysis of Complex Traits.

    Science.gov (United States)

    Mahjani, Behrang; Toor, Salman; Nettelblad, Carl; Holmgren, Sverker

    2017-01-01

    In quantitative trait locus (QTL) mapping significance of putative QTL is often determined using permutation testing. The computational needs to calculate the significance level are immense, 10 4 up to 10 8 or even more permutations can be needed. We have previously introduced the PruneDIRECT algorithm for multiple QTL scan with epistatic interactions. This algorithm has specific strengths for permutation testing. Here, we present a flexible, parallel computing framework for identifying multiple interacting QTL using the PruneDIRECT algorithm which uses the map-reduce model as implemented in Hadoop. The framework is implemented in R, a widely used software tool among geneticists. This enables users to rearrange algorithmic steps to adapt genetic models, search algorithms, and parallelization steps to their needs in a flexible way. Our work underlines the maturity of accessing distributed parallel computing for computationally demanding bioinformatics applications through building workflows within existing scientific environments. We investigate the PruneDIRECT algorithm, comparing its performance to exhaustive search and DIRECT algorithm using our framework on a public cloud resource. We find that PruneDIRECT is vastly superior for permutation testing, and perform 2 ×10 5 permutations for a 2D QTL problem in 15 hours, using 100 cloud processes. We show that our framework scales out almost linearly for a 3D QTL search.

  3. Development of eSSR-Markers in Setaria italica and Their Applicability in Studying Genetic Diversity, Cross-Transferability and Comparative Mapping in Millet and Non-Millet Species.

    Directory of Open Access Journals (Sweden)

    Kajal Kumari

    Full Text Available Foxtail millet (Setariaitalica L. is a tractable experimental model crop for studying functional genomics of millets and bioenergy grasses. But the limited availability of genomic resources, particularly expressed sequence-based genic markers is significantly impeding its genetic improvement. Considering this, we attempted to develop EST-derived-SSR (eSSR markers and utilize them in germplasm characterization, cross-genera transferability and in silico comparative mapping. From 66,027 foxtail millet EST sequences 24,828 non-redundant ESTs were deduced, representing ~16 Mb, which revealed 534 (~2% eSSRs in 495 SSR containing ESTs at a frequency of 1/30 kb. A total of 447 pp were successfully designed, of which 327 were mapped physically onto nine chromosomes. About 106 selected primer pairs representing the foxtail millet genome showed high-level of cross-genera amplification at an average of ~88% in eight millets and four non-millet species. Broad range of genetic diversity (0.02-0.65 obtained in constructed phylogenetic tree using 40 eSSR markers demonstrated its utility in germplasm characterizations and phylogenetics. Comparative mapping of physically mapped eSSR markers showed considerable proportion of sequence-based orthology and syntenic relationship between foxtail millet chromosomes and sorghum (~68%, maize (~61% and rice (~42% chromosomes. Synteny analysis of eSSRs of foxtail millet, rice, maize and sorghum suggested the nested chromosome fusion frequently observed in grass genomes. Thus, for the first time we had generated large-scale eSSR markers in foxtail millet and demonstrated their utility in germplasm characterization, transferability, phylogenetics and comparative mapping studies in millets and bioenergy grass species.

  4. DATA TRANSFER FROM A DEC PDP-11 BASED MASS-SPECTROMETRY DATA STATION TO AN MS-DOS PERSONAL-COMPUTER

    NARCIS (Netherlands)

    RAFFAELLI, A; BRUINS, AP

    This paper describes a simple procedure for obtaining better quality graphic output for mass spectrometry data from data systems equipped with poor quality printing devices. The procedure uses KERMIT, a low cost public domain software, to transfer ASCII tables to a MS-DOS personal computer where

  5. Prediction of the 21-cm signal from reionization: comparison between 3D and 1D radiative transfer schemes

    Science.gov (United States)

    Ghara, Raghunath; Mellema, Garrelt; Giri, Sambit K.; Choudhury, T. Roy; Datta, Kanan K.; Majumdar, Suman

    2018-05-01

    Three-dimensional radiative transfer simulations of the epoch of reionization can produce realistic results, but are computationally expensive. On the other hand, simulations relying on one-dimensional radiative transfer solutions are faster but limited in accuracy due to their more approximate nature. Here, we compare the performance of the reionization simulation codes GRIZZLY and C2-RAY which use 1D and 3D radiative transfer schemes, respectively. The comparison is performed using the same cosmological density fields, halo catalogues, and source properties. We find that the ionization maps, as well as the 21-cm signal maps from these two simulations are very similar even for complex scenarios which include thermal feedback on low-mass haloes. The comparison between the schemes in terms of the statistical quantities such as the power spectrum of the brightness temperature fluctuation agrees with each other within 10 per cent error throughout the entire reionization history. GRIZZLY seems to perform slightly better than the seminumerical approaches considered in Majumdar et al. which are based on the excursion set principle. We argue that GRIZZLY can be efficiently used for exploring parameter space, establishing observations strategies, and estimating parameters from 21-cm observations.

  6. Efficient and Adaptive Methods for Computing Accurate Potential Surfaces for Quantum Nuclear Effects: Applications to Hydrogen-Transfer Reactions.

    Science.gov (United States)

    DeGregorio, Nicole; Iyengar, Srinivasan S

    2018-01-09

    We present two sampling measures to gauge critical regions of potential energy surfaces. These sampling measures employ (a) the instantaneous quantum wavepacket density, an approximation to the (b) potential surface, its (c) gradients, and (d) a Shannon information theory based expression that estimates the local entropy associated with the quantum wavepacket. These four criteria together enable a directed sampling of potential surfaces that appears to correctly describe the local oscillation frequencies, or the local Nyquist frequency, of a potential surface. The sampling functions are then utilized to derive a tessellation scheme that discretizes the multidimensional space to enable efficient sampling of potential surfaces. The sampled potential surface is then combined with four different interpolation procedures, namely, (a) local Hermite curve interpolation, (b) low-pass filtered Lagrange interpolation, (c) the monomial symmetrization approximation (MSA) developed by Bowman and co-workers, and (d) a modified Shepard algorithm. The sampling procedure and the fitting schemes are used to compute (a) potential surfaces in highly anharmonic hydrogen-bonded systems and (b) study hydrogen-transfer reactions in biogenic volatile organic compounds (isoprene) where the transferring hydrogen atom is found to demonstrate critical quantum nuclear effects. In the case of isoprene, the algorithm discussed here is used to derive multidimensional potential surfaces along a hydrogen-transfer reaction path to gauge the effect of quantum-nuclear degrees of freedom on the hydrogen-transfer process. Based on the decreased computational effort, facilitated by the optimal sampling of the potential surfaces through the use of sampling functions discussed here, and the accuracy of the associated potential surfaces, we believe the method will find great utility in the study of quantum nuclear dynamics problems, of which application to hydrogen-transfer reactions and hydrogen

  7. Map projections cartographic information systems

    CERN Document Server

    Grafarend, Erik W; Syffus, Rainer

    2014-01-01

    This book offers a timely review of map projections including sphere, ellipsoid, rotational surfaces, and geodetic datum transformations. Coverage includes computer vision, and remote sensing space projective mappings in photogrammetry.

  8. Numerical Computation of Underground Inundation in Multiple Layers Using the Adaptive Transfer Method

    Directory of Open Access Journals (Sweden)

    Hyung-Jun Kim

    2018-01-01

    Full Text Available Extreme rainfall causes surface runoff to flow towards lowlands and subterranean facilities, such as subway stations and buildings with underground spaces in densely packed urban areas. These facilities and areas are therefore vulnerable to catastrophic submergence. However, flood modeling of underground space has not yet been adequately studied because there are difficulties in reproducing the associated multiple horizontal layers connected with staircases or elevators. This study proposes a convenient approach to simulate underground inundation when two layers are connected. The main facet of this approach is to compute the flow flux passing through staircases in an upper layer and to transfer the equivalent quantity to a lower layer. This is defined as the ‘adaptive transfer method’. This method overcomes the limitations of 2D modeling by introducing layers connecting concepts to prevent large variations in mesh sizes caused by complicated underlying obstacles or local details. Consequently, this study aims to contribute to the numerical analysis of flow in inundated underground spaces with multiple floors.

  9. Networking for large-scale science: infrastructure, provisioning, transport and application mapping

    International Nuclear Information System (INIS)

    Rao, Nageswara S; Carter, Steven M; Wu Qishi; Wing, William R; Zhu Mengxia; Mezzacappa, Anthony; Veeraraghavan, Malathi; Blondin, John M

    2005-01-01

    Large-scale science computations and experiments require unprecedented network capabilities in the form of large bandwidth and dynamically stable connections to support data transfers, interactive visualizations, and monitoring and steering operations. A number of component technologies dealing with the infrastructure, provisioning, transport and application mappings must be developed and/or optimized to achieve these capabilities. We present a brief account of the following technologies that contribute toward achieving these network capabilities: (a) DOE UltraScienceNet and NSF CHEETAH network testbeds that provide on-demand and scheduled dedicated network connections; (b) experimental results on transport protocols that achieve close to 100% utilization on dedicated 1Gbps wide-area channels; (c) a scheme for optimally mapping a visualization pipeline onto a network to minimize the end-to-end delays; and (d) interconnect configuration and protocols that provides multiple Gbps flows from Cray X1 to external hosts

  10. Networking for large-scale science: infrastructure, provisioning, transport and application mapping

    Energy Technology Data Exchange (ETDEWEB)

    Rao, Nageswara S [Computer Science and Mathematics Division, Oak Ridge National Laboratory, Oak Ridge, TN 37831 (United States); Carter, Steven M [Computer Science and Mathematics Division, Oak Ridge National Laboratory, Oak Ridge, TN 37831 (United States); Wu Qishi [Computer Science and Mathematics Division, Oak Ridge National Laboratory, Oak Ridge, TN 37831 (United States); Wing, William R [Computer Science and Mathematics Division, Oak Ridge National Laboratory, Oak Ridge, TN 37831 (United States); Zhu Mengxia [Department of Computer Science, Louisiana State University, Baton Rouge, LA 70803 (United States); Mezzacappa, Anthony [Physics Division, Oak Ridge National Laboratory, Oak Ridge, TN 37831 (United States); Veeraraghavan, Malathi [Department of Computer Science, University of Virginia, Charlottesville, VA 22904 (United States); Blondin, John M [Department of Physics, North Carolina State University, Raleigh, NC 27695 (United States)

    2005-01-01

    Large-scale science computations and experiments require unprecedented network capabilities in the form of large bandwidth and dynamically stable connections to support data transfers, interactive visualizations, and monitoring and steering operations. A number of component technologies dealing with the infrastructure, provisioning, transport and application mappings must be developed and/or optimized to achieve these capabilities. We present a brief account of the following technologies that contribute toward achieving these network capabilities: (a) DOE UltraScienceNet and NSF CHEETAH network testbeds that provide on-demand and scheduled dedicated network connections; (b) experimental results on transport protocols that achieve close to 100% utilization on dedicated 1Gbps wide-area channels; (c) a scheme for optimally mapping a visualization pipeline onto a network to minimize the end-to-end delays; and (d) interconnect configuration and protocols that provides multiple Gbps flows from Cray X1 to external hosts.

  11. The visual attention saliency map for movie retrospection

    Science.gov (United States)

    Rogalska, Anna; Napieralski, Piotr

    2018-04-01

    The visual saliency map is becoming important and challenging for many scientific disciplines (robotic systems, psychophysics, cognitive neuroscience and computer science). Map created by the model indicates possible salient regions by taking into consideration face presence and motion which is essential in motion pictures. By combining we can obtain credible saliency map with a low computational cost.

  12. Experimental and computational investigations of heat and mass transfer of intensifier grids

    International Nuclear Information System (INIS)

    Kobzar, Leonid; Oleksyuk, Dmitry; Semchenkov, Yuriy

    2015-01-01

    The paper discusses experimental and numerical investigations on intensification of thermal and mass exchange which were performed by National Research Centre ''Kurchatov Institute'' over the past years. Recently, many designs of heat mass transfer intensifier grids have been proposed. NRC ''Kurchatov Institute'' has accomplished a large scope of experimental investigations to study efficiency of intensifier grids of various types. The outcomes of experimental investigations can be used in verification of computational models and codes. On the basis of experimental data, we derived correlations to calculate coolant mixing and critical heat flux mixing in rod bundles equipped with intensifier grids. The acquired correlations were integrated in subchannel code SC-INT.

  13. Membrane Transfer from Mononuclear Cells to Polymorphonuclear Neutrophils Transduces Cell Survival and Activation Signals in the Recipient Cells via Anti-Extrinsic Apoptotic and MAP Kinase Signaling Pathways.

    Science.gov (United States)

    Li, Ko-Jen; Wu, Cheng-Han; Shen, Chieh-Yu; Kuo, Yu-Min; Yu, Chia-Li; Hsieh, Song-Chou

    2016-01-01

    The biological significance of membrane transfer (trogocytosis) between polymorphonuclear neutrophils (PMNs) and mononuclear cells (MNCs) remains unclear. We investigated the biological/immunological effects and molecular basis of trogocytosis among various immune cells in healthy individuals and patients with active systemic lupus erythematosus (SLE). By flow cytometry, we determined that molecules in the immunological synapse, including HLA class-I and-II, CD11b and LFA-1, along with CXCR1, are exchanged among autologous PMNs, CD4+ T cells, and U937 cells (monocytes) after cell-cell contact. Small interfering RNA knockdown of the integrin adhesion molecule CD11a in U937 unexpectedly enhanced the level of total membrane transfer from U937 to PMN cells. Functionally, phagocytosis and IL-8 production by PMNs were enhanced after co-culture with T cells. Total membrane transfer from CD4+ T to PMNs delayed PMN apoptosis by suppressing the extrinsic apoptotic molecules, BAX, MYC and caspase 8. This enhancement of activities of PMNs by T cells was found to be mediated via p38- and P44/42-Akt-MAP kinase pathways and inhibited by the actin-polymerization inhibitor, latrunculin B, the clathrin inhibitor, Pitstop-2, and human immunoglobulin G, but not by the caveolin inhibitor, methyl-β-cyclodextrin. In addition, membrane transfer from PMNs enhanced IL-2 production by recipient anti-CD3/anti-CD28 activated MNCs, and this was suppressed by inhibitors of mitogen-activated protein kinase (PD98059) and protein kinase C (Rottlerin). Of clinical significance, decreased total membrane transfer from PMNs to MNCs in patients with active SLE suppressed mononuclear IL-2 production. In conclusion, membrane transfer from MNCs to PMNs, mainly at the immunological synapse, transduces survival and activation signals to enhance PMN functions and is dependent on actin polymerization, clathrin activation, and Fcγ receptors, while membrane transfer from PMNs to MNCs depends on MAP kinase and

  14. Membrane Transfer from Mononuclear Cells to Polymorphonuclear Neutrophils Transduces Cell Survival and Activation Signals in the Recipient Cells via Anti-Extrinsic Apoptotic and MAP Kinase Signaling Pathways.

    Directory of Open Access Journals (Sweden)

    Ko-Jen Li

    Full Text Available The biological significance of membrane transfer (trogocytosis between polymorphonuclear neutrophils (PMNs and mononuclear cells (MNCs remains unclear. We investigated the biological/immunological effects and molecular basis of trogocytosis among various immune cells in healthy individuals and patients with active systemic lupus erythematosus (SLE. By flow cytometry, we determined that molecules in the immunological synapse, including HLA class-I and-II, CD11b and LFA-1, along with CXCR1, are exchanged among autologous PMNs, CD4+ T cells, and U937 cells (monocytes after cell-cell contact. Small interfering RNA knockdown of the integrin adhesion molecule CD11a in U937 unexpectedly enhanced the level of total membrane transfer from U937 to PMN cells. Functionally, phagocytosis and IL-8 production by PMNs were enhanced after co-culture with T cells. Total membrane transfer from CD4+ T to PMNs delayed PMN apoptosis by suppressing the extrinsic apoptotic molecules, BAX, MYC and caspase 8. This enhancement of activities of PMNs by T cells was found to be mediated via p38- and P44/42-Akt-MAP kinase pathways and inhibited by the actin-polymerization inhibitor, latrunculin B, the clathrin inhibitor, Pitstop-2, and human immunoglobulin G, but not by the caveolin inhibitor, methyl-β-cyclodextrin. In addition, membrane transfer from PMNs enhanced IL-2 production by recipient anti-CD3/anti-CD28 activated MNCs, and this was suppressed by inhibitors of mitogen-activated protein kinase (PD98059 and protein kinase C (Rottlerin. Of clinical significance, decreased total membrane transfer from PMNs to MNCs in patients with active SLE suppressed mononuclear IL-2 production. In conclusion, membrane transfer from MNCs to PMNs, mainly at the immunological synapse, transduces survival and activation signals to enhance PMN functions and is dependent on actin polymerization, clathrin activation, and Fcγ receptors, while membrane transfer from PMNs to MNCs depends on

  15. Processing computed tomography images by using personal computer

    International Nuclear Information System (INIS)

    Seto, Kazuhiko; Fujishiro, Kazuo; Seki, Hirofumi; Yamamoto, Tetsuo.

    1994-01-01

    Processing of CT images was attempted by using a popular personal computer. The program for image-processing was made with C compiler. The original images, acquired with CT scanner (TCT-60A, Toshiba), were transferred to the computer by 8-inch flexible diskette. Many fundamental image-processing, such as displaying image to the monitor, calculating CT value and drawing the profile curve. The result showed that a popular personal computer had ability to process CT images. It seemed that 8-inch flexible diskette was still useful medium of transferring image data. (author)

  16. Development and adaptation of conduction and radiation heat-transfer computer codes for the CFTL

    International Nuclear Information System (INIS)

    Conklin, J.C.

    1981-08-01

    RODCON and HOTTEL are two computational methods used to calculate thermal and radiation heat transfer for the Core Flow Test Loop (CFTL) analysis efforts. RODCON was developed at ORNL to calculate the internal temperature distribution of the fuel rod simulator (FRS) for the CFTL. RODCON solves the time-dependent heat transfer equation in two-dimensional (R angle) cylindrical coordinates at an axial plane with user-specified radial material zones and time- and position-variant surface conditions at the FRS periphery. Symmetry of the FRS periphery boundary conditions is not necessary. The governing elliptic, partial differential heat equation is cast into a fully implicit, finite-difference form by approximating the derivatives with a forward-differencing scheme with variable mesh spacing. The heat conduction path is circumferentially complete, and the potential mathematical problem at the rod center can be effectively ignored. HOTTEL is a revision of an algorithm developed by C.B. Baxi at the General Atomic Company (GAC) to be used in calculating radiation heat transfer in a rod bundle enclosed in a hexagonal duct. HOTTEL uses geometric view factors, surface emissivities, and surface areas to calculate the gray-body or composite view factors in an enclosure having multiple reflections in a nonparticipating medium

  17. Computational and experimental study of the effect of mass transfer on liquid jet break-up

    Science.gov (United States)

    Schetz, J. A.; Situ, M.

    1983-06-01

    A computational method has been developed to predict the effect of mass transfer on liquid jet break-up in coaxial, low velocity gas streams. Two conditions, both with and without the effect of mass transfer on the jet break-up, are calculated, and compared with experimental results and the classical linear theory. Methanol and water were used as the injectants. The numerical solution can predict the instantaneous shape of the jet surface and the break-up time, and it is very close to the experimental results. The numerical solutions and the experimental results both indicate that the wave number of the maximum instability is about 6.9, higher than 4.51 which was predicted by Rayleigh's linear theory. The experimental results and numerical solution show that the growth of the amplitude of the trough is faster than the growth of the amplitude of the crest, especially for a rapidly vaporizing jet. The numerical solutions show that for the small rates of evaporation, the effect of the mass transfer on the interface has a stabilizing effect near the wave number for maximum instability. Inversely, it has a destabilizing effect far from the wave number for maximum instability. For rapid evaporation, the effect of the mass transfer always has a destabilizing effect and decreases the break-up time of the jet.

  18. A computationally efficient method for full-core conjugate heat transfer modeling of sodium fast reactors

    Energy Technology Data Exchange (ETDEWEB)

    Hu, Rui, E-mail: rhu@anl.gov; Yu, Yiqi

    2016-11-15

    Highlights: • Developed a computationally efficient method for full-core conjugate heat transfer modeling of sodium fast reactors. • Applied fully-coupled JFNK solution scheme to avoid the operator-splitting errors. • The accuracy and efficiency of the method is confirmed with a 7-assembly test problem. • The effects of different spatial discretization schemes are investigated and compared to the RANS-based CFD simulations. - Abstract: For efficient and accurate temperature predictions of sodium fast reactor structures, a 3-D full-core conjugate heat transfer modeling capability is developed for an advanced system analysis tool, SAM. The hexagon lattice core is modeled with 1-D parallel channels representing the subassembly flow, and 2-D duct walls and inter-assembly gaps. The six sides of the hexagon duct wall and near-wall coolant region are modeled separately to account for different temperatures and heat transfer between coolant flow and each side of the duct wall. The Jacobian Free Newton Krylov (JFNK) solution method is applied to solve the fluid and solid field simultaneously in a fully coupled fashion. The 3-D full-core conjugate heat transfer modeling capability in SAM has been demonstrated by a verification test problem with 7 fuel assemblies in a hexagon lattice layout. Additionally, the SAM simulation results are compared with RANS-based CFD simulations. Very good agreements have been achieved between the results of the two approaches.

  19. Primary Issues of Mixed Convection Heat Transfer Phenomena

    Energy Technology Data Exchange (ETDEWEB)

    Chae, Myeong-Seon; Chung, Bum-Jin [Kyung Hee University, Yongin (Korea, Republic of)

    2015-10-15

    The computer code analyzing the system operating and transient behavior must distinguish flow conditions involved with convective heat transfer flow regimes. And the proper correlations must be supplied to those flow regimes. However the existing safety analysis codes are focused on the Light Water Reactor and they are skeptical to be applied to the GCRs (Gas Cooled Reactors). One of the technical issues raise by the development of the VHTR is the mixed convection, which occur when the driving forces of both forced and natural convection are of comparable magnitudes. It can be encountered as in channel of the stacked with fuel elements and a decay heat removal system and in VHTR. The mixed convection is not intermediate phenomena with natural convection and forced convection but independent complicated phenomena. Therefore, many researchers have been studied and some primary issues were propounded for phenomena mixed convection. This paper is to discuss some problems identified through reviewing the papers for mixed convection phenomena. And primary issues of mixed convection heat transfer were proposed respect to thermal hydraulic problems for VHTR. The VHTR thermal hydraulic study requires an indepth study of the mixed convection phenomena. In this study we reviewed the classical flow regime map of Metais and Eckert and derived further issues to be considered. The following issues were raised: (1) Buoyancy aided an opposed flows were not differentiated and plotted in a map. (2) Experimental results for UWT and UHF condition were also plotted in the same map without differentiation. (3) The buoyancy coefficient was not generalized for correlating with buoyancy coefficient. (4) The phenomenon analysis for laminarization and returbulization as buoyancy effects in turbulent mixed convection was not established. (5) The defining to transition in mixed convection regime was difficult.

  20. A Computational Fluid Dynamic and Heat Transfer Model for Gaseous Core and Gas Cooled Space Power and Propulsion Reactors

    Science.gov (United States)

    Anghaie, S.; Chen, G.

    1996-01-01

    A computational model based on the axisymmetric, thin-layer Navier-Stokes equations is developed to predict the convective, radiation and conductive heat transfer in high temperature space nuclear reactors. An implicit-explicit, finite volume, MacCormack method in conjunction with the Gauss-Seidel line iteration procedure is utilized to solve the thermal and fluid governing equations. Simulation of coolant and propellant flows in these reactors involves the subsonic and supersonic flows of hydrogen, helium and uranium tetrafluoride under variable boundary conditions. An enthalpy-rebalancing scheme is developed and implemented to enhance and accelerate the rate of convergence when a wall heat flux boundary condition is used. The model also incorporated the Baldwin and Lomax two-layer algebraic turbulence scheme for the calculation of the turbulent kinetic energy and eddy diffusivity of energy. The Rosseland diffusion approximation is used to simulate the radiative energy transfer in the optically thick environment of gas core reactors. The computational model is benchmarked with experimental data on flow separation angle and drag force acting on a suspended sphere in a cylindrical tube. The heat transfer is validated by comparing the computed results with the standard heat transfer correlations predictions. The model is used to simulate flow and heat transfer under a variety of design conditions. The effect of internal heat generation on the heat transfer in the gas core reactors is examined for a variety of power densities, 100 W/cc, 500 W/cc and 1000 W/cc. The maximum temperature, corresponding with the heat generation rates, are 2150 K, 2750 K and 3550 K, respectively. This analysis shows that the maximum temperature is strongly dependent on the value of heat generation rate. It also indicates that a heat generation rate higher than 1000 W/cc is necessary to maintain the gas temperature at about 3500 K, which is typical design temperature required to achieve high

  1. An XML transfer schema for exchange of genomic and genetic mapping data: implementation as a web service in a Taverna workflow

    Directory of Open Access Journals (Sweden)

    Law Andy

    2009-08-01

    data retrieval into Taverna workflows. Conclusion The data exchange standard we present here provides a useful generic format for transfer and integration of genomic and genetic mapping data. The extensibility of our schema allows for inclusion of additional data and provides a mechanism for typing mapping objects via third party standards. Web services retrieving GMD-compliant mapping data demonstrate that use of this exchange standard provides a practical mechanism for achieving data integration, by facilitating syntactically and semantically-controlled access to the data.

  2. The visual attention saliency map for movie retrospection

    Directory of Open Access Journals (Sweden)

    Rogalska Anna

    2018-04-01

    Full Text Available The visual saliency map is becoming important and challenging for many scientific disciplines (robotic systems, psychophysics, cognitive neuroscience and computer science. Map created by the model indicates possible salient regions by taking into consideration face presence and motion which is essential in motion pictures. By combining we can obtain credible saliency map with a low computational cost.

  3. Problems of mixed convection flow regime map in a vertical cylinder

    International Nuclear Information System (INIS)

    Kang, Gyeong Uk; Chung, Bum Jin

    2012-01-01

    One of the technical issues by the development of the VHTR is the mixed convection, which is the regime of heat transfer that occurs when the driving forces of both forced and natural convection are of comparable orders of magnitude. In vertical internal flows, the buoyancy force acts upward only, but forced flows can move either upward or downward. Thus, there are two types of mixed convection flows, depending on the direction of the forced flow. When the directions of the forced flow and buoyancy are the same, the flow is a buoyancy aided flow; when they are opposite, the flow is a buoyancy opposed flow. In laminar flows, buoyancy aided flow shows enhanced heat transfer compared to the pure forced convection and buoyancy opposed flow shows impaired heat transfer due to the flow velocity affected by the buoyancy forces. In turbulent flows, however, buoyancy opposed flows shows enhanced heat transfer due to increased turbulence production and buoyancy aided flow shows impaired heat transfer at low buoyancy forces and as the buoyancy increases, the heat transfer restores and at further increases of the buoyancy forces, the heat transfer is enhanced. It is of primary interests to classify which convection regime is mainly dominant. The methods most used to classify between forced, mixed and natural convection have been to refer to the classical flow regime map suggested by Meta is and Eckert. During the course of fundamental literature studies on this topic, it is found that there are some problems on the flow regime map in a vertical cylinder. This paper is to discuss problems identified through reviewing the papers composed in the classical flow regime map. We have tried to reproduce the flow regime map independently using the data obtained from the literatures and compared with the classical flow regime map and finally, the problems on this topic were discussed

  4. Computational Characterization of Exogenous MicroRNAs that Can Be Transferred into Human Circulation.

    Directory of Open Access Journals (Sweden)

    Jiang Shu

    Full Text Available MicroRNAs have been long considered synthesized endogenously until very recent discoveries showing that human can absorb dietary microRNAs from animal and plant origins while the mechanism remains unknown. Compelling evidences of microRNAs from rice, milk, and honeysuckle transported to human blood and tissues have created a high volume of interests in the fundamental questions that which and how exogenous microRNAs can be transferred into human circulation and possibly exert functions in humans. Here we present an integrated genomics and computational analysis to study the potential deciding features of transportable microRNAs. Specifically, we analyzed all publicly available microRNAs, a total of 34,612 from 194 species, with 1,102 features derived from the microRNA sequence and structure. Through in-depth bioinformatics analysis, 8 groups of discriminative features have been used to characterize human circulating microRNAs and infer the likelihood that a microRNA will get transferred into human circulation. For example, 345 dietary microRNAs have been predicted as highly transportable candidates where 117 of them have identical sequences with their homologs in human and 73 are known to be associated with exosomes. Through a milk feeding experiment, we have validated 9 cow-milk microRNAs in human plasma using microRNA-sequencing analysis, including the top ranked microRNAs such as bta-miR-487b, miR-181b, and miR-421. The implications in health-related processes have been illustrated in the functional analysis. This work demonstrates the data-driven computational analysis is highly promising to study novel molecular characteristics of transportable microRNAs while bypassing the complex mechanistic details.

  5. Computational Characterization of Exogenous MicroRNAs that Can Be Transferred into Human Circulation

    Science.gov (United States)

    Shu, Jiang; Chiang, Kevin; Zempleni, Janos; Cui, Juan

    2015-01-01

    MicroRNAs have been long considered synthesized endogenously until very recent discoveries showing that human can absorb dietary microRNAs from animal and plant origins while the mechanism remains unknown. Compelling evidences of microRNAs from rice, milk, and honeysuckle transported to human blood and tissues have created a high volume of interests in the fundamental questions that which and how exogenous microRNAs can be transferred into human circulation and possibly exert functions in humans. Here we present an integrated genomics and computational analysis to study the potential deciding features of transportable microRNAs. Specifically, we analyzed all publicly available microRNAs, a total of 34,612 from 194 species, with 1,102 features derived from the microRNA sequence and structure. Through in-depth bioinformatics analysis, 8 groups of discriminative features have been used to characterize human circulating microRNAs and infer the likelihood that a microRNA will get transferred into human circulation. For example, 345 dietary microRNAs have been predicted as highly transportable candidates where 117 of them have identical sequences with their homologs in human and 73 are known to be associated with exosomes. Through a milk feeding experiment, we have validated 9 cow-milk microRNAs in human plasma using microRNA-sequencing analysis, including the top ranked microRNAs such as bta-miR-487b, miR-181b, and miR-421. The implications in health-related processes have been illustrated in the functional analysis. This work demonstrates the data-driven computational analysis is highly promising to study novel molecular characteristics of transportable microRNAs while bypassing the complex mechanistic details. PMID:26528912

  6. Globus File Transfer Services | High-Performance Computing | NREL

    Science.gov (United States)

    installed on the systems at both ends of the data transfer. The NREL endpoint is nrel#globus. Click Login on the Globus web site. On the login page select "Globus ID" as the login method and click Login to the Globus website. From the Manage Data drop down menu, select Transfer Files. Then click Get

  7. BUSH: A computer code for calculating steady state heat transfer in LWR rod bundles under accident conditions

    International Nuclear Information System (INIS)

    Shepherd, I.M.

    1982-01-01

    The computer code BUSH has been developed for the calculation of steady state heat transfer in a rod bundle. For a given power, flow and geometry it can calculate the temperatures in the rods, coolant and shroud assuming that at any axial level each rod can be described by one temperature and the coolant fluid is also radially uniform at this level. Heat transfer by convection and radiation are handled and the geometry is flexible enough to model nearly all types of envisaged shroud design for the SUPERSARA test series. The modular way in which BUSH has been written makes it suitable for future development, either within the present BUSH framework or as part of a more advanced code

  8. Mapping Sub-Saharan African Agriculture in High-Resolution Satellite Imagery with Computer Vision & Machine Learning

    Science.gov (United States)

    Debats, Stephanie Renee

    Smallholder farms dominate in many parts of the world, including Sub-Saharan Africa. These systems are characterized by small, heterogeneous, and often indistinct field patterns, requiring a specialized methodology to map agricultural landcover. In this thesis, we developed a benchmark labeled data set of high-resolution satellite imagery of agricultural fields in South Africa. We presented a new approach to mapping agricultural fields, based on efficient extraction of a vast set of simple, highly correlated, and interdependent features, followed by a random forest classifier. The algorithm achieved similar high performance across agricultural types, including spectrally indistinct smallholder fields, and demonstrated the ability to generalize across large geographic areas. In sensitivity analyses, we determined multi-temporal images provided greater performance gains than the addition of multi-spectral bands. We also demonstrated how active learning can be incorporated in the algorithm to create smaller, more efficient training data sets, which reduced computational resources, minimized the need for humans to hand-label data, and boosted performance. We designed a patch-based uncertainty metric to drive the active learning framework, based on the regular grid of a crowdsourcing platform, and demonstrated how subject matter experts can be replaced with fleets of crowdsourcing workers. Our active learning algorithm achieved similar performance as an algorithm trained with randomly selected data, but with 62% less data samples. This thesis furthers the goal of providing accurate agricultural landcover maps, at a scale that is relevant for the dominant smallholder class. Accurate maps are crucial for monitoring and promoting agricultural production. Furthermore, improved agricultural landcover maps will aid a host of other applications, including landcover change assessments, cadastral surveys to strengthen smallholder land rights, and constraints for crop modeling

  9. Basic heat transfer

    CERN Document Server

    Bacon, D H

    2013-01-01

    Basic Heat Transfer aims to help readers use a computer to solve heat transfer problems and to promote greater understanding by changing data values and observing the effects, which are necessary in design and optimization calculations.The book is concerned with applications including insulation and heating in buildings and pipes, temperature distributions in solids for steady state and transient conditions, the determination of surface heat transfer coefficients for convection in various situations, radiation heat transfer in grey body problems, the use of finned surfaces, and simple heat exc

  10. [Impact to Z-score Mapping of Hyperacute Stroke Images by Computed Tomography in Adaptive Statistical Iterative Reconstruction].

    Science.gov (United States)

    Watanabe, Shota; Sakaguchi, Kenta; Hosono, Makoto; Ishii, Kazunari; Murakami, Takamichi; Ichikawa, Katsuhiro

    The purpose of this study was to evaluate the effect of a hybrid-type iterative reconstruction method on Z-score mapping of hyperacute stroke in unenhanced computed tomography (CT) images. We used a hybrid-type iterative reconstruction [adaptive statistical iterative reconstruction (ASiR)] implemented in a CT system (Optima CT660 Pro advance, GE Healthcare). With 15 normal brain cases, we reconstructed CT images with a filtered back projection (FBP) and ASiR with a blending factor of 100% (ASiR100%). Two standardized normal brain data were created from normal databases of FBP images (FBP-NDB) and ASiR100% images (ASiR-NDB), and standard deviation (SD) values in basal ganglia were measured. The Z-score mapping was performed for 12 hyperacute stroke cases by using FBP-NDB and ASiR-NDB, and compared Z-score value on hyperacute stroke area and normal area between FBP-NDB and ASiR-NDB. By using ASiR-NDB, the SD value of standardized brain was decreased by 16%. The Z-score value of ASiR-NDB on hyperacute stroke area was significantly higher than FBP-NDB (pASiR100% for Z-score mapping had potential to improve the accuracy of Z-score mapping.

  11. COMPUTER-BASED REASONING SYSTEMS: AN OVERVIEW

    Directory of Open Access Journals (Sweden)

    CIPRIAN CUCU

    2012-12-01

    Full Text Available Argumentation is nowadays seen both as skill that people use in various aspects of their lives, as well as an educational technique that can support the transfer or creation of knowledge thus aiding in the development of other skills (e.g. Communication, critical thinking or attitudes. However, teaching argumentation and teaching with argumentation is still a rare practice, mostly due to the lack of available resources such as time or expert human tutors that are specialized in argumentation. Intelligent Computer Systems (i.e. Systems that implement an inner representation of particular knowledge and try to emulate the behavior of humans could allow more people to understand the purpose, techniques and benefits of argumentation. The proposed paper investigates the state of the art concepts of computer-based argumentation used in education and tries to develop a conceptual map, showing benefits, limitation and relations between various concepts focusing on the duality “learning to argue – arguing to learn”.

  12. Computationally efficient dynamic modeling of robot manipulators with multiple flexible-links using acceleration-based discrete time transfer matrix method

    DEFF Research Database (Denmark)

    Zhang, Xuping; Sørensen, Rasmus; RahbekIversen, Mathias

    2018-01-01

    This paper presents a novel and computationally efficient modeling method for the dynamics of flexible-link robot manipulators. In this method, a robot manipulator is decomposed into components/elements. The component/element dynamics is established using Newton–Euler equations, and then is linea......This paper presents a novel and computationally efficient modeling method for the dynamics of flexible-link robot manipulators. In this method, a robot manipulator is decomposed into components/elements. The component/element dynamics is established using Newton–Euler equations......, and then is linearized based on the acceleration-based state vector. The transfer matrices for each type of components/elements are developed, and used to establish the system equations of a flexible robot manipulator by concatenating the state vector from the base to the end-effector. With this strategy, the size...... manipulators, and only involves calculating and transferring component/element dynamic equations that have small size. The numerical simulations and experimental testing of flexible-link manipulators are conducted to validate the proposed methodologies....

  13. Experiments and Analyses of Data Transfers Over Wide-Area Dedicated Connections

    Energy Technology Data Exchange (ETDEWEB)

    Rao, Nageswara S. [ORNL; Liu, Qiang [ORNL; Sen, Satyabrata [ORNL; Hanley, Jesse A. [ORNL; Foster, Ian [University of Chicago; Kettimuthu, R. [Argonne National Laboratory (ANL); Wu, Qishi [University of Memphis; Yun, Daqing [Harrisburg University; Towsley, Don [University of Massachusetts, Amherst; Vardoyan, Gayane [University of Massachusetts, Amherst

    2017-08-01

    Dedicated wide-area network connections are increasingly employed in high-performance computing and big data scenarios. One might expect the performance and dynamics of data transfers over such connections to be easy to analyze due to the lack of competing traffic. However, non-linear transport dynamics and end-system complexities (e.g., multi-core hosts and distributed filesystems) can in fact make analysis surprisingly challenging. We present extensive measurements of memory-to-memory and disk-to-disk file transfers over 10 Gbps physical and emulated connections with 0–366 ms round trip times (RTTs). For memory-to-memory transfers, profiles of both TCP and UDT throughput as a function of RTT show concave and convex regions; large buffer sizes and more parallel flows lead to wider concave regions, which are highly desirable. TCP and UDT both also display complex throughput dynamics, as indicated by their Poincare maps and Lyapunov exponents. For disk-to-disk transfers, we determine that high throughput can be achieved via a combination of parallel I/O threads, parallel network threads, and direct I/O mode. Our measurements also show that Lustre filesystems can be mounted over long-haul connections using LNet routers, although challenges remain in jointly optimizing file I/O and transport method parameters to achieve peak throughput.

  14. Accelerating Multiagent Reinforcement Learning by Equilibrium Transfer.

    Science.gov (United States)

    Hu, Yujing; Gao, Yang; An, Bo

    2015-07-01

    An important approach in multiagent reinforcement learning (MARL) is equilibrium-based MARL, which adopts equilibrium solution concepts in game theory and requires agents to play equilibrium strategies at each state. However, most existing equilibrium-based MARL algorithms cannot scale due to a large number of computationally expensive equilibrium computations (e.g., computing Nash equilibria is PPAD-hard) during learning. For the first time, this paper finds that during the learning process of equilibrium-based MARL, the one-shot games corresponding to each state's successive visits often have the same or similar equilibria (for some states more than 90% of games corresponding to successive visits have similar equilibria). Inspired by this observation, this paper proposes to use equilibrium transfer to accelerate equilibrium-based MARL. The key idea of equilibrium transfer is to reuse previously computed equilibria when each agent has a small incentive to deviate. By introducing transfer loss and transfer condition, a novel framework called equilibrium transfer-based MARL is proposed. We prove that although equilibrium transfer brings transfer loss, equilibrium-based MARL algorithms can still converge to an equilibrium policy under certain assumptions. Experimental results in widely used benchmarks (e.g., grid world game, soccer game, and wall game) show that the proposed framework: 1) not only significantly accelerates equilibrium-based MARL (up to 96.7% reduction in learning time), but also achieves higher average rewards than algorithms without equilibrium transfer and 2) scales significantly better than algorithms without equilibrium transfer when the state/action space grows and the number of agents increases.

  15. Coordinating knowledge transfer within manufacturing networks

    DEFF Research Database (Denmark)

    Yang, Cheng; Johansen, John; Boer, Harry

    2008-01-01

    Along with increasing globalization, the management of international manufacturing networks is becoming increasingly important for industrial companies. This paper mainly focuses on the coordination of knowledge transfer within manufacturing networks. In this context, we propose the time......-place matrix as a tool for mapping the distribution of knowledge within manufacturing networks. Using this tool, four important questions about the coordination of knowledge transfer within a manufacturing network are identified: know-where, know-what, know-when, know-how to transfer. The relationships among...

  16. Quantum vertex model for reversible classical computing.

    Science.gov (United States)

    Chamon, C; Mucciolo, E R; Ruckenstein, A E; Yang, Z-C

    2017-05-12

    Mappings of classical computation onto statistical mechanics models have led to remarkable successes in addressing some complex computational problems. However, such mappings display thermodynamic phase transitions that may prevent reaching solution even for easy problems known to be solvable in polynomial time. Here we map universal reversible classical computations onto a planar vertex model that exhibits no bulk classical thermodynamic phase transition, independent of the computational circuit. Within our approach the solution of the computation is encoded in the ground state of the vertex model and its complexity is reflected in the dynamics of the relaxation of the system to its ground state. We use thermal annealing with and without 'learning' to explore typical computational problems. We also construct a mapping of the vertex model into the Chimera architecture of the D-Wave machine, initiating an approach to reversible classical computation based on state-of-the-art implementations of quantum annealing.

  17. Map projections cartographic information systems

    CERN Document Server

    Grafarend, Erik W

    2006-01-01

    In the context of Geographical Information Systems (GIS) the book offers a timely review of map projections (sphere, ellipsoid, rotational surfaces) and geodetic datum transformations. For the needs of photogrammetry, computer vision, and remote sensing space projective mappings are reviewed.

  18. Automated cropland mapping of continental Africa using Google Earth Engine cloud computing

    Science.gov (United States)

    Xiong, Jun N.; Thenkabail, Prasad S.; Gumma, Murali Krishna; Teluguntla, Pardhasaradhi G.; Poehnelt, Justin; Congalton, Russell G.; Yadav, Kamini; Thau, David

    2017-01-01

    The automation of agricultural mapping using satellite-derived remotely sensed data remains a challenge in Africa because of the heterogeneous and fragmental landscape, complex crop cycles, and limited access to local knowledge. Currently, consistent, continent-wide routine cropland mapping of Africa does not exist, with most studies focused either on certain portions of the continent or at most a one-time effort at mapping the continent at coarse resolution remote sensing. In this research, we addressed these limitations by applying an automated cropland mapping algorithm (ACMA) that captures extensive knowledge on the croplands of Africa available through: (a) ground-based training samples, (b) very high (sub-meter to five-meter) resolution imagery (VHRI), and (c) local knowledge captured during field visits and/or sourced from country reports and literature. The study used 16-day time-series of Moderate Resolution Imaging Spectroradiometer (MODIS) normalized difference vegetation index (NDVI) composited data at 250-m resolution for the entire African continent. Based on these data, the study first produced accurate reference cropland layers or RCLs (cropland extent/areas, irrigation versus rainfed, cropping intensities, crop dominance, and croplands versus cropland fallows) for the year 2014 that provided an overall accuracy of around 90% for crop extent in different agro-ecological zones (AEZs). The RCLs for the year 2014 (RCL2014) were then used in the development of the ACMA algorithm to create ACMA-derived cropland layers for 2014 (ACL2014). ACL2014 when compared pixel-by-pixel with the RCL2014 had an overall similarity greater than 95%. Based on the ACL2014, the African continent had 296 Mha of net cropland areas (260 Mha cultivated plus 36 Mha fallows) and 330 Mha of gross cropland areas. Of the 260 Mha of net cropland areas cultivated during 2014, 90.6% (236 Mha) was rainfed and just 9.4% (24 Mha) was irrigated. Africa has about 15% of the

  19. ActionMap: A web-based software that automates loci assignments to framework maps.

    Science.gov (United States)

    Albini, Guillaume; Falque, Matthieu; Joets, Johann

    2003-07-01

    Genetic linkage computation may be a repetitive and time consuming task, especially when numerous loci are assigned to a framework map. We thus developed ActionMap, a web-based software that automates genetic mapping on a fixed framework map without adding the new markers to the map. Using this tool, hundreds of loci may be automatically assigned to the framework in a single process. ActionMap was initially developed to map numerous ESTs with a small plant mapping population and is limited to inbred lines and backcrosses. ActionMap is highly configurable and consists of Perl and PHP scripts that automate command steps for the MapMaker program. A set of web forms were designed for data import and mapping settings. Results of automatic mapping can be displayed as tables or drawings of maps and may be exported. The user may create personal access-restricted projects to store raw data, settings and mapping results. All data may be edited, updated or deleted. ActionMap may be used either online or downloaded for free (http://moulon.inra.fr/~bioinfo/).

  20. Review of Mixed Convection Flow Regime Map of a Vertical pipe

    International Nuclear Information System (INIS)

    Chae, Myeong-Seon; Chung, Bum-Jin; Kang, Gyeong-Uk

    2015-01-01

    In a vertical pipe, the natural convective force due to buoyancy acts upward only, but forced convective force can be either upward or downward. This determines buoyancy-aided and buoyancy-opposed flows depending on the direction of forced flow with respect to the buoyancy forces. Furthermore, depending on the exchange mechanism, the flow condition is classified into laminar and turbulent. In laminar mixed convection, buoyancy-aided flow presents enhanced heat transfer compared to the pure forced convection and buoyancy-opposed flow shows impaired heat transfer as the flow velocity affected by the buoyancy forces. However, in turbulent mixed convection, buoyancy-aided flow shows an impairment of the heat transfer rate for small buoyancy, and a gradational enhancement for large buoyancy. In this study, the existing flow regime map on mixed convection in a vertical pipe was reviewed through an analysis of literatures. Using the investigated data and heat transfer correlations, the flow regime map was reconstructed independently, and compared with the existing one. This study reviewed the limitations of the classical mixed convection flow regime map. Using the existing data and heat transfer correlations by Martinelli and Boelter and Watzinger and Johnson, the flow regime map was reconstructed independently. The results revealed that the existing map used the data selectively among the experimental and theoretical results, and a detailed description for lines forming mixed convection and transition regime were not given. And the information about uncertainty analysis and the evidentiary data were given insufficiently. The flow regime map and investigator commonly used the diameter as the characteristic length for both Re and Gr in place of the height of the heated wall, though the buoyancy forces are proportional to the third power of the height of heated wall

  1. Q-S synchronization in 3D Henon-like map and generalized Henon map via a scalar controller

    International Nuclear Information System (INIS)

    Yan Zhenya

    2005-01-01

    In this Letter, a generalized, systematic and automatic backstepping scheme is developed to investigate the Q-S synchronization of two identical 3a discrete-time dynamical systems and two different 3D discrete-time dynamical systems. With the aid of symbolic-numeric computation, we use the proposed scheme to illustrate chaos synchronization between two identical 3D generalized Henon map and Q-S synchronization between two different 3D generalized Henon map and Henon-like map via a scalar controller, respectively. Moreover numerical simulations are used to verify the effectiveness of the proposed scheme. In addition, the scheme can be also applied to investigate the tracking problem in the discrete-time systems and to generate automatically the scalar controller in computer with the aid of symbolic-numeric computation

  2. Visualization of Heat Transfer and Core Damage With RGUI 1.5

    International Nuclear Information System (INIS)

    Mesina, George L.

    2002-01-01

    Graphical User Interfaces (GUI) have become an integral and essential part of computer software. In the ever-changing world of computing, they provide the user with a valuable means to learn, understand, and use the application software while also helping applications adapt to and span different computing paradigms, such as different operating systems. For these reasons, GUI development for nuclear plant analysis programs has been ongoing for a decade and a half and much progress has been made. With the development of codes such as RELAP5-3D [1] and SCDAP/RELAP5 that have multi-dimensional modeling capability, it has become necessary to represent three-dimensional, calculated data. The RELAP5-3D Graphical User Interface (RGUI) [4] was designed specifically for this purpose. It reduces the difficulty of analyzing complex three-dimensional models and enhances the analysts' ability to recognize plant behavior visually. Previous versions of RGUI [5] focused on visualizing reactor coolant behavior during a simulated transient or accident. Recent work has extended RGUI to display two other phenomena, heat transfer and core damage. Heat transfer is depicted through the visualization of RELAP5-3D heat structures. Core damage is visualized by displaying fuel rods and other core structures in a reactor vessel screen. Conditions within the core are displayed via numerical results and color maps. These new features of RGUI 1.5 are described and illustrated. (authors)

  3. Preparation of functions of computer code GENGTC and improvement for two-dimensional heat transfer calculations for irradiation capsules

    International Nuclear Information System (INIS)

    Nomura, Yasushi; Someya, Hiroyuki; Ito, Haruhiko.

    1992-11-01

    Capsules for irradiation tests in the JMTR (Japan Materials Testing Reactor), consist of irradiation specimens surrounded by a cladding tube, holders, an inner tube and a container tube (from 30mm to 65mm in diameter). And the annular gaps between these structural materials in the capsule are filled with liquids or gases. Cooling of the capsule is done by reactor primary coolant flowing down outside the capsule. Most of the heat generated by fission in fuel specimens and gamma absorption in structural materials is directed radially to the capsule container outer surface. In thermal performance calculations for capsule design, an one(r)-dimensional heat transfer computer code entitled (Generalyzed Gap Temperature Calculation), GENGTC, originally developed in Oak Ridge National Laboratory, U.S.A., has been frequently used. In designing a capsule, are needed many cases of parametric calculations with respect to changes materials and gap sizes. And in some cases, two(r,z)-dimensional heat transfer calculations are needed for irradiation test capsules with short length fuel rods. Recently the authors improved the original one-dimensional code GENGTC, (1) to simplify preparation of input data, (2) to perform automatic calculations for parametric survey based on design temperatures, ect. Moreover, the computer code has been improved to perform r-z two-dimensional heat transfer calculation. This report describes contents of the preparation of the one-dimensional code GENGTC and the improvement for the two-dimensional code GENGTC-2, together with their code manuals. (author)

  4. Evaluation of the optimum region for mammographic system using computer simulation to study modulation transfer functions

    International Nuclear Information System (INIS)

    Oliveira, Isaura N. Sombra; Schiable, Homero; Porcel, Naider T.; Frere, Annie F.; Marques, Paulo M.A.

    1996-01-01

    An investigation of the 'optimum region' of the radiation field considering mammographic systems is studied. Such a region was defined in previous works as the field range where the system has its best performance and sharpest images. This study is based on a correlation of two methods for evaluating radiologic imaging systems, both using computer simulation in order to determine modulation transfer functions (MTFs) due to the X-ray tube focal spot in several field orientation and locations

  5. Convective heat transfer

    CERN Document Server

    Kakac, Sadik; Pramuanjaroenkij, Anchasa

    2014-01-01

    Intended for readers who have taken a basic heat transfer course and have a basic knowledge of thermodynamics, heat transfer, fluid mechanics, and differential equations, Convective Heat Transfer, Third Edition provides an overview of phenomenological convective heat transfer. This book combines applications of engineering with the basic concepts of convection. It offers a clear and balanced presentation of essential topics using both traditional and numerical methods. The text addresses emerging science and technology matters, and highlights biomedical applications and energy technologies. What’s New in the Third Edition: Includes updated chapters and two new chapters on heat transfer in microchannels and heat transfer with nanofluids Expands problem sets and introduces new correlations and solved examples Provides more coverage of numerical/computer methods The third edition details the new research areas of heat transfer in microchannels and the enhancement of convective heat transfer with nanofluids....

  6. Signal processing of eddy current three-dimensional maps

    International Nuclear Information System (INIS)

    Birac, C.; David, D.; Lamant, D.

    1987-01-01

    Digital processing of eddy current three-dimensional maps improves accuracy of detection: flattening, filtering, computing deconvolution, mapping new variables,.., give new possibilities for difficult test problems. With simulation of defects, probes, probe travels, it is now possible to compute new eddy current processes, without machining defects or building probes

  7. Noise tolerant spatiotemporal chaos computing.

    Science.gov (United States)

    Kia, Behnam; Kia, Sarvenaz; Lindner, John F; Sinha, Sudeshna; Ditto, William L

    2014-12-01

    We introduce and design a noise tolerant chaos computing system based on a coupled map lattice (CML) and the noise reduction capabilities inherent in coupled dynamical systems. The resulting spatiotemporal chaos computing system is more robust to noise than a single map chaos computing system. In this CML based approach to computing, under the coupled dynamics, the local noise from different nodes of the lattice diffuses across the lattice, and it attenuates each other's effects, resulting in a system with less noise content and a more robust chaos computing architecture.

  8. Successful experiences in the application of Concept Maps in Engineering in Computing, Mexico

    Directory of Open Access Journals (Sweden)

    Beatriz Guardian Soto

    2013-02-01

    Full Text Available Today there is an enormous amount of work related to new models and styles of learning and instruction in the field of engineering. In the case of the engineering degree in computing that is taught in the Mexico National Polytechnic Institute (IPN, there is a working group led by an expert of international waisted whose success and work thereon, processes are reflected in this text through experiences gained in the last 8 years with students and teachers, thus generatingthe requirements and tools for the globalised world and the knowledge society in which we find ourselves. Lessons learned are in subjects as the theory of automata (TA, compilers (Cs, analysis of algorithms (AA, (R, Artificial Intelligence (AI, computer programming (P networks, degree project (PT and strategic planning (PE mainly, among others to facilitate the understanding of concepts and applications by the student and believe that through the teaching strategy using concept maps developed by j. Novak results have been favorable in dynamism, understanding and generating meaningful learning in the long term, providing well, solid elements for your professional practice. Listed proposals obtained by teachers and exercises developed by teachers and students.

  9. New computer-aided diagnosis of dementia using positron emission tomography: brain regional sensitivity-mapping method.

    Directory of Open Access Journals (Sweden)

    Akihiro Kakimoto

    Full Text Available PURPOSE: We devised a new computer-aided diagnosis method to segregate dementia using one estimated index (Total Z score derived from the Brodmann area (BA sensitivity map on the stereotaxic brain atlas. The purpose of this study is to investigate its accuracy to differentiate patients with Alzheimer's disease (AD or mild cognitive impairment (MCI from normal adults (NL. METHODS: We studied 101 adults (NL: 40, AD: 37, MCI: 24 who underwent (18FDG positron emission tomography (PET measurement. We divided NL and AD groups into two categories: a training group with (Category A and a test group without (Category B clinical information. In Category A, we estimated sensitivity by comparing the standard uptake value per BA (SUVR between NL and AD groups. Then, we calculated a summated index (Total Z score by utilizing the sensitivity-distribution maps and each BA z-score to segregate AD patterns. To confirm the validity of this method, we examined the accuracy in Category B. Finally, we applied this method to MCI patients. RESULTS: In Category A, we found that the sensitivity and specificity of differentiation between NL and AD were all 100%. In Category B, those were 100% and 95%, respectively. Furthermore, we found this method attained 88% to differentiate AD-converters from non-converters in MCI group. CONCLUSIONS: The present automated computer-aided evaluation method based on a single estimated index provided good accuracy for differential diagnosis of AD and MCI. This good differentiation power suggests its usefulness not only for dementia diagnosis but also in a longitudinal study.

  10. New Computer-Aided Diagnosis of Dementia Using Positron Emission Tomography: Brain Regional Sensitivity-Mapping Method

    Science.gov (United States)

    Kakimoto, Akihiro; Kamekawa, Yuichi; Ito, Shigeru; Yoshikawa, Etsuji; Okada, Hiroyuki; Nishizawa, Sadahiko; Minoshima, Satoshi; Ouchi, Yasuomi

    2011-01-01

    Purpose We devised a new computer-aided diagnosis method to segregate dementia using one estimated index (Total Z score) derived from the Brodmann area (BA) sensitivity map on the stereotaxic brain atlas. The purpose of this study is to investigate its accuracy to differentiate patients with Alzheimer's disease (AD) or mild cognitive impairment (MCI) from normal adults (NL). Methods We studied 101 adults (NL: 40, AD: 37, MCI: 24) who underwent 18FDG positron emission tomography (PET) measurement. We divided NL and AD groups into two categories: a training group with (Category A) and a test group without (Category B) clinical information. In Category A, we estimated sensitivity by comparing the standard uptake value per BA (SUVR) between NL and AD groups. Then, we calculated a summated index (Total Z score) by utilizing the sensitivity-distribution maps and each BA z-score to segregate AD patterns. To confirm the validity of this method, we examined the accuracy in Category B. Finally, we applied this method to MCI patients. Results In Category A, we found that the sensitivity and specificity of differentiation between NL and AD were all 100%. In Category B, those were 100% and 95%, respectively. Furthermore, we found this method attained 88% to differentiate AD-converters from non-converters in MCI group. Conclusions The present automated computer-aided evaluation method based on a single estimated index provided good accuracy for differential diagnosis of AD and MCI. This good differentiation power suggests its usefulness not only for dementia diagnosis but also in a longitudinal study. PMID:21966405

  11. Using a Cloud Computing System to Reduce Door-to-Balloon Time in Acute ST-Elevation Myocardial Infarction Patients Transferred for Percutaneous Coronary Intervention

    Directory of Open Access Journals (Sweden)

    Chi-Kung Ho

    2017-01-01

    Full Text Available Background. This study evaluated the impact on clinical outcomes using a cloud computing system to reduce percutaneous coronary intervention hospital door-to-balloon (DTB time for ST segment elevation myocardial infarction (STEMI. Methods. A total of 369 patients before and after implementation of the transfer protocol were enrolled. Of these patients, 262 were transferred through protocol while the other 107 patients were transferred through the traditional referral process. Results. There were no significant differences in DTB time, pain to door of STEMI receiving center arrival time, and pain to balloon time between the two groups. Pain to electrocardiography time in patients with Killip I/II and catheterization laboratory to balloon time in patients with Killip III/IV were significantly reduced in transferred through protocol group compared to in traditional referral process group (both p<0.05. There were also no remarkable differences in the complication rate and 30-day mortality between two groups. The multivariate analysis revealed that the independent predictors of 30-day mortality were elderly patients, advanced Killip score, and higher level of troponin-I. Conclusions. This study showed that patients transferred through our present protocol could reduce pain to electrocardiography and catheterization laboratory to balloon time in Killip I/II and III/IV patients separately. However, this study showed that using a cloud computing system in our present protocol did not reduce DTB time.

  12. Debugging Data Transfers in CMS

    CERN Document Server

    Bagliesi, G; Bloom, K; Bockelman, B; Bonacorsi, D; Fisk, I; Flix, J; Hernandez, J; D'Hondt, J; Kadastik, M; Klem, J; Kodolova, O; Kuo, C M; Letts, J; Maes, J; Magini, N; Metson, S; Piedra, J; Pukhaeva, N; Tuura, L; Sonajalg, S; Wu, Y; Van Mulders, P; Villella, I; Wurthwein, F

    2010-01-01

    The CMS experiment at CERN is preparing for LHC data taking in several computing preparation activities. In early 2007 a traffic load generator infrastructure for distributed data transfer tests called the LoadTest was designed and deployed to equip the WLCG sites that support CMS with a means for debugging, load-testing and commissioning data transfer routes among CMS computing centres. The LoadTest is based upon PhEDEx as a reliable, scalable data set replication system. The Debugging Data Transfers (DDT) task force was created to coordinate the debugging of the data transfer links. The task force aimed to commission most crucial transfer routes among CMS sites by designing and enforcing a clear procedure to debug problematic links. Such procedure aimed to move a link from a debugging phase in a separate and independent environment to a production environment when a set of agreed conditions are achieved for that link. The goal was to deliver one by one working transfer routes to the CMS data operations team...

  13. When cloud computing meets bioinformatics: a review.

    Science.gov (United States)

    Zhou, Shuigeng; Liao, Ruiqi; Guan, Jihong

    2013-10-01

    In the past decades, with the rapid development of high-throughput technologies, biology research has generated an unprecedented amount of data. In order to store and process such a great amount of data, cloud computing and MapReduce were applied to many fields of bioinformatics. In this paper, we first introduce the basic concepts of cloud computing and MapReduce, and their applications in bioinformatics. We then highlight some problems challenging the applications of cloud computing and MapReduce to bioinformatics. Finally, we give a brief guideline for using cloud computing in biology research.

  14. Concept for high speed computer printer

    Science.gov (United States)

    Stephens, J. W.

    1970-01-01

    Printer uses Kerr cell as light shutter for controlling the print on photosensitive paper. Applied to output data transfer, the information transfer rate of graphic computer printers could be increased to speeds approaching the data transfer rate of computer central processors /5000 to 10,000 lines per minute/.

  15. A Systematic Approach to Modified BCJR MAP Algorithms for Convolutional Codes

    Directory of Open Access Journals (Sweden)

    Patenaude François

    2006-01-01

    Full Text Available Since Berrou, Glavieux and Thitimajshima published their landmark paper in 1993, different modified BCJR MAP algorithms have appeared in the literature. The existence of a relatively large number of similar but different modified BCJR MAP algorithms, derived using the Markov chain properties of convolutional codes, naturally leads to the following questions. What is the relationship among the different modified BCJR MAP algorithms? What are their relative performance, computational complexities, and memory requirements? In this paper, we answer these questions. We derive systematically four major modified BCJR MAP algorithms from the BCJR MAP algorithm using simple mathematical transformations. The connections between the original and the four modified BCJR MAP algorithms are established. A detailed analysis of the different modified BCJR MAP algorithms shows that they have identical computational complexities and memory requirements. Computer simulations demonstrate that the four modified BCJR MAP algorithms all have identical performance to the BCJR MAP algorithm.

  16. Computing algebraic transfer entropy and coupling directions via transcripts

    Science.gov (United States)

    Amigó, José M.; Monetti, Roberto; Graff, Beata; Graff, Grzegorz

    2016-11-01

    Most random processes studied in nonlinear time series analysis take values on sets endowed with a group structure, e.g., the real and rational numbers, and the integers. This fact allows to associate with each pair of group elements a third element, called their transcript, which is defined as the product of the second element in the pair times the first one. The transfer entropy of two such processes is called algebraic transfer entropy. It measures the information transferred between two coupled processes whose values belong to a group. In this paper, we show that, subject to one constraint, the algebraic transfer entropy matches the (in general, conditional) mutual information of certain transcripts with one variable less. This property has interesting practical applications, especially to the analysis of short time series. We also derive weak conditions for the 3-dimensional algebraic transfer entropy to yield the same coupling direction as the corresponding mutual information of transcripts. A related issue concerns the use of mutual information of transcripts to determine coupling directions in cases where the conditions just mentioned are not fulfilled. We checked the latter possibility in the lowest dimensional case with numerical simulations and cardiovascular data, and obtained positive results.

  17. Collaborative filtering for brain-computer interaction using transfer learning and active class selection.

    Science.gov (United States)

    Wu, Dongrui; Lance, Brent J; Parsons, Thomas D

    2013-01-01

    Brain-computer interaction (BCI) and physiological computing are terms that refer to using processed neural or physiological signals to influence human interaction with computers, environment, and each other. A major challenge in developing these systems arises from the large individual differences typically seen in the neural/physiological responses. As a result, many researchers use individually-trained recognition algorithms to process this data. In order to minimize time, cost, and barriers to use, there is a need to minimize the amount of individual training data required, or equivalently, to increase the recognition accuracy without increasing the number of user-specific training samples. One promising method for achieving this is collaborative filtering, which combines training data from the individual subject with additional training data from other, similar subjects. This paper describes a successful application of a collaborative filtering approach intended for a BCI system. This approach is based on transfer learning (TL), active class selection (ACS), and a mean squared difference user-similarity heuristic. The resulting BCI system uses neural and physiological signals for automatic task difficulty recognition. TL improves the learning performance by combining a small number of user-specific training samples with a large number of auxiliary training samples from other similar subjects. ACS optimally selects the classes to generate user-specific training samples. Experimental results on 18 subjects, using both k nearest neighbors and support vector machine classifiers, demonstrate that the proposed approach can significantly reduce the number of user-specific training data samples. This collaborative filtering approach will also be generalizable to handling individual differences in many other applications that involve human neural or physiological data, such as affective computing.

  18. Collaborative filtering for brain-computer interaction using transfer learning and active class selection.

    Directory of Open Access Journals (Sweden)

    Dongrui Wu

    Full Text Available Brain-computer interaction (BCI and physiological computing are terms that refer to using processed neural or physiological signals to influence human interaction with computers, environment, and each other. A major challenge in developing these systems arises from the large individual differences typically seen in the neural/physiological responses. As a result, many researchers use individually-trained recognition algorithms to process this data. In order to minimize time, cost, and barriers to use, there is a need to minimize the amount of individual training data required, or equivalently, to increase the recognition accuracy without increasing the number of user-specific training samples. One promising method for achieving this is collaborative filtering, which combines training data from the individual subject with additional training data from other, similar subjects. This paper describes a successful application of a collaborative filtering approach intended for a BCI system. This approach is based on transfer learning (TL, active class selection (ACS, and a mean squared difference user-similarity heuristic. The resulting BCI system uses neural and physiological signals for automatic task difficulty recognition. TL improves the learning performance by combining a small number of user-specific training samples with a large number of auxiliary training samples from other similar subjects. ACS optimally selects the classes to generate user-specific training samples. Experimental results on 18 subjects, using both k nearest neighbors and support vector machine classifiers, demonstrate that the proposed approach can significantly reduce the number of user-specific training data samples. This collaborative filtering approach will also be generalizable to handling individual differences in many other applications that involve human neural or physiological data, such as affective computing.

  19. Cloud/Fog Computing System Architecture and Key Technologies for South-North Water Transfer Project Safety

    Directory of Open Access Journals (Sweden)

    Yaoling Fan

    2018-01-01

    Full Text Available In view of the real-time and distributed features of Internet of Things (IoT safety system in water conservancy engineering, this study proposed a new safety system architecture for water conservancy engineering based on cloud/fog computing and put forward a method of data reliability detection for the false alarm caused by false abnormal data from the bottom sensors. Designed for the South-North Water Transfer Project (SNWTP, the architecture integrated project safety, water quality safety, and human safety. Using IoT devices, fog computing layer was constructed between cloud server and safety detection devices in water conservancy projects. Technologies such as real-time sensing, intelligent processing, and information interconnection were developed. Therefore, accurate forecasting, accurate positioning, and efficient management were implemented as required by safety prevention of the SNWTP, and safety protection of water conservancy projects was effectively improved, and intelligential water conservancy engineering was developed.

  20. Proceedings of the 33rd national heat transfer conference NHTC'99

    International Nuclear Information System (INIS)

    Jensen, M.K.; Di Marzo, M.

    1999-01-01

    The papers in this conference were divided into the following sections: Radiation Heat Transfer in Fires; Computational Fluid Dynamics Methods in Two-Phase Flow; Heat Transfer in Microchannels; Thin Film Heat Transfer; Thermal Design of Electronics; Enhanced Heat Transfer I; Porous Media Convection; Contact Resistance Heat Transfer; Materials Processing in Solidification and Crystal Growth; Fundamentals of Combustion; Challenging Modeling Aspects of Radiative Transfer; Fundamentals of Microscale Transport; Laser Processing and Diagnostics for Manufacturing and Materials Processing; Experimental Studies of Multiphase Flow; Enhanced Heat Transfer II; Heat and Mass Transfer in Porous Media; Heat Transfer in Turbomachinery and Gas Turbine Systems; Conduction Heat Transfer; General Papers; Open Forum on Combustion; Combustion and Instrumentation and Diagnostics I; Radiative Heat Transfer and Interactions in Participating and Nonparticipating Media; Applications of Computational Heat Transfer; Heat Transfer and Fluid Aspects of Heat Exchangers; Two-Phase Flow and Heat Transfer Phenomena; Fundamentals of Natural and Mixed Convection Heat Transfer I; Fundamental of Natural and Mixed Convection Heat Transfer II; Combustion and Instrumentation and Diagnostics II; Computational Methods for Multidimensional Radiative Transfer; Process Heat Transfer; Advances in Computational Heat and Mass Transfer; Numerical Methods for Porous Media; Transport Phenomena in Manufacturing and Materials Processing; Practical Combustion; Melting and Solidification Heat Transfer; Transients in Dynamics of Two-Phase Flow; Basic Aspects of Two-Phase Flow; Turbulent Heat Transfer; Convective Heat Transfer in Electronics; Thermal Problems in Radioactive and Mixed Waste Management; and Transport Phenomena in Oscillatory Flows. Separate abstracts were prepared for most papers in this conference

  1. Contrast computation methods for interferometric measurement of sensor modulation transfer function

    Science.gov (United States)

    Battula, Tharun; Georgiev, Todor; Gille, Jennifer; Goma, Sergio

    2018-01-01

    Accurate measurement of image-sensor frequency response over a wide range of spatial frequencies is very important for analyzing pixel array characteristics, such as modulation transfer function (MTF), crosstalk, and active pixel shape. Such analysis is especially significant in computational photography for the purposes of deconvolution, multi-image superresolution, and improved light-field capture. We use a lensless interferometric setup that produces high-quality fringes for measuring MTF over a wide range of frequencies (here, 37 to 434 line pairs per mm). We discuss the theoretical framework, involving Michelson and Fourier contrast measurement of the MTF, addressing phase alignment problems using a moiré pattern. We solidify the definition of Fourier contrast mathematically and compare it to Michelson contrast. Our interferometric measurement method shows high detail in the MTF, especially at high frequencies (above Nyquist frequency). We are able to estimate active pixel size and pixel pitch from measurements. We compare both simulation and experimental MTF results to a lens-free slanted-edge implementation using commercial software.

  2. Mapping method for generating three-dimensional meshes: past and present

    International Nuclear Information System (INIS)

    Cook, W.A.; Oakes, W.R.

    1982-01-01

    Two transformations are derived in this paper. One is a mapping of a unit square onto a surve and the other is a mapping of a unit cube onto a three-dimensional region. Two meshing computer programs are then discussed that use these mappings. The first is INGEN, which has been used to calculate three-dimensional meshes for approximately 15 years. This meshing program uses an index scheme to number boundaries, surfaces, and regions. With such an index scheme, it is possible to control nodal points, elements, and boundary conditions. The second is ESCHER, a meshing program now being developed. Two primary considerations governing development of ESCHER are that meshes graded using quadrilaterals are required and that edge-line geometry defined by Computer-Aided Design/Computer-Aided Manufacturing (CAD/CAM) systems will be a major source of geometry definition. This program separates the processes of nodal-point connectivity generation, computation of nodal-point mapping space coordinates, and mapping of nodal points into model space

  3. A computational framework for ultrastructural mapping of neural circuitry.

    Directory of Open Access Journals (Sweden)

    James R Anderson

    2009-03-01

    Full Text Available Circuitry mapping of metazoan neural systems is difficult because canonical neural regions (regions containing one or more copies of all components are large, regional borders are uncertain, neuronal diversity is high, and potential network topologies so numerous that only anatomical ground truth can resolve them. Complete mapping of a specific network requires synaptic resolution, canonical region coverage, and robust neuronal classification. Though transmission electron microscopy (TEM remains the optimal tool for network mapping, the process of building large serial section TEM (ssTEM image volumes is rendered difficult by the need to precisely mosaic distorted image tiles and register distorted mosaics. Moreover, most molecular neuronal class markers are poorly compatible with optimal TEM imaging. Our objective was to build a complete framework for ultrastructural circuitry mapping. This framework combines strong TEM-compliant small molecule profiling with automated image tile mosaicking, automated slice-to-slice image registration, and gigabyte-scale image browsing for volume annotation. Specifically we show how ultrathin molecular profiling datasets and their resultant classification maps can be embedded into ssTEM datasets and how scripted acquisition tools (SerialEM, mosaicking and registration (ir-tools, and large slice viewers (MosaicBuilder, Viking can be used to manage terabyte-scale volumes. These methods enable large-scale connectivity analyses of new and legacy data. In well-posed tasks (e.g., complete network mapping in retina, terabyte-scale image volumes that previously would require decades of assembly can now be completed in months. Perhaps more importantly, the fusion of molecular profiling, image acquisition by SerialEM, ir-tools volume assembly, and data viewers/annotators also allow ssTEM to be used as a prospective tool for discovery in nonneural systems and a practical screening methodology for neurogenetics. Finally

  4. Assessment of vertical transfer in problem solving: Mapping the problem design space

    Science.gov (United States)

    Von Korff, Joshua; Hu, Dehui; Rebello, N. Sanjay

    2012-02-01

    In schema-based theories of cognition, vertical transfer occurs when a learner constructs a new schema to solve a transfer task or chooses between several possible schemas. Vertical transfer is interesting to study, but difficult to measure. Did the student solve the problem using the desired schema or by an alternative method? Perhaps the problem cued the student to use certain resources without knowing why? In this paper, we consider some of the threats to validity in problem design. We provide a theoretical framework to explain the challenges faced in designing vertical transfer problems, and we contrast these challenges with horizontal transfer problem design. We have developed this framework from a set of problems that we tested on introductory mechanics students, and we illustrate the framework using one of the problems.

  5. Approach of simultaneous localization and mapping based on local maps for robot

    Institute of Scientific and Technical Information of China (English)

    CHEN Bai-fan; CAI Zi-xing; HU De-wen

    2006-01-01

    An extended Kalman filter approach of simultaneous localization and mapping(SLAM) was proposed based on local maps.A local frame of reference was established periodically at the position of the robot, and then the observations of the robot and landmarks were fused into the global frame of reference. Because of the independence of the local map, the approach does not cumulate the estimate and calculation errors which are produced by SLAM using Kalman filter directly. At the same time, it reduces the computational complexity. This method is proven correct and feasible in simulation experiments.

  6. Computational Approach to Electron Charge Transfer Reactions

    DEFF Research Database (Denmark)

    Jónsson, Elvar Örn

    -molecular mechanics scheme, and tools to analyse statistical data and generate relative free energies and free energy surfaces. The methodology is applied to several charge transfer species and reactions in chemical environments - chemical in the sense that solvent, counter ions and substrate surfaces are taken...... in to account - which directly influence the reactants and resulting reaction through both physical and chemical interactions. All methods are though general and can be applied to different types of chemistry. First, the basis of the various theoretical tools is presented and applied to several test systems...... and asymmetric charge transfer reactions between several first-row transition metals in water. The results are compared to experiments and rationalised with classical analytic expressions. Shortcomings of the methods are accounted for with clear steps towards improved accuracy. Later the analysis is extended...

  7. An integrated high-performance beam optics-nuclear processes framework with hybrid transfer map-Monte Carlo particle transport and optimization

    Energy Technology Data Exchange (ETDEWEB)

    Bandura, L., E-mail: bandura@msu.ed [Argonne National Laboratory, Argonne, IL 60439 (United States); Erdelyi, B. [Argonne National Laboratory, Argonne, IL 60439 (United States); Northern Illinois University, DeKalb, IL 60115 (United States); Nolen, J. [Argonne National Laboratory, Argonne, IL 60439 (United States)

    2010-12-01

    An integrated beam optics-nuclear processes framework is essential for accurate simulation of fragment separator beam dynamics. The code COSY INFINITY provides powerful differential algebraic methods for modeling and beam dynamics simulations in absence of beam-material interactions. However, these interactions are key for accurately simulating the dynamics of heavy ion fragmentation and fission. We have developed an extended version of the code that includes these interactions, and a set of new tools that allow efficient and accurate particle transport: by transfer map in vacuum and by Monte Carlo methods in materials. The new framework is presented, along with several examples from a preliminary layout of a fragment separator for a facility for rare isotope beams.

  8. An integrated high-performance beam optics-nuclear processes framework with hybrid transfer map-Monte Carlo particle transport and optimization

    International Nuclear Information System (INIS)

    Bandura, L.; Erdelyi, B.; Nolen, J.

    2010-01-01

    An integrated beam optics-nuclear processes framework is essential for accurate simulation of fragment separator beam dynamics. The code COSY INFINITY provides powerful differential algebraic methods for modeling and beam dynamics simulations in absence of beam-material interactions. However, these interactions are key for accurately simulating the dynamics of heavy ion fragmentation and fission. We have developed an extended version of the code that includes these interactions, and a set of new tools that allow efficient and accurate particle transport: by transfer map in vacuum and by Monte Carlo methods in materials. The new framework is presented, along with several examples from a preliminary layout of a fragment separator for a facility for rare isotope beams.

  9. Fluctuating hydrodynamics for multiscale modeling and simulation: energy and heat transfer in molecular fluids.

    Science.gov (United States)

    Shang, Barry Z; Voulgarakis, Nikolaos K; Chu, Jhih-Wei

    2012-07-28

    This work illustrates that fluctuating hydrodynamics (FHD) simulations can be used to capture the thermodynamic and hydrodynamic responses of molecular fluids at the nanoscale, including those associated with energy and heat transfer. Using all-atom molecular dynamics (MD) trajectories as the reference data, the atomistic coordinates of each snapshot are mapped onto mass, momentum, and energy density fields on Eulerian grids to generate a corresponding field trajectory. The molecular length-scale associated with finite molecule size is explicitly imposed during this coarse-graining by requiring that the variances of density fields scale inversely with the grid volume. From the fluctuations of field variables, the response functions and transport coefficients encoded in the all-atom MD trajectory are computed. By using the extracted fluid properties in FHD simulations, we show that the fluctuations and relaxation of hydrodynamic fields quantitatively match with those observed in the reference all-atom MD trajectory, hence establishing compatibility between the atomistic and field representations. We also show that inclusion of energy transfer in the FHD equations can more accurately capture the thermodynamic and hydrodynamic responses of molecular fluids. The results indicate that the proposed MD-to-FHD mapping with explicit consideration of finite molecule size provides a robust framework for coarse-graining the solution phase of complex molecular systems.

  10. A literature survey on numerical heat transfer

    Science.gov (United States)

    Shih, T. M.

    1982-12-01

    Technical papers in the area of numerical heat transfer published from 1977 through 1981 are reviewed. The journals surveyed include: (1) ASME Journal of Heat Transfer, (2) International Journal of Heat and Mass Transfer, (3) AIAA Journal, (4) Numerical Heat Transfer, (5) Computers and Fluids, (6) International Journal for Numerical Methods in Engineering, (7) SIAM Journal of Numerical Analysis, and (8) Journal of Computational Physics. This survey excludes experimental work in heat transfer and numerical schemes that are not applied to equations governing heat transfer phenomena. The research work is categorized into the following areas: (A) conduction, (B) boundary-layer flows, (C) momentum and heat transfer in cavities, (D) turbulent flows, (E) convection around cylinders and spheres or within annuli, (F) numerical convective instability, (G) radiation, (H) combustion, (I) plumes, jets, and wakes, (J) heat transfer in porous media, (K) boiling, condensation, and two-phase flows, (L) developing and fully developed channel flows, (M) combined heat and mass transfer, (N) applications, (O) comparison and properties of numerical schemes, and (P) body-fitted coordinates and nonuniform grids.

  11. Fluence map segmentation

    International Nuclear Information System (INIS)

    Rosenwald, J.-C.

    2008-01-01

    The lecture addressed the following topics: 'Interpreting' the fluence map; The sequencer; Reasons for difference between desired and actual fluence map; Principle of 'Step and Shoot' segmentation; Large number of solutions for given fluence map; Optimizing 'step and shoot' segmentation; The interdigitation constraint; Main algorithms; Conclusions on segmentation algorithms (static mode); Optimizing intensity levels and monitor units; Sliding window sequencing; Synchronization to avoid the tongue-and-groove effect; Accounting for physical characteristics of MLC; Importance of corrections for leaf transmission and offset; Accounting for MLC mechanical constraints; The 'complexity' factor; Incorporating the sequencing into optimization algorithm; Data transfer to the treatment machine; Interface between R and V and accelerator; and Conclusions on fluence map segmentation (Segmentation is part of the overall inverse planning procedure; 'Step and Shoot' and 'Dynamic' options are available for most TPS (depending on accelerator model; The segmentation phase tends to come into the optimization loop; The physical characteristics of the MLC have a large influence on final dose distribution; The IMRT plans (MU and relative dose distribution) must be carefully validated). (P.A.)

  12. DirtyGrid I: 3D Dust Radiative Transfer Modeling of Spectral Energy Distributions of Dusty Stellar Populations

    Science.gov (United States)

    Law, Ka-Hei; Gordon, Karl D.; Misselt, Karl A.

    2018-06-01

    Understanding the properties of stellar populations and interstellar dust has important implications for galaxy evolution. In normal star-forming galaxies, stars and the interstellar medium dominate the radiation from ultraviolet (UV) to infrared (IR). In particular, interstellar dust absorbs and scatters UV and optical light, re-emitting the absorbed energy in the IR. This is a strongly nonlinear process that makes independent studies of the UV-optical and IR susceptible to large uncertainties and degeneracies. Over the years, UV to IR spectral energy distribution (SED) fitting utilizing varying approximations has revealed important results on the stellar and dust properties of galaxies. Yet the approximations limit the fidelity of the derived properties. There is sufficient computer power now available that it is now possible to remove these approximations and map out of landscape of galaxy SEDs using full dust radiative transfer. This improves upon previous work by directly connecting the UV, optical, and IR through dust grain physics. We present the DIRTYGrid, a grid of radiative transfer models of SEDs of dusty stellar populations in galactic environments designed to span the full range of physical parameters of galaxies. Using the stellar and gas radiation input from the stellar population synthesis model PEGASE, our radiative transfer model DIRTY self-consistently computes the UV to far-IR/sub-mm SEDs for each set of parameters in our grid. DIRTY computes the dust absorption, scattering, and emission from the local radiation field and a dust grain model, thereby physically connecting the UV-optical to the IR. We describe the computational method and explain the choices of parameters in DIRTYGrid. The computation took millions of CPU hours on supercomputers, and the SEDs produced are an invaluable tool for fitting multi-wavelength data sets. We provide the complete set of SEDs in an online table.

  13. The development and mapping of functional markers in Fragaria and their transferability and potential for mapping in other genera.

    Science.gov (United States)

    Sargent, D J; Rys, A; Nier, S; Simpson, D W; Tobutt, K R

    2007-01-01

    We have developed 46 primer pairs from exon sequences flanking polymorphic introns of 23 Fragaria gene sequences and one Malus sequence deposited in the EMBL database. Sequencing of a set of the PCR products amplified with the novel primer pairs in diploid Fragaria showed the products to be homologous to the sequences from which the primers were originally designed. By scoring the segregation of the 24 genes in two diploid Fragaria progenies FV x FN (F. vesca x F. nubicola F(2)) and 815 x 903BC (F. vesca x F. viridis BC(1)) 29 genetic loci at discrete positions on the seven linkage groups previously characterised could be mapped, bringing to 35 the total number of known function genes mapped in Fragaria. Twenty primer pairs, representing 14 genes, amplified a product of the expected size in both Malus and Prunus. To demonstrate the applicability of these gene-specific loci to comparative mapping in Rosaceae, five markers that displayed clear polymorphism between the parents of a Malus and a Prunus mapping population were selected. The markers were then scored and mapped in at least one of the two additional progenies.

  14. Participatory maps

    DEFF Research Database (Denmark)

    Salovaara-Moring, Inka

    towards a new political ecology. This type of digital cartographies has been highlighted as the ‘processual turn’ in critical cartography, whereas in related computational journalism it can be seen as an interactive and iterative process of mapping complex and fragile ecological developments. This paper...

  15. REST-MapReduce: An Integrated Interface but Differentiated Service

    Directory of Open Access Journals (Sweden)

    Jong-Hyuk Park

    2014-01-01

    Full Text Available With the fast deployment of cloud computing, MapReduce architectures are becoming the major technologies for mobile cloud computing. The concept of MapReduce was first introduced as a novel programming model and implementation for a large set of computing devices. In this research, we propose a novel concept of REST-MapReduce, enabling users to use only the REST interface without using the MapReduce architecture. This approach provides a higher level of abstraction by integration of the two types of access interface, REST API and MapReduce. The motivation of this research stems from the slower response time for accessing simple RDBMS on Hadoop than direct access to RDMBS. This is because there is overhead to job scheduling, initiating, starting, tracking, and management during MapReduce-based parallel execution. Therefore, we provide a good performance for REST Open API service and for MapReduce, respectively. This is very useful for constructing REST Open API services on Hadoop hosting services, for example, Amazon AWS (Macdonald, 2005 or IBM Smart Cloud. For evaluating performance of our REST-MapReduce framework, we conducted experiments with Jersey REST web server and Hadoop. Experimental result shows that our approach outperforms conventional approaches.

  16. Debugging data transfers in CMS

    International Nuclear Information System (INIS)

    Bagliesi, G; Belforte, S; Bloom, K; Bockelman, B; Bonacorsi, D; Fisk, I; Flix, J; Hernandez, J; D'Hondt, J; Maes, J; Kadastik, M; Klem, J; Kodolova, O; Kuo, C-M; Letts, J; Magini, N; Metson, S; Piedra, J; Pukhaeva, N; Tuura, L

    2010-01-01

    The CMS experiment at CERN is preparing for LHC data taking in several computing preparation activities. In early 2007 a traffic load generator infrastructure for distributed data transfer tests was designed and deployed to equip the WLCG tiers which support the CMS virtual organization with a means for debugging, load-testing and commissioning data transfer routes among CMS computing centres. The LoadTest is based upon PhEDEx as a reliable, scalable data set replication system. The Debugging Data Transfers (DDT) task force was created to coordinate the debugging of the data transfer links. The task force aimed to commission most crucial transfer routes among CMS tiers by designing and enforcing a clear procedure to debug problematic links. Such procedure aimed to move a link from a debugging phase in a separate and independent environment to a production environment when a set of agreed conditions are achieved for that link. The goal was to deliver one by one working transfer routes to the CMS data operations team. The preparation, activities and experience of the DDT task force within the CMS experiment are discussed. Common technical problems and challenges encountered during the lifetime of the taskforce in debugging data transfer links in CMS are explained and summarized.

  17. WLCG transfers dashboard: a unified monitoring tool for heterogeneous data transfers

    International Nuclear Information System (INIS)

    Andreeva, J; Beche, A; Saiz, P; Tuckett, D; Belov, S; Kadochnikov, I

    2014-01-01

    The Worldwide LHC Computing Grid provides resources for the four main virtual organizations. Along with data processing, data distribution is the key computing activity on the WLCG infrastructure. The scale of this activity is very large, the ATLAS virtual organization (VO) alone generates and distributes more than 40 PB of data in 100 million files per year. Another challenge is the heterogeneity of data transfer technologies. Currently there are two main alternatives for data transfers on the WLCG: File Transfer Service and XRootD protocol. Each LHC VO has its own monitoring system which is limited to the scope of that particular VO. There is a need for a global system which would provide a complete cross-VO and cross-technology picture of all WLCG data transfers. We present a unified monitoring tool – WLCG Transfers Dashboard – where all the VOs and technologies coexist and are monitored together. The scale of the activity and the heterogeneity of the system raise a number of technical challenges. Each technology comes with its own monitoring specificities and some of the VOs use several of these technologies. This paper describes the implementation of the system with particular focus on the design principles applied to ensure the necessary scalability and performance, and to easily integrate any new technology providing additional functionality which might be specific to that technology.

  18. WLCG Transfers Dashboard: a Unified Monitoring Tool for Heterogeneous Data Transfers

    Science.gov (United States)

    Andreeva, J.; Beche, A.; Belov, S.; Kadochnikov, I.; Saiz, P.; Tuckett, D.

    2014-06-01

    The Worldwide LHC Computing Grid provides resources for the four main virtual organizations. Along with data processing, data distribution is the key computing activity on the WLCG infrastructure. The scale of this activity is very large, the ATLAS virtual organization (VO) alone generates and distributes more than 40 PB of data in 100 million files per year. Another challenge is the heterogeneity of data transfer technologies. Currently there are two main alternatives for data transfers on the WLCG: File Transfer Service and XRootD protocol. Each LHC VO has its own monitoring system which is limited to the scope of that particular VO. There is a need for a global system which would provide a complete cross-VO and cross-technology picture of all WLCG data transfers. We present a unified monitoring tool - WLCG Transfers Dashboard - where all the VOs and technologies coexist and are monitored together. The scale of the activity and the heterogeneity of the system raise a number of technical challenges. Each technology comes with its own monitoring specificities and some of the VOs use several of these technologies. This paper describes the implementation of the system with particular focus on the design principles applied to ensure the necessary scalability and performance, and to easily integrate any new technology providing additional functionality which might be specific to that technology.

  19. User's guide to HEATRAN: a computer program for three-dimensional transient fluid-flow and heat-transfer analysis

    International Nuclear Information System (INIS)

    Wong, C.N.C.; Cheng, S.K.; Todreas, N.E.

    1982-01-01

    This report provides the HEATRAN user with programming and input information. HEATRAN is a computer program which is written to analyze the transient three dimensional single phase incompressible fluid flow and heat transfer problem. In this report, the programming information is given first. This information includes details concerning the code and structure. The description of the required input variables is presented next. Following the input description, the sample problems are described and HEATRAN's results are presented

  20. Severity mapping of the proximal femur: a new method for assessing hip osteoarthritis with computed tomography.

    Science.gov (United States)

    Turmezei, T D; Lomas, D J; Hopper, M A; Poole, K E S

    2014-10-01

    Plain radiography has been the mainstay of imaging assessment in osteoarthritis for over 50 years, but it does have limitations. Here we present the methodology and results of a new technique for identifying, grading, and mapping the severity and spatial distribution of osteoarthritic disease features at the hip in 3D with clinical computed tomography (CT). CT imaging of 456 hips from 230 adult female volunteers (mean age 66 ± 17 years) was reviewed using 3D multiplanar reformatting to identify bone-related radiological features of osteoarthritis, namely osteophytes, subchondral cysts and joint space narrowing. Scoresheets dividing up the femoral head, head-neck region and the joint space were used to register the location and severity of each feature (scored from 0 to 3). Novel 3D cumulative feature severity maps were then created to display where the most severe disease features from each individual were anatomically located across the cohort. Feature severity maps showed a propensity for osteophytes at the inferoposterior and superolateral femoral head-neck junction. Subchondral cysts were a less common and less localised phenomenon. Joint space narrowing osteophytes and joint space narrowing for the first time. We believe this technique can perform several important roles in future osteoarthritis research, including phenotyping and sensitive disease assessment in 3D. Copyright © 2014 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

  1. Computational Fluid Dynamics-Discrete Element Method (CFD-DEM) Study of Mass-Transfer Mechanisms in Riser Flow.

    Science.gov (United States)

    Carlos Varas, Álvaro E; Peters, E A J F; Kuipers, J A M

    2017-05-17

    We report a computational fluid dynamics-discrete element method (CFD-DEM) simulation study on the interplay between mass transfer and a heterogeneous catalyzed chemical reaction in cocurrent gas-particle flows as encountered in risers. Slip velocity, axial gas dispersion, gas bypassing, and particle mixing phenomena have been evaluated under riser flow conditions to study the complex system behavior in detail. The most important factors are found to be directly related to particle cluster formation. Low air-to-solids flux ratios lead to more heterogeneous systems, where the cluster formation is more pronounced and mass transfer more influenced. Falling clusters can be partially circumvented by the gas phase, which therefore does not fully interact with the cluster particles, leading to poor gas-solid contact efficiencies. Cluster gas-solid contact efficiencies are quantified at several gas superficial velocities, reaction rates, and dilution factors in order to gain more insight regarding the influence of clustering phenomena on the performance of riser reactors.

  2. GIS-aided low flow mapping

    Science.gov (United States)

    Saghafian, B.; Mohammadi, A.

    2003-04-01

    Most studies involving water resources allocation, water quality, hydropower generation, and allowable water withdrawal and transfer require estimation of low flows. Normally, frequency analysis on at-station D-day low flow data is performed to derive various T-yr return period values. However, this analysis is restricted to the location of hydrometric stations where the flow discharge is measured. Regional analysis is therefore conducted to relate the at-station low flow quantiles to watershed characteristics. This enables the transposition of low flow quantiles to ungauged sites. Nevertheless, a procedure to map the regional regression relations for the entire stream network, within the bounds of the relations, is particularly helpful when one studies and weighs alternative sites for certain water resources project. In this study, we used a GIS-aided procedure for low flow mapping in Gilan province, part of northern region in Iran. Gilan enjoys a humid climate with an average of 1100 mm annual precipitation. Although rich in water resources, the highly populated area is quite dependent on minimum amount of water to sustain the vast rice farming and to maintain required flow discharge for quality purposes. To carry out the low flow analysis, a total of 36 hydrometric stations with sufficient and reliable discharge data were identified in the region. The average area of the watersheds was 250 sq. km. Log Pearson type 3 was found the best distribution for flow durations over 60 days, while log normal fitted well the shorter duration series. Low flows with return periods of 2, 5, 10, 25, 50, and 100 year were then computed. Cluster analysis identified two homogeneous areas. Although various watershed parameters were examined in factor analysis, the results showed watershed area, length of the main stream, and annual precipitation were the most effective low flow parameters. The regression equations were then mapped with the aid of GIS based on flow accumulation maps

  3. Response variance in functional maps: neural darwinism revisited.

    Directory of Open Access Journals (Sweden)

    Hirokazu Takahashi

    Full Text Available The mechanisms by which functional maps and map plasticity contribute to cortical computation remain controversial. Recent studies have revisited the theory of neural Darwinism to interpret the learning-induced map plasticity and neuronal heterogeneity observed in the cortex. Here, we hypothesize that the Darwinian principle provides a substrate to explain the relationship between neuron heterogeneity and cortical functional maps. We demonstrate in the rat auditory cortex that the degree of response variance is closely correlated with the size of its representational area. Further, we show that the response variance within a given population is altered through training. These results suggest that larger representational areas may help to accommodate heterogeneous populations of neurons. Thus, functional maps and map plasticity are likely to play essential roles in Darwinian computation, serving as effective, but not absolutely necessary, structures to generate diverse response properties within a neural population.

  4. Response variance in functional maps: neural darwinism revisited.

    Science.gov (United States)

    Takahashi, Hirokazu; Yokota, Ryo; Kanzaki, Ryohei

    2013-01-01

    The mechanisms by which functional maps and map plasticity contribute to cortical computation remain controversial. Recent studies have revisited the theory of neural Darwinism to interpret the learning-induced map plasticity and neuronal heterogeneity observed in the cortex. Here, we hypothesize that the Darwinian principle provides a substrate to explain the relationship between neuron heterogeneity and cortical functional maps. We demonstrate in the rat auditory cortex that the degree of response variance is closely correlated with the size of its representational area. Further, we show that the response variance within a given population is altered through training. These results suggest that larger representational areas may help to accommodate heterogeneous populations of neurons. Thus, functional maps and map plasticity are likely to play essential roles in Darwinian computation, serving as effective, but not absolutely necessary, structures to generate diverse response properties within a neural population.

  5. Cosmic string induced CMB maps

    International Nuclear Information System (INIS)

    Landriau, M.; Shellard, E. P. S.

    2011-01-01

    We compute maps of CMB temperature fluctuations seeded by cosmic strings using high resolution simulations of cosmic strings in a Friedmann-Robertson-Walker universe. We create full-sky, 18 deg. and 3 deg. CMB maps, including the relevant string contribution at each resolution from before recombination to today. We extract the angular power spectrum from these maps, demonstrating the importance of recombination effects. We briefly discuss the probability density function of the pixel temperatures, their skewness, and kurtosis.

  6. Information technology road map 2015

    International Nuclear Information System (INIS)

    2009-09-01

    This book introduces information technology road map 2015 with presentation, process, plan and conclusion of it. It also has introduction of IT road map by field : information technology road map 2015 on the next-generation of semiconductor, display, light emitting diode and light industry, home network and home electronic appliances, digital TV and broadcasting, radio technology, satellite communications, mobile communication for the next-generation, BcN field, software, computer for the next-generation and security of knowledge information.

  7. PORFLO - a continuum model for fluid flow, heat transfer, and mass transport in porous media. Model theory, numerical methods, and computational tests

    International Nuclear Information System (INIS)

    Runchal, A.K.; Sagar, B.; Baca, R.G.; Kline, N.W.

    1985-09-01

    Postclosure performance assessment of the proposed high-level nuclear waste repository in flood basalts at Hanford requires that the processes of fluid flow, heat transfer, and mass transport be numerically modeled at appropriate space and time scales. A suite of computer models has been developed to meet this objective. The theory of one of these models, named PORFLO, is described in this report. Also presented are a discussion of the numerical techniques in the PORFLO computer code and a few computational test cases. Three two-dimensional equations, one each for fluid flow, heat transfer, and mass transport, are numerically solved in PORFLO. The governing equations are derived from the principle of conservation of mass, momentum, and energy in a stationary control volume that is assumed to contain a heterogeneous, anisotropic porous medium. Broad discrete features can be accommodated by specifying zones with distinct properties, or these can be included by defining an equivalent porous medium. The governing equations are parabolic differential equations that are coupled through time-varying parameters. Computational tests of the model are done by comparisons of simulation results with analytic solutions, with results from other independently developed numerical models, and with available laboratory and/or field data. In this report, in addition to the theory of the model, results from three test cases are discussed. A users' manual for the computer code resulting from this model has been prepared and is available as a separate document. 37 refs., 20 figs., 15 tabs

  8. Quantum Programs as Kleisli Maps

    Directory of Open Access Journals (Sweden)

    Abraham Westerbaan

    2017-01-01

    Full Text Available Furber and Jacobs have shown in their study of quantum computation that the category of commutative C*-algebras and PU-maps (positive linear maps which preserve the unit is isomorphic to the Kleisli category of a comonad on the category of commutative C*-algebras with MIU-maps (linear maps which preserve multiplication, involution and unit. [Furber and Jacobs, 2013] In this paper, we prove a non-commutative variant of this result: the category of C*-algebras and PU-maps is isomorphic to the Kleisli category of a comonad on the subcategory of MIU-maps. A variation on this result has been used to construct a model of Selinger and Valiron's quantum lambda calculus using von Neumann algebras. [Cho and Westerbaan, 2016

  9. Fine art of computing nulling interferometer maps

    Science.gov (United States)

    Hénault, F.

    2008-07-01

    Spaceborne nulling interferometers are often characterized by means of their nulling ratio, which is defined as the deepest possible extinction of one target star supposed to harbor an extra-solar system. Herein is shown that another parameter, which is the transmitting efficiency of nearby bright fringes, is also of prime importance. More generally, "nulling maps" formed by the whole destructive and constructive fringe pattern projected on-sky, are found to be very sensitive on the design of some subsystems constituting the interferometer. In particular, we consider Spatial Filtering (SF) and Achromatic Phase Shifter (APS) devices, both required achieving planet detection and characterization. Consequences of the SF choice (pinhole or single-mode optical fiber) and APS properties (with or without induced pupil-flip) are discussed, for both monochromatic and polychromatic cases. Examples of numerical simulations are provided for single Bracewell interferometer, Angel cross and X-array configurations, demonstrating noticeable differences in the aspect of resulting nulling maps. It is concluded that both FS and APS designs exhibit variable capacities for serendipitous planet discovery.

  10. THE ELECTRONIC COURSE OF HEAT AND MASS TRANSFER

    Directory of Open Access Journals (Sweden)

    Alexander P. Solodov

    2013-01-01

    Full Text Available The Electronic course of heat and mass transfer in power engineering is presented containing the full Electronic book as the structured hypertext document, the full set of Mathcad-documents with the whole set of educative computer models of heat and mass transfer, the computer labs, and selected educational presentations. 

  11. A guide for the selection of computer assisted mapping (CAM) and facilities informations systems

    Energy Technology Data Exchange (ETDEWEB)

    Haslin, S.; Baxter, P.; Jarvis, L.

    1980-12-01

    Many distribution engineers are now aware that computer assisted mapping (CAM) and facilities informations systems are probably the most significant breakthrough to date in computer applications for distribution engineering. The Canadian Electrical Asociation (CEA) recognized this and requested engineers of B.C. Hydro make a study of the state of the art in Canadian utilities and the progress of CAM systems on an international basis. The purpose was to provide a guide to assist Canadian utility distribution engineers faced with the problem of studying the application of CAM systems as an alternative to present methods, consideration being given to the long-term and other benefits that were perhaps not apparent for those approaching this field for the first time. It soon became apparent that technology was developing at a high rate and competition in the market was very strong. Also a number of publications were produced by other sources which adequately covered the scope of this study. This report is thus a collection of references to reports, manuals, and other documents with a few considerations provided for those companies interested in exploring further the use of interactive graphics. 24 refs.

  12. Assessing Changes in High School Students' Conceptual Understanding through Concept Maps before and after the Computer-Based Predict-Observe-Explain (CB-POE) Tasks on Acid-Base Chemistry at the Secondary Level

    Science.gov (United States)

    Yaman, Fatma; Ayas, Alipasa

    2015-01-01

    Although concept maps have been used as alternative assessment methods in education, there has been an ongoing debate on how to evaluate students' concept maps. This study discusses how to evaluate students' concept maps as an assessment tool before and after 15 computer-based Predict-Observe-Explain (CB-POE) tasks related to acid-base chemistry.…

  13. Matching by Monotonic Tone Mapping.

    Science.gov (United States)

    Kovacs, Gyorgy

    2018-06-01

    In this paper, a novel dissimilarity measure called Matching by Monotonic Tone Mapping (MMTM) is proposed. The MMTM technique allows matching under non-linear monotonic tone mappings and can be computed efficiently when the tone mappings are approximated by piecewise constant or piecewise linear functions. The proposed method is evaluated in various template matching scenarios involving simulated and real images, and compared to other measures developed to be invariant to monotonic intensity transformations. The results show that the MMTM technique is a highly competitive alternative of conventional measures in problems where possible tone mappings are close to monotonic.

  14. Optimized Data Transfers Based on the OpenCL Event Management Mechanism

    Directory of Open Access Journals (Sweden)

    Hiroyuki Takizawa

    2015-01-01

    Full Text Available In standard OpenCL programming, hosts are supposed to control their compute devices. Since compute devices are dedicated to kernel computation, only hosts can execute several kinds of data transfers such as internode communication and file access. These data transfers require one host to simultaneously play two or more roles due to the need for collaboration between the host and devices. The codes for such data transfers are likely to be system-specific, resulting in low portability. This paper proposes an OpenCL extension that incorporates such data transfers into the OpenCL event management mechanism. Unlike the current OpenCL standard, the main thread running on the host is not blocked to serialize dependent operations. Hence, an application can easily use the opportunities to overlap parallel activities of hosts and compute devices. In addition, the implementation details of data transfers are hidden behind the extension, and application programmers can use the optimized data transfers without any tricky programming techniques. The evaluation results show that the proposed extension can use the optimized data transfer implementation and thereby increase the sustained data transfer performance by about 18% for a real application accessing a big data file.

  15. Map Archive Mining: Visual-Analytical Approaches to Explore Large Historical Map Collections

    Directory of Open Access Journals (Sweden)

    Johannes H. Uhl

    2018-04-01

    Full Text Available Historical maps are unique sources of retrospective geographical information. Recently, several map archives containing map series covering large spatial and temporal extents have been systematically scanned and made available to the public. The geographical information contained in such data archives makes it possible to extend geospatial analysis retrospectively beyond the era of digital cartography. However, given the large data volumes of such archives (e.g., more than 200,000 map sheets in the United States Geological Survey topographic map archive and the low graphical quality of older, manually-produced map sheets, the process to extract geographical information from these map archives needs to be automated to the highest degree possible. To understand the potential challenges (e.g., salient map characteristics and data quality variations in automating large-scale information extraction tasks for map archives, it is useful to efficiently assess spatio-temporal coverage, approximate map content, and spatial accuracy of georeferenced map sheets at different map scales. Such preliminary analytical steps are often neglected or ignored in the map processing literature but represent critical phases that lay the foundation for any subsequent computational processes including recognition. Exemplified for the United States Geological Survey topographic map and the Sanborn fire insurance map archives, we demonstrate how such preliminary analyses can be systematically conducted using traditional analytical and cartographic techniques, as well as visual-analytical data mining tools originating from machine learning and data science.

  16. De novo assembly of a cotyledon-enriched transcriptome map of Vicia faba (L. for transfer cell research

    Directory of Open Access Journals (Sweden)

    Kiruba Shankari eArun Chinnappa

    2015-04-01

    Full Text Available Vicia faba (L. is an important cool-season grain legume species used widely in agriculture but also in plant physiology research, particularly as an experimental model to study transfer cell (TC development. Adaxial epidermal cells of isolated cotyledons can be induced to form functional TCs, thus providing a valuable experimental system to investigate genetic regulation of TC development. The genome of V. faba is exceedingly large (ca. 13 Gb, however, and limited genomic information is available for this species. To provide a resource for transcript profiling of epidermal TC development, we have undertaken de novo assembly of a cotyledon-enriched transcriptome map for V. faba. Illumina paired-end sequencing of total RNA pooled from different tissues and different stages, including isolated cotyledons induced to form TCs, generated 69.5M reads, of which 65.8M were used for assembly following trimming and quality control. Assembly using a De-Bruijn graph-based approach within CLC Genomics Workbench v6.1 generated 21,297 contigs, of which 80.6% were successfully annotated against GO terms. The assembly was validated against known V. faba cDNAs held in GenBank, including transcripts previously identified as being specifically expressed in epidermal cells across TC trans-differentiation. This cotyledon-enriched transcriptome map therefore provides a valuable tool for future transcript profiling of epidermal TC development, and also enriches the genetic resources available for this important legume crop species.

  17. Common data buffer system. [communication with computational equipment utilized in spacecraft operations

    Science.gov (United States)

    Byrne, F. (Inventor)

    1981-01-01

    A high speed common data buffer system is described for providing an interface and communications medium between a plurality of computers utilized in a distributed computer complex forming part of a checkout, command and control system for space vehicles and associated ground support equipment. The system includes the capability for temporarily storing data to be transferred between computers, for transferring a plurality of interrupts between computers, for monitoring and recording these transfers, and for correcting errors incurred in these transfers. Validity checks are made on each transfer and appropriate error notification is given to the computer associated with that transfer.

  18. Challenges of model transferability to data-scarce regions (Invited)

    Science.gov (United States)

    Samaniego, L. E.

    2013-12-01

    Developing the ability to globally predict the movement of water on the land surface at spatial scales from 1 to 5 km constitute one of grand challenges in land surface modelling. Copying with this grand challenge implies that land surface models (LSM) should be able to make reliable predictions across locations and/or scales other than those used for parameter estimation. In addition to that, data scarcity and quality impose further difficulties in attaining reliable predictions of water and energy fluxes at the scales of interest. Current computational limitations impose also seriously limitations to exhaustively investigate the parameter space of LSM over large domains (e.g. greater than half a million square kilometers). Addressing these challenges require holistic approaches that integrate the best techniques available for parameter estimation, field measurements and remotely sensed data at their native resolutions. An attempt to systematically address these issues is the multiscale parameterisation technique (MPR) that links high resolution land surface characteristics with effective model parameters. This technique requires a number of pedo-transfer functions and a much fewer global parameters (i.e. coefficients) to be inferred by calibration in gauged basins. The key advantage of this technique is the quasi-scale independence of the global parameters which enables to estimate global parameters at coarser spatial resolutions and then to transfer them to (ungauged) areas and scales of interest. In this study we show the ability of this technique to reproduce the observed water fluxes and states over a wide range of climate and land surface conditions ranging from humid to semiarid and from sparse to dense forested regions. Results of transferability of global model parameters in space (from humid to semi-arid basins) and across scales (from coarser to finer) clearly indicate the robustness of this technique. Simulations with coarse data sets (e.g. EOBS

  19. A BASIC program for an IBM PC compatible computer for drawing the weak phase object contrast transfer function

    International Nuclear Information System (INIS)

    Olsen, A.; Skjerpe, P.

    1989-01-01

    This report describes a computer program which is useful in high resolution microscopy. The program is written in EBASIC and calculates the weak phase object contrast transfer function as function of instrumental and imaging parameters. The function is plotted on the PC graphics screen, and by a Print Screen command the function can be copied to the printer. The program runs on both the Hercules graphic card and the IBM CGA card. 2 figs

  20. Computing fixed points of nonexpansive mappings by $\\alpha$-dense curves

    Directory of Open Access Journals (Sweden)

    G. García

    2017-08-01

    Full Text Available Given a multivalued nonexpansive mapping defined on a convex and compact set of a Banach space, with values in the class of convex and compact subsets of its domain, we present an iteration scheme which (under suitable conditions converges to a fixed point of such mapping. This new iteration provides us another method to approximate the fixed points of a singlevalued nonexpansive mapping, defined on a compact and convex set into itself. Moreover, the conditions for the singlevalued case are less restrictive than for the multivalued case. Our main tool will be the so called $\\alpha$-dense curves, which will allow us to construct such iterations. Some numerical examples are provided to illustrate our results.

  1. Mapping the Most Significant Computer Hacking Events to a Temporal Computer Attack Model

    OpenAIRE

    Heerden , Renier ,; Pieterse , Heloise; Irwin , Barry

    2012-01-01

    Part 4: Section 3: ICT for Peace and War; International audience; This paper presents eight of the most significant computer hacking events (also known as computer attacks). These events were selected because of their unique impact, methodology, or other properties. A temporal computer attack model is presented that can be used to model computer based attacks. This model consists of the following stages: Target Identification, Reconnaissance, Attack, and Post-Attack Reconnaissance stages. The...

  2. Computer aided site management. Site use management by digital mapping

    International Nuclear Information System (INIS)

    Chupin, J.C.

    1990-01-01

    The logistics program developed for assisting the Hague site management is presented. A digital site mapping representation and geographical data bases are used. The digital site map and its integration into a data base are described. The program can be applied to urban and rural land management aid. Technical administrative and economic evaluations of the program are summarized [fr

  3. Computer-generated maps of lunar composition from gamma ray data

    International Nuclear Information System (INIS)

    Arnold, J.R.; Metzger, A.E.; Reedy, R.C.

    1977-01-01

    The system of Eliason and Soderblom (1977) has been used to process some of the gamma ray mapping data obtained on Apollo 15 and 16. Old results are confirmed and new information obtained, especially with respect to the distribution of Fe over the area mapped, and the correlation of Fe with the radioactive elements. The results are displayed

  4. A computational fluid dynamics analysis on stratified scavenging system of medium capacity two-stroke internal combustion engines

    Directory of Open Access Journals (Sweden)

    Pitta Srinivasa Rao

    2008-01-01

    Full Text Available The main objective of the present work is to make a computational study of stratified scavenging system in two-stroke medium capacity engines to reduce or to curb the emissions from the two-stroke engines. The 3-D flows within the cylinder are simulated using computational fluid dynamics and the code Fluent 6. Flow structures in the transfer ports and the exhaust port are predicted without the stratification and with the stratification, and are well predicted. The total pressure and velocity map from computation provided comprehensive information on the scavenging and stratification phenomenon. Analysis is carried out for the transfer ports flow and the extra port in the transfer port along with the exhaust port when the piston is moving from the top dead center to the bottom dead center, as the ports are closed, half open, three forth open, and full port opening. An unstructured cell is adopted for meshing the geometry created in CATIA software. Flow is simulated by solving governing equations namely conservation of mass momentum and energy using SIMPLE algorithm. Turbulence is modeled by high Reynolds number version k-e model. Experimental measurements are made for validating the numerical prediction. Good agreement is observed between predicted result and experimental data; that the stratification had significantly reduced the emissions and fuel economy is achieved.

  5. The security energy encryption in wireless power transfer

    Science.gov (United States)

    Sadzali, M. N.; Ali, A.; Azizan, M. M.; Albreem, M. A. M.

    2017-09-01

    This paper presents a concept of security in wireless power transfer (WPT) by applying chaos theory. Chaos theory is applied as a security system in order to safeguard the transfer of energy from a transmitter to the intended receiver. The energy encryption of the wireless power transfer utilizes chaos theory to generate the possibility of a logistic map for the chaotic security key. The simulation for energy encryption wireless power transfer system was conducted by using MATLAB and Simulink. By employing chaos theory, the chaotic key ensures the transmission of energy from transmitter to its intended receiver.

  6. Resonant transfer of excitons and quantum computation

    International Nuclear Information System (INIS)

    Lovett, Brendon W.; Reina, John H.; Nazir, Ahsan; Kothari, Beeneet; Briggs, G. Andrew D.

    2003-01-01

    Resonant energy transfer mechanisms have been observed in the sensitized luminescence of solids, and in quantum dots, molecular nanostructures, and photosynthetic organisms. We demonstrate that such mechanisms, together with the exciton-exciton binding energy shift typical of these nanostructures, can be used to perform universal quantum logic and generate quantum entanglement

  7. Virtual 3D tumor marking-exact intraoperative coordinate mapping improve post-operative radiotherapy

    International Nuclear Information System (INIS)

    Essig, Harald; Gellrich, Nils-Claudius; Rana, Majeed; Meyer, Andreas; Eckardt, André M; Kokemueller, Horst; See, Constantin von; Lindhorst, Daniel; Tavassol, Frank; Ruecker, Martin

    2011-01-01

    The quality of the interdisciplinary interface in oncological treatment between surgery, pathology and radiotherapy is mainly dependent on reliable anatomical three-dimensional (3D) allocation of specimen and their context sensitive interpretation which defines further treatment protocols. Computer-assisted preoperative planning (CAPP) allows for outlining macroscopical tumor size and margins. A new technique facilitates the 3D virtual marking and mapping of frozen sections and resection margins or important surgical intraoperative information. These data could be stored in DICOM format (Digital Imaging and Communication in Medicine) in terms of augmented reality and transferred to communicate patient's specific tumor information (invasion to vessels and nerves, non-resectable tumor) to oncologists, radiotherapists and pathologists

  8. Virtual 3D tumor marking-exact intraoperative coordinate mapping improve post-operative radiotherapy

    Directory of Open Access Journals (Sweden)

    Essig Harald

    2011-11-01

    Full Text Available Abstract The quality of the interdisciplinary interface in oncological treatment between surgery, pathology and radiotherapy is mainly dependent on reliable anatomical three-dimensional (3D allocation of specimen and their context sensitive interpretation which defines further treatment protocols. Computer-assisted preoperative planning (CAPP allows for outlining macroscopical tumor size and margins. A new technique facilitates the 3D virtual marking and mapping of frozen sections and resection margins or important surgical intraoperative information. These data could be stored in DICOM format (Digital Imaging and Communication in Medicine in terms of augmented reality and transferred to communicate patient's specific tumor information (invasion to vessels and nerves, non-resectable tumor to oncologists, radiotherapists and pathologists.

  9. Noise mapping inside a car cabin

    DEFF Research Database (Denmark)

    Knudsen, Kim; Sjøj, Sidsel Marie Nørholm; Jacobsen, Finn

    The mapping of noise is of considerable interest in the car industry where a good noise mapping can make it much easier to identify the sources that generate the noise and eventually reduce the individual contributions to the noise. The methods used for this purpose include delay-and-sum beamform......The mapping of noise is of considerable interest in the car industry where a good noise mapping can make it much easier to identify the sources that generate the noise and eventually reduce the individual contributions to the noise. The methods used for this purpose include delay......-and-sum beamforming and spherical harmonics beamforming. These methods have a poor spatial esolution at low frequencies, and since much noise generated in cars is dominated by low frequencies the methods are not optimal. In the present paper the mapping is done by solving an inverse problem with a transfer matrix...

  10. Privacy Preserving Mapping Schemes Supporting Comparison

    NARCIS (Netherlands)

    Tang, Qiang

    2010-01-01

    To cater to the privacy requirements in cloud computing, we introduce a new primitive, namely Privacy Preserving Mapping (PPM) schemes supporting comparison. An PPM scheme enables a user to map data items into images in such a way that, with a set of images, any entity can determine the <, =, >

  11. TRANSFER OF TECHNOLOGY FOR CADASTRAL MAPPING IN TAJIKISTAN USING HIGH RESOLUTION SATELLITE DATA

    Directory of Open Access Journals (Sweden)

    R. Kaczynski

    2012-07-01

    Full Text Available European Commission funded project entitled: "Support to the mapping and certification capacity of the Agency of Land Management, Geodesy and Cartography" in Tajikistan was run by FINNMAP FM-International and Human Dynamics from Nov. 2006 to June 2011. The Agency of Land Management, Geodesy and Cartography is the state agency responsible for development, implementation, monitoring and evaluation of state policies on land tenure and land management, including the on-going land reform and registration of land use rights. The specific objective was to support and strengthen the professional capacity of the "Fazo" Institute in the field of satellite geodesy, digital photogrammetry, advanced digital satellite image processing of high resolution satellite data and digital cartography. Lectures and on-the-job trainings for the personnel of "Fazo" and Agency in satellite geodesy, digital photogrammetry, cartography and the use of high resolution satellite data for cadastral mapping have been organized. Standards and Quality control system for all data and products have been elaborated and implemented in the production line. Technical expertise and trainings in geodesy, photogrammetry and satellite image processing to the World Bank project "Land Registration and Cadastre System for Sustainable Agriculture" has also been completed in Tajikistan. The new map projection was chosen and the new unclassified geodetic network has been established for all of the country in which all agricultural parcel boundaries are being mapped. IKONOS, QuickBird and WorldView1 panchromatic data have been used for orthophoto generation. Average accuracy of space triangulation of non-standard (long up to 90km satellite images of QuickBird Pan and IKONOS Pan on ICPs: RMSEx = 0.5m and RMSEy = 0.5m have been achieved. Accuracy of digital orthophoto map is RMSExy = 1.0m. More then two and half thousands of digital orthophoto map sheets in the scale of 1:5000 with pixel size 0.5m

  12. Coordinate systems and map projections

    CERN Document Server

    Maling, DH

    1992-01-01

    A revised and expanded new edition of the definitive English work on map projections. The revisions take into account the huge advances in geometrical geodesy which have occurred since the early years of satellite geodesy. The detailed configuration of the geoid resulting from the GEOS and SEASAT altimetry measurements are now taken into consideration. Additionally, the chapter on computation of map projections is updated bearing in mind the availability of pocket calculators and microcomputers. Analytical derivation of some map projections including examples of pseudocylindrical and polyconic

  13. Mapping of moveout in a TTI medium

    KAUST Repository

    Stovas, A.; Alkhalifah, Tariq Ali

    2012-01-01

    To compute moveout in a transversely isotropic medium with tilted symmetry axis is a very complicated problem. We propose to split this problem into two parts. First, to compute the moveout in a corresponding VTI medium. Second, to map the computed moveout to a TTI medium.

  14. World Gravity Map: a set of global complete spherical Bouguer and isostatic anomaly maps and grids

    Science.gov (United States)

    Bonvalot, S.; Balmino, G.; Briais, A.; Kuhn, M.; Peyrefitte, A.; Vales, N.; Biancale, R.; Gabalda, G.; Reinquin, F.

    2012-04-01

    We present here a set of digital maps of the Earth's gravity anomalies (surface free air, Bouguer and isostatic), computed at Bureau Gravimetric International (BGI) as a contribution to the Global Geodetic Observing Systems (GGOS) and to the global geophysical maps published by the Commission for the Geological Map of the World (CGMW) with support of UNESCO and other institutions. The Bouguer anomaly concept is extensively used in geophysical interpretation to investigate the density distributions in the Earth's interior. Complete Bouguer anomalies (including terrain effects) are usually computed at regional scales by integrating the gravity attraction of topography elements over and beyond a given area (under planar or spherical approximations). Here, we developed and applied a worldwide spherical approach aimed to provide a set of homogeneous and high resolution gravity anomaly maps and grids computed at the Earth's surface, taking into account a realistic Earth model and reconciling geophysical and geodetic definitions of gravity anomalies. This first version (1.0) has been computed by spherical harmonics analysis / synthesis of the Earth's topography-bathymetry up to degree 10800. The detailed theory of the spherical harmonics approach is given in Balmino et al., (Journal of Geodesy, 2011). The Bouguer and terrain corrections have thus been computed in spherical geometry at 1'x1' resolution using the ETOPO1 topography/bathymetry, ice surface and bedrock models from the NOAA (National Oceanic and Atmospheric Administration) and taking into account precise characteristics (boundaries and densities) of major lakes, inner seas, polar caps and of land areas below sea level. Isostatic corrections have been computed according to the Airy-Heiskanen model in spherical geometry for a constant depth of compensation of 30km. The gravity information given here is provided by the Earth Geopotential Model (EGM2008), developed at degree 2160 by the National Geospatial

  15. Automated Computer-Based Facility for Measurement of Near-Field Structure of Microwave Radiators and Scatterers

    DEFF Research Database (Denmark)

    Mishra, Shantnu R.;; Pavlasek, Tomas J. F.;; Muresan, Letitia V.

    1980-01-01

    An automatic facility for measuring the three-dimensional structure of the near fields of microwave radiators and scatterers is described. The amplitude and phase for different polarization components can be recorded in analog and digital form using a microprocessor-based system. The stored data...... are transferred to a large high-speed computer for bulk processing and for the production of isophot and equiphase contour maps or profiles. The performance of the system is demonstrated through results for a single conical horn, for interacting rectangular horns, for multiple cylindrical scatterers...

  16. Maintaining Mappings Valid between Dynamic KOS

    OpenAIRE

    Dos Reis , Julio Cesar

    2013-01-01

    International audience; Knowledge Organization Systems (KOS) and the existing mappings between them have become extremely relevant in semantic-enabled systems especially for interoperability reasons. KOS may have a dynamic nature since knowledge in a lot of domains evolves fast, and thus KOS evolution can potentially impact mappings, turning them unreliable. A still open research problem is how to adapt mappings in the course of KOS evolution without re- computing semantic correspondences bet...

  17. First international workshop on fundamental aspects of post-dryout heat transfer: proceedings

    International Nuclear Information System (INIS)

    Lee, R.

    1984-12-01

    The purpose of the First International Workshop on Fundamental Aspects of Post-Dryout Heat Transfer was to review recent developments and the state of art in the field of post-dryout heat transfer. The workshop centered on interchanging ideas, reviewing current research results, and defining future research needs. The following five sessions dealing with the fundamental aspects of post-dryout heat transfer were held. A Computer Code Modeling and Flow Phenomena session was held dealing with flow rgimes, drop size, drop formation and behavior, interfacial area, interfacial drag, and computer modeling. A Quenching Phenomena session was held dealing with nature of rewetting, maximum wetting temperature, Leidenfrost phenomenon and heat transfer in the vicinity of quench front. A Low-Void Heat Transfer session was held dealing with inverted annular-flow heat transfer, inverted slug-flow heat transfer thermal non-equilibrium and computer modeling. A Dispersed-Flow Heat Transfer session was held dealing with drop interfacial heat transfer, vapor convection, thermal non-equilibrium and correlations and models

  18. A Semantic Map for Evaluating Creativity

    NARCIS (Netherlands)

    van der Velde, Frank; van der Velde, Frank; Wolf, Roger A.; Schmettow, Martin; Nazareth, Deniece; Toivonen, Hannu; Colton, Simon; Cook, Michael; Ventura, Dan

    2015-01-01

    We present a semantic map of words related with creativity. The aim is to empirically derive terms which can be used to rate processes or products of computational creativity. The words in the map are based on association studies performed by human subjects and augmented with words derived from the

  19. High Throughput WAN Data Transfer with Hadoop-based Storage

    Science.gov (United States)

    Amin, A.; Bockelman, B.; Letts, J.; Levshina, T.; Martin, T.; Pi, H.; Sfiligoi, I.; Thomas, M.; Wüerthwein, F.

    2011-12-01

    Hadoop distributed file system (HDFS) is becoming more popular in recent years as a key building block of integrated grid storage solution in the field of scientific computing. Wide Area Network (WAN) data transfer is one of the important data operations for large high energy physics experiments to manage, share and process datasets of PetaBytes scale in a highly distributed grid computing environment. In this paper, we present the experience of high throughput WAN data transfer with HDFS-based Storage Element. Two protocols, GridFTP and fast data transfer (FDT), are used to characterize the network performance of WAN data transfer.

  20. High Throughput WAN Data Transfer with Hadoop-based Storage

    International Nuclear Information System (INIS)

    Amin, A; Thomas, M; Bockelman, B; Letts, J; Martin, T; Pi, H; Sfiligoi, I; Wüerthwein, F; Levshina, T

    2011-01-01

    Hadoop distributed file system (HDFS) is becoming more popular in recent years as a key building block of integrated grid storage solution in the field of scientific computing. Wide Area Network (WAN) data transfer is one of the important data operations for large high energy physics experiments to manage, share and process datasets of PetaBytes scale in a highly distributed grid computing environment. In this paper, we present the experience of high throughput WAN data transfer with HDFS-based Storage Element. Two protocols, GridFTP and fast data transfer (FDT), are used to characterize the network performance of WAN data transfer.

  1. Calculation of local bed to wall heat transfer in a fluidized-bed

    International Nuclear Information System (INIS)

    Kilkis, B.I.

    1987-01-01

    Surface to bed heat transfer in a fluidized-bed largely depends upon its local and global hydrodynamical behavior including particle velocity, particle trajectory, gas velocity, and void fraction. In this study, a computer program was developed in order to calculate the local bed to wall heat transfer, by accounting for the local and global instantaneous hydrodynamics of the bed. This is accomplished by utilizing the CHEMFLUB computer program. This information at a given location is interpreted so that the most appropriate heat transfer model is utilized for each time increment. These instantaneous heat transfer coefficient for the given location. Repeating the procedure for different locations, a space average heat transfer coefficient is also calculated. This report briefly summarizes the various heat transfer models employed and gives sample computer results reporting the case study for Mickley - Trilling's experimental set-up. Comparisons with available experimental data and correlations are also provided in order to compare and evaluate the computer results

  2. Smartphones Based Mobile Mapping Systems

    Science.gov (United States)

    Al-Hamad, A.; El-Sheimy, N.

    2014-06-01

    The past 20 years have witnessed an explosive growth in the demand for geo-spatial data. This demand has numerous sources and takes many forms; however, the net effect is an ever-increasing thirst for data that is more accurate, has higher density, is produced more rapidly, and is acquired less expensively. For mapping and Geographic Information Systems (GIS) projects, this has been achieved through the major development of Mobile Mapping Systems (MMS). MMS integrate various navigation and remote sensing technologies which allow mapping from moving platforms (e.g. cars, airplanes, boats, etc.) to obtain the 3D coordinates of the points of interest. Such systems obtain accuracies that are suitable for all but the most demanding mapping and engineering applications. However, this accuracy doesn't come cheaply. As a consequence of the platform and navigation and mapping technologies used, even an "inexpensive" system costs well over 200 000 USD. Today's mobile phones are getting ever more sophisticated. Phone makers are determined to reduce the gap between computers and mobile phones. Smartphones, in addition to becoming status symbols, are increasingly being equipped with extended Global Positioning System (GPS) capabilities, Micro Electro Mechanical System (MEMS) inertial sensors, extremely powerful computing power and very high resolution cameras. Using all of these components, smartphones have the potential to replace the traditional land MMS and portable GPS/GIS equipment. This paper introduces an innovative application of smartphones as a very low cost portable MMS for mapping and GIS applications.

  3. Genomics With Cloud Computing

    Directory of Open Access Journals (Sweden)

    Sukhamrit Kaur

    2015-04-01

    Full Text Available Abstract Genomics is study of genome which provides large amount of data for which large storage and computation power is needed. These issues are solved by cloud computing that provides various cloud platforms for genomics. These platforms provides many services to user like easy access to data easy sharing and transfer providing storage in hundreds of terabytes more computational power. Some cloud platforms are Google genomics DNAnexus and Globus genomics. Various features of cloud computing to genomics are like easy access and sharing of data security of data less cost to pay for resources but still there are some demerits like large time needed to transfer data less network bandwidth.

  4. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  5. Computation, measurement and analysis of the reactivity-to-power-transfer-function for the sodium cooled nuclear power plant KNK I

    International Nuclear Information System (INIS)

    Hoppe, P.; Mitzel, F.

    1977-02-01

    The Reactivity-to-Power-Transfer-Function for the sodium cooled nuclear power plant KNK I (Kompakte Natriumgekuehlte Kernenergieanlage) has been measured and compared with theoretical results. The measurements have been performed with the help of pseudostochastic reactivity perturbations. The transfer function has been determined by computing the auto- and cross-power-spectral-densities for the reactivity- and neutron flux signals. The agreement between the experimental and theoretical transfer function could be improved by adjusting the reactivity coefficients. The applications of these measurements with respect to reactor diagnosis and malfunction detection are discussed. For this purpose the accuracy of the measured transfer function is of great importance. Therefore an extensive error analysis has been performed. It turned out, that the inherent instability of the reactor without control system and the feedback by the primary coolant system were the reasons for comparatively big systematical errors. The conditions have been derived under which these types of errors can be considerably reduced. The conclusions can also be applied to analogical measurements at fast sodium cooled reactors. Because of their inherent stability the systematical errors will be reduced. (orig.) [de

  6. The projective heat map

    CERN Document Server

    Schwartz, Richard Evan

    2017-01-01

    This book introduces a simple dynamical model for a planar heat map that is invariant under projective transformations. The map is defined by iterating a polygon map, where one starts with a finite planar N-gon and produces a new N-gon by a prescribed geometric construction. One of the appeals of the topic of this book is the simplicity of the construction that yet leads to deep and far reaching mathematics. To construct the projective heat map, the author modifies the classical affine invariant midpoint map, which takes a polygon to a new polygon whose vertices are the midpoints of the original. The author provides useful background which makes this book accessible to a beginning graduate student or advanced undergraduate as well as researchers approaching this subject from other fields of specialty. The book includes many illustrations, and there is also a companion computer program.

  7. COMPARING IMAGE-BASED METHODS FOR ASSESSING VISUAL CLUTTER IN GENERALIZED MAPS

    Directory of Open Access Journals (Sweden)

    G. Touya

    2015-08-01

    Full Text Available Map generalization abstracts and simplifies geographic information to derive maps at smaller scales. The automation of map generalization requires techniques to evaluate the global quality of a generalized map. The quality and legibility of a generalized map is related to the complexity of the map, or the amount of clutter in the map, i.e. the excessive amount of information and its disorganization. Computer vision research is highly interested in measuring clutter in images, and this paper proposes to compare some of the existing techniques from computer vision, applied to generalized maps evaluation. Four techniques from the literature are described and tested on a large set of maps, generalized at different scales: edge density, subband entropy, quad tree complexity, and segmentation clutter. The results are analyzed against several criteria related to generalized maps, the identification of cluttered areas, the preservation of the global amount of information, the handling of occlusions and overlaps, foreground vs background, and blank space reduction.

  8. Adige river in Trento flooding map, 1892: private or public risk transfer?

    Science.gov (United States)

    Ranzi, Roberto

    2016-04-01

    For the determination of the flood risk hydrologist and hydraulic engineers focuse their attention mainly to the estimation of physical factors determining the flood hazard, while economists and experts of social sciences deal mainly with the estimation of vulnerability and exposure. The fact that flood zoning involves both hydrological and socio-economic aspects, however, was clear already in the XIX century when the impact of floods on inundated areas started to appear in flood maps, for instance in the UK and in Italy. A pioneering 'flood risk' map for the Adige river in Trento, Italy, was already published in 1892, taking into account in detail both hazard intensity in terms of velocity and depth, frequency of occurrence, vulnerability and economic costs for flood protection with river embankments. This map is likely to be the reinterpreted certainly as a pioneering, and possibly as the first flood risk map for an Italian river and worldwide. Risk levels were divided in three categories and seven sub-categories, depending on flood water depth, velocity, frequency and damage costs. It is interesting to notice the fact that at that time the map was used to share the cost of levees' reparation and enhancement after the severe September 1882 flood as a function of the estimated level of protection of the respective areas against the flood risk. The sharing of costs between public bodies, the railway company and private owners was debated for about 20 years and at the end the public sustained the major costs. This shows how already at that time the economic assessment of structural flood protections was based on objective and rational cost-benefit criteria, that hydraulic risk mapping was perceived by the society as fundamental for the design of flood protection systems and that a balanced cost sharing between public and private was an accepted approach although some protests arose at that time.

  9. Using the Hadoop/MapReduce approach for monitoring the CERN storage system and improving the ATLAS computing model

    CERN Document Server

    Russo, Stefano Alberto; Lamanna, M

    The processing of huge amounts of data, an already fundamental task for the research in the elementary particle physics field, is becoming more and more important also for companies operating in the Information Technology (IT) industry. In this context, if conventional approaches are adopted several problems arise, starting from the congestion of the communication channels. In the IT sector, one of the approaches designed to minimize this congestion on is to exploit the data locality, or in other words, to bring the computation as closer as possible to where the data resides. The most common implementation of this concept is the Hadoop/MapReduce framework. In this thesis work I evaluate the usage of Hadoop/MapReduce in two areas: a standard one similar to typical IT analyses, and an innovative one related to high energy physics analyses. The first consists in monitoring the history of the storage cluster which stores the data generated by the LHC experiments, the second in the physics analysis of the latter, ...

  10. Quadratic rational rotations of the torus and dual lattice maps

    CERN Document Server

    Kouptsov, K L; Vivaldi, F

    2002-01-01

    We develop a general formalism for computed-assisted proofs concerning the orbit structure of certain non ergodic piecewise affine maps of the torus, whose eigenvalues are roots of unity. For a specific class of maps, we prove that if the trace is a quadratic irrational (the simplest nontrivial case, comprising 8 maps), then the periodic orbits are organized into finitely many renormalizable families, with exponentially increasing period, plus a finite number of exceptional families. The proof is based on exact computations with algebraic numbers, where units play the role of scaling parameters. Exploiting a duality existing between these maps and lattice maps representing rounded-off planar rotations, we establish the global periodicity of the latter systems, for a set of orbits of full density.

  11. Comparing the performance of various digital soil mapping approaches to map physical soil properties

    Science.gov (United States)

    Laborczi, Annamária; Takács, Katalin; Pásztor, László

    2015-04-01

    Spatial information on physical soil properties is intensely expected, in order to support environmental related and land use management decisions. One of the most widely used properties to characterize soils physically is particle size distribution (PSD), which determines soil water management and cultivability. According to their size, different particles can be categorized as clay, silt, or sand. The size intervals are defined by national or international textural classification systems. The relative percentage of sand, silt, and clay in the soil constitutes textural classes, which are also specified miscellaneously in various national and/or specialty systems. The most commonly used is the classification system of the United States Department of Agriculture (USDA). Soil texture information is essential input data in meteorological, hydrological and agricultural prediction modelling. Although Hungary has a great deal of legacy soil maps and other relevant soil information, it often occurs, that maps do not exist on a certain characteristic with the required thematic and/or spatial representation. The recent developments in digital soil mapping (DSM), however, provide wide opportunities for the elaboration of object specific soil maps (OSSM) with predefined parameters (resolution, accuracy, reliability etc.). Due to the simultaneous richness of available Hungarian legacy soil data, spatial inference methods and auxiliary environmental information, there is a high versatility of possible approaches for the compilation of a given soil map. This suggests the opportunity of optimization. For the creation of an OSSM one might intend to identify the optimum set of soil data, method and auxiliary co-variables optimized for the resources (data costs, computation requirements etc.). We started comprehensive analysis of the effects of the various DSM components on the accuracy of the output maps on pilot areas. The aim of this study is to compare and evaluate different

  12. Transfer induced compressive strain in graphene

    DEFF Research Database (Denmark)

    Larsen, Martin Benjamin Barbour Spanget; Mackenzie, David; Caridad, Jose

    2014-01-01

    We have used spatially resolved micro Raman spectroscopy to map the full width at half maximum (FWHM) of the graphene G-band and the 2D and G peak positions, for as-grown graphene on copper catalyst layers, for transferred CVD graphene and for micromechanically exfoliated graphene, in order...... to characterize the effects of a transfer process on graphene properties. Here we use the FWHM(G) as an indicator of the doping level of graphene, and the ratio of the shifts in the 2D and G bands as an indicator of strain. We find that the transfer process introduces an isotropic, spatially uniform, compressive...... strain in graphene, and increases the carrier concentration....

  13. Turbulence Modeling and Computation of Turbine Aerodynamics and Heat Transfer

    Science.gov (United States)

    Lakshminarayana, B.; Luo, J.

    1996-01-01

    The objective of the present research is to develop improved turbulence models for the computation of complex flows through turbomachinery passages, including the effects of streamline curvature, heat transfer and secondary flows. Advanced turbulence models are crucial for accurate prediction of rocket engine flows, due to existance of very large extra strain rates, such as strong streamline curvature. Numerical simulation of the turbulent flows in strongly curved ducts, including two 180-deg ducts, one 90-deg duct and a strongly concave curved turbulent boundary layer have been carried out with Reynolds stress models (RSM) and algebraic Reynolds stress models (ARSM). An improved near-wall pressure-strain correlation has been developed for capturing the anisotropy of turbulence in the concave region. A comparative study of two modes of transition in gas turbine, the by-pass transition and the separation-induced transition, has been carried out with several representative low-Reynolds number (LRN) k-epsilon models. Effects of blade surface pressure gradient, freestream turbulence and Reynolds number on the blade boundary layer development, and particularly the inception of transition are examined in detail. The present study indicates that the turbine blade transition, in the presence of high freestream turbulence, is predicted well with LRN k-epsilon models employed. The three-dimensional Navier-Stokes procedure developed by the present authors has been used to compute the three-dimensional viscous flow through the turbine nozzle passage of a single stage turbine. A low Reynolds number k-epsilon model and a zonal k-epsilon/ARSM (algebraic Reynolds stress model) are utilized for turbulence closure. An assessment of the performance of the turbulence models has been carried out. The two models are found to provide similar predictions for the mean flow parameters, although slight improvement in the prediction of some secondary flow quantities has been obtained by the

  14. The Added Value of a Single-photon Emission Computed Tomography-Computed Tomography in Sentinel Lymph Node Mapping in Patients with Breast Cancer and Malignant Melanoma

    International Nuclear Information System (INIS)

    Bennie, George; Vorster, Mariza; Buscombe, John; Sathekge, Mike

    2015-01-01

    Single-photon emission computed tomography-computed tomography (SPECT-CT) allows for physiological and anatomical co-registration in sentinel lymph node (SLN) mapping and offers additional benefits over conventional planar imaging. However, the clinical relevance when considering added costs and radiation burden of these reported benefits remains somewhat uncertain. This study aimed to evaluate the possible added value of SPECT-CT and intra-operative gamma-probe use over planar imaging alone in the South African setting. 80 patients with breast cancer or malignant melanoma underwent both planar and SPECT-CT imaging for SLN mapping. We assessed and compared the number of nodes detected on each study, false positive and negative findings, changes in surgical approach and or patient management. In all cases where a sentinel node was identified, SPECT-CT was more accurate anatomically. There was a significant change in surgical approach in 30 cases - breast cancer (n = 13; P 0.001) and malignant melanoma (n = 17; P 0.0002). In 4 cases a node not identified on planar imaging was seen on SPECT-CT. In 16 cases additional echelon nodes were identified. False positives were excluded by SPECT-CT in 12 cases. The addition of SPECT-CT and use of intra-operative gamma-probe to planar imaging offers important benefits in patients who present with breast cancer and melanoma. These benefits include increased nodal detection, elimination of false positives and negatives and improved anatomical localization that ultimately aids and expedites surgical management. This has been demonstrated in the context of industrialized country previously and has now also been confirmed in the setting of a emerging-market nation

  15. The Added Value of a Single-photon Emission Computed Tomography-Computed Tomography in Sentinel Lymph Node Mapping in Patients with Breast Cancer and Malignant Melanoma.

    Science.gov (United States)

    Bennie, George; Vorster, Mariza; Buscombe, John; Sathekge, Mike

    2015-01-01

    Single-photon emission computed tomography-computed tomography (SPECT-CT) allows for physiological and anatomical co-registration in sentinel lymph node (SLN) mapping and offers additional benefits over conventional planar imaging. However, the clinical relevance when considering added costs and radiation burden of these reported benefits remains somewhat uncertain. This study aimed to evaluate the possible added value of SPECT-CT and intra-operative gamma-probe use over planar imaging alone in the South African setting. 80 patients with breast cancer or malignant melanoma underwent both planar and SPECT-CT imaging for SLN mapping. We assessed and compared the number of nodes detected on each study, false positive and negative findings, changes in surgical approach and or patient management. In all cases where a sentinel node was identified, SPECT-CT was more accurate anatomically. There was a significant change in surgical approach in 30 cases - breast cancer (n = 13; P 0.001) and malignant melanoma (n = 17; P 0.0002). In 4 cases a node not identified on planar imaging was seen on SPECT-CT. In 16 cases additional echelon nodes were identified. False positives were excluded by SPECT-CT in 12 cases. The addition of SPECT-CT and use of intra-operative gamma-probe to planar imaging offers important benefits in patients who present with breast cancer and melanoma. These benefits include increased nodal detection, elimination of false positives and negatives and improved anatomical localization that ultimately aids and expedites surgical management. This has been demonstrated in the context of industrialized country previously and has now also been confirmed in the setting of a emerging-market nation.

  16. Noniterative MAP reconstruction using sparse matrix representations.

    Science.gov (United States)

    Cao, Guangzhi; Bouman, Charles A; Webb, Kevin J

    2009-09-01

    We present a method for noniterative maximum a posteriori (MAP) tomographic reconstruction which is based on the use of sparse matrix representations. Our approach is to precompute and store the inverse matrix required for MAP reconstruction. This approach has generally not been used in the past because the inverse matrix is typically large and fully populated (i.e., not sparse). In order to overcome this problem, we introduce two new ideas. The first idea is a novel theory for the lossy source coding of matrix transformations which we refer to as matrix source coding. This theory is based on a distortion metric that reflects the distortions produced in the final matrix-vector product, rather than the distortions in the coded matrix itself. The resulting algorithms are shown to require orthonormal transformations of both the measurement data and the matrix rows and columns before quantization and coding. The second idea is a method for efficiently storing and computing the required orthonormal transformations, which we call a sparse-matrix transform (SMT). The SMT is a generalization of the classical FFT in that it uses butterflies to compute an orthonormal transform; but unlike an FFT, the SMT uses the butterflies in an irregular pattern, and is numerically designed to best approximate the desired transforms. We demonstrate the potential of the noniterative MAP reconstruction with examples from optical tomography. The method requires offline computation to encode the inverse transform. However, once these offline computations are completed, the noniterative MAP algorithm is shown to reduce both storage and computation by well over two orders of magnitude, as compared to a linear iterative reconstruction methods.

  17. Conjugate Compressible Fluid Flow and Heat Transfer in Ducts

    Science.gov (United States)

    Cross, M. F.

    2011-01-01

    A computational approach to modeling transient, compressible fluid flow with heat transfer in long, narrow ducts is presented. The primary application of the model is for analyzing fluid flow and heat transfer in solid propellant rocket motor nozzle joints during motor start-up, but the approach is relevant to a wide range of analyses involving rapid pressurization and filling of ducts. Fluid flow is modeled through solution of the spatially one-dimensional, transient Euler equations. Source terms are included in the governing equations to account for the effects of wall friction and heat transfer. The equation solver is fully-implicit, thus providing greater flexibility than an explicit solver. This approach allows for resolution of pressure wave effects on the flow as well as for fast calculation of the steady-state solution when a quasi-steady approach is sufficient. Solution of the one-dimensional Euler equations with source terms significantly reduces computational run times compared to general purpose computational fluid dynamics packages solving the Navier-Stokes equations with resolved boundary layers. In addition, conjugate heat transfer is more readily implemented using the approach described in this paper than with most general purpose computational fluid dynamics packages. The compressible flow code has been integrated with a transient heat transfer solver to analyze heat transfer between the fluid and surrounding structure. Conjugate fluid flow and heat transfer solutions are presented. The author is unaware of any previous work available in the open literature which uses the same approach described in this paper.

  18. Computer aided heat transfer analysis in a laboratory scaled heat exchanger unit

    International Nuclear Information System (INIS)

    Gunes, M.

    1998-01-01

    In this study. an explanation of a laboratory scaled heat exchanger unit and a software which is developed to analyze heat transfer. especially to use it in heat transfer courses, are represented. Analyses carried out in the software through sample values measured in the heat exchanger are: (l) Determination of heat transfer rate, logarithmic mean temperature difference and overall heat transfer coefficient; (2)Determination of convection heat transfer coefficient inside and outside the tube and the effect of fluid velocity on these; (3)Investigation of the relationship between Nusselt Number. Reynolds Number and Prandtl Number by using multiple non-linear regression analysis. Results are displayed on the screen graphically

  19. Surface mineral maps of Afghanistan derived from HyMap imaging spectrometer data, version 2

    Science.gov (United States)

    Kokaly, Raymond F.; King, Trude V.V.; Hoefen, Todd M.

    2013-01-01

    This report presents a new version of surface mineral maps derived from HyMap imaging spectrometer data collected over Afghanistan in the fall of 2007. This report also describes the processing steps applied to the imaging spectrometer data. The 218 individual flight lines composing the Afghanistan dataset, covering more than 438,000 square kilometers, were georeferenced to a mosaic of orthorectified Landsat images. The HyMap data were converted from radiance to reflectance using a radiative transfer program in combination with ground-calibration sites and a network of cross-cutting calibration flight lines. The U.S. Geological Survey Material Identification and Characterization Algorithm (MICA) was used to generate two thematic maps of surface minerals: a map of iron-bearing minerals and other materials, which have their primary absorption features at the shorter wavelengths of the reflected solar wavelength range, and a map of carbonates, phyllosilicates, sulfates, altered minerals, and other materials, which have their primary absorption features at the longer wavelengths of the reflected solar wavelength range. In contrast to the original version, version 2 of these maps is provided at full resolution of 23-meter pixel size. The thematic maps, MICA summary images, and the material fit and depth images are distributed in digital files linked to this report, in a format readable by remote sensing software and Geographic Information Systems (GIS). The digital files can be downloaded from http://pubs.usgs.gov/ds/787/downloads/.

  20. Heat transfer from humans wearing clothing

    NARCIS (Netherlands)

    Lotens, W.A.

    1993-01-01

    In this monograph the effects of clothing on human heat transfer are described. The description is based on the physics of heat and mass transfer, depending on the design of the clothing, the climate, and the activity of the wearer. The resulting model has been stepwise implemented in computer

  1. Motor transfer from map ocular exploration to locomotion during spatial navigation from memory.

    Science.gov (United States)

    Demichelis, Alixia; Olivier, Gérard; Berthoz, Alain

    2013-02-01

    Spatial navigation from memory can rely on two different strategies: a mental simulation of a kinesthetic spatial navigation (egocentric route strategy) or visual-spatial memory using a mental map (allocentric survey strategy). We hypothesized that a previously performed "oculomotor navigation" on a map could be used by the brain to perform a locomotor memory task. Participants were instructed to (1) learn a path on a map through a sequence of vertical and horizontal eyes movements and (2) walk on the slabs of a "magic carpet" to recall this path. The main results showed that the anisotropy of ocular movements (horizontal ones being more efficient than vertical ones) influenced performances of participants when they change direction on the central slab of the magic carpet. These data suggest that, to find their way through locomotor space, subjects mentally repeated their past ocular exploration of the map, and this visuo-motor memory was used as a template for the locomotor performance.

  2. Smartphones Based Mobile Mapping Systems

    Directory of Open Access Journals (Sweden)

    A. Al-Hamad

    2014-06-01

    Full Text Available The past 20 years have witnessed an explosive growth in the demand for geo-spatial data. This demand has numerous sources and takes many forms; however, the net effect is an ever-increasing thirst for data that is more accurate, has higher density, is produced more rapidly, and is acquired less expensively. For mapping and Geographic Information Systems (GIS projects, this has been achieved through the major development of Mobile Mapping Systems (MMS. MMS integrate various navigation and remote sensing technologies which allow mapping from moving platforms (e.g. cars, airplanes, boats, etc. to obtain the 3D coordinates of the points of interest. Such systems obtain accuracies that are suitable for all but the most demanding mapping and engineering applications. However, this accuracy doesn't come cheaply. As a consequence of the platform and navigation and mapping technologies used, even an "inexpensive" system costs well over 200 000 USD. Today's mobile phones are getting ever more sophisticated. Phone makers are determined to reduce the gap between computers and mobile phones. Smartphones, in addition to becoming status symbols, are increasingly being equipped with extended Global Positioning System (GPS capabilities, Micro Electro Mechanical System (MEMS inertial sensors, extremely powerful computing power and very high resolution cameras. Using all of these components, smartphones have the potential to replace the traditional land MMS and portable GPS/GIS equipment. This paper introduces an innovative application of smartphones as a very low cost portable MMS for mapping and GIS applications.

  3. Calculation and mapping of critical loads in Europe: Status report 1993

    International Nuclear Information System (INIS)

    Downing, R.J.; Hettelingh, J.P.; De Smet, P.A.M.

    1993-01-01

    The work of the RIVM Coordination Center for Effects (CCE) and National Focal Centers (NFCs) for Mapping over the past two years is summarized. The primary task of the critical loads mapping program during this period was to compute and map critical loads of sulphur in Europe. Efforts were undertaken to enhance the scientific foundations and policy relevance of the critical load program, and to foster consensus among producers and users of this information by means of three workshops. The applied calculation methods are described, as well as the resulting critical loads maps, based upon the outcomes of the workshops. Chapter 2 contains the most recent maps (May 1993) of the critical load of acidity as well as the critical load of sulphur and critical sulphur deposition, which are derived from the critical load of acidity. The chapter also contains maps of the sulphur deposition in Europe in 1980 and 1990, and the resulting exceedances. In chapter 3 the methods and equations used to derive the maps of critical loads and exceedances of acidity and sulphur are described with emphasis on the advances in the calculation methods used since the first European critical loads maps were produced in 1991. In chapter 4 the methods to be used to compute and map critical loads in the future are presented. In chapter 5 an overview of the data inputs is given, and the methods of data handling performed by the CCE to produce the current European maps of critical loads. In chapter 6 the results of an uncertainty analysis is described, which was performed on the critical loads computation methodology to assess the reliability of the computation results and the importance of the various input variables. Chapter 7 provides some conclusions and recommendations resulting from the critical load mapping activities. In Appendix 1 the reports of the can be found, with additional maps of critical loads and background variables in Appendix 2. 15 figs., 11 tabs., 156 refs

  4. Large Scale Landform Mapping Using Lidar DEM

    Directory of Open Access Journals (Sweden)

    Türkay Gökgöz

    2015-08-01

    Full Text Available In this study, LIDAR DEM data was used to obtain a primary landform map in accordance with a well-known methodology. This primary landform map was generalized using the Focal Statistics tool (Majority, considering the minimum area condition in cartographic generalization in order to obtain landform maps at 1:1000 and 1:5000 scales. Both the primary and the generalized landform maps were verified visually with hillshaded DEM and an orthophoto. As a result, these maps provide satisfactory visuals of the landforms. In order to show the effect of generalization, the area of each landform in both the primary and the generalized maps was computed. Consequently, landform maps at large scales could be obtained with the proposed methodology, including generalization using LIDAR DEM.

  5. Advanced Map For Real-Time Process Control

    Science.gov (United States)

    Shiobara, Yasuhisa; Matsudaira, Takayuki; Sashida, Yoshio; Chikuma, Makoto

    1987-10-01

    MAP, a communications protocol for factory automation proposed by General Motors [1], has been accepted by users throughout the world and is rapidly becoming a user standard. In fact, it is now a LAN standard for factory automation. MAP is intended to interconnect different devices, such as computers and programmable devices, made by different manufacturers, enabling them to exchange information. It is based on the OSI intercomputer com-munications protocol standard under development by the ISO. With progress and standardization, MAP is being investigated for application to process control fields other than factory automation [2]. The transmission response time of the network system and centralized management of data exchanged with various devices for distributed control are import-ant in the case of a real-time process control with programmable controllers, computers, and instruments connected to a LAN system. MAP/EPA and MINI MAP aim at reduced overhead in protocol processing and enhanced transmission response. If applied to real-time process control, a protocol based on point-to-point and request-response transactions limits throughput and transmission response. This paper describes an advanced MAP LAN system applied to real-time process control by adding a new data transmission control that performs multicasting communication voluntarily and periodically in the priority order of data to be exchanged.

  6. What is needed to implement a computer-assisted health risk assessment tool? An exploratory concept mapping study

    Directory of Open Access Journals (Sweden)

    Ahmad Farah

    2012-12-01

    Full Text Available Abstract Background Emerging eHealth tools could facilitate the delivery of comprehensive care in time-constrained clinical settings. One such tool is interactive computer-assisted health-risk assessments (HRA, which may improve provider-patient communication at the point of care, particularly for psychosocial health concerns, which remain under-detected in clinical encounters. The research team explored the perspectives of healthcare providers representing a variety of disciplines (physicians, nurses, social workers, allied staff regarding the factors required for implementation of an interactive HRA on psychosocial health. Methods The research team employed a semi-qualitative participatory method known as Concept Mapping, which involved three distinct phases. First, in face-to-face and online brainstorming sessions, participants responded to an open-ended central question: “What factors should be in place within your clinical setting to support an effective computer-assisted screening tool for psychosocial risks?” The brainstormed items were consolidated by the research team. Then, in face-to-face and online sorting sessions, participants grouped the items thematically as ‘it made sense to them’. Participants also rated each item on a 5-point scale for its ‘importance’ and ‘action feasibility’ over the ensuing six month period. The sorted and rated data was analyzed using multidimensional scaling and hierarchical cluster analyses which produced visual maps. In the third and final phase, the face-to-face Interpretation sessions, the concept maps were discussed and illuminated by participants collectively. Results Overall, 54 providers participated (emergency care 48%; primary care 52%. Participants brainstormed 196 items thought to be necessary for the implementation of an interactive HRA emphasizing psychosocial health. These were consolidated by the research team into 85 items. After sorting and rating, cluster analysis

  7. Mapping the HISS Dipole

    International Nuclear Information System (INIS)

    McParland, C.; Bieser, F.

    1984-01-01

    The principal component of the Bevalac HISS facility is a large super-conducting 3 Tesla dipole. The facility's need for a large magnetic volume spectrometer resulted in a large gap geometry - a 2 meter pole tip diameter and a 1 meter pole gap. Obviously, the field required detailed mapping for effective use as a spectrometer. The mapping device was designed with several major features in mind. The device would measure field values on a grid which described a closed rectangular solid. The grid would be a regular with the exact measurement intervals adjustable by software. The device would function unattended over the long period of time required to complete a field map. During this time, the progress of the map could be monitored by anyone with access to the HISS VAX computer. Details of the mechanical, electrical, and control design follow

  8. Chaotic maps-based password-authenticated key agreement using smart cards

    Science.gov (United States)

    Guo, Cheng; Chang, Chin-Chen

    2013-06-01

    Password-based authenticated key agreement using smart cards has been widely and intensively researched. Inspired by the semi-group property of Chebyshev maps and key agreement protocols based on chaotic maps, we proposed a novel chaotic maps-based password-authenticated key agreement protocol with smart cards. In our protocol, we avoid modular exponential computing or scalar multiplication on elliptic curve used in traditional authenticated key agreement protocols using smart cards. Our analysis shows that our protocol has comprehensive characteristics and can withstand attacks, including the insider attack, replay attack, and others, satisfying essential security requirements. Performance analysis shows that our protocol can refrain from consuming modular exponential computing and scalar multiplication on an elliptic curve. The computational cost of our protocol compared with related protocols is acceptable.

  9. Heat and mass transfer during the cryopreservation of a bioartificial liver device: a computational model.

    Science.gov (United States)

    Balasubramanian, Saravana K; Coger, Robin N

    2005-01-01

    Bioartificial liver devices (BALs) have proven to be an effective bridge to transplantation for cases of acute liver failure. Enabling the long-term storage of these devices using a method such as cryopreservation will ensure their easy off the shelf availability. To date, cryopreservation of liver cells has been attempted for both single cells and sandwich cultures. This study presents the potential of using computational modeling to help develop a cryopreservation protocol for storing the three dimensional BAL: Hepatassist. The focus is upon determining the thermal and concentration profiles as the BAL is cooled from 37 degrees C-100 degrees C, and is completed in two steps: a cryoprotectant loading step and a phase change step. The results indicate that, for the loading step, mass transfer controls the duration of the protocol, whereas for the phase change step, when mass transfer is assumed negligible, the latent heat released during freezing is the control factor. The cryoprotocol that is ultimately proposed considers time, cooling rate, and the temperature gradients that the cellular space is exposed to during cooling. To our knowledge, this study is the first reported effort toward designing an effective protocol for the cryopreservation of a three-dimensional BAL device.

  10. Geologic Maps as the Foundation of Mineral-Hazards Maps in California

    Science.gov (United States)

    Higgins, C. T.; Churchill, R. K.; Downey, C. I.; Clinkenbeard, J. P.; Fonseca, M. C.

    2010-12-01

    that show potential for mineral hazards. Depending on the type of mineral hazard investigated, qualitative and/or quantitative methods are used in this process. The final information is given to CGS clients in various formats that range from traditional paper maps to attributed digital layers, which can be viewed on background digital imagery in 2D or 3D with image viewers or GIS software. This variety of formats assures that users with different levels of computer experience or available computer resources can access the information. Besides the applications presented here, mineral-hazards mapping can also be used in many other settings and situations as a tool to evaluate potential effects on human health and the environment. Examples include fighting forest fires, harvesting of timber, post-fire debris flows during storms, disposal or import of earth materials for non-highway construction projects, and rural areas used for recreation (hiking, motorcycling, etc.). In the future, the CGS expects to investigate and possibly employ more-sophisticated digital algorithms to rate and display the potential for specific mineral hazards on its maps. The geologist’s knowledge and experience will still be needed, however, to review these digital results to decide if they are reasonable.

  11. Clustered deep shadow maps for integrated polyhedral and volume rendering

    KAUST Repository

    Bornik, Alexander

    2012-01-01

    This paper presents a hardware-accelerated approach for shadow computation in scenes containing both complex volumetric objects and polyhedral models. Our system is the first hardware accelerated complete implementation of deep shadow maps, which unifies the computation of volumetric and geometric shadows. Up to now such unified computation was limited to software-only rendering . Previous hardware accelerated techniques can handle only geometric or only volumetric scenes - both resulting in the loss of important properties of the original concept. Our approach supports interactive rendering of polyhedrally bounded volumetric objects on the GPU based on ray casting. The ray casting can be conveniently used for both the shadow map computation and the rendering. We show how anti-aliased high-quality shadows are feasible in scenes composed of multiple overlapping translucent objects, and how sparse scenes can be handled efficiently using clustered deep shadow maps. © 2012 Springer-Verlag.

  12. Collaborative and Multilingual Approach to Learn Database Topics Using Concept Maps

    Science.gov (United States)

    Calvo, Iñaki

    2014-01-01

    Authors report on a study using the concept mapping technique in computer engineering education for learning theoretical introductory database topics. In addition, the learning of multilingual technical terminology by means of the collaborative drawing of a concept map is also pursued in this experiment. The main characteristics of a study carried out in the database subject at the University of the Basque Country during the 2011/2012 course are described. This study contributes to the field of concept mapping as these kinds of cognitive tools have proved to be valid to support learning in computer engineering education. It contributes to the field of computer engineering education, providing a technique that can be incorporated with several educational purposes within the discipline. Results reveal the potential that a collaborative concept map editor offers to fulfil the above mentioned objectives. PMID:25538957

  13. Analysis of natural convection heat transfer with crust formation in the molten metal pool using CONV-2 and 3D computer codes

    International Nuclear Information System (INIS)

    Park, R. J.; Kang, K. H.; Kim, S. B.; Kim, H. D.; Choi, S. M.

    1998-01-01

    Analytical studies have been performed on natural convection heat transfer with crust formation in a molten metal pool to validate and evaluate experimental data using the CONV-2 and 3D computer codes. Two types of steady state tests, a low and high geometric aspect ratio case in the molten metal pool, were performed to investigate crust thickness as a function of boundary conditions. The CONV-2 and 3D computer codes were developed under the OECD/NEA RASPLAV project to simulate two- and three-dimensional natural convection heat transfer with crust formation, respectively. The Rayleigh-Benard flow patterns in the molten metal pool contribute to the temperature distribution, which affects non-uniform crust formation. The CONV-2D results on crust thickness are a little higher than the experimental data because of heat loss during the test. In comparison of the CONV-3D results with the CONV-2D results on crust thickness, the three-dimensional results are higher than the two-dimensional results, because of three dimensional natural convection flow and wall effect

  14. A users manual for a computer program which calculates time optical geocentric transfers using solar or nuclear electric and high thrust propulsion

    Science.gov (United States)

    Sackett, L. L.; Edelbaum, T. N.; Malchow, H. L.

    1974-01-01

    This manual is a guide for using a computer program which calculates time optimal trajectories for high-and low-thrust geocentric transfers. Either SEP or NEP may be assumed and a one or two impulse, fixed total delta V, initial high thrust phase may be included. Also a single impulse of specified delta V may be included after the low thrust state. The low thrust phase utilizes equinoctial orbital elements to avoid the classical singularities and Kryloff-Boguliuboff averaging to help insure more rapid computation time. The program is written in FORTRAN 4 in double precision for use on an IBM 360 computer. The manual includes a description of the problem treated, input/output information, examples of runs, and source code listings.

  15. Improving the bulk data transfer experience

    Energy Technology Data Exchange (ETDEWEB)

    Guok, Chin; Guok, Chin; Lee, Jason R.; Berket, Karlo

    2008-05-07

    Scientific computations and collaborations increasingly rely on the network to provide high-speed data transfer, dissemination of results, access to instruments, support for computational steering, etc. The Energy Sciences Network is establishing a science data network to provide user driven bandwidth allocation. In a shared network environment, some reservations may not be granted due to the lack of available bandwidth on any single path. In many cases, the available bandwidth across multiple paths would be sufficient to grant the reservation. In this paper we investigate how to utilize the available bandwidth across multiple paths in the case of bulk data transfer.

  16. Deciphering the Arginine-binding preferences at the substrate-binding groove of Ser/Thr kinases by computational surface mapping.

    Directory of Open Access Journals (Sweden)

    Avraham Ben-Shimon

    2011-11-01

    Full Text Available Protein kinases are key signaling enzymes that catalyze the transfer of γ-phosphate from an ATP molecule to a phospho-accepting residue in the substrate. Unraveling the molecular features that govern the preference of kinases for particular residues flanking the phosphoacceptor is important for understanding kinase specificities toward their substrates and for designing substrate-like peptidic inhibitors. We applied ANCHORSmap, a new fragment-based computational approach for mapping amino acid side chains on protein surfaces, to predict and characterize the preference of kinases toward Arginine binding. We focus on positions P-2 and P-5, commonly occupied by Arginine (Arg in substrates of basophilic Ser/Thr kinases. The method accurately identified all the P-2/P-5 Arg binding sites previously determined by X-ray crystallography and produced Arg preferences that corresponded to those experimentally found by peptide arrays. The predicted Arg-binding positions and their associated pockets were analyzed in terms of shape, physicochemical properties, amino acid composition, and in-silico mutagenesis, providing structural rationalization for previously unexplained trends in kinase preferences toward Arg moieties. This methodology sheds light on several kinases that were described in the literature as having non-trivial preferences for Arg, and provides some surprising departures from the prevailing views regarding residues that determine kinase specificity toward Arg. In particular, we found that the preference for a P-5 Arg is not necessarily governed by the 170/230 acidic pair, as was previously assumed, but by several different pairs of acidic residues, selected from positions 133, 169, and 230 (PKA numbering. The acidic residue at position 230 serves as a pivotal element in recognizing Arg from both the P-2 and P-5 positions.

  17. AlignerBoost: A Generalized Software Toolkit for Boosting Next-Gen Sequencing Mapping Accuracy Using a Bayesian-Based Mapping Quality Framework.

    Directory of Open Access Journals (Sweden)

    Qi Zheng

    2016-10-01

    Full Text Available Accurate mapping of next-generation sequencing (NGS reads to reference genomes is crucial for almost all NGS applications and downstream analyses. Various repetitive elements in human and other higher eukaryotic genomes contribute in large part to ambiguously (non-uniquely mapped reads. Most available NGS aligners attempt to address this by either removing all non-uniquely mapping reads, or reporting one random or "best" hit based on simple heuristics. Accurate estimation of the mapping quality of NGS reads is therefore critical albeit completely lacking at present. Here we developed a generalized software toolkit "AlignerBoost", which utilizes a Bayesian-based framework to accurately estimate mapping quality of ambiguously mapped NGS reads. We tested AlignerBoost with both simulated and real DNA-seq and RNA-seq datasets at various thresholds. In most cases, but especially for reads falling within repetitive regions, AlignerBoost dramatically increases the mapping precision of modern NGS aligners without significantly compromising the sensitivity even without mapping quality filters. When using higher mapping quality cutoffs, AlignerBoost achieves a much lower false mapping rate while exhibiting comparable or higher sensitivity compared to the aligner default modes, therefore significantly boosting the detection power of NGS aligners even using extreme thresholds. AlignerBoost is also SNP-aware, and higher quality alignments can be achieved if provided with known SNPs. AlignerBoost's algorithm is computationally efficient, and can process one million alignments within 30 seconds on a typical desktop computer. AlignerBoost is implemented as a uniform Java application and is freely available at https://github.com/Grice-Lab/AlignerBoost.

  18. AlignerBoost: A Generalized Software Toolkit for Boosting Next-Gen Sequencing Mapping Accuracy Using a Bayesian-Based Mapping Quality Framework.

    Science.gov (United States)

    Zheng, Qi; Grice, Elizabeth A

    2016-10-01

    Accurate mapping of next-generation sequencing (NGS) reads to reference genomes is crucial for almost all NGS applications and downstream analyses. Various repetitive elements in human and other higher eukaryotic genomes contribute in large part to ambiguously (non-uniquely) mapped reads. Most available NGS aligners attempt to address this by either removing all non-uniquely mapping reads, or reporting one random or "best" hit based on simple heuristics. Accurate estimation of the mapping quality of NGS reads is therefore critical albeit completely lacking at present. Here we developed a generalized software toolkit "AlignerBoost", which utilizes a Bayesian-based framework to accurately estimate mapping quality of ambiguously mapped NGS reads. We tested AlignerBoost with both simulated and real DNA-seq and RNA-seq datasets at various thresholds. In most cases, but especially for reads falling within repetitive regions, AlignerBoost dramatically increases the mapping precision of modern NGS aligners without significantly compromising the sensitivity even without mapping quality filters. When using higher mapping quality cutoffs, AlignerBoost achieves a much lower false mapping rate while exhibiting comparable or higher sensitivity compared to the aligner default modes, therefore significantly boosting the detection power of NGS aligners even using extreme thresholds. AlignerBoost is also SNP-aware, and higher quality alignments can be achieved if provided with known SNPs. AlignerBoost's algorithm is computationally efficient, and can process one million alignments within 30 seconds on a typical desktop computer. AlignerBoost is implemented as a uniform Java application and is freely available at https://github.com/Grice-Lab/AlignerBoost.

  19. Mechanistic Investigation of Molybdate-Catalysed Transfer Hydrodeoxygenation

    DEFF Research Database (Denmark)

    Larsen, Daniel Bo; Petersen, Allan Robertson; Dethlefsen, Johannes Rytter

    2016-01-01

    The molybdate-catalysed transfer hydrodeoxygenation (HDO) of benzyl alcohol to toluene driven by oxidation of the solvent isopropyl alcohol to acetone has been investigated by using a combination of experimental and computational methods. A Hammett study that compared the relative rates for the t......The molybdate-catalysed transfer hydrodeoxygenation (HDO) of benzyl alcohol to toluene driven by oxidation of the solvent isopropyl alcohol to acetone has been investigated by using a combination of experimental and computational methods. A Hammett study that compared the relative rates...

  20. Utilizing HDF4 File Content Maps for the Cloud

    Science.gov (United States)

    Lee, Hyokyung Joe

    2016-01-01

    We demonstrate a prototype study that HDF4 file content map can be used for efficiently organizing data in cloud object storage system to facilitate cloud computing. This approach can be extended to any binary data formats and to any existing big data analytics solution powered by cloud computing because HDF4 file content map project started as long term preservation of NASA data that doesn't require HDF4 APIs to access data.

  1. Well location and land-use mapping in the Columbia Plateau area

    International Nuclear Information System (INIS)

    Stephan, J.; Foote, H.; Coburn, V.

    1979-10-01

    Irrigation wells in a 41,000-square mile area located in Washington and northern Oregon were the subject of this study. Approximately 30,000 square miles of the area were mapped within the boundary of the Columbia Plateau, which covers some 48,200 square miles in the states of Washington, Oregon, and Idaho. Advanced state-of-the-art computer analysis techniques for processing Landsat digital multispectral data were used for mapping the area into ten land-use classes. Specially designed computer programs were used for mapping the locations of 1476 irrigation wells located in 13 counties. Six thematic color-encoded maps were prepared which show additional land-use types and relative areal distribution. Three maps depict the location of irrigation wells

  2. Computational heat transfer analysis and combined ANN–GA ...

    Indian Academy of Sciences (India)

    The analysis using the numerical simulation and neural network ... Optimization is the process of finding the most plausible and desirable solution to a problem. ... increased heat transfer and compared the results of regular non-fuzzy model and fuzzy model. ..... network is designed using MATLAB Neural network toolbox.

  3. An Optimization Approach to Improving Collections of Shape Maps

    DEFF Research Database (Denmark)

    Nguyen, Andy; Ben‐Chen, Mirela; Welnicka, Katarzyna

    2011-01-01

    pairwise map independently does not take full advantage of all existing information. For example, a notorious problem with computing shape maps is the ambiguity introduced by the symmetry problem — for two similar shapes which have reflectional symmetry there exist two maps which are equally favorable...... shape maps connecting our collection, we propose to add the constraint of global map consistency, requiring that any composition of maps between two shapes should be independent of the path chosen in the network. This requirement can help us choose among the equally good symmetric alternatives, or help...

  4. Quantum baker maps with controlled-not coupling

    International Nuclear Information System (INIS)

    Vallejos, Raul O; Santoro, Pedro R del; Almeida, Alfredo M Ozorio de

    2006-01-01

    The characteristic stretching and squeezing of chaotic motion is linearized within the finite number of phase space domains which subdivide a classical baker map. Tensor products of such maps are also chaotic, but a more interesting generalized baker map arises if the stacking orders for the factor maps are allowed to interact. These maps are readily quantized, in such a way that the stacking interaction is entirely attributed to primary qubits in each map, if each jth subsystem has Hilbert space dimension D j 2 n j . We here study the particular example of two baker maps that interact via a controlled-not interaction, which is a universal gate for quantum computation. Numerical evidence indicates that the control subspace becomes an ideal Markovian environment for the target map in the limit of large Hilbert space dimension

  5. Flow Visualization with Quantified Spatial and Temporal Errors Using Edge Maps

    KAUST Repository

    Bhatia, H.; Jadhav, S.; Bremer, P.; Guoning Chen,; Levine, J. A.; Nonato, L. G.; Pascucci, V.

    2012-01-01

    Robust analysis of vector fields has been established as an important tool for deriving insights from the complex systems these fields model. Traditional analysis and visualization techniques rely primarily on computing streamlines through numerical integration. The inherent numerical errors of such approaches are usually ignored, leading to inconsistencies that cause unreliable visualizations and can ultimately prevent in-depth analysis. We propose a new representation for vector fields on surfaces that replaces numerical integration through triangles with maps from the triangle boundaries to themselves. This representation, called edge maps, permits a concise description of flow behaviors and is equivalent to computing all possible streamlines at a user defined error threshold. Independent of this error streamlines computed using edge maps are guaranteed to be consistent up to floating point precision, enabling the stable extraction of features such as the topological skeleton. Furthermore, our representation explicitly stores spatial and temporal errors which we use to produce more informative visualizations. This work describes the construction of edge maps, the error quantification, and a refinement procedure to adhere to a user defined error bound. Finally, we introduce new visualizations using the additional information provided by edge maps to indicate the uncertainty involved in computing streamlines and topological structures. © 2012 IEEE.

  6. Flow Visualization with Quantified Spatial and Temporal Errors Using Edge Maps

    KAUST Repository

    Bhatia, H.

    2012-09-01

    Robust analysis of vector fields has been established as an important tool for deriving insights from the complex systems these fields model. Traditional analysis and visualization techniques rely primarily on computing streamlines through numerical integration. The inherent numerical errors of such approaches are usually ignored, leading to inconsistencies that cause unreliable visualizations and can ultimately prevent in-depth analysis. We propose a new representation for vector fields on surfaces that replaces numerical integration through triangles with maps from the triangle boundaries to themselves. This representation, called edge maps, permits a concise description of flow behaviors and is equivalent to computing all possible streamlines at a user defined error threshold. Independent of this error streamlines computed using edge maps are guaranteed to be consistent up to floating point precision, enabling the stable extraction of features such as the topological skeleton. Furthermore, our representation explicitly stores spatial and temporal errors which we use to produce more informative visualizations. This work describes the construction of edge maps, the error quantification, and a refinement procedure to adhere to a user defined error bound. Finally, we introduce new visualizations using the additional information provided by edge maps to indicate the uncertainty involved in computing streamlines and topological structures. © 2012 IEEE.

  7. A pilot trial on pulmonary emphysema quantification and perfusion mapping in a single-step using contrast-enhanced dual-energy computed tomography.

    Science.gov (United States)

    Lee, Choong Wook; Seo, Joon Beom; Lee, Youngjoo; Chae, Eun Jin; Kim, Namkug; Lee, Hyun Joo; Hwang, Hye Jeon; Lim, Chae-Hun

    2012-01-01

    To know whether contrast-enhanced dual-energy computed tomography angiography (DECTA) can be used for simultaneous assessment of emphysema quantification and regional perfusion evaluation. We assessed 27 patients who had pulmonary emphysema and no pulmonary embolism on visual assessment of CT images, among 584 consecutive patients who underwent DECTA for the evaluation of pulmonary embolism. Virtual noncontrast (VNC) images were generated by modifying the "Liver VNC" application in a dedicated workstation. Using in-house software, the low-attenuation area below 950HU (LAA950), the 15th percentile attenuation (15pctlVNC) and the mean lung attenuation (MeanVNC) were calculated. The "Lung PBV" application was used to assess perfusion, and the low-iodine area below 5HU (LIA5), the 15th percentile iodine (15pctlIodine), and the mean iodine value (MeanIodine) were calculated from iodine map images. The correlation between VNC parameters and pulmonary function test data (available in 22 patients) and the correlation between VNC and iodine map parameters (all included 27 patients) were assessed. Color-coded map of VNC image were compared with iodine map images for the evaluation of regional heterogeneity. We observed moderate correlations between LAA950 and predicted %FEV1 (rs = -0.47, P VNC images. We observed moderate correlations between quantitative parameters on VNC images and pulmonary function test data, and also observed moderate correlations between the severity of parenchymal destruction, as determined from VNC images, and perfusion status, as determined from iodine maps. Therefore, the contrast-enhanced DECTA can be used for the emphysema quantification and regional perfusion evaluation by using the VNC images and iodine map, simultaneously.

  8. REMap: Operon Map of M. tuberculosis

    Science.gov (United States)

    Xia, Fang Fang; Stevens, Rick L.; Bishai, William R.; Lamichhane, Gyanu

    2016-01-01

    A map of the transcriptional organization of genes of an organism is a basic tool that is necessary to understand and facilitate a more accurate genetic manipulation of the organism. Operon maps are largely generated by computational prediction programs that rely on gene conservation and genome architecture and may not be physiologically relevant. With the widespread use of RNA sequencing (RNAseq), the prediction of operons based on actual transcriptome sequencing rather than computational genomics alone is much needed. Here, we report a validated operon map of Mycobacterium tuberculosis, developed using RNAseq data from both the exponential and stationary phases of growth. At least 58.4% of M. tuberculosis genes are organized into 749 operons. Our prediction algorithm, REMap (RNA Expression Mapping of operons), considers the many cases of transcription coverage of intergenic regions, and avoids dependencies on functional annotation and arbitrary assumptions about gene structure. As a result, we demonstrate that REMap is able to more accurately predict operons, especially those that contain long intergenic regions or functionally unrelated genes, than previous operon prediction programs. The REMap algorithm is publicly available as a user-friendly tool that can be readily modified to predict operons in other bacteria. PMID:27450008

  9. Mapping the space of genomic signatures.

    Directory of Open Access Journals (Sweden)

    Lila Kari

    Full Text Available We propose a computational method to measure and visualize interrelationships among any number of DNA sequences allowing, for example, the examination of hundreds or thousands of complete mitochondrial genomes. An "image distance" is computed for each pair of graphical representations of DNA sequences, and the distances are visualized as a Molecular Distance Map: Each point on the map represents a DNA sequence, and the spatial proximity between any two points reflects the degree of structural similarity between the corresponding sequences. The graphical representation of DNA sequences utilized, Chaos Game Representation (CGR, is genome- and species-specific and can thus act as a genomic signature. Consequently, Molecular Distance Maps could inform species identification, taxonomic classifications and, to a certain extent, evolutionary history. The image distance employed, Structural Dissimilarity Index (DSSIM, implicitly compares the occurrences of oligomers of length up to k (herein k = 9 in DNA sequences. We computed DSSIM distances for more than 5 million pairs of complete mitochondrial genomes, and used Multi-Dimensional Scaling (MDS to obtain Molecular Distance Maps that visually display the sequence relatedness in various subsets, at different taxonomic levels. This general-purpose method does not require DNA sequence alignment and can thus be used to compare similar or vastly different DNA sequences, genomic or computer-generated, of the same or different lengths. We illustrate potential uses of this approach by applying it to several taxonomic subsets: phylum Vertebrata, (superkingdom Protista, classes Amphibia-Insecta-Mammalia, class Amphibia, and order Primates. This analysis of an extensive dataset confirms that the oligomer composition of full mtDNA sequences can be a source of taxonomic information. This method also correctly finds the mtDNA sequences most closely related to that of the anatomically modern human (the Neanderthal

  10. Automatic Mapping of NES Games with Mappy

    OpenAIRE

    Osborn, Joseph C.; Summerville, Adam; Mateas, Michael

    2017-01-01

    Game maps are useful for human players, general-game-playing agents, and data-driven procedural content generation. These maps are generally made by hand-assembling manually-created screenshots of game levels. Besides being tedious and error-prone, this approach requires additional effort for each new game and level to be mapped. The results can still be hard for humans or computational systems to make use of, privileging visual appearance over semantic information. We describe a software sys...

  11. Behavior of Poisson Bracket Mapping Equation in Studying Excitation Energy Transfer Dynamics of Cryptophyte Phycocyanin 645 Complex

    International Nuclear Information System (INIS)

    Lee, Weon Gyu; Kelly, Aaron; Rhee, Young Min

    2012-01-01

    Recently, it has been shown that quantum coherence appears in energy transfers of various photosynthetic light harvesting complexes at from cryogenic to even room temperatures. Because the photosynthetic systems are inherently complex, these findings have subsequently interested many researchers in the field of both experiment and theory. From the theoretical part, simplified dynamics or semiclassical approaches have been widely used. In these approaches, the quantum-classical Liouville equation (QCLE) is the fundamental starting point. Toward the semiclassical scheme, approximations are needed to simplify the equations of motion of various degrees of freedom. Here, we have adopted the Poisson bracket mapping equation (PBME) as an approximate form of QCLE and applied it to find the time evolution of the excitation in a photosynthetic complex from marine algae. The benefit of using PBME is its similarity to conventional Hamiltonian dynamics. Through this, we confirmed the coherent population transfer behaviors in short time domain as previously reported with a more accurate but more time-consuming iterative linearized density matrix approach. However, we find that the site populations do not behave according to the Boltzmann law in the long time limit. We also test the effect of adding spurious high frequency vibrations to the spectral density of the bath, and find that their existence does not alter the dynamics to any significant extent as long as the associated reorganization energy is changed not too drastically. This suggests that adopting classical trajectory based ensembles in semiclassical simulations should not influence the coherence dynamics in any practical manner, even though the classical trajectories often yield spurious high frequency vibrational features in the spectral density

  12. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... images. These images can be viewed on a computer monitor, printed on film or transferred to a ... other in a ring, called a gantry. The computer workstation that processes the imaging information is located ...

  13. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... images. These images can be viewed on a computer monitor, printed on film or transferred to a ... other in a ring, called a gantry. The computer workstation that processes the imaging information is located ...

  14. Electron transfer in organic glass. Distance and energy dependence

    International Nuclear Information System (INIS)

    Krongauz, V.V.

    1992-01-01

    The authors have investigated the distance and energy dependence of electron transfer in rigid organic glasses containing randomly dispersed electron donor and electron acceptor molecules. Pulsed radiolysis by an electron beam from a linear accelerator was used for ionization resulting in charge deposition on donor molecules. The disappearance kinetics of donor radical anions due to electron transfer to acceptor was monitored spectroscopically by the change in optical density at the wavelength corresponding to that of donor radical anion absorbance. It was found that the rate of the electron transfer observed experimentally was higher than that computed using the Marcus-Levich theory assuming that the electron-transfer activation barrier is equal to the binding energy of electron on the donor molecule. This discrepancy between the experimental and computed results suggests that the open-quotes inertclose quotes media in which electron-transfer reaction takes place may be participating in the process, resulting in experimentally observed higher electron-transfer rates. 32 refs., 3 figs., 2 tabs

  15. Seafloor mapping of large areas using multibeam system - Indian experience

    Digital Repository Service at National Institute of Oceanography (India)

    Kodagali, V.N.; KameshRaju, K.A; Ramprasad, T.

    averaged and merged to produce large area maps. Maps were generated in the scale of 1 mil. and 1.5 mil covering area of about 2 mil. sq.km in single map. Also, depth contour interval were generated. A computer program was developed to convert the depth data...

  16. Dynamical zeta functions for piecewise monotone maps of the interval

    CERN Document Server

    Ruelle, David

    2004-01-01

    Consider a space M, a map f:M\\to M, and a function g:M \\to {\\mathbb C}. The formal power series \\zeta (z) = \\exp \\sum ^\\infty _{m=1} \\frac {z^m}{m} \\sum _{x \\in \\mathrm {Fix}\\,f^m} \\prod ^{m-1}_{k=0} g (f^kx) yields an example of a dynamical zeta function. Such functions have unexpected analytic properties and interesting relations to the theory of dynamical systems, statistical mechanics, and the spectral theory of certain operators (transfer operators). The first part of this monograph presents a general introduction to this subject. The second part is a detailed study of the zeta functions associated with piecewise monotone maps of the interval [0,1]. In particular, Ruelle gives a proof of a generalized form of the Baladi-Keller theorem relating the poles of \\zeta (z) and the eigenvalues of the transfer operator. He also proves a theorem expressing the largest eigenvalue of the transfer operator in terms of the ergodic properties of (M,f,g).

  17. An integrated system for land resources supervision based on the IoT and cloud computing

    Science.gov (United States)

    Fang, Shifeng; Zhu, Yunqiang; Xu, Lida; Zhang, Jinqu; Zhou, Peiji; Luo, Kan; Yang, Jie

    2017-01-01

    Integrated information systems are important safeguards for the utilisation and development of land resources. Information technologies, including the Internet of Things (IoT) and cloud computing, are inevitable requirements for the quality and efficiency of land resources supervision tasks. In this study, an economical and highly efficient supervision system for land resources has been established based on IoT and cloud computing technologies; a novel online and offline integrated system with synchronised internal and field data that includes the entire process of 'discovering breaches, analysing problems, verifying fieldwork and investigating cases' was constructed. The system integrates key technologies, such as the automatic extraction of high-precision information based on remote sensing, semantic ontology-based technology to excavate and discriminate public sentiment on the Internet that is related to illegal incidents, high-performance parallel computing based on MapReduce, uniform storing and compressing (bitwise) technology, global positioning system data communication and data synchronisation mode, intelligent recognition and four-level ('device, transfer, system and data') safety control technology. The integrated system based on a 'One Map' platform has been officially implemented by the Department of Land and Resources of Guizhou Province, China, and was found to significantly increase the efficiency and level of land resources supervision. The system promoted the overall development of informatisation in fields related to land resource management.

  18. Heat transfer study on open heat exchangers used in jaggery production modules – Computational Fluid Dynamics simulation and field data assessment

    International Nuclear Information System (INIS)

    La Madrid, Raul; Marcelo, Daniel; Orbegoso, Elder Mendoza; Saavedra, Rafael

    2016-01-01

    Highlights: • Heat transfer modeling and simulation between flue gases and sugar cane juice. • Use of Computational Fluid Dynamics to get thermal parameters of a jaggery furnace. • Data acquisition system installed in the jaggery production module. • Parametric analysis changing the flue-gases velocity to represent temperature drops. - Abstract: Jaggery (also called organic sugar) is a concentrated product of sugarcane juice that is produced in rural communities in the highlands and jungle of Peru. In the last few years there has been an increase in the exports of jaggery and higher volumes of production are required driving this activity from a rural process with small production to an industry seeking greater productivity. In this framework, optimization of the use of energy becomes essential for the proper development of the process of production and the correct performance of the involved equipment. Open heat exchangers made of stainless steel are used in the production of jaggery. These heat exchangers containing sugarcane juice are placed over a flue gas duct. The thermal energy contained in the gas is used to evaporate the water contained in the sugarcane juice thickening the juice and after evaporating almost all the water, a pasty crystalline yellow substance is left in the boiling pan which becomes solid after cooling, this is the jaggery. The modeling and simulation of heat transfer between the combustion gases and the juice is very important in order to improve the thermal efficiency of the process. It permits to know with a high level of detail the physical phenomena of heat transfer occurring from bagasse combustion flue gases to sugarcane juice. This paper presents the results of the numerical simulation of heat transfer phenomena in the open heat exchangers and those results are compared to field measured data. Numerical results about temperature drop of flue gases in the several locations of the jaggery furnace are in good accordance with

  19. Worldwide complete spherical Bouguer and isostatic anomaly maps

    Science.gov (United States)

    Bonvalot, S.; Balmino, G.; Briais, A.; Peyrefitte, A.; Vales, N.; Biancale, R.; Gabalda, G.; Reinquin, F.

    2011-12-01

    We present here a set of digital maps of the Earth's gravity anomalies (surface "free air", Bouguer and isostatic), computed at Bureau Gravimetric International (BGI) as a contribution to the Global Geodetic Observing Systems (GGOS) and to the global geophysical maps published by the Commission for the Geological Map of the World (CGMW). The free air and Bouguer anomaly concept is extensively used in geophysical interpretation to investigate the density distributions in the Earth's interior. Complete Bouguer anomalies (including terrain effects) are usually computed at regional scales by integrating the gravity attraction of topography elements over and beyond a given area (under planar or spherical approximations). Here, we developed and applied a worldwide spherical approach aimed to provide a set of homogeneous and high resolution gravity anomaly maps and grids computed at the Earth's surface, taking into account a realistic Earth model and reconciling geophysical and geodetic definitions of gravity anomalies. This first version (1.0) has been computed by spherical harmonics analysis / synthesis of the Earth's topography-bathymetry up to degree 10800. The detailed theory of the spherical harmonics approach is given in Balmino et al., (Journal of Geodesy, submitted). The Bouguer and terrain corrections have thus been computed in spherical geometry at 1'x1' resolution using the ETOPO1 topography/bathymetry, ice surface and bedrock models from the NOAA (National Oceanic and Atmospheric Administration) and taking into account precise characteristics (boundaries and densities) of major lakes, inner seas, polar caps and of land areas below sea level. Isostatic corrections have been computed according to the Airy Heiskanen model in spherical geometry for a constant depth of compensation of 30km. The gravity information given here is provided by the Earth Geopotential Model (EGM2008), developed at degree 2160 by the National Geospatial Intelligence Agency (NGA) (Pavlis

  20. Intelligent process mapping through systematic improvement of heuristics

    Science.gov (United States)

    Ieumwananonthachai, Arthur; Aizawa, Akiko N.; Schwartz, Steven R.; Wah, Benjamin W.; Yan, Jerry C.

    1992-01-01

    The present system for automatic learning/evaluation of novel heuristic methods applicable to the mapping of communication-process sets on a computer network has its basis in the testing of a population of competing heuristic methods within a fixed time-constraint. The TEACHER 4.1 prototype learning system implemented or learning new postgame analysis heuristic methods iteratively generates and refines the mappings of a set of communicating processes on a computer network. A systematic exploration of the space of possible heuristic methods is shown to promise significant improvement.

  1. Construction of microsatellite-based linkage map and mapping of nectarilessness and hairiness genes in Gossypium tomentosum.

    Science.gov (United States)

    Hou, Meiying; Cai, Caiping; Zhang, Shuwen; Guo, Wangzhen; Zhang, Tianzhen; Zhou, Baoliang

    2013-12-01

    Gossypium tomentosum, a wild tetraploid cotton species with AD genomes, possesses genes conferring strong fibers and high heat tolerance. To effectively transfer these genes into Gossypium hirsutum, an entire microsatellite (simple sequence repeat, SSR)-based genetic map was constructed using the interspecific cross of G. hirsutum x G. tomentosum (HT). We detected 1800 loci from 1347 pairs of polymorphic primers. Of these, 1204 loci were grouped into 35 linkage groups at LOD ≥ 4. The map covers 3320.8 cM, with a mean density of 2.76 cM per locus. We detected 420 common loci (186 in the At subgenome and 234 in Dt) between the HT map and the map of TM-1 (G. hirsutum) and Hai 7124 (G. barbadense; HB map). The linkage groups were assigned chromosome numbers based on location of common loci and the HB map as reference. A comparison of common markers revealed that no significant chromosomal rearrangement exist between G. tomentosum and G. barbadense. Interestingly, however, we detected numerous (33.7%) segregation loci deviating from 3:1 ratio (P constructed in this study will be useful for further genetic studies on cotton breeding, including mapping loci controlling quantitative traits associated with fiber quality, stress tolerance and developing chromosome segment specific introgression lines from G. tomentosum into G. hirsutum using marker-assisted selection.

  2. SU-G-BRC-15: The Potential Clinical Significance of Dose Mapping Error for Intra- Fraction Dose Mapping for Lung Cancer Patients

    Energy Technology Data Exchange (ETDEWEB)

    Sayah, N [Thomas Cancer Center, Richmond, VA (United States); Weiss, E [Virginia Commonwealth University, Richmond, Virginia (United States); Watkins, W [University of Virginia, Charlottesville, VA (United States); Siebers, J [University of Virginia Health System, Charlottesville, VA (United States)

    2016-06-15

    Purpose: To evaluate the dose-mapping error (DME) inherent to conventional dose-mapping algorithms as a function of dose-matrix resolution. Methods: As DME has been reported to be greatest where dose-gradients overlap tissue-density gradients, non-clinical 66 Gy IMRT plans were generated for 11 lung patients with the target edge defined as the maximum 3D density gradient on the 0% (end of inhale) breathing phase. Post-optimization, Beams were copied to 9 breathing phases. Monte Carlo dose computed (with 2*2*2 mm{sup 3} resolution) on all 10 breathing phases was deformably mapped to phase 0% using the Monte Carlo energy-transfer method with congruent mass-mapping (EMCM); an externally implemented tri-linear interpolation method with voxel sub-division; Pinnacle’s internal (tri-linear) method; and a post-processing energy-mass voxel-warping method (dTransform). All methods used the same base displacement-vector-field (or it’s pseudo-inverse as appropriate) for the dose mapping. Mapping was also performed at 4*4*4 mm{sup 3} by merging adjacent dose voxels. Results: Using EMCM as the reference standard, no clinically significant (>1 Gy) DMEs were found for the mean lung dose (MLD), lung V20Gy, or esophagus dose-volume indices, although MLD and V20Gy were statistically different (2*2*2 mm{sup 3}). Pinnacle-to-EMCM target D98% DMEs of 4.4 and 1.2 Gy were observed ( 2*2*2 mm{sup 3}). However dTransform, which like EMCM conserves integral dose, had DME >1 Gy for one case. The root mean square RMS of the DME for the tri-linear-to- EMCM methods was lower for the smaller voxel volume for the tumor 4D-D98%, lung V20Gy, and cord D1%. Conclusion: When tissue gradients overlap with dose gradients, organs-at-risk DME was statistically significant but not clinically significant. Target-D98%-DME was deemed clinically significant for 2/11 patients (2*2*2 mm{sup 3}). Since tri-linear RMS-DME between EMCM and tri-linear was reduced at 2*2*2 mm{sup 3}, use of this resolution is

  3. IMPACT OF SCHEMATIC DESIGNS ON THE COGNITION OF UNDERGROUND TUBE MAPS

    Directory of Open Access Journals (Sweden)

    Z. Liu

    2016-06-01

    Full Text Available Schematic maps have been popularly employed to represent transport networks, particularly underground tube lines (or metro lines, since its adoption by the Official London Underground in early 1930s. Such maps employ straightened lines along horizontal, vertical and diagonal directions. Recently, some researchers started to argue that the distortion in such a schematization may cause big distortion and some new designs are proposed. This project aims to make a comparative analysis of such a schematic design with a new design proposed by Mark Noad in 2011, which makes use of lines along 30º and 60º directions instead of the 45º direction. Tasks have been designed for evaluating the effect of schematic designs on route planning by travellers. The participant was asked to choose the route s/he would take among two or three possible route options and then read the name of the selected transfer station. Eye-tracking technique has been employed to track the map recognition process. Total travel time is used as criterion for effectiveness; completion time and mental work cost are used for efficiency evaluation. It has been found that (1 the design of map style has significant impact on users’ travel decision making, especially map distance and transfer station symbol designs, and (2 the design style of a schematic map will have great impact on the effectiveness and efficiency of map recognition.

  4. Regionalization: A Story Map Lesson on Regions

    Science.gov (United States)

    Edmondson, Deborah

    2018-01-01

    This lesson introduces the concept of regionalization and types of regions. After a brief introductory activity, students explore a story map to learn the material. The teacher can project the story map on a screen for all students to follow or students may work individually on computers. Working individually will allow students to set their own…

  5. Non-destructive electrochemical graphene transfer from reusable thin-film catalysts

    DEFF Research Database (Denmark)

    Pizzocchero, Filippo; Jessen, Bjarke Sørensen; Whelan, Patrick Rebsdorf

    2015-01-01

    We demonstrate an electrochemical method - which we term oxidative decoupling transfer (ODT) - for transferring chemical vapor deposited graphene from physically deposited copper catalyst layers. This copper oxidation-based transfer technique is generally applicable to copper surfaces...... - up to 100 mm diameter films are demonstrated here - and exhibit a low Raman D:G peak ratio and a homogenous and continuous distribution of sheet conductance mapped by THz time-domain spectroscopy. By applying a fixed potential of -0.4 V vs. an Ag/AgCl reference electrode - significantly below...

  6. High-Frequency Mapping of the IPV6 Internet Using YARRP

    Science.gov (United States)

    2017-03-01

    Network CIDR Classless Inter-Domain Routing DNS Domain Name System HMAC Hashed Message Authentication Code HTTP Hypertext Transfer Protocol IANA...is connected (i.e., the interconnection of the routers that make up the network). Topology mapping can be conducted through either passive or active...means. In passive topology mapping, inferences are made about network connections based on data-plane traffic observed at specific points such as web

  7. Pseudo random number generator based on quantum chaotic map

    Science.gov (United States)

    Akhshani, A.; Akhavan, A.; Mobaraki, A.; Lim, S.-C.; Hassan, Z.

    2014-01-01

    For many years dissipative quantum maps were widely used as informative models of quantum chaos. In this paper, a new scheme for generating good pseudo-random numbers (PRNG), based on quantum logistic map is proposed. Note that the PRNG merely relies on the equations used in the quantum chaotic map. The algorithm is not complex, which does not impose high requirement on computer hardware and thus computation speed is fast. In order to face the challenge of using the proposed PRNG in quantum cryptography and other practical applications, the proposed PRNG is subjected to statistical tests using well-known test suites such as NIST, DIEHARD, ENT and TestU01. The results of the statistical tests were promising, as the proposed PRNG successfully passed all these tests. Moreover, the degree of non-periodicity of the chaotic sequences of the quantum map is investigated through the Scale index technique. The obtained result shows that, the sequence is more non-periodic. From these results it can be concluded that, the new scheme can generate a high percentage of usable pseudo-random numbers for simulation and other applications in scientific computing.

  8. Plume Tracker: Interactive mapping of volcanic sulfur dioxide emissions with high-performance radiative transfer modeling

    Science.gov (United States)

    Realmuto, Vincent J.; Berk, Alexander

    2016-11-01

    We describe the development of Plume Tracker, an interactive toolkit for the analysis of multispectral thermal infrared observations of volcanic plumes and clouds. Plume Tracker is the successor to MAP_SO2, and together these flexible and comprehensive tools have enabled investigators to map sulfur dioxide (SO2) emissions from a number of volcanoes with TIR data from a variety of airborne and satellite instruments. Our objective for the development of Plume Tracker was to improve the computational performance of the retrieval procedures while retaining the accuracy of the retrievals. We have achieved a 300 × improvement in the benchmark performance of the retrieval procedures through the introduction of innovative data binning and signal reconstruction strategies, and improved the accuracy of the retrievals with a new method for evaluating the misfit between model and observed radiance spectra. We evaluated the accuracy of Plume Tracker retrievals with case studies based on MODIS and AIRS data acquired over Sarychev Peak Volcano, and ASTER data acquired over Kilauea and Turrialba Volcanoes. In the Sarychev Peak study, the AIRS-based estimate of total SO2 mass was 40% lower than the MODIS-based estimate. This result was consistent with a 45% reduction in the AIRS-based estimate of plume area relative to the corresponding MODIS-based estimate. In addition, we found that our AIRS-based estimate agreed with an independent estimate, based on a competing retrieval technique, within a margin of ± 20%. In the Kilauea study, the ASTER-based concentration estimates from 21 May 2012 were within ± 50% of concurrent ground-level concentration measurements. In the Turrialba study, the ASTER-based concentration estimates on 21 January 2012 were in exact agreement with SO2 concentrations measured at plume altitude on 1 February 2012.

  9. Attenuation maps for SPECT determined using cone beam transmission computed tomography

    International Nuclear Information System (INIS)

    Manglos, S.H.; Bassano, D.A.; Duxbury, C.E.; Capone, R.B.

    1990-01-01

    This paper presents a new method for measuring non-uniform attenuation maps, using a cone beam geometry CT scan acquired on a standard rotating gamma camera normally used for SPECT imaging. The resulting map is intended for use in non-uniform attenuation compensation of SPECT images. The method was implemented using a light-weight point source holder attached to the camera. A cone beam collimator may be used on the gamma camera, but the cone beam CT scans may also be acquired without collimator. In either implementation, the advantages include very high efficiency and resolution limited not by the collimator but by the intrinsic camera resolution (about 4 mm). Several phantoms tested the spatial uniformity, noise, linearity as a function of attenuation coefficient, and spatial resolution. Good quality attenuation maps were obtained, at least for the central slices where no truncation was present

  10. Transfer and characterization of large-area CVD graphene for transparent electrode applications

    DEFF Research Database (Denmark)

    Whelan, Patrick Rebsdorf

    addresses key issues for industrial integration of large area graphene for optoelectronic devices. This is done through optimization of existing characterization methods and development of new transfer techniques. A method for accurately measuring the decoupling of graphene from copper catalysts...... and the electrical properties of graphene after transfer are superior compared to the standard etching transfer method. Spatial mapping of the electrical properties of transferred graphene is performed using terahertz time-domain spectroscopy (THz-TDS). The non-contact nature of THz-TDS and the fact...

  11. Ramachandran and his Map

    Indian Academy of Sciences (India)

    Ramachandran map, detailed in this article. His current interests are peptide, cyclic peptide and protein conforma- tions, energetics, data analysis, computer modeling as well as development of new algorithms useful for the conformational studies. C Ramakrishnan. Introduction. Professor G N Ramachandran was one of the ...

  12. Computational heat transfer analysis and combined ANN–GA

    Indian Academy of Sciences (India)

    The heat transfer augmentation is studied for different parameters such as inner radius, outer radius, height of the fins and number of pin fins. The base plate is supplied with a constant heat flux in the range of 20–500W. The base plate dimensions are kept constant. The base plate temperature is predicted using Artificial ...

  13. Statistical properties of the gyro-averaged standard map

    Science.gov (United States)

    da Fonseca, Julio D.; Sokolov, Igor M.; Del-Castillo-Negrete, Diego; Caldas, Ibere L.

    2015-11-01

    A statistical study of the gyro-averaged standard map (GSM) is presented. The GSM is an area preserving map model proposed in as a simplified description of finite Larmor radius (FLR) effects on ExB chaotic transport in magnetized plasmas with zonal flows perturbed by drift waves. The GSM's effective perturbation parameter, gamma, is proportional to the zero-order Bessel function of the particle's Larmor radius. In the limit of zero Larmor radius, the GSM reduces to the standard, Chirikov-Taylor map. We consider plasmas in thermal equilibrium and assume a Larmor radius' probability density function (pdf) resulting from a Maxwell-Boltzmann distribution. Since the particles have in general different Larmor radii, each orbit is computed using a different perturbation parameter, gamma. We present analytical and numerical computations of the pdf of gamma for a Maxwellian distribution. We also compute the pdf of global chaos, which gives the probability that a particle with a given Larmor radius exhibits global chaos, i.e. the probability that Kolmogorov-Arnold-Moser (KAM) transport barriers do not exist.

  14. A computer-oriented system for assembling and displaying land management information

    Science.gov (United States)

    Elliot L. Amidon

    1964-01-01

    Maps contain information basic to land management planning. By transforming conventional map symbols into numbers which are punched into cards, the land manager can have a computer assemble and display information required for a specific job. He can let a computer select information from several maps, combine it with such nonmap data as treatment cost or benefit per...

  15. Physicists set new record for network data transfer

    CERN Multimedia

    2007-01-01

    "An international team of physicists, computer scientists, and network engineers joined forces to set new records for sustained data transfer between storage systems durint the SuperComputing 2006 (SC06) Bandwidth Challenge (BWC). (3 pages)

  16. Image quality transfer and applications in diffusion MRI

    DEFF Research Database (Denmark)

    Alexander, Daniel C.; Zikic, Darko; Ghosh, Aurobrata

    2017-01-01

    This paper introduces a new computational imaging technique called image quality transfer (IQT). IQT uses machine learning to transfer the rich information available from one-off experimental medical imaging devices to the abundant but lower-quality data from routine acquisitions. The procedure u...

  17. Quasipolynomial generalization of Lotka-Volterra mappings

    Science.gov (United States)

    Hernández-Bermejo, Benito; Brenig, Léon

    2002-07-01

    In recent years, it has been shown that Lotka-Volterra mappings constitute a valuable tool from both the theoretical and the applied points of view, with developments in very diverse fields such as physics, population dynamics, chemistry and economy. The purpose of this work is to demonstrate that many of the most important ideas and algebraic methods that constitute the basis of the quasipolynomial formalism (originally conceived for the analysis of ordinary differential equations) can be extended into the mapping domain. The extension of the formalism into the discrete-time context is remarkable as far as the quasipolynomial methodology had never been shown to be applicable beyond the differential case. It will be demonstrated that Lotka-Volterra mappings play a central role in the quasipolynomial formalism for the discrete-time case. Moreover, the extension of the formalism into the discrete-time domain allows a significant generalization of Lotka-Volterra mappings as well as a whole transfer of algebraic methods into the discrete-time context. The result is a novel and more general conceptual framework for the understanding of Lotka-Volterra mappings as well as a new range of possibilities that become open not only for the theoretical analysis of Lotka-Volterra mappings and their generalizations, but also for the development of new applications.

  18. Quasipolynomial generalization of Lotka-Volterra mappings

    International Nuclear Information System (INIS)

    Hernandez-Bermejo, Benito; Brenig, Leon

    2002-01-01

    In recent years, it has been shown that Lotka-Volterra mappings constitute a valuable tool from both the theoretical and the applied points of view, with developments in very diverse fields such as physics, population dynamics, chemistry and economy. The purpose of this work is to demonstrate that many of the most important ideas and algebraic methods that constitute the basis of the quasipolynomial formalism (originally conceived for the analysis of ordinary differential equations) can be extended into the mapping domain. The extension of the formalism into the discrete-time context is remarkable as far as the quasipolynomial methodology had never been shown to be applicable beyond the differential case. It will be demonstrated that Lotka-Volterra mappings play a central role in the quasipolynomial formalism for the discrete-time case. Moreover, the extension of the formalism into the discrete-time domain allows a significant generalization of Lotka-Volterra mappings as well as a whole transfer of algebraic methods into the discrete-time context. The result is a novel and more general conceptual framework for the understanding of Lotka-Volterra mappings as well as a new range of possibilities that become open not only for the theoretical analysis of Lotka-Volterra mappings and their generalizations, but also for the development of new applications. (author)

  19. Negotiating transfer pricing using the Nash bargaining solution

    Directory of Open Access Journals (Sweden)

    Clempner Julio B.

    2017-12-01

    Full Text Available This paper analyzes and proposes a solution to the transfer pricing problem from the point of view of the Nash bargaining game theory approach. We consider a firm consisting of several divisions with sequential transfers, in which central management provides a transfer price decision that enables maximization of operating profits. Price transferring between divisions is negotiable throughout the bargaining approach. Initially, we consider a disagreement point (status quo between the divisions of the firm, which plays the role of a deterrent. We propose a framework and a method based on the Nash equilibrium approach for computing the disagreement point. Then, we introduce a bargaining solution, which is a single-valued function that selects an outcome from the feasible pay-offs for each bargaining problem that is a result of cooperation of the divisions of the firm involved in the transfer pricing problem. The agreement reached by the divisions in the game is the most preferred alternative within the set of feasible outcomes, which produces a profit-maximizing allocation of the transfer price between divisions. For computing the bargaining solution, we propose an optimization method. An example illustrating the usefulness of the method is presented.

  20. Hierarchy of rational order families of chaotic maps with an invariant ...

    Indian Academy of Sciences (India)

    We introduce an interesting hierarchy of rational order chaotic maps that possess an invariant measure. In contrast to the previously introduced hierarchy of chaotic maps [1–5], with merely entropy production, the rational order chaotic maps can simultaneously produce and consume entropy. We compute the ...

  1. Cattle transfers between herds under paratuberculosis surveillance in The Netherlands are not random

    NARCIS (Netherlands)

    Weber, M.F.; Roermund, van H.J.W.; Vernooij, J.C.M.; Kalis, C.H.J.; Stegeman, J.A.

    2006-01-01

    The rate and structure of cattle transfers between 206 Dutch cattle herds with a 'Mycobacterium avium subsp. paratuberculosis (Map)-free' status by November 2002, were analyzed over a 3-year period (November 1999-November 2002). Of the 206 'Map-free' herds, 184 were closed herds during the period

  2. A CAMAC display module for fast bit-mapped graphics

    International Nuclear Information System (INIS)

    Abdel-Aal, R.E.

    1992-01-01

    In many data acquisition and analysis facilities for nuclear physics research, utilities for the display of two-dimensional (2D) images and spectra on graphics terminals suffer from low speed, poor resolution, and limit accuracy. Developed of CAMAC bit-mapped graphics modules for this purpose has been discouraged in the past by the large device count needed and the long times required to load the image data from the host computer into the CAMAC hardware; particularly since many such facilities have been designed to support fast DMA block transfers only for data acquisition into the host. This paper describes the design and implementation of a prototype CAMAC graphics display module with a resolution of 256x256 pixels at eight colours for which all components can be easily accommodated in a single-width package. Employed is a hardware technique which reduces the number of programmed CAMAC data transfer operations needed for writing 2D images into the display memory by approximately an order of magnitude, with attendant improvements in the display speed and CPU time consumption. Hardware and software details are given together with sample results. Information on the performance of the module in a typical VAX/MBD data acquisition environment is presented, including data on the mutual effects of simultaneous data acquisition traffic. Suggestions are made for further improvements in performance. (orig.)

  3. Bulk Data Movement for Climate Dataset: Efficient Data Transfer Management with Dynamic Transfer Adjustment

    International Nuclear Information System (INIS)

    Sim, Alexander; Balman, Mehmet; Williams, Dean; Shoshani, Arie; Natarajan, Vijaya

    2010-01-01

    Many scientific applications and experiments, such as high energy and nuclear physics, astrophysics, climate observation and modeling, combustion, nano-scale material sciences, and computational biology, generate extreme volumes of data with a large number of files. These data sources are distributed among national and international data repositories, and are shared by large numbers of geographically distributed scientists. A large portion of data is frequently accessed, and a large volume of data is moved from one place to another for analysis and storage. One challenging issue in such efforts is the limited network capacity for moving large datasets to explore and manage. The Bulk Data Mover (BDM), a data transfer management tool in the Earth System Grid (ESG) community, has been managing the massive dataset transfers efficiently with the pre-configured transfer properties in the environment where the network bandwidth is limited. Dynamic transfer adjustment was studied to enhance the BDM to handle significant end-to-end performance changes in the dynamic network environment as well as to control the data transfers for the desired transfer performance. We describe the results from the BDM transfer management for the climate datasets. We also describe the transfer estimation model and results from the dynamic transfer adjustment.

  4. Simulation of quantum computers

    NARCIS (Netherlands)

    De Raedt, H; Michielsen, K; Hams, AH; Miyashita, S; Saito, K; Landau, DP; Lewis, SP; Schuttler, HB

    2001-01-01

    We describe a simulation approach to study the functioning of Quantum Computer hardware. The latter is modeled by a collection of interacting spin-1/2 objects. The time evolution of this spin system maps one-to-one to a quantum program carried out by the Quantum Computer. Our simulation software

  5. Simulation of quantum computers

    NARCIS (Netherlands)

    Raedt, H. De; Michielsen, K.; Hams, A.H.; Miyashita, S.; Saito, K.

    2000-01-01

    We describe a simulation approach to study the functioning of Quantum Computer hardware. The latter is modeled by a collection of interacting spin-1/2 objects. The time evolution of this spin system maps one-to-one to a quantum program carried out by the Quantum Computer. Our simulation software

  6. Simultaneous epicardial and noncontact endocardial mapping of the canine right atrium: simulation and experiment.

    Science.gov (United States)

    Sabouri, Sepideh; Matene, Elhacene; Vinet, Alain; Richer, Louis-Philippe; Cardinal, René; Armour, J Andrew; Pagé, Pierre; Kus, Teresa; Jacquemet, Vincent

    2014-01-01

    Epicardial high-density electrical mapping is a well-established experimental instrument to monitor in vivo the activity of the atria in response to modulations of the autonomic nervous system in sinus rhythm. In regions that are not accessible by epicardial mapping, noncontact endocardial mapping performed through a balloon catheter may provide a more comprehensive description of atrial activity. We developed a computer model of the canine right atrium to compare epicardial and noncontact endocardial mapping. The model was derived from an experiment in which electroanatomical reconstruction, epicardial mapping (103 electrodes), noncontact endocardial mapping (2048 virtual electrodes computed from a 64-channel balloon catheter), and direct-contact endocardial catheter recordings were simultaneously performed in a dog. The recording system was simulated in the computer model. For simulations and experiments (after atrio-ventricular node suppression), activation maps were computed during sinus rhythm. Repolarization was assessed by measuring the area under the atrial T wave (ATa), a marker of repolarization gradients. Results showed an epicardial-endocardial correlation coefficients of 0.80 and 0.63 (two dog experiments) and 0.96 (simulation) between activation times, and a correlation coefficients of 0.57 and 0.46 (two dog experiments) and 0.92 (simulation) between ATa values. Despite distance (balloon-atrial wall) and dimension reduction (64 electrodes), some information about atrial repolarization remained present in noncontact signals.

  7. Simultaneous epicardial and noncontact endocardial mapping of the canine right atrium: simulation and experiment.

    Directory of Open Access Journals (Sweden)

    Sepideh Sabouri

    Full Text Available Epicardial high-density electrical mapping is a well-established experimental instrument to monitor in vivo the activity of the atria in response to modulations of the autonomic nervous system in sinus rhythm. In regions that are not accessible by epicardial mapping, noncontact endocardial mapping performed through a balloon catheter may provide a more comprehensive description of atrial activity. We developed a computer model of the canine right atrium to compare epicardial and noncontact endocardial mapping. The model was derived from an experiment in which electroanatomical reconstruction, epicardial mapping (103 electrodes, noncontact endocardial mapping (2048 virtual electrodes computed from a 64-channel balloon catheter, and direct-contact endocardial catheter recordings were simultaneously performed in a dog. The recording system was simulated in the computer model. For simulations and experiments (after atrio-ventricular node suppression, activation maps were computed during sinus rhythm. Repolarization was assessed by measuring the area under the atrial T wave (ATa, a marker of repolarization gradients. Results showed an epicardial-endocardial correlation coefficients of 0.80 and 0.63 (two dog experiments and 0.96 (simulation between activation times, and a correlation coefficients of 0.57 and 0.46 (two dog experiments and 0.92 (simulation between ATa values. Despite distance (balloon-atrial wall and dimension reduction (64 electrodes, some information about atrial repolarization remained present in noncontact signals.

  8. Simultaneous Epicardial and Noncontact Endocardial Mapping of the Canine Right Atrium: Simulation and Experiment

    Science.gov (United States)

    Sabouri, Sepideh; Matene, Elhacene; Vinet, Alain; Richer, Louis-Philippe; Cardinal, René; Armour, J. Andrew; Pagé, Pierre; Kus, Teresa; Jacquemet, Vincent

    2014-01-01

    Epicardial high-density electrical mapping is a well-established experimental instrument to monitor in vivo the activity of the atria in response to modulations of the autonomic nervous system in sinus rhythm. In regions that are not accessible by epicardial mapping, noncontact endocardial mapping performed through a balloon catheter may provide a more comprehensive description of atrial activity. We developed a computer model of the canine right atrium to compare epicardial and noncontact endocardial mapping. The model was derived from an experiment in which electroanatomical reconstruction, epicardial mapping (103 electrodes), noncontact endocardial mapping (2048 virtual electrodes computed from a 64-channel balloon catheter), and direct-contact endocardial catheter recordings were simultaneously performed in a dog. The recording system was simulated in the computer model. For simulations and experiments (after atrio-ventricular node suppression), activation maps were computed during sinus rhythm. Repolarization was assessed by measuring the area under the atrial T wave (ATa), a marker of repolarization gradients. Results showed an epicardial-endocardial correlation coefficients of 0.80 and 0.63 (two dog experiments) and 0.96 (simulation) between activation times, and a correlation coefficients of 0.57 and 0.46 (two dog experiments) and 0.92 (simulation) between ATa values. Despite distance (balloon-atrial wall) and dimension reduction (64 electrodes), some information about atrial repolarization remained present in noncontact signals. PMID:24598778

  9. Surgical planning of total hip arthroplasty: accuracy of computer-assisted EndoMap software in predicting component size

    International Nuclear Information System (INIS)

    Davila, Jesse A.; Kransdorf, Mark J.; Duffy, Gavan P.

    2006-01-01

    The purpose of our study was to assess the accuracy of a computer-assisted templating in the surgical planning of patients undergoing total hip arthroplasty utilizing EndoMap software (Siemans AG, Medical Solutions, Erlangen, Germany). Endomap Software is an electronic program that uses DICOM images to analyze standard anteroposterior radiographs for determination of optimal prosthesis component size. We retrospectively reviewed the preoperative radiographs of 36 patients undergoing uncomplicated primary total hip arthroplasty, utilizing EndoMap software, Version VA20. DICOM anteroposterior radiographs were analyzed using standard manufacturer supplied electronic templates to determine acetabular and femoral component sizes. No additional clinical information was reviewed. Acetabular and femoral component sizes were assessed by an orthopedic surgeon and two radiologists. Mean and estimated component size was compared with component size as documented in operative reports. The mean estimated acetabular component size was 53 mm (range 48-60 mm), 1 mm larger than the mean implanted size of 52 mm (range 48-62 mm). Thirty-one of 36 acetabular component sizes (86%) were accurate within one size. The mean calculated femoral component size was 4 (range 2-7), 1 size smaller than the actual mean component size of 5 (range 2-9). Twenty-six of 36 femoral component sizes (72%) were accurate within one size, and accurate within two sizes in all but four cases (94%). EndoMap Software predicted femoral component size well, with 72% within one component size of that used, and 94% within two sizes. Acetabular component size was predicted slightly better with 86% within one component size and 94% within two component sizes. (orig.)

  10. Sudden Cardiac Risk Stratification with Electrocardiographic Indices - A Review on Computational Processing, Technology Transfer, and Scientific Evidence

    Directory of Open Access Journals (Sweden)

    Francisco Javier eGimeno-Blanes

    2016-03-01

    Full Text Available Great effort has been devoted in recent years to the development of sudden cardiac risk predictors as a function of electric cardiac signals, mainly obtained from the electrocardiogram (ECG analysis. But these prediction techniques are still seldom used in clinical practice, partly due to its limited diagnostic accuracy and to the lack of consensus about the appropriate computational signal processing implementation. This paper addresses a three-fold approach, based on ECG indexes, to structure this review on sudden cardiac risk stratification. First, throughout the computational techniques that had been widely proposed for obtaining these indexes in technical literature. Second, over the scientific evidence, that although is supported by observational clinical studies, they are not always representative enough. And third, via the limited technology transfer of academy-accepted algorithms, requiring further meditation for future systems. We focus on three families of ECG derived indexes which are tackled from the aforementioned viewpoints, namely, heart rate turbulence, heart rate variability, and T-wave alternans. In terms of computational algorithms, we still need clearer scientific evidence, standardizing, and benchmarking, siting on advanced algorithms applied over large and representative datasets. New scenarios like electronic health recordings, big data, long-term monitoring, and cloud databases, will eventually open new frameworks to foresee suitable new paradigms in the near future.

  11. A computer simulation of the turbocharged turbo compounded diesel engine system: A description of the thermodynamic and heat transfer models

    Science.gov (United States)

    Assanis, D. N.; Ekchian, J. E.; Frank, R. M.; Heywood, J. B.

    1985-01-01

    A computer simulation of the turbocharged turbocompounded direct-injection diesel engine system was developed in order to study the performance characteristics of the total system as major design parameters and materials are varied. Quasi-steady flow models of the compressor, turbines, manifolds, intercooler, and ducting are coupled with a multicylinder reciprocator diesel model, where each cylinder undergoes the same thermodynamic cycle. The master cylinder model describes the reciprocator intake, compression, combustion and exhaust processes in sufficient detail to define the mass and energy transfers in each subsystem of the total engine system. Appropriate thermal loading models relate the heat flow through critical system components to material properties and design details. From this information, the simulation predicts the performance gains, and assesses the system design trade-offs which would result from the introduction of selected heat transfer reduction materials in key system components, over a range of operating conditions.

  12. Multidisciplinary Knowledge Transfer in Training Multimedia Projects

    Science.gov (United States)

    Freyens, Benoit; Martin, Marguerite

    2007-01-01

    Purpose--Training multimedia projects often face identical knowledge-transfer obstacles that partly originate in the multidisciplinarity of the project team. The purpose of this paper is to describe these difficulties and the tools used to overcome them. In particular, the aim is to show how elements of cognitive psychology theory (concept maps,…

  13. Children's (Pediatric) CT (Computed Tomography)

    Medline Plus

    Full Text Available ... images. These images can be viewed on a computer monitor, printed on film or transferred to a ... a gantry, which rotates around the patient. The computer that processes the imaging information and monitor are ...

  14. Computer control of fuel handling activities at FFTF

    International Nuclear Information System (INIS)

    Romrell, D.M.

    1985-03-01

    The Fast Flux Test Facility near Richland, Washington, utilizes computer control for reactor refueling and other related core component handling and processing tasks. The computer controlled tasks described in this paper include core component transfers within the reactor vessel, core component transfers into and out of the reactor vessel, remote duct measurements of irradiated core components, remote duct cutting, and finally, transferring irradiated components out of the reactor containment building for off-site shipments or to long term storage. 3 refs., 16 figs

  15. HTCC - a heat transfer model for gas-steam mixtures

    International Nuclear Information System (INIS)

    Papadimitriou, P.

    1983-01-01

    The mathematical model HTCC (Heat Transfer Coefficient in Containment) has been developed for RALOC after a loss-of-coolant accident in order to determine the local heat transfer coefficients for transfer between the containment atmosphere and the walls of the reactor building. The model considers the current values of room and wall temperature, the concentration of steam and non-condensible gases, geometry data and those of fluid dynamics together with thermodynamic parameters and from these determines the heat transfer mechanisms due to convection, radiation and condensation. The HTCC is implemented in the RALOC program. Comparative analyses of computed temperature profiles, for HEDL Standard problems A and B on hydrogen distribution, and of computed temperature profiles determined during the heat-up phase in the CSE-A5 experiment show a good agreement with experimental data. (orig.) [de

  16. Development and Application of a Numerical Framework for Improving Building Foundation Heat Transfer Calculations

    Science.gov (United States)

    Kruis, Nathanael J. F.

    Heat transfer from building foundations varies significantly in all three spatial dimensions and has important dynamic effects at all timescales, from one hour to several years. With the additional consideration of moisture transport, ground freezing, evapotranspiration, and other physical phenomena, the estimation of foundation heat transfer becomes increasingly sophisticated and computationally intensive to the point where accuracy must be compromised for reasonable computation time. The tools currently available to calculate foundation heat transfer are often either too limited in their capabilities to draw meaningful conclusions or too sophisticated to use in common practices. This work presents Kiva, a new foundation heat transfer computational framework. Kiva provides a flexible environment for testing different numerical schemes, initialization methods, spatial and temporal discretizations, and geometric approximations. Comparisons within this framework provide insight into the balance of computation speed and accuracy relative to highly detailed reference solutions. The accuracy and computational performance of six finite difference numerical schemes are verified against established IEA BESTEST test cases for slab-on-grade heat conduction. Of the schemes tested, the Alternating Direction Implicit (ADI) scheme demonstrates the best balance between accuracy, performance, and numerical stability. Kiva features four approaches of initializing soil temperatures for an annual simulation. A new accelerated initialization approach is shown to significantly reduce the required years of presimulation. Methods of approximating three-dimensional heat transfer within a representative two-dimensional context further improve computational performance. A new approximation called the boundary layer adjustment method is shown to improve accuracy over other established methods with a negligible increase in computation time. This method accounts for the reduced heat transfer

  17. Automated thermal mapping techniques using chromatic image analysis

    Science.gov (United States)

    Buck, Gregory M.

    1989-01-01

    Thermal imaging techniques are introduced using a chromatic image analysis system and temperature sensitive coatings. These techniques are used for thermal mapping and surface heat transfer measurements on aerothermodynamic test models in hypersonic wind tunnels. Measurements are made on complex vehicle configurations in a timely manner and at minimal expense. The image analysis system uses separate wavelength filtered images to analyze surface spectral intensity data. The system was initially developed for quantitative surface temperature mapping using two-color thermographic phosphors but was found useful in interpreting phase change paint and liquid crystal data as well.

  18. Learning transfer of geospatial technologies in secondary science and mathematics core areas

    Science.gov (United States)

    Nielsen, Curtis P.

    The purpose of this study was to investigate the transfer of geospatial technology knowledge and skill presented in a social sciences course context to other core areas of the curriculum. Specifically, this study explored the transfer of geospatial technology knowledge and skill to the STEM-related core areas of science and mathematics among ninth-grade students. Haskell's (2001) research on "levels of transfer" provided the theoretical framework for this study, which sought to demonstrate the experimental group's higher ability to transfer geospatial skills, higher mean assignment scores, higher post-test scores, higher geospatial skill application and deeper levels of transfer application than the control group. The participants of the study consisted of thirty ninth-graders enrolled in U.S. History, Earth Science and Integrated Mathematics 1 courses. The primary investigator of this study had no previous classroom experiences with this group of students. The participants who were enrolled in the school's existing two-section class configuration were assigned to experimental and control groups. The experimental group had ready access to Macintosh MacBook laptop computers, and the control group had ready access to Macintosh iPads. All participants in U.S. History received instruction with and were required to use ArcGIS Explorer Online during a Westward Expansion project. All participants were given the ArcGIS Explorer Online content assessment following the completion of the U.S. History project. Once the project in U.S. History was completed, Earth Science and Integrated Mathematics 1 began units of instruction beginning with a multiple-choice content pre-test created by the classroom teachers. Experimental participants received the same unit of instruction without the use or influence of ArcGIS Explorer Online. At the end of the Earth Science and Integrated Math 1 units, the same multiple-choice test was administered as the content post-test. Following the

  19. Designing Knowledge Map for Knowledge Management projects Using Network Analysis

    Directory of Open Access Journals (Sweden)

    heidar najafi

    2017-09-01

    Full Text Available In this research knowledge management has been studied as an interdisciplinary area. We aim to find an answer for this question that "what are the scientific structure and knowledge map of knowledge management projects regarding these two aspect of subject areas and keywords. For this purpose, nearly 40000 scientific documents including knowledge management as one of their keywords were selected from Scopus database and were studied in various subject areas. In this research,bar charts have been drawn for each index of subject areas and keywords. Besides, using Co-occurrence matrix, adjacency graphs were drawn and then clustered using Average-Link algorithm. Bar charts and graphs were drawn using R and Excel software. The results of this research showed that among the researches on knowledge management in the world, the most relevant scientific fields to knowledge management are Computer Sciences with 32.5%, Business, Management and Accounting with 14.5%, Engineering with 13.7%, Decisive Sciences with 12.6%, Mathematics with 7.07%, and Social Sciences with 6.63%, respectively. The most keywords collocate with knowledge management in the world are Human-Computer Interaction, Information Management, Systems Management, Information Technology, Manufacturing, Acquisition of Knowledge, Semantics, Knowledge Transfer, Ontology and Information Retrieval.

  20. Quantitative estimation of viable myocardium in the infarcted zone by infarct-redistribution map from images of exercise thallium-201 emission computed tomography

    International Nuclear Information System (INIS)

    Sekiai, Yasuhiro

    1988-01-01

    To evaluate, quantitatively, the viable myocardium in the infarcted zone, we invented the infarct-redistribution map which is produced from images of exercise thallium-201 emission computed tomography performed on 10 healthy subjects and 20 patients with myocardial infarction. The map displayed a left ventricle in which the infarcted area both with and without redistribution, the redistribution area without infarction, and normal perfusion area were shown separated in same screen. In these circumstances, the nonredistribution infarct lesion was found as being surrounded by the redistribution area. Indices of infarct and redistribution extent (defect score, % defect, redistribution ratio (RR) and redistribution index (RI)), were induced from the map and were used for quantitative analysis of the redistribution area and as the basis for comparative discussion regarding regional wall motion of the left ventricle. The quantitative indices of defect score, % defect, RR and RI were consistent with the visual assessment of planar images in detecting the extent of redistribution. Furthermore, defect score and % defect had an inverted linear relationship with % shortening (r = -0.573; p < 0.05, r = -0.536; p < 0.05, respectively), and RI had a good linear relationship with % shortening (r = 0.669; p < 0.01). We conclude that the infarct-redistribution map accurately reflects the myocardial viability and therefore may be useful for quantitative estimation of viable myocardium in the infarcted zone. (author)

  1. The Education Value of Cloud Computing

    Science.gov (United States)

    Katzan, Harry, Jr.

    2010-01-01

    Cloud computing is a technique for supplying computer facilities and providing access to software via the Internet. Cloud computing represents a contextual shift in how computers are provisioned and accessed. One of the defining characteristics of cloud software service is the transfer of control from the client domain to the service provider.…

  2. Michael Levitt and Computational Biology

    Science.gov (United States)

    dropdown arrow Site Map A-Z Index Menu Synopsis Michael Levitt and Computational Biology Resources with Michael Levitt, PhD, professor of structural biology at the Stanford University School of Medicine, has function. ... Levitt's early work pioneered computational structural biology, which helped to predict

  3. Computer-Based Training in Eating and Nutrition Facilitates Person-Centered Hospital Care: A Group Concept Mapping Study.

    Science.gov (United States)

    Westergren, Albert; Edfors, Ellinor; Norberg, Erika; Stubbendorff, Anna; Hedin, Gita; Wetterstrand, Martin; Rosas, Scott R; Hagell, Peter

    2018-04-01

    Studies have shown that computer-based training in eating and nutrition for hospital nursing staff increased the likelihood that patients at risk of undernutrition would receive nutritional interventions. This article seeks to provide understanding from the perspective of nursing staff of conceptually important areas for computer-based nutritional training, and their relative importance to nutritional care, following completion of the training. Group concept mapping, an integrated qualitative and quantitative methodology, was used to conceptualize important factors relating to the training experiences through four focus groups (n = 43), statement sorting (n = 38), and importance rating (n = 32), followed by multidimensional scaling and cluster analysis. Sorting of 38 statements yielded four clusters. These clusters (number of statements) were as follows: personal competence and development (10), practice close care development (10), patient safety (9), and awareness about the nutrition care process (9). First and second clusters represented "the learning organization," and third and fourth represented "quality improvement." These findings provide a conceptual basis for understanding the importance of training in eating and nutrition, which contributes to a learning organization and quality improvement, and can be linked to and facilitates person-centered nutritional care and patient safety.

  4. Neural network representation and learning of mappings and their derivatives

    Science.gov (United States)

    White, Halbert; Hornik, Kurt; Stinchcombe, Maxwell; Gallant, A. Ronald

    1991-01-01

    Discussed here are recent theorems proving that artificial neural networks are capable of approximating an arbitrary mapping and its derivatives as accurately as desired. This fact forms the basis for further results establishing the learnability of the desired approximations, using results from non-parametric statistics. These results have potential applications in robotics, chaotic dynamics, control, and sensitivity analysis. An example involving learning the transfer function and its derivatives for a chaotic map is discussed.

  5. Transfer metrics analytics project

    CERN Document Server

    Matonis, Zygimantas

    2016-01-01

    This report represents work done towards predicting transfer rates/latencies on Worldwide LHC Computing Grid (WLCG) sites using Machine Learning techniques. Topic covered are technologies used for the project, data preparation for ML suitable format and attribute selection as well as a comparison of different ML algorithms.

  6. Bilaterally Weighted Patches for Disparity Map Computation

    Directory of Open Access Journals (Sweden)

    Laura Fernández Julià

    2015-03-01

    Full Text Available Visual correspondence is the key for 3D reconstruction in binocular stereovision. Local methods perform block-matching to compute the disparity, or apparent motion, of pixels between images. The simplest approach computes the distance of patches, usually square windows, and assumes that all pixels in the patch have the same disparity. A prominent artifact of the method is the "foreground fattening effet" near depth discontinuities. In order to find a more appropriate support, Yoon and Kweon introduced the use of weights based on color similarity and spatial distance, analogous to those used in the bilateral filter. This paper presents the theory of this method and the implementation we have developed. Moreover, some variants are discussed and improvements are used in the final implementation. Several examples and tests are presented and the parameters and performance of the method are analyzed.

  7. Physical Webbing: Collaborative Kinesthetic Three-Dimensional Mind Maps[R

    Science.gov (United States)

    Williams, Marian H.

    2012-01-01

    Mind Mapping has predominantly been used by individuals or collaboratively in groups as a paper-based or computer-generated learning strategy. In an effort to make Mind Mapping kinesthetic, collaborative, and three-dimensional, an innovative pedagogical strategy, termed Physical Webbing, was devised. In the Physical Web activity, groups…

  8. How pinning and contact angle hysteresis govern quasi-static liquid drop transfer.

    Science.gov (United States)

    Chen, H; Tang, T; Zhao, H; Law, K-Y; Amirfazli, A

    2016-02-21

    This paper presents both experimental and numerical simulations of liquid transfer between two solid surfaces with contact angle hysteresis (CAH). Systematic studies on the role of the advancing contact angle (θa), receding contact angle (θr) and CAH in determining the transfer ratio (volume of the liquid transferred onto the acceptor surface over the total liquid volume) and the maximum adhesion force (Fmax) were performed. The transfer ratio was found to be governed by contact line pinning at the end of the transfer process caused by CAH of surfaces. A map based on θr of the two surfaces was generated to identify the three regimes for liquid transfer: (I) contact line pinning occurs only on the donor surface, (II) contact line pinning occurs on both surfaces, and (III) contact line pinning occurs only on the acceptor surface. With this map, an empirical equation is provided which is able to estimate the transfer ratio by only knowing θr of the two surfaces. The value of Fmax is found to be strongly influenced by the contact line pinning in the early stretching stage. For symmetric liquid bridges between two identical surfaces, Fmax may be determined only by θa, only by θr, or by both θa and θr, depending on the magnitude of the contact angles. For asymmetric bridges, Fmax is found to be affected by the period when contact lines are pinned on both surfaces.

  9. Polytene chromosomal maps of 11 Drosophila species: the order of genomic scaffolds inferred from genetic and physical maps.

    Science.gov (United States)

    Schaeffer, Stephen W; Bhutkar, Arjun; McAllister, Bryant F; Matsuda, Muneo; Matzkin, Luciano M; O'Grady, Patrick M; Rohde, Claudia; Valente, Vera L S; Aguadé, Montserrat; Anderson, Wyatt W; Edwards, Kevin; Garcia, Ana C L; Goodman, Josh; Hartigan, James; Kataoka, Eiko; Lapoint, Richard T; Lozovsky, Elena R; Machado, Carlos A; Noor, Mohamed A F; Papaceit, Montserrat; Reed, Laura K; Richards, Stephen; Rieger, Tania T; Russo, Susan M; Sato, Hajime; Segarra, Carmen; Smith, Douglas R; Smith, Temple F; Strelets, Victor; Tobari, Yoshiko N; Tomimura, Yoshihiko; Wasserman, Marvin; Watts, Thomas; Wilson, Robert; Yoshida, Kiyohito; Markow, Therese A; Gelbart, William M; Kaufman, Thomas C

    2008-07-01

    The sequencing of the 12 genomes of members of the genus Drosophila was taken as an opportunity to reevaluate the genetic and physical maps for 11 of the species, in part to aid in the mapping of assembled scaffolds. Here, we present an overview of the importance of cytogenetic maps to Drosophila biology and to the concepts of chromosomal evolution. Physical and genetic markers were used to anchor the genome assembly scaffolds to the polytene chromosomal maps for each species. In addition, a computational approach was used to anchor smaller scaffolds on the basis of the analysis of syntenic blocks. We present the chromosomal map data from each of the 11 sequenced non-Drosophila melanogaster species as a series of sections. Each section reviews the history of the polytene chromosome maps for each species, presents the new polytene chromosome maps, and anchors the genomic scaffolds to the cytological maps using genetic and physical markers. The mapping data agree with Muller's idea that the majority of Drosophila genes are syntenic. Despite the conservation of genes within homologous chromosome arms across species, the karyotypes of these species have changed through the fusion of chromosomal arms followed by subsequent rearrangement events.

  10. From hydro-geomorphological mapping to sediment transfer evaluation in the Upper Guil Catchment (Queyras, French Alps)

    Science.gov (United States)

    Lissak, Candide; Fort, Monique; Arnaud-Fassetta, Gilles; Mathieu, Alexandre; Malet, Jean-Philippe; Carlier, Benoit; Betard, François; Cossart, Etienne; Madelin, Malika; Viel, Vincent; Charney, Bérengère; Bletterie, Xavier

    2014-05-01

    The Guil River catchment (Queyras, Southern French Alps) is prone to hydro-geomorphic hazards related to catastrophic floods, with an amplification of their impacts due to strong hillslope-channel connectivity such as in 1957 (> R.I. 100 yr), and more recently in 2000 (R.I. 30 yr). In both cases, the rainfall intensity, aggravated by pre-existing saturated soils, explained the immediate response of the fluvial system and the subsequent destabilisation of slopes. This resulted in serious damages to infrastructure and buildings in the valley bottom, mostly along some specific reaches and confluences with debris flow prone tributaries. After each event, new protective structures are built. One of the purposes of this study, undertaken in the frame of the SAMCO (ANR) project, was to understand the hydro-geomorphological functioning of this upper Alpine catchment in a context of hazards mitigation and sustainable management of sediment yield, transfer and deposition. To determine the main sediment storages that could be mobilised during the next major hydro-meteorological events, the first step of our study consists in the identification and characterisation of areas that play a role into the sediment transfer processing. From environmental characteristics (channel geometric, vegetation cover…) and anthropogenic factors (hydraulic infrastructures, urban development…), a semi-automatic method provides a typology of contribution areas with sediment storages sensitive to erosion, or areas that will be prone to deposition of sediments during the next flooding event. The second step of the study is focused on the sediment storages with their characterisation and connectivity to the trunk channel. Taking into account the entire catchment and including the torrential system, this phase analyses the sedimentary transfers from the identification and classification of sediment storages to the evaluation of the degree of connectivity with the main or secondary channels. The

  11. Dual-energy computed tomography to assess tumor response to hepatic radiofrequency ablation: potential diagnostic value of virtual noncontrast images and iodine maps.

    Science.gov (United States)

    Lee, Su Hyun; Lee, Jeong Min; Kim, Kyung Won; Klotz, Ernst; Kim, Se Hyung; Lee, Jae Young; Han, Joon Koo; Choi, Byung Ihn

    2011-02-01

    to determine the value of dual-energy (DE) scanning with virtual noncontrast (VNC) images and iodine maps in the evaluation of therapeutic response to radiofrequency ablation (RFA) for hepatic tumors. a total of 75 patients with hepatic tumors and who underwent DE computed tomography (CT) after RFA, were enrolled in this study. Our DE CT protocol included precontrast, arterial, and portal phase scans. VNC images and iodine maps were created from 80 to 140 kVp images during the arterial and portal phases. VNC images were then compared with true, noncontrast (TNC) images, and iodine maps were compared with linearly blended images, both qualitatively and quantitatively. For the former comparison, image quality and acceptability of the VNC images as a replacement for TNC images were both rated. The CT numbers of the hepatic parenchyma, ablation zone, and image noise were measured. For the latter comparison, lesion conspicuity of the ablation zone and the additional benefit of integrating the iodine map into the routine protocol, were assessed. Contrast-to-noise ratios (CNR) of the ablation zone-to-liver and aorta-to-liver as well as the CT number differences between the center and the periphery of the ablation zone were calculated. The image quality of the VNC images was rated as good (mean grading score, 1.88) and the level of acceptance was 90% (68/75). The mean CT numbers of the hepatic parenchyma and ablation zone did not differ significantly between the TNC and the VNC images (P > 0.05). The lesion conspicuity of the ablation zone was rated as excellent or good in 97% of the iodine map (73/75), and the additional benefits of the iodine maps were positively rated as better to the same (mean 1.5). The CNR of the aorta-to-liver parenchyma was significantly higher on the iodine map (P = 0.002), and the CT number differences between the center and the periphery of the ablation zone were significantly lower on the iodine map (P VNC images can be an alternative to TNC

  12. Uncertainty of Monetary Valued Ecosystem Services - Value Transfer Functions for Global Mapping.

    Directory of Open Access Journals (Sweden)

    Stefan Schmidt

    Full Text Available Growing demand of resources increases pressure on ecosystem services (ES and biodiversity. Monetary valuation of ES is frequently seen as a decision-support tool by providing explicit values for unconsidered, non-market goods and services. Here we present global value transfer functions by using a meta-analytic framework for the synthesis of 194 case studies capturing 839 monetary values of ES. For 12 ES the variance of monetary values could be explained with a subset of 93 study- and site-specific variables by utilizing boosted regression trees. This provides the first global quantification of uncertainties and transferability of monetary valuations. Models explain from 18% (water provision to 44% (food provision of variance and provide statistically reliable extrapolations for 70% (water provision to 91% (food provision of the terrestrial earth surface. Although the application of different valuation methods is a source of uncertainty, we found evidence that assuming homogeneity of ecosystems is a major error in value transfer function models. Food provision is positively correlated with better life domains and variables indicating positive conditions for human well-being. Water provision and recreation service show that weak ownerships affect valuation of other common goods negatively (e.g. non-privately owned forests. Furthermore, we found support for the shifting baseline hypothesis in valuing climate regulation. Ecological conditions and societal vulnerability determine valuation of extreme event prevention. Valuation of habitat services is negatively correlated with indicators characterizing less favorable areas. Our analysis represents a stepping stone to establish a standardized integration of and reporting on uncertainties for reliable and valid benefit transfer as an important component for decision support.

  13. Effect of Inlet Velocity on Heat Transfer Process in a Novel Photo-Fermentation Biohydrogen Production Bioreactor using Computational Fluid Dynamics Simulation

    Directory of Open Access Journals (Sweden)

    Zhiping Zhang

    2014-11-01

    Full Text Available Temperature is one of the most important parameters in biohydrogen production by way of photo-fermentation. Enzymatic hydrolysate of corncob powder was utilized as a substrate. Computational fluid dynamics (CFD modeling was conducted to simulate the temperature distribution in an up-flow baffle photo-bioreactor (UBPB. Commercial software, GAMBIT, was utilized to mesh the photobioreactor geometry, while the software FLUENT was adopted to simulate the heat transfer in the photo-fermentation process. The inlet velocity had a marked impact on heat transfer; the most optimum velocity value was 0.0036 m•s-1 because it had the smallest temperature fluctuation and the most uniform temperature distribution. When the velocity decreased from 0.0036 m•s-1 to 0.0009 m•s-1, more heat was accumulated. The results obtained from the established model were consistent to the actual situation by comparing the simulation values and experimental values. The hydrogen production simulation verified that the novel UBPB was suitable for biohydrogen production by photosynthetic bacteria because of its uniform temperature and lighting distribution, with the serpentine flow pattern also providing mixing without additional energy input, thus enhancing the mass transfer and biohydrogen yield.

  14. Reducing Communication Overhead by Scheduling TCP Transfers on Mobile Devices using Wireless Network Performance Maps

    DEFF Research Database (Denmark)

    Højgaard-Hansen, Kim; Madsen, Tatiana Kozlova; Schwefel, Hans-Peter

    2012-01-01

    The performance of wireless communication networks has been shown to have a strong location dependence. Measuring the performance while having accurate location information available makes it possible to generate performance maps. In this paper we propose a framework for the generation and use...... of such performance maps. We demonstrate how the framework can be used to reduce the retransmissions and to better utilise network resources when performing TCP-based file downloads in vehicular M2M communication scenarios. The approach works on top of a standard TCP stack hence has to map identified transmission...

  15. Generalized double-humped logistic map-based medical image encryption

    Directory of Open Access Journals (Sweden)

    Samar M. Ismail

    2018-03-01

    Full Text Available This paper presents the design of the generalized Double Humped (DH logistic map, used for pseudo-random number key generation (PRNG. The generalized parameter added to the map provides more control on the map chaotic range. A new special map with a zooming effect of the bifurcation diagram is obtained by manipulating the generalization parameter value. The dynamic behavior of the generalized map is analyzed, including the study of the fixed points and stability ranges, Lyapunov exponent, and the complete bifurcation diagram. The option of designing any specific map is made possible through changing the general parameter increasing the randomness and controllability of the map. An image encryption algorithm is introduced based on pseudo-random sequence generation using the proposed generalized DH map offering secure communication transfer of medical MRI and X-ray images. Security analyses are carried out to consolidate system efficiency including: key sensitivity and key-space analyses, histogram analysis, correlation coefficients, MAE, NPCR and UACI calculations. System robustness against noise attacks has been proved along with the NIST test ensuring the system efficiency. A comparison between the proposed system with respect to previous works is presented.

  16. Accelerating artificial intelligence with reconfigurable computing

    Science.gov (United States)

    Cieszewski, Radoslaw

    Reconfigurable computing is emerging as an important area of research in computer architectures and software systems. Many algorithms can be greatly accelerated by placing the computationally intense portions of an algorithm into reconfigurable hardware. Reconfigurable computing combines many benefits of both software and ASIC implementations. Like software, the mapped circuit is flexible, and can be changed over the lifetime of the system. Similar to an ASIC, reconfigurable systems provide a method to map circuits into hardware. Reconfigurable systems therefore have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. Such a field, where there is many different algorithms which can be accelerated, is an artificial intelligence. This paper presents example hardware implementations of Artificial Neural Networks, Genetic Algorithms and Expert Systems.

  17. From symplectic integrator to Poincare map: Spline expansion of a map generator in Cartesian coordinates

    International Nuclear Information System (INIS)

    Warnock, R.L.; Ellison, J.A.; Univ. of New Mexico, Albuquerque, NM

    1997-08-01

    Data from orbits of a symplectic integrator can be interpolated so as to construct an approximation to the generating function of a Poincare map. The time required to compute an orbit of the symplectic map induced by the generator can be much less than the time to follow the same orbit by symplectic integration. The construction has been carried out previously for full-turn maps of large particle accelerators, and a big saving in time (for instance a factor of 60) has been demonstrated. A shortcoming of the work to date arose from the use of canonical polar coordinates, which precluded map construction in small regions of phase space near coordinate singularities. This paper shows that Cartesian coordinates can also be used, thus avoiding singularities. The generator is represented in a basis of tensor product B-splines. Under weak conditions the spline expansion converges uniformly as the mesh is refined, approaching the exact generator of the Poincare map as defined by the symplectic integrator, in some parallelepiped of phase space centered at the origin

  18. API, Cloud computing, WebGIS and cartography

    Directory of Open Access Journals (Sweden)

    Andrea Favretto

    2013-05-01

    Full Text Available This paper explores some of the digital mapping processes available on the Internet in order to analyse their cartographic congruence. It will focus on WebGIS-based cartography in relation to what is produced using Mash-up site maps. These websites often use Googlebased maps in order to produce their own cartography. Thus, we can identify two main typologies of Internet mapping sites, which are characterized by the ownership or non-ownership of their cartographic bases. This paper will critically assess the cartography employed in the two different instances. A concise introduction to the Cloud Computing Internet propagated phenomenon is also premised in order to provide the reader with an accurate frame of reference. Cloud Computing has encouraged a significant Internet participation via the Application Programming Interface software (API, leading to mash-up cartographic websites.

  19. Temperature control system with computer mapping for engine cooling circuits; Kennfeldgesteuertes Temperaturregelsystem fuer Motorkuehlkreislaeufe

    Energy Technology Data Exchange (ETDEWEB)

    Saur, R.; Leu, P.; Lemberger, H.; Heumer, G.

    1996-07-01

    Thermomanagement of the vehicles powered by internal combustion engines is one of the prerequisites needed to fulfil the German automobil industry`s commitment to reduce fuel consumption by 25% as compared with 1990 before the year 2005. Thermomanagement improves comfort, and reduces fuel consumption and toxic emissions. BMW and Behr Thermot-Tronik have jointly developed the first component of such a thermomanagement system: An engine cooling system with computer mapping. BMW is the first manufacturer worldwide to install this system as standard equipment, as it is doing in its refinde eight-cylinder engine series (M62). (orig.) [Deutsch] Die Verpflichtung der deutschen Automobilindustrie, den Kraftstoffverbrauch bis zum Jahre 2005 zur Basis des Jahres 1990 um 25% zu reduzieren, fuehrt unter anderem zwingend zum Thermomanagement der mit Verbrennungsmotoren betriebenen Fahrzeuge. Ein Thermomanagement verbessert den Komfort, reduziert den Verbrauch und vermindert die Schadstoffemissionen. BMW und Behr Thermot-Tronik haben gemeinsam den ersten Baustein des Thermomanagements - das kennfeldgesteuerte Motorkuehlungssystem - entwickelt. Dieses System wird seit Januar 1996 weltweit erstmalig von BMW serienmaessig in der ueberarbeiteten Achtzylinder-Motorbaureihe (M62) eingesetzt. (orig.)

  20. Mapping soil deformation around plant roots using in vivo 4D X-ray Computed Tomography and Digital Volume Correlation.

    Science.gov (United States)

    Keyes, S D; Gillard, F; Soper, N; Mavrogordato, M N; Sinclair, I; Roose, T

    2016-06-14

    The mechanical impedance of soils inhibits the growth of plant roots, often being the most significant physical limitation to root system development. Non-invasive imaging techniques have recently been used to investigate the development of root system architecture over time, but the relationship with soil deformation is usually neglected. Correlative mapping approaches parameterised using 2D and 3D image data have recently gained prominence for quantifying physical deformation in composite materials including fibre-reinforced polymers and trabecular bone. Digital Image Correlation (DIC) and Digital Volume Correlation (DVC) are computational techniques which use the inherent material texture of surfaces and volumes, captured using imaging techniques, to map full-field deformation components in samples during physical loading. Here we develop an experimental assay and methodology for four-dimensional, in vivo X-ray Computed Tomography (XCT) and apply a Digital Volume Correlation (DVC) approach to the data to quantify deformation. The method is validated for a field-derived soil under conditions of uniaxial compression, and a calibration study is used to quantify thresholds of displacement and strain measurement. The validated and calibrated approach is then demonstrated for an in vivo test case in which an extending maize root in field-derived soil was imaged hourly using XCT over a growth period of 19h. This allowed full-field soil deformation data and 3D root tip dynamics to be quantified in parallel for the first time. This fusion of methods paves the way for comparative studies of contrasting soils and plant genotypes, improving our understanding of the fundamental mechanical processes which influence root system development. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  2. A scheme for distributed quantum search through simultaneous state transfer mechanism

    International Nuclear Information System (INIS)

    Gupta, M.; Pathak, A.

    2007-01-01

    Using a quantum network model, we present a scheme for distributed implementation of Grover's algorithm. The proposed scheme can implement a quantum search over data bases stored in different computers. Entanglement is used to carry out different non-local operations over the spatially distributed quantum computers. A method to transfer the combined state of many qubits over the entanglement and subsequently refreshing the entangled pair is presented. This method of simultaneous s tate transfer from one computer to the other, is shown to result in a constant communication complexity. (Abstract Copyright [2007], Wiley Periodicals, Inc.)

  3. Experiments to Distribute Map Generalization Processes

    Science.gov (United States)

    Berli, Justin; Touya, Guillaume; Lokhat, Imran; Regnauld, Nicolas

    2018-05-01

    Automatic map generalization requires the use of computationally intensive processes often unable to deal with large datasets. Distributing the generalization process is the only way to make them scalable and usable in practice. But map generalization is a highly contextual process, and the surroundings of a generalized map feature needs to be known to generalize the feature, which is a problem as distribution might partition the dataset and parallelize the processing of each part. This paper proposes experiments to evaluate the past propositions to distribute map generalization, and to identify the main remaining issues. The past propositions to distribute map generalization are first discussed, and then the experiment hypotheses and apparatus are described. The experiments confirmed that regular partitioning was the quickest strategy, but also the less effective in taking context into account. The geographical partitioning, though less effective for now, is quite promising regarding the quality of the results as it better integrates the geographical context.

  4. Regularity of optimal transport maps on multiple products of spheres

    OpenAIRE

    Figalli, Alessio; Kim, Young-Heon; McCann, Robert J.

    2010-01-01

    This article addresses regularity of optimal transport maps for cost="squared distance" on Riemannian manifolds that are products of arbitrarily many round spheres with arbitrary sizes and dimensions. Such manifolds are known to be non-negatively cross-curved [KM2]. Under boundedness and non-vanishing assumptions on the transfered source and target densities we show that optimal maps stay away from the cut-locus (where the cost exhibits singularity), and obtain injectivity and continuity of o...

  5. Standard Test Method for Calculation of Stagnation Enthalpy from Heat Transfer Theory and Experimental Measurements of Stagnation-Point Heat Transfer and Pressure

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2005-01-01

    1.1 This test method covers the calculation from heat transfer theory of the stagnation enthalpy from experimental measurements of the stagnation-point heat transfer and stagnation pressure. 1.2 Advantages 1.2.1 A value of stagnation enthalpy can be obtained at the location in the stream where the model is tested. This value gives a consistent set of data, along with heat transfer and stagnation pressure, for ablation computations. 1.2.2 This computation of stagnation enthalpy does not require the measurement of any arc heater parameters. 1.3 Limitations and ConsiderationsThere are many factors that may contribute to an error using this type of approach to calculate stagnation enthalpy, including: 1.3.1 TurbulenceThe turbulence generated by adding energy to the stream may cause deviation from the laminar equilibrium heat transfer theory. 1.3.2 Equilibrium, Nonequilibrium, or Frozen State of GasThe reaction rates and expansions may be such that the gas is far from thermodynamic equilibrium. 1.3.3 Noncat...

  6. Nonlinear Algorithms for Channel Equalization and Map Symbol Detection.

    Science.gov (United States)

    Giridhar, K.

    The transfer of information through a communication medium invariably results in various kinds of distortion to the transmitted signal. In this dissertation, a feed -forward neural network-based equalizer, and a family of maximum a posteriori (MAP) symbol detectors are proposed for signal recovery in the presence of intersymbol interference (ISI) and additive white Gaussian noise. The proposed neural network-based equalizer employs a novel bit-mapping strategy to handle multilevel data signals in an equivalent bipolar representation. It uses a training procedure to learn the channel characteristics, and at the end of training, the multilevel symbols are recovered from the corresponding inverse bit-mapping. When the channel characteristics are unknown and no training sequences are available, blind estimation of the channel (or its inverse) and simultaneous data recovery is required. Convergence properties of several existing Bussgang-type blind equalization algorithms are studied through computer simulations, and a unique gain independent approach is used to obtain a fair comparison of their rates of convergence. Although simple to implement, the slow convergence of these Bussgang-type blind equalizers make them unsuitable for many high data-rate applications. Rapidly converging blind algorithms based on the principle of MAP symbol-by -symbol detection are proposed, which adaptively estimate the channel impulse response (CIR) and simultaneously decode the received data sequence. Assuming a linear and Gaussian measurement model, the near-optimal blind MAP symbol detector (MAPSD) consists of a parallel bank of conditional Kalman channel estimators, where the conditioning is done on each possible data subsequence that can convolve with the CIR. This algorithm is also extended to the recovery of convolutionally encoded waveforms in the presence of ISI. Since the complexity of the MAPSD algorithm increases exponentially with the length of the assumed CIR, a suboptimal

  7. Distributed Monocular SLAM for Indoor Map Building

    Directory of Open Access Journals (Sweden)

    Ruwan Egodagamage

    2017-01-01

    Full Text Available Utilization and generation of indoor maps are critical elements in accurate indoor tracking. Simultaneous Localization and Mapping (SLAM is one of the main techniques for such map generation. In SLAM an agent generates a map of an unknown environment while estimating its location in it. Ubiquitous cameras lead to monocular visual SLAM, where a camera is the only sensing device for the SLAM process. In modern applications, multiple mobile agents may be involved in the generation of such maps, thus requiring a distributed computational framework. Each agent can generate its own local map, which can then be combined into a map covering a larger area. By doing so, they can cover a given environment faster than a single agent. Furthermore, they can interact with each other in the same environment, making this framework more practical, especially for collaborative applications such as augmented reality. One of the main challenges of distributed SLAM is identifying overlapping maps, especially when relative starting positions of agents are unknown. In this paper, we are proposing a system having multiple monocular agents, with unknown relative starting positions, which generates a semidense global map of the environment.

  8. Computing Wigner distributions and time correlation functions using the quantum thermal bath method: application to proton transfer spectroscopy.

    Science.gov (United States)

    Basire, Marie; Borgis, Daniel; Vuilleumier, Rodolphe

    2013-08-14

    Langevin dynamics coupled to a quantum thermal bath (QTB) allows for the inclusion of vibrational quantum effects in molecular dynamics simulations at virtually no additional computer cost. We investigate here the ability of the QTB method to reproduce the quantum Wigner distribution of a variety of model potentials, designed to assess the performances and limits of the method. We further compute the infrared spectrum of a multidimensional model of proton transfer in the gas phase and in solution, using classical trajectories sampled initially from the Wigner distribution. It is shown that for this type of system involving large anharmonicities and strong nonlinear coupling to the environment, the quantum thermal bath is able to sample the Wigner distribution satisfactorily and to account for both zero point energy and tunneling effects. It leads to quantum time correlation functions having the correct short-time behavior, and the correct associated spectral frequencies, but that are slightly too overdamped. This is attributed to the classical propagation approximation rather than the generation of the quantized initial conditions themselves.

  9. Unraveling African plate structure from elevation, geoid and geology data: implications for the impact of mantle flow and sediment transfers on lithospheric deformation

    Science.gov (United States)

    Bajolet, Flora; Robert, Alexandra; Chardon, Dominique; Rouby, Delphine

    2017-04-01

    The aim of our project is to simulate the long-wavelength, flexural isostatic response of the African plate to sediment transfers due to Meso-Cenozoic erosion - deposition processes in order to extract the residual topography driven by mantle dynamics. The first step of our project consists in computing crustal and lithospheric thickness maps of the African plate considering its main geological components (cratons, mobile belts, basins, rifts and passive margins of various ages and strengths). In order to consider these heterogeneities, we compute a 2D distribution of crustal densities and thermal parameters from geological data and use it as an input of our modeling. We combine elevation and geoid anomaly data using a thermal analysis, following the method of Fullea et al. (2007) in order to map crustal and lithospheric thicknesses. In this approach, we assume local isostasy and consider a four-layer model made of crust and lithospheric mantle plus seawater and asthenosphere. In addition, we compare our results with crustal and lithospheric thickness datasets compiled from bibliography and existing global models. The obtained crustal thicknesses range from 28 to 42km, with the thickest crust confined to the northern part of the West African Craton, the Kaapvaal craton, and the Congo cuvette. The crust in the East African Rift appears unrealistically thick (40-45 km) as it is not isotatically compensated, highlighting the dynamic effect of the African superswell. The thinnest crust (28-34km) follows a central East-West trend coinciding with Cretaceous rifts and the Cameroon volcanic line. The lithosphere reaches 220 km beneath the Congo craton, but remains globally thin (ca. 120-180 km) compared to tomographic models and considering the age of most geological provinces. As for the crust, the thinnest lithosphere is located in areas of Cretaceous-Jurassic rifting, suggesting that the lithosphere did not thermally recover from Mesozoic rifting. A new elastic

  10. Spike-timing-based computation in sound localization.

    Directory of Open Access Journals (Sweden)

    Dan F M Goodman

    2010-11-01

    Full Text Available Spike timing is precise in the auditory system and it has been argued that it conveys information about auditory stimuli, in particular about the location of a sound source. However, beyond simple time differences, the way in which neurons might extract this information is unclear and the potential computational advantages are unknown. The computational difficulty of this task for an animal is to locate the source of an unexpected sound from two monaural signals that are highly dependent on the unknown source signal. In neuron models consisting of spectro-temporal filtering and spiking nonlinearity, we found that the binaural structure induced by spatialized sounds is mapped to synchrony patterns that depend on source location rather than on source signal. Location-specific synchrony patterns would then result in the activation of location-specific assemblies of postsynaptic neurons. We designed a spiking neuron model which exploited this principle to locate a variety of sound sources in a virtual acoustic environment using measured human head-related transfer functions. The model was able to accurately estimate the location of previously unknown sounds in both azimuth and elevation (including front/back discrimination in a known acoustic environment. We found that multiple representations of different acoustic environments could coexist as sets of overlapping neural assemblies which could be associated with spatial locations by Hebbian learning. The model demonstrates the computational relevance of relative spike timing to extract spatial information about sources independently of the source signal.

  11. Selections from 2017: Computers Help Us Map Our Home

    Science.gov (United States)

    Kohler, Susanna

    2017-12-01

    Editors note:In these last two weeks of 2017, well be looking at a few selections that we havent yet discussed on AAS Nova from among the most-downloaded paperspublished in AAS journals this year. The usual posting schedule will resume in January.Machine-Learned Identification of RR Lyrae Stars from Sparse, Multi-Band Data: The PS1 SamplePublished April2017Main takeaway:A sample of RR Lyrae variable stars was built from thePan-STARRS1 (PS1) survey by a team led byBranimir Sesar (Max Planck Institute for Astronomy, Germany). The sample of45,000 starsrepresentsthe widest (three-fourthsof the sky) and deepest (reaching 120 kpc) sample of RR Lyrae stars to date.Why its interesting:Its challengingto understand the overall shape and behaviorof our galaxy because were stuck on the inside of it. RR Lyrae stars are a useful tool for this purpose: they can be used as tracers to map out the Milky Ways halo. The authors large sample of RR Lyrae stars from PS1 combined withproper-motion measurements from Gaia and radial-velocity measurements from multi-object spectroscopic surveys could become thepremier source for studying the structure, kinematics, and the gravitational potential of our galaxys outskirts.How they were found:The black dots show the distribution of the 45,000 probable RR Lyrae stars in the authors sample. [Sesar et al. 2017]The 45,000 stars in this sample were selected not by humans, but by computer.The authors used machine-learning algorithms to examine the light curvesin the Pan-STARRS1 sample and identify the characteristic brightness variations of RR Lyrae stars lying in the galactic halo. These techniques resulted in a very pure and complete sample, and the authors suggest that this approachmay translate well to othersparse,multi-band data sets such asthat from the upcomingLarge Synoptic Survey Telescope (LSST) galactic plane sub-survey.CitationBranimir Sesar et al 2017 AJ 153 204. doi:10.3847/1538-3881/aa661b

  12. High performance multiple stream data transfer

    International Nuclear Information System (INIS)

    Rademakers, F.; Saiz, P.

    2001-01-01

    The ALICE detector at LHC (CERN), will record raw data at a rate of 1.2 Gigabytes per second. Trying to analyse all this data at CERN will not be feasible. As originally proposed by the MONARC project, data collected at CERN will be transferred to remote centres to use their computing infrastructure. The remote centres will reconstruct and analyse the events, and make available the results. Therefore high-rate data transfer between computing centres (Tiers) will become of paramount importance. The authors will present several tests that have been made between CERN and remote centres in Padova (Italy), Torino (Italy), Catania (Italy), Lyon (France), Ohio (United States), Warsaw (Poland) and Calcutta (India). These tests consisted, in a first stage, of sending raw data from CERN to the remote centres and back, using a ftp method that allows connections of several streams at the same time. Thanks to these multiple streams, it is possible to increase the rate at which the data is transferred. While several 'multiple stream ftp solutions' already exist, the authors' method is based on a parallel socket implementation which allows, besides files, also objects (or any large message) to be send in parallel. A prototype will be presented able to manage different transfers. This is the first step of a system to be implemented that will be able to take care of the connections with the remote centres to exchange data and monitor the status of the transfer

  13. Experimental and computational studies on creatininium 4-nitrobenzoate - An organic proton transfer complex

    Science.gov (United States)

    Thirumurugan, R.; Anitha, K.

    2017-10-01

    A new organic proton transfer complex of creatininium 4-nitrobenzoate (C4NB) has been synthesized and its single crystals were grown successfully by slow evaporation technique. The grown single crystal was subjected to various characterization techniques like single crystal X-ray diffraction (SCXRD), FTIR, FT-Raman and Kurtz-Perry powder second harmonic generation (SHG). The SCXRD analysis revealed that C4NB was crystallized into orthorhombic crystal system, with noncentrosymmetric (NCS), P212121 space group. The creatininium cation and 4-nitrobenzoate anion were connected through a pair of N__H⋯O hydrogen bonds (N(3)__H(6) ⋯ O(3) (x+1, y, z) and N(2)__H(5) &ctdot O(2) (x-1/2, -y-1/2, -z+2)) and fashioned a R22(8) ring motif. The crystal structure was stabilized by strong N__H⋯O and weak C__H⋯O intermolecular interactions and it was quantitatively analysed by Hirshfeld surface and fingerprint (FP) analysis. FTIR and FT-Raman studies confirmed the vibrational modes of functional groups present in C4NB compound indubitably. SHG efficiency of grown crystal was 4.6 times greater than that of standard potassium dihydrogen phosphate (KDP) material. Moreover, density functional theory (DFT) studies such as Mulliken charge distribution, frontier molecular orbitals (FMOs), molecular electrostatic potential (MEP) map, natural bond orbital analysis (NBO) and first order hyperpolarizability (β0) were calculated to explore the structure-property relationship.

  14. Development and Validation of Computational Fluid Dynamics Models for Prediction of Heat Transfer and Thermal Microenvironments of Corals

    Science.gov (United States)

    Ong, Robert H.; King, Andrew J. C.; Mullins, Benjamin J.; Cooper, Timothy F.; Caley, M. Julian

    2012-01-01

    We present Computational Fluid Dynamics (CFD) models of the coupled dynamics of water flow, heat transfer and irradiance in and around corals to predict temperatures experienced by corals. These models were validated against controlled laboratory experiments, under constant and transient irradiance, for hemispherical and branching corals. Our CFD models agree very well with experimental studies. A linear relationship between irradiance and coral surface warming was evident in both the simulation and experimental result agreeing with heat transfer theory. However, CFD models for the steady state simulation produced a better fit to the linear relationship than the experimental data, likely due to experimental error in the empirical measurements. The consistency of our modelling results with experimental observations demonstrates the applicability of CFD simulations, such as the models developed here, to coral bleaching studies. A study of the influence of coral skeletal porosity and skeletal bulk density on surface warming was also undertaken, demonstrating boundary layer behaviour, and interstitial flow magnitude and temperature profiles in coral cross sections. Our models compliment recent studies showing systematic changes in these parameters in some coral colonies and have utility in the prediction of coral bleaching. PMID:22701582

  15. Mapping the 2017 Eclipse: Education, Navigation, Inspiration

    Science.gov (United States)

    Zeiler, M.

    2015-12-01

    Eclipse maps are a unique vessel of knowledge. At a glance, they communicate the essential knowledge of where and when to successfully view a total eclipse of the sun. An eclipse map also provides detailed knowledge of eclipse circumstances superimposed on the highway system for optimal navigation, especially in the event that weather forces relocation. Eclipse maps are also a vital planning tool for solar physicists and astrophotographers capturing high-resolution imagery of the solar corona. Michael Zeiler will speak to the role of eclipse maps in educating the American public and inspiring people to make the effort to reach the path of totality for the sight of a lifetime. Michael will review the role of eclipse maps in astronomical research and discuss a project under development, the 2017 Eclipse Atlas for smartphones, tablets, and desktop computers.

  16. Developing an ionospheric map for South Africa

    Directory of Open Access Journals (Sweden)

    D. I. Okoh

    2010-07-01

    Full Text Available The development of a map of the ionosphere over South Africa is presented in this paper. The International Reference Ionosphere (IRI model, South African Bottomside Ionospheric Model (SABIM, and measurements from ionosondes in the South African Ionosonde Network, were combined within their own limitations to develop an accurate representation of the South African ionosphere. The map is essentially in the form of a computer program that shows spatial and temporal representations of the South African ionosphere for a given set of geophysical parameters. A validation of the map is attempted using a comparison of Total Electron Content (TEC values derived from the map, from the IRI model, and from Global Positioning System (GPS measurements. It is foreseen that the final South African ionospheric map will be implemented as a Space Weather product of the African Space Weather Regional Warning Centre.

  17. Mapping sequences by parts

    Directory of Open Access Journals (Sweden)

    Guziolowski Carito

    2007-09-01

    Full Text Available Abstract Background: We present the N-map method, a pairwise and asymmetrical approach which allows us to compare sequences by taking into account evolutionary events that produce shuffled, reversed or repeated elements. Basically, the optimal N-map of a sequence s over a sequence t is the best way of partitioning the first sequence into N parts and placing them, possibly complementary reversed, over the second sequence in order to maximize the sum of their gapless alignment scores. Results: We introduce an algorithm computing an optimal N-map with time complexity O (|s| × |t| × N using O (|s| × |t| × N memory space. Among all the numbers of parts taken in a reasonable range, we select the value N for which the optimal N-map has the most significant score. To evaluate this significance, we study the empirical distributions of the scores of optimal N-maps and show that they can be approximated by normal distributions with a reasonable accuracy. We test the functionality of the approach over random sequences on which we apply artificial evolutionary events. Practical Application: The method is illustrated with four case studies of pairs of sequences involving non-standard evolutionary events.

  18. Concept Mapping Using Cmap Tools to Enhance Meaningful Learning

    Science.gov (United States)

    Cañas, Alberto J.; Novak, Joseph D.

    Concept maps are graphical tools that have been used in all facets of education and training for organizing and representing knowledge. When learners build concept maps, meaningful learning is facilitated. Computer-based concept mapping software such as CmapTools have further extended the use of concept mapping and greatly enhanced the potential of the tool, facilitating the implementation of a concept map-centered learning environment. In this chapter, we briefly present concept mapping and its theoretical foundation, and illustrate how it can lead to an improved learning environment when it is combined with CmapTools and the Internet. We present the nationwide “Proyecto Conéctate al Conocimiento” in Panama as an example of how concept mapping, together with technology, can be adopted by hundreds of schools as a means to enhance meaningful learning.

  19. Subdifferential-based implicit return-mapping operators in computational plasticity

    Czech Academy of Sciences Publication Activity Database

    Sysala, Stanislav; Čermák, Martin; Koudelka, T.; Kruis, J.; Zeman, J.; Blaheta, Radim

    2016-01-01

    Roč. 96, č. 11 (2016), s. 1318-1338 ISSN 1521-4001 R&D Projects: GA MŠk LQ1602; GA ČR GA13-18652S Institutional support: RVO:68145535 Keywords : elastoplasticity * nonsmooth yield surface * multivalued flow direction * implicit return-mapping scheme * semismooth Newton method * limit analysis Subject RIV: BA - General Mathematics http://onlinelibrary.wiley.com/doi/10.1002/zamm.201500305/full

  20. A simple nonlinear dynamical computing device

    International Nuclear Information System (INIS)

    Miliotis, Abraham; Murali, K.; Sinha, Sudeshna; Ditto, William L.; Spano, Mark L.

    2009-01-01

    We propose and characterize an iterated map whose nonlinearity has a simple (i.e., minimal) electronic implementation. We then demonstrate explicitly how all the different fundamental logic gates can be implemented and morphed using this nonlinearity. These gates provide the full set of gates necessary to construct a general-purpose, reconfigurable computing device. As an example of how such chaotic computing devices can be exploited, we use an array of these maps to encode data and to process information. Each map can store one of M items, where M is variable and can be large. This nonlinear hardware stores data naturally in different bases or alphabets. We also show how this method of storing information can serve as a preprocessing tool for exact or inexact pattern-matching searches.