WorldWideScience

Sample records for local computer environment

  1. Weighted Local Active Pixel Pattern (WLAPP for Face Recognition in Parallel Computation Environment

    Directory of Open Access Journals (Sweden)

    Gundavarapu Mallikarjuna Rao

    2013-10-01

    Full Text Available Abstract  - The availability of multi-core technology resulted totally new computational era. Researchers are keen to explore available potential in state of art-machines for breaking the bearer imposed by serial computation. Face Recognition is one of the challenging applications on so ever computational environment. The main difficulty of traditional Face Recognition algorithms is lack of the scalability. In this paper Weighted Local Active Pixel Pattern (WLAPP, a new scalable Face Recognition Algorithm suitable for parallel environment is proposed.  Local Active Pixel Pattern (LAPP is found to be simple and computational inexpensive compare to Local Binary Patterns (LBP. WLAPP is developed based on concept of LAPP. The experimentation is performed on FG-Net Aging Database with deliberately introduced 20% distortion and the results are encouraging. Keywords — Active pixels, Face Recognition, Local Binary Pattern (LBP, Local Active Pixel Pattern (LAPP, Pattern computing, parallel workers, template, weight computation.  

  2. Green Computing in Local Governments and Information Technology Companies

    Directory of Open Access Journals (Sweden)

    Badar Agung Nugroho

    2013-06-01

    Full Text Available Green computing is a study and practice of designing, manufacturing, using, and disposing of information and communication devices efficiently and effectively with minimum impact on the environment. If the green computing concept was implemented, it will help the agencies or companies to reduce energy and capital cost from their IT infrastructure. The goal from this research is to explore the current condition about the efforts from local governments and IT companies at West Java to implement the green computing concept at their working environment. The primary data were collected by using focus group discussion by inviting the local governments and IT companies representatives who responsible to manage their IT infrastructure. And then, the secondary data were collected by doing brief observation in order to see the real effort of green computing implementation at each institution. The result shows that there are many different perspectives and efforts of green computing implementation between local governments and IT companies.

  3. Cluster-based localization and tracking in ubiquitous computing systems

    CERN Document Server

    Martínez-de Dios, José Ramiro; Torres-González, Arturo; Ollero, Anibal

    2017-01-01

    Localization and tracking are key functionalities in ubiquitous computing systems and techniques. In recent years a very high variety of approaches, sensors and techniques for indoor and GPS-denied environments have been developed. This book briefly summarizes the current state of the art in localization and tracking in ubiquitous computing systems focusing on cluster-based schemes. Additionally, existing techniques for measurement integration, node inclusion/exclusion and cluster head selection are also described in this book.

  4. DCE. Future IHEP's computing environment

    International Nuclear Information System (INIS)

    Zheng Guorui; Liu Xiaoling

    1995-01-01

    IHEP'S computing environment consists of several different computing environments established on IHEP computer networks. In which, the BES environment supported HEP computing is the main part of IHEP computing environment. Combining with the procedure of improvement and extension of BES environment, the authors describe development of computing environments in outline as viewed from high energy physics (HEP) environment establishment. The direction of developing to distributed computing of the IHEP computing environment based on the developing trend of present distributed computing is presented

  5. Computing environment logbook

    Science.gov (United States)

    Osbourn, Gordon C; Bouchard, Ann M

    2012-09-18

    A computing environment logbook logs events occurring within a computing environment. The events are displayed as a history of past events within the logbook of the computing environment. The logbook provides search functionality to search through the history of past events to find one or more selected past events, and further, enables an undo of the one or more selected past events.

  6. HeNCE: A Heterogeneous Network Computing Environment

    Directory of Open Access Journals (Sweden)

    Adam Beguelin

    1994-01-01

    Full Text Available Network computing seeks to utilize the aggregate resources of many networked computers to solve a single problem. In so doing it is often possible to obtain supercomputer performance from an inexpensive local area network. The drawback is that network computing is complicated and error prone when done by hand, especially if the computers have different operating systems and data formats and are thus heterogeneous. The heterogeneous network computing environment (HeNCE is an integrated graphical environment for creating and running parallel programs over a heterogeneous collection of computers. It is built on a lower level package called parallel virtual machine (PVM. The HeNCE philosophy of parallel programming is to have the programmer graphically specify the parallelism of a computation and to automate, as much as possible, the tasks of writing, compiling, executing, debugging, and tracing the network computation. Key to HeNCE is a graphical language based on directed graphs that describe the parallelism and data dependencies of an application. Nodes in the graphs represent conventional Fortran or C subroutines and the arcs represent data and control flow. This article describes the present state of HeNCE, its capabilities, limitations, and areas of future research.

  7. Computer network environment planning and analysis

    Science.gov (United States)

    Dalphin, John F.

    1989-01-01

    The GSFC Computer Network Environment provides a broadband RF cable between campus buildings and ethernet spines in buildings for the interlinking of Local Area Networks (LANs). This system provides terminal and computer linkage among host and user systems thereby providing E-mail services, file exchange capability, and certain distributed computing opportunities. The Environment is designed to be transparent and supports multiple protocols. Networking at Goddard has a short history and has been under coordinated control of a Network Steering Committee for slightly more than two years; network growth has been rapid with more than 1500 nodes currently addressed and greater expansion expected. A new RF cable system with a different topology is being installed during summer 1989; consideration of a fiber optics system for the future will begin soon. Summmer study was directed toward Network Steering Committee operation and planning plus consideration of Center Network Environment analysis and modeling. Biweekly Steering Committee meetings were attended to learn the background of the network and the concerns of those managing it. Suggestions for historical data gathering have been made to support future planning and modeling. Data Systems Dynamic Simulator, a simulation package developed at NASA and maintained at GSFC was studied as a possible modeling tool for the network environment. A modeling concept based on a hierarchical model was hypothesized for further development. Such a model would allow input of newly updated parameters and would provide an estimation of the behavior of the network.

  8. Computational modeling of local hemodynamics phenomena: methods, tools and clinical applications

    International Nuclear Information System (INIS)

    Ponzini, R.; Rizzo, G.; Vergara, C.; Veneziani, A.; Morbiducci, U.; Montevecchi, F.M.; Redaelli, A.

    2009-01-01

    Local hemodynamics plays a key role in the onset of vessel wall pathophysiology, with peculiar blood flow structures (i.e. spatial velocity profiles, vortices, re-circulating zones, helical patterns and so on) characterizing the behavior of specific vascular districts. Thanks to the evolving technologies on computer sciences, mathematical modeling and hardware performances, the study of local hemodynamics can today afford also the use of a virtual environment to perform hypothesis testing, product development, protocol design and methods validation that just a couple of decades ago would have not been thinkable. Computational fluid dynamics (Cfd) appears to be more than a complementary partner to in vitro modeling and a possible substitute to animal models, furnishing a privileged environment for cheap fast and reproducible data generation.

  9. Distributed multiscale computing with MUSCLE 2, the Multiscale Coupling Library and Environment

    NARCIS (Netherlands)

    Borgdorff, J.; Mamonski, M.; Bosak, B.; Kurowski, K.; Ben Belgacem, M.; Chopard, B.; Groen, D.; Coveney, P.V.; Hoekstra, A.G.

    2014-01-01

    We present the Multiscale Coupling Library and Environment: MUSCLE 2. This multiscale component-based execution environment has a simple to use Java, C++, C, Python and Fortran API, compatible with MPI, OpenMP and threading codes. We demonstrate its local and distributed computing capabilities and

  10. A PC/workstation cluster computing environment for reservoir engineering simulation applications

    International Nuclear Information System (INIS)

    Hermes, C.E.; Koo, J.

    1995-01-01

    Like the rest of the petroleum industry, Texaco has been transferring its applications and databases from mainframes to PC's and workstations. This transition has been very positive because it provides an environment for integrating applications, increases end-user productivity, and in general reduces overall computing costs. On the down side, the transition typically results in a dramatic increase in workstation purchases and raises concerns regarding the cost and effective management of computing resources in this new environment. The workstation transition also places the user in a Unix computing environment which, to say the least, can be quite frustrating to learn and to use. This paper describes the approach, philosophy, architecture, and current status of the new reservoir engineering/simulation computing environment developed at Texaco's E and P Technology Dept. (EPTD) in Houston. The environment is representative of those under development at several other large oil companies and is based on a cluster of IBM and Silicon Graphics Intl. (SGI) workstations connected by a fiber-optics communications network and engineering PC's connected to local area networks, or Ethernets. Because computing resources and software licenses are shared among a group of users, the new environment enables the company to get more out of its investments in workstation hardware and software

  11. Cielo Computational Environment Usage Model With Mappings to ACE Requirements for the General Availability User Environment Capabilities Release Version 1.1

    Energy Technology Data Exchange (ETDEWEB)

    Vigil,Benny Manuel [Los Alamos National Laboratory; Ballance, Robert [SNL; Haskell, Karen [SNL

    2012-08-09

    Cielo is a massively parallel supercomputer funded by the DOE/NNSA Advanced Simulation and Computing (ASC) program, and operated by the Alliance for Computing at Extreme Scale (ACES), a partnership between Los Alamos National Laboratory (LANL) and Sandia National Laboratories (SNL). The primary Cielo compute platform is physically located at Los Alamos National Laboratory. This Cielo Computational Environment Usage Model documents the capabilities and the environment to be provided for the Q1 FY12 Level 2 Cielo Capability Computing (CCC) Platform Production Readiness Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model is focused on the needs of the ASC user working in the secure computing environments at Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory, or Sandia National Laboratories, but also addresses the needs of users working in the unclassified environment. The Cielo Computational Environment Usage Model maps the provided capabilities to the tri-Lab ASC Computing Environment (ACE) Version 8.0 requirements. The ACE requirements reflect the high performance computing requirements for the Production Readiness Milestone user environment capabilities of the ASC community. A description of ACE requirements met, and those requirements that are not met, are included in each section of this document. The Cielo Computing Environment, along with the ACE mappings, has been issued and reviewed throughout the tri-Lab community.

  12. Event detection and localization for small mobile robots using reservoir computing.

    Science.gov (United States)

    Antonelo, E A; Schrauwen, B; Stroobandt, D

    2008-08-01

    Reservoir Computing (RC) techniques use a fixed (usually randomly created) recurrent neural network, or more generally any dynamic system, which operates at the edge of stability, where only a linear static readout output layer is trained by standard linear regression methods. In this work, RC is used for detecting complex events in autonomous robot navigation. This can be extended to robot localization tasks which are solely based on a few low-range, high-noise sensory data. The robot thus builds an implicit map of the environment (after learning) that is used for efficient localization by simply processing the input stream of distance sensors. These techniques are demonstrated in both a simple simulation environment and in the physically realistic Webots simulation of the commercially available e-puck robot, using several complex and even dynamic environments.

  13. Local environment effects in disordered alloys

    International Nuclear Information System (INIS)

    Cable, J.W.

    1978-01-01

    The magnetic moment of an atom in a ferromagnetic disordered alloy depends on the local environment of that atom. This is particularly true for Ni and Pd based alloys for which neutron diffuse scattering measurements of the range and magnitude of the moment disturbances indicate that both magnetic and chemical environment are important in determining the moment distribution. In this paper we review recent neutron studies of local environment effects in Ni based alloys. These are discussed in terms of a phenomenological model that allows a separation of the total moment disturbance at a Ni site into its chemical and magnetic components

  14. Multiple Signal Classification Algorithm Based Electric Dipole Source Localization Method in an Underwater Environment

    Directory of Open Access Journals (Sweden)

    Yidong Xu

    2017-10-01

    Full Text Available A novel localization method based on multiple signal classification (MUSIC algorithm is proposed for positioning an electric dipole source in a confined underwater environment by using electric dipole-receiving antenna array. In this method, the boundary element method (BEM is introduced to analyze the boundary of the confined region by use of a matrix equation. The voltage of each dipole pair is used as spatial-temporal localization data, and it does not need to obtain the field component in each direction compared with the conventional fields based localization method, which can be easily implemented in practical engineering applications. Then, a global-multiple region-conjugate gradient (CG hybrid search method is used to reduce the computation burden and to improve the operation speed. Two localization simulation models and a physical experiment are conducted. Both the simulation results and physical experiment result provide accurate positioning performance, with the help to verify the effectiveness of the proposed localization method in underwater environments.

  15. Local fishing associations and environment authorities visit CERN

    CERN Document Server

    AUTHOR|(CDS)2099575

    2016-01-01

    Local fishing associations and Host-States environment authorities visited CERN on Thursday 21st April 2016. They discovered the efforts made by CERN and its Health, Safety and Environment (HSE) unit to control and limit the impact of the Laboratory's activities on natural environment, and more specifically local rivers.

  16. Precise RFID localization in impaired environment through sparse signal recovery

    Science.gov (United States)

    Subedi, Saurav; Zhang, Yimin D.; Amin, Moeness G.

    2013-05-01

    Radio frequency identification (RFID) is a rapidly developing wireless communication technology for electronically identifying, locating, and tracking products, assets, and personnel. RFID has become one of the most important means to construct real-time locating systems (RTLS) that track and identify the location of objects in real time using simple, inexpensive tags and readers. The applicability and usefulness of RTLS techniques depend on their achievable accuracy. In particular, when multilateration-based localization techniques are exploited, the achievable accuracy primarily relies on the precision of the range estimates between a reader and the tags. Such range information can be obtained by using the received signal strength indicator (RSSI) and/or the phase difference of arrival (PDOA). In both cases, however, the accuracy is significantly compromised when the operation environment is impaired. In particular, multipath propagation significantly affects the measurement accuracy of both RSSI and phase information. In addition, because RFID systems are typically operated in short distances, RSSI and phase measurements are also coupled with the reader and tag antenna patterns, making accurate RFID localization very complicated and challenging. In this paper, we develop new methods to localize RFID tags or readers by exploiting sparse signal recovery techniques. The proposed method allows the channel environment and antenna patterns to be taken into account and be properly compensated at a low computational cost. As such, the proposed technique yields superior performance in challenging operation environments with the above-mentioned impairments.

  17. ISS Local Environment Spectrometers (ISLES)

    Science.gov (United States)

    Krause, Linda Habash; Gilchrist, Brian E.

    2014-01-01

    In order to study the complex interactions between the space environment surrounding the ISS and the ISS surface materials, we propose to use lowcost, high-TRL plasma sensors on the ISS robotic arm to probe the ISS space environment. During many years of ISS operation, we have been able to condut effective (but not perfect) extravehicular activities (both human and robotic) within the perturbed local ISS space environment. Because of the complexity of the interaction between the ISS and the LEO space environment, there remain important questions, such as differential charging at solar panel junctions (the so-called "triple point" between conductor, dielectric, and space plasma), increased chemical contamination due to ISS surface charging and/or thruster activation, water dumps, etc, and "bootstrap" charging of insulating surfaces. Some compelling questions could synergistically draw upon a common sensor suite, which also leverages previous and current MSFC investments. Specific questions address ISS surface charging, plasma contactor plume expansion in a magnetized drifting plasma, and possible localized contamination effects across the ISS.

  18. Performative Environments

    DEFF Research Database (Denmark)

    Thomsen, Bo Stjerne

    2008-01-01

    The paper explores how performative architecture can act as a collective environment localizing urban flows and establishing public domains through the integration of pervasive computing and animation techniques. The NoRA project introduces the concept of ‘performative environments,' focusing on ...... of local interactions and network behaviour, building becomes social infrastructure and prompts an understanding of architectural structures as quasiobjects, which can retain both variation and recognisability in changing social constellations.......The paper explores how performative architecture can act as a collective environment localizing urban flows and establishing public domains through the integration of pervasive computing and animation techniques. The NoRA project introduces the concept of ‘performative environments,' focusing...

  19. CSNS computing environment Based on OpenStack

    Science.gov (United States)

    Li, Yakang; Qi, Fazhi; Chen, Gang; Wang, Yanming; Hong, Jianshu

    2017-10-01

    Cloud computing can allow for more flexible configuration of IT resources and optimized hardware utilization, it also can provide computing service according to the real need. We are applying this computing mode to the China Spallation Neutron Source(CSNS) computing environment. So, firstly, CSNS experiment and its computing scenarios and requirements are introduced in this paper. Secondly, the design and practice of cloud computing platform based on OpenStack are mainly demonstrated from the aspects of cloud computing system framework, network, storage and so on. Thirdly, some improvments to openstack we made are discussed further. Finally, current status of CSNS cloud computing environment are summarized in the ending of this paper.

  20. Reach and get capability in a computing environment

    Science.gov (United States)

    Bouchard, Ann M [Albuquerque, NM; Osbourn, Gordon C [Albuquerque, NM

    2012-06-05

    A reach and get technique includes invoking a reach command from a reach location within a computing environment. A user can then navigate to an object within the computing environment and invoke a get command on the object. In response to invoking the get command, the computing environment is automatically navigated back to the reach location and the object copied into the reach location.

  1. Applications of the pipeline environment for visual informatics and genomics computations

    Directory of Open Access Journals (Sweden)

    Genco Alex

    2011-07-01

    Full Text Available Abstract Background Contemporary informatics and genomics research require efficient, flexible and robust management of large heterogeneous data, advanced computational tools, powerful visualization, reliable hardware infrastructure, interoperability of computational resources, and detailed data and analysis-protocol provenance. The Pipeline is a client-server distributed computational environment that facilitates the visual graphical construction, execution, monitoring, validation and dissemination of advanced data analysis protocols. Results This paper reports on the applications of the LONI Pipeline environment to address two informatics challenges - graphical management of diverse genomics tools, and the interoperability of informatics software. Specifically, this manuscript presents the concrete details of deploying general informatics suites and individual software tools to new hardware infrastructures, the design, validation and execution of new visual analysis protocols via the Pipeline graphical interface, and integration of diverse informatics tools via the Pipeline eXtensible Markup Language syntax. We demonstrate each of these processes using several established informatics packages (e.g., miBLAST, EMBOSS, mrFAST, GWASS, MAQ, SAMtools, Bowtie for basic local sequence alignment and search, molecular biology data analysis, and genome-wide association studies. These examples demonstrate the power of the Pipeline graphical workflow environment to enable integration of bioinformatics resources which provide a well-defined syntax for dynamic specification of the input/output parameters and the run-time execution controls. Conclusions The LONI Pipeline environment http://pipeline.loni.ucla.edu provides a flexible graphical infrastructure for efficient biomedical computing and distributed informatics research. The interactive Pipeline resource manager enables the utilization and interoperability of diverse types of informatics resources. The

  2. Efficient Topological Localization Using Global and Local Feature Matching

    Directory of Open Access Journals (Sweden)

    Junqiu Wang

    2013-03-01

    Full Text Available We present an efficient vision-based global topological localization approach in which different image features are used in a coarse-to-fine matching framework. Orientation Adjacency Coherence Histogram (OACH, a novel image feature, is proposed to improve the coarse localization. The coarse localization results are taken as inputs for the fine localization which is carried out by matching Harris-Laplace interest points characterized by the SIFT descriptor. The computation of OACHs and interest points is efficient due to the fact that these features are computed in an integrated process. The matching of local features is improved by using approximate nearest neighbor searching technique. We have implemented and tested the localization system in real environments. The experimental results demonstrate that our approach is efficient and reliable in both indoor and outdoor environments. This work has also been compared with previous works. The comparison results show that our approach has better performance with higher correct ratio and lower computational complexity.

  3. Large-scale parallel genome assembler over cloud computing environment.

    Science.gov (United States)

    Das, Arghya Kusum; Koppa, Praveen Kumar; Goswami, Sayan; Platania, Richard; Park, Seung-Jong

    2017-06-01

    The size of high throughput DNA sequencing data has already reached the terabyte scale. To manage this huge volume of data, many downstream sequencing applications started using locality-based computing over different cloud infrastructures to take advantage of elastic (pay as you go) resources at a lower cost. However, the locality-based programming model (e.g. MapReduce) is relatively new. Consequently, developing scalable data-intensive bioinformatics applications using this model and understanding the hardware environment that these applications require for good performance, both require further research. In this paper, we present a de Bruijn graph oriented Parallel Giraph-based Genome Assembler (GiGA), as well as the hardware platform required for its optimal performance. GiGA uses the power of Hadoop (MapReduce) and Giraph (large-scale graph analysis) to achieve high scalability over hundreds of compute nodes by collocating the computation and data. GiGA achieves significantly higher scalability with competitive assembly quality compared to contemporary parallel assemblers (e.g. ABySS and Contrail) over traditional HPC cluster. Moreover, we show that the performance of GiGA is significantly improved by using an SSD-based private cloud infrastructure over traditional HPC cluster. We observe that the performance of GiGA on 256 cores of this SSD-based cloud infrastructure closely matches that of 512 cores of traditional HPC cluster.

  4. The effect of brain lesions on sound localization in complex acoustic environments.

    Science.gov (United States)

    Zündorf, Ida C; Karnath, Hans-Otto; Lewald, Jörg

    2014-05-01

    Localizing sound sources of interest in cluttered acoustic environments--as in the 'cocktail-party' situation--is one of the most demanding challenges to the human auditory system in everyday life. In this study, stroke patients' ability to localize acoustic targets in a single-source and in a multi-source setup in the free sound field were directly compared. Subsequent voxel-based lesion-behaviour mapping analyses were computed to uncover the brain areas associated with a deficit in localization in the presence of multiple distracter sound sources rather than localization of individually presented sound sources. Analyses revealed a fundamental role of the right planum temporale in this task. The results from the left hemisphere were less straightforward, but suggested an involvement of inferior frontal and pre- and postcentral areas. These areas appear to be particularly involved in the spectrotemporal analyses crucial for effective segregation of multiple sound streams from various locations, beyond the currently known network for localization of isolated sound sources in otherwise silent surroundings.

  5. Multi-VO support in IHEP's distributed computing environment

    International Nuclear Information System (INIS)

    Yan, T; Suo, B; Zhao, X H; Zhang, X M; Ma, Z T; Yan, X F; Lin, T; Deng, Z Y; Li, W D; Belov, S; Pelevanyuk, I; Zhemchugov, A; Cai, H

    2015-01-01

    Inspired by the success of BESDIRAC, the distributed computing environment based on DIRAC for BESIII experiment, several other experiments operated by Institute of High Energy Physics (IHEP), such as Circular Electron Positron Collider (CEPC), Jiangmen Underground Neutrino Observatory (JUNO), Large High Altitude Air Shower Observatory (LHAASO) and Hard X-ray Modulation Telescope (HXMT) etc, are willing to use DIRAC to integrate the geographically distributed computing resources available by their collaborations. In order to minimize manpower and hardware cost, we extended the BESDIRAC platform to support multi-VO scenario, instead of setting up a self-contained distributed computing environment for each VO. This makes DIRAC as a service for the community of those experiments. To support multi-VO, the system architecture of BESDIRAC is adjusted for scalability. The VOMS and DIRAC servers are reconfigured to manage users and groups belong to several VOs. A lightweight storage resource manager StoRM is employed as the central SE to integrate local and grid data. A frontend system is designed for user's massive job splitting, submission and management, with plugins to support new VOs. A monitoring and accounting system is also considered to easy the system administration and VO related resources usage accounting. (paper)

  6. Experience of BESIII data production with local cluster and distributed computing model

    International Nuclear Information System (INIS)

    Deng, Z Y; Li, W D; Liu, H M; Sun, Y Z; Zhang, X M; Lin, L; Nicholson, C; Zhemchugov, A

    2012-01-01

    The BES III detector is a new spectrometer which works on the upgraded high-luminosity collider, BEPCII. The BES III experiment studies physics in the tau-charm energy region from 2 GeV to 4.6 GeV . From 2009 to 2011, BEPCII has produced 106M ψ(2S) events, 225M J/ψ events, 2.8 fb −1 ψ(3770) data, and 500 pb −1 data at 4.01 GeV. All the data samples were processed successfully and many important physics results have been achieved based on these samples. Doing data production correctly and efficiently with limited CPU and storage resources is a big challenge. This paper will describe the implementation of the experiment-specific data production for BESIII in detail, including data calibration with event-level parallel computing model, data reconstruction, inclusive Monte Carlo generation, random trigger background mixing and multi-stream data skimming. Now, with the data sample increasing rapidly, there is a growing demand to move from solely using a local cluster to a more distributed computing model. A distributed computing environment is being set up and expected to go into production use in 2012. The experience of BESIII data production, both with a local cluster and with a distributed computing model, is presented here.

  7. Local rollback for fault-tolerance in parallel computing systems

    Science.gov (United States)

    Blumrich, Matthias A [Yorktown Heights, NY; Chen, Dong [Yorktown Heights, NY; Gara, Alan [Yorktown Heights, NY; Giampapa, Mark E [Yorktown Heights, NY; Heidelberger, Philip [Yorktown Heights, NY; Ohmacht, Martin [Yorktown Heights, NY; Steinmacher-Burow, Burkhard [Boeblingen, DE; Sugavanam, Krishnan [Yorktown Heights, NY

    2012-01-24

    A control logic device performs a local rollback in a parallel super computing system. The super computing system includes at least one cache memory device. The control logic device determines a local rollback interval. The control logic device runs at least one instruction in the local rollback interval. The control logic device evaluates whether an unrecoverable condition occurs while running the at least one instruction during the local rollback interval. The control logic device checks whether an error occurs during the local rollback. The control logic device restarts the local rollback interval if the error occurs and the unrecoverable condition does not occur during the local rollback interval.

  8. A Compute Environment of ABC95 Array Computer Based on Multi-FPGA Chip

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    ABC95 array computer is a multi-function network's computer based on FPGA technology, The multi-function network supports processors conflict-free access data from memory and supports processors access data from processors based on enhanced MESH network.ABC95 instruction's system includes control instructions, scalar instructions, vectors instructions.Mostly net-work instructions are introduced.A programming environment of ABC95 array computer assemble language is designed.A programming environment of ABC95 array computer for VC++ is advanced.It includes load function of ABC95 array computer program and data, store function, run function and so on.Specially, The data type of ABC95 array computer conflict-free access is defined.The results show that these technologies can develop programmer of ABC95 array computer effectively.

  9. Security Management Model in Cloud Computing Environment

    OpenAIRE

    Ahmadpanah, Seyed Hossein

    2016-01-01

    In the cloud computing environment, cloud virtual machine (VM) will be more and more the number of virtual machine security and management faced giant Challenge. In order to address security issues cloud computing virtualization environment, this paper presents a virtual machine based on efficient and dynamic deployment VM security management model state migration and scheduling, study of which virtual machine security architecture, based on AHP (Analytic Hierarchy Process) virtual machine de...

  10. Computing, Environment and Life Sciences | Argonne National Laboratory

    Science.gov (United States)

    Computing, Environment and Life Sciences Research Divisions BIOBiosciences CPSComputational Science DSLData Argonne Leadership Computing Facility Biosciences Division Environmental Science Division Mathematics and Computer Science Division Facilities and Institutes Argonne Leadership Computing Facility News Events About

  11. Exploiting Locality in Quantum Computation for Quantum Chemistry.

    Science.gov (United States)

    McClean, Jarrod R; Babbush, Ryan; Love, Peter J; Aspuru-Guzik, Alán

    2014-12-18

    Accurate prediction of chemical and material properties from first-principles quantum chemistry is a challenging task on traditional computers. Recent developments in quantum computation offer a route toward highly accurate solutions with polynomial cost; however, this solution still carries a large overhead. In this Perspective, we aim to bring together known results about the locality of physical interactions from quantum chemistry with ideas from quantum computation. We show that the utilization of spatial locality combined with the Bravyi-Kitaev transformation offers an improvement in the scaling of known quantum algorithms for quantum chemistry and provides numerical examples to help illustrate this point. We combine these developments to improve the outlook for the future of quantum chemistry on quantum computers.

  12. Intelligent computing for sustainable energy and environment

    Energy Technology Data Exchange (ETDEWEB)

    Li, Kang [Queen' s Univ. Belfast (United Kingdom). School of Electronics, Electrical Engineering and Computer Science; Li, Shaoyuan; Li, Dewei [Shanghai Jiao Tong Univ., Shanghai (China). Dept. of Automation; Niu, Qun (eds.) [Shanghai Univ. (China). School of Mechatronic Engineering and Automation

    2013-07-01

    Fast track conference proceedings. State of the art research. Up to date results. This book constitutes the refereed proceedings of the Second International Conference on Intelligent Computing for Sustainable Energy and Environment, ICSEE 2012, held in Shanghai, China, in September 2012. The 60 full papers presented were carefully reviewed and selected from numerous submissions and present theories and methodologies as well as the emerging applications of intelligent computing in sustainable energy and environment.

  13. CHPS IN CLOUD COMPUTING ENVIRONMENT

    OpenAIRE

    K.L.Giridas; A.Shajin Nargunam

    2012-01-01

    Workflow have been utilized to characterize a various form of applications concerning high processing and storage space demands. So, to make the cloud computing environment more eco-friendly,our research project was aiming in reducing E-waste accumulated by computers. In a hybrid cloud, the user has flexibility offered by public cloud resources that can be combined to the private resources pool as required. Our previous work described the process of combining the low range and mid range proce...

  14. Local environment can enhance fidelity of quantum teleportation

    Science.gov (United States)

    BadziaĢ, Piotr; Horodecki, Michał; Horodecki, Paweł; Horodecki, Ryszard

    2000-07-01

    We show how an interaction with the environment can enhance fidelity of quantum teleportation. To this end, we present examples of states which cannot be made useful for teleportation by any local unitary transformations; nevertheless, after being subjected to a dissipative interaction with the local environment, the states allow for teleportation with genuinely quantum fidelity. The surprising fact here is that the necessary interaction does not require any intelligent action from the parties sharing the states. In passing, we produce some general results regarding optimization of teleportation fidelity by local action. We show that bistochastic processes cannot improve fidelity of two-qubit states. We also show that in order to have their fidelity improvable by a local process, the bipartite states must violate the so-called reduction criterion of separability.

  15. A Distributed Snapshot Protocol for Efficient Artificial Intelligence Computation in Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    JongBeom Lim

    2018-01-01

    Full Text Available Many artificial intelligence applications often require a huge amount of computing resources. As a result, cloud computing adoption rates are increasing in the artificial intelligence field. To support the demand for artificial intelligence applications and guarantee the service level agreement, cloud computing should provide not only computing resources but also fundamental mechanisms for efficient computing. In this regard, a snapshot protocol has been used to create a consistent snapshot of the global state in cloud computing environments. However, the existing snapshot protocols are not optimized in the context of artificial intelligence applications, where large-scale iterative computation is the norm. In this paper, we present a distributed snapshot protocol for efficient artificial intelligence computation in cloud computing environments. The proposed snapshot protocol is based on a distributed algorithm to run interconnected multiple nodes in a scalable fashion. Our snapshot protocol is able to deal with artificial intelligence applications, in which a large number of computing nodes are running. We reveal that our distributed snapshot protocol guarantees the correctness, safety, and liveness conditions.

  16. Instantaneous Non-Local Computation of Low T-Depth Quantum Circuits

    DEFF Research Database (Denmark)

    Speelman, Florian

    2016-01-01

    -depth of a quantum circuit, able to perform non-local computation of quantum circuits with a (poly-)logarithmic number of layers of T gates with quasi-polynomial entanglement. Our proofs combine ideas from blind and delegated quantum computation with the garden-hose model, a combinatorial model of communication......Instantaneous non-local quantum computation requires multiple parties to jointly perform a quantum operation, using pre-shared entanglement and a single round of simultaneous communication. We study this task for its close connection to position-based quantum cryptography, but it also has natural...... applications in the context of foundations of quantum physics and in distributed computing. The best known general construction for instantaneous non-local quantum computation requires a pre-shared state which is exponentially large in the number of qubits involved in the operation, while efficient...

  17. Printing in Ubiquitous Computing Environments

    NARCIS (Netherlands)

    Karapantelakis, Athanasios; Delvic, Alisa; Zarifi Eslami, Mohammed; Khamit, Saltanat

    Document printing has long been considered an indispensable part of the workspace. While this process is considered trivial and simple for environments where resources are ample (e.g. desktop computers connected to printers within a corporate network), it becomes complicated when applied in a mobile

  18. CRLBs for WSNs localization in NLOS environment

    Directory of Open Access Journals (Sweden)

    Wang Peng

    2011-01-01

    Full Text Available Abstract Determination of Cramer-Rao lower bound (CRLB as an optimality criterion for the problem of localization in wireless sensor networks (WSNs is a very important issue. Currently, CRLBs have been derived for line-of-sight (LOS situation in WSNs. However, one of major problems for accurate localization in WSNs is non-line-of-sight (NLOS propagation. This article proposes two CRLBs for WSNs localization in NLOS environment. The proposed CRLBs consider both the cases that positions of reference devices (RDs are perfectly or imperfectly known. Since non-parametric kernel method is used to build probability density function of NLOS errors, the proposed CRLBs are suitable for various distributions of NLOS errors. Moreover, the proposed CRLBs provide a unified presentation for both LOS and NLOS environments. Theoretical analysis also proves that the proposed CRLB for NLOS situation becomes the CRLB for LOS situation when NLOS errors go to 0, which gives a robust check for the proposed CRLB.

  19. DEFACTO: A Design Environment for Adaptive Computing Technology

    National Research Council Canada - National Science Library

    Hall, Mary

    2003-01-01

    This report describes the activities of the DEFACTO project, a Design Environment for Adaptive Computing Technology funded under the DARPA Adaptive Computing Systems and Just-In-Time-Hardware programs...

  20. A non-local computational boundary condition for duct acoustics

    Science.gov (United States)

    Zorumski, William E.; Watson, Willie R.; Hodge, Steve L.

    1994-01-01

    A non-local boundary condition is formulated for acoustic waves in ducts without flow. The ducts are two dimensional with constant area, but with variable impedance wall lining. Extension of the formulation to three dimensional and variable area ducts is straightforward in principle, but requires significantly more computation. The boundary condition simulates a nonreflecting wave field in an infinite duct. It is implemented by a constant matrix operator which is applied at the boundary of the computational domain. An efficient computational solution scheme is developed which allows calculations for high frequencies and long duct lengths. This computational solution utilizes the boundary condition to limit the computational space while preserving the radiation boundary condition. The boundary condition is tested for several sources. It is demonstrated that the boundary condition can be applied close to the sound sources, rendering the computational domain small. Computational solutions with the new non-local boundary condition are shown to be consistent with the known solutions for nonreflecting wavefields in an infinite uniform duct.

  1. 5 CFR 531.245 - Computing locality rates and special rates for GM employees.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Computing locality rates and special... Gm Employees § 531.245 Computing locality rates and special rates for GM employees. Locality rates and special rates are computed for GM employees in the same manner as locality rates and special rates...

  2. The DIII-D Computing Environment: Characteristics and Recent Changes

    International Nuclear Information System (INIS)

    McHarg, B.B. Jr.

    1999-01-01

    The DIII-D tokamak national fusion research facility along with its predecessor Doublet III has been operating for over 21 years. The DIII-D computing environment consists of real-time systems controlling the tokamak, heating systems, and diagnostics, and systems acquiring experimental data from instrumentation; major data analysis server nodes performing short term and long term data access and data analysis; and systems providing mechanisms for remote collaboration and the dissemination of information over the world wide web. Computer systems for the facility have undergone incredible changes over the course of time as the computer industry has changed dramatically. Yet there are certain valuable characteristics of the DIII-D computing environment that have been developed over time and have been maintained to this day. Some of these characteristics include: continuous computer infrastructure improvements, distributed data and data access, computing platform integration, and remote collaborations. These characteristics are being carried forward as well as new characteristics resulting from recent changes which have included: a dedicated storage system and a hierarchical storage management system for raw shot data, various further infrastructure improvements including deployment of Fast Ethernet, the introduction of MDSplus, LSF and common IDL based tools, and improvements to remote collaboration capabilities. This paper will describe this computing environment, important characteristics that over the years have contributed to the success of DIII-D computing systems, and recent changes to computer systems

  3. 5 CFR 531.607 - Computing hourly, daily, weekly, and biweekly locality rates.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Computing hourly, daily, weekly, and... Computing hourly, daily, weekly, and biweekly locality rates. (a) Apply the following methods to convert an... firefighter whose pay is computed under 5 U.S.C. 5545b, a firefighter hourly locality rate is computed using a...

  4. Micro-computer cards for hard industrial environment

    Energy Technology Data Exchange (ETDEWEB)

    Breton, J M

    1984-03-15

    Approximately 60% of present or future distributed systems have, or will have, operational units installed in hard environments. In these applications, which include canalization and industrial motor control, robotics and process control, systems must be easily applied in environments not made for electronic use. The development of card systems in this hard industrial environment, which is found in petrochemical industry and mines is described. National semiconductor CIM card system CMOS technology allows the real time micro computer application to be efficient and functional in hard industrial environments.

  5. Scheduling multimedia services in cloud computing environment

    Science.gov (United States)

    Liu, Yunchang; Li, Chunlin; Luo, Youlong; Shao, Yanling; Zhang, Jing

    2018-02-01

    Currently, security is a critical factor for multimedia services running in the cloud computing environment. As an effective mechanism, trust can improve security level and mitigate attacks within cloud computing environments. Unfortunately, existing scheduling strategy for multimedia service in the cloud computing environment do not integrate trust mechanism when making scheduling decisions. In this paper, we propose a scheduling scheme for multimedia services in multi clouds. At first, a novel scheduling architecture is presented. Then, We build a trust model including both subjective trust and objective trust to evaluate the trust degree of multimedia service providers. By employing Bayesian theory, the subjective trust degree between multimedia service providers and users is obtained. According to the attributes of QoS, the objective trust degree of multimedia service providers is calculated. Finally, a scheduling algorithm integrating trust of entities is proposed by considering the deadline, cost and trust requirements of multimedia services. The scheduling algorithm heuristically hunts for reasonable resource allocations and satisfies the requirement of trust and meets deadlines for the multimedia services. Detailed simulated experiments demonstrate the effectiveness and feasibility of the proposed trust scheduling scheme.

  6. Embedding Moodle into Ubiquitous Computing Environments

    NARCIS (Netherlands)

    Glahn, Christian; Specht, Marcus

    2010-01-01

    Glahn, C., & Specht, M. (2010). Embedding Moodle into Ubiquitous Computing Environments. In M. Montebello, et al. (Eds.), 9th World Conference on Mobile and Contextual Learning (MLearn2010) (pp. 100-107). October, 19-22, 2010, Valletta, Malta.

  7. Computers and the Environment: Minimizing the Carbon Footprint

    Science.gov (United States)

    Kaestner, Rich

    2009-01-01

    Computers can be good and bad for the environment; one can maximize the good and minimize the bad. When dealing with environmental issues, it's difficult to ignore the computing infrastructure. With an operations carbon footprint equal to the airline industry's, computer energy use is only part of the problem; everyone is also dealing with the use…

  8. The sociability of computer-supported collaborative learning environments

    NARCIS (Netherlands)

    Kreijns, C.J.; Kirschner, P.A.; Jochems, W.M.G.

    2002-01-01

    There is much positive research on computer-supported collaborative learning (CSCL) environments in asynchronous distributed learning groups (DLGs). There is also research that shows that contemporary CSCL environments do not completely fulfil expectations on supporting interactive group learning,

  9. The efficacy of control environment as fraud deterrence in local government

    Directory of Open Access Journals (Sweden)

    Nuswantara Dian Anita

    2017-12-01

    Full Text Available In a globalised scenario, the topic of an enormous increase of malfeasance in the local governments, posing catastrophic threats which come from vicious bureaucratic apparatus, becomes a global phenomenon. This current study uses case study material on the risk management control system specially the control environment in Indonesia local governments to extend existing theory by developing a contingency theory for the public sector. Within local government, contingency theory has emerged as a lens for exploring the links between public sector initiatives to improve risk mitigation and the structure of the control system. The case illustrates that the discretion of control environment - the encouragement of a local government’s control environment - is considered as a springboard for fraud deterrence and might be the loopholes in the government control systems.

  10. Spike-timing-based computation in sound localization.

    Directory of Open Access Journals (Sweden)

    Dan F M Goodman

    2010-11-01

    Full Text Available Spike timing is precise in the auditory system and it has been argued that it conveys information about auditory stimuli, in particular about the location of a sound source. However, beyond simple time differences, the way in which neurons might extract this information is unclear and the potential computational advantages are unknown. The computational difficulty of this task for an animal is to locate the source of an unexpected sound from two monaural signals that are highly dependent on the unknown source signal. In neuron models consisting of spectro-temporal filtering and spiking nonlinearity, we found that the binaural structure induced by spatialized sounds is mapped to synchrony patterns that depend on source location rather than on source signal. Location-specific synchrony patterns would then result in the activation of location-specific assemblies of postsynaptic neurons. We designed a spiking neuron model which exploited this principle to locate a variety of sound sources in a virtual acoustic environment using measured human head-related transfer functions. The model was able to accurately estimate the location of previously unknown sounds in both azimuth and elevation (including front/back discrimination in a known acoustic environment. We found that multiple representations of different acoustic environments could coexist as sets of overlapping neural assemblies which could be associated with spatial locations by Hebbian learning. The model demonstrates the computational relevance of relative spike timing to extract spatial information about sources independently of the source signal.

  11. Ultrafast and scalable cone-beam CT reconstruction using MapReduce in a cloud computing environment.

    Science.gov (United States)

    Meng, Bowen; Pratx, Guillem; Xing, Lei

    2011-12-01

    Four-dimensional CT (4DCT) and cone beam CT (CBCT) are widely used in radiation therapy for accurate tumor target definition and localization. However, high-resolution and dynamic image reconstruction is computationally demanding because of the large amount of data processed. Efficient use of these imaging techniques in the clinic requires high-performance computing. The purpose of this work is to develop a novel ultrafast, scalable and reliable image reconstruction technique for 4D CBCT∕CT using a parallel computing framework called MapReduce. We show the utility of MapReduce for solving large-scale medical physics problems in a cloud computing environment. In this work, we accelerated the Feldcamp-Davis-Kress (FDK) algorithm by porting it to Hadoop, an open-source MapReduce implementation. Gated phases from a 4DCT scans were reconstructed independently. Following the MapReduce formalism, Map functions were used to filter and backproject subsets of projections, and Reduce function to aggregate those partial backprojection into the whole volume. MapReduce automatically parallelized the reconstruction process on a large cluster of computer nodes. As a validation, reconstruction of a digital phantom and an acquired CatPhan 600 phantom was performed on a commercial cloud computing environment using the proposed 4D CBCT∕CT reconstruction algorithm. Speedup of reconstruction time is found to be roughly linear with the number of nodes employed. For instance, greater than 10 times speedup was achieved using 200 nodes for all cases, compared to the same code executed on a single machine. Without modifying the code, faster reconstruction is readily achievable by allocating more nodes in the cloud computing environment. Root mean square error between the images obtained using MapReduce and a single-threaded reference implementation was on the order of 10(-7). Our study also proved that cloud computing with MapReduce is fault tolerant: the reconstruction completed

  12. Barcode based localization system in indoor environment

    Directory of Open Access Journals (Sweden)

    Ľubica Ilkovičová

    2014-12-01

    Full Text Available Nowadays, in the era of intelligent buildings, there is a need to create indoornavigation systems, what is steadily a challenge. QR (Quick Response codesprovide accurate localization also in indoor environment, where other navigationtechniques (e.g. GPS are not available. The paper deals with the issues of posi-tioning using QR codes, solved at the Department of Surveying, Faculty of CivilEngineering SUT in Bratislava. Operating principle of QR codes, description ofthe application for positioning in indoor environment based on OS Android forsmartphones are described.

  13. Intraspecific Colour Variation among Lizards in Distinct Island Environments Enhances Local Camouflage.

    Science.gov (United States)

    Marshall, Kate L A; Philpot, Kate E; Damas-Moreira, Isabel; Stevens, Martin

    2015-01-01

    Within-species colour variation is widespread among animals. Understanding how this arises can elucidate evolutionary mechanisms, such as those underlying reproductive isolation and speciation. Here, we investigated whether five island populations of Aegean wall lizards (Podarcis erhardii) have more effective camouflage against their own (local) island substrates than against other (non-local) island substrates to avian predators, and whether this was linked to island differences in substrate appearance. We also investigated whether degree of local substrate matching varied among island populations and between sexes. In most populations, both sexes were better matched against local backgrounds than against non-local backgrounds, particularly in terms of luminance (perceived lightness), which usually occurred when local and non-local backgrounds were different in appearance. This was found even between island populations that historically had a land connection and in populations that have been isolated relatively recently, suggesting that isolation in these distinct island environments has been sufficient to cause enhanced local background matching, sometimes on a rapid evolutionary time-scale. However, heightened local matching was poorer in populations inhabiting more variable and unstable environments with a prolonged history of volcanic activity. Overall, these results show that lizard coloration is tuned to provide camouflage in local environments, either due to genetic adaptation or changes during development. Yet, the occurrence and extent of selection for local matching may depend on specific conditions associated with local ecology and biogeographic history. These results emphasize how anti-predator adaptations to different environments can drive divergence within a species, which may contribute to reproductive isolation among populations and lead to ecological speciation.

  14. Discharge Fee Policy Analysis: A Computable General Equilibrium (CGE) Model of Water Resources and Water Environments

    OpenAIRE

    Guohua Fang; Ting Wang; Xinyi Si; Xin Wen; Yu Liu

    2016-01-01

    To alleviate increasingly serious water pollution and shortages in developing countries, various kinds of policies have been implemented by local governments. It is vital to quantify and evaluate the performance and potential economic impacts of these policies. This study develops a Computable General Equilibrium (CGE) model to simulate the regional economic and environmental effects of discharge fees. Firstly, water resources and water environment factors are separated from the input and out...

  15. An Introduction to Computer Forensics: Gathering Evidence in a Computing Environment

    Directory of Open Access Journals (Sweden)

    Henry B. Wolfe

    2001-01-01

    Full Text Available Business has become increasingly dependent on the Internet and computing to operate. It has become apparent that there are issues of evidence gathering in a computing environment, which by their nature are technical and different to other forms of evidence gathering, that must be addressed. This paper offers an introduction to some of the technical issues surrounding this new and specialized field of Computer Forensics. It attempts to identify and describe sources of evidence that can be found on disk data storage devices in the course of an investigation. It also considers sources of copies of email, which can be used in evidence, as well as case building.

  16. Environments for online maritime simulators with cloud computing capabilities

    Science.gov (United States)

    Raicu, Gabriel; Raicu, Alexandra

    2016-12-01

    This paper presents the cloud computing environments, network principles and methods for graphical development in realistic naval simulation, naval robotics and virtual interactions. The aim of this approach is to achieve a good simulation quality in large networked environments using open source solutions designed for educational purposes. Realistic rendering of maritime environments requires near real-time frameworks with enhanced computing capabilities during distance interactions. E-Navigation concepts coupled with the last achievements in virtual and augmented reality will enhance the overall experience leading to new developments and innovations. We have to deal with a multiprocessing situation using advanced technologies and distributed applications using remote ship scenario and automation of ship operations.

  17. Intraspecific Colour Variation among Lizards in Distinct Island Environments Enhances Local Camouflage

    Science.gov (United States)

    Marshall, Kate L. A.; Philpot, Kate E.; Damas-Moreira, Isabel; Stevens, Martin

    2015-01-01

    Within-species colour variation is widespread among animals. Understanding how this arises can elucidate evolutionary mechanisms, such as those underlying reproductive isolation and speciation. Here, we investigated whether five island populations of Aegean wall lizards (Podarcis erhardii) have more effective camouflage against their own (local) island substrates than against other (non-local) island substrates to avian predators, and whether this was linked to island differences in substrate appearance. We also investigated whether degree of local substrate matching varied among island populations and between sexes. In most populations, both sexes were better matched against local backgrounds than against non-local backgrounds, particularly in terms of luminance (perceived lightness), which usually occurred when local and non-local backgrounds were different in appearance. This was found even between island populations that historically had a land connection and in populations that have been isolated relatively recently, suggesting that isolation in these distinct island environments has been sufficient to cause enhanced local background matching, sometimes on a rapid evolutionary time-scale. However, heightened local matching was poorer in populations inhabiting more variable and unstable environments with a prolonged history of volcanic activity. Overall, these results show that lizard coloration is tuned to provide camouflage in local environments, either due to genetic adaptation or changes during development. Yet, the occurrence and extent of selection for local matching may depend on specific conditions associated with local ecology and biogeographic history. These results emphasize how anti-predator adaptations to different environments can drive divergence within a species, which may contribute to reproductive isolation among populations and lead to ecological speciation. PMID:26372454

  18. Intraspecific Colour Variation among Lizards in Distinct Island Environments Enhances Local Camouflage.

    Directory of Open Access Journals (Sweden)

    Kate L A Marshall

    Full Text Available Within-species colour variation is widespread among animals. Understanding how this arises can elucidate evolutionary mechanisms, such as those underlying reproductive isolation and speciation. Here, we investigated whether five island populations of Aegean wall lizards (Podarcis erhardii have more effective camouflage against their own (local island substrates than against other (non-local island substrates to avian predators, and whether this was linked to island differences in substrate appearance. We also investigated whether degree of local substrate matching varied among island populations and between sexes. In most populations, both sexes were better matched against local backgrounds than against non-local backgrounds, particularly in terms of luminance (perceived lightness, which usually occurred when local and non-local backgrounds were different in appearance. This was found even between island populations that historically had a land connection and in populations that have been isolated relatively recently, suggesting that isolation in these distinct island environments has been sufficient to cause enhanced local background matching, sometimes on a rapid evolutionary time-scale. However, heightened local matching was poorer in populations inhabiting more variable and unstable environments with a prolonged history of volcanic activity. Overall, these results show that lizard coloration is tuned to provide camouflage in local environments, either due to genetic adaptation or changes during development. Yet, the occurrence and extent of selection for local matching may depend on specific conditions associated with local ecology and biogeographic history. These results emphasize how anti-predator adaptations to different environments can drive divergence within a species, which may contribute to reproductive isolation among populations and lead to ecological speciation.

  19. A high performance scientific cloud computing environment for materials simulations

    Science.gov (United States)

    Jorissen, K.; Vila, F. D.; Rehr, J. J.

    2012-09-01

    We describe the development of a scientific cloud computing (SCC) platform that offers high performance computation capability. The platform consists of a scientific virtual machine prototype containing a UNIX operating system and several materials science codes, together with essential interface tools (an SCC toolset) that offers functionality comparable to local compute clusters. In particular, our SCC toolset provides automatic creation of virtual clusters for parallel computing, including tools for execution and monitoring performance, as well as efficient I/O utilities that enable seamless connections to and from the cloud. Our SCC platform is optimized for the Amazon Elastic Compute Cloud (EC2). We present benchmarks for prototypical scientific applications and demonstrate performance comparable to local compute clusters. To facilitate code execution and provide user-friendly access, we have also integrated cloud computing capability in a JAVA-based GUI. Our SCC platform may be an alternative to traditional HPC resources for materials science or quantum chemistry applications.

  20. The Computer Revolution in Science: Steps towards the realization of computer-supported discovery environments

    NARCIS (Netherlands)

    de Jong, Hidde; Rip, Arie

    1997-01-01

    The tools that scientists use in their search processes together form so-called discovery environments. The promise of artificial intelligence and other branches of computer science is to radically transform conventional discovery environments by equipping scientists with a range of powerful

  1. Local computations in Dempster-Shafer theory of evidence

    Czech Academy of Sciences Publication Activity Database

    Jiroušek, Radim

    2012-01-01

    Roč. 53, č. 8 (2012), s. 1155-1167 ISSN 0888-613X Grant - others:GA ČR(CZ) GAP403/12/2175 Program:GA Institutional support: RVO:67985556 Keywords : Discrete belief functions * Dempster-Shafer theory * conditional independence * decomposable model Subject RIV: IN - Informatics, Computer Science Impact factor: 1.729, year: 2012 http://library.utia.cas.cz/separaty/2012/MTR/jirousek-local computations in dempster–shafer theory of evidence. pdf

  2. Opisthorchiasis in Northeastern Thailand: Effect of local environment and culture

    Directory of Open Access Journals (Sweden)

    Beuy Joob

    2015-06-01

    Full Text Available Opisthorchiasis is a kind of trematode infection. This parasitic infestation is a chronic hepatobiliary tract infection and can cause chronic irritation that will finally lead to cholangiocarcinoma. It is highly endemic in northeastern region of Thailand and contributes to many cholangiocarcinoma cases annually. The attempt to control the disease becomes a national policy. However, the sanitation becomes a major underlying factor leading to infection and meanwhile, the poverty and low education of the local people become an important concern. In this opinion, the authors discuss the effect of local environment and culture on opisthorchiasis in northeastern Thailand. Due to the pattern change of local environment, global warming and globalization, the dynamicity can be observed.

  3. High performance computing network for cloud environment using simulators

    OpenAIRE

    Singh, N. Ajith; Hemalatha, M.

    2012-01-01

    Cloud computing is the next generation computing. Adopting the cloud computing is like signing up new form of a website. The GUI which controls the cloud computing make is directly control the hardware resource and your application. The difficulty part in cloud computing is to deploy in real environment. Its' difficult to know the exact cost and it's requirement until and unless we buy the service not only that whether it will support the existing application which is available on traditional...

  4. Enhancing Security by System-Level Virtualization in Cloud Computing Environments

    Science.gov (United States)

    Sun, Dawei; Chang, Guiran; Tan, Chunguang; Wang, Xingwei

    Many trends are opening up the era of cloud computing, which will reshape the IT industry. Virtualization techniques have become an indispensable ingredient for almost all cloud computing system. By the virtual environments, cloud provider is able to run varieties of operating systems as needed by each cloud user. Virtualization can improve reliability, security, and availability of applications by using consolidation, isolation, and fault tolerance. In addition, it is possible to balance the workloads by using live migration techniques. In this paper, the definition of cloud computing is given; and then the service and deployment models are introduced. An analysis of security issues and challenges in implementation of cloud computing is identified. Moreover, a system-level virtualization case is established to enhance the security of cloud computing environments.

  5. New computing systems, future computing environment, and their implications on structural analysis and design

    Science.gov (United States)

    Noor, Ahmed K.; Housner, Jerrold M.

    1993-01-01

    Recent advances in computer technology that are likely to impact structural analysis and design of flight vehicles are reviewed. A brief summary is given of the advances in microelectronics, networking technologies, and in the user-interface hardware and software. The major features of new and projected computing systems, including high performance computers, parallel processing machines, and small systems, are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed. The impact of the advances in computer technology on structural analysis and the design of flight vehicles is described. A scenario for future computing paradigms is presented, and the near-term needs in the computational structures area are outlined.

  6. The Contribution of Local Environments to Competence Creation in Multinational Enterprises

    DEFF Research Database (Denmark)

    Andersson, Ulf; Dellestrand, Henrik; Pedersen, Torben

    2014-01-01

    This paper examines the competence development of subsidiaries in multinational enterprises. We analyze how local subsidiary environments affect the development of technological and business competencies among other units in the multinational enterprise. We test our predictions using data from 2......,107 foreign-owned subsidiaries located in seven European countries, by means of structural equation modeling — namely, LISREL. By bringing the local environment to the fore, we contribute to the literature on the emergence and determinants of firm-specific advantages. We link local subsidiary environments...... throughout the organization. Thus, we contribute to an enhanced understanding of location as a determinant of the creation of units of competence and centers of excellence within multinational enterprises. In other words, we demonstrate that country-specific advantages are beneficial for competence creation...

  7. CERR: A computational environment for radiotherapy research

    International Nuclear Information System (INIS)

    Deasy, Joseph O.; Blanco, Angel I.; Clark, Vanessa H.

    2003-01-01

    A software environment is described, called the computational environment for radiotherapy research (CERR, pronounced 'sir'). CERR partially addresses four broad needs in treatment planning research: (a) it provides a convenient and powerful software environment to develop and prototype treatment planning concepts, (b) it serves as a software integration environment to combine treatment planning software written in multiple languages (MATLAB, FORTRAN, C/C++, JAVA, etc.), together with treatment plan information (computed tomography scans, outlined structures, dose distributions, digital films, etc.), (c) it provides the ability to extract treatment plans from disparate planning systems using the widely available AAPM/RTOG archiving mechanism, and (d) it provides a convenient and powerful tool for sharing and reproducing treatment planning research results. The functional components currently being distributed, including source code, include: (1) an import program which converts the widely available AAPM/RTOG treatment planning format into a MATLAB cell-array data object, facilitating manipulation; (2) viewers which display axial, coronal, and sagittal computed tomography images, structure contours, digital films, and isodose lines or dose colorwash, (3) a suite of contouring tools to edit and/or create anatomical structures, (4) dose-volume and dose-surface histogram calculation and display tools, and (5) various predefined commands. CERR allows the user to retrieve any AAPM/RTOG key word information about the treatment plan archive. The code is relatively self-describing, because it relies on MATLAB structure field name definitions based on the AAPM/RTOG standard. New structure field names can be added dynamically or permanently. New components of arbitrary data type can be stored and accessed without disturbing system operation. CERR has been applied to aid research in dose-volume-outcome modeling, Monte Carlo dose calculation, and treatment planning optimization

  8. A high-resolution computational localization method for transcranial magnetic stimulation mapping.

    Science.gov (United States)

    Aonuma, Shinta; Gomez-Tames, Jose; Laakso, Ilkka; Hirata, Akimasa; Takakura, Tomokazu; Tamura, Manabu; Muragaki, Yoshihiro

    2018-05-15

    Transcranial magnetic stimulation (TMS) is used for the mapping of brain motor functions. The complexity of the brain deters determining the exact localization of the stimulation site using simplified methods (e.g., the region below the center of the TMS coil) or conventional computational approaches. This study aimed to present a high-precision localization method for a specific motor area by synthesizing computed non-uniform current distributions in the brain for multiple sessions of TMS. Peritumoral mapping by TMS was conducted on patients who had intra-axial brain neoplasms located within or close to the motor speech area. The electric field induced by TMS was computed using realistic head models constructed from magnetic resonance images of patients. A post-processing method was implemented to determine a TMS hotspot by combining the computed electric fields for the coil orientations and positions that delivered high motor-evoked potentials during peritumoral mapping. The method was compared to the stimulation site localized via intraoperative direct brain stimulation and navigated TMS. Four main results were obtained: 1) the dependence of the computed hotspot area on the number of peritumoral measurements was evaluated; 2) the estimated localization of the hand motor area in eight non-affected hemispheres was in good agreement with the position of a so-called "hand-knob"; 3) the estimated hotspot areas were not sensitive to variations in tissue conductivity; and 4) the hand motor areas estimated by this proposal and direct electric stimulation (DES) were in good agreement in the ipsilateral hemisphere of four glioma patients. The TMS localization method was validated by well-known positions of the "hand-knob" in brains for the non-affected hemisphere, and by a hotspot localized via DES during awake craniotomy for the tumor-containing hemisphere. Copyright © 2018 Elsevier Inc. All rights reserved.

  9. Local computer network of the JINR Neutron Physics Laboratory

    International Nuclear Information System (INIS)

    Alfimenkov, A.V.; Vagov, V.A.; Vajdkhadze, F.

    1988-01-01

    New high-speed local computer network, where intelligent network adapter (NA) is used as hardware base, is developed in the JINR Neutron Physics Laboratory to increase operation efficiency and data transfer rate. NA consists of computer bus interface, cable former, microcomputer segment designed for both program realization of channel-level protocol and organization of bidirectional transfer of information through direct access channel between monochannel and computer memory with or witout buffering in NA operation memory device

  10. An evaluation of multi-probe locality sensitive hashing for computing similarities over web-scale query logs.

    Directory of Open Access Journals (Sweden)

    Graham Cormode

    Full Text Available Many modern applications of AI such as web search, mobile browsing, image processing, and natural language processing rely on finding similar items from a large database of complex objects. Due to the very large scale of data involved (e.g., users' queries from commercial search engines, computing such near or nearest neighbors is a non-trivial task, as the computational cost grows significantly with the number of items. To address this challenge, we adopt Locality Sensitive Hashing (a.k.a, LSH methods and evaluate four variants in a distributed computing environment (specifically, Hadoop. We identify several optimizations which improve performance, suitable for deployment in very large scale settings. The experimental results demonstrate our variants of LSH achieve the robust performance with better recall compared with "vanilla" LSH, even when using the same amount of space.

  11. Distributed Computations Environment Protection Using Artificial Immune Systems

    Directory of Open Access Journals (Sweden)

    A. V. Moiseev

    2011-12-01

    Full Text Available In this article the authors describe possibility of artificial immune systems applying for distributed computations environment protection from definite types of malicious impacts.

  12. THE VALUE OF CLOUD COMPUTING IN THE BUSINESS ENVIRONMENT

    OpenAIRE

    Mircea GEORGESCU; Marian MATEI

    2013-01-01

    Without any doubt, cloud computing has become one of the most significant trends in any enterprise, not only for IT businesses. Besides the fact that the cloud can offer access to low cost, considerably flexible computing resources, cloud computing also provides the capacity to create a new relationship between business entities and corporate IT departments. The value added to the business environment is given by the balanced use of resources, offered by cloud computing. The cloud mentality i...

  13. Smile (System/Machine-Independent Local Environment)

    Energy Technology Data Exchange (ETDEWEB)

    Fletcher, J.G.

    1988-04-01

    This document defines the characteristics of Smile, a System/machine-independent local environment. This environment consists primarily of a number of primitives (types, macros, procedure calls, and variables) that a program may use; these primitives provide facilities, such as memory allocation, timing, tasking and synchronization beyond those typically provided by a programming language. The intent is that a program will be portable from system to system and from machine to machine if it relies only on the portable aspects of its programming language and on the Smile primitives. For this to be so, Smile itself must be implemented on each system and machine, most likely using non-portable constructions; that is, while the environment provided by Smile is intended to be portable, the implementation of Smile is not necessarily so. In order to make the implementation of Smile as easy as possible and thereby expedite the porting of programs to a new system or a new machine, Smile has been defined to provide a minimal portable environment; that is, simple primitives are defined, out of which more complex facilities may be constructed using portable procedures. The implementation of Smile can be as any of the following: the underlying software environment for the operating system of an otherwise {open_quotes}bare{close_quotes} machine, a {open_quotes}guest{close_quotes} system environment built upon a preexisting operating system, an environment within a {open_quotes}user{close_quotes} process run by an operating system, or a single environment for an entire machine, encompassing both system and {open_quotes}user{close_quotes} processes. In the first three of these cases the tasks provided by Smile are {open_quotes}lightweight processes{close_quotes} multiplexed within preexisting processes or the system, while in the last case they also include the system processes themselves.

  14. Student Advising and Retention Application in Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Gurdeep S Hura

    2016-11-01

    Full Text Available  This paper proposes a new user-friendly application enhancing and expanding the current advising services of Gradesfirst currently being used for advising and retention by the Athletic department of UMES with a view to implement new performance activities like mentoring, tutoring, scheduling, and study hall hours into existing tools. This application includes various measurements that can be used to monitor and improve the performance of the students in the Athletic Department of UMES by monitoring students’ weekly study hall hours, and tutoring schedules. It also supervises tutors’ login and logout activities in order to monitor their effectiveness, supervises tutor-tutee interaction, and stores and analyzes the overall academic progress of each student. A dedicated server for providing services will be developed at the local site. The paper has been implemented in three steps. The first step involves the creation of an independent cloud computing environment that provides resources such as database creation, query-based statistical data, performance measures activities, and automated support of performance measures such as advising, mentoring, monitoring and tutoring. The second step involves the creation of an application known as Student Advising and Retention (SAR application in a cloud computing environment. This application has been designed to be a comprehensive database management system which contains relevant data regarding student academic development that supports various strategic advising and monitoring of students. The third step involves the creation of a systematic advising chart and frameworks which help advisors. The paper shows ways of creating the most appropriate advising technique based on the student’s academic needs. The proposed application runs in a Windows-based system. As stated above, the proposed application is expected to enhance and expand the current advising service of Gradesfirst tool. A brief

  15. Airborne Cloud Computing Environment (ACCE)

    Science.gov (United States)

    Hardman, Sean; Freeborn, Dana; Crichton, Dan; Law, Emily; Kay-Im, Liz

    2011-01-01

    Airborne Cloud Computing Environment (ACCE) is JPL's internal investment to improve the return on airborne missions. Improve development performance of the data system. Improve return on the captured science data. The investment is to develop a common science data system capability for airborne instruments that encompasses the end-to-end lifecycle covering planning, provisioning of data system capabilities, and support for scientific analysis in order to improve the quality, cost effectiveness, and capabilities to enable new scientific discovery and research in earth observation.

  16. A high performance scientific cloud computing environment for materials simulations

    OpenAIRE

    Jorissen, Kevin; Vila, Fernando D.; Rehr, John J.

    2011-01-01

    We describe the development of a scientific cloud computing (SCC) platform that offers high performance computation capability. The platform consists of a scientific virtual machine prototype containing a UNIX operating system and several materials science codes, together with essential interface tools (an SCC toolset) that offers functionality comparable to local compute clusters. In particular, our SCC toolset provides automatic creation of virtual clusters for parallel computing, including...

  17. Dynamic provisioning of local and remote compute resources with OpenStack

    Science.gov (United States)

    Giffels, M.; Hauth, T.; Polgart, F.; Quast, G.

    2015-12-01

    Modern high-energy physics experiments rely on the extensive usage of computing resources, both for the reconstruction of measured events as well as for Monte-Carlo simulation. The Institut fur Experimentelle Kernphysik (EKP) at KIT is participating in both the CMS and Belle experiments with computing and storage resources. In the upcoming years, these requirements are expected to increase due to growing amount of recorded data and the rise in complexity of the simulated events. It is therefore essential to increase the available computing capabilities by tapping into all resource pools. At the EKP institute, powerful desktop machines are available to users. Due to the multi-core nature of modern CPUs, vast amounts of CPU time are not utilized by common desktop usage patterns. Other important providers of compute capabilities are classical HPC data centers at universities or national research centers. Due to the shared nature of these installations, the standardized software stack required by HEP applications cannot be installed. A viable way to overcome this constraint and offer a standardized software environment in a transparent manner is the usage of virtualization technologies. The OpenStack project has become a widely adopted solution to virtualize hardware and offer additional services like storage and virtual machine management. This contribution will report on the incorporation of the institute's desktop machines into a private OpenStack Cloud. The additional compute resources provisioned via the virtual machines have been used for Monte-Carlo simulation and data analysis. Furthermore, a concept to integrate shared, remote HPC centers into regular HEP job workflows will be presented. In this approach, local and remote resources are merged to form a uniform, virtual compute cluster with a single point-of-entry for the user. Evaluations of the performance and stability of this setup and operational experiences will be discussed.

  18. Design requirements for ubiquitous computing environments for healthcare professionals.

    Science.gov (United States)

    Bång, Magnus; Larsson, Anders; Eriksson, Henrik

    2004-01-01

    Ubiquitous computing environments can support clinical administrative routines in new ways. The aim of such computing approaches is to enhance routine physical work, thus it is important to identify specific design requirements. We studied healthcare professionals in an emergency room and developed the computer-augmented environment NOSTOS to support teamwork in that setting. NOSTOS uses digital pens and paper-based media as the primary input interface for data capture and as a means of controlling the system. NOSTOS also includes a digital desk, walk-up displays, and sensor technology that allow the system to track documents and activities in the workplace. We propose a set of requirements and discuss the value of tangible user interfaces for healthcare personnel. Our results suggest that the key requirements are flexibility in terms of system usage and seamless integration between digital and physical components. We also discuss how ubiquitous computing approaches like NOSTOS can be beneficial in the medical workplace.

  19. System administration of ATLAS TDAQ computing environment

    Science.gov (United States)

    Adeel-Ur-Rehman, A.; Bujor, F.; Benes, J.; Caramarcu, C.; Dobson, M.; Dumitrescu, A.; Dumitru, I.; Leahu, M.; Valsan, L.; Oreshkin, A.; Popov, D.; Unel, G.; Zaytsev, A.

    2010-04-01

    This contribution gives a thorough overview of the ATLAS TDAQ SysAdmin group activities which deals with administration of the TDAQ computing environment supporting High Level Trigger, Event Filter and other subsystems of the ATLAS detector operating on LHC collider at CERN. The current installation consists of approximately 1500 netbooted nodes managed by more than 60 dedicated servers, about 40 multi-screen user interface machines installed in the control rooms and various hardware and service monitoring machines as well. In the final configuration, the online computer farm will be capable of hosting tens of thousands applications running simultaneously. The software distribution requirements are matched by the two level NFS based solution. Hardware and network monitoring systems of ATLAS TDAQ are based on NAGIOS and MySQL cluster behind it for accounting and storing the monitoring data collected, IPMI tools, CERN LANDB and the dedicated tools developed by the group, e.g. ConfdbUI. The user management schema deployed in TDAQ environment is founded on the authentication and role management system based on LDAP. External access to the ATLAS online computing facilities is provided by means of the gateways supplied with an accounting system as well. Current activities of the group include deployment of the centralized storage system, testing and validating hardware solutions for future use within the ATLAS TDAQ environment including new multi-core blade servers, developing GUI tools for user authentication and roles management, testing and validating 64-bit OS, and upgrading the existing TDAQ hardware components, authentication servers and the gateways.

  20. Personal computer local networks report

    CERN Document Server

    1991-01-01

    Please note this is a Short Discount publication. Since the first microcomputer local networks of the late 1970's and early 80's, personal computer LANs have expanded in popularity, especially since the introduction of IBMs first PC in 1981. The late 1980s has seen a maturing in the industry with only a few vendors maintaining a large share of the market. This report is intended to give the reader a thorough understanding of the technology used to build these systems ... from cable to chips ... to ... protocols to servers. The report also fully defines PC LANs and the marketplace, with in-

  1. Local Authority Empowerment towards Quality Living Environment for Coastal Reclamation Area

    Directory of Open Access Journals (Sweden)

    Yusup Mohammad

    2016-01-01

    Full Text Available Good urban governance administration system is the key to a successful physical planning development. A local authority of a local government concentrates on planning administration and executes the policies and strategies either the federal or state, or even the local’s policies and strategies. Based on its characteristic as the lowest level of government, it becomes the best authority to regulate and monitor the development process within their territory. The significance of a local authority in providing quality living environment invites various academia and professionals to ponder the best urban governance system at a local level. However, there are issues with regards to financial and technical capacity of a local authority, its legal limitation and development instrument adopted in providing urban services for coastal reclamation area in Malaysia. The aim of this paper is to investigate the capability of local authorities in Malaysia in implementing their function as drawn by the legislation. Hence, this paper examines the roles and functions of a local authority as the lowest level of government administration agency in providing urban services; collecting revenue; safeguarding the physical environment in Malaysia, particularly when dealing with development in a coastal reclamation area. Primary data collection was gathered through face-to-face interview sessions involving government agencies and stakeholders. Legal documents, policies and development plans were then analysed to support the primary data for further understanding of the issues concerning the capacity of a local authority especially when providing urban services within its area. The study is expected to provide a new approach to local authorities in Malaysia in providing quality living environment in terms of development procedure, the role and function, legal empowerment, and decentralisation of function particularly in enhancing the current practices at local level.

  2. Computer simulation of spacecraft/environment interaction

    International Nuclear Information System (INIS)

    Krupnikov, K.K.; Makletsov, A.A.; Mileev, V.N.; Novikov, L.S.; Sinolits, V.V.

    1999-01-01

    This report presents some examples of a computer simulation of spacecraft interaction with space environment. We analysed a set data on electron and ion fluxes measured in 1991-1994 on geostationary satellite GORIZONT-35. The influence of spacecraft eclipse and device eclipse by solar-cell panel on spacecraft charging was investigated. A simple method was developed for an estimation of spacecraft potentials in LEO. Effects of various particle flux impact and spacecraft orientation are discussed. A computer engineering model for a calculation of space radiation is presented. This model is used as a client/server model with WWW interface, including spacecraft model description and results representation based on the virtual reality markup language

  3. Computer simulation of spacecraft/environment interaction

    CERN Document Server

    Krupnikov, K K; Mileev, V N; Novikov, L S; Sinolits, V V

    1999-01-01

    This report presents some examples of a computer simulation of spacecraft interaction with space environment. We analysed a set data on electron and ion fluxes measured in 1991-1994 on geostationary satellite GORIZONT-35. The influence of spacecraft eclipse and device eclipse by solar-cell panel on spacecraft charging was investigated. A simple method was developed for an estimation of spacecraft potentials in LEO. Effects of various particle flux impact and spacecraft orientation are discussed. A computer engineering model for a calculation of space radiation is presented. This model is used as a client/server model with WWW interface, including spacecraft model description and results representation based on the virtual reality markup language.

  4. Printing in heterogeneous computer environment at DESY

    International Nuclear Information System (INIS)

    Jakubowski, Z.

    1996-01-01

    The number of registered hosts DESY reaches 3500 while the number of print queues approaches 150. The spectrum of used computing environment is very wide: from MAC's and PC's, through SUN, DEC and SGI machines to the IBM mainframe. In 1994 we used 18 tons of paper. We present a solution for providing print services in such an environment for more than 3500 registered users. The availability of the print service is a serious issue. Using centralized printing has a lot of advantages for software administration but creates single point of failure. We solved this problem partially without using expensive software and hardware. The talk provides information about the DESY central central print spooler concept. None of the systems available on the market provides ready to use reliable solution for all platforms used for DESY. We discuss concepts for installation, administration and monitoring large number of printers. We found a solution for printing both on central computing facilities likewise for support of stand-alone workstations. (author)

  5. A Multi-Sensorial Simultaneous Localization and Mapping (SLAM) System for Low-Cost Micro Aerial Vehicles in GPS-Denied Environments.

    Science.gov (United States)

    López, Elena; García, Sergio; Barea, Rafael; Bergasa, Luis M; Molinos, Eduardo J; Arroyo, Roberto; Romera, Eduardo; Pardo, Samuel

    2017-04-08

    One of the main challenges of aerial robots navigation in indoor or GPS-denied environments is position estimation using only the available onboard sensors. This paper presents a Simultaneous Localization and Mapping (SLAM) system that remotely calculates the pose and environment map of different low-cost commercial aerial platforms, whose onboard computing capacity is usually limited. The proposed system adapts to the sensory configuration of the aerial robot, by integrating different state-of-the art SLAM methods based on vision, laser and/or inertial measurements using an Extended Kalman Filter (EKF). To do this, a minimum onboard sensory configuration is supposed, consisting of a monocular camera, an Inertial Measurement Unit (IMU) and an altimeter. It allows to improve the results of well-known monocular visual SLAM methods (LSD-SLAM and ORB-SLAM are tested and compared in this work) by solving scale ambiguity and providing additional information to the EKF. When payload and computational capabilities permit, a 2D laser sensor can be easily incorporated to the SLAM system, obtaining a local 2.5D map and a footprint estimation of the robot position that improves the 6D pose estimation through the EKF. We present some experimental results with two different commercial platforms, and validate the system by applying it to their position control.

  6. CAFE: A Computer Tool for Accurate Simulation of the Regulatory Pool Fire Environment for Type B Packages

    International Nuclear Information System (INIS)

    Gritzo, L.A.; Koski, J.A.; Suo-Anttila, A.J.

    1999-01-01

    The Container Analysis Fire Environment computer code (CAFE) is intended to provide Type B package designers with an enhanced engulfing fire boundary condition when combined with the PATRAN/P-Thermal commercial code. Historically an engulfing fire boundary condition has been modeled as σT 4 where σ is the Stefan-Boltzman constant, and T is the fire temperature. The CAFE code includes the necessary chemistry, thermal radiation, and fluid mechanics to model an engulfing fire. Effects included are the local cooling of gases that form a protective boundary layer that reduces the incoming radiant heat flux to values lower than expected from a simple σT 4 model. In addition, the effect of object shape on mixing that may increase the local fire temperature is included. Both high and low temperature regions that depend upon the local availability of oxygen are also calculated. Thus the competing effects that can both increase and decrease the local values of radiant heat flux are included in a reamer that is not predictable a-priori. The CAFE package consists of a group of computer subroutines that can be linked to workstation-based thermal analysis codes in order to predict package performance during regulatory and other accident fire scenarios

  7. Creation Greenhouse Environment Map Using Localization of Edge of Cultivation Platforms Based on Stereo Vision

    Directory of Open Access Journals (Sweden)

    A Nasiri

    2017-10-01

    Full Text Available Introduction Stereo vision means the capability of extracting the depth based on analysis of two images taken from different angles of one scene. The result of stereo vision is a collection of three-dimensional points which describes the details of scene proportional to the resolution of the obtained images. Vehicle automatic steering and crop growth monitoring are two important operations in agricultural precision. The essential aspects of an automated steering are position and orientation of the agricultural equipment in relation to crop row, detection of obstacles and design of path planning between the crop rows. The developed map can provide this information in the real time. Machine vision has the capabilities to perform these tasks in order to execute some operations such as cultivation, spraying and harvesting. In greenhouse environment, it is possible to develop a map and perform an automatic control by detecting and localizing the cultivation platforms as the main moving obstacle. The current work was performed to meet a method based on the stereo vision for detecting and localizing platforms, and then, providing a two-dimensional map for cultivation platforms in the greenhouse environment. Materials and Methods In this research, two webcams, made by Microsoft Corporation with the resolution of 960×544, are connected to the computer via USB2 in order to produce a stereo parallel camera. Due to the structure of cultivation platforms, the number of points in the point cloud will be decreased by extracting the only upper and lower edges of the platform. The proposed method in this work aims at extracting the edges based on depth discontinuous features in the region of platform edge. By getting the disparity image of the platform edges from the rectified stereo images and translating its data to 3D-space, the point cloud model of the environments is constructed. Then by projecting the points to XZ plane and putting local maps together

  8. Perspectives on Emerging/Novel Computing Paradigms and Future Aerospace Workforce Environments

    Science.gov (United States)

    Noor, Ahmed K.

    2003-01-01

    The accelerating pace of the computing technology development shows no signs of abating. Computing power reaching 100 Tflop/s is likely to be reached by 2004 and Pflop/s (10(exp 15) Flop/s) by 2007. The fundamental physical limits of computation, including information storage limits, communication limits and computation rate limits will likely be reached by the middle of the present millennium. To overcome these limits, novel technologies and new computing paradigms will be developed. An attempt is made in this overview to put the diverse activities related to new computing-paradigms in perspective and to set the stage for the succeeding presentations. The presentation is divided into five parts. In the first part, a brief historical account is given of development of computer and networking technologies. The second part provides brief overviews of the three emerging computing paradigms grid, ubiquitous and autonomic computing. The third part lists future computing alternatives and the characteristics of future computing environment. The fourth part describes future aerospace workforce research, learning and design environments. The fifth part lists the objectives of the workshop and some of the sources of information on future computing paradigms.

  9. An expert system for a local planning environment

    NARCIS (Netherlands)

    Meester, G.J.; Meester, G.J.

    1993-01-01

    In this paper, we discuss the design of an Expert System (ES) that supports decision making in a Local Planning System (LPS) environment. The LPS provides the link between a high level factory planning system (rough cut capacity planning and material coordination) and the actual execution of jobs on

  10. VECTR: Virtual Environment Computational Training Resource

    Science.gov (United States)

    Little, William L.

    2018-01-01

    The Westridge Middle School Curriculum and Community Night is an annual event designed to introduce students and parents to potential employers in the Central Florida area. NASA participated in the event in 2017, and has been asked to come back for the 2018 event on January 25. We will be demonstrating our Microsoft Hololens Virtual Rovers project, and the Virtual Environment Computational Training Resource (VECTR) virtual reality tool.

  11. Supporting Student Learning in Computer Science Education via the Adaptive Learning Environment ALMA

    Directory of Open Access Journals (Sweden)

    Alexandra Gasparinatou

    2015-10-01

    Full Text Available This study presents the ALMA environment (Adaptive Learning Models from texts and Activities. ALMA supports the processes of learning and assessment via: (1 texts differing in local and global cohesion for students with low, medium, and high background knowledge; (2 activities corresponding to different levels of comprehension which prompt the student to practically implement different text-reading strategies, with the recommended activity sequence adapted to the student’s learning style; (3 an overall framework for informing, guiding, and supporting students in performing the activities; and; (4 individualized support and guidance according to student specific characteristics. ALMA also, supports students in distance learning or in blended learning in which students are submitted to face-to-face learning supported by computer technology. The adaptive techniques provided via ALMA are: (a adaptive presentation and (b adaptive navigation. Digital learning material, in accordance with the text comprehension model described by Kintsch, was introduced into the ALMA environment. This material can be exploited in either distance or blended learning.

  12. A heterogeneous computing environment to solve the 768-bit RSA challenge

    OpenAIRE

    Kleinjung, Thorsten; Bos, Joppe Willem; Lenstra, Arjen K.; Osvik, Dag Arne; Aoki, Kazumaro; Contini, Scott; Franke, Jens; Thomé, Emmanuel; Jermini, Pascal; Thiémard, Michela; Leyland, Paul; Montgomery, Peter L.; Timofeev, Andrey; Stockinger, Heinz

    2010-01-01

    In December 2009 the 768-bit, 232-digit number RSA-768 was factored using the number field sieve. Overall, the computational challenge would take more than 1700 years on a single, standard core. In the article we present the heterogeneous computing approach, involving different compute clusters and Grid computing environments, used to solve this problem.

  13. A Secure Authenticate Framework for Cloud Computing Environment

    OpenAIRE

    Nitin Nagar; Pradeep k. Jatav

    2014-01-01

    Cloud computing has an important aspect for the companies to build and deploy their infrastructure and application. Data Storage service in the cloud computing is easy as compare to the other data storage services. At the same time, cloud security in the cloud environment is challenging task. Security issues ranging from missing system configuration, lack of proper updates, or unwise user actions from remote data storage. It can expose user’s private data and information to unwanted access. i...

  14. Bridging context management systems for different types of pervasive computing environments

    NARCIS (Netherlands)

    Hesselman, C.E.W.; Benz, Hartmut; Benz, H.P.; Pawar, P.; Liu, F.; Wegdam, M.; Wibbels, Martin; Broens, T.H.F.; Brok, Jacco

    2008-01-01

    A context management system is a distributed system that enables applications to obtain context information about (mobile) users and forms a key component of any pervasive computing environment. Context management systems are however very environment-specific (e.g., specific for home environments)

  15. High performance computation of landscape genomic models including local indicators of spatial association.

    Science.gov (United States)

    Stucki, S; Orozco-terWengel, P; Forester, B R; Duruz, S; Colli, L; Masembe, C; Negrini, R; Landguth, E; Jones, M R; Bruford, M W; Taberlet, P; Joost, S

    2017-09-01

    With the increasing availability of both molecular and topo-climatic data, the main challenges facing landscape genomics - that is the combination of landscape ecology with population genomics - include processing large numbers of models and distinguishing between selection and demographic processes (e.g. population structure). Several methods address the latter, either by estimating a null model of population history or by simultaneously inferring environmental and demographic effects. Here we present samβada, an approach designed to study signatures of local adaptation, with special emphasis on high performance computing of large-scale genetic and environmental data sets. samβada identifies candidate loci using genotype-environment associations while also incorporating multivariate analyses to assess the effect of many environmental predictor variables. This enables the inclusion of explanatory variables representing population structure into the models to lower the occurrences of spurious genotype-environment associations. In addition, samβada calculates local indicators of spatial association for candidate loci to provide information on whether similar genotypes tend to cluster in space, which constitutes a useful indication of the possible kinship between individuals. To test the usefulness of this approach, we carried out a simulation study and analysed a data set from Ugandan cattle to detect signatures of local adaptation with samβada, bayenv, lfmm and an F ST outlier method (FDIST approach in arlequin) and compare their results. samβada - an open source software for Windows, Linux and Mac OS X available at http://lasig.epfl.ch/sambada - outperforms other approaches and better suits whole-genome sequence data processing. © 2016 The Authors. Molecular Ecology Resources Published by John Wiley & Sons Ltd.

  16. A semi-local quasi-harmonic model to compute the thermodynamic and mechanical properties of silicon nanostructures

    International Nuclear Information System (INIS)

    Zhao, H; Aluru, N R

    2007-01-01

    This paper presents a semi-local quasi-harmonic model with local phonon density of states (LPDOS) to compute the thermodynamic and mechanical properties of silicon nanostructures at finite temperature. In contrast to an earlier approach (Tang and Aluru 2006 Phys. Rev. B 74 235441), where a quasi-harmonic model with LPDOS computed by a Green's function technique (QHMG) was developed considering many layers of atoms, the semi-local approach considers only two layers of atoms to compute the LPDOS. We show that the semi-local approach combines the accuracy of the QHMG approach and the computational efficiency of the local quasi-harmonic model. We present results for several silicon nanostructures to address the accuracy and efficiency of the semi-local approach

  17. Hybrid Symbiotic Organisms Search Optimization Algorithm for Scheduling of Tasks on Cloud Computing Environment.

    Science.gov (United States)

    Abdullahi, Mohammed; Ngadi, Md Asri

    2016-01-01

    Cloud computing has attracted significant attention from research community because of rapid migration rate of Information Technology services to its domain. Advances in virtualization technology has made cloud computing very popular as a result of easier deployment of application services. Tasks are submitted to cloud datacenters to be processed on pay as you go fashion. Task scheduling is one the significant research challenges in cloud computing environment. The current formulation of task scheduling problems has been shown to be NP-complete, hence finding the exact solution especially for large problem sizes is intractable. The heterogeneous and dynamic feature of cloud resources makes optimum task scheduling non-trivial. Therefore, efficient task scheduling algorithms are required for optimum resource utilization. Symbiotic Organisms Search (SOS) has been shown to perform competitively with Particle Swarm Optimization (PSO). The aim of this study is to optimize task scheduling in cloud computing environment based on a proposed Simulated Annealing (SA) based SOS (SASOS) in order to improve the convergence rate and quality of solution of SOS. The SOS algorithm has a strong global exploration capability and uses fewer parameters. The systematic reasoning ability of SA is employed to find better solutions on local solution regions, hence, adding exploration ability to SOS. Also, a fitness function is proposed which takes into account the utilization level of virtual machines (VMs) which reduced makespan and degree of imbalance among VMs. CloudSim toolkit was used to evaluate the efficiency of the proposed method using both synthetic and standard workload. Results of simulation showed that hybrid SOS performs better than SOS in terms of convergence speed, response time, degree of imbalance, and makespan.

  18. Hybrid Symbiotic Organisms Search Optimization Algorithm for Scheduling of Tasks on Cloud Computing Environment.

    Directory of Open Access Journals (Sweden)

    Mohammed Abdullahi

    Full Text Available Cloud computing has attracted significant attention from research community because of rapid migration rate of Information Technology services to its domain. Advances in virtualization technology has made cloud computing very popular as a result of easier deployment of application services. Tasks are submitted to cloud datacenters to be processed on pay as you go fashion. Task scheduling is one the significant research challenges in cloud computing environment. The current formulation of task scheduling problems has been shown to be NP-complete, hence finding the exact solution especially for large problem sizes is intractable. The heterogeneous and dynamic feature of cloud resources makes optimum task scheduling non-trivial. Therefore, efficient task scheduling algorithms are required for optimum resource utilization. Symbiotic Organisms Search (SOS has been shown to perform competitively with Particle Swarm Optimization (PSO. The aim of this study is to optimize task scheduling in cloud computing environment based on a proposed Simulated Annealing (SA based SOS (SASOS in order to improve the convergence rate and quality of solution of SOS. The SOS algorithm has a strong global exploration capability and uses fewer parameters. The systematic reasoning ability of SA is employed to find better solutions on local solution regions, hence, adding exploration ability to SOS. Also, a fitness function is proposed which takes into account the utilization level of virtual machines (VMs which reduced makespan and degree of imbalance among VMs. CloudSim toolkit was used to evaluate the efficiency of the proposed method using both synthetic and standard workload. Results of simulation showed that hybrid SOS performs better than SOS in terms of convergence speed, response time, degree of imbalance, and makespan.

  19. Computer Vision Using Local Binary Patterns

    CERN Document Server

    Pietikainen, Matti; Zhao, Guoying; Ahonen, Timo

    2011-01-01

    The recent emergence of Local Binary Patterns (LBP) has led to significant progress in applying texture methods to various computer vision problems and applications. The focus of this research has broadened from 2D textures to 3D textures and spatiotemporal (dynamic) textures. Also, where texture was once utilized for applications such as remote sensing, industrial inspection and biomedical image analysis, the introduction of LBP-based approaches have provided outstanding results in problems relating to face and activity analysis, with future scope for face and facial expression recognition, b

  20. A Comparative Study of Load Balancing Algorithms in Cloud Computing Environment

    OpenAIRE

    Katyal, Mayanka; Mishra, Atul

    2014-01-01

    Cloud Computing is a new trend emerging in IT environment with huge requirements of infrastructure and resources. Load Balancing is an important aspect of cloud computing environment. Efficient load balancing scheme ensures efficient resource utilization by provisioning of resources to cloud users on demand basis in pay as you say manner. Load Balancing may even support prioritizing users by applying appropriate scheduling criteria. This paper presents various load balancing schemes in differ...

  1. A Matchmaking Strategy Of Mixed Resource On Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Wisam Elshareef

    2015-08-01

    Full Text Available Abstract Today cloud computing has become a key technology for online allotment of computing resources and online storage of user data in a lower cost where computing resources are available all the time over the Internet with pay per use concept. Recently there is a growing need for resource management strategies in a cloud computing environment that encompass both end-users satisfaction and a high job submission throughput with appropriate scheduling. One of the major and essential issues in resource management is related to allocate incoming tasks to suitable virtual machine matchmaking. The main objective of this paper is to propose a matchmaking strategy between the incoming requests and various resources in the cloud environment to satisfy the requirements of users and to load balance the workload on resources. Load Balancing is an important aspect of resource management in a cloud computing environment. So this paper proposes a dynamic weight active monitor DWAM load balance algorithm which allocates on the fly the incoming requests to the all available virtual machines in an efficient manner in order to achieve better performance parameters such as response time processing time and resource utilization. The feasibility of the proposed algorithm is analyzed using Cloudsim simulator which proves the superiority of the proposed DWAM algorithm over its counterparts in literature. Simulation results demonstrate that proposed algorithm dramatically improves response time data processing time and more utilized of resource compared Active monitor and VM-assign algorithms.

  2. Human-Computer Interaction in Smart Environments

    Science.gov (United States)

    Paravati, Gianluca; Gatteschi, Valentina

    2015-01-01

    Here, we provide an overview of the content of the Special Issue on “Human-computer interaction in smart environments”. The aim of this Special Issue is to highlight technologies and solutions encompassing the use of mass-market sensors in current and emerging applications for interacting with Smart Environments. Selected papers address this topic by analyzing different interaction modalities, including hand/body gestures, face recognition, gaze/eye tracking, biosignal analysis, speech and activity recognition, and related issues.

  3. EDF: contribution to local development and the protection of the environment

    International Nuclear Information System (INIS)

    Parot, F.; Veyret, G.

    1995-01-01

    As a consequence of the 1982-1983 French Decentralization laws, local elected officials were entrusted with new responsibilities concerning environmental protection and local development. EDF, the French public electricity utility therefore had to respond to new demands. New forms of cooperation with the various local actors were imagined: assistance in diagnostics, working out local strategies, subcontracting and working for the establishment of new industrial plants, multi-purpose water management (dams for example), environment protection (discreet lines...), urban waste treatment, transportation, etc

  4. Running climate model on a commercial cloud computing environment: A case study using Community Earth System Model (CESM) on Amazon AWS

    Science.gov (United States)

    Chen, Xiuhong; Huang, Xianglei; Jiao, Chaoyi; Flanner, Mark G.; Raeker, Todd; Palen, Brock

    2017-01-01

    The suites of numerical models used for simulating climate of our planet are usually run on dedicated high-performance computing (HPC) resources. This study investigates an alternative to the usual approach, i.e. carrying out climate model simulations on commercially available cloud computing environment. We test the performance and reliability of running the CESM (Community Earth System Model), a flagship climate model in the United States developed by the National Center for Atmospheric Research (NCAR), on Amazon Web Service (AWS) EC2, the cloud computing environment by Amazon.com, Inc. StarCluster is used to create virtual computing cluster on the AWS EC2 for the CESM simulations. The wall-clock time for one year of CESM simulation on the AWS EC2 virtual cluster is comparable to the time spent for the same simulation on a local dedicated high-performance computing cluster with InfiniBand connections. The CESM simulation can be efficiently scaled with the number of CPU cores on the AWS EC2 virtual cluster environment up to 64 cores. For the standard configuration of the CESM at a spatial resolution of 1.9° latitude by 2.5° longitude, increasing the number of cores from 16 to 64 reduces the wall-clock running time by more than 50% and the scaling is nearly linear. Beyond 64 cores, the communication latency starts to outweigh the benefit of distributed computing and the parallel speedup becomes nearly unchanged.

  5. An u-Service Model Based on a Smart Phone for Urban Computing Environments

    Science.gov (United States)

    Cho, Yongyun; Yoe, Hyun

    In urban computing environments, all of services should be based on the interaction between humans and environments around them, which frequently and ordinarily in home and office. This paper propose an u-service model based on a smart phone for urban computing environments. The suggested service model includes a context-aware and personalized service scenario development environment that can instantly describe user's u-service demand or situation information with smart devices. To do this, the architecture of the suggested service model consists of a graphical service editing environment for smart devices, an u-service platform, and an infrastructure with sensors and WSN/USN. The graphic editor expresses contexts as execution conditions of a new service through a context model based on ontology. The service platform deals with the service scenario according to contexts. With the suggested service model, an user in urban computing environments can quickly and easily make u-service or new service using smart devices.

  6. Research on Digital Forensic Readiness Design in a Cloud Computing-Based Smart Work Environment

    Directory of Open Access Journals (Sweden)

    Sangho Park

    2018-04-01

    Full Text Available Recently, the work environments of organizations have been in the process of transitioning into smart work environments by applying cloud computing technology in the existing work environment. The smart work environment has the characteristic of being able to access information assets inside the company from outside the company through cloud computing technology, share information without restrictions on location by using mobile terminals, and provide a work environment where work can be conducted effectively in various locations and mobile environments. Thus, in the cloud computing-based smart work environment, changes are occurring in terms of security risks, such as an increase in the leakage risk of an organization’s information assets through mobile terminals which have a high risk of loss and theft and increase the hacking risk of wireless networks in mobile environments. According to these changes in security risk, the reactive digital forensic method, which investigates digital evidence after the occurrence of security incidents, appears to have a limit which has led to a rise in the necessity of proactive digital forensic approaches wherein security incidents can be addressed preemptively. Accordingly, in this research, we design a digital forensic readiness model at the level of preemptive prevention by considering changes in the cloud computing-based smart work environment. Firstly, we investigate previous research related to the cloud computing-based smart work environment and digital forensic readiness and analyze a total of 50 components of digital forensic readiness. In addition, through the analysis of the corresponding preceding research, we design seven detailed areas, namely, outside the organization environment, within the organization guideline, system information, terminal information, user information, usage information, and additional function. Then, we design a draft of the digital forensic readiness model in the cloud

  7. Human-Computer Interaction in Smart Environments

    Directory of Open Access Journals (Sweden)

    Gianluca Paravati

    2015-08-01

    Full Text Available Here, we provide an overview of the content of the Special Issue on “Human-computer interaction in smart environments”. The aim of this Special Issue is to highlight technologies and solutions encompassing the use of mass-market sensors in current and emerging applications for interacting with Smart Environments. Selected papers address this topic by analyzing different interaction modalities, including hand/body gestures, face recognition, gaze/eye tracking, biosignal analysis, speech and activity recognition, and related issues.

  8. Applications integration in a hybrid cloud computing environment: modelling and platform

    Science.gov (United States)

    Li, Qing; Wang, Ze-yuan; Li, Wei-hua; Li, Jun; Wang, Cheng; Du, Rui-yang

    2013-08-01

    With the development of application services providers and cloud computing, more and more small- and medium-sized business enterprises use software services and even infrastructure services provided by professional information service companies to replace all or part of their information systems (ISs). These information service companies provide applications, such as data storage, computing processes, document sharing and even management information system services as public resources to support the business process management of their customers. However, no cloud computing service vendor can satisfy the full functional IS requirements of an enterprise. As a result, enterprises often have to simultaneously use systems distributed in different clouds and their intra enterprise ISs. Thus, this article presents a framework to integrate applications deployed in public clouds and intra ISs. A run-time platform is developed and a cross-computing environment process modelling technique is also developed to improve the feasibility of ISs under hybrid cloud computing environments.

  9. A FUNCTIONAL MODEL OF COMPUTER-ORIENTED LEARNING ENVIRONMENT OF A POST-DEGREE PEDAGOGICAL EDUCATION

    Directory of Open Access Journals (Sweden)

    Kateryna R. Kolos

    2014-06-01

    Full Text Available The study substantiates the need for a systematic study of the functioning of computer-oriented learning environment of a post-degree pedagogical education; it is determined the definition of “functional model of computer-oriented learning environment of a post-degree pedagogical education”; it is built a functional model of computer-oriented learning environment of a post-degree pedagogical education in accordance with the functions of business, information and communication technology, academic, administrative staff and peculiarities of training courses teachers.

  10. Urbancontext: A Management Model For Pervasive Environments In User-Oriented Urban Computing

    Directory of Open Access Journals (Sweden)

    Claudia L. Zuniga-Canon

    2014-01-01

    Full Text Available Nowadays, urban computing has gained a lot of interest for guiding the evolution of citiesinto intelligent environments. These environments are appropriated for individuals’ inter-actions changing in their behaviors. These changes require new approaches that allow theunderstanding of how urban computing systems should be modeled.In this work we present UrbanContext, a new model for designing of urban computingplatforms that applies the theory of roles to manage the individual’s context in urban envi-ronments. The theory of roles helps to understand the individual’s behavior within a socialenvironment, allowing to model urban computing systems able to adapt to individuals statesand their needs.UrbanContext collects data in urban atmospheres and classifies individuals’ behaviorsaccording to their change of roles, to optimize social interaction and offer secure services.Likewise, UrbanContext serves as a generic model to provide interoperability, and to facilitatethe design, implementation and expansion of urban computing systems.

  11. Fostering computational thinking skills with a tangible blocks programming environment

    OpenAIRE

    Turchi, T; Malizia, A

    2016-01-01

    Computational Thinking has recently returned into the limelight as an essential skill to have for both the general public and disciplines outside Computer Science. It encapsulates those thinking skills integral to solving complex problems using a computer, thus widely applicable in our technological society. Several public initiatives such as the Hour of Code successfully introduced it to millions of people of different ages and backgrounds, mostly using Blocks Programming Environments like S...

  12. Distributed computing testbed for a remote experimental environment

    International Nuclear Information System (INIS)

    Butner, D.N.; Casper, T.A.; Howard, B.C.; Henline, P.A.; Davis, S.L.; Barnes, D.

    1995-01-01

    Collaboration is increasing as physics research becomes concentrated on a few large, expensive facilities, particularly in magnetic fusion energy research, with national and international participation. These facilities are designed for steady state operation and interactive, real-time experimentation. We are developing tools to provide for the establishment of geographically distant centers for interactive operations; such centers would allow scientists to participate in experiments from their home institutions. A testbed is being developed for a Remote Experimental Environment (REE), a ''Collaboratory.'' The testbed will be used to evaluate the ability of a remotely located group of scientists to conduct research on the DIII-D Tokamak at General Atomics. The REE will serve as a testing environment for advanced control and collaboration concepts applicable to future experiments. Process-to-process communications over high speed wide area networks provide real-time synchronization and exchange of data among multiple computer networks, while the ability to conduct research is enhanced by adding audio/video communication capabilities. The Open Software Foundation's Distributed Computing Environment is being used to test concepts in distributed control, security, naming, remote procedure calls and distributed file access using the Distributed File Services. We are exploring the technology and sociology of remotely participating in the operation of a large scale experimental facility

  13. Evolution of the local environment of lanthanum during simplified SON68 glass leaching

    International Nuclear Information System (INIS)

    Jollivet, P.; Delaye, J.M.; Den Auwer, C.; Simoni, E.

    2007-01-01

    The evolution of the short- and medium-range local environment of lanthanum was determined by L-III-edge X-ray absorption spectroscopy (XAS) during leaching of simplified SON68-type glasses. In glass without phosphorus, lanthanum is found in a silicate environment, and its first coordination sphere comprises eight oxygen atoms at a mean distance of 2.51 angstrom. When this glass was leached at a high renewal rate, the lanthanum local environment was significantly modified: it was present at hydroxy-carbonate and silicate sites with a mean La-O distance of 2.56 angstrom, and the second neighbors consisted of La atoms instead of Si for the glass. Conversely, in the gel formed at low renewal rates, lanthanum was found in a silicate environment similar to that of the glass. In phosphorus-doped glass, lanthanum is found in a phosphate environment, although the Si/P atomic ratio is 20:1. Lanthanum is surrounded by seven oxygen atoms at a mean distance of 2.37 angstrom. When phosphorus-doped glass is leached, regardless of the leaching solution flow rate, the short- and medium-range lanthanum local environment remains almost constant; the most significant change is a 0.05 angstrom increase in the La-O distance. (authors)

  14. Collaborative virtual reality environments for computational science and design

    International Nuclear Information System (INIS)

    Papka, M. E.

    1998-01-01

    The authors are developing a networked, multi-user, virtual-reality-based collaborative environment coupled to one or more petaFLOPs computers, enabling the interactive simulation of 10 9 atom systems. The purpose of this work is to explore the requirements for this coupling. Through the design, development, and testing of such systems, they hope to gain knowledge that allows computational scientists to discover and analyze their results more quickly and in a more intuitive manner

  15. LINER galaxy properties and the local environment

    Science.gov (United States)

    Coldwell, Georgina V.; Alonso, Sol; Duplancic, Fernanda; Mesa, Valeria

    2018-05-01

    We analyse the properties of a sample of 5560 low-ionization nuclear emission-line region (LINER) galaxies selected from SDSS-DR12 at low red shift, for a complete range of local density environments. The host LINER galaxies were studied and compared with a well-defined control sample of 5553 non-LINER galaxies matched in red shift, luminosity, morphology and local density. By studying the distributions of galaxy colours and the stellar age population, we find that LINERs are redder and older than the control sample over a wide range of densities. In addition, LINERs are older than the control sample, at a given galaxy colour, indicating that some external process could have accelerated the evolution of the stellar population. The analysis of the host properties shows that the control sample exhibits a strong relation between colours, ages and the local density, while more than 90 per cent of the LINERs are redder and older than the mean values, independently of the neighbourhood density. Furthermore, a detailed study in three local density ranges shows that, while control sample galaxies are redder and older as a function of stellar mass and density, LINER galaxies mismatch the known morphology-density relation of galaxies without low-ionization features. The results support the contribution of hot and old stars to the low-ionization emission although the contribution of nuclear activity is not discarded.

  16. Students experiences with collaborative learning in asynchronous computer-supported collaborative learning environments.

    NARCIS (Netherlands)

    Dewiyanti, Silvia; Brand-Gruwel, Saskia; Jochems, Wim; Broers, Nick

    2008-01-01

    Dewiyanti, S., Brand-Gruwel, S., Jochems, W., & Broers, N. (2007). Students experiences with collaborative learning in asynchronous computer-supported collaborative learning environments. Computers in Human Behavior, 23, 496-514.

  17. A Scheme for Verification on Data Integrity in Mobile Multicloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Laicheng Cao

    2016-01-01

    Full Text Available In order to verify the data integrity in mobile multicloud computing environment, a MMCDIV (mobile multicloud data integrity verification scheme is proposed. First, the computability and nondegeneracy of verification can be obtained by adopting BLS (Boneh-Lynn-Shacham short signature scheme. Second, communication overhead is reduced based on HVR (Homomorphic Verifiable Response with random masking and sMHT (sequence-enforced Merkle hash tree construction. Finally, considering the resource constraints of mobile devices, data integrity is verified by lightweight computing and low data transmission. The scheme improves shortage that mobile device communication and computing power are limited, it supports dynamic data operation in mobile multicloud environment, and data integrity can be verified without using direct source file block. Experimental results also demonstrate that this scheme can achieve a lower cost of computing and communications.

  18. Enrichment of Human-Computer Interaction in Brain-Computer Interfaces via Virtual Environments

    Directory of Open Access Journals (Sweden)

    Alonso-Valerdi Luz María

    2017-01-01

    Full Text Available Tridimensional representations stimulate cognitive processes that are the core and foundation of human-computer interaction (HCI. Those cognitive processes take place while a user navigates and explores a virtual environment (VE and are mainly related to spatial memory storage, attention, and perception. VEs have many distinctive features (e.g., involvement, immersion, and presence that can significantly improve HCI in highly demanding and interactive systems such as brain-computer interfaces (BCI. BCI is as a nonmuscular communication channel that attempts to reestablish the interaction between an individual and his/her environment. Although BCI research started in the sixties, this technology is not efficient or reliable yet for everyone at any time. Over the past few years, researchers have argued that main BCI flaws could be associated with HCI issues. The evidence presented thus far shows that VEs can (1 set out working environmental conditions, (2 maximize the efficiency of BCI control panels, (3 implement navigation systems based not only on user intentions but also on user emotions, and (4 regulate user mental state to increase the differentiation between control and noncontrol modalities.

  19. Local-order metric for condensed-phase environments

    Science.gov (United States)

    Martelli, Fausto; Ko, Hsin-Yu; Oǧuz, Erdal C.; Car, Roberto

    2018-02-01

    We introduce a local order metric (LOM) that measures the degree of order in the neighborhood of an atomic or molecular site in a condensed medium. The LOM maximizes the overlap between the spatial distribution of sites belonging to that neighborhood and the corresponding distribution in a suitable reference system. The LOM takes a value tending to zero for completely disordered environments and tending to one for environments that perfectly match the reference. The site-averaged LOM and its standard deviation define two scalar order parameters, S and δ S , that characterize with excellent resolution crystals, liquids, and amorphous materials. We show with molecular dynamics simulations that S , δ S , and the LOM provide very insightful information in the study of structural transformations, such as those occurring when ice spontaneously nucleates from supercooled water or when a supercooled water sample becomes amorphous upon progressive cooling.

  20. Local Control of Audio Environment: A Review of Methods and Applications

    Directory of Open Access Journals (Sweden)

    Jussi Kuutti

    2014-02-01

    Full Text Available The concept of a local audio environment is to have sound playback locally restricted such that, ideally, adjacent regions of an indoor or outdoor space could exhibit their own individual audio content without interfering with each other. This would enable people to listen to their content of choice without disturbing others next to them, yet, without any headphones to block conversation. In practice, perfect sound containment in free air cannot be attained, but a local audio environment can still be satisfactorily approximated using directional speakers. Directional speakers may be based on regular audible frequencies or they may employ modulated ultrasound. Planar, parabolic, and array form factors are commonly used. The directivity of a speaker improves as its surface area and sound frequency increases, making these the main design factors for directional audio systems. Even directional speakers radiate some sound outside the main beam, and sound can also reflect from objects. Therefore, directional speaker systems perform best when there is enough ambient noise to mask the leaking sound. Possible areas of application for local audio include information and advertisement audio feed in commercial facilities, guiding and narration in museums and exhibitions, office space personalization, control room messaging, rehabilitation environments, and entertainment audio systems.

  1. Cloud Computing and Virtual Desktop Infrastructures in Afloat Environments

    OpenAIRE

    Gillette, Stefan E.

    2012-01-01

    The phenomenon of “cloud computing” has become ubiquitous among users of the Internet and many commercial applications. Yet, the U.S. Navy has conducted limited research in this nascent technology. This thesis explores the application and integration of cloud computing both at the shipboard level and in a multi-ship environment. A virtual desktop infrastructure, mirroring a shipboard environment, was built and analyzed in the Cloud Lab at the Naval Postgraduate School, which offers a potentia...

  2. Xcache in the ATLAS Distributed Computing Environment

    CERN Document Server

    Hanushevsky, Andrew; The ATLAS collaboration

    2018-01-01

    Built upon the Xrootd Proxy Cache (Xcache), we developed additional features to adapt the ATLAS distributed computing and data environment, especially its data management system RUCIO, to help improve the cache hit rate, as well as features that make the Xcache easy to use, similar to the way the Squid cache is used by the HTTP protocol. We are optimizing Xcache for the HPC environments, and adapting the HL-LHC Data Lakes design as its component for data delivery. We packaged the software in CVMFS, in Docker and Singularity containers in order to standardize the deployment and reduce the cost to resolve issues at remote sites. We are also integrating it into RUCIO as a volatile storage systems, and into various ATLAS workflow such as user analysis,

  3. Bridging Theory and Practice: Developing Guidelines to Facilitate the Design of Computer-based Learning Environments

    Directory of Open Access Journals (Sweden)

    Lisa D. Young

    2003-10-01

    Full Text Available Abstract. The design of computer-based learning environments has undergone a paradigm shift; moving students away from instruction that was considered to promote technical rationality grounded in objectivism, to the application of computers to create cognitive tools utilized in constructivist environments. The goal of the resulting computer-based learning environment design principles is to have students learn with technology, rather than from technology. This paper reviews the general constructivist theory that has guided the development of these environments, and offers suggestions for the adaptation of modest, generic guidelines, not mandated principles, that can be flexibly applied and allow for the expression of true constructivist ideals in online learning environments.

  4. A visualization environment for supercomputing-based applications in computational mechanics

    Energy Technology Data Exchange (ETDEWEB)

    Pavlakos, C.J.; Schoof, L.A.; Mareda, J.F.

    1993-06-01

    In this paper, we characterize a visualization environment that has been designed and prototyped for a large community of scientists and engineers, with an emphasis in superconducting-based computational mechanics. The proposed environment makes use of a visualization server concept to provide effective, interactive visualization to the user`s desktop. Benefits of using the visualization server approach are discussed. Some thoughts regarding desirable features for visualization server hardware architectures are also addressed. A brief discussion of the software environment is included. The paper concludes by summarizing certain observations which we have made regarding the implementation of such visualization environments.

  5. Protect Heterogeneous Environment Distributed Computing from Malicious Code Assignment

    Directory of Open Access Journals (Sweden)

    V. S. Gorbatov

    2011-09-01

    Full Text Available The paper describes the practical implementation of the protection system of heterogeneous environment distributed computing from malicious code for the assignment. A choice of technologies, development of data structures, performance evaluation of the implemented system security are conducted.

  6. Data Summarization in the Node by Parameters (DSNP: Local Data Fusion in an IoT Environment

    Directory of Open Access Journals (Sweden)

    Luis F. C. Maschi

    2018-03-01

    Full Text Available With the advent of the Internet of Things, billions of objects or devices are inserted into the global computer network, generating and processing data at a volume never imagined before. This paper proposes a way to collect and process local data through a data fusion technology called summarization. The main feature of the proposal is the local data fusion, through parameters provided by the application, ensuring the quality of data collected by the sensor node. In the evaluation, the sensor node was compared when performing the data summary with another that performed a continuous recording of the collected data. Two sets of nodes were created, one with a sensor node that analyzed the luminosity of the room, which in this case obtained a reduction of 97% in the volume of data generated, and another set that analyzed the temperature of the room, obtaining a reduction of 80% in the data volume. Through these tests, it has been proven that the local data fusion at the node can be used to reduce the volume of data generated, consequently decreasing the volume of messages generated by IoT environments.

  7. Data Summarization in the Node by Parameters (DSNP): Local Data Fusion in an IoT Environment.

    Science.gov (United States)

    Maschi, Luis F C; Pinto, Alex S R; Meneguette, Rodolfo I; Baldassin, Alexandro

    2018-03-07

    With the advent of the Internet of Things, billions of objects or devices are inserted into the global computer network, generating and processing data at a volume never imagined before. This paper proposes a way to collect and process local data through a data fusion technology called summarization. The main feature of the proposal is the local data fusion, through parameters provided by the application, ensuring the quality of data collected by the sensor node. In the evaluation, the sensor node was compared when performing the data summary with another that performed a continuous recording of the collected data. Two sets of nodes were created, one with a sensor node that analyzed the luminosity of the room, which in this case obtained a reduction of 97% in the volume of data generated, and another set that analyzed the temperature of the room, obtaining a reduction of 80% in the data volume. Through these tests, it has been proven that the local data fusion at the node can be used to reduce the volume of data generated, consequently decreasing the volume of messages generated by IoT environments.

  8. Tacit knowledge in action: basic notions of knowledge sharing in computer supported work environments

    OpenAIRE

    Mackenzie Owen, John

    2001-01-01

    An important characteristic of most computer supported work environments is the distribution of work over individuals or teams in different locations. This leads to what we nowadays call `virtual' environments. In these environments communication between actors is to a large degree mediated, i.e. established through communications media (telephone, fax, computer networks) rather in a face-to-face way. Unfortunately, mediated communication limits the effectiveness of knowledge exchange in virt...

  9. Computational Tool for Aerothermal Environment Around Transatmospheric Vehicles, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this Project is to develop a high-fidelity computational tool for accurate prediction of aerothermal environment on transatmospheric vehicles. This...

  10. Transnational (Dis)connection in localizing personal computing in the Netherlands, 1975-1990

    NARCIS (Netherlands)

    Veraart, F.C.A.; Alberts, G.; Oldenziel, R.

    2014-01-01

    Examining the diffusion and domestication of computer technologies in Dutch households and schools during the 1980s and 1990s, this chapter shows that the process was not a simple story of adoption of American models. Instead, many Dutch actors adapted computer technologies to their own local needs,

  11. Computational Environments and Analysis methods available on the NCI High Performance Computing (HPC) and High Performance Data (HPD) Platform

    Science.gov (United States)

    Evans, B. J. K.; Foster, C.; Minchin, S. A.; Pugh, T.; Lewis, A.; Wyborn, L. A.; Evans, B. J.; Uhlherr, A.

    2014-12-01

    The National Computational Infrastructure (NCI) has established a powerful in-situ computational environment to enable both high performance computing and data-intensive science across a wide spectrum of national environmental data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress in addressing harmonisation of the underlying data collections for future transdisciplinary research that enable accurate climate projections. NCI makes available 10+ PB major data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the national scientific records), major research communities, and collaborating overseas organisations. The data is accessible within an integrated HPC-HPD environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large scale and high-bandwidth Lustre filesystems. This computational environment supports a catalogue of integrated reusable software and workflows from earth system and ecosystem modelling, weather research, satellite and other observed data processing and analysis. To enable transdisciplinary research on this scale, data needs to be harmonised so that researchers can readily apply techniques and software across the corpus of data available and not be constrained to work within artificial disciplinary boundaries. Future challenges will

  12. A synthetic computational environment: To control the spread of respiratory infections in a virtual university

    Science.gov (United States)

    Ge, Yuanzheng; Chen, Bin; liu, Liang; Qiu, Xiaogang; Song, Hongbin; Wang, Yong

    2018-02-01

    Individual-based computational environment provides an effective solution to study complex social events by reconstructing scenarios. Challenges remain in reconstructing the virtual scenarios and reproducing the complex evolution. In this paper, we propose a framework to reconstruct a synthetic computational environment, reproduce the epidemic outbreak, and evaluate management interventions in a virtual university. The reconstructed computational environment includes 4 fundamental components: the synthetic population, behavior algorithms, multiple social networks, and geographic campus environment. In the virtual university, influenza H1N1 transmission experiments are conducted, and gradually enhanced interventions are evaluated and compared quantitatively. The experiment results indicate that the reconstructed virtual environment provides a solution to reproduce complex emergencies and evaluate policies to be executed in the real world.

  13. Operational computer graphics in the flight dynamics environment

    Science.gov (United States)

    Jeletic, James F.

    1989-01-01

    Over the past five years, the Flight Dynamics Division of the National Aeronautics and Space Administration's (NASA's) Goddard Space Flight Center has incorporated computer graphics technology into its operational environment. In an attempt to increase the effectiveness and productivity of the Division, computer graphics software systems have been developed that display spacecraft tracking and telemetry data in 2-d and 3-d graphic formats that are more comprehensible than the alphanumeric tables of the past. These systems vary in functionality from real-time mission monitoring system, to mission planning utilities, to system development tools. Here, the capabilities and architecture of these systems are discussed.

  14. Sub-sampling-based 2D localization of an impulsive acoustic source in reverberant environments

    KAUST Repository

    Omer, Muhammad

    2014-07-01

    This paper presents a robust method for two-dimensional (2D) impulsive acoustic source localization in a room environment using low sampling rates. The proposed method finds the time delay from the room impulse response (RIR) which makes it robust against room reverberations. We consider the RIR as a sparse phenomenon and apply a recently proposed sparse signal reconstruction technique called orthogonal clustering (OC) for its estimation from the sub-sampled received signal. The arrival time of the direct path signal at a pair of microphones is identified from the estimated RIR, and their difference yields the desired time delay estimate (TDE). Low sampling rates reduces the hardware and computational complexity and decreases the communication between the microphones and the centralized location. Simulation and experimental results of an actual hardware setup are presented to demonstrate the performance of the proposed technique.

  15. Sub-sampling-based 2D localization of an impulsive acoustic source in reverberant environments

    KAUST Repository

    Omer, Muhammad; Quadeer, Ahmed A; Sharawi, Mohammad S; Al-Naffouri, Tareq Y.

    2014-01-01

    This paper presents a robust method for two-dimensional (2D) impulsive acoustic source localization in a room environment using low sampling rates. The proposed method finds the time delay from the room impulse response (RIR) which makes it robust against room reverberations. We consider the RIR as a sparse phenomenon and apply a recently proposed sparse signal reconstruction technique called orthogonal clustering (OC) for its estimation from the sub-sampled received signal. The arrival time of the direct path signal at a pair of microphones is identified from the estimated RIR, and their difference yields the desired time delay estimate (TDE). Low sampling rates reduces the hardware and computational complexity and decreases the communication between the microphones and the centralized location. Simulation and experimental results of an actual hardware setup are presented to demonstrate the performance of the proposed technique.

  16. Center for Advanced Energy Studies: Computer Assisted Virtual Environment (CAVE)

    Data.gov (United States)

    Federal Laboratory Consortium — The laboratory contains a four-walled 3D computer assisted virtual environment - or CAVE TM — that allows scientists and engineers to literally walk into their data...

  17. Computation of 3D form factors in complex environments

    International Nuclear Information System (INIS)

    Coulon, N.

    1989-01-01

    The calculation of radiant interchange among opaque surfaces in a complex environment poses the general problem of determining the visible and hidden parts of the environment. In many thermal engineering applications, surfaces are separated by radiatively non-participating media and may be idealized as diffuse emitters and reflectors. Consenquently the net radiant energy fluxes are intimately related to purely geometrical quantities called form factors, that take into account hidden parts: the problem is reduced to the form factor evaluation. This paper presents the method developed for the computation of 3D form factors in the finite-element module of the system TRIO, which is a general computer code for thermal and fluid flow analysis. The method is derived from an algorithm devised for synthetic image generation. A comparison is performed with the standard contour integration method also implemented and suited to convex geometries. Several illustrative examples of finite-element thermal calculations in radiating enclosures are given

  18. Cloud computing for comparative genomics.

    Science.gov (United States)

    Wall, Dennis P; Kudtarkar, Parul; Fusaro, Vincent A; Pivovarov, Rimma; Patil, Prasad; Tonellato, Peter J

    2010-05-18

    Large comparative genomics studies and tools are becoming increasingly more compute-expensive as the number of available genome sequences continues to rise. The capacity and cost of local computing infrastructures are likely to become prohibitive with the increase, especially as the breadth of questions continues to rise. Alternative computing architectures, in particular cloud computing environments, may help alleviate this increasing pressure and enable fast, large-scale, and cost-effective comparative genomics strategies going forward. To test this, we redesigned a typical comparative genomics algorithm, the reciprocal smallest distance algorithm (RSD), to run within Amazon's Elastic Computing Cloud (EC2). We then employed the RSD-cloud for ortholog calculations across a wide selection of fully sequenced genomes. We ran more than 300,000 RSD-cloud processes within the EC2. These jobs were farmed simultaneously to 100 high capacity compute nodes using the Amazon Web Service Elastic Map Reduce and included a wide mix of large and small genomes. The total computation time took just under 70 hours and cost a total of $6,302 USD. The effort to transform existing comparative genomics algorithms from local compute infrastructures is not trivial. However, the speed and flexibility of cloud computing environments provides a substantial boost with manageable cost. The procedure designed to transform the RSD algorithm into a cloud-ready application is readily adaptable to similar comparative genomics problems.

  19. About Security Solutions in Fog Computing

    Directory of Open Access Journals (Sweden)

    Eugen Petac

    2016-01-01

    Full Text Available The key for improving a system's performance, its security and reliability is to have the dataprocessed locally in remote data centers. Fog computing extends cloud computing through itsservices to devices and users at the edge of the network. Through this paper it is explored the fogcomputing environment. Security issues in this area are also described. Fog computing providesthe improved quality of services to the user by complementing shortages of cloud in IoT (Internet ofThings environment. Our proposal, named Adaptive Fog Computing Node Security Profile(AFCNSP, which is based security Linux solutions, will get an improved security of fog node withrich feature sets.

  20. Hybrid Cloud Computing Environment for EarthCube and Geoscience Community

    Science.gov (United States)

    Yang, C. P.; Qin, H.

    2016-12-01

    The NSF EarthCube Integration and Test Environment (ECITE) has built a hybrid cloud computing environment to provides cloud resources from private cloud environments by using cloud system software - OpenStack and Eucalyptus, and also manages public cloud - Amazon Web Service that allow resource synchronizing and bursting between private and public cloud. On ECITE hybrid cloud platform, EarthCube and geoscience community can deploy and manage the applications by using base virtual machine images or customized virtual machines, analyze big datasets by using virtual clusters, and real-time monitor the virtual resource usage on the cloud. Currently, a number of EarthCube projects have deployed or started migrating their projects to this platform, such as CHORDS, BCube, CINERGI, OntoSoft, and some other EarthCube building blocks. To accomplish the deployment or migration, administrator of ECITE hybrid cloud platform prepares the specific needs (e.g. images, port numbers, usable cloud capacity, etc.) of each project in advance base on the communications between ECITE and participant projects, and then the scientists or IT technicians in those projects launch one or multiple virtual machines, access the virtual machine(s) to set up computing environment if need be, and migrate their codes, documents or data without caring about the heterogeneity in structure and operations among different cloud platforms.

  1. Fault-tolerant quantum computation for local non-Markovian noise

    International Nuclear Information System (INIS)

    Terhal, Barbara M.; Burkard, Guido

    2005-01-01

    We derive a threshold result for fault-tolerant quantum computation for local non-Markovian noise models. The role of error amplitude in our analysis is played by the product of the elementary gate time t 0 and the spectral width of the interaction Hamiltonian between system and bath. We discuss extensions of our model and the applicability of our analysis

  2. An extended Intelligent Water Drops algorithm for workflow scheduling in cloud computing environment

    Directory of Open Access Journals (Sweden)

    Shaymaa Elsherbiny

    2018-03-01

    Full Text Available Cloud computing is emerging as a high performance computing environment with a large scale, heterogeneous collection of autonomous systems and flexible computational architecture. Many resource management methods may enhance the efficiency of the whole cloud computing system. The key part of cloud computing resource management is resource scheduling. Optimized scheduling of tasks on the cloud virtual machines is an NP-hard problem and many algorithms have been presented to solve it. The variations among these schedulers are due to the fact that the scheduling strategies of the schedulers are adapted to the changing environment and the types of tasks. The focus of this paper is on workflows scheduling in cloud computing, which is gaining a lot of attention recently because workflows have emerged as a paradigm to represent complex computing problems. We proposed a novel algorithm extending the natural-based Intelligent Water Drops (IWD algorithm that optimizes the scheduling of workflows on the cloud. The proposed algorithm is implemented and embedded within the workflows simulation toolkit and tested in different simulated cloud environments with different cost models. Our algorithm showed noticeable enhancements over the classical workflow scheduling algorithms. We made a comparison between the proposed IWD-based algorithm with other well-known scheduling algorithms, including MIN-MIN, MAX-MIN, Round Robin, FCFS, and MCT, PSO and C-PSO, where the proposed algorithm presented noticeable enhancements in the performance and cost in most situations.

  3. Human face recognition using eigenface in cloud computing environment

    Science.gov (United States)

    Siregar, S. T. M.; Syahputra, M. F.; Rahmat, R. F.

    2018-02-01

    Doing a face recognition for one single face does not take a long time to process, but if we implement attendance system or security system on companies that have many faces to be recognized, it will take a long time. Cloud computing is a computing service that is done not on a local device, but on an internet connected to a data center infrastructure. The system of cloud computing also provides a scalability solution where cloud computing can increase the resources needed when doing larger data processing. This research is done by applying eigenface while collecting data as training data is also done by using REST concept to provide resource, then server can process the data according to existing stages. After doing research and development of this application, it can be concluded by implementing Eigenface, recognizing face by applying REST concept as endpoint in giving or receiving related information to be used as a resource in doing model formation to do face recognition.

  4. An Interactive, Web-based High Performance Modeling Environment for Computational Epidemiology.

    Science.gov (United States)

    Deodhar, Suruchi; Bisset, Keith R; Chen, Jiangzhuo; Ma, Yifei; Marathe, Madhav V

    2014-07-01

    We present an integrated interactive modeling environment to support public health epidemiology. The environment combines a high resolution individual-based model with a user-friendly web-based interface that allows analysts to access the models and the analytics back-end remotely from a desktop or a mobile device. The environment is based on a loosely-coupled service-oriented-architecture that allows analysts to explore various counter factual scenarios. As the modeling tools for public health epidemiology are getting more sophisticated, it is becoming increasingly hard for non-computational scientists to effectively use the systems that incorporate such models. Thus an important design consideration for an integrated modeling environment is to improve ease of use such that experimental simulations can be driven by the users. This is achieved by designing intuitive and user-friendly interfaces that allow users to design and analyze a computational experiment and steer the experiment based on the state of the system. A key feature of a system that supports this design goal is the ability to start, stop, pause and roll-back the disease propagation and intervention application process interactively. An analyst can access the state of the system at any point in time and formulate dynamic interventions based on additional information obtained through state assessment. In addition, the environment provides automated services for experiment set-up and management, thus reducing the overall time for conducting end-to-end experimental studies. We illustrate the applicability of the system by describing computational experiments based on realistic pandemic planning scenarios. The experiments are designed to demonstrate the system's capability and enhanced user productivity.

  5. A SOUND SOURCE LOCALIZATION TECHNIQUE TO SUPPORT SEARCH AND RESCUE IN LOUD NOISE ENVIRONMENTS

    Science.gov (United States)

    Yoshinaga, Hiroshi; Mizutani, Koichi; Wakatsuki, Naoto

    At some sites of earthquakes and other disasters, rescuers search for people buried under rubble by listening for the sounds which they make. Thus developing a technique to localize sound sources amidst loud noise will support such search and rescue operations. In this paper, we discuss an experiment performed to test an array signal processing technique which searches for unperceivable sound in loud noise environments. Two speakers simultaneously played a noise of a generator and a voice decreased by 20 dB (= 1/100 of power) from the generator noise at an outdoor space where cicadas were making noise. The sound signal was received by a horizontally set linear microphone array 1.05 m in length and consisting of 15 microphones. The direction and the distance of the voice were computed and the sound of the voice was extracted and played back as an audible sound by array signal processing.

  6. Novel Ethernet Based Optical Local Area Networks for Computer Interconnection

    NARCIS (Netherlands)

    Radovanovic, Igor; van Etten, Wim; Taniman, R.O.; Kleinkiskamp, Ronny

    2003-01-01

    In this paper we present new optical local area networks for fiber-to-the-desk application. Presented networks are expected to bring a solution for having optical fibers all the way to computers. To bring the overall implementation costs down we have based our networks on short-wavelength optical

  7. Dynamic Scaffolding of Socially Regulated Learning in a Computer-Based Learning Environment

    NARCIS (Netherlands)

    Molenaar, I.; Roda, Claudia; van Boxtel, Carla A.M.; Sleegers, P.J.C.

    2012-01-01

    The aim of this study is to test the effects of dynamically scaffolding social regulation of middle school students working in a computer-based learning environment. Dyads in the scaffolding condition (N = 56) are supported with computer-generated scaffolds and students in the control condition (N =

  8. Self-Localization at Street Intersections.

    Science.gov (United States)

    Fusco, Giovanni; Shen, Huiying; Coughlan, James M

    2014-05-01

    There is growing interest among smartphone users in the ability to determine their precise location in their environment for a variety of applications related to wayfinding, travel and shopping. While GPS provides valuable self-localization estimates, its accuracy is limited to approximately 10 meters in most urban locations. This paper focuses on the self-localization needs of blind or visually impaired travelers, who are faced with the challenge of negotiating street intersections. These travelers need more precise self-localization to help them align themselves properly to crosswalks, signal lights and other features such as walk light pushbuttons. We demonstrate a novel computer vision-based localization approach that is tailored to the street intersection domain. Unlike most work on computer vision-based localization techniques, which typically assume the presence of detailed, high-quality 3D models of urban environments, our technique harnesses the availability of simple, ubiquitous satellite imagery (e.g., Google Maps) to create simple maps of each intersection. Not only does this technique scale naturally to the great majority of street intersections in urban areas, but it has the added advantage of incorporating the specific metric information that blind or visually impaired travelers need, namely, the locations of intersection features such as crosswalks. Key to our approach is the integration of IMU (inertial measurement unit) information with geometric information obtained from image panorama stitchings. Finally, we evaluate the localization performance of our algorithm on a dataset of intersection panoramas, demonstrating the feasibility of our approach.

  9. Usability Studies in Virtual and Traditional Computer Aided Design Environments for Fault Identification

    Science.gov (United States)

    2017-08-08

    communicate their subjective opinions. Keywords: Usability Analysis; CAVETM (Cave Automatic Virtual Environments); Human Computer Interface (HCI...the differences in interaction when compared with traditional human computer interfaces. This paper provides analysis via usability study methods

  10. Localization of Outdoor Mobile Robots Using Curb Features in Urban Road Environments

    Directory of Open Access Journals (Sweden)

    Hyunsuk Lee

    2014-01-01

    Full Text Available Urban road environments that have pavement and curb are characterized as semistructured road environments. In semistructured road environments, the curb provides useful information for robot navigation. In this paper, we present a practical localization method for outdoor mobile robots using the curb features in semistructured road environments. The curb features are especially useful in urban environment, where the GPS failures take place frequently. A curb extraction is conducted on the basis of the Kernel Fisher Discriminant Analysis (KFDA to minimize false detection. We adopt the Extended Kalman Filter (EKF to combine the curb information with odometry and Differential Global Positioning System (DGPS. The uncertainty models for the sensors are quantitatively analyzed to provide a practical solution.

  11. Beekeeping, environment and modernity in localities in Yucatan, Mexico

    Directory of Open Access Journals (Sweden)

    Enrique Rodríguez Balam

    2015-09-01

    Full Text Available In this paper, we reflect on the local knowledge about the European honey bee Apis mellifera scutellata, namely its biology, behavior, social structure, communication, and the relationships that these organisms maintain with the environment and their natural enemies. We also discuss the impacts that land use has on this economic activity. The empirical knowledge of beekeepers converges quite well with the scientific knowledge concerning this group of organisms.

  12. Application of local computer networks in nuclear-physical experiments and technology

    International Nuclear Information System (INIS)

    Foteev, V.A.

    1986-01-01

    The bases of construction, comparative performance and potentialities of local computer networks with respect to their application in physical experiments are considered. The principle of operation of local networks is shown on the basis of the Ethernet network and the results of analysis of their operating performance are given. The examples of operating local networks in the area of nuclear-physics research and nuclear technology are presented as follows: networks of Japan Atomic Energy Research Institute, California University and Los Alamos National Laboratory, network realization according to the DECnet and Fast-bus programs, home network configurations of the USSR Academy of Sciences and JINR Neutron Physical Laboratory etc. It is shown that local networks allows significantly raise productivity in the sphere of data processing

  13. An approach for heterogeneous and loosely coupled geospatial data distributed computing

    Science.gov (United States)

    Chen, Bin; Huang, Fengru; Fang, Yu; Huang, Zhou; Lin, Hui

    2010-07-01

    Most GIS (Geographic Information System) applications tend to have heterogeneous and autonomous geospatial information resources, and the availability of these local resources is unpredictable and dynamic under a distributed computing environment. In order to make use of these local resources together to solve larger geospatial information processing problems that are related to an overall situation, in this paper, with the support of peer-to-peer computing technologies, we propose a geospatial data distributed computing mechanism that involves loosely coupled geospatial resource directories and a term named as Equivalent Distributed Program of global geospatial queries to solve geospatial distributed computing problems under heterogeneous GIS environments. First, a geospatial query process schema for distributed computing as well as a method for equivalent transformation from a global geospatial query to distributed local queries at SQL (Structured Query Language) level to solve the coordinating problem among heterogeneous resources are presented. Second, peer-to-peer technologies are used to maintain a loosely coupled network environment that consists of autonomous geospatial information resources, thus to achieve decentralized and consistent synchronization among global geospatial resource directories, and to carry out distributed transaction management of local queries. Finally, based on the developed prototype system, example applications of simple and complex geospatial data distributed queries are presented to illustrate the procedure of global geospatial information processing.

  14. Preserving access to ALEPH computing environment via virtual machines

    International Nuclear Information System (INIS)

    Coscetti, Simone; Boccali, Tommaso; Arezzini, Silvia; Maggi, Marcello

    2014-01-01

    The ALEPH Collaboration [1] took data at the LEP (CERN) electron-positron collider in the period 1989-2000, producing more than 300 scientific papers. While most of the Collaboration activities stopped in the last years, the data collected still has physics potential, with new theoretical models emerging, which ask checks with data at the Z and WW production energies. An attempt to revive and preserve the ALEPH Computing Environment is presented; the aim is not only the preservation of the data files (usually called bit preservation), but of the full environment a physicist would need to perform brand new analyses. Technically, a Virtual Machine approach has been chosen, using the VirtualBox platform. Concerning simulated events, the full chain from event generators to physics plots is possible, and reprocessing of data events is also functioning. Interactive tools like the DALI event display can be used on both data and simulated events. The Virtual Machine approach is suited for both interactive usage, and for massive computing using Cloud like approaches.

  15. G-LoSA: An efficient computational tool for local structure-centric biological studies and drug design.

    Science.gov (United States)

    Lee, Hui Sun; Im, Wonpil

    2016-04-01

    Molecular recognition by protein mostly occurs in a local region on the protein surface. Thus, an efficient computational method for accurate characterization of protein local structural conservation is necessary to better understand biology and drug design. We present a novel local structure alignment tool, G-LoSA. G-LoSA aligns protein local structures in a sequence order independent way and provides a GA-score, a chemical feature-based and size-independent structure similarity score. Our benchmark validation shows the robust performance of G-LoSA to the local structures of diverse sizes and characteristics, demonstrating its universal applicability to local structure-centric comparative biology studies. In particular, G-LoSA is highly effective in detecting conserved local regions on the entire surface of a given protein. In addition, the applications of G-LoSA to identifying template ligands and predicting ligand and protein binding sites illustrate its strong potential for computer-aided drug design. We hope that G-LoSA can be a useful computational method for exploring interesting biological problems through large-scale comparison of protein local structures and facilitating drug discovery research and development. G-LoSA is freely available to academic users at http://im.compbio.ku.edu/GLoSA/. © 2016 The Protein Society.

  16. Cloud computing for comparative genomics

    Directory of Open Access Journals (Sweden)

    Pivovarov Rimma

    2010-05-01

    Full Text Available Abstract Background Large comparative genomics studies and tools are becoming increasingly more compute-expensive as the number of available genome sequences continues to rise. The capacity and cost of local computing infrastructures are likely to become prohibitive with the increase, especially as the breadth of questions continues to rise. Alternative computing architectures, in particular cloud computing environments, may help alleviate this increasing pressure and enable fast, large-scale, and cost-effective comparative genomics strategies going forward. To test this, we redesigned a typical comparative genomics algorithm, the reciprocal smallest distance algorithm (RSD, to run within Amazon's Elastic Computing Cloud (EC2. We then employed the RSD-cloud for ortholog calculations across a wide selection of fully sequenced genomes. Results We ran more than 300,000 RSD-cloud processes within the EC2. These jobs were farmed simultaneously to 100 high capacity compute nodes using the Amazon Web Service Elastic Map Reduce and included a wide mix of large and small genomes. The total computation time took just under 70 hours and cost a total of $6,302 USD. Conclusions The effort to transform existing comparative genomics algorithms from local compute infrastructures is not trivial. However, the speed and flexibility of cloud computing environments provides a substantial boost with manageable cost. The procedure designed to transform the RSD algorithm into a cloud-ready application is readily adaptable to similar comparative genomics problems.

  17. A computational environment for creating and testing reduced chemical kinetic mechanisms

    Energy Technology Data Exchange (ETDEWEB)

    Montgomery, C.J.; Swensen, D.A.; Harding, T.V.; Cremer, M.A.; Bockelie, M.J. [Reaction Engineering International, Salt Lake City, UT (USA)

    2002-02-01

    This paper describes software called computer assisted reduced mechanism problem solving environment (CARM-PSE) that gives the engineer the ability to rapidly set up, run and examine large numbers of problems comparing detailed and reduced (approximate) chemistry. CARM-PSE integrates the automatic chemical mechanism reduction code CARM and the codes that simulate perfectly stirred reactors and plug flow reactors into a user-friendly computational environment. CARM-PSE gives the combustion engineer the ability to easily test chemical approximations over many hundreds of combinations of inputs in a multidimensional parameter space. The demonstration problems compare detailed and reduced chemical kinetic calculations for methane-air combustion, including nitrogen oxide formation, in a stirred reactor and selective non-catalytic reduction of NOx, in coal combustion flue gas.

  18. Mathematical Language Development and Talk Types in Computer Supported Collaborative Learning Environments

    Science.gov (United States)

    Symons, Duncan; Pierce, Robyn

    2015-01-01

    In this study we examine the use of cumulative and exploratory talk types in a year 5 computer supported collaborative learning environment. The focus for students in this environment was to participate in mathematical problem solving, with the intention of developing the proficiencies of problem solving and reasoning. Findings suggest that…

  19. Advanced quadrature sets and acceleration and preconditioning techniques for the discrete ordinates method in parallel computing environments

    Science.gov (United States)

    Longoni, Gianluca

    In the nuclear science and engineering field, radiation transport calculations play a key-role in the design and optimization of nuclear devices. The linear Boltzmann equation describes the angular, energy and spatial variations of the particle or radiation distribution. The discrete ordinates method (S N) is the most widely used technique for solving the linear Boltzmann equation. However, for realistic problems, the memory and computing time require the use of supercomputers. This research is devoted to the development of new formulations for the SN method, especially for highly angular dependent problems, in parallel environments. The present research work addresses two main issues affecting the accuracy and performance of SN transport theory methods: quadrature sets and acceleration techniques. New advanced quadrature techniques which allow for large numbers of angles with a capability for local angular refinement have been developed. These techniques have been integrated into the 3-D SN PENTRAN (Parallel Environment Neutral-particle TRANsport) code and applied to highly angular dependent problems, such as CT-Scan devices, that are widely used to obtain detailed 3-D images for industrial/medical applications. In addition, the accurate simulation of core physics and shielding problems with strong heterogeneities and transport effects requires the numerical solution of the transport equation. In general, the convergence rate of the solution methods for the transport equation is reduced for large problems with optically thick regions and scattering ratios approaching unity. To remedy this situation, new acceleration algorithms based on the Even-Parity Simplified SN (EP-SSN) method have been developed. A new stand-alone code system, PENSSn (Parallel Environment Neutral-particle Simplified SN), has been developed based on the EP-SSN method. The code is designed for parallel computing environments with spatial, angular and hybrid (spatial/angular) domain

  20. Exploring the influence of local food environments on food behaviours: a systematic review of qualitative literature.

    Science.gov (United States)

    Pitt, Erin; Gallegos, Danielle; Comans, Tracy; Cameron, Cate; Thornton, Lukar

    2017-09-01

    Systematic reviews investigating associations between objective measures of the food environment and dietary behaviours or health outcomes have not established a consistent evidence base. The present paper aims to synthesise qualitative evidence regarding the influence of local food environments on food and purchasing behaviours. A systematic review in the form of a qualitative thematic synthesis. Urban localities. Adults. Four analytic themes were identified from the review including community and consumer nutrition environments, other environmental factors and individual coping strategies for shopping and purchasing decisions. Availability, accessibility and affordability were consistently identified as key determinants of store choice and purchasing behaviours that often result in less healthy food choices within community nutrition environments. Food availability, quality and food store characteristics within consumer nutrition environments also greatly influenced in-store purchases. Individuals used a range of coping strategies in both the community and consumer nutrition environments to make optimal purchasing decisions, often within the context of financial constraints. Findings from the current review add depth and scope to quantitative literature and can guide ongoing theory, interventions and policy development in food environment research. There is a need to investigate contextual influences within food environments as well as individual and household socio-economic characteristics that contribute to the differing use of and views towards local food environments. Greater emphasis on how individual and environmental factors interact in the food environment field will be key to developing stronger understanding of how environments can support and promote healthier food choices.

  1. Ubiquitous computing in shared-care environments.

    Science.gov (United States)

    Koch, S

    2006-07-01

    In light of future challenges, such as growing numbers of elderly, increase in chronic diseases, insufficient health care budgets and problems with staff recruitment for the health-care sector, information and communication technology (ICT) becomes a possible means to meet these challenges. Organizational changes such as the decentralization of the health-care system lead to a shift from in-hospital to both advanced and basic home health care. Advanced medical technologies provide solutions for distant home care in form of specialist consultations and home monitoring. Furthermore, the shift towards home health care will increase mobile work and the establishment of shared care teams which require ICT-based solutions that support ubiquitous information access and cooperative work. Clinical documentation and decision support systems are the main ICT-based solutions of interest in the context of ubiquitous computing for shared care environments. This paper therefore describes the prerequisites for clinical documentation and decision support at the point of care, the impact of mobility on the documentation process, and how the introduction of ICT-based solutions will influence organizations and people. Furthermore, the role of dentistry in shared-care environments is discussed and illustrated in the form of a future scenario.

  2. Power Consumption Evaluation of Distributed Computing Network Considering Traffic Locality

    Science.gov (United States)

    Ogawa, Yukio; Hasegawa, Go; Murata, Masayuki

    When computing resources are consolidated in a few huge data centers, a massive amount of data is transferred to each data center over a wide area network (WAN). This results in increased power consumption in the WAN. A distributed computing network (DCN), such as a content delivery network, can reduce the traffic from/to the data center, thereby decreasing the power consumed in the WAN. In this paper, we focus on the energy-saving aspect of the DCN and evaluate its effectiveness, especially considering traffic locality, i.e., the amount of traffic related to the geographical vicinity. We first formulate the problem of optimizing the DCN power consumption and describe the DCN in detail. Then, numerical evaluations show that, when there is strong traffic locality and the router has ideal energy proportionality, the system's power consumption is reduced to about 50% of the power consumed in the case where a DCN is not used; moreover, this advantage becomes even larger (up to about 30%) when the data center is located farthest from the center of the network topology.

  3. A local computer network for the experimental data acquisition at BESSY

    International Nuclear Information System (INIS)

    Buchholz, W.

    1984-01-01

    For the users of the Berlin dedicated electron storage ring for synchrotron radiation (BESSY) a local computer network has been installed: The system is designed primarily for data acquisition and offers the users a generous hardware provision combined with maximum sortware flexibility

  4. Initialization and Restart in Stochastic Local Search: Computing a Most Probable Explanation in Bayesian Networks

    Science.gov (United States)

    Mengshoel, Ole J.; Wilkins, David C.; Roth, Dan

    2010-01-01

    For hard computational problems, stochastic local search has proven to be a competitive approach to finding optimal or approximately optimal problem solutions. Two key research questions for stochastic local search algorithms are: Which algorithms are effective for initialization? When should the search process be restarted? In the present work we investigate these research questions in the context of approximate computation of most probable explanations (MPEs) in Bayesian networks (BNs). We introduce a novel approach, based on the Viterbi algorithm, to explanation initialization in BNs. While the Viterbi algorithm works on sequences and trees, our approach works on BNs with arbitrary topologies. We also give a novel formalization of stochastic local search, with focus on initialization and restart, using probability theory and mixture models. Experimentally, we apply our methods to the problem of MPE computation, using a stochastic local search algorithm known as Stochastic Greedy Search. By carefully optimizing both initialization and restart, we reduce the MPE search time for application BNs by several orders of magnitude compared to using uniform at random initialization without restart. On several BNs from applications, the performance of Stochastic Greedy Search is competitive with clique tree clustering, a state-of-the-art exact algorithm used for MPE computation in BNs.

  5. Learning styles: individualizing computer-based learning environments

    Directory of Open Access Journals (Sweden)

    Tim Musson

    1995-12-01

    Full Text Available While the need to adapt teaching to the needs of a student is generally acknowledged (see Corno and Snow, 1986, for a wide review of the literature, little is known about the impact of individual learner-differences on the quality of learning attained within computer-based learning environments (CBLEs. What evidence there is appears to support the notion that individual differences have implications for the degree of success or failure experienced by students (Ford and Ford, 1992 and by trainee end-users of software packages (Bostrom et al, 1990. The problem is to identify the way in which specific individual characteristics of a student interact with particular features of a CBLE, and how the interaction affects the quality of the resultant learning. Teaching in a CBLE is likely to require a subset of teaching strategies different from that subset appropriate to more traditional environments, and the use of a machine may elicit different behaviours from those normally arising in a classroom context.

  6. Dynamic Scaffolding of Socially Regulated Learning in a Computer-Based Learning Environment

    Science.gov (United States)

    Molenaar, Inge; Roda, Claudia; van Boxtel, Carla; Sleegers, Peter

    2012-01-01

    The aim of this study is to test the effects of dynamically scaffolding social regulation of middle school students working in a computer-based learning environment. Dyads in the scaffolding condition (N=56) are supported with computer-generated scaffolds and students in the control condition (N=54) do not receive scaffolds. The scaffolds are…

  7. Application of Selective Algorithm for Effective Resource Provisioning in Cloud Computing Environment

    OpenAIRE

    Katyal, Mayanka; Mishra, Atul

    2014-01-01

    Modern day continued demand for resource hungry services and applications in IT sector has led to development of Cloud computing. Cloud computing environment involves high cost infrastructure on one hand and need high scale computational resources on the other hand. These resources need to be provisioned (allocation and scheduling) to the end users in most efficient manner so that the tremendous capabilities of cloud are utilized effectively and efficiently. In this paper we discuss a selecti...

  8. Computed tomography findings after radiofrequency ablation in locally advanced pancreatic cancer

    NARCIS (Netherlands)

    Rombouts, Steffi J. E.; Derksen, Tyche C.; Nio, Chung Y.; van Hillegersberg, Richard; van Santvoort, Hjalmar C.; Walma, Marieke S.; Molenaar, Izaak Q.; van Leeuwen, Maarten S.

    2018-01-01

    The purpose of the study was to provide a systematic evaluation of the computed tomography(CT) findings after radiofrequency ablation (RFA) in locally advanced pancreatic cancer(LAPC). Eighteen patients with intra-operative RFA-treated LAPC were included in a prospective case series. All CT-scans

  9. Halo assembly bias and the tidal anisotropy of the local halo environment

    Science.gov (United States)

    Paranjape, Aseem; Hahn, Oliver; Sheth, Ravi K.

    2018-05-01

    We study the role of the local tidal environment in determining the assembly bias of dark matter haloes. Previous results suggest that the anisotropy of a halo's environment (i.e. whether it lies in a filament or in a more isotropic region) can play a significant role in determining the eventual mass and age of the halo. We statistically isolate this effect, using correlations between the large-scale and small-scale environments of simulated haloes at z = 0 with masses between 1011.6 ≲ (m/h-1 M⊙) ≲ 1014.9. We probe the large-scale environment, using a novel halo-by-halo estimator of linear bias. For the small-scale environment, we identify a variable αR that captures the tidal anisotropy in a region of radius R = 4R200b around the halo and correlates strongly with halo bias at fixed mass. Segregating haloes by αR reveals two distinct populations. Haloes in highly isotropic local environments (αR ≲ 0.2) behave as expected from the simplest, spherically averaged analytical models of structure formation, showing a negative correlation between their concentration and large-scale bias at all masses. In contrast, haloes in anisotropic, filament-like environments (αR ≳ 0.5) tend to show a positive correlation between bias and concentration at any mass. Our multiscale analysis cleanly demonstrates how the overall assembly bias trend across halo mass emerges as an average over these different halo populations, and provides valuable insights towards building analytical models that correctly incorporate assembly bias. We also discuss potential implications for the nature and detectability of galaxy assembly bias.

  10. The Needs of Virtual Machines Implementation in Private Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Edy Kristianto

    2015-12-01

    Full Text Available The Internet of Things (IOT becomes the purpose of the development of information and communication technology. Cloud computing has a very important role in supporting the IOT, because cloud computing allows to provide services in the form of infrastructure (IaaS, platform (PaaS, and Software (SaaS for its users. One of the fundamental services is infrastructure as a service (IaaS. This study analyzed the requirement that there must be based on a framework of NIST to realize infrastructure as a service in the form of a virtual machine to be built in a cloud computing environment.

  11. Blockchain-based database to ensure data integrity in cloud computing environments

    OpenAIRE

    Gaetani, Edoardo; Aniello, Leonardo; Baldoni, Roberto; Lombardi, Federico; Margheri, Andrea; Sassone, Vladimiro

    2017-01-01

    Data is nowadays an invaluable resource, indeed it guides all business decisions in most of the computer-aided human activities. Threats to data integrity are thus of paramount relevance, as tampering with data may maliciously affect crucial business decisions. This issue is especially true in cloud computing environments, where data owners cannot control fundamental data aspects, like the physical storage of data and the control of its accesses. Blockchain has recently emerged as a fascinati...

  12. Computed tomography of localized dilatation of the intrahepatic bile ducts

    International Nuclear Information System (INIS)

    Araki, T.; Itai, Y.; Tasaka, A.

    1981-01-01

    Twenty-nine patients showed localized dilatation of the intrahepatic bile ducts on computed tomography, usually unaccompanied by jaundice. Congenital dilatation was diagnosed when associated with a choledochal cyst, while cholangiographic contrast material was helpful in differentiating such dilatation from a simple cyst by showing its communication with the biliary tract when no choledochal cyst was present. Obstructive dilatation was associated with intrahepatic calculi in 4 cases, hepatoma in 9, cholangioma in 5, metastatic tumor in 5, and polycystic disease in 2. Cholangioma and intrahepatic calculi had a greater tendency to accompany such localized dilatation; in 2 cases, the dilatation was the only clue to the underlying disorder

  13. NOSTOS: a paper-based ubiquitous computing healthcare environment to support data capture and collaboration.

    Science.gov (United States)

    Bång, Magnus; Larsson, Anders; Eriksson, Henrik

    2003-01-01

    In this paper, we present a new approach to clinical workplace computerization that departs from the window-based user interface paradigm. NOSTOS is an experimental computer-augmented work environment designed to support data capture and teamwork in an emergency room. NOSTOS combines multiple technologies, such as digital pens, walk-up displays, headsets, a smart desk, and sensors to enhance an existing paper-based practice with computer power. The physical interfaces allow clinicians to retain mobile paper-based collaborative routines and still benefit from computer technology. The requirements for the system were elicited from situated workplace studies. We discuss the advantages and disadvantages of augmenting a paper-based clinical work environment.

  14. Imaging local brain function with emission computed tomography

    International Nuclear Information System (INIS)

    Kuhl, D.E.

    1984-01-01

    Positron emission tomography (PET) using 18 F-fluorodeoxyglucose (FDG) was used to map local cerebral glucose utilization in the study of local cerebral function. This information differs fundamentally from structural assessment by means of computed tomography (CT). In normal human volunteers, the FDG scan was used to determine the cerebral metabolic response to conrolled sensory stimulation and the effects of aging. Cerebral metabolic patterns are distinctive among depressed and demented elderly patients. The FDG scan appears normal in the depressed patient, studded with multiple metabolic defects in patients with multiple infarct dementia, and in the patients with Alzheimer disease, metabolism is particularly reduced in the parietal cortex, but only slightly reduced in the caudate and thalamus. The interictal FDG scan effectively detects hypometabolic brain zones that are sites of onset for seizures in patients with partial epilepsy, even though these zones usually appear normal on CT scans. The future prospects of PET are discussed

  15. A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations

    Science.gov (United States)

    Demir, I.; Agliamzanov, R.

    2014-12-01

    Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.

  16. An integrated computer design environment for the development of micro-computer critical software

    International Nuclear Information System (INIS)

    De Agostino, E.; Massari, V.

    1986-01-01

    The paper deals with the development of micro-computer software for Nuclear Safety System. More specifically, it describes an experimental work in the field of software development methodologies to be used for the implementation of micro-computer based safety systems. An investigation of technological improvements that are provided by state-of-the-art integrated packages for micro-based systems development has been carried out. The work has aimed to assess a suitable automated tools environment for the whole software life-cycle. The main safety functions, as DNBR, KW/FT, of a nuclear power reactor have been implemented in a host-target approach. A prototype test-bed microsystem has been implemented to run the safety functions in order to derive a concrete evaluation on the feasibility of critical software according to new technological trends of ''Software Factories''. (author)

  17. Bronchobiliary Fistula Localized by Cholescintigraphy with Single-Photon Emission Computed Tomography

    International Nuclear Information System (INIS)

    Artunduaga, Maddy; Patel, Niraj R.; Wendt, Julie A.; Guy, Elizabeth S.; Nachiappan, Arun C.

    2015-01-01

    Biliptysis is an important clinical feature to recognize as it is associated with bronchobiliary fistula, a rare entity. Bronchobiliary fistulas have been diagnosed with planar cholescintigraphy. However, cholescintigraphy with single-photon emission computed tomography (SPECT) can better spatially localize a bronchobiliary fistula as compared to planar cholescintigraphy alone, and is useful for preoperative planning if surgical treatment is required. Here, we present the case of a 23-year-old male who developed a bronchobiliary fistula in the setting of posttraumatic and postsurgical infection, which was diagnosed and localized by cholescintigraphy with SPECT

  18. Bound on quantum computation time: Quantum error correction in a critical environment

    International Nuclear Information System (INIS)

    Novais, E.; Mucciolo, Eduardo R.; Baranger, Harold U.

    2010-01-01

    We obtain an upper bound on the time available for quantum computation for a given quantum computer and decohering environment with quantum error correction implemented. First, we derive an explicit quantum evolution operator for the logical qubits and show that it has the same form as that for the physical qubits but with a reduced coupling strength to the environment. Using this evolution operator, we find the trace distance between the real and ideal states of the logical qubits in two cases. For a super-Ohmic bath, the trace distance saturates, while for Ohmic or sub-Ohmic baths, there is a finite time before the trace distance exceeds a value set by the user.

  19. Service ORiented Computing EnviRonment (SORCER) for Deterministic Global and Stochastic Optimization

    OpenAIRE

    Raghunath, Chaitra

    2015-01-01

    With rapid growth in the complexity of large scale engineering systems, the application of multidisciplinary analysis and design optimization (MDO) in the engineering design process has garnered much attention. MDO addresses the challenge of integrating several different disciplines into the design process. Primary challenges of MDO include computational expense and poor scalability. The introduction of a distributed, collaborative computational environment results in better...

  20. Providing a computing environment for a high energy physics workshop

    International Nuclear Information System (INIS)

    Nicholls, J.

    1991-03-01

    Although computing facilities have been provided at conferences and workshops remote from the hose institution for some years, the equipment provided has rarely been capable of providing for much more than simple editing and electronic mail over leased lines. This presentation describes the pioneering effort involved by the Computing Department/Division at Fermilab in providing a local computing facility with world-wide networking capability for the Physics at Fermilab in the 1990's workshop held in Breckenridge, Colorado, in August 1989, as well as the enhanced facilities provided for the 1990 Summer Study on High Energy Physics at Snowmass, Colorado, in June/July 1990. Issues discussed include type and sizing of the facilities, advance preparations, shipping, on-site support, as well as an evaluation of the value of the facility to the workshop participants

  1. The Development and Evaluation of a Computer-Simulated Science Inquiry Environment Using Gamified Elements

    Science.gov (United States)

    Tsai, Fu-Hsing

    2018-01-01

    This study developed a computer-simulated science inquiry environment, called the Science Detective Squad, to engage students in investigating an electricity problem that may happen in daily life. The environment combined the simulation of scientific instruments and a virtual environment, including gamified elements, such as points and a story for…

  2. Computer Graphics Orientation and Training in a Corporate/Production Environment.

    Science.gov (United States)

    McDevitt, Marsha Jean

    This master's thesis provides an overview of a computer graphics production environment and proposes a realistic approach to orientation and on-going training for employees working within a fast-paced production schedule. Problems involved in meeting the training needs of employees are briefly discussed in the first chapter, while the second…

  3. Enhanced Survey and Proposal to secure the data in Cloud Computing Environment

    OpenAIRE

    MR.S.SUBBIAH; DR.S.SELVA MUTHUKUMARAN; DR.T.RAMKUMAR

    2013-01-01

    Cloud computing have the power to eliminate the cost of setting high end computing infrastructure. It is a promising area or design to give very flexible architecture, accessible through the internet. In the cloud computing environment the data will be reside at any of the data centers. Due to that, some data center may leak the data stored on there, beyond the reach and control of the users. For this kind of misbehaving data centers, the service providers should take care of the security and...

  4. Real Time Vision System for Obstacle Detection and Localization on FPGA

    OpenAIRE

    Alhamwi , Ali; Vandeportaele , Bertrand; Piat , Jonathan

    2015-01-01

    International audience; Obstacle detection is a mandatory function for a robot navigating in an indoor environment especially when interaction with humans is done in a cluttered environment. Commonly used vision-based solutions like SLAM (Simultaneous Localization and Mapping) or optical flow tend to be computation intensive and require powerful computation resources to meet low speed real-time constraints. Solutions using LIDAR (Light Detection And Ranging) sensors are more robust but not co...

  5. Deception Detection in a Computer-Mediated Environment: Gender, Trust, and Training Issues

    National Research Council Canada - National Science Library

    Dziubinski, Monica

    2003-01-01

    .... This research draws on communication and deception literature to develop a conceptual model proposing relationships between deception detection abilities in a computer-mediated environment, gender, trust, and training...

  6. Local bureaucrats as bricoleurs. The everyday implementation practices of county environment officers in rural Kenya

    Directory of Open Access Journals (Sweden)

    Mikkel Funder

    2015-03-01

    Full Text Available Bricolage in natural resource governance takes place through the interplay of a variety of actors. This article explores the practices of a group whose agency as bricoleurs has received little attention, namely the government officers who represent the state in the everyday management of water, land, forests and other resources across rural Africa. Specifically we examine how local Environment Officers in Taita Taveta County in Kenya go about implementing the national environmental law on the ground, and how they interact with communities in this process. As representatives of “the local state”, the Environment Officers occupy an ambiguous position in which they are expected to implement lofty laws and policies with limited means and in a complex local reality. In response to this they employ three key practices, namely (i working through personal networks, (ii tailoring informal agreements, and (iii delegating public functions and authority to civil society. As a result, the environmental law is to a large extent implemented through a blend of formal and informal rules and governance arrangements, produced through the interplay of the Environment Officers, communities and other local actors.

  7. BEAM: A computational workflow system for managing and modeling material characterization data in HPC environments

    Energy Technology Data Exchange (ETDEWEB)

    Lingerfelt, Eric J [ORNL; Endeve, Eirik [ORNL; Ovchinnikov, Oleg S [ORNL; Borreguero Calvo, Jose M [ORNL; Park, Byung H [ORNL; Archibald, Richard K [ORNL; Symons, Christopher T [ORNL; Kalinin, Sergei V [ORNL; Messer, Bronson [ORNL; Shankar, Mallikarjun [ORNL; Jesse, Stephen [ORNL

    2016-01-01

    Improvements in scientific instrumentation allow imaging at mesoscopic to atomic length scales, many spectroscopic modes, and now with the rise of multimodal acquisition systems and the associated processing capability the era of multidimensional, informationally dense data sets has arrived. Technical issues in these combinatorial scientific fields are exacerbated by computational challenges best summarized as a necessity for drastic improvement in the capability to transfer, store, and analyze large volumes of data. The Bellerophon Environment for Analysis of Materials (BEAM) platform provides material scientists the capability to directly leverage the integrated computational and analytical power of High Performance Computing (HPC) to perform scalable data analysis and simulation via an intuitive, cross-platform client user interface. This framework delivers authenticated, push-button execution of complex user workflows that deploy data analysis algorithms and computational simulations utilizing the converged compute-and-data infrastructure at Oak Ridge National Laboratory s (ORNL) Compute and Data Environment for Science (CADES) and HPC environments like Titan at the Oak Ridge Leadership Computing Facility (OLCF). In this work we address the underlying HPC needs for characterization in the material science community, elaborate how BEAM s design and infrastructure tackle those needs, and present a small sub-set of user cases where scientists utilized BEAM across a broad range of analytical techniques and analysis modes.

  8. A Wireless Sensor Network with Soft Computing Localization Techniques for Track Cycling Applications.

    Science.gov (United States)

    Gharghan, Sadik Kamel; Nordin, Rosdiadee; Ismail, Mahamod

    2016-08-06

    In this paper, we propose two soft computing localization techniques for wireless sensor networks (WSNs). The two techniques, Neural Fuzzy Inference System (ANFIS) and Artificial Neural Network (ANN), focus on a range-based localization method which relies on the measurement of the received signal strength indicator (RSSI) from the three ZigBee anchor nodes distributed throughout the track cycling field. The soft computing techniques aim to estimate the distance between bicycles moving on the cycle track for outdoor and indoor velodromes. In the first approach the ANFIS was considered, whereas in the second approach the ANN was hybridized individually with three optimization algorithms, namely Particle Swarm Optimization (PSO), Gravitational Search Algorithm (GSA), and Backtracking Search Algorithm (BSA). The results revealed that the hybrid GSA-ANN outperforms the other methods adopted in this paper in terms of accuracy localization and distance estimation accuracy. The hybrid GSA-ANN achieves a mean absolute distance estimation error of 0.02 m and 0.2 m for outdoor and indoor velodromes, respectively.

  9. Error Analysis for RADAR Neighbor Matching Localization in Linear Logarithmic Strength Varying Wi-Fi Environment

    Directory of Open Access Journals (Sweden)

    Mu Zhou

    2014-01-01

    Full Text Available This paper studies the statistical errors for the fingerprint-based RADAR neighbor matching localization with the linearly calibrated reference points (RPs in logarithmic received signal strength (RSS varying Wi-Fi environment. To the best of our knowledge, little comprehensive analysis work has appeared on the error performance of neighbor matching localization with respect to the deployment of RPs. However, in order to achieve the efficient and reliable location-based services (LBSs as well as the ubiquitous context-awareness in Wi-Fi environment, much attention has to be paid to the highly accurate and cost-efficient localization systems. To this end, the statistical errors by the widely used neighbor matching localization are significantly discussed in this paper to examine the inherent mathematical relations between the localization errors and the locations of RPs by using a basic linear logarithmic strength varying model. Furthermore, based on the mathematical demonstrations and some testing results, the closed-form solutions to the statistical errors by RADAR neighbor matching localization can be an effective tool to explore alternative deployment of fingerprint-based neighbor matching localization systems in the future.

  10. Error Analysis for RADAR Neighbor Matching Localization in Linear Logarithmic Strength Varying Wi-Fi Environment

    Science.gov (United States)

    Tian, Zengshan; Xu, Kunjie; Yu, Xiang

    2014-01-01

    This paper studies the statistical errors for the fingerprint-based RADAR neighbor matching localization with the linearly calibrated reference points (RPs) in logarithmic received signal strength (RSS) varying Wi-Fi environment. To the best of our knowledge, little comprehensive analysis work has appeared on the error performance of neighbor matching localization with respect to the deployment of RPs. However, in order to achieve the efficient and reliable location-based services (LBSs) as well as the ubiquitous context-awareness in Wi-Fi environment, much attention has to be paid to the highly accurate and cost-efficient localization systems. To this end, the statistical errors by the widely used neighbor matching localization are significantly discussed in this paper to examine the inherent mathematical relations between the localization errors and the locations of RPs by using a basic linear logarithmic strength varying model. Furthermore, based on the mathematical demonstrations and some testing results, the closed-form solutions to the statistical errors by RADAR neighbor matching localization can be an effective tool to explore alternative deployment of fingerprint-based neighbor matching localization systems in the future. PMID:24683349

  11. Structural analysis of magnetic fusion energy systems in a combined interactive/batch computer environment

    International Nuclear Information System (INIS)

    Johnson, N.E.; Singhal, M.K.; Walls, J.C.; Gray, W.H.

    1979-01-01

    A system of computer programs has been developed to aid in the preparation of input data for and the evaluation of output data from finite element structural analyses of magnetic fusion energy devices. The system utilizes the NASTRAN structural analysis computer program and a special set of interactive pre- and post-processor computer programs, and has been designed for use in an environment wherein a time-share computer system is linked to a batch computer system. In such an environment, the analyst must only enter, review and/or manipulate data through interactive terminals linked to the time-share computer system. The primary pre-processor programs include NASDAT, NASERR and TORMAC. NASDAT and TORMAC are used to generate NASTRAN input data. NASERR performs routine error checks on this data. The NASTRAN program is run on a batch computer system using data generated by NASDAT and TORMAC. The primary post-processing programs include NASCMP and NASPOP. NASCMP is used to compress the data initially stored on magnetic tape by NASTRAN so as to facilitate interactive use of the data. NASPOP reads the data stored by NASCMP and reproduces NASTRAN output for selected grid points, elements and/or data types

  12. Virtualization of the ATLAS software environment on a shared HPC system

    CERN Document Server

    Schnoor, Ulrike; The ATLAS collaboration

    2017-01-01

    High-Performance Computing (HPC) and other research cluster computing resources provided by universities can be useful supplements to the collaboration’s own WLCG computing resources for data analysis and production of simulated event samples. The shared HPC cluster "NEMO" at the University of Freiburg has been made available to local ATLAS users through the provisioning of virtual machines incorporating the ATLAS software environment analogously to a WLCG center. The talk describes the concept and implementation of virtualizing the ATLAS software environment to run both data analysis and production on the HPC host system which is connected to the existing Tier-3 infrastructure. Main challenges include the integration into the NEMO and Tier-3 schedulers in a dynamic, on-demand way, the scalability of the OpenStack infrastructure, as well as the automatic generation of a fully functional virtual machine image providing access to the local user environment, the dCache storage element and the parallel file sys...

  13. A Modular Environment for Geophysical Inversion and Run-time Autotuning using Heterogeneous Computing Systems

    Science.gov (United States)

    Myre, Joseph M.

    Heterogeneous computing systems have recently come to the forefront of the High-Performance Computing (HPC) community's interest. HPC computer systems that incorporate special purpose accelerators, such as Graphics Processing Units (GPUs), are said to be heterogeneous. Large scale heterogeneous computing systems have consistently ranked highly on the Top500 list since the beginning of the heterogeneous computing trend. By using heterogeneous computing systems that consist of both general purpose processors and special- purpose accelerators, the speed and problem size of many simulations could be dramatically increased. Ultimately this results in enhanced simulation capabilities that allows, in some cases for the first time, the execution of parameter space and uncertainty analyses, model optimizations, and other inverse modeling techniques that are critical for scientific discovery and engineering analysis. However, simplifying the usage and optimization of codes for heterogeneous computing systems remains a challenge. This is particularly true for scientists and engineers for whom understanding HPC architectures and undertaking performance analysis may not be primary research objectives. To enable scientists and engineers to remain focused on their primary research objectives, a modular environment for geophysical inversion and run-time autotuning on heterogeneous computing systems is presented. This environment is composed of three major components: 1) CUSH---a framework for reducing the complexity of programming heterogeneous computer systems, 2) geophysical inversion routines which can be used to characterize physical systems, and 3) run-time autotuning routines designed to determine configurations of heterogeneous computing systems in an attempt to maximize the performance of scientific and engineering codes. Using three case studies, a lattice-Boltzmann method, a non-negative least squares inversion, and a finite-difference fluid flow method, it is shown that

  14. The Effects of a Robot Game Environment on Computer Programming Education for Elementary School Students

    Science.gov (United States)

    Shim, Jaekwoun; Kwon, Daiyoung; Lee, Wongyu

    2017-01-01

    In the past, computer programming was perceived as a task only carried out by computer scientists; in the 21st century, however, computer programming is viewed as a critical and necessary skill that everyone should learn. In order to improve teaching of problem-solving abilities in a computing environment, extensive research is being done on…

  15. MDA-image: an environment of networked desktop computers for teleradiology/pathology.

    Science.gov (United States)

    Moffitt, M E; Richli, W R; Carrasco, C H; Wallace, S; Zimmerman, S O; Ayala, A G; Benjamin, R S; Chee, S; Wood, P; Daniels, P

    1991-04-01

    MDA-Image, a project of The University of Texas M. D. Anderson Cancer Center, is an environment of networked desktop computers for teleradiology/pathology. Radiographic film is digitized with a film scanner and histopathologic slides are digitized using a red, green, and blue (RGB) video camera connected to a microscope. Digitized images are stored on a data server connected to the institution's computer communication network (Ethernet) and can be displayed from authorized desktop computers connected to Ethernet. Images are digitized for cases presented at the Bone Tumor Management Conference, a multidisciplinary conference in which treatment options are discussed among clinicians, surgeons, radiologists, pathologists, radiotherapists, and medical oncologists. These radiographic and histologic images are shown on a large screen computer monitor during the conference. They are available for later review for follow-up or representation.

  16. Method and system for rendering and interacting with an adaptable computing environment

    Science.gov (United States)

    Osbourn, Gordon Cecil [Albuquerque, NM; Bouchard, Ann Marie [Albuquerque, NM

    2012-06-12

    An adaptable computing environment is implemented with software entities termed "s-machines", which self-assemble into hierarchical data structures capable of rendering and interacting with the computing environment. A hierarchical data structure includes a first hierarchical s-machine bound to a second hierarchical s-machine. The first hierarchical s-machine is associated with a first layer of a rendering region on a display screen and the second hierarchical s-machine is associated with a second layer of the rendering region overlaying at least a portion of the first layer. A screen element s-machine is linked to the first hierarchical s-machine. The screen element s-machine manages data associated with a screen element rendered to the display screen within the rendering region at the first layer.

  17. Study on User Authority Management for Safe Data Protection in Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    Su-Hyun Kim

    2015-03-01

    Full Text Available In cloud computing environments, user data are encrypted using numerous distributed servers before storing such data. Global Internet service companies, such as Google and Yahoo, recognized the importance of Internet service platforms and conducted self-research and development to create and utilize large cluster-based cloud computing platform technology based on low-priced commercial nodes. As diverse data services become possible in distributed computing environments, high-capacity distributed management is emerging as a major issue. Meanwhile, because of the diverse forms of using high-capacity data, security vulnerability and privacy invasion by malicious attackers or internal users can occur. As such, when various sensitive data are stored in cloud servers and used from there, the problem of data spill might occur because of external attackers or the poor management of internal users. Data can be managed through encryption to prevent such problems. However, existing simple encryption methods involve problems associated with the management of access to data stored in cloud environments. Therefore, in the present paper, a technique for data access management by user authority, based on Attribute-Based Encryption (ABE and secret distribution techniques, is proposed.

  18. A Single LiDAR-Based Feature Fusion Indoor Localization Algorithm.

    Science.gov (United States)

    Wang, Yun-Ting; Peng, Chao-Chung; Ravankar, Ankit A; Ravankar, Abhijeet

    2018-04-23

    In past years, there has been significant progress in the field of indoor robot localization. To precisely recover the position, the robots usually relies on multiple on-board sensors. Nevertheless, this affects the overall system cost and increases computation. In this research work, we considered a light detection and ranging (LiDAR) device as the only sensor for detecting surroundings and propose an efficient indoor localization algorithm. To attenuate the computation effort and preserve localization robustness, a weighted parallel iterative closed point (WP-ICP) with interpolation is presented. As compared to the traditional ICP, the point cloud is first processed to extract corners and line features before applying point registration. Later, points labeled as corners are only matched with the corner candidates. Similarly, points labeled as lines are only matched with the lines candidates. Moreover, their ICP confidence levels are also fused in the algorithm, which make the pose estimation less sensitive to environment uncertainties. The proposed WP-ICP architecture reduces the probability of mismatch and thereby reduces the ICP iterations. Finally, based on given well-constructed indoor layouts, experiment comparisons are carried out under both clean and perturbed environments. It is shown that the proposed method is effective in significantly reducing computation effort and is simultaneously able to preserve localization precision.

  19. A Single LiDAR-Based Feature Fusion Indoor Localization Algorithm

    Directory of Open Access Journals (Sweden)

    Yun-Ting Wang

    2018-04-01

    Full Text Available In past years, there has been significant progress in the field of indoor robot localization. To precisely recover the position, the robots usually relies on multiple on-board sensors. Nevertheless, this affects the overall system cost and increases computation. In this research work, we considered a light detection and ranging (LiDAR device as the only sensor for detecting surroundings and propose an efficient indoor localization algorithm. To attenuate the computation effort and preserve localization robustness, a weighted parallel iterative closed point (WP-ICP with interpolation is presented. As compared to the traditional ICP, the point cloud is first processed to extract corners and line features before applying point registration. Later, points labeled as corners are only matched with the corner candidates. Similarly, points labeled as lines are only matched with the lines candidates. Moreover, their ICP confidence levels are also fused in the algorithm, which make the pose estimation less sensitive to environment uncertainties. The proposed WP-ICP architecture reduces the probability of mismatch and thereby reduces the ICP iterations. Finally, based on given well-constructed indoor layouts, experiment comparisons are carried out under both clean and perturbed environments. It is shown that the proposed method is effective in significantly reducing computation effort and is simultaneously able to preserve localization precision.

  20. The Use of Computer Simulation to Compare Student performance in Traditional versus Distance Learning Environments

    Directory of Open Access Journals (Sweden)

    Retta Guy

    2015-06-01

    Full Text Available Simulations have been shown to be an effective tool in traditional learning environments; however, as distance learning grows in popularity, the need to examine simulation effectiveness in this environment has become paramount. A casual-comparative design was chosen for this study to determine whether students using a computer-based instructional simulation in hybrid and fully online environments learned better than traditional classroom learners. The study spans a period of 6 years beginning fall 2008 through spring 2014. The population studied was 281 undergraduate business students self-enrolled in a 200-level microcomputer application course. The overall results support previous studies in that computer simulations are most effective when used as a supplement to face-to-face lectures and in hybrid environments.

  1. Performance comparison of heuristic algorithms for task scheduling in IaaS cloud computing environment

    Science.gov (United States)

    Madni, Syed Hamid Hussain; Abd Latiff, Muhammad Shafie; Abdullahi, Mohammed; Usman, Mohammed Joda

    2017-01-01

    Cloud computing infrastructure is suitable for meeting computational needs of large task sizes. Optimal scheduling of tasks in cloud computing environment has been proved to be an NP-complete problem, hence the need for the application of heuristic methods. Several heuristic algorithms have been developed and used in addressing this problem, but choosing the appropriate algorithm for solving task assignment problem of a particular nature is difficult since the methods are developed under different assumptions. Therefore, six rule based heuristic algorithms are implemented and used to schedule autonomous tasks in homogeneous and heterogeneous environments with the aim of comparing their performance in terms of cost, degree of imbalance, makespan and throughput. First Come First Serve (FCFS), Minimum Completion Time (MCT), Minimum Execution Time (MET), Max-min, Min-min and Sufferage are the heuristic algorithms considered for the performance comparison and analysis of task scheduling in cloud computing. PMID:28467505

  2. Performance comparison of heuristic algorithms for task scheduling in IaaS cloud computing environment.

    Science.gov (United States)

    Madni, Syed Hamid Hussain; Abd Latiff, Muhammad Shafie; Abdullahi, Mohammed; Abdulhamid, Shafi'i Muhammad; Usman, Mohammed Joda

    2017-01-01

    Cloud computing infrastructure is suitable for meeting computational needs of large task sizes. Optimal scheduling of tasks in cloud computing environment has been proved to be an NP-complete problem, hence the need for the application of heuristic methods. Several heuristic algorithms have been developed and used in addressing this problem, but choosing the appropriate algorithm for solving task assignment problem of a particular nature is difficult since the methods are developed under different assumptions. Therefore, six rule based heuristic algorithms are implemented and used to schedule autonomous tasks in homogeneous and heterogeneous environments with the aim of comparing their performance in terms of cost, degree of imbalance, makespan and throughput. First Come First Serve (FCFS), Minimum Completion Time (MCT), Minimum Execution Time (MET), Max-min, Min-min and Sufferage are the heuristic algorithms considered for the performance comparison and analysis of task scheduling in cloud computing.

  3. Visual Reasoning in Computational Environment: A Case of Graph Sketching

    Science.gov (United States)

    Leung, Allen; Chan, King Wah

    2004-01-01

    This paper reports the case of a form six (grade 12) Hong Kong student's exploration of graph sketching in a computational environment. In particular, the student summarized his discovery in the form of two empirical laws. The student was interviewed and the interviewed data were used to map out a possible path of his visual reasoning. Critical…

  4. Beekeeping, environment and modernity in localities in Yucatan, Mexico

    Directory of Open Access Journals (Sweden)

    Enrique Rodríguez Balam

    2015-05-01

    Full Text Available http://dx.doi.org/10.5007/2175-7925.2015v28n3p143 In this paper, we reflect on the local knowledge about the European honey bee  Apis mellifera scutellata, namely its biology, behavior, social structure, communication, and the relationships that these organisms maintain with the environment and their natural enemies. We also discuss the impacts that land use has on this economic activity. The empirical knowledge of beekeepers converges quite well with the scientific knowledge concerning this group of organisms.

  5. Integrating the environment in local strategic planning : Guidelines (Case of Morocco)

    Science.gov (United States)

    Benbrahim, Hafsa

    2018-05-01

    Since 2010, an advanced regionalization project has been initiated by Morocco, which plans to consolidate the processes of decentralization and deconcentration by extending the powers of the regions and other local authorities. This project, institutionalized in the 2011 Constitution, defines the territorial organization of the Kingdom and reinforces decentralization according to a model of advanced regionalization. Through advanced regionalization, Morocco aims at integrated and sustainable development in economic, social, cultural and environmental terms, through the development of the potential and resources of each region. However, in order to honor this commitment of advanced regionalization, local authorities must be assisted in adopting a local strategic planning approach, allowing them to develop territorial plans for sustainable development in accordance with the national legal framework, specifically the Framework law 99-12, and international commitments in terms of environmental protection. This research deals with the issue of environmental governance in relation to the role and duties of local authorities. Thus, the main goal of our study is to present the guidelines to be followed by the local authorities to improve the quality of the environment integration process in the local strategic planning with the aim of putting it in a perspective of sustainable development.

  6. Quality control of computational fluid dynamics in indoor environments

    DEFF Research Database (Denmark)

    Sørensen, Dan Nørtoft; Nielsen, P. V.

    2003-01-01

    Computational fluid dynamics (CFD) is used routinely to predict air movement and distributions of temperature and concentrations in indoor environments. Modelling and numerical errors are inherent in such studies and must be considered when the results are presented. Here, we discuss modelling as...... the quality of CFD calculations, as well as guidelines for the minimum information that should accompany all CFD-related publications to enable a scientific judgment of the quality of the study....

  7. Performing a local reduction operation on a parallel computer

    Science.gov (United States)

    Blocksome, Michael A.; Faraj, Daniel A.

    2012-12-11

    A parallel computer including compute nodes, each including two reduction processing cores, a network write processing core, and a network read processing core, each processing core assigned an input buffer. Copying, in interleaved chunks by the reduction processing cores, contents of the reduction processing cores' input buffers to an interleaved buffer in shared memory; copying, by one of the reduction processing cores, contents of the network write processing core's input buffer to shared memory; copying, by another of the reduction processing cores, contents of the network read processing core's input buffer to shared memory; and locally reducing in parallel by the reduction processing cores: the contents of the reduction processing core's input buffer; every other interleaved chunk of the interleaved buffer; the copied contents of the network write processing core's input buffer; and the copied contents of the network read processing core's input buffer.

  8. Tablet computers and eBooks. Unlocking the potential for personal learning environments?

    NARCIS (Netherlands)

    Kalz, Marco

    2012-01-01

    Kalz, M. (2012, 9 May). Tablet computers and eBooks. Unlocking the potential for personal learning environments? Invited presentation during the annual conference of the European Association for Distance Learning (EADL), Noordwijkerhout, The Netherlands.

  9. Environment mapping and localization with an uncontrolled swarm of ultrasound sensor motes

    NARCIS (Netherlands)

    Duisterwinkel, E.; Demi, L.; Dubbelman, G.; Talnishnikh, E.; Wörtche, H.J.; Bergmans, J.W.M.

    2014-01-01

    A method is presented in which a (large) swarm of sensor motes perform simple ultrasonic ranging measurements. The method allows to localize the motes within the swarm, and at the same time, map the environment which the swarm has traversed. The motes float passively uncontrolled through the

  10. Application of local area network technology in an engineering environment

    International Nuclear Information System (INIS)

    Powell, A.D.; Sokolowski, M.A.

    1990-01-01

    This paper reports on the application of local area network technology in an engineering environment. Mobil Research and Development Corporation Engineering, Dallas, texas has installed a local area network (LAN) linking over 85 microcomputers. This network, which has been in existence for more than three years, provides common access by all engineers to quality output devices such as laser printers and multi-color pen plotters; IBM mainframe connections; electronic mail and file transfer; and common engineering program. The network has been expanded via a wide area ethernet network to link the Dallas location with a functionally equivalent LAN of over 400 microcomputers in Princeton, N.J. Additionally, engineers on assignment at remote areas in Europe, U.S., Africa and project task forces have dial-in access to the network via telephone lines

  11. Design and study of parallel computing environment of Monte Carlo simulation for particle therapy planning using a public cloud-computing infrastructure

    International Nuclear Information System (INIS)

    Yokohama, Noriya

    2013-01-01

    This report was aimed at structuring the design of architectures and studying performance measurement of a parallel computing environment using a Monte Carlo simulation for particle therapy using a high performance computing (HPC) instance within a public cloud-computing infrastructure. Performance measurements showed an approximately 28 times faster speed than seen with single-thread architecture, combined with improved stability. A study of methods of optimizing the system operations also indicated lower cost. (author)

  12. Individually controlled localized chilled beam in conjunction with chilled ceiling: Part 1 – Physical environment

    DEFF Research Database (Denmark)

    Arghand, Taha; Bolashikov, Zhecho Dimitrov; Kosonen, Risto

    2016-01-01

    This study investigates the indoor environment generated by localized chilled beam coupled with chilled ceiling (LCBCC) and compares it with the environment generated by mixing ventilation coupled with chilled ceiling (CCMV). The experiments were performed in a mock-up of single office (4.1 m × 4...

  13. Auto-Scaling of Geo-Based Image Processing in an OpenStack Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Sanggoo Kang

    2016-08-01

    Full Text Available Cloud computing is a base platform for the distribution of large volumes of data and high-performance image processing on the Web. Despite wide applications in Web-based services and their many benefits, geo-spatial applications based on cloud computing technology are still developing. Auto-scaling realizes automatic scalability, i.e., the scale-out and scale-in processing of virtual servers in a cloud computing environment. This study investigates the applicability of auto-scaling to geo-based image processing algorithms by comparing the performance of a single virtual server and multiple auto-scaled virtual servers under identical experimental conditions. In this study, the cloud computing environment is built with OpenStack, and four algorithms from the Orfeo toolbox are used for practical geo-based image processing experiments. The auto-scaling results from all experimental performance tests demonstrate applicable significance with respect to cloud utilization concerning response time. Auto-scaling contributes to the development of web-based satellite image application services using cloud-based technologies.

  14. Integrated multi-sensor fusion for mapping and localization in outdoor environments for mobile robots

    Science.gov (United States)

    Emter, Thomas; Petereit, Janko

    2014-05-01

    An integrated multi-sensor fusion framework for localization and mapping for autonomous navigation in unstructured outdoor environments based on extended Kalman filters (EKF) is presented. The sensors for localization include an inertial measurement unit, a GPS, a fiber optic gyroscope, and wheel odometry. Additionally a 3D LIDAR is used for simultaneous localization and mapping (SLAM). A 3D map is built while concurrently a localization in a so far established 2D map is estimated with the current scan of the LIDAR. Despite of longer run-time of the SLAM algorithm compared to the EKF update, a high update rate is still guaranteed by sophisticatedly joining and synchronizing two parallel localization estimators.

  15. Investigation of local environment around rare earths (La and Eu) by fluorescence line narrowing during borosilicate glass alteration

    Energy Technology Data Exchange (ETDEWEB)

    Molières, Estelle [CEA – DEN-DTCD-LCV-SECM Laboratoire d' études du Comportement à Long Terme, 30207 Bagnols-sur-Cèze (France); Panczer, Gérard; Guyot, Yannick [Institut Lumière Matière, UMR5306 Université Lyon 1-CNRS, Université de Lyon, 69622 Villeurbanne cedex (France); Jollivet, Patrick [CEA – DEN-DTCD-LCV-SECM Laboratoire d' études du Comportement à Long Terme, 30207 Bagnols-sur-Cèze (France); Majérus, Odile; Aschehoug, Patrick; Barboux, Philippe [Laboratoire de Chimie de la Matière Condensée de Paris, UMR-CNRS 7574, École Nationale Supérieure de Chimie de Paris (ENSCP Chimie-ParisTech), 11 rue Pierre et Marie Curie, 75231 Paris (France); Gin, Stéphane [CEA – DEN-DTCD-LCV-SECM Laboratoire d' études du Comportement à Long Terme, 30207 Bagnols-sur-Cèze (France); Angeli, Frédéric, E-mail: frederic.angeli@cea.fr [CEA – DEN-DTCD-LCV-SECM Laboratoire d' études du Comportement à Long Terme, 30207 Bagnols-sur-Cèze (France)

    2014-01-15

    The local environment of europium in soda-lime borosilicate glasses with a range of La{sub 2}O{sub 3} content was probed by continuous luminescence and Fluorescence Line Narrowing (FLN) to investigate the local environment of rare earth elements in pristine and leached glass. After aqueous leaching at 90 °C at pH 7 and 9.5, rare earths were fully retained and homogeneously distributed in the amorphous alteration layer (commonly called gel). Two separate silicate environments were observed in pristine and leached glasses regardless of the lanthanum content and the leaching conditions. A borate environment surrounding europium was not observed in pristine and leached glasses. During glass alteration, OH groups were located around the europium environment, which became more organized (higher symmetry) in the first coordination shell. -- Highlights: • No borate environment surrounding europium was detected in pristine borosilicate glasses. • Up to 12 mol% of REE2O3 in glass, local environment of europium does not significantly change. • Europium environment becomes more ordered and symmetric in gels than in pristine glasses. • Two distinct silicate sites were observed, as well in pristine glass as in gels (leached glasses). • In altered glasses, OH groups were located around europium.

  16. Investigation of local environment around rare earths (La and Eu) by fluorescence line narrowing during borosilicate glass alteration

    International Nuclear Information System (INIS)

    Molières, Estelle; Panczer, Gérard; Guyot, Yannick; Jollivet, Patrick; Majérus, Odile; Aschehoug, Patrick; Barboux, Philippe; Gin, Stéphane; Angeli, Frédéric

    2014-01-01

    The local environment of europium in soda-lime borosilicate glasses with a range of La 2 O 3 content was probed by continuous luminescence and Fluorescence Line Narrowing (FLN) to investigate the local environment of rare earth elements in pristine and leached glass. After aqueous leaching at 90 °C at pH 7 and 9.5, rare earths were fully retained and homogeneously distributed in the amorphous alteration layer (commonly called gel). Two separate silicate environments were observed in pristine and leached glasses regardless of the lanthanum content and the leaching conditions. A borate environment surrounding europium was not observed in pristine and leached glasses. During glass alteration, OH groups were located around the europium environment, which became more organized (higher symmetry) in the first coordination shell. -- Highlights: • No borate environment surrounding europium was detected in pristine borosilicate glasses. • Up to 12 mol% of REE2O3 in glass, local environment of europium does not significantly change. • Europium environment becomes more ordered and symmetric in gels than in pristine glasses. • Two distinct silicate sites were observed, as well in pristine glass as in gels (leached glasses). • In altered glasses, OH groups were located around europium

  17. Automated linear regression tools improve RSSI WSN localization in multipath indoor environment

    Directory of Open Access Journals (Sweden)

    Laermans Eric

    2011-01-01

    Full Text Available Abstract Received signal strength indication (RSSI-based localization is emerging in wireless sensor networks (WSNs. Localization algorithms need to include the physical and hardware limitations of RSSI measurements in order to give more accurate results in dynamic real-life indoor environments. In this study, we use the Interdisciplinary Institute for Broadband Technology real-life test bed and present an automated method to optimize and calibrate the experimental data before offering them to a positioning engine. In a preprocessing localization step, we introduce a new method to provide bounds for the range, thereby further improving the accuracy of our simple and fast 2D localization algorithm based on corrected distance circles. A maximum likelihood algorithm with a mean square error cost function has a higher position error median than our algorithm. Our experiments further show that the complete proposed algorithm eliminates outliers and avoids any manual calibration procedure.

  18. Using the CAVE virtual-reality environment as an aid to 3-D electromagnetic field computation

    International Nuclear Information System (INIS)

    Turner, L.R.; Levine, D.; Huang, M.; Papka, M.

    1995-01-01

    One of the major problems in three-dimensional (3-D) field computation is visualizing the resulting 3-D field distributions. A virtual-reality environment, such as the CAVE, (CAVE Automatic Virtual Environment) is helping to overcome this problem, thus making the results of computation more usable for designers and users of magnets and other electromagnetic devices. As a demonstration of the capabilities of the CAVE, the elliptical multipole wiggler (EMW), an insertion device being designed for the Advanced Photon Source (APS) now being commissioned at Argonne National Laboratory (ANL), wa made visible, along with its fields and beam orbits. Other uses of the CAVE in preprocessing and postprocessing computation for electromagnetic applications are also discussed

  19. Characteristics of Israeli School Teachers in Computer-based Learning Environments

    Directory of Open Access Journals (Sweden)

    Noga Magen-Nagar

    2013-01-01

    Full Text Available The purpose of this research is to investigate whether there are differences in the level of computer literacy, the amount of implementation of ICT in teaching and learning-assessment processes and the attitudes of teachers from computerized schools in comparison to teachers in non-computerized schools. In addition, the research investigates the characteristics of Israeli school teachers in a 21st century computer-based learning environment. A quantitative research methodology was used. The research sample included 811 elementary school teachers from the Jewish sector of whom 402 teachers were from the computerized school sample and 409 were teachers from the non-computerized school sample. The research findings show that teachers from the computerized school sample are more familiar with ICT, tend to use ICT more and have a more positive attitude towards ICT than teachers in the non-computerized school sample. The main conclusion which can be drawn from this research is that positive attitudes of teachers towards ICT are not sufficient for the integration of technology to occur. Future emphasis on new teaching skills of collective Technological Pedagogical Content Knowledge is necessary to promote the implementation of optimal pedagogy in innovative environments.

  20. Using a Cloud-Based Computing Environment to Support Teacher Training on Common Core Implementation

    Science.gov (United States)

    Robertson, Cory

    2013-01-01

    A cloud-based computing environment, Google Apps for Education (GAFE), has provided the Anaheim City School District (ACSD) a comprehensive and collaborative avenue for creating, sharing, and editing documents, calendars, and social networking communities. With this environment, teachers and district staff at ACSD are able to utilize the deep…

  1. Quantum computation via local control theory: Direct sum vs. direct product Hilbert spaces

    International Nuclear Information System (INIS)

    Sklarz, Shlomo E.; Tannor, David J.

    2006-01-01

    The central objective in any quantum computation is the creation of a desired unitary transformation; the mapping that this unitary transformation produces between the input and output states is identified with the computation. In [S.E. Sklarz, D.J. Tannor, arXiv:quant-ph/0404081 (submitted to PRA) (2004)] it was shown that local control theory can be used to calculate fields that will produce such a desired unitary transformation. In contrast with previous strategies for quantum computing based on optimal control theory, the local control scheme maintains the system within the computational subspace at intermediate times, thereby avoiding unwanted decay processes. In [S.E. Sklarz et al.], the structure of the Hilbert space had a direct sum structure with respect to the computational register and the mediating states. In this paper, we extend the formalism to the important case of a direct product Hilbert space. The final equations for the control algorithm for the two cases are remarkably similar in structure, despite the fact that the derivations are completely different and that in one case the dynamics is in a Hilbert space and in the other case the dynamics is in a Liouville space. As shown in [S.E. Sklarz et al.], the direct sum implementation leads to a computational mechanism based on virtual transitions, and can be viewed as an extension of the principles of Stimulated Raman Adiabatic Passage from state manipulation to evolution operator manipulation. The direct product implementation developed here leads to the intriguing concept of virtual entanglement - computation that exploits second-order transitions that pass through entangled states but that leaves the subsystems nearly separable at all intermediate times. Finally, we speculate on a connection between the algorithm developed here and the concept of decoherence free subspaces

  2. An environment for parallel structuring of Fortran programs

    International Nuclear Information System (INIS)

    Sridharan, K.; McShea, M.; Denton, C.; Eventoff, B.; Browne, J.C.; Newton, P.; Ellis, M.; Grossbard, D.; Wise, T.; Clemmer, D.

    1990-01-01

    The paper describes and illustrates an environment for interactive support of the detection and implementation of macro-level parallelism in Fortran programs. The approach couples algorithms for dependence analysis with both innovative techniques for complexity management and capabilities for the measurement and analysis of the parallel computation structures generated through use of the environment. The resulting environment is complementary to the more common approach of seeking local parallelism by loop unrolling, either by an automatic compiler or manually. (orig.)

  3. History Matching in Parallel Computational Environments

    Energy Technology Data Exchange (ETDEWEB)

    Steven Bryant; Sanjay Srinivasan; Alvaro Barrera; Sharad Yadav

    2005-10-01

    A novel methodology for delineating multiple reservoir domains for the purpose of history matching in a distributed computing environment has been proposed. A fully probabilistic approach to perturb permeability within the delineated zones is implemented. The combination of robust schemes for identifying reservoir zones and distributed computing significantly increase the accuracy and efficiency of the probabilistic approach. The information pertaining to the permeability variations in the reservoir that is contained in dynamic data is calibrated in terms of a deformation parameter rD. This information is merged with the prior geologic information in order to generate permeability models consistent with the observed dynamic data as well as the prior geology. The relationship between dynamic response data and reservoir attributes may vary in different regions of the reservoir due to spatial variations in reservoir attributes, well configuration, flow constrains etc. The probabilistic approach then has to account for multiple r{sub D} values in different regions of the reservoir. In order to delineate reservoir domains that can be characterized with different rD parameters, principal component analysis (PCA) of the Hessian matrix has been done. The Hessian matrix summarizes the sensitivity of the objective function at a given step of the history matching to model parameters. It also measures the interaction of the parameters in affecting the objective function. The basic premise of PC analysis is to isolate the most sensitive and least correlated regions. The eigenvectors obtained during the PCA are suitably scaled and appropriate grid block volume cut-offs are defined such that the resultant domains are neither too large (which increases interactions between domains) nor too small (implying ineffective history matching). The delineation of domains requires calculation of Hessian, which could be computationally costly and as well as restricts the current approach to

  4. Research on elastic resource management for multi-queue under cloud computing environment

    Science.gov (United States)

    CHENG, Zhenjing; LI, Haibo; HUANG, Qiulan; Cheng, Yaodong; CHEN, Gang

    2017-10-01

    As a new approach to manage computing resource, virtualization technology is more and more widely applied in the high-energy physics field. A virtual computing cluster based on Openstack was built at IHEP, using HTCondor as the job queue management system. In a traditional static cluster, a fixed number of virtual machines are pre-allocated to the job queue of different experiments. However this method cannot be well adapted to the volatility of computing resource requirements. To solve this problem, an elastic computing resource management system under cloud computing environment has been designed. This system performs unified management of virtual computing nodes on the basis of job queue in HTCondor based on dual resource thresholds as well as the quota service. A two-stage pool is designed to improve the efficiency of resource pool expansion. This paper will present several use cases of the elastic resource management system in IHEPCloud. The practical run shows virtual computing resource dynamically expanded or shrunk while computing requirements change. Additionally, the CPU utilization ratio of computing resource was significantly increased when compared with traditional resource management. The system also has good performance when there are multiple condor schedulers and multiple job queues.

  5. Community-driven computational biology with Debian Linux.

    Science.gov (United States)

    Möller, Steffen; Krabbenhöft, Hajo Nils; Tille, Andreas; Paleino, David; Williams, Alan; Wolstencroft, Katy; Goble, Carole; Holland, Richard; Belhachemi, Dominique; Plessy, Charles

    2010-12-21

    The Open Source movement and its technologies are popular in the bioinformatics community because they provide freely available tools and resources for research. In order to feed the steady demand for updates on software and associated data, a service infrastructure is required for sharing and providing these tools to heterogeneous computing environments. The Debian Med initiative provides ready and coherent software packages for medical informatics and bioinformatics. These packages can be used together in Taverna workflows via the UseCase plugin to manage execution on local or remote machines. If such packages are available in cloud computing environments, the underlying hardware and the analysis pipelines can be shared along with the software. Debian Med closes the gap between developers and users. It provides a simple method for offering new releases of software and data resources, thus provisioning a local infrastructure for computational biology. For geographically distributed teams it can ensure they are working on the same versions of tools, in the same conditions. This contributes to the world-wide networking of researchers.

  6. G‐LoSA: An efficient computational tool for local structure‐centric biological studies and drug design

    Science.gov (United States)

    2016-01-01

    Abstract Molecular recognition by protein mostly occurs in a local region on the protein surface. Thus, an efficient computational method for accurate characterization of protein local structural conservation is necessary to better understand biology and drug design. We present a novel local structure alignment tool, G‐LoSA. G‐LoSA aligns protein local structures in a sequence order independent way and provides a GA‐score, a chemical feature‐based and size‐independent structure similarity score. Our benchmark validation shows the robust performance of G‐LoSA to the local structures of diverse sizes and characteristics, demonstrating its universal applicability to local structure‐centric comparative biology studies. In particular, G‐LoSA is highly effective in detecting conserved local regions on the entire surface of a given protein. In addition, the applications of G‐LoSA to identifying template ligands and predicting ligand and protein binding sites illustrate its strong potential for computer‐aided drug design. We hope that G‐LoSA can be a useful computational method for exploring interesting biological problems through large‐scale comparison of protein local structures and facilitating drug discovery research and development. G‐LoSA is freely available to academic users at http://im.compbio.ku.edu/GLoSA/. PMID:26813336

  7. Parallelized Local Volatility Estimation Using GP-GPU Hardware Acceleration

    KAUST Repository

    Douglas, Craig C.

    2010-01-01

    We introduce an inverse problem for the local volatility model in option pricing. We solve the problem using the Levenberg-Marquardt algorithm and use the notion of the Fréchet derivative when calculating the Jacobian matrix. We analyze the existence of the Fréchet derivative and its numerical computation. To reduce the computational time of the inverse problem, a GP-GPU environment is considered for parallel computation. Numerical results confirm the validity and efficiency of the proposed method. ©2010 IEEE.

  8. GLOBAL-LOCAL ENVIRONMENT CERTIFICATION AT FIVE STAR HOTELS IN TOURISM AREA OF NUSA DUA, BALI

    Directory of Open Access Journals (Sweden)

    Ni Gst Nym Suci Murni

    2014-06-01

    Full Text Available The research aims to examine the various form of environment certification, ideology behind the practice of green tourism (global award and Tri Hita Karana (local award, and the implication of environment practice at five star hotel in Nusa Dua tourism area. The data of the reserach was assessed by postmodern critical theory (theory of discourse regarding power/knowledge, hegemony theory, practice theory, and theory of deep/shallow ecology. The method used in this cultural studies is the qualitative one, where the data collection were obtained through direct observation, in-depth interviews, and related documentation. The sample used 6 five star hotels which practise green award, of 14 established five star hotels (some hotel is not in full operation.  The results showed that (1 there are some variation of environment practice in five star hotel, (2 ideology working behind these practices can be seen from global ideology in the form of sustainable development deriving green tourism, and the local ideology, in the form of Tri Hita Karana (THK used in THK award, (3 implication of global-local invironment practice in tourism area and surrounding.

  9. Localized Corrosion Behavior of Type 304SS with a Silica Layer Under Atmospheric Corrosion Environments

    International Nuclear Information System (INIS)

    E. Tada; G.S. Frankel

    2006-01-01

    The U.S. Department of Energy (DOE) has proposed a potential repository for spent nuclear fuel and high-level radioactive waste at the Yucca Mountain site in Nevada. [I] The temperature could be high on the waste packages, and it is possible that dripping water or humidity could interact with rock dust particulate to form a thin electrolyte layer with concentrated ionic species. Under these conditions, it is possible that highly corrosion-resistant alloys (CRAs) used as packages to dispose the nuclear waste could suffer localized corrosion. Therefore, to better understand long-term corrosion performance of CRAs in the repository, it is important to investigate localized corrosion under a simulated repository environment. We measured open circuit potential (OCP) and galvanic current (i g ) for silica-coated Type 304SS during drying of salt solutions under controlled RH environments to clarify the effect of silica layer as a dust layer simulant on localized corrosion under atmospheric environments. Type 304SS was used as a relatively susceptible model CRA instead of the much more corrosion resistant alloys, such as Alloy 22, that are being considered as, waste package materials

  10. Preoperative localization of endocrine pancreatic tumours by intra-arterial dynamic computed tomography

    International Nuclear Information System (INIS)

    Ahlstroem, H.; Magnusson, A.; Grama, D.; Eriksson, B.; Oeberg, K.; Loerelius, L.E.; Akademiska Sjukhuset, Uppsala; Akademiska Sjukhuset, Uppsala

    1990-01-01

    Eleven patients with biochemically confirmed endocrine pancreatic tumours were examined with intra-arterial (i.a.) dynamic computed tomography (CT) and angiography preoperatively. Seven of the patients suffered from the multiple endocrine neoplasia type 1 (MEN-1) syndrome. All patients were operated upon and surgical palpation and ultrasound were the peroperative localization methods. Of the 33 tumours which were found at histopathologic analysis of the resected specimens in the 11 patients, 7 tumours in 7 patients were correctly localized by both i.a. dynamic CT and angiography. Six patients with MEN-1 syndrome had multiple tumours and this group of patients together had 28 tumours, of which 5 (18%) were localized preoperatively by both CT and angiography. I.a. dynamic CT, with the technique used by us, does not seem to improve the localization of endocrine pancreatic tumours, especially in the rare group of MEN-1 patients, as compared with angiography. (orig.)

  11. A large-scale RF-based Indoor Localization System Using Low-complexity Gaussian filter and improved Bayesian inference

    Directory of Open Access Journals (Sweden)

    L. Xiao

    2013-04-01

    Full Text Available The growing convergence among mobile computing device and smart sensors boosts the development of ubiquitous computing and smart spaces, where localization is an essential part to realize the big vision. The general localization methods based on GPS and cellular techniques are not suitable for tracking numerous small size and limited power objects in the indoor case. In this paper, we propose and demonstrate a new localization method, this method is an easy-setup and cost-effective indoor localization system based on off-the-shelf active RFID technology. Our system is not only compatible with the future smart spaces and ubiquitous computing systems, but also suitable for large-scale indoor localization. The use of low-complexity Gaussian Filter (GF, Wheel Graph Model (WGM and Probabilistic Localization Algorithm (PLA make the proposed algorithm robust and suitable for large-scale indoor positioning from uncertainty, self-adjective to varying indoor environment. Using MATLAB simulation, we study the system performances, especially the dependence on a number of system and environment parameters, and their statistical properties. The simulation results prove that our proposed system is an accurate and cost-effective candidate for indoor localization.

  12. Local environment but not genetic differentiation influences biparental care in ten plover populations.

    Directory of Open Access Journals (Sweden)

    Orsolya Vincze

    Full Text Available Social behaviours are highly variable between species, populations and individuals. However, it is contentious whether behavioural variations are primarily moulded by the environment, caused by genetic differences, or a combination of both. Here we establish that biparental care, a complex social behaviour that involves rearing of young by both parents, differs between closely related populations, and then test two potential sources of variation in parental behaviour between populations: ambient environment and genetic differentiation. We use 2904 hours behavioural data from 10 geographically distinct Kentish (Charadrius alexandrinus and snowy plover (C. nivosus populations in America, Europe, the Middle East and North Africa to test these two sources of behavioural variation. We show that local ambient temperature has a significant influence on parental care: with extreme heat (above 40 °C total incubation (i.e. % of time the male or female incubated the nest increased, and female share (% female share of incubation decreased. By contrast, neither genetic differences between populations, nor geographic distances predicted total incubation or female's share of incubation. These results suggest that the local environment has a stronger influence on a social behaviour than genetic differentiation, at least between populations of closely related species.

  13. A mixed-methods exploration of an environment for learning computer programming

    Directory of Open Access Journals (Sweden)

    Richard Mather

    2015-08-01

    Full Text Available A mixed-methods approach is evaluated for exploring collaborative behaviour, acceptance and progress surrounding an interactive technology for learning computer programming. A review of literature reveals a compelling case for using mixed-methods approaches when evaluating technology-enhanced-learning environments. Here, ethnographic approaches used for the requirements engineering of computing systems are combined with questionnaire-based feedback and skill tests. These are applied to the ‘Ceebot’ animated 3D learning environment. Video analysis with workplace observation allowed detailed inspection of problem solving and tacit behaviours. Questionnaires and knowledge tests provided broad sample coverage with insights into subject understanding and overall response to the learning environment. Although relatively low scores in programming tests seemingly contradicted the perception that Ceebot had enhanced understanding of programming, this perception was nevertheless found to be correlated with greater test performance. Video analysis corroborated findings that the learning environment and Ceebot animations were engaging and encouraged constructive collaborative behaviours. Ethnographic observations clearly captured Ceebot's value in providing visual cues for problem-solving discussions and for progress through sharing discoveries. Notably, performance in tests was most highly correlated with greater programming practice (p≤0.01. It was apparent that although students had appropriated technology for collaborative working and benefitted from visual and tacit cues provided by Ceebot, they had not necessarily deeply learned the lessons intended. The key value of the ‘mixed-methods’ approach was that ethnographic observations captured the authenticity of learning behaviours, and thereby strengthened confidence in the interpretation of questionnaire and test findings.

  14. The Virtual Cell: a software environment for computational cell biology.

    Science.gov (United States)

    Loew, L M; Schaff, J C

    2001-10-01

    The newly emerging field of computational cell biology requires software tools that address the needs of a broad community of scientists. Cell biological processes are controlled by an interacting set of biochemical and electrophysiological events that are distributed within complex cellular structures. Computational modeling is familiar to researchers in fields such as molecular structure, neurobiology and metabolic pathway engineering, and is rapidly emerging in the area of gene expression. Although some of these established modeling approaches can be adapted to address problems of interest to cell biologists, relatively few software development efforts have been directed at the field as a whole. The Virtual Cell is a computational environment designed for cell biologists as well as for mathematical biologists and bioengineers. It serves to aid the construction of cell biological models and the generation of simulations from them. The system enables the formulation of both compartmental and spatial models, the latter with either idealized or experimentally derived geometries of one, two or three dimensions.

  15. Proposed Network Intrusion Detection System ‎Based on Fuzzy c Mean Algorithm in Cloud ‎Computing Environment

    Directory of Open Access Journals (Sweden)

    Shawq Malik Mehibs

    2017-12-01

    Full Text Available Nowadays cloud computing had become is an integral part of IT industry, cloud computing provides Working environment allow a user of environmental to share data and resources over the internet. Where cloud computing its virtual grouping of resources offered over the internet, this lead to different matters related to the security and privacy in cloud computing. And therefore, create intrusion detection very important to detect outsider and insider intruders of cloud computing with high detection rate and low false positive alarm in the cloud environment. This work proposed network intrusion detection module using fuzzy c mean algorithm. The kdd99 dataset used for experiments .the proposed system characterized by a high detection rate with low false positive alarm

  16. Touch in Computer-Mediated Environments: An Analysis of Online Shoppers' Touch-Interface User Experiences

    Science.gov (United States)

    Chung, Sorim

    2016-01-01

    Over the past few years, one of the most fundamental changes in current computer-mediated environments has been input devices, moving from mouse devices to touch interfaces. However, most studies of online retailing have not considered device environments as retail cues that could influence users' shopping behavior. In this research, I examine the…

  17. Reinforcement learning in computer vision

    Science.gov (United States)

    Bernstein, A. V.; Burnaev, E. V.

    2018-04-01

    Nowadays, machine learning has become one of the basic technologies used in solving various computer vision tasks such as feature detection, image segmentation, object recognition and tracking. In many applications, various complex systems such as robots are equipped with visual sensors from which they learn state of surrounding environment by solving corresponding computer vision tasks. Solutions of these tasks are used for making decisions about possible future actions. It is not surprising that when solving computer vision tasks we should take into account special aspects of their subsequent application in model-based predictive control. Reinforcement learning is one of modern machine learning technologies in which learning is carried out through interaction with the environment. In recent years, Reinforcement learning has been used both for solving such applied tasks as processing and analysis of visual information, and for solving specific computer vision problems such as filtering, extracting image features, localizing objects in scenes, and many others. The paper describes shortly the Reinforcement learning technology and its use for solving computer vision problems.

  18. Computation of quantum electron transport with local current conservation using quantum trajectories

    International Nuclear Information System (INIS)

    Alarcón, A; Oriols, X

    2009-01-01

    A recent proposal for modeling time-dependent quantum electron transport with Coulomb and exchange correlations using quantum (Bohm) trajectories (Oriols 2007 Phys. Rev. Lett. 98 066803) is extended towards the computation of the total (particle plus displacement) current in mesoscopic devices. In particular, two different methods for the practical computation of the total current are compared. The first method computes the particle and the displacement currents from the rate of Bohm particles crossing a particular surface and the time-dependent variations of the electric field there. The second method uses the Ramo–Shockley theorem to compute the total current on that surface from the knowledge of the Bohm particle dynamics in a 3D volume and the time-dependent variations of the electric field on the boundaries of that volume. From a computational point of view, it is shown that both methods achieve local current conservation, but the second is preferred because it is free from 'spurious' peaks. A numerical example, a Bohm trajectory crossing a double-barrier tunneling structure, is presented, supporting the conclusions

  19. Robust localisation of automated guided vehicles for computer-integrated manufacturing environments

    Directory of Open Access Journals (Sweden)

    Dixon, R. C.

    2013-05-01

    Full Text Available As industry moves toward an era of complete automation and mass customisation, automated guided vehicles (AGVs are used as material handling systems. However, the current techniques that provide navigation, control, and manoeuvrability of automated guided vehicles threaten to create bottlenecks and inefficiencies in manufacturing environments that strive towards the optimisation of part production. This paper proposes a decentralised localisation technique for an automated guided vehicle without any non-holonomic constraints. Incorporation of these vehicles into the material handling system of a computer-integrated manufacturing environment would increase the characteristics of robustness, efficiency, flexibility, and advanced manoeuvrability.

  20. Touch in Computer-Mediated Environments: An Analysis of Online Shoppers’ Touch-Interface User Experiences

    OpenAIRE

    Chung, Sorim

    2016-01-01

    Over the past few years, one of the most fundamental changes in current computer-mediated environments has been input devices, moving from mouse devices to touch interfaces. However, most studies of online retailing have not considered device environments as retail cues that could influence users’ shopping behavior. In this research, I examine the underlying mechanisms between input device environments and shoppers’ decision-making processes. In particular, I investigate the impact of input d...

  1. KeyWare: an open wireless distributed computing environment

    Science.gov (United States)

    Shpantzer, Isaac; Schoenfeld, Larry; Grindahl, Merv; Kelman, Vladimir

    1995-12-01

    Deployment of distributed applications in the wireless domain lack equivalent tools, methodologies, architectures, and network management that exist in LAN based applications. A wireless distributed computing environment (KeyWareTM) based on intelligent agents within a multiple client multiple server scheme was developed to resolve this problem. KeyWare renders concurrent application services to wireline and wireless client nodes encapsulated in multiple paradigms such as message delivery, database access, e-mail, and file transfer. These services and paradigms are optimized to cope with temporal and spatial radio coverage, high latency, limited throughput and transmission costs. A unified network management paradigm for both wireless and wireline facilitates seamless extensions of LAN- based management tools to include wireless nodes. A set of object oriented tools and methodologies enables direct asynchronous invocation of agent-based services supplemented by tool-sets matched to supported KeyWare paradigms. The open architecture embodiment of KeyWare enables a wide selection of client node computing platforms, operating systems, transport protocols, radio modems and infrastructures while maintaining application portability.

  2. ComputerApplications and Virtual Environments (CAVE)

    Science.gov (United States)

    1993-01-01

    Virtual Reality (VR) can provide cost effective methods to design and evaluate components and systems for maintenance and refurbishment operations. The Marshall Space Flight Centerr (MSFC) in Huntsville, Alabama began to utilize VR for design analysis in the X-34 experimental reusable space vehicle. Analysts at MSFC's Computer Applications and Virtual Environments (CAVE) used Head Mounted Displays (HMD) (pictured), spatial trackers and gesture inputs as a means to animate or inhabit a properly sized virtual human model. These models were used in a VR scenario as a way to determine functionality of space and maintenance requirements for the virtual X-34. The primary functions of the virtual X-34 mockup was to support operations development and design analysis for engine removal, the engine compartment and the aft fuselage. This capability provided general visualization support to engineers and designers at MSFC and to the System Design Freeze Review at Orbital Sciences Corporation (OSC). The X-34 program was cancelled in 2001.

  3. Distributed, signal strength-based indoor localization algorithm for use in healthcare environments.

    Science.gov (United States)

    Wyffels, Jeroen; De Brabanter, Jos; Crombez, Pieter; Verhoeve, Piet; Nauwelaers, Bart; De Strycker, Lieven

    2014-11-01

    In current healthcare environments, a trend toward mobile and personalized interactions between people and nurse call systems is strongly noticeable. Therefore, it should be possible to locate patients at all times and in all places throughout the care facility. This paper aims at describing a method by which a mobile node can locate itself indoors, based on signal strength measurements and a minimal amount of yes/no decisions. The algorithm has been developed specifically for use in a healthcare environment. With extensive testing and statistical support, we prove that our algorithm can be used in a healthcare setting with an envisioned level of localization accuracy up to room revel (or region level in a corridor), while avoiding heavy investments since the hardware of an existing nurse call network can be reused. The approach opted for leads to very high scalability, since thousands of mobile nodes can locate themselves. Network timing issues and localization update delays are avoided, which ensures that a patient can receive the needed care in a time and resources efficient way.

  4. Wireless local area network in a prehospital environment

    Directory of Open Access Journals (Sweden)

    Grimes Gary J

    2004-08-01

    Full Text Available Abstract Background Wireless local area networks (WLANs are considered the next generation of clinical data network. They open the possibility for capturing clinical data in a prehospital setting (e.g., a patient's home using various devices, such as personal digital assistants, laptops, digital electrocardiogram (EKG machines, and even cellular phones, and transmitting the captured data to a physician or hospital. The transmission rate is crucial to the applicability of the technology in the prehospital setting. Methods We created two separate WLANs to simulate a virtual local are network environment such as in a patient's home or an emergency room (ER. The effects of different methods of data transmission, number of clients, and roaming among different access points on the file transfer rate were determined. Results The present results suggest that it is feasible to transfer small files such as patient demographics and EKG data from the patient's home to the ER at a reasonable speed. Encryption, user control, and access control were implemented and results discussed. Conclusions Implementing a WLAN in a centrally managed and multiple-layer-controlled access control server is the key to ensuring its security and accessibility. Future studies should focus on product capacity, speed, compatibility, interoperability, and security management.

  5. Acoustic radiosity for computation of sound fields in diffuse environments

    Science.gov (United States)

    Muehleisen, Ralph T.; Beamer, C. Walter

    2002-05-01

    The use of image and ray tracing methods (and variations thereof) for the computation of sound fields in rooms is relatively well developed. In their regime of validity, both methods work well for prediction in rooms with small amounts of diffraction and mostly specular reflection at the walls. While extensions to the method to include diffuse reflections and diffraction have been made, they are limited at best. In the fields of illumination and computer graphics the ray tracing and image methods are joined by another method called luminous radiative transfer or radiosity. In radiosity, an energy balance between surfaces is computed assuming diffuse reflection at the reflective surfaces. Because the interaction between surfaces is constant, much of the computation required for sound field prediction with multiple or moving source and receiver positions can be reduced. In acoustics the radiosity method has had little attention because of the problems of diffraction and specular reflection. The utility of radiosity in acoustics and an approach to a useful development of the method for acoustics will be presented. The method looks especially useful for sound level prediction in industrial and office environments. [Work supported by NSF.

  6. Human response to individually controlled micro environment generated with localized chilled beam

    DEFF Research Database (Denmark)

    Uth, Simon C.; Nygaard, Linette; Bolashikov, Zhecho Dimitrov

    2014-01-01

    Indoor environment in a single-office room created by a localised chilled beam with individual control of the primary air flow was studied. Response of 24 human subjects when exposed to the environment generated by the chilled beam was collected via questionnaires under a 2-hour exposure including...... and local thermal sensation reported by the subjects with the two systems. Both systems were equally acceptable. At 26°C the individual control of the localised chilled beam lead to higher acceptability of the work environment. At 28°C the acceptability decreased with the two systems. It was not acceptable...... different work tasks at three locations in the room. Response of the subjects to the environment generated with a chilled ceiling combined with mixing air distribution was used for comparison. The air temperature in the room was kept at 26 or 28 °C. Results show no significant difference in the overall...

  7. MOO: Using a Computer Gaming Environment to Teach about Community Arts

    Science.gov (United States)

    Garber, Elizabeth

    2004-01-01

    In this paper, the author discusses the use of an interactive computer technology, "MOO" (Multi-user domain, Object-Oriented), in her art education classes for preservice teachers. A MOO is a text-based environment wherein interactivity is centered on text exchanges made between users based on problems or other materials created by teachers. The…

  8. Computers in nuclear medicine - current trends and future directions

    International Nuclear Information System (INIS)

    Anon.

    1994-01-01

    Previously, a decision to purchase computing equipment for nuclear medicine usually required evaluation of the 'local' needs. With the advent of Pacs and state of the art computer techniques for image acquisition and manipulation, purchase and subsequent application is to become much more complex. Some of the current trends and future possibilities which may influence the choice and operation of computers within and outside the nuclear medicine environment is discussed. (author)

  9. Auto-Scaling of Geo-Based Image Processing in an OpenStack Cloud Computing Environment

    OpenAIRE

    Sanggoo Kang; Kiwon Lee

    2016-01-01

    Cloud computing is a base platform for the distribution of large volumes of data and high-performance image processing on the Web. Despite wide applications in Web-based services and their many benefits, geo-spatial applications based on cloud computing technology are still developing. Auto-scaling realizes automatic scalability, i.e., the scale-out and scale-in processing of virtual servers in a cloud computing environment. This study investigates the applicability of auto-scaling to geo-bas...

  10. 2: Local area networks as a multiprocessor treatment planning system

    International Nuclear Information System (INIS)

    Neblett, D.L.; Hogan, S.E.

    1987-01-01

    The creation of a local area network (LAN) of interconnected computers provides an environment of multi computer processors that adds a new dimension to treatment planning. A LAN system provides the opportunity to have two or more computers working on the plan in parallel. With high speed interprocessor transfer, events such as the time consuming task of correcting several individual beams for contours and inhomogeneities can be performed simultaneously; thus, effectively creating a parallel multiprocessor treatment planning system

  11. Towards the Automatic Detection of Efficient Computing Assets in a Heterogeneous Cloud Environment

    OpenAIRE

    Iglesias, Jesus Omana; Stokes, Nicola; Ventresque, Anthony; Murphy, Liam, B.E.; Thorburn, James

    2013-01-01

    peer-reviewed In a heterogeneous cloud environment, the manual grading of computing assets is the first step in the process of configuring IT infrastructures to ensure optimal utilization of resources. Grading the efficiency of computing assets is however, a difficult, subjective and time consuming manual task. Thus, an automatic efficiency grading algorithm is highly desirable. In this paper, we compare the effectiveness of the different criteria used in the manual gr...

  12. The status of computing and means of local and external networking at JINR

    Energy Technology Data Exchange (ETDEWEB)

    Dorokhin, A T; Shirikov, V P

    1996-12-31

    The goal of this report is to represent a view of the current state of computer support at JINR different physical researches. JINR network and its applications are considered. Trends of local networks and the connectivity with global networks are discussed. 3 refs.

  13. Thermal comfort assessment of a surgical room through computational fluid dynamics using local PMV index.

    Science.gov (United States)

    Rodrigues, Nelson J O; Oliveira, Ricardo F; Teixeira, Senhorinha F C F; Miguel, Alberto Sérgio; Teixeira, José Carlos; Baptista, João S

    2015-01-01

    Studies concerning indoor thermal conditions are very important in defining the satisfactory comfort range in health care facilities. This study focuses on the evaluation of the thermal comfort sensation felt by surgeons and nurses, in an orthopaedic surgical room of a Portuguese hospital. Two cases are assessed, with and without the presence of a person. Computational fluid dynamic (CFD) tools were applied for evaluating the predicted mean vote (PMV) index locally. Using average ventilation values to calculate the PMV index does not provide a correct and enough descriptive evaluation of the surgical room thermal environment. As studied for both cases, surgeons feel the environment slightly hotter than nurses. The nurses feel a slightly cold sensation under the air supply diffuser and their neutral comfort zone is located in the air stagnation zones close to the walls, while the surgeons feel the opposite. It was observed that the presence of a person in the room leads to an increase of the PMV index for surgeons and nurses. That goes in line with the empirical knowledge that more persons in a room lead to an increased heat sensation. The clothing used by both classes, as well as the ventilation conditions, should be revised accordingly to the amount of persons in the room and the type of activity performed.

  14. Inequality measures perform differently in global and local assessments: An exploratory computational experiment

    Science.gov (United States)

    Chiang, Yen-Sheng

    2015-11-01

    Inequality measures are widely used in both the academia and public media to help us understand how incomes and wealth are distributed. They can be used to assess the distribution of a whole society-global inequality-as well as inequality of actors' referent networks-local inequality. How different is local inequality from global inequality? Formalizing the structure of reference groups as a network, the paper conducted a computational experiment to see how the structure of complex networks influences the difference between global and local inequality assessed by a selection of inequality measures. It was found that local inequality tends to be higher than global inequality when population size is large; network is dense and heterophilously assorted, and income distribution is less dispersed. The implications of the simulation findings are discussed.

  15. A Computing Environment to Support Repeatable Scientific Big Data Experimentation of World-Wide Scientific Literature

    Energy Technology Data Exchange (ETDEWEB)

    Schlicher, Bob G [ORNL; Kulesz, James J [ORNL; Abercrombie, Robert K [ORNL; Kruse, Kara L [ORNL

    2015-01-01

    A principal tenant of the scientific method is that experiments must be repeatable and relies on ceteris paribus (i.e., all other things being equal). As a scientific community, involved in data sciences, we must investigate ways to establish an environment where experiments can be repeated. We can no longer allude to where the data comes from, we must add rigor to the data collection and management process from which our analysis is conducted. This paper describes a computing environment to support repeatable scientific big data experimentation of world-wide scientific literature, and recommends a system that is housed at the Oak Ridge National Laboratory in order to provide value to investigators from government agencies, academic institutions, and industry entities. The described computing environment also adheres to the recently instituted digital data management plan mandated by multiple US government agencies, which involves all stages of the digital data life cycle including capture, analysis, sharing, and preservation. It particularly focuses on the sharing and preservation of digital research data. The details of this computing environment are explained within the context of cloud services by the three layer classification of Software as a Service , Platform as a Service , and Infrastructure as a Service .

  16. Evaluating Students' Perceptions and Attitudes toward Computer-Mediated Project-Based Learning Environment: A Case Study

    Science.gov (United States)

    Seet, Ling Ying Britta; Quek, Choon Lang

    2010-01-01

    This research investigated 68 secondary school students' perceptions of their computer-mediated project-based learning environment and their attitudes towards Project Work (PW) using two instruments--Project Work Classroom Learning Environment Questionnaire (PWCLEQ) and Project Work Related Attitudes Instrument (PWRAI). In this project-based…

  17. Nuclides.net: An integrated environment for computations on radionuclides and their radiation

    International Nuclear Information System (INIS)

    Galy, J.; Magill, J.

    2002-01-01

    Full text: The Nuclides.net computational package is of direct interest in the fields of environment monitoring and nuclear forensics. The 'integrated environment' is a suite of computer programs ranging from a powerful user-friendly interface, which allows the user to navigate the nuclide chart and explore the properties of nuclides, to various computational modules for decay calculations, dosimetry and shielding calculations, etc. The main emphasis in Nuclides.net is on nuclear science applications, such as health physics, radioprotection and radiochemistry, rather than nuclear data for which excellent sources already exist. In contrast to the CD-based Nuclides 2000 predecessor, Nuclides.net applications run over the internet on a web server. The user interface to these applications is via a web browser. Information submitted by the user is sent to the appropriate applications resident on the web server. The results of the calculations are returned to the user, again via the browser. The product is aimed at both students and professionals for reference data on radionuclides and computations based on this data using the latest internet technology. It is particularly suitable for educational purposes in the nuclear industry, health physics and radiation protection, nuclear and radiochemistry, nuclear physics, astrophysics, etc. The Nuclides.net software suite contains the following modules/features: a) A new user interface to view the nuclide charts (with zoom features). Additional nuclide charts are based on spin, parity, binding energy etc. b) There are five main applications: (1) 'Decay Engine' for decay calculations of numbers, masses, activities, dose rates, etc. of parent and daughters. (2) 'Dosimetry and Shielding' module allows the calculation of dose rates from both unshielded and shielded point sources. A choice of 10 shield materials is available. (3) 'Virtual Nuclides' allows the user to do decay and dosimetry and shielding calculations on mixtures of

  18. Development of a locally mass flux conservative computer code for calculating 3-D viscous flow in turbomachines

    Science.gov (United States)

    Walitt, L.

    1982-01-01

    The VANS successive approximation numerical method was extended to the computation of three dimensional, viscous, transonic flows in turbomachines. A cross-sectional computer code, which conserves mass flux at each point of the cross-sectional surface of computation was developed. In the VANS numerical method, the cross-sectional computation follows a blade-to-blade calculation. Numerical calculations were made for an axial annular turbine cascade and a transonic, centrifugal impeller with splitter vanes. The subsonic turbine cascade computation was generated in blade-to-blade surface to evaluate the accuracy of the blade-to-blade mode of marching. Calculated blade pressures at the hub, mid, and tip radii of the cascade agreed with corresponding measurements. The transonic impeller computation was conducted to test the newly developed locally mass flux conservative cross-sectional computer code. Both blade-to-blade and cross sectional modes of calculation were implemented for this problem. A triplet point shock structure was computed in the inducer region of the impeller. In addition, time-averaged shroud static pressures generally agreed with measured shroud pressures. It is concluded that the blade-to-blade computation produces a useful engineering flow field in regions of subsonic relative flow; and cross-sectional computation, with a locally mass flux conservative continuity equation, is required to compute the shock waves in regions of supersonic relative flow.

  19. Local Path Planning of Driverless Car Navigation Based on Jump Point Search Method Under Urban Environment

    Directory of Open Access Journals (Sweden)

    Kaijun Zhou

    2017-09-01

    Full Text Available The Jump Point Search (JPS algorithm is adopted for local path planning of the driverless car under urban environment, and it is a fast search method applied in path planning. Firstly, a vector Geographic Information System (GIS map, including Global Positioning System (GPS position, direction, and lane information, is built for global path planning. Secondly, the GIS map database is utilized in global path planning for the driverless car. Then, the JPS algorithm is adopted to avoid the front obstacle, and to find an optimal local path for the driverless car in the urban environment. Finally, 125 different simulation experiments in the urban environment demonstrate that JPS can search out the optimal and safety path successfully, and meanwhile, it has a lower time complexity compared with the Vector Field Histogram (VFH, the Rapidly Exploring Random Tree (RRT, A*, and the Probabilistic Roadmaps (PRM algorithms. Furthermore, JPS is validated usefully in the structured urban environment.

  20. Architecture independent environment for developing engineering software on MIMD computers

    Science.gov (United States)

    Valimohamed, Karim A.; Lopez, L. A.

    1990-01-01

    Engineers are constantly faced with solving problems of increasing complexity and detail. Multiple Instruction stream Multiple Data stream (MIMD) computers have been developed to overcome the performance limitations of serial computers. The hardware architectures of MIMD computers vary considerably and are much more sophisticated than serial computers. Developing large scale software for a variety of MIMD computers is difficult and expensive. There is a need to provide tools that facilitate programming these machines. First, the issues that must be considered to develop those tools are examined. The two main areas of concern were architecture independence and data management. Architecture independent software facilitates software portability and improves the longevity and utility of the software product. It provides some form of insurance for the investment of time and effort that goes into developing the software. The management of data is a crucial aspect of solving large engineering problems. It must be considered in light of the new hardware organizations that are available. Second, the functional design and implementation of a software environment that facilitates developing architecture independent software for large engineering applications are described. The topics of discussion include: a description of the model that supports the development of architecture independent software; identifying and exploiting concurrency within the application program; data coherence; engineering data base and memory management.

  1. COMPUTATIONAL MODELS USED FOR MINIMIZING THE NEGATIVE IMPACT OF ENERGY ON THE ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Oprea D.

    2012-04-01

    Full Text Available Optimizing energy system is a problem that is extensively studied for many years by scientists. This problem can be studied from different views and using different computer programs. The work is characterized by one of the following calculation methods used in Europe for modelling, power system optimization. This method shall be based on reduce action of energy system on environment. Computer program used and characterized in this article is GEMIS.

  2. Visualization system for grid environment in the nuclear field

    International Nuclear Information System (INIS)

    Suzuki, Yoshio; Matsumoto, Nobuko; Idomura, Yasuhiro; Tani, Masayuki

    2006-01-01

    An innovative scientific visualization system is needed to integratedly visualize large amount of data which are distributedly generated in remote locations as a result of a large-scale numerical simulation using a grid environment. One of the important functions in such a visualization system is a parallel visualization which enables to visualize data using multiple CPUs of a supercomputer. The other is a distributed visualization which enables to execute visualization processes using a local client computer and remote computers. We have developed a toolkit including these functions in cooperation with the commercial visualization software AVS/Express, called Parallel Support Toolkit (PST). PST can execute visualization processes with three kinds of parallelism (data parallelism, task parallelism and pipeline parallelism) using local and remote computers. We have evaluated PST for large amount of data generated by a nuclear fusion simulation. Here, two supercomputers Altix3700Bx2 and Prism installed in JAEA are used. From the evaluation, it can be seen that PST has a potential to efficiently visualize large amount of data in a grid environment. (author)

  3. A Novel Biometric Approach for Authentication In Pervasive Computing Environments

    OpenAIRE

    Rachappa,; Divyajyothi M G; D H Rao

    2016-01-01

    The paradigm of embedding computing devices in our surrounding environment has gained more interest in recent days. Along with contemporary technology comes challenges, the most important being the security and privacy aspect. Keeping the aspect of compactness and memory constraints of pervasive devices in mind, the biometric techniques proposed for identification should be robust and dynamic. In this work, we propose an emerging scheme that is based on few exclusive human traits and characte...

  4. Taiwanese Consumers’ Perceptions of Local and Global Brands: An Investigation in Taiwan Computer Industry

    OpenAIRE

    Hsieh, Ya-Yun

    2010-01-01

    This study aims to investigate how consumers in a newly developed country, Taiwan, perceive local brands and global brands in the computer industry. To access an in-depth understanding and evaluate factors that influence consumers’ assessment of local and global brands, the country-of-origin effect and the association of brand origin are investigated; the effect of consumer ethnocentrism is addressed; and the cultural aspects on collectivism and face concept are examined. The study adopts...

  5. DiFX: A software correlator for very long baseline interferometry using multi-processor computing environments

    OpenAIRE

    Deller, A. T.; Tingay, S. J.; Bailes, M.; West, C.

    2007-01-01

    We describe the development of an FX style correlator for Very Long Baseline Interferometry (VLBI), implemented in software and intended to run in multi-processor computing environments, such as large clusters of commodity machines (Beowulf clusters) or computers specifically designed for high performance computing, such as multi-processor shared-memory machines. We outline the scientific and practical benefits for VLBI correlation, these chiefly being due to the inherent flexibility of softw...

  6. Massive calculations of electrostatic potentials and structure maps of biopolymers in a distributed computing environment

    International Nuclear Information System (INIS)

    Akishina, T.P.; Ivanov, V.V.; Stepanenko, V.A.

    2013-01-01

    Among the key factors determining the processes of transcription and translation are the distributions of the electrostatic potentials of DNA, RNA and proteins. Calculations of electrostatic distributions and structure maps of biopolymers on computers are time consuming and require large computational resources. We developed the procedures for organization of massive calculations of electrostatic potentials and structure maps for biopolymers in a distributed computing environment (several thousands of cores).

  7. Vanderbilt University: Campus Computing Environment.

    Science.gov (United States)

    CAUSE/EFFECT, 1988

    1988-01-01

    Despite the decentralized nature of computing at Vanderbilt, there is significant evidence of cooperation and use of each other's resources by the various computing entities. Planning for computing occurs in every school and department. Caravan, a campus-wide network, is described. (MLW)

  8. Local and Long Distance Computer Networking for Science Classrooms. Technical Report No. 43.

    Science.gov (United States)

    Newman, Denis

    This report describes Earth Lab, a project which is demonstrating new ways of using computers for upper-elementary and middle-school science instruction, and finding ways to integrate local-area and telecommunications networks. The discussion covers software, classroom activities, formative research on communications networks, and integration of…

  9. Accelerating Dust Storm Simulation by Balancing Task Allocation in Parallel Computing Environment

    Science.gov (United States)

    Gui, Z.; Yang, C.; XIA, J.; Huang, Q.; YU, M.

    2013-12-01

    Dust storm has serious negative impacts on environment, human health, and assets. The continuing global climate change has increased the frequency and intensity of dust storm in the past decades. To better understand and predict the distribution, intensity and structure of dust storm, a series of dust storm models have been developed, such as Dust Regional Atmospheric Model (DREAM), the NMM meteorological module (NMM-dust) and Chinese Unified Atmospheric Chemistry Environment for Dust (CUACE/Dust). The developments and applications of these models have contributed significantly to both scientific research and our daily life. However, dust storm simulation is a data and computing intensive process. Normally, a simulation for a single dust storm event may take several days or hours to run. It seriously impacts the timeliness of prediction and potential applications. To speed up the process, high performance computing is widely adopted. By partitioning a large study area into small subdomains according to their geographic location and executing them on different computing nodes in a parallel fashion, the computing performance can be significantly improved. Since spatiotemporal correlations exist in the geophysical process of dust storm simulation, each subdomain allocated to a node need to communicate with other geographically adjacent subdomains to exchange data. Inappropriate allocations may introduce imbalance task loads and unnecessary communications among computing nodes. Therefore, task allocation method is the key factor, which may impact the feasibility of the paralleling. The allocation algorithm needs to carefully leverage the computing cost and communication cost for each computing node to minimize total execution time and reduce overall communication cost for the entire system. This presentation introduces two algorithms for such allocation and compares them with evenly distributed allocation method. Specifically, 1) In order to get optimized solutions, a

  10. Secure Scientific Applications Scheduling Technique for Cloud Computing Environment Using Global League Championship Algorithm.

    Science.gov (United States)

    Abdulhamid, Shafi'i Muhammad; Abd Latiff, Muhammad Shafie; Abdul-Salaam, Gaddafi; Hussain Madni, Syed Hamid

    2016-01-01

    Cloud computing system is a huge cluster of interconnected servers residing in a datacenter and dynamically provisioned to clients on-demand via a front-end interface. Scientific applications scheduling in the cloud computing environment is identified as NP-hard problem due to the dynamic nature of heterogeneous resources. Recently, a number of metaheuristics optimization schemes have been applied to address the challenges of applications scheduling in the cloud system, without much emphasis on the issue of secure global scheduling. In this paper, scientific applications scheduling techniques using the Global League Championship Algorithm (GBLCA) optimization technique is first presented for global task scheduling in the cloud environment. The experiment is carried out using CloudSim simulator. The experimental results show that, the proposed GBLCA technique produced remarkable performance improvement rate on the makespan that ranges between 14.44% to 46.41%. It also shows significant reduction in the time taken to securely schedule applications as parametrically measured in terms of the response time. In view of the experimental results, the proposed technique provides better-quality scheduling solution that is suitable for scientific applications task execution in the Cloud Computing environment than the MinMin, MaxMin, Genetic Algorithm (GA) and Ant Colony Optimization (ACO) scheduling techniques.

  11. Secure Scientific Applications Scheduling Technique for Cloud Computing Environment Using Global League Championship Algorithm

    Science.gov (United States)

    Abdulhamid, Shafi’i Muhammad; Abd Latiff, Muhammad Shafie; Abdul-Salaam, Gaddafi; Hussain Madni, Syed Hamid

    2016-01-01

    Cloud computing system is a huge cluster of interconnected servers residing in a datacenter and dynamically provisioned to clients on-demand via a front-end interface. Scientific applications scheduling in the cloud computing environment is identified as NP-hard problem due to the dynamic nature of heterogeneous resources. Recently, a number of metaheuristics optimization schemes have been applied to address the challenges of applications scheduling in the cloud system, without much emphasis on the issue of secure global scheduling. In this paper, scientific applications scheduling techniques using the Global League Championship Algorithm (GBLCA) optimization technique is first presented for global task scheduling in the cloud environment. The experiment is carried out using CloudSim simulator. The experimental results show that, the proposed GBLCA technique produced remarkable performance improvement rate on the makespan that ranges between 14.44% to 46.41%. It also shows significant reduction in the time taken to securely schedule applications as parametrically measured in terms of the response time. In view of the experimental results, the proposed technique provides better-quality scheduling solution that is suitable for scientific applications task execution in the Cloud Computing environment than the MinMin, MaxMin, Genetic Algorithm (GA) and Ant Colony Optimization (ACO) scheduling techniques. PMID:27384239

  12. Computing wave functions in multichannel collisions with non-local potentials using the R-matrix method

    Science.gov (United States)

    Bonitati, Joey; Slimmer, Ben; Li, Weichuan; Potel, Gregory; Nunes, Filomena

    2017-09-01

    The calculable form of the R-matrix method has been previously shown to be a useful tool in approximately solving the Schrodinger equation in nuclear scattering problems. We use this technique combined with the Gauss quadrature for the Lagrange-mesh method to efficiently solve for the wave functions of projectile nuclei in low energy collisions (1-100 MeV) involving an arbitrary number of channels. We include the local Woods-Saxon potential, the non-local potential of Perey and Buck, a Coulomb potential, and a coupling potential to computationally solve for the wave function of two nuclei at short distances. Object oriented programming is used to increase modularity, and parallel programming techniques are introduced to reduce computation time. We conclude that the R-matrix method is an effective method to predict the wave functions of nuclei in scattering problems involving both multiple channels and non-local potentials. Michigan State University iCER ACRES REU.

  13. Techniques and environments for big data analysis parallel, cloud, and grid computing

    CERN Document Server

    Dehuri, Satchidananda; Kim, Euiwhan; Wang, Gi-Name

    2016-01-01

    This volume is aiming at a wide range of readers and researchers in the area of Big Data by presenting the recent advances in the fields of Big Data Analysis, as well as the techniques and tools used to analyze it. The book includes 10 distinct chapters providing a concise introduction to Big Data Analysis and recent Techniques and Environments for Big Data Analysis. It gives insight into how the expensive fitness evaluation of evolutionary learning can play a vital role in big data analysis by adopting Parallel, Grid, and Cloud computing environments.

  14. The capacity of local governments to improve business environment: Evidence from Serbia

    Directory of Open Access Journals (Sweden)

    Vesna Janković Milić

    2014-12-01

    Full Text Available The aim of this paper is to draw attention on the need to strengthen institutional cooperation between local self-governments and the business community. The paper analyses the ability of socio-economic councils in Serbia, as a part of local governments, to improve the business environment and indicators of social status at the local level. In addition to socio-economic councils, this analysis includes the departments, divisions and offices for local economic development and their responsibilities. The results in the paper has been generated using descriptive statistics, Chi-Square test, t-test and regression analysis, based on the analysis of primary data collected in empirical research on 55 municipalities in Serbia. The fundamental results obtained using the stated analysis is that socio-economic councils have positive impact on the social and economic development in the survived municipalities. Finally, the basic conclusion from the executed research is that size of the municipality is not a limiting factor for the establishment of the socio-economic councils and their functionality

  15. Discharge Fee Policy Analysis: A Computable General Equilibrium (CGE Model of Water Resources and Water Environments

    Directory of Open Access Journals (Sweden)

    Guohua Fang

    2016-09-01

    Full Text Available To alleviate increasingly serious water pollution and shortages in developing countries, various kinds of policies have been implemented by local governments. It is vital to quantify and evaluate the performance and potential economic impacts of these policies. This study develops a Computable General Equilibrium (CGE model to simulate the regional economic and environmental effects of discharge fees. Firstly, water resources and water environment factors are separated from the input and output sources of the National Economic Production Department. Secondly, an extended Social Accounting Matrix (SAM of Jiangsu province is developed to simulate various scenarios. By changing values of the discharge fees (increased by 50%, 100% and 150%, three scenarios are simulated to examine their influence on the overall economy and each industry. The simulation results show that an increased fee will have a negative impact on Gross Domestic Product (GDP. However, waste water may be effectively controlled. Also, this study demonstrates that along with the economic costs, the increase of the discharge fee will lead to the upgrading of industrial structures from a situation of heavy pollution to one of light pollution which is beneficial to the sustainable development of the economy and the protection of the environment.

  16. Development of a Computational Steering Framework for High Performance Computing Environments on Blue Gene/P Systems

    KAUST Repository

    Danani, Bob K.

    2012-07-01

    Computational steering has revolutionized the traditional workflow in high performance computing (HPC) applications. The standard workflow that consists of preparation of an application’s input, running of a simulation, and visualization of simulation results in a post-processing step is now transformed into a real-time interactive workflow that significantly reduces development and testing time. Computational steering provides the capability to direct or re-direct the progress of a simulation application at run-time. It allows modification of application-defined control parameters at run-time using various user-steering applications. In this project, we propose a computational steering framework for HPC environments that provides an innovative solution and easy-to-use platform, which allows users to connect and interact with running application(s) in real-time. This framework uses RealityGrid as the underlying steering library and adds several enhancements to the library to enable steering support for Blue Gene systems. Included in the scope of this project is the development of a scalable and efficient steering relay server that supports many-to-many connectivity between multiple steered applications and multiple steering clients. Steered applications can range from intermediate simulation and physical modeling applications to complex computational fluid dynamics (CFD) applications or advanced visualization applications. The Blue Gene supercomputer presents special challenges for remote access because the compute nodes reside on private networks. This thesis presents an implemented solution and demonstrates it on representative applications. Thorough implementation details and application enablement steps are also presented in this thesis to encourage direct usage of this framework.

  17. Simulation-based computation of dose to humans in radiological environments

    International Nuclear Information System (INIS)

    Breazeal, N.L.; Davis, K.R.; Watson, R.A.; Vickers, D.S.; Ford, M.S.

    1996-03-01

    The Radiological Environment Modeling System (REMS) quantifies dose to humans working in radiological environments using the IGRIP (Interactive Graphical Robot Instruction Program) and Deneb/ERGO simulation software. These commercially available products are augmented with custom C code to provide radiation exposure information to, and collect radiation dose information from, workcell simulations. Through the use of any radiation transport code or measured data, a radiation exposure input database may be formulated. User-specified IGRIP simulations utilize these databases to compute and accumulate dose to programmable human models operating around radiation sources. Timing, distances, shielding, and human activity may be modeled accurately in the simulations. The accumulated dose is recorded in output files, and the user is able to process and view this output. The entire REMS capability can be operated from a single graphical user interface

  18. Simulation-based computation of dose to humans in radiological environments

    Energy Technology Data Exchange (ETDEWEB)

    Breazeal, N.L. [Sandia National Labs., Livermore, CA (United States); Davis, K.R.; Watson, R.A. [Sandia National Labs., Albuquerque, NM (United States); Vickers, D.S. [Brigham Young Univ., Provo, UT (United States). Dept. of Electrical and Computer Engineering; Ford, M.S. [Battelle Pantex, Amarillo, TX (United States). Dept. of Radiation Safety

    1996-03-01

    The Radiological Environment Modeling System (REMS) quantifies dose to humans working in radiological environments using the IGRIP (Interactive Graphical Robot Instruction Program) and Deneb/ERGO simulation software. These commercially available products are augmented with custom C code to provide radiation exposure information to, and collect radiation dose information from, workcell simulations. Through the use of any radiation transport code or measured data, a radiation exposure input database may be formulated. User-specified IGRIP simulations utilize these databases to compute and accumulate dose to programmable human models operating around radiation sources. Timing, distances, shielding, and human activity may be modeled accurately in the simulations. The accumulated dose is recorded in output files, and the user is able to process and view this output. The entire REMS capability can be operated from a single graphical user interface.

  19. Toward a Computer Vision-based Wayfinding Aid for Blind Persons to Access Unfamiliar Indoor Environments.

    Science.gov (United States)

    Tian, Yingli; Yang, Xiaodong; Yi, Chucai; Arditi, Aries

    2013-04-01

    Independent travel is a well known challenge for blind and visually impaired persons. In this paper, we propose a proof-of-concept computer vision-based wayfinding aid for blind people to independently access unfamiliar indoor environments. In order to find different rooms (e.g. an office, a lab, or a bathroom) and other building amenities (e.g. an exit or an elevator), we incorporate object detection with text recognition. First we develop a robust and efficient algorithm to detect doors, elevators, and cabinets based on their general geometric shape, by combining edges and corners. The algorithm is general enough to handle large intra-class variations of objects with different appearances among different indoor environments, as well as small inter-class differences between different objects such as doors and door-like cabinets. Next, in order to distinguish intra-class objects (e.g. an office door from a bathroom door), we extract and recognize text information associated with the detected objects. For text recognition, we first extract text regions from signs with multiple colors and possibly complex backgrounds, and then apply character localization and topological analysis to filter out background interference. The extracted text is recognized using off-the-shelf optical character recognition (OCR) software products. The object type, orientation, location, and text information are presented to the blind traveler as speech.

  20. The effects of the local environment on active galactic nuclei

    International Nuclear Information System (INIS)

    Manzer, L. H.; De Robertis, M. M.

    2014-01-01

    There continues to be significant controversy regarding the mechanism(s) responsible for the initiation and maintenance of activity in galactic nuclei. In this paper we will investigate possible environmental triggers of nuclear activity through a statistical analysis of a large sample of galaxy groups. The focus of this paper is to identify active galactic nuclei (AGNs) and other emission-line galaxies in these groups and to compare their frequency with a sample of over 260,000 isolated galaxies from the same catalog. The galaxy groups are taken from the catalog of Yang et al., in which over 20,000 virialized groups of galaxies (2 ≤ N ≤ 20) with redshifts between 0.01 and 0.20 are from the Sloan Digital Sky Survey. We first investigate the completeness of our data set and find, though biases are a concern particularly at higher redshift, that our data provide a fair representation of the local universe. After correcting emission-line equivalent widths for extinction and underlying Balmer stellar absorption, we classify galaxies in the sample using traditional emission-line ratios, while incorporating measurement uncertainties. We find a significantly higher fraction of AGNs in groups compared with the isolated sample. Likewise, a significantly higher fraction of absorption-line galaxies are found in groups, while a higher fraction of star-forming galaxies prefer isolated environments. Within grouped environments, AGNs and star-forming galaxies are found more frequently in small- to medium-richness groups, while absorption-line galaxies prefer groups with larger richnesses. Groups containing only emission-line galaxies have smaller virial radii, velocity dispersions, and masses compared with those containing only absorption-line galaxies. Furthermore, the AGN fraction increases with decreasing distance to the group centroid, independent of galaxy morphology. Using properties obtained from Galaxy Zoo, there is an increased fraction of AGNs within merging systems

  1. The effects of the local environment on active galactic nuclei

    Energy Technology Data Exchange (ETDEWEB)

    Manzer, L. H.; De Robertis, M. M., E-mail: liannemanzer@gmail.com, E-mail: mmdr@yorku.ca [Department of Physics and Astronomy, York University, Toronto, ON M3J 1P3 (Canada)

    2014-06-20

    There continues to be significant controversy regarding the mechanism(s) responsible for the initiation and maintenance of activity in galactic nuclei. In this paper we will investigate possible environmental triggers of nuclear activity through a statistical analysis of a large sample of galaxy groups. The focus of this paper is to identify active galactic nuclei (AGNs) and other emission-line galaxies in these groups and to compare their frequency with a sample of over 260,000 isolated galaxies from the same catalog. The galaxy groups are taken from the catalog of Yang et al., in which over 20,000 virialized groups of galaxies (2 ≤ N ≤ 20) with redshifts between 0.01 and 0.20 are from the Sloan Digital Sky Survey. We first investigate the completeness of our data set and find, though biases are a concern particularly at higher redshift, that our data provide a fair representation of the local universe. After correcting emission-line equivalent widths for extinction and underlying Balmer stellar absorption, we classify galaxies in the sample using traditional emission-line ratios, while incorporating measurement uncertainties. We find a significantly higher fraction of AGNs in groups compared with the isolated sample. Likewise, a significantly higher fraction of absorption-line galaxies are found in groups, while a higher fraction of star-forming galaxies prefer isolated environments. Within grouped environments, AGNs and star-forming galaxies are found more frequently in small- to medium-richness groups, while absorption-line galaxies prefer groups with larger richnesses. Groups containing only emission-line galaxies have smaller virial radii, velocity dispersions, and masses compared with those containing only absorption-line galaxies. Furthermore, the AGN fraction increases with decreasing distance to the group centroid, independent of galaxy morphology. Using properties obtained from Galaxy Zoo, there is an increased fraction of AGNs within merging systems

  2. The Plant-Window System: A framework for an integrated computing environment at advanced nuclear power plants

    International Nuclear Information System (INIS)

    Wood, R.T.; Mullens, J.A.; Naser, J.A.

    1997-01-01

    Power plant data, and the information that can be derived from it, provide the link to the plant through which the operations, maintenance and engineering staff understand and manage plant performance. The extensive use of computer technology in advanced reactor designs provides the opportunity to greatly expand the capability to obtain, analyze, and present data about the plant to station personnel. However, to support highly efficient and increasingly safe operation of nuclear power plants, it is necessary to transform the vast quantity of available data into clear, concise, and coherent information that can be readily accessed and used throughout the plant. This need can be met by an integrated computer workstation environment that provides the necessary information and software applications, in a manner that can be easily understood and sued, to the proper users throughout the plan. As part of a Cooperative Research and Development Agreement with the Electric Power Research Institute, the Oak Ridge National laboratory has developed functional requirements for a Plant-Wide Integrated Environment Distributed On Workstations (Plant-Window) System. The Plant-Window System (PWS) can serve the needs of operations, engineering, and maintenance personnel at nuclear power stations by providing integrated data and software applications within a common computing environment. The PWS requirements identify functional capabilities and provide guidelines for standardized hardware, software, and display interfaces so as to define a flexible computing environment for both current generation nuclear power plants and advanced reactor designs

  3. A Drawing and Multi-Representational Computer Environment for Beginners' Learning of Programming Using C: Design and Pilot Formative Evaluation

    Science.gov (United States)

    Kordaki, Maria

    2010-01-01

    This paper presents both the design and the pilot formative evaluation study of a computer-based problem-solving environment (named LECGO: Learning Environment for programming using C using Geometrical Objects) for the learning of computer programming using C by beginners. In its design, constructivist and social learning theories were taken into…

  4. Reaction time for processing visual stimulus in a computer-assisted rehabilitation environment.

    Science.gov (United States)

    Sanchez, Yerly; Pinzon, David; Zheng, Bin

    2017-10-01

    To examine the reaction time when human subjects process information presented in the visual channel under both a direct vision and a virtual rehabilitation environment when walking was performed. Visual stimulus included eight math problems displayed on the peripheral vision to seven healthy human subjects in a virtual rehabilitation training (computer-assisted rehabilitation environment (CAREN)) and a direct vision environment. Subjects were required to verbally report the results of these math calculations in a short period of time. Reaction time measured by Tobii Eye tracker and calculation accuracy were recorded and compared between the direct vision and virtual rehabilitation environment. Performance outcomes measured for both groups included reaction time, reading time, answering time and the verbal answer score. A significant difference between the groups was only found for the reaction time (p = .004). Participants had more difficulty recognizing the first equation of the virtual environment. Participants reaction time was faster in the direct vision environment. This reaction time delay should be kept in mind when designing skill training scenarios in virtual environments. This was a pilot project to a series of studies assessing cognition ability of stroke patients who are undertaking a rehabilitation program with a virtual training environment. Implications for rehabilitation Eye tracking is a reliable tool that can be employed in rehabilitation virtual environments. Reaction time changes between direct vision and virtual environment.

  5. Cluster implementation for parallel computation within MATLAB software environment

    International Nuclear Information System (INIS)

    Santana, Antonio O. de; Dantas, Carlos C.; Charamba, Luiz G. da R.; Souza Neto, Wilson F. de; Melo, Silvio B. Melo; Lima, Emerson A. de O.

    2013-01-01

    A cluster for parallel computation with MATLAB software the COCGT - Cluster for Optimizing Computing in Gamma ray Transmission methods, is implemented. The implementation correspond to creation of a local net of computers, facilities and configurations of software, as well as the accomplishment of cluster tests for determine and optimizing of performance in the data processing. The COCGT implementation was required by data computation from gamma transmission measurements applied to fluid dynamic and tomography reconstruction in a FCC-Fluid Catalytic Cracking cold pilot unity, and simulation data as well. As an initial test the determination of SVD - Singular Values Decomposition - of random matrix with dimension (n , n), n=1000, using the Girco's law modified, revealed that COCGT was faster in comparison to the literature [1] cluster, which is similar and operates at the same conditions. Solution of a system of linear equations provided a new test for the COCGT performance by processing a square matrix with n=10000, computing time was 27 s and for square matrix with n=12000, computation time was 45 s. For determination of the cluster behavior in relation to 'parfor' (parallel for-loop) and 'spmd' (single program multiple data), two codes were used containing those two commands and the same problem: determination of SVD of a square matrix with n= 1000. The execution of codes by means of COCGT proved: 1) for the code with 'parfor', the performance improved with the labs number from 1 to 8 labs; 2) for the code 'spmd', just 1 lab (core) was enough to process and give results in less than 1 s. In similar situation, with the difference that now the SVD will be determined from square matrix with n1500, for code with 'parfor', and n=7000, for code with 'spmd'. That results take to conclusions: 1) for the code with 'parfor', the behavior was the same already described above; 2) for code with 'spmd', the same besides having produced a larger performance, it supports a

  6. Using high performance interconnects in a distributed computing and mass storage environment

    International Nuclear Information System (INIS)

    Ernst, M.

    1994-01-01

    Detector Collaborations of the HERA Experiments typically involve more than 500 physicists from a few dozen institutes. These physicists require access to large amounts of data in a fully transparent manner. Important issues include Distributed Mass Storage Management Systems in a Distributed and Heterogeneous Computing Environment. At the very center of a distributed system, including tens of CPUs and network attached mass storage peripherals are the communication links. Today scientists are witnessing an integration of computing and communication technology with the open-quote network close-quote becoming the computer. This contribution reports on a centrally operated computing facility for the HERA Experiments at DESY, including Symmetric Multiprocessor Machines (84 Processors), presently more than 400 GByte of magnetic disk and 40 TB of automoted tape storage, tied together by a HIPPI open-quote network close-quote. Focussing on the High Performance Interconnect technology, details will be provided about the HIPPI based open-quote Backplane close-quote configured around a 20 Gigabit/s Multi Media Router and the performance and efficiency of the related computer interfaces

  7. Computationally efficient near-field source localization using third-order moments

    Science.gov (United States)

    Chen, Jian; Liu, Guohong; Sun, Xiaoying

    2014-12-01

    In this paper, a third-order moment-based estimation of signal parameters via rotational invariance techniques (ESPRIT) algorithm is proposed for passive localization of near-field sources. By properly choosing sensor outputs of the symmetric uniform linear array, two special third-order moment matrices are constructed, in which the steering matrix is the function of electric angle γ, while the rotational factor is the function of electric angles γ and ϕ. With the singular value decomposition (SVD) operation, all direction-of-arrivals (DOAs) are estimated from a polynomial rooting version. After substituting the DOA information into the steering matrix, the rotational factor is determined via the total least squares (TLS) version, and the related range estimations are performed. Compared with the high-order ESPRIT method, the proposed algorithm requires a lower computational burden, and it avoids the parameter-match procedure. Computer simulations are carried out to demonstrate the performance of the proposed algorithm.

  8. Dietary quality in children and the role of the local food environment

    Directory of Open Access Journals (Sweden)

    Eimear Keane

    2016-12-01

    Full Text Available Diet is a modifiable contributor to many chronic diseases including childhood obesity. The local food environment may influence children's diet but this area of research is understudied. This study explores if distance to and the number of supermarkets and convenience stores in the local area around households are associated with dietary quality in nine year olds whilst controlling for household level socio-economic factors. This is a secondary analysis of Wave 1 (2007/2008 of the Growing Up in Ireland (GUI Child Cohort Study, a sample of 8568 nine year olds from the Republic of Ireland. Dietary intake was assessed using a short, 20-item parent reported food frequency questionnaire and was used to create a dietary quality score (DQS whereby a higher score indicated a higher diet quality. Socio-economic status was measured using household class, household income, and maternal education. Food availability was measured as road network distance to and the number of supermarkets and convenience stores around households. Separate fixed effects regression models assessed the association between local area food availability and dietary quality, stratified by sex. The DQS ranged from −5 to 25 (mean 9.4, SD 4.2. Mean DQS was higher in those who lived furthest (distance in quintiles from their nearest supermarket (p<0.001, and in those who lived furthest from their nearest convenience store (p<0.001. After controlling for socio-economic characteristics of the household, there was insufficient evidence to suggest that distance to the nearest supermarket or convenience store was associated with dietary quality in girls or boys. The number of supermarkets or convenience stores within 1000 m of the household was not associated with dietary quality. Food availability had a limited effect on dietary quality in this study. Issues associated with conceptualising and measuring the food environment may explain the findings of the current study. Keywords: Diet

  9. A web-based, collaborative modeling, simulation, and parallel computing environment for electromechanical systems

    Directory of Open Access Journals (Sweden)

    Xiaoliang Yin

    2015-03-01

    Full Text Available Complex electromechanical system is usually composed of multiple components from different domains, including mechanical, electronic, hydraulic, control, and so on. Modeling and simulation for electromechanical system on a unified platform is one of the research hotspots in system engineering at present. It is also the development trend of the design for complex electromechanical system. The unified modeling techniques and tools based on Modelica language provide a satisfactory solution. To meet with the requirements of collaborative modeling, simulation, and parallel computing for complex electromechanical systems based on Modelica, a general web-based modeling and simulation prototype environment, namely, WebMWorks, is designed and implemented. Based on the rich Internet application technologies, an interactive graphic user interface for modeling and post-processing on web browser was implemented; with the collaborative design module, the environment supports top-down, concurrent modeling and team cooperation; additionally, service-oriented architecture–based architecture was applied to supply compiling and solving services which run on cloud-like servers, so the environment can manage and dispatch large-scale simulation tasks in parallel on multiple computing servers simultaneously. An engineering application about pure electric vehicle is tested on WebMWorks. The results of simulation and parametric experiment demonstrate that the tested web-based environment can effectively shorten the design cycle of the complex electromechanical system.

  10. Secondary School Students' Attitudes towards Mathematics Computer--Assisted Instruction Environment in Kenya

    Science.gov (United States)

    Mwei, Philip K.; Wando, Dave; Too, Jackson K.

    2012-01-01

    This paper reports the results of research conducted in six classes (Form IV) with 205 students with a sample of 94 respondents. Data represent students' statements that describe (a) the role of Mathematics teachers in a computer-assisted instruction (CAI) environment and (b) effectiveness of CAI in Mathematics instruction. The results indicated…

  11. The Effects of Study Tasks in a Computer-Based Chemistry Learning Environment

    Science.gov (United States)

    Urhahne, Detlef; Nick, Sabine; Poepping, Anna Christin; Schulz , Sarah Jayne

    2013-01-01

    The present study examines the effects of different study tasks on the acquisition of knowledge about acids and bases in a computer-based learning environment. Three different task formats were selected to create three treatment conditions: learning with gap-fill and matching tasks, learning with multiple-choice tasks, and learning only from text…

  12. Nuclides.net: A computational environment for nuclear data and applications in radioprotection and radioecology

    International Nuclear Information System (INIS)

    Berthou, V.; Galy, J.; Leutzenkirchen, K.

    2004-01-01

    An interactive multimedia tool, Nuclides.net, has been developed at the Institute for Transuranium Elements. The Nuclides.net 'integrated environment' is a suite of computer programs ranging from a powerful user-friendly interface, which allows the user to navigate the nuclides chart and explore the properties of nuclides, to various computational modules for decay calculations, dosimetry and shielding calculations, etc. The product is particularly suitable for environmental radioprotection and radioecology. (authors)

  13. Whiskers and Localized Corrosion on Copper in Repository Environment

    International Nuclear Information System (INIS)

    Hermansson, Hans-Peter; Gillen, Peter

    2004-03-01

    Previous studies have demonstrated that whiskers (thread/hair shaped structures) can form on copper in a sulphide containing environment. A remaining important question is whether the attack on the copper metal surface beneath a whisker is of a localized or of a general nature. This issue has not been clarified as whiskers are very fragile and have always detached and fallen off from the surface at some stage of handling. It has therefore been very difficult to link the growth root of the whisker to underlying structures in the metal surface. A study was therefore initiated to settle the important issue of the relation between whisker position and the type of underlying metal attack. The usage of a porous medium was originally planned to support the whiskers in order to keep them in place and by post examinations characterize the nature of the whisker roots and thus the type of attack on the metal. However, the early stages of the present experimental work clearly indicated that other ways of study were necessary. A photographic method for the registration and positioning of whisker growth was therefore developed. It proved to be a successful means to coordinate whisker position and to link it with the attack on the underlying metal. Shortage of sulphide in previous experiments caused a retarded growth rate of whiskers. Therefore, in present experiments the sulphide concentration was kept at a more constant level throughout an experiment and a hindered whisker growth did not limit the attack on underlying metal. Whiskers and substrates were observed with a video camera throughout an experiment and the phase composition was examined with Laser Raman Spectroscopy, LRS and the Raman video microscope. Post examinations were also performed using light optical microscopy. By combining the results from the optical methods it has been possible to distinguish two kinds of whisker roots (small/large diameter) with the underlying metal surface. It has also been demonstrated

  14. CLINICAL SURFACES - Activity-Based Computing for Distributed Multi-Display Environments in Hospitals

    Science.gov (United States)

    Bardram, Jakob E.; Bunde-Pedersen, Jonathan; Doryab, Afsaneh; Sørensen, Steffen

    A multi-display environment (MDE) is made up of co-located and networked personal and public devices that form an integrated workspace enabling co-located group work. Traditionally, MDEs have, however, mainly been designed to support a single “smart room”, and have had little sense of the tasks and activities that the MDE is being used for. This paper presents a novel approach to support activity-based computing in distributed MDEs, where displays are physically distributed across a large building. CLINICAL SURFACES was designed for clinical work in hospitals, and enables context-sensitive retrieval and browsing of patient data on public displays. We present the design and implementation of CLINICAL SURFACES, and report from an evaluation of the system at a large hospital. The evaluation shows that using distributed public displays to support activity-based computing inside a hospital is very useful for clinical work, and that the apparent contradiction between maintaining privacy of medical data in a public display environment can be mitigated by the use of CLINICAL SURFACES.

  15. The Integrated Computational Environment for Airbreathing Hypersonic Flight Vehicle Modeling and Design Evaluation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — An integrated computational environment for multidisciplinary, physics-based simulation and analyses of airbreathing hypersonic flight vehicles will be developed....

  16. Constrained Local UniversE Simulations: a Local Group factory

    Science.gov (United States)

    Carlesi, Edoardo; Sorce, Jenny G.; Hoffman, Yehuda; Gottlöber, Stefan; Yepes, Gustavo; Libeskind, Noam I.; Pilipenko, Sergey V.; Knebe, Alexander; Courtois, Hélène; Tully, R. Brent; Steinmetz, Matthias

    2016-05-01

    Near-field cosmology is practised by studying the Local Group (LG) and its neighbourhood. This paper describes a framework for simulating the `near field' on the computer. Assuming the Λ cold dark matter (ΛCDM) model as a prior and applying the Bayesian tools of the Wiener filter and constrained realizations of Gaussian fields to the Cosmicflows-2 (CF2) survey of peculiar velocities, constrained simulations of our cosmic environment are performed. The aim of these simulations is to reproduce the LG and its local environment. Our main result is that the LG is likely a robust outcome of the ΛCDMscenario when subjected to the constraint derived from CF2 data, emerging in an environment akin to the observed one. Three levels of criteria are used to define the simulated LGs. At the base level, pairs of haloes must obey specific isolation, mass and separation criteria. At the second level, the orbital angular momentum and energy are constrained, and on the third one the phase of the orbit is constrained. Out of the 300 constrained simulations, 146 LGs obey the first set of criteria, 51 the second and 6 the third. The robustness of our LG `factory' enables the construction of a large ensemble of simulated LGs. Suitable candidates for high-resolution hydrodynamical simulations of the LG can be drawn from this ensemble, which can be used to perform comprehensive studies of the formation of the LG.

  17. Human resources of local governments as motivators of participation of businesses and citizens in protecting of environment

    OpenAIRE

    NIKOLIĆ N.; GAJOVIĆ A.; PAUNOVIĆ V.

    2015-01-01

    This paper discusses the importance of human resources of local governments in the motivation of businesses and citizens in protecting the environment. The inability to absorb current problems caused by inadequate and incomplete arrangement of utilization of human resources of the local government of Lučani caused the redefining of strategic priorities of environmental protection. The motivational power of human resources of local governments expressed through interaction with the population ...

  18. A Solid-State NMR Experiment: Analysis of Local Structural Environments in Phosphate Glasses

    Science.gov (United States)

    Anderson, Stanley E.; Saiki, David; Eckert, Hellmut; Meise-Gresch, Karin

    2004-01-01

    An experiment that can be used to directly study the local chemical environments of phosphorus in solid amorphous materials is demonstrated. The experiment aims at familiarizing the students of chemistry with the principles of solid-state NMR, by having them synthesize a simple phosphate glass, and making them observe the (super 31)P NMR spectrum,…

  19. Emerging and Future Computing Paradigms and Their Impact on the Research, Training, and Design Environments of the Aerospace Workforce

    Science.gov (United States)

    Noor, Ahmed K. (Compiler)

    2003-01-01

    The document contains the proceedings of the training workshop on Emerging and Future Computing Paradigms and their impact on the Research, Training and Design Environments of the Aerospace Workforce. The workshop was held at NASA Langley Research Center, Hampton, Virginia, March 18 and 19, 2003. The workshop was jointly sponsored by Old Dominion University and NASA. Workshop attendees came from NASA, other government agencies, industry and universities. The objectives of the workshop were to a) provide broad overviews of the diverse activities related to new computing paradigms, including grid computing, pervasive computing, high-productivity computing, and the IBM-led autonomic computing; and b) identify future directions for research that have high potential for future aerospace workforce environments. The format of the workshop included twenty-one, half-hour overview-type presentations and three exhibits by vendors.

  20. Specialized Computer Systems for Environment Visualization

    Science.gov (United States)

    Al-Oraiqat, Anas M.; Bashkov, Evgeniy A.; Zori, Sergii A.

    2018-06-01

    The need for real time image generation of landscapes arises in various fields as part of tasks solved by virtual and augmented reality systems, as well as geographic information systems. Such systems provide opportunities for collecting, storing, analyzing and graphically visualizing geographic data. Algorithmic and hardware software tools for increasing the realism and efficiency of the environment visualization in 3D visualization systems are proposed. This paper discusses a modified path tracing algorithm with a two-level hierarchy of bounding volumes and finding intersections with Axis-Aligned Bounding Box. The proposed algorithm eliminates the branching and hence makes the algorithm more suitable to be implemented on the multi-threaded CPU and GPU. A modified ROAM algorithm is used to solve the qualitative visualization of reliefs' problems and landscapes. The algorithm is implemented on parallel systems—cluster and Compute Unified Device Architecture-networks. Results show that the implementation on MPI clusters is more efficient than Graphics Processing Unit/Graphics Processing Clusters and allows real-time synthesis. The organization and algorithms of the parallel GPU system for the 3D pseudo stereo image/video synthesis are proposed. With realizing possibility analysis on a parallel GPU-architecture of each stage, 3D pseudo stereo synthesis is performed. An experimental prototype of a specialized hardware-software system 3D pseudo stereo imaging and video was developed on the CPU/GPU. The experimental results show that the proposed adaptation of 3D pseudo stereo imaging to the architecture of GPU-systems is efficient. Also it accelerates the computational procedures of 3D pseudo-stereo synthesis for the anaglyph and anamorphic formats of the 3D stereo frame without performing optimization procedures. The acceleration is on average 11 and 54 times for test GPUs.

  1. Local Measurement of Fuel Energy Deposition and Heat Transfer Environment During Fuel Lifetime Using Controlled Calorimetry

    International Nuclear Information System (INIS)

    Don W. Miller; Andrew Kauffmann; Eric Kreidler; Dongxu Li; Hanying Liu; Daniel Mills; Thomas D. Radcliff; Joseph Talnagi

    2001-01-01

    A comprehensive description of the accomplishments of the DOE grant titled, ''Local Measurement of Fuel Energy Deposition and Heat Transfer Environment During Fuel Lifetime using Controlled Calorimetry''

  2. Parallel sort with a ranged, partitioned key-value store in a high perfomance computing environment

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Grider, Gary; Torres, Aaron; Poole, Stephen W.

    2016-01-26

    Improved sorting techniques are provided that perform a parallel sort using a ranged, partitioned key-value store in a high performance computing (HPC) environment. A plurality of input data files comprising unsorted key-value data in a partitioned key-value store are sorted. The partitioned key-value store comprises a range server for each of a plurality of ranges. Each input data file has an associated reader thread. Each reader thread reads the unsorted key-value data in the corresponding input data file and performs a local sort of the unsorted key-value data to generate sorted key-value data. A plurality of sorted, ranged subsets of each of the sorted key-value data are generated based on the plurality of ranges. Each sorted, ranged subset corresponds to a given one of the ranges and is provided to one of the range servers corresponding to the range of the sorted, ranged subset. Each range server sorts the received sorted, ranged subsets and provides a sorted range. A plurality of the sorted ranges are concatenated to obtain a globally sorted result.

  3. Role of Multislice Computed Tomography and Local Contrast in the Diagnosis and Characterization of Choanal Atresia

    Directory of Open Access Journals (Sweden)

    Khaled Al-Noury

    2011-01-01

    Full Text Available Objective. To illustrate the role of multislice computed tomography and local contrast instillation in the diagnosis and characterization of choanal atresia. To review the common associated radiological findings. Methods. We analyzed 9 pediatric patients (5 males and 4 females with suspected choanal atresia by multislice computed tomography. We recorded the type of atresia plate and other congenital malformations of the skull. Results. Multislice computed tomography with local contrast installed delineated the posterior choanae. Three patients had unilateral mixed membranous and bony atresia. Three patients had unilateral pure bony atresia. Only 1 of 7 patients have bilateral bony atresia. It also showed other congenital anomalies in the head region. One patient is with an ear abnormality. One patient had congenital nasal pyriform aperture stenosis. One of these patients had several congenital abnormalities, including cardiac and renal deformities and a hypoplastic lateral semicircular canal. Of the 6 patients diagnosed to have choanal atresia, 1 patient had esophageal atresia and a tracheoesophageal fistula. The remaining patients had no other CHARGE syndrome lesions. Conclusions. Local Contrast medium with the application of the low-dose technique helps to delineate the cause of the nasal obstruction avoiding a high radiation dose to the child.

  4. Successful Implementation of a Computer-Supported Collaborative Learning System in Teaching E-Commerce

    Science.gov (United States)

    Ngai, E. W. T.; Lam, S. S.; Poon, J. K. L.

    2013-01-01

    This paper describes the successful application of a computer-supported collaborative learning system in teaching e-commerce. The authors created a teaching and learning environment for 39 local secondary schools to introduce e-commerce using a computer-supported collaborative learning system. This system is designed to equip students with…

  5. Computing the Local Field Potential (LFP) from Integrate-and-Fire Network Models

    DEFF Research Database (Denmark)

    Mazzoni, Alberto; Linden, Henrik; Cuntz, Hermann

    2015-01-01

    Leaky integrate-and-fire (LIF) network models are commonly used to study how the spiking dynamics of neural networks changes with stimuli, tasks or dynamic network states. However, neurophysiological studies in vivo often rather measure the mass activity of neuronal microcircuits with the local f...... in cases where a single pyramidal population dominates the LFP generation, and thereby facilitate quantitative comparison between computational models and experimental LFP recordings in vivo....

  6. Analysis of experimental data sets for local scour depth around ...

    African Journals Online (AJOL)

    The performance of soft computing techniques to analyse and interpret the experimental data of local scour depth around bridge abutment, measured at different laboratory conditions and environment, is presented. The scour around bridge piers and abutments is, in the majority of cases, the main reason for bridge failures.

  7. Implementation of Computer Assisted Test Selection System in Local Governments

    Directory of Open Access Journals (Sweden)

    Abdul Azis Basri

    2016-05-01

    Full Text Available As an evaluative way of selection of civil servant system in all government areas, Computer Assisted Test selection system was started to apply in 2013. In phase of implementation for first time in all areas in 2014, this system selection had trouble in several areas, such as registration procedure and passing grade. The main objective of this essay was to describe implementation of new selection system for civil servants in the local governments and to seek level of effectiveness of this selection system. This essay used combination of study literature and field survey which data collection was made by interviews, observations, and documentations from various sources, and to analyze the collected data, this essay used reduction, display data and verification for made the conclusion. The result of this essay showed, despite there a few parts that be problem of this system such as in the registration phase but almost all phases of implementation of CAT selection system in local government areas can be said was working clearly likes in preparation, implementation and result processing phase. And also this system was fulfilled two of three criterias of effectiveness for selection system, they were accuracy and trusty. Therefore, this selection system can be said as an effective way to select new civil servant. As suggestion, local governments have to make prime preparation in all phases of test and make a good feedback as evaluation mechanism and together with central government to seek, fix and improve infrastructures as supporting tool and competency of local residents.

  8. Attitudes and gender differences of high school seniors within one-to-one computing environments in South Dakota

    Science.gov (United States)

    Nelson, Mathew

    In today's age of exponential change and technological advancement, awareness of any gender gap in technology and computer science-related fields is crucial, but further research must be done in an effort to better understand the complex interacting factors contributing to the gender gap. This study utilized a survey to investigate specific gender differences relating to computing self-efficacy, computer usage, and environmental factors of exposure, personal interests, and parental influence that impact gender differences of high school students within a one-to-one computing environment in South Dakota. The population who completed the One-to-One High School Computing Survey for this study consisted of South Dakota high school seniors who had been involved in a one-to-one computing environment for two or more years. The data from the survey were analyzed using descriptive and inferential statistics for the determined variables. From the review of literature and data analysis several conclusions were drawn from the findings. Among them are that overall, there was very little difference in perceived computing self-efficacy and computing anxiety between male and female students within the one-to-one computing initiative. The study supported the current research that males and females utilized computers similarly, but males spent more time using their computers to play online games. Early exposure to computers, or the age at which the student was first exposed to a computer, and the number of computers present in the home (computer ownership) impacted computing self-efficacy. The results also indicated parental encouragement to work with computers also contributed positively to both male and female students' computing self-efficacy. Finally the study also found that both mothers and fathers encouraged their male children more than their female children to work with computing and pursue careers in computing science fields.

  9. Observation Likelihood Model Design and Failure Recovery Scheme toward Reliable Localization of Mobile Robots

    Directory of Open Access Journals (Sweden)

    Chang-bae Moon

    2011-01-01

    Full Text Available Although there have been many researches on mobile robot localization, it is still difficult to obtain reliable localization performance in a human co-existing real environment. Reliability of localization is highly dependent upon developer's experiences because uncertainty is caused by a variety of reasons. We have developed a range sensor based integrated localization scheme for various indoor service robots. Through the experience, we found out that there are several significant experimental issues. In this paper, we provide useful solutions for following questions which are frequently faced with in practical applications: 1 How to design an observation likelihood model? 2 How to detect the localization failure? 3 How to recover from the localization failure? We present design guidelines of observation likelihood model. Localization failure detection and recovery schemes are presented by focusing on abrupt wheel slippage. Experiments were carried out in a typical office building environment. The proposed scheme to identify the localizer status is useful in practical environments. Moreover, the semi-global localization is a computationally efficient recovery scheme from localization failure. The results of experiments and analysis clearly present the usefulness of proposed solutions.

  10. Observation Likelihood Model Design and Failure Recovery Scheme Toward Reliable Localization of Mobile Robots

    Directory of Open Access Journals (Sweden)

    Chang-bae Moon

    2010-12-01

    Full Text Available Although there have been many researches on mobile robot localization, it is still difficult to obtain reliable localization performance in a human co-existing real environment. Reliability of localization is highly dependent upon developer's experiences because uncertainty is caused by a variety of reasons. We have developed a range sensor based integrated localization scheme for various indoor service robots. Through the experience, we found out that there are several significant experimental issues. In this paper, we provide useful solutions for following questions which are frequently faced with in practical applications: 1 How to design an observation likelihood model? 2 How to detect the localization failure? 3 How to recover from the localization failure? We present design guidelines of observation likelihood model. Localization failure detection and recovery schemes are presented by focusing on abrupt wheel slippage. Experiments were carried out in a typical office building environment. The proposed scheme to identify the localizer status is useful in practical environments. Moreover, the semi-global localization is a computationally efficient recovery scheme from localization failure. The results of experiments and analysis clearly present the usefulness of proposed solutions.

  11. Local extinction and reignition of the flame; Liekin paikallinen sammuminen ja uudelleen syttyminen

    Energy Technology Data Exchange (ETDEWEB)

    Kjaeldman, L. [VTT Energia, Espoo (Finland); Brink, A. [Aabo Akademi, Turku (Finland)

    1996-12-01

    A model of the local extinction and reignition of the flame suitable to be used in computational fluid dynamic analysis of primarily multi-burner furnaces is developed. The model is implemented in the computational environment Ardemus of VTT and Imatran Voima Oy, and tested against well defined experiments. The model makes the simulation of especially the near burner processes more realistic. (author)

  12. Offloading Method for Efficient Use of Local Computational Resources in Mobile Location-Based Services Using Clouds

    Directory of Open Access Journals (Sweden)

    Yunsik Son

    2017-01-01

    Full Text Available With the development of mobile computing, location-based services (LBSs have been developed to provide services based on location information through communication networks or the global positioning system. In recent years, LBSs have evolved into smart LBSs, which provide many services using only location information. These include basic services such as traffic, logistic, and entertainment services. However, a smart LBS may require relatively complicated operations, which may not be effectively performed by the mobile computing system. To overcome this problem, a computation offloading technique can be used to perform certain tasks on mobile devices in cloud and fog environments. Furthermore, mobile platforms exist that provide smart LBSs. The smart cross-platform is a solution based on a virtual machine (VM that enables compatibility of content in various mobile and smart device environments. However, owing to the nature of the VM-based execution method, the execution performance is degraded compared to that of the native execution method. In this paper, we introduce a computation offloading technique that utilizes fog computing to improve the performance of VMs running on mobile devices. We applied the proposed method to smart devices with a smart VM (SVM and HTML5 SVM to compare their performances.

  13. Primary assembly of soil communities: disentangling the effect of dispersal and local environment.

    Science.gov (United States)

    Ingimarsdóttir, María; Caruso, Tancredi; Ripa, Jörgen; Magnúsdóttir, Olöf Birna; Migliorini, Massimo; Hedlund, Katarina

    2012-11-01

    It has long been recognised that dispersal abilities and environmental factors are important in shaping invertebrate communities, but their relative importance for primary soil community assembly has not yet been disentangled. By studying soil communities along chronosequences on four recently emerged nunataks (ice-free land in glacial areas) in Iceland, we replicated environmental conditions spatially at various geographical distances. This allowed us to determine the underlying factors of primary community assembly with the help of metacommunity theories that predict different levels of dispersal constraints and effects of the local environment. Comparing community assembly of the nunataks with that of non-isolated deglaciated areas indicated that isolation of a few kilometres did not affect the colonisation of the soil invertebrates. When accounting for effects of geographical distances, soil age and plant richness explained a significant part of the variance observed in the distribution of the oribatid mites and collembola communities, respectively. Furthermore, null model analyses revealed less co-occurrence than expected by chance and also convergence in the body size ratio of co-occurring oribatids, which is consistent with species sorting. Geographical distances influenced species composition, indicating that the community is also assembled by dispersal, e.g. mass effect. When all the results are linked together, they demonstrate that local environmental factors are important in structuring the soil community assembly, but are accompanied with effects of dispersal that may "override" the visible effect of the local environment.

  14. Daily Megavoltage Computed Tomography in Lung Cancer Radiotherapy: Correlation Between Volumetric Changes and Local Outcome

    International Nuclear Information System (INIS)

    Bral, Samuel; De Ridder, Mark; Duchateau, Michael; Gevaert, Thierry; Engels, Benedikt; Schallier, Denis; Storme, Guy

    2011-01-01

    Purpose: To assess the predictive or comparative value of volumetric changes, measured on daily megavoltage computed tomography during radiotherapy for lung cancer. Patients and Methods: We included 80 patients with locally advanced non-small-cell lung cancer treated with image-guided intensity-modulated radiotherapy. The radiotherapy was combined with concurrent chemotherapy, combined with induction chemotherapy, or given as primary treatment. Patients entered two parallel studies with moderately hypofractionated radiotherapy. Tumor volume contouring was done on the daily acquired images. A regression coefficient was derived from the volumetric changes on megavoltage computed tomography, and its predictive value was validated. Logarithmic or polynomial fits were applied to the intratreatment changes to compare the different treatment schedules radiobiologically. Results: Regardless of the treatment type, a high regression coefficient during radiotherapy predicted for a significantly prolonged cause-specific local progression free-survival (p = 0.05). Significant differences were found in the response during radiotherapy. The significant difference in volumetric treatment response between radiotherapy with concurrent chemotherapy and radiotherapy plus induction chemotherapy translated to a superior long-term local progression-free survival for concurrent chemotherapy (p = 0.03). An enhancement ratio of 1.3 was measured for the used platinum/taxane doublet in comparison with radiotherapy alone. Conclusion: Contouring on daily megavoltage computed tomography images during radiotherapy enabled us to predict the efficacy of a given treatment. The significant differences in volumetric response between treatment strategies makes it a possible tool for future schedule comparison.

  15. Effects of feedback in a computer-based learning environment on students’ learning outcomes: a meta-analysis

    NARCIS (Netherlands)

    van der Kleij, Fabienne; Feskens, Remco C.W.; Eggen, Theodorus Johannes Hendrikus Maria

    2015-01-01

    In this meta-analysis, we investigated the effects of methods for providing item-based feedback in a computer-based environment on students’ learning outcomes. From 40 studies, 70 effect sizes were computed, which ranged from −0.78 to 2.29. A mixed model was used for the data analysis. The results

  16. ReputationPro: The Efficient Approaches to Contextual Transaction Trust Computation in E-Commerce Environments

    OpenAIRE

    Zhang, Haibin; Wang, Yan; Zhang, Xiuzhen; Lim, Ee-Peng

    2013-01-01

    In e-commerce environments, the trustworthiness of a seller is utterly important to potential buyers, especially when the seller is unknown to them. Most existing trust evaluation models compute a single value to reflect the general trust level of a seller without taking any transaction context information into account. In this paper, we first present a trust vector consisting of three values for Contextual Transaction Trust (CTT). In the computation of three CTT values, the identified three ...

  17. Modeling Students' Problem Solving Performance in the Computer-Based Mathematics Learning Environment

    Science.gov (United States)

    Lee, Young-Jin

    2017-01-01

    Purpose: The purpose of this paper is to develop a quantitative model of problem solving performance of students in the computer-based mathematics learning environment. Design/methodology/approach: Regularized logistic regression was used to create a quantitative model of problem solving performance of students that predicts whether students can…

  18. Glimpsing the imprint of local environment on the galaxy stellar mass function

    Science.gov (United States)

    Tomczak, Adam R.; Lemaux, Brian C.; Lubin, Lori M.; Gal, Roy R.; Wu, Po-Feng; Holden, Bradford; Kocevski, Dale D.; Mei, Simona; Pelliccia, Debora; Rumbaugh, Nicholas; Shen, Lu

    2017-12-01

    We investigate the impact of local environment on the galaxy stellar mass function (SMF) spanning a wide range of galaxy densities from the field up to dense cores of massive galaxy clusters. Data are drawn from a sample of eight fields from the Observations of Redshift Evolution in Large-Scale Environments (ORELSE) survey. Deep photometry allow us to select mass-complete samples of galaxies down to 109 M⊙. Taking advantage of >4000 secure spectroscopic redshifts from ORELSE and precise photometric redshifts, we construct three-dimensional density maps between 0.55 environmental dependence in the SMFs of star-forming and quiescent galaxies, although not quite as strongly for the quiescent subsample. To characterize the connection between the SMF of field galaxies and that of denser environments, we devise a simple semi-empirical model. The model begins with a sample of ≈106 galaxies at zstart = 5 with stellar masses distributed according to the field. Simulated galaxies then evolve down to zfinal = 0.8 following empirical prescriptions for star-formation, quenching and galaxy-galaxy merging. We run the simulation multiple times, testing a variety of scenarios with differing overall amounts of merging. Our model suggests that a large number of mergers are required to reproduce the SMF in dense environments. Additionally, a large majority of these mergers would have to occur in intermediate density environments (e.g. galaxy groups).

  19. Analysis of computational complexity for HT-based fingerprint alignment algorithms on java card environment

    CSIR Research Space (South Africa)

    Mlambo, CS

    2015-01-01

    Full Text Available In this paper, implementations of three Hough Transform based fingerprint alignment algorithms are analyzed with respect to time complexity on Java Card environment. Three algorithms are: Local Match Based Approach (LMBA), Discretized Rotation Based...

  20. Localized Ambient Solidity Separation Algorithm Based Computer User Segmentation

    Science.gov (United States)

    Sun, Xiao; Zhang, Tongda; Chai, Yueting; Liu, Yi

    2015-01-01

    Most of popular clustering methods typically have some strong assumptions of the dataset. For example, the k-means implicitly assumes that all clusters come from spherical Gaussian distributions which have different means but the same covariance. However, when dealing with datasets that have diverse distribution shapes or high dimensionality, these assumptions might not be valid anymore. In order to overcome this weakness, we proposed a new clustering algorithm named localized ambient solidity separation (LASS) algorithm, using a new isolation criterion called centroid distance. Compared with other density based isolation criteria, our proposed centroid distance isolation criterion addresses the problem caused by high dimensionality and varying density. The experiment on a designed two-dimensional benchmark dataset shows that our proposed LASS algorithm not only inherits the advantage of the original dissimilarity increments clustering method to separate naturally isolated clusters but also can identify the clusters which are adjacent, overlapping, and under background noise. Finally, we compared our LASS algorithm with the dissimilarity increments clustering method on a massive computer user dataset with over two million records that contains demographic and behaviors information. The results show that LASS algorithm works extremely well on this computer user dataset and can gain more knowledge from it. PMID:26221133

  1. Scheduling Method of Data-Intensive Applications in Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    Xiong Fu

    2015-01-01

    Full Text Available The virtualization of cloud computing improves the utilization of resources and energy. And a cloud user can deploy his/her own applications and related data on a pay-as-you-go basis. The communications between an application and a data storage node, as well as within the application, have a great impact on the execution efficiency of the application. The locations of subtasks of an application and the data that transferred between the subtasks are the main reason why communication delay exists. The communication delay can affect the completion time of the application. In this paper, we take into account the data transmission time and communications between subtasks and propose a heuristic optimal virtual machine (VM placement algorithm. Related simulations demonstrate that this algorithm can reduce the completion time of user tasks and ensure the feasibility and effectiveness of the overall network performance of applications when running in a cloud computing environment.

  2. Designing a parallel evolutionary algorithm for inferring gene networks on the cloud computing environment.

    Science.gov (United States)

    Lee, Wei-Po; Hsiao, Yu-Ting; Hwang, Wei-Che

    2014-01-16

    To improve the tedious task of reconstructing gene networks through testing experimentally the possible interactions between genes, it becomes a trend to adopt the automated reverse engineering procedure instead. Some evolutionary algorithms have been suggested for deriving network parameters. However, to infer large networks by the evolutionary algorithm, it is necessary to address two important issues: premature convergence and high computational cost. To tackle the former problem and to enhance the performance of traditional evolutionary algorithms, it is advisable to use parallel model evolutionary algorithms. To overcome the latter and to speed up the computation, it is advocated to adopt the mechanism of cloud computing as a promising solution: most popular is the method of MapReduce programming model, a fault-tolerant framework to implement parallel algorithms for inferring large gene networks. This work presents a practical framework to infer large gene networks, by developing and parallelizing a hybrid GA-PSO optimization method. Our parallel method is extended to work with the Hadoop MapReduce programming model and is executed in different cloud computing environments. To evaluate the proposed approach, we use a well-known open-source software GeneNetWeaver to create several yeast S. cerevisiae sub-networks and use them to produce gene profiles. Experiments have been conducted and the results have been analyzed. They show that our parallel approach can be successfully used to infer networks with desired behaviors and the computation time can be largely reduced. Parallel population-based algorithms can effectively determine network parameters and they perform better than the widely-used sequential algorithms in gene network inference. These parallel algorithms can be distributed to the cloud computing environment to speed up the computation. By coupling the parallel model population-based optimization method and the parallel computational framework, high

  3. Cloud Computing for Pharmacometrics: Using AWS, NONMEM, PsN, Grid Engine, and Sonic.

    Science.gov (United States)

    Sanduja, S; Jewell, P; Aron, E; Pharai, N

    2015-09-01

    Cloud computing allows pharmacometricians to access advanced hardware, network, and security resources available to expedite analysis and reporting. Cloud-based computing environments are available at a fraction of the time and effort when compared to traditional local datacenter-based solutions. This tutorial explains how to get started with building your own personal cloud computer cluster using Amazon Web Services (AWS), NONMEM, PsN, Grid Engine, and Sonic.

  4. Modeling and performance analysis for composite network–compute service provisioning in software-defined cloud environments

    Directory of Open Access Journals (Sweden)

    Qiang Duan

    2015-08-01

    Full Text Available The crucial role of networking in Cloud computing calls for a holistic vision of both networking and computing systems that leads to composite network–compute service provisioning. Software-Defined Network (SDN is a fundamental advancement in networking that enables network programmability. SDN and software-defined compute/storage systems form a Software-Defined Cloud Environment (SDCE that may greatly facilitate composite network–compute service provisioning to Cloud users. Therefore, networking and computing systems need to be modeled and analyzed as composite service provisioning systems in order to obtain thorough understanding about service performance in SDCEs. In this paper, a novel approach for modeling composite network–compute service capabilities and a technique for evaluating composite network–compute service performance are developed. The analytic method proposed in this paper is general and agnostic to service implementation technologies; thus is applicable to a wide variety of network–compute services in SDCEs. The results obtained in this paper provide useful guidelines for federated control and management of networking and computing resources to achieve Cloud service performance guarantees.

  5. Local environment of zirconium in nuclear gels studied by XAS

    International Nuclear Information System (INIS)

    Pelegrin, E.; Ildefonse, Ph.; Calas, G.; Ricol, St.; Flank, A.M.

    1997-01-01

    During lixiviation experiments, nuclear gels are formed and heavy metals are retained. In order to understand this retardation mechanisms, we performed an analysis of the local environment of Zr in parent glasses and derived alteration gels both at the Zr-L II,III , and Zr-K edges. Calibration of the method was conducted through the analysis of model compounds with known coordination number (CN): catapleite Na 2 ZrSi 3 O 9 ,2H 2 O (CN=6), baddeleyite ZrO 2 (CN=7) and zircon SiZrO 4 (CN=8). Nuclear glasses (R7T7, and a simplified nuclear glass V 1) and gels obtained at 90 deg C, with leaching times from 7 to 12 months and with solution renewal. were also investigated (GR7T7R and GV1). Zr-L II,III XANES spectra evidenced that zirconium is 6-fold coordinated in R7T7 and V1 nuclear glasses. For GR7T7R and GV1 gels, Zr local environment is significantly changed, and a mixture of CN (6 and 7J has been evidenced. Quantitative structural results were derived from EXAFS analysis at Zr-K edge. In parent glasses, derived Zr-O distance is 2.10±0.01 10 -10 m, and is in the range Zr-O distances for octahedral coordination in model compounds. In both gels studied, Zr-O distances increase significantly up to 2.15 ±0.01 10 -10 m. This distance is close to that known in baddeleyite (2,158 10 -10 m). A better understanding of the Zr retention mechanism has to be made by studying the second neighbors contributions. (authors)

  6. Secure Encapsulation and Publication of Biological Services in the Cloud Computing Environment

    Science.gov (United States)

    Zhang, Weizhe; Wang, Xuehui; Lu, Bo; Kim, Tai-hoon

    2013-01-01

    Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved. PMID:24078906

  7. Secure Encapsulation and Publication of Biological Services in the Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Weizhe Zhang

    2013-01-01

    Full Text Available Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved.

  8. Secure encapsulation and publication of biological services in the cloud computing environment.

    Science.gov (United States)

    Zhang, Weizhe; Wang, Xuehui; Lu, Bo; Kim, Tai-hoon

    2013-01-01

    Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved.

  9. Hypercompetitive Environments: An Agent-based model approach

    Science.gov (United States)

    Dias, Manuel; Araújo, Tanya

    Information technology (IT) environments are characterized by complex changes and rapid evolution. Globalization and the spread of technological innovation have increased the need for new strategic information resources, both from individual firms and management environments. Improvements in multidisciplinary methods and, particularly, the availability of powerful computational tools, are giving researchers an increasing opportunity to investigate management environments in their true complex nature. The adoption of a complex systems approach allows for modeling business strategies from a bottom-up perspective — understood as resulting from repeated and local interaction of economic agents — without disregarding the consequences of the business strategies themselves to individual behavior of enterprises, emergence of interaction patterns between firms and management environments. Agent-based models are at the leading approach of this attempt.

  10. 1990 CERN School of Computing

    International Nuclear Information System (INIS)

    1991-01-01

    These Proceedings contain written versions of lectures delivered at the 1990 CERN School of Computing, covering a variety of topics. Computer networks are treated in three papers: standards in computer networking; evolution of local and metropolitan area networks; asynchronous transfer mode, the solution for broadband ISDN. Data acquisition and analysis are the topic of papers on: data acquisition using MODEL software; graphical event analysis. Two papers in the field of signal processing treat digital image processing and the use of digital signal processors in HEP. Another paper reviews the present state of digital optical computing. Operating systems and programming discipline are covered in two papers: UNIX, evolution towards distributed systems; new developments in program verification. Three papers treat miscellaneous topics: computer security within the CERN environment; numerical simulation in fluid mechanics; fractals. An introduction to transputers and Occam gives an account of the tutorial lectures given at the School. (orig.)

  11. A vector-product information retrieval system adapted to heterogeneous, distributed computing environments

    Science.gov (United States)

    Rorvig, Mark E.

    1991-01-01

    Vector-product information retrieval (IR) systems produce retrieval results superior to all other searching methods but presently have no commercial implementations beyond the personal computer environment. The NASA Electronic Library Systems (NELS) provides a ranked list of the most likely relevant objects in collections in response to a natural language query. Additionally, the system is constructed using standards and tools (Unix, X-Windows, Notif, and TCP/IP) that permit its operation in organizations that possess many different hosts, workstations, and platforms. There are no known commercial equivalents to this product at this time. The product has applications in all corporate management environments, particularly those that are information intensive, such as finance, manufacturing, biotechnology, and research and development.

  12. A Secure Framework for Location Verification in Pervasive Computing

    Science.gov (United States)

    Liu, Dawei; Lee, Moon-Chuen; Wu, Dan

    The way people use computing devices has been changed in some way by the relatively new pervasive computing paradigm. For example, a person can use a mobile device to obtain its location information at anytime and anywhere. There are several security issues concerning whether this information is reliable in a pervasive environment. For example, a malicious user may disable the localization system by broadcasting a forged location, and it may impersonate other users by eavesdropping their locations. In this paper, we address the verification of location information in a secure manner. We first present the design challenges for location verification, and then propose a two-layer framework VerPer for secure location verification in a pervasive computing environment. Real world GPS-based wireless sensor network experiments confirm the effectiveness of the proposed framework.

  13. The Plant-Window system: A flexible, expandable computing environment for the integration of power plant activities

    International Nuclear Information System (INIS)

    Wood, R.T.; Mullens, J.A.; Naser, J.A.

    1994-01-01

    Power plant data, and the information that can be derived from it, provide the link to the plant through which the operations, maintenance and engineering staff understand and manage plant performance. The increasing use of computer technology in the US nuclear power industry has greatly expanded the capability to obtain, analyze, and present data about the plant to station personnel. However, it is necessary to transform the vast quantity of available data into clear, concise, and coherent information that can be readily accessed and used throughout the plant. This need can be met by an integrated computer workstation environment that provides the necessary information and software applications, in a manner that can be easily understood and used, to the proper users throughout the plant. As part of a Cooperative Research and Development Agreement with the Electric Power Research Institute, the Oak Ridge National Laboratory has developed functional requirements for a Plant-Wide Integrated Environment Distributed on Workstations (Plant-Window) System. The Plant-Window System (PWS) can serve the needs of operations, engineering, and maintenance personnel at nuclear power stations by providing integrated data and software applications (e.g., monitoring, analysis, diagnosis, and control applications) within a common environment. The PWS requirements identify functional capabilities and provide guidelines for standardized hardware, software, and display interfaces to define a flexible computer environment that permits a tailored implementation of workstation capabilities and facilitates future upgrades

  14. Improving science and mathematics education with computational modelling in interactive engagement environments

    Science.gov (United States)

    Neves, Rui Gomes; Teodoro, Vítor Duarte

    2012-09-01

    A teaching approach aiming at an epistemologically balanced integration of computational modelling in science and mathematics education is presented. The approach is based on interactive engagement learning activities built around computational modelling experiments that span the range of different kinds of modelling from explorative to expressive modelling. The activities are designed to make a progressive introduction to scientific computation without requiring prior development of a working knowledge of programming, generate and foster the resolution of cognitive conflicts in the understanding of scientific and mathematical concepts and promote performative competency in the manipulation of different and complementary representations of mathematical models. The activities are supported by interactive PDF documents which explain the fundamental concepts, methods and reasoning processes using text, images and embedded movies, and include free space for multimedia enriched student modelling reports and teacher feedback. To illustrate, an example from physics implemented in the Modellus environment and tested in undergraduate university general physics and biophysics courses is discussed.

  15. Development of computational infrastructure to support hyper-resolution large-ensemble hydrology simulations from local-to-continental scales

    Data.gov (United States)

    National Aeronautics and Space Administration — Development of computational infrastructure to support hyper-resolution large-ensemble hydrology simulations from local-to-continental scales A move is currently...

  16. An analytically based numerical method for computing view factors in real urban environments

    Science.gov (United States)

    Lee, Doo-Il; Woo, Ju-Wan; Lee, Sang-Hyun

    2018-01-01

    A view factor is an important morphological parameter used in parameterizing in-canyon radiative energy exchange process as well as in characterizing local climate over urban environments. For realistic representation of the in-canyon radiative processes, a complete set of view factors at the horizontal and vertical surfaces of urban facets is required. Various analytical and numerical methods have been suggested to determine the view factors for urban environments, but most of the methods provide only sky-view factor at the ground level of a specific location or assume simplified morphology of complex urban environments. In this study, a numerical method that can determine the sky-view factors ( ψ ga and ψ wa ) and wall-view factors ( ψ gw and ψ ww ) at the horizontal and vertical surfaces is presented for application to real urban morphology, which are derived from an analytical formulation of the view factor between two blackbody surfaces of arbitrary geometry. The established numerical method is validated against the analytical sky-view factor estimation for ideal street canyon geometries, showing a consolidate confidence in accuracy with errors of less than 0.2 %. Using a three-dimensional building database, the numerical method is also demonstrated to be applicable in determining the sky-view factors at the horizontal (roofs and roads) and vertical (walls) surfaces in real urban environments. The results suggest that the analytically based numerical method can be used for the radiative process parameterization of urban numerical models as well as for the characterization of local urban climate.

  17. Computed tomography-guided cryoablation of local recurrence after primary resection of pancreatic adenocarcinoma

    Directory of Open Access Journals (Sweden)

    Claudio Pusceddu

    2015-06-01

    Full Text Available The optimal management of local recurrences after primary resection of pancreatic cancer still remains to be clarified. A 58-yearold woman developed an isolated recurrence of pancreatic cancer six year after distal pancreatectomy. Re-resection was attempted but the lesion was deemed unresectable at surgery. Then chemotherapy was administrated without obtaining a reduction of the tumor size nor an improvement of the patient’s symptoms. Thus the patient underwent percutaneous cryoablation under computed tomography (CT-guidance obtaining tumor necrosis and a significant improvement in the quality of life. A CT scan one month later showed a stable lesion with no contrast enhancement. While the use of percutaneous cryoblation has widened its applications in patients with unresectable pancreatic cancer, it has never been described for the treatment of local pancreatic cancer recurrence after primary resection. Percutaneous cryoablation deserves further studies in the multimodality treatment of local recurrence after primary pancreatic surgery.

  18. Encountering the Expertise Reversal Effect with a Computer-Based Environment on Electrical Circuit Analysis

    Science.gov (United States)

    Reisslein, Jana; Atkinson, Robert K.; Seeling, Patrick; Reisslein, Martin

    2006-01-01

    This study examined the effectiveness of a computer-based environment employing three example-based instructional procedures (example-problem, problem-example, and fading) to teach series and parallel electrical circuit analysis to learners classified by two levels of prior knowledge (low and high). Although no differences between the…

  19. Landmark based localization in urban environment

    Science.gov (United States)

    Qu, Xiaozhi; Soheilian, Bahman; Paparoditis, Nicolas

    2018-06-01

    A landmark based localization with uncertainty analysis based on cameras and geo-referenced landmarks is presented in this paper. The system is developed to adapt different camera configurations for six degree-of-freedom pose estimation. Local bundle adjustment is applied for optimization and the geo-referenced landmarks are integrated to reduce the drift. In particular, the uncertainty analysis is taken into account. On the one hand, we estimate the uncertainties of poses to predict the precision of localization. On the other hand, uncertainty propagation is considered for matching, tracking and landmark registering. The proposed method is evaluated on both KITTI benchmark and the data acquired by a mobile mapping system. In our experiments, decimeter level accuracy can be reached.

  20. Local pursuit strategy-inspired cooperative trajectory planning algorithm for a class of nonlinear constrained dynamical systems

    Science.gov (United States)

    Xu, Yunjun; Remeikas, Charles; Pham, Khanh

    2014-03-01

    Cooperative trajectory planning is crucial for networked vehicles to respond rapidly in cluttered environments and has a significant impact on many applications such as air traffic or border security monitoring and assessment. One of the challenges in cooperative planning is to find a computationally efficient algorithm that can accommodate both the complexity of the environment and real hardware and configuration constraints of vehicles in the formation. Inspired by a local pursuit strategy observed in foraging ants, feasible and optimal trajectory planning algorithms are proposed in this paper for a class of nonlinear constrained cooperative vehicles in environments with densely populated obstacles. In an iterative hierarchical approach, the local behaviours, such as the formation stability, obstacle avoidance, and individual vehicle's constraints, are considered in each vehicle's (i.e. follower's) decentralised optimisation. The cooperative-level behaviours, such as the inter-vehicle collision avoidance, are considered in the virtual leader's centralised optimisation. Early termination conditions are derived to reduce the computational cost by not wasting time in the local-level optimisation if the virtual leader trajectory does not satisfy those conditions. The expected advantages of the proposed algorithms are (1) the formation can be globally asymptotically maintained in a decentralised manner; (2) each vehicle decides its local trajectory using only the virtual leader and its own information; (3) the formation convergence speed is controlled by one single parameter, which makes it attractive for many practical applications; (4) nonlinear dynamics and many realistic constraints, such as the speed limitation and obstacle avoidance, can be easily considered; (5) inter-vehicle collision avoidance can be guaranteed in both the formation transient stage and the formation steady stage; and (6) the computational cost in finding both the feasible and optimal

  1. Review of computational thermal-hydraulic modeling

    International Nuclear Information System (INIS)

    Keefer, R.H.; Keeton, L.W.

    1995-01-01

    Corrosion of heat transfer tubing in nuclear steam generators has been a persistent problem in the power generation industry, assuming many different forms over the years depending on chemistry and operating conditions. Whatever the corrosion mechanism, a fundamental understanding of the process is essential to establish effective management strategies. To gain this fundamental understanding requires an integrated investigative approach that merges technology from many diverse scientific disciplines. An important aspect of an integrated approach is characterization of the corrosive environment at high temperature. This begins with a thorough understanding of local thermal-hydraulic conditions, since they affect deposit formation, chemical concentration, and ultimately corrosion. Computational Fluid Dynamics (CFD) can and should play an important role in characterizing the thermal-hydraulic environment and in predicting the consequences of that environment,. The evolution of CFD technology now allows accurate calculation of steam generator thermal-hydraulic conditions and the resulting sludge deposit profiles. Similar calculations are also possible for model boilers, so that tests can be designed to be prototypic of the heat exchanger environment they are supposed to simulate. This paper illustrates the utility of CFD technology by way of examples in each of these two areas. This technology can be further extended to produce more detailed local calculations of the chemical environment in support plate crevices, beneath thick deposits on tubes, and deep in tubesheet sludge piles. Knowledge of this local chemical environment will provide the foundation for development of mechanistic corrosion models, which can be used to optimize inspection and cleaning schedules and focus the search for a viable fix

  2. Ubiquitous Green Computing Techniques for High Demand Applications in Smart Environments

    Directory of Open Access Journals (Sweden)

    Jose M. Moya

    2012-08-01

    Full Text Available Ubiquitous sensor network deployments, such as the ones found in Smart cities and Ambient intelligence applications, require constantly increasing high computational demands in order to process data and offer services to users. The nature of these applications imply the usage of data centers. Research has paid much attention to the energy consumption of the sensor nodes in WSNs infrastructures. However, supercomputing facilities are the ones presenting a higher economic and environmental impact due to their very high power consumption. The latter problem, however, has been disregarded in the field of smart environment services. This paper proposes an energy-minimization workload assignment technique, based on heterogeneity and application-awareness, that redistributes low-demand computational tasks from high-performance facilities to idle nodes with low and medium resources in the WSN infrastructure. These non-optimal allocation policies reduce the energy consumed by the whole infrastructure and the total execution time.

  3. Ubiquitous green computing techniques for high demand applications in Smart environments.

    Science.gov (United States)

    Zapater, Marina; Sanchez, Cesar; Ayala, Jose L; Moya, Jose M; Risco-Martín, José L

    2012-01-01

    Ubiquitous sensor network deployments, such as the ones found in Smart cities and Ambient intelligence applications, require constantly increasing high computational demands in order to process data and offer services to users. The nature of these applications imply the usage of data centers. Research has paid much attention to the energy consumption of the sensor nodes in WSNs infrastructures. However, supercomputing facilities are the ones presenting a higher economic and environmental impact due to their very high power consumption. The latter problem, however, has been disregarded in the field of smart environment services. This paper proposes an energy-minimization workload assignment technique, based on heterogeneity and application-awareness, that redistributes low-demand computational tasks from high-performance facilities to idle nodes with low and medium resources in the WSN infrastructure. These non-optimal allocation policies reduce the energy consumed by the whole infrastructure and the total execution time.

  4. The link between poverty, environment and development. The political challenge of localizing Agenda 21.

    Science.gov (United States)

    Wichmann, R

    1995-11-01

    This article discusses the links between poverty, development, the environment, and implementing Agenda 21. The poor in large cities experience greater health risks and threats from environmental hazards. The poor also face inadequate housing, poor sanitation, polluted drinking water, and lack of other basic services. Many poor live in marginalized areas more susceptible to environmental degradation. During 1990-2030, population size may reach 9.7 billion, or 3.7 billion more than today. 90% may be urban residents. Already a large proportion of urban population live in a decaying urban environment with health and life threatening conditions. At least 250 million do not have easy access to safe piped water. 400 million lack proper sanitation. The liberalization of the global economy is fueling urbanization. The cycle of poverty and environmental decline requires rapid economic growth and closing of the infrastructure gaps. Policy initiatives of Agenda 21 occur at the local urban level. At this level, policies directly affect people. The future success of Agenda 21 will depend on local initiatives. Management approaches may need to change in order to achieve sustainable development. The poor will be more vocal and heard from in the future. Critical areas of management include waste management, pollution control, traffic, transportation, energy, economic development, and job creation. Society must be able to participate in setting priorities. About 1500 local authorities are involved in Agenda 21 planning initiatives. Curitiba, Brazil, is an example of how cities can solve community problems.

  5. Implementation of Grid-computing Framework for Simulation in Multi-scale Structural Analysis

    Directory of Open Access Journals (Sweden)

    Data Iranata

    2010-05-01

    Full Text Available A new grid-computing framework for simulation in multi-scale structural analysis is presented. Two levels of parallel processing will be involved in this framework: multiple local distributed computing environments connected by local network to form a grid-based cluster-to-cluster distributed computing environment. To successfully perform the simulation, a large-scale structural system task is decomposed into the simulations of a simplified global model and several detailed component models using various scales. These correlated multi-scale structural system tasks are distributed among clusters and connected together in a multi-level hierarchy and then coordinated over the internet. The software framework for supporting the multi-scale structural simulation approach is also presented. The program architecture design allows the integration of several multi-scale models as clients and servers under a single platform. To check its feasibility, a prototype software system has been designed and implemented to perform the proposed concept. The simulation results show that the software framework can increase the speedup performance of the structural analysis. Based on this result, the proposed grid-computing framework is suitable to perform the simulation of the multi-scale structural analysis.

  6. Secure Enclaves: An Isolation-centric Approach for Creating Secure High Performance Computing Environments

    Energy Technology Data Exchange (ETDEWEB)

    Aderholdt, Ferrol [Tennessee Technological Univ., Cookeville, TN (United States); Caldwell, Blake A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hicks, Susan Elaine [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Koch, Scott M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Naughton, III, Thomas J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Pelfrey, Daniel S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Pogge, James R [Tennessee Technological Univ., Cookeville, TN (United States); Scott, Stephen L [Tennessee Technological Univ., Cookeville, TN (United States); Shipman, Galen M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sorrillo, Lawrence [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-01-01

    High performance computing environments are often used for a wide variety of workloads ranging from simulation, data transformation and analysis, and complex workflows to name just a few. These systems may process data at various security levels but in so doing are often enclaved at the highest security posture. This approach places significant restrictions on the users of the system even when processing data at a lower security level and exposes data at higher levels of confidentiality to a much broader population than otherwise necessary. The traditional approach of isolation, while effective in establishing security enclaves poses significant challenges for the use of shared infrastructure in HPC environments. This report details current state-of-the-art in virtualization, reconfigurable network enclaving via Software Defined Networking (SDN), and storage architectures and bridging techniques for creating secure enclaves in HPC environments.

  7. Aeroflex Single Board Computers and Instrument Circuit Cards for Nuclear Environments Measuring and Monitoring

    International Nuclear Information System (INIS)

    Stratton, Sam; Stevenson, Dave; Magnifico, Mateo

    2013-06-01

    A Single Board Computer (SBC) is an entire computer including all of the required components and I/O interfaces built on a single circuit board. SBC's are used across numerous industrial, military and space flight applications. In the case of military and space implementations, SBC's employ advanced high reliability processors designed for rugged thermal, mechanical and even radiation environments. These processors, in turn, rely on equally advanced support components such as memory, interface, and digital logic. When all of these components are put together on a printed circuit card, the result is a highly reliable Single Board Computer that can perform a wide variety of tasks in very harsh environments. In the area of instrumentation, peripheral circuit cards can be developed that directly interface to the SBC and various radiation measuring devices and systems. Designers use signal conditioning and high reliability Analog to Digital Converters (ADC's) to convert the measuring device signals to digital data suitable for a microprocessor. The data can then be sent to the SBC via high speed communication protocols such as Ethernet or similar type of serial bus. Data received by the SBC can then be manipulated and processed into a form readily available to users. Recent events are causing some in the NPP industry to consider devices and systems with better radiation and temperature performance capability. Systems designed for space application are designed for the harsh environment of space which under certain conditions would be similar to what the electronics will see during a severe nuclear reactor event. The NPP industry should be considering higher reliability electronics for certain critical applications. (authors)

  8. Development of a computational environment for the General Curvilinear Ocean Model

    International Nuclear Information System (INIS)

    Thomas, Mary P; Castillo, Jose E

    2009-01-01

    The General Curvilinear Ocean Model (GCOM) differs significantly from the traditional approach, where the use of Cartesian coordinates forces the model to simulate terrain as a series of steps. GCOM utilizes a full three-dimensional curvilinear transformation, which has been shown to have greater accuracy than similar models and to achieve results more efficiently. The GCOM model has been validated for several types of water bodies, different coastlines and bottom shapes, including the Alarcon Seamount, Southern California Coastal Region, the Valencia Lake in Venezuela, and more recently the Monterey Bay. In this paper, enhancements to the GCOM model and an overview of the computational environment (GCOM-CE) are presented. Model improvements include migration from F77 to F90; approach to a component design; and initial steps towards parallelization of the model. Through the use of the component design, new models are being incorporated including biogeochemical, pollution, and sediment transport. The computational environment is designed to allow various client interactions via secure Web applications (portal, Web services, and Web 2.0 gadgets). Features include building jobs, managing and interacting with long running jobs; managing input and output files; quick visualization of results; publishing of Web services to be used by other systems such as larger climate models. The CE is based mainly on Python tools including a grid-enabled Pylons Web application Framework for Web services, pyWSRF (python-Web Services-Resource Framework), pyGlobus based web services, SciPy, and Google code tools.

  9. An Algorithm Computing the Local $b$ Function by an Approximate Division Algorithm in $\\hat{\\mathcal{D}}$

    OpenAIRE

    Nakayama, Hiromasa

    2006-01-01

    We give an algorithm to compute the local $b$ function. In this algorithm, we use the Mora division algorithm in the ring of differential operators and an approximate division algorithm in the ring of differential operators with power series coefficient.

  10. TMX-U computer system in evolution

    International Nuclear Information System (INIS)

    Casper, T.A.; Bell, H.; Brown, M.; Gorvad, M.; Jenkins, S.; Meyer, W.; Moller, J.; Perkins, D.

    1986-01-01

    Over the past three years, the total TMX-U diagnsotic data base has grown to exceed 10 megabytes from over 1300 channels; roughly triple the originally designed size. This acquisition and processing load has resulted in an experiment repetition rate exceeding 10 minutes per shot using the five original Hewlett-Packard HP-1000 computers with their shared disks. Our new diagnostics tend to be multichannel instruments, which, in our environment, can be more easily managed using local computers. For this purpose, we are using HP series 9000 computers for instrument control, data acquisition, and analysis. Fourteen such systems are operational with processed format output exchanged via a shared resource manager. We are presently implementing the necessary hardware and software changes to create a local area network allowing us to combine the data from these systems with our main data archive. The expansion of our diagnostic system using the paralled acquisition and processing concept allows us to increase our data base with a minimum of impact on the experimental repetition rate

  11. TSaT-MUSIC: a novel algorithm for rapid and accurate ultrasonic 3D localization

    Science.gov (United States)

    Mizutani, Kyohei; Ito, Toshio; Sugimoto, Masanori; Hashizume, Hiromichi

    2011-12-01

    We describe a fast and accurate indoor localization technique using the multiple signal classification (MUSIC) algorithm. The MUSIC algorithm is known as a high-resolution method for estimating directions of arrival (DOAs) or propagation delays. A critical problem in using the MUSIC algorithm for localization is its computational complexity. Therefore, we devised a novel algorithm called Time Space additional Temporal-MUSIC, which can rapidly and simultaneously identify DOAs and delays of mul-ticarrier ultrasonic waves from transmitters. Computer simulations have proved that the computation time of the proposed algorithm is almost constant in spite of increasing numbers of incoming waves and is faster than that of existing methods based on the MUSIC algorithm. The robustness of the proposed algorithm is discussed through simulations. Experiments in real environments showed that the standard deviation of position estimations in 3D space is less than 10 mm, which is satisfactory for indoor localization.

  12. Performance Evaluation of Resource Management in Cloud Computing Environments.

    Science.gov (United States)

    Batista, Bruno Guazzelli; Estrella, Julio Cezar; Ferreira, Carlos Henrique Gomes; Filho, Dionisio Machado Leite; Nakamura, Luis Hideo Vasconcelos; Reiff-Marganiec, Stephan; Santana, Marcos José; Santana, Regina Helena Carlucci

    2015-01-01

    Cloud computing is a computational model in which resource providers can offer on-demand services to clients in a transparent way. However, to be able to guarantee quality of service without limiting the number of accepted requests, providers must be able to dynamically manage the available resources so that they can be optimized. This dynamic resource management is not a trivial task, since it involves meeting several challenges related to workload modeling, virtualization, performance modeling, deployment and monitoring of applications on virtualized resources. This paper carries out a performance evaluation of a module for resource management in a cloud environment that includes handling available resources during execution time and ensuring the quality of service defined in the service level agreement. An analysis was conducted of different resource configurations to define which dimension of resource scaling has a real influence on client requests. The results were used to model and implement a simulated cloud system, in which the allocated resource can be changed on-the-fly, with a corresponding change in price. In this way, the proposed module seeks to satisfy both the client by ensuring quality of service, and the provider by ensuring the best use of resources at a fair price.

  13. Performance Evaluation of Resource Management in Cloud Computing Environments.

    Directory of Open Access Journals (Sweden)

    Bruno Guazzelli Batista

    Full Text Available Cloud computing is a computational model in which resource providers can offer on-demand services to clients in a transparent way. However, to be able to guarantee quality of service without limiting the number of accepted requests, providers must be able to dynamically manage the available resources so that they can be optimized. This dynamic resource management is not a trivial task, since it involves meeting several challenges related to workload modeling, virtualization, performance modeling, deployment and monitoring of applications on virtualized resources. This paper carries out a performance evaluation of a module for resource management in a cloud environment that includes handling available resources during execution time and ensuring the quality of service defined in the service level agreement. An analysis was conducted of different resource configurations to define which dimension of resource scaling has a real influence on client requests. The results were used to model and implement a simulated cloud system, in which the allocated resource can be changed on-the-fly, with a corresponding change in price. In this way, the proposed module seeks to satisfy both the client by ensuring quality of service, and the provider by ensuring the best use of resources at a fair price.

  14. Study of Propagation Mechanisms in Dynamical Railway Environment to Reduce Computation Time of 3D Ray Tracing Simulator

    Directory of Open Access Journals (Sweden)

    Siham Hairoud

    2013-01-01

    Full Text Available In order to better assess the behaviours of the propagation channel in a confined environment such as a railway tunnel for subway application, we present an optimization method for a deterministic channel simulator based on 3D ray tracing associated to the geometrical optics laws and the uniform theory of diffraction. This tool requires a detailed description of the environment. Thus, the complexity of this model is directly bound to the complexity of the environment and specifically to the number of facets that compose it. In this paper, we propose an algorithm to identify facets that have no significant impact on the wave propagation. This allows us to simplify the description of the geometry of the modelled environment by removing them and by this way, to reduce the complexity of our model and therefore its computation time. A comparative study between full and simplified environment is led and shows the impact of this proposed method on the characteristic parameters of the propagation channel. Thus computation time obtained from the simplified environment is 6 times lower than the one of the full model without significant degradation of simulation accuracy.

  15. CE-ACCE: The Cloud Enabled Advanced sCience Compute Environment

    Science.gov (United States)

    Cinquini, L.; Freeborn, D. J.; Hardman, S. H.; Wong, C.

    2017-12-01

    Traditionally, Earth Science data from NASA remote sensing instruments has been processed by building custom data processing pipelines (often based on a common workflow engine or framework) which are typically deployed and run on an internal cluster of computing resources. This approach has some intrinsic limitations: it requires each mission to develop and deploy a custom software package on top of the adopted framework; it makes use of dedicated hardware, network and storage resources, which must be specifically purchased, maintained and re-purposed at mission completion; and computing services cannot be scaled on demand beyond the capability of the available servers.More recently, the rise of Cloud computing, coupled with other advances in containerization technology (most prominently, Docker) and micro-services architecture, has enabled a new paradigm, whereby space mission data can be processed through standard system architectures, which can be seamlessly deployed and scaled on demand on either on-premise clusters, or commercial Cloud providers. In this talk, we will present one such architecture named CE-ACCE ("Cloud Enabled Advanced sCience Compute Environment"), which we have been developing at the NASA Jet Propulsion Laboratory over the past year. CE-ACCE is based on the Apache OODT ("Object Oriented Data Technology") suite of services for full data lifecycle management, which are turned into a composable array of Docker images, and complemented by a plug-in model for mission-specific customization. We have applied this infrastructure to both flying and upcoming NASA missions, such as ECOSTRESS and SMAP, and demonstrated deployment on the Amazon Cloud, either using simple EC2 instances, or advanced AWS services such as Amazon Lambda and ECS (EC2 Container Services).

  16. Computer modeling of the dynamics of surface tension on rotating fluids in low and microgravity environments

    Science.gov (United States)

    Hung, R. J.; Tsao, Y. D.; Hong, B. B.; Leslie, Fred W.

    1989-01-01

    Time-dependent evolutions of the profile of the free surface (bubble shapes) for a cylindrical container partially filled with a Newtonian fluid of constant density, rotating about its axis of symmetry, have been studied. Numerical computations have been carried out with the following situations: (1) linear functions of spin-up and spin-down in low- and microgravity environments, (2) linear functions of increasing and decreasing gravity environments at high- and low-rotating cylinder speeds, and (3) step functions of spin-up and spin-down in a low-gravity environment.

  17. Improving Quality of Service and Reducing Power Consumption with WAN accelerator in Cloud Computing Environments

    OpenAIRE

    Shin-ichi Kuribayashi

    2013-01-01

    The widespread use of cloud computing services is expected to deteriorate a Quality of Service and toincrease the power consumption of ICT devices, since the distance to a server becomes longer than before. Migration of virtual machines over a wide area can solve many problems such as load balancing and power saving in cloud computing environments. This paper proposes to dynamically apply WAN accelerator within the network when a virtual machine is moved to a distant center, in order to preve...

  18. Local body cooling to improve sleep quality and thermal comfort in a hot environment.

    Science.gov (United States)

    Lan, L; Qian, X L; Lian, Z W; Lin, Y B

    2018-01-01

    The effects of local body cooling on thermal comfort and sleep quality in a hot environment were investigated in an experiment with 16 male subjects. Sleep quality was evaluated subjectively, using questionnaires completed in the morning, and objectively, by analysis of electroencephalogram (EEG) signals that were continuously monitored during the sleeping period. Compared with no cooling, the largest improvement in thermal comfort and sleep quality was observed when the back and head (neck) were both cooled at a room temperature of 32°C. Back cooling alone also improved thermal comfort and sleep quality, although the effects were less than when cooling both back and head (neck). Mean sleep efficiency was improved from 84.6% in the no cooling condition to 95.3% and 92.8%, respectively, in these conditions, indicating good sleep quality. Head (neck) cooling alone slightly improved thermal comfort and subjective sleep quality and increased Stage N3 sleep, but did not otherwise improve sleep quality. The results show that local cooling applied to large body sections (back and head) could effectively maintain good sleep and improve thermal comfort in a hot environment. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  19. HEPLIB '91: International users meeting on the support and environments of high energy physics computing

    International Nuclear Information System (INIS)

    Johnstad, H.

    1991-01-01

    The purpose of this meeting is to discuss the current and future HEP computing support and environments from the perspective of new horizons in accelerator, physics, and computing technologies. Topics of interest to the Meeting include (but are limited to): the forming of the HEPLIB world user group for High Energy Physic computing; mandate, desirables, coordination, organization, funding; user experience, international collaboration; the roles of national labs, universities, and industry; range of software, Monte Carlo, mathematics, physics, interactive analysis, text processors, editors, graphics, data base systems, code management tools; program libraries, frequency of updates, distribution; distributed and interactive computing, data base systems, user interface, UNIX operating systems, networking, compilers, Xlib, X-Graphics; documentation, updates, availability, distribution; code management in large collaborations, keeping track of program versions; and quality assurance, testing, conventions, standards

  20. Relationship between x-ray emission and absorption spectroscopy and the local H-bond environment in water

    Science.gov (United States)

    Zhovtobriukh, Iurii; Besley, Nicholas A.; Fransson, Thomas; Nilsson, Anders; Pettersson, Lars G. M.

    2018-04-01

    The connection between specific features in the water X-ray absorption spectrum and X-ray emission spectrum (XES) and the local H-bond coordination is studied based on structures obtained from path-integral molecular dynamics simulations using either the opt-PBE-vdW density functional or the MB-pol force field. Computing the XES spectrum using all molecules in a snapshot results in only one peak in the lone-pair (1b1) region, while the experiment shows two peaks separated by 0.8-0.9 eV. Different H-bond configurations were classified based on the local structure index (LSI) and a geometrical H-bond cone criterion. We find that tetrahedrally coordinated molecules characterized by high LSI values and two strong donated and two strong accepted H-bonds contribute to the low energy 1b1 emission peak and to the post-edge region in absorption. Molecules with the asymmetric H-bond environment with one strong accepted H-bond and one strong donated H-bond and low LSI values give rise to the high energy 1b1 peak in the emission spectrum and mainly contribute to the pre-edge and main-edge in the absorption spectrum. The 1b1 peak splitting can be increased to 0.62 eV by imposing constraints on the H-bond length, i.e., for very tetrahedral structures short H-bonds (less than 2.68 Å) and for very asymmetric structures elongated H-bonds (longer than 2.8 Å). Such structures are present, but underrepresented, in the simulations which give more of an average of the two extremes.

  1. Relationship between x-ray emission and absorption spectroscopy and the local H-bond environment in water.

    Science.gov (United States)

    Zhovtobriukh, Iurii; Besley, Nicholas A; Fransson, Thomas; Nilsson, Anders; Pettersson, Lars G M

    2018-04-14

    The connection between specific features in the water X-ray absorption spectrum and X-ray emission spectrum (XES) and the local H-bond coordination is studied based on structures obtained from path-integral molecular dynamics simulations using either the opt-PBE-vdW density functional or the MB-pol force field. Computing the XES spectrum using all molecules in a snapshot results in only one peak in the lone-pair (1b 1 ) region, while the experiment shows two peaks separated by 0.8-0.9 eV. Different H-bond configurations were classified based on the local structure index (LSI) and a geometrical H-bond cone criterion. We find that tetrahedrally coordinated molecules characterized by high LSI values and two strong donated and two strong accepted H-bonds contribute to the low energy 1b 1 emission peak and to the post-edge region in absorption. Molecules with the asymmetric H-bond environment with one strong accepted H-bond and one strong donated H-bond and low LSI values give rise to the high energy 1b 1 peak in the emission spectrum and mainly contribute to the pre-edge and main-edge in the absorption spectrum. The 1b 1 peak splitting can be increased to 0.62 eV by imposing constraints on the H-bond length, i.e., for very tetrahedral structures short H-bonds (less than 2.68 Å) and for very asymmetric structures elongated H-bonds (longer than 2.8 Å). Such structures are present, but underrepresented, in the simulations which give more of an average of the two extremes.

  2. The New Learning Ecology of One-to-One Computing Environments: Preparing Teachers for Shifting Dynamics and Relationships

    Science.gov (United States)

    Spires, Hiller A.; Oliver, Kevin; Corn, Jenifer

    2012-01-01

    Despite growing research and evaluation results on one-to-one computing environments, how these environments affect learning in schools remains underexamined. The purpose of this article is twofold: (a) to use a theoretical lens, namely a new learning ecology, to frame the dynamic changes as well as challenges that are introduced by a one-to-one…

  3. Accurate and Integrated Localization System for Indoor Environments Based on IEEE 802.11 Round-Trip Time Measurements

    Directory of Open Access Journals (Sweden)

    Alfonso Bahillo

    2010-01-01

    Full Text Available The presence of (Non line of Sight NLOS propagation paths has been considered the main drawback for localization schemes to estimate the position of a (Mobile User MU in an indoor environment. This paper presents a comprehensive wireless localization system based on (Round-Trip Time RTT measurements in an unmodified IEEE 802.11 wireless network. It overcomes the NLOS impairment by implementing the (Prior NLOS Measurements Correction PNMC technique. At first, the RTT measurements are performed with a novel electronic circuit avoiding the need for time synchronization between wireless nodes. At second, the distance between the MU and each reference device is estimated by using a simple linear regression function that best relates the RTT to the distance in (Line of Sight LOS. Assuming that LOS in an indoor environment is a simplification of reality hence, the PNMC technique is applied to correct the NLOS effect. At third, assuming known the position of the reference devices, a multilateration technique is implemented to obtain the MU position. Finally, the localization system coupled with measurements demonstrates that the system outperforms the conventional time-based indoor localization schemes without using any tracking technique such as Kalman filters or Bayesian methods.

  4. Computer local construction of a general solution for the Chew-Low equations

    International Nuclear Information System (INIS)

    Gerdt, V.P.

    1980-01-01

    General solution of the dynamic form of the Chew-Low equations in the vicinity of the restpoint is considered. A method for calculating coefficients of series being members of such solution is suggested. The results of calculations, coefficients of power series and expansions carried out by means of the SCHOONSCHIP and SYMBAL systems are given. It is noted that the suggested procedure of the Chew-Low equation solutions basing on using an electronic computer as an instrument for analytical calculations permits to obtain detail information on the local structure of general solution

  5. General rigid motion correction for computed tomography imaging based on locally linear embedding

    Science.gov (United States)

    Chen, Mianyi; He, Peng; Feng, Peng; Liu, Baodong; Yang, Qingsong; Wei, Biao; Wang, Ge

    2018-02-01

    The patient motion can damage the quality of computed tomography images, which are typically acquired in cone-beam geometry. The rigid patient motion is characterized by six geometric parameters and are more challenging to correct than in fan-beam geometry. We extend our previous rigid patient motion correction method based on the principle of locally linear embedding (LLE) from fan-beam to cone-beam geometry and accelerate the computational procedure with the graphics processing unit (GPU)-based all scale tomographic reconstruction Antwerp toolbox. The major merit of our method is that we need neither fiducial markers nor motion-tracking devices. The numerical and experimental studies show that the LLE-based patient motion correction is capable of calibrating the six parameters of the patient motion simultaneously, reducing patient motion artifacts significantly.

  6. The relationship of the local food environment with obesity: A systematic review of methods, study quality, and results.

    Science.gov (United States)

    Cobb, Laura K; Appel, Lawrence J; Franco, Manuel; Jones-Smith, Jessica C; Nur, Alana; Anderson, Cheryl A M

    2015-07-01

    To examine the relationship between local food environments and obesity and assess the quality of studies reviewed. Systematic keyword searches identified studies from US and Canada that assessed the relationship of obesity to local food environments. We applied a quality metric based on design, exposure and outcome measurement, and analysis. We identified 71 studies representing 65 cohorts. Overall, study quality was low; 60 studies were cross-sectional. Associations between food outlet availability and obesity were predominantly null. Among non-null associations, we saw a trend toward inverse associations between supermarket availability and obesity (22 negative, 4 positive, 67 null) and direct associations between fast food and obesity (29 positive, 6 negative, 71 null) in adults. We saw direct associations between fast food availability and obesity in lower income children (12 positive, 7 null). Indices including multiple food outlets were most consistently associated with obesity in adults (18 expected, 1 not expected, 17 null). Limiting to higher quality studies did not affect results. Despite the large number of studies, we found limited evidence for associations between local food environments and obesity. The predominantly null associations should be interpreted cautiously due to the low quality of available studies. © 2015 The Obesity Society.

  7. The relationship of the local food environment with obesity: A systematic review of methods, study quality and results

    Science.gov (United States)

    Cobb, Laura K; Appel, Lawrence J; Franco, Manuel; Jones-Smith, Jessica C; Nur, Alana; Anderson, Cheryl AM

    2015-01-01

    Objective To examine the relationship between local food environments and obesity and assess the quality of studies reviewed. Methods Systematic keyword searches identified studies from US and Canada that assessed the relationship of obesity to local food environments. We applied a quality metric based on design, exposure and outcome measurement, and analysis. Results We identified 71 studies representing 65 cohorts. Overall, study quality was low; 60 studies were cross-sectional. Associations between food outlet availability and obesity were predominantly null. Among non-null associations, we saw a trend toward inverse associations between supermarket availability and obesity (22 negative, 4 positive, 67 null) and direct associations between fast food and obesity (29 positive, 6 negative, 71 null) in adults. We saw direct associations between fast food availability and obesity in lower income children (12 positive, 7 null). Indices including multiple food outlets were most consistently associated with obesity in adults (18 expected, 1 not expected, 17 null). Limiting to higher quality studies did not affect results. Conclusions Despite the large number of studies, we found limited evidence for associations between local food environments and obesity. The predominantly null associations should be interpreted cautiously due to the low quality of available studies. PMID:26096983

  8. Epidemic spreading in localized environments with recurrent mobility patterns

    Science.gov (United States)

    Granell, Clara; Mucha, Peter J.

    2018-05-01

    The spreading of epidemics is very much determined by the structure of the contact network, which may be impacted by the mobility dynamics of the individuals themselves. In confined scenarios where a small, closed population spends most of its time in localized environments and has easily identifiable mobility patterns—such as workplaces, university campuses, or schools—it is of critical importance to identify the factors controlling the rate of disease spread. Here, we present a discrete-time, metapopulation-based model to describe the transmission of susceptible-infected-susceptible-like diseases that take place in confined scenarios where the mobilities of the individuals are not random but, rather, follow clear recurrent travel patterns. This model allows analytical determination of the onset of epidemics, as well as the ability to discern which contact structures are most suited to prevent the infection to spread. It thereby determines whether common prevention mechanisms, as isolation, are worth implementing in such a scenario and their expected impact.

  9. Computation Offloading Algorithm for Arbitrarily Divisible Applications in Mobile Edge Computing Environments: An OCR Case

    Directory of Open Access Journals (Sweden)

    Bo Li

    2018-05-01

    Full Text Available Divisible applications are a class of tasks whose loads can be partitioned into some smaller fractions, and each part can be executed independently by a processor. A wide variety of divisible applications have been found in the area of parallel and distributed processing. This paper addresses the problem of how to partition and allocate divisible applications to available resources in mobile edge computing environments with the aim of minimizing the completion time of the applications. A theoretical model was proposed for partitioning an entire divisible application according to the load of the application and the capabilities of available resources, and the solutions were derived in closed form. Both simulations and real experiments were carried out to justify this model.

  10. Investigations of the local environment and macroscopic alignment behavior of novel polymerizeable lyotropic liquid crystals using nuclear magnetic resonance

    Science.gov (United States)

    Juang, Elizabeth

    In this dissertation, a variety of NMR techniques were used to explore the local environment of novel polymerizeable lyotropic liquid crystals (LLC). The LLC monomers examined in this study self-assemble in the presence of a small amount of water to form uniform, nanometer-scale tubes with aqueous interiors. The phase architecture is retained upon photopolymerization to yield the resulting nanoporous material. By dissolving reactive precursors into the aqueous phase, well- structured nancomposite materials have also been formed. Proposed uses for these novel polymerizeable LLCs are as porous water filtration membranes, as heterogeneous organic catalysts, and as nanocomposite materials for load bearing and optical applications. In order to better exploit these polymerizeable LLCs for materials development, the local environment must be examined. In addition, the macroscopic orientation of these materials remains an important step in their advancement. Various NMR studies were conducted on these novel LLCs. NMR T1 relaxation measurements were conducted to elucidate the local environment and dynamics of the 23Na counterions located inside the aqueous channels. 2H NMR line shape analyses were used to characterize the local structure and dynamics near the hydrophilic headgroup. 29 Si NMR studies were performed on silica nanocomposites formed with these LLC structures. Finally, the macroscopic alignment behavior of these novel LLCs using shear and magnetic fields was examined.

  11. Discovering local patterns of co - evolution: computational aspects and biological examples

    Directory of Open Access Journals (Sweden)

    Tuller Tamir

    2010-01-01

    Full Text Available Abstract Background Co-evolution is the process in which two (or more sets of orthologs exhibit a similar or correlative pattern of evolution. Co-evolution is a powerful way to learn about the functional interdependencies between sets of genes and cellular functions and to predict physical interactions. More generally, it can be used for answering fundamental questions about the evolution of biological systems. Orthologs that exhibit a strong signal of co-evolution in a certain part of the evolutionary tree may show a mild signal of co-evolution in other branches of the tree. The major reasons for this phenomenon are noise in the biological input, genes that gain or lose functions, and the fact that some measures of co-evolution relate to rare events such as positive selection. Previous publications in the field dealt with the problem of finding sets of genes that co-evolved along an entire underlying phylogenetic tree, without considering the fact that often co-evolution is local. Results In this work, we describe a new set of biological problems that are related to finding patterns of local co-evolution. We discuss their computational complexity and design algorithms for solving them. These algorithms outperform other bi-clustering methods as they are designed specifically for solving the set of problems mentioned above. We use our approach to trace the co-evolution of fungal, eukaryotic, and mammalian genes at high resolution across the different parts of the corresponding phylogenetic trees. Specifically, we discover regions in the fungi tree that are enriched with positive evolution. We show that metabolic genes exhibit a remarkable level of co-evolution and different patterns of co-evolution in various biological datasets. In addition, we find that protein complexes that are related to gene expression exhibit non-homogenous levels of co-evolution across different parts of the fungi evolutionary line. In the case of mammalian evolution

  12. Signal and image processing algorithm performance in a virtual and elastic computing environment

    Science.gov (United States)

    Bennett, Kelly W.; Robertson, James

    2013-05-01

    The U.S. Army Research Laboratory (ARL) supports the development of classification, detection, tracking, and localization algorithms using multiple sensing modalities including acoustic, seismic, E-field, magnetic field, PIR, and visual and IR imaging. Multimodal sensors collect large amounts of data in support of algorithm development. The resulting large amount of data, and their associated high-performance computing needs, increases and challenges existing computing infrastructures. Purchasing computer power as a commodity using a Cloud service offers low-cost, pay-as-you-go pricing models, scalability, and elasticity that may provide solutions to develop and optimize algorithms without having to procure additional hardware and resources. This paper provides a detailed look at using a commercial cloud service provider, such as Amazon Web Services (AWS), to develop and deploy simple signal and image processing algorithms in a cloud and run the algorithms on a large set of data archived in the ARL Multimodal Signatures Database (MMSDB). Analytical results will provide performance comparisons with existing infrastructure. A discussion on using cloud computing with government data will discuss best security practices that exist within cloud services, such as AWS.

  13. Improving Communicative Competence through Synchronous Communication in Computer-Supported Collaborative Learning Environments: A Systematic Review

    Directory of Open Access Journals (Sweden)

    Xi Huang

    2018-01-01

    Full Text Available Computer-supported collaborative learning facilitates the extension of second language acquisition into social practice. Studies on its achievement effects speak directly to the pedagogical notion of treating communicative practice in synchronous computer-mediated communication (SCMC: real-time communication that takes place between human beings via the instrumentality of computers in forms of text, audio and video communication, such as live chat and chatrooms as socially-oriented meaning construction. This review begins by considering the adoption of social interactionist views to identify key paradigms and supportive principles of computer-supported collaborative learning. A special focus on two components of communicative competence is then presented to explore interactional variables in synchronous computer-mediated communication along with a review of research. There follows a discussion on a synthesis of interactional variables in negotiated interaction and co-construction of knowledge from psycholinguistic and social cohesion perspectives. This review reveals both possibilities and disparities of language socialization in promoting intersubjective learning and diversifying the salient use of interactively creative language in computer-supported collaborative learning environments in service of communicative competence.

  14. Sex-specific effects of the local social environment on juvenile post-fledging dispersal in great tits

    NARCIS (Netherlands)

    Michler, Stephanie P. M.; Nicolaus, Marion; Ubels, Richard; van der Velde, Marco; Komdeur, Jan; Both, Christiaan; Tinbergen, Joost M.; Gibson, R.

    2011-01-01

    An individual's decision to disperse from the natal habitat can affect its future fitness prospects. Especially in species with sex-biased dispersal, we expect the cost benefit balance for dispersal to vary according to the social environment (e.g., local sex ratio and density). However, little is

  15. Incorporating Informal Learning Environments and Local Fossil Specimens in Earth Science Classrooms: A Recipe for Success

    Science.gov (United States)

    Clary, Renee M.; Wandersee, James H.

    2009-01-01

    In an online graduate paleontology course taken by practicing Earth Science teachers, we designed an investigation using teachers' local informal educational environments. Teachers (N = 28) were responsible for photographing, describing, and integrating fossil specimens from two informal sites into a paleoenvironmental analysis of the landscape in…

  16. The amount of natural radionuclides in the individual parts of environment in the locality Jahodna

    International Nuclear Information System (INIS)

    Cipakova, A.; Vrabel, V.

    2008-01-01

    In this study we have investigated and evaluated the amount of K-40, Ra-226, Th-232, U-238 as well as total alpha and beta activity in individual parts of environment, i.e. soil, plant, water and sediment. The locality Jahodna was a studied one. This is a perspective source of uranium ore in the Slovak Republic. (authors)

  17. Multislice Computed Tomography Coronary Angiography at a Local Hospital: Pitfalls and Potential

    Energy Technology Data Exchange (ETDEWEB)

    Kolnes, K.; Velle, Ose H.; Hareide, S.; Hegbom, K.; Wiseth, R. [Volda Hospital (Norway). Depts. of Radiology and Internal Medicine

    2006-09-15

    Purpose: To evaluate whether the favorable results achieved with multislice computed tomography (MSCT) of coronary arteries at larger centers could be paralleled at a local hospital. Material and Methods: Fifty consecutive patients with suspected coronary artery disease scheduled for invasive investigation with quantitative coronary angiography (QCA) at a university hospital underwent MSCT with a 16-slice scanner at a local hospital. Diagnostic accuracy of MSCT for coronary artery disease was assessed using a 16-segment coronary artery model with QCA as the gold standard. Results: Segments with diameter 50% stenosis for the 416 assessable segments were 92%, 82%, 53%, and 98%, respectively. Conclusion: Our beginners' experience demonstrated favorable results regarding sensitivity and negative predictive value. The positive predictive value, however, was unsatisfactory. Calcifications were identified as the most important factor for false-positive results with MSCT. With widespread use of MSCT coronary angiography, there is a risk of recruiting patients without significant coronary artery disease to unnecessary and potentially harmful invasive procedures.

  18. Integrating CAD modules in a PACS environment using a wide computing infrastructure.

    Science.gov (United States)

    Suárez-Cuenca, Jorge J; Tilve, Amara; López, Ricardo; Ferro, Gonzalo; Quiles, Javier; Souto, Miguel

    2017-04-01

    The aim of this paper is to describe a project designed to achieve a total integration of different CAD algorithms into the PACS environment by using a wide computing infrastructure. The aim is to build a system for the entire region of Galicia, Spain, to make CAD accessible to multiple hospitals by employing different PACSs and clinical workstations. The new CAD model seeks to connect different devices (CAD systems, acquisition modalities, workstations and PACS) by means of networking based on a platform that will offer different CAD services. This paper describes some aspects related to the health services of the region where the project was developed, CAD algorithms that were either employed or selected for inclusion in the project, and several technical aspects and results. We have built a standard-based platform with which users can request a CAD service and receive the results in their local PACS. The process runs through a web interface that allows sending data to the different CAD services. A DICOM SR object is received with the results of the algorithms stored inside the original study in the proper folder with the original images. As a result, a homogeneous service to the different hospitals of the region will be offered. End users will benefit from a homogeneous workflow and a standardised integration model to request and obtain results from CAD systems in any modality, not dependant on commercial integration models. This new solution will foster the deployment of these technologies in the entire region of Galicia.

  19. Imaging local cerebral blood flow by xenon-enhanced computed tomography - technical optimization procedures

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, J.S.; Shinohara, T.; Imai, A.; Kobari, M.; Sakai, F.; Hata, T.; Oravez, W.T.; Timpe, G.M.; Deville, T.; Solomon, E.

    1988-08-01

    Methods are described for non-invasive, computer-assisted serial scanning throughout the human brain during eight minutes of inhalation of 27%-30% xenon gas in order to measure local cerebral blood flow (LCBF). Optimized xenon-enhanced computed tomography (XeCT) was achieved by 5-second scanning at one-minute intervals utilizing a state-of-the-art CT scanner and rapid delivery of xenon gas via a face mask. Values for local brain-blood partition coefficients (Llambda) measured in vivo were utilized to calculate LCBF values. Previous methods assumed Llambda values to be normal, introducing the risk of systematic errors, because Llambda values differ throughout normal brain and may be altered by disease. Color-coded maps of Llambda and LCBF values were formatted directly onto CT images for exact correlation of function with anatomic and pathologic observations (spatial resolution: 26.5 cubic mm). Results were compared among eight normal volunteers, aged between 50 and 88 years. Mean cortical gray matter blood flow was 46.3 +- 7.7, for subcortical gray matter it was 50.3 +- 13.2 and for white matter it was 18.8 +- 3.2. Modern CT scanners provide stability, improved signal to noise ratio and minimal radiation scatter. Combining these advantages with rapid xenon saturation of the blood provides correlations of Llambda and LCBF with images of normal and abnormal brain in a safe, useful and non-invasive manner.

  20. Social interaction in type 2 diabetes computer-mediated environments: How inherent features of the channels influence peer-to-peer interaction.

    Science.gov (United States)

    Lewinski, Allison A; Fisher, Edwin B

    2016-06-01

    Interventions via the internet provide support to individuals managing chronic illness. The purpose of this integrative review was to determine how the features of a computer-mediated environment influence social interactions among individuals with type 2 diabetes. A combination of MeSH and keyword terms, based on the cognates of three broad groupings: social interaction, computer-mediated environments, and chronic illness, was used to search the PubMed, PsychInfo, Sociology Research Database, and Cumulative Index to Nursing and Allied Health Literature databases. Eleven articles met the inclusion criteria. Computer-mediated environments enhance an individual's ability to interact with peers while increasing the convenience of obtaining personalized support. A matrix, focused on social interaction among peers, identified themes across all articles, and five characteristics emerged: (1) the presence of synchronous and asynchronous communication, (2) the ability to connect with similar peers, (3) the presence or absence of a moderator, (4) personalization of feedback regarding individual progress and self-management, and (5) the ability of individuals to maintain choice during participation. Individuals interact with peers to obtain relevant, situation-specific information and knowledge about managing their own care. Computer-mediated environments facilitate the ability of individuals to exchange this information despite temporal or geographical barriers that may be present, thus improving T2D self-management. © The Author(s) 2015.

  1. Use of X-Ray Absorption Spectra as a ``Fingerprint'' of the Local Environment in Complex Chalcogenides

    Science.gov (United States)

    Branci, C.; Womes, M.; Lippens, P. E.; Olivier-Fourcade, J.; Jumas, J. C.

    2000-03-01

    The local environment of tin, titanium, iron, and sulfur in spinel compounds Cu2FeSn3S8 and Cu2FeTi3S8 was studied by X-ray absorption spectroscopy (XAS) at the titanium, iron, sulfur K edges, and the tin LI-edge. As detailed calculations of the electronic structure of these compounds are difficult to carry out due to the large number of atoms contained in the unit cell, the XAS spectra of the spinels are compared to those of relatively simple binary sulfides like SnS2, TiS2, and FeS. Indeed, the metal environments in these binary compounds are very similar to those in the spinels, and they can be considered good model compounds allowing the interpretation of electronic transitions observed in the spectra of quaternary phases. In the latter, the bottom of the conduction band is mainly formed by Sn 5s-S 3p, Sn 5p-S 3p antibonding states for the tin-based compounds and by Ti 3dt2g-S 3p, Ti 3deg-S 3p antibonding states for the titanium-based compounds. It it shown that the local environment of iron atoms remains unchanged when substituting tin with titanium atoms, according to a topotactic substitution.

  2. McMaster University: College and University Computing Environment.

    Science.gov (United States)

    CAUSE/EFFECT, 1988

    1988-01-01

    The computing and information services (CIS) organization includes administrative computing, academic computing, and networking and has three divisions: computing services, development services, and information services. Other computing activities include Health Sciences, Humanities Computing Center, and Department of Computer Science and Systems.…

  3. Effects Of Social Networking Sites (SNSs) On Hyper Media Computer Mediated Environments (HCMEs)

    OpenAIRE

    Yoon C. Cho

    2011-01-01

    Social Networking Sites (SNSs) are known as tools to interact and build relationships between users/customers in Hyper Media Computer Mediated Environments (HCMEs). This study explored how social networking sites play a significant role in communication between users. While numerous researchers examined the effectiveness of social networking websites, few studies investigated which factors affected customers attitudes and behavior toward social networking sites. In this paper, the authors inv...

  4. Load Balancing in Cloud Computing Environment Using Improved Weighted Round Robin Algorithm for Nonpreemptive Dependent Tasks

    OpenAIRE

    Devi, D. Chitra; Uthariaraj, V. Rhymend

    2016-01-01

    Cloud computing uses the concepts of scheduling and load balancing to migrate tasks to underutilized VMs for effectively sharing the resources. The scheduling of the nonpreemptive tasks in the cloud computing environment is an irrecoverable restraint and hence it has to be assigned to the most appropriate VMs at the initial placement itself. Practically, the arrived jobs consist of multiple interdependent tasks and they may execute the independent tasks in multiple VMs or in the same VM’s mul...

  5. Coevolution Based Adaptive Monte Carlo Localization (CEAMCL

    Directory of Open Access Journals (Sweden)

    Luo Ronghua

    2008-11-01

    Full Text Available An adaptive Monte Carlo localization algorithm based on coevolution mechanism of ecological species is proposed. Samples are clustered into species, each of which represents a hypothesis of the robot's pose. Since the coevolution between the species ensures that the multiple distinct hypotheses can be tracked stably, the problem of premature convergence when using MCL in highly symmetric environments can be solved. And the sample size can be adjusted adaptively over time according to the uncertainty of the robot's pose by using the population growth model. In addition, by using the crossover and mutation operators in evolutionary computation, intra-species evolution can drive the samples move towards the regions where the desired posterior density is large. So a small size of samples can represent the desired density well enough to make precise localization. The new algorithm is termed coevolution based adaptive Monte Carlo localization (CEAMCL. Experiments have been carried out to prove the efficiency of the new localization algorithm.

  6. Students' Perceptions of Computer-Based Learning Environments, Their Attitude towards Business Statistics, and Their Academic Achievement: Implications from a UK University

    Science.gov (United States)

    Nguyen, ThuyUyen H.; Charity, Ian; Robson, Andrew

    2016-01-01

    This study investigates students' perceptions of computer-based learning environments, their attitude towards business statistics, and their academic achievement in higher education. Guided by learning environments concepts and attitudinal theory, a theoretical model was proposed with two instruments, one for measuring the learning environment and…

  7. General approach to the computation of local transport coefficients with finite Larmor effects in the collision contribution

    International Nuclear Information System (INIS)

    Ghendrih, P.

    1986-10-01

    We expand the distribution functions on a basis of Hermite functions and obtain a general scheme to compute the local transport coefficients. The magnetic field dependence due to finite Larmor radius effects during the collision process is taken into account

  8. The Effects of the Local Environment on Active Galactic Nuclei

    Science.gov (United States)

    Manzer, L. H.; De Robertis, M. M.

    2014-06-01

    There continues to be significant controversy regarding the mechanism(s) responsible for the initiation and maintenance of activity in galactic nuclei. In this paper we will investigate possible environmental triggers of nuclear activity through a statistical analysis of a large sample of galaxy groups. The focus of this paper is to identify active galactic nuclei (AGNs) and other emission-line galaxies in these groups and to compare their frequency with a sample of over 260,000 isolated galaxies from the same catalog. The galaxy groups are taken from the catalog of Yang et al., in which over 20,000 virialized groups of galaxies (2 universe. After correcting emission-line equivalent widths for extinction and underlying Balmer stellar absorption, we classify galaxies in the sample using traditional emission-line ratios, while incorporating measurement uncertainties. We find a significantly higher fraction of AGNs in groups compared with the isolated sample. Likewise, a significantly higher fraction of absorption-line galaxies are found in groups, while a higher fraction of star-forming galaxies prefer isolated environments. Within grouped environments, AGNs and star-forming galaxies are found more frequently in small- to medium-richness groups, while absorption-line galaxies prefer groups with larger richnesses. Groups containing only emission-line galaxies have smaller virial radii, velocity dispersions, and masses compared with those containing only absorption-line galaxies. Furthermore, the AGN fraction increases with decreasing distance to the group centroid, independent of galaxy morphology. Using properties obtained from Galaxy Zoo, there is an increased fraction of AGNs within merging systems, unlike star-forming galaxies. These results provide some indication that the local environment does play a role in initiating activity in galactic nuclei, but it is by no means simple or straightforward.

  9. Upscaling ecotourism in Kisumu city and its environs: Local community perspective Authors

    Directory of Open Access Journals (Sweden)

    Patrick Odhiambo HAYOMBE

    2013-07-01

    Full Text Available Kenya’s quest to be among the top ten long-haul tourist destinations globally require strategic focus as envisaged in Kenya’s Vision 2030. Ecotourism is emerging as an alternative development path that can enhance environmental conservation, promote preservation of cultural heritage as well as provide an alternative source of sustainable livelihood. Alternative livelihood in ecotourism provides a sustainable development path for Kisumu City and its environs. However, sustainability in ecotourism transformation is a concern; that is how to motivate the local community to participate in this venture? This study discerns these significant sustainability factors as perceived by the local community. The objective of the study was to discern the local community’s perception on significant sustainability factors for ecotourism transformation. And the research questions: What is the local community’s perception on significant sustainability factors for ecotourism transformation? This research design used both qualitative and quantitative research. The qualitative research design focused on site specific analysis of ecotourism sites of Dunga (Kisumu, Miyandhe (Bondo and Seka (Kendu Bay. The quantitative research entailed data collection administered through questionnaire in eco-tourism outlets represented by 10 Beach Management Units (BMU selected through purposive sampling. Principal Component Analysis was used to discern the significant sustainability factors for ecotourism transformation. A total of 28 items converted into variables were subjected against 326 respondents in the PCA analysis. The results indicated a total of seven (7 significant sustainability factors: First factor was willingness to participate in ecotourism ventures; second Factor was upscale ecotourism initiatives in the neighborhood; third factor was women and youth empowerment; fourth factor was youth and women employment in the neighborhood; fifth Factor: Natural Artifact

  10. Chemical, Mechanical, and Durability Properties of Concrete with Local Mineral Admixtures under Sulfate Environment in Northwest China.

    Science.gov (United States)

    Nie, Qingke; Zhou, Changjun; Shu, Xiang; He, Qiang; Huang, Baoshan

    2014-05-13

    Over the vast Northwest China, arid desert contains high concentrations of sulfate, chloride, and other chemicals in the ground water, which poses serious challenges to infrastructure construction that routinely utilizes portland cement concrete. Rapid industrialization in the region has been generating huge amounts of mineral admixtures, such as fly ash and slags from energy and metallurgical industries. These industrial by-products would turn into waste materials if not utilized in time. The present study evaluated the suitability of utilizing local mineral admixtures in significant quantities for producing quality concrete mixtures that can withstand the harsh chemical environment without compromising the essential mechanical properties. Comprehensive chemical, mechanical, and durability tests were conducted in the laboratory to characterize the properties of the local cementitious mineral admixtures, cement mortar and portland cement concrete mixtures containing these admixtures. The results from this study indicated that the sulfate resistance of concrete was effectively improved by adding local class F fly ash and slag, or by applying sulfate resistance cement to the mixtures. It is noteworthy that concrete containing local mineral admixtures exhibited much lower permeability (in terms of chloride ion penetration) than ordinary portland cement concrete while retaining the same mechanical properties; whereas concrete mixtures made with sulfate resistance cement had significantly reduced strength and much increased chloride penetration comparing to the other mixtures. Hence, the use of local mineral admixtures in Northwest China in concrete mixtures would be beneficial to the performance of concrete, as well as to the protection of environment.

  11. Preferred Air Velocity and Local Cooling Effect of desk fans in warm environments

    DEFF Research Database (Denmark)

    Simone, Angela; Olesen, Bjarne W.

    2013-01-01

    to compensate for higher environmental temperatures at the expense of no or relatively low energy consumption. When using desk fans, local air movement is generated around the occupant and a certain cooling effect is perceived. The impact of the local air movement generated by different air flow patterns......Common experiences, standards, and laboratory studies show that increased air velocity helps to offset warm sensation due to high environmental temperatures. In warm climate regions the opening of windows and the use of desk or ceiling fans are the most common systems to generate increased airflows......, and the possibility to keep comfortable conditions for the occupants in warm environments were evaluated in studies with human subjects. In an office-like climatic chamber, the effect of higher air velocity was investigated at room temperatures between 26°C to 34°C and at constant absolute humidity of 12.2 g...

  12. Vision-based map building and trajectory planning to enable autonomous flight through urban environments

    Science.gov (United States)

    Watkins, Adam S.

    . The local path is converted to a trajectory by incorporating vehicle dynamics with an optimal control scheme which minimizes deviation from the path and final time. Simulation results are presented for the mapping and trajectory planning solutions. The SLAM solutions are investigated in terms of estimation performance, filter consistency, and computational efficiency. The trajectory planning method is shown to produce computationally efficient solutions that maximize environment coverage.

  13. Development of an international matrix-solver prediction system on a French-Japanese international grid computing environment

    International Nuclear Information System (INIS)

    Suzuki, Yoshio; Kushida, Noriyuki; Tatekawa, Takayuki; Teshima, Naoya; Caniou, Yves; Guivarch, Ronan; Dayde, Michel; Ramet, Pierre

    2010-01-01

    The 'Research and Development of International Matrix-Solver Prediction System (REDIMPS)' project aimed at improving the TLSE sparse linear algebra expert website by establishing an international grid computing environment between Japan and France. To help users in identifying the best solver or sparse linear algebra tool for their problems, we have developed an interoperable environment between French and Japanese grid infrastructures (respectively managed by DIET and AEGIS). Two main issues were considered. The first issue is how to submit a job from DIET to AEGIS. The second issue is how to bridge the difference of security between DIET and AEGIS. To overcome these issues, we developed APIs to communicate between different grid infrastructures by improving the client API of AEGIS. By developing a server deamon program (SeD) of DIET which behaves like an AEGIS user, DIET can call functions in AEGIS: authentication, file transfer, job submission, and so on. To intensify the security, we also developed functionalities to authenticate DIET sites and DIET users in order to access AEGIS computing resources. By this study, the set of software and computers available within TLSE to find an appropriate solver is enlarged over France (DIET) and Japan (AEGIS). (author)

  14. A general-purpose development environment for intelligent computer-aided training systems

    Science.gov (United States)

    Savely, Robert T.

    1990-01-01

    Space station training will be a major task, requiring the creation of large numbers of simulation-based training systems for crew, flight controllers, and ground-based support personnel. Given the long duration of space station missions and the large number of activities supported by the space station, the extension of space shuttle training methods to space station training may prove to be impractical. The application of artificial intelligence technology to simulation training can provide the ability to deliver individualized training to large numbers of personnel in a distributed workstation environment. The principal objective of this project is the creation of a software development environment which can be used to build intelligent training systems for procedural tasks associated with the operation of the space station. Current NASA Johnson Space Center projects and joint projects with other NASA operational centers will result in specific training systems for existing space shuttle crew, ground support personnel, and flight controller tasks. Concurrently with the creation of these systems, a general-purpose development environment for intelligent computer-aided training systems will be built. Such an environment would permit the rapid production, delivery, and evolution of training systems for space station crew, flight controllers, and other support personnel. The widespread use of such systems will serve to preserve task and training expertise, support the training of many personnel in a distributed manner, and ensure the uniformity and verifiability of training experiences. As a result, significant reductions in training costs can be realized while safety and the probability of mission success can be enhanced.

  15. Understanding Student Retention in Computer Science Education: The Role of Environment, Gains, Barriers and Usefulness

    Science.gov (United States)

    Giannakos, Michail N.; Pappas, Ilias O.; Jaccheri, Letizia; Sampson, Demetrios G.

    2017-01-01

    Researchers have been working to understand the high dropout rates in computer science (CS) education. Despite the great demand for CS professionals, little is known about what influences individuals to complete their CS studies. We identify gains of studying CS, the (learning) environment, degree's usefulness, and barriers as important predictors…

  16. The Effect of a Graph-Oriented Computer-Assisted Project-Based Learning Environment on Argumentation Skills

    Science.gov (United States)

    Hsu, P. -S.; Van Dyke, M.; Chen, Y.; Smith, T. J.

    2015-01-01

    The purpose of this quasi-experimental study was to explore how seventh graders in a suburban school in the United States developed argumentation skills and science knowledge in a project-based learning environment that incorporated a graph-oriented, computer-assisted application. A total of 54 students (three classes) comprised this treatment…

  17. Proceedings of the 1993 Conference on Intelligent Computer-Aided Training and Virtual Environment Technology, Volume 1

    Science.gov (United States)

    Hyde, Patricia R.; Loftin, R. Bowen

    1993-01-01

    These proceedings are organized in the same manner as the conference's contributed sessions, with the papers grouped by topic area. These areas are as follows: VE (virtual environment) training for Space Flight, Virtual Environment Hardware, Knowledge Aquisition for ICAT (Intelligent Computer-Aided Training) & VE, Multimedia in ICAT Systems, VE in Training & Education (1 & 2), Virtual Environment Software (1 & 2), Models in ICAT systems, ICAT Commercial Applications, ICAT Architectures & Authoring Systems, ICAT Education & Medical Applications, Assessing VE for Training, VE & Human Systems (1 & 2), ICAT Theory & Natural Language, ICAT Applications in the Military, VE Applications in Engineering, Knowledge Acquisition for ICAT, and ICAT Applications in Aerospace.

  18. Localization in Multiple Source Environments: Localizing the Missing Source

    Science.gov (United States)

    2007-02-01

    volunteer listeners (3 males and 3 females, 19-24 years of age ), participated in the experiment. All had normal hearing (au- diometric thresholds < 15...were routed from a control computer to a Mark of the Unicorn digital-to-analog con- verter (MOTU 24 I/O), then through a bank of amplifiers (Crown Model

  19. Wireless Adaptive Therapeutic TeleGaming in a Pervasive Computing Environment

    Science.gov (United States)

    Peters, James F.; Szturm, Tony; Borkowski, Maciej; Lockery, Dan; Ramanna, Sheela; Shay, Barbara

    This chapter introduces a wireless, pervasive computing approach to adaptive therapeutic telegaming considered in the context of near set theory. Near set theory provides a formal basis for observation, comparison and classification of perceptual granules. A perceptual granule is defined by a collection of objects that are graspable by the senses or by the mind. In the proposed pervasive computing approach to telegaming, a handicapped person (e.g., stroke patient with limited hand, finger, arm function) plays a video game by interacting with familiar instrumented objects such as cups, cutlery, soccer balls, nozzles, screw top-lids, spoons, so that the technology that makes therapeutic exercise game-playing possible is largely invisible (Archives of Physical Medicine and Rehabilitation 89:2213-2217, 2008). The basic approach to adaptive learning (AL) in the proposed telegaming environment is ethology-inspired and is quite different from the traditional approach to reinforcement learning. In biologically-inspired learning, organisms learn to achieve some goal by durable modification of behaviours in response to signals from the environment resulting from specific experiences (Animal Behavior, 1995). The term adaptive is used here in an ethological sense, where learning by an organism results from modifying behaviour in response to perceived changes in the environment. To instill adaptivity in a video game, it is assumed that learning by a video game is episodic. During an episode, the behaviour of a player is measured indirectly by tracking the occurrence of gaming events such as a hit or a miss of a target (e.g., hitting a moving ball with a game paddle). An ethogram provides a record of behaviour feature values that provide a basis a functional registry for handicapped players for gaming adaptivity. An important practical application of adaptive gaming is therapeutic rehabilitation exercise carried out in parallel with playing action video games. Enjoyable and

  20. SU-E-P-10: Establishment of Local Diagnostic Reference Levels of Routine Exam in Computed Tomography

    International Nuclear Information System (INIS)

    Yeh, M; Wang, Y; Weng, H

    2015-01-01

    Introduction National diagnostic reference levels (NDRLs) can be used as a reference dose of radiological examination can provide radiation dose as the basis of patient dose optimization. Local diagnostic reference levels (LDRLs) by periodically view and check doses, more efficiency to improve the way of examination. Therefore, the important first step is establishing a diagnostic reference level. Computed Tomography in Taiwan had been built up the radiation dose limit value,in addition, many studies report shows that CT scan contributed most of the radiation dose in different medical. Therefore, this study was mainly to let everyone understand DRL’s international status. For computed tomography in our hospital to establish diagnostic reference levels. Methods and Materials: There are two clinical CT scanners (a Toshiba Aquilion and a Siemens Sensation) were performed in this study. For CT examinations the basic recommended dosimetric quantity is the Computed Tomography Dose Index (CTDI). Each exam each different body part, we collect 10 patients at least. Carried out the routine examinations, and all exposure parameters have been collected and the corresponding CTDIv and DLP values have been determined. Results: The majority of patients (75%) were between 60–70 Kg of body weight. There are 25 examinations in this study. Table 1 shows the LDRL of each CT routine examination. Conclusions: Therefore, this study would like to let everyone know DRL’s international status, but also establishment of computed tomography of the local reference levels for our hospital, and providing radiation reference, as a basis for optimizing patient dose

  1. SU-E-P-10: Establishment of Local Diagnostic Reference Levels of Routine Exam in Computed Tomography

    Energy Technology Data Exchange (ETDEWEB)

    Yeh, M; Wang, Y; Weng, H [Chiayi Chang Gung Memorial Hospital of The C.G.M.F, Puzi City, Chiayi County, Taiwan (China)

    2015-06-15

    Introduction National diagnostic reference levels (NDRLs) can be used as a reference dose of radiological examination can provide radiation dose as the basis of patient dose optimization. Local diagnostic reference levels (LDRLs) by periodically view and check doses, more efficiency to improve the way of examination. Therefore, the important first step is establishing a diagnostic reference level. Computed Tomography in Taiwan had been built up the radiation dose limit value,in addition, many studies report shows that CT scan contributed most of the radiation dose in different medical. Therefore, this study was mainly to let everyone understand DRL’s international status. For computed tomography in our hospital to establish diagnostic reference levels. Methods and Materials: There are two clinical CT scanners (a Toshiba Aquilion and a Siemens Sensation) were performed in this study. For CT examinations the basic recommended dosimetric quantity is the Computed Tomography Dose Index (CTDI). Each exam each different body part, we collect 10 patients at least. Carried out the routine examinations, and all exposure parameters have been collected and the corresponding CTDIv and DLP values have been determined. Results: The majority of patients (75%) were between 60–70 Kg of body weight. There are 25 examinations in this study. Table 1 shows the LDRL of each CT routine examination. Conclusions: Therefore, this study would like to let everyone know DRL’s international status, but also establishment of computed tomography of the local reference levels for our hospital, and providing radiation reference, as a basis for optimizing patient dose.

  2. Design of UAV-Embedded Microphone Array System for Sound Source Localization in Outdoor Environments

    Directory of Open Access Journals (Sweden)

    Kotaro Hoshiba

    2017-11-01

    Full Text Available In search and rescue activities, unmanned aerial vehicles (UAV should exploit sound information to compensate for poor visual information. This paper describes the design and implementation of a UAV-embedded microphone array system for sound source localization in outdoor environments. Four critical development problems included water-resistance of the microphone array, efficiency in assembling, reliability of wireless communication, and sufficiency of visualization tools for operators. To solve these problems, we developed a spherical microphone array system (SMAS consisting of a microphone array, a stable wireless network communication system, and intuitive visualization tools. The performance of SMAS was evaluated with simulated data and a demonstration in the field. Results confirmed that the SMAS provides highly accurate localization, water resistance, prompt assembly, stable wireless communication, and intuitive information for observers and operators.

  3. Design of UAV-Embedded Microphone Array System for Sound Source Localization in Outdoor Environments.

    Science.gov (United States)

    Hoshiba, Kotaro; Washizaki, Kai; Wakabayashi, Mizuho; Ishiki, Takahiro; Kumon, Makoto; Bando, Yoshiaki; Gabriel, Daniel; Nakadai, Kazuhiro; Okuno, Hiroshi G

    2017-11-03

    In search and rescue activities, unmanned aerial vehicles (UAV) should exploit sound information to compensate for poor visual information. This paper describes the design and implementation of a UAV-embedded microphone array system for sound source localization in outdoor environments. Four critical development problems included water-resistance of the microphone array, efficiency in assembling, reliability of wireless communication, and sufficiency of visualization tools for operators. To solve these problems, we developed a spherical microphone array system (SMAS) consisting of a microphone array, a stable wireless network communication system, and intuitive visualization tools. The performance of SMAS was evaluated with simulated data and a demonstration in the field. Results confirmed that the SMAS provides highly accurate localization, water resistance, prompt assembly, stable wireless communication, and intuitive information for observers and operators.

  4. FAST - A multiprocessed environment for visualization of computational fluid dynamics

    International Nuclear Information System (INIS)

    Bancroft, G.V.; Merritt, F.J.; Plessel, T.C.; Kelaita, P.G.; Mccabe, R.K.

    1991-01-01

    The paper presents the Flow Analysis Software Toolset (FAST) to be used for fluid-mechanics analysis. The design criteria for FAST including the minimization of the data path in the computational fluid-dynamics (CFD) process, consistent user interface, extensible software architecture, modularization, and the isolation of three-dimensional tasks from the application programmer are outlined. Each separate process communicates through the FAST Hub, while other modules such as FAST Central, NAS file input, CFD calculator, surface extractor and renderer, titler, tracer, and isolev might work together to generate the scene. An interprocess communication package making it possible for FAST to operate as a modular environment where resources could be shared among different machines as well as a single host is discussed. 20 refs

  5. Moessbauer study of the local environment of the iron implanted in glassy AgAsS2

    International Nuclear Information System (INIS)

    Bychkov, E.A.; Vlasov, Yu.G.; Dravin, V.A.; Semenov, V.G.

    1987-01-01

    Local environment of iron implanted into glassy AgAsS 2 or introduced into this glass in the course of synthesis is investigated. It is shown that chemical forms of iron stabilization are similar in both cases, however, concentrational relations of various forms differ sufficiently. The main doped glass spectrum component (85-88% of the total area) represents a quadrupole iron doublet (2) in glass in tetrahedral sulfide environment. In implanted sample spectra contributions from iron (2) in glass and from amorphous iron disulfide are comparable. Concentrational differences are probably linked with high rates of glass implanted area hardening

  6. Materials and Life Science Experimental Facility at the Japan Proton Accelerator Research Complex III: Neutron Devices and Computational and Sample Environments

    Directory of Open Access Journals (Sweden)

    Kaoru Sakasai

    2017-08-01

    Full Text Available Neutron devices such as neutron detectors, optical devices including supermirror devices and 3He neutron spin filters, and choppers are successfully developed and installed at the Materials Life Science Facility (MLF of the Japan Proton Accelerator Research Complex (J-PARC, Tokai, Japan. Four software components of MLF computational environment, instrument control, data acquisition, data analysis, and a database, have been developed and equipped at MLF. MLF also provides a wide variety of sample environment options including high and low temperatures, high magnetic fields, and high pressures. This paper describes the current status of neutron devices, computational and sample environments at MLF.

  7. Abnormalities by pulmonary regions studied with computer tomography following local or local-regional radiotherapy for breast cancer

    International Nuclear Information System (INIS)

    Lind, Pehr; Svane, Gunilla; Gagliardi, Giovanna; Svensson, Christer

    1999-01-01

    Purpose: To study pulmonary radiological abnormalities with computer tomography (CT) following different radiotherapy (RT) techniques for breast cancer with respect to regions and density, and their correlation to pulmonary complications and reduction in vital capacity (VC). Methods and Materials: CT scans of the lungs were performed prior to and 4 months following RT in 105 breast cancer patients treated with local or local-regional RT. The radiological abnormalities were analyzed with a CT-adapted modification of a classification system originally proposed by Arriagada, and scored according to increasing density (0-3) and affected lung regions (apical-lateral, central-parahilar, basal-lateral). The highest density grade in each region were added together to form scores ranging from 0-9. The patients were monitored for RT-induced pulmonary complications. VC was measured prior to and 5 months following RT. Results: Increasing CT scores were correlated with both local-regional RT and pulmonary complications (p < 0.001). The mean reduction of VC for patients scoring 4-9 (-202 ml) was larger than for patients scoring 0-3 (-2 ml) (p = 0.035). The effect of confounding factors on the radiological scoring was tested in the local-regional RT group. Scores of 4-9 were less frequently seen in the patients who had received adjuvant chemotherapy prior to RT. The importance of the respective lung regions on the outcome of pulmonary complications was tested. Only radiological abnormalities in the central-parahilar and apical-lateral regions were significantly correlated to pulmonary complications. Discussion: Radiological abnormalities detected on CT images and scored with a modification of Arriagada's classification system can be used as an objective endpoint for pulmonary side effects in breast cancer. The described model should, however, be expanded with information about the volume of lung affected in each region before definite conclusions can be drawn concerning each

  8. Optimizing the Use of Storage Systems Provided by Cloud Computing Environments

    Science.gov (United States)

    Gallagher, J. H.; Potter, N.; Byrne, D. A.; Ogata, J.; Relph, J.

    2013-12-01

    Cloud computing systems present a set of features that include familiar computing resources (albeit augmented to support dynamic scaling of processing power) bundled with a mix of conventional and unconventional storage systems. The linux base on which many Cloud environments (e.g., Amazon) are based make it tempting to assume that any Unix software will run efficiently in this environment efficiently without change. OPeNDAP and NODC collaborated on a short project to explore how the S3 and Glacier storage systems provided by the Amazon Cloud Computing infrastructure could be used with a data server developed primarily to access data stored in a traditional Unix file system. Our work used the Amazon cloud system, but we strived for designs that could be adapted easily to other systems like OpenStack. Lastly, we evaluated different architectures from a computer security perspective. We found that there are considerable issues associated with treating S3 as if it is a traditional file system, even though doing so is conceptually simple. These issues include performance penalties because using a software tool that emulates a traditional file system to store data in S3 performs poorly when compared to a storing data directly in S3. We also found there are important benefits beyond performance to ensuring that data written to S3 can directly accessed without relying on a specific software tool. To provide a hierarchical organization to the data stored in S3, we wrote 'catalog' files, using XML. These catalog files map discrete files to S3 access keys. Like a traditional file system's directories, the catalogs can also contain references to other catalogs, providing a simple but effective hierarchy overlaid on top of S3's flat storage space. An added benefit to these catalogs is that they can be viewed in a web browser; our storage scheme provides both efficient access for the data server and access via a web browser. We also looked at the Glacier storage system and

  9. Imaging local cerebral blood flow by xenon-enhanced computed tomography - technical optimization procedures

    International Nuclear Information System (INIS)

    Meyer, J.S.; Shinohara, T.; Imai, A.; Kobari, M.; Solomon, E.

    1988-01-01

    Methods are described for non-invasive, computer-assisted serial scanning throughout the human brain during eight minutes of inhalation of 27%-30% xenon gas in order to measure local cerebral blood flow (LCBF). Optimized xenon-enhanced computed tomography (XeCT) was achieved by 5-second scanning at one-minute intervals utilizing a state-of-the-art CT scanner and rapid delivery of xenon gas via a face mask. Values for local brain-blood partition coefficients (Lλ) measured in vivo were utilized to calculate LCBF values. Previous methods assumed Lλ values to be normal, introducing the risk of systematic errors, because Lλ values differ throughout normal brain and may be altered by disease. Color-coded maps of Lλ and LCBF values were formatted directly onto CT images for exact correlation of function with anatomic and pathologic observations (spatial resolution: 26.5 cubic mm). Results were compared among eight normal volunteers, aged between 50 and 88 years. Mean cortical gray matter blood flow was 46.3 ± 7.7, for subcortical gray matter it was 50.3 ± 13.2 and for white matter it was 18.8 ± 3.2. Modern CT scanners provide stability, improved signal to noise ratio and minimal radiation scatter. Combining these advantages with rapid xenon saturation of the blood provides correlations of Lλ and LCBF with images of normal and abnormal brain in a safe, useful and non-invasive manner. (orig.)

  10. A simple interface to computational fluid dynamics programs for building environment simulations

    Energy Technology Data Exchange (ETDEWEB)

    Broderick, III, C R; Chen, Q [Massachusetts Institute of Technology, Cambridge, MA (United States)

    2000-07-01

    It is becoming a popular practice for architects and HVAC engineers to simulate airflow in and around buildings by computational fluid dynamics (CFD) methods in order to predict indoor and outdoor environment. However, many CFD programs are crippled by a historically poor and inefficient user interface system, particularly for users with little training in numerical simulation. This investigation endeavors to create a simplified CFD interface (SCI) that allows architects and buildings engineers to use CFD without excessive training. The SCI can be easily integrated into new CFD programs. (author)

  11. Development of a unique product: Perception of guests in Tourism in vineyard cottages on the local environment

    Directory of Open Access Journals (Sweden)

    Le- Marija Colarič-Jakše

    2017-11-01

    Full Text Available Purpose and Originality: Tourism in vineyard cottages is new, authentic, unique tourism product, which involves the area of wine-growing land Posavje, with districts of Dolenjska (Lower Carniola Region, Bela Krajina and Obsotelje-Kozjansko, where are the wine-growing areas with vineyards and vineyard cottages. Tourists in vineyard cottages bring economic benefits to the local community. Method: With the research we gained the information, where local residents can see positive and where negative impacts of guests who come to the vineyard cottages. As the part of descriptive approach in our research we used a method of a description about the opinion of local residents on impact of arriving tourists in tourist vineyard cottages on the local environment and the method of a compilation discoveries, observations and results. In the activities of analytical approach we are going on the base about the results of questioning individual cases and conclusion about opinion of local inhabitants in the area of marketing the tourism product Tourism in vineyard cottages. Results: Considering the results of the research, individual interviews with guests in the area of product Tourism in vineyard cottages and responses in the local environment, we evaluate, that the product has unique, authentic, original and attractive approach with all the possibilities, that it becomes one of the most recognizable, wanted and paraded integral product of Slovenian tourism. Society: New, innovative, attractive, unique and authentic product Tourism in vineyard cottages, which is developed in the wine-growing region Posavje, it has an extremely great potential, because it is one of the most recognizable forms of tourism in the countryside areas, and it has extra perspective with the creating local stories and connecting into the integral tourism products. Limitations / further research: It is necessary, that also other owners of the vineyard cottages, who are not included into

  12. Retention Capability of Local Backfill Materials 1-Simulated Disposal Environment

    International Nuclear Information System (INIS)

    Ghattas, N.K.; Eskander, S.B.; El-Adham, K.A.; Mahmoud, N.S.

    2001-01-01

    In Egypt, a shallow ground disposal facility was the chosen option for the disposal of low and and intermediate radioactive wastes. The impact of the waste disposal facility on the environment depends on the nature of the barriers, which intend to limit and control contaminant migration. Owing to their physical, chemical and mechanical characteristics. Local soil materials were studied to illustrate the role of the back fill as part of an optimized safety multi-barrier system, which can provide the required level of protection of the environment and meet economic and regulatory requirements. A theoretical model was proposed to calculate the transport phenomena through the backfill materials. The credibility and validity of the proposed model was checked by the experimental results obtained from a three-arms arrangement system. The obtained data for the distribution coefficient (K d ) and the apparent diffusion coefficient (D a ) were in good agreement with those previously obtained in the literatures. Taking in consideration the prevailing initial conditions, the data calculated by the theoretical model applied show a reasonable agreement with the results obtained from experimental work. Prediction of radioactive cesium migration through the backfill materials using the proposed model was performed as a function of distance. The results obtained show that after 100 years, a fraction not exceeding 1E-9 of the original activity could be detected at 1m distance away from the waste material

  13. Final Project Report: Data Locality Enhancement of Dynamic Simulations for Exascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Shen, Xipeng [North Carolina State Univ., Raleigh, NC (United States)

    2016-04-27

    The goal of this project is to develop a set of techniques and software tools to enhance the matching between memory accesses in dynamic simulations and the prominent features of modern and future manycore systems, alleviating the memory performance issues for exascale computing. In the first three years, the PI and his group have achieves some significant progress towards the goal, producing a set of novel techniques for improving the memory performance and data locality in manycore systems, yielding 18 conference and workshop papers and 4 journal papers and graduating 6 Ph.Ds. This report summarizes the research results of this project through that period.

  14. Exploiting Deep Neural Networks and Head Movements for Robust Binaural Localization of Multiple Sources in Reverberant Environments

    DEFF Research Database (Denmark)

    Ma, Ning; May, Tobias; Brown, Guy J.

    2017-01-01

    This paper presents a novel machine-hearing system that exploits deep neural networks (DNNs) and head movements for robust binaural localization of multiple sources in reverberant environments. DNNs are used to learn the relationship between the source azimuth and binaural cues, consisting...... of the complete cross-correlation function (CCF) and interaural level differences (ILDs). In contrast to many previous binaural hearing systems, the proposed approach is not restricted to localization of sound sources in the frontal hemifield. Due to the similarity of binaural cues in the frontal and rear...

  15. mGrid: A load-balanced distributed computing environment for the remote execution of the user-defined Matlab code

    Directory of Open Access Journals (Sweden)

    Almeida Jonas S

    2006-03-01

    Full Text Available Abstract Background Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. Results mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else. Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Conclusion Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web

  16. mGrid: a load-balanced distributed computing environment for the remote execution of the user-defined Matlab code.

    Science.gov (United States)

    Karpievitch, Yuliya V; Almeida, Jonas S

    2006-03-15

    Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over

  17. Recent Advances in Wireless Indoor Localization Techniques and System

    Directory of Open Access Journals (Sweden)

    Zahid Farid

    2013-01-01

    Full Text Available The advances in localization based technologies and the increasing importance of ubiquitous computing and context-dependent information have led to a growing business interest in location-based applications and services. Today, most application requirements are locating or real-time tracking of physical belongings inside buildings accurately; thus, the demand for indoor localization services has become a key prerequisite in some markets. Moreover, indoor localization technologies address the inadequacy of global positioning system inside a closed environment, like buildings. Based on this, though, this paper aims to provide the reader with a review of the recent advances in wireless indoor localization techniques and system to deliver a better understanding of state-of-the-art technologies and motivate new research efforts in this promising field. For this purpose, existing wireless localization position system and location estimation schemes are reviewed, as we also compare the related techniques and systems along with a conclusion and future trends.

  18. Privacy-Preserving Computation with Trusted Computing via Scramble-then-Compute

    OpenAIRE

    Dang Hung; Dinh Tien Tuan Anh; Chang Ee-Chien; Ooi Beng Chin

    2017-01-01

    We consider privacy-preserving computation of big data using trusted computing primitives with limited private memory. Simply ensuring that the data remains encrypted outside the trusted computing environment is insufficient to preserve data privacy, for data movement observed during computation could leak information. While it is possible to thwart such leakage using generic solution such as ORAM [42], designing efficient privacy-preserving algorithms is challenging. Besides computation effi...

  19. Enabling Efficient Climate Science Workflows in High Performance Computing Environments

    Science.gov (United States)

    Krishnan, H.; Byna, S.; Wehner, M. F.; Gu, J.; O'Brien, T. A.; Loring, B.; Stone, D. A.; Collins, W.; Prabhat, M.; Liu, Y.; Johnson, J. N.; Paciorek, C. J.

    2015-12-01

    A typical climate science workflow often involves a combination of acquisition of data, modeling, simulation, analysis, visualization, publishing, and storage of results. Each of these tasks provide a myriad of challenges when running on a high performance computing environment such as Hopper or Edison at NERSC. Hurdles such as data transfer and management, job scheduling, parallel analysis routines, and publication require a lot of forethought and planning to ensure that proper quality control mechanisms are in place. These steps require effectively utilizing a combination of well tested and newly developed functionality to move data, perform analysis, apply statistical routines, and finally, serve results and tools to the greater scientific community. As part of the CAlibrated and Systematic Characterization, Attribution and Detection of Extremes (CASCADE) project we highlight a stack of tools our team utilizes and has developed to ensure that large scale simulation and analysis work are commonplace and provide operations that assist in everything from generation/procurement of data (HTAR/Globus) to automating publication of results to portals like the Earth Systems Grid Federation (ESGF), all while executing everything in between in a scalable environment in a task parallel way (MPI). We highlight the use and benefit of these tools by showing several climate science analysis use cases they have been applied to.

  20. Replicated Data Management for Mobile Computing

    CERN Document Server

    Douglas, Terry

    2008-01-01

    Managing data in a mobile computing environment invariably involves caching or replication. In many cases, a mobile device has access only to data that is stored locally, and much of that data arrives via replication from other devices, PCs, and services. Given portable devices with limited resources, weak or intermittent connectivity, and security vulnerabilities, data replication serves to increase availability, reduce communication costs, foster sharing, and enhance survivability of critical information. Mobile systems have employed a variety of distributed architectures from client-server

  1. Visual Perspectives within Educational Computer Games: Effects on Presence and Flow within Virtual Immersive Learning Environments

    Science.gov (United States)

    Scoresby, Jon; Shelton, Brett E.

    2011-01-01

    The mis-categorizing of cognitive states involved in learning within virtual environments has complicated instructional technology research. Further, most educational computer game research does not account for how learning activity is influenced by factors of game content and differences in viewing perspectives. This study is a qualitative…

  2. A 250-Mbit/s ring local computer network using 1.3-microns single-mode optical fibers

    Science.gov (United States)

    Eng, S. T.; Tell, R.; Andersson, T.; Eng, B.

    1985-01-01

    A 250-Mbit/s three-station fiber-optic ring local computer network was built and successfully demonstrated. A conventional token protocol was employed for bus arbitration to maximize the bus efficiency under high loading conditions, and a non-return-to-zero (NRS) data encoding format was selected for simplicity and maximum utilization of the ECL-circuit bandwidth.

  3. Fast mapping of the local environment of an autonomous mobile robot

    International Nuclear Information System (INIS)

    Fanton, Herve

    1989-01-01

    The construction of a map of the local world for the navigation of an autonomous mobile robot leads to the following problem: how to extract among the sensor data information accurate an reliable enough to plan a path, in a way that enables a reasonable displacement speed. Choice has been made not to tele-operate the vehicle nor to design any custom architecture. So the only way to match the computational cost is to look for the most efficient sensor-algorithms-architecture combination. A good solution is described in this study, using a laser range-finder, a grid model of the world and both SIMD and MIMD parallel processors. A short review of some possible approaches is made first; the mapping algorithms are then described as also the parallel implementations with the corresponding speedup and efficiency factors. (author) [fr

  4. An IoT-Based Computational Framework for Healthcare Monitoring in Mobile Environments.

    Science.gov (United States)

    Mora, Higinio; Gil, David; Terol, Rafael Muñoz; Azorín, Jorge; Szymanski, Julian

    2017-10-10

    The new Internet of Things paradigm allows for small devices with sensing, processing and communication capabilities to be designed, which enable the development of sensors, embedded devices and other 'things' ready to understand the environment. In this paper, a distributed framework based on the internet of things paradigm is proposed for monitoring human biomedical signals in activities involving physical exertion. The main advantages and novelties of the proposed system is the flexibility in computing the health application by using resources from available devices inside the body area network of the user. This proposed framework can be applied to other mobile environments, especially those where intensive data acquisition and high processing needs take place. Finally, we present a case study in order to validate our proposal that consists in monitoring footballers' heart rates during a football match. The real-time data acquired by these devices presents a clear social objective of being able to predict not only situations of sudden death but also possible injuries.

  5. Modern design of a fast front-end computer

    Science.gov (United States)

    Šoštarić, Z.; Anic̈ić, D.; Sekolec, L.; Su, J.

    1994-12-01

    Front-end computers (FEC) at Paul Scherrer Institut provide access to accelerator CAMAC-based sensors and actuators by way of a local area network. In the scope of the new generation FEC project, a front-end is regarded as a collection of services. The functionality of one such service is described in terms of Yourdon's environment, behaviour, processor and task models. The computational model (software representation of the environment) of the service is defined separately, using the information model of the Shlaer-Mellor method, and Sather OO language. In parallel with the analysis and later with the design, a suite of test programmes was developed to evaluate the feasibility of different computing platforms for the project and a set of rapid prototypes was produced to resolve different implementation issues. The past and future aspects of the project and its driving forces are presented. Justification of the choice of methodology, platform and requirement, is given. We conclude with a description of the present state, priorities and limitations of our project.

  6. Gestão local e meio ambiente Local management and environment

    Directory of Open Access Journals (Sweden)

    Paulo Gonzaga M. de Carvalho

    2005-01-01

    Full Text Available O objetivo deste artigo é, a partir das informações disponibilizadas pela Pesquisa de Informações Básicas Municipais do IBGE, analisar três variáveis: a existência de Conselhos Municipais de Meio Ambiente, de Fundos Especiais de Meio Ambiente e de legislação sobre Áreas de Interesse Especial. Dentre outros aspectos, examina-se a incidência dos Conselhos Municipais de Meio Ambiente tendo em vista a bacia hidrográfica e o partido do prefeito.Based on the information available on the Municipal Basic Information Research of IBGE, this article aims to analyze three variables: the existence of Municipal Councils for the Environment, Special Funds for the Environment and Legislation on Areas of Special Interest. Among other aspects, it examines the incidence of Municipal Councils for the Environment having in mind the hydrographic basin and the Mayor Political Party.

  7. Tri-P-LETS: Changing the Face of High School Computer Science

    Science.gov (United States)

    Sherrell, Linda; Malasri, Kriangsiri; Mills, David; Thomas, Allen; Greer, James

    2012-01-01

    From 2004-2007, the University of Memphis carried out the NSF-funded Tri-P-LETS (Three P Learning Environment for Teachers and Students) project to improve local high-school computer science curricula. The project reached a total of 58 classrooms in eleven high schools emphasizing problem solving skills, programming concepts as opposed to syntax,…

  8. Surface and interfacial interactions of multilayer graphitic structures with local environment

    International Nuclear Information System (INIS)

    Mazzocco, R.; Robinson, B.J.; Rabot, C.; Delamoreanu, A.; Zenasni, A.; Dickinson, J.W.; Boxall, C.; Kolosov, O.V.

    2015-01-01

    In order to exploit the potential of graphene in next-generation devices, such as supercapacitors, rechargeable batteries, displays and ultrathin sensors, it is crucial to understand the solvent interactions with the graphene surface and interlayers, especially where the latter may be in competition with the former, in the medium of application deployment. In this report, we combine quartz crystal microbalance (QCM) and ultrasonic force microscopy methods to investigate the changes in the film–substrate and film–environment interfaces of graphene and graphene oxide films, produced by diverse scalable routes, in both polar (deionised water) and non-polar (dodecane) liquid and vapour environments. In polar liquid environments, we observe nanobubble adsorption/desorption on the graphene film corresponding to a surface coverage of up to 20%. As no comparable behaviour is observed for non-polar environment, we conclude that nanobubble formation is directly due to the hydrophobic nature of graphene with direct consequences for electrode structures immersed in electrolyte solutions. The amount of water adsorbed by the graphene films was found to vary considerably from 0.012 monolayers of water per monolayer of reduced graphene oxide to 0.231 monolayers of water per monolayer of carbon diffusion growth graphene. This is supported by direct nanomechanical mapping of the films immersed in water where an increased variation of local stiffness suggests water propagation within the film and/or between the film and substrate. Transferred film thickness calculations performed for QCM, atomic force microscopy topography and optical transmission measurements, returns results an order of magnitude larger (46 ± 1 layers) than Raman spectroscopy (1 - 2 graphene layers) on pristine pre-transferred films due to contamination during transfer and possible turbostratic structures of large areas. - Highlights: • Exploring interaction of graphene films with polar and nonpolar liquids

  9. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  10. PABLM: a computer program to calculate accumulated radiation doses from radionuclides in the environment

    Energy Technology Data Exchange (ETDEWEB)

    Napier, B.A.; Kennedy, W.E. Jr.; Soldat, J.K.

    1980-03-01

    A computer program, PABLM, was written to facilitate the calculation of internal radiation doses to man from radionuclides in food products and external radiation doses from radionuclides in the environment. This report contains details of mathematical models used and calculational procedures required to run the computer program. Radiation doses from radionuclides in the environment may be calculated from deposition on the soil or plants during an atmospheric or liquid release, or from exposure to residual radionuclides in the environment after the releases have ended. Radioactive decay is considered during the release of radionuclides, after they are deposited on the plants or ground, and during holdup of food after harvest. The radiation dose models consider several exposure pathways. Doses may be calculated for either a maximum-exposed individual or for a population group. The doses calculated are accumulated doses from continuous chronic exposure. A first-year committed dose is calculated as well as an integrated dose for a selected number of years. The equations for calculating internal radiation doses are derived from those given by the International Commission on Radiological Protection (ICRP) for body burdens and MPC's of each radionuclide. The radiation doses from external exposure to contaminated water and soil are calculated using the basic assumption that the contaminated medium is large enough to be considered an infinite volume or plane relative to the range of the emitted radiations. The equations for calculations of the radiation dose from external exposure to shoreline sediments include a correction for the finite width of the contaminated beach.

  11. PABLM: a computer program to calculate accumulated radiation doses from radionuclides in the environment

    International Nuclear Information System (INIS)

    Napier, B.A.; Kennedy, W.E. Jr.; Soldat, J.K.

    1980-03-01

    A computer program, PABLM, was written to facilitate the calculation of internal radiation doses to man from radionuclides in food products and external radiation doses from radionuclides in the environment. This report contains details of mathematical models used and calculational procedures required to run the computer program. Radiation doses from radionuclides in the environment may be calculated from deposition on the soil or plants during an atmospheric or liquid release, or from exposure to residual radionuclides in the environment after the releases have ended. Radioactive decay is considered during the release of radionuclides, after they are deposited on the plants or ground, and during holdup of food after harvest. The radiation dose models consider several exposure pathways. Doses may be calculated for either a maximum-exposed individual or for a population group. The doses calculated are accumulated doses from continuous chronic exposure. A first-year committed dose is calculated as well as an integrated dose for a selected number of years. The equations for calculating internal radiation doses are derived from those given by the International Commission on Radiological Protection (ICRP) for body burdens and MPC's of each radionuclide. The radiation doses from external exposure to contaminated water and soil are calculated using the basic assumption that the contaminated medium is large enough to be considered an infinite volume or plane relative to the range of the emitted radiations. The equations for calculations of the radiation dose from external exposure to shoreline sediments include a correction for the finite width of the contaminated beach

  12. The Influence of Trainee Gaming Experience and Computer Self-Efficacy on Learner Outcomes of Videogame-Based Learning Environments

    National Research Council Canada - National Science Library

    Orvis, Karin A; Orvis, Kara L; Belanich, James; Mullin, Laura N

    2005-01-01

    .... The purpose of the current research was to investigate the influence of two trainee characteristics, prior videogame experience and computer self-efficacy, on learner outcomes of a videogame-based training environment...

  13. Distributed Bayesian Computation and Self-Organized Learning in Sheets of Spiking Neurons with Local Lateral Inhibition.

    Directory of Open Access Journals (Sweden)

    Johannes Bill

    Full Text Available During the last decade, Bayesian probability theory has emerged as a framework in cognitive science and neuroscience for describing perception, reasoning and learning of mammals. However, our understanding of how probabilistic computations could be organized in the brain, and how the observed connectivity structure of cortical microcircuits supports these calculations, is rudimentary at best. In this study, we investigate statistical inference and self-organized learning in a spatially extended spiking network model, that accommodates both local competitive and large-scale associative aspects of neural information processing, under a unified Bayesian account. Specifically, we show how the spiking dynamics of a recurrent network with lateral excitation and local inhibition in response to distributed spiking input, can be understood as sampling from a variational posterior distribution of a well-defined implicit probabilistic model. This interpretation further permits a rigorous analytical treatment of experience-dependent plasticity on the network level. Using machine learning theory, we derive update rules for neuron and synapse parameters which equate with Hebbian synaptic and homeostatic intrinsic plasticity rules in a neural implementation. In computer simulations, we demonstrate that the interplay of these plasticity rules leads to the emergence of probabilistic local experts that form distributed assemblies of similarly tuned cells communicating through lateral excitatory connections. The resulting sparse distributed spike code of a well-adapted network carries compressed information on salient input features combined with prior experience on correlations among them. Our theory predicts that the emergence of such efficient representations benefits from network architectures in which the range of local inhibition matches the spatial extent of pyramidal cells that share common afferent input.

  14. Distributed Bayesian Computation and Self-Organized Learning in Sheets of Spiking Neurons with Local Lateral Inhibition

    Science.gov (United States)

    Bill, Johannes; Buesing, Lars; Habenschuss, Stefan; Nessler, Bernhard; Maass, Wolfgang; Legenstein, Robert

    2015-01-01

    During the last decade, Bayesian probability theory has emerged as a framework in cognitive science and neuroscience for describing perception, reasoning and learning of mammals. However, our understanding of how probabilistic computations could be organized in the brain, and how the observed connectivity structure of cortical microcircuits supports these calculations, is rudimentary at best. In this study, we investigate statistical inference and self-organized learning in a spatially extended spiking network model, that accommodates both local competitive and large-scale associative aspects of neural information processing, under a unified Bayesian account. Specifically, we show how the spiking dynamics of a recurrent network with lateral excitation and local inhibition in response to distributed spiking input, can be understood as sampling from a variational posterior distribution of a well-defined implicit probabilistic model. This interpretation further permits a rigorous analytical treatment of experience-dependent plasticity on the network level. Using machine learning theory, we derive update rules for neuron and synapse parameters which equate with Hebbian synaptic and homeostatic intrinsic plasticity rules in a neural implementation. In computer simulations, we demonstrate that the interplay of these plasticity rules leads to the emergence of probabilistic local experts that form distributed assemblies of similarly tuned cells communicating through lateral excitatory connections. The resulting sparse distributed spike code of a well-adapted network carries compressed information on salient input features combined with prior experience on correlations among them. Our theory predicts that the emergence of such efficient representations benefits from network architectures in which the range of local inhibition matches the spatial extent of pyramidal cells that share common afferent input. PMID:26284370

  15. Aerosol transport simulations in indoor and outdoor environments using computational fluid dynamics (CFD)

    Science.gov (United States)

    Landazuri, Andrea C.

    This dissertation focuses on aerosol transport modeling in occupational environments and mining sites in Arizona using computational fluid dynamics (CFD). The impacts of human exposure in both environments are explored with the emphasis on turbulence, wind speed, wind direction and particle sizes. Final emissions simulations involved the digitalization process of available elevation contour plots of one of the mining sites to account for realistic topographical features. The digital elevation map (DEM) of one of the sites was imported to COMSOL MULTIPHYSICSRTM for subsequent turbulence and particle simulations. Simulation results that include realistic topography show considerable deviations of wind direction. Inter-element correlation results using metal and metalloid size resolved concentration data using a Micro-Orifice Uniform Deposit Impactor (MOUDI) under given wind speeds and directions provided guidance on groups of metals that coexist throughout mining activities. Groups between Fe-Mg, Cr-Fe, Al-Sc, Sc-Fe, and Mg-Al are strongly correlated for unrestricted wind directions and speeds, suggesting that the source may be of soil origin (e.g. ore and tailings); also, groups of elements where Cu is present, in the coarse fraction range, may come from mechanical action mining activities and saltation phenomenon. Besides, MOUDI data under low wind speeds (Computational Fluid Dynamics can be used as a source apportionment tool to identify areas that have an effect over specific sampling points and susceptible regions under certain meteorological conditions, and these conclusions can be supported with inter-element correlation matrices and lead isotope analysis, especially since there is limited access to the mining sites. Additional results concluded that grid adaption is a powerful tool that allows to refine specific regions that require lots of detail and therefore better resolve flow detail, provides higher number of locations with monotonic convergence than the

  16. Multi-Language Programming Environments for High Performance Java Computing

    Directory of Open Access Journals (Sweden)

    Vladimir Getov

    1999-01-01

    Full Text Available Recent developments in processor capabilities, software tools, programming languages and programming paradigms have brought about new approaches to high performance computing. A steadfast component of this dynamic evolution has been the scientific community’s reliance on established scientific packages. As a consequence, programmers of high‐performance applications are reluctant to embrace evolving languages such as Java. This paper describes the Java‐to‐C Interface (JCI tool which provides application programmers wishing to use Java with immediate accessibility to existing scientific packages. The JCI tool also facilitates rapid development and reuse of existing code. These benefits are provided at minimal cost to the programmer. While beneficial to the programmer, the additional advantages of mixed‐language programming in terms of application performance and portability are addressed in detail within the context of this paper. In addition, we discuss how the JCI tool is complementing other ongoing projects such as IBM’s High‐Performance Compiler for Java (HPCJ and IceT’s metacomputing environment.

  17. Combining local and global optimisation for virtual camera control

    OpenAIRE

    Burelli, Paolo; Yannakakis, Georgios N.; 2010 IEEE Symposium on Computational Intelligence and Games

    2010-01-01

    Controlling a virtual camera in 3D computer games is a complex task. The camera is required to react to dynamically changing environments and produce high quality visual results and smooth animations. This paper proposes an approach that combines local and global search to solve the virtual camera control problem. The automatic camera control problem is described and it is decomposed into sub-problems; then a hierarchical architecture that solves each sub-problem using the most appropriate op...

  18. A novel combined SLAM based on RBPF-SLAM and EIF-SLAM for mobile system sensing in a large scale environment.

    Science.gov (United States)

    He, Bo; Zhang, Shujing; Yan, Tianhong; Zhang, Tao; Liang, Yan; Zhang, Hongjin

    2011-01-01

    Mobile autonomous systems are very important for marine scientific investigation and military applications. Many algorithms have been studied to deal with the computational efficiency problem required for large scale simultaneous localization and mapping (SLAM) and its related accuracy and consistency. Among these methods, submap-based SLAM is a more effective one. By combining the strength of two popular mapping algorithms, the Rao-Blackwellised particle filter (RBPF) and extended information filter (EIF), this paper presents a combined SLAM-an efficient submap-based solution to the SLAM problem in a large scale environment. RBPF-SLAM is used to produce local maps, which are periodically fused into an EIF-SLAM algorithm. RBPF-SLAM can avoid linearization of the robot model during operating and provide a robust data association, while EIF-SLAM can improve the whole computational speed, and avoid the tendency of RBPF-SLAM to be over-confident. In order to further improve the computational speed in a real time environment, a binary-tree-based decision-making strategy is introduced. Simulation experiments show that the proposed combined SLAM algorithm significantly outperforms currently existing algorithms in terms of accuracy and consistency, as well as the computing efficiency. Finally, the combined SLAM algorithm is experimentally validated in a real environment by using the Victoria Park dataset.

  19. A Novel Combined SLAM Based on RBPF-SLAM and EIF-SLAM for Mobile System Sensing in a Large Scale Environment

    Directory of Open Access Journals (Sweden)

    Hongjin Zhang

    2011-10-01

    Full Text Available Mobile autonomous systems are very important for marine scientific investigation and military applications. Many algorithms have been studied to deal with the computational efficiency problem required for large scale Simultaneous Localization and Mapping (SLAM and its related accuracy and consistency. Among these methods, submap-based SLAM is a more effective one. By combining the strength of two popular mapping algorithms, the Rao-Blackwellised particle filter (RBPF and extended information filter (EIF, this paper presents a Combined SLAM—an efficient submap-based solution to the SLAM problem in a large scale environment. RBPF-SLAM is used to produce local maps, which are periodically fused into an EIF-SLAM algorithm. RBPF-SLAM can avoid linearization of the robot model during operating and provide a robust data association, while EIF-SLAM can improve the whole computational speed, and avoid the tendency of RBPF-SLAM to be over-confident. In order to further improve the computational speed in a real time environment, a binary-tree-based decision-making strategy is introduced. Simulation experiments show that the proposed Combined SLAM algorithm significantly outperforms currently existing algorithms in terms of accuracy and consistency, as well as the computing efficiency. Finally, the Combined SLAM algorithm is experimentally validated in a real environment by using the Victoria Park dataset.

  20. A survey of simultaneous localization and mapping on unstructured lunar complex environment

    Science.gov (United States)

    Wang, Yiqiao; Zhang, Wei; An, Pei

    2017-10-01

    Simultaneous localization and mapping (SLAM) technology is the key to realizing lunar rover's intelligent perception and autonomous navigation. It embodies the autonomous ability of mobile robot, and has attracted plenty of concerns of researchers in the past thirty years. Visual sensors are meaningful to SLAM research because they can provide a wealth of information. Visual SLAM uses merely images as external information to estimate the location of the robot and construct the environment map. Nowadays, SLAM technology still has problems when applied in large-scale, unstructured and complex environment. Based on the latest technology in the field of visual SLAM, this paper investigates and summarizes the SLAM technology using in the unstructured complex environment of lunar surface. In particular, we focus on summarizing and comparing the detection and matching of features of SIFT, SURF and ORB, in the meanwhile discussing their advantages and disadvantages. We have analyzed the three main methods: SLAM Based on Extended Kalman Filter, SLAM Based on Particle Filter and SLAM Based on Graph Optimization (EKF-SLAM, PF-SLAM and Graph-based SLAM). Finally, this article summarizes and discusses the key scientific and technical difficulties in the lunar context that Visual SLAM faces. At the same time, we have explored the frontier issues such as multi-sensor fusion SLAM and multi-robot cooperative SLAM technology. We also predict and prospect the development trend of lunar rover SLAM technology, and put forward some ideas of further research.

  1. New computer simulation technology of WSPEEDI for local and regional environmental assessment during nuclear emergency

    International Nuclear Information System (INIS)

    Chino, Masamichi; Furuno, Akiko; Terada, Hiroaki; Kitabata, Hideyuki

    2002-01-01

    The increase of nuclear power plants in the Asian region necessitates the capability to predict long-range atmospheric dispersions of radionuclides and radiological impacts due to a nuclear accident. For this purpose, we have developed a computer-based emergency response system WSPEEDI. This paper aims to expanding the capability of WSPEEDI so that it can be applied to simultaneous multi-scale predictions of local and regional scales in the Asian region

  2. The cellular environment in computer simulations of radiation-induced damage to DNA

    International Nuclear Information System (INIS)

    Moiseenko, V.V.; Waker, A.J.; Prestwich, W.V.

    1998-01-01

    Radiation-induced DNA single- and double-strand breaks were modeled for 660 keV photon radiation and scavenger capacity mimicking the cellular environment. Atomistic representation of DNA in B form with a first hydration shell was utilized to model direct and indirect damage. Monte Carlo generated electron tracks were used to model energy deposition in matter and to derive initial spatial distributions of species which appear in the medium following radiolysis. Diffusion of species was followed with time, and their reactions with DNA and each other were modeled in an encounter-controlled manner. Three methods to account for hydroxyl radical diffusion in a cellular environment were tested: assumed exponential survival, time-limited modeling and modeling of reactions between hydroxyl radicals and scavengers in an encounter-controlled manner. Although the method based on modeling scavenging in an encounter-controlled manner is more precise, it requires substantially more computer resources than either the exponential or time-limiting method. Scavenger concentrations of 0.5 and 0.15 M were considered using exponential and encounter-controlled methods with reaction rate set at 3 x 10 9 dm 3 mol -1 s -1 . Diffusion length and strand break yields, predicted by these two methods for the same scavenger molarity, were different by 20%-30%. The method based on limiting time of chemistry follow-up to 10 -9 s leads to DNA damage and radical diffusion estimates similar to 0.5 M scavenger concentration in the other two methods. The difference observed in predictions made by the methods considered could be tolerated in computer simulations of DNA damage. (orig.)

  3. The cellular environment in computer simulations of radiation-induced damage to DNA

    International Nuclear Information System (INIS)

    Moiseenko, V.V.; Hamm, R.N.; Waker, A.J.; Prestwich, W.V.

    1988-01-01

    Radiation-induced DNA single- and double-strand breaks were modeled for 660 keV photon radiation and scavenger capacity mimicking the cellular environment. Atomistic representation of DNA in B form with a first hydration shell was utilized to model direct and indirect damage. Monte Carlo generated electron tracks were used to model energy deposition in matter and to derive initial spatial distributions of species which appear in the medium following radiolysis. Diffusion of species was followed with time, and their reactions with DNA and each other were modeled in an encounter-controlled manner. Three methods to account for hydroxyl radical diffusion in cellular environment were tested: assumed exponential survival, time-limited modeling and modeling of reactions between hydroxyl radicals and scavengers in an encounter-controlled manner. Although the method based on modeling scavenging in an encounter-controlled manner is more precise, it requires substantially more computer resources than either the exponential or time-limiting method. Scavenger concentrations of 0.5 and 0.15 M were considered using exponential and encounter-controlled methods with reaction rate set at 3x10 9 dm 3 mol -1 s-1. Diffusion length and strand break yields, predicted by these two methods for the same scavenger molarity, were different by 20%-30%. The method based on limiting time of chemistry follow-up to 10 -9 s leads to DNA damage and radical diffusion estimates similar to 0.5 M scavenger concentration in the other two methods. The difference observed in predictions made by the methods considered could be tolerated in computer simulations of DNA damage. (author)

  4. Detecting and Understanding the Impact of Cognitive and Interpersonal Conflict in Computer Supported Collaborative Learning Environments

    Science.gov (United States)

    Prata, David Nadler; Baker, Ryan S. J. d.; Costa, Evandro d. B.; Rose, Carolyn P.; Cui, Yue; de Carvalho, Adriana M. J. B.

    2009-01-01

    This paper presents a model which can automatically detect a variety of student speech acts as students collaborate within a computer supported collaborative learning environment. In addition, an analysis is presented which gives substantial insight as to how students' learning is associated with students' speech acts, knowledge that will…

  5. COSMIC REIONIZATION ON COMPUTERS. III. THE CLUMPING FACTOR

    Energy Technology Data Exchange (ETDEWEB)

    Kaurov, Alexander A.; Gnedin, Nickolay Y., E-mail: kaurov@uchicago.edu, E-mail: gnedin@fnal.gov [Department of Astronomy and Astrophysics, The University of Chicago, Chicago, IL 60637 (United States)

    2015-09-10

    We use fully self-consistent numerical simulations of cosmic reionization, completed under the Cosmic Reionization On Computers project, to explore how well the recombinations in the ionized intergalactic medium (IGM) can be quantified by the effective “clumping factor.” The density distribution in the simulations (and, presumably, in a real universe) is highly inhomogeneous and more-or-less smoothly varying in space. However, even in highly complex and dynamic environments, the concept of the IGM remains reasonably well-defined; the largest ambiguity comes from the unvirialized regions around galaxies that are over-ionized by the local enhancement in the radiation field (“proximity zones”). That ambiguity precludes computing the IGM clumping factor to better than about 20%. We also discuss a “local clumping factor,” defined over a particular spatial scale, and quantify its scatter on a given scale and its variation as a function of scale.

  6. Cosmic Reionization on Computers. III. The Clumping Factor

    Science.gov (United States)

    Kaurov, Alexander A.; Gnedin, Nickolay Y.

    2015-09-01

    We use fully self-consistent numerical simulations of cosmic reionization, completed under the Cosmic Reionization On Computers project, to explore how well the recombinations in the ionized intergalactic medium (IGM) can be quantified by the effective “clumping factor.” The density distribution in the simulations (and, presumably, in a real universe) is highly inhomogeneous and more-or-less smoothly varying in space. However, even in highly complex and dynamic environments, the concept of the IGM remains reasonably well-defined; the largest ambiguity comes from the unvirialized regions around galaxies that are over-ionized by the local enhancement in the radiation field (“proximity zones”). That ambiguity precludes computing the IGM clumping factor to better than about 20%. We also discuss a “local clumping factor,” defined over a particular spatial scale, and quantify its scatter on a given scale and its variation as a function of scale.

  7. Development of an Acoustic Localization Method for Cavitation Experiments in Reverberant Environments

    Science.gov (United States)

    Ranjeva, Minna; Thompson, Lee; Perlitz, Daniel; Bonness, William; Capone, Dean; Elbing, Brian

    2011-11-01

    Cavitation is a major concern for the US Navy since it can cause ship damage and produce unwanted noise. The ability to precisely locate cavitation onset in laboratory scale experiments is essential for proper design that will minimize this undesired phenomenon. Measuring the cavitation onset is more accurately determined acoustically than visually. However, if other parts of the model begin to cavitate prior to the component of interest the acoustic data is contaminated with spurious noise. Consequently, cavitation onset is widely determined by optically locating the event of interest. The current research effort aims at developing an acoustic localization scheme for reverberant environments such as water tunnels. Currently cavitation bubbles are being induced in a static water tank with a laser, allowing the localization techniques to be refined with the bubble at a known location. The source is located with the use of acoustic data collected with hydrophones and analyzed using signal processing techniques. To verify the accuracy of the acoustic scheme, the events are simultaneously monitored visually with the use of a high speed camera. Once refined testing will be conducted in a water tunnel. This research was sponsored by the Naval Engineering Education Center (NEEC).

  8. Computer simulation of local atomic displacements in alloys. Application to Guinier-Preston zones in Al-Cu

    International Nuclear Information System (INIS)

    Kyobu, J.; Murata, Y.; Morinaga, M.

    1994-01-01

    A new computer program has been developed for the simulation of local atomic displacements in alloys with face-centered-cubic and body-centered-cubic lattices. The combined use of this program with the Gehlen-Cohen program for the simulation of chemical short-range order completely describes atomic fluctuations in alloys. The method has been applied to the structural simulation of Guinier-Preston (GP) zones in an Al-Cu alloy, using the experimental data of Matsubara and Cohen. Characteristic displacements of atoms have been observed around the GP zones and new structural models including local displacements have been proposed for a single-layer zone and several multilayer zones. (orig.)

  9. Transient Localization in Shallow Water Environments

    National Research Council Canada - National Science Library

    Brune, Joachim

    1998-01-01

    .... Measures of robustness to be examined include the size of the localization footprint on the ambiguity surface and the peak-to-sidelobe levels in the presence of environmental mismatch and noise...

  10. Local Competition-Based Superpixel Segmentation Algorithm in Remote Sensing.

    Science.gov (United States)

    Liu, Jiayin; Tang, Zhenmin; Cui, Ying; Wu, Guoxing

    2017-06-12

    Remote sensing technologies have been widely applied in urban environments' monitoring, synthesis and modeling. Incorporating spatial information in perceptually coherent regions, superpixel-based approaches can effectively eliminate the "salt and pepper" phenomenon which is common in pixel-wise approaches. Compared with fixed-size windows, superpixels have adaptive sizes and shapes for different spatial structures. Moreover, superpixel-based algorithms can significantly improve computational efficiency owing to the greatly reduced number of image primitives. Hence, the superpixel algorithm, as a preprocessing technique, is more and more popularly used in remote sensing and many other fields. In this paper, we propose a superpixel segmentation algorithm called Superpixel Segmentation with Local Competition (SSLC), which utilizes a local competition mechanism to construct energy terms and label pixels. The local competition mechanism leads to energy terms locality and relativity, and thus, the proposed algorithm is less sensitive to the diversity of image content and scene layout. Consequently, SSLC could achieve consistent performance in different image regions. In addition, the Probability Density Function (PDF), which is estimated by Kernel Density Estimation (KDE) with the Gaussian kernel, is introduced to describe the color distribution of superpixels as a more sophisticated and accurate measure. To reduce computational complexity, a boundary optimization framework is introduced to only handle boundary pixels instead of the whole image. We conduct experiments to benchmark the proposed algorithm with the other state-of-the-art ones on the Berkeley Segmentation Dataset (BSD) and remote sensing images. Results demonstrate that the SSLC algorithm yields the best overall performance, while the computation time-efficiency is still competitive.

  11. A New Localization System for Indoor Service Robots in Low Luminance and Slippery Indoor Environment Using Afocal Optical Flow Sensor Based Sensor Fusion

    Directory of Open Access Journals (Sweden)

    Dong-Hoon Yi

    2018-01-01

    Full Text Available In this paper, a new localization system utilizing afocal optical flow sensor (AOFS based sensor fusion for indoor service robots in low luminance and slippery environment is proposed, where conventional localization systems do not perform well. To accurately estimate the moving distance of a robot in a slippery environment, the robot was equipped with an AOFS along with two conventional wheel encoders. To estimate the orientation of the robot, we adopted a forward-viewing mono-camera and a gyroscope. In a very low luminance environment, it is hard to conduct conventional feature extraction and matching for localization. Instead, the interior space structure from an image and robot orientation was assessed. To enhance the appearance of image boundary, rolling guidance filter was applied after the histogram equalization. The proposed system was developed to be operable on a low-cost processor and implemented on a consumer robot. Experiments were conducted in low illumination condition of 0.1 lx and carpeted environment. The robot moved for 20 times in a 1.5 × 2.0 m square trajectory. When only wheel encoders and a gyroscope were used for robot localization, the maximum position error was 10.3 m and the maximum orientation error was 15.4°. Using the proposed system, the maximum position error and orientation error were found as 0.8 m and within 1.0°, respectively.

  12. A New Localization System for Indoor Service Robots in Low Luminance and Slippery Indoor Environment Using Afocal Optical Flow Sensor Based Sensor Fusion.

    Science.gov (United States)

    Yi, Dong-Hoon; Lee, Tae-Jae; Cho, Dong-Il Dan

    2018-01-10

    In this paper, a new localization system utilizing afocal optical flow sensor (AOFS) based sensor fusion for indoor service robots in low luminance and slippery environment is proposed, where conventional localization systems do not perform well. To accurately estimate the moving distance of a robot in a slippery environment, the robot was equipped with an AOFS along with two conventional wheel encoders. To estimate the orientation of the robot, we adopted a forward-viewing mono-camera and a gyroscope. In a very low luminance environment, it is hard to conduct conventional feature extraction and matching for localization. Instead, the interior space structure from an image and robot orientation was assessed. To enhance the appearance of image boundary, rolling guidance filter was applied after the histogram equalization. The proposed system was developed to be operable on a low-cost processor and implemented on a consumer robot. Experiments were conducted in low illumination condition of 0.1 lx and carpeted environment. The robot moved for 20 times in a 1.5 × 2.0 m square trajectory. When only wheel encoders and a gyroscope were used for robot localization, the maximum position error was 10.3 m and the maximum orientation error was 15.4°. Using the proposed system, the maximum position error and orientation error were found as 0.8 m and within 1.0°, respectively.

  13. ATLAS off-Grid sites (Tier 3) monitoring. From local fabric monitoring to global overview of the VO computing activities

    CERN Document Server

    PETROSYAN, A; The ATLAS collaboration; BELOV, S; ANDREEVA, J; KADOCHNIKOV, I

    2012-01-01

    The ATLAS Distributed Computing activities have so far concentrated in the "central" part of the experiment computing system, namely the first 3 tiers (the CERN Tier0, 10 Tier1 centers and over 60 Tier2 sites). Many ATLAS Institutes and National Communities have deployed (or intend to) deploy Tier-3 facilities. Tier-3 centers consist of non-pledged resources, which are usually dedicated to data analysis tasks by the geographically close or local scientific groups, and which usually comprise a range of architectures without Grid middleware. Therefore a substantial part of the ATLAS monitoring tools which make use of Grid middleware, cannot be used for a large fraction of Tier3 sites. The presentation will describe the T3mon project, which aims to develop a software suite for monitoring the Tier3 sites, both from the perspective of the local site administrator and that of the ATLAS VO, thereby enabling the global view of the contribution from Tier3 sites to the ATLAS computing activities. Special attention in p...

  14. Coarse-grained models using local-density potentials optimized with the relative entropy: Application to implicit solvation

    International Nuclear Information System (INIS)

    Sanyal, Tanmoy; Shell, M. Scott

    2016-01-01

    Bottom-up multiscale techniques are frequently used to develop coarse-grained (CG) models for simulations at extended length and time scales but are often limited by a compromise between computational efficiency and accuracy. The conventional approach to CG nonbonded interactions uses pair potentials which, while computationally efficient, can neglect the inherently multibody contributions of the local environment of a site to its energy, due to degrees of freedom that were coarse-grained out. This effect often causes the CG potential to depend strongly on the overall system density, composition, or other properties, which limits its transferability to states other than the one at which it was parameterized. Here, we propose to incorporate multibody effects into CG potentials through additional nonbonded terms, beyond pair interactions, that depend in a mean-field manner on local densities of different atomic species. This approach is analogous to embedded atom and bond-order models that seek to capture multibody electronic effects in metallic systems. We show that the relative entropy coarse-graining framework offers a systematic route to parameterizing such local density potentials. We then characterize this approach in the development of implicit solvation strategies for interactions between model hydrophobes in an aqueous environment.

  15. Implementing interactive computing in an object-oriented environment

    Directory of Open Access Journals (Sweden)

    Frederic Udina

    2000-04-01

    Full Text Available Statistical computing when input/output is driven by a Graphical User Interface is considered. A proposal is made for automatic control of computational flow to ensure that only strictly required computations are actually carried on. The computational flow is modeled by a directed graph for implementation in any object-oriented programming language with symbolic manipulation capabilities. A complete implementation example is presented to compute and display frequency based piecewise linear density estimators such as histograms or frequency polygons.

  16. Cloud Computing: The Future of Computing

    OpenAIRE

    Aggarwal, Kanika

    2013-01-01

    Cloud computing has recently emerged as a new paradigm for hosting and delivering services over the Internet. Cloud computing is attractive to business owners as it eliminates the requirement for users to plan ahead for provisioning, and allows enterprises to start from the small and increase resources only when there is a rise in service demand. The basic principles of cloud computing is to make the computing be assigned in a great number of distributed computers, rather then local computer ...

  17. Adaptive particle filter for localization problem in service robotics

    Directory of Open Access Journals (Sweden)

    Heilig Alexander

    2018-01-01

    Full Text Available In this paper we present a statistical approach to the likelihood computation and adaptive resampling algorithm for particle filters using low cost ultrasonic sensors in the context of service robotics. This increases the efficiency of the particle filter in the Monte Carlo Localization problem by means of preventing sample impoverishment and ensuring it converges towards the most likely particle and simultaneously keeping less likely ones by systematic resampling. Proposed algorithms were developed in the ROS framework, simulation was done in Gazebo environment. Experiments using a differential drive mobile platform with 4 ultrasonic sensors in the office environment show that our approach provides strong improvement over particle filters with fixed sample sizes.

  18. Design of UAV-Embedded Microphone Array System for Sound Source Localization in Outdoor Environments

    Science.gov (United States)

    Hoshiba, Kotaro; Washizaki, Kai; Wakabayashi, Mizuho; Ishiki, Takahiro; Bando, Yoshiaki; Gabriel, Daniel; Nakadai, Kazuhiro; Okuno, Hiroshi G.

    2017-01-01

    In search and rescue activities, unmanned aerial vehicles (UAV) should exploit sound information to compensate for poor visual information. This paper describes the design and implementation of a UAV-embedded microphone array system for sound source localization in outdoor environments. Four critical development problems included water-resistance of the microphone array, efficiency in assembling, reliability of wireless communication, and sufficiency of visualization tools for operators. To solve these problems, we developed a spherical microphone array system (SMAS) consisting of a microphone array, a stable wireless network communication system, and intuitive visualization tools. The performance of SMAS was evaluated with simulated data and a demonstration in the field. Results confirmed that the SMAS provides highly accurate localization, water resistance, prompt assembly, stable wireless communication, and intuitive information for observers and operators. PMID:29099790

  19. Sound Localization in Multisource Environments

    Science.gov (United States)

    2009-03-01

    A total of 7 paid volunteer listeners (3 males and 4 females, 20-25 years of age ) par- ticipated in the experiment. All had normal hearing (i.e...effects of the loudspeaker frequency responses, and were then sent from an experimental control computer to a Mark of the Unicorn (MOTU 24 I/O) digital-to...after the overall multisource stimulus has been presented (the ’post-cue’ condition). 3.2 Methods 3.2.1 Listeners Eight listeners, ranging in age from

  20. Local pulmonary structure classification for computer-aided nodule detection

    Science.gov (United States)

    Bahlmann, Claus; Li, Xianlin; Okada, Kazunori

    2006-03-01

    We propose a new method of classifying the local structure types, such as nodules, vessels, and junctions, in thoracic CT scans. This classification is important in the context of computer aided detection (CAD) of lung nodules. The proposed method can be used as a post-process component of any lung CAD system. In such a scenario, the classification results provide an effective means of removing false positives caused by vessels and junctions thus improving overall performance. As main advantage, the proposed solution transforms the complex problem of classifying various 3D topological structures into much simpler 2D data clustering problem, to which more generic and flexible solutions are available in literature, and which is better suited for visualization. Given a nodule candidate, first, our solution robustly fits an anisotropic Gaussian to the data. The resulting Gaussian center and spread parameters are used to affine-normalize the data domain so as to warp the fitted anisotropic ellipsoid into a fixed-size isotropic sphere. We propose an automatic method to extract a 3D spherical manifold, containing the appropriate bounding surface of the target structure. Scale selection is performed by a data driven entropy minimization approach. The manifold is analyzed for high intensity clusters, corresponding to protruding structures. Techniques involve EMclustering with automatic mode number estimation, directional statistics, and hierarchical clustering with a modified Bhattacharyya distance. The estimated number of high intensity clusters explicitly determines the type of pulmonary structures: nodule (0), attached nodule (1), vessel (2), junction (>3). We show accurate classification results for selected examples in thoracic CT scans. This local procedure is more flexible and efficient than current state of the art and will help to improve the accuracy of general lung CAD systems.

  1. Transient Localization in Shallow Water Environments

    National Research Council Canada - National Science Library

    Brune, Joachim

    1998-01-01

    .... A full-wave PE model is used to produce broadband replicas. Both model-generated synthetic signals, which provide baseline results, and measured pulses in a shallow water environment are analyzed...

  2. Quantum computers and quantum computations

    International Nuclear Information System (INIS)

    Valiev, Kamil' A

    2005-01-01

    This review outlines the principles of operation of quantum computers and their elements. The theory of ideal computers that do not interact with the environment and are immune to quantum decohering processes is presented. Decohering processes in quantum computers are investigated. The review considers methods for correcting quantum computing errors arising from the decoherence of the state of the quantum computer, as well as possible methods for the suppression of the decohering processes. A brief enumeration of proposed quantum computer realizations concludes the review. (reviews of topical problems)

  3. The Use of Engineering Design Concept for Computer Programming Course: A Model of Blended Learning Environment

    Science.gov (United States)

    Tritrakan, Kasame; Kidrakarn, Pachoen; Asanok, Manit

    2016-01-01

    The aim of this research is to develop a learning model which blends factors from learning environment and engineering design concept for learning in computer programming course. The usage of the model was also analyzed. This study presents the design, implementation, and evaluation of the model. The research methodology is divided into three…

  4. Sound localization and word discrimination in reverberant environment in children with developmental dyslexia

    Directory of Open Access Journals (Sweden)

    Wendy Castro-Camacho

    2015-04-01

    Full Text Available Objective Compare if localization of sounds and words discrimination in reverberant environment is different between children with dyslexia and controls. Method We studied 30 children with dyslexia and 30 controls. Sound and word localization and discrimination was studied in five angles from left to right auditory fields (-90o, -45o, 0o, +45o, +90o, under reverberant and no-reverberant conditions; correct answers were compared. Results Spatial location of words in no-reverberant test was deficient in children with dyslexia at 0º and +90o. Spatial location for reverberant test was altered in children with dyslexia at all angles, except –-90o. Word discrimination in no-reverberant test in children with dyslexia had a poor performance at left angles. In reverberant test, children with dyslexia exhibited deficiencies at -45o, -90o, and +45o angles. Conclusion Children with dyslexia could had problems when have to locate sound, and discriminate words in extreme locations of the horizontal plane in classrooms with reverberation.

  5. Elastic Scheduling of Scientific Workflows under Deadline Constraints in Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    Nazia Anwar

    2018-01-01

    Full Text Available Scientific workflow applications are collections of several structured activities and fine-grained computational tasks. Scientific workflow scheduling in cloud computing is a challenging research topic due to its distinctive features. In cloud environments, it has become critical to perform efficient task scheduling resulting in reduced scheduling overhead, minimized cost and maximized resource utilization while still meeting the user-specified overall deadline. This paper proposes a strategy, Dynamic Scheduling of Bag of Tasks based workflows (DSB, for scheduling scientific workflows with the aim to minimize financial cost of leasing Virtual Machines (VMs under a user-defined deadline constraint. The proposed model groups the workflow into Bag of Tasks (BoTs based on data dependency and priority constraints and thereafter optimizes the allocation and scheduling of BoTs on elastic, heterogeneous and dynamically provisioned cloud resources called VMs in order to attain the proposed method’s objectives. The proposed approach considers pay-as-you-go Infrastructure as a Service (IaaS clouds having inherent features such as elasticity, abundance, heterogeneity and VM provisioning delays. A trace-based simulation using benchmark scientific workflows representing real world applications, demonstrates a significant reduction in workflow computation cost while the workflow deadline is met. The results validate that the proposed model produces better success rates to meet deadlines and cost efficiencies in comparison to adapted state-of-the-art algorithms for similar problems.

  6. Distributed computing environments for future space control systems

    Science.gov (United States)

    Viallefont, Pierre

    1993-01-01

    The aim of this paper is to present the results of a CNES research project on distributed computing systems. The purpose of this research was to study the impact of the use of new computer technologies in the design and development of future space applications. The first part of this study was a state-of-the-art review of distributed computing systems. One of the interesting ideas arising from this review is the concept of a 'virtual computer' allowing the distributed hardware architecture to be hidden from a software application. The 'virtual computer' can improve system performance by adapting the best architecture (addition of computers) to the software application without having to modify its source code. This concept can also decrease the cost and obsolescence of the hardware architecture. In order to verify the feasibility of the 'virtual computer' concept, a prototype representative of a distributed space application is being developed independently of the hardware architecture.

  7. Information-Fusion Methods Based Simultaneous Localization and Mapping for Robot Adapting to Search and Rescue Postdisaster Environments

    Directory of Open Access Journals (Sweden)

    Hongling Wang

    2018-01-01

    Full Text Available The first application of utilizing unique information-fusion SLAM (IF-SLAM methods is developed for mobile robots performing simultaneous localization and mapping (SLAM adapting to search and rescue (SAR environments in this paper. Several fusion approaches, parallel measurements filtering, exploration trajectories fusing, and combination sensors’ measurements and mobile robots’ trajectories, are proposed. The novel integration particle filter (IPF and optimal improved EKF (IEKF algorithms are derived for information-fusion systems to perform SLAM task in SAR scenarios. The information-fusion architecture consists of multirobots and multisensors (MAM; multiple robots mount on-board laser range finder (LRF sensors, localization sonars, gyro odometry, Kinect-sensor, RGB-D camera, and other proprioceptive sensors. This information-fusion SLAM (IF-SLAM is compared with conventional methods, which indicates that fusion trajectory is more consistent with estimated trajectories and real observation trajectories. The simulations and experiments of SLAM process are conducted in both cluttered indoor environment and outdoor collapsed unstructured scenario, and experimental results validate the effectiveness of the proposed information-fusion methods in improving SLAM performances adapting to SAR scenarios.

  8. Detecting Sybil Attacks in Cloud Computing  Environments Based on Fail‐Stop Signature

    Directory of Open Access Journals (Sweden)

    JongBeom Lim

    2017-03-01

    Full Text Available Due to the loosely coupled property of cloud computing environments, no node has complete knowledge of the system. For this reason, detecting a Sybil attack in cloud computing environments is a non‐trivial task. In such a dynamic system, the use of algorithms based on tree or ring structures for collecting the global state of the system has unfortunate downsides, that is, the structure should be re‐constructed in the presence of node joining and leaving. In this paper, we propose an unstructured Sybil attack detection algorithm in cloud computing environments. Our proposed algorithm uses one‐to‐one communication primitives rather than broadcast primitives and, therefore, the message complexity can be reduced. In our algorithmic design, attacker nodes forging multiple identities are effectively detected by normal nodes with the fail‐stop signature scheme. We show that, regardless of the number of attacker nodes, our Sybil attack detection algorithm is able to reach consensus.

  9. Virtual reality exposure treatment of agoraphobia: a comparison of computer automatic virtual environment and head-mounted display

    NARCIS (Netherlands)

    Meyerbröker, K.; Morina, N.; Kerkhof, G.; Emmelkamp, P.M.G.; Wiederhold, B.K.; Bouchard, S.; Riva, G.

    2011-01-01

    In this study the effects of virtual reality exposure therapy (VRET) were investigated in patients with panic disorder and agoraphobia. The level of presence in VRET was compared between using either a head-mounted display (HMD) or a computer automatic virtual environment (CAVE). Results indicate

  10. ARCHER, a new Monte Carlo software tool for emerging heterogeneous computing environments

    International Nuclear Information System (INIS)

    Xu, X. George; Liu, Tianyu; Su, Lin; Du, Xining; Riblett, Matthew; Ji, Wei; Gu, Deyang; Carothers, Christopher D.; Shephard, Mark S.; Brown, Forrest B.; Kalra, Mannudeep K.; Liu, Bob

    2015-01-01

    Highlights: • A fast Monte Carlo based radiation transport code ARCHER was developed. • ARCHER supports different hardware including CPU, GPU and Intel Xeon Phi coprocessor. • Code is benchmarked again MCNP for medical applications. • A typical CT scan dose simulation only takes 6.8 s on an NVIDIA M2090 GPU. • GPU and coprocessor-based codes are 5–8 times faster than the CPU-based codes. - Abstract: The Monte Carlo radiation transport community faces a number of challenges associated with peta- and exa-scale computing systems that rely increasingly on heterogeneous architectures involving hardware accelerators such as GPUs and Xeon Phi coprocessors. Existing Monte Carlo codes and methods must be strategically upgraded to meet emerging hardware and software needs. In this paper, we describe the development of a software, called ARCHER (Accelerated Radiation-transport Computations in Heterogeneous EnviRonments), which is designed as a versatile testbed for future Monte Carlo codes. Preliminary results from five projects in nuclear engineering and medical physics are presented

  11. Probability-Based Determination Methods for Service Waiting in Service-Oriented Computing Environments

    Science.gov (United States)

    Zeng, Sen; Huang, Shuangxi; Liu, Yang

    Cooperative business processes (CBP)-based service-oriented enterprise networks (SOEN) are emerging with the significant advances of enterprise integration and service-oriented architecture. The performance prediction and optimization for CBP-based SOEN is very complex. To meet these challenges, one of the key points is to try to reduce an abstract service’s waiting number of its physical services. This paper introduces a probability-based determination method (PBDM) of an abstract service’ waiting number, M l , and time span, τ i , for its physical services. The determination of M i and τ i is according to the physical services’ arriving rule and their overall performance’s distribution functions. In PBDM, the arriving probability of the physical services with the best overall performance value is a pre-defined reliability. PBDM has made use of the information of the physical services’ arriving rule and performance distribution functions thoroughly, which will improve the computational efficiency for the scheme design and performance optimization of the collaborative business processes in service-oriented computing environments.

  12. Long-term changes of information environments and computer anxiety of nurse administrators in Japan.

    Science.gov (United States)

    Majima, Yukie; Izumi, Takako

    2013-01-01

    In Japan, medical information systems, including electronic medical records, are being introduced increasingly at medical and nursing fields. Nurse administrators, who are involved in the introduction of medical information systems and who must make proper judgment, are particularly required to have at least minimal knowledge of computers and networks and the ability to think about easy-to-use medical information systems. However, few of the current generation of nurse administrators studied information science subjects in their basic education curriculum. It can be said that information education for nurse administrators has become a pressing issue. Consequently, in this study, we conducted a survey of participants taking the first level program of the education course for Japanese certified nurse administrators to ascertain the actual conditions, such as the information environments that nurse administrators are in, their anxiety attitude to computers. Comparisons over the seven years since 2004 revealed that although introduction of electronic medical records in hospitals was progressing, little change in attributes of participants taking the course was observed, such as computer anxiety.

  13. RSSI-Based Distance Estimation Framework Using a Kalman Filter for Sustainable Indoor Computing Environments

    Directory of Open Access Journals (Sweden)

    Yunsick Sung

    2016-11-01

    Full Text Available Given that location information is the key to providing a variety of services in sustainable indoor computing environments, it is required to obtain accurate locations. Locations can be estimated by three distances from three fixed points. Therefore, if the distance between two points can be measured or estimated accurately, the location in indoor environments can be estimated. To increase the accuracy of the measured distance, noise filtering, signal revision, and distance estimation processes are generally performed. This paper proposes a novel framework for estimating the distance between a beacon and an access point (AP in a sustainable indoor computing environment. Diverse types of received strength signal indications (RSSIs are used for WiFi, Bluetooth, and radio signals, and the proposed distance estimation framework is unique in that it is independent of the specific wireless signal involved, being based on the Bluetooth signal of the beacon. Generally, RSSI measurement, noise filtering, and revision are required for distance estimation using RSSIs. The employed RSSIs are first measured from an AP, with multiple APs sometimes used to increase the accuracy of the distance estimation. Owing to the inevitable presence of noise in the measured RSSIs, the application of noise filtering is essential, and further revision is used to address the inaccuracy and instability that characterizes RSSIs measured in an indoor environment. The revised RSSIs are then used to estimate the distance. The proposed distance estimation framework uses one AP to measure the RSSIs, a Kalman filter to eliminate noise, and a log-distance path loss model to revise the measured RSSIs. In the experimental implementation of the framework, both a RSSI filter and a Kalman filter were respectively used for noise elimination to comparatively evaluate the performance of the latter for the specific application. The Kalman filter was found to reduce the accumulated errors by 8

  14. Do Social Computing Make You Happy? A Case Study of Nomadic Children in Mixed Environments

    DEFF Research Database (Denmark)

    Christensen, Bent Guldbjerg

    2005-01-01

    In this paper I describe a perspective on ambient, ubiquitous, and pervasive computing called the happiness perspective. By using the happiness perspective, the application domain and how the technology is used and experienced, becomes a central and integral part of perceiving ambient technology....... will use the perspective in a case study on field test experiments with nomadic children in mixed environments using the eBag system....

  15. A Robust Localization, Slip Estimation, and Compensation System for WMR in the Indoor Environments

    Directory of Open Access Journals (Sweden)

    Zakir Ullah

    2018-05-01

    Full Text Available A novel approach is proposed for the path tracking of a Wheeled Mobile Robot (WMR in the presence of an unknown lateral slip. Much of the existing work has assumed pure rolling conditions between the wheel and ground. Under the pure rolling conditions, the wheels of a WMR are supposed to roll without slipping. Complex wheel-ground interactions, acceleration and steering system noise are the factors which cause WMR wheel slip. A basic research problem in this context is localization and slip estimation of WMR from a stream of noisy sensors data when the robot is moving on a slippery surface, or moving at a high speed. DecaWave based ranging system and Particle Filter (PF are good candidates to estimate the location of WMR indoors and outdoors. Unfortunately, wheel-slip of WMR limits the ultimate performance that can be achieved by real-world implementation of the PF, because location estimation systems typically partially rely on the robot heading. A small error in the WMR heading leads to a large error in location estimation of the PF because of its cumulative nature. In order to enhance the tracking and localization performance of the PF in the environments where the main reason for an error in the PF location estimation is angular noise, two methods were used for heading estimation of the WMR (1: Reinforcement Learning (RL and (2: Location-based Heading Estimation (LHE. Trilateration is applied to DecaWave based ranging system for calculating the probable location of WMR, this noisy location along with PF current mean is used to estimate the WMR heading by using the above two methods. Beside the WMR location calculation, DecaWave based ranging system is also used to update the PF weights. The localization and tracking performance of the PF is significantly improved through incorporating heading error in localization by applying RL and LHE. Desired trajectory information is then used to develop an algorithm for extracting the lateral slip along

  16. Modeling and Analysis Compute Environments, Utilizing Virtualization Technology in the Climate and Earth Systems Science domain

    Science.gov (United States)

    Michaelis, A.; Nemani, R. R.; Wang, W.; Votava, P.; Hashimoto, H.

    2010-12-01

    Given the increasing complexity of climate modeling and analysis tools, it is often difficult and expensive to build or recreate an exact replica of the software compute environment used in past experiments. With the recent development of new technologies for hardware virtualization, an opportunity exists to create full modeling, analysis and compute environments that are “archiveable”, transferable and may be easily shared amongst a scientific community or presented to a bureaucratic body if the need arises. By encapsulating and entire modeling and analysis environment in a virtual machine image, others may quickly gain access to the fully built system used in past experiments, potentially easing the task and reducing the costs of reproducing and verify past results produced by other researchers. Moreover, these virtual machine images may be used as a pedagogical tool for others that are interested in performing an academic exercise but don't yet possess the broad expertise required. We built two virtual machine images, one with the Community Earth System Model (CESM) and one with Weather Research Forecast Model (WRF), then ran several small experiments to assess the feasibility, performance overheads costs, reusability, and transferability. We present a list of the pros and cons as well as lessoned learned from utilizing virtualization technology in the climate and earth systems modeling domain.

  17. Educational Game Design. Bridging the gab between computer based learning and experimental learning environments

    DEFF Research Database (Denmark)

    Andersen, Kristine

    2007-01-01

    Considering the rapidly growing amount of digital educational materials only few of them bridge the gab between experimental learning environments and computer based learning environments (Gardner, 1991). Observations from two cases in primary school and lower secondary school in the subject...... with a prototype of a MOO storyline. The aim of the MOO storyline is to challenge the potential of dialogue, user involvement, and learning responsibility and to use the children?s natural curiosity and motivation for game playing, especially when digital games involves other children. The paper proposes a model......, based on the narrative approach for experimental learning subjects, relying on ideas from Csikszentmihalyis notion of flow (Csikszentmihalyi, 1991), storyline-pedagogy (Meldgaard, 1994) and ideas from Howard Gardner (Gardner, 1991). The model forms the basis for educational games to be used in home...

  18. Young children reorient by computing layout geometry, not by matching images of the environment.

    Science.gov (United States)

    Lee, Sang Ah; Spelke, Elizabeth S

    2011-02-01

    Disoriented animals from ants to humans reorient in accord with the shape of the surrounding surface layout: a behavioral pattern long taken as evidence for sensitivity to layout geometry. Recent computational models suggest, however, that the reorientation process may not depend on geometrical analyses but instead on the matching of brightness contours in 2D images of the environment. Here we test this suggestion by investigating young children's reorientation in enclosed environments. Children reoriented by extremely subtle geometric properties of the 3D layout: bumps and ridges that protruded only slightly off the floor, producing edges with low contrast. Moreover, children failed to reorient by prominent brightness contours in continuous layouts with no distinctive 3D structure. The findings provide evidence that geometric layout representations support children's reorientation.

  19. The hELENa project - I. Stellar populations of early-type galaxies linked with local environment and galaxy mass

    OpenAIRE

    Sybilska, A.; Lisker, T.; Kuntschner, H.; Vazdekis, A.; van de Ven, G.; Peletier, R.; Falcón-Barroso, J.; Vijayaraghavan, R.; Janz, J.

    2017-01-01

    We present the first in a series of papers in T$h$e role of $E$nvironment in shaping $L$ow-mass $E$arly-type $N$earby g$a$laxies (hELENa) project. In this paper we combine our sample of 20 low-mass early types (dEs) with 258 massive early types (ETGs) from the ATLAS$^{\\mathrm{3D}}$ survey - all observed with the SAURON integral field unit (IFU) - to investigate early-type galaxies' stellar population scaling relations and the dependence of the population properties on local environment, exten...

  20. Does the local food environment around schools affect diet? Longitudinal associations in adolescents attending secondary schools in East London

    Directory of Open Access Journals (Sweden)

    Smith Dianna

    2013-01-01

    Full Text Available Abstract Background The local retail food environment around schools may act as a potential risk factor for adolescent diet. However, international research utilising cross-sectional designs to investigate associations between retail food outlet proximity to schools and diet provides equivocal support for an effect. In this study we employ longitudinal perspectives in order to answer the following two questions. First, how has the local retail food environment around secondary schools changed over time and second, is this change associated with change in diet of students at these schools? Methods The locations of retail food outlets and schools in 2001 and 2005 were geo-coded in three London boroughs. Network analysis in a Geographic Information System (GIS ascertained the number, minimum and median distances to food outlets within 400 m and 800 m of the school location. Outcome measures were ‘healthy’ and ‘unhealthy’ diet scores derived from adolescent self-reported data in the Research with East London Adolescents: Community Health Survey (RELACHS. Adjusted associations between distance from school to food retail outlets, counts of outlets near schools and diet scores were assessed using longitudinal (2001–2005 n=757 approaches. Results Between 2001 and 2005 the number of takeaways and grocers/convenience stores within 400 m of schools increased, with many more grocers reported within 800 m of schools in 2005 (p Conclusions The results provide some evidence that the local food environment around secondary schools may influence adolescent diet, though effects were small. Further research on adolescents’ food purchasing habits with larger samples in varied geographic regions is required to identify robust relationships between proximity and diet, as small numbers, because of confounding, may dilute effect food environment effects. Data on individual foods purchased in all shop formats may clarify the frequent, overly simple

  1. An IoT-Based Computational Framework for Healthcare Monitoring in Mobile Environments

    Directory of Open Access Journals (Sweden)

    Higinio Mora

    2017-10-01

    Full Text Available The new Internet of Things paradigm allows for small devices with sensing, processing and communication capabilities to be designed, which enable the development of sensors, embedded devices and other ‘things’ ready to understand the environment. In this paper, a distributed framework based on the internet of things paradigm is proposed for monitoring human biomedical signals in activities involving physical exertion. The main advantages and novelties of the proposed system is the flexibility in computing the health application by using resources from available devices inside the body area network of the user. This proposed framework can be applied to other mobile environments, especially those where intensive data acquisition and high processing needs take place. Finally, we present a case study in order to validate our proposal that consists in monitoring footballers’ heart rates during a football match. The real-time data acquired by these devices presents a clear social objective of being able to predict not only situations of sudden death but also possible injuries.

  2. Parallel Computing Characteristics of CUPID code under MPI and Hybrid environment

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jae Ryong; Yoon, Han Young [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Jeon, Byoung Jin; Choi, Hyoung Gwon [Seoul National Univ. of Science and Technology, Seoul (Korea, Republic of)

    2014-05-15

    simulation with diagonal preconditioning shows the better speedup. The MPI library was used for node-to-node communication among partitioned subdomains, and the OpenMP threads were activated in every single node using multi-core computing environment. The results of hybrid computing show good performance comparing the pure MPI parallel computing.

  3. Local behavior and lymph node metastases of Wilms' tumor: accuracy of computed tomography; Comportamento local e metastases linfonodais do tumor de Wilms: acuracia da tomografia computadorizada

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Eduardo Just da Costa e, E-mail: eduardojust@oi.com.br [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil); Instituto Materno Infantil de Pernambuco (IMIP), Recife, PE (Brazil); Silva, Giselia Alves Pontes da [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Dept. Maternal Infantil

    2014-01-15

    Objective: to evaluate the accuracy of computed tomography for local and lymph node staging of Wilms' tumor. Materials and methods: each case of Wilms' tumor was evaluated for the presence of abdominal lymph nodes by a radiologist. Signs of capsule and adjacent organ invasion were analyzed. Surgical and histopathological results were taken as the gold standard. Results: sensitivity was 100% for both mesenteric and retroperitoneal lymph nodes detection, and specificity was, respectively, 12% and 33%, with positive predictive value of 8% and 11% and negative predictive value of 100%. Signs of capsular invasion presented sensitivity of 87%, specificity of 77%, positive predictive value of 63% and negative predictive value of 93%. Signs of adjacent organ invasion presented sensitivity of 100%, specificity of 78%, positive predictive value of 37% and negative predictive value of 100%. Conclusion: computed tomography tumor showed low specificity and low positive predictive value in the detection of lymph node dissemination. The absence of detectable lymph nodes makes their presence unlikely, and likewise regarding the evaluation of local behavior of tumors. (author)

  4. A computational environment for long-term multi-feature and multi-algorithm seizure prediction.

    Science.gov (United States)

    Teixeira, C A; Direito, B; Costa, R P; Valderrama, M; Feldwisch-Drentrup, H; Nikolopoulos, S; Le Van Quyen, M; Schelter, B; Dourado, A

    2010-01-01

    The daily life of epilepsy patients is constrained by the possibility of occurrence of seizures. Until now, seizures cannot be predicted with sufficient sensitivity and specificity. Most of the seizure prediction studies have been focused on a small number of patients, and frequently assuming unrealistic hypothesis. This paper adopts the view that for an appropriate development of reliable predictors one should consider long-term recordings and several features and algorithms integrated in one software tool. A computational environment, based on Matlab (®), is presented, aiming to be an innovative tool for seizure prediction. It results from the need of a powerful and flexible tool for long-term EEG/ECG analysis by multiple features and algorithms. After being extracted, features can be subjected to several reduction and selection methods, and then used for prediction. The predictions can be conducted based on optimized thresholds or by applying computational intelligence methods. One important aspect is the integrated evaluation of the seizure prediction characteristic of the developed predictors.

  5. Sensing the environment: regulation of local and global homeostasis by the skin's neuroendocrine system.

    Science.gov (United States)

    Slominski, Andrzej T; Zmijewski, Michal A; Skobowiat, Cezary; Zbytek, Blazej; Slominski, Radomir M; Steketee, Jeffery D

    2012-01-01

    Skin, the body's largest organ, is strategically located at the interface with the external environment where it detects, integrates, and responds to a diverse range of stressors including solar radiation. It has already been established that the skin is an important peripheral neuro-endocrine-immune organ that is tightly networked to central regulatory systems. These capabilities contribute to the maintenance of peripheral homeostasis. Specifically, epidermal and dermal cells produce and respond to classical stress neurotransmitters, neuropeptides, and hormones. Such production is stimulated by ultraviolet radiation (UVR), biological factors (infectious and noninfectious), and other physical and chemical agents. Examples of local biologically active products are cytokines, biogenic amines (catecholamines, histamine, serotonin, and N-acetyl-serotonin), melatonin, acetylocholine, neuropeptides including pituitary (proopiomelanocortin-derived ACTH, beta-endorphin or MSH peptides, thyroid-stimulating hormone) and hypothalamic (corticotropin-releasing factor and related urocortins, thyroid-releasing hormone) hormones as well as enkephalins and dynorphins, thyroid hormones, steroids (glucocorticoids, mineralocorticoids, sex hormones, 7-delta steroids), secosteroids, opioids, and endocannabinoids. The production of these molecules is hierarchical, organized along the algorithms of classical neuroendocrine axes such as hypothalamic-pituitary-adrenal axis (HPA), hypothalamic-thyroid axis (HPT), serotoninergic, melatoninergic, catecholaminergic, cholinergic, steroid/secosteroidogenic, opioid, and endocannbinoid systems. Dysregulation of these axes or of communication between them may lead to skin and/ or systemic diseases. These local neuroendocrine networks are also addressed at restricting maximally the effect of noxious environmental agents to preserve local and consequently global homeostasis. Moreover, the skin-derived factors/systems can also activate cutaneous nerve

  6. InSAR Scientific Computing Environment

    Science.gov (United States)

    Rosen, Paul A.; Sacco, Gian Franco; Gurrola, Eric M.; Zabker, Howard A.

    2011-01-01

    This computing environment is the next generation of geodetic image processing technology for repeat-pass Interferometric Synthetic Aperture (InSAR) sensors, identified by the community as a needed capability to provide flexibility and extensibility in reducing measurements from radar satellites and aircraft to new geophysical products. This software allows users of interferometric radar data the flexibility to process from Level 0 to Level 4 products using a variety of algorithms and for a range of available sensors. There are many radar satellites in orbit today delivering to the science community data of unprecedented quantity and quality, making possible large-scale studies in climate research, natural hazards, and the Earth's ecosystem. The proposed DESDynI mission, now under consideration by NASA for launch later in this decade, would provide time series and multiimage measurements that permit 4D models of Earth surface processes so that, for example, climate-induced changes over time would become apparent and quantifiable. This advanced data processing technology, applied to a global data set such as from the proposed DESDynI mission, enables a new class of analyses at time and spatial scales unavailable using current approaches. This software implements an accurate, extensible, and modular processing system designed to realize the full potential of InSAR data from future missions such as the proposed DESDynI, existing radar satellite data, as well as data from the NASA UAVSAR (Uninhabited Aerial Vehicle Synthetic Aperture Radar), and other airborne platforms. The processing approach has been re-thought in order to enable multi-scene analysis by adding new algorithms and data interfaces, to permit user-reconfigurable operation and extensibility, and to capitalize on codes already developed by NASA and the science community. The framework incorporates modern programming methods based on recent research, including object-oriented scripts controlling legacy and

  7. BISEN: Biochemical simulation environment

    NARCIS (Netherlands)

    Vanlier, J.; Wu, F.; Qi, F.; Vinnakota, K.C.; Han, Y.; Dash, R.K.; Yang, F.; Beard, D.A.

    2009-01-01

    The Biochemical Simulation Environment (BISEN) is a suite of tools for generating equations and associated computer programs for simulating biochemical systems in the MATLAB® computing environment. This is the first package that can generate appropriate systems of differential equations for

  8. Local Fitness Landscapes Predict Yeast Evolutionary Dynamics in Directionally Changing Environments.

    Science.gov (United States)

    Gorter, Florien A; Aarts, Mark G M; Zwaan, Bas J; de Visser, J Arjan G M

    2018-01-01

    The fitness landscape is a concept that is widely used for understanding and predicting evolutionary adaptation. The topography of the fitness landscape depends critically on the environment, with potentially far-reaching consequences for evolution under changing conditions. However, few studies have assessed directly how empirical fitness landscapes change across conditions, or validated the predicted consequences of such change. We previously evolved replicate yeast populations in the presence of either gradually increasing, or constant high, concentrations of the heavy metals cadmium (Cd), nickel (Ni), and zinc (Zn), and analyzed their phenotypic and genomic changes. Here, we reconstructed the local fitness landscapes underlying adaptation to each metal by deleting all repeatedly mutated genes both by themselves and in combination. Fitness assays revealed that the height, and/or shape, of each local fitness landscape changed considerably across metal concentrations, with distinct qualitative differences between unconditionally (Cd) and conditionally toxic metals (Ni and Zn). This change in topography had particularly crucial consequences in the case of Ni, where a substantial part of the individual mutational fitness effects changed in sign across concentrations. Based on the Ni landscape analyses, we made several predictions about which mutations had been selected when during the evolution experiment. Deep sequencing of population samples from different time points generally confirmed these predictions, demonstrating the power of landscape reconstruction analyses for understanding and ultimately predicting evolutionary dynamics, even under complex scenarios of environmental change. Copyright © 2018 by the Genetics Society of America.

  9. Neural Computation Scheme of Compound Control: Tacit Learning for Bipedal Locomotion

    Science.gov (United States)

    Shimoda, Shingo; Kimura, Hidenori

    The growing need for controlling complex behaviors of versatile robots working in unpredictable environment has revealed the fundamental limitation of model-based control strategy that requires precise models of robots and environments before their operations. This difficulty is fundamental and has the same root with the well-known frame problem in artificial intelligence. It has been a central long standing issue in advanced robotics, as well as machine intelligence, to find a prospective clue to attack this fundamental difficulty. The general consensus shared by many leading researchers in the related field is that the body plays an important role in acquiring intelligence that can conquer unknowns. In particular, purposeful behaviors emerge during body-environment interactions with the help of an appropriately organized neural computational scheme that can exploit what the environment can afford. Along this line, we propose a new scheme of neural computation based on compound control which represents a typical feature of biological controls. This scheme is based on classical neuron models with local rules that can create macroscopic purposeful behaviors. This scheme is applied to a bipedal robot and generates the rhythm of walking without any model of robot dynamics and environments.

  10. Computational Science in Armenia (Invited Talk)

    Science.gov (United States)

    Marandjian, H.; Shoukourian, Yu.

    This survey is devoted to the development of informatics and computer science in Armenia. The results in theoretical computer science (algebraic models, solutions to systems of general form recursive equations, the methods of coding theory, pattern recognition and image processing), constitute the theoretical basis for developing problem-solving-oriented environments. As examples can be mentioned: a synthesizer of optimized distributed recursive programs, software tools for cluster-oriented implementations of two-dimensional cellular automata, a grid-aware web interface with advanced service trading for linear algebra calculations. In the direction of solving scientific problems that require high-performance computing resources, examples of completed projects include the field of physics (parallel computing of complex quantum systems), astrophysics (Armenian virtual laboratory), biology (molecular dynamics study of human red blood cell membrane), meteorology (implementing and evaluating the Weather Research and Forecast Model for the territory of Armenia). The overview also notes that the Institute for Informatics and Automation Problems of the National Academy of Sciences of Armenia has established a scientific and educational infrastructure, uniting computing clusters of scientific and educational institutions of the country and provides the scientific community with access to local and international computational resources, that is a strong support for computational science in Armenia.

  11. Energy Consumption and Indoor Environment Predicted by a Combination of Computational Fluid Dynamics and Building Energy Performance Simulation

    DEFF Research Database (Denmark)

    Nielsen, Peter Vilhelm

    2003-01-01

    An interconnection between a building energy performance simulation program and a Computational Fluid Dynamics program (CFD) for room air distribution is introduced for improvement of the predictions of both the energy consumption and the indoor environment.The article describes a calculation...

  12. Security Architecture of Cloud Computing

    OpenAIRE

    V.KRISHNA REDDY; Dr. L.S.S.REDDY

    2011-01-01

    The Cloud Computing offers service over internet with dynamically scalable resources. Cloud Computing services provides benefits to the users in terms of cost and ease of use. Cloud Computing services need to address the security during the transmission of sensitive data and critical applications to shared and public cloud environments. The cloud environments are scaling large for data processing and storage needs. Cloud computing environment have various advantages as well as disadvantages o...

  13. The Development of Biology Teaching Material Based on the Local Wisdom of Timorese to Improve Students Knowledge and Attitude of Environment in Caring the Preservation of Environment

    Science.gov (United States)

    Ardan, Andam S.

    2016-01-01

    The purposes of this study were (1) to describe the biology learning such as lesson plans, teaching materials, media and worksheets for the tenth grade of High School on the topic of Biodiversity and Basic Classification, Ecosystems and Environment Issues based on local wisdom of Timorese; (2) to analyze the improvement of the environmental…

  14. The impact of new computer technology on accelerator control

    International Nuclear Information System (INIS)

    Theil, E.; Jacobson, V.; Paxson, V.

    1987-01-01

    This paper describes some recent developments in computing and stresses their application to accelerator control systems. Among the advances that promise to have a significant impact are: i) low cost scientific workstations; ii) the use of ''windows'', pointing devices and menus in a multitasking operating system; iii) high resolution large-screen graphics monitors; iv) new kinds of high bandwidth local area networks. The relevant features are related to a general accelerator control system. For example, the authors examine the implications of a computing environment which permits and encourages graphical manipulation of system components, rather than traditional access through the writing of programs or ''canned'' access via touch panels

  15. The impact of new computer technology on accelerator control

    International Nuclear Information System (INIS)

    Theil, E.; Jacobson, V.; Paxson, V.

    1987-04-01

    This paper describes some recent developments in computing and stresses their application in accelerator control systems. Among the advances that promise to have a significant impact are (1) low cost scientific workstations; (2) the use of ''windows'', pointing devices and menus in a multi-tasking operating system; (3) high resolution large-screen graphics monitors; (4) new kinds of high bandwidth local area networks. The relevant features are related to a general accelerator control system. For example, this paper examines the implications of a computing environment which permits and encourages graphical manipulation of system components, rather than traditional access through the writing of programs or ''canned'' access via touch panels

  16. Modification of GNPS environment radiation monitoring network system

    International Nuclear Information System (INIS)

    Jiang Lili; Cao Chunsheng

    1999-01-01

    GNPS Environment Radiation Continuous Monitoring System (KRS), the only real time on-line system of site radiation monitoring, was put into service in 1993 prior to the first loading the the plant. It is revealed through several years of operation that this system has some deficiencies such as inadequate real time monitoring means, no figure and diagram display function on the central computer, high failures, frequent failure warning signals, thus making the availability of the system at a low level. In recent years, with the rapid development of computer network technology and increasingly strict requirements on the NPP environment protection raised by the government and public, KRS modification had become necessary and urgent. In 1996, GNPS carried out modification work on the measuring geometry condition of γ radiation monitoring sub-station and lightening protection. To enhance the functions of real time monitoring and data auto-processing, further modification of the system was made in 1998, including the update of the software and hardware of KRS central processor, set-up of system computer local network and database. In this way, the system availability and monitoring quality are greatly improved and effective monitoring and analysis means are provided for gaseous release during normal operation and under accident condition

  17. A computational framework for the optimal design of morphing processes in locally activated smart material structures

    International Nuclear Information System (INIS)

    Wang, Shuang; Brigham, John C

    2012-01-01

    A proof-of-concept study is presented for a strategy to obtain maximally efficient and accurate morphing structures composed of active materials such as shape memory polymers (SMP) through synchronization of adaptable and localized activation and actuation. The work focuses on structures or structural components entirely composed of thermo-responsive SMP, and particularly utilizes the ability of such materials to display controllable variable stiffness. The study presents and employs a computational inverse mechanics approach that combines a computational representation of the SMP thermo-mechanical behavior with a nonlinear optimization algorithm to determine location, magnitude and sequencing of the activation and actuation to obtain a desired shape change subject to design objectives such as prevention of damage. Two numerical examples are presented in which the synchronization of the activation and actuation and the location of activation excitation were optimized with respect to the combined thermal and mechanical energy for design concepts in morphing skeletal structural components. In all cases the concept of localized activation along with the optimal design strategy were able to produce far more energy efficient morphing structures and more accurately reach the desired shape change in comparison to traditional methods that require complete structural activation prior to actuation. (paper)

  18. Multi-Locality Based Local and Symbiotic Computing for Interactively fast On-Demand Weather Forecasting for Small Regions, Short Durations, and Very High-Resolutions

    OpenAIRE

    Fjukstad, Bård

    2014-01-01

    Papers 1, 3 and 4 are not available in Munin: 1: Bård Fjukstad, Tor-Magne Stien Hagen, Daniel Stødle, Phuong Hoai Ha, John Markus Bjørndalen, and Otto Anshus: ‘Interactive Weather Simulation and Visualization on a Display Wall with Many-Core Compute Nodes’, in K. Jónasson (ed.): PARA 2010, Part I, LNCS 7133, pp. 142–151, 2012, © Springer-Verlag Berlin Heidelberg 3: Bård Fjukstad, John Markus Bjørndalen and Otto Anshus: ‘Accurate Weather Forecasting Through Locality Based Collaborative Computi...

  19. A room acoustical computer model for industrial environments - the model and its verification

    DEFF Research Database (Denmark)

    Christensen, Claus Lynge; Foged, Hans Torben

    1998-01-01

    This paper presents an extension to the traditional room acoustic modelling methods allowing computer modelling of huge machinery in industrial spaces. The program in question is Odeon 3.0 Industrial and Odeon 3.0 Combined which allows the modelling of point sources, surface sources and line...... of an omnidirectional sound source and a microphone. This allows the comparison of simulated results with the ones measured in real rooms. However when simulating the acoustic environment in industrial rooms, the sound sources are often far from being point like, as they can be distributed over a large space...

  20. Computer-aided design and computer science technology

    Science.gov (United States)

    Fulton, R. E.; Voigt, S. J.

    1976-01-01

    A description is presented of computer-aided design requirements and the resulting computer science advances needed to support aerospace design. The aerospace design environment is examined, taking into account problems of data handling and aspects of computer hardware and software. The interactive terminal is normally the primary interface between the computer system and the engineering designer. Attention is given to user aids, interactive design, interactive computations, the characteristics of design information, data management requirements, hardware advancements, and computer science developments.