WorldWideScience

Sample records for local computer environment

  1. Weighted Local Active Pixel Pattern (WLAPP for Face Recognition in Parallel Computation Environment

    Directory of Open Access Journals (Sweden)

    Gundavarapu Mallikarjuna Rao

    2013-10-01

    Full Text Available Abstract  - The availability of multi-core technology resulted totally new computational era. Researchers are keen to explore available potential in state of art-machines for breaking the bearer imposed by serial computation. Face Recognition is one of the challenging applications on so ever computational environment. The main difficulty of traditional Face Recognition algorithms is lack of the scalability. In this paper Weighted Local Active Pixel Pattern (WLAPP, a new scalable Face Recognition Algorithm suitable for parallel environment is proposed.  Local Active Pixel Pattern (LAPP is found to be simple and computational inexpensive compare to Local Binary Patterns (LBP. WLAPP is developed based on concept of LAPP. The experimentation is performed on FG-Net Aging Database with deliberately introduced 20% distortion and the results are encouraging. Keywords — Active pixels, Face Recognition, Local Binary Pattern (LBP, Local Active Pixel Pattern (LAPP, Pattern computing, parallel workers, template, weight computation.  

  2. Computing environment logbook

    Science.gov (United States)

    Osbourn, Gordon C; Bouchard, Ann M

    2012-09-18

    A computing environment logbook logs events occurring within a computing environment. The events are displayed as a history of past events within the logbook of the computing environment. The logbook provides search functionality to search through the history of past events to find one or more selected past events, and further, enables an undo of the one or more selected past events.

  3. DCE. Future IHEP's computing environment

    International Nuclear Information System (INIS)

    Zheng Guorui; Liu Xiaoling

    1995-01-01

    IHEP'S computing environment consists of several different computing environments established on IHEP computer networks. In which, the BES environment supported HEP computing is the main part of IHEP computing environment. Combining with the procedure of improvement and extension of BES environment, the authors describe development of computing environments in outline as viewed from high energy physics (HEP) environment establishment. The direction of developing to distributed computing of the IHEP computing environment based on the developing trend of present distributed computing is presented

  4. ISS Local Environment Spectrometers (ISLES)

    Science.gov (United States)

    Krause, Linda Habash; Gilchrist, Brian E.

    2014-01-01

    In order to study the complex interactions between the space environment surrounding the ISS and the ISS surface materials, we propose to use lowcost, high-TRL plasma sensors on the ISS robotic arm to probe the ISS space environment. During many years of ISS operation, we have been able to condut effective (but not perfect) extravehicular activities (both human and robotic) within the perturbed local ISS space environment. Because of the complexity of the interaction between the ISS and the LEO space environment, there remain important questions, such as differential charging at solar panel junctions (the so-called "triple point" between conductor, dielectric, and space plasma), increased chemical contamination due to ISS surface charging and/or thruster activation, water dumps, etc, and "bootstrap" charging of insulating surfaces. Some compelling questions could synergistically draw upon a common sensor suite, which also leverages previous and current MSFC investments. Specific questions address ISS surface charging, plasma contactor plume expansion in a magnetized drifting plasma, and possible localized contamination effects across the ISS.

  5. CHPS IN CLOUD COMPUTING ENVIRONMENT

    OpenAIRE

    K.L.Giridas; A.Shajin Nargunam

    2012-01-01

    Workflow have been utilized to characterize a various form of applications concerning high processing and storage space demands. So, to make the cloud computing environment more eco-friendly,our research project was aiming in reducing E-waste accumulated by computers. In a hybrid cloud, the user has flexibility offered by public cloud resources that can be combined to the private resources pool as required. Our previous work described the process of combining the low range and mid range proce...

  6. Personal computer local networks report

    CERN Document Server

    1991-01-01

    Please note this is a Short Discount publication. Since the first microcomputer local networks of the late 1970's and early 80's, personal computer LANs have expanded in popularity, especially since the introduction of IBMs first PC in 1981. The late 1980s has seen a maturing in the industry with only a few vendors maintaining a large share of the market. This report is intended to give the reader a thorough understanding of the technology used to build these systems ... from cable to chips ... to ... protocols to servers. The report also fully defines PC LANs and the marketplace, with in-

  7. Printing in Ubiquitous Computing Environments

    NARCIS (Netherlands)

    Karapantelakis, Athanasios; Delvic, Alisa; Zarifi Eslami, Mohammed; Khamit, Saltanat

    Document printing has long been considered an indispensable part of the workspace. While this process is considered trivial and simple for environments where resources are ample (e.g. desktop computers connected to printers within a corporate network), it becomes complicated when applied in a mobile

  8. Airborne Cloud Computing Environment (ACCE)

    Science.gov (United States)

    Hardman, Sean; Freeborn, Dana; Crichton, Dan; Law, Emily; Kay-Im, Liz

    2011-01-01

    Airborne Cloud Computing Environment (ACCE) is JPL's internal investment to improve the return on airborne missions. Improve development performance of the data system. Improve return on the captured science data. The investment is to develop a common science data system capability for airborne instruments that encompasses the end-to-end lifecycle covering planning, provisioning of data system capabilities, and support for scientific analysis in order to improve the quality, cost effectiveness, and capabilities to enable new scientific discovery and research in earth observation.

  9. Computer network environment planning and analysis

    Science.gov (United States)

    Dalphin, John F.

    1989-01-01

    The GSFC Computer Network Environment provides a broadband RF cable between campus buildings and ethernet spines in buildings for the interlinking of Local Area Networks (LANs). This system provides terminal and computer linkage among host and user systems thereby providing E-mail services, file exchange capability, and certain distributed computing opportunities. The Environment is designed to be transparent and supports multiple protocols. Networking at Goddard has a short history and has been under coordinated control of a Network Steering Committee for slightly more than two years; network growth has been rapid with more than 1500 nodes currently addressed and greater expansion expected. A new RF cable system with a different topology is being installed during summer 1989; consideration of a fiber optics system for the future will begin soon. Summmer study was directed toward Network Steering Committee operation and planning plus consideration of Center Network Environment analysis and modeling. Biweekly Steering Committee meetings were attended to learn the background of the network and the concerns of those managing it. Suggestions for historical data gathering have been made to support future planning and modeling. Data Systems Dynamic Simulator, a simulation package developed at NASA and maintained at GSFC was studied as a possible modeling tool for the network environment. A modeling concept based on a hierarchical model was hypothesized for further development. Such a model would allow input of newly updated parameters and would provide an estimation of the behavior of the network.

  10. Sound Localization in Multisource Environments

    Science.gov (United States)

    2009-03-01

    A total of 7 paid volunteer listeners (3 males and 4 females, 20-25 years of age ) par- ticipated in the experiment. All had normal hearing (i.e...effects of the loudspeaker frequency responses, and were then sent from an experimental control computer to a Mark of the Unicorn (MOTU 24 I/O) digital-to...after the overall multisource stimulus has been presented (the ’post-cue’ condition). 3.2 Methods 3.2.1 Listeners Eight listeners, ranging in age from

  11. Green Computing in Local Governments and Information Technology Companies

    Directory of Open Access Journals (Sweden)

    Badar Agung Nugroho

    2013-06-01

    Full Text Available Green computing is a study and practice of designing, manufacturing, using, and disposing of information and communication devices efficiently and effectively with minimum impact on the environment. If the green computing concept was implemented, it will help the agencies or companies to reduce energy and capital cost from their IT infrastructure. The goal from this research is to explore the current condition about the efforts from local governments and IT companies at West Java to implement the green computing concept at their working environment. The primary data were collected by using focus group discussion by inviting the local governments and IT companies representatives who responsible to manage their IT infrastructure. And then, the secondary data were collected by doing brief observation in order to see the real effort of green computing implementation at each institution. The result shows that there are many different perspectives and efforts of green computing implementation between local governments and IT companies.

  12. Cluster-based localization and tracking in ubiquitous computing systems

    CERN Document Server

    Martínez-de Dios, José Ramiro; Torres-González, Arturo; Ollero, Anibal

    2017-01-01

    Localization and tracking are key functionalities in ubiquitous computing systems and techniques. In recent years a very high variety of approaches, sensors and techniques for indoor and GPS-denied environments have been developed. This book briefly summarizes the current state of the art in localization and tracking in ubiquitous computing systems focusing on cluster-based schemes. Additionally, existing techniques for measurement integration, node inclusion/exclusion and cluster head selection are also described in this book.

  13. Local environment effects in disordered alloys

    International Nuclear Information System (INIS)

    Cable, J.W.

    1978-01-01

    The magnetic moment of an atom in a ferromagnetic disordered alloy depends on the local environment of that atom. This is particularly true for Ni and Pd based alloys for which neutron diffuse scattering measurements of the range and magnitude of the moment disturbances indicate that both magnetic and chemical environment are important in determining the moment distribution. In this paper we review recent neutron studies of local environment effects in Ni based alloys. These are discussed in terms of a phenomenological model that allows a separation of the total moment disturbance at a Ni site into its chemical and magnetic components

  14. Vanderbilt University: Campus Computing Environment.

    Science.gov (United States)

    CAUSE/EFFECT, 1988

    1988-01-01

    Despite the decentralized nature of computing at Vanderbilt, there is significant evidence of cooperation and use of each other's resources by the various computing entities. Planning for computing occurs in every school and department. Caravan, a campus-wide network, is described. (MLW)

  15. Barcode based localization system in indoor environment

    Directory of Open Access Journals (Sweden)

    Ľubica Ilkovičová

    2014-12-01

    Full Text Available Nowadays, in the era of intelligent buildings, there is a need to create indoornavigation systems, what is steadily a challenge. QR (Quick Response codesprovide accurate localization also in indoor environment, where other navigationtechniques (e.g. GPS are not available. The paper deals with the issues of posi-tioning using QR codes, solved at the Department of Surveying, Faculty of CivilEngineering SUT in Bratislava. Operating principle of QR codes, description ofthe application for positioning in indoor environment based on OS Android forsmartphones are described.

  16. Embedding Moodle into Ubiquitous Computing Environments

    NARCIS (Netherlands)

    Glahn, Christian; Specht, Marcus

    2010-01-01

    Glahn, C., & Specht, M. (2010). Embedding Moodle into Ubiquitous Computing Environments. In M. Montebello, et al. (Eds.), 9th World Conference on Mobile and Contextual Learning (MLearn2010) (pp. 100-107). October, 19-22, 2010, Valletta, Malta.

  17. Smile (System/Machine-Independent Local Environment)

    Energy Technology Data Exchange (ETDEWEB)

    Fletcher, J.G.

    1988-04-01

    This document defines the characteristics of Smile, a System/machine-independent local environment. This environment consists primarily of a number of primitives (types, macros, procedure calls, and variables) that a program may use; these primitives provide facilities, such as memory allocation, timing, tasking and synchronization beyond those typically provided by a programming language. The intent is that a program will be portable from system to system and from machine to machine if it relies only on the portable aspects of its programming language and on the Smile primitives. For this to be so, Smile itself must be implemented on each system and machine, most likely using non-portable constructions; that is, while the environment provided by Smile is intended to be portable, the implementation of Smile is not necessarily so. In order to make the implementation of Smile as easy as possible and thereby expedite the porting of programs to a new system or a new machine, Smile has been defined to provide a minimal portable environment; that is, simple primitives are defined, out of which more complex facilities may be constructed using portable procedures. The implementation of Smile can be as any of the following: the underlying software environment for the operating system of an otherwise {open_quotes}bare{close_quotes} machine, a {open_quotes}guest{close_quotes} system environment built upon a preexisting operating system, an environment within a {open_quotes}user{close_quotes} process run by an operating system, or a single environment for an entire machine, encompassing both system and {open_quotes}user{close_quotes} processes. In the first three of these cases the tasks provided by Smile are {open_quotes}lightweight processes{close_quotes} multiplexed within preexisting processes or the system, while in the last case they also include the system processes themselves.

  18. Security Management Model in Cloud Computing Environment

    OpenAIRE

    Ahmadpanah, Seyed Hossein

    2016-01-01

    In the cloud computing environment, cloud virtual machine (VM) will be more and more the number of virtual machine security and management faced giant Challenge. In order to address security issues cloud computing virtualization environment, this paper presents a virtual machine based on efficient and dynamic deployment VM security management model state migration and scheduling, study of which virtual machine security architecture, based on AHP (Analytic Hierarchy Process) virtual machine de...

  19. Intelligent computing for sustainable energy and environment

    Energy Technology Data Exchange (ETDEWEB)

    Li, Kang [Queen' s Univ. Belfast (United Kingdom). School of Electronics, Electrical Engineering and Computer Science; Li, Shaoyuan; Li, Dewei [Shanghai Jiao Tong Univ., Shanghai (China). Dept. of Automation; Niu, Qun (eds.) [Shanghai Univ. (China). School of Mechatronic Engineering and Automation

    2013-07-01

    Fast track conference proceedings. State of the art research. Up to date results. This book constitutes the refereed proceedings of the Second International Conference on Intelligent Computing for Sustainable Energy and Environment, ICSEE 2012, held in Shanghai, China, in September 2012. The 60 full papers presented were carefully reviewed and selected from numerous submissions and present theories and methodologies as well as the emerging applications of intelligent computing in sustainable energy and environment.

  20. CRLBs for WSNs localization in NLOS environment

    Directory of Open Access Journals (Sweden)

    Wang Peng

    2011-01-01

    Full Text Available Abstract Determination of Cramer-Rao lower bound (CRLB as an optimality criterion for the problem of localization in wireless sensor networks (WSNs is a very important issue. Currently, CRLBs have been derived for line-of-sight (LOS situation in WSNs. However, one of major problems for accurate localization in WSNs is non-line-of-sight (NLOS propagation. This article proposes two CRLBs for WSNs localization in NLOS environment. The proposed CRLBs consider both the cases that positions of reference devices (RDs are perfectly or imperfectly known. Since non-parametric kernel method is used to build probability density function of NLOS errors, the proposed CRLBs are suitable for various distributions of NLOS errors. Moreover, the proposed CRLBs provide a unified presentation for both LOS and NLOS environments. Theoretical analysis also proves that the proposed CRLB for NLOS situation becomes the CRLB for LOS situation when NLOS errors go to 0, which gives a robust check for the proposed CRLB.

  1. LINER galaxy properties and the local environment

    Science.gov (United States)

    Coldwell, Georgina V.; Alonso, Sol; Duplancic, Fernanda; Mesa, Valeria

    2018-05-01

    We analyse the properties of a sample of 5560 low-ionization nuclear emission-line region (LINER) galaxies selected from SDSS-DR12 at low red shift, for a complete range of local density environments. The host LINER galaxies were studied and compared with a well-defined control sample of 5553 non-LINER galaxies matched in red shift, luminosity, morphology and local density. By studying the distributions of galaxy colours and the stellar age population, we find that LINERs are redder and older than the control sample over a wide range of densities. In addition, LINERs are older than the control sample, at a given galaxy colour, indicating that some external process could have accelerated the evolution of the stellar population. The analysis of the host properties shows that the control sample exhibits a strong relation between colours, ages and the local density, while more than 90 per cent of the LINERs are redder and older than the mean values, independently of the neighbourhood density. Furthermore, a detailed study in three local density ranges shows that, while control sample galaxies are redder and older as a function of stellar mass and density, LINER galaxies mismatch the known morphology-density relation of galaxies without low-ionization features. The results support the contribution of hot and old stars to the low-ionization emission although the contribution of nuclear activity is not discarded.

  2. Computer Vision Using Local Binary Patterns

    CERN Document Server

    Pietikainen, Matti; Zhao, Guoying; Ahonen, Timo

    2011-01-01

    The recent emergence of Local Binary Patterns (LBP) has led to significant progress in applying texture methods to various computer vision problems and applications. The focus of this research has broadened from 2D textures to 3D textures and spatiotemporal (dynamic) textures. Also, where texture was once utilized for applications such as remote sensing, industrial inspection and biomedical image analysis, the introduction of LBP-based approaches have provided outstanding results in problems relating to face and activity analysis, with future scope for face and facial expression recognition, b

  3. Precise RFID localization in impaired environment through sparse signal recovery

    Science.gov (United States)

    Subedi, Saurav; Zhang, Yimin D.; Amin, Moeness G.

    2013-05-01

    Radio frequency identification (RFID) is a rapidly developing wireless communication technology for electronically identifying, locating, and tracking products, assets, and personnel. RFID has become one of the most important means to construct real-time locating systems (RTLS) that track and identify the location of objects in real time using simple, inexpensive tags and readers. The applicability and usefulness of RTLS techniques depend on their achievable accuracy. In particular, when multilateration-based localization techniques are exploited, the achievable accuracy primarily relies on the precision of the range estimates between a reader and the tags. Such range information can be obtained by using the received signal strength indicator (RSSI) and/or the phase difference of arrival (PDOA). In both cases, however, the accuracy is significantly compromised when the operation environment is impaired. In particular, multipath propagation significantly affects the measurement accuracy of both RSSI and phase information. In addition, because RFID systems are typically operated in short distances, RSSI and phase measurements are also coupled with the reader and tag antenna patterns, making accurate RFID localization very complicated and challenging. In this paper, we develop new methods to localize RFID tags or readers by exploiting sparse signal recovery techniques. The proposed method allows the channel environment and antenna patterns to be taken into account and be properly compensated at a low computational cost. As such, the proposed technique yields superior performance in challenging operation environments with the above-mentioned impairments.

  4. HeNCE: A Heterogeneous Network Computing Environment

    Directory of Open Access Journals (Sweden)

    Adam Beguelin

    1994-01-01

    Full Text Available Network computing seeks to utilize the aggregate resources of many networked computers to solve a single problem. In so doing it is often possible to obtain supercomputer performance from an inexpensive local area network. The drawback is that network computing is complicated and error prone when done by hand, especially if the computers have different operating systems and data formats and are thus heterogeneous. The heterogeneous network computing environment (HeNCE is an integrated graphical environment for creating and running parallel programs over a heterogeneous collection of computers. It is built on a lower level package called parallel virtual machine (PVM. The HeNCE philosophy of parallel programming is to have the programmer graphically specify the parallelism of a computation and to automate, as much as possible, the tasks of writing, compiling, executing, debugging, and tracing the network computation. Key to HeNCE is a graphical language based on directed graphs that describe the parallelism and data dependencies of an application. Nodes in the graphs represent conventional Fortran or C subroutines and the arcs represent data and control flow. This article describes the present state of HeNCE, its capabilities, limitations, and areas of future research.

  5. Human-Computer Interaction in Smart Environments

    Science.gov (United States)

    Paravati, Gianluca; Gatteschi, Valentina

    2015-01-01

    Here, we provide an overview of the content of the Special Issue on “Human-computer interaction in smart environments”. The aim of this Special Issue is to highlight technologies and solutions encompassing the use of mass-market sensors in current and emerging applications for interacting with Smart Environments. Selected papers address this topic by analyzing different interaction modalities, including hand/body gestures, face recognition, gaze/eye tracking, biosignal analysis, speech and activity recognition, and related issues.

  6. VECTR: Virtual Environment Computational Training Resource

    Science.gov (United States)

    Little, William L.

    2018-01-01

    The Westridge Middle School Curriculum and Community Night is an annual event designed to introduce students and parents to potential employers in the Central Florida area. NASA participated in the event in 2017, and has been asked to come back for the 2018 event on January 25. We will be demonstrating our Microsoft Hololens Virtual Rovers project, and the Virtual Environment Computational Training Resource (VECTR) virtual reality tool.

  7. Scheduling multimedia services in cloud computing environment

    Science.gov (United States)

    Liu, Yunchang; Li, Chunlin; Luo, Youlong; Shao, Yanling; Zhang, Jing

    2018-02-01

    Currently, security is a critical factor for multimedia services running in the cloud computing environment. As an effective mechanism, trust can improve security level and mitigate attacks within cloud computing environments. Unfortunately, existing scheduling strategy for multimedia service in the cloud computing environment do not integrate trust mechanism when making scheduling decisions. In this paper, we propose a scheduling scheme for multimedia services in multi clouds. At first, a novel scheduling architecture is presented. Then, We build a trust model including both subjective trust and objective trust to evaluate the trust degree of multimedia service providers. By employing Bayesian theory, the subjective trust degree between multimedia service providers and users is obtained. According to the attributes of QoS, the objective trust degree of multimedia service providers is calculated. Finally, a scheduling algorithm integrating trust of entities is proposed by considering the deadline, cost and trust requirements of multimedia services. The scheduling algorithm heuristically hunts for reasonable resource allocations and satisfies the requirement of trust and meets deadlines for the multimedia services. Detailed simulated experiments demonstrate the effectiveness and feasibility of the proposed trust scheduling scheme.

  8. Local fishing associations and environment authorities visit CERN

    CERN Document Server

    AUTHOR|(CDS)2099575

    2016-01-01

    Local fishing associations and Host-States environment authorities visited CERN on Thursday 21st April 2016. They discovered the efforts made by CERN and its Health, Safety and Environment (HSE) unit to control and limit the impact of the Laboratory's activities on natural environment, and more specifically local rivers.

  9. Computing, Environment and Life Sciences | Argonne National Laboratory

    Science.gov (United States)

    Computing, Environment and Life Sciences Research Divisions BIOBiosciences CPSComputational Science DSLData Argonne Leadership Computing Facility Biosciences Division Environmental Science Division Mathematics and Computer Science Division Facilities and Institutes Argonne Leadership Computing Facility News Events About

  10. CERR: A computational environment for radiotherapy research

    International Nuclear Information System (INIS)

    Deasy, Joseph O.; Blanco, Angel I.; Clark, Vanessa H.

    2003-01-01

    A software environment is described, called the computational environment for radiotherapy research (CERR, pronounced 'sir'). CERR partially addresses four broad needs in treatment planning research: (a) it provides a convenient and powerful software environment to develop and prototype treatment planning concepts, (b) it serves as a software integration environment to combine treatment planning software written in multiple languages (MATLAB, FORTRAN, C/C++, JAVA, etc.), together with treatment plan information (computed tomography scans, outlined structures, dose distributions, digital films, etc.), (c) it provides the ability to extract treatment plans from disparate planning systems using the widely available AAPM/RTOG archiving mechanism, and (d) it provides a convenient and powerful tool for sharing and reproducing treatment planning research results. The functional components currently being distributed, including source code, include: (1) an import program which converts the widely available AAPM/RTOG treatment planning format into a MATLAB cell-array data object, facilitating manipulation; (2) viewers which display axial, coronal, and sagittal computed tomography images, structure contours, digital films, and isodose lines or dose colorwash, (3) a suite of contouring tools to edit and/or create anatomical structures, (4) dose-volume and dose-surface histogram calculation and display tools, and (5) various predefined commands. CERR allows the user to retrieve any AAPM/RTOG key word information about the treatment plan archive. The code is relatively self-describing, because it relies on MATLAB structure field name definitions based on the AAPM/RTOG standard. New structure field names can be added dynamically or permanently. New components of arbitrary data type can be stored and accessed without disturbing system operation. CERR has been applied to aid research in dose-volume-outcome modeling, Monte Carlo dose calculation, and treatment planning optimization

  11. Localization in Multiple Source Environments: Localizing the Missing Source

    Science.gov (United States)

    2007-02-01

    volunteer listeners (3 males and 3 females, 19-24 years of age ), participated in the experiment. All had normal hearing (au- diometric thresholds < 15...were routed from a control computer to a Mark of the Unicorn digital-to-analog con- verter (MOTU 24 I/O), then through a bank of amplifiers (Crown Model

  12. Computer simulation of spacecraft/environment interaction

    International Nuclear Information System (INIS)

    Krupnikov, K.K.; Makletsov, A.A.; Mileev, V.N.; Novikov, L.S.; Sinolits, V.V.

    1999-01-01

    This report presents some examples of a computer simulation of spacecraft interaction with space environment. We analysed a set data on electron and ion fluxes measured in 1991-1994 on geostationary satellite GORIZONT-35. The influence of spacecraft eclipse and device eclipse by solar-cell panel on spacecraft charging was investigated. A simple method was developed for an estimation of spacecraft potentials in LEO. Effects of various particle flux impact and spacecraft orientation are discussed. A computer engineering model for a calculation of space radiation is presented. This model is used as a client/server model with WWW interface, including spacecraft model description and results representation based on the virtual reality markup language

  13. Computer simulation of spacecraft/environment interaction

    CERN Document Server

    Krupnikov, K K; Mileev, V N; Novikov, L S; Sinolits, V V

    1999-01-01

    This report presents some examples of a computer simulation of spacecraft interaction with space environment. We analysed a set data on electron and ion fluxes measured in 1991-1994 on geostationary satellite GORIZONT-35. The influence of spacecraft eclipse and device eclipse by solar-cell panel on spacecraft charging was investigated. A simple method was developed for an estimation of spacecraft potentials in LEO. Effects of various particle flux impact and spacecraft orientation are discussed. A computer engineering model for a calculation of space radiation is presented. This model is used as a client/server model with WWW interface, including spacecraft model description and results representation based on the virtual reality markup language.

  14. System administration of ATLAS TDAQ computing environment

    Science.gov (United States)

    Adeel-Ur-Rehman, A.; Bujor, F.; Benes, J.; Caramarcu, C.; Dobson, M.; Dumitrescu, A.; Dumitru, I.; Leahu, M.; Valsan, L.; Oreshkin, A.; Popov, D.; Unel, G.; Zaytsev, A.

    2010-04-01

    This contribution gives a thorough overview of the ATLAS TDAQ SysAdmin group activities which deals with administration of the TDAQ computing environment supporting High Level Trigger, Event Filter and other subsystems of the ATLAS detector operating on LHC collider at CERN. The current installation consists of approximately 1500 netbooted nodes managed by more than 60 dedicated servers, about 40 multi-screen user interface machines installed in the control rooms and various hardware and service monitoring machines as well. In the final configuration, the online computer farm will be capable of hosting tens of thousands applications running simultaneously. The software distribution requirements are matched by the two level NFS based solution. Hardware and network monitoring systems of ATLAS TDAQ are based on NAGIOS and MySQL cluster behind it for accounting and storing the monitoring data collected, IPMI tools, CERN LANDB and the dedicated tools developed by the group, e.g. ConfdbUI. The user management schema deployed in TDAQ environment is founded on the authentication and role management system based on LDAP. External access to the ATLAS online computing facilities is provided by means of the gateways supplied with an accounting system as well. Current activities of the group include deployment of the centralized storage system, testing and validating hardware solutions for future use within the ATLAS TDAQ environment including new multi-core blade servers, developing GUI tools for user authentication and roles management, testing and validating 64-bit OS, and upgrading the existing TDAQ hardware components, authentication servers and the gateways.

  15. Xcache in the ATLAS Distributed Computing Environment

    CERN Document Server

    Hanushevsky, Andrew; The ATLAS collaboration

    2018-01-01

    Built upon the Xrootd Proxy Cache (Xcache), we developed additional features to adapt the ATLAS distributed computing and data environment, especially its data management system RUCIO, to help improve the cache hit rate, as well as features that make the Xcache easy to use, similar to the way the Squid cache is used by the HTTP protocol. We are optimizing Xcache for the HPC environments, and adapting the HL-LHC Data Lakes design as its component for data delivery. We packaged the software in CVMFS, in Docker and Singularity containers in order to standardize the deployment and reduce the cost to resolve issues at remote sites. We are also integrating it into RUCIO as a volatile storage systems, and into various ATLAS workflow such as user analysis,

  16. Human-Computer Interaction in Smart Environments

    Directory of Open Access Journals (Sweden)

    Gianluca Paravati

    2015-08-01

    Full Text Available Here, we provide an overview of the content of the Special Issue on “Human-computer interaction in smart environments”. The aim of this Special Issue is to highlight technologies and solutions encompassing the use of mass-market sensors in current and emerging applications for interacting with Smart Environments. Selected papers address this topic by analyzing different interaction modalities, including hand/body gestures, face recognition, gaze/eye tracking, biosignal analysis, speech and activity recognition, and related issues.

  17. Transient Localization in Shallow Water Environments

    National Research Council Canada - National Science Library

    Brune, Joachim

    1998-01-01

    .... Measures of robustness to be examined include the size of the localization footprint on the ambiguity surface and the peak-to-sidelobe levels in the presence of environmental mismatch and noise...

  18. Transient Localization in Shallow Water Environments

    National Research Council Canada - National Science Library

    Brune, Joachim

    1998-01-01

    .... A full-wave PE model is used to produce broadband replicas. Both model-generated synthetic signals, which provide baseline results, and measured pulses in a shallow water environment are analyzed...

  19. Printing in heterogeneous computer environment at DESY

    International Nuclear Information System (INIS)

    Jakubowski, Z.

    1996-01-01

    The number of registered hosts DESY reaches 3500 while the number of print queues approaches 150. The spectrum of used computing environment is very wide: from MAC's and PC's, through SUN, DEC and SGI machines to the IBM mainframe. In 1994 we used 18 tons of paper. We present a solution for providing print services in such an environment for more than 3500 registered users. The availability of the print service is a serious issue. Using centralized printing has a lot of advantages for software administration but creates single point of failure. We solved this problem partially without using expensive software and hardware. The talk provides information about the DESY central central print spooler concept. None of the systems available on the market provides ready to use reliable solution for all platforms used for DESY. We discuss concepts for installation, administration and monitoring large number of printers. We found a solution for printing both on central computing facilities likewise for support of stand-alone workstations. (author)

  20. Landmark based localization in urban environment

    Science.gov (United States)

    Qu, Xiaozhi; Soheilian, Bahman; Paparoditis, Nicolas

    2018-06-01

    A landmark based localization with uncertainty analysis based on cameras and geo-referenced landmarks is presented in this paper. The system is developed to adapt different camera configurations for six degree-of-freedom pose estimation. Local bundle adjustment is applied for optimization and the geo-referenced landmarks are integrated to reduce the drift. In particular, the uncertainty analysis is taken into account. On the one hand, we estimate the uncertainties of poses to predict the precision of localization. On the other hand, uncertainty propagation is considered for matching, tracking and landmark registering. The proposed method is evaluated on both KITTI benchmark and the data acquired by a mobile mapping system. In our experiments, decimeter level accuracy can be reached.

  1. Reach and get capability in a computing environment

    Science.gov (United States)

    Bouchard, Ann M [Albuquerque, NM; Osbourn, Gordon C [Albuquerque, NM

    2012-06-05

    A reach and get technique includes invoking a reach command from a reach location within a computing environment. A user can then navigate to an object within the computing environment and invoke a get command on the object. In response to invoking the get command, the computing environment is automatically navigated back to the reach location and the object copied into the reach location.

  2. Local rollback for fault-tolerance in parallel computing systems

    Science.gov (United States)

    Blumrich, Matthias A [Yorktown Heights, NY; Chen, Dong [Yorktown Heights, NY; Gara, Alan [Yorktown Heights, NY; Giampapa, Mark E [Yorktown Heights, NY; Heidelberger, Philip [Yorktown Heights, NY; Ohmacht, Martin [Yorktown Heights, NY; Steinmacher-Burow, Burkhard [Boeblingen, DE; Sugavanam, Krishnan [Yorktown Heights, NY

    2012-01-24

    A control logic device performs a local rollback in a parallel super computing system. The super computing system includes at least one cache memory device. The control logic device determines a local rollback interval. The control logic device runs at least one instruction in the local rollback interval. The control logic device evaluates whether an unrecoverable condition occurs while running the at least one instruction during the local rollback interval. The control logic device checks whether an error occurs during the local rollback. The control logic device restarts the local rollback interval if the error occurs and the unrecoverable condition does not occur during the local rollback interval.

  3. ComputerApplications and Virtual Environments (CAVE)

    Science.gov (United States)

    1993-01-01

    Virtual Reality (VR) can provide cost effective methods to design and evaluate components and systems for maintenance and refurbishment operations. The Marshall Space Flight Centerr (MSFC) in Huntsville, Alabama began to utilize VR for design analysis in the X-34 experimental reusable space vehicle. Analysts at MSFC's Computer Applications and Virtual Environments (CAVE) used Head Mounted Displays (HMD) (pictured), spatial trackers and gesture inputs as a means to animate or inhabit a properly sized virtual human model. These models were used in a VR scenario as a way to determine functionality of space and maintenance requirements for the virtual X-34. The primary functions of the virtual X-34 mockup was to support operations development and design analysis for engine removal, the engine compartment and the aft fuselage. This capability provided general visualization support to engineers and designers at MSFC and to the System Design Freeze Review at Orbital Sciences Corporation (OSC). The X-34 program was cancelled in 2001.

  4. History Matching in Parallel Computational Environments

    Energy Technology Data Exchange (ETDEWEB)

    Steven Bryant; Sanjay Srinivasan; Alvaro Barrera; Sharad Yadav

    2005-10-01

    A novel methodology for delineating multiple reservoir domains for the purpose of history matching in a distributed computing environment has been proposed. A fully probabilistic approach to perturb permeability within the delineated zones is implemented. The combination of robust schemes for identifying reservoir zones and distributed computing significantly increase the accuracy and efficiency of the probabilistic approach. The information pertaining to the permeability variations in the reservoir that is contained in dynamic data is calibrated in terms of a deformation parameter rD. This information is merged with the prior geologic information in order to generate permeability models consistent with the observed dynamic data as well as the prior geology. The relationship between dynamic response data and reservoir attributes may vary in different regions of the reservoir due to spatial variations in reservoir attributes, well configuration, flow constrains etc. The probabilistic approach then has to account for multiple r{sub D} values in different regions of the reservoir. In order to delineate reservoir domains that can be characterized with different rD parameters, principal component analysis (PCA) of the Hessian matrix has been done. The Hessian matrix summarizes the sensitivity of the objective function at a given step of the history matching to model parameters. It also measures the interaction of the parameters in affecting the objective function. The basic premise of PC analysis is to isolate the most sensitive and least correlated regions. The eigenvectors obtained during the PCA are suitably scaled and appropriate grid block volume cut-offs are defined such that the resultant domains are neither too large (which increases interactions between domains) nor too small (implying ineffective history matching). The delineation of domains requires calculation of Hessian, which could be computationally costly and as well as restricts the current approach to

  5. Ubiquitous computing in shared-care environments.

    Science.gov (United States)

    Koch, S

    2006-07-01

    In light of future challenges, such as growing numbers of elderly, increase in chronic diseases, insufficient health care budgets and problems with staff recruitment for the health-care sector, information and communication technology (ICT) becomes a possible means to meet these challenges. Organizational changes such as the decentralization of the health-care system lead to a shift from in-hospital to both advanced and basic home health care. Advanced medical technologies provide solutions for distant home care in form of specialist consultations and home monitoring. Furthermore, the shift towards home health care will increase mobile work and the establishment of shared care teams which require ICT-based solutions that support ubiquitous information access and cooperative work. Clinical documentation and decision support systems are the main ICT-based solutions of interest in the context of ubiquitous computing for shared care environments. This paper therefore describes the prerequisites for clinical documentation and decision support at the point of care, the impact of mobility on the documentation process, and how the introduction of ICT-based solutions will influence organizations and people. Furthermore, the role of dentistry in shared-care environments is discussed and illustrated in the form of a future scenario.

  6. CSNS computing environment Based on OpenStack

    Science.gov (United States)

    Li, Yakang; Qi, Fazhi; Chen, Gang; Wang, Yanming; Hong, Jianshu

    2017-10-01

    Cloud computing can allow for more flexible configuration of IT resources and optimized hardware utilization, it also can provide computing service according to the real need. We are applying this computing mode to the China Spallation Neutron Source(CSNS) computing environment. So, firstly, CSNS experiment and its computing scenarios and requirements are introduced in this paper. Secondly, the design and practice of cloud computing platform based on OpenStack are mainly demonstrated from the aspects of cloud computing system framework, network, storage and so on. Thirdly, some improvments to openstack we made are discussed further. Finally, current status of CSNS cloud computing environment are summarized in the ending of this paper.

  7. Specialized Computer Systems for Environment Visualization

    Science.gov (United States)

    Al-Oraiqat, Anas M.; Bashkov, Evgeniy A.; Zori, Sergii A.

    2018-06-01

    The need for real time image generation of landscapes arises in various fields as part of tasks solved by virtual and augmented reality systems, as well as geographic information systems. Such systems provide opportunities for collecting, storing, analyzing and graphically visualizing geographic data. Algorithmic and hardware software tools for increasing the realism and efficiency of the environment visualization in 3D visualization systems are proposed. This paper discusses a modified path tracing algorithm with a two-level hierarchy of bounding volumes and finding intersections with Axis-Aligned Bounding Box. The proposed algorithm eliminates the branching and hence makes the algorithm more suitable to be implemented on the multi-threaded CPU and GPU. A modified ROAM algorithm is used to solve the qualitative visualization of reliefs' problems and landscapes. The algorithm is implemented on parallel systems—cluster and Compute Unified Device Architecture-networks. Results show that the implementation on MPI clusters is more efficient than Graphics Processing Unit/Graphics Processing Clusters and allows real-time synthesis. The organization and algorithms of the parallel GPU system for the 3D pseudo stereo image/video synthesis are proposed. With realizing possibility analysis on a parallel GPU-architecture of each stage, 3D pseudo stereo synthesis is performed. An experimental prototype of a specialized hardware-software system 3D pseudo stereo imaging and video was developed on the CPU/GPU. The experimental results show that the proposed adaptation of 3D pseudo stereo imaging to the architecture of GPU-systems is efficient. Also it accelerates the computational procedures of 3D pseudo-stereo synthesis for the anaglyph and anamorphic formats of the 3D stereo frame without performing optimization procedures. The acceleration is on average 11 and 54 times for test GPUs.

  8. Large-scale parallel genome assembler over cloud computing environment.

    Science.gov (United States)

    Das, Arghya Kusum; Koppa, Praveen Kumar; Goswami, Sayan; Platania, Richard; Park, Seung-Jong

    2017-06-01

    The size of high throughput DNA sequencing data has already reached the terabyte scale. To manage this huge volume of data, many downstream sequencing applications started using locality-based computing over different cloud infrastructures to take advantage of elastic (pay as you go) resources at a lower cost. However, the locality-based programming model (e.g. MapReduce) is relatively new. Consequently, developing scalable data-intensive bioinformatics applications using this model and understanding the hardware environment that these applications require for good performance, both require further research. In this paper, we present a de Bruijn graph oriented Parallel Giraph-based Genome Assembler (GiGA), as well as the hardware platform required for its optimal performance. GiGA uses the power of Hadoop (MapReduce) and Giraph (large-scale graph analysis) to achieve high scalability over hundreds of compute nodes by collocating the computation and data. GiGA achieves significantly higher scalability with competitive assembly quality compared to contemporary parallel assemblers (e.g. ABySS and Contrail) over traditional HPC cluster. Moreover, we show that the performance of GiGA is significantly improved by using an SSD-based private cloud infrastructure over traditional HPC cluster. We observe that the performance of GiGA on 256 cores of this SSD-based cloud infrastructure closely matches that of 512 cores of traditional HPC cluster.

  9. Computed tomographic localization of pelvic hydatid disease

    International Nuclear Information System (INIS)

    Kotoulas, G.; Gouliamos, A.; Kalovidouris, A.; Vlahos, L.; Papavasiliou, C.

    1990-01-01

    Nine patients with history of hydatid disease have been examined by CT. Localization of the hydatid cysts in the pelvis was established by anatomical criteria. Occasionally, the transverse plane can be confusing the precise localization of a lesion. A central location, close to the boundaries of the bladder and rectum, can define peritoneal location. Further posterolateral retrovesical location can be considered retroperitoneal. Using these criteria, 8 cysts were situated within the peritoneum an 1 within the retroperitoneum. (authoer). 16 refs.; 5 figs.; 1 tab

  10. Computed tomographic localization of pelvic hydatid disease

    Energy Technology Data Exchange (ETDEWEB)

    Kotoulas, G.; Gouliamos, A.; Kalovidouris, A.; Vlahos, L.; Papavasiliou, C. (Athens University (Greece). Areteion Hospital, Department of Radiology)

    Nine patients with history of hydatid disease have been examined by CT. Localization of the hydatid cysts in the pelvis was established by anatomical criteria. Occasionally, the transverse plane can be confusing the precise localization of a lesion. A central location, close to the boundaries of the bladder and rectum, can define peritoneal location. Further posterolateral retrovesical location can be considered retroperitoneal. Using these criteria, 8 cysts were situated within the peritoneum an 1 within the retroperitoneum. (authoer). 16 refs.; 5 figs.; 1 tab.

  11. DEFACTO: A Design Environment for Adaptive Computing Technology

    National Research Council Canada - National Science Library

    Hall, Mary

    2003-01-01

    This report describes the activities of the DEFACTO project, a Design Environment for Adaptive Computing Technology funded under the DARPA Adaptive Computing Systems and Just-In-Time-Hardware programs...

  12. The sociability of computer-supported collaborative learning environments

    NARCIS (Netherlands)

    Kreijns, C.J.; Kirschner, P.A.; Jochems, W.M.G.

    2002-01-01

    There is much positive research on computer-supported collaborative learning (CSCL) environments in asynchronous distributed learning groups (DLGs). There is also research that shows that contemporary CSCL environments do not completely fulfil expectations on supporting interactive group learning,

  13. Distributed Computations Environment Protection Using Artificial Immune Systems

    Directory of Open Access Journals (Sweden)

    A. V. Moiseev

    2011-12-01

    Full Text Available In this article the authors describe possibility of artificial immune systems applying for distributed computations environment protection from definite types of malicious impacts.

  14. Distributed multiscale computing with MUSCLE 2, the Multiscale Coupling Library and Environment

    NARCIS (Netherlands)

    Borgdorff, J.; Mamonski, M.; Bosak, B.; Kurowski, K.; Ben Belgacem, M.; Chopard, B.; Groen, D.; Coveney, P.V.; Hoekstra, A.G.

    2014-01-01

    We present the Multiscale Coupling Library and Environment: MUSCLE 2. This multiscale component-based execution environment has a simple to use Java, C++, C, Python and Fortran API, compatible with MPI, OpenMP and threading codes. We demonstrate its local and distributed computing capabilities and

  15. InSAR Scientific Computing Environment

    Science.gov (United States)

    Rosen, Paul A.; Sacco, Gian Franco; Gurrola, Eric M.; Zabker, Howard A.

    2011-01-01

    This computing environment is the next generation of geodetic image processing technology for repeat-pass Interferometric Synthetic Aperture (InSAR) sensors, identified by the community as a needed capability to provide flexibility and extensibility in reducing measurements from radar satellites and aircraft to new geophysical products. This software allows users of interferometric radar data the flexibility to process from Level 0 to Level 4 products using a variety of algorithms and for a range of available sensors. There are many radar satellites in orbit today delivering to the science community data of unprecedented quantity and quality, making possible large-scale studies in climate research, natural hazards, and the Earth's ecosystem. The proposed DESDynI mission, now under consideration by NASA for launch later in this decade, would provide time series and multiimage measurements that permit 4D models of Earth surface processes so that, for example, climate-induced changes over time would become apparent and quantifiable. This advanced data processing technology, applied to a global data set such as from the proposed DESDynI mission, enables a new class of analyses at time and spatial scales unavailable using current approaches. This software implements an accurate, extensible, and modular processing system designed to realize the full potential of InSAR data from future missions such as the proposed DESDynI, existing radar satellite data, as well as data from the NASA UAVSAR (Uninhabited Aerial Vehicle Synthetic Aperture Radar), and other airborne platforms. The processing approach has been re-thought in order to enable multi-scene analysis by adding new algorithms and data interfaces, to permit user-reconfigurable operation and extensibility, and to capitalize on codes already developed by NASA and the science community. The framework incorporates modern programming methods based on recent research, including object-oriented scripts controlling legacy and

  16. High performance computing network for cloud environment using simulators

    OpenAIRE

    Singh, N. Ajith; Hemalatha, M.

    2012-01-01

    Cloud computing is the next generation computing. Adopting the cloud computing is like signing up new form of a website. The GUI which controls the cloud computing make is directly control the hardware resource and your application. The difficulty part in cloud computing is to deploy in real environment. Its' difficult to know the exact cost and it's requirement until and unless we buy the service not only that whether it will support the existing application which is available on traditional...

  17. Local environment can enhance fidelity of quantum teleportation

    Science.gov (United States)

    BadziaĢ, Piotr; Horodecki, Michał; Horodecki, Paweł; Horodecki, Ryszard

    2000-07-01

    We show how an interaction with the environment can enhance fidelity of quantum teleportation. To this end, we present examples of states which cannot be made useful for teleportation by any local unitary transformations; nevertheless, after being subjected to a dissipative interaction with the local environment, the states allow for teleportation with genuinely quantum fidelity. The surprising fact here is that the necessary interaction does not require any intelligent action from the parties sharing the states. In passing, we produce some general results regarding optimization of teleportation fidelity by local action. We show that bistochastic processes cannot improve fidelity of two-qubit states. We also show that in order to have their fidelity improvable by a local process, the bipartite states must violate the so-called reduction criterion of separability.

  18. Computed tomographic localization of Rt. Juxtadiaphragmatic lesions

    International Nuclear Information System (INIS)

    Lee, Jong Doo; Choe, Kyu Ok; Kim, Ki Whang; Hong, In Soo

    1989-01-01

    Since several reports were published about CT differentiation of peridiaphragmatic fluid collection using 4 useful signs-diaphragm, displaced crus, bare area and interface signs. Transverse CT scans of 20 patients with abnormal diaphragmatic position due to large intrathoracic or intraabdominal lesion were analysed on the basis of those signs. Difficulties were encounted with differentiation when laterally located lesions did not extend to as far medially as crus, and when diaphragmatic stripe could not be distinguished from thickened pleura or adjacent wall of lesions. As a result, limited cases can be adequately assessed by diaphragm or displaced crus sign. Furthermore, bare area and interface signs seemed to be not useful at all. However relationship between caudal tip of lesions and thoracoabdominal wall was always constant in each thoracic or abdominal lesions. All of intrathoracic masses or empyemas were attached to thoracic wall displacing properitoneal and perirenal fat medially or inferiorly. By contraries, all of intraabdominal masses were separated from abdominal wall displacing properitoneal fat or peritoneum laterally. The key to accurate localization seemed to be identification of such relationship

  19. Local computer network of the JINR Neutron Physics Laboratory

    International Nuclear Information System (INIS)

    Alfimenkov, A.V.; Vagov, V.A.; Vajdkhadze, F.

    1988-01-01

    New high-speed local computer network, where intelligent network adapter (NA) is used as hardware base, is developed in the JINR Neutron Physics Laboratory to increase operation efficiency and data transfer rate. NA consists of computer bus interface, cable former, microcomputer segment designed for both program realization of channel-level protocol and organization of bidirectional transfer of information through direct access channel between monochannel and computer memory with or witout buffering in NA operation memory device

  20. Computers and the Environment: Minimizing the Carbon Footprint

    Science.gov (United States)

    Kaestner, Rich

    2009-01-01

    Computers can be good and bad for the environment; one can maximize the good and minimize the bad. When dealing with environmental issues, it's difficult to ignore the computing infrastructure. With an operations carbon footprint equal to the airline industry's, computer energy use is only part of the problem; everyone is also dealing with the use…

  1. An expert system for a local planning environment

    NARCIS (Netherlands)

    Meester, G.J.; Meester, G.J.

    1993-01-01

    In this paper, we discuss the design of an Expert System (ES) that supports decision making in a Local Planning System (LPS) environment. The LPS provides the link between a high level factory planning system (rough cut capacity planning and material coordination) and the actual execution of jobs on

  2. A high performance scientific cloud computing environment for materials simulations

    OpenAIRE

    Jorissen, Kevin; Vila, Fernando D.; Rehr, John J.

    2011-01-01

    We describe the development of a scientific cloud computing (SCC) platform that offers high performance computation capability. The platform consists of a scientific virtual machine prototype containing a UNIX operating system and several materials science codes, together with essential interface tools (an SCC toolset) that offers functionality comparable to local compute clusters. In particular, our SCC toolset provides automatic creation of virtual clusters for parallel computing, including...

  3. Ubiquitous Computing in Physico-Spatial Environments

    DEFF Research Database (Denmark)

    Dalsgård, Peter; Eriksson, Eva

    2007-01-01

    Interaction design of pervasive and ubiquitous computing (UC) systems must take into account physico-spatial issues as technology is implemented into our physical surroundings. In this paper we discuss how one conceptual framework for understanding interaction in context, Activity Theory (AT...

  4. Opisthorchiasis in Northeastern Thailand: Effect of local environment and culture

    Directory of Open Access Journals (Sweden)

    Beuy Joob

    2015-06-01

    Full Text Available Opisthorchiasis is a kind of trematode infection. This parasitic infestation is a chronic hepatobiliary tract infection and can cause chronic irritation that will finally lead to cholangiocarcinoma. It is highly endemic in northeastern region of Thailand and contributes to many cholangiocarcinoma cases annually. The attempt to control the disease becomes a national policy. However, the sanitation becomes a major underlying factor leading to infection and meanwhile, the poverty and low education of the local people become an important concern. In this opinion, the authors discuss the effect of local environment and culture on opisthorchiasis in northeastern Thailand. Due to the pattern change of local environment, global warming and globalization, the dynamicity can be observed.

  5. THE VALUE OF CLOUD COMPUTING IN THE BUSINESS ENVIRONMENT

    OpenAIRE

    Mircea GEORGESCU; Marian MATEI

    2013-01-01

    Without any doubt, cloud computing has become one of the most significant trends in any enterprise, not only for IT businesses. Besides the fact that the cloud can offer access to low cost, considerably flexible computing resources, cloud computing also provides the capacity to create a new relationship between business entities and corporate IT departments. The value added to the business environment is given by the balanced use of resources, offered by cloud computing. The cloud mentality i...

  6. Research computing in a distributed cloud environment

    International Nuclear Information System (INIS)

    Fransham, K; Agarwal, A; Armstrong, P; Bishop, A; Charbonneau, A; Desmarais, R; Hill, N; Gable, I; Gaudet, S; Goliath, S; Impey, R; Leavett-Brown, C; Ouellete, J; Paterson, M; Pritchet, C; Penfold-Brown, D; Podaima, W; Schade, D; Sobie, R J

    2010-01-01

    The recent increase in availability of Infrastructure-as-a-Service (IaaS) computing clouds provides a new way for researchers to run complex scientific applications. However, using cloud resources for a large number of research jobs requires significant effort and expertise. Furthermore, running jobs on many different clouds presents even more difficulty. In order to make it easy for researchers to deploy scientific applications across many cloud resources, we have developed a virtual machine resource manager (Cloud Scheduler) for distributed compute clouds. In response to a user's job submission to a batch system, the Cloud Scheduler manages the distribution and deployment of user-customized virtual machines across multiple clouds. We describe the motivation for and implementation of a distributed cloud using the Cloud Scheduler that is spread across both commercial and dedicated private sites, and present some early results of scientific data analysis using the system.

  7. Security in cloud computing and virtual environments

    OpenAIRE

    Aarseth, Raymond

    2015-01-01

    Cloud computing is a big buzzwords today. Just watch the commercials on TV and I can promise that you will hear the word cloud service at least once. With the growth of cloud technology steadily rising, and everything from cellphones to cars connected to the cloud, how secure is cloud technology? What are the caveats of using cloud technology? And how does it all work? This thesis will discuss cloud security and the underlying technology called Virtualization to ...

  8. Distributed computing environment for Mine Warfare Command

    OpenAIRE

    Pritchard, Lane L.

    1993-01-01

    Approved for public release; distribution is unlimited. The Mine Warfare Command in Charleston, South Carolina has been converting its information systems architecture from a centralized mainframe based system to a decentralized network of personal computers over the past several years. This thesis analyzes the progress Of the evolution as of May of 1992. The building blocks of a distributed architecture are discussed in relation to the choices the Mine Warfare Command has made to date. Ar...

  9. A high performance scientific cloud computing environment for materials simulations

    Science.gov (United States)

    Jorissen, K.; Vila, F. D.; Rehr, J. J.

    2012-09-01

    We describe the development of a scientific cloud computing (SCC) platform that offers high performance computation capability. The platform consists of a scientific virtual machine prototype containing a UNIX operating system and several materials science codes, together with essential interface tools (an SCC toolset) that offers functionality comparable to local compute clusters. In particular, our SCC toolset provides automatic creation of virtual clusters for parallel computing, including tools for execution and monitoring performance, as well as efficient I/O utilities that enable seamless connections to and from the cloud. Our SCC platform is optimized for the Amazon Elastic Compute Cloud (EC2). We present benchmarks for prototypical scientific applications and demonstrate performance comparable to local compute clusters. To facilitate code execution and provide user-friendly access, we have also integrated cloud computing capability in a JAVA-based GUI. Our SCC platform may be an alternative to traditional HPC resources for materials science or quantum chemistry applications.

  10. Distributed computing environment monitoring and user expectations

    International Nuclear Information System (INIS)

    Cottrell, R.L.A.; Logg, C.A.

    1996-01-01

    This paper discusses the growing needs for distributed system monitoring and compares it to current practices. It then goes to identify the components of distributed system monitoring and shows how they are implemented and successfully used at one site today to address the Local area Network (WAN), and host monitoring. It shows how this monitoring can be used to develop realistic service level expectations and also identifies the costs. Finally, the paper briefly discusses the future challenges in network monitoring. (author)

  11. Distributed computing environment monitoring and user expectations

    International Nuclear Information System (INIS)

    Cottrell, R.L.A.; Logg, C.A.

    1995-11-01

    This paper discusses the growing needs for distributed system monitoring and compares it to current practices. It then goes on to identify the components of distributed system monitoring and shows how they are implemented and successfully used at one site today to address the Local Area Network (LAN), network services and applications, the Wide Area Network (WAN), and host monitoring. It shows how this monitoring can be used to develop realistic service level expectations and also identifies the costs. Finally, the paper briefly discusses the future challenges in network monitoring

  12. Exploiting Locality in Quantum Computation for Quantum Chemistry.

    Science.gov (United States)

    McClean, Jarrod R; Babbush, Ryan; Love, Peter J; Aspuru-Guzik, Alán

    2014-12-18

    Accurate prediction of chemical and material properties from first-principles quantum chemistry is a challenging task on traditional computers. Recent developments in quantum computation offer a route toward highly accurate solutions with polynomial cost; however, this solution still carries a large overhead. In this Perspective, we aim to bring together known results about the locality of physical interactions from quantum chemistry with ideas from quantum computation. We show that the utilization of spatial locality combined with the Bravyi-Kitaev transformation offers an improvement in the scaling of known quantum algorithms for quantum chemistry and provides numerical examples to help illustrate this point. We combine these developments to improve the outlook for the future of quantum chemistry on quantum computers.

  13. Student Advising and Retention Application in Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Gurdeep S Hura

    2016-11-01

    Full Text Available  This paper proposes a new user-friendly application enhancing and expanding the current advising services of Gradesfirst currently being used for advising and retention by the Athletic department of UMES with a view to implement new performance activities like mentoring, tutoring, scheduling, and study hall hours into existing tools. This application includes various measurements that can be used to monitor and improve the performance of the students in the Athletic Department of UMES by monitoring students’ weekly study hall hours, and tutoring schedules. It also supervises tutors’ login and logout activities in order to monitor their effectiveness, supervises tutor-tutee interaction, and stores and analyzes the overall academic progress of each student. A dedicated server for providing services will be developed at the local site. The paper has been implemented in three steps. The first step involves the creation of an independent cloud computing environment that provides resources such as database creation, query-based statistical data, performance measures activities, and automated support of performance measures such as advising, mentoring, monitoring and tutoring. The second step involves the creation of an application known as Student Advising and Retention (SAR application in a cloud computing environment. This application has been designed to be a comprehensive database management system which contains relevant data regarding student academic development that supports various strategic advising and monitoring of students. The third step involves the creation of a systematic advising chart and frameworks which help advisors. The paper shows ways of creating the most appropriate advising technique based on the student’s academic needs. The proposed application runs in a Windows-based system. As stated above, the proposed application is expected to enhance and expand the current advising service of Gradesfirst tool. A brief

  14. Local computations in Dempster-Shafer theory of evidence

    Czech Academy of Sciences Publication Activity Database

    Jiroušek, Radim

    2012-01-01

    Roč. 53, č. 8 (2012), s. 1155-1167 ISSN 0888-613X Grant - others:GA ČR(CZ) GAP403/12/2175 Program:GA Institutional support: RVO:67985556 Keywords : Discrete belief functions * Dempster-Shafer theory * conditional independence * decomposable model Subject RIV: IN - Informatics, Computer Science Impact factor: 1.729, year: 2012 http://library.utia.cas.cz/separaty/2012/MTR/jirousek-local computations in dempster–shafer theory of evidence. pdf

  15. Center for Advanced Energy Studies: Computer Assisted Virtual Environment (CAVE)

    Data.gov (United States)

    Federal Laboratory Consortium — The laboratory contains a four-walled 3D computer assisted virtual environment - or CAVE TM — that allows scientists and engineers to literally walk into their data...

  16. Computational Tool for Aerothermal Environment Around Transatmospheric Vehicles, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this Project is to develop a high-fidelity computational tool for accurate prediction of aerothermal environment on transatmospheric vehicles. This...

  17. Beekeeping, environment and modernity in localities in Yucatan, Mexico

    Directory of Open Access Journals (Sweden)

    Enrique Rodríguez Balam

    2015-09-01

    Full Text Available In this paper, we reflect on the local knowledge about the European honey bee Apis mellifera scutellata, namely its biology, behavior, social structure, communication, and the relationships that these organisms maintain with the environment and their natural enemies. We also discuss the impacts that land use has on this economic activity. The empirical knowledge of beekeepers converges quite well with the scientific knowledge concerning this group of organisms.

  18. A non-local computational boundary condition for duct acoustics

    Science.gov (United States)

    Zorumski, William E.; Watson, Willie R.; Hodge, Steve L.

    1994-01-01

    A non-local boundary condition is formulated for acoustic waves in ducts without flow. The ducts are two dimensional with constant area, but with variable impedance wall lining. Extension of the formulation to three dimensional and variable area ducts is straightforward in principle, but requires significantly more computation. The boundary condition simulates a nonreflecting wave field in an infinite duct. It is implemented by a constant matrix operator which is applied at the boundary of the computational domain. An efficient computational solution scheme is developed which allows calculations for high frequencies and long duct lengths. This computational solution utilizes the boundary condition to limit the computational space while preserving the radiation boundary condition. The boundary condition is tested for several sources. It is demonstrated that the boundary condition can be applied close to the sound sources, rendering the computational domain small. Computational solutions with the new non-local boundary condition are shown to be consistent with the known solutions for nonreflecting wavefields in an infinite uniform duct.

  19. Fostering computational thinking skills with a tangible blocks programming environment

    OpenAIRE

    Turchi, T; Malizia, A

    2016-01-01

    Computational Thinking has recently returned into the limelight as an essential skill to have for both the general public and disciplines outside Computer Science. It encapsulates those thinking skills integral to solving complex problems using a computer, thus widely applicable in our technological society. Several public initiatives such as the Hour of Code successfully introduced it to millions of people of different ages and backgrounds, mostly using Blocks Programming Environments like S...

  20. The DIII-D Computing Environment: Characteristics and Recent Changes

    International Nuclear Information System (INIS)

    McHarg, B.B. Jr.

    1999-01-01

    The DIII-D tokamak national fusion research facility along with its predecessor Doublet III has been operating for over 21 years. The DIII-D computing environment consists of real-time systems controlling the tokamak, heating systems, and diagnostics, and systems acquiring experimental data from instrumentation; major data analysis server nodes performing short term and long term data access and data analysis; and systems providing mechanisms for remote collaboration and the dissemination of information over the world wide web. Computer systems for the facility have undergone incredible changes over the course of time as the computer industry has changed dramatically. Yet there are certain valuable characteristics of the DIII-D computing environment that have been developed over time and have been maintained to this day. Some of these characteristics include: continuous computer infrastructure improvements, distributed data and data access, computing platform integration, and remote collaborations. These characteristics are being carried forward as well as new characteristics resulting from recent changes which have included: a dedicated storage system and a hierarchical storage management system for raw shot data, various further infrastructure improvements including deployment of Fast Ethernet, the introduction of MDSplus, LSF and common IDL based tools, and improvements to remote collaboration capabilities. This paper will describe this computing environment, important characteristics that over the years have contributed to the success of DIII-D computing systems, and recent changes to computer systems

  1. Novel Ethernet Based Optical Local Area Networks for Computer Interconnection

    NARCIS (Netherlands)

    Radovanovic, Igor; van Etten, Wim; Taniman, R.O.; Kleinkiskamp, Ronny

    2003-01-01

    In this paper we present new optical local area networks for fiber-to-the-desk application. Presented networks are expected to bring a solution for having optical fibers all the way to computers. To bring the overall implementation costs down we have based our networks on short-wavelength optical

  2. Environments for online maritime simulators with cloud computing capabilities

    Science.gov (United States)

    Raicu, Gabriel; Raicu, Alexandra

    2016-12-01

    This paper presents the cloud computing environments, network principles and methods for graphical development in realistic naval simulation, naval robotics and virtual interactions. The aim of this approach is to achieve a good simulation quality in large networked environments using open source solutions designed for educational purposes. Realistic rendering of maritime environments requires near real-time frameworks with enhanced computing capabilities during distance interactions. E-Navigation concepts coupled with the last achievements in virtual and augmented reality will enhance the overall experience leading to new developments and innovations. We have to deal with a multiprocessing situation using advanced technologies and distributed applications using remote ship scenario and automation of ship operations.

  3. Cielo Computational Environment Usage Model With Mappings to ACE Requirements for the General Availability User Environment Capabilities Release Version 1.1

    Energy Technology Data Exchange (ETDEWEB)

    Vigil,Benny Manuel [Los Alamos National Laboratory; Ballance, Robert [SNL; Haskell, Karen [SNL

    2012-08-09

    Cielo is a massively parallel supercomputer funded by the DOE/NNSA Advanced Simulation and Computing (ASC) program, and operated by the Alliance for Computing at Extreme Scale (ACES), a partnership between Los Alamos National Laboratory (LANL) and Sandia National Laboratories (SNL). The primary Cielo compute platform is physically located at Los Alamos National Laboratory. This Cielo Computational Environment Usage Model documents the capabilities and the environment to be provided for the Q1 FY12 Level 2 Cielo Capability Computing (CCC) Platform Production Readiness Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model is focused on the needs of the ASC user working in the secure computing environments at Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory, or Sandia National Laboratories, but also addresses the needs of users working in the unclassified environment. The Cielo Computational Environment Usage Model maps the provided capabilities to the tri-Lab ASC Computing Environment (ACE) Version 8.0 requirements. The ACE requirements reflect the high performance computing requirements for the Production Readiness Milestone user environment capabilities of the ASC community. A description of ACE requirements met, and those requirements that are not met, are included in each section of this document. The Cielo Computing Environment, along with the ACE mappings, has been issued and reviewed throughout the tri-Lab community.

  4. Micro-computer cards for hard industrial environment

    Energy Technology Data Exchange (ETDEWEB)

    Breton, J M

    1984-03-15

    Approximately 60% of present or future distributed systems have, or will have, operational units installed in hard environments. In these applications, which include canalization and industrial motor control, robotics and process control, systems must be easily applied in environments not made for electronic use. The development of card systems in this hard industrial environment, which is found in petrochemical industry and mines is described. National semiconductor CIM card system CMOS technology allows the real time micro computer application to be efficient and functional in hard industrial environments.

  5. Application of local area network technology in an engineering environment

    International Nuclear Information System (INIS)

    Powell, A.D.; Sokolowski, M.A.

    1990-01-01

    This paper reports on the application of local area network technology in an engineering environment. Mobil Research and Development Corporation Engineering, Dallas, texas has installed a local area network (LAN) linking over 85 microcomputers. This network, which has been in existence for more than three years, provides common access by all engineers to quality output devices such as laser printers and multi-color pen plotters; IBM mainframe connections; electronic mail and file transfer; and common engineering program. The network has been expanded via a wide area ethernet network to link the Dallas location with a functionally equivalent LAN of over 400 microcomputers in Princeton, N.J. Additionally, engineers on assignment at remote areas in Europe, U.S., Africa and project task forces have dial-in access to the network via telephone lines

  6. Local-order metric for condensed-phase environments

    Science.gov (United States)

    Martelli, Fausto; Ko, Hsin-Yu; Oǧuz, Erdal C.; Car, Roberto

    2018-02-01

    We introduce a local order metric (LOM) that measures the degree of order in the neighborhood of an atomic or molecular site in a condensed medium. The LOM maximizes the overlap between the spatial distribution of sites belonging to that neighborhood and the corresponding distribution in a suitable reference system. The LOM takes a value tending to zero for completely disordered environments and tending to one for environments that perfectly match the reference. The site-averaged LOM and its standard deviation define two scalar order parameters, S and δ S , that characterize with excellent resolution crystals, liquids, and amorphous materials. We show with molecular dynamics simulations that S , δ S , and the LOM provide very insightful information in the study of structural transformations, such as those occurring when ice spontaneously nucleates from supercooled water or when a supercooled water sample becomes amorphous upon progressive cooling.

  7. Beekeeping, environment and modernity in localities in Yucatan, Mexico

    Directory of Open Access Journals (Sweden)

    Enrique Rodríguez Balam

    2015-05-01

    Full Text Available http://dx.doi.org/10.5007/2175-7925.2015v28n3p143 In this paper, we reflect on the local knowledge about the European honey bee  Apis mellifera scutellata, namely its biology, behavior, social structure, communication, and the relationships that these organisms maintain with the environment and their natural enemies. We also discuss the impacts that land use has on this economic activity. The empirical knowledge of beekeepers converges quite well with the scientific knowledge concerning this group of organisms.

  8. Computed tomography of localized dilatation of the intrahepatic bile ducts

    International Nuclear Information System (INIS)

    Araki, T.; Itai, Y.; Tasaka, A.

    1981-01-01

    Twenty-nine patients showed localized dilatation of the intrahepatic bile ducts on computed tomography, usually unaccompanied by jaundice. Congenital dilatation was diagnosed when associated with a choledochal cyst, while cholangiographic contrast material was helpful in differentiating such dilatation from a simple cyst by showing its communication with the biliary tract when no choledochal cyst was present. Obstructive dilatation was associated with intrahepatic calculi in 4 cases, hepatoma in 9, cholangioma in 5, metastatic tumor in 5, and polycystic disease in 2. Cholangioma and intrahepatic calculi had a greater tendency to accompany such localized dilatation; in 2 cases, the dilatation was the only clue to the underlying disorder

  9. Design requirements for ubiquitous computing environments for healthcare professionals.

    Science.gov (United States)

    Bång, Magnus; Larsson, Anders; Eriksson, Henrik

    2004-01-01

    Ubiquitous computing environments can support clinical administrative routines in new ways. The aim of such computing approaches is to enhance routine physical work, thus it is important to identify specific design requirements. We studied healthcare professionals in an emergency room and developed the computer-augmented environment NOSTOS to support teamwork in that setting. NOSTOS uses digital pens and paper-based media as the primary input interface for data capture and as a means of controlling the system. NOSTOS also includes a digital desk, walk-up displays, and sensor technology that allow the system to track documents and activities in the workplace. We propose a set of requirements and discuss the value of tangible user interfaces for healthcare personnel. Our results suggest that the key requirements are flexibility in terms of system usage and seamless integration between digital and physical components. We also discuss how ubiquitous computing approaches like NOSTOS can be beneficial in the medical workplace.

  10. A Compute Environment of ABC95 Array Computer Based on Multi-FPGA Chip

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    ABC95 array computer is a multi-function network's computer based on FPGA technology, The multi-function network supports processors conflict-free access data from memory and supports processors access data from processors based on enhanced MESH network.ABC95 instruction's system includes control instructions, scalar instructions, vectors instructions.Mostly net-work instructions are introduced.A programming environment of ABC95 array computer assemble language is designed.A programming environment of ABC95 array computer for VC++ is advanced.It includes load function of ABC95 array computer program and data, store function, run function and so on.Specially, The data type of ABC95 array computer conflict-free access is defined.The results show that these technologies can develop programmer of ABC95 array computer effectively.

  11. Collaborative virtual reality environments for computational science and design

    International Nuclear Information System (INIS)

    Papka, M. E.

    1998-01-01

    The authors are developing a networked, multi-user, virtual-reality-based collaborative environment coupled to one or more petaFLOPs computers, enabling the interactive simulation of 10 9 atom systems. The purpose of this work is to explore the requirements for this coupling. Through the design, development, and testing of such systems, they hope to gain knowledge that allows computational scientists to discover and analyze their results more quickly and in a more intuitive manner

  12. A Secure Authenticate Framework for Cloud Computing Environment

    OpenAIRE

    Nitin Nagar; Pradeep k. Jatav

    2014-01-01

    Cloud computing has an important aspect for the companies to build and deploy their infrastructure and application. Data Storage service in the cloud computing is easy as compare to the other data storage services. At the same time, cloud security in the cloud environment is challenging task. Security issues ranging from missing system configuration, lack of proper updates, or unwise user actions from remote data storage. It can expose user’s private data and information to unwanted access. i...

  13. A PC/workstation cluster computing environment for reservoir engineering simulation applications

    International Nuclear Information System (INIS)

    Hermes, C.E.; Koo, J.

    1995-01-01

    Like the rest of the petroleum industry, Texaco has been transferring its applications and databases from mainframes to PC's and workstations. This transition has been very positive because it provides an environment for integrating applications, increases end-user productivity, and in general reduces overall computing costs. On the down side, the transition typically results in a dramatic increase in workstation purchases and raises concerns regarding the cost and effective management of computing resources in this new environment. The workstation transition also places the user in a Unix computing environment which, to say the least, can be quite frustrating to learn and to use. This paper describes the approach, philosophy, architecture, and current status of the new reservoir engineering/simulation computing environment developed at Texaco's E and P Technology Dept. (EPTD) in Houston. The environment is representative of those under development at several other large oil companies and is based on a cluster of IBM and Silicon Graphics Intl. (SGI) workstations connected by a fiber-optics communications network and engineering PC's connected to local area networks, or Ethernets. Because computing resources and software licenses are shared among a group of users, the new environment enables the company to get more out of its investments in workstation hardware and software

  14. Computational modeling of local hemodynamics phenomena: methods, tools and clinical applications

    International Nuclear Information System (INIS)

    Ponzini, R.; Rizzo, G.; Vergara, C.; Veneziani, A.; Morbiducci, U.; Montevecchi, F.M.; Redaelli, A.

    2009-01-01

    Local hemodynamics plays a key role in the onset of vessel wall pathophysiology, with peculiar blood flow structures (i.e. spatial velocity profiles, vortices, re-circulating zones, helical patterns and so on) characterizing the behavior of specific vascular districts. Thanks to the evolving technologies on computer sciences, mathematical modeling and hardware performances, the study of local hemodynamics can today afford also the use of a virtual environment to perform hypothesis testing, product development, protocol design and methods validation that just a couple of decades ago would have not been thinkable. Computational fluid dynamics (Cfd) appears to be more than a complementary partner to in vitro modeling and a possible substitute to animal models, furnishing a privileged environment for cheap fast and reproducible data generation.

  15. Cloud Computing and Virtual Desktop Infrastructures in Afloat Environments

    OpenAIRE

    Gillette, Stefan E.

    2012-01-01

    The phenomenon of “cloud computing” has become ubiquitous among users of the Internet and many commercial applications. Yet, the U.S. Navy has conducted limited research in this nascent technology. This thesis explores the application and integration of cloud computing both at the shipboard level and in a multi-ship environment. A virtual desktop infrastructure, mirroring a shipboard environment, was built and analyzed in the Cloud Lab at the Naval Postgraduate School, which offers a potentia...

  16. Spike-timing-based computation in sound localization.

    Directory of Open Access Journals (Sweden)

    Dan F M Goodman

    2010-11-01

    Full Text Available Spike timing is precise in the auditory system and it has been argued that it conveys information about auditory stimuli, in particular about the location of a sound source. However, beyond simple time differences, the way in which neurons might extract this information is unclear and the potential computational advantages are unknown. The computational difficulty of this task for an animal is to locate the source of an unexpected sound from two monaural signals that are highly dependent on the unknown source signal. In neuron models consisting of spectro-temporal filtering and spiking nonlinearity, we found that the binaural structure induced by spatialized sounds is mapped to synchrony patterns that depend on source location rather than on source signal. Location-specific synchrony patterns would then result in the activation of location-specific assemblies of postsynaptic neurons. We designed a spiking neuron model which exploited this principle to locate a variety of sound sources in a virtual acoustic environment using measured human head-related transfer functions. The model was able to accurately estimate the location of previously unknown sounds in both azimuth and elevation (including front/back discrimination in a known acoustic environment. We found that multiple representations of different acoustic environments could coexist as sets of overlapping neural assemblies which could be associated with spatial locations by Hebbian learning. The model demonstrates the computational relevance of relative spike timing to extract spatial information about sources independently of the source signal.

  17. Event detection and localization for small mobile robots using reservoir computing.

    Science.gov (United States)

    Antonelo, E A; Schrauwen, B; Stroobandt, D

    2008-08-01

    Reservoir Computing (RC) techniques use a fixed (usually randomly created) recurrent neural network, or more generally any dynamic system, which operates at the edge of stability, where only a linear static readout output layer is trained by standard linear regression methods. In this work, RC is used for detecting complex events in autonomous robot navigation. This can be extended to robot localization tasks which are solely based on a few low-range, high-noise sensory data. The robot thus builds an implicit map of the environment (after learning) that is used for efficient localization by simply processing the input stream of distance sensors. These techniques are demonstrated in both a simple simulation environment and in the physically realistic Webots simulation of the commercially available e-puck robot, using several complex and even dynamic environments.

  18. Protect Heterogeneous Environment Distributed Computing from Malicious Code Assignment

    Directory of Open Access Journals (Sweden)

    V. S. Gorbatov

    2011-09-01

    Full Text Available The paper describes the practical implementation of the protection system of heterogeneous environment distributed computing from malicious code for the assignment. A choice of technologies, development of data structures, performance evaluation of the implemented system security are conducted.

  19. Visual Reasoning in Computational Environment: A Case of Graph Sketching

    Science.gov (United States)

    Leung, Allen; Chan, King Wah

    2004-01-01

    This paper reports the case of a form six (grade 12) Hong Kong student's exploration of graph sketching in a computational environment. In particular, the student summarized his discovery in the form of two empirical laws. The student was interviewed and the interviewed data were used to map out a possible path of his visual reasoning. Critical…

  20. New computing systems, future computing environment, and their implications on structural analysis and design

    Science.gov (United States)

    Noor, Ahmed K.; Housner, Jerrold M.

    1993-01-01

    Recent advances in computer technology that are likely to impact structural analysis and design of flight vehicles are reviewed. A brief summary is given of the advances in microelectronics, networking technologies, and in the user-interface hardware and software. The major features of new and projected computing systems, including high performance computers, parallel processing machines, and small systems, are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed. The impact of the advances in computer technology on structural analysis and the design of flight vehicles is described. A scenario for future computing paradigms is presented, and the near-term needs in the computational structures area are outlined.

  1. A Matchmaking Strategy Of Mixed Resource On Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Wisam Elshareef

    2015-08-01

    Full Text Available Abstract Today cloud computing has become a key technology for online allotment of computing resources and online storage of user data in a lower cost where computing resources are available all the time over the Internet with pay per use concept. Recently there is a growing need for resource management strategies in a cloud computing environment that encompass both end-users satisfaction and a high job submission throughput with appropriate scheduling. One of the major and essential issues in resource management is related to allocate incoming tasks to suitable virtual machine matchmaking. The main objective of this paper is to propose a matchmaking strategy between the incoming requests and various resources in the cloud environment to satisfy the requirements of users and to load balance the workload on resources. Load Balancing is an important aspect of resource management in a cloud computing environment. So this paper proposes a dynamic weight active monitor DWAM load balance algorithm which allocates on the fly the incoming requests to the all available virtual machines in an efficient manner in order to achieve better performance parameters such as response time processing time and resource utilization. The feasibility of the proposed algorithm is analyzed using Cloudsim simulator which proves the superiority of the proposed DWAM algorithm over its counterparts in literature. Simulation results demonstrate that proposed algorithm dramatically improves response time data processing time and more utilized of resource compared Active monitor and VM-assign algorithms.

  2. The Computer Revolution in Science: Steps towards the realization of computer-supported discovery environments

    NARCIS (Netherlands)

    de Jong, Hidde; Rip, Arie

    1997-01-01

    The tools that scientists use in their search processes together form so-called discovery environments. The promise of artificial intelligence and other branches of computer science is to radically transform conventional discovery environments by equipping scientists with a range of powerful

  3. Local environment of zirconium in nuclear gels studied by XAS

    International Nuclear Information System (INIS)

    Pelegrin, E.; Ildefonse, Ph.; Calas, G.; Ricol, St.; Flank, A.M.

    1997-01-01

    During lixiviation experiments, nuclear gels are formed and heavy metals are retained. In order to understand this retardation mechanisms, we performed an analysis of the local environment of Zr in parent glasses and derived alteration gels both at the Zr-L II,III , and Zr-K edges. Calibration of the method was conducted through the analysis of model compounds with known coordination number (CN): catapleite Na 2 ZrSi 3 O 9 ,2H 2 O (CN=6), baddeleyite ZrO 2 (CN=7) and zircon SiZrO 4 (CN=8). Nuclear glasses (R7T7, and a simplified nuclear glass V 1) and gels obtained at 90 deg C, with leaching times from 7 to 12 months and with solution renewal. were also investigated (GR7T7R and GV1). Zr-L II,III XANES spectra evidenced that zirconium is 6-fold coordinated in R7T7 and V1 nuclear glasses. For GR7T7R and GV1 gels, Zr local environment is significantly changed, and a mixture of CN (6 and 7J has been evidenced. Quantitative structural results were derived from EXAFS analysis at Zr-K edge. In parent glasses, derived Zr-O distance is 2.10±0.01 10 -10 m, and is in the range Zr-O distances for octahedral coordination in model compounds. In both gels studied, Zr-O distances increase significantly up to 2.15 ±0.01 10 -10 m. This distance is close to that known in baddeleyite (2,158 10 -10 m). A better understanding of the Zr retention mechanism has to be made by studying the second neighbors contributions. (authors)

  4. 5 CFR 531.245 - Computing locality rates and special rates for GM employees.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Computing locality rates and special... Gm Employees § 531.245 Computing locality rates and special rates for GM employees. Locality rates and special rates are computed for GM employees in the same manner as locality rates and special rates...

  5. Human face recognition using eigenface in cloud computing environment

    Science.gov (United States)

    Siregar, S. T. M.; Syahputra, M. F.; Rahmat, R. F.

    2018-02-01

    Doing a face recognition for one single face does not take a long time to process, but if we implement attendance system or security system on companies that have many faces to be recognized, it will take a long time. Cloud computing is a computing service that is done not on a local device, but on an internet connected to a data center infrastructure. The system of cloud computing also provides a scalability solution where cloud computing can increase the resources needed when doing larger data processing. This research is done by applying eigenface while collecting data as training data is also done by using REST concept to provide resource, then server can process the data according to existing stages. After doing research and development of this application, it can be concluded by implementing Eigenface, recognizing face by applying REST concept as endpoint in giving or receiving related information to be used as a resource in doing model formation to do face recognition.

  6. Multiple Signal Classification Algorithm Based Electric Dipole Source Localization Method in an Underwater Environment

    Directory of Open Access Journals (Sweden)

    Yidong Xu

    2017-10-01

    Full Text Available A novel localization method based on multiple signal classification (MUSIC algorithm is proposed for positioning an electric dipole source in a confined underwater environment by using electric dipole-receiving antenna array. In this method, the boundary element method (BEM is introduced to analyze the boundary of the confined region by use of a matrix equation. The voltage of each dipole pair is used as spatial-temporal localization data, and it does not need to obtain the field component in each direction compared with the conventional fields based localization method, which can be easily implemented in practical engineering applications. Then, a global-multiple region-conjugate gradient (CG hybrid search method is used to reduce the computation burden and to improve the operation speed. Two localization simulation models and a physical experiment are conducted. Both the simulation results and physical experiment result provide accurate positioning performance, with the help to verify the effectiveness of the proposed localization method in underwater environments.

  7. Wireless local area network in a prehospital environment

    Directory of Open Access Journals (Sweden)

    Grimes Gary J

    2004-08-01

    Full Text Available Abstract Background Wireless local area networks (WLANs are considered the next generation of clinical data network. They open the possibility for capturing clinical data in a prehospital setting (e.g., a patient's home using various devices, such as personal digital assistants, laptops, digital electrocardiogram (EKG machines, and even cellular phones, and transmitting the captured data to a physician or hospital. The transmission rate is crucial to the applicability of the technology in the prehospital setting. Methods We created two separate WLANs to simulate a virtual local are network environment such as in a patient's home or an emergency room (ER. The effects of different methods of data transmission, number of clients, and roaming among different access points on the file transfer rate were determined. Results The present results suggest that it is feasible to transfer small files such as patient demographics and EKG data from the patient's home to the ER at a reasonable speed. Encryption, user control, and access control were implemented and results discussed. Conclusions Implementing a WLAN in a centrally managed and multiple-layer-controlled access control server is the key to ensuring its security and accessibility. Future studies should focus on product capacity, speed, compatibility, interoperability, and security management.

  8. Retention Capability of Local Backfill Materials 1-Simulated Disposal Environment

    International Nuclear Information System (INIS)

    Ghattas, N.K.; Eskander, S.B.; El-Adham, K.A.; Mahmoud, N.S.

    2001-01-01

    In Egypt, a shallow ground disposal facility was the chosen option for the disposal of low and and intermediate radioactive wastes. The impact of the waste disposal facility on the environment depends on the nature of the barriers, which intend to limit and control contaminant migration. Owing to their physical, chemical and mechanical characteristics. Local soil materials were studied to illustrate the role of the back fill as part of an optimized safety multi-barrier system, which can provide the required level of protection of the environment and meet economic and regulatory requirements. A theoretical model was proposed to calculate the transport phenomena through the backfill materials. The credibility and validity of the proposed model was checked by the experimental results obtained from a three-arms arrangement system. The obtained data for the distribution coefficient (K d ) and the apparent diffusion coefficient (D a ) were in good agreement with those previously obtained in the literatures. Taking in consideration the prevailing initial conditions, the data calculated by the theoretical model applied show a reasonable agreement with the results obtained from experimental work. Prediction of radioactive cesium migration through the backfill materials using the proposed model was performed as a function of distance. The results obtained show that after 100 years, a fraction not exceeding 1E-9 of the original activity could be detected at 1m distance away from the waste material

  9. The Effects of the Local Environment on Active Galactic Nuclei

    Science.gov (United States)

    Manzer, L. H.; De Robertis, M. M.

    2014-06-01

    There continues to be significant controversy regarding the mechanism(s) responsible for the initiation and maintenance of activity in galactic nuclei. In this paper we will investigate possible environmental triggers of nuclear activity through a statistical analysis of a large sample of galaxy groups. The focus of this paper is to identify active galactic nuclei (AGNs) and other emission-line galaxies in these groups and to compare their frequency with a sample of over 260,000 isolated galaxies from the same catalog. The galaxy groups are taken from the catalog of Yang et al., in which over 20,000 virialized groups of galaxies (2 universe. After correcting emission-line equivalent widths for extinction and underlying Balmer stellar absorption, we classify galaxies in the sample using traditional emission-line ratios, while incorporating measurement uncertainties. We find a significantly higher fraction of AGNs in groups compared with the isolated sample. Likewise, a significantly higher fraction of absorption-line galaxies are found in groups, while a higher fraction of star-forming galaxies prefer isolated environments. Within grouped environments, AGNs and star-forming galaxies are found more frequently in small- to medium-richness groups, while absorption-line galaxies prefer groups with larger richnesses. Groups containing only emission-line galaxies have smaller virial radii, velocity dispersions, and masses compared with those containing only absorption-line galaxies. Furthermore, the AGN fraction increases with decreasing distance to the group centroid, independent of galaxy morphology. Using properties obtained from Galaxy Zoo, there is an increased fraction of AGNs within merging systems, unlike star-forming galaxies. These results provide some indication that the local environment does play a role in initiating activity in galactic nuclei, but it is by no means simple or straightforward.

  10. Performing a local reduction operation on a parallel computer

    Science.gov (United States)

    Blocksome, Michael A.; Faraj, Daniel A.

    2012-12-11

    A parallel computer including compute nodes, each including two reduction processing cores, a network write processing core, and a network read processing core, each processing core assigned an input buffer. Copying, in interleaved chunks by the reduction processing cores, contents of the reduction processing cores' input buffers to an interleaved buffer in shared memory; copying, by one of the reduction processing cores, contents of the network write processing core's input buffer to shared memory; copying, by another of the reduction processing cores, contents of the network read processing core's input buffer to shared memory; and locally reducing in parallel by the reduction processing cores: the contents of the reduction processing core's input buffer; every other interleaved chunk of the interleaved buffer; the copied contents of the network write processing core's input buffer; and the copied contents of the network read processing core's input buffer.

  11. A Distributed Snapshot Protocol for Efficient Artificial Intelligence Computation in Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    JongBeom Lim

    2018-01-01

    Full Text Available Many artificial intelligence applications often require a huge amount of computing resources. As a result, cloud computing adoption rates are increasing in the artificial intelligence field. To support the demand for artificial intelligence applications and guarantee the service level agreement, cloud computing should provide not only computing resources but also fundamental mechanisms for efficient computing. In this regard, a snapshot protocol has been used to create a consistent snapshot of the global state in cloud computing environments. However, the existing snapshot protocols are not optimized in the context of artificial intelligence applications, where large-scale iterative computation is the norm. In this paper, we present a distributed snapshot protocol for efficient artificial intelligence computation in cloud computing environments. The proposed snapshot protocol is based on a distributed algorithm to run interconnected multiple nodes in a scalable fashion. Our snapshot protocol is able to deal with artificial intelligence applications, in which a large number of computing nodes are running. We reveal that our distributed snapshot protocol guarantees the correctness, safety, and liveness conditions.

  12. Imaging local brain function with emission computed tomography

    International Nuclear Information System (INIS)

    Kuhl, D.E.

    1984-01-01

    Positron emission tomography (PET) using 18 F-fluorodeoxyglucose (FDG) was used to map local cerebral glucose utilization in the study of local cerebral function. This information differs fundamentally from structural assessment by means of computed tomography (CT). In normal human volunteers, the FDG scan was used to determine the cerebral metabolic response to conrolled sensory stimulation and the effects of aging. Cerebral metabolic patterns are distinctive among depressed and demented elderly patients. The FDG scan appears normal in the depressed patient, studded with multiple metabolic defects in patients with multiple infarct dementia, and in the patients with Alzheimer disease, metabolism is particularly reduced in the parietal cortex, but only slightly reduced in the caudate and thalamus. The interictal FDG scan effectively detects hypometabolic brain zones that are sites of onset for seizures in patients with partial epilepsy, even though these zones usually appear normal on CT scans. The future prospects of PET are discussed

  13. The effects of the local environment on active galactic nuclei

    International Nuclear Information System (INIS)

    Manzer, L. H.; De Robertis, M. M.

    2014-01-01

    There continues to be significant controversy regarding the mechanism(s) responsible for the initiation and maintenance of activity in galactic nuclei. In this paper we will investigate possible environmental triggers of nuclear activity through a statistical analysis of a large sample of galaxy groups. The focus of this paper is to identify active galactic nuclei (AGNs) and other emission-line galaxies in these groups and to compare their frequency with a sample of over 260,000 isolated galaxies from the same catalog. The galaxy groups are taken from the catalog of Yang et al., in which over 20,000 virialized groups of galaxies (2 ≤ N ≤ 20) with redshifts between 0.01 and 0.20 are from the Sloan Digital Sky Survey. We first investigate the completeness of our data set and find, though biases are a concern particularly at higher redshift, that our data provide a fair representation of the local universe. After correcting emission-line equivalent widths for extinction and underlying Balmer stellar absorption, we classify galaxies in the sample using traditional emission-line ratios, while incorporating measurement uncertainties. We find a significantly higher fraction of AGNs in groups compared with the isolated sample. Likewise, a significantly higher fraction of absorption-line galaxies are found in groups, while a higher fraction of star-forming galaxies prefer isolated environments. Within grouped environments, AGNs and star-forming galaxies are found more frequently in small- to medium-richness groups, while absorption-line galaxies prefer groups with larger richnesses. Groups containing only emission-line galaxies have smaller virial radii, velocity dispersions, and masses compared with those containing only absorption-line galaxies. Furthermore, the AGN fraction increases with decreasing distance to the group centroid, independent of galaxy morphology. Using properties obtained from Galaxy Zoo, there is an increased fraction of AGNs within merging systems

  14. Whiskers and Localized Corrosion on Copper in Repository Environment

    International Nuclear Information System (INIS)

    Hermansson, Hans-Peter; Gillen, Peter

    2004-03-01

    Previous studies have demonstrated that whiskers (thread/hair shaped structures) can form on copper in a sulphide containing environment. A remaining important question is whether the attack on the copper metal surface beneath a whisker is of a localized or of a general nature. This issue has not been clarified as whiskers are very fragile and have always detached and fallen off from the surface at some stage of handling. It has therefore been very difficult to link the growth root of the whisker to underlying structures in the metal surface. A study was therefore initiated to settle the important issue of the relation between whisker position and the type of underlying metal attack. The usage of a porous medium was originally planned to support the whiskers in order to keep them in place and by post examinations characterize the nature of the whisker roots and thus the type of attack on the metal. However, the early stages of the present experimental work clearly indicated that other ways of study were necessary. A photographic method for the registration and positioning of whisker growth was therefore developed. It proved to be a successful means to coordinate whisker position and to link it with the attack on the underlying metal. Shortage of sulphide in previous experiments caused a retarded growth rate of whiskers. Therefore, in present experiments the sulphide concentration was kept at a more constant level throughout an experiment and a hindered whisker growth did not limit the attack on underlying metal. Whiskers and substrates were observed with a video camera throughout an experiment and the phase composition was examined with Laser Raman Spectroscopy, LRS and the Raman video microscope. Post examinations were also performed using light optical microscopy. By combining the results from the optical methods it has been possible to distinguish two kinds of whisker roots (small/large diameter) with the underlying metal surface. It has also been demonstrated

  15. The effects of the local environment on active galactic nuclei

    Energy Technology Data Exchange (ETDEWEB)

    Manzer, L. H.; De Robertis, M. M., E-mail: liannemanzer@gmail.com, E-mail: mmdr@yorku.ca [Department of Physics and Astronomy, York University, Toronto, ON M3J 1P3 (Canada)

    2014-06-20

    There continues to be significant controversy regarding the mechanism(s) responsible for the initiation and maintenance of activity in galactic nuclei. In this paper we will investigate possible environmental triggers of nuclear activity through a statistical analysis of a large sample of galaxy groups. The focus of this paper is to identify active galactic nuclei (AGNs) and other emission-line galaxies in these groups and to compare their frequency with a sample of over 260,000 isolated galaxies from the same catalog. The galaxy groups are taken from the catalog of Yang et al., in which over 20,000 virialized groups of galaxies (2 ≤ N ≤ 20) with redshifts between 0.01 and 0.20 are from the Sloan Digital Sky Survey. We first investigate the completeness of our data set and find, though biases are a concern particularly at higher redshift, that our data provide a fair representation of the local universe. After correcting emission-line equivalent widths for extinction and underlying Balmer stellar absorption, we classify galaxies in the sample using traditional emission-line ratios, while incorporating measurement uncertainties. We find a significantly higher fraction of AGNs in groups compared with the isolated sample. Likewise, a significantly higher fraction of absorption-line galaxies are found in groups, while a higher fraction of star-forming galaxies prefer isolated environments. Within grouped environments, AGNs and star-forming galaxies are found more frequently in small- to medium-richness groups, while absorption-line galaxies prefer groups with larger richnesses. Groups containing only emission-line galaxies have smaller virial radii, velocity dispersions, and masses compared with those containing only absorption-line galaxies. Furthermore, the AGN fraction increases with decreasing distance to the group centroid, independent of galaxy morphology. Using properties obtained from Galaxy Zoo, there is an increased fraction of AGNs within merging systems

  16. Implementation of Computer Assisted Test Selection System in Local Governments

    Directory of Open Access Journals (Sweden)

    Abdul Azis Basri

    2016-05-01

    Full Text Available As an evaluative way of selection of civil servant system in all government areas, Computer Assisted Test selection system was started to apply in 2013. In phase of implementation for first time in all areas in 2014, this system selection had trouble in several areas, such as registration procedure and passing grade. The main objective of this essay was to describe implementation of new selection system for civil servants in the local governments and to seek level of effectiveness of this selection system. This essay used combination of study literature and field survey which data collection was made by interviews, observations, and documentations from various sources, and to analyze the collected data, this essay used reduction, display data and verification for made the conclusion. The result of this essay showed, despite there a few parts that be problem of this system such as in the registration phase but almost all phases of implementation of CAT selection system in local government areas can be said was working clearly likes in preparation, implementation and result processing phase. And also this system was fulfilled two of three criterias of effectiveness for selection system, they were accuracy and trusty. Therefore, this selection system can be said as an effective way to select new civil servant. As suggestion, local governments have to make prime preparation in all phases of test and make a good feedback as evaluation mechanism and together with central government to seek, fix and improve infrastructures as supporting tool and competency of local residents.

  17. Operational computer graphics in the flight dynamics environment

    Science.gov (United States)

    Jeletic, James F.

    1989-01-01

    Over the past five years, the Flight Dynamics Division of the National Aeronautics and Space Administration's (NASA's) Goddard Space Flight Center has incorporated computer graphics technology into its operational environment. In an attempt to increase the effectiveness and productivity of the Division, computer graphics software systems have been developed that display spacecraft tracking and telemetry data in 2-d and 3-d graphic formats that are more comprehensible than the alphanumeric tables of the past. These systems vary in functionality from real-time mission monitoring system, to mission planning utilities, to system development tools. Here, the capabilities and architecture of these systems are discussed.

  18. Discharge Fee Policy Analysis: A Computable General Equilibrium (CGE) Model of Water Resources and Water Environments

    OpenAIRE

    Guohua Fang; Ting Wang; Xinyi Si; Xin Wen; Yu Liu

    2016-01-01

    To alleviate increasingly serious water pollution and shortages in developing countries, various kinds of policies have been implemented by local governments. It is vital to quantify and evaluate the performance and potential economic impacts of these policies. This study develops a Computable General Equilibrium (CGE) model to simulate the regional economic and environmental effects of discharge fees. Firstly, water resources and water environment factors are separated from the input and out...

  19. Power Consumption Evaluation of Distributed Computing Network Considering Traffic Locality

    Science.gov (United States)

    Ogawa, Yukio; Hasegawa, Go; Murata, Masayuki

    When computing resources are consolidated in a few huge data centers, a massive amount of data is transferred to each data center over a wide area network (WAN). This results in increased power consumption in the WAN. A distributed computing network (DCN), such as a content delivery network, can reduce the traffic from/to the data center, thereby decreasing the power consumed in the WAN. In this paper, we focus on the energy-saving aspect of the DCN and evaluate its effectiveness, especially considering traffic locality, i.e., the amount of traffic related to the geographical vicinity. We first formulate the problem of optimizing the DCN power consumption and describe the DCN in detail. Then, numerical evaluations show that, when there is strong traffic locality and the router has ideal energy proportionality, the system's power consumption is reduced to about 50% of the power consumed in the case where a DCN is not used; moreover, this advantage becomes even larger (up to about 30%) when the data center is located farthest from the center of the network topology.

  20. Computation of 3D form factors in complex environments

    International Nuclear Information System (INIS)

    Coulon, N.

    1989-01-01

    The calculation of radiant interchange among opaque surfaces in a complex environment poses the general problem of determining the visible and hidden parts of the environment. In many thermal engineering applications, surfaces are separated by radiatively non-participating media and may be idealized as diffuse emitters and reflectors. Consenquently the net radiant energy fluxes are intimately related to purely geometrical quantities called form factors, that take into account hidden parts: the problem is reduced to the form factor evaluation. This paper presents the method developed for the computation of 3D form factors in the finite-element module of the system TRIO, which is a general computer code for thermal and fluid flow analysis. The method is derived from an algorithm devised for synthetic image generation. A comparison is performed with the standard contour integration method also implemented and suited to convex geometries. Several illustrative examples of finite-element thermal calculations in radiating enclosures are given

  1. A Novel Biometric Approach for Authentication In Pervasive Computing Environments

    OpenAIRE

    Rachappa,; Divyajyothi M G; D H Rao

    2016-01-01

    The paradigm of embedding computing devices in our surrounding environment has gained more interest in recent days. Along with contemporary technology comes challenges, the most important being the security and privacy aspect. Keeping the aspect of compactness and memory constraints of pervasive devices in mind, the biometric techniques proposed for identification should be robust and dynamic. In this work, we propose an emerging scheme that is based on few exclusive human traits and characte...

  2. Quality control of computational fluid dynamics in indoor environments

    DEFF Research Database (Denmark)

    Sørensen, Dan Nørtoft; Nielsen, P. V.

    2003-01-01

    Computational fluid dynamics (CFD) is used routinely to predict air movement and distributions of temperature and concentrations in indoor environments. Modelling and numerical errors are inherent in such studies and must be considered when the results are presented. Here, we discuss modelling as...... the quality of CFD calculations, as well as guidelines for the minimum information that should accompany all CFD-related publications to enable a scientific judgment of the quality of the study....

  3. Providing a computing environment for a high energy physics workshop

    International Nuclear Information System (INIS)

    Nicholls, J.

    1991-03-01

    Although computing facilities have been provided at conferences and workshops remote from the hose institution for some years, the equipment provided has rarely been capable of providing for much more than simple editing and electronic mail over leased lines. This presentation describes the pioneering effort involved by the Computing Department/Division at Fermilab in providing a local computing facility with world-wide networking capability for the Physics at Fermilab in the 1990's workshop held in Breckenridge, Colorado, in August 1989, as well as the enhanced facilities provided for the 1990 Summer Study on High Energy Physics at Snowmass, Colorado, in June/July 1990. Issues discussed include type and sizing of the facilities, advance preparations, shipping, on-site support, as well as an evaluation of the value of the facility to the workshop participants

  4. The Virtual Cell: a software environment for computational cell biology.

    Science.gov (United States)

    Loew, L M; Schaff, J C

    2001-10-01

    The newly emerging field of computational cell biology requires software tools that address the needs of a broad community of scientists. Cell biological processes are controlled by an interacting set of biochemical and electrophysiological events that are distributed within complex cellular structures. Computational modeling is familiar to researchers in fields such as molecular structure, neurobiology and metabolic pathway engineering, and is rapidly emerging in the area of gene expression. Although some of these established modeling approaches can be adapted to address problems of interest to cell biologists, relatively few software development efforts have been directed at the field as a whole. The Virtual Cell is a computational environment designed for cell biologists as well as for mathematical biologists and bioengineers. It serves to aid the construction of cell biological models and the generation of simulations from them. The system enables the formulation of both compartmental and spatial models, the latter with either idealized or experimentally derived geometries of one, two or three dimensions.

  5. Epidemic spreading in localized environments with recurrent mobility patterns

    Science.gov (United States)

    Granell, Clara; Mucha, Peter J.

    2018-05-01

    The spreading of epidemics is very much determined by the structure of the contact network, which may be impacted by the mobility dynamics of the individuals themselves. In confined scenarios where a small, closed population spends most of its time in localized environments and has easily identifiable mobility patterns—such as workplaces, university campuses, or schools—it is of critical importance to identify the factors controlling the rate of disease spread. Here, we present a discrete-time, metapopulation-based model to describe the transmission of susceptible-infected-susceptible-like diseases that take place in confined scenarios where the mobilities of the individuals are not random but, rather, follow clear recurrent travel patterns. This model allows analytical determination of the onset of epidemics, as well as the ability to discern which contact structures are most suited to prevent the infection to spread. It thereby determines whether common prevention mechanisms, as isolation, are worth implementing in such a scenario and their expected impact.

  6. An Introduction to Computer Forensics: Gathering Evidence in a Computing Environment

    Directory of Open Access Journals (Sweden)

    Henry B. Wolfe

    2001-01-01

    Full Text Available Business has become increasingly dependent on the Internet and computing to operate. It has become apparent that there are issues of evidence gathering in a computing environment, which by their nature are technical and different to other forms of evidence gathering, that must be addressed. This paper offers an introduction to some of the technical issues surrounding this new and specialized field of Computer Forensics. It attempts to identify and describe sources of evidence that can be found on disk data storage devices in the course of an investigation. It also considers sources of copies of email, which can be used in evidence, as well as case building.

  7. Architecture independent environment for developing engineering software on MIMD computers

    Science.gov (United States)

    Valimohamed, Karim A.; Lopez, L. A.

    1990-01-01

    Engineers are constantly faced with solving problems of increasing complexity and detail. Multiple Instruction stream Multiple Data stream (MIMD) computers have been developed to overcome the performance limitations of serial computers. The hardware architectures of MIMD computers vary considerably and are much more sophisticated than serial computers. Developing large scale software for a variety of MIMD computers is difficult and expensive. There is a need to provide tools that facilitate programming these machines. First, the issues that must be considered to develop those tools are examined. The two main areas of concern were architecture independence and data management. Architecture independent software facilitates software portability and improves the longevity and utility of the software product. It provides some form of insurance for the investment of time and effort that goes into developing the software. The management of data is a crucial aspect of solving large engineering problems. It must be considered in light of the new hardware organizations that are available. Second, the functional design and implementation of a software environment that facilitates developing architecture independent software for large engineering applications are described. The topics of discussion include: a description of the model that supports the development of architecture independent software; identifying and exploiting concurrency within the application program; data coherence; engineering data base and memory management.

  8. 5 CFR 531.607 - Computing hourly, daily, weekly, and biweekly locality rates.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Computing hourly, daily, weekly, and... Computing hourly, daily, weekly, and biweekly locality rates. (a) Apply the following methods to convert an... firefighter whose pay is computed under 5 U.S.C. 5545b, a firefighter hourly locality rate is computed using a...

  9. The effect of brain lesions on sound localization in complex acoustic environments.

    Science.gov (United States)

    Zündorf, Ida C; Karnath, Hans-Otto; Lewald, Jörg

    2014-05-01

    Localizing sound sources of interest in cluttered acoustic environments--as in the 'cocktail-party' situation--is one of the most demanding challenges to the human auditory system in everyday life. In this study, stroke patients' ability to localize acoustic targets in a single-source and in a multi-source setup in the free sound field were directly compared. Subsequent voxel-based lesion-behaviour mapping analyses were computed to uncover the brain areas associated with a deficit in localization in the presence of multiple distracter sound sources rather than localization of individually presented sound sources. Analyses revealed a fundamental role of the right planum temporale in this task. The results from the left hemisphere were less straightforward, but suggested an involvement of inferior frontal and pre- and postcentral areas. These areas appear to be particularly involved in the spectrotemporal analyses crucial for effective segregation of multiple sound streams from various locations, beyond the currently known network for localization of isolated sound sources in otherwise silent surroundings.

  10. Localized Ambient Solidity Separation Algorithm Based Computer User Segmentation

    Science.gov (United States)

    Sun, Xiao; Zhang, Tongda; Chai, Yueting; Liu, Yi

    2015-01-01

    Most of popular clustering methods typically have some strong assumptions of the dataset. For example, the k-means implicitly assumes that all clusters come from spherical Gaussian distributions which have different means but the same covariance. However, when dealing with datasets that have diverse distribution shapes or high dimensionality, these assumptions might not be valid anymore. In order to overcome this weakness, we proposed a new clustering algorithm named localized ambient solidity separation (LASS) algorithm, using a new isolation criterion called centroid distance. Compared with other density based isolation criteria, our proposed centroid distance isolation criterion addresses the problem caused by high dimensionality and varying density. The experiment on a designed two-dimensional benchmark dataset shows that our proposed LASS algorithm not only inherits the advantage of the original dissimilarity increments clustering method to separate naturally isolated clusters but also can identify the clusters which are adjacent, overlapping, and under background noise. Finally, we compared our LASS algorithm with the dissimilarity increments clustering method on a massive computer user dataset with over two million records that contains demographic and behaviors information. The results show that LASS algorithm works extremely well on this computer user dataset and can gain more knowledge from it. PMID:26221133

  11. Electrostatic influence of local cysteine environments on disulfide exchange kinetics.

    Science.gov (United States)

    Snyder, G H; Cennerazzo, M J; Karalis, A J; Field, D

    1981-11-10

    The ionic strength dependence of the bimolecular rate constant for reaction of the negative disulfide 5,5'-dithiobis (2-nitrobenzoic acid) with cysteines in fragments of naturally occurring proteins was determined by stopped-flow spectroscopy. The Debye-Hückel relationship was applied to determine the effective charge at the cysteine and thereby determine the extent to which nearby neighbors in the primary sequence influence the kinetics. Corrections for the secondary salt effect on cysteine pKs were determined by direct spectrometric pH titration of sulfhydryl groups or by observation of the ionic strength dependence of kinetics of cysteine reaction with the neutral disulfide 2,2'-dithiodipyridine. Quantitative expressions was verified by model studies with N-acetyl-cystein. At ionic strengths equal to or greater than 20 mM, the net charge at the polypeptide cysteine site is the sum of the single negative charge of the thiolate anion and the charges of the amino acids immediately preceding and following the cysteine in the primary sequence. At lower ionic strengths, more distant residues influence kinetics. At pH 7.0, 23 degree C, and an ionic strength of 20 mM, rate constants for reaction of the negative disulfide with a cysteine having two positive neighbors, one positive and one neutral neighbor, or two neutral neighbors are 132000, 3350, and 367 s-1 M-1, respectively. This corresponds to a contribution to the activation energy of 0.65- 1.1 kcal/mol per ion pair involved in collision between the cysteine and disulfide regions. The results permit the estimation that cysteine local environments may provide a means of achieving a 10(6)-fold range in rate constants in disulfide exchange reactions in random-coil proteins. This range may prove useful in developing strategies for directing disulfide pairing in synthetic proteins.

  12. Distributed computing testbed for a remote experimental environment

    International Nuclear Information System (INIS)

    Butner, D.N.; Casper, T.A.; Howard, B.C.; Henline, P.A.; Davis, S.L.; Barnes, D.

    1995-01-01

    Collaboration is increasing as physics research becomes concentrated on a few large, expensive facilities, particularly in magnetic fusion energy research, with national and international participation. These facilities are designed for steady state operation and interactive, real-time experimentation. We are developing tools to provide for the establishment of geographically distant centers for interactive operations; such centers would allow scientists to participate in experiments from their home institutions. A testbed is being developed for a Remote Experimental Environment (REE), a ''Collaboratory.'' The testbed will be used to evaluate the ability of a remotely located group of scientists to conduct research on the DIII-D Tokamak at General Atomics. The REE will serve as a testing environment for advanced control and collaboration concepts applicable to future experiments. Process-to-process communications over high speed wide area networks provide real-time synchronization and exchange of data among multiple computer networks, while the ability to conduct research is enhanced by adding audio/video communication capabilities. The Open Software Foundation's Distributed Computing Environment is being used to test concepts in distributed control, security, naming, remote procedure calls and distributed file access using the Distributed File Services. We are exploring the technology and sociology of remotely participating in the operation of a large scale experimental facility

  13. Cluster implementation for parallel computation within MATLAB software environment

    International Nuclear Information System (INIS)

    Santana, Antonio O. de; Dantas, Carlos C.; Charamba, Luiz G. da R.; Souza Neto, Wilson F. de; Melo, Silvio B. Melo; Lima, Emerson A. de O.

    2013-01-01

    A cluster for parallel computation with MATLAB software the COCGT - Cluster for Optimizing Computing in Gamma ray Transmission methods, is implemented. The implementation correspond to creation of a local net of computers, facilities and configurations of software, as well as the accomplishment of cluster tests for determine and optimizing of performance in the data processing. The COCGT implementation was required by data computation from gamma transmission measurements applied to fluid dynamic and tomography reconstruction in a FCC-Fluid Catalytic Cracking cold pilot unity, and simulation data as well. As an initial test the determination of SVD - Singular Values Decomposition - of random matrix with dimension (n , n), n=1000, using the Girco's law modified, revealed that COCGT was faster in comparison to the literature [1] cluster, which is similar and operates at the same conditions. Solution of a system of linear equations provided a new test for the COCGT performance by processing a square matrix with n=10000, computing time was 27 s and for square matrix with n=12000, computation time was 45 s. For determination of the cluster behavior in relation to 'parfor' (parallel for-loop) and 'spmd' (single program multiple data), two codes were used containing those two commands and the same problem: determination of SVD of a square matrix with n= 1000. The execution of codes by means of COCGT proved: 1) for the code with 'parfor', the performance improved with the labs number from 1 to 8 labs; 2) for the code 'spmd', just 1 lab (core) was enough to process and give results in less than 1 s. In similar situation, with the difference that now the SVD will be determined from square matrix with n1500, for code with 'parfor', and n=7000, for code with 'spmd'. That results take to conclusions: 1) for the code with 'parfor', the behavior was the same already described above; 2) for code with 'spmd', the same besides having produced a larger performance, it supports a

  14. Acoustic radiosity for computation of sound fields in diffuse environments

    Science.gov (United States)

    Muehleisen, Ralph T.; Beamer, C. Walter

    2002-05-01

    The use of image and ray tracing methods (and variations thereof) for the computation of sound fields in rooms is relatively well developed. In their regime of validity, both methods work well for prediction in rooms with small amounts of diffraction and mostly specular reflection at the walls. While extensions to the method to include diffuse reflections and diffraction have been made, they are limited at best. In the fields of illumination and computer graphics the ray tracing and image methods are joined by another method called luminous radiative transfer or radiosity. In radiosity, an energy balance between surfaces is computed assuming diffuse reflection at the reflective surfaces. Because the interaction between surfaces is constant, much of the computation required for sound field prediction with multiple or moving source and receiver positions can be reduced. In acoustics the radiosity method has had little attention because of the problems of diffraction and specular reflection. The utility of radiosity in acoustics and an approach to a useful development of the method for acoustics will be presented. The method looks especially useful for sound level prediction in industrial and office environments. [Work supported by NSF.

  15. Experience of BESIII data production with local cluster and distributed computing model

    International Nuclear Information System (INIS)

    Deng, Z Y; Li, W D; Liu, H M; Sun, Y Z; Zhang, X M; Lin, L; Nicholson, C; Zhemchugov, A

    2012-01-01

    The BES III detector is a new spectrometer which works on the upgraded high-luminosity collider, BEPCII. The BES III experiment studies physics in the tau-charm energy region from 2 GeV to 4.6 GeV . From 2009 to 2011, BEPCII has produced 106M ψ(2S) events, 225M J/ψ events, 2.8 fb −1 ψ(3770) data, and 500 pb −1 data at 4.01 GeV. All the data samples were processed successfully and many important physics results have been achieved based on these samples. Doing data production correctly and efficiently with limited CPU and storage resources is a big challenge. This paper will describe the implementation of the experiment-specific data production for BESIII in detail, including data calibration with event-level parallel computing model, data reconstruction, inclusive Monte Carlo generation, random trigger background mixing and multi-stream data skimming. Now, with the data sample increasing rapidly, there is a growing demand to move from solely using a local cluster to a more distributed computing model. A distributed computing environment is being set up and expected to go into production use in 2012. The experience of BESIII data production, both with a local cluster and with a distributed computing model, is presented here.

  16. Local pulmonary structure classification for computer-aided nodule detection

    Science.gov (United States)

    Bahlmann, Claus; Li, Xianlin; Okada, Kazunori

    2006-03-01

    We propose a new method of classifying the local structure types, such as nodules, vessels, and junctions, in thoracic CT scans. This classification is important in the context of computer aided detection (CAD) of lung nodules. The proposed method can be used as a post-process component of any lung CAD system. In such a scenario, the classification results provide an effective means of removing false positives caused by vessels and junctions thus improving overall performance. As main advantage, the proposed solution transforms the complex problem of classifying various 3D topological structures into much simpler 2D data clustering problem, to which more generic and flexible solutions are available in literature, and which is better suited for visualization. Given a nodule candidate, first, our solution robustly fits an anisotropic Gaussian to the data. The resulting Gaussian center and spread parameters are used to affine-normalize the data domain so as to warp the fitted anisotropic ellipsoid into a fixed-size isotropic sphere. We propose an automatic method to extract a 3D spherical manifold, containing the appropriate bounding surface of the target structure. Scale selection is performed by a data driven entropy minimization approach. The manifold is analyzed for high intensity clusters, corresponding to protruding structures. Techniques involve EMclustering with automatic mode number estimation, directional statistics, and hierarchical clustering with a modified Bhattacharyya distance. The estimated number of high intensity clusters explicitly determines the type of pulmonary structures: nodule (0), attached nodule (1), vessel (2), junction (>3). We show accurate classification results for selected examples in thoracic CT scans. This local procedure is more flexible and efficient than current state of the art and will help to improve the accuracy of general lung CAD systems.

  17. Preserving access to ALEPH computing environment via virtual machines

    International Nuclear Information System (INIS)

    Coscetti, Simone; Boccali, Tommaso; Arezzini, Silvia; Maggi, Marcello

    2014-01-01

    The ALEPH Collaboration [1] took data at the LEP (CERN) electron-positron collider in the period 1989-2000, producing more than 300 scientific papers. While most of the Collaboration activities stopped in the last years, the data collected still has physics potential, with new theoretical models emerging, which ask checks with data at the Z and WW production energies. An attempt to revive and preserve the ALEPH Computing Environment is presented; the aim is not only the preservation of the data files (usually called bit preservation), but of the full environment a physicist would need to perform brand new analyses. Technically, a Virtual Machine approach has been chosen, using the VirtualBox platform. Concerning simulated events, the full chain from event generators to physics plots is possible, and reprocessing of data events is also functioning. Interactive tools like the DALI event display can be used on both data and simulated events. The Virtual Machine approach is suited for both interactive usage, and for massive computing using Cloud like approaches.

  18. Enabling Computational Dynamics in Distributed Computing Environments Using a Heterogeneous Computing Template

    Science.gov (United States)

    2011-08-09

    heterogeneous computing concept advertised recently as the paradigm capable of delivering exascale flop rates by the end of the decade. In this framework...and Lamb. Page 10 of 10 UNCLASSIFIED [3] Skaugen, K., Petascale to Exascale : Extending Intel’s HPC Commitment: http://download.intel.com

  19. Local Measurement of Fuel Energy Deposition and Heat Transfer Environment During Fuel Lifetime Using Controlled Calorimetry

    International Nuclear Information System (INIS)

    Don W. Miller; Andrew Kauffmann; Eric Kreidler; Dongxu Li; Hanying Liu; Daniel Mills; Thomas D. Radcliff; Joseph Talnagi

    2001-01-01

    A comprehensive description of the accomplishments of the DOE grant titled, ''Local Measurement of Fuel Energy Deposition and Heat Transfer Environment During Fuel Lifetime using Controlled Calorimetry''

  20. McMaster University: College and University Computing Environment.

    Science.gov (United States)

    CAUSE/EFFECT, 1988

    1988-01-01

    The computing and information services (CIS) organization includes administrative computing, academic computing, and networking and has three divisions: computing services, development services, and information services. Other computing activities include Health Sciences, Humanities Computing Center, and Department of Computer Science and Systems.…

  1. When can a green entrepreneur manage the local environment?

    Science.gov (United States)

    Brandt, Urs Steiner; Svendsen, Gert Tinggaard

    2016-12-01

    How do we deal with environmental management issues at the local level? Traditionally, the approach proposed from an environmental management perspective has involved various kinds of "top-down" regulatory measures, such as defining a standard that must be satisfied or a tax on pollution. Conversely, there has been less focus on the analysis of local, bottom-up approaches, as for example the effectiveness of various ways of organizing a local environmental transition process. Our focus is on analyzing of under what conditions it is possible for a "green entrepreneur" (GE) to manage a transition from brown to green energy? Theoretically, we consider four entrepreneurial skills, at least two of which must be present for the GE to succeed. In the case of the Danish island of Samsø and its rapid introduction of renewable energy, three of these skills are found to be present: profits, communication, and trustworthiness. The GE, however, failed to activate the fourth skill concerning the ability to persuade local non-green actors regarding the value of the green component. Thus, a main result is that it is crucial to convince non-green locals about the profitability of local environmental management rather than its potentially green components. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. FAST - A multiprocessed environment for visualization of computational fluid dynamics

    International Nuclear Information System (INIS)

    Bancroft, G.V.; Merritt, F.J.; Plessel, T.C.; Kelaita, P.G.; Mccabe, R.K.

    1991-01-01

    The paper presents the Flow Analysis Software Toolset (FAST) to be used for fluid-mechanics analysis. The design criteria for FAST including the minimization of the data path in the computational fluid-dynamics (CFD) process, consistent user interface, extensible software architecture, modularization, and the isolation of three-dimensional tasks from the application programmer are outlined. Each separate process communicates through the FAST Hub, while other modules such as FAST Central, NAS file input, CFD calculator, surface extractor and renderer, titler, tracer, and isolev might work together to generate the scene. An interprocess communication package making it possible for FAST to operate as a modular environment where resources could be shared among different machines as well as a single host is discussed. 20 refs

  3. A SOUND SOURCE LOCALIZATION TECHNIQUE TO SUPPORT SEARCH AND RESCUE IN LOUD NOISE ENVIRONMENTS

    Science.gov (United States)

    Yoshinaga, Hiroshi; Mizutani, Koichi; Wakatsuki, Naoto

    At some sites of earthquakes and other disasters, rescuers search for people buried under rubble by listening for the sounds which they make. Thus developing a technique to localize sound sources amidst loud noise will support such search and rescue operations. In this paper, we discuss an experiment performed to test an array signal processing technique which searches for unperceivable sound in loud noise environments. Two speakers simultaneously played a noise of a generator and a voice decreased by 20 dB (= 1/100 of power) from the generator noise at an outdoor space where cicadas were making noise. The sound signal was received by a horizontally set linear microphone array 1.05 m in length and consisting of 15 microphones. The direction and the distance of the voice were computed and the sound of the voice was extracted and played back as an audible sound by array signal processing.

  4. Sub-sampling-based 2D localization of an impulsive acoustic source in reverberant environments

    KAUST Repository

    Omer, Muhammad

    2014-07-01

    This paper presents a robust method for two-dimensional (2D) impulsive acoustic source localization in a room environment using low sampling rates. The proposed method finds the time delay from the room impulse response (RIR) which makes it robust against room reverberations. We consider the RIR as a sparse phenomenon and apply a recently proposed sparse signal reconstruction technique called orthogonal clustering (OC) for its estimation from the sub-sampled received signal. The arrival time of the direct path signal at a pair of microphones is identified from the estimated RIR, and their difference yields the desired time delay estimate (TDE). Low sampling rates reduces the hardware and computational complexity and decreases the communication between the microphones and the centralized location. Simulation and experimental results of an actual hardware setup are presented to demonstrate the performance of the proposed technique.

  5. Sub-sampling-based 2D localization of an impulsive acoustic source in reverberant environments

    KAUST Repository

    Omer, Muhammad; Quadeer, Ahmed A; Sharawi, Mohammad S; Al-Naffouri, Tareq Y.

    2014-01-01

    This paper presents a robust method for two-dimensional (2D) impulsive acoustic source localization in a room environment using low sampling rates. The proposed method finds the time delay from the room impulse response (RIR) which makes it robust against room reverberations. We consider the RIR as a sparse phenomenon and apply a recently proposed sparse signal reconstruction technique called orthogonal clustering (OC) for its estimation from the sub-sampled received signal. The arrival time of the direct path signal at a pair of microphones is identified from the estimated RIR, and their difference yields the desired time delay estimate (TDE). Low sampling rates reduces the hardware and computational complexity and decreases the communication between the microphones and the centralized location. Simulation and experimental results of an actual hardware setup are presented to demonstrate the performance of the proposed technique.

  6. Localization system for use in GPS denied environments

    Energy Technology Data Exchange (ETDEWEB)

    Trueblood, J. J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-01

    The military uses to autonomous platforms to complete missions to provide standoff for the warfighters. However autonomous platforms rely on GPS to provide their global position. In many missions spaces the autonomous platforms may encounter GPS denied environments which limits where the platform operates and requires the warfighters to takes its place. GPS denied environments can occur due to tall building, trees, canyon wall blocking the GPS satellite signals or a lack of coverage. An Inertial Navigation System (INS) uses sensors to detect the vehicle movement and direction its traveling to calculate the vehicle. One of biggest challenges with an INS system is the accuracy and accumulation of errors over time of the sensors. If these challenges can be overcome the INS would provide accurate positioning information to the autonomous vehicle in GPS denied environments and allow them to provide the desired standoff for the warfighters.

  7. KeyWare: an open wireless distributed computing environment

    Science.gov (United States)

    Shpantzer, Isaac; Schoenfeld, Larry; Grindahl, Merv; Kelman, Vladimir

    1995-12-01

    Deployment of distributed applications in the wireless domain lack equivalent tools, methodologies, architectures, and network management that exist in LAN based applications. A wireless distributed computing environment (KeyWareTM) based on intelligent agents within a multiple client multiple server scheme was developed to resolve this problem. KeyWare renders concurrent application services to wireline and wireless client nodes encapsulated in multiple paradigms such as message delivery, database access, e-mail, and file transfer. These services and paradigms are optimized to cope with temporal and spatial radio coverage, high latency, limited throughput and transmission costs. A unified network management paradigm for both wireless and wireline facilitates seamless extensions of LAN- based management tools to include wireless nodes. A set of object oriented tools and methodologies enables direct asynchronous invocation of agent-based services supplemented by tool-sets matched to supported KeyWare paradigms. The open architecture embodiment of KeyWare enables a wide selection of client node computing platforms, operating systems, transport protocols, radio modems and infrastructures while maintaining application portability.

  8. Performance Evaluation of Resource Management in Cloud Computing Environments.

    Science.gov (United States)

    Batista, Bruno Guazzelli; Estrella, Julio Cezar; Ferreira, Carlos Henrique Gomes; Filho, Dionisio Machado Leite; Nakamura, Luis Hideo Vasconcelos; Reiff-Marganiec, Stephan; Santana, Marcos José; Santana, Regina Helena Carlucci

    2015-01-01

    Cloud computing is a computational model in which resource providers can offer on-demand services to clients in a transparent way. However, to be able to guarantee quality of service without limiting the number of accepted requests, providers must be able to dynamically manage the available resources so that they can be optimized. This dynamic resource management is not a trivial task, since it involves meeting several challenges related to workload modeling, virtualization, performance modeling, deployment and monitoring of applications on virtualized resources. This paper carries out a performance evaluation of a module for resource management in a cloud environment that includes handling available resources during execution time and ensuring the quality of service defined in the service level agreement. An analysis was conducted of different resource configurations to define which dimension of resource scaling has a real influence on client requests. The results were used to model and implement a simulated cloud system, in which the allocated resource can be changed on-the-fly, with a corresponding change in price. In this way, the proposed module seeks to satisfy both the client by ensuring quality of service, and the provider by ensuring the best use of resources at a fair price.

  9. Performance Evaluation of Resource Management in Cloud Computing Environments.

    Directory of Open Access Journals (Sweden)

    Bruno Guazzelli Batista

    Full Text Available Cloud computing is a computational model in which resource providers can offer on-demand services to clients in a transparent way. However, to be able to guarantee quality of service without limiting the number of accepted requests, providers must be able to dynamically manage the available resources so that they can be optimized. This dynamic resource management is not a trivial task, since it involves meeting several challenges related to workload modeling, virtualization, performance modeling, deployment and monitoring of applications on virtualized resources. This paper carries out a performance evaluation of a module for resource management in a cloud environment that includes handling available resources during execution time and ensuring the quality of service defined in the service level agreement. An analysis was conducted of different resource configurations to define which dimension of resource scaling has a real influence on client requests. The results were used to model and implement a simulated cloud system, in which the allocated resource can be changed on-the-fly, with a corresponding change in price. In this way, the proposed module seeks to satisfy both the client by ensuring quality of service, and the provider by ensuring the best use of resources at a fair price.

  10. Learning styles: individualizing computer-based learning environments

    Directory of Open Access Journals (Sweden)

    Tim Musson

    1995-12-01

    Full Text Available While the need to adapt teaching to the needs of a student is generally acknowledged (see Corno and Snow, 1986, for a wide review of the literature, little is known about the impact of individual learner-differences on the quality of learning attained within computer-based learning environments (CBLEs. What evidence there is appears to support the notion that individual differences have implications for the degree of success or failure experienced by students (Ford and Ford, 1992 and by trainee end-users of software packages (Bostrom et al, 1990. The problem is to identify the way in which specific individual characteristics of a student interact with particular features of a CBLE, and how the interaction affects the quality of the resultant learning. Teaching in a CBLE is likely to require a subset of teaching strategies different from that subset appropriate to more traditional environments, and the use of a machine may elicit different behaviours from those normally arising in a classroom context.

  11. Enabling Efficient Climate Science Workflows in High Performance Computing Environments

    Science.gov (United States)

    Krishnan, H.; Byna, S.; Wehner, M. F.; Gu, J.; O'Brien, T. A.; Loring, B.; Stone, D. A.; Collins, W.; Prabhat, M.; Liu, Y.; Johnson, J. N.; Paciorek, C. J.

    2015-12-01

    A typical climate science workflow often involves a combination of acquisition of data, modeling, simulation, analysis, visualization, publishing, and storage of results. Each of these tasks provide a myriad of challenges when running on a high performance computing environment such as Hopper or Edison at NERSC. Hurdles such as data transfer and management, job scheduling, parallel analysis routines, and publication require a lot of forethought and planning to ensure that proper quality control mechanisms are in place. These steps require effectively utilizing a combination of well tested and newly developed functionality to move data, perform analysis, apply statistical routines, and finally, serve results and tools to the greater scientific community. As part of the CAlibrated and Systematic Characterization, Attribution and Detection of Extremes (CASCADE) project we highlight a stack of tools our team utilizes and has developed to ensure that large scale simulation and analysis work are commonplace and provide operations that assist in everything from generation/procurement of data (HTAR/Globus) to automating publication of results to portals like the Earth Systems Grid Federation (ESGF), all while executing everything in between in a scalable environment in a task parallel way (MPI). We highlight the use and benefit of these tools by showing several climate science analysis use cases they have been applied to.

  12. The Impact of Military Exercises and Operations on Local Environment

    African Journals Online (AJOL)

    Among the non-conventional security matters, environment has emerged as a new sphere in which the military has been actively involved; as a benevolent and malevolent agent through its exercises and operations. Despite the notable positive contributions, the negative impact of military exercises and operations in the ...

  13. Designing safer living environments support for local government

    CSIR Research Space (South Africa)

    Landman, K

    1999-06-01

    Full Text Available This paper addresses the built environment, the opportunities it presents for crime and the role city planners and urban designers have to play in the design of safer cities and towns. City planners and urban designers can play a role...

  14. Environment Modules on the Peregrine System | High-Performance Computing |

    Science.gov (United States)

    NREL Environment Modules on the Peregrine System Environment Modules on the Peregrine System Peregrine uses environment modules to easily manage software environments. Environment modules facilitate modules commands set up a basic environment for the default compilers, tools and libraries, such as the

  15. Association between fast food purchasing and the local food environment.

    Science.gov (United States)

    Thornton, Lukar E; Kavanagh, A M

    2012-12-03

    In this study, an instrument was created to measure the healthy and unhealthy characteristics of food environments and investigate associations between the whole of the food environment and fast food consumption. In consultation with other academic researchers in this field, food stores were categorised to either healthy or unhealthy and weighted (between +10 and -10) by their likely contribution to healthy/unhealthy eating practices. A healthy and unhealthy food environment score (FES) was created using these weightings. Using a cross-sectional study design, multilevel multinomial regression was used to estimate the effects of the whole food environment on the fast food purchasing habits of 2547 individuals. Respondents in areas with the highest tertile of the healthy FES had a lower likelihood of purchasing fast food both infrequently and frequently compared with respondents who never purchased, however only infrequent purchasing remained significant when simultaneously modelled with the unhealthy FES (odds ratio (OR) 0.52; 95% confidence interval (CI) 0.32-0.83). Although a lower likelihood of frequent fast food purchasing was also associated with living in the highest tertile of the unhealthy FES, no association remained once the healthy FES was included in the models. In our binary models, respondents living in areas with a higher unhealthy FES than healthy FES were more likely to purchase fast food infrequently (OR 1.35; 95% CI 1.00-1.82) however no association was found for frequent purchasing. Our study provides some evidence to suggest that healthier food environments may discourage fast food purchasing.

  16. Implementing a Business Intelligence Environment in Local Market of Pakistan

    Directory of Open Access Journals (Sweden)

    FARIA JAMEEL

    2017-04-01

    Full Text Available The BI (Business Intelligence has gained great success during the last decade throughout the world to aid in decision support with the availability of necessary knowledge to reduce costs, increase revenues and minimize risks. The local market of Pakistan is still not very much aware of its benefits, except some multinationals which are using these tools since almost 7-8 years and earning more revenues and improved performances and a few others are under the process of implementation. The small and medium sized businesses of our local market are focused for the implementation of BI. The pros and cons are identified by analysing the BI tools being used by other large companies here in Pakistan and feasibility of these tools at small and medium enterprises is discussed so that they too may focus on their KPIs (Key Performance Indicator to increase their performance level.

  17. Use of local convective and radiant cooling at warm environment

    DEFF Research Database (Denmark)

    Melikov, Arsen Krikor; Krejcirikova, Barbora; Kaczmarczyk, Jan

    2012-01-01

    The effect of four local cooling devices (convective, radiant and combined) on SBS symptoms reported by 24 subjects at 28 ˚C and 50% RH was studied. The devices studied were: (1) desk cooling fan, (2) personalized ventilation providing clean air, (3) two radiant panels and (4) two radiant panels...... and with radiant panel with attached fans, which also helped people to feel less fatigue. The SBS symptoms increased the most when the cooling fan, generating movement of polluted room air, was used....

  18. An integrated computer design environment for the development of micro-computer critical software

    International Nuclear Information System (INIS)

    De Agostino, E.; Massari, V.

    1986-01-01

    The paper deals with the development of micro-computer software for Nuclear Safety System. More specifically, it describes an experimental work in the field of software development methodologies to be used for the implementation of micro-computer based safety systems. An investigation of technological improvements that are provided by state-of-the-art integrated packages for micro-based systems development has been carried out. The work has aimed to assess a suitable automated tools environment for the whole software life-cycle. The main safety functions, as DNBR, KW/FT, of a nuclear power reactor have been implemented in a host-target approach. A prototype test-bed microsystem has been implemented to run the safety functions in order to derive a concrete evaluation on the feasibility of critical software according to new technological trends of ''Software Factories''. (author)

  19. Dynamic provisioning of local and remote compute resources with OpenStack

    Science.gov (United States)

    Giffels, M.; Hauth, T.; Polgart, F.; Quast, G.

    2015-12-01

    Modern high-energy physics experiments rely on the extensive usage of computing resources, both for the reconstruction of measured events as well as for Monte-Carlo simulation. The Institut fur Experimentelle Kernphysik (EKP) at KIT is participating in both the CMS and Belle experiments with computing and storage resources. In the upcoming years, these requirements are expected to increase due to growing amount of recorded data and the rise in complexity of the simulated events. It is therefore essential to increase the available computing capabilities by tapping into all resource pools. At the EKP institute, powerful desktop machines are available to users. Due to the multi-core nature of modern CPUs, vast amounts of CPU time are not utilized by common desktop usage patterns. Other important providers of compute capabilities are classical HPC data centers at universities or national research centers. Due to the shared nature of these installations, the standardized software stack required by HEP applications cannot be installed. A viable way to overcome this constraint and offer a standardized software environment in a transparent manner is the usage of virtualization technologies. The OpenStack project has become a widely adopted solution to virtualize hardware and offer additional services like storage and virtual machine management. This contribution will report on the incorporation of the institute's desktop machines into a private OpenStack Cloud. The additional compute resources provisioned via the virtual machines have been used for Monte-Carlo simulation and data analysis. Furthermore, a concept to integrate shared, remote HPC centers into regular HEP job workflows will be presented. In this approach, local and remote resources are merged to form a uniform, virtual compute cluster with a single point-of-entry for the user. Evaluations of the performance and stability of this setup and operational experiences will be discussed.

  20. Local area networking in a radio quiet environment

    Science.gov (United States)

    Childers, Edwin L.; Hunt, Gareth; Brandt, Joseph J.

    2002-11-01

    The Green Bank facility of the National Radio Astronomy Observatory is spread out over 2,700 acres in the Allegheny Mountains of West Virginia. Good communication has always been needed between the radio telescopes and the control buildings. The National Radio Quiet Zone helps protect the Green Bank site from radio transmissions that interfere with the astronomical signals. Due to stringent Radio Frequency Interference (RFI) requirements, a fiber optic communication system was used for Ethernet transmissions on the site and coaxial cable within the buildings. With the need for higher speed communications, the entire network has been upgraded to use optical fiber with modern Ethernet switches. As with most modern equipment, the implementation of the control of the newly deployed Green Bank Telescope (GBT) depends heavily on TCP/IP. In order to protect the GBT from the commodity Internet, the GBT uses a non-routable network. Communication between the control building Local Area Network (LAN) and the GBT is implemented using a Virtual LAN (VLAN). This configuration will be extended to achieve isolation between trusted local user systems, the GBT, and other Internet users. Legitimate access to the site, for example by remote observers, is likely to be implemented using a virtual private network (VPN).

  1. Self-Replication of Localized Vegetation Patches in Scarce Environments

    Science.gov (United States)

    Bordeu, Ignacio; Clerc, Marcel G.; Couteron, Piere; Lefever, René; Tlidi, Mustapha

    2016-09-01

    Desertification due to climate change and increasing drought periods is a worldwide problem for both ecology and economy. Our ability to understand how vegetation manages to survive and propagate through arid and semiarid ecosystems may be useful in the development of future strategies to prevent desertification, preserve flora—and fauna within—or even make use of scarce resources soils. In this paper, we study a robust phenomena observed in semi-arid ecosystems, by which localized vegetation patches split in a process called self-replication. Localized patches of vegetation are visible in nature at various spatial scales. Even though they have been described in literature, their growth mechanisms remain largely unexplored. Here, we develop an innovative statistical analysis based on real field observations to show that patches may exhibit deformation and splitting. This growth mechanism is opposite to the desertification since it allows to repopulate territories devoid of vegetation. We investigate these aspects by characterizing quantitatively, with a simple mathematical model, a new class of instabilities that lead to the self-replication phenomenon observed.

  2. The local food environment and diet: a systematic review.

    Science.gov (United States)

    Caspi, Caitlin E; Sorensen, Glorian; Subramanian, S V; Kawachi, Ichiro

    2012-09-01

    Despite growing attention to the problem of obesogenic environments, there has not been a comprehensive review evaluating the food environment-diet relationship. This study aims to evaluate this relationship in the current literature, focusing specifically on the method of exposure assessment (GIS, survey, or store audit). This study also explores 5 dimensions of "food access" (availability, accessibility, affordability, accommodation, acceptability) using a conceptual definition proposed by Penchansky and Thomas (1981). Articles were retrieved through a systematic keyword search in Web of Science and supplemented by the reference lists of included studies. Thirty-eight studies were reviewed and categorized by the exposure assessment method and the conceptual dimensions of access it captured. GIS-based measures were the most common measures, but were less consistently associated with diet than other measures. Few studies examined dimensions of affordability, accommodation, and acceptability. Because GIS-based measures on their own may not capture important non-geographic dimensions of access, a set of recommendations for future researchers is outlined. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. Applications of the pipeline environment for visual informatics and genomics computations

    Directory of Open Access Journals (Sweden)

    Genco Alex

    2011-07-01

    Full Text Available Abstract Background Contemporary informatics and genomics research require efficient, flexible and robust management of large heterogeneous data, advanced computational tools, powerful visualization, reliable hardware infrastructure, interoperability of computational resources, and detailed data and analysis-protocol provenance. The Pipeline is a client-server distributed computational environment that facilitates the visual graphical construction, execution, monitoring, validation and dissemination of advanced data analysis protocols. Results This paper reports on the applications of the LONI Pipeline environment to address two informatics challenges - graphical management of diverse genomics tools, and the interoperability of informatics software. Specifically, this manuscript presents the concrete details of deploying general informatics suites and individual software tools to new hardware infrastructures, the design, validation and execution of new visual analysis protocols via the Pipeline graphical interface, and integration of diverse informatics tools via the Pipeline eXtensible Markup Language syntax. We demonstrate each of these processes using several established informatics packages (e.g., miBLAST, EMBOSS, mrFAST, GWASS, MAQ, SAMtools, Bowtie for basic local sequence alignment and search, molecular biology data analysis, and genome-wide association studies. These examples demonstrate the power of the Pipeline graphical workflow environment to enable integration of bioinformatics resources which provide a well-defined syntax for dynamic specification of the input/output parameters and the run-time execution controls. Conclusions The LONI Pipeline environment http://pipeline.loni.ucla.edu provides a flexible graphical infrastructure for efficient biomedical computing and distributed informatics research. The interactive Pipeline resource manager enables the utilization and interoperability of diverse types of informatics resources. The

  4. EDF: contribution to local development and the protection of the environment

    International Nuclear Information System (INIS)

    Parot, F.; Veyret, G.

    1995-01-01

    As a consequence of the 1982-1983 French Decentralization laws, local elected officials were entrusted with new responsibilities concerning environmental protection and local development. EDF, the French public electricity utility therefore had to respond to new demands. New forms of cooperation with the various local actors were imagined: assistance in diagnostics, working out local strategies, subcontracting and working for the establishment of new industrial plants, multi-purpose water management (dams for example), environment protection (discreet lines...), urban waste treatment, transportation, etc

  5. Relation between local food environments and obesity among adults

    Directory of Open Access Journals (Sweden)

    Raine Kim D

    2009-06-01

    Full Text Available Abstract Background Outside of the United States, evidence for associations between exposure to fast-food establishments and risk for obesity among adults is limited and equivocal. The purposes of this study were to investigate whether the relative availability of different types of food retailers around people's homes was associated with obesity among adults in Edmonton, Canada, and if this association varied as a function of distance between food locations and people's homes. Methods Data from a population health survey of 2900 adults (18 years or older conducted in 2002 was linked with geographic measures of access to food retailers. Based upon a ratio of the number of fast-food restaurants and convenience stores to supermarkets and specialty food stores, a Retail Food Environment Index (RFEI was calculated for 800 m and 1600 m buffers around people's homes. In a series of logistic regressions, associations between the RFEI and the level of obesity among adults were examined. Results The median RFEI for adults in Edmonton was 4.00 within an 800 m buffer around their residence and 6.46 within a 1600 m buffer around their residence. Approximately 14% of the respondents were classified as being obese. The odds of a resident being obese were significantly lower (OR = 0.75, 95%CI 0.59 – 0.95 if they lived in an area with the lowest RFEI (below 3.0 in comparison to the highest RFEI (5.0 and above. These associations existed regardless of the covariates included in the model. No significant associations were observed between RFEI within a 1600 m buffer of the home and obesity. Conclusion The lower the ratio of fast-food restaurants and convenience stores to grocery stores and produce vendors near people's homes, the lower the odds of being obese. Thus the proximity of the obesogenic environment to individuals appears to be an important factor in their risk for obesity.

  6. Security issues of cloud computing environment in possible military applications

    OpenAIRE

    Samčović, Andreja B.

    2013-01-01

    The evolution of cloud computing over the past few years is potentially one of major advances in the history of computing and telecommunications. Although there are many benefits of adopting cloud computing, there are also some significant barriers to adoption, security issues being the most important of them. This paper introduces the concept of cloud computing; looks at relevant technologies in cloud computing; takes into account cloud deployment models and some military applications. Addit...

  7. Implementing interactive computing in an object-oriented environment

    Directory of Open Access Journals (Sweden)

    Frederic Udina

    2000-04-01

    Full Text Available Statistical computing when input/output is driven by a Graphical User Interface is considered. A proposal is made for automatic control of computational flow to ensure that only strictly required computations are actually carried on. The computational flow is modeled by a directed graph for implementation in any object-oriented programming language with symbolic manipulation capabilities. A complete implementation example is presented to compute and display frequency based piecewise linear density estimators such as histograms or frequency polygons.

  8. Multi-Language Programming Environments for High Performance Java Computing

    Directory of Open Access Journals (Sweden)

    Vladimir Getov

    1999-01-01

    Full Text Available Recent developments in processor capabilities, software tools, programming languages and programming paradigms have brought about new approaches to high performance computing. A steadfast component of this dynamic evolution has been the scientific community’s reliance on established scientific packages. As a consequence, programmers of high‐performance applications are reluctant to embrace evolving languages such as Java. This paper describes the Java‐to‐C Interface (JCI tool which provides application programmers wishing to use Java with immediate accessibility to existing scientific packages. The JCI tool also facilitates rapid development and reuse of existing code. These benefits are provided at minimal cost to the programmer. While beneficial to the programmer, the additional advantages of mixed‐language programming in terms of application performance and portability are addressed in detail within the context of this paper. In addition, we discuss how the JCI tool is complementing other ongoing projects such as IBM’s High‐Performance Compiler for Java (HPCJ and IceT’s metacomputing environment.

  9. Enrichment of Human-Computer Interaction in Brain-Computer Interfaces via Virtual Environments

    Directory of Open Access Journals (Sweden)

    Alonso-Valerdi Luz María

    2017-01-01

    Full Text Available Tridimensional representations stimulate cognitive processes that are the core and foundation of human-computer interaction (HCI. Those cognitive processes take place while a user navigates and explores a virtual environment (VE and are mainly related to spatial memory storage, attention, and perception. VEs have many distinctive features (e.g., involvement, immersion, and presence that can significantly improve HCI in highly demanding and interactive systems such as brain-computer interfaces (BCI. BCI is as a nonmuscular communication channel that attempts to reestablish the interaction between an individual and his/her environment. Although BCI research started in the sixties, this technology is not efficient or reliable yet for everyone at any time. Over the past few years, researchers have argued that main BCI flaws could be associated with HCI issues. The evidence presented thus far shows that VEs can (1 set out working environmental conditions, (2 maximize the efficiency of BCI control panels, (3 implement navigation systems based not only on user intentions but also on user emotions, and (4 regulate user mental state to increase the differentiation between control and noncontrol modalities.

  10. Scalable quantum computation via local control of only two qubits

    International Nuclear Information System (INIS)

    Burgarth, Daniel; Maruyama, Koji; Murphy, Michael; Montangero, Simone; Calarco, Tommaso; Nori, Franco; Plenio, Martin B.

    2010-01-01

    We apply quantum control techniques to a long spin chain by acting only on two qubits at one of its ends, thereby implementing universal quantum computation by a combination of quantum gates on these qubits and indirect swap operations across the chain. It is shown that the control sequences can be computed and implemented efficiently. We discuss the application of these ideas to physical systems such as superconducting qubits in which full control of long chains is challenging.

  11. Multi-VO support in IHEP's distributed computing environment

    International Nuclear Information System (INIS)

    Yan, T; Suo, B; Zhao, X H; Zhang, X M; Ma, Z T; Yan, X F; Lin, T; Deng, Z Y; Li, W D; Belov, S; Pelevanyuk, I; Zhemchugov, A; Cai, H

    2015-01-01

    Inspired by the success of BESDIRAC, the distributed computing environment based on DIRAC for BESIII experiment, several other experiments operated by Institute of High Energy Physics (IHEP), such as Circular Electron Positron Collider (CEPC), Jiangmen Underground Neutrino Observatory (JUNO), Large High Altitude Air Shower Observatory (LHAASO) and Hard X-ray Modulation Telescope (HXMT) etc, are willing to use DIRAC to integrate the geographically distributed computing resources available by their collaborations. In order to minimize manpower and hardware cost, we extended the BESDIRAC platform to support multi-VO scenario, instead of setting up a self-contained distributed computing environment for each VO. This makes DIRAC as a service for the community of those experiments. To support multi-VO, the system architecture of BESDIRAC is adjusted for scalability. The VOMS and DIRAC servers are reconfigured to manage users and groups belong to several VOs. A lightweight storage resource manager StoRM is employed as the central SE to integrate local and grid data. A frontend system is designed for user's massive job splitting, submission and management, with plugins to support new VOs. A monitoring and accounting system is also considered to easy the system administration and VO related resources usage accounting. (paper)

  12. Adaptations to local environments in modern human populations.

    Science.gov (United States)

    Jeong, Choongwon; Di Rienzo, Anna

    2014-12-01

    After leaving sub-Saharan Africa around 50000-100000 years ago, anatomically modern humans have quickly occupied extremely diverse environments. Human populations were exposed to further environmental changes resulting from cultural innovations, such as the spread of farming, which gave rise to new selective pressures related to pathogen exposures and dietary shifts. In addition to changing the frequency of individual adaptive alleles, natural selection may also shape the overall genetic architecture of adaptive traits. Here, we review recent advances in understanding the genetic architecture of adaptive human phenotypes based on insights from the studies of lactase persistence, skin pigmentation and high-altitude adaptation. These adaptations evolved in parallel in multiple human populations, providing a chance to investigate independent realizations of the evolutionary process. We suggest that the outcome of adaptive evolution is often highly variable even under similar selective pressures. Finally, we highlight a growing need for detecting adaptations that did not follow the classical sweep model and for incorporating new sources of genetic evidence such as information from ancient DNA. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Creation Greenhouse Environment Map Using Localization of Edge of Cultivation Platforms Based on Stereo Vision

    Directory of Open Access Journals (Sweden)

    A Nasiri

    2017-10-01

    Full Text Available Introduction Stereo vision means the capability of extracting the depth based on analysis of two images taken from different angles of one scene. The result of stereo vision is a collection of three-dimensional points which describes the details of scene proportional to the resolution of the obtained images. Vehicle automatic steering and crop growth monitoring are two important operations in agricultural precision. The essential aspects of an automated steering are position and orientation of the agricultural equipment in relation to crop row, detection of obstacles and design of path planning between the crop rows. The developed map can provide this information in the real time. Machine vision has the capabilities to perform these tasks in order to execute some operations such as cultivation, spraying and harvesting. In greenhouse environment, it is possible to develop a map and perform an automatic control by detecting and localizing the cultivation platforms as the main moving obstacle. The current work was performed to meet a method based on the stereo vision for detecting and localizing platforms, and then, providing a two-dimensional map for cultivation platforms in the greenhouse environment. Materials and Methods In this research, two webcams, made by Microsoft Corporation with the resolution of 960×544, are connected to the computer via USB2 in order to produce a stereo parallel camera. Due to the structure of cultivation platforms, the number of points in the point cloud will be decreased by extracting the only upper and lower edges of the platform. The proposed method in this work aims at extracting the edges based on depth discontinuous features in the region of platform edge. By getting the disparity image of the platform edges from the rectified stereo images and translating its data to 3D-space, the point cloud model of the environments is constructed. Then by projecting the points to XZ plane and putting local maps together

  14. Dynamic integration of remote cloud resources into local computing clusters

    Energy Technology Data Exchange (ETDEWEB)

    Fleig, Georg; Erli, Guenther; Giffels, Manuel; Hauth, Thomas; Quast, Guenter; Schnepf, Matthias [Institut fuer Experimentelle Kernphysik, Karlsruher Institut fuer Technologie (Germany)

    2016-07-01

    In modern high-energy physics (HEP) experiments enormous amounts of data are analyzed and simulated. Traditionally dedicated HEP computing centers are built or extended to meet this steadily increasing demand for computing resources. Nowadays it is more reasonable and more flexible to utilize computing power at remote data centers providing regular cloud services to users as they can be operated in a more efficient manner. This approach uses virtualization and allows the HEP community to run virtual machines containing a dedicated operating system and transparent access to the required software stack on almost any cloud site. The dynamic management of virtual machines depending on the demand for computing power is essential for cost efficient operation and sharing of resources with other communities. For this purpose the EKP developed the on-demand cloud manager ROCED for dynamic instantiation and integration of virtualized worker nodes into the institute's computing cluster. This contribution will report on the concept of our cloud manager and the implementation utilizing a remote OpenStack cloud site and a shared HPC center (bwForCluster located in Freiburg).

  15. Earth observation scientific workflows in a distributed computing environment

    CSIR Research Space (South Africa)

    Van Zyl, TL

    2011-09-01

    Full Text Available capabilities has focused on the web services approach as exemplified by the OGC's Web Processing Service and by GRID computing. The approach to leveraging distributed computing resources described in this paper uses instead remote objects via RPy...

  16. Exploring the influence of local food environments on food behaviours: a systematic review of qualitative literature.

    Science.gov (United States)

    Pitt, Erin; Gallegos, Danielle; Comans, Tracy; Cameron, Cate; Thornton, Lukar

    2017-09-01

    Systematic reviews investigating associations between objective measures of the food environment and dietary behaviours or health outcomes have not established a consistent evidence base. The present paper aims to synthesise qualitative evidence regarding the influence of local food environments on food and purchasing behaviours. A systematic review in the form of a qualitative thematic synthesis. Urban localities. Adults. Four analytic themes were identified from the review including community and consumer nutrition environments, other environmental factors and individual coping strategies for shopping and purchasing decisions. Availability, accessibility and affordability were consistently identified as key determinants of store choice and purchasing behaviours that often result in less healthy food choices within community nutrition environments. Food availability, quality and food store characteristics within consumer nutrition environments also greatly influenced in-store purchases. Individuals used a range of coping strategies in both the community and consumer nutrition environments to make optimal purchasing decisions, often within the context of financial constraints. Findings from the current review add depth and scope to quantitative literature and can guide ongoing theory, interventions and policy development in food environment research. There is a need to investigate contextual influences within food environments as well as individual and household socio-economic characteristics that contribute to the differing use of and views towards local food environments. Greater emphasis on how individual and environmental factors interact in the food environment field will be key to developing stronger understanding of how environments can support and promote healthier food choices.

  17. Gestão local e meio ambiente Local management and environment

    Directory of Open Access Journals (Sweden)

    Paulo Gonzaga M. de Carvalho

    2005-01-01

    Full Text Available O objetivo deste artigo é, a partir das informações disponibilizadas pela Pesquisa de Informações Básicas Municipais do IBGE, analisar três variáveis: a existência de Conselhos Municipais de Meio Ambiente, de Fundos Especiais de Meio Ambiente e de legislação sobre Áreas de Interesse Especial. Dentre outros aspectos, examina-se a incidência dos Conselhos Municipais de Meio Ambiente tendo em vista a bacia hidrográfica e o partido do prefeito.Based on the information available on the Municipal Basic Information Research of IBGE, this article aims to analyze three variables: the existence of Municipal Councils for the Environment, Special Funds for the Environment and Legislation on Areas of Special Interest. Among other aspects, it examines the incidence of Municipal Councils for the Environment having in mind the hydrographic basin and the Mayor Political Party.

  18. Electrical imaging for localizing historical tunnels at an urban environment

    Science.gov (United States)

    Osella, Ana; Martinelli, Patricia; Grunhut, Vivian; de la Vega, Matías; Bonomo, Néstor; Weissel, Marcelo

    2015-08-01

    We performed a geophysical study at a historical site in Buenos Aires, Argentina, corresponding to the location of a Jesuit Mission established during the 17th century, remaining there until the 18th century. The site consisted of a church, cloisters, a school, orchards and a procurator’s office; also several tunnels were built, connecting the mission with different public buildings in the town. In the 19th century the Faculty of Sciences of the University of Buenos Aires was built in a sector of the site originally occupied by an orchard, functioning until its demolition in 1973. At present, this area is a cobbled square. With the aim of preserving and restoring the buried structures, work was carried out in this square looking for tunnels and remains of the basement of the old building. Considering the conductive features of the subsoil, mainly formed by clays and silt, the complex characteristics of the buried structures, and the urban localization of the study area with its consequent high level of environmental electromagnetic noise, we performed pre-feasibility studies to determine the usefulness of different geophysical methods. The best results were achieved from the geoelectrical method. Dipole-dipole profiles with electrode spacings of 1.5 and 3 m provided enough lateral and vertical resolution and the required penetration depth. Reliable data were obtained as long as the electrodes were buried at least 15 cm among the cobble stones. Nine 2D electrical resistivity tomographies were obtained by using a robust inversion procedure to reduce the effect of possible data outliers in the resulting models. The effect on these models of different error estimations was also analyzed. Then, we built up a pseudo-3D model by laterally interpolating the 2D inversion results. Finally, by correlating the resulting model with the original plans, the remains of the expected main structures embedded in the site were characterized. In addition, an anomaly was

  19. Electrical imaging for localizing historical tunnels at an urban environment

    International Nuclear Information System (INIS)

    Osella, Ana; Martinelli, Patricia; De la Vega, Matías; Bonomo, Néstor; Grunhut, Vivian; Weissel, Marcelo

    2015-01-01

    We performed a geophysical study at a historical site in Buenos Aires, Argentina, corresponding to the location of a Jesuit Mission established during the 17 th century, remaining there until the 18th century. The site consisted of a church, cloisters, a school, orchards and a procurator’s office; also several tunnels were built, connecting the mission with different public buildings in the town. In the 19th century the Faculty of Sciences of the University of Buenos Aires was built in a sector of the site originally occupied by an orchard, functioning until its demolition in 1973. At present, this area is a cobbled square. With the aim of preserving and restoring the buried structures, work was carried out in this square looking for tunnels and remains of the basement of the old building.Considering the conductive features of the subsoil, mainly formed by clays and silt, the complex characteristics of the buried structures, and the urban localization of the study area with its consequent high level of environmental electromagnetic noise, we performed pre-feasibility studies to determine the usefulness of different geophysical methods. The best results were achieved from the geoelectrical method. Dipole–dipole profiles with electrode spacings of 1.5 and 3 m provided enough lateral and vertical resolution and the required penetration depth. Reliable data were obtained as long as the electrodes were buried at least 15 cm among the cobble stones. Nine 2D electrical resistivity tomographies were obtained by using a robust inversion procedure to reduce the effect of possible data outliers in the resulting models. The effect on these models of different error estimations was also analyzed. Then, we built up a pseudo-3D model by laterally interpolating the 2D inversion results. Finally, by correlating the resulting model with the original plans, the remains of the expected main structures embedded in the site were characterized. In addition, an anomaly was

  20. Diffusion Monte Carlo approach versus adiabatic computation for local Hamiltonians

    Science.gov (United States)

    Bringewatt, Jacob; Dorland, William; Jordan, Stephen P.; Mink, Alan

    2018-02-01

    Most research regarding quantum adiabatic optimization has focused on stoquastic Hamiltonians, whose ground states can be expressed with only real non-negative amplitudes and thus for whom destructive interference is not manifest. This raises the question of whether classical Monte Carlo algorithms can efficiently simulate quantum adiabatic optimization with stoquastic Hamiltonians. Recent results have given counterexamples in which path-integral and diffusion Monte Carlo fail to do so. However, most adiabatic optimization algorithms, such as for solving MAX-k -SAT problems, use k -local Hamiltonians, whereas our previous counterexample for diffusion Monte Carlo involved n -body interactions. Here we present a 6-local counterexample which demonstrates that even for these local Hamiltonians there are cases where diffusion Monte Carlo cannot efficiently simulate quantum adiabatic optimization. Furthermore, we perform empirical testing of diffusion Monte Carlo on a standard well-studied class of permutation-symmetric tunneling problems and similarly find large advantages for quantum optimization over diffusion Monte Carlo.

  1. Securing the Data Storage and Processing in Cloud Computing Environment

    Science.gov (United States)

    Owens, Rodney

    2013-01-01

    Organizations increasingly utilize cloud computing architectures to reduce costs and energy consumption both in the data warehouse and on mobile devices by better utilizing the computing resources available. However, the security and privacy issues with publicly available cloud computing infrastructures have not been studied to a sufficient depth…

  2. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    Energy Technology Data Exchange (ETDEWEB)

    Mike Bockelie; Dave Swensen; Martin Denison; Adel Sarofim; Connie Senior

    2004-12-22

    , immersive environment. The Virtual Engineering Framework (VEF), in effect a prototype framework, was developed through close collaboration with NETL supported research teams from Iowa State University Virtual Reality Applications Center (ISU-VRAC) and Carnegie Mellon University (CMU). The VEF is open source, compatible across systems ranging from inexpensive desktop PCs to large-scale, immersive facilities and provides support for heterogeneous distributed computing of plant simulations. The ability to compute plant economics through an interface that coupled the CMU IECM tool to the VEF was demonstrated, and the ability to couple the VEF to Aspen Plus, a commercial flowsheet modeling tool, was demonstrated. Models were interfaced to the framework using VES-Open. Tests were performed for interfacing CAPE-Open-compliant models to the framework. Where available, the developed models and plant simulations have been benchmarked against data from the open literature. The VEF has been installed at NETL. The VEF provides simulation capabilities not available in commercial simulation tools. It provides DOE engineers, scientists, and decision makers with a flexible and extensible simulation system that can be used to reduce the time, technical risk, and cost to develop the next generation of advanced, coal-fired power systems that will have low emissions and high efficiency. Furthermore, the VEF provides a common simulation system that NETL can use to help manage Advanced Power Systems Research projects, including both combustion- and gasification-based technologies.

  3. Usability Studies in Virtual and Traditional Computer Aided Design Environments for Fault Identification

    Science.gov (United States)

    2017-08-08

    communicate their subjective opinions. Keywords: Usability Analysis; CAVETM (Cave Automatic Virtual Environments); Human Computer Interface (HCI...the differences in interaction when compared with traditional human computer interfaces. This paper provides analysis via usability study methods

  4. Students experiences with collaborative learning in asynchronous computer-supported collaborative learning environments.

    NARCIS (Netherlands)

    Dewiyanti, Silvia; Brand-Gruwel, Saskia; Jochems, Wim; Broers, Nick

    2008-01-01

    Dewiyanti, S., Brand-Gruwel, S., Jochems, W., & Broers, N. (2007). Students experiences with collaborative learning in asynchronous computer-supported collaborative learning environments. Computers in Human Behavior, 23, 496-514.

  5. A heterogeneous computing environment to solve the 768-bit RSA challenge

    OpenAIRE

    Kleinjung, Thorsten; Bos, Joppe Willem; Lenstra, Arjen K.; Osvik, Dag Arne; Aoki, Kazumaro; Contini, Scott; Franke, Jens; Thomé, Emmanuel; Jermini, Pascal; Thiémard, Michela; Leyland, Paul; Montgomery, Peter L.; Timofeev, Andrey; Stockinger, Heinz

    2010-01-01

    In December 2009 the 768-bit, 232-digit number RSA-768 was factored using the number field sieve. Overall, the computational challenge would take more than 1700 years on a single, standard core. In the article we present the heterogeneous computing approach, involving different compute clusters and Grid computing environments, used to solve this problem.

  6. The Effects of a Robot Game Environment on Computer Programming Education for Elementary School Students

    Science.gov (United States)

    Shim, Jaekwoun; Kwon, Daiyoung; Lee, Wongyu

    2017-01-01

    In the past, computer programming was perceived as a task only carried out by computer scientists; in the 21st century, however, computer programming is viewed as a critical and necessary skill that everyone should learn. In order to improve teaching of problem-solving abilities in a computing environment, extensive research is being done on…

  7. Thinking Globally, Acting Locally: Using the Local Environment to Explore Global Issues.

    Science.gov (United States)

    Simmons, Deborah

    1994-01-01

    Asserts that water pollution is a global problem and presents statistics indicating how much of the world's water is threatened. Presents three elementary school classroom activities on water quality and local water resources. Includes a figure describing the work of the Global Rivers Environmental Education Network. (CFR)

  8. Distributed computing environments for future space control systems

    Science.gov (United States)

    Viallefont, Pierre

    1993-01-01

    The aim of this paper is to present the results of a CNES research project on distributed computing systems. The purpose of this research was to study the impact of the use of new computer technologies in the design and development of future space applications. The first part of this study was a state-of-the-art review of distributed computing systems. One of the interesting ideas arising from this review is the concept of a 'virtual computer' allowing the distributed hardware architecture to be hidden from a software application. The 'virtual computer' can improve system performance by adapting the best architecture (addition of computers) to the software application without having to modify its source code. This concept can also decrease the cost and obsolescence of the hardware architecture. In order to verify the feasibility of the 'virtual computer' concept, a prototype representative of a distributed space application is being developed independently of the hardware architecture.

  9. The Integrated Computational Environment for Airbreathing Hypersonic Flight Vehicle Modeling and Design Evaluation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — An integrated computational environment for multidisciplinary, physics-based simulation and analyses of airbreathing hypersonic flight vehicles will be developed....

  10. High performance computation of landscape genomic models including local indicators of spatial association.

    Science.gov (United States)

    Stucki, S; Orozco-terWengel, P; Forester, B R; Duruz, S; Colli, L; Masembe, C; Negrini, R; Landguth, E; Jones, M R; Bruford, M W; Taberlet, P; Joost, S

    2017-09-01

    With the increasing availability of both molecular and topo-climatic data, the main challenges facing landscape genomics - that is the combination of landscape ecology with population genomics - include processing large numbers of models and distinguishing between selection and demographic processes (e.g. population structure). Several methods address the latter, either by estimating a null model of population history or by simultaneously inferring environmental and demographic effects. Here we present samβada, an approach designed to study signatures of local adaptation, with special emphasis on high performance computing of large-scale genetic and environmental data sets. samβada identifies candidate loci using genotype-environment associations while also incorporating multivariate analyses to assess the effect of many environmental predictor variables. This enables the inclusion of explanatory variables representing population structure into the models to lower the occurrences of spurious genotype-environment associations. In addition, samβada calculates local indicators of spatial association for candidate loci to provide information on whether similar genotypes tend to cluster in space, which constitutes a useful indication of the possible kinship between individuals. To test the usefulness of this approach, we carried out a simulation study and analysed a data set from Ugandan cattle to detect signatures of local adaptation with samβada, bayenv, lfmm and an F ST outlier method (FDIST approach in arlequin) and compare their results. samβada - an open source software for Windows, Linux and Mac OS X available at http://lasig.epfl.ch/sambada - outperforms other approaches and better suits whole-genome sequence data processing. © 2016 The Authors. Molecular Ecology Resources Published by John Wiley & Sons Ltd.

  11. Computation Offloading Algorithm for Arbitrarily Divisible Applications in Mobile Edge Computing Environments: An OCR Case

    Directory of Open Access Journals (Sweden)

    Bo Li

    2018-05-01

    Full Text Available Divisible applications are a class of tasks whose loads can be partitioned into some smaller fractions, and each part can be executed independently by a processor. A wide variety of divisible applications have been found in the area of parallel and distributed processing. This paper addresses the problem of how to partition and allocate divisible applications to available resources in mobile edge computing environments with the aim of minimizing the completion time of the applications. A theoretical model was proposed for partitioning an entire divisible application according to the load of the application and the capabilities of available resources, and the solutions were derived in closed form. Both simulations and real experiments were carried out to justify this model.

  12. Individually controlled localized chilled beam in conjunction with chilled ceiling: Part 1 – Physical environment

    DEFF Research Database (Denmark)

    Arghand, Taha; Bolashikov, Zhecho Dimitrov; Kosonen, Risto

    2016-01-01

    This study investigates the indoor environment generated by localized chilled beam coupled with chilled ceiling (LCBCC) and compares it with the environment generated by mixing ventilation coupled with chilled ceiling (CCMV). The experiments were performed in a mock-up of single office (4.1 m × 4...

  13. Intraspecific Colour Variation among Lizards in Distinct Island Environments Enhances Local Camouflage

    Science.gov (United States)

    Marshall, Kate L. A.; Philpot, Kate E.; Damas-Moreira, Isabel; Stevens, Martin

    2015-01-01

    Within-species colour variation is widespread among animals. Understanding how this arises can elucidate evolutionary mechanisms, such as those underlying reproductive isolation and speciation. Here, we investigated whether five island populations of Aegean wall lizards (Podarcis erhardii) have more effective camouflage against their own (local) island substrates than against other (non-local) island substrates to avian predators, and whether this was linked to island differences in substrate appearance. We also investigated whether degree of local substrate matching varied among island populations and between sexes. In most populations, both sexes were better matched against local backgrounds than against non-local backgrounds, particularly in terms of luminance (perceived lightness), which usually occurred when local and non-local backgrounds were different in appearance. This was found even between island populations that historically had a land connection and in populations that have been isolated relatively recently, suggesting that isolation in these distinct island environments has been sufficient to cause enhanced local background matching, sometimes on a rapid evolutionary time-scale. However, heightened local matching was poorer in populations inhabiting more variable and unstable environments with a prolonged history of volcanic activity. Overall, these results show that lizard coloration is tuned to provide camouflage in local environments, either due to genetic adaptation or changes during development. Yet, the occurrence and extent of selection for local matching may depend on specific conditions associated with local ecology and biogeographic history. These results emphasize how anti-predator adaptations to different environments can drive divergence within a species, which may contribute to reproductive isolation among populations and lead to ecological speciation. PMID:26372454

  14. Intraspecific Colour Variation among Lizards in Distinct Island Environments Enhances Local Camouflage.

    Directory of Open Access Journals (Sweden)

    Kate L A Marshall

    Full Text Available Within-species colour variation is widespread among animals. Understanding how this arises can elucidate evolutionary mechanisms, such as those underlying reproductive isolation and speciation. Here, we investigated whether five island populations of Aegean wall lizards (Podarcis erhardii have more effective camouflage against their own (local island substrates than against other (non-local island substrates to avian predators, and whether this was linked to island differences in substrate appearance. We also investigated whether degree of local substrate matching varied among island populations and between sexes. In most populations, both sexes were better matched against local backgrounds than against non-local backgrounds, particularly in terms of luminance (perceived lightness, which usually occurred when local and non-local backgrounds were different in appearance. This was found even between island populations that historically had a land connection and in populations that have been isolated relatively recently, suggesting that isolation in these distinct island environments has been sufficient to cause enhanced local background matching, sometimes on a rapid evolutionary time-scale. However, heightened local matching was poorer in populations inhabiting more variable and unstable environments with a prolonged history of volcanic activity. Overall, these results show that lizard coloration is tuned to provide camouflage in local environments, either due to genetic adaptation or changes during development. Yet, the occurrence and extent of selection for local matching may depend on specific conditions associated with local ecology and biogeographic history. These results emphasize how anti-predator adaptations to different environments can drive divergence within a species, which may contribute to reproductive isolation among populations and lead to ecological speciation.

  15. Intraspecific Colour Variation among Lizards in Distinct Island Environments Enhances Local Camouflage.

    Science.gov (United States)

    Marshall, Kate L A; Philpot, Kate E; Damas-Moreira, Isabel; Stevens, Martin

    2015-01-01

    Within-species colour variation is widespread among animals. Understanding how this arises can elucidate evolutionary mechanisms, such as those underlying reproductive isolation and speciation. Here, we investigated whether five island populations of Aegean wall lizards (Podarcis erhardii) have more effective camouflage against their own (local) island substrates than against other (non-local) island substrates to avian predators, and whether this was linked to island differences in substrate appearance. We also investigated whether degree of local substrate matching varied among island populations and between sexes. In most populations, both sexes were better matched against local backgrounds than against non-local backgrounds, particularly in terms of luminance (perceived lightness), which usually occurred when local and non-local backgrounds were different in appearance. This was found even between island populations that historically had a land connection and in populations that have been isolated relatively recently, suggesting that isolation in these distinct island environments has been sufficient to cause enhanced local background matching, sometimes on a rapid evolutionary time-scale. However, heightened local matching was poorer in populations inhabiting more variable and unstable environments with a prolonged history of volcanic activity. Overall, these results show that lizard coloration is tuned to provide camouflage in local environments, either due to genetic adaptation or changes during development. Yet, the occurrence and extent of selection for local matching may depend on specific conditions associated with local ecology and biogeographic history. These results emphasize how anti-predator adaptations to different environments can drive divergence within a species, which may contribute to reproductive isolation among populations and lead to ecological speciation.

  16. Cloud Computing as Network Environment in Students Work

    OpenAIRE

    Piotrowski, Dominik Mirosław

    2013-01-01

    The purpose of the article was to show the need for literacy education from a variety of services available in the cloud computing as a specialist information field of activity. Teaching at university in the field of cloud computing related to the management of information could provide tangible benefits in the form of useful learning outcomes. This allows students and future information professionals to begin enjoying the benefits of cloud computing SaaS model at work, thereby freeing up of...

  17. Hybrid Symbiotic Organisms Search Optimization Algorithm for Scheduling of Tasks on Cloud Computing Environment.

    Science.gov (United States)

    Abdullahi, Mohammed; Ngadi, Md Asri

    2016-01-01

    Cloud computing has attracted significant attention from research community because of rapid migration rate of Information Technology services to its domain. Advances in virtualization technology has made cloud computing very popular as a result of easier deployment of application services. Tasks are submitted to cloud datacenters to be processed on pay as you go fashion. Task scheduling is one the significant research challenges in cloud computing environment. The current formulation of task scheduling problems has been shown to be NP-complete, hence finding the exact solution especially for large problem sizes is intractable. The heterogeneous and dynamic feature of cloud resources makes optimum task scheduling non-trivial. Therefore, efficient task scheduling algorithms are required for optimum resource utilization. Symbiotic Organisms Search (SOS) has been shown to perform competitively with Particle Swarm Optimization (PSO). The aim of this study is to optimize task scheduling in cloud computing environment based on a proposed Simulated Annealing (SA) based SOS (SASOS) in order to improve the convergence rate and quality of solution of SOS. The SOS algorithm has a strong global exploration capability and uses fewer parameters. The systematic reasoning ability of SA is employed to find better solutions on local solution regions, hence, adding exploration ability to SOS. Also, a fitness function is proposed which takes into account the utilization level of virtual machines (VMs) which reduced makespan and degree of imbalance among VMs. CloudSim toolkit was used to evaluate the efficiency of the proposed method using both synthetic and standard workload. Results of simulation showed that hybrid SOS performs better than SOS in terms of convergence speed, response time, degree of imbalance, and makespan.

  18. Hybrid Symbiotic Organisms Search Optimization Algorithm for Scheduling of Tasks on Cloud Computing Environment.

    Directory of Open Access Journals (Sweden)

    Mohammed Abdullahi

    Full Text Available Cloud computing has attracted significant attention from research community because of rapid migration rate of Information Technology services to its domain. Advances in virtualization technology has made cloud computing very popular as a result of easier deployment of application services. Tasks are submitted to cloud datacenters to be processed on pay as you go fashion. Task scheduling is one the significant research challenges in cloud computing environment. The current formulation of task scheduling problems has been shown to be NP-complete, hence finding the exact solution especially for large problem sizes is intractable. The heterogeneous and dynamic feature of cloud resources makes optimum task scheduling non-trivial. Therefore, efficient task scheduling algorithms are required for optimum resource utilization. Symbiotic Organisms Search (SOS has been shown to perform competitively with Particle Swarm Optimization (PSO. The aim of this study is to optimize task scheduling in cloud computing environment based on a proposed Simulated Annealing (SA based SOS (SASOS in order to improve the convergence rate and quality of solution of SOS. The SOS algorithm has a strong global exploration capability and uses fewer parameters. The systematic reasoning ability of SA is employed to find better solutions on local solution regions, hence, adding exploration ability to SOS. Also, a fitness function is proposed which takes into account the utilization level of virtual machines (VMs which reduced makespan and degree of imbalance among VMs. CloudSim toolkit was used to evaluate the efficiency of the proposed method using both synthetic and standard workload. Results of simulation showed that hybrid SOS performs better than SOS in terms of convergence speed, response time, degree of imbalance, and makespan.

  19. The efficacy of control environment as fraud deterrence in local government

    Directory of Open Access Journals (Sweden)

    Nuswantara Dian Anita

    2017-12-01

    Full Text Available In a globalised scenario, the topic of an enormous increase of malfeasance in the local governments, posing catastrophic threats which come from vicious bureaucratic apparatus, becomes a global phenomenon. This current study uses case study material on the risk management control system specially the control environment in Indonesia local governments to extend existing theory by developing a contingency theory for the public sector. Within local government, contingency theory has emerged as a lens for exploring the links between public sector initiatives to improve risk mitigation and the structure of the control system. The case illustrates that the discretion of control environment - the encouragement of a local government’s control environment - is considered as a springboard for fraud deterrence and might be the loopholes in the government control systems.

  20. A Semantic Based Policy Management Framework for Cloud Computing Environments

    Science.gov (United States)

    Takabi, Hassan

    2013-01-01

    Cloud computing paradigm has gained tremendous momentum and generated intensive interest. Although security issues are delaying its fast adoption, cloud computing is an unstoppable force and we need to provide security mechanisms to ensure its secure adoption. In this dissertation, we mainly focus on issues related to policy management and access…

  1. Newmark local time stepping on high-performance computing architectures

    KAUST Repository

    Rietmann, Max

    2016-11-25

    In multi-scale complex media, finite element meshes often require areas of local refinement, creating small elements that can dramatically reduce the global time-step for wave-propagation problems due to the CFL condition. Local time stepping (LTS) algorithms allow an explicit time-stepping scheme to adapt the time-step to the element size, allowing near-optimal time-steps everywhere in the mesh. We develop an efficient multilevel LTS-Newmark scheme and implement it in a widely used continuous finite element seismic wave-propagation package. In particular, we extend the standard LTS formulation with adaptations to continuous finite element methods that can be implemented very efficiently with very strong element-size contrasts (more than 100×). Capable of running on large CPU and GPU clusters, we present both synthetic validation examples and large scale, realistic application examples to demonstrate the performance and applicability of the method and implementation on thousands of CPU cores and hundreds of GPUs.

  2. Newmark local time stepping on high-performance computing architectures

    Energy Technology Data Exchange (ETDEWEB)

    Rietmann, Max, E-mail: max.rietmann@erdw.ethz.ch [Institute for Computational Science, Università della Svizzera italiana, Lugano (Switzerland); Institute of Geophysics, ETH Zurich (Switzerland); Grote, Marcus, E-mail: marcus.grote@unibas.ch [Department of Mathematics and Computer Science, University of Basel (Switzerland); Peter, Daniel, E-mail: daniel.peter@kaust.edu.sa [Institute for Computational Science, Università della Svizzera italiana, Lugano (Switzerland); Institute of Geophysics, ETH Zurich (Switzerland); Schenk, Olaf, E-mail: olaf.schenk@usi.ch [Institute for Computational Science, Università della Svizzera italiana, Lugano (Switzerland)

    2017-04-01

    In multi-scale complex media, finite element meshes often require areas of local refinement, creating small elements that can dramatically reduce the global time-step for wave-propagation problems due to the CFL condition. Local time stepping (LTS) algorithms allow an explicit time-stepping scheme to adapt the time-step to the element size, allowing near-optimal time-steps everywhere in the mesh. We develop an efficient multilevel LTS-Newmark scheme and implement it in a widely used continuous finite element seismic wave-propagation package. In particular, we extend the standard LTS formulation with adaptations to continuous finite element methods that can be implemented very efficiently with very strong element-size contrasts (more than 100x). Capable of running on large CPU and GPU clusters, we present both synthetic validation examples and large scale, realistic application examples to demonstrate the performance and applicability of the method and implementation on thousands of CPU cores and hundreds of GPUs.

  3. Newmark local time stepping on high-performance computing architectures

    KAUST Repository

    Rietmann, Max; Grote, Marcus; Peter, Daniel; Schenk, Olaf

    2016-01-01

    In multi-scale complex media, finite element meshes often require areas of local refinement, creating small elements that can dramatically reduce the global time-step for wave-propagation problems due to the CFL condition. Local time stepping (LTS) algorithms allow an explicit time-stepping scheme to adapt the time-step to the element size, allowing near-optimal time-steps everywhere in the mesh. We develop an efficient multilevel LTS-Newmark scheme and implement it in a widely used continuous finite element seismic wave-propagation package. In particular, we extend the standard LTS formulation with adaptations to continuous finite element methods that can be implemented very efficiently with very strong element-size contrasts (more than 100×). Capable of running on large CPU and GPU clusters, we present both synthetic validation examples and large scale, realistic application examples to demonstrate the performance and applicability of the method and implementation on thousands of CPU cores and hundreds of GPUs.

  4. Construction of a Digital Learning Environment Based on Cloud Computing

    Science.gov (United States)

    Ding, Jihong; Xiong, Caiping; Liu, Huazhong

    2015-01-01

    Constructing the digital learning environment for ubiquitous learning and asynchronous distributed learning has opened up immense amounts of concrete research. However, current digital learning environments do not fully fulfill the expectations on supporting interactive group learning, shared understanding and social construction of knowledge.…

  5. Thermal comfort assessment of a surgical room through computational fluid dynamics using local PMV index.

    Science.gov (United States)

    Rodrigues, Nelson J O; Oliveira, Ricardo F; Teixeira, Senhorinha F C F; Miguel, Alberto Sérgio; Teixeira, José Carlos; Baptista, João S

    2015-01-01

    Studies concerning indoor thermal conditions are very important in defining the satisfactory comfort range in health care facilities. This study focuses on the evaluation of the thermal comfort sensation felt by surgeons and nurses, in an orthopaedic surgical room of a Portuguese hospital. Two cases are assessed, with and without the presence of a person. Computational fluid dynamic (CFD) tools were applied for evaluating the predicted mean vote (PMV) index locally. Using average ventilation values to calculate the PMV index does not provide a correct and enough descriptive evaluation of the surgical room thermal environment. As studied for both cases, surgeons feel the environment slightly hotter than nurses. The nurses feel a slightly cold sensation under the air supply diffuser and their neutral comfort zone is located in the air stagnation zones close to the walls, while the surgeons feel the opposite. It was observed that the presence of a person in the room leads to an increase of the PMV index for surgeons and nurses. That goes in line with the empirical knowledge that more persons in a room lead to an increased heat sensation. The clothing used by both classes, as well as the ventilation conditions, should be revised accordingly to the amount of persons in the room and the type of activity performed.

  6. Monte Carlo in radiotherapy: experience in a distributed computational environment

    Science.gov (United States)

    Caccia, B.; Mattia, M.; Amati, G.; Andenna, C.; Benassi, M.; D'Angelo, A.; Frustagli, G.; Iaccarino, G.; Occhigrossi, A.; Valentini, S.

    2007-06-01

    New technologies in cancer radiotherapy need a more accurate computation of the dose delivered in the radiotherapeutical treatment plan, and it is important to integrate sophisticated mathematical models and advanced computing knowledge into the treatment planning (TP) process. We present some results about using Monte Carlo (MC) codes in dose calculation for treatment planning. A distributed computing resource located in the Technologies and Health Department of the Italian National Institute of Health (ISS) along with other computer facilities (CASPUR - Inter-University Consortium for the Application of Super-Computing for Universities and Research) has been used to perform a fully complete MC simulation to compute dose distribution on phantoms irradiated with a radiotherapy accelerator. Using BEAMnrc and GEANT4 MC based codes we calculated dose distributions on a plain water phantom and air/water phantom. Experimental and calculated dose values below ±2% (for depth between 5 mm and 130 mm) were in agreement both in PDD (Percentage Depth Dose) and transversal sections of the phantom. We consider these results a first step towards a system suitable for medical physics departments to simulate a complete treatment plan using remote computing facilities for MC simulations.

  7. Cortical basis of communication: local computation, coordination, attention.

    Science.gov (United States)

    Alexandre, Frederic

    2009-03-01

    Human communication emerges from cortical processing, known to be implemented on a regular repetitive neuronal substratum. The supposed genericity of cortical processing has elicited a series of modeling works in computational neuroscience that underline the information flows driven by the cortical circuitry. In the minimalist framework underlying the current theories for the embodiment of cognition, such a generic cortical processing is exploited for the coordination of poles of representation, as is reported in this paper for the case of visual attention. Interestingly, this case emphasizes how abstract internal referents are built to conform to memory requirements. This paper proposes that these referents are the basis for communication in humans, which is firstly a coordination and an attentional procedure with regard to their congeners.

  8. Instantaneous Non-Local Computation of Low T-Depth Quantum Circuits

    DEFF Research Database (Denmark)

    Speelman, Florian

    2016-01-01

    -depth of a quantum circuit, able to perform non-local computation of quantum circuits with a (poly-)logarithmic number of layers of T gates with quasi-polynomial entanglement. Our proofs combine ideas from blind and delegated quantum computation with the garden-hose model, a combinatorial model of communication......Instantaneous non-local quantum computation requires multiple parties to jointly perform a quantum operation, using pre-shared entanglement and a single round of simultaneous communication. We study this task for its close connection to position-based quantum cryptography, but it also has natural...... applications in the context of foundations of quantum physics and in distributed computing. The best known general construction for instantaneous non-local quantum computation requires a pre-shared state which is exponentially large in the number of qubits involved in the operation, while efficient...

  9. Computer Aided Design Tools for Extreme Environment Electronics, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This project aims to provide Computer Aided Design (CAD) tools for radiation-tolerant, wide-temperature-range digital, analog, mixed-signal, and radio-frequency...

  10. Distributed metadata in a high performance computing environment

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Zhang, Zhenhua; Liu, Xuezhao; Tang, Haiying

    2017-07-11

    A computer-executable method, system, and computer program product for managing meta-data in a distributed storage system, wherein the distributed storage system includes one or more burst buffers enabled to operate with a distributed key-value store, the co computer-executable method, system, and computer program product comprising receiving a request for meta-data associated with a block of data stored in a first burst buffer of the one or more burst buffers in the distributed storage system, wherein the meta data is associated with a key-value, determining which of the one or more burst buffers stores the requested metadata, and upon determination that a first burst buffer of the one or more burst buffers stores the requested metadata, locating the key-value in a portion of the distributed key-value store accessible from the first burst buffer.

  11. Ubiquitous fuzzy computing in open ambient intelligence environments

    NARCIS (Netherlands)

    Acampora, G.; Loia, V.

    2006-01-01

    Ambient intelligence (AmI) is considered as the composition of three emergent technologies: ubiquitous computing, ubiquitous communication and intelligent user interfaces. The aim of integration of aforesaid technologies is to make wider the interaction between human beings and information

  12. Computer-based Role Playing Game Environment for Analogue Electronics

    Directory of Open Access Journals (Sweden)

    Lachlan M MacKinnon

    2009-02-01

    Full Text Available An implementation of a design for a game based virtual learning environment is described. The game is developed for a course in analogue electronics, and the topic is the design of a power supply. This task can be solved in a number of different ways, with certain constraints, giving the students a certain amount of freedom, although the game is designed not to facilitate trial-and-error approach. The use of storytelling and a virtual gaming environment provides the student with the learning material in a MMORPG environment.

  13. Deriving motion from megavoltage localization cone beam computed tomography scans

    International Nuclear Information System (INIS)

    Alfredo C Siochi, R

    2009-01-01

    Cone beam computed tomography (CBCT) projection data consist of views of a moving point (e.g. diaphragm apex). The point is selected in identification views of extreme motion (two inhale, two exhale). The room coordinates of the extreme points are determined by source-to-view ray tracing intersections. Projected to other views, these points become opposite corners of a motion-bounding box. The view coordinates of the point, relative to the box, are used to interpolate between extreme room coordinates. Along with the views' time stamps, this provides the point's room coordinates as a function of time. CBCT-derived trajectories of a tungsten pin, moving 3 cm cranio-caudally and 1 cm elsewhere, deviate from expected ones by at most 1.06 mm. When deviations from the ideal imaging geometry are considered, mean errors are less than 0.2 mm. While CBCT-derived cranio-caudal positions are insensitive to the choice of identification views, the bounding box determination requires view separations between 15 and 163 deg. Inhale views with the two largest amplitudes should be used, though corrections can account for different amplitudes. The information could be used to calibrate motion surrogates, adaptively define phase triggers immediately before gated radiotherapy and provide phase and amplitude sorting for 4D CBCT.

  14. Context-aware Cloud Computing for Personal Learning Environment

    OpenAIRE

    Chen, Feng; Al-Bayatti, Ali Hilal; Siewe, Francois

    2016-01-01

    Virtual learning means to learn from social interactions in a virtual platform that enables people to study anywhere and at any time. Current Virtual Learning Environments (VLEs) are a range of integrated web based applications to support and enhance the education. Normally, VLEs are institution centric; are owned by the institutions and are designed to support formal learning, which do not support lifelong learning. These limitations led to the research of Personal Learning Environments (PLE...

  15. Computer tomographic localization and lesion size in aphasia

    International Nuclear Information System (INIS)

    Hojo, Kei

    1985-01-01

    Using a microcomputer, the locus and extent of the lesions demonstrated on CT were superimposed on standardized matrices in 127 cases with various types of aphasia, to investigate the relationship between location of the lesions and types of aphasia. Main results were as follows. 1. Broca aphasics: The lesions involved rather large areas in the deep structures of the lower part of the precentral gyrus, the insula and the lenticular nucleus. Therefore, the finding was regarded as being of little localizing value. 2. Wernicke aphasics: At least 70 % of the patients had superior temporal lesions involving Wernicke's area and the subcortical lesions of the superior and middle temporal gyri. The site of the lesion corresponded roughly with that in the previous clinico-pathological reports but was indicated in a little deeper area. 3. Amnestic aphasics: The size of the lesion was smaller than any other type but the lesions were distributed throughout the left hemisphere. Amnestic asphasia was thought to be the least localizable. 4. Conduction aphasics: Most patients had lesions in the posterior speech area involving part of Wernicke's area. In particular, in more than 80 % of the conduction aphasics the lesions were revealed in the supramarginal gyrus and it's adjacent deep structures. 5. Global aphasics: In general, the size of the lesion was very large and 70 % of the global aphasics had extensive lesions involving both Broca's and Wernicke's areas. However, there were some patients showing small and confined lesions. (author)

  16. Tacit knowledge in action: basic notions of knowledge sharing in computer supported work environments

    OpenAIRE

    Mackenzie Owen, John

    2001-01-01

    An important characteristic of most computer supported work environments is the distribution of work over individuals or teams in different locations. This leads to what we nowadays call `virtual' environments. In these environments communication between actors is to a large degree mediated, i.e. established through communications media (telephone, fax, computer networks) rather in a face-to-face way. Unfortunately, mediated communication limits the effectiveness of knowledge exchange in virt...

  17. Transnational (Dis)connection in localizing personal computing in the Netherlands, 1975-1990

    NARCIS (Netherlands)

    Veraart, F.C.A.; Alberts, G.; Oldenziel, R.

    2014-01-01

    Examining the diffusion and domestication of computer technologies in Dutch households and schools during the 1980s and 1990s, this chapter shows that the process was not a simple story of adoption of American models. Instead, many Dutch actors adapted computer technologies to their own local needs,

  18. Bridging context management systems for different types of pervasive computing environments

    NARCIS (Netherlands)

    Hesselman, C.E.W.; Benz, Hartmut; Benz, H.P.; Pawar, P.; Liu, F.; Wegdam, M.; Wibbels, Martin; Broens, T.H.F.; Brok, Jacco

    2008-01-01

    A context management system is a distributed system that enables applications to obtain context information about (mobile) users and forms a key component of any pervasive computing environment. Context management systems are however very environment-specific (e.g., specific for home environments)

  19. The Development and Evaluation of a Computer-Simulated Science Inquiry Environment Using Gamified Elements

    Science.gov (United States)

    Tsai, Fu-Hsing

    2018-01-01

    This study developed a computer-simulated science inquiry environment, called the Science Detective Squad, to engage students in investigating an electricity problem that may happen in daily life. The environment combined the simulation of scientific instruments and a virtual environment, including gamified elements, such as points and a story for…

  20. The Contribution of Local Environments to Competence Creation in Multinational Enterprises

    DEFF Research Database (Denmark)

    Andersson, Ulf; Dellestrand, Henrik; Pedersen, Torben

    2014-01-01

    This paper examines the competence development of subsidiaries in multinational enterprises. We analyze how local subsidiary environments affect the development of technological and business competencies among other units in the multinational enterprise. We test our predictions using data from 2......,107 foreign-owned subsidiaries located in seven European countries, by means of structural equation modeling — namely, LISREL. By bringing the local environment to the fore, we contribute to the literature on the emergence and determinants of firm-specific advantages. We link local subsidiary environments...... throughout the organization. Thus, we contribute to an enhanced understanding of location as a determinant of the creation of units of competence and centers of excellence within multinational enterprises. In other words, we demonstrate that country-specific advantages are beneficial for competence creation...

  1. Performance Measurements in a High Throughput Computing Environment

    CERN Document Server

    AUTHOR|(CDS)2145966; Gribaudo, Marco

    The IT infrastructures of companies and research centres are implementing new technologies to satisfy the increasing need of computing resources for big data analysis. In this context, resource profiling plays a crucial role in identifying areas where the improvement of the utilisation efficiency is needed. In order to deal with the profiling and optimisation of computing resources, two complementary approaches can be adopted: the measurement-based approach and the model-based approach. The measurement-based approach gathers and analyses performance metrics executing benchmark applications on computing resources. Instead, the model-based approach implies the design and implementation of a model as an abstraction of the real system, selecting only those aspects relevant to the study. This Thesis originates from a project carried out by the author within the CERN IT department. CERN is an international scientific laboratory that conducts fundamental researches in the domain of elementary particle physics. The p...

  2. Local Authority Empowerment towards Quality Living Environment for Coastal Reclamation Area

    Directory of Open Access Journals (Sweden)

    Yusup Mohammad

    2016-01-01

    Full Text Available Good urban governance administration system is the key to a successful physical planning development. A local authority of a local government concentrates on planning administration and executes the policies and strategies either the federal or state, or even the local’s policies and strategies. Based on its characteristic as the lowest level of government, it becomes the best authority to regulate and monitor the development process within their territory. The significance of a local authority in providing quality living environment invites various academia and professionals to ponder the best urban governance system at a local level. However, there are issues with regards to financial and technical capacity of a local authority, its legal limitation and development instrument adopted in providing urban services for coastal reclamation area in Malaysia. The aim of this paper is to investigate the capability of local authorities in Malaysia in implementing their function as drawn by the legislation. Hence, this paper examines the roles and functions of a local authority as the lowest level of government administration agency in providing urban services; collecting revenue; safeguarding the physical environment in Malaysia, particularly when dealing with development in a coastal reclamation area. Primary data collection was gathered through face-to-face interview sessions involving government agencies and stakeholders. Legal documents, policies and development plans were then analysed to support the primary data for further understanding of the issues concerning the capacity of a local authority especially when providing urban services within its area. The study is expected to provide a new approach to local authorities in Malaysia in providing quality living environment in terms of development procedure, the role and function, legal empowerment, and decentralisation of function particularly in enhancing the current practices at local level.

  3. On-line computing in a classified environment

    International Nuclear Information System (INIS)

    O'Callaghan, P.B.

    1982-01-01

    Westinghouse Hanford Company (WHC) recently developed a Department of Energy (DOE) approved real-time, on-line computer system to control nuclear material. The system simultaneously processes both classified and unclassified information. Implementation of this system required application of many security techniques. The system has a secure, but user friendly interface. Many software applications protect the integrity of the data base from malevolent or accidental errors. Programming practices ensure the integrity of the computer system software. The audit trail and the reports generation capability record user actions and status of the nuclear material inventory

  4. An evaluation of multi-probe locality sensitive hashing for computing similarities over web-scale query logs.

    Directory of Open Access Journals (Sweden)

    Graham Cormode

    Full Text Available Many modern applications of AI such as web search, mobile browsing, image processing, and natural language processing rely on finding similar items from a large database of complex objects. Due to the very large scale of data involved (e.g., users' queries from commercial search engines, computing such near or nearest neighbors is a non-trivial task, as the computational cost grows significantly with the number of items. To address this challenge, we adopt Locality Sensitive Hashing (a.k.a, LSH methods and evaluate four variants in a distributed computing environment (specifically, Hadoop. We identify several optimizations which improve performance, suitable for deployment in very large scale settings. The experimental results demonstrate our variants of LSH achieve the robust performance with better recall compared with "vanilla" LSH, even when using the same amount of space.

  5. Local Path Planning of Driverless Car Navigation Based on Jump Point Search Method Under Urban Environment

    Directory of Open Access Journals (Sweden)

    Kaijun Zhou

    2017-09-01

    Full Text Available The Jump Point Search (JPS algorithm is adopted for local path planning of the driverless car under urban environment, and it is a fast search method applied in path planning. Firstly, a vector Geographic Information System (GIS map, including Global Positioning System (GPS position, direction, and lane information, is built for global path planning. Secondly, the GIS map database is utilized in global path planning for the driverless car. Then, the JPS algorithm is adopted to avoid the front obstacle, and to find an optimal local path for the driverless car in the urban environment. Finally, 125 different simulation experiments in the urban environment demonstrate that JPS can search out the optimal and safety path successfully, and meanwhile, it has a lower time complexity compared with the Vector Field Histogram (VFH, the Rapidly Exploring Random Tree (RRT, A*, and the Probabilistic Roadmaps (PRM algorithms. Furthermore, JPS is validated usefully in the structured urban environment.

  6. Cloud Computing E-Communication Services in the University Environment

    Science.gov (United States)

    Babin, Ron; Halilovic, Branka

    2017-01-01

    The use of cloud computing services has grown dramatically in post-secondary institutions in the last decade. In particular, universities have been attracted to the low-cost and flexibility of acquiring cloud software services from Google, Microsoft and others, to implement e-mail, calendar and document management and other basic office software.…

  7. Music Teachers' Experiences in One-to-One Computing Environments

    Science.gov (United States)

    Dorfman, Jay

    2016-01-01

    Ubiquitous computing scenarios such as the one-to-one model, in which every student is issued a device that is to be used across all subjects, have increased in popularity and have shown both positive and negative influences on education. Music teachers in schools that adopt one-to-one models may be inadequately equipped to integrate this kind of…

  8. Formal and Information Learning in a Computer Clubhouse Environment

    Science.gov (United States)

    McDougall, Anne; Lowe, Jenny; Hopkins, Josie

    2004-01-01

    This paper outlines the establishment and running of an after-school Computer Clubhouse, describing aspects of the leadership, mentoring and learning activities undertaken there. Research data has been collected from examination of documents associated with the Clubhouse, interviews with its founders, Director, session leaders and mentors, and…

  9. Computational Photophysics in the Presence of an Environment

    Science.gov (United States)

    Nogueira, Juan J.; González, Leticia

    2018-04-01

    Most processes triggered by ultraviolet (UV) or visible (vis) light in nature take place in complex biological environments. The first step in these photophysical events is the excitation of the absorbing system or chromophore to an electronically excited state. Such an excitation can be monitored by the UV-vis absorption spectrum. A precise calculation of the UV-vis spectrum of a chromophore embedded in an environment is a challenging task that requires the consideration of several ingredients, besides an accurate electronic-structure method for the excited states. Two of the most important are an appropriate description of the interactions between the chromophore and the environment and accounting for the vibrational motion of the whole system. In this contribution, we review the most common theoretical methodologies to describe the environment (including quantum mechanics/continuum and quantum mechanics/molecular mechanics models) and to account for vibrational sampling (including Wigner sampling and molecular dynamics). Further, we illustrate in a series of examples how the lack of these ingredients can lead to a wrong interpretation of the electronic features behind the UV-vis absorption spectrum.

  10. Multiscale Computing with the Multiscale Modeling Library and Runtime Environment

    NARCIS (Netherlands)

    Borgdorff, J.; Mamonski, M.; Bosak, B.; Groen, D.; Ben Belgacem, M.; Kurowski, K.; Hoekstra, A.G.

    2013-01-01

    We introduce a software tool to simulate multiscale models: the Multiscale Coupling Library and Environment 2 (MUSCLE 2). MUSCLE 2 is a component-based modeling tool inspired by the multiscale modeling and simulation framework, with an easy-to-use API which supports Java, C++, C, and Fortran. We

  11. Evolution of the local environment of lanthanum during simplified SON68 glass leaching

    International Nuclear Information System (INIS)

    Jollivet, P.; Delaye, J.M.; Den Auwer, C.; Simoni, E.

    2007-01-01

    The evolution of the short- and medium-range local environment of lanthanum was determined by L-III-edge X-ray absorption spectroscopy (XAS) during leaching of simplified SON68-type glasses. In glass without phosphorus, lanthanum is found in a silicate environment, and its first coordination sphere comprises eight oxygen atoms at a mean distance of 2.51 angstrom. When this glass was leached at a high renewal rate, the lanthanum local environment was significantly modified: it was present at hydroxy-carbonate and silicate sites with a mean La-O distance of 2.56 angstrom, and the second neighbors consisted of La atoms instead of Si for the glass. Conversely, in the gel formed at low renewal rates, lanthanum was found in a silicate environment similar to that of the glass. In phosphorus-doped glass, lanthanum is found in a phosphate environment, although the Si/P atomic ratio is 20:1. Lanthanum is surrounded by seven oxygen atoms at a mean distance of 2.37 angstrom. When phosphorus-doped glass is leached, regardless of the leaching solution flow rate, the short- and medium-range lanthanum local environment remains almost constant; the most significant change is a 0.05 angstrom increase in the La-O distance. (authors)

  12. Sentiment analysis and ontology engineering an environment of computational intelligence

    CERN Document Server

    Chen, Shyi-Ming

    2016-01-01

    This edited volume provides the reader with a fully updated, in-depth treatise on the emerging principles, conceptual underpinnings, algorithms and practice of Computational Intelligence in the realization of concepts and implementation of models of sentiment analysis and ontology –oriented engineering. The volume involves studies devoted to key issues of sentiment analysis, sentiment models, and ontology engineering. The book is structured into three main parts. The first part offers a comprehensive and prudently structured exposure to the fundamentals of sentiment analysis and natural language processing. The second part consists of studies devoted to the concepts, methodologies, and algorithmic developments elaborating on fuzzy linguistic aggregation to emotion analysis, carrying out interpretability of computational sentiment models, emotion classification, sentiment-oriented information retrieval, a methodology of adaptive dynamics in knowledge acquisition. The third part includes a plethora of applica...

  13. An u-Service Model Based on a Smart Phone for Urban Computing Environments

    Science.gov (United States)

    Cho, Yongyun; Yoe, Hyun

    In urban computing environments, all of services should be based on the interaction between humans and environments around them, which frequently and ordinarily in home and office. This paper propose an u-service model based on a smart phone for urban computing environments. The suggested service model includes a context-aware and personalized service scenario development environment that can instantly describe user's u-service demand or situation information with smart devices. To do this, the architecture of the suggested service model consists of a graphical service editing environment for smart devices, an u-service platform, and an infrastructure with sensors and WSN/USN. The graphic editor expresses contexts as execution conditions of a new service through a context model based on ontology. The service platform deals with the service scenario according to contexts. With the suggested service model, an user in urban computing environments can quickly and easily make u-service or new service using smart devices.

  14. Multi-Language Programming Environments for High Performance Java Computing

    OpenAIRE

    Vladimir Getov; Paul Gray; Sava Mintchev; Vaidy Sunderam

    1999-01-01

    Recent developments in processor capabilities, software tools, programming languages and programming paradigms have brought about new approaches to high performance computing. A steadfast component of this dynamic evolution has been the scientific community’s reliance on established scientific packages. As a consequence, programmers of high‐performance applications are reluctant to embrace evolving languages such as Java. This paper describes the Java‐to‐C Interface (JCI) tool which provides ...

  15. A FUNCTIONAL MODEL OF COMPUTER-ORIENTED LEARNING ENVIRONMENT OF A POST-DEGREE PEDAGOGICAL EDUCATION

    Directory of Open Access Journals (Sweden)

    Kateryna R. Kolos

    2014-06-01

    Full Text Available The study substantiates the need for a systematic study of the functioning of computer-oriented learning environment of a post-degree pedagogical education; it is determined the definition of “functional model of computer-oriented learning environment of a post-degree pedagogical education”; it is built a functional model of computer-oriented learning environment of a post-degree pedagogical education in accordance with the functions of business, information and communication technology, academic, administrative staff and peculiarities of training courses teachers.

  16. Error Analysis for RADAR Neighbor Matching Localization in Linear Logarithmic Strength Varying Wi-Fi Environment

    Directory of Open Access Journals (Sweden)

    Mu Zhou

    2014-01-01

    Full Text Available This paper studies the statistical errors for the fingerprint-based RADAR neighbor matching localization with the linearly calibrated reference points (RPs in logarithmic received signal strength (RSS varying Wi-Fi environment. To the best of our knowledge, little comprehensive analysis work has appeared on the error performance of neighbor matching localization with respect to the deployment of RPs. However, in order to achieve the efficient and reliable location-based services (LBSs as well as the ubiquitous context-awareness in Wi-Fi environment, much attention has to be paid to the highly accurate and cost-efficient localization systems. To this end, the statistical errors by the widely used neighbor matching localization are significantly discussed in this paper to examine the inherent mathematical relations between the localization errors and the locations of RPs by using a basic linear logarithmic strength varying model. Furthermore, based on the mathematical demonstrations and some testing results, the closed-form solutions to the statistical errors by RADAR neighbor matching localization can be an effective tool to explore alternative deployment of fingerprint-based neighbor matching localization systems in the future.

  17. Error Analysis for RADAR Neighbor Matching Localization in Linear Logarithmic Strength Varying Wi-Fi Environment

    Science.gov (United States)

    Tian, Zengshan; Xu, Kunjie; Yu, Xiang

    2014-01-01

    This paper studies the statistical errors for the fingerprint-based RADAR neighbor matching localization with the linearly calibrated reference points (RPs) in logarithmic received signal strength (RSS) varying Wi-Fi environment. To the best of our knowledge, little comprehensive analysis work has appeared on the error performance of neighbor matching localization with respect to the deployment of RPs. However, in order to achieve the efficient and reliable location-based services (LBSs) as well as the ubiquitous context-awareness in Wi-Fi environment, much attention has to be paid to the highly accurate and cost-efficient localization systems. To this end, the statistical errors by the widely used neighbor matching localization are significantly discussed in this paper to examine the inherent mathematical relations between the localization errors and the locations of RPs by using a basic linear logarithmic strength varying model. Furthermore, based on the mathematical demonstrations and some testing results, the closed-form solutions to the statistical errors by RADAR neighbor matching localization can be an effective tool to explore alternative deployment of fingerprint-based neighbor matching localization systems in the future. PMID:24683349

  18. Integrated multi-sensor fusion for mapping and localization in outdoor environments for mobile robots

    Science.gov (United States)

    Emter, Thomas; Petereit, Janko

    2014-05-01

    An integrated multi-sensor fusion framework for localization and mapping for autonomous navigation in unstructured outdoor environments based on extended Kalman filters (EKF) is presented. The sensors for localization include an inertial measurement unit, a GPS, a fiber optic gyroscope, and wheel odometry. Additionally a 3D LIDAR is used for simultaneous localization and mapping (SLAM). A 3D map is built while concurrently a localization in a so far established 2D map is estimated with the current scan of the LIDAR. Despite of longer run-time of the SLAM algorithm compared to the EKF update, a high update rate is still guaranteed by sophisticatedly joining and synchronizing two parallel localization estimators.

  19. Toward a Computer Vision-based Wayfinding Aid for Blind Persons to Access Unfamiliar Indoor Environments.

    Science.gov (United States)

    Tian, Yingli; Yang, Xiaodong; Yi, Chucai; Arditi, Aries

    2013-04-01

    Independent travel is a well known challenge for blind and visually impaired persons. In this paper, we propose a proof-of-concept computer vision-based wayfinding aid for blind people to independently access unfamiliar indoor environments. In order to find different rooms (e.g. an office, a lab, or a bathroom) and other building amenities (e.g. an exit or an elevator), we incorporate object detection with text recognition. First we develop a robust and efficient algorithm to detect doors, elevators, and cabinets based on their general geometric shape, by combining edges and corners. The algorithm is general enough to handle large intra-class variations of objects with different appearances among different indoor environments, as well as small inter-class differences between different objects such as doors and door-like cabinets. Next, in order to distinguish intra-class objects (e.g. an office door from a bathroom door), we extract and recognize text information associated with the detected objects. For text recognition, we first extract text regions from signs with multiple colors and possibly complex backgrounds, and then apply character localization and topological analysis to filter out background interference. The extracted text is recognized using off-the-shelf optical character recognition (OCR) software products. The object type, orientation, location, and text information are presented to the blind traveler as speech.

  20. Supporting Student Learning in Computer Science Education via the Adaptive Learning Environment ALMA

    Directory of Open Access Journals (Sweden)

    Alexandra Gasparinatou

    2015-10-01

    Full Text Available This study presents the ALMA environment (Adaptive Learning Models from texts and Activities. ALMA supports the processes of learning and assessment via: (1 texts differing in local and global cohesion for students with low, medium, and high background knowledge; (2 activities corresponding to different levels of comprehension which prompt the student to practically implement different text-reading strategies, with the recommended activity sequence adapted to the student’s learning style; (3 an overall framework for informing, guiding, and supporting students in performing the activities; and; (4 individualized support and guidance according to student specific characteristics. ALMA also, supports students in distance learning or in blended learning in which students are submitted to face-to-face learning supported by computer technology. The adaptive techniques provided via ALMA are: (a adaptive presentation and (b adaptive navigation. Digital learning material, in accordance with the text comprehension model described by Kintsch, was introduced into the ALMA environment. This material can be exploited in either distance or blended learning.

  1. Discharge Fee Policy Analysis: A Computable General Equilibrium (CGE Model of Water Resources and Water Environments

    Directory of Open Access Journals (Sweden)

    Guohua Fang

    2016-09-01

    Full Text Available To alleviate increasingly serious water pollution and shortages in developing countries, various kinds of policies have been implemented by local governments. It is vital to quantify and evaluate the performance and potential economic impacts of these policies. This study develops a Computable General Equilibrium (CGE model to simulate the regional economic and environmental effects of discharge fees. Firstly, water resources and water environment factors are separated from the input and output sources of the National Economic Production Department. Secondly, an extended Social Accounting Matrix (SAM of Jiangsu province is developed to simulate various scenarios. By changing values of the discharge fees (increased by 50%, 100% and 150%, three scenarios are simulated to examine their influence on the overall economy and each industry. The simulation results show that an increased fee will have a negative impact on Gross Domestic Product (GDP. However, waste water may be effectively controlled. Also, this study demonstrates that along with the economic costs, the increase of the discharge fee will lead to the upgrading of industrial structures from a situation of heavy pollution to one of light pollution which is beneficial to the sustainable development of the economy and the protection of the environment.

  2. Localization of Outdoor Mobile Robots Using Curb Features in Urban Road Environments

    Directory of Open Access Journals (Sweden)

    Hyunsuk Lee

    2014-01-01

    Full Text Available Urban road environments that have pavement and curb are characterized as semistructured road environments. In semistructured road environments, the curb provides useful information for robot navigation. In this paper, we present a practical localization method for outdoor mobile robots using the curb features in semistructured road environments. The curb features are especially useful in urban environment, where the GPS failures take place frequently. A curb extraction is conducted on the basis of the Kernel Fisher Discriminant Analysis (KFDA to minimize false detection. We adopt the Extended Kalman Filter (EKF to combine the curb information with odometry and Differential Global Positioning System (DGPS. The uncertainty models for the sensors are quantitatively analyzed to provide a practical solution.

  3. A Solid-State NMR Experiment: Analysis of Local Structural Environments in Phosphate Glasses

    Science.gov (United States)

    Anderson, Stanley E.; Saiki, David; Eckert, Hellmut; Meise-Gresch, Karin

    2004-01-01

    An experiment that can be used to directly study the local chemical environments of phosphorus in solid amorphous materials is demonstrated. The experiment aims at familiarizing the students of chemistry with the principles of solid-state NMR, by having them synthesize a simple phosphate glass, and making them observe the (super 31)P NMR spectrum,…

  4. The amount of natural radionuclides in the individual parts of environment in the locality Jahodna

    International Nuclear Information System (INIS)

    Cipakova, A.; Vrabel, V.

    2008-01-01

    In this study we have investigated and evaluated the amount of K-40, Ra-226, Th-232, U-238 as well as total alpha and beta activity in individual parts of environment, i.e. soil, plant, water and sediment. The locality Jahodna was a studied one. This is a perspective source of uranium ore in the Slovak Republic. (authors)

  5. Incorporating Informal Learning Environments and Local Fossil Specimens in Earth Science Classrooms: A Recipe for Success

    Science.gov (United States)

    Clary, Renee M.; Wandersee, James H.

    2009-01-01

    In an online graduate paleontology course taken by practicing Earth Science teachers, we designed an investigation using teachers' local informal educational environments. Teachers (N = 28) were responsible for photographing, describing, and integrating fossil specimens from two informal sites into a paleoenvironmental analysis of the landscape in…

  6. Environment mapping and localization with an uncontrolled swarm of ultrasound sensor motes

    NARCIS (Netherlands)

    Duisterwinkel, E.; Demi, L.; Dubbelman, G.; Talnishnikh, E.; Wörtche, H.J.; Bergmans, J.W.M.

    2014-01-01

    A method is presented in which a (large) swarm of sensor motes perform simple ultrasonic ranging measurements. The method allows to localize the motes within the swarm, and at the same time, map the environment which the swarm has traversed. The motes float passively uncontrolled through the

  7. New computer and communications environments for light armored vehicles

    Science.gov (United States)

    Rapanotti, John L.; Palmarini, Marc; Dumont, Marc

    2002-08-01

    Light Armoured Vehicles (LAVs) are being developed to meet the modern requirements of rapid deployment and operations other than war. To achieve these requirements, passive armour is minimized and survivability depends more on sensors, computers and countermeasures to detect and avoid threats. The performance, reliability, and ultimately the cost of these components, will be determined by the trends in computing and communications. These trends and the potential impact on DAS (Defensive Aids Suite) development were investigated and are reported in this paper. Vehicle performance is affected by communication with other vehicles and other ISTAR (Intelligence, Surveillance, Target Acquisition and Reconnaissance) battlefield assets. This investigation includes the networking technology Jini developed by SUN Microsystems, which can be used to interface the vehicle to the ISTAR network. VxWorks by Wind River Systems, is a real time operating system designed for military systems and compatible with Jini. Other technologies affecting computer hardware development include, dynamic reconfiguration, hot swap, alternate pathing, CompactPCI, and Fiber Channel serial communication. To achieve the necessary performance at reasonable cost, and over the long service life of the vehicle, a DAS should have two essential features. A fitted for, but not fitted with approach will provide the necessary rapid deployment without a need to equip the entire fleet. With an expected vehicle service life of 50 years, 5-year technology upgrades can be used to maintain vehicle performance over the entire service life. A federation of modules instead of integrated fused sensors will provide the capability for incremental upgrades and mission configurability. A plug and play capability can be used for both hardware and expendables.

  8. Understanding the Offender/Environment Dynamic for Computer Crimes

    DEFF Research Database (Denmark)

    Willison, Robert Andrew

    2005-01-01

    practices by possiblyhighlighting new areas for safeguard implementation. To help facilitate a greaterunderstanding of the offender/environment dynamic, this paper assesses the feasibilityof applying criminological theory to the IS security context. More specifically, threetheories are advanced, which focus...... on the offender's behaviour in a criminal setting. Drawing on an account of the Barings Bank collapse, events highlighted in the casestudy are used to assess whether concepts central to the theories are supported by thedata. It is noted that while one of the theories is to be found wanting in terms ofconceptual...

  9. Shell stability analysis in a computer aided engineering (CAE) environment

    Science.gov (United States)

    Arbocz, J.; Hol, J. M. A. M.

    1993-01-01

    The development of 'DISDECO', the Delft Interactive Shell DEsign COde is described. The purpose of this project is to make the accumulated theoretical, numerical and practical knowledge of the last 25 years or so readily accessible to users interested in the analysis of buckling sensitive structures. With this open ended, hierarchical, interactive computer code the user can access from his workstation successively programs of increasing complexity. The computational modules currently operational in DISDECO provide the prospective user with facilities to calculate the critical buckling loads of stiffened anisotropic shells under combined loading, to investigate the effects the various types of boundary conditions will have on the critical load, and to get a complete picture of the degrading effects the different shapes of possible initial imperfections might cause, all in one interactive session. Once a design is finalized, its collapse load can be verified by running a large refined model remotely from behind the workstation with one of the current generation 2-dimensional codes, with advanced capabilities to handle both geometric and material nonlinearities.

  10. Ultrafast and scalable cone-beam CT reconstruction using MapReduce in a cloud computing environment.

    Science.gov (United States)

    Meng, Bowen; Pratx, Guillem; Xing, Lei

    2011-12-01

    Four-dimensional CT (4DCT) and cone beam CT (CBCT) are widely used in radiation therapy for accurate tumor target definition and localization. However, high-resolution and dynamic image reconstruction is computationally demanding because of the large amount of data processed. Efficient use of these imaging techniques in the clinic requires high-performance computing. The purpose of this work is to develop a novel ultrafast, scalable and reliable image reconstruction technique for 4D CBCT∕CT using a parallel computing framework called MapReduce. We show the utility of MapReduce for solving large-scale medical physics problems in a cloud computing environment. In this work, we accelerated the Feldcamp-Davis-Kress (FDK) algorithm by porting it to Hadoop, an open-source MapReduce implementation. Gated phases from a 4DCT scans were reconstructed independently. Following the MapReduce formalism, Map functions were used to filter and backproject subsets of projections, and Reduce function to aggregate those partial backprojection into the whole volume. MapReduce automatically parallelized the reconstruction process on a large cluster of computer nodes. As a validation, reconstruction of a digital phantom and an acquired CatPhan 600 phantom was performed on a commercial cloud computing environment using the proposed 4D CBCT∕CT reconstruction algorithm. Speedup of reconstruction time is found to be roughly linear with the number of nodes employed. For instance, greater than 10 times speedup was achieved using 200 nodes for all cases, compared to the same code executed on a single machine. Without modifying the code, faster reconstruction is readily achievable by allocating more nodes in the cloud computing environment. Root mean square error between the images obtained using MapReduce and a single-threaded reference implementation was on the order of 10(-7). Our study also proved that cloud computing with MapReduce is fault tolerant: the reconstruction completed

  11. Tablet computers and eBooks. Unlocking the potential for personal learning environments?

    NARCIS (Netherlands)

    Kalz, Marco

    2012-01-01

    Kalz, M. (2012, 9 May). Tablet computers and eBooks. Unlocking the potential for personal learning environments? Invited presentation during the annual conference of the European Association for Distance Learning (EADL), Noordwijkerhout, The Netherlands.

  12. Deception Detection in a Computer-Mediated Environment: Gender, Trust, and Training Issues

    National Research Council Canada - National Science Library

    Dziubinski, Monica

    2003-01-01

    .... This research draws on communication and deception literature to develop a conceptual model proposing relationships between deception detection abilities in a computer-mediated environment, gender, trust, and training...

  13. A Comparative Study of Load Balancing Algorithms in Cloud Computing Environment

    OpenAIRE

    Katyal, Mayanka; Mishra, Atul

    2014-01-01

    Cloud Computing is a new trend emerging in IT environment with huge requirements of infrastructure and resources. Load Balancing is an important aspect of cloud computing environment. Efficient load balancing scheme ensures efficient resource utilization by provisioning of resources to cloud users on demand basis in pay as you say manner. Load Balancing may even support prioritizing users by applying appropriate scheduling criteria. This paper presents various load balancing schemes in differ...

  14. Analysis of computational complexity for HT-based fingerprint alignment algorithms on java card environment

    CSIR Research Space (South Africa)

    Mlambo, CS

    2015-01-01

    Full Text Available In this paper, implementations of three Hough Transform based fingerprint alignment algorithms are analyzed with respect to time complexity on Java Card environment. Three algorithms are: Local Match Based Approach (LMBA), Discretized Rotation Based...

  15. A synthetic computational environment: To control the spread of respiratory infections in a virtual university

    Science.gov (United States)

    Ge, Yuanzheng; Chen, Bin; liu, Liang; Qiu, Xiaogang; Song, Hongbin; Wang, Yong

    2018-02-01

    Individual-based computational environment provides an effective solution to study complex social events by reconstructing scenarios. Challenges remain in reconstructing the virtual scenarios and reproducing the complex evolution. In this paper, we propose a framework to reconstruct a synthetic computational environment, reproduce the epidemic outbreak, and evaluate management interventions in a virtual university. The reconstructed computational environment includes 4 fundamental components: the synthetic population, behavior algorithms, multiple social networks, and geographic campus environment. In the virtual university, influenza H1N1 transmission experiments are conducted, and gradually enhanced interventions are evaluated and compared quantitatively. The experiment results indicate that the reconstructed virtual environment provides a solution to reproduce complex emergencies and evaluate policies to be executed in the real world.

  16. Bridging Theory and Practice: Developing Guidelines to Facilitate the Design of Computer-based Learning Environments

    Directory of Open Access Journals (Sweden)

    Lisa D. Young

    2003-10-01

    Full Text Available Abstract. The design of computer-based learning environments has undergone a paradigm shift; moving students away from instruction that was considered to promote technical rationality grounded in objectivism, to the application of computers to create cognitive tools utilized in constructivist environments. The goal of the resulting computer-based learning environment design principles is to have students learn with technology, rather than from technology. This paper reviews the general constructivist theory that has guided the development of these environments, and offers suggestions for the adaptation of modest, generic guidelines, not mandated principles, that can be flexibly applied and allow for the expression of true constructivist ideals in online learning environments.

  17. Perspectives on Emerging/Novel Computing Paradigms and Future Aerospace Workforce Environments

    Science.gov (United States)

    Noor, Ahmed K.

    2003-01-01

    The accelerating pace of the computing technology development shows no signs of abating. Computing power reaching 100 Tflop/s is likely to be reached by 2004 and Pflop/s (10(exp 15) Flop/s) by 2007. The fundamental physical limits of computation, including information storage limits, communication limits and computation rate limits will likely be reached by the middle of the present millennium. To overcome these limits, novel technologies and new computing paradigms will be developed. An attempt is made in this overview to put the diverse activities related to new computing-paradigms in perspective and to set the stage for the succeeding presentations. The presentation is divided into five parts. In the first part, a brief historical account is given of development of computer and networking technologies. The second part provides brief overviews of the three emerging computing paradigms grid, ubiquitous and autonomic computing. The third part lists future computing alternatives and the characteristics of future computing environment. The fourth part describes future aerospace workforce research, learning and design environments. The fifth part lists the objectives of the workshop and some of the sources of information on future computing paradigms.

  18. Computer simulations of polymers in a confined environment

    International Nuclear Information System (INIS)

    Sikorski, Andrzej; Romiszowski, Piotr

    2007-01-01

    A coarse-grained model of star-branched polymers confined in a slit formed by two parallel impenetrable surfaces, which were attractive for polymer segments, was developed and studied. The model chains were regular stars consisting of f = 3 branches of equal length. The flexible chains were constructed of united atoms (segments) and were restricted to vertices of a simple cubic lattice. Good solvent conditions were modelled and, thus, the macromolecules interacted only with the excluded volume. The properties of the model chains were determined by means of Monte Carlo simulations with a sampling algorithm based on the local changes of conformation of the chains. It appeared that the strongly adsorbed chains located in slits of appropriate width could swap between both confining surfaces. The influence of the chain length, width of the slit and the temperature on the frequency of such jumps was studied. The mechanism of the chain motion is also discussed

  19. Automated linear regression tools improve RSSI WSN localization in multipath indoor environment

    Directory of Open Access Journals (Sweden)

    Laermans Eric

    2011-01-01

    Full Text Available Abstract Received signal strength indication (RSSI-based localization is emerging in wireless sensor networks (WSNs. Localization algorithms need to include the physical and hardware limitations of RSSI measurements in order to give more accurate results in dynamic real-life indoor environments. In this study, we use the Interdisciplinary Institute for Broadband Technology real-life test bed and present an automated method to optimize and calibrate the experimental data before offering them to a positioning engine. In a preprocessing localization step, we introduce a new method to provide bounds for the range, thereby further improving the accuracy of our simple and fast 2D localization algorithm based on corrected distance circles. A maximum likelihood algorithm with a mean square error cost function has a higher position error median than our algorithm. Our experiments further show that the complete proposed algorithm eliminates outliers and avoids any manual calibration procedure.

  20. Service ORiented Computing EnviRonment (SORCER) for Deterministic Global and Stochastic Optimization

    OpenAIRE

    Raghunath, Chaitra

    2015-01-01

    With rapid growth in the complexity of large scale engineering systems, the application of multidisciplinary analysis and design optimization (MDO) in the engineering design process has garnered much attention. MDO addresses the challenge of integrating several different disciplines into the design process. Primary challenges of MDO include computational expense and poor scalability. The introduction of a distributed, collaborative computational environment results in better...

  1. Dynamic Scaffolding of Socially Regulated Learning in a Computer-Based Learning Environment

    Science.gov (United States)

    Molenaar, Inge; Roda, Claudia; van Boxtel, Carla; Sleegers, Peter

    2012-01-01

    The aim of this study is to test the effects of dynamically scaffolding social regulation of middle school students working in a computer-based learning environment. Dyads in the scaffolding condition (N=56) are supported with computer-generated scaffolds and students in the control condition (N=54) do not receive scaffolds. The scaffolds are…

  2. Dynamic Scaffolding of Socially Regulated Learning in a Computer-Based Learning Environment

    NARCIS (Netherlands)

    Molenaar, I.; Roda, Claudia; van Boxtel, Carla A.M.; Sleegers, P.J.C.

    2012-01-01

    The aim of this study is to test the effects of dynamically scaffolding social regulation of middle school students working in a computer-based learning environment. Dyads in the scaffolding condition (N = 56) are supported with computer-generated scaffolds and students in the control condition (N =

  3. Taiwanese Consumers’ Perceptions of Local and Global Brands: An Investigation in Taiwan Computer Industry

    OpenAIRE

    Hsieh, Ya-Yun

    2010-01-01

    This study aims to investigate how consumers in a newly developed country, Taiwan, perceive local brands and global brands in the computer industry. To access an in-depth understanding and evaluate factors that influence consumers’ assessment of local and global brands, the country-of-origin effect and the association of brand origin are investigated; the effect of consumer ethnocentrism is addressed; and the cultural aspects on collectivism and face concept are examined. The study adopts...

  4. Maintaining Traceability in an Evolving Distributed Computing Environment

    Science.gov (United States)

    Collier, I.; Wartel, R.

    2015-12-01

    The management of risk is fundamental to the operation of any distributed computing infrastructure. Identifying the cause of incidents is essential to prevent them from re-occurring. In addition, it is a goal to contain the impact of an incident while keeping services operational. For response to incidents to be acceptable this needs to be commensurate with the scale of the problem. The minimum level of traceability for distributed computing infrastructure usage is to be able to identify the source of all actions (executables, file transfers, pilot jobs, portal jobs, etc.) and the individual who initiated them. In addition, sufficiently fine-grained controls, such as blocking the originating user and monitoring to detect abnormal behaviour, are necessary for keeping services operational. It is essential to be able to understand the cause and to fix any problems before re-enabling access for the user. The aim is to be able to answer the basic questions who, what, where, and when concerning any incident. This requires retaining all relevant information, including timestamps and the digital identity of the user, sufficient to identify, for each service instance, and for every security event including at least the following: connect, authenticate, authorize (including identity changes) and disconnect. In traditional grid infrastructures (WLCG, EGI, OSG etc.) best practices and procedures for gathering and maintaining the information required to maintain traceability are well established. In particular, sites collect and store information required to ensure traceability of events at their sites. With the increased use of virtualisation and private and public clouds for HEP workloads established procedures, which are unable to see 'inside' running virtual machines no longer capture all the information required. Maintaining traceability will at least involve a shift of responsibility from sites to Virtual Organisations (VOs) bringing with it new requirements for their

  5. Local bureaucrats as bricoleurs. The everyday implementation practices of county environment officers in rural Kenya

    Directory of Open Access Journals (Sweden)

    Mikkel Funder

    2015-03-01

    Full Text Available Bricolage in natural resource governance takes place through the interplay of a variety of actors. This article explores the practices of a group whose agency as bricoleurs has received little attention, namely the government officers who represent the state in the everyday management of water, land, forests and other resources across rural Africa. Specifically we examine how local Environment Officers in Taita Taveta County in Kenya go about implementing the national environmental law on the ground, and how they interact with communities in this process. As representatives of “the local state”, the Environment Officers occupy an ambiguous position in which they are expected to implement lofty laws and policies with limited means and in a complex local reality. In response to this they employ three key practices, namely (i working through personal networks, (ii tailoring informal agreements, and (iii delegating public functions and authority to civil society. As a result, the environmental law is to a large extent implemented through a blend of formal and informal rules and governance arrangements, produced through the interplay of the Environment Officers, communities and other local actors.

  6. Local Control of Audio Environment: A Review of Methods and Applications

    Directory of Open Access Journals (Sweden)

    Jussi Kuutti

    2014-02-01

    Full Text Available The concept of a local audio environment is to have sound playback locally restricted such that, ideally, adjacent regions of an indoor or outdoor space could exhibit their own individual audio content without interfering with each other. This would enable people to listen to their content of choice without disturbing others next to them, yet, without any headphones to block conversation. In practice, perfect sound containment in free air cannot be attained, but a local audio environment can still be satisfactorily approximated using directional speakers. Directional speakers may be based on regular audible frequencies or they may employ modulated ultrasound. Planar, parabolic, and array form factors are commonly used. The directivity of a speaker improves as its surface area and sound frequency increases, making these the main design factors for directional audio systems. Even directional speakers radiate some sound outside the main beam, and sound can also reflect from objects. Therefore, directional speaker systems perform best when there is enough ambient noise to mask the leaking sound. Possible areas of application for local audio include information and advertisement audio feed in commercial facilities, guiding and narration in museums and exhibitions, office space personalization, control room messaging, rehabilitation environments, and entertainment audio systems.

  7. Human response to individually controlled micro environment generated with localized chilled beam

    DEFF Research Database (Denmark)

    Uth, Simon C.; Nygaard, Linette; Bolashikov, Zhecho Dimitrov

    2014-01-01

    Indoor environment in a single-office room created by a localised chilled beam with individual control of the primary air flow was studied. Response of 24 human subjects when exposed to the environment generated by the chilled beam was collected via questionnaires under a 2-hour exposure including...... and local thermal sensation reported by the subjects with the two systems. Both systems were equally acceptable. At 26°C the individual control of the localised chilled beam lead to higher acceptability of the work environment. At 28°C the acceptability decreased with the two systems. It was not acceptable...... different work tasks at three locations in the room. Response of the subjects to the environment generated with a chilled ceiling combined with mixing air distribution was used for comparison. The air temperature in the room was kept at 26 or 28 °C. Results show no significant difference in the overall...

  8. Massive calculations of electrostatic potentials and structure maps of biopolymers in a distributed computing environment

    International Nuclear Information System (INIS)

    Akishina, T.P.; Ivanov, V.V.; Stepanenko, V.A.

    2013-01-01

    Among the key factors determining the processes of transcription and translation are the distributions of the electrostatic potentials of DNA, RNA and proteins. Calculations of electrostatic distributions and structure maps of biopolymers on computers are time consuming and require large computational resources. We developed the procedures for organization of massive calculations of electrostatic potentials and structure maps for biopolymers in a distributed computing environment (several thousands of cores).

  9. Application of Selective Algorithm for Effective Resource Provisioning in Cloud Computing Environment

    OpenAIRE

    Katyal, Mayanka; Mishra, Atul

    2014-01-01

    Modern day continued demand for resource hungry services and applications in IT sector has led to development of Cloud computing. Cloud computing environment involves high cost infrastructure on one hand and need high scale computational resources on the other hand. These resources need to be provisioned (allocation and scheduling) to the end users in most efficient manner so that the tremendous capabilities of cloud are utilized effectively and efficiently. In this paper we discuss a selecti...

  10. High performance computing environment for multidimensional image analysis.

    Science.gov (United States)

    Rao, A Ravishankar; Cecchi, Guillermo A; Magnasco, Marcelo

    2007-07-10

    The processing of images acquired through microscopy is a challenging task due to the large size of datasets (several gigabytes) and the fast turnaround time required. If the throughput of the image processing stage is significantly increased, it can have a major impact in microscopy applications. We present a high performance computing (HPC) solution to this problem. This involves decomposing the spatial 3D image into segments that are assigned to unique processors, and matched to the 3D torus architecture of the IBM Blue Gene/L machine. Communication between segments is restricted to the nearest neighbors. When running on a 2 Ghz Intel CPU, the task of 3D median filtering on a typical 256 megabyte dataset takes two and a half hours, whereas by using 1024 nodes of Blue Gene, this task can be performed in 18.8 seconds, a 478x speedup. Our parallel solution dramatically improves the performance of image processing, feature extraction and 3D reconstruction tasks. This increased throughput permits biologists to conduct unprecedented large scale experiments with massive datasets.

  11. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    International Nuclear Information System (INIS)

    Mike Bockelie; Dave Swensen; Martin Denison

    2002-01-01

    This is the fifth Quarterly Technical Report for DOE Cooperative Agreement No: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a computational workbench for simulating the performance of Vision 21 Power Plant Systems. Within the last quarter, our efforts have become focused on developing an improved workbench for simulating a gasifier based Vision 21 energyplex. To provide for interoperability of models developed under Vision 21 and other DOE programs, discussions have been held with DOE and other organizations developing plant simulator tools to review the possibility of establishing a common software interface or protocol to use when developing component models. A component model that employs the CCA protocol has successfully been interfaced to our CCA enabled workbench. To investigate the software protocol issue, DOE has selected a gasifier based Vision 21 energyplex configuration for use in testing and evaluating the impacts of different software interface methods. A Memo of Understanding with the Cooperative Research Centre for Coal in Sustainable Development (CCSD) in Australia has been completed that will enable collaborative research efforts on gasification issues. Preliminary results have been obtained for a CFD model of a pilot scale, entrained flow gasifier. A paper was presented at the Vision 21 Program Review Meeting at NETL (Morgantown) that summarized our accomplishments for Year One and plans for Year Two and Year Three

  12. Mathematical Language Development and Talk Types in Computer Supported Collaborative Learning Environments

    Science.gov (United States)

    Symons, Duncan; Pierce, Robyn

    2015-01-01

    In this study we examine the use of cumulative and exploratory talk types in a year 5 computer supported collaborative learning environment. The focus for students in this environment was to participate in mathematical problem solving, with the intention of developing the proficiencies of problem solving and reasoning. Findings suggest that…

  13. Using a Cloud-Based Computing Environment to Support Teacher Training on Common Core Implementation

    Science.gov (United States)

    Robertson, Cory

    2013-01-01

    A cloud-based computing environment, Google Apps for Education (GAFE), has provided the Anaheim City School District (ACSD) a comprehensive and collaborative avenue for creating, sharing, and editing documents, calendars, and social networking communities. With this environment, teachers and district staff at ACSD are able to utilize the deep…

  14. Touch in Computer-Mediated Environments: An Analysis of Online Shoppers' Touch-Interface User Experiences

    Science.gov (United States)

    Chung, Sorim

    2016-01-01

    Over the past few years, one of the most fundamental changes in current computer-mediated environments has been input devices, moving from mouse devices to touch interfaces. However, most studies of online retailing have not considered device environments as retail cues that could influence users' shopping behavior. In this research, I examine the…

  15. Computational Environments and Analysis methods available on the NCI High Performance Computing (HPC) and High Performance Data (HPD) Platform

    Science.gov (United States)

    Evans, B. J. K.; Foster, C.; Minchin, S. A.; Pugh, T.; Lewis, A.; Wyborn, L. A.; Evans, B. J.; Uhlherr, A.

    2014-12-01

    The National Computational Infrastructure (NCI) has established a powerful in-situ computational environment to enable both high performance computing and data-intensive science across a wide spectrum of national environmental data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress in addressing harmonisation of the underlying data collections for future transdisciplinary research that enable accurate climate projections. NCI makes available 10+ PB major data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the national scientific records), major research communities, and collaborating overseas organisations. The data is accessible within an integrated HPC-HPD environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large scale and high-bandwidth Lustre filesystems. This computational environment supports a catalogue of integrated reusable software and workflows from earth system and ecosystem modelling, weather research, satellite and other observed data processing and analysis. To enable transdisciplinary research on this scale, data needs to be harmonised so that researchers can readily apply techniques and software across the corpus of data available and not be constrained to work within artificial disciplinary boundaries. Future challenges will

  16. Touch in Computer-Mediated Environments: An Analysis of Online Shoppers’ Touch-Interface User Experiences

    OpenAIRE

    Chung, Sorim

    2016-01-01

    Over the past few years, one of the most fundamental changes in current computer-mediated environments has been input devices, moving from mouse devices to touch interfaces. However, most studies of online retailing have not considered device environments as retail cues that could influence users’ shopping behavior. In this research, I examine the underlying mechanisms between input device environments and shoppers’ decision-making processes. In particular, I investigate the impact of input d...

  17. A high-resolution computational localization method for transcranial magnetic stimulation mapping.

    Science.gov (United States)

    Aonuma, Shinta; Gomez-Tames, Jose; Laakso, Ilkka; Hirata, Akimasa; Takakura, Tomokazu; Tamura, Manabu; Muragaki, Yoshihiro

    2018-05-15

    Transcranial magnetic stimulation (TMS) is used for the mapping of brain motor functions. The complexity of the brain deters determining the exact localization of the stimulation site using simplified methods (e.g., the region below the center of the TMS coil) or conventional computational approaches. This study aimed to present a high-precision localization method for a specific motor area by synthesizing computed non-uniform current distributions in the brain for multiple sessions of TMS. Peritumoral mapping by TMS was conducted on patients who had intra-axial brain neoplasms located within or close to the motor speech area. The electric field induced by TMS was computed using realistic head models constructed from magnetic resonance images of patients. A post-processing method was implemented to determine a TMS hotspot by combining the computed electric fields for the coil orientations and positions that delivered high motor-evoked potentials during peritumoral mapping. The method was compared to the stimulation site localized via intraoperative direct brain stimulation and navigated TMS. Four main results were obtained: 1) the dependence of the computed hotspot area on the number of peritumoral measurements was evaluated; 2) the estimated localization of the hand motor area in eight non-affected hemispheres was in good agreement with the position of a so-called "hand-knob"; 3) the estimated hotspot areas were not sensitive to variations in tissue conductivity; and 4) the hand motor areas estimated by this proposal and direct electric stimulation (DES) were in good agreement in the ipsilateral hemisphere of four glioma patients. The TMS localization method was validated by well-known positions of the "hand-knob" in brains for the non-affected hemisphere, and by a hotspot localized via DES during awake craniotomy for the tumor-containing hemisphere. Copyright © 2018 Elsevier Inc. All rights reserved.

  18. Image selection as a service for cloud computing environments

    KAUST Repository

    Filepp, Robert

    2010-12-01

    Customers of Cloud Services are expected to choose specific machine images to instantiate in order to host their workloads. Unfortunately very little information is provided to the users to enable them to make intelligent choices. We believe that as the number of images proliferates it will become increasingly difficult for users to decide effectively. Cloud service providers often allow their customers to instantiate standard system images, to modify their instances, and to store images of these customized instances for public or private future use. Storing modified instances as images enables customers to avoid re-provisioning and re-configuration of required resources thereby reducing their future costs. However Cloud service providers generally do not expose details regarding the configurations of the images in a rigorous canonical fashion nor offer services that assist clients in the best target image selection to support client transformation objectives. Rather, they allow customers to enter a free-form description of an image based on client\\'s best effort. This means in order to find a "best fit" image to instantiate, a human user must review potentially thousands of image descriptions, reading each description to evaluate its suitability as a platform to host their source application. Furthermore, the actual content of the selected image may differ greatly from its description. Finally, even images that have been customized and retained for future use may need additional provisioning and customization to accommodate specific needs. In this paper we propose a service that accumulates image configuration details in a canonical fashion and a further service that employs an algorithm to order images per best fit /least cost in conformance to user-specified policies. These services collectively facilitate workload transformation into enterprise cloud environments.

  19. Urbancontext: A Management Model For Pervasive Environments In User-Oriented Urban Computing

    Directory of Open Access Journals (Sweden)

    Claudia L. Zuniga-Canon

    2014-01-01

    Full Text Available Nowadays, urban computing has gained a lot of interest for guiding the evolution of citiesinto intelligent environments. These environments are appropriated for individuals’ inter-actions changing in their behaviors. These changes require new approaches that allow theunderstanding of how urban computing systems should be modeled.In this work we present UrbanContext, a new model for designing of urban computingplatforms that applies the theory of roles to manage the individual’s context in urban envi-ronments. The theory of roles helps to understand the individual’s behavior within a socialenvironment, allowing to model urban computing systems able to adapt to individuals statesand their needs.UrbanContext collects data in urban atmospheres and classifies individuals’ behaviorsaccording to their change of roles, to optimize social interaction and offer secure services.Likewise, UrbanContext serves as a generic model to provide interoperability, and to facilitatethe design, implementation and expansion of urban computing systems.

  20. Halo assembly bias and the tidal anisotropy of the local halo environment

    Science.gov (United States)

    Paranjape, Aseem; Hahn, Oliver; Sheth, Ravi K.

    2018-05-01

    We study the role of the local tidal environment in determining the assembly bias of dark matter haloes. Previous results suggest that the anisotropy of a halo's environment (i.e. whether it lies in a filament or in a more isotropic region) can play a significant role in determining the eventual mass and age of the halo. We statistically isolate this effect, using correlations between the large-scale and small-scale environments of simulated haloes at z = 0 with masses between 1011.6 ≲ (m/h-1 M⊙) ≲ 1014.9. We probe the large-scale environment, using a novel halo-by-halo estimator of linear bias. For the small-scale environment, we identify a variable αR that captures the tidal anisotropy in a region of radius R = 4R200b around the halo and correlates strongly with halo bias at fixed mass. Segregating haloes by αR reveals two distinct populations. Haloes in highly isotropic local environments (αR ≲ 0.2) behave as expected from the simplest, spherically averaged analytical models of structure formation, showing a negative correlation between their concentration and large-scale bias at all masses. In contrast, haloes in anisotropic, filament-like environments (αR ≳ 0.5) tend to show a positive correlation between bias and concentration at any mass. Our multiscale analysis cleanly demonstrates how the overall assembly bias trend across halo mass emerges as an average over these different halo populations, and provides valuable insights towards building analytical models that correctly incorporate assembly bias. We also discuss potential implications for the nature and detectability of galaxy assembly bias.

  1. Initialization and Restart in Stochastic Local Search: Computing a Most Probable Explanation in Bayesian Networks

    Science.gov (United States)

    Mengshoel, Ole J.; Wilkins, David C.; Roth, Dan

    2010-01-01

    For hard computational problems, stochastic local search has proven to be a competitive approach to finding optimal or approximately optimal problem solutions. Two key research questions for stochastic local search algorithms are: Which algorithms are effective for initialization? When should the search process be restarted? In the present work we investigate these research questions in the context of approximate computation of most probable explanations (MPEs) in Bayesian networks (BNs). We introduce a novel approach, based on the Viterbi algorithm, to explanation initialization in BNs. While the Viterbi algorithm works on sequences and trees, our approach works on BNs with arbitrary topologies. We also give a novel formalization of stochastic local search, with focus on initialization and restart, using probability theory and mixture models. Experimentally, we apply our methods to the problem of MPE computation, using a stochastic local search algorithm known as Stochastic Greedy Search. By carefully optimizing both initialization and restart, we reduce the MPE search time for application BNs by several orders of magnitude compared to using uniform at random initialization without restart. On several BNs from applications, the performance of Stochastic Greedy Search is competitive with clique tree clustering, a state-of-the-art exact algorithm used for MPE computation in BNs.

  2. The status of computing and means of local and external networking at JINR

    Energy Technology Data Exchange (ETDEWEB)

    Dorokhin, A T; Shirikov, V P

    1996-12-31

    The goal of this report is to represent a view of the current state of computer support at JINR different physical researches. JINR network and its applications are considered. Trends of local networks and the connectivity with global networks are discussed. 3 refs.

  3. Fault-tolerant quantum computation for local non-Markovian noise

    International Nuclear Information System (INIS)

    Terhal, Barbara M.; Burkard, Guido

    2005-01-01

    We derive a threshold result for fault-tolerant quantum computation for local non-Markovian noise models. The role of error amplitude in our analysis is played by the product of the elementary gate time t 0 and the spectral width of the interaction Hamiltonian between system and bath. We discuss extensions of our model and the applicability of our analysis

  4. Computed tomography findings after radiofrequency ablation in locally advanced pancreatic cancer

    NARCIS (Netherlands)

    Rombouts, Steffi J. E.; Derksen, Tyche C.; Nio, Chung Y.; van Hillegersberg, Richard; van Santvoort, Hjalmar C.; Walma, Marieke S.; Molenaar, Izaak Q.; van Leeuwen, Maarten S.

    2018-01-01

    The purpose of the study was to provide a systematic evaluation of the computed tomography(CT) findings after radiofrequency ablation (RFA) in locally advanced pancreatic cancer(LAPC). Eighteen patients with intra-operative RFA-treated LAPC were included in a prospective case series. All CT-scans

  5. Local and Long Distance Computer Networking for Science Classrooms. Technical Report No. 43.

    Science.gov (United States)

    Newman, Denis

    This report describes Earth Lab, a project which is demonstrating new ways of using computers for upper-elementary and middle-school science instruction, and finding ways to integrate local-area and telecommunications networks. The discussion covers software, classroom activities, formative research on communications networks, and integration of…

  6. A local computer network for the experimental data acquisition at BESSY

    International Nuclear Information System (INIS)

    Buchholz, W.

    1984-01-01

    For the users of the Berlin dedicated electron storage ring for synchrotron radiation (BESSY) a local computer network has been installed: The system is designed primarily for data acquisition and offers the users a generous hardware provision combined with maximum sortware flexibility

  7. Enhancing Security by System-Level Virtualization in Cloud Computing Environments

    Science.gov (United States)

    Sun, Dawei; Chang, Guiran; Tan, Chunguang; Wang, Xingwei

    Many trends are opening up the era of cloud computing, which will reshape the IT industry. Virtualization techniques have become an indispensable ingredient for almost all cloud computing system. By the virtual environments, cloud provider is able to run varieties of operating systems as needed by each cloud user. Virtualization can improve reliability, security, and availability of applications by using consolidation, isolation, and fault tolerance. In addition, it is possible to balance the workloads by using live migration techniques. In this paper, the definition of cloud computing is given; and then the service and deployment models are introduced. An analysis of security issues and challenges in implementation of cloud computing is identified. Moreover, a system-level virtualization case is established to enhance the security of cloud computing environments.

  8. A Scheme for Verification on Data Integrity in Mobile Multicloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Laicheng Cao

    2016-01-01

    Full Text Available In order to verify the data integrity in mobile multicloud computing environment, a MMCDIV (mobile multicloud data integrity verification scheme is proposed. First, the computability and nondegeneracy of verification can be obtained by adopting BLS (Boneh-Lynn-Shacham short signature scheme. Second, communication overhead is reduced based on HVR (Homomorphic Verifiable Response with random masking and sMHT (sequence-enforced Merkle hash tree construction. Finally, considering the resource constraints of mobile devices, data integrity is verified by lightweight computing and low data transmission. The scheme improves shortage that mobile device communication and computing power are limited, it supports dynamic data operation in mobile multicloud environment, and data integrity can be verified without using direct source file block. Experimental results also demonstrate that this scheme can achieve a lower cost of computing and communications.

  9. Applications integration in a hybrid cloud computing environment: modelling and platform

    Science.gov (United States)

    Li, Qing; Wang, Ze-yuan; Li, Wei-hua; Li, Jun; Wang, Cheng; Du, Rui-yang

    2013-08-01

    With the development of application services providers and cloud computing, more and more small- and medium-sized business enterprises use software services and even infrastructure services provided by professional information service companies to replace all or part of their information systems (ISs). These information service companies provide applications, such as data storage, computing processes, document sharing and even management information system services as public resources to support the business process management of their customers. However, no cloud computing service vendor can satisfy the full functional IS requirements of an enterprise. As a result, enterprises often have to simultaneously use systems distributed in different clouds and their intra enterprise ISs. Thus, this article presents a framework to integrate applications deployed in public clouds and intra ISs. A run-time platform is developed and a cross-computing environment process modelling technique is also developed to improve the feasibility of ISs under hybrid cloud computing environments.

  10. Integrating the environment in local strategic planning : Guidelines (Case of Morocco)

    Science.gov (United States)

    Benbrahim, Hafsa

    2018-05-01

    Since 2010, an advanced regionalization project has been initiated by Morocco, which plans to consolidate the processes of decentralization and deconcentration by extending the powers of the regions and other local authorities. This project, institutionalized in the 2011 Constitution, defines the territorial organization of the Kingdom and reinforces decentralization according to a model of advanced regionalization. Through advanced regionalization, Morocco aims at integrated and sustainable development in economic, social, cultural and environmental terms, through the development of the potential and resources of each region. However, in order to honor this commitment of advanced regionalization, local authorities must be assisted in adopting a local strategic planning approach, allowing them to develop territorial plans for sustainable development in accordance with the national legal framework, specifically the Framework law 99-12, and international commitments in terms of environmental protection. This research deals with the issue of environmental governance in relation to the role and duties of local authorities. Thus, the main goal of our study is to present the guidelines to be followed by the local authorities to improve the quality of the environment integration process in the local strategic planning with the aim of putting it in a perspective of sustainable development.

  11. GLOBAL-LOCAL ENVIRONMENT CERTIFICATION AT FIVE STAR HOTELS IN TOURISM AREA OF NUSA DUA, BALI

    Directory of Open Access Journals (Sweden)

    Ni Gst Nym Suci Murni

    2014-06-01

    Full Text Available The research aims to examine the various form of environment certification, ideology behind the practice of green tourism (global award and Tri Hita Karana (local award, and the implication of environment practice at five star hotel in Nusa Dua tourism area. The data of the reserach was assessed by postmodern critical theory (theory of discourse regarding power/knowledge, hegemony theory, practice theory, and theory of deep/shallow ecology. The method used in this cultural studies is the qualitative one, where the data collection were obtained through direct observation, in-depth interviews, and related documentation. The sample used 6 five star hotels which practise green award, of 14 established five star hotels (some hotel is not in full operation.  The results showed that (1 there are some variation of environment practice in five star hotel, (2 ideology working behind these practices can be seen from global ideology in the form of sustainable development deriving green tourism, and the local ideology, in the form of Tri Hita Karana (THK used in THK award, (3 implication of global-local invironment practice in tourism area and surrounding.

  12. Localized Corrosion Behavior of Type 304SS with a Silica Layer Under Atmospheric Corrosion Environments

    International Nuclear Information System (INIS)

    E. Tada; G.S. Frankel

    2006-01-01

    The U.S. Department of Energy (DOE) has proposed a potential repository for spent nuclear fuel and high-level radioactive waste at the Yucca Mountain site in Nevada. [I] The temperature could be high on the waste packages, and it is possible that dripping water or humidity could interact with rock dust particulate to form a thin electrolyte layer with concentrated ionic species. Under these conditions, it is possible that highly corrosion-resistant alloys (CRAs) used as packages to dispose the nuclear waste could suffer localized corrosion. Therefore, to better understand long-term corrosion performance of CRAs in the repository, it is important to investigate localized corrosion under a simulated repository environment. We measured open circuit potential (OCP) and galvanic current (i g ) for silica-coated Type 304SS during drying of salt solutions under controlled RH environments to clarify the effect of silica layer as a dust layer simulant on localized corrosion under atmospheric environments. Type 304SS was used as a relatively susceptible model CRA instead of the much more corrosion resistant alloys, such as Alloy 22, that are being considered as, waste package materials

  13. Parallel sort with a ranged, partitioned key-value store in a high perfomance computing environment

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Grider, Gary; Torres, Aaron; Poole, Stephen W.

    2016-01-26

    Improved sorting techniques are provided that perform a parallel sort using a ranged, partitioned key-value store in a high performance computing (HPC) environment. A plurality of input data files comprising unsorted key-value data in a partitioned key-value store are sorted. The partitioned key-value store comprises a range server for each of a plurality of ranges. Each input data file has an associated reader thread. Each reader thread reads the unsorted key-value data in the corresponding input data file and performs a local sort of the unsorted key-value data to generate sorted key-value data. A plurality of sorted, ranged subsets of each of the sorted key-value data are generated based on the plurality of ranges. Each sorted, ranged subset corresponds to a given one of the ranges and is provided to one of the range servers corresponding to the range of the sorted, ranged subset. Each range server sorts the received sorted, ranged subsets and provides a sorted range. A plurality of the sorted ranges are concatenated to obtain a globally sorted result.

  14. Integrating CAD modules in a PACS environment using a wide computing infrastructure.

    Science.gov (United States)

    Suárez-Cuenca, Jorge J; Tilve, Amara; López, Ricardo; Ferro, Gonzalo; Quiles, Javier; Souto, Miguel

    2017-04-01

    The aim of this paper is to describe a project designed to achieve a total integration of different CAD algorithms into the PACS environment by using a wide computing infrastructure. The aim is to build a system for the entire region of Galicia, Spain, to make CAD accessible to multiple hospitals by employing different PACSs and clinical workstations. The new CAD model seeks to connect different devices (CAD systems, acquisition modalities, workstations and PACS) by means of networking based on a platform that will offer different CAD services. This paper describes some aspects related to the health services of the region where the project was developed, CAD algorithms that were either employed or selected for inclusion in the project, and several technical aspects and results. We have built a standard-based platform with which users can request a CAD service and receive the results in their local PACS. The process runs through a web interface that allows sending data to the different CAD services. A DICOM SR object is received with the results of the algorithms stored inside the original study in the proper folder with the original images. As a result, a homogeneous service to the different hospitals of the region will be offered. End users will benefit from a homogeneous workflow and a standardised integration model to request and obtain results from CAD systems in any modality, not dependant on commercial integration models. This new solution will foster the deployment of these technologies in the entire region of Galicia.

  15. CAFE: A Computer Tool for Accurate Simulation of the Regulatory Pool Fire Environment for Type B Packages

    International Nuclear Information System (INIS)

    Gritzo, L.A.; Koski, J.A.; Suo-Anttila, A.J.

    1999-01-01

    The Container Analysis Fire Environment computer code (CAFE) is intended to provide Type B package designers with an enhanced engulfing fire boundary condition when combined with the PATRAN/P-Thermal commercial code. Historically an engulfing fire boundary condition has been modeled as σT 4 where σ is the Stefan-Boltzman constant, and T is the fire temperature. The CAFE code includes the necessary chemistry, thermal radiation, and fluid mechanics to model an engulfing fire. Effects included are the local cooling of gases that form a protective boundary layer that reduces the incoming radiant heat flux to values lower than expected from a simple σT 4 model. In addition, the effect of object shape on mixing that may increase the local fire temperature is included. Both high and low temperature regions that depend upon the local availability of oxygen are also calculated. Thus the competing effects that can both increase and decrease the local values of radiant heat flux are included in a reamer that is not predictable a-priori. The CAFE package consists of a group of computer subroutines that can be linked to workstation-based thermal analysis codes in order to predict package performance during regulatory and other accident fire scenarios

  16. Data Summarization in the Node by Parameters (DSNP): Local Data Fusion in an IoT Environment.

    Science.gov (United States)

    Maschi, Luis F C; Pinto, Alex S R; Meneguette, Rodolfo I; Baldassin, Alexandro

    2018-03-07

    With the advent of the Internet of Things, billions of objects or devices are inserted into the global computer network, generating and processing data at a volume never imagined before. This paper proposes a way to collect and process local data through a data fusion technology called summarization. The main feature of the proposal is the local data fusion, through parameters provided by the application, ensuring the quality of data collected by the sensor node. In the evaluation, the sensor node was compared when performing the data summary with another that performed a continuous recording of the collected data. Two sets of nodes were created, one with a sensor node that analyzed the luminosity of the room, which in this case obtained a reduction of 97% in the volume of data generated, and another set that analyzed the temperature of the room, obtaining a reduction of 80% in the data volume. Through these tests, it has been proven that the local data fusion at the node can be used to reduce the volume of data generated, consequently decreasing the volume of messages generated by IoT environments.

  17. Data Summarization in the Node by Parameters (DSNP: Local Data Fusion in an IoT Environment

    Directory of Open Access Journals (Sweden)

    Luis F. C. Maschi

    2018-03-01

    Full Text Available With the advent of the Internet of Things, billions of objects or devices are inserted into the global computer network, generating and processing data at a volume never imagined before. This paper proposes a way to collect and process local data through a data fusion technology called summarization. The main feature of the proposal is the local data fusion, through parameters provided by the application, ensuring the quality of data collected by the sensor node. In the evaluation, the sensor node was compared when performing the data summary with another that performed a continuous recording of the collected data. Two sets of nodes were created, one with a sensor node that analyzed the luminosity of the room, which in this case obtained a reduction of 97% in the volume of data generated, and another set that analyzed the temperature of the room, obtaining a reduction of 80% in the data volume. Through these tests, it has been proven that the local data fusion at the node can be used to reduce the volume of data generated, consequently decreasing the volume of messages generated by IoT environments.

  18. A Modular Environment for Geophysical Inversion and Run-time Autotuning using Heterogeneous Computing Systems

    Science.gov (United States)

    Myre, Joseph M.

    Heterogeneous computing systems have recently come to the forefront of the High-Performance Computing (HPC) community's interest. HPC computer systems that incorporate special purpose accelerators, such as Graphics Processing Units (GPUs), are said to be heterogeneous. Large scale heterogeneous computing systems have consistently ranked highly on the Top500 list since the beginning of the heterogeneous computing trend. By using heterogeneous computing systems that consist of both general purpose processors and special- purpose accelerators, the speed and problem size of many simulations could be dramatically increased. Ultimately this results in enhanced simulation capabilities that allows, in some cases for the first time, the execution of parameter space and uncertainty analyses, model optimizations, and other inverse modeling techniques that are critical for scientific discovery and engineering analysis. However, simplifying the usage and optimization of codes for heterogeneous computing systems remains a challenge. This is particularly true for scientists and engineers for whom understanding HPC architectures and undertaking performance analysis may not be primary research objectives. To enable scientists and engineers to remain focused on their primary research objectives, a modular environment for geophysical inversion and run-time autotuning on heterogeneous computing systems is presented. This environment is composed of three major components: 1) CUSH---a framework for reducing the complexity of programming heterogeneous computer systems, 2) geophysical inversion routines which can be used to characterize physical systems, and 3) run-time autotuning routines designed to determine configurations of heterogeneous computing systems in an attempt to maximize the performance of scientific and engineering codes. Using three case studies, a lattice-Boltzmann method, a non-negative least squares inversion, and a finite-difference fluid flow method, it is shown that

  19. A visualization environment for supercomputing-based applications in computational mechanics

    Energy Technology Data Exchange (ETDEWEB)

    Pavlakos, C.J.; Schoof, L.A.; Mareda, J.F.

    1993-06-01

    In this paper, we characterize a visualization environment that has been designed and prototyped for a large community of scientists and engineers, with an emphasis in superconducting-based computational mechanics. The proposed environment makes use of a visualization server concept to provide effective, interactive visualization to the user`s desktop. Benefits of using the visualization server approach are discussed. Some thoughts regarding desirable features for visualization server hardware architectures are also addressed. A brief discussion of the software environment is included. The paper concludes by summarizing certain observations which we have made regarding the implementation of such visualization environments.

  20. An extended Intelligent Water Drops algorithm for workflow scheduling in cloud computing environment

    Directory of Open Access Journals (Sweden)

    Shaymaa Elsherbiny

    2018-03-01

    Full Text Available Cloud computing is emerging as a high performance computing environment with a large scale, heterogeneous collection of autonomous systems and flexible computational architecture. Many resource management methods may enhance the efficiency of the whole cloud computing system. The key part of cloud computing resource management is resource scheduling. Optimized scheduling of tasks on the cloud virtual machines is an NP-hard problem and many algorithms have been presented to solve it. The variations among these schedulers are due to the fact that the scheduling strategies of the schedulers are adapted to the changing environment and the types of tasks. The focus of this paper is on workflows scheduling in cloud computing, which is gaining a lot of attention recently because workflows have emerged as a paradigm to represent complex computing problems. We proposed a novel algorithm extending the natural-based Intelligent Water Drops (IWD algorithm that optimizes the scheduling of workflows on the cloud. The proposed algorithm is implemented and embedded within the workflows simulation toolkit and tested in different simulated cloud environments with different cost models. Our algorithm showed noticeable enhancements over the classical workflow scheduling algorithms. We made a comparison between the proposed IWD-based algorithm with other well-known scheduling algorithms, including MIN-MIN, MAX-MIN, Round Robin, FCFS, and MCT, PSO and C-PSO, where the proposed algorithm presented noticeable enhancements in the performance and cost in most situations.

  1. BEAM: A computational workflow system for managing and modeling material characterization data in HPC environments

    Energy Technology Data Exchange (ETDEWEB)

    Lingerfelt, Eric J [ORNL; Endeve, Eirik [ORNL; Ovchinnikov, Oleg S [ORNL; Borreguero Calvo, Jose M [ORNL; Park, Byung H [ORNL; Archibald, Richard K [ORNL; Symons, Christopher T [ORNL; Kalinin, Sergei V [ORNL; Messer, Bronson [ORNL; Shankar, Mallikarjun [ORNL; Jesse, Stephen [ORNL

    2016-01-01

    Improvements in scientific instrumentation allow imaging at mesoscopic to atomic length scales, many spectroscopic modes, and now with the rise of multimodal acquisition systems and the associated processing capability the era of multidimensional, informationally dense data sets has arrived. Technical issues in these combinatorial scientific fields are exacerbated by computational challenges best summarized as a necessity for drastic improvement in the capability to transfer, store, and analyze large volumes of data. The Bellerophon Environment for Analysis of Materials (BEAM) platform provides material scientists the capability to directly leverage the integrated computational and analytical power of High Performance Computing (HPC) to perform scalable data analysis and simulation via an intuitive, cross-platform client user interface. This framework delivers authenticated, push-button execution of complex user workflows that deploy data analysis algorithms and computational simulations utilizing the converged compute-and-data infrastructure at Oak Ridge National Laboratory s (ORNL) Compute and Data Environment for Science (CADES) and HPC environments like Titan at the Oak Ridge Leadership Computing Facility (OLCF). In this work we address the underlying HPC needs for characterization in the material science community, elaborate how BEAM s design and infrastructure tackle those needs, and present a small sub-set of user cases where scientists utilized BEAM across a broad range of analytical techniques and analysis modes.

  2. Nuclides.net: An integrated environment for computations on radionuclides and their radiation

    International Nuclear Information System (INIS)

    Galy, J.; Magill, J.

    2002-01-01

    Full text: The Nuclides.net computational package is of direct interest in the fields of environment monitoring and nuclear forensics. The 'integrated environment' is a suite of computer programs ranging from a powerful user-friendly interface, which allows the user to navigate the nuclide chart and explore the properties of nuclides, to various computational modules for decay calculations, dosimetry and shielding calculations, etc. The main emphasis in Nuclides.net is on nuclear science applications, such as health physics, radioprotection and radiochemistry, rather than nuclear data for which excellent sources already exist. In contrast to the CD-based Nuclides 2000 predecessor, Nuclides.net applications run over the internet on a web server. The user interface to these applications is via a web browser. Information submitted by the user is sent to the appropriate applications resident on the web server. The results of the calculations are returned to the user, again via the browser. The product is aimed at both students and professionals for reference data on radionuclides and computations based on this data using the latest internet technology. It is particularly suitable for educational purposes in the nuclear industry, health physics and radiation protection, nuclear and radiochemistry, nuclear physics, astrophysics, etc. The Nuclides.net software suite contains the following modules/features: a) A new user interface to view the nuclide charts (with zoom features). Additional nuclide charts are based on spin, parity, binding energy etc. b) There are five main applications: (1) 'Decay Engine' for decay calculations of numbers, masses, activities, dose rates, etc. of parent and daughters. (2) 'Dosimetry and Shielding' module allows the calculation of dose rates from both unshielded and shielded point sources. A choice of 10 shield materials is available. (3) 'Virtual Nuclides' allows the user to do decay and dosimetry and shielding calculations on mixtures of

  3. Comparison of computed tomography scout based reference point localization to conventional film and axial computed tomography.

    Science.gov (United States)

    Jiang, Lan; Templeton, Alistair; Turian, Julius; Kirk, Michael; Zusag, Thomas; Chu, James C H

    2011-01-01

    Identification of source positions after implantation is an important step in brachytherapy planning. Reconstruction is traditionally performed from films taken by conventional simulators, but these are gradually being replaced in the clinic by computed tomography (CT) simulators. The present study explored the use of a scout image-based reconstruction algorithm that replaces the use of traditional film, while exhibiting low sensitivity to metal-induced artifacts that can appear in 3D CT methods. In addition, the accuracy of an in-house graphical software implementation of scout-based reconstruction was compared with seed location reconstructions for 2 phantoms by conventional simulator and CT measurements. One phantom was constructed using a planar fixed grid of 1.5-mm diameter ball bearings (BBs) with 40-mm spacing. The second was a Fletcher-Suit applicator embedded in Styrofoam (Dow Chemical Co., Midland, MI) with one 3.2-mm-diameter BB inserted into each of 6 surrounding holes. Conventional simulator, kilovoltage CT (kVCT), megavoltage CT, and scout-based methods were evaluated by their ability to calculate the distance between seeds (40 mm for the fixed grid, 30-120 mm in Fletcher-Suit). All methods were able to reconstruct the fixed grid distances with an average deviation of <1%. The worst single deviations (approximately 6%) were exhibited in the 2 volumetric CT methods. In the Fletcher-Suit phantom, the intermodality agreement was within approximately 3%, with the conventional sim measuring marginally larger distances, with kVCT the smallest. All of the established reconstruction methods exhibited similar abilities to detect the distances between BBs. The 3D CT-based methods, with lower axial resolution, showed more variation, particularly with the smaller BBs. With a software implementation, scout-based reconstruction is an appealing approach because it simplifies data acquisition over film-based reconstruction without requiring any specialized equipment

  4. Preoperative localization of endocrine pancreatic tumours by intra-arterial dynamic computed tomography

    International Nuclear Information System (INIS)

    Ahlstroem, H.; Magnusson, A.; Grama, D.; Eriksson, B.; Oeberg, K.; Loerelius, L.E.; Akademiska Sjukhuset, Uppsala; Akademiska Sjukhuset, Uppsala

    1990-01-01

    Eleven patients with biochemically confirmed endocrine pancreatic tumours were examined with intra-arterial (i.a.) dynamic computed tomography (CT) and angiography preoperatively. Seven of the patients suffered from the multiple endocrine neoplasia type 1 (MEN-1) syndrome. All patients were operated upon and surgical palpation and ultrasound were the peroperative localization methods. Of the 33 tumours which were found at histopathologic analysis of the resected specimens in the 11 patients, 7 tumours in 7 patients were correctly localized by both i.a. dynamic CT and angiography. Six patients with MEN-1 syndrome had multiple tumours and this group of patients together had 28 tumours, of which 5 (18%) were localized preoperatively by both CT and angiography. I.a. dynamic CT, with the technique used by us, does not seem to improve the localization of endocrine pancreatic tumours, especially in the rare group of MEN-1 patients, as compared with angiography. (orig.)

  5. Inequality measures perform differently in global and local assessments: An exploratory computational experiment

    Science.gov (United States)

    Chiang, Yen-Sheng

    2015-11-01

    Inequality measures are widely used in both the academia and public media to help us understand how incomes and wealth are distributed. They can be used to assess the distribution of a whole society-global inequality-as well as inequality of actors' referent networks-local inequality. How different is local inequality from global inequality? Formalizing the structure of reference groups as a network, the paper conducted a computational experiment to see how the structure of complex networks influences the difference between global and local inequality assessed by a selection of inequality measures. It was found that local inequality tends to be higher than global inequality when population size is large; network is dense and heterophilously assorted, and income distribution is less dispersed. The implications of the simulation findings are discussed.

  6. Application of local computer networks in nuclear-physical experiments and technology

    International Nuclear Information System (INIS)

    Foteev, V.A.

    1986-01-01

    The bases of construction, comparative performance and potentialities of local computer networks with respect to their application in physical experiments are considered. The principle of operation of local networks is shown on the basis of the Ethernet network and the results of analysis of their operating performance are given. The examples of operating local networks in the area of nuclear-physics research and nuclear technology are presented as follows: networks of Japan Atomic Energy Research Institute, California University and Los Alamos National Laboratory, network realization according to the DECnet and Fast-bus programs, home network configurations of the USSR Academy of Sciences and JINR Neutron Physical Laboratory etc. It is shown that local networks allows significantly raise productivity in the sphere of data processing

  7. Glimpsing the imprint of local environment on the galaxy stellar mass function

    Science.gov (United States)

    Tomczak, Adam R.; Lemaux, Brian C.; Lubin, Lori M.; Gal, Roy R.; Wu, Po-Feng; Holden, Bradford; Kocevski, Dale D.; Mei, Simona; Pelliccia, Debora; Rumbaugh, Nicholas; Shen, Lu

    2017-12-01

    We investigate the impact of local environment on the galaxy stellar mass function (SMF) spanning a wide range of galaxy densities from the field up to dense cores of massive galaxy clusters. Data are drawn from a sample of eight fields from the Observations of Redshift Evolution in Large-Scale Environments (ORELSE) survey. Deep photometry allow us to select mass-complete samples of galaxies down to 109 M⊙. Taking advantage of >4000 secure spectroscopic redshifts from ORELSE and precise photometric redshifts, we construct three-dimensional density maps between 0.55 environmental dependence in the SMFs of star-forming and quiescent galaxies, although not quite as strongly for the quiescent subsample. To characterize the connection between the SMF of field galaxies and that of denser environments, we devise a simple semi-empirical model. The model begins with a sample of ≈106 galaxies at zstart = 5 with stellar masses distributed according to the field. Simulated galaxies then evolve down to zfinal = 0.8 following empirical prescriptions for star-formation, quenching and galaxy-galaxy merging. We run the simulation multiple times, testing a variety of scenarios with differing overall amounts of merging. Our model suggests that a large number of mergers are required to reproduce the SMF in dense environments. Additionally, a large majority of these mergers would have to occur in intermediate density environments (e.g. galaxy groups).

  8. Performance comparison of heuristic algorithms for task scheduling in IaaS cloud computing environment

    Science.gov (United States)

    Madni, Syed Hamid Hussain; Abd Latiff, Muhammad Shafie; Abdullahi, Mohammed; Usman, Mohammed Joda

    2017-01-01

    Cloud computing infrastructure is suitable for meeting computational needs of large task sizes. Optimal scheduling of tasks in cloud computing environment has been proved to be an NP-complete problem, hence the need for the application of heuristic methods. Several heuristic algorithms have been developed and used in addressing this problem, but choosing the appropriate algorithm for solving task assignment problem of a particular nature is difficult since the methods are developed under different assumptions. Therefore, six rule based heuristic algorithms are implemented and used to schedule autonomous tasks in homogeneous and heterogeneous environments with the aim of comparing their performance in terms of cost, degree of imbalance, makespan and throughput. First Come First Serve (FCFS), Minimum Completion Time (MCT), Minimum Execution Time (MET), Max-min, Min-min and Sufferage are the heuristic algorithms considered for the performance comparison and analysis of task scheduling in cloud computing. PMID:28467505

  9. Performance comparison of heuristic algorithms for task scheduling in IaaS cloud computing environment.

    Science.gov (United States)

    Madni, Syed Hamid Hussain; Abd Latiff, Muhammad Shafie; Abdullahi, Mohammed; Abdulhamid, Shafi'i Muhammad; Usman, Mohammed Joda

    2017-01-01

    Cloud computing infrastructure is suitable for meeting computational needs of large task sizes. Optimal scheduling of tasks in cloud computing environment has been proved to be an NP-complete problem, hence the need for the application of heuristic methods. Several heuristic algorithms have been developed and used in addressing this problem, but choosing the appropriate algorithm for solving task assignment problem of a particular nature is difficult since the methods are developed under different assumptions. Therefore, six rule based heuristic algorithms are implemented and used to schedule autonomous tasks in homogeneous and heterogeneous environments with the aim of comparing their performance in terms of cost, degree of imbalance, makespan and throughput. First Come First Serve (FCFS), Minimum Completion Time (MCT), Minimum Execution Time (MET), Max-min, Min-min and Sufferage are the heuristic algorithms considered for the performance comparison and analysis of task scheduling in cloud computing.

  10. Primary assembly of soil communities: disentangling the effect of dispersal and local environment.

    Science.gov (United States)

    Ingimarsdóttir, María; Caruso, Tancredi; Ripa, Jörgen; Magnúsdóttir, Olöf Birna; Migliorini, Massimo; Hedlund, Katarina

    2012-11-01

    It has long been recognised that dispersal abilities and environmental factors are important in shaping invertebrate communities, but their relative importance for primary soil community assembly has not yet been disentangled. By studying soil communities along chronosequences on four recently emerged nunataks (ice-free land in glacial areas) in Iceland, we replicated environmental conditions spatially at various geographical distances. This allowed us to determine the underlying factors of primary community assembly with the help of metacommunity theories that predict different levels of dispersal constraints and effects of the local environment. Comparing community assembly of the nunataks with that of non-isolated deglaciated areas indicated that isolation of a few kilometres did not affect the colonisation of the soil invertebrates. When accounting for effects of geographical distances, soil age and plant richness explained a significant part of the variance observed in the distribution of the oribatid mites and collembola communities, respectively. Furthermore, null model analyses revealed less co-occurrence than expected by chance and also convergence in the body size ratio of co-occurring oribatids, which is consistent with species sorting. Geographical distances influenced species composition, indicating that the community is also assembled by dispersal, e.g. mass effect. When all the results are linked together, they demonstrate that local environmental factors are important in structuring the soil community assembly, but are accompanied with effects of dispersal that may "override" the visible effect of the local environment.

  11. The link between poverty, environment and development. The political challenge of localizing Agenda 21.

    Science.gov (United States)

    Wichmann, R

    1995-11-01

    This article discusses the links between poverty, development, the environment, and implementing Agenda 21. The poor in large cities experience greater health risks and threats from environmental hazards. The poor also face inadequate housing, poor sanitation, polluted drinking water, and lack of other basic services. Many poor live in marginalized areas more susceptible to environmental degradation. During 1990-2030, population size may reach 9.7 billion, or 3.7 billion more than today. 90% may be urban residents. Already a large proportion of urban population live in a decaying urban environment with health and life threatening conditions. At least 250 million do not have easy access to safe piped water. 400 million lack proper sanitation. The liberalization of the global economy is fueling urbanization. The cycle of poverty and environmental decline requires rapid economic growth and closing of the infrastructure gaps. Policy initiatives of Agenda 21 occur at the local urban level. At this level, policies directly affect people. The future success of Agenda 21 will depend on local initiatives. Management approaches may need to change in order to achieve sustainable development. The poor will be more vocal and heard from in the future. Critical areas of management include waste management, pollution control, traffic, transportation, energy, economic development, and job creation. Society must be able to participate in setting priorities. About 1500 local authorities are involved in Agenda 21 planning initiatives. Curitiba, Brazil, is an example of how cities can solve community problems.

  12. The Use of Computer Simulation to Compare Student performance in Traditional versus Distance Learning Environments

    Directory of Open Access Journals (Sweden)

    Retta Guy

    2015-06-01

    Full Text Available Simulations have been shown to be an effective tool in traditional learning environments; however, as distance learning grows in popularity, the need to examine simulation effectiveness in this environment has become paramount. A casual-comparative design was chosen for this study to determine whether students using a computer-based instructional simulation in hybrid and fully online environments learned better than traditional classroom learners. The study spans a period of 6 years beginning fall 2008 through spring 2014. The population studied was 281 undergraduate business students self-enrolled in a 200-level microcomputer application course. The overall results support previous studies in that computer simulations are most effective when used as a supplement to face-to-face lectures and in hybrid environments.

  13. Research on Digital Forensic Readiness Design in a Cloud Computing-Based Smart Work Environment

    Directory of Open Access Journals (Sweden)

    Sangho Park

    2018-04-01

    Full Text Available Recently, the work environments of organizations have been in the process of transitioning into smart work environments by applying cloud computing technology in the existing work environment. The smart work environment has the characteristic of being able to access information assets inside the company from outside the company through cloud computing technology, share information without restrictions on location by using mobile terminals, and provide a work environment where work can be conducted effectively in various locations and mobile environments. Thus, in the cloud computing-based smart work environment, changes are occurring in terms of security risks, such as an increase in the leakage risk of an organization’s information assets through mobile terminals which have a high risk of loss and theft and increase the hacking risk of wireless networks in mobile environments. According to these changes in security risk, the reactive digital forensic method, which investigates digital evidence after the occurrence of security incidents, appears to have a limit which has led to a rise in the necessity of proactive digital forensic approaches wherein security incidents can be addressed preemptively. Accordingly, in this research, we design a digital forensic readiness model at the level of preemptive prevention by considering changes in the cloud computing-based smart work environment. Firstly, we investigate previous research related to the cloud computing-based smart work environment and digital forensic readiness and analyze a total of 50 components of digital forensic readiness. In addition, through the analysis of the corresponding preceding research, we design seven detailed areas, namely, outside the organization environment, within the organization guideline, system information, terminal information, user information, usage information, and additional function. Then, we design a draft of the digital forensic readiness model in the cloud

  14. COMPUTATIONAL MODELS USED FOR MINIMIZING THE NEGATIVE IMPACT OF ENERGY ON THE ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Oprea D.

    2012-04-01

    Full Text Available Optimizing energy system is a problem that is extensively studied for many years by scientists. This problem can be studied from different views and using different computer programs. The work is characterized by one of the following calculation methods used in Europe for modelling, power system optimization. This method shall be based on reduce action of energy system on environment. Computer program used and characterized in this article is GEMIS.

  15. ReputationPro: The Efficient Approaches to Contextual Transaction Trust Computation in E-Commerce Environments

    OpenAIRE

    Zhang, Haibin; Wang, Yan; Zhang, Xiuzhen; Lim, Ee-Peng

    2013-01-01

    In e-commerce environments, the trustworthiness of a seller is utterly important to potential buyers, especially when the seller is unknown to them. Most existing trust evaluation models compute a single value to reflect the general trust level of a seller without taking any transaction context information into account. In this paper, we first present a trust vector consisting of three values for Contextual Transaction Trust (CTT). In the computation of three CTT values, the identified three ...

  16. Improving Quality of Service and Reducing Power Consumption with WAN accelerator in Cloud Computing Environments

    OpenAIRE

    Shin-ichi Kuribayashi

    2013-01-01

    The widespread use of cloud computing services is expected to deteriorate a Quality of Service and toincrease the power consumption of ICT devices, since the distance to a server becomes longer than before. Migration of virtual machines over a wide area can solve many problems such as load balancing and power saving in cloud computing environments. This paper proposes to dynamically apply WAN accelerator within the network when a virtual machine is moved to a distant center, in order to preve...

  17. Nuclides.net: A computational environment for nuclear data and applications in radioprotection and radioecology

    International Nuclear Information System (INIS)

    Berthou, V.; Galy, J.; Leutzenkirchen, K.

    2004-01-01

    An interactive multimedia tool, Nuclides.net, has been developed at the Institute for Transuranium Elements. The Nuclides.net 'integrated environment' is a suite of computer programs ranging from a powerful user-friendly interface, which allows the user to navigate the nuclides chart and explore the properties of nuclides, to various computational modules for decay calculations, dosimetry and shielding calculations, etc. The product is particularly suitable for environmental radioprotection and radioecology. (authors)

  18. Load Balancing in Cloud Computing Environment Using Improved Weighted Round Robin Algorithm for Nonpreemptive Dependent Tasks

    OpenAIRE

    Devi, D. Chitra; Uthariaraj, V. Rhymend

    2016-01-01

    Cloud computing uses the concepts of scheduling and load balancing to migrate tasks to underutilized VMs for effectively sharing the resources. The scheduling of the nonpreemptive tasks in the cloud computing environment is an irrecoverable restraint and hence it has to be assigned to the most appropriate VMs at the initial placement itself. Practically, the arrived jobs consist of multiple interdependent tasks and they may execute the independent tasks in multiple VMs or in the same VM’s mul...

  19. Blockchain-based database to ensure data integrity in cloud computing environments

    OpenAIRE

    Gaetani, Edoardo; Aniello, Leonardo; Baldoni, Roberto; Lombardi, Federico; Margheri, Andrea; Sassone, Vladimiro

    2017-01-01

    Data is nowadays an invaluable resource, indeed it guides all business decisions in most of the computer-aided human activities. Threats to data integrity are thus of paramount relevance, as tampering with data may maliciously affect crucial business decisions. This issue is especially true in cloud computing environments, where data owners cannot control fundamental data aspects, like the physical storage of data and the control of its accesses. Blockchain has recently emerged as a fascinati...

  20. Towards the Automatic Detection of Efficient Computing Assets in a Heterogeneous Cloud Environment

    OpenAIRE

    Iglesias, Jesus Omana; Stokes, Nicola; Ventresque, Anthony; Murphy, Liam, B.E.; Thorburn, James

    2013-01-01

    peer-reviewed In a heterogeneous cloud environment, the manual grading of computing assets is the first step in the process of configuring IT infrastructures to ensure optimal utilization of resources. Grading the efficiency of computing assets is however, a difficult, subjective and time consuming manual task. Thus, an automatic efficiency grading algorithm is highly desirable. In this paper, we compare the effectiveness of the different criteria used in the manual gr...

  1. Local environment but not genetic differentiation influences biparental care in ten plover populations.

    Directory of Open Access Journals (Sweden)

    Orsolya Vincze

    Full Text Available Social behaviours are highly variable between species, populations and individuals. However, it is contentious whether behavioural variations are primarily moulded by the environment, caused by genetic differences, or a combination of both. Here we establish that biparental care, a complex social behaviour that involves rearing of young by both parents, differs between closely related populations, and then test two potential sources of variation in parental behaviour between populations: ambient environment and genetic differentiation. We use 2904 hours behavioural data from 10 geographically distinct Kentish (Charadrius alexandrinus and snowy plover (C. nivosus populations in America, Europe, the Middle East and North Africa to test these two sources of behavioural variation. We show that local ambient temperature has a significant influence on parental care: with extreme heat (above 40 °C total incubation (i.e. % of time the male or female incubated the nest increased, and female share (% female share of incubation decreased. By contrast, neither genetic differences between populations, nor geographic distances predicted total incubation or female's share of incubation. These results suggest that the local environment has a stronger influence on a social behaviour than genetic differentiation, at least between populations of closely related species.

  2. Bronchobiliary Fistula Localized by Cholescintigraphy with Single-Photon Emission Computed Tomography

    International Nuclear Information System (INIS)

    Artunduaga, Maddy; Patel, Niraj R.; Wendt, Julie A.; Guy, Elizabeth S.; Nachiappan, Arun C.

    2015-01-01

    Biliptysis is an important clinical feature to recognize as it is associated with bronchobiliary fistula, a rare entity. Bronchobiliary fistulas have been diagnosed with planar cholescintigraphy. However, cholescintigraphy with single-photon emission computed tomography (SPECT) can better spatially localize a bronchobiliary fistula as compared to planar cholescintigraphy alone, and is useful for preoperative planning if surgical treatment is required. Here, we present the case of a 23-year-old male who developed a bronchobiliary fistula in the setting of posttraumatic and postsurgical infection, which was diagnosed and localized by cholescintigraphy with SPECT

  3. Quantum computation via local control theory: Direct sum vs. direct product Hilbert spaces

    International Nuclear Information System (INIS)

    Sklarz, Shlomo E.; Tannor, David J.

    2006-01-01

    The central objective in any quantum computation is the creation of a desired unitary transformation; the mapping that this unitary transformation produces between the input and output states is identified with the computation. In [S.E. Sklarz, D.J. Tannor, arXiv:quant-ph/0404081 (submitted to PRA) (2004)] it was shown that local control theory can be used to calculate fields that will produce such a desired unitary transformation. In contrast with previous strategies for quantum computing based on optimal control theory, the local control scheme maintains the system within the computational subspace at intermediate times, thereby avoiding unwanted decay processes. In [S.E. Sklarz et al.], the structure of the Hilbert space had a direct sum structure with respect to the computational register and the mediating states. In this paper, we extend the formalism to the important case of a direct product Hilbert space. The final equations for the control algorithm for the two cases are remarkably similar in structure, despite the fact that the derivations are completely different and that in one case the dynamics is in a Hilbert space and in the other case the dynamics is in a Liouville space. As shown in [S.E. Sklarz et al.], the direct sum implementation leads to a computational mechanism based on virtual transitions, and can be viewed as an extension of the principles of Stimulated Raman Adiabatic Passage from state manipulation to evolution operator manipulation. The direct product implementation developed here leads to the intriguing concept of virtual entanglement - computation that exploits second-order transitions that pass through entangled states but that leaves the subsystems nearly separable at all intermediate times. Finally, we speculate on a connection between the algorithm developed here and the concept of decoherence free subspaces

  4. Dietary quality in children and the role of the local food environment

    Directory of Open Access Journals (Sweden)

    Eimear Keane

    2016-12-01

    Full Text Available Diet is a modifiable contributor to many chronic diseases including childhood obesity. The local food environment may influence children's diet but this area of research is understudied. This study explores if distance to and the number of supermarkets and convenience stores in the local area around households are associated with dietary quality in nine year olds whilst controlling for household level socio-economic factors. This is a secondary analysis of Wave 1 (2007/2008 of the Growing Up in Ireland (GUI Child Cohort Study, a sample of 8568 nine year olds from the Republic of Ireland. Dietary intake was assessed using a short, 20-item parent reported food frequency questionnaire and was used to create a dietary quality score (DQS whereby a higher score indicated a higher diet quality. Socio-economic status was measured using household class, household income, and maternal education. Food availability was measured as road network distance to and the number of supermarkets and convenience stores around households. Separate fixed effects regression models assessed the association between local area food availability and dietary quality, stratified by sex. The DQS ranged from −5 to 25 (mean 9.4, SD 4.2. Mean DQS was higher in those who lived furthest (distance in quintiles from their nearest supermarket (p<0.001, and in those who lived furthest from their nearest convenience store (p<0.001. After controlling for socio-economic characteristics of the household, there was insufficient evidence to suggest that distance to the nearest supermarket or convenience store was associated with dietary quality in girls or boys. The number of supermarkets or convenience stores within 1000 m of the household was not associated with dietary quality. Food availability had a limited effect on dietary quality in this study. Issues associated with conceptualising and measuring the food environment may explain the findings of the current study. Keywords: Diet

  5. Design of UAV-Embedded Microphone Array System for Sound Source Localization in Outdoor Environments

    Directory of Open Access Journals (Sweden)

    Kotaro Hoshiba

    2017-11-01

    Full Text Available In search and rescue activities, unmanned aerial vehicles (UAV should exploit sound information to compensate for poor visual information. This paper describes the design and implementation of a UAV-embedded microphone array system for sound source localization in outdoor environments. Four critical development problems included water-resistance of the microphone array, efficiency in assembling, reliability of wireless communication, and sufficiency of visualization tools for operators. To solve these problems, we developed a spherical microphone array system (SMAS consisting of a microphone array, a stable wireless network communication system, and intuitive visualization tools. The performance of SMAS was evaluated with simulated data and a demonstration in the field. Results confirmed that the SMAS provides highly accurate localization, water resistance, prompt assembly, stable wireless communication, and intuitive information for observers and operators.

  6. Design of UAV-Embedded Microphone Array System for Sound Source Localization in Outdoor Environments

    Science.gov (United States)

    Hoshiba, Kotaro; Washizaki, Kai; Wakabayashi, Mizuho; Ishiki, Takahiro; Bando, Yoshiaki; Gabriel, Daniel; Nakadai, Kazuhiro; Okuno, Hiroshi G.

    2017-01-01

    In search and rescue activities, unmanned aerial vehicles (UAV) should exploit sound information to compensate for poor visual information. This paper describes the design and implementation of a UAV-embedded microphone array system for sound source localization in outdoor environments. Four critical development problems included water-resistance of the microphone array, efficiency in assembling, reliability of wireless communication, and sufficiency of visualization tools for operators. To solve these problems, we developed a spherical microphone array system (SMAS) consisting of a microphone array, a stable wireless network communication system, and intuitive visualization tools. The performance of SMAS was evaluated with simulated data and a demonstration in the field. Results confirmed that the SMAS provides highly accurate localization, water resistance, prompt assembly, stable wireless communication, and intuitive information for observers and operators. PMID:29099790

  7. Design of UAV-Embedded Microphone Array System for Sound Source Localization in Outdoor Environments.

    Science.gov (United States)

    Hoshiba, Kotaro; Washizaki, Kai; Wakabayashi, Mizuho; Ishiki, Takahiro; Kumon, Makoto; Bando, Yoshiaki; Gabriel, Daniel; Nakadai, Kazuhiro; Okuno, Hiroshi G

    2017-11-03

    In search and rescue activities, unmanned aerial vehicles (UAV) should exploit sound information to compensate for poor visual information. This paper describes the design and implementation of a UAV-embedded microphone array system for sound source localization in outdoor environments. Four critical development problems included water-resistance of the microphone array, efficiency in assembling, reliability of wireless communication, and sufficiency of visualization tools for operators. To solve these problems, we developed a spherical microphone array system (SMAS) consisting of a microphone array, a stable wireless network communication system, and intuitive visualization tools. The performance of SMAS was evaluated with simulated data and a demonstration in the field. Results confirmed that the SMAS provides highly accurate localization, water resistance, prompt assembly, stable wireless communication, and intuitive information for observers and operators.

  8. Preferred Air Velocity and Local Cooling Effect of desk fans in warm environments

    DEFF Research Database (Denmark)

    Simone, Angela; Olesen, Bjarne W.

    2013-01-01

    to compensate for higher environmental temperatures at the expense of no or relatively low energy consumption. When using desk fans, local air movement is generated around the occupant and a certain cooling effect is perceived. The impact of the local air movement generated by different air flow patterns......Common experiences, standards, and laboratory studies show that increased air velocity helps to offset warm sensation due to high environmental temperatures. In warm climate regions the opening of windows and the use of desk or ceiling fans are the most common systems to generate increased airflows......, and the possibility to keep comfortable conditions for the occupants in warm environments were evaluated in studies with human subjects. In an office-like climatic chamber, the effect of higher air velocity was investigated at room temperatures between 26°C to 34°C and at constant absolute humidity of 12.2 g...

  9. Enhanced Survey and Proposal to secure the data in Cloud Computing Environment

    OpenAIRE

    MR.S.SUBBIAH; DR.S.SELVA MUTHUKUMARAN; DR.T.RAMKUMAR

    2013-01-01

    Cloud computing have the power to eliminate the cost of setting high end computing infrastructure. It is a promising area or design to give very flexible architecture, accessible through the internet. In the cloud computing environment the data will be reside at any of the data centers. Due to that, some data center may leak the data stored on there, beyond the reach and control of the users. For this kind of misbehaving data centers, the service providers should take care of the security and...

  10. The Needs of Virtual Machines Implementation in Private Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Edy Kristianto

    2015-12-01

    Full Text Available The Internet of Things (IOT becomes the purpose of the development of information and communication technology. Cloud computing has a very important role in supporting the IOT, because cloud computing allows to provide services in the form of infrastructure (IaaS, platform (PaaS, and Software (SaaS for its users. One of the fundamental services is infrastructure as a service (IaaS. This study analyzed the requirement that there must be based on a framework of NIST to realize infrastructure as a service in the form of a virtual machine to be built in a cloud computing environment.

  11. The capacity of local governments to improve business environment: Evidence from Serbia

    Directory of Open Access Journals (Sweden)

    Vesna Janković Milić

    2014-12-01

    Full Text Available The aim of this paper is to draw attention on the need to strengthen institutional cooperation between local self-governments and the business community. The paper analyses the ability of socio-economic councils in Serbia, as a part of local governments, to improve the business environment and indicators of social status at the local level. In addition to socio-economic councils, this analysis includes the departments, divisions and offices for local economic development and their responsibilities. The results in the paper has been generated using descriptive statistics, Chi-Square test, t-test and regression analysis, based on the analysis of primary data collected in empirical research on 55 municipalities in Serbia. The fundamental results obtained using the stated analysis is that socio-economic councils have positive impact on the social and economic development in the survived municipalities. Finally, the basic conclusion from the executed research is that size of the municipality is not a limiting factor for the establishment of the socio-economic councils and their functionality

  12. Computing the Local Field Potential (LFP) from Integrate-and-Fire Network Models

    DEFF Research Database (Denmark)

    Mazzoni, Alberto; Linden, Henrik; Cuntz, Hermann

    2015-01-01

    Leaky integrate-and-fire (LIF) network models are commonly used to study how the spiking dynamics of neural networks changes with stimuli, tasks or dynamic network states. However, neurophysiological studies in vivo often rather measure the mass activity of neuronal microcircuits with the local f...... in cases where a single pyramidal population dominates the LFP generation, and thereby facilitate quantitative comparison between computational models and experimental LFP recordings in vivo....

  13. New computer simulation technology of WSPEEDI for local and regional environmental assessment during nuclear emergency

    International Nuclear Information System (INIS)

    Chino, Masamichi; Furuno, Akiko; Terada, Hiroaki; Kitabata, Hideyuki

    2002-01-01

    The increase of nuclear power plants in the Asian region necessitates the capability to predict long-range atmospheric dispersions of radionuclides and radiological impacts due to a nuclear accident. For this purpose, we have developed a computer-based emergency response system WSPEEDI. This paper aims to expanding the capability of WSPEEDI so that it can be applied to simultaneous multi-scale predictions of local and regional scales in the Asian region

  14. Reaction time for processing visual stimulus in a computer-assisted rehabilitation environment.

    Science.gov (United States)

    Sanchez, Yerly; Pinzon, David; Zheng, Bin

    2017-10-01

    To examine the reaction time when human subjects process information presented in the visual channel under both a direct vision and a virtual rehabilitation environment when walking was performed. Visual stimulus included eight math problems displayed on the peripheral vision to seven healthy human subjects in a virtual rehabilitation training (computer-assisted rehabilitation environment (CAREN)) and a direct vision environment. Subjects were required to verbally report the results of these math calculations in a short period of time. Reaction time measured by Tobii Eye tracker and calculation accuracy were recorded and compared between the direct vision and virtual rehabilitation environment. Performance outcomes measured for both groups included reaction time, reading time, answering time and the verbal answer score. A significant difference between the groups was only found for the reaction time (p = .004). Participants had more difficulty recognizing the first equation of the virtual environment. Participants reaction time was faster in the direct vision environment. This reaction time delay should be kept in mind when designing skill training scenarios in virtual environments. This was a pilot project to a series of studies assessing cognition ability of stroke patients who are undertaking a rehabilitation program with a virtual training environment. Implications for rehabilitation Eye tracking is a reliable tool that can be employed in rehabilitation virtual environments. Reaction time changes between direct vision and virtual environment.

  15. Structural analysis of magnetic fusion energy systems in a combined interactive/batch computer environment

    International Nuclear Information System (INIS)

    Johnson, N.E.; Singhal, M.K.; Walls, J.C.; Gray, W.H.

    1979-01-01

    A system of computer programs has been developed to aid in the preparation of input data for and the evaluation of output data from finite element structural analyses of magnetic fusion energy devices. The system utilizes the NASTRAN structural analysis computer program and a special set of interactive pre- and post-processor computer programs, and has been designed for use in an environment wherein a time-share computer system is linked to a batch computer system. In such an environment, the analyst must only enter, review and/or manipulate data through interactive terminals linked to the time-share computer system. The primary pre-processor programs include NASDAT, NASERR and TORMAC. NASDAT and TORMAC are used to generate NASTRAN input data. NASERR performs routine error checks on this data. The NASTRAN program is run on a batch computer system using data generated by NASDAT and TORMAC. The primary post-processing programs include NASCMP and NASPOP. NASCMP is used to compress the data initially stored on magnetic tape by NASTRAN so as to facilitate interactive use of the data. NASPOP reads the data stored by NASCMP and reproduces NASTRAN output for selected grid points, elements and/or data types

  16. Daily Megavoltage Computed Tomography in Lung Cancer Radiotherapy: Correlation Between Volumetric Changes and Local Outcome

    International Nuclear Information System (INIS)

    Bral, Samuel; De Ridder, Mark; Duchateau, Michael; Gevaert, Thierry; Engels, Benedikt; Schallier, Denis; Storme, Guy

    2011-01-01

    Purpose: To assess the predictive or comparative value of volumetric changes, measured on daily megavoltage computed tomography during radiotherapy for lung cancer. Patients and Methods: We included 80 patients with locally advanced non-small-cell lung cancer treated with image-guided intensity-modulated radiotherapy. The radiotherapy was combined with concurrent chemotherapy, combined with induction chemotherapy, or given as primary treatment. Patients entered two parallel studies with moderately hypofractionated radiotherapy. Tumor volume contouring was done on the daily acquired images. A regression coefficient was derived from the volumetric changes on megavoltage computed tomography, and its predictive value was validated. Logarithmic or polynomial fits were applied to the intratreatment changes to compare the different treatment schedules radiobiologically. Results: Regardless of the treatment type, a high regression coefficient during radiotherapy predicted for a significantly prolonged cause-specific local progression free-survival (p = 0.05). Significant differences were found in the response during radiotherapy. The significant difference in volumetric treatment response between radiotherapy with concurrent chemotherapy and radiotherapy plus induction chemotherapy translated to a superior long-term local progression-free survival for concurrent chemotherapy (p = 0.03). An enhancement ratio of 1.3 was measured for the used platinum/taxane doublet in comparison with radiotherapy alone. Conclusion: Contouring on daily megavoltage computed tomography images during radiotherapy enabled us to predict the efficacy of a given treatment. The significant differences in volumetric response between treatment strategies makes it a possible tool for future schedule comparison.

  17. Role of Multislice Computed Tomography and Local Contrast in the Diagnosis and Characterization of Choanal Atresia

    Directory of Open Access Journals (Sweden)

    Khaled Al-Noury

    2011-01-01

    Full Text Available Objective. To illustrate the role of multislice computed tomography and local contrast instillation in the diagnosis and characterization of choanal atresia. To review the common associated radiological findings. Methods. We analyzed 9 pediatric patients (5 males and 4 females with suspected choanal atresia by multislice computed tomography. We recorded the type of atresia plate and other congenital malformations of the skull. Results. Multislice computed tomography with local contrast installed delineated the posterior choanae. Three patients had unilateral mixed membranous and bony atresia. Three patients had unilateral pure bony atresia. Only 1 of 7 patients have bilateral bony atresia. It also showed other congenital anomalies in the head region. One patient is with an ear abnormality. One patient had congenital nasal pyriform aperture stenosis. One of these patients had several congenital abnormalities, including cardiac and renal deformities and a hypoplastic lateral semicircular canal. Of the 6 patients diagnosed to have choanal atresia, 1 patient had esophageal atresia and a tracheoesophageal fistula. The remaining patients had no other CHARGE syndrome lesions. Conclusions. Local Contrast medium with the application of the low-dose technique helps to delineate the cause of the nasal obstruction avoiding a high radiation dose to the child.

  18. Computer modeling of the dynamics of surface tension on rotating fluids in low and microgravity environments

    Science.gov (United States)

    Hung, R. J.; Tsao, Y. D.; Hong, B. B.; Leslie, Fred W.

    1989-01-01

    Time-dependent evolutions of the profile of the free surface (bubble shapes) for a cylindrical container partially filled with a Newtonian fluid of constant density, rotating about its axis of symmetry, have been studied. Numerical computations have been carried out with the following situations: (1) linear functions of spin-up and spin-down in low- and microgravity environments, (2) linear functions of increasing and decreasing gravity environments at high- and low-rotating cylinder speeds, and (3) step functions of spin-up and spin-down in a low-gravity environment.

  19. Research on elastic resource management for multi-queue under cloud computing environment

    Science.gov (United States)

    CHENG, Zhenjing; LI, Haibo; HUANG, Qiulan; Cheng, Yaodong; CHEN, Gang

    2017-10-01

    As a new approach to manage computing resource, virtualization technology is more and more widely applied in the high-energy physics field. A virtual computing cluster based on Openstack was built at IHEP, using HTCondor as the job queue management system. In a traditional static cluster, a fixed number of virtual machines are pre-allocated to the job queue of different experiments. However this method cannot be well adapted to the volatility of computing resource requirements. To solve this problem, an elastic computing resource management system under cloud computing environment has been designed. This system performs unified management of virtual computing nodes on the basis of job queue in HTCondor based on dual resource thresholds as well as the quota service. A two-stage pool is designed to improve the efficiency of resource pool expansion. This paper will present several use cases of the elastic resource management system in IHEPCloud. The practical run shows virtual computing resource dynamically expanded or shrunk while computing requirements change. Additionally, the CPU utilization ratio of computing resource was significantly increased when compared with traditional resource management. The system also has good performance when there are multiple condor schedulers and multiple job queues.

  20. Abnormalities by pulmonary regions studied with computer tomography following local or local-regional radiotherapy for breast cancer

    International Nuclear Information System (INIS)

    Lind, Pehr; Svane, Gunilla; Gagliardi, Giovanna; Svensson, Christer

    1999-01-01

    Purpose: To study pulmonary radiological abnormalities with computer tomography (CT) following different radiotherapy (RT) techniques for breast cancer with respect to regions and density, and their correlation to pulmonary complications and reduction in vital capacity (VC). Methods and Materials: CT scans of the lungs were performed prior to and 4 months following RT in 105 breast cancer patients treated with local or local-regional RT. The radiological abnormalities were analyzed with a CT-adapted modification of a classification system originally proposed by Arriagada, and scored according to increasing density (0-3) and affected lung regions (apical-lateral, central-parahilar, basal-lateral). The highest density grade in each region were added together to form scores ranging from 0-9. The patients were monitored for RT-induced pulmonary complications. VC was measured prior to and 5 months following RT. Results: Increasing CT scores were correlated with both local-regional RT and pulmonary complications (p < 0.001). The mean reduction of VC for patients scoring 4-9 (-202 ml) was larger than for patients scoring 0-3 (-2 ml) (p = 0.035). The effect of confounding factors on the radiological scoring was tested in the local-regional RT group. Scores of 4-9 were less frequently seen in the patients who had received adjuvant chemotherapy prior to RT. The importance of the respective lung regions on the outcome of pulmonary complications was tested. Only radiological abnormalities in the central-parahilar and apical-lateral regions were significantly correlated to pulmonary complications. Discussion: Radiological abnormalities detected on CT images and scored with a modification of Arriagada's classification system can be used as an objective endpoint for pulmonary side effects in breast cancer. The described model should, however, be expanded with information about the volume of lung affected in each region before definite conclusions can be drawn concerning each

  1. Computed tomography-guided cryoablation of local recurrence after primary resection of pancreatic adenocarcinoma

    Directory of Open Access Journals (Sweden)

    Claudio Pusceddu

    2015-06-01

    Full Text Available The optimal management of local recurrences after primary resection of pancreatic cancer still remains to be clarified. A 58-yearold woman developed an isolated recurrence of pancreatic cancer six year after distal pancreatectomy. Re-resection was attempted but the lesion was deemed unresectable at surgery. Then chemotherapy was administrated without obtaining a reduction of the tumor size nor an improvement of the patient’s symptoms. Thus the patient underwent percutaneous cryoablation under computed tomography (CT-guidance obtaining tumor necrosis and a significant improvement in the quality of life. A CT scan one month later showed a stable lesion with no contrast enhancement. While the use of percutaneous cryoblation has widened its applications in patients with unresectable pancreatic cancer, it has never been described for the treatment of local pancreatic cancer recurrence after primary resection. Percutaneous cryoablation deserves further studies in the multimodality treatment of local recurrence after primary pancreatic surgery.

  2. Secondary School Students' Attitudes towards Mathematics Computer--Assisted Instruction Environment in Kenya

    Science.gov (United States)

    Mwei, Philip K.; Wando, Dave; Too, Jackson K.

    2012-01-01

    This paper reports the results of research conducted in six classes (Form IV) with 205 students with a sample of 94 respondents. Data represent students' statements that describe (a) the role of Mathematics teachers in a computer-assisted instruction (CAI) environment and (b) effectiveness of CAI in Mathematics instruction. The results indicated…

  3. Modeling Students' Problem Solving Performance in the Computer-Based Mathematics Learning Environment

    Science.gov (United States)

    Lee, Young-Jin

    2017-01-01

    Purpose: The purpose of this paper is to develop a quantitative model of problem solving performance of students in the computer-based mathematics learning environment. Design/methodology/approach: Regularized logistic regression was used to create a quantitative model of problem solving performance of students that predicts whether students can…

  4. The Use of Engineering Design Concept for Computer Programming Course: A Model of Blended Learning Environment

    Science.gov (United States)

    Tritrakan, Kasame; Kidrakarn, Pachoen; Asanok, Manit

    2016-01-01

    The aim of this research is to develop a learning model which blends factors from learning environment and engineering design concept for learning in computer programming course. The usage of the model was also analyzed. This study presents the design, implementation, and evaluation of the model. The research methodology is divided into three…

  5. Computer Graphics Orientation and Training in a Corporate/Production Environment.

    Science.gov (United States)

    McDevitt, Marsha Jean

    This master's thesis provides an overview of a computer graphics production environment and proposes a realistic approach to orientation and on-going training for employees working within a fast-paced production schedule. Problems involved in meeting the training needs of employees are briefly discussed in the first chapter, while the second…

  6. Visual Perspectives within Educational Computer Games: Effects on Presence and Flow within Virtual Immersive Learning Environments

    Science.gov (United States)

    Scoresby, Jon; Shelton, Brett E.

    2011-01-01

    The mis-categorizing of cognitive states involved in learning within virtual environments has complicated instructional technology research. Further, most educational computer game research does not account for how learning activity is influenced by factors of game content and differences in viewing perspectives. This study is a qualitative…

  7. The Effects of Study Tasks in a Computer-Based Chemistry Learning Environment

    Science.gov (United States)

    Urhahne, Detlef; Nick, Sabine; Poepping, Anna Christin; Schulz , Sarah Jayne

    2013-01-01

    The present study examines the effects of different study tasks on the acquisition of knowledge about acids and bases in a computer-based learning environment. Three different task formats were selected to create three treatment conditions: learning with gap-fill and matching tasks, learning with multiple-choice tasks, and learning only from text…

  8. Understanding Student Retention in Computer Science Education: The Role of Environment, Gains, Barriers and Usefulness

    Science.gov (United States)

    Giannakos, Michail N.; Pappas, Ilias O.; Jaccheri, Letizia; Sampson, Demetrios G.

    2017-01-01

    Researchers have been working to understand the high dropout rates in computer science (CS) education. Despite the great demand for CS professionals, little is known about what influences individuals to complete their CS studies. We identify gains of studying CS, the (learning) environment, degree's usefulness, and barriers as important predictors…

  9. Encountering the Expertise Reversal Effect with a Computer-Based Environment on Electrical Circuit Analysis

    Science.gov (United States)

    Reisslein, Jana; Atkinson, Robert K.; Seeling, Patrick; Reisslein, Martin

    2006-01-01

    This study examined the effectiveness of a computer-based environment employing three example-based instructional procedures (example-problem, problem-example, and fading) to teach series and parallel electrical circuit analysis to learners classified by two levels of prior knowledge (low and high). Although no differences between the…

  10. Detecting and Understanding the Impact of Cognitive and Interpersonal Conflict in Computer Supported Collaborative Learning Environments

    Science.gov (United States)

    Prata, David Nadler; Baker, Ryan S. J. d.; Costa, Evandro d. B.; Rose, Carolyn P.; Cui, Yue; de Carvalho, Adriana M. J. B.

    2009-01-01

    This paper presents a model which can automatically detect a variety of student speech acts as students collaborate within a computer supported collaborative learning environment. In addition, an analysis is presented which gives substantial insight as to how students' learning is associated with students' speech acts, knowledge that will…

  11. MOO: Using a Computer Gaming Environment to Teach about Community Arts

    Science.gov (United States)

    Garber, Elizabeth

    2004-01-01

    In this paper, the author discusses the use of an interactive computer technology, "MOO" (Multi-user domain, Object-Oriented), in her art education classes for preservice teachers. A MOO is a text-based environment wherein interactivity is centered on text exchanges made between users based on problems or other materials created by teachers. The…

  12. A semi-local quasi-harmonic model to compute the thermodynamic and mechanical properties of silicon nanostructures

    International Nuclear Information System (INIS)

    Zhao, H; Aluru, N R

    2007-01-01

    This paper presents a semi-local quasi-harmonic model with local phonon density of states (LPDOS) to compute the thermodynamic and mechanical properties of silicon nanostructures at finite temperature. In contrast to an earlier approach (Tang and Aluru 2006 Phys. Rev. B 74 235441), where a quasi-harmonic model with LPDOS computed by a Green's function technique (QHMG) was developed considering many layers of atoms, the semi-local approach considers only two layers of atoms to compute the LPDOS. We show that the semi-local approach combines the accuracy of the QHMG approach and the computational efficiency of the local quasi-harmonic model. We present results for several silicon nanostructures to address the accuracy and efficiency of the semi-local approach

  13. Using the CAVE virtual-reality environment as an aid to 3-D electromagnetic field computation

    International Nuclear Information System (INIS)

    Turner, L.R.; Levine, D.; Huang, M.; Papka, M.

    1995-01-01

    One of the major problems in three-dimensional (3-D) field computation is visualizing the resulting 3-D field distributions. A virtual-reality environment, such as the CAVE, (CAVE Automatic Virtual Environment) is helping to overcome this problem, thus making the results of computation more usable for designers and users of magnets and other electromagnetic devices. As a demonstration of the capabilities of the CAVE, the elliptical multipole wiggler (EMW), an insertion device being designed for the Advanced Photon Source (APS) now being commissioned at Argonne National Laboratory (ANL), wa made visible, along with its fields and beam orbits. Other uses of the CAVE in preprocessing and postprocessing computation for electromagnetic applications are also discussed

  14. NOSTOS: a paper-based ubiquitous computing healthcare environment to support data capture and collaboration.

    Science.gov (United States)

    Bång, Magnus; Larsson, Anders; Eriksson, Henrik

    2003-01-01

    In this paper, we present a new approach to clinical workplace computerization that departs from the window-based user interface paradigm. NOSTOS is an experimental computer-augmented work environment designed to support data capture and teamwork in an emergency room. NOSTOS combines multiple technologies, such as digital pens, walk-up displays, headsets, a smart desk, and sensors to enhance an existing paper-based practice with computer power. The physical interfaces allow clinicians to retain mobile paper-based collaborative routines and still benefit from computer technology. The requirements for the system were elicited from situated workplace studies. We discuss the advantages and disadvantages of augmenting a paper-based clinical work environment.

  15. Local body cooling to improve sleep quality and thermal comfort in a hot environment.

    Science.gov (United States)

    Lan, L; Qian, X L; Lian, Z W; Lin, Y B

    2018-01-01

    The effects of local body cooling on thermal comfort and sleep quality in a hot environment were investigated in an experiment with 16 male subjects. Sleep quality was evaluated subjectively, using questionnaires completed in the morning, and objectively, by analysis of electroencephalogram (EEG) signals that were continuously monitored during the sleeping period. Compared with no cooling, the largest improvement in thermal comfort and sleep quality was observed when the back and head (neck) were both cooled at a room temperature of 32°C. Back cooling alone also improved thermal comfort and sleep quality, although the effects were less than when cooling both back and head (neck). Mean sleep efficiency was improved from 84.6% in the no cooling condition to 95.3% and 92.8%, respectively, in these conditions, indicating good sleep quality. Head (neck) cooling alone slightly improved thermal comfort and subjective sleep quality and increased Stage N3 sleep, but did not otherwise improve sleep quality. The results show that local cooling applied to large body sections (back and head) could effectively maintain good sleep and improve thermal comfort in a hot environment. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  16. Distributed, signal strength-based indoor localization algorithm for use in healthcare environments.

    Science.gov (United States)

    Wyffels, Jeroen; De Brabanter, Jos; Crombez, Pieter; Verhoeve, Piet; Nauwelaers, Bart; De Strycker, Lieven

    2014-11-01

    In current healthcare environments, a trend toward mobile and personalized interactions between people and nurse call systems is strongly noticeable. Therefore, it should be possible to locate patients at all times and in all places throughout the care facility. This paper aims at describing a method by which a mobile node can locate itself indoors, based on signal strength measurements and a minimal amount of yes/no decisions. The algorithm has been developed specifically for use in a healthcare environment. With extensive testing and statistical support, we prove that our algorithm can be used in a healthcare setting with an envisioned level of localization accuracy up to room revel (or region level in a corridor), while avoiding heavy investments since the hardware of an existing nurse call network can be reused. The approach opted for leads to very high scalability, since thousands of mobile nodes can locate themselves. Network timing issues and localization update delays are avoided, which ensures that a patient can receive the needed care in a time and resources efficient way.

  17. Development of a Computational Steering Framework for High Performance Computing Environments on Blue Gene/P Systems

    KAUST Repository

    Danani, Bob K.

    2012-07-01

    Computational steering has revolutionized the traditional workflow in high performance computing (HPC) applications. The standard workflow that consists of preparation of an application’s input, running of a simulation, and visualization of simulation results in a post-processing step is now transformed into a real-time interactive workflow that significantly reduces development and testing time. Computational steering provides the capability to direct or re-direct the progress of a simulation application at run-time. It allows modification of application-defined control parameters at run-time using various user-steering applications. In this project, we propose a computational steering framework for HPC environments that provides an innovative solution and easy-to-use platform, which allows users to connect and interact with running application(s) in real-time. This framework uses RealityGrid as the underlying steering library and adds several enhancements to the library to enable steering support for Blue Gene systems. Included in the scope of this project is the development of a scalable and efficient steering relay server that supports many-to-many connectivity between multiple steered applications and multiple steering clients. Steered applications can range from intermediate simulation and physical modeling applications to complex computational fluid dynamics (CFD) applications or advanced visualization applications. The Blue Gene supercomputer presents special challenges for remote access because the compute nodes reside on private networks. This thesis presents an implemented solution and demonstrates it on representative applications. Thorough implementation details and application enablement steps are also presented in this thesis to encourage direct usage of this framework.

  18. Sensing the environment: regulation of local and global homeostasis by the skin's neuroendocrine system.

    Science.gov (United States)

    Slominski, Andrzej T; Zmijewski, Michal A; Skobowiat, Cezary; Zbytek, Blazej; Slominski, Radomir M; Steketee, Jeffery D

    2012-01-01

    Skin, the body's largest organ, is strategically located at the interface with the external environment where it detects, integrates, and responds to a diverse range of stressors including solar radiation. It has already been established that the skin is an important peripheral neuro-endocrine-immune organ that is tightly networked to central regulatory systems. These capabilities contribute to the maintenance of peripheral homeostasis. Specifically, epidermal and dermal cells produce and respond to classical stress neurotransmitters, neuropeptides, and hormones. Such production is stimulated by ultraviolet radiation (UVR), biological factors (infectious and noninfectious), and other physical and chemical agents. Examples of local biologically active products are cytokines, biogenic amines (catecholamines, histamine, serotonin, and N-acetyl-serotonin), melatonin, acetylocholine, neuropeptides including pituitary (proopiomelanocortin-derived ACTH, beta-endorphin or MSH peptides, thyroid-stimulating hormone) and hypothalamic (corticotropin-releasing factor and related urocortins, thyroid-releasing hormone) hormones as well as enkephalins and dynorphins, thyroid hormones, steroids (glucocorticoids, mineralocorticoids, sex hormones, 7-delta steroids), secosteroids, opioids, and endocannabinoids. The production of these molecules is hierarchical, organized along the algorithms of classical neuroendocrine axes such as hypothalamic-pituitary-adrenal axis (HPA), hypothalamic-thyroid axis (HPT), serotoninergic, melatoninergic, catecholaminergic, cholinergic, steroid/secosteroidogenic, opioid, and endocannbinoid systems. Dysregulation of these axes or of communication between them may lead to skin and/ or systemic diseases. These local neuroendocrine networks are also addressed at restricting maximally the effect of noxious environmental agents to preserve local and consequently global homeostasis. Moreover, the skin-derived factors/systems can also activate cutaneous nerve

  19. Upscaling ecotourism in Kisumu city and its environs: Local community perspective Authors

    Directory of Open Access Journals (Sweden)

    Patrick Odhiambo HAYOMBE

    2013-07-01

    Full Text Available Kenya’s quest to be among the top ten long-haul tourist destinations globally require strategic focus as envisaged in Kenya’s Vision 2030. Ecotourism is emerging as an alternative development path that can enhance environmental conservation, promote preservation of cultural heritage as well as provide an alternative source of sustainable livelihood. Alternative livelihood in ecotourism provides a sustainable development path for Kisumu City and its environs. However, sustainability in ecotourism transformation is a concern; that is how to motivate the local community to participate in this venture? This study discerns these significant sustainability factors as perceived by the local community. The objective of the study was to discern the local community’s perception on significant sustainability factors for ecotourism transformation. And the research questions: What is the local community’s perception on significant sustainability factors for ecotourism transformation? This research design used both qualitative and quantitative research. The qualitative research design focused on site specific analysis of ecotourism sites of Dunga (Kisumu, Miyandhe (Bondo and Seka (Kendu Bay. The quantitative research entailed data collection administered through questionnaire in eco-tourism outlets represented by 10 Beach Management Units (BMU selected through purposive sampling. Principal Component Analysis was used to discern the significant sustainability factors for ecotourism transformation. A total of 28 items converted into variables were subjected against 326 respondents in the PCA analysis. The results indicated a total of seven (7 significant sustainability factors: First factor was willingness to participate in ecotourism ventures; second Factor was upscale ecotourism initiatives in the neighborhood; third factor was women and youth empowerment; fourth factor was youth and women employment in the neighborhood; fifth Factor: Natural Artifact

  20. Computation of quantum electron transport with local current conservation using quantum trajectories

    International Nuclear Information System (INIS)

    Alarcón, A; Oriols, X

    2009-01-01

    A recent proposal for modeling time-dependent quantum electron transport with Coulomb and exchange correlations using quantum (Bohm) trajectories (Oriols 2007 Phys. Rev. Lett. 98 066803) is extended towards the computation of the total (particle plus displacement) current in mesoscopic devices. In particular, two different methods for the practical computation of the total current are compared. The first method computes the particle and the displacement currents from the rate of Bohm particles crossing a particular surface and the time-dependent variations of the electric field there. The second method uses the Ramo–Shockley theorem to compute the total current on that surface from the knowledge of the Bohm particle dynamics in a 3D volume and the time-dependent variations of the electric field on the boundaries of that volume. From a computational point of view, it is shown that both methods achieve local current conservation, but the second is preferred because it is free from 'spurious' peaks. A numerical example, a Bohm trajectory crossing a double-barrier tunneling structure, is presented, supporting the conclusions

  1. A Robust Localization, Slip Estimation, and Compensation System for WMR in the Indoor Environments

    Directory of Open Access Journals (Sweden)

    Zakir Ullah

    2018-05-01

    Full Text Available A novel approach is proposed for the path tracking of a Wheeled Mobile Robot (WMR in the presence of an unknown lateral slip. Much of the existing work has assumed pure rolling conditions between the wheel and ground. Under the pure rolling conditions, the wheels of a WMR are supposed to roll without slipping. Complex wheel-ground interactions, acceleration and steering system noise are the factors which cause WMR wheel slip. A basic research problem in this context is localization and slip estimation of WMR from a stream of noisy sensors data when the robot is moving on a slippery surface, or moving at a high speed. DecaWave based ranging system and Particle Filter (PF are good candidates to estimate the location of WMR indoors and outdoors. Unfortunately, wheel-slip of WMR limits the ultimate performance that can be achieved by real-world implementation of the PF, because location estimation systems typically partially rely on the robot heading. A small error in the WMR heading leads to a large error in location estimation of the PF because of its cumulative nature. In order to enhance the tracking and localization performance of the PF in the environments where the main reason for an error in the PF location estimation is angular noise, two methods were used for heading estimation of the WMR (1: Reinforcement Learning (RL and (2: Location-based Heading Estimation (LHE. Trilateration is applied to DecaWave based ranging system for calculating the probable location of WMR, this noisy location along with PF current mean is used to estimate the WMR heading by using the above two methods. Beside the WMR location calculation, DecaWave based ranging system is also used to update the PF weights. The localization and tracking performance of the PF is significantly improved through incorporating heading error in localization by applying RL and LHE. Desired trajectory information is then used to develop an algorithm for extracting the lateral slip along

  2. A survey of simultaneous localization and mapping on unstructured lunar complex environment

    Science.gov (United States)

    Wang, Yiqiao; Zhang, Wei; An, Pei

    2017-10-01

    Simultaneous localization and mapping (SLAM) technology is the key to realizing lunar rover's intelligent perception and autonomous navigation. It embodies the autonomous ability of mobile robot, and has attracted plenty of concerns of researchers in the past thirty years. Visual sensors are meaningful to SLAM research because they can provide a wealth of information. Visual SLAM uses merely images as external information to estimate the location of the robot and construct the environment map. Nowadays, SLAM technology still has problems when applied in large-scale, unstructured and complex environment. Based on the latest technology in the field of visual SLAM, this paper investigates and summarizes the SLAM technology using in the unstructured complex environment of lunar surface. In particular, we focus on summarizing and comparing the detection and matching of features of SIFT, SURF and ORB, in the meanwhile discussing their advantages and disadvantages. We have analyzed the three main methods: SLAM Based on Extended Kalman Filter, SLAM Based on Particle Filter and SLAM Based on Graph Optimization (EKF-SLAM, PF-SLAM and Graph-based SLAM). Finally, this article summarizes and discusses the key scientific and technical difficulties in the lunar context that Visual SLAM faces. At the same time, we have explored the frontier issues such as multi-sensor fusion SLAM and multi-robot cooperative SLAM technology. We also predict and prospect the development trend of lunar rover SLAM technology, and put forward some ideas of further research.

  3. HEPLIB '91: International users meeting on the support and environments of high energy physics computing

    International Nuclear Information System (INIS)

    Johnstad, H.

    1991-01-01

    The purpose of this meeting is to discuss the current and future HEP computing support and environments from the perspective of new horizons in accelerator, physics, and computing technologies. Topics of interest to the Meeting include (but are limited to): the forming of the HEPLIB world user group for High Energy Physic computing; mandate, desirables, coordination, organization, funding; user experience, international collaboration; the roles of national labs, universities, and industry; range of software, Monte Carlo, mathematics, physics, interactive analysis, text processors, editors, graphics, data base systems, code management tools; program libraries, frequency of updates, distribution; distributed and interactive computing, data base systems, user interface, UNIX operating systems, networking, compilers, Xlib, X-Graphics; documentation, updates, availability, distribution; code management in large collaborations, keeping track of program versions; and quality assurance, testing, conventions, standards

  4. MDA-image: an environment of networked desktop computers for teleradiology/pathology.

    Science.gov (United States)

    Moffitt, M E; Richli, W R; Carrasco, C H; Wallace, S; Zimmerman, S O; Ayala, A G; Benjamin, R S; Chee, S; Wood, P; Daniels, P

    1991-04-01

    MDA-Image, a project of The University of Texas M. D. Anderson Cancer Center, is an environment of networked desktop computers for teleradiology/pathology. Radiographic film is digitized with a film scanner and histopathologic slides are digitized using a red, green, and blue (RGB) video camera connected to a microscope. Digitized images are stored on a data server connected to the institution's computer communication network (Ethernet) and can be displayed from authorized desktop computers connected to Ethernet. Images are digitized for cases presented at the Bone Tumor Management Conference, a multidisciplinary conference in which treatment options are discussed among clinicians, surgeons, radiologists, pathologists, radiotherapists, and medical oncologists. These radiographic and histologic images are shown on a large screen computer monitor during the conference. They are available for later review for follow-up or representation.

  5. A Wireless Sensor Network with Soft Computing Localization Techniques for Track Cycling Applications.

    Science.gov (United States)

    Gharghan, Sadik Kamel; Nordin, Rosdiadee; Ismail, Mahamod

    2016-08-06

    In this paper, we propose two soft computing localization techniques for wireless sensor networks (WSNs). The two techniques, Neural Fuzzy Inference System (ANFIS) and Artificial Neural Network (ANN), focus on a range-based localization method which relies on the measurement of the received signal strength indicator (RSSI) from the three ZigBee anchor nodes distributed throughout the track cycling field. The soft computing techniques aim to estimate the distance between bicycles moving on the cycle track for outdoor and indoor velodromes. In the first approach the ANFIS was considered, whereas in the second approach the ANN was hybridized individually with three optimization algorithms, namely Particle Swarm Optimization (PSO), Gravitational Search Algorithm (GSA), and Backtracking Search Algorithm (BSA). The results revealed that the hybrid GSA-ANN outperforms the other methods adopted in this paper in terms of accuracy localization and distance estimation accuracy. The hybrid GSA-ANN achieves a mean absolute distance estimation error of 0.02 m and 0.2 m for outdoor and indoor velodromes, respectively.

  6. Designing a parallel evolutionary algorithm for inferring gene networks on the cloud computing environment.

    Science.gov (United States)

    Lee, Wei-Po; Hsiao, Yu-Ting; Hwang, Wei-Che

    2014-01-16

    To improve the tedious task of reconstructing gene networks through testing experimentally the possible interactions between genes, it becomes a trend to adopt the automated reverse engineering procedure instead. Some evolutionary algorithms have been suggested for deriving network parameters. However, to infer large networks by the evolutionary algorithm, it is necessary to address two important issues: premature convergence and high computational cost. To tackle the former problem and to enhance the performance of traditional evolutionary algorithms, it is advisable to use parallel model evolutionary algorithms. To overcome the latter and to speed up the computation, it is advocated to adopt the mechanism of cloud computing as a promising solution: most popular is the method of MapReduce programming model, a fault-tolerant framework to implement parallel algorithms for inferring large gene networks. This work presents a practical framework to infer large gene networks, by developing and parallelizing a hybrid GA-PSO optimization method. Our parallel method is extended to work with the Hadoop MapReduce programming model and is executed in different cloud computing environments. To evaluate the proposed approach, we use a well-known open-source software GeneNetWeaver to create several yeast S. cerevisiae sub-networks and use them to produce gene profiles. Experiments have been conducted and the results have been analyzed. They show that our parallel approach can be successfully used to infer networks with desired behaviors and the computation time can be largely reduced. Parallel population-based algorithms can effectively determine network parameters and they perform better than the widely-used sequential algorithms in gene network inference. These parallel algorithms can be distributed to the cloud computing environment to speed up the computation. By coupling the parallel model population-based optimization method and the parallel computational framework, high

  7. A Computing Environment to Support Repeatable Scientific Big Data Experimentation of World-Wide Scientific Literature

    Energy Technology Data Exchange (ETDEWEB)

    Schlicher, Bob G [ORNL; Kulesz, James J [ORNL; Abercrombie, Robert K [ORNL; Kruse, Kara L [ORNL

    2015-01-01

    A principal tenant of the scientific method is that experiments must be repeatable and relies on ceteris paribus (i.e., all other things being equal). As a scientific community, involved in data sciences, we must investigate ways to establish an environment where experiments can be repeated. We can no longer allude to where the data comes from, we must add rigor to the data collection and management process from which our analysis is conducted. This paper describes a computing environment to support repeatable scientific big data experimentation of world-wide scientific literature, and recommends a system that is housed at the Oak Ridge National Laboratory in order to provide value to investigators from government agencies, academic institutions, and industry entities. The described computing environment also adheres to the recently instituted digital data management plan mandated by multiple US government agencies, which involves all stages of the digital data life cycle including capture, analysis, sharing, and preservation. It particularly focuses on the sharing and preservation of digital research data. The details of this computing environment are explained within the context of cloud services by the three layer classification of Software as a Service , Platform as a Service , and Infrastructure as a Service .

  8. Surface and interfacial interactions of multilayer graphitic structures with local environment

    International Nuclear Information System (INIS)

    Mazzocco, R.; Robinson, B.J.; Rabot, C.; Delamoreanu, A.; Zenasni, A.; Dickinson, J.W.; Boxall, C.; Kolosov, O.V.

    2015-01-01

    In order to exploit the potential of graphene in next-generation devices, such as supercapacitors, rechargeable batteries, displays and ultrathin sensors, it is crucial to understand the solvent interactions with the graphene surface and interlayers, especially where the latter may be in competition with the former, in the medium of application deployment. In this report, we combine quartz crystal microbalance (QCM) and ultrasonic force microscopy methods to investigate the changes in the film–substrate and film–environment interfaces of graphene and graphene oxide films, produced by diverse scalable routes, in both polar (deionised water) and non-polar (dodecane) liquid and vapour environments. In polar liquid environments, we observe nanobubble adsorption/desorption on the graphene film corresponding to a surface coverage of up to 20%. As no comparable behaviour is observed for non-polar environment, we conclude that nanobubble formation is directly due to the hydrophobic nature of graphene with direct consequences for electrode structures immersed in electrolyte solutions. The amount of water adsorbed by the graphene films was found to vary considerably from 0.012 monolayers of water per monolayer of reduced graphene oxide to 0.231 monolayers of water per monolayer of carbon diffusion growth graphene. This is supported by direct nanomechanical mapping of the films immersed in water where an increased variation of local stiffness suggests water propagation within the film and/or between the film and substrate. Transferred film thickness calculations performed for QCM, atomic force microscopy topography and optical transmission measurements, returns results an order of magnitude larger (46 ± 1 layers) than Raman spectroscopy (1 - 2 graphene layers) on pristine pre-transferred films due to contamination during transfer and possible turbostratic structures of large areas. - Highlights: • Exploring interaction of graphene films with polar and nonpolar liquids

  9. Fast mapping of the local environment of an autonomous mobile robot

    International Nuclear Information System (INIS)

    Fanton, Herve

    1989-01-01

    The construction of a map of the local world for the navigation of an autonomous mobile robot leads to the following problem: how to extract among the sensor data information accurate an reliable enough to plan a path, in a way that enables a reasonable displacement speed. Choice has been made not to tele-operate the vehicle nor to design any custom architecture. So the only way to match the computational cost is to look for the most efficient sensor-algorithms-architecture combination. A good solution is described in this study, using a laser range-finder, a grid model of the world and both SIMD and MIMD parallel processors. A short review of some possible approaches is made first; the mapping algorithms are then described as also the parallel implementations with the corresponding speedup and efficiency factors. (author) [fr

  10. Secure encapsulation and publication of biological services in the cloud computing environment.

    Science.gov (United States)

    Zhang, Weizhe; Wang, Xuehui; Lu, Bo; Kim, Tai-hoon

    2013-01-01

    Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved.

  11. The KCLBOT: Exploiting RGB-D Sensor Inputs for Navigation Environment Building and Mobile Robot Localization

    Directory of Open Access Journals (Sweden)

    Evangelos Georgiou

    2011-09-01

    Full Text Available This paper presents an alternative approach to implementing a stereo camera configuration for SLAM. The approach suggested implements a simplified method using a single RGB-D camera sensor mounted on a maneuverable non-holonomic mobile robot, the KCLBOT, used for extracting image feature depth information while maneuvering. Using a defined quadratic equation, based on the calibration of the camera, a depth computation model is derived base on the HSV color space map. Using this methodology it is possible to build navigation environment maps and carry out autonomous mobile robot path following and obstacle avoidance. This paper presents a calculation model which enables the distance estimation using the RGB-D sensor from Microsoft .NET micro framework device. Experimental results are presented to validate the distance estimation methodology.

  12. Computationally efficient near-field source localization using third-order moments

    Science.gov (United States)

    Chen, Jian; Liu, Guohong; Sun, Xiaoying

    2014-12-01

    In this paper, a third-order moment-based estimation of signal parameters via rotational invariance techniques (ESPRIT) algorithm is proposed for passive localization of near-field sources. By properly choosing sensor outputs of the symmetric uniform linear array, two special third-order moment matrices are constructed, in which the steering matrix is the function of electric angle γ, while the rotational factor is the function of electric angles γ and ϕ. With the singular value decomposition (SVD) operation, all direction-of-arrivals (DOAs) are estimated from a polynomial rooting version. After substituting the DOA information into the steering matrix, the rotational factor is determined via the total least squares (TLS) version, and the related range estimations are performed. Compared with the high-order ESPRIT method, the proposed algorithm requires a lower computational burden, and it avoids the parameter-match procedure. Computer simulations are carried out to demonstrate the performance of the proposed algorithm.

  13. General rigid motion correction for computed tomography imaging based on locally linear embedding

    Science.gov (United States)

    Chen, Mianyi; He, Peng; Feng, Peng; Liu, Baodong; Yang, Qingsong; Wei, Biao; Wang, Ge

    2018-02-01

    The patient motion can damage the quality of computed tomography images, which are typically acquired in cone-beam geometry. The rigid patient motion is characterized by six geometric parameters and are more challenging to correct than in fan-beam geometry. We extend our previous rigid patient motion correction method based on the principle of locally linear embedding (LLE) from fan-beam to cone-beam geometry and accelerate the computational procedure with the graphics processing unit (GPU)-based all scale tomographic reconstruction Antwerp toolbox. The major merit of our method is that we need neither fiducial markers nor motion-tracking devices. The numerical and experimental studies show that the LLE-based patient motion correction is capable of calibrating the six parameters of the patient motion simultaneously, reducing patient motion artifacts significantly.

  14. Probing the local environment of a single OPE3 molecule using inelastic tunneling electron spectroscopy.

    Science.gov (United States)

    Frisenda, Riccardo; Perrin, Mickael L; van der Zant, Herre S J

    2015-01-01

    We study single-molecule oligo(phenylene ethynylene)dithiol junctions by means of inelastic electron tunneling spectroscopy (IETS). The molecule is contacted with gold nano-electrodes formed with the mechanically controllable break junction technique. We record the IETS spectrum of the molecule from direct current measurements, both as a function of time and electrode separation. We find that for fixed electrode separation the molecule switches between various configurations, which are characterized by different IETS spectra. Similar variations in the IETS signal are observed during atomic rearrangements upon stretching of the molecular junction. Using quantum chemistry calculations, we identity some of the vibrational modes which constitute a chemical fingerprint of the molecule. In addition, changes can be attributed to rearrangements of the local molecular environment, in particular at the molecule-electrode interface. This study shows the importance of taking into account the interaction with the electrodes when describing inelastic contributions to transport through single-molecule junctions.

  15. A Stable Metal-Organic Framework Featuring a Local Buffer Environment for Carbon Dioxide Fixation.

    Science.gov (United States)

    He, Hongming; Sun, Qi; Gao, Wenyang; Perman, Jason A; Sun, Fuxing; Zhu, Guangshan; Aguila, Briana; Forrest, Katherine; Space, Brian; Ma, Shengqian

    2018-04-16

    A majority of metal-organic frameworks (MOFs) fail to preserve their physical and chemical properties after exposure to acidic, neutral, or alkaline aqueous solutions, therefore limiting their practical applications in many areas. The strategy demonstrated herein is the design and synthesis of an organic ligand that behaves as a buffer to drastically boost the aqueous stability of a porous MOF (JUC-1000), which maintains its structural integrity at low and high pH values. The local buffer environment resulting from the weak acid-base pairs of the custom-designed organic ligand also greatly facilitates the performance of JUC-1000 in the chemical fixation of carbon dioxide under ambient conditions, outperforming a series of benchmark catalysts. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Final Project Report: Data Locality Enhancement of Dynamic Simulations for Exascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Shen, Xipeng [North Carolina State Univ., Raleigh, NC (United States)

    2016-04-27

    The goal of this project is to develop a set of techniques and software tools to enhance the matching between memory accesses in dynamic simulations and the prominent features of modern and future manycore systems, alleviating the memory performance issues for exascale computing. In the first three years, the PI and his group have achieves some significant progress towards the goal, producing a set of novel techniques for improving the memory performance and data locality in manycore systems, yielding 18 conference and workshop papers and 4 journal papers and graduating 6 Ph.Ds. This report summarizes the research results of this project through that period.

  17. Computer local construction of a general solution for the Chew-Low equations

    International Nuclear Information System (INIS)

    Gerdt, V.P.

    1980-01-01

    General solution of the dynamic form of the Chew-Low equations in the vicinity of the restpoint is considered. A method for calculating coefficients of series being members of such solution is suggested. The results of calculations, coefficients of power series and expansions carried out by means of the SCHOONSCHIP and SYMBAL systems are given. It is noted that the suggested procedure of the Chew-Low equation solutions basing on using an electronic computer as an instrument for analytical calculations permits to obtain detail information on the local structure of general solution

  18. Robust localisation of automated guided vehicles for computer-integrated manufacturing environments

    Directory of Open Access Journals (Sweden)

    Dixon, R. C.

    2013-05-01

    Full Text Available As industry moves toward an era of complete automation and mass customisation, automated guided vehicles (AGVs are used as material handling systems. However, the current techniques that provide navigation, control, and manoeuvrability of automated guided vehicles threaten to create bottlenecks and inefficiencies in manufacturing environments that strive towards the optimisation of part production. This paper proposes a decentralised localisation technique for an automated guided vehicle without any non-holonomic constraints. Incorporation of these vehicles into the material handling system of a computer-integrated manufacturing environment would increase the characteristics of robustness, efficiency, flexibility, and advanced manoeuvrability.

  19. Techniques and environments for big data analysis parallel, cloud, and grid computing

    CERN Document Server

    Dehuri, Satchidananda; Kim, Euiwhan; Wang, Gi-Name

    2016-01-01

    This volume is aiming at a wide range of readers and researchers in the area of Big Data by presenting the recent advances in the fields of Big Data Analysis, as well as the techniques and tools used to analyze it. The book includes 10 distinct chapters providing a concise introduction to Big Data Analysis and recent Techniques and Environments for Big Data Analysis. It gives insight into how the expensive fitness evaluation of evolutionary learning can play a vital role in big data analysis by adopting Parallel, Grid, and Cloud computing environments.

  20. Human resources of local governments as motivators of participation of businesses and citizens in protecting of environment

    OpenAIRE

    NIKOLIĆ N.; GAJOVIĆ A.; PAUNOVIĆ V.

    2015-01-01

    This paper discusses the importance of human resources of local governments in the motivation of businesses and citizens in protecting the environment. The inability to absorb current problems caused by inadequate and incomplete arrangement of utilization of human resources of the local government of Lučani caused the redefining of strategic priorities of environmental protection. The motivational power of human resources of local governments expressed through interaction with the population ...

  1. Acoustic sources of opportunity in the marine environment - Applied to source localization and ocean sensing

    Science.gov (United States)

    Verlinden, Christopher M.

    Controlled acoustic sources have typically been used for imaging the ocean. These sources can either be used to locate objects or characterize the ocean environment. The processing involves signal extraction in the presence of ambient noise, with shipping being a major component of the latter. With the advent of the Automatic Identification System (AIS) which provides accurate locations of all large commercial vessels, these major noise sources can be converted from nuisance to beacons or sources of opportunity for the purpose of studying the ocean. The source localization method presented here is similar to traditional matched field processing, but differs in that libraries of data-derived measured replicas are used in place of modeled replicas. In order to account for differing source spectra between library and target vessels, cross-correlation functions are compared instead of comparing acoustic signals directly. The library of measured cross-correlation function replicas is extrapolated using waveguide invariant theory to fill gaps between ship tracks, fully populating the search grid with estimated replicas allowing for continuous tracking. In addition to source localization, two ocean sensing techniques are discussed in this dissertation. The feasibility of estimating ocean sound speed and temperature structure, using ship noise across a drifting volumetric array of hydrophones suspended beneath buoys, in a shallow water marine environment is investigated. Using the attenuation of acoustic energy along eigenray paths to invert for ocean properties such as temperature, salinity, and pH is also explored. In each of these cases, the theory is developed, tested using numerical simulations, and validated with data from acoustic field experiments.

  2. Performative Environments

    DEFF Research Database (Denmark)

    Thomsen, Bo Stjerne

    2008-01-01

    The paper explores how performative architecture can act as a collective environment localizing urban flows and establishing public domains through the integration of pervasive computing and animation techniques. The NoRA project introduces the concept of ‘performative environments,' focusing on ...... of local interactions and network behaviour, building becomes social infrastructure and prompts an understanding of architectural structures as quasiobjects, which can retain both variation and recognisability in changing social constellations.......The paper explores how performative architecture can act as a collective environment localizing urban flows and establishing public domains through the integration of pervasive computing and animation techniques. The NoRA project introduces the concept of ‘performative environments,' focusing...

  3. A computational environment for creating and testing reduced chemical kinetic mechanisms

    Energy Technology Data Exchange (ETDEWEB)

    Montgomery, C.J.; Swensen, D.A.; Harding, T.V.; Cremer, M.A.; Bockelie, M.J. [Reaction Engineering International, Salt Lake City, UT (USA)

    2002-02-01

    This paper describes software called computer assisted reduced mechanism problem solving environment (CARM-PSE) that gives the engineer the ability to rapidly set up, run and examine large numbers of problems comparing detailed and reduced (approximate) chemistry. CARM-PSE integrates the automatic chemical mechanism reduction code CARM and the codes that simulate perfectly stirred reactors and plug flow reactors into a user-friendly computational environment. CARM-PSE gives the combustion engineer the ability to easily test chemical approximations over many hundreds of combinations of inputs in a multidimensional parameter space. The demonstration problems compare detailed and reduced chemical kinetic calculations for methane-air combustion, including nitrogen oxide formation, in a stirred reactor and selective non-catalytic reduction of NOx, in coal combustion flue gas.

  4. Method and system for rendering and interacting with an adaptable computing environment

    Science.gov (United States)

    Osbourn, Gordon Cecil [Albuquerque, NM; Bouchard, Ann Marie [Albuquerque, NM

    2012-06-12

    An adaptable computing environment is implemented with software entities termed "s-machines", which self-assemble into hierarchical data structures capable of rendering and interacting with the computing environment. A hierarchical data structure includes a first hierarchical s-machine bound to a second hierarchical s-machine. The first hierarchical s-machine is associated with a first layer of a rendering region on a display screen and the second hierarchical s-machine is associated with a second layer of the rendering region overlaying at least a portion of the first layer. A screen element s-machine is linked to the first hierarchical s-machine. The screen element s-machine manages data associated with a screen element rendered to the display screen within the rendering region at the first layer.

  5. Bound on quantum computation time: Quantum error correction in a critical environment

    International Nuclear Information System (INIS)

    Novais, E.; Mucciolo, Eduardo R.; Baranger, Harold U.

    2010-01-01

    We obtain an upper bound on the time available for quantum computation for a given quantum computer and decohering environment with quantum error correction implemented. First, we derive an explicit quantum evolution operator for the logical qubits and show that it has the same form as that for the physical qubits but with a reduced coupling strength to the environment. Using this evolution operator, we find the trace distance between the real and ideal states of the logical qubits in two cases. For a super-Ohmic bath, the trace distance saturates, while for Ohmic or sub-Ohmic baths, there is a finite time before the trace distance exceeds a value set by the user.

  6. Exciton center-of-mass localization and dielectric environment effect in monolayer WS2

    Science.gov (United States)

    Hichri, Aïda; Ben Amara, Imen; Ayari, Sabrine; Jaziri, Sihem

    2017-06-01

    The ultrathin transition metal dichalcogenides (TMDs) have emerged as promising materials for various applications using two dimensional semiconductors. They have attracted increasing attention due to their unique optical properties originate from neutral and charged excitons. In this paper, we study the strong localization of exciton center-of-mass motion within random potential fluctuations caused by the monolayer defects. Here, we report negatively charged exciton formation in monolayer TMDs, notably tungsten disulfide WS2. Our theory is based on an effective mass model of neutral and charged excitons, parameterized by ab-initio calculations. Taking into the account the strong correlation between the monolayer WS2 and the surrounding dielectric environment, our theoretical results are in good agreement with one-photon photoluminescence (PL) and reflectivity measurements. We also show that the exciton state with p-symmetry, experimentally observed by two-photon PL emission, is energetically below the 2s-state. We use the equilibrium mass action law, to quantify the relative weight of exciton and trion PL. We show that exciton and trion emission can be tuned and controlled by external parameters like temperature, pumping, and injection electrons. Finally, in comparison with experimental measurements, we show that exciton emission in monolayer tungsten dichalcogenides is substantially reduced. This feature suggests that free exciton can be trapped in disordered potential wells to form a localized exciton and therefore offers a route toward novel optical properties.

  7. Local Fitness Landscapes Predict Yeast Evolutionary Dynamics in Directionally Changing Environments.

    Science.gov (United States)

    Gorter, Florien A; Aarts, Mark G M; Zwaan, Bas J; de Visser, J Arjan G M

    2018-01-01

    The fitness landscape is a concept that is widely used for understanding and predicting evolutionary adaptation. The topography of the fitness landscape depends critically on the environment, with potentially far-reaching consequences for evolution under changing conditions. However, few studies have assessed directly how empirical fitness landscapes change across conditions, or validated the predicted consequences of such change. We previously evolved replicate yeast populations in the presence of either gradually increasing, or constant high, concentrations of the heavy metals cadmium (Cd), nickel (Ni), and zinc (Zn), and analyzed their phenotypic and genomic changes. Here, we reconstructed the local fitness landscapes underlying adaptation to each metal by deleting all repeatedly mutated genes both by themselves and in combination. Fitness assays revealed that the height, and/or shape, of each local fitness landscape changed considerably across metal concentrations, with distinct qualitative differences between unconditionally (Cd) and conditionally toxic metals (Ni and Zn). This change in topography had particularly crucial consequences in the case of Ni, where a substantial part of the individual mutational fitness effects changed in sign across concentrations. Based on the Ni landscape analyses, we made several predictions about which mutations had been selected when during the evolution experiment. Deep sequencing of population samples from different time points generally confirmed these predictions, demonstrating the power of landscape reconstruction analyses for understanding and ultimately predicting evolutionary dynamics, even under complex scenarios of environmental change. Copyright © 2018 by the Genetics Society of America.

  8. Sound localization and word discrimination in reverberant environment in children with developmental dyslexia

    Directory of Open Access Journals (Sweden)

    Wendy Castro-Camacho

    2015-04-01

    Full Text Available Objective Compare if localization of sounds and words discrimination in reverberant environment is different between children with dyslexia and controls. Method We studied 30 children with dyslexia and 30 controls. Sound and word localization and discrimination was studied in five angles from left to right auditory fields (-90o, -45o, 0o, +45o, +90o, under reverberant and no-reverberant conditions; correct answers were compared. Results Spatial location of words in no-reverberant test was deficient in children with dyslexia at 0º and +90o. Spatial location for reverberant test was altered in children with dyslexia at all angles, except –-90o. Word discrimination in no-reverberant test in children with dyslexia had a poor performance at left angles. In reverberant test, children with dyslexia exhibited deficiencies at -45o, -90o, and +45o angles. Conclusion Children with dyslexia could had problems when have to locate sound, and discriminate words in extreme locations of the horizontal plane in classrooms with reverberation.

  9. Dust Evolution in Low-Metallicity Environments: Bridging the Gap Between Local Universe and Primordial Galaxies

    Science.gov (United States)

    Galliano, Frederic; Barlow, Mike; Bendo, George; Boselli, Alessandro; Buat, Veronique; Chanial, Pierre; Clements, David; Davies, Jon; Eales, Steve; Gomez, Haley; Isaak, Kate; Madden, Suzanne; Page, Mathew; Perez Fournon, Ismael; Sauvage, Marc; Spinoglio, Luigi; Vaccari, Mattia; Wilson, Christine

    2008-03-01

    The local galaxy Science Advisory Group (SAG 2) in the Herschel/SPIRE consortium, has constructed a Guaranteed Time Key Program using the PACS and SPIRE insruments to obtain 60 to 550 micron photometry of a statistically significant sample of 51 dwarf galaxies in our local universe chosen to cover an impressivly broad range of physical conditions. Here we propose the necessary complementary IRAC, MIPS and IRS Spitzer observations which together with the Herschel GT database will provide a rich database to the community to perform the dust and gas analyses in unprecedented detail in low metallicity galaxies ranging between 1/50 to 1 solar metallicity. Due to their chemical youth, and to the extreme conditions they experience, low metallicity environments constitute a keystone to understand dust evolution. The primary goal of this combined Herschel and Spitzer project is to study in details the physical processes at play within the ISM of these galaxies. We will take advantage of the powerful combination of Spitzer, Herschel and ancillary data to decompose the SED into the emission coming from the main phases of the ISM. Such a decomposition will provide reliable estimate of the abundances of the principal dust species, as a fonction of metallicity and physical conditions. These results will be exploited to compare the various evolutionary processes affecting the dust content of galaxies. All these outstanding scientific advances will be the true legacy value that this project brings to the community.

  10. Development of an Acoustic Localization Method for Cavitation Experiments in Reverberant Environments

    Science.gov (United States)

    Ranjeva, Minna; Thompson, Lee; Perlitz, Daniel; Bonness, William; Capone, Dean; Elbing, Brian

    2011-11-01

    Cavitation is a major concern for the US Navy since it can cause ship damage and produce unwanted noise. The ability to precisely locate cavitation onset in laboratory scale experiments is essential for proper design that will minimize this undesired phenomenon. Measuring the cavitation onset is more accurately determined acoustically than visually. However, if other parts of the model begin to cavitate prior to the component of interest the acoustic data is contaminated with spurious noise. Consequently, cavitation onset is widely determined by optically locating the event of interest. The current research effort aims at developing an acoustic localization scheme for reverberant environments such as water tunnels. Currently cavitation bubbles are being induced in a static water tank with a laser, allowing the localization techniques to be refined with the bubble at a known location. The source is located with the use of acoustic data collected with hydrophones and analyzed using signal processing techniques. To verify the accuracy of the acoustic scheme, the events are simultaneously monitored visually with the use of a high speed camera. Once refined testing will be conducted in a water tunnel. This research was sponsored by the Naval Engineering Education Center (NEEC).

  11. Learning environment simulator: a tool for local decision makers and first responders

    Energy Technology Data Exchange (ETDEWEB)

    Leclaire, Rene J [Los Alamos National Laboratory; Hirsch, Gary B [CLE, INCORPORATED

    2009-01-01

    The National Infrastructure Simulation and Analysis Center (NISAC) has developed a prototype learning environment simulator (LES) based on the Critical Infrastructure Protection Decision Support System (CIPDSS) infrastructure and scenario models. The LES is designed to engage decision makers at the grass-roots level (local/city/state) to deepen their understanding of an evolving crisis, enhance their intuition and allow them to test their own strategies for events before they occur. An initial version is being developed, centered on a pandemic influenza outbreak and has been successfully tested with a group of hospital administrators and first responders. LES is not a predictive tool but rather a simulated environment allowing the user to experience the complexities of a crisis before it happens. Users can contrast various approaches to the crisis, competing with alternative strategies of their own or other participants. LES is designed to assist decision makers in making informed choices by functionally representing relevant scenarios before they occur, including impacts to critical infrastructures with their interdependencies, and estimating human health & safety and economic impacts. In this paper a brief overview of the underlying models are given followed by a description of the LES, its interface and usage and an overview of the experience testing LES with a group of hospital administrators and first responders. The paper concludes with a brief discussion of the work remaining to make LES operational.

  12. Mid-late Holocene environments of Agua Buena locality (34050'S; 69056'W), Mendoza, Argentina

    International Nuclear Information System (INIS)

    Navarro, Diego; Paez, M M; Mehl, A; Zarate, M A

    2010-01-01

    In southern South America the acquisition of high-quality Holocene paleoclimate data is a priority due to the paucity of complete, continuous and well dated records. Here we report preliminary results from a combined sedimentological and palynological study of an alluvial fan sequence and the laterally connected sedimentary deposits of the Vega de la Cueva profile at Agua Buena east of the Andes in central Argentina. The main geomorphological units of the area were identified and mapped based on satellite image analysis and multiple field surveys. The sedimentological and pollen results allowed us to reconstruct the development of some environments. The Agua Buena record corresponds to the distal facies of the Arroyo Bayo alluvial fan starting the aggradation process prior to ca. 4100 cal yr BP. The organic-rich levels found were formed during the development of wetlands (vegas) dominated by Cyperaceae, Juncaceae and Poaceae. These highly productive environments with almost permanent water saturation were important between 4100 and 2800 cal yr BP, indicating more stable conditions. After 2800 cal yr BP, the organic content was comparatively lower with increasing sedimentation rates that are indicative of higher fluvial discharges. This information is fundamental to interpret both the pollen and charcoal records of the area and to evaluate their representativeness and potential to reconstruct past local and/or regional vegetation.

  13. Running climate model on a commercial cloud computing environment: A case study using Community Earth System Model (CESM) on Amazon AWS

    Science.gov (United States)

    Chen, Xiuhong; Huang, Xianglei; Jiao, Chaoyi; Flanner, Mark G.; Raeker, Todd; Palen, Brock

    2017-01-01

    The suites of numerical models used for simulating climate of our planet are usually run on dedicated high-performance computing (HPC) resources. This study investigates an alternative to the usual approach, i.e. carrying out climate model simulations on commercially available cloud computing environment. We test the performance and reliability of running the CESM (Community Earth System Model), a flagship climate model in the United States developed by the National Center for Atmospheric Research (NCAR), on Amazon Web Service (AWS) EC2, the cloud computing environment by Amazon.com, Inc. StarCluster is used to create virtual computing cluster on the AWS EC2 for the CESM simulations. The wall-clock time for one year of CESM simulation on the AWS EC2 virtual cluster is comparable to the time spent for the same simulation on a local dedicated high-performance computing cluster with InfiniBand connections. The CESM simulation can be efficiently scaled with the number of CPU cores on the AWS EC2 virtual cluster environment up to 64 cores. For the standard configuration of the CESM at a spatial resolution of 1.9° latitude by 2.5° longitude, increasing the number of cores from 16 to 64 reduces the wall-clock running time by more than 50% and the scaling is nearly linear. Beyond 64 cores, the communication latency starts to outweigh the benefit of distributed computing and the parallel speedup becomes nearly unchanged.

  14. A Drawing and Multi-Representational Computer Environment for Beginners' Learning of Programming Using C: Design and Pilot Formative Evaluation

    Science.gov (United States)

    Kordaki, Maria

    2010-01-01

    This paper presents both the design and the pilot formative evaluation study of a computer-based problem-solving environment (named LECGO: Learning Environment for programming using C using Geometrical Objects) for the learning of computer programming using C by beginners. In its design, constructivist and social learning theories were taken into…

  15. Effects Of Social Networking Sites (SNSs) On Hyper Media Computer Mediated Environments (HCMEs)

    OpenAIRE

    Yoon C. Cho

    2011-01-01

    Social Networking Sites (SNSs) are known as tools to interact and build relationships between users/customers in Hyper Media Computer Mediated Environments (HCMEs). This study explored how social networking sites play a significant role in communication between users. While numerous researchers examined the effectiveness of social networking websites, few studies investigated which factors affected customers attitudes and behavior toward social networking sites. In this paper, the authors inv...

  16. Do Social Computing Make You Happy? A Case Study of Nomadic Children in Mixed Environments

    DEFF Research Database (Denmark)

    Christensen, Bent Guldbjerg

    2005-01-01

    In this paper I describe a perspective on ambient, ubiquitous, and pervasive computing called the happiness perspective. By using the happiness perspective, the application domain and how the technology is used and experienced, becomes a central and integral part of perceiving ambient technology....... will use the perspective in a case study on field test experiments with nomadic children in mixed environments using the eBag system....

  17. Improving Communicative Competence through Synchronous Communication in Computer-Supported Collaborative Learning Environments: A Systematic Review

    Directory of Open Access Journals (Sweden)

    Xi Huang

    2018-01-01

    Full Text Available Computer-supported collaborative learning facilitates the extension of second language acquisition into social practice. Studies on its achievement effects speak directly to the pedagogical notion of treating communicative practice in synchronous computer-mediated communication (SCMC: real-time communication that takes place between human beings via the instrumentality of computers in forms of text, audio and video communication, such as live chat and chatrooms as socially-oriented meaning construction. This review begins by considering the adoption of social interactionist views to identify key paradigms and supportive principles of computer-supported collaborative learning. A special focus on two components of communicative competence is then presented to explore interactional variables in synchronous computer-mediated communication along with a review of research. There follows a discussion on a synthesis of interactional variables in negotiated interaction and co-construction of knowledge from psycholinguistic and social cohesion perspectives. This review reveals both possibilities and disparities of language socialization in promoting intersubjective learning and diversifying the salient use of interactively creative language in computer-supported collaborative learning environments in service of communicative competence.

  18. Accelerating Dust Storm Simulation by Balancing Task Allocation in Parallel Computing Environment

    Science.gov (United States)

    Gui, Z.; Yang, C.; XIA, J.; Huang, Q.; YU, M.

    2013-12-01

    Dust storm has serious negative impacts on environment, human health, and assets. The continuing global climate change has increased the frequency and intensity of dust storm in the past decades. To better understand and predict the distribution, intensity and structure of dust storm, a series of dust storm models have been developed, such as Dust Regional Atmospheric Model (DREAM), the NMM meteorological module (NMM-dust) and Chinese Unified Atmospheric Chemistry Environment for Dust (CUACE/Dust). The developments and applications of these models have contributed significantly to both scientific research and our daily life. However, dust storm simulation is a data and computing intensive process. Normally, a simulation for a single dust storm event may take several days or hours to run. It seriously impacts the timeliness of prediction and potential applications. To speed up the process, high performance computing is widely adopted. By partitioning a large study area into small subdomains according to their geographic location and executing them on different computing nodes in a parallel fashion, the computing performance can be significantly improved. Since spatiotemporal correlations exist in the geophysical process of dust storm simulation, each subdomain allocated to a node need to communicate with other geographically adjacent subdomains to exchange data. Inappropriate allocations may introduce imbalance task loads and unnecessary communications among computing nodes. Therefore, task allocation method is the key factor, which may impact the feasibility of the paralleling. The allocation algorithm needs to carefully leverage the computing cost and communication cost for each computing node to minimize total execution time and reduce overall communication cost for the entire system. This presentation introduces two algorithms for such allocation and compares them with evenly distributed allocation method. Specifically, 1) In order to get optimized solutions, a

  19. Secure Scientific Applications Scheduling Technique for Cloud Computing Environment Using Global League Championship Algorithm

    Science.gov (United States)

    Abdulhamid, Shafi’i Muhammad; Abd Latiff, Muhammad Shafie; Abdul-Salaam, Gaddafi; Hussain Madni, Syed Hamid

    2016-01-01

    Cloud computing system is a huge cluster of interconnected servers residing in a datacenter and dynamically provisioned to clients on-demand via a front-end interface. Scientific applications scheduling in the cloud computing environment is identified as NP-hard problem due to the dynamic nature of heterogeneous resources. Recently, a number of metaheuristics optimization schemes have been applied to address the challenges of applications scheduling in the cloud system, without much emphasis on the issue of secure global scheduling. In this paper, scientific applications scheduling techniques using the Global League Championship Algorithm (GBLCA) optimization technique is first presented for global task scheduling in the cloud environment. The experiment is carried out using CloudSim simulator. The experimental results show that, the proposed GBLCA technique produced remarkable performance improvement rate on the makespan that ranges between 14.44% to 46.41%. It also shows significant reduction in the time taken to securely schedule applications as parametrically measured in terms of the response time. In view of the experimental results, the proposed technique provides better-quality scheduling solution that is suitable for scientific applications task execution in the Cloud Computing environment than the MinMin, MaxMin, Genetic Algorithm (GA) and Ant Colony Optimization (ACO) scheduling techniques. PMID:27384239

  20. Secure Scientific Applications Scheduling Technique for Cloud Computing Environment Using Global League Championship Algorithm.

    Science.gov (United States)

    Abdulhamid, Shafi'i Muhammad; Abd Latiff, Muhammad Shafie; Abdul-Salaam, Gaddafi; Hussain Madni, Syed Hamid

    2016-01-01

    Cloud computing system is a huge cluster of interconnected servers residing in a datacenter and dynamically provisioned to clients on-demand via a front-end interface. Scientific applications scheduling in the cloud computing environment is identified as NP-hard problem due to the dynamic nature of heterogeneous resources. Recently, a number of metaheuristics optimization schemes have been applied to address the challenges of applications scheduling in the cloud system, without much emphasis on the issue of secure global scheduling. In this paper, scientific applications scheduling techniques using the Global League Championship Algorithm (GBLCA) optimization technique is first presented for global task scheduling in the cloud environment. The experiment is carried out using CloudSim simulator. The experimental results show that, the proposed GBLCA technique produced remarkable performance improvement rate on the makespan that ranges between 14.44% to 46.41%. It also shows significant reduction in the time taken to securely schedule applications as parametrically measured in terms of the response time. In view of the experimental results, the proposed technique provides better-quality scheduling solution that is suitable for scientific applications task execution in the Cloud Computing environment than the MinMin, MaxMin, Genetic Algorithm (GA) and Ant Colony Optimization (ACO) scheduling techniques.

  1. Study on User Authority Management for Safe Data Protection in Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    Su-Hyun Kim

    2015-03-01

    Full Text Available In cloud computing environments, user data are encrypted using numerous distributed servers before storing such data. Global Internet service companies, such as Google and Yahoo, recognized the importance of Internet service platforms and conducted self-research and development to create and utilize large cluster-based cloud computing platform technology based on low-priced commercial nodes. As diverse data services become possible in distributed computing environments, high-capacity distributed management is emerging as a major issue. Meanwhile, because of the diverse forms of using high-capacity data, security vulnerability and privacy invasion by malicious attackers or internal users can occur. As such, when various sensitive data are stored in cloud servers and used from there, the problem of data spill might occur because of external attackers or the poor management of internal users. Data can be managed through encryption to prevent such problems. However, existing simple encryption methods involve problems associated with the management of access to data stored in cloud environments. Therefore, in the present paper, a technique for data access management by user authority, based on Attribute-Based Encryption (ABE and secret distribution techniques, is proposed.

  2. An Interactive, Web-based High Performance Modeling Environment for Computational Epidemiology.

    Science.gov (United States)

    Deodhar, Suruchi; Bisset, Keith R; Chen, Jiangzhuo; Ma, Yifei; Marathe, Madhav V

    2014-07-01

    We present an integrated interactive modeling environment to support public health epidemiology. The environment combines a high resolution individual-based model with a user-friendly web-based interface that allows analysts to access the models and the analytics back-end remotely from a desktop or a mobile device. The environment is based on a loosely-coupled service-oriented-architecture that allows analysts to explore various counter factual scenarios. As the modeling tools for public health epidemiology are getting more sophisticated, it is becoming increasingly hard for non-computational scientists to effectively use the systems that incorporate such models. Thus an important design consideration for an integrated modeling environment is to improve ease of use such that experimental simulations can be driven by the users. This is achieved by designing intuitive and user-friendly interfaces that allow users to design and analyze a computational experiment and steer the experiment based on the state of the system. A key feature of a system that supports this design goal is the ability to start, stop, pause and roll-back the disease propagation and intervention application process interactively. An analyst can access the state of the system at any point in time and formulate dynamic interventions based on additional information obtained through state assessment. In addition, the environment provides automated services for experiment set-up and management, thus reducing the overall time for conducting end-to-end experimental studies. We illustrate the applicability of the system by describing computational experiments based on realistic pandemic planning scenarios. The experiments are designed to demonstrate the system's capability and enhanced user productivity.

  3. Improving science and mathematics education with computational modelling in interactive engagement environments

    Science.gov (United States)

    Neves, Rui Gomes; Teodoro, Vítor Duarte

    2012-09-01

    A teaching approach aiming at an epistemologically balanced integration of computational modelling in science and mathematics education is presented. The approach is based on interactive engagement learning activities built around computational modelling experiments that span the range of different kinds of modelling from explorative to expressive modelling. The activities are designed to make a progressive introduction to scientific computation without requiring prior development of a working knowledge of programming, generate and foster the resolution of cognitive conflicts in the understanding of scientific and mathematical concepts and promote performative competency in the manipulation of different and complementary representations of mathematical models. The activities are supported by interactive PDF documents which explain the fundamental concepts, methods and reasoning processes using text, images and embedded movies, and include free space for multimedia enriched student modelling reports and teacher feedback. To illustrate, an example from physics implemented in the Modellus environment and tested in undergraduate university general physics and biophysics courses is discussed.

  4. A computational framework for the optimal design of morphing processes in locally activated smart material structures

    International Nuclear Information System (INIS)

    Wang, Shuang; Brigham, John C

    2012-01-01

    A proof-of-concept study is presented for a strategy to obtain maximally efficient and accurate morphing structures composed of active materials such as shape memory polymers (SMP) through synchronization of adaptable and localized activation and actuation. The work focuses on structures or structural components entirely composed of thermo-responsive SMP, and particularly utilizes the ability of such materials to display controllable variable stiffness. The study presents and employs a computational inverse mechanics approach that combines a computational representation of the SMP thermo-mechanical behavior with a nonlinear optimization algorithm to determine location, magnitude and sequencing of the activation and actuation to obtain a desired shape change subject to design objectives such as prevention of damage. Two numerical examples are presented in which the synchronization of the activation and actuation and the location of activation excitation were optimized with respect to the combined thermal and mechanical energy for design concepts in morphing skeletal structural components. In all cases the concept of localized activation along with the optimal design strategy were able to produce far more energy efficient morphing structures and more accurately reach the desired shape change in comparison to traditional methods that require complete structural activation prior to actuation. (paper)

  5. Imaging local cerebral blood flow by xenon-enhanced computed tomography - technical optimization procedures

    International Nuclear Information System (INIS)

    Meyer, J.S.; Shinohara, T.; Imai, A.; Kobari, M.; Solomon, E.

    1988-01-01

    Methods are described for non-invasive, computer-assisted serial scanning throughout the human brain during eight minutes of inhalation of 27%-30% xenon gas in order to measure local cerebral blood flow (LCBF). Optimized xenon-enhanced computed tomography (XeCT) was achieved by 5-second scanning at one-minute intervals utilizing a state-of-the-art CT scanner and rapid delivery of xenon gas via a face mask. Values for local brain-blood partition coefficients (Lλ) measured in vivo were utilized to calculate LCBF values. Previous methods assumed Lλ values to be normal, introducing the risk of systematic errors, because Lλ values differ throughout normal brain and may be altered by disease. Color-coded maps of Lλ and LCBF values were formatted directly onto CT images for exact correlation of function with anatomic and pathologic observations (spatial resolution: 26.5 cubic mm). Results were compared among eight normal volunteers, aged between 50 and 88 years. Mean cortical gray matter blood flow was 46.3 ± 7.7, for subcortical gray matter it was 50.3 ± 13.2 and for white matter it was 18.8 ± 3.2. Modern CT scanners provide stability, improved signal to noise ratio and minimal radiation scatter. Combining these advantages with rapid xenon saturation of the blood provides correlations of Lλ and LCBF with images of normal and abnormal brain in a safe, useful and non-invasive manner. (orig.)

  6. Imaging local cerebral blood flow by xenon-enhanced computed tomography - technical optimization procedures

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, J.S.; Shinohara, T.; Imai, A.; Kobari, M.; Sakai, F.; Hata, T.; Oravez, W.T.; Timpe, G.M.; Deville, T.; Solomon, E.

    1988-08-01

    Methods are described for non-invasive, computer-assisted serial scanning throughout the human brain during eight minutes of inhalation of 27%-30% xenon gas in order to measure local cerebral blood flow (LCBF). Optimized xenon-enhanced computed tomography (XeCT) was achieved by 5-second scanning at one-minute intervals utilizing a state-of-the-art CT scanner and rapid delivery of xenon gas via a face mask. Values for local brain-blood partition coefficients (Llambda) measured in vivo were utilized to calculate LCBF values. Previous methods assumed Llambda values to be normal, introducing the risk of systematic errors, because Llambda values differ throughout normal brain and may be altered by disease. Color-coded maps of Llambda and LCBF values were formatted directly onto CT images for exact correlation of function with anatomic and pathologic observations (spatial resolution: 26.5 cubic mm). Results were compared among eight normal volunteers, aged between 50 and 88 years. Mean cortical gray matter blood flow was 46.3 +- 7.7, for subcortical gray matter it was 50.3 +- 13.2 and for white matter it was 18.8 +- 3.2. Modern CT scanners provide stability, improved signal to noise ratio and minimal radiation scatter. Combining these advantages with rapid xenon saturation of the blood provides correlations of Llambda and LCBF with images of normal and abnormal brain in a safe, useful and non-invasive manner.

  7. Advanced quadrature sets and acceleration and preconditioning techniques for the discrete ordinates method in parallel computing environments

    Science.gov (United States)

    Longoni, Gianluca

    In the nuclear science and engineering field, radiation transport calculations play a key-role in the design and optimization of nuclear devices. The linear Boltzmann equation describes the angular, energy and spatial variations of the particle or radiation distribution. The discrete ordinates method (S N) is the most widely used technique for solving the linear Boltzmann equation. However, for realistic problems, the memory and computing time require the use of supercomputers. This research is devoted to the development of new formulations for the SN method, especially for highly angular dependent problems, in parallel environments. The present research work addresses two main issues affecting the accuracy and performance of SN transport theory methods: quadrature sets and acceleration techniques. New advanced quadrature techniques which allow for large numbers of angles with a capability for local angular refinement have been developed. These techniques have been integrated into the 3-D SN PENTRAN (Parallel Environment Neutral-particle TRANsport) code and applied to highly angular dependent problems, such as CT-Scan devices, that are widely used to obtain detailed 3-D images for industrial/medical applications. In addition, the accurate simulation of core physics and shielding problems with strong heterogeneities and transport effects requires the numerical solution of the transport equation. In general, the convergence rate of the solution methods for the transport equation is reduced for large problems with optically thick regions and scattering ratios approaching unity. To remedy this situation, new acceleration algorithms based on the Even-Parity Simplified SN (EP-SSN) method have been developed. A new stand-alone code system, PENSSn (Parallel Environment Neutral-particle Simplified SN), has been developed based on the EP-SSN method. The code is designed for parallel computing environments with spatial, angular and hybrid (spatial/angular) domain

  8. Development of computational infrastructure to support hyper-resolution large-ensemble hydrology simulations from local-to-continental scales

    Data.gov (United States)

    National Aeronautics and Space Administration — Development of computational infrastructure to support hyper-resolution large-ensemble hydrology simulations from local-to-continental scales A move is currently...

  9. An Algorithm Computing the Local $b$ Function by an Approximate Division Algorithm in $\\hat{\\mathcal{D}}$

    OpenAIRE

    Nakayama, Hiromasa

    2006-01-01

    We give an algorithm to compute the local $b$ function. In this algorithm, we use the Mora division algorithm in the ring of differential operators and an approximate division algorithm in the ring of differential operators with power series coefficient.

  10. General approach to the computation of local transport coefficients with finite Larmor effects in the collision contribution

    International Nuclear Information System (INIS)

    Ghendrih, P.

    1986-10-01

    We expand the distribution functions on a basis of Hermite functions and obtain a general scheme to compute the local transport coefficients. The magnetic field dependence due to finite Larmor radius effects during the collision process is taken into account

  11. Hybrid Cloud Computing Environment for EarthCube and Geoscience Community

    Science.gov (United States)

    Yang, C. P.; Qin, H.

    2016-12-01

    The NSF EarthCube Integration and Test Environment (ECITE) has built a hybrid cloud computing environment to provides cloud resources from private cloud environments by using cloud system software - OpenStack and Eucalyptus, and also manages public cloud - Amazon Web Service that allow resource synchronizing and bursting between private and public cloud. On ECITE hybrid cloud platform, EarthCube and geoscience community can deploy and manage the applications by using base virtual machine images or customized virtual machines, analyze big datasets by using virtual clusters, and real-time monitor the virtual resource usage on the cloud. Currently, a number of EarthCube projects have deployed or started migrating their projects to this platform, such as CHORDS, BCube, CINERGI, OntoSoft, and some other EarthCube building blocks. To accomplish the deployment or migration, administrator of ECITE hybrid cloud platform prepares the specific needs (e.g. images, port numbers, usable cloud capacity, etc.) of each project in advance base on the communications between ECITE and participant projects, and then the scientists or IT technicians in those projects launch one or multiple virtual machines, access the virtual machine(s) to set up computing environment if need be, and migrate their codes, documents or data without caring about the heterogeneity in structure and operations among different cloud platforms.

  12. A web-based, collaborative modeling, simulation, and parallel computing environment for electromechanical systems

    Directory of Open Access Journals (Sweden)

    Xiaoliang Yin

    2015-03-01

    Full Text Available Complex electromechanical system is usually composed of multiple components from different domains, including mechanical, electronic, hydraulic, control, and so on. Modeling and simulation for electromechanical system on a unified platform is one of the research hotspots in system engineering at present. It is also the development trend of the design for complex electromechanical system. The unified modeling techniques and tools based on Modelica language provide a satisfactory solution. To meet with the requirements of collaborative modeling, simulation, and parallel computing for complex electromechanical systems based on Modelica, a general web-based modeling and simulation prototype environment, namely, WebMWorks, is designed and implemented. Based on the rich Internet application technologies, an interactive graphic user interface for modeling and post-processing on web browser was implemented; with the collaborative design module, the environment supports top-down, concurrent modeling and team cooperation; additionally, service-oriented architecture–based architecture was applied to supply compiling and solving services which run on cloud-like servers, so the environment can manage and dispatch large-scale simulation tasks in parallel on multiple computing servers simultaneously. An engineering application about pure electric vehicle is tested on WebMWorks. The results of simulation and parametric experiment demonstrate that the tested web-based environment can effectively shorten the design cycle of the complex electromechanical system.

  13. Secure Enclaves: An Isolation-centric Approach for Creating Secure High Performance Computing Environments

    Energy Technology Data Exchange (ETDEWEB)

    Aderholdt, Ferrol [Tennessee Technological Univ., Cookeville, TN (United States); Caldwell, Blake A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hicks, Susan Elaine [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Koch, Scott M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Naughton, III, Thomas J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Pelfrey, Daniel S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Pogge, James R [Tennessee Technological Univ., Cookeville, TN (United States); Scott, Stephen L [Tennessee Technological Univ., Cookeville, TN (United States); Shipman, Galen M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sorrillo, Lawrence [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-01-01

    High performance computing environments are often used for a wide variety of workloads ranging from simulation, data transformation and analysis, and complex workflows to name just a few. These systems may process data at various security levels but in so doing are often enclaved at the highest security posture. This approach places significant restrictions on the users of the system even when processing data at a lower security level and exposes data at higher levels of confidentiality to a much broader population than otherwise necessary. The traditional approach of isolation, while effective in establishing security enclaves poses significant challenges for the use of shared infrastructure in HPC environments. This report details current state-of-the-art in virtualization, reconfigurable network enclaving via Software Defined Networking (SDN), and storage architectures and bridging techniques for creating secure enclaves in HPC environments.

  14. Relationship between x-ray emission and absorption spectroscopy and the local H-bond environment in water.

    Science.gov (United States)

    Zhovtobriukh, Iurii; Besley, Nicholas A; Fransson, Thomas; Nilsson, Anders; Pettersson, Lars G M

    2018-04-14

    The connection between specific features in the water X-ray absorption spectrum and X-ray emission spectrum (XES) and the local H-bond coordination is studied based on structures obtained from path-integral molecular dynamics simulations using either the opt-PBE-vdW density functional or the MB-pol force field. Computing the XES spectrum using all molecules in a snapshot results in only one peak in the lone-pair (1b 1 ) region, while the experiment shows two peaks separated by 0.8-0.9 eV. Different H-bond configurations were classified based on the local structure index (LSI) and a geometrical H-bond cone criterion. We find that tetrahedrally coordinated molecules characterized by high LSI values and two strong donated and two strong accepted H-bonds contribute to the low energy 1b 1 emission peak and to the post-edge region in absorption. Molecules with the asymmetric H-bond environment with one strong accepted H-bond and one strong donated H-bond and low LSI values give rise to the high energy 1b 1 peak in the emission spectrum and mainly contribute to the pre-edge and main-edge in the absorption spectrum. The 1b 1 peak splitting can be increased to 0.62 eV by imposing constraints on the H-bond length, i.e., for very tetrahedral structures short H-bonds (less than 2.68 Å) and for very asymmetric structures elongated H-bonds (longer than 2.8 Å). Such structures are present, but underrepresented, in the simulations which give more of an average of the two extremes.

  15. Relationship between x-ray emission and absorption spectroscopy and the local H-bond environment in water

    Science.gov (United States)

    Zhovtobriukh, Iurii; Besley, Nicholas A.; Fransson, Thomas; Nilsson, Anders; Pettersson, Lars G. M.

    2018-04-01

    The connection between specific features in the water X-ray absorption spectrum and X-ray emission spectrum (XES) and the local H-bond coordination is studied based on structures obtained from path-integral molecular dynamics simulations using either the opt-PBE-vdW density functional or the MB-pol force field. Computing the XES spectrum using all molecules in a snapshot results in only one peak in the lone-pair (1b1) region, while the experiment shows two peaks separated by 0.8-0.9 eV. Different H-bond configurations were classified based on the local structure index (LSI) and a geometrical H-bond cone criterion. We find that tetrahedrally coordinated molecules characterized by high LSI values and two strong donated and two strong accepted H-bonds contribute to the low energy 1b1 emission peak and to the post-edge region in absorption. Molecules with the asymmetric H-bond environment with one strong accepted H-bond and one strong donated H-bond and low LSI values give rise to the high energy 1b1 peak in the emission spectrum and mainly contribute to the pre-edge and main-edge in the absorption spectrum. The 1b1 peak splitting can be increased to 0.62 eV by imposing constraints on the H-bond length, i.e., for very tetrahedral structures short H-bonds (less than 2.68 Å) and for very asymmetric structures elongated H-bonds (longer than 2.8 Å). Such structures are present, but underrepresented, in the simulations which give more of an average of the two extremes.

  16. Investigation of local environment around rare earths (La and Eu) by fluorescence line narrowing during borosilicate glass alteration

    International Nuclear Information System (INIS)

    Molières, Estelle; Panczer, Gérard; Guyot, Yannick; Jollivet, Patrick; Majérus, Odile; Aschehoug, Patrick; Barboux, Philippe; Gin, Stéphane; Angeli, Frédéric

    2014-01-01

    The local environment of europium in soda-lime borosilicate glasses with a range of La 2 O 3 content was probed by continuous luminescence and Fluorescence Line Narrowing (FLN) to investigate the local environment of rare earth elements in pristine and leached glass. After aqueous leaching at 90 °C at pH 7 and 9.5, rare earths were fully retained and homogeneously distributed in the amorphous alteration layer (commonly called gel). Two separate silicate environments were observed in pristine and leached glasses regardless of the lanthanum content and the leaching conditions. A borate environment surrounding europium was not observed in pristine and leached glasses. During glass alteration, OH groups were located around the europium environment, which became more organized (higher symmetry) in the first coordination shell. -- Highlights: • No borate environment surrounding europium was detected in pristine borosilicate glasses. • Up to 12 mol% of REE2O3 in glass, local environment of europium does not significantly change. • Europium environment becomes more ordered and symmetric in gels than in pristine glasses. • Two distinct silicate sites were observed, as well in pristine glass as in gels (leached glasses). • In altered glasses, OH groups were located around europium

  17. Investigation of local environment around rare earths (La and Eu) by fluorescence line narrowing during borosilicate glass alteration

    Energy Technology Data Exchange (ETDEWEB)

    Molières, Estelle [CEA – DEN-DTCD-LCV-SECM Laboratoire d' études du Comportement à Long Terme, 30207 Bagnols-sur-Cèze (France); Panczer, Gérard; Guyot, Yannick [Institut Lumière Matière, UMR5306 Université Lyon 1-CNRS, Université de Lyon, 69622 Villeurbanne cedex (France); Jollivet, Patrick [CEA – DEN-DTCD-LCV-SECM Laboratoire d' études du Comportement à Long Terme, 30207 Bagnols-sur-Cèze (France); Majérus, Odile; Aschehoug, Patrick; Barboux, Philippe [Laboratoire de Chimie de la Matière Condensée de Paris, UMR-CNRS 7574, École Nationale Supérieure de Chimie de Paris (ENSCP Chimie-ParisTech), 11 rue Pierre et Marie Curie, 75231 Paris (France); Gin, Stéphane [CEA – DEN-DTCD-LCV-SECM Laboratoire d' études du Comportement à Long Terme, 30207 Bagnols-sur-Cèze (France); Angeli, Frédéric, E-mail: frederic.angeli@cea.fr [CEA – DEN-DTCD-LCV-SECM Laboratoire d' études du Comportement à Long Terme, 30207 Bagnols-sur-Cèze (France)

    2014-01-15

    The local environment of europium in soda-lime borosilicate glasses with a range of La{sub 2}O{sub 3} content was probed by continuous luminescence and Fluorescence Line Narrowing (FLN) to investigate the local environment of rare earth elements in pristine and leached glass. After aqueous leaching at 90 °C at pH 7 and 9.5, rare earths were fully retained and homogeneously distributed in the amorphous alteration layer (commonly called gel). Two separate silicate environments were observed in pristine and leached glasses regardless of the lanthanum content and the leaching conditions. A borate environment surrounding europium was not observed in pristine and leached glasses. During glass alteration, OH groups were located around the europium environment, which became more organized (higher symmetry) in the first coordination shell. -- Highlights: • No borate environment surrounding europium was detected in pristine borosilicate glasses. • Up to 12 mol% of REE2O3 in glass, local environment of europium does not significantly change. • Europium environment becomes more ordered and symmetric in gels than in pristine glasses. • Two distinct silicate sites were observed, as well in pristine glass as in gels (leached glasses). • In altered glasses, OH groups were located around europium.

  18. Using high performance interconnects in a distributed computing and mass storage environment

    International Nuclear Information System (INIS)

    Ernst, M.

    1994-01-01

    Detector Collaborations of the HERA Experiments typically involve more than 500 physicists from a few dozen institutes. These physicists require access to large amounts of data in a fully transparent manner. Important issues include Distributed Mass Storage Management Systems in a Distributed and Heterogeneous Computing Environment. At the very center of a distributed system, including tens of CPUs and network attached mass storage peripherals are the communication links. Today scientists are witnessing an integration of computing and communication technology with the open-quote network close-quote becoming the computer. This contribution reports on a centrally operated computing facility for the HERA Experiments at DESY, including Symmetric Multiprocessor Machines (84 Processors), presently more than 400 GByte of magnetic disk and 40 TB of automoted tape storage, tied together by a HIPPI open-quote network close-quote. Focussing on the High Performance Interconnect technology, details will be provided about the HIPPI based open-quote Backplane close-quote configured around a 20 Gigabit/s Multi Media Router and the performance and efficiency of the related computer interfaces

  19. The Development of Biology Teaching Material Based on the Local Wisdom of Timorese to Improve Students Knowledge and Attitude of Environment in Caring the Preservation of Environment

    Science.gov (United States)

    Ardan, Andam S.

    2016-01-01

    The purposes of this study were (1) to describe the biology learning such as lesson plans, teaching materials, media and worksheets for the tenth grade of High School on the topic of Biodiversity and Basic Classification, Ecosystems and Environment Issues based on local wisdom of Timorese; (2) to analyze the improvement of the environmental…

  20. Procrastinating Behavior in Computer-Based Learning Environments to Predict Performance: A Case Study in Moodle

    Science.gov (United States)

    Cerezo, Rebeca; Esteban, María; Sánchez-Santillán, Miguel; Núñez, José C.

    2017-01-01

    Introduction: Research about student performance has traditionally considered academic procrastination as a behavior that has negative effects on academic achievement. Although there is much evidence for this in class-based environments, there is a lack of research on Computer-Based Learning Environments (CBLEs). Therefore, the purpose of this study is to evaluate student behavior in a blended learning program and specifically procrastination behavior in relation to performance through Data Mining techniques. Materials and Methods: A sample of 140 undergraduate students participated in a blended learning experience implemented in a Moodle (Modular Object Oriented Developmental Learning Environment) Management System. Relevant interaction variables were selected for the study, taking into account student achievement and analyzing data by means of association rules, a mining technique. The association rules were arrived at and filtered through two selection criteria: 1, rules must have an accuracy over 0.8 and 2, they must be present in both sub-samples. Results: The findings of our study highlight the influence of time management in online learning environments, particularly on academic achievement, as there is an association between procrastination variables and student performance. Conclusion: Negative impact of procrastination in learning outcomes has been observed again but in virtual learning environments where practical implications, prevention of, and intervention in, are different from class-based learning. These aspects are discussed to help resolve student difficulties at various ages. PMID:28883801

  1. Procrastinating Behavior in Computer-Based Learning Environments to Predict Performance: A Case Study in Moodle.

    Science.gov (United States)

    Cerezo, Rebeca; Esteban, María; Sánchez-Santillán, Miguel; Núñez, José C

    2017-01-01

    Introduction: Research about student performance has traditionally considered academic procrastination as a behavior that has negative effects on academic achievement. Although there is much evidence for this in class-based environments, there is a lack of research on Computer-Based Learning Environments (CBLEs) . Therefore, the purpose of this study is to evaluate student behavior in a blended learning program and specifically procrastination behavior in relation to performance through Data Mining techniques. Materials and Methods: A sample of 140 undergraduate students participated in a blended learning experience implemented in a Moodle (Modular Object Oriented Developmental Learning Environment) Management System. Relevant interaction variables were selected for the study, taking into account student achievement and analyzing data by means of association rules, a mining technique. The association rules were arrived at and filtered through two selection criteria: 1, rules must have an accuracy over 0.8 and 2, they must be present in both sub-samples. Results: The findings of our study highlight the influence of time management in online learning environments, particularly on academic achievement, as there is an association between procrastination variables and student performance. Conclusion: Negative impact of procrastination in learning outcomes has been observed again but in virtual learning environments where practical implications, prevention of, and intervention in, are different from class-based learning. These aspects are discussed to help resolve student difficulties at various ages.

  2. Procrastinating Behavior in Computer-Based Learning Environments to Predict Performance: A Case Study in Moodle

    Directory of Open Access Journals (Sweden)

    Rebeca Cerezo

    2017-08-01

    Full Text Available Introduction: Research about student performance has traditionally considered academic procrastination as a behavior that has negative effects on academic achievement. Although there is much evidence for this in class-based environments, there is a lack of research on Computer-Based Learning Environments (CBLEs. Therefore, the purpose of this study is to evaluate student behavior in a blended learning program and specifically procrastination behavior in relation to performance through Data Mining techniques.Materials and Methods: A sample of 140 undergraduate students participated in a blended learning experience implemented in a Moodle (Modular Object Oriented Developmental Learning Environment Management System. Relevant interaction variables were selected for the study, taking into account student achievement and analyzing data by means of association rules, a mining technique. The association rules were arrived at and filtered through two selection criteria: 1, rules must have an accuracy over 0.8 and 2, they must be present in both sub-samples.Results: The findings of our study highlight the influence of time management in online learning environments, particularly on academic achievement, as there is an association between procrastination variables and student performance.Conclusion: Negative impact of procrastination in learning outcomes has been observed again but in virtual learning environments where practical implications, prevention of, and intervention in, are different from class-based learning. These aspects are discussed to help resolve student difficulties at various ages.

  3. Risk Analysis of Coastal hazard Considering Sea-level Rise and Local Environment in Coastal Area

    Science.gov (United States)

    Sangjin, P.; Lee, D. K.; KIM, H.; Ryu, J. E.; Yoo, S.; Ryoo, H.

    2014-12-01

    Recently, natural hazards has been more unpredictable with increasing frequency and strength due to climate change. Especially, coastal areas would be more vulnerable in the future because of sea-level rise (SLR). In case of Korea, it is surrounded by oceans and has many big cities at coastal area, thus a hazard prevention plan in coastal area is absolutely necessary. However, prior to making the plan, finding areas at risk would be the first step. In order to find the vulnerable area, local characteristics of coastal areas should also be considered along with SLR. Therefore, the objective of the research is to find vulnerable areas, which could be damaged by coastal hazards considering local environment and SLR of coastal areas. Spatial scope of the research was set up as 1km from the coastline according to the 'coastal management law' in Korea. The assessment was done up to the year of 2050, and the highest sea level rise scenario was used. For risk analysis, biophysical and socioeconomic characteristics were considered as to represent local characteristics of coastal area. Risk analysis was carried out through the combination of 'possibility of hazard' and the 'level of damages', and both of them reflect the above-mentioned regional characteristics. Since the range of inundation was narrowed down to the inundation from typhoon in this research, the possibility of inundation caused by typhoon was estimated by using numerical model, which calculated the height of storm surge considering wave, tide, sea-level pressure and SLR. Also the level of damage was estimated by categorizing the socioeconomic character into four factors; human, infrastructure, ecology and socioeconomic. Variables that represent each factor were selected and used in damage estimation with their classification and weighting value. The result shows that the urban coastal areas are more vulnerable and hazardous than other areas because of socioeconomic factors. The east and the south coast are

  4. Development of a locally mass flux conservative computer code for calculating 3-D viscous flow in turbomachines

    Science.gov (United States)

    Walitt, L.

    1982-01-01

    The VANS successive approximation numerical method was extended to the computation of three dimensional, viscous, transonic flows in turbomachines. A cross-sectional computer code, which conserves mass flux at each point of the cross-sectional surface of computation was developed. In the VANS numerical method, the cross-sectional computation follows a blade-to-blade calculation. Numerical calculations were made for an axial annular turbine cascade and a transonic, centrifugal impeller with splitter vanes. The subsonic turbine cascade computation was generated in blade-to-blade surface to evaluate the accuracy of the blade-to-blade mode of marching. Calculated blade pressures at the hub, mid, and tip radii of the cascade agreed with corresponding measurements. The transonic impeller computation was conducted to test the newly developed locally mass flux conservative cross-sectional computer code. Both blade-to-blade and cross sectional modes of calculation were implemented for this problem. A triplet point shock structure was computed in the inducer region of the impeller. In addition, time-averaged shroud static pressures generally agreed with measured shroud pressures. It is concluded that the blade-to-blade computation produces a useful engineering flow field in regions of subsonic relative flow; and cross-sectional computation, with a locally mass flux conservative continuity equation, is required to compute the shock waves in regions of supersonic relative flow.

  5. Daily rhythmicity of the thermoregulatory responses of locally adapted Brazilian sheep in a semiarid environment

    Science.gov (United States)

    da Silva, Wilma Emanuela; Leite, Jacinara Hody Gurgel Morais; de Sousa, José Ernandes Rufino; Costa, Wirton Peixoto; da Silva, Wallace Sostene Tavares; Guilhermino, Magda Maria; Asensio, Luis Alberto Bermejo; Façanha, Débora Andréa Evangelista

    2017-07-01

    approximately 5:00 p.m.; however, these findings confirm the importance of providing environmental protection during critical periods of the day, even for locally adapted breeds. These responses suggest that the use of thermal storage allowed the animals to achieve equilibrium with the environment and maintain a stable body temperature.

  6. Modeling and Analysis Compute Environments, Utilizing Virtualization Technology in the Climate and Earth Systems Science domain

    Science.gov (United States)

    Michaelis, A.; Nemani, R. R.; Wang, W.; Votava, P.; Hashimoto, H.

    2010-12-01

    Given the increasing complexity of climate modeling and analysis tools, it is often difficult and expensive to build or recreate an exact replica of the software compute environment used in past experiments. With the recent development of new technologies for hardware virtualization, an opportunity exists to create full modeling, analysis and compute environments that are “archiveable”, transferable and may be easily shared amongst a scientific community or presented to a bureaucratic body if the need arises. By encapsulating and entire modeling and analysis environment in a virtual machine image, others may quickly gain access to the fully built system used in past experiments, potentially easing the task and reducing the costs of reproducing and verify past results produced by other researchers. Moreover, these virtual machine images may be used as a pedagogical tool for others that are interested in performing an academic exercise but don't yet possess the broad expertise required. We built two virtual machine images, one with the Community Earth System Model (CESM) and one with Weather Research Forecast Model (WRF), then ran several small experiments to assess the feasibility, performance overheads costs, reusability, and transferability. We present a list of the pros and cons as well as lessoned learned from utilizing virtualization technology in the climate and earth systems modeling domain.

  7. PABLM: a computer program to calculate accumulated radiation doses from radionuclides in the environment

    International Nuclear Information System (INIS)

    Napier, B.A.; Kennedy, W.E. Jr.; Soldat, J.K.

    1980-03-01

    A computer program, PABLM, was written to facilitate the calculation of internal radiation doses to man from radionuclides in food products and external radiation doses from radionuclides in the environment. This report contains details of mathematical models used and calculational procedures required to run the computer program. Radiation doses from radionuclides in the environment may be calculated from deposition on the soil or plants during an atmospheric or liquid release, or from exposure to residual radionuclides in the environment after the releases have ended. Radioactive decay is considered during the release of radionuclides, after they are deposited on the plants or ground, and during holdup of food after harvest. The radiation dose models consider several exposure pathways. Doses may be calculated for either a maximum-exposed individual or for a population group. The doses calculated are accumulated doses from continuous chronic exposure. A first-year committed dose is calculated as well as an integrated dose for a selected number of years. The equations for calculating internal radiation doses are derived from those given by the International Commission on Radiological Protection (ICRP) for body burdens and MPC's of each radionuclide. The radiation doses from external exposure to contaminated water and soil are calculated using the basic assumption that the contaminated medium is large enough to be considered an infinite volume or plane relative to the range of the emitted radiations. The equations for calculations of the radiation dose from external exposure to shoreline sediments include a correction for the finite width of the contaminated beach

  8. The ionic conductivity and local environment of cations in Bi9ReO17

    International Nuclear Information System (INIS)

    Thompson, M.; Herranz, T.; Santos, B.; Marco, J.F.; Berry, F.J.; Greaves, C.

    2010-01-01

    The influence of temperature on the structure of Bi 9 ReO 17 has been investigated using differential thermal analysis, variable temperature X-ray diffraction and neutron powder diffraction. The material undergoes an order-disorder transition at ∼1000 K on heating, to form a fluorite-related phase. The local environments of the cations in fully ordered Bi 9 ReO 17 have been investigated by Bi L III - and Re L III -edge extended X-ray absorption fine structure (EXAFS) measurements to complement the neutron powder diffraction information. Whereas rhenium displays regular tetrahedral coordination, all bismuth sites show coordination geometries which reflect the importance of a stereochemically active lone pair of electrons. Because of the wide range of Bi-O distances, EXAFS data are similar to those observed for disordered structures, and are dominated by the shorter Bi-O bonds. Ionic conductivity measurements indicate that ordered Bi 9 ReO 17 exhibits reasonably high oxide ion conductivity, corresponding to 2.9x10 -5 Ω -1 cm -1 at 673 K, whereas the disordered form shows higher oxide ion conductivity (9.1x10 -4 Ω -1 cm -1 at 673 K). - Graphical abstract: The structure of Bi 9 ReO 17 is discussed and related to the ionic conductivity of the ordered and disordered forms.

  9. The global obesity pandemic: shaped by global drivers and local environments.

    Science.gov (United States)

    Swinburn, Boyd A; Sacks, Gary; Hall, Kevin D; McPherson, Klim; Finegood, Diane T; Moodie, Marjory L; Gortmaker, Steven L

    2011-08-27

    The simultaneous increases in obesity in almost all countries seem to be driven mainly by changes in the global food system, which is producing more processed, affordable, and effectively marketed food than ever before. This passive overconsumption of energy leading to obesity is a predictable outcome of market economies predicated on consumption-based growth. The global food system drivers interact with local environmental factors to create a wide variation in obesity prevalence between populations. Within populations, the interactions between environmental and individual factors, including genetic makeup, explain variability in body size between individuals. However, even with this individual variation, the epidemic has predictable patterns in subpopulations. In low-income countries, obesity mostly affects middle-aged adults (especially women) from wealthy, urban environments; whereas in high-income countries it affects both sexes and all ages, but is disproportionately greater in disadvantaged groups. Unlike other major causes of preventable death and disability, such as tobacco use, injuries, and infectious diseases, there are no exemplar populations in which the obesity epidemic has been reversed by public health measures. This absence increases the urgency for evidence-creating policy action, with a priority on reduction of the supply-side drivers. Copyright © 2011 Elsevier Ltd. All rights reserved.

  10. Design and study of parallel computing environment of Monte Carlo simulation for particle therapy planning using a public cloud-computing infrastructure

    International Nuclear Information System (INIS)

    Yokohama, Noriya

    2013-01-01

    This report was aimed at structuring the design of architectures and studying performance measurement of a parallel computing environment using a Monte Carlo simulation for particle therapy using a high performance computing (HPC) instance within a public cloud-computing infrastructure. Performance measurements showed an approximately 28 times faster speed than seen with single-thread architecture, combined with improved stability. A study of methods of optimizing the system operations also indicated lower cost. (author)

  11. Deoxyglucose method for the estimation of local myocardial glucose metabolism with positron computed tomography

    International Nuclear Information System (INIS)

    Ratib, O.; Phelps, M.E.; Huang, S.C.; Henze, E.; Selin, C.E.; Schelbert, H.R.

    1981-01-01

    The deoxyglucose method originally developed for measurements of the local cerebral metabolic rate for glucose has been investigated in terms of its application to studies of the heart with positron computed tomography (PCT) and FDG. Studies were performed in dogs to measure the tissue kinetics of FDG with PCT and by direct arterial-venous sampling. The operational equation developed in our laboratory as an extension of the Sokoloff model was used to analyze the data. The FDG method accurately predicted the true MMRGlc even when the glucose metabolic rate was normal but myocardial blood flow (MBF) was elevated 5 times the control value or when metabolism was reduced to 10% of normal and MBF increased 5 times normal. Improvements in PCT resolution are required to improve the accuracy of the estimates of the rate constants and the MMRGlc

  12. Scintigraphic diagnosis and computed tomographic localization of an accessory spleen following relapse of chronic immune thrombocytopaenia

    International Nuclear Information System (INIS)

    Cardaci, G.T.; Blake, M.P.

    1992-01-01

    Chronic immune thrombocytopaenia is an immunologically mediated disorder resulting in disordered platelet kinetics and potentially life-threatening disease. Failure of medical therapy is an indication for splenectomy, and responses are seen in 80% of patients following this procedure. An important cause of relapse following splenectomy is the presence of an accessory spleen. A patient with Hodgkin's Disease developed chronic immune thrombocytopaenia despite previous splenectomy. A remission was induced with immunosuppressive therapy, but he later relapsed. An accessory spleen was detected using 99 m Tc denatured red blood cells and localized using computed tomography. Resection of the accessory spleen resulted in clinical remission. As accessory spleens are often small in size, combined modality imaging is recommended in the evaluation of this disorder. 15 refs., 2 figs

  13. Attitudes and gender differences of high school seniors within one-to-one computing environments in South Dakota

    Science.gov (United States)

    Nelson, Mathew

    In today's age of exponential change and technological advancement, awareness of any gender gap in technology and computer science-related fields is crucial, but further research must be done in an effort to better understand the complex interacting factors contributing to the gender gap. This study utilized a survey to investigate specific gender differences relating to computing self-efficacy, computer usage, and environmental factors of exposure, personal interests, and parental influence that impact gender differences of high school students within a one-to-one computing environment in South Dakota. The population who completed the One-to-One High School Computing Survey for this study consisted of South Dakota high school seniors who had been involved in a one-to-one computing environment for two or more years. The data from the survey were analyzed using descriptive and inferential statistics for the determined variables. From the review of literature and data analysis several conclusions were drawn from the findings. Among them are that overall, there was very little difference in perceived computing self-efficacy and computing anxiety between male and female students within the one-to-one computing initiative. The study supported the current research that males and females utilized computers similarly, but males spent more time using their computers to play online games. Early exposure to computers, or the age at which the student was first exposed to a computer, and the number of computers present in the home (computer ownership) impacted computing self-efficacy. The results also indicated parental encouragement to work with computers also contributed positively to both male and female students' computing self-efficacy. Finally the study also found that both mothers and fathers encouraged their male children more than their female children to work with computing and pursue careers in computing science fields.

  14. Signal and image processing algorithm performance in a virtual and elastic computing environment

    Science.gov (United States)

    Bennett, Kelly W.; Robertson, James

    2013-05-01

    The U.S. Army Research Laboratory (ARL) supports the development of classification, detection, tracking, and localization algorithms using multiple sensing modalities including acoustic, seismic, E-field, magnetic field, PIR, and visual and IR imaging. Multimodal sensors collect large amounts of data in support of algorithm development. The resulting large amount of data, and their associated high-performance computing needs, increases and challenges existing computing infrastructures. Purchasing computer power as a commodity using a Cloud service offers low-cost, pay-as-you-go pricing models, scalability, and elasticity that may provide solutions to develop and optimize algorithms without having to procure additional hardware and resources. This paper provides a detailed look at using a commercial cloud service provider, such as Amazon Web Services (AWS), to develop and deploy simple signal and image processing algorithms in a cloud and run the algorithms on a large set of data archived in the ARL Multimodal Signatures Database (MMSDB). Analytical results will provide performance comparisons with existing infrastructure. A discussion on using cloud computing with government data will discuss best security practices that exist within cloud services, such as AWS.

  15. Scheduling Method of Data-Intensive Applications in Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    Xiong Fu

    2015-01-01

    Full Text Available The virtualization of cloud computing improves the utilization of resources and energy. And a cloud user can deploy his/her own applications and related data on a pay-as-you-go basis. The communications between an application and a data storage node, as well as within the application, have a great impact on the execution efficiency of the application. The locations of subtasks of an application and the data that transferred between the subtasks are the main reason why communication delay exists. The communication delay can affect the completion time of the application. In this paper, we take into account the data transmission time and communications between subtasks and propose a heuristic optimal virtual machine (VM placement algorithm. Related simulations demonstrate that this algorithm can reduce the completion time of user tasks and ensure the feasibility and effectiveness of the overall network performance of applications when running in a cloud computing environment.

  16. The development of a distributed computing environment for the design and modeling of plasma spectroscopy experiments

    International Nuclear Information System (INIS)

    Nash, J.K.; Eme, W.G.; Lee, R.W.; Salter, J.M.

    1994-10-01

    The design and analysis of plasma spectroscopy experiments can be significantly complicated by relatively routine computational tasks arising from the massive amount of data encountered in the experimental design and analysis stages of the work. Difficulties in obtaining, computing, manipulating and visualizing the information represent not simply an issue of convenience -- they have a very real limiting effect on the final quality of the data and on the potential for arriving at meaningful conclusions regarding an experiment. We describe ongoing work in developing a portable UNIX environment shell with the goal of simplifying and enabling these activities for the plasma-modeling community. Applications to the construction of atomic kinetics models and to the analysis of x-ray transmission spectroscopy will be shown

  17. Secure Encapsulation and Publication of Biological Services in the Cloud Computing Environment

    Science.gov (United States)

    Zhang, Weizhe; Wang, Xuehui; Lu, Bo; Kim, Tai-hoon

    2013-01-01

    Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved. PMID:24078906

  18. Secure Encapsulation and Publication of Biological Services in the Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Weizhe Zhang

    2013-01-01

    Full Text Available Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved.

  19. Ubiquitous Green Computing Techniques for High Demand Applications in Smart Environments

    Directory of Open Access Journals (Sweden)

    Jose M. Moya

    2012-08-01

    Full Text Available Ubiquitous sensor network deployments, such as the ones found in Smart cities and Ambient intelligence applications, require constantly increasing high computational demands in order to process data and offer services to users. The nature of these applications imply the usage of data centers. Research has paid much attention to the energy consumption of the sensor nodes in WSNs infrastructures. However, supercomputing facilities are the ones presenting a higher economic and environmental impact due to their very high power consumption. The latter problem, however, has been disregarded in the field of smart environment services. This paper proposes an energy-minimization workload assignment technique, based on heterogeneity and application-awareness, that redistributes low-demand computational tasks from high-performance facilities to idle nodes with low and medium resources in the WSN infrastructure. These non-optimal allocation policies reduce the energy consumed by the whole infrastructure and the total execution time.

  20. Ubiquitous green computing techniques for high demand applications in Smart environments.

    Science.gov (United States)

    Zapater, Marina; Sanchez, Cesar; Ayala, Jose L; Moya, Jose M; Risco-Martín, José L

    2012-01-01

    Ubiquitous sensor network deployments, such as the ones found in Smart cities and Ambient intelligence applications, require constantly increasing high computational demands in order to process data and offer services to users. The nature of these applications imply the usage of data centers. Research has paid much attention to the energy consumption of the sensor nodes in WSNs infrastructures. However, supercomputing facilities are the ones presenting a higher economic and environmental impact due to their very high power consumption. The latter problem, however, has been disregarded in the field of smart environment services. This paper proposes an energy-minimization workload assignment technique, based on heterogeneity and application-awareness, that redistributes low-demand computational tasks from high-performance facilities to idle nodes with low and medium resources in the WSN infrastructure. These non-optimal allocation policies reduce the energy consumed by the whole infrastructure and the total execution time.

  1. Integration of a browser based operator manual in the system environment of a process computer system

    International Nuclear Information System (INIS)

    Weber, Andreas; Erfle, Robert; Feinkohl, Dirk

    2012-01-01

    The integration of a browser based operator manual in the system environment of a process computer system is an optimization of the operating procedure in the control room and a safety enhancement due to faster and error-free access to the manual contents. Several requirements by the authorities have to be fulfilled: the operating manual has to be available as hard copy, the format has to be true to original, protection against manipulation has to be provided, the manual content of the browser-based version and the hard copy have to identical, and the display presentation has to be consistent with ergonomic principals. The integration of the on-line manual in the surveillance process computer system provides the operator with the relevant comments to the surveillance signal. The described integration of the on-line manual is an optimization of the operator's everyday job with respect to ergonomics and safety (human performance).

  2. Aeroflex Single Board Computers and Instrument Circuit Cards for Nuclear Environments Measuring and Monitoring

    International Nuclear Information System (INIS)

    Stratton, Sam; Stevenson, Dave; Magnifico, Mateo

    2013-06-01

    A Single Board Computer (SBC) is an entire computer including all of the required components and I/O interfaces built on a single circuit board. SBC's are used across numerous industrial, military and space flight applications. In the case of military and space implementations, SBC's employ advanced high reliability processors designed for rugged thermal, mechanical and even radiation environments. These processors, in turn, rely on equally advanced support components such as memory, interface, and digital logic. When all of these components are put together on a printed circuit card, the result is a highly reliable Single Board Computer that can perform a wide variety of tasks in very harsh environments. In the area of instrumentation, peripheral circuit cards can be developed that directly interface to the SBC and various radiation measuring devices and systems. Designers use signal conditioning and high reliability Analog to Digital Converters (ADC's) to convert the measuring device signals to digital data suitable for a microprocessor. The data can then be sent to the SBC via high speed communication protocols such as Ethernet or similar type of serial bus. Data received by the SBC can then be manipulated and processed into a form readily available to users. Recent events are causing some in the NPP industry to consider devices and systems with better radiation and temperature performance capability. Systems designed for space application are designed for the harsh environment of space which under certain conditions would be similar to what the electronics will see during a severe nuclear reactor event. The NPP industry should be considering higher reliability electronics for certain critical applications. (authors)

  3. Behavioral response and pain perception to computer controlled local anesthetic delivery system and cartridge syringe

    Directory of Open Access Journals (Sweden)

    T D Yogesh Kumar

    2015-01-01

    Full Text Available Aim: The present study evaluated and compared the pain perception, behavioral response, physiological parameters, and the role of topical anesthetic administration during local anesthetic administration with cartridge syringe and computer controlled local anesthetic delivery system (CCLAD. Design: A randomized controlled crossover study was carried out with 120 children aged 7-11 years. They were randomly divided into Group A: Receiving injection with CCLAD during first visit; Group B: Receiving injection with cartridge syringe during first visit. They were further subdivided into three subgroups based on the topical application used: (a 20% benzocaine; (b pressure with cotton applicator; (c no topical application. Pulse rate and blood pressure were recorded before and during injection procedure. Objective evaluation of disruptive behavior and subjective evaluation of pain were done using face legs activity cry consolability scale and modified facial image scale, respectively. The washout period between the two visits was 1-week. Results: Injections with CCLAD produced significantly lesser pain response, disruptive behavior (P < 0.001, and pulse rate (P < 0.05 when compared to cartridge syringe injections. Application of benzocaine produced lesser pain response and disruptive behavior when compared to the other two subgroups, although the result was not significant. Conclusion: Usage of techniques which enhance behavioral response in children like injections with CCLAD can be considered as a possible step toward achieving a pain-free pediatric dental practice.

  4. Three-dimensional localization of impacted canines and root resorption assessment using cone beam computed tomography.

    Science.gov (United States)

    Almuhtaseb, Eyad; Mao, Jing; Mahony, Derek; Bader, Rawan; Zhang, Zhi-xing

    2014-06-01

    The purpose of this study was to develop a new way to localize the impacted canines from three dimensions and to investigate the root resorption of the adjacent teeth by using cone beam computed tomography (CBCT). Forty-six patients undergoing orthodontic treatments and having impacted canines in Tongji Hospital were examined. The images of CBCT scans were obtained from KaVo 3D exam vision. Angular and linear measurements of the cusp tip and root apex according to the three planes (mid-sagittal, occlusal and frontal) have been taken using the cephalometric tool of the InVivo Dental Anatomage Version 5.1.10. The measurements of the angular and linear coordinates of the maxillary and mandibular canines were obtained. Using this technique the operators could envision the location of the impacted canine according to the three clinical planes. Adjacent teeth root resorption of 28.26 % was in the upper lateral incisors while 17.39% in upper central incisors, but no lower root resorption was found in our samples. Accurate and reliable localization of the impacted canines could be obtained from the novel analysis system, which offers a better surgical and orthodontic treatment for the patients with impacted canines.

  5. Multislice Computed Tomography Coronary Angiography at a Local Hospital: Pitfalls and Potential

    Energy Technology Data Exchange (ETDEWEB)

    Kolnes, K.; Velle, Ose H.; Hareide, S.; Hegbom, K.; Wiseth, R. [Volda Hospital (Norway). Depts. of Radiology and Internal Medicine

    2006-09-15

    Purpose: To evaluate whether the favorable results achieved with multislice computed tomography (MSCT) of coronary arteries at larger centers could be paralleled at a local hospital. Material and Methods: Fifty consecutive patients with suspected coronary artery disease scheduled for invasive investigation with quantitative coronary angiography (QCA) at a university hospital underwent MSCT with a 16-slice scanner at a local hospital. Diagnostic accuracy of MSCT for coronary artery disease was assessed using a 16-segment coronary artery model with QCA as the gold standard. Results: Segments with diameter 50% stenosis for the 416 assessable segments were 92%, 82%, 53%, and 98%, respectively. Conclusion: Our beginners' experience demonstrated favorable results regarding sensitivity and negative predictive value. The positive predictive value, however, was unsatisfactory. Calcifications were identified as the most important factor for false-positive results with MSCT. With widespread use of MSCT coronary angiography, there is a risk of recruiting patients without significant coronary artery disease to unnecessary and potentially harmful invasive procedures.

  6. Young children reorient by computing layout geometry, not by matching images of the environment.

    Science.gov (United States)

    Lee, Sang Ah; Spelke, Elizabeth S

    2011-02-01

    Disoriented animals from ants to humans reorient in accord with the shape of the surrounding surface layout: a behavioral pattern long taken as evidence for sensitivity to layout geometry. Recent computational models suggest, however, that the reorientation process may not depend on geometrical analyses but instead on the matching of brightness contours in 2D images of the environment. Here we test this suggestion by investigating young children's reorientation in enclosed environments. Children reoriented by extremely subtle geometric properties of the 3D layout: bumps and ridges that protruded only slightly off the floor, producing edges with low contrast. Moreover, children failed to reorient by prominent brightness contours in continuous layouts with no distinctive 3D structure. The findings provide evidence that geometric layout representations support children's reorientation.

  7. Educational Game Design. Bridging the gab between computer based learning and experimental learning environments

    DEFF Research Database (Denmark)

    Andersen, Kristine

    2007-01-01

    Considering the rapidly growing amount of digital educational materials only few of them bridge the gab between experimental learning environments and computer based learning environments (Gardner, 1991). Observations from two cases in primary school and lower secondary school in the subject...... with a prototype of a MOO storyline. The aim of the MOO storyline is to challenge the potential of dialogue, user involvement, and learning responsibility and to use the children?s natural curiosity and motivation for game playing, especially when digital games involves other children. The paper proposes a model......, based on the narrative approach for experimental learning subjects, relying on ideas from Csikszentmihalyis notion of flow (Csikszentmihalyi, 1991), storyline-pedagogy (Meldgaard, 1994) and ideas from Howard Gardner (Gardner, 1991). The model forms the basis for educational games to be used in home...

  8. Simulation-based computation of dose to humans in radiological environments

    Energy Technology Data Exchange (ETDEWEB)

    Breazeal, N.L. [Sandia National Labs., Livermore, CA (United States); Davis, K.R.; Watson, R.A. [Sandia National Labs., Albuquerque, NM (United States); Vickers, D.S. [Brigham Young Univ., Provo, UT (United States). Dept. of Electrical and Computer Engineering; Ford, M.S. [Battelle Pantex, Amarillo, TX (United States). Dept. of Radiation Safety

    1996-03-01

    The Radiological Environment Modeling System (REMS) quantifies dose to humans working in radiological environments using the IGRIP (Interactive Graphical Robot Instruction Program) and Deneb/ERGO simulation software. These commercially available products are augmented with custom C code to provide radiation exposure information to, and collect radiation dose information from, workcell simulations. Through the use of any radiation transport code or measured data, a radiation exposure input database may be formulated. User-specified IGRIP simulations utilize these databases to compute and accumulate dose to programmable human models operating around radiation sources. Timing, distances, shielding, and human activity may be modeled accurately in the simulations. The accumulated dose is recorded in output files, and the user is able to process and view this output. The entire REMS capability can be operated from a single graphical user interface.

  9. A vector-product information retrieval system adapted to heterogeneous, distributed computing environments

    Science.gov (United States)

    Rorvig, Mark E.

    1991-01-01

    Vector-product information retrieval (IR) systems produce retrieval results superior to all other searching methods but presently have no commercial implementations beyond the personal computer environment. The NASA Electronic Library Systems (NELS) provides a ranked list of the most likely relevant objects in collections in response to a natural language query. Additionally, the system is constructed using standards and tools (Unix, X-Windows, Notif, and TCP/IP) that permit its operation in organizations that possess many different hosts, workstations, and platforms. There are no known commercial equivalents to this product at this time. The product has applications in all corporate management environments, particularly those that are information intensive, such as finance, manufacturing, biotechnology, and research and development.

  10. Simulation-based computation of dose to humans in radiological environments

    International Nuclear Information System (INIS)

    Breazeal, N.L.; Davis, K.R.; Watson, R.A.; Vickers, D.S.; Ford, M.S.

    1996-03-01

    The Radiological Environment Modeling System (REMS) quantifies dose to humans working in radiological environments using the IGRIP (Interactive Graphical Robot Instruction Program) and Deneb/ERGO simulation software. These commercially available products are augmented with custom C code to provide radiation exposure information to, and collect radiation dose information from, workcell simulations. Through the use of any radiation transport code or measured data, a radiation exposure input database may be formulated. User-specified IGRIP simulations utilize these databases to compute and accumulate dose to programmable human models operating around radiation sources. Timing, distances, shielding, and human activity may be modeled accurately in the simulations. The accumulated dose is recorded in output files, and the user is able to process and view this output. The entire REMS capability can be operated from a single graphical user interface

  11. A simple interface to computational fluid dynamics programs for building environment simulations

    Energy Technology Data Exchange (ETDEWEB)

    Broderick, III, C R; Chen, Q [Massachusetts Institute of Technology, Cambridge, MA (United States)

    2000-07-01

    It is becoming a popular practice for architects and HVAC engineers to simulate airflow in and around buildings by computational fluid dynamics (CFD) methods in order to predict indoor and outdoor environment. However, many CFD programs are crippled by a historically poor and inefficient user interface system, particularly for users with little training in numerical simulation. This investigation endeavors to create a simplified CFD interface (SCI) that allows architects and buildings engineers to use CFD without excessive training. The SCI can be easily integrated into new CFD programs. (author)

  12. A room acoustical computer model for industrial environments - the model and its verification

    DEFF Research Database (Denmark)

    Christensen, Claus Lynge; Foged, Hans Torben

    1998-01-01

    This paper presents an extension to the traditional room acoustic modelling methods allowing computer modelling of huge machinery in industrial spaces. The program in question is Odeon 3.0 Industrial and Odeon 3.0 Combined which allows the modelling of point sources, surface sources and line...... of an omnidirectional sound source and a microphone. This allows the comparison of simulated results with the ones measured in real rooms. However when simulating the acoustic environment in industrial rooms, the sound sources are often far from being point like, as they can be distributed over a large space...

  13. A mixed-methods exploration of an environment for learning computer programming

    Directory of Open Access Journals (Sweden)

    Richard Mather

    2015-08-01

    Full Text Available A mixed-methods approach is evaluated for exploring collaborative behaviour, acceptance and progress surrounding an interactive technology for learning computer programming. A review of literature reveals a compelling case for using mixed-methods approaches when evaluating technology-enhanced-learning environments. Here, ethnographic approaches used for the requirements engineering of computing systems are combined with questionnaire-based feedback and skill tests. These are applied to the ‘Ceebot’ animated 3D learning environment. Video analysis with workplace observation allowed detailed inspection of problem solving and tacit behaviours. Questionnaires and knowledge tests provided broad sample coverage with insights into subject understanding and overall response to the learning environment. Although relatively low scores in programming tests seemingly contradicted the perception that Ceebot had enhanced understanding of programming, this perception was nevertheless found to be correlated with greater test performance. Video analysis corroborated findings that the learning environment and Ceebot animations were engaging and encouraged constructive collaborative behaviours. Ethnographic observations clearly captured Ceebot's value in providing visual cues for problem-solving discussions and for progress through sharing discoveries. Notably, performance in tests was most highly correlated with greater programming practice (p≤0.01. It was apparent that although students had appropriated technology for collaborative working and benefitted from visual and tacit cues provided by Ceebot, they had not necessarily deeply learned the lessons intended. The key value of the ‘mixed-methods’ approach was that ethnographic observations captured the authenticity of learning behaviours, and thereby strengthened confidence in the interpretation of questionnaire and test findings.

  14. RSSI-Based Distance Estimation Framework Using a Kalman Filter for Sustainable Indoor Computing Environments

    Directory of Open Access Journals (Sweden)

    Yunsick Sung

    2016-11-01

    Full Text Available Given that location information is the key to providing a variety of services in sustainable indoor computing environments, it is required to obtain accurate locations. Locations can be estimated by three distances from three fixed points. Therefore, if the distance between two points can be measured or estimated accurately, the location in indoor environments can be estimated. To increase the accuracy of the measured distance, noise filtering, signal revision, and distance estimation processes are generally performed. This paper proposes a novel framework for estimating the distance between a beacon and an access point (AP in a sustainable indoor computing environment. Diverse types of received strength signal indications (RSSIs are used for WiFi, Bluetooth, and radio signals, and the proposed distance estimation framework is unique in that it is independent of the specific wireless signal involved, being based on the Bluetooth signal of the beacon. Generally, RSSI measurement, noise filtering, and revision are required for distance estimation using RSSIs. The employed RSSIs are first measured from an AP, with multiple APs sometimes used to increase the accuracy of the distance estimation. Owing to the inevitable presence of noise in the measured RSSIs, the application of noise filtering is essential, and further revision is used to address the inaccuracy and instability that characterizes RSSIs measured in an indoor environment. The revised RSSIs are then used to estimate the distance. The proposed distance estimation framework uses one AP to measure the RSSIs, a Kalman filter to eliminate noise, and a log-distance path loss model to revise the measured RSSIs. In the experimental implementation of the framework, both a RSSI filter and a Kalman filter were respectively used for noise elimination to comparatively evaluate the performance of the latter for the specific application. The Kalman filter was found to reduce the accumulated errors by 8

  15. G-LoSA: An efficient computational tool for local structure-centric biological studies and drug design.

    Science.gov (United States)

    Lee, Hui Sun; Im, Wonpil

    2016-04-01

    Molecular recognition by protein mostly occurs in a local region on the protein surface. Thus, an efficient computational method for accurate characterization of protein local structural conservation is necessary to better understand biology and drug design. We present a novel local structure alignment tool, G-LoSA. G-LoSA aligns protein local structures in a sequence order independent way and provides a GA-score, a chemical feature-based and size-independent structure similarity score. Our benchmark validation shows the robust performance of G-LoSA to the local structures of diverse sizes and characteristics, demonstrating its universal applicability to local structure-centric comparative biology studies. In particular, G-LoSA is highly effective in detecting conserved local regions on the entire surface of a given protein. In addition, the applications of G-LoSA to identifying template ligands and predicting ligand and protein binding sites illustrate its strong potential for computer-aided drug design. We hope that G-LoSA can be a useful computational method for exploring interesting biological problems through large-scale comparison of protein local structures and facilitating drug discovery research and development. G-LoSA is freely available to academic users at http://im.compbio.ku.edu/GLoSA/. © 2016 The Protein Society.

  16. Discovering local patterns of co - evolution: computational aspects and biological examples

    Directory of Open Access Journals (Sweden)

    Tuller Tamir

    2010-01-01

    Full Text Available Abstract Background Co-evolution is the process in which two (or more sets of orthologs exhibit a similar or correlative pattern of evolution. Co-evolution is a powerful way to learn about the functional interdependencies between sets of genes and cellular functions and to predict physical interactions. More generally, it can be used for answering fundamental questions about the evolution of biological systems. Orthologs that exhibit a strong signal of co-evolution in a certain part of the evolutionary tree may show a mild signal of co-evolution in other branches of the tree. The major reasons for this phenomenon are noise in the biological input, genes that gain or lose functions, and the fact that some measures of co-evolution relate to rare events such as positive selection. Previous publications in the field dealt with the problem of finding sets of genes that co-evolved along an entire underlying phylogenetic tree, without considering the fact that often co-evolution is local. Results In this work, we describe a new set of biological problems that are related to finding patterns of local co-evolution. We discuss their computational complexity and design algorithms for solving them. These algorithms outperform other bi-clustering methods as they are designed specifically for solving the set of problems mentioned above. We use our approach to trace the co-evolution of fungal, eukaryotic, and mammalian genes at high resolution across the different parts of the corresponding phylogenetic trees. Specifically, we discover regions in the fungi tree that are enriched with positive evolution. We show that metabolic genes exhibit a remarkable level of co-evolution and different patterns of co-evolution in various biological datasets. In addition, we find that protein complexes that are related to gene expression exhibit non-homogenous levels of co-evolution across different parts of the fungi evolutionary line. In the case of mammalian evolution

  17. An analytically based numerical method for computing view factors in real urban environments

    Science.gov (United States)

    Lee, Doo-Il; Woo, Ju-Wan; Lee, Sang-Hyun

    2018-01-01

    A view factor is an important morphological parameter used in parameterizing in-canyon radiative energy exchange process as well as in characterizing local climate over urban environments. For realistic representation of the in-canyon radiative processes, a complete set of view factors at the horizontal and vertical surfaces of urban facets is required. Various analytical and numerical methods have been suggested to determine the view factors for urban environments, but most of the methods provide only sky-view factor at the ground level of a specific location or assume simplified morphology of complex urban environments. In this study, a numerical method that can determine the sky-view factors ( ψ ga and ψ wa ) and wall-view factors ( ψ gw and ψ ww ) at the horizontal and vertical surfaces is presented for application to real urban morphology, which are derived from an analytical formulation of the view factor between two blackbody surfaces of arbitrary geometry. The established numerical method is validated against the analytical sky-view factor estimation for ideal street canyon geometries, showing a consolidate confidence in accuracy with errors of less than 0.2 %. Using a three-dimensional building database, the numerical method is also demonstrated to be applicable in determining the sky-view factors at the horizontal (roofs and roads) and vertical (walls) surfaces in real urban environments. The results suggest that the analytically based numerical method can be used for the radiative process parameterization of urban numerical models as well as for the characterization of local urban climate.

  18. A Multi-Sensorial Simultaneous Localization and Mapping (SLAM) System for Low-Cost Micro Aerial Vehicles in GPS-Denied Environments.

    Science.gov (United States)

    López, Elena; García, Sergio; Barea, Rafael; Bergasa, Luis M; Molinos, Eduardo J; Arroyo, Roberto; Romera, Eduardo; Pardo, Samuel

    2017-04-08

    One of the main challenges of aerial robots navigation in indoor or GPS-denied environments is position estimation using only the available onboard sensors. This paper presents a Simultaneous Localization and Mapping (SLAM) system that remotely calculates the pose and environment map of different low-cost commercial aerial platforms, whose onboard computing capacity is usually limited. The proposed system adapts to the sensory configuration of the aerial robot, by integrating different state-of-the art SLAM methods based on vision, laser and/or inertial measurements using an Extended Kalman Filter (EKF). To do this, a minimum onboard sensory configuration is supposed, consisting of a monocular camera, an Inertial Measurement Unit (IMU) and an altimeter. It allows to improve the results of well-known monocular visual SLAM methods (LSD-SLAM and ORB-SLAM are tested and compared in this work) by solving scale ambiguity and providing additional information to the EKF. When payload and computational capabilities permit, a 2D laser sensor can be easily incorporated to the SLAM system, obtaining a local 2.5D map and a footprint estimation of the robot position that improves the 6D pose estimation through the EKF. We present some experimental results with two different commercial platforms, and validate the system by applying it to their position control.

  19. Assessment of modern smartphone sensors performance on vehicle localization in urban environments

    Science.gov (United States)

    Lazarou, Theodoros; Danezis, Chris

    2017-09-01

    The advent of Global Navigation Satellite Systems (GNSS) initiated a revolution in Positioning, Navigation and Timing (PNT) applications. Besides the enormous impact on geospatial data acquisition and reality capture, satellite navigation has penetrated everyday life, a fact which is proved by the increasing degree of human reliance on GNSS-enabled smart devices to perform casual activities. Nevertheless, GNSS does not perform well in all cases. Specifically, in GNSS-challenging environments, such as urban canyons or forested areas, navigation performance may be significantly degraded or even nullified. Consequently, positioning is achieved by combining GNSS with additional heterogeneous information or sensors, such as inertial sensors. To date, most smartphones are equipped with at least accelerometers and gyroscopes, besides GNSS chipsets. In the frame of this research, difficult localization scenarios were investigated to assess the performance of these low-cost inertial sensors with respect to higher grade GNSS and IMU systems. Four state-of-the-art smartphones were mounted on a specifically designed on-purpose build platform along with reference equipment. The platform was installed on top of a vehicle, which was driven by a predefined trajectory that included several GNSS-challenging parts. Consequently, positioning and inertial readings were acquired by smartphones and compared to the information collected by the reference equipment. The results indicated that although the smartphone GNSS receivers have increased sensitivity, they were unable to produce an acceptable solution for more than 30% of the driven course. However, all smartphones managed to identify, up to a satisfactory degree, distinct driving features, such as curves or bumps.

  20. VAT: a computational framework to functionally annotate variants in personal genomes within a cloud-computing environment.

    Science.gov (United States)

    Habegger, Lukas; Balasubramanian, Suganthi; Chen, David Z; Khurana, Ekta; Sboner, Andrea; Harmanci, Arif; Rozowsky, Joel; Clarke, Declan; Snyder, Michael; Gerstein, Mark

    2012-09-01

    The functional annotation of variants obtained through sequencing projects is generally assumed to be a simple intersection of genomic coordinates with genomic features. However, complexities arise for several reasons, including the differential effects of a variant on alternatively spliced transcripts, as well as the difficulty in assessing the impact of small insertions/deletions and large structural variants. Taking these factors into consideration, we developed the Variant Annotation Tool (VAT) to functionally annotate variants from multiple personal genomes at the transcript level as well as obtain summary statistics across genes and individuals. VAT also allows visualization of the effects of different variants, integrates allele frequencies and genotype data from the underlying individuals and facilitates comparative analysis between different groups of individuals. VAT can either be run through a command-line interface or as a web application. Finally, in order to enable on-demand access and to minimize unnecessary transfers of large data files, VAT can be run as a virtual machine in a cloud-computing environment. VAT is implemented in C and PHP. The VAT web service, Amazon Machine Image, source code and detailed documentation are available at vat.gersteinlab.org.

  1. The Influence of Trainee Gaming Experience and Computer Self-Efficacy on Learner Outcomes of Videogame-Based Learning Environments

    National Research Council Canada - National Science Library

    Orvis, Karin A; Orvis, Kara L; Belanich, James; Mullin, Laura N

    2005-01-01

    .... The purpose of the current research was to investigate the influence of two trainee characteristics, prior videogame experience and computer self-efficacy, on learner outcomes of a videogame-based training environment...

  2. A Hybrid Scheme for Fine-Grained Search and Access Authorization in Fog Computing Environment

    Science.gov (United States)

    Xiao, Min; Zhou, Jing; Liu, Xuejiao; Jiang, Mingda

    2017-01-01

    In the fog computing environment, the encrypted sensitive data may be transferred to multiple fog nodes on the edge of a network for low latency; thus, fog nodes need to implement a search over encrypted data as a cloud server. Since the fog nodes tend to provide service for IoT applications often running on resource-constrained end devices, it is necessary to design lightweight solutions. At present, there is little research on this issue. In this paper, we propose a fine-grained owner-forced data search and access authorization scheme spanning user-fog-cloud for resource constrained end users. Compared to existing schemes only supporting either index encryption with search ability or data encryption with fine-grained access control ability, the proposed hybrid scheme supports both abilities simultaneously, and index ciphertext and data ciphertext are constructed based on a single ciphertext-policy attribute based encryption (CP-ABE) primitive and share the same key pair, thus the data access efficiency is significantly improved and the cost of key management is greatly reduced. Moreover, in the proposed scheme, the resource constrained end devices are allowed to rapidly assemble ciphertexts online and securely outsource most of decryption task to fog nodes, and mediated encryption mechanism is also adopted to achieve instantaneous user revocation instead of re-encrypting ciphertexts with many copies in many fog nodes. The security and the performance analysis show that our scheme is suitable for a fog computing environment. PMID:28629131

  3. CLINICAL SURFACES - Activity-Based Computing for Distributed Multi-Display Environments in Hospitals

    Science.gov (United States)

    Bardram, Jakob E.; Bunde-Pedersen, Jonathan; Doryab, Afsaneh; Sørensen, Steffen

    A multi-display environment (MDE) is made up of co-located and networked personal and public devices that form an integrated workspace enabling co-located group work. Traditionally, MDEs have, however, mainly been designed to support a single “smart room”, and have had little sense of the tasks and activities that the MDE is being used for. This paper presents a novel approach to support activity-based computing in distributed MDEs, where displays are physically distributed across a large building. CLINICAL SURFACES was designed for clinical work in hospitals, and enables context-sensitive retrieval and browsing of patient data on public displays. We present the design and implementation of CLINICAL SURFACES, and report from an evaluation of the system at a large hospital. The evaluation shows that using distributed public displays to support activity-based computing inside a hospital is very useful for clinical work, and that the apparent contradiction between maintaining privacy of medical data in a public display environment can be mitigated by the use of CLINICAL SURFACES.

  4. Characteristics of Israeli School Teachers in Computer-based Learning Environments

    Directory of Open Access Journals (Sweden)

    Noga Magen-Nagar

    2013-01-01

    Full Text Available The purpose of this research is to investigate whether there are differences in the level of computer literacy, the amount of implementation of ICT in teaching and learning-assessment processes and the attitudes of teachers from computerized schools in comparison to teachers in non-computerized schools. In addition, the research investigates the characteristics of Israeli school teachers in a 21st century computer-based learning environment. A quantitative research methodology was used. The research sample included 811 elementary school teachers from the Jewish sector of whom 402 teachers were from the computerized school sample and 409 were teachers from the non-computerized school sample. The research findings show that teachers from the computerized school sample are more familiar with ICT, tend to use ICT more and have a more positive attitude towards ICT than teachers in the non-computerized school sample. The main conclusion which can be drawn from this research is that positive attitudes of teachers towards ICT are not sufficient for the integration of technology to occur. Future emphasis on new teaching skills of collective Technological Pedagogical Content Knowledge is necessary to promote the implementation of optimal pedagogy in innovative environments.

  5. Development of a computational environment for the General Curvilinear Ocean Model

    International Nuclear Information System (INIS)

    Thomas, Mary P; Castillo, Jose E

    2009-01-01

    The General Curvilinear Ocean Model (GCOM) differs significantly from the traditional approach, where the use of Cartesian coordinates forces the model to simulate terrain as a series of steps. GCOM utilizes a full three-dimensional curvilinear transformation, which has been shown to have greater accuracy than similar models and to achieve results more efficiently. The GCOM model has been validated for several types of water bodies, different coastlines and bottom shapes, including the Alarcon Seamount, Southern California Coastal Region, the Valencia Lake in Venezuela, and more recently the Monterey Bay. In this paper, enhancements to the GCOM model and an overview of the computational environment (GCOM-CE) are presented. Model improvements include migration from F77 to F90; approach to a component design; and initial steps towards parallelization of the model. Through the use of the component design, new models are being incorporated including biogeochemical, pollution, and sediment transport. The computational environment is designed to allow various client interactions via secure Web applications (portal, Web services, and Web 2.0 gadgets). Features include building jobs, managing and interacting with long running jobs; managing input and output files; quick visualization of results; publishing of Web services to be used by other systems such as larger climate models. The CE is based mainly on Python tools including a grid-enabled Pylons Web application Framework for Web services, pyWSRF (python-Web Services-Resource Framework), pyGlobus based web services, SciPy, and Google code tools.

  6. A Hybrid Scheme for Fine-Grained Search and Access Authorization in Fog Computing Environment.

    Science.gov (United States)

    Xiao, Min; Zhou, Jing; Liu, Xuejiao; Jiang, Mingda

    2017-06-17

    In the fog computing environment, the encrypted sensitive data may be transferred to multiple fog nodes on the edge of a network for low latency; thus, fog nodes need to implement a search over encrypted data as a cloud server. Since the fog nodes tend to provide service for IoT applications often running on resource-constrained end devices, it is necessary to design lightweight solutions. At present, there is little research on this issue. In this paper, we propose a fine-grained owner-forced data search and access authorization scheme spanning user-fog-cloud for resource constrained end users. Compared to existing schemes only supporting either index encryption with search ability or data encryption with fine-grained access control ability, the proposed hybrid scheme supports both abilities simultaneously, and index ciphertext and data ciphertext are constructed based on a single ciphertext-policy attribute based encryption (CP-ABE) primitive and share the same key pair, thus the data access efficiency is significantly improved and the cost of key management is greatly reduced. Moreover, in the proposed scheme, the resource constrained end devices are allowed to rapidly assemble ciphertexts online and securely outsource most of decryption task to fog nodes, and mediated encryption mechanism is also adopted to achieve instantaneous user revocation instead of re-encrypting ciphertexts with many copies in many fog nodes. The security and the performance analysis show that our scheme is suitable for a fog computing environment.

  7. local

    Directory of Open Access Journals (Sweden)

    Abílio Amiguinho

    2005-01-01

    Full Text Available The process of socio-educational territorialisation in rural contexts is the topic of this text. The theme corresponds to a challenge to address it having as main axis of discussion either the problem of social exclusion or that of local development. The reasons to locate the discussion in this last field of analysis are discussed in the first part of the text. Theoretical and political reasons are there articulated because the question is about projects whose intentions and practices call for the political both in the theoretical debate and in the choices that anticipate intervention. From research conducted for several years, I use contributions that aim at discuss and enlighten how school can be a potential locus of local development. Its identification and recognition as local institution (either because of those that work and live in it or because of those that act in the surrounding context are crucial steps to progressively constitute school as a partner for development. The promotion of the local values and roots, the reconstruction of socio-personal and local identities, the production of sociabilities and the equation and solution of shared problems were the dimensions of a socio-educative intervention, markedly globalising. This scenario, as it is argued, was also, intentionally, one of transformation and of deliberate change of school and of the administration of the educative territoires.

  8. Sex-specific effects of the local social environment on juvenile post-fledging dispersal in great tits

    NARCIS (Netherlands)

    Michler, Stephanie P. M.; Nicolaus, Marion; Ubels, Richard; van der Velde, Marco; Komdeur, Jan; Both, Christiaan; Tinbergen, Joost M.; Gibson, R.

    2011-01-01

    An individual's decision to disperse from the natal habitat can affect its future fitness prospects. Especially in species with sex-biased dispersal, we expect the cost benefit balance for dispersal to vary according to the social environment (e.g., local sex ratio and density). However, little is

  9. Aerosol transport simulations in indoor and outdoor environments using computational fluid dynamics (CFD)

    Science.gov (United States)

    Landazuri, Andrea C.

    This dissertation focuses on aerosol transport modeling in occupational environments and mining sites in Arizona using computational fluid dynamics (CFD). The impacts of human exposure in both environments are explored with the emphasis on turbulence, wind speed, wind direction and particle sizes. Final emissions simulations involved the digitalization process of available elevation contour plots of one of the mining sites to account for realistic topographical features. The digital elevation map (DEM) of one of the sites was imported to COMSOL MULTIPHYSICSRTM for subsequent turbulence and particle simulations. Simulation results that include realistic topography show considerable deviations of wind direction. Inter-element correlation results using metal and metalloid size resolved concentration data using a Micro-Orifice Uniform Deposit Impactor (MOUDI) under given wind speeds and directions provided guidance on groups of metals that coexist throughout mining activities. Groups between Fe-Mg, Cr-Fe, Al-Sc, Sc-Fe, and Mg-Al are strongly correlated for unrestricted wind directions and speeds, suggesting that the source may be of soil origin (e.g. ore and tailings); also, groups of elements where Cu is present, in the coarse fraction range, may come from mechanical action mining activities and saltation phenomenon. Besides, MOUDI data under low wind speeds (Computational Fluid Dynamics can be used as a source apportionment tool to identify areas that have an effect over specific sampling points and susceptible regions under certain meteorological conditions, and these conclusions can be supported with inter-element correlation matrices and lead isotope analysis, especially since there is limited access to the mining sites. Additional results concluded that grid adaption is a powerful tool that allows to refine specific regions that require lots of detail and therefore better resolve flow detail, provides higher number of locations with monotonic convergence than the

  10. Long-term changes of information environments and computer anxiety of nurse administrators in Japan.

    Science.gov (United States)

    Majima, Yukie; Izumi, Takako

    2013-01-01

    In Japan, medical information systems, including electronic medical records, are being introduced increasingly at medical and nursing fields. Nurse administrators, who are involved in the introduction of medical information systems and who must make proper judgment, are particularly required to have at least minimal knowledge of computers and networks and the ability to think about easy-to-use medical information systems. However, few of the current generation of nurse administrators studied information science subjects in their basic education curriculum. It can be said that information education for nurse administrators has become a pressing issue. Consequently, in this study, we conducted a survey of participants taking the first level program of the education course for Japanese certified nurse administrators to ascertain the actual conditions, such as the information environments that nurse administrators are in, their anxiety attitude to computers. Comparisons over the seven years since 2004 revealed that although introduction of electronic medical records in hospitals was progressing, little change in attributes of participants taking the course was observed, such as computer anxiety.

  11. Load Balancing in Cloud Computing Environment Using Improved Weighted Round Robin Algorithm for Nonpreemptive Dependent Tasks.

    Science.gov (United States)

    Devi, D Chitra; Uthariaraj, V Rhymend

    2016-01-01

    Cloud computing uses the concepts of scheduling and load balancing to migrate tasks to underutilized VMs for effectively sharing the resources. The scheduling of the nonpreemptive tasks in the cloud computing environment is an irrecoverable restraint and hence it has to be assigned to the most appropriate VMs at the initial placement itself. Practically, the arrived jobs consist of multiple interdependent tasks and they may execute the independent tasks in multiple VMs or in the same VM's multiple cores. Also, the jobs arrive during the run time of the server in varying random intervals under various load conditions. The participating heterogeneous resources are managed by allocating the tasks to appropriate resources by static or dynamic scheduling to make the cloud computing more efficient and thus it improves the user satisfaction. Objective of this work is to introduce and evaluate the proposed scheduling and load balancing algorithm by considering the capabilities of each virtual machine (VM), the task length of each requested job, and the interdependency of multiple tasks. Performance of the proposed algorithm is studied by comparing with the existing methods.

  12. The Case for Higher Computational Density in the Memory-Bound FDTD Method within Multicore Environments

    Directory of Open Access Journals (Sweden)

    Mohammed F. Hadi

    2012-01-01

    Full Text Available It is argued here that more accurate though more compute-intensive alternate algorithms to certain computational methods which are deemed too inefficient and wasteful when implemented within serial codes can be more efficient and cost-effective when implemented in parallel codes designed to run on today's multicore and many-core environments. This argument is most germane to methods that involve large data sets with relatively limited computational density—in other words, algorithms with small ratios of floating point operations to memory accesses. The examples chosen here to support this argument represent a variety of high-order finite-difference time-domain algorithms. It will be demonstrated that a three- to eightfold increase in floating-point operations due to higher-order finite-differences will translate to only two- to threefold increases in actual run times using either graphical or central processing units of today. It is hoped that this argument will convince researchers to revisit certain numerical techniques that have long been shelved and reevaluate them for multicore usability.

  13. Load Balancing in Cloud Computing Environment Using Improved Weighted Round Robin Algorithm for Nonpreemptive Dependent Tasks

    Directory of Open Access Journals (Sweden)

    D. Chitra Devi

    2016-01-01

    Full Text Available Cloud computing uses the concepts of scheduling and load balancing to migrate tasks to underutilized VMs for effectively sharing the resources. The scheduling of the nonpreemptive tasks in the cloud computing environment is an irrecoverable restraint and hence it has to be assigned to the most appropriate VMs at the initial placement itself. Practically, the arrived jobs consist of multiple interdependent tasks and they may execute the independent tasks in multiple VMs or in the same VM’s multiple cores. Also, the jobs arrive during the run time of the server in varying random intervals under various load conditions. The participating heterogeneous resources are managed by allocating the tasks to appropriate resources by static or dynamic scheduling to make the cloud computing more efficient and thus it improves the user satisfaction. Objective of this work is to introduce and evaluate the proposed scheduling and load balancing algorithm by considering the capabilities of each virtual machine (VM, the task length of each requested job, and the interdependency of multiple tasks. Performance of the proposed algorithm is studied by comparing with the existing methods.

  14. Elastic Scheduling of Scientific Workflows under Deadline Constraints in Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    Nazia Anwar

    2018-01-01

    Full Text Available Scientific workflow applications are collections of several structured activities and fine-grained computational tasks. Scientific workflow scheduling in cloud computing is a challenging research topic due to its distinctive features. In cloud environments, it has become critical to perform efficient task scheduling resulting in reduced scheduling overhead, minimized cost and maximized resource utilization while still meeting the user-specified overall deadline. This paper proposes a strategy, Dynamic Scheduling of Bag of Tasks based workflows (DSB, for scheduling scientific workflows with the aim to minimize financial cost of leasing Virtual Machines (VMs under a user-defined deadline constraint. The proposed model groups the workflow into Bag of Tasks (BoTs based on data dependency and priority constraints and thereafter optimizes the allocation and scheduling of BoTs on elastic, heterogeneous and dynamically provisioned cloud resources called VMs in order to attain the proposed method’s objectives. The proposed approach considers pay-as-you-go Infrastructure as a Service (IaaS clouds having inherent features such as elasticity, abundance, heterogeneity and VM provisioning delays. A trace-based simulation using benchmark scientific workflows representing real world applications, demonstrates a significant reduction in workflow computation cost while the workflow deadline is met. The results validate that the proposed model produces better success rates to meet deadlines and cost efficiencies in comparison to adapted state-of-the-art algorithms for similar problems.

  15. PABLM: a computer program to calculate accumulated radiation doses from radionuclides in the environment

    Energy Technology Data Exchange (ETDEWEB)

    Napier, B.A.; Kennedy, W.E. Jr.; Soldat, J.K.

    1980-03-01

    A computer program, PABLM, was written to facilitate the calculation of internal radiation doses to man from radionuclides in food products and external radiation doses from radionuclides in the environment. This report contains details of mathematical models used and calculational procedures required to run the computer program. Radiation doses from radionuclides in the environment may be calculated from deposition on the soil or plants during an atmospheric or liquid release, or from exposure to residual radionuclides in the environment after the releases have ended. Radioactive decay is considered during the release of radionuclides, after they are deposited on the plants or ground, and during holdup of food after harvest. The radiation dose models consider several exposure pathways. Doses may be calculated for either a maximum-exposed individual or for a population group. The doses calculated are accumulated doses from continuous chronic exposure. A first-year committed dose is calculated as well as an integrated dose for a selected number of years. The equations for calculating internal radiation doses are derived from those given by the International Commission on Radiological Protection (ICRP) for body burdens and MPC's of each radionuclide. The radiation doses from external exposure to contaminated water and soil are calculated using the basic assumption that the contaminated medium is large enough to be considered an infinite volume or plane relative to the range of the emitted radiations. The equations for calculations of the radiation dose from external exposure to shoreline sediments include a correction for the finite width of the contaminated beach.

  16. Emerging and Future Computing Paradigms and Their Impact on the Research, Training, and Design Environments of the Aerospace Workforce

    Science.gov (United States)

    Noor, Ahmed K. (Compiler)

    2003-01-01

    The document contains the proceedings of the training workshop on Emerging and Future Computing Paradigms and their impact on the Research, Training and Design Environments of the Aerospace Workforce. The workshop was held at NASA Langley Research Center, Hampton, Virginia, March 18 and 19, 2003. The workshop was jointly sponsored by Old Dominion University and NASA. Workshop attendees came from NASA, other government agencies, industry and universities. The objectives of the workshop were to a) provide broad overviews of the diverse activities related to new computing paradigms, including grid computing, pervasive computing, high-productivity computing, and the IBM-led autonomic computing; and b) identify future directions for research that have high potential for future aerospace workforce environments. The format of the workshop included twenty-one, half-hour overview-type presentations and three exhibits by vendors.

  17. Computational local stiffness analysis of biological cell: High aspect ratio single wall carbon nanotube tip

    Energy Technology Data Exchange (ETDEWEB)

    TermehYousefi, Amin, E-mail: at.tyousefi@gmail.com [Department of Human Intelligence Systems, Graduate School of Life Science and Systems Engineering, Kyushu Institute of Technology (Kyutech) (Japan); Bagheri, Samira; Shahnazar, Sheida [Nanotechnology & Catalysis Research Centre (NANOCAT), IPS Building, University Malaya, 50603 Kuala Lumpur (Malaysia); Rahman, Md. Habibur [Department of Computer Science and Engineering, University of Asia Pacific, Green Road, Dhaka-1215 (Bangladesh); Kadri, Nahrizul Adib [Department of Biomedical Engineering, Faculty of Engineering, University Malaya, 50603 Kuala Lumpur (Malaysia)

    2016-02-01

    Carbon nanotubes (CNTs) are potentially ideal tips for atomic force microscopy (AFM) due to the robust mechanical properties, nanoscale diameter and also their ability to be functionalized by chemical and biological components at the tip ends. This contribution develops the idea of using CNTs as an AFM tip in computational analysis of the biological cells. The proposed software was ABAQUS 6.13 CAE/CEL provided by Dassault Systems, which is a powerful finite element (FE) tool to perform the numerical analysis and visualize the interactions between proposed tip and membrane of the cell. Finite element analysis employed for each section and displacement of the nodes located in the contact area was monitored by using an output database (ODB). Mooney–Rivlin hyperelastic model of the cell allows the simulation to obtain a new method for estimating the stiffness and spring constant of the cell. Stress and strain curve indicates the yield stress point which defines as a vertical stress and plan stress. Spring constant of the cell and the local stiffness was measured as well as the applied force of CNT-AFM tip on the contact area of the cell. This reliable integration of CNT-AFM tip process provides a new class of high performance nanoprobes for single biological cell analysis. - Graphical abstract: This contribution develops the idea of using CNTs as an AFM tip in computational analysis of the biological cells. The proposed software was ABAQUS 6.13 CAE/CEL provided by Dassault Systems. Finite element analysis employed for each section and displacement of the nodes located in the contact area was monitored by using an output database (ODB). Mooney–Rivlin hyperelastic model of the cell allows the simulation to obtain a new method for estimating the stiffness and spring constant of the cell. Stress and strain curve indicates the yield stress point which defines as a vertical stress and plan stress. Spring constant of the cell and the local stiffness was measured as well

  18. Computing the Local Field Potential (LFP) from Integrate-and-Fire Network Models

    Science.gov (United States)

    Cuntz, Hermann; Lansner, Anders; Panzeri, Stefano; Einevoll, Gaute T.

    2015-01-01

    Leaky integrate-and-fire (LIF) network models are commonly used to study how the spiking dynamics of neural networks changes with stimuli, tasks or dynamic network states. However, neurophysiological studies in vivo often rather measure the mass activity of neuronal microcircuits with the local field potential (LFP). Given that LFPs are generated by spatially separated currents across the neuronal membrane, they cannot be computed directly from quantities defined in models of point-like LIF neurons. Here, we explore the best approximation for predicting the LFP based on standard output from point-neuron LIF networks. To search for this best “LFP proxy”, we compared LFP predictions from candidate proxies based on LIF network output (e.g, firing rates, membrane potentials, synaptic currents) with “ground-truth” LFP obtained when the LIF network synaptic input currents were injected into an analogous three-dimensional (3D) network model of multi-compartmental neurons with realistic morphology, spatial distributions of somata and synapses. We found that a specific fixed linear combination of the LIF synaptic currents provided an accurate LFP proxy, accounting for most of the variance of the LFP time course observed in the 3D network for all recording locations. This proxy performed well over a broad set of conditions, including substantial variations of the neuronal morphologies. Our results provide a simple formula for estimating the time course of the LFP from LIF network simulations in cases where a single pyramidal population dominates the LFP generation, and thereby facilitate quantitative comparison between computational models and experimental LFP recordings in vivo. PMID:26657024

  19. Computing the Local Field Potential (LFP from Integrate-and-Fire Network Models.

    Directory of Open Access Journals (Sweden)

    Alberto Mazzoni

    2015-12-01

    Full Text Available Leaky integrate-and-fire (LIF network models are commonly used to study how the spiking dynamics of neural networks changes with stimuli, tasks or dynamic network states. However, neurophysiological studies in vivo often rather measure the mass activity of neuronal microcircuits with the local field potential (LFP. Given that LFPs are generated by spatially separated currents across the neuronal membrane, they cannot be computed directly from quantities defined in models of point-like LIF neurons. Here, we explore the best approximation for predicting the LFP based on standard output from point-neuron LIF networks. To search for this best "LFP proxy", we compared LFP predictions from candidate proxies based on LIF network output (e.g, firing rates, membrane potentials, synaptic currents with "ground-truth" LFP obtained when the LIF network synaptic input currents were injected into an analogous three-dimensional (3D network model of multi-compartmental neurons with realistic morphology, spatial distributions of somata and synapses. We found that a specific fixed linear combination of the LIF synaptic currents provided an accurate LFP proxy, accounting for most of the variance of the LFP time course observed in the 3D network for all recording locations. This proxy performed well over a broad set of conditions, including substantial variations of the neuronal morphologies. Our results provide a simple formula for estimating the time course of the LFP from LIF network simulations in cases where a single pyramidal population dominates the LFP generation, and thereby facilitate quantitative comparison between computational models and experimental LFP recordings in vivo.

  20. Effects of feedback in a computer-based learning environment on students’ learning outcomes: a meta-analysis

    NARCIS (Netherlands)

    van der Kleij, Fabienne; Feskens, Remco C.W.; Eggen, Theodorus Johannes Hendrikus Maria

    2015-01-01

    In this meta-analysis, we investigated the effects of methods for providing item-based feedback in a computer-based environment on students’ learning outcomes. From 40 studies, 70 effect sizes were computed, which ranged from −0.78 to 2.29. A mixed model was used for the data analysis. The results

  1. Calculation of local skin doses with ICRP adult mesh-type reference computational phantoms

    Science.gov (United States)

    Yeom, Yeon Soo; Han, Haegin; Choi, Chansoo; Nguyen, Thang Tat; Lee, Hanjin; Shin, Bangho; Kim, Chan Hyeong; Han, Min Cheol

    2018-01-01

    Recently, Task Group 103 of the International Commission on Radiological Protection (ICRP) developed new mesh-type reference computational phantoms (MRCPs) for adult males and females in order to address the limitations of the current voxel-type reference phantoms described in ICRP Publication 110 due to their limited voxel resolutions and the nature of the voxel geometry. One of the substantial advantages of the MRCPs over the ICRP-110 reference phantoms is the inclusion of a 50-μm-thick radiosensitive skin basal-cell layer; however, a methodology for calculating the local skin dose (LSD), i.e., the maximum dose to the basal layer averaged over a 1-cm2 area, has yet to be developed. In the present study, a dedicated program for the LSD calculation with the MRCPs was developed based on the mean shift algorithm and the Geant4 Monte Carlo code. The developed program was used to calculate local skin dose coefficients (LSDCs) for electrons and alpha particles, which were then compared with the values given in ICRP Publication 116 that were produced with a simple tissue-equivalent cube model. The results of the present study show that the LSDCs of the MRCPs are generally in good agreement with the ICRP-116 values for alpha particles, but for electrons, significant differences are found at energies higher than 0.15 MeV. The LSDCs of the MRCPs are greater than the ICRP-116 values by as much as 2.7 times at 10 MeV, which is due mainly to the different curvature between realistic MRCPs ( i.e., curved) and the simple cube model ( i.e., flat).

  2. Computer Aided Theragnosis Using Quantitative Ultrasound Spectroscopy and Maximum Mean Discrepancy in Locally Advanced Breast Cancer.

    Science.gov (United States)

    Gangeh, Mehrdad J; Tadayyon, Hadi; Sannachi, Lakshmanan; Sadeghi-Naini, Ali; Tran, William T; Czarnota, Gregory J

    2016-03-01

    A noninvasive computer-aided-theragnosis (CAT) system was developed for the early therapeutic cancer response assessment in patients with locally advanced breast cancer (LABC) treated with neoadjuvant chemotherapy. The proposed CAT system was based on multi-parametric quantitative ultrasound (QUS) spectroscopic methods in conjunction with advanced machine learning techniques. Specifically, a kernel-based metric named maximum mean discrepancy (MMD), a technique for learning from imbalanced data based on random undersampling, and supervised learning were investigated with response-monitoring data from LABC patients. The CAT system was tested on 56 patients using statistical significance tests and leave-one-subject-out classification techniques. Textural features using state-of-the-art local binary patterns (LBP), and gray-scale intensity features were extracted from the spectral parametric maps in the proposed CAT system. The system indicated significant differences in changes between the responding and non-responding patient populations as well as high accuracy, sensitivity, and specificity in discriminating between the two patient groups early after the start of treatment, i.e., on weeks 1 and 4 of several months of treatment. The proposed CAT system achieved an accuracy of 85%, 87%, and 90% on weeks 1, 4 and 8, respectively. The sensitivity and specificity of developed CAT system for the same times was 85%, 95%, 90% and 85%, 85%, 91%, respectively. The proposed CAT system thus establishes a noninvasive framework for monitoring cancer treatment response in tumors using clinical ultrasound imaging in conjunction with machine learning techniques. Such a framework can potentially facilitate the detection of refractory responses in patients to treatment early on during a course of therapy to enable possibly switching to more efficacious treatments.

  3. GATE Monte Carlo simulation of dose distribution using MapReduce in a cloud computing environment.

    Science.gov (United States)

    Liu, Yangchuan; Tang, Yuguo; Gao, Xin

    2017-12-01

    The GATE Monte Carlo simulation platform has good application prospects of treatment planning and quality assurance. However, accurate dose calculation using GATE is time consuming. The purpose of this study is to implement a novel cloud computing method for accurate GATE Monte Carlo simulation of dose distribution using MapReduce. An Amazon Machine Image installed with Hadoop and GATE is created to set up Hadoop clusters on Amazon Elastic Compute Cloud (EC2). Macros, the input files for GATE, are split into a number of self-contained sub-macros. Through Hadoop Streaming, the sub-macros are executed by GATE in Map tasks and the sub-results are aggregated into final outputs in Reduce tasks. As an evaluation, GATE simulations were performed in a cubical water phantom for X-ray photons of 6 and 18 MeV. The parallel simulation on the cloud computing platform is as accurate as the single-threaded simulation on a local server and the simulation correctness is not affected by the failure of some worker nodes. The cloud-based simulation time is approximately inversely proportional to the number of worker nodes. For the simulation of 10 million photons on a cluster with 64 worker nodes, time decreases of 41× and 32× were achieved compared to the single worker node case and the single-threaded case, respectively. The test of Hadoop's fault tolerance showed that the simulation correctness was not affected by the failure of some worker nodes. The results verify that the proposed method provides a feasible cloud computing solution for GATE.

  4. DiFX: A software correlator for very long baseline interferometry using multi-processor computing environments

    OpenAIRE

    Deller, A. T.; Tingay, S. J.; Bailes, M.; West, C.

    2007-01-01

    We describe the development of an FX style correlator for Very Long Baseline Interferometry (VLBI), implemented in software and intended to run in multi-processor computing environments, such as large clusters of commodity machines (Beowulf clusters) or computers specifically designed for high performance computing, such as multi-processor shared-memory machines. We outline the scientific and practical benefits for VLBI correlation, these chiefly being due to the inherent flexibility of softw...

  5. Evaluating Students' Perceptions and Attitudes toward Computer-Mediated Project-Based Learning Environment: A Case Study

    Science.gov (United States)

    Seet, Ling Ying Britta; Quek, Choon Lang

    2010-01-01

    This research investigated 68 secondary school students' perceptions of their computer-mediated project-based learning environment and their attitudes towards Project Work (PW) using two instruments--Project Work Classroom Learning Environment Questionnaire (PWCLEQ) and Project Work Related Attitudes Instrument (PWRAI). In this project-based…

  6. The New Learning Ecology of One-to-One Computing Environments: Preparing Teachers for Shifting Dynamics and Relationships

    Science.gov (United States)

    Spires, Hiller A.; Oliver, Kevin; Corn, Jenifer

    2012-01-01

    Despite growing research and evaluation results on one-to-one computing environments, how these environments affect learning in schools remains underexamined. The purpose of this article is twofold: (a) to use a theoretical lens, namely a new learning ecology, to frame the dynamic changes as well as challenges that are introduced by a one-to-one…

  7. CE-ACCE: The Cloud Enabled Advanced sCience Compute Environment

    Science.gov (United States)

    Cinquini, L.; Freeborn, D. J.; Hardman, S. H.; Wong, C.

    2017-12-01

    Traditionally, Earth Science data from NASA remote sensing instruments has been processed by building custom data processing pipelines (often based on a common workflow engine or framework) which are typically deployed and run on an internal cluster of computing resources. This approach has some intrinsic limitations: it requires each mission to develop and deploy a custom software package on top of the adopted framework; it makes use of dedicated hardware, network and storage resources, which must be specifically purchased, maintained and re-purposed at mission completion; and computing services cannot be scaled on demand beyond the capability of the available servers.More recently, the rise of Cloud computing, coupled with other advances in containerization technology (most prominently, Docker) and micro-services architecture, has enabled a new paradigm, whereby space mission data can be processed through standard system architectures, which can be seamlessly deployed and scaled on demand on either on-premise clusters, or commercial Cloud providers. In this talk, we will present one such architecture named CE-ACCE ("Cloud Enabled Advanced sCience Compute Environment"), which we have been developing at the NASA Jet Propulsion Laboratory over the past year. CE-ACCE is based on the Apache OODT ("Object Oriented Data Technology") suite of services for full data lifecycle management, which are turned into a composable array of Docker images, and complemented by a plug-in model for mission-specific customization. We have applied this infrastructure to both flying and upcoming NASA missions, such as ECOSTRESS and SMAP, and demonstrated deployment on the Amazon Cloud, either using simple EC2 instances, or advanced AWS services such as Amazon Lambda and ECS (EC2 Container Services).

  8. G‐LoSA: An efficient computational tool for local structure‐centric biological studies and drug design

    Science.gov (United States)

    2016-01-01

    Abstract Molecular recognition by protein mostly occurs in a local region on the protein surface. Thus, an efficient computational method for accurate characterization of protein local structural conservation is necessary to better understand biology and drug design. We present a novel local structure alignment tool, G‐LoSA. G‐LoSA aligns protein local structures in a sequence order independent way and provides a GA‐score, a chemical feature‐based and size‐independent structure similarity score. Our benchmark validation shows the robust performance of G‐LoSA to the local structures of diverse sizes and characteristics, demonstrating its universal applicability to local structure‐centric comparative biology studies. In particular, G‐LoSA is highly effective in detecting conserved local regions on the entire surface of a given protein. In addition, the applications of G‐LoSA to identifying template ligands and predicting ligand and protein binding sites illustrate its strong potential for computer‐aided drug design. We hope that G‐LoSA can be a useful computational method for exploring interesting biological problems through large‐scale comparison of protein local structures and facilitating drug discovery research and development. G‐LoSA is freely available to academic users at http://im.compbio.ku.edu/GLoSA/. PMID:26813336

  9. A general-purpose development environment for intelligent computer-aided training systems

    Science.gov (United States)

    Savely, Robert T.

    1990-01-01

    Space station training will be a major task, requiring the creation of large numbers of simulation-based training systems for crew, flight controllers, and ground-based support personnel. Given the long duration of space station missions and the large number of activities supported by the space station, the extension of space shuttle training methods to space station training may prove to be impractical. The application of artificial intelligence technology to simulation training can provide the ability to deliver individualized training to large numbers of personnel in a distributed workstation environment. The principal objective of this project is the creation of a software development environment which can be used to build intelligent training systems for procedural tasks associated with the operation of the space station. Current NASA Johnson Space Center projects and joint projects with other NASA operational centers will result in specific training systems for existing space shuttle crew, ground support personnel, and flight controller tasks. Concurrently with the creation of these systems, a general-purpose development environment for intelligent computer-aided training systems will be built. Such an environment would permit the rapid production, delivery, and evolution of training systems for space station crew, flight controllers, and other support personnel. The widespread use of such systems will serve to preserve task and training expertise, support the training of many personnel in a distributed manner, and ensure the uniformity and verifiability of training experiences. As a result, significant reductions in training costs can be realized while safety and the probability of mission success can be enhanced.

  10. Local Ray-Based Traveltime Computation Using the Linearized Eikonal Equation

    KAUST Repository

    Almubarak, Mohammed S.

    2013-05-01

    The computation of traveltimes plays a critical role in the conventional implementations of Kirchhoff migration. Finite-difference-based methods are considered one of the most effective approaches for traveltime calculations and are therefore widely used. However, these eikonal solvers are mainly used to obtain early-arrival traveltime. Ray tracing can be used to pick later traveltime branches, besides the early arrivals, which may lead to an improvement in velocity estimation or in seismic imaging. In this thesis, I improved the accuracy of the solution of the linearized eikonal equation by constructing a linear system of equations (LSE) based on finite-difference approximation, which is of second-order accuracy. The ill-conditioned LSE is initially regularized and subsequently solved to calculate the traveltime update. Numerical tests proved that this method is as accurate as the second-order eikonal solver. Later arrivals are picked using ray tracing. These traveltimes are binned to the nearest node on a regular grid and empty nodes are estimated by interpolating the known values. The resulting traveltime field is used as an input to the linearized eikonal algorithm, which improves the accuracy of the interpolated nodes and yields a local ray-based traveltime. This is a preliminary study and further investigation is required to test the efficiency and the convergence of the solutions.

  11. Human response to local convective and radiant cooling in a warm environment

    DEFF Research Database (Denmark)

    Melikov, Arsen Krikor; Krejcirikova, Barbora; Kaczmarczyk, Jan

    2013-01-01

    The response of 24 human subjects to local convective cooling, radiant cooling, and combined radiant and convective cooling was studied at 28°C and 50% relative humidity. The local cooling devices used were (1) a tabletop cooling fan, (2) personalized ventilation providing a stream of clean air, (3...

  12. Environment

    International Nuclear Information System (INIS)

    McIntyre, A.D.; Turnbull, R.G.H.

    1992-01-01

    The development of the hydrocarbon resources of the North Sea has resulted in both offshore and onshore environmental repercussions, involving the existing physical attributes of the sea and seabed, the coastline and adjoining land. The social and economic repercussions of the industry were equally widespread. The dramatic and speedy impact of the exploration and exploitation of the northern North Sea resources in the early 1970s, on the physical resources of Scotland was quickly realised together with the concern that any environmental and social damage to the physical and social fabric should be kept to a minimum. To this end, a wide range of research and other activities by central and local government, and other interested agencies was undertaken to extend existing knowledge on the marine and terrestrial environments that might be affected by the oil and gas industry. The outcome of these activities is summarized in this paper. The topics covered include a survey of the marine ecosystems of the North Sea, the fishing industry, the impact of oil pollution on seabirds and fish stocks, the ecology of the Scottish coastline and the impact of the petroleum industry on a selection of particular sites. (author)

  13. Investigations of the local environment and macroscopic alignment behavior of novel polymerizeable lyotropic liquid crystals using nuclear magnetic resonance

    Science.gov (United States)

    Juang, Elizabeth

    In this dissertation, a variety of NMR techniques were used to explore the local environment of novel polymerizeable lyotropic liquid crystals (LLC). The LLC monomers examined in this study self-assemble in the presence of a small amount of water to form uniform, nanometer-scale tubes with aqueous interiors. The phase architecture is retained upon photopolymerization to yield the resulting nanoporous material. By dissolving reactive precursors into the aqueous phase, well- structured nancomposite materials have also been formed. Proposed uses for these novel polymerizeable LLCs are as porous water filtration membranes, as heterogeneous organic catalysts, and as nanocomposite materials for load bearing and optical applications. In order to better exploit these polymerizeable LLCs for materials development, the local environment must be examined. In addition, the macroscopic orientation of these materials remains an important step in their advancement. Various NMR studies were conducted on these novel LLCs. NMR T1 relaxation measurements were conducted to elucidate the local environment and dynamics of the 23Na counterions located inside the aqueous channels. 2H NMR line shape analyses were used to characterize the local structure and dynamics near the hydrophilic headgroup. 29 Si NMR studies were performed on silica nanocomposites formed with these LLC structures. Finally, the macroscopic alignment behavior of these novel LLCs using shear and magnetic fields was examined.

  14. Population growth and the environment in Africa : local informal institutions, the missing link

    NARCIS (Netherlands)

    Mazzucato, V.; Niemeijer, D.

    2002-01-01

    Population and environment debates regarding Africa, whether Malthusian or Boserupian in nature, focus on population levels as the driving force behind the relationship between environment and society. This article argues, instead, that how people adjust to their rise in numbers is more important

  15. Multimodal Interaction in Ambient Intelligence Environments Using Speech, Localization and Robotics

    Science.gov (United States)

    Galatas, Georgios

    2013-01-01

    An Ambient Intelligence Environment is meant to sense and respond to the presence of people, using its embedded technology. In order to effectively sense the activities and intentions of its inhabitants, such an environment needs to utilize information captured from multiple sensors and modalities. By doing so, the interaction becomes more natural…

  16. Environment

    DEFF Research Database (Denmark)

    Valentini, Chiara

    2017-01-01

    The term environment refers to the internal and external context in which organizations operate. For some scholars, environment is defined as an arrangement of political, economic, social and cultural factors existing in a given context that have an impact on organizational processes and structures....... For others, environment is a generic term describing a large variety of stakeholders and how these interact and act upon organizations. Organizations and their environment are mutually interdependent and organizational communications are highly affected by the environment. This entry examines the origin...... and development of organization-environment interdependence, the nature of the concept of environment and its relevance for communication scholarships and activities....

  17. Patient identity management for secondary use of biomedical research data in a distributed computing environment.

    Science.gov (United States)

    Nitzlnader, Michael; Schreier, Günter

    2014-01-01

    Dealing with data from different source domains is of increasing importance in today's large scale biomedical research endeavours. Within the European Network for Cancer research in Children and Adolescents (ENCCA) a solution to share such data for secondary use will be established. In this paper the solution arising from the aims of the ENCCA project and regulatory requirements concerning data protection and privacy is presented. Since the details of secondary biomedical dataset utilisation are often not known in advance, data protection regulations are met with an identity management concept that facilitates context-specific pseudonymisation and a way of data aggregation using a hidden reference table later on. Phonetic hashing is proposed to prevent duplicated patient registration and re-identification of patients is possible via a trusted third party only. Finally, the solution architecture allows for implementation in a distributed computing environment, including cloud-based elements.

  18. ENVIRONMENT: a computational platform to stochastically simulate reacting and self-reproducing lipid compartments

    Science.gov (United States)

    Mavelli, Fabio; Ruiz-Mirazo, Kepa

    2010-09-01

    'ENVIRONMENT' is a computational platform that has been developed in the last few years with the aim to simulate stochastically the dynamics and stability of chemically reacting protocellular systems. Here we present and describe some of its main features, showing how the stochastic kinetics approach can be applied to study the time evolution of reaction networks in heterogeneous conditions, particularly when supramolecular lipid structures (micelles, vesicles, etc) coexist with aqueous domains. These conditions are of special relevance to understand the origins of cellular, self-reproducing compartments, in the context of prebiotic chemistry and evolution. We contrast our simulation results with real lab experiments, with the aim to bring together theoretical and experimental research on protocell and minimal artificial cell systems.

  19. Event heap: a coordination infrastructure for dynamic heterogeneous application interactions in ubiquitous computing environments

    Science.gov (United States)

    Johanson, Bradley E.; Fox, Armando; Winograd, Terry A.; Hanrahan, Patrick M.

    2010-04-20

    An efficient and adaptive middleware infrastructure called the Event Heap system dynamically coordinates application interactions and communications in a ubiquitous computing environment, e.g., an interactive workspace, having heterogeneous software applications running on various machines and devices across different platforms. Applications exchange events via the Event Heap. Each event is characterized by a set of unordered, named fields. Events are routed by matching certain attributes in the fields. The source and target versions of each field are automatically set when an event is posted or used as a template. The Event Heap system implements a unique combination of features, both intrinsic to tuplespaces and specific to the Event Heap, including content based addressing, support for routing patterns, standard routing fields, limited data persistence, query persistence/registration, transparent communication, self-description, flexible typing, logical/physical centralization, portable client API, at most once per source first-in-first-out ordering, and modular restartability.

  20. Virtual environment and computer-aided technologies used for system prototyping and requirements development

    Science.gov (United States)

    Logan, Cory; Maida, James; Goldsby, Michael; Clark, Jim; Wu, Liew; Prenger, Henk

    1993-01-01

    The Space Station Freedom (SSF) Data Management System (DMS) consists of distributed hardware and software which monitor and control the many onboard systems. Virtual environment and off-the-shelf computer technologies can be used at critical points in project development to aid in objectives and requirements development. Geometric models (images) coupled with off-the-shelf hardware and software technologies were used in The Space Station Mockup and Trainer Facility (SSMTF) Crew Operational Assessment Project. Rapid prototyping is shown to be a valuable tool for operational procedure and system hardware and software requirements development. The project objectives, hardware and software technologies used, data gained, current activities, future development and training objectives shall be discussed. The importance of defining prototyping objectives and staying focused while maintaining schedules are discussed along with project pitfalls.