WorldWideScience

Sample records for self-adjusting computation enables

  1. Micro computed tomography evaluation of the Self-adjusting file and ProTaper Universal system on curved mandibular molars.

    Science.gov (United States)

    Serefoglu, Burcu; Piskin, Beyser

    2017-09-26

    The aim of this investigation was to compare the cleaning and shaping efficiency of Self-adjusting file and Protaper, and to assess the correlation between root canal curvature and working time in mandibular molars using micro-computed tomography. Twenty extracted mandibular molars instrumented with Protaper and Self-adjusting file and the total working time was measured in mesial canals. The changes in canal volume, surface area and structure model index, transportation, uninstrumented area and the correlation between working-time and the curvature were analyzed. Although no statistically significant difference was observed between two systems in distal canals (p>0.05), a significantly higher amount of removed dentin volume and lower uninstrumented area were provided by Protaper in mesial canals (p<0.0001). A correlation between working-time and the canal-curvature was also observed in mesial canals for both groups (SAFr 2 =0.792, p<0.0004, PTUr 2 =0.9098, p<0.0001).

  2. Computer Security Systems Enable Access.

    Science.gov (United States)

    Riggen, Gary

    1989-01-01

    A good security system enables access and protects information from damage or tampering, but the most important aspects of a security system aren't technical. A security procedures manual addresses the human element of computer security. (MLW)

  3. Canal transportation and centering ability of protaper and self-adjusting file system in long oval canals: An ex-vivo cone-beam computed tomography analysis.

    Science.gov (United States)

    Shah, Dipali Yogesh; Wadekar, Swati Ishwara; Dadpe, Ashwini Manish; Jadhav, Ganesh Ranganath; Choudhary, Lalit Jayant; Kalra, Dheeraj Deepak

    2017-01-01

    The purpose of this study was to compare and evaluate the shaping ability of ProTaper (PT) and Self-Adjusting File (SAF) system using cone-beam computed tomography (CBCT) to assess their performance in oval-shaped root canals. Sixty-two mandibular premolars with single oval canals were divided into two experimental groups ( n = 31) according to the systems used: Group I - PT and Group II - SAF. Canals were evaluated before and after instrumentation using CBCT to assess centering ratio and canal transportation at three levels. Data were statistically analyzed using one-way analysis of variance, post hoc Tukey's test, and t -test. The SAF showed better centering ability and lesser canal transportation than the PT only in the buccolingual plane at 6 and 9 mm levels. The shaping ability of the PT was best in the apical third in both the planes. The SAF had statistically significant better centering and lesser canal transportation in the buccolingual as compared to the mesiodistal plane at the middle and coronal levels. The SAF produced significantly less transportation and remained centered than the PT at the middle and coronal levels in the buccolingual plane of oval canals. In the mesiodistal plane, the performance of both the systems was parallel.

  4. Evaluation of the Self-Adjusting File system (SAF) for the instrumentation of primary molar root canals: a micro-computed tomographic study.

    Science.gov (United States)

    Kaya, E; Elbay, M; Yiğit, D

    2017-06-01

    The Self-Adjusting File (SAF) system has been recommended for use in permanent teeth since it offers more conservative and effective root-canal preparation when compared to traditional rotary systems. However, no study had evaluated the usage of SAF in primary teeth. The aim of this study was to evaluate and compare the use of SAF, K file (manual instrumentation) and Profile (traditional rotary instrumentation) systems for primary-tooth root-canal preparation in terms of instrumentation time and amounts of dentin removed using micro-computed tomography (μCT) technology. Study Design: The study was conducted with 60 human primary mandibular second molar teeth divided into 3 groups according to instrumentation technique: Group I: SAF (n=20); Group II: K file (n=20); Group III; Profile (n=20). Teeth were embedded in acrylic blocks and scanned with a μCT scanner prior to instrumentation. All distal root canals were prepared up to size 30 for K file,.04/30 for Profile and 2 mm thickness, size 25 for SAF; instrumentation time was recorded for each tooth, and a second μCT scan was performed after instrumentation was complete. Amounts of dentin removed were measured using the three-dimensional images by calculating the difference in root-canal volume before and after preparation. Data was statistically analysed using the Kolmogorov-Smirnov and Kruskal-Wallis tests. Manual instrumentation (K file) resulted in significantly more dentin removal when compared to rotary instrumentation (Profile and SAF), while the SAF system generated significantly less dentin removal than both manual instrumentation (K file) and traditional rotary instrumentation (Profile) (psystems. Within the experimental conditions of the present study, the SAF seems as a useful system for root-canal instrumentation in primary molars because it removed less dentin than other systems, which is especially important for the relatively thin-walled canals of primary teeth, and because it involves less

  5. Religiousity, Spirituality and Adolescents' Self-Adjustment

    Science.gov (United States)

    Japar, Muhammad; Purwati

    2014-01-01

    Religiuosity, spirituality, and adolescents' self-adjustment. The objective of this study is to test the correlation among religiosity, spirituality and adolescents' self-adjustment. A quantitative approach was employed in this study. Data were collected from 476 junior high schools students of 13 State Junior High Schools and one Junior High…

  6. Enabling opportunistic resources for CMS Computing Operations

    Energy Technology Data Exchange (ETDEWEB)

    Hufnagel, Dick [Fermilab

    2015-11-19

    With the increased pressure on computing brought by the higher energy and luminosity from the LHC in Run 2, CMS Computing Operations expects to require the ability to utilize “opportunistic” resources — resources not owned by, or a priori configured for CMS — to meet peak demands. In addition to our dedicated resources we look to add computing resources from non CMS grids, cloud resources, and national supercomputing centers. CMS uses the HTCondor/glideinWMS job submission infrastructure for all its batch processing, so such resources will need to be transparently integrated into its glideinWMS pool. Bosco and parrot wrappers are used to enable access and bring the CMS environment into these non CMS resources. Here we describe our strategy to supplement our native capabilities with opportunistic resources and our experience so far using them.

  7. Evolution of Cloud Computing and Enabling Technologies

    OpenAIRE

    Rabi Prasad Padhy; Manas Ranjan Patra

    2012-01-01

    We present an overview of the history of forecasting software over the past 25 years, concentrating especially on the interaction between computing and technologies from mainframe computing to cloud computing. The cloud computing is latest one. For delivering the vision of  various  of computing models, this paper lightly explains the architecture, characteristics, advantages, applications and issues of various computing models like PC computing, internet computing etc and related technologie...

  8. Enabling Earth Science Through Cloud Computing

    Science.gov (United States)

    Hardman, Sean; Riofrio, Andres; Shams, Khawaja; Freeborn, Dana; Springer, Paul; Chafin, Brian

    2012-01-01

    Cloud Computing holds tremendous potential for missions across the National Aeronautics and Space Administration. Several flight missions are already benefiting from an investment in cloud computing for mission critical pipelines and services through faster processing time, higher availability, and drastically lower costs available on cloud systems. However, these processes do not currently extend to general scientific algorithms relevant to earth science missions. The members of the Airborne Cloud Computing Environment task at the Jet Propulsion Laboratory have worked closely with the Carbon in Arctic Reservoirs Vulnerability Experiment (CARVE) mission to integrate cloud computing into their science data processing pipeline. This paper details the efforts involved in deploying a science data system for the CARVE mission, evaluating and integrating cloud computing solutions with the system and porting their science algorithms for execution in a cloud environment.

  9. Autonomic computing enabled cooperative networked design

    CERN Document Server

    Wodczak, Michal

    2014-01-01

    This book introduces the concept of autonomic computing driven cooperative networked system design from an architectural perspective. As such it leverages and capitalises on the relevant advancements in both the realms of autonomic computing and networking by welding them closely together. In particular, a multi-faceted Autonomic Cooperative System Architectural Model is defined which incorporates the notion of Autonomic Cooperative Behaviour being orchestrated by the Autonomic Cooperative Networking Protocol of a cross-layer nature. The overall proposed solution not only advocates for the inc

  10. Speech-enabled Computer-aided Translation

    DEFF Research Database (Denmark)

    Mesa-Lao, Bartolomé

    2014-01-01

    The present study has surveyed post-editor trainees’ views and attitudes before and after the introduction of speech technology as a front end to a computer-aided translation workbench. The aim of the survey was (i) to identify attitudes and perceptions among post-editor trainees before performing...... a post-editing task using automatic speech recognition (ASR); and (ii) to assess the degree to which post-editors’ attitudes and expectations to the use of speech technology changed after actually using it. The survey was based on two questionnaires: the first one administered before the participants...

  11. Cusps enable line attractors for neural computation

    International Nuclear Information System (INIS)

    Xiao, Zhuocheng; Zhang, Jiwei; Sornborger, Andrew T.; Tao, Louis

    2017-01-01

    Here, line attractors in neuronal networks have been suggested to be the basis of many brain functions, such as working memory, oculomotor control, head movement, locomotion, and sensory processing. In this paper, we make the connection between line attractors and pulse gating in feed-forward neuronal networks. In this context, because of their neutral stability along a one-dimensional manifold, line attractors are associated with a time-translational invariance that allows graded information to be propagated from one neuronal population to the next. To understand how pulse-gating manifests itself in a high-dimensional, nonlinear, feedforward integrate-and-fire network, we use a Fokker-Planck approach to analyze system dynamics. We make a connection between pulse-gated propagation in the Fokker-Planck and population-averaged mean-field (firing rate) models, and then identify an approximate line attractor in state space as the essential structure underlying graded information propagation. An analysis of the line attractor shows that it consists of three fixed points: a central saddle with an unstable manifold along the line and stable manifolds orthogonal to the line, which is surrounded on either side by stable fixed points. Along the manifold defined by the fixed points, slow dynamics give rise to a ghost. We show that this line attractor arises at a cusp catastrophe, where a fold bifurcation develops as a function of synaptic noise; and that the ghost dynamics near the fold of the cusp underly the robustness of the line attractor. Understanding the dynamical aspects of this cusp catastrophe allows us to show how line attractors can persist in biologically realistic neuronal networks and how the interplay of pulse gating, synaptic coupling, and neuronal stochasticity can be used to enable attracting one-dimensional manifolds and, thus, dynamically control the processing of graded information.

  12. Cusps enable line attractors for neural computation

    Science.gov (United States)

    Xiao, Zhuocheng; Zhang, Jiwei; Sornborger, Andrew T.; Tao, Louis

    2017-11-01

    Line attractors in neuronal networks have been suggested to be the basis of many brain functions, such as working memory, oculomotor control, head movement, locomotion, and sensory processing. In this paper, we make the connection between line attractors and pulse gating in feed-forward neuronal networks. In this context, because of their neutral stability along a one-dimensional manifold, line attractors are associated with a time-translational invariance that allows graded information to be propagated from one neuronal population to the next. To understand how pulse-gating manifests itself in a high-dimensional, nonlinear, feedforward integrate-and-fire network, we use a Fokker-Planck approach to analyze system dynamics. We make a connection between pulse-gated propagation in the Fokker-Planck and population-averaged mean-field (firing rate) models, and then identify an approximate line attractor in state space as the essential structure underlying graded information propagation. An analysis of the line attractor shows that it consists of three fixed points: a central saddle with an unstable manifold along the line and stable manifolds orthogonal to the line, which is surrounded on either side by stable fixed points. Along the manifold defined by the fixed points, slow dynamics give rise to a ghost. We show that this line attractor arises at a cusp catastrophe, where a fold bifurcation develops as a function of synaptic noise; and that the ghost dynamics near the fold of the cusp underly the robustness of the line attractor. Understanding the dynamical aspects of this cusp catastrophe allows us to show how line attractors can persist in biologically realistic neuronal networks and how the interplay of pulse gating, synaptic coupling, and neuronal stochasticity can be used to enable attracting one-dimensional manifolds and, thus, dynamically control the processing of graded information.

  13. Workflow Support for Advanced Grid-Enabled Computing

    OpenAIRE

    Xu, Fenglian; Eres, M.H.; Tao, Feng; Cox, Simon J.

    2004-01-01

    The Geodise project brings computer scientists and engineer's skills together to build up a service-oriented computing environmnet for engineers to perform complicated computations in a distributed system. The workflow tool is a front GUI to provide a full life cycle of workflow functions for Grid-enabled computing. The full life cycle of workflow functions have been enhanced based our initial research and development. The life cycle starts with a composition of a workflow, followed by an ins...

  14. DNA-Enabled Integrated Molecular Systems for Computation and Sensing

    Science.gov (United States)

    2014-05-21

    Computational devices can be chemically conjugated to different strands of DNA that are then self-assembled according to strict Watson − Crick binding rules... DNA -Enabled Integrated Molecular Systems for Computation and Sensing Craig LaBoda,† Heather Duschl,† and Chris L. Dwyer*,†,‡ †Department of...guided folding of DNA , inspired by nature, allows designs to manipulate molecular-scale processes unlike any other material system. Thus, DNA can be

  15. Enabling high performance computational science through combinatorial algorithms

    International Nuclear Information System (INIS)

    Boman, Erik G; Bozdag, Doruk; Catalyurek, Umit V; Devine, Karen D; Gebremedhin, Assefaw H; Hovland, Paul D; Pothen, Alex; Strout, Michelle Mills

    2007-01-01

    The Combinatorial Scientific Computing and Petascale Simulations (CSCAPES) Institute is developing algorithms and software for combinatorial problems that play an enabling role in scientific and engineering computations. Discrete algorithms will be increasingly critical for achieving high performance for irregular problems on petascale architectures. This paper describes recent contributions by researchers at the CSCAPES Institute in the areas of load balancing, parallel graph coloring, performance improvement, and parallel automatic differentiation

  16. Enabling high performance computational science through combinatorial algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Boman, Erik G [Discrete Algorithms and Math Department, Sandia National Laboratories (United States); Bozdag, Doruk [Biomedical Informatics, and Electrical and Computer Engineering, Ohio State University (United States); Catalyurek, Umit V [Biomedical Informatics, and Electrical and Computer Engineering, Ohio State University (United States); Devine, Karen D [Discrete Algorithms and Math Department, Sandia National Laboratories (United States); Gebremedhin, Assefaw H [Computer Science and Center for Computational Science, Old Dominion University (United States); Hovland, Paul D [Mathematics and Computer Science Division, Argonne National Laboratory (United States); Pothen, Alex [Computer Science and Center for Computational Science, Old Dominion University (United States); Strout, Michelle Mills [Computer Science, Colorado State University (United States)

    2007-07-15

    The Combinatorial Scientific Computing and Petascale Simulations (CSCAPES) Institute is developing algorithms and software for combinatorial problems that play an enabling role in scientific and engineering computations. Discrete algorithms will be increasingly critical for achieving high performance for irregular problems on petascale architectures. This paper describes recent contributions by researchers at the CSCAPES Institute in the areas of load balancing, parallel graph coloring, performance improvement, and parallel automatic differentiation.

  17. Combinatorial algorithms enabling computational science: tales from the front

    International Nuclear Information System (INIS)

    Bhowmick, Sanjukta; Boman, Erik G; Devine, Karen; Gebremedhin, Assefaw; Hendrickson, Bruce; Hovland, Paul; Munson, Todd; Pothen, Alex

    2006-01-01

    Combinatorial algorithms have long played a crucial enabling role in scientific and engineering computations. The importance of discrete algorithms continues to grow with the demands of new applications and advanced architectures. This paper surveys some recent developments in this rapidly changing and highly interdisciplinary field

  18. Combinatorial algorithms enabling computational science: tales from the front

    Energy Technology Data Exchange (ETDEWEB)

    Bhowmick, Sanjukta [Mathematics and Computer Science Division, Argonne National Laboratory (United States); Boman, Erik G [Discrete Algorithms and Math Department, Sandia National Laboratories (United States); Devine, Karen [Discrete Algorithms and Math Department, Sandia National Laboratories (United States); Gebremedhin, Assefaw [Computer Science Department, Old Dominion University (United States); Hendrickson, Bruce [Discrete Algorithms and Math Department, Sandia National Laboratories (United States); Hovland, Paul [Mathematics and Computer Science Division, Argonne National Laboratory (United States); Munson, Todd [Mathematics and Computer Science Division, Argonne National Laboratory (United States); Pothen, Alex [Computer Science Department, Old Dominion University (United States)

    2006-09-15

    Combinatorial algorithms have long played a crucial enabling role in scientific and engineering computations. The importance of discrete algorithms continues to grow with the demands of new applications and advanced architectures. This paper surveys some recent developments in this rapidly changing and highly interdisciplinary field.

  19. Working Towards New Transformative Geoscience Analytics Enabled by Petascale Computing

    Science.gov (United States)

    Woodcock, R.; Wyborn, L.

    2012-04-01

    Currently the top 10 supercomputers in the world are petascale and already exascale computers are being planned. Cloud computing facilities are becoming mainstream either as private or commercial investments. These computational developments will provide abundant opportunities for the earth science community to tackle the data deluge which has resulted from new instrumentation enabling data to be gathered at a greater rate and at higher resolution. Combined, the new computational environments should enable the earth sciences to be transformed. However, experience in Australia and elsewhere has shown that it is not easy to scale existing earth science methods, software and analytics to take advantage of the increased computational capacity that is now available. It is not simply a matter of 'transferring' current work practices to the new facilities: they have to be extensively 'transformed'. In particular new Geoscientific methods will need to be developed using advanced data mining, assimilation, machine learning and integration algorithms. Software will have to be capable of operating in highly parallelised environments, and will also need to be able to scale as the compute systems grow. Data access will have to improve and the earth science community needs to move from the file discovery, display and then locally download paradigm to self describing data cubes and data arrays that are available as online resources from either major data repositories or in the cloud. In the new transformed world, rather than analysing satellite data scene by scene, sensor agnostic data cubes of calibrated earth observation data will enable researchers to move across data from multiple sensors at varying spatial data resolutions. In using geophysics to characterise basement and cover, rather than analysing individual gridded airborne geophysical data sets, and then combining the results, petascale computing will enable analysis of multiple data types, collected at varying

  20. Grid computing : enabling a vision for collaborative research

    International Nuclear Information System (INIS)

    von Laszewski, G.

    2002-01-01

    In this paper the authors provide a motivation for Grid computing based on a vision to enable a collaborative research environment. The authors vision goes beyond the connection of hardware resources. They argue that with an infrastructure such as the Grid, new modalities for collaborative research are enabled. They provide an overview showing why Grid research is difficult, and they present a number of management-related issues that must be addressed to make Grids a reality. They list projects that provide solutions to subsets of these issues

  1. A data management system to enable urgent natural disaster computing

    Science.gov (United States)

    Leong, Siew Hoon; Kranzlmüller, Dieter; Frank, Anton

    2014-05-01

    Civil protection, in particular natural disaster management, is very important to most nations and civilians in the world. When disasters like flash floods, earthquakes and tsunamis are expected or have taken place, it is of utmost importance to make timely decisions for managing the affected areas and reduce casualties. Computer simulations can generate information and provide predictions to facilitate this decision making process. Getting the data to the required resources is a critical requirement to enable the timely computation of the predictions. An urgent data management system to support natural disaster computing is thus necessary to effectively carry out data activities within a stipulated deadline. Since the trigger of a natural disaster is usually unpredictable, it is not always possible to prepare required resources well in advance. As such, an urgent data management system for natural disaster computing has to be able to work with any type of resources. Additional requirements include the need to manage deadlines and huge volume of data, fault tolerance, reliable, flexibility to changes, ease of usage, etc. The proposed data management platform includes a service manager to provide a uniform and extensible interface for the supported data protocols, a configuration manager to check and retrieve configurations of available resources, a scheduler manager to ensure that the deadlines can be met, a fault tolerance manager to increase the reliability of the platform and a data manager to initiate and perform the data activities. These managers will enable the selection of the most appropriate resource, transfer protocol, etc. such that the hard deadline of an urgent computation can be met for a particular urgent activity, e.g. data staging or computation. We associated 2 types of deadlines [2] with an urgent computing system. Soft-hard deadline: Missing a soft-firm deadline will render the computation less useful resulting in a cost that can have severe

  2. Global tree network for computing structures enabling global processing operations

    Science.gov (United States)

    Blumrich; Matthias A.; Chen, Dong; Coteus, Paul W.; Gara, Alan G.; Giampapa, Mark E.; Heidelberger, Philip; Hoenicke, Dirk; Steinmacher-Burow, Burkhard D.; Takken, Todd E.; Vranas, Pavlos M.

    2010-01-19

    A system and method for enabling high-speed, low-latency global tree network communications among processing nodes interconnected according to a tree network structure. The global tree network enables collective reduction operations to be performed during parallel algorithm operations executing in a computer structure having a plurality of the interconnected processing nodes. Router devices are included that interconnect the nodes of the tree via links to facilitate performance of low-latency global processing operations at nodes of the virtual tree and sub-tree structures. The global operations performed include one or more of: broadcast operations downstream from a root node to leaf nodes of a virtual tree, reduction operations upstream from leaf nodes to the root node in the virtual tree, and point-to-point message passing from any node to the root node. The global tree network is configurable to provide global barrier and interrupt functionality in asynchronous or synchronized manner, and, is physically and logically partitionable.

  3. Self-adjustable glasses in the developing world

    Directory of Open Access Journals (Sweden)

    Murthy Gudlavalleti VS

    2014-02-01

    Full Text Available Venkata S Murthy Gudlavalleti,1 Komal Preet Allagh,1 Aashrai SV Gudlavalleti2 1Indian Institute of Public Health, Public Health Foundation of India, Hyderabad, 2Centre for Chronic Disease Control, Public Health Foundation of India, New Delhi, India Abstract: Uncorrected refractive errors are the single largest cause of visual impairment globally. Refractive errors are an avoidable cause of visual impairment that are easily correctable. Provision of spectacles is a cost-effective measure. Unfortunately, this simple solution becomes a public health challenge in low- and middle-income countries because of the paucity of human resources for refraction and optical services, lack of access to refraction services in rural areas, and the cost of spectacles. Low-cost approaches to provide affordable glasses in developing countries are critical. A number of approaches has been tried to surmount the challenge, including ready-made spectacles, the use of focometers and self-adjustable glasses, among other modalities. Recently, self-adjustable spectacles have been validated in studies in both children and adults in developed and developing countries. A high degree of agreement between self-adjustable spectacles and cycloplegic subjective refraction has been reported. Self-refraction has also been found to be less prone to accommodative inaccuracy compared with non-cycloplegic autorefraction. The benefits of self-adjusted spectacles include: the potential for correction of both distance and near vision, applicability for all ages, the empowerment of lay workers, the increased participation of clients, augmented awareness of the mechanism of refraction, reduced costs of optical and refraction units in low-resource settings, and a relative reduction in costs for refraction services. Concerns requiring attention include a need for the improved cosmetic appearance of the currently available self-adjustable spectacles, an increased range of correction (currently

  4. Multidirectional flexible force sensors based on confined, self-adjusting carbon nanotube arrays

    Science.gov (United States)

    Lee, J.-I.; Pyo, Soonjae; Kim, Min-Ook; Kim, Jongbaeg

    2018-02-01

    We demonstrate a highly sensitive force sensor based on self-adjusting carbon nanotube (CNT) arrays. Aligned CNT arrays are directly synthesized on silicon microstructures by a space-confined growth technique which enables a facile self-adjusting contact. To afford flexibility and softness, the patterned microstructures with the integrated CNTs are embedded in polydimethylsiloxane structures. The sensing mechanism is based on variations in the contact resistance between the facing CNT arrays under the applied force. By finite element analysis, proper dimensions and positions for each component are determined. Further, high sensitivities up to 15.05%/mN of the proposed sensors were confirmed experimentally. Multidirectional sensing capability could also be achieved by designing multiple sets of sensing elements in a single sensor. The sensors show long-term operational stability, owing to the unique properties of the constituent CNTs, such as outstanding mechanical durability and elasticity.

  5. A self-adjusting expandable GPS collar for male elk

    Science.gov (United States)

    Brian L. Dick; Scott L. Findholt; Bruce K. Johnson

    2013-01-01

    It is a challenge to use collars on male cervids because their neck size can increase substantially during the rut and also because of growth as the animal matures. We describe how to build a self-adjusting expandable collar for yearling or adult male Rocky Mountain elk (Cervus elaphus) to which very high frequency transmitters and global...

  6. Review of Enabling Technologies to Facilitate Secure Compute Customization

    Energy Technology Data Exchange (ETDEWEB)

    Aderholdt, Ferrol [Tennessee Technological University; Caldwell, Blake A [ORNL; Hicks, Susan Elaine [ORNL; Koch, Scott M [ORNL; Naughton, III, Thomas J [ORNL; Pelfrey, Daniel S [ORNL; Pogge, James R [Tennessee Technological University; Scott, Stephen L [Tennessee Technological University; Shipman, Galen M [ORNL; Sorrillo, Lawrence [ORNL

    2014-12-01

    High performance computing environments are often used for a wide variety of workloads ranging from simulation, data transformation and analysis, and complex workflows to name just a few. These systems may process data for a variety of users, often requiring strong separation between job allocations. There are many challenges to establishing these secure enclaves within the shared infrastructure of high-performance computing (HPC) environments. The isolation mechanisms in the system software are the basic building blocks for enabling secure compute enclaves. There are a variety of approaches and the focus of this report is to review the different virtualization technologies that facilitate the creation of secure compute enclaves. The report reviews current operating system (OS) protection mechanisms and modern virtualization technologies to better understand the performance/isolation properties. We also examine the feasibility of running ``virtualized'' computing resources as non-privileged users, and providing controlled administrative permissions for standard users running within a virtualized context. Our examination includes technologies such as Linux containers (LXC [32], Docker [15]) and full virtualization (KVM [26], Xen [5]). We categorize these different approaches to virtualization into two broad groups: OS-level virtualization and system-level virtualization. The OS-level virtualization uses containers to allow a single OS kernel to be partitioned to create Virtual Environments (VE), e.g., LXC. The resources within the host's kernel are only virtualized in the sense of separate namespaces. In contrast, system-level virtualization uses hypervisors to manage multiple OS kernels and virtualize the physical resources (hardware) to create Virtual Machines (VM), e.g., Xen, KVM. This terminology of VE and VM, detailed in Section 2, is used throughout the report to distinguish between the two different approaches to providing virtualized execution

  7. Enabling Grid Computing resources within the KM3NeT computing model

    Directory of Open Access Journals (Sweden)

    Filippidis Christos

    2016-01-01

    Full Text Available KM3NeT is a future European deep-sea research infrastructure hosting a new generation neutrino detectors that – located at the bottom of the Mediterranean Sea – will open a new window on the universe and answer fundamental questions both in particle physics and astrophysics. International collaborative scientific experiments, like KM3NeT, are generating datasets which are increasing exponentially in both complexity and volume, making their analysis, archival, and sharing one of the grand challenges of the 21st century. These experiments, in their majority, adopt computing models consisting of different Tiers with several computing centres and providing a specific set of services for the different steps of data processing such as detector calibration, simulation and data filtering, reconstruction and analysis. The computing requirements are extremely demanding and, usually, span from serial to multi-parallel or GPU-optimized jobs. The collaborative nature of these experiments demands very frequent WAN data transfers and data sharing among individuals and groups. In order to support the aforementioned demanding computing requirements we enabled Grid Computing resources, operated by EGI, within the KM3NeT computing model. In this study we describe our first advances in this field and the method for the KM3NeT users to utilize the EGI computing resources in a simulation-driven use-case.

  8. Enabling High-Performance Computing as a Service

    KAUST Repository

    AbdelBaky, Moustafa; Parashar, Manish; Kim, Hyunjoo; Jordan, Kirk E.; Sachdeva, Vipin; Sexton, James; Jamjoom, Hani; Shae, Zon-Yin; Pencheva, Gergina; Tavakoli, Reza; Wheeler, Mary F.

    2012-01-01

    With the right software infrastructure, clouds can provide scientists with as a service access to high-performance computing resources. An award-winning prototype framework transforms the Blue Gene/P system into an elastic cloud to run a

  9. Enabling Computational Dynamics in Distributed Computing Environments Using a Heterogeneous Computing Template

    Science.gov (United States)

    2011-08-09

    heterogeneous computing concept advertised recently as the paradigm capable of delivering exascale flop rates by the end of the decade. In this framework...and Lamb. Page 10 of 10 UNCLASSIFIED [3] Skaugen, K., Petascale to Exascale : Extending Intel’s HPC Commitment: http://download.intel.com

  10. Enabling High-Performance Computing as a Service

    KAUST Repository

    AbdelBaky, Moustafa

    2012-10-01

    With the right software infrastructure, clouds can provide scientists with as a service access to high-performance computing resources. An award-winning prototype framework transforms the Blue Gene/P system into an elastic cloud to run a representative HPC application. © 2012 IEEE.

  11. The gputools package enables GPU computing in R.

    Science.gov (United States)

    Buckner, Joshua; Wilson, Justin; Seligman, Mark; Athey, Brian; Watson, Stanley; Meng, Fan

    2010-01-01

    By default, the R statistical environment does not make use of parallelism. Researchers may resort to expensive solutions such as cluster hardware for large analysis tasks. Graphics processing units (GPUs) provide an inexpensive and computationally powerful alternative. Using R and the CUDA toolkit from Nvidia, we have implemented several functions commonly used in microarray gene expression analysis for GPU-equipped computers. R users can take advantage of the better performance provided by an Nvidia GPU. The package is available from CRAN, the R project's repository of packages, at http://cran.r-project.org/web/packages/gputools More information about our gputools R package is available at http://brainarray.mbni.med.umich.edu/brainarray/Rgpgpu

  12. Scidac-Data: Enabling Data Driven Modeling of Exascale Computing

    Science.gov (United States)

    Mubarak, Misbah; Ding, Pengfei; Aliaga, Leo; Tsaris, Aristeidis; Norman, Andrew; Lyon, Adam; Ross, Robert

    2017-10-01

    The SciDAC-Data project is a DOE-funded initiative to analyze and exploit two decades of information and analytics that have been collected by the Fermilab data center on the organization, movement, and consumption of high energy physics (HEP) data. The project analyzes the analysis patterns and data organization that have been used by NOvA, MicroBooNE, MINERvA, CDF, D0, and other experiments to develop realistic models of HEP analysis workflows and data processing. The SciDAC-Data project aims to provide both realistic input vectors and corresponding output data that can be used to optimize and validate simulations of HEP analysis. These simulations are designed to address questions of data handling, cache optimization, and workflow structures that are the prerequisites for modern HEP analysis chains to be mapped and optimized to run on the next generation of leadership-class exascale computing facilities. We present the use of a subset of the SciDAC-Data distributions, acquired from analysis of approximately 71,000 HEP workflows run on the Fermilab data center and corresponding to over 9 million individual analysis jobs, as the input to detailed queuing simulations that model the expected data consumption and caching behaviors of the work running in high performance computing (HPC) and high throughput computing (HTC) environments. In particular we describe how the Sequential Access via Metadata (SAM) data-handling system in combination with the dCache/Enstore-based data archive facilities has been used to develop radically different models for analyzing the HEP data. We also show how the simulations may be used to assess the impact of design choices in archive facilities.

  13. Enabling Efficient Climate Science Workflows in High Performance Computing Environments

    Science.gov (United States)

    Krishnan, H.; Byna, S.; Wehner, M. F.; Gu, J.; O'Brien, T. A.; Loring, B.; Stone, D. A.; Collins, W.; Prabhat, M.; Liu, Y.; Johnson, J. N.; Paciorek, C. J.

    2015-12-01

    A typical climate science workflow often involves a combination of acquisition of data, modeling, simulation, analysis, visualization, publishing, and storage of results. Each of these tasks provide a myriad of challenges when running on a high performance computing environment such as Hopper or Edison at NERSC. Hurdles such as data transfer and management, job scheduling, parallel analysis routines, and publication require a lot of forethought and planning to ensure that proper quality control mechanisms are in place. These steps require effectively utilizing a combination of well tested and newly developed functionality to move data, perform analysis, apply statistical routines, and finally, serve results and tools to the greater scientific community. As part of the CAlibrated and Systematic Characterization, Attribution and Detection of Extremes (CASCADE) project we highlight a stack of tools our team utilizes and has developed to ensure that large scale simulation and analysis work are commonplace and provide operations that assist in everything from generation/procurement of data (HTAR/Globus) to automating publication of results to portals like the Earth Systems Grid Federation (ESGF), all while executing everything in between in a scalable environment in a task parallel way (MPI). We highlight the use and benefit of these tools by showing several climate science analysis use cases they have been applied to.

  14. Implementing Computer Algebra Enabled Questions for the Assessment and Learning of Mathematics

    Science.gov (United States)

    Sangwin, Christopher J.; Naismith, Laura

    2008-01-01

    We present principles for the design of an online system to support computer algebra enabled questions for use within the teaching and learning of mathematics in higher education. The introduction of a computer algebra system (CAS) into a computer aided assessment (CAA) system affords sophisticated response processing of student provided answers.…

  15. Connections that Count: Brain-Computer Interface Enables the Profoundly Paralyzed to Communicate

    Science.gov (United States)

    ... Home Current Issue Past Issues Connections that Count: Brain-Computer Interface Enables the Profoundly Paralyzed to Communicate Past Issues / ... of this page please turn Javascript on. A brain-computer interface (BCI) system This brain-computer interface (BCI) system ...

  16. An intelligent FFR with a self-adjustable ventilation fan.

    Science.gov (United States)

    Zhou, Song; Li, Hui; Shen, Shengnan; Li, Siyu; Wang, Wei; Zhang, Xiaotie; Yang, James

    2017-11-01

    This article presents an intelligent Filtering Facepiece Respirator (FFR) with a self-adjustable ventilation fan for improved comfort. The ventilation fan with an intelligent control aims to reduce temperature, relative humidity, and CO 2 concentrations inside the facepiece. Compared with a previous version of the FFR, the advantage of this new FFR is the intelligent control of the fan's rotation speed based on the change in temperature and relative humidity in the FFR dead space. The design of the control system utilizes an 8-bit, ultra-low power STC15W404AS microcontroller (HongJin technology, Shenzhen, China), and adopts a high-precision AM2320 device (AoSong electronic, Guangzhou, China) as temperature and relative humidity sensor so that control of temperature and relative humidity is realized in real time within the FFR dead space. The ventilation fan is intelligently driven and runs on a rechargeable lithium battery with a power-save mode that provides a correspondingly longer operational time. Meanwhile, the design is simplistic. Two experiments were performed to determine the best location to place the fan.

  17. From E-commerce to Social Commerce: A Framework to Guide Enabling Cloud Computing

    OpenAIRE

    Baghdadi, Youcef

    2013-01-01

    Social commerce is doing commerce in a collaborative and participative way, by using social media, through an enterprise interactive interface that enables social interactions. Technologies such as Web 2.0, Cloud Computing and Service Oriented Architecture (SOA) enable social commerce. Yet, a framework for social commerce, putting Enterprise Social Interactions as central entities, would provide a strong business justification for social commerce design and adoption with these enabling techno...

  18. Clinical antibacterial effectiveness of the self-adjusting file system.

    Science.gov (United States)

    Neves, M A S; Rôças, I N; Siqueira, J F

    2014-04-01

    To evaluate in vivo the antibacterial effectiveness of the self-adjusting file (SAF) using molecular methods. Root canals from single-rooted teeth with apical periodontitis were instrumented using the SAF system under continuous irrigation with 2.5% NaOCl. DNA extracts from samples taken before and after instrumentation were subjected to quantitative analysis of total bacteria counts and levels of streptococci by quantitative real-time polymerase chain reaction (qPCR). The reverse-capture checkerboard assay was also used to identify 28 bacterial taxa before (S1) and after (S2) SAF instrumentation. SAF was also compared with a conventional hand nickel-titanium instrumentation technique for total bacterial reduction. Data from qPCR were analysed statistically within groups using the Wilcoxon matched pairs test and between groups using the Mann-Whitney U-test and the Fisher's exact test, with significance level set at P file significantly reduced the total bacterial counts from a mean number of 1.96 × 10(7) cells to 1.34 × 10(4) cells (P system was significantly superior to the 95.1% reduction obtained by hand instrumentation (P system succeeded in significantly reducing the streptococcal levels, but four cases still harboured these bacteria in S2. Checkerboard analysis revealed that not only streptococci but also some anaerobic and even as-yet-uncultivated bacteria may resist the effects of chemomechanical procedures. The SAF instrumentation system was highly effective in reducing bacterial populations from infected root canals and performed significantly better than hand instrumentation. However, because half of the samples still had detectable bacteria after preparation with SAF, supplementary disinfection is still required to maximize bacterial elimination. © 2013 International Endodontic Journal. Published by John Wiley & Sons Ltd.

  19. Using spatial principles to optimize distributed computing for enabling the physical science discoveries.

    Science.gov (United States)

    Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing

    2011-04-05

    Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century.

  20. A portable grid-enabled computing system for a nuclear material study

    International Nuclear Information System (INIS)

    Tsujita, Yuichi; Arima, Tatsumi; Takekawa, Takayuki; Suzuki, Yoshio

    2010-01-01

    We have built a portable grid-enabled computing system specialized for our molecular dynamics (MD) simulation program to study Pu material easily. Experimental approach to reveal properties of Pu materials is often accompanied by some difficulties such as radiotoxicity of actinides. Since a computational approach reveals new aspects to researchers without such radioactive facilities, we address an MD computation. In order to have more realistic results about e.g., melting point or thermal conductivity, we need a large scale of parallel computations. Most of application users who don't have supercomputers in their institutes should use a remote supercomputer. For such users, we have developed the portable and secured grid-enabled computing system to utilize a grid computing infrastructure provided by Information Technology Based Laboratory (ITBL). This system enables us to access remote supercomputers in the ITBL system seamlessly from a client PC through its graphical user interface (GUI). Typically it enables seamless file accesses on the GUI. Furthermore monitoring of standard output or standard error is available to see progress of an executed program. Since the system provides fruitful functionalities which are useful for parallel computing on a remote supercomputer, application users can concentrate on their researches. (author)

  1. Crowd Sensing-Enabling Security Service Recommendation for Social Fog Computing Systems.

    Science.gov (United States)

    Wu, Jun; Su, Zhou; Wang, Shen; Li, Jianhua

    2017-07-30

    Fog computing, shifting intelligence and resources from the remote cloud to edge networks, has the potential of providing low-latency for the communication from sensing data sources to users. For the objects from the Internet of Things (IoT) to the cloud, it is a new trend that the objects establish social-like relationships with each other, which efficiently brings the benefits of developed sociality to a complex environment. As fog service become more sophisticated, it will become more convenient for fog users to share their own services, resources, and data via social networks. Meanwhile, the efficient social organization can enable more flexible, secure, and collaborative networking. Aforementioned advantages make the social network a potential architecture for fog computing systems. In this paper, we design an architecture for social fog computing, in which the services of fog are provisioned based on "friend" relationships. To the best of our knowledge, this is the first attempt at an organized fog computing system-based social model. Meanwhile, social networking enhances the complexity and security risks of fog computing services, creating difficulties of security service recommendations in social fog computing. To address this, we propose a novel crowd sensing-enabling security service provisioning method to recommend security services accurately in social fog computing systems. Simulation results show the feasibilities and efficiency of the crowd sensing-enabling security service recommendation method for social fog computing systems.

  2. Crowd Sensing-Enabling Security Service Recommendation for Social Fog Computing Systems

    Directory of Open Access Journals (Sweden)

    Jun Wu

    2017-07-01

    Full Text Available Fog computing, shifting intelligence and resources from the remote cloud to edge networks, has the potential of providing low-latency for the communication from sensing data sources to users. For the objects from the Internet of Things (IoT to the cloud, it is a new trend that the objects establish social-like relationships with each other, which efficiently brings the benefits of developed sociality to a complex environment. As fog service become more sophisticated, it will become more convenient for fog users to share their own services, resources, and data via social networks. Meanwhile, the efficient social organization can enable more flexible, secure, and collaborative networking. Aforementioned advantages make the social network a potential architecture for fog computing systems. In this paper, we design an architecture for social fog computing, in which the services of fog are provisioned based on “friend” relationships. To the best of our knowledge, this is the first attempt at an organized fog computing system-based social model. Meanwhile, social networking enhances the complexity and security risks of fog computing services, creating difficulties of security service recommendations in social fog computing. To address this, we propose a novel crowd sensing-enabling security service provisioning method to recommend security services accurately in social fog computing systems. Simulation results show the feasibilities and efficiency of the crowd sensing-enabling security service recommendation method for social fog computing systems.

  3. Crowd Sensing-Enabling Security Service Recommendation for Social Fog Computing Systems

    Science.gov (United States)

    Wu, Jun; Su, Zhou; Li, Jianhua

    2017-01-01

    Fog computing, shifting intelligence and resources from the remote cloud to edge networks, has the potential of providing low-latency for the communication from sensing data sources to users. For the objects from the Internet of Things (IoT) to the cloud, it is a new trend that the objects establish social-like relationships with each other, which efficiently brings the benefits of developed sociality to a complex environment. As fog service become more sophisticated, it will become more convenient for fog users to share their own services, resources, and data via social networks. Meanwhile, the efficient social organization can enable more flexible, secure, and collaborative networking. Aforementioned advantages make the social network a potential architecture for fog computing systems. In this paper, we design an architecture for social fog computing, in which the services of fog are provisioned based on “friend” relationships. To the best of our knowledge, this is the first attempt at an organized fog computing system-based social model. Meanwhile, social networking enhances the complexity and security risks of fog computing services, creating difficulties of security service recommendations in social fog computing. To address this, we propose a novel crowd sensing-enabling security service provisioning method to recommend security services accurately in social fog computing systems. Simulation results show the feasibilities and efficiency of the crowd sensing-enabling security service recommendation method for social fog computing systems. PMID:28758943

  4. Productization and Commercialization of IT-Enabled Higher Education in Computer Science: A Systematic Literature Review

    Science.gov (United States)

    Kankaanpää, Irja; Isomäki, Hannakaisa

    2013-01-01

    This paper reviews research literature on the production and commercialization of IT-enabled higher education in computer science. Systematic literature review (SLR) was carried out in order to find out to what extent this area has been studied, more specifically how much it has been studied and to what detail. The results of this paper make a…

  5. Enabling analytics on sensitive medical data with secure multi-party computation

    NARCIS (Netherlands)

    M. Veeningen (Meilof); S. Chatterjea (Supriyo); A.Z. Horváth (Anna Zsófia); G. Spindler (Gerald); E. Boersma (Eric); P. van der Spek (Peter); O. van der Galiën (Onno); J. Gutteling (Job); W. Kraaij (Wessel); P.J.M. Veugen (Thijs)

    2018-01-01

    textabstractWhile there is a clear need to apply data analytics in the healthcare sector, this is often difficult because it requires combining sensitive data from multiple data sources. In this paper, we show how the cryptographic technique of secure multiparty computation can enable such data

  6. Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools

    Science.gov (United States)

    Boe, Bryce A.

    There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.

  7. Parallel Monte Carlo simulations on an ARC-enabled computing grid

    International Nuclear Information System (INIS)

    Nilsen, Jon K; Samset, Bjørn H

    2011-01-01

    Grid computing opens new possibilities for running heavy Monte Carlo simulations of physical systems in parallel. The presentation gives an overview of GaMPI, a system for running an MPI-based random walker simulation on grid resources. Integrating the ARC middleware and the new storage system Chelonia with the Ganga grid job submission and control system, we show that MPI jobs can be run on a world-wide computing grid with good performance and promising scaling properties. Results for relatively communication-heavy Monte Carlo simulations run on multiple heterogeneous, ARC-enabled computing clusters in several countries are presented.

  8. Eight months of clinical experience with the Self-Adjusting File system.

    Science.gov (United States)

    Solomonov, Michael

    2011-06-01

    The Self-Adjusting File (SAF) system (ReDent-Nova, Ra'anana, Israel) has been recently introduced for the simultaneous instrumentation and irrigation of root canals. The SAF is claimed to adapt itself three dimensionally to the root canal, including its cross-section. It is operated with a continuous flow of sodium hypochlorite that is delivered into the root canal through the hollow file and claimed to be activated by sonic agitation of the irrigant. Our aim was to present for the first time clinical cases prepared with the SAF system and to describe a clinical classification of canals, according to their difficulty, with recommendations for endodontic treatment sequences for each category. This report is based on the experience of a single endodontist, who used the system to treat more than 50 consecutive primary endodontic cases over the prior 8 months. A clinical classification was developed which enabled the operator to select a treatment protocol for easy and optimal glide path preparation to be effectively used with the SAF file in the various root canals encountered in the clinical environment. Clinical classification of canal difficulty makes root canal treatment sequences with the SAF simple and predictable. Many types of cases can be treated with the SAF system although a novice user is advised to advance slowly along the learning curve from simpler to more complicated canals. Copyright © 2011 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  9. Neuromorphic computing enabled by physics of electron spins: Prospects and perspectives

    Science.gov (United States)

    Sengupta, Abhronil; Roy, Kaushik

    2018-03-01

    “Spintronics” refers to the understanding of the physics of electron spin-related phenomena. While most of the significant advancements in this field has been driven primarily by memory, recent research has demonstrated that various facets of the underlying physics of spin transport and manipulation can directly mimic the functionalities of the computational primitives in neuromorphic computation, i.e., the neurons and synapses. Given the potential of these spintronic devices to implement bio-mimetic computations at very low terminal voltages, several spin-device structures have been proposed as the core building blocks of neuromorphic circuits and systems to implement brain-inspired computing. Such an approach is expected to play a key role in circumventing the problems of ever-increasing power dissipation and hardware requirements for implementing neuro-inspired algorithms in conventional digital CMOS technology. Perspectives on spin-enabled neuromorphic computing, its status, and challenges and future prospects are outlined in this review article.

  10. Self-adjusting house-heating control system

    Energy Technology Data Exchange (ETDEWEB)

    Hacker, O; Ott, M

    1983-01-01

    Only small expenditure in terms of hard- and software is needed for the heating-control system described here to keep the house-room temperature in day- and night (reduced temperature)-operation precisely at the desired degree C. No control adjustment is needed as the computer - in this case an EMUF-model - adapts itself to changing conditions like type of house, weather conditions etc. Perfect control and good control dynamic lead to considerable savings of energy.

  11. The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report

    Energy Technology Data Exchange (ETDEWEB)

    Diachin, L F; Garaizar, F X; Henson, V E; Pope, G

    2009-10-12

    In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE and the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.

  12. Reconfigurable Computing As an Enabling Technology for Single-Photon-Counting Laser Altimetry

    Science.gov (United States)

    Powell, Wesley; Hicks, Edward; Pinchinat, Maxime; Dabney, Philip; McGarry, Jan; Murray, Paul

    2003-01-01

    Single-photon-counting laser altimetry is a new measurement technique offering significant advantages in vertical resolution, reducing instrument size, mass, and power, and reducing laser complexity as compared to analog or threshold detection laser altimetry techniques. However, these improvements come at the cost of a dramatically increased requirement for onboard real-time data processing. Reconfigurable computing has been shown to offer considerable performance advantages in performing this processing. These advantages have been demonstrated on the Multi-KiloHertz Micro-Laser Altimeter (MMLA), an aircraft based single-photon-counting laser altimeter developed by NASA Goddard Space Flight Center with several potential spaceflight applications. This paper describes how reconfigurable computing technology was employed to perform MMLA data processing in real-time under realistic operating constraints, along with the results observed. This paper also expands on these prior results to identify concepts for using reconfigurable computing to enable spaceflight single-photon-counting laser altimeter instruments.

  13. Soft Electronics Enabled Ergonomic Human-Computer Interaction for Swallowing Training

    Science.gov (United States)

    Lee, Yongkuk; Nicholls, Benjamin; Sup Lee, Dong; Chen, Yanfei; Chun, Youngjae; Siang Ang, Chee; Yeo, Woon-Hong

    2017-04-01

    We introduce a skin-friendly electronic system that enables human-computer interaction (HCI) for swallowing training in dysphagia rehabilitation. For an ergonomic HCI, we utilize a soft, highly compliant (“skin-like”) electrode, which addresses critical issues of an existing rigid and planar electrode combined with a problematic conductive electrolyte and adhesive pad. The skin-like electrode offers a highly conformal, user-comfortable interaction with the skin for long-term wearable, high-fidelity recording of swallowing electromyograms on the chin. Mechanics modeling and experimental quantification captures the ultra-elastic mechanical characteristics of an open mesh microstructured sensor, conjugated with an elastomeric membrane. Systematic in vivo studies investigate the functionality of the soft electronics for HCI-enabled swallowing training, which includes the application of a biofeedback system to detect swallowing behavior. The collection of results demonstrates clinical feasibility of the ergonomic electronics in HCI-driven rehabilitation for patients with swallowing disorders.

  14. Enabling Analytics on Sensitive Medical Data with Secure Multi-Party Computation.

    Science.gov (United States)

    Veeningen, Meilof; Chatterjea, Supriyo; Horváth, Anna Zsófia; Spindler, Gerald; Boersma, Eric; van der Spek, Peter; van der Galiën, Onno; Gutteling, Job; Kraaij, Wessel; Veugen, Thijs

    2018-01-01

    While there is a clear need to apply data analytics in the healthcare sector, this is often difficult because it requires combining sensitive data from multiple data sources. In this paper, we show how the cryptographic technique of secure multi-party computation can enable such data analytics by performing analytics without the need to share the underlying data. We discuss the issue of compliance to European privacy legislation; report on three pilots bringing these techniques closer to practice; and discuss the main challenges ahead to make fully privacy-preserving data analytics in the medical sector commonplace.

  15. Combining Archetypes, Ontologies and Formalization Enables Automated Computation of Quality Indicators.

    Science.gov (United States)

    Legaz-García, María Del Carmen; Dentler, Kathrin; Fernández-Breis, Jesualdo Tomás; Cornet, Ronald

    2017-01-01

    ArchMS is a framework that represents clinical information and knowledge using ontologies in OWL, which facilitates semantic interoperability and thereby the exploitation and secondary use of clinical data. However, it does not yet support the automated assessment of quality of care. CLIF is a stepwise method to formalize quality indicators. The method has been implemented in the CLIF tool which supports its users in generating computable queries based on a patient data model which can be based on archetypes. To enable the automated computation of quality indicators using ontologies and archetypes, we tested whether ArchMS and the CLIF tool can be integrated. We successfully automated the process of generating SPARQL queries from quality indicators that have been formalized with CLIF and integrated them into ArchMS. Hence, ontologies and archetypes can be combined for the execution of formalized quality indicators.

  16. CE-ACCE: The Cloud Enabled Advanced sCience Compute Environment

    Science.gov (United States)

    Cinquini, L.; Freeborn, D. J.; Hardman, S. H.; Wong, C.

    2017-12-01

    Traditionally, Earth Science data from NASA remote sensing instruments has been processed by building custom data processing pipelines (often based on a common workflow engine or framework) which are typically deployed and run on an internal cluster of computing resources. This approach has some intrinsic limitations: it requires each mission to develop and deploy a custom software package on top of the adopted framework; it makes use of dedicated hardware, network and storage resources, which must be specifically purchased, maintained and re-purposed at mission completion; and computing services cannot be scaled on demand beyond the capability of the available servers.More recently, the rise of Cloud computing, coupled with other advances in containerization technology (most prominently, Docker) and micro-services architecture, has enabled a new paradigm, whereby space mission data can be processed through standard system architectures, which can be seamlessly deployed and scaled on demand on either on-premise clusters, or commercial Cloud providers. In this talk, we will present one such architecture named CE-ACCE ("Cloud Enabled Advanced sCience Compute Environment"), which we have been developing at the NASA Jet Propulsion Laboratory over the past year. CE-ACCE is based on the Apache OODT ("Object Oriented Data Technology") suite of services for full data lifecycle management, which are turned into a composable array of Docker images, and complemented by a plug-in model for mission-specific customization. We have applied this infrastructure to both flying and upcoming NASA missions, such as ECOSTRESS and SMAP, and demonstrated deployment on the Amazon Cloud, either using simple EC2 instances, or advanced AWS services such as Amazon Lambda and ECS (EC2 Container Services).

  17. A self-adjusting delay circuit for pixel read-out chips

    International Nuclear Information System (INIS)

    Raith, B.

    1997-01-01

    A simple concept for automatic adjustment of important VLSI-circuit properties was proposed in (Fischer and Joens, Nucl. Instr. and. Meth.). As an application, a self-adjusting monoflop is reviewed, and detailed measurements are discussed regarding a possible implementation in the LHC 1 read-out chip for the ATLAS experiment (ATLAS Internal Note, 1995). (orig.)

  18. Effects of Self-Adjusting File, Mtwo, and ProTaper on the root canal wall

    NARCIS (Netherlands)

    Hin, E.S.; Wu, M.K.; Wesselink, P.R.; Shemesh, H.

    2013-01-01

    Introduction The purpose of this ex vivo study was to observe the incidence of cracks in root dentin after root canal preparation with hand files, self-adjusting file (SAF), ProTaper, and Mtwo. Methods One hundred extracted mandibular premolars with single canals were randomly selected. Two

  19. Glove-Enabled Computer Operations (GECO): Design and Testing of an Extravehicular Activity Glove Adapted for Human-Computer Interface

    Science.gov (United States)

    Adams, Richard J.; Olowin, Aaron; Krepkovich, Eileen; Hannaford, Blake; Lindsay, Jack I. C.; Homer, Peter; Patrie, James T.; Sands, O. Scott

    2013-01-01

    The Glove-Enabled Computer Operations (GECO) system enables an extravehicular activity (EVA) glove to be dual-purposed as a human-computer interface device. This paper describes the design and human participant testing of a right-handed GECO glove in a pressurized glove box. As part of an investigation into the usability of the GECO system for EVA data entry, twenty participants were asked to complete activities including (1) a Simon Says Games in which they attempted to duplicate random sequences of targeted finger strikes and (2) a Text Entry activity in which they used the GECO glove to enter target phrases in two different virtual keyboard modes. In a within-subjects design, both activities were performed both with and without vibrotactile feedback. Participants mean accuracies in correctly generating finger strikes with the pressurized glove were surprisingly high, both with and without the benefit of tactile feedback. Five of the subjects achieved mean accuracies exceeding 99 in both conditions. In Text Entry, tactile feedback provided a statistically significant performance benefit, quantified by characters entered per minute, as well as reduction in error rate. Secondary analyses of responses to a NASA Task Loader Index (TLX) subjective workload assessments reveal a benefit for tactile feedback in GECO glove use for data entry. This first-ever investigation of employment of a pressurized EVA glove for human-computer interface opens up a wide range of future applications, including text chat communications, manipulation of procedureschecklists, cataloguingannotating images, scientific note taking, human-robot interaction, and control of suit andor other EVA systems.

  20. Smart Items, Fog and Cloud Computing as Enablers of Servitization in Healthcare

    Directory of Open Access Journals (Sweden)

    Vladimir STANTCHEV

    2015-02-01

    Full Text Available In this article we argue that smart items and cloud computing can be powerful enablers of servitization as business trend. This is exemplified by an application scenario in healthcare that was developed in the context of the OpSIT-Project in Germany. We present a three-level architecture for a smart healthcare infrastructure. The approach is based on a service-oriented architecture and extends established architectural approaches developed previously at our group. More specifically, it integrates a role model, a layered cloud computing architecture, as well as a fog- computing-informed paradigm in order to provide a viable architecture for healthcare and elderly-care applications. The integration of established paradigms is beneficial with respect to providing adequate quality of service and governance (e.g., data privacy and compliance. It has been verified by expert interviews with healthcare specialists and IT professionals. To further demonstrate the validity of this architectural model, we provide an example use-case as a template for any kind of smart sensor-based healthcare infrastructure.

  1. Reversing the Trend of Large Scale and Centralization in Manufacturing: The Case of Distributed Manufacturing of Customizable 3-D-Printable Self-Adjustable Glasses

    Directory of Open Access Journals (Sweden)

    Jephias Gwamuri

    2014-04-01

    Full Text Available Although the trend in manufacturing has been towards centralization to leverage economies of scale, the recent rapid technical development of open-source 3-D printers enables low-cost distributed bespoke production. This paper explores the potential advantages of a distributed manufacturing model of high-value products by investigating the application of 3-D printing to self-refraction eyeglasses. A series of parametric 3-D printable designs is developed, fabricated and tested to overcome limitations identified with mass-manufactured self-correcting eyeglasses designed for the developing world's poor. By utilizing 3-D printable self-adjustable glasses, communities not only gain access to far more diversity in product design, as the glasses can be customized for the individual, but 3-D printing also offers the potential for significant cost reductions. The results show that distributed manufacturing with open-source 3-D printing can empower developing world communities through the ability to print less expensive and customized self-adjusting eyeglasses. This offers the potential to displace both centrally manufactured conventional and self-adjusting glasses while completely eliminating the costs of the conventional optics correction experience, including those of highly-trained optometrists and ophthalmologists and their associated equipment. Although, this study only analyzed a single product, it is clear that other products would benefit from the same approach in isolated regions of the developing world.

  2. PEAC: A Power-Efficient Adaptive Computing Technology for Enabling Swarm of Small Spacecraft and Deployable Mini-Payloads

    Data.gov (United States)

    National Aeronautics and Space Administration — This task is to develop and demonstrate a path-to-flight and power-adaptive avionics technology PEAC (Power Efficient Adaptive Computing). PEAC will enable emerging...

  3. DOE High Performance Computing Operational Review (HPCOR): Enabling Data-Driven Scientific Discovery at HPC Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, Richard; Allcock, William; Beggio, Chris; Campbell, Stuart; Cherry, Andrew; Cholia, Shreyas; Dart, Eli; England, Clay; Fahey, Tim; Foertter, Fernanda; Goldstone, Robin; Hick, Jason; Karelitz, David; Kelly, Kaki; Monroe, Laura; Prabhat,; Skinner, David; White, Julia

    2014-10-17

    U.S. Department of Energy (DOE) High Performance Computing (HPC) facilities are on the verge of a paradigm shift in the way they deliver systems and services to science and engineering teams. Research projects are producing a wide variety of data at unprecedented scale and level of complexity, with community-specific services that are part of the data collection and analysis workflow. On June 18-19, 2014 representatives from six DOE HPC centers met in Oakland, CA at the DOE High Performance Operational Review (HPCOR) to discuss how they can best provide facilities and services to enable large-scale data-driven scientific discovery at the DOE national laboratories. The report contains findings from that review.

  4. The COPD Knowledge Base: enabling data analysis and computational simulation in translational COPD research.

    Science.gov (United States)

    Cano, Isaac; Tényi, Ákos; Schueller, Christine; Wolff, Martin; Huertas Migueláñez, M Mercedes; Gomez-Cabrero, David; Antczak, Philipp; Roca, Josep; Cascante, Marta; Falciani, Francesco; Maier, Dieter

    2014-11-28

    Previously we generated a chronic obstructive pulmonary disease (COPD) specific knowledge base (http://www.copdknowledgebase.eu) from clinical and experimental data, text-mining results and public databases. This knowledge base allowed the retrieval of specific molecular networks together with integrated clinical and experimental data. The COPDKB has now been extended to integrate over 40 public data sources on functional interaction (e.g. signal transduction, transcriptional regulation, protein-protein interaction, gene-disease association). In addition we integrated COPD-specific expression and co-morbidity networks connecting over 6 000 genes/proteins with physiological parameters and disease states. Three mathematical models describing different aspects of systemic effects of COPD were connected to clinical and experimental data. We have completely redesigned the technical architecture of the user interface and now provide html and web browser-based access and form-based searches. A network search enables the use of interconnecting information and the generation of disease-specific sub-networks from general knowledge. Integration with the Synergy-COPD Simulation Environment enables multi-scale integrated simulation of individual computational models while integration with a Clinical Decision Support System allows delivery into clinical practice. The COPD Knowledge Base is the only publicly available knowledge resource dedicated to COPD and combining genetic information with molecular, physiological and clinical data as well as mathematical modelling. Its integrated analysis functions provide overviews about clinical trends and connections while its semantically mapped content enables complex analysis approaches. We plan to further extend the COPDKB by offering it as a repository to publish and semantically integrate data from relevant clinical trials. The COPDKB is freely available after registration at http://www.copdknowledgebase.eu.

  5. The Centre of High-Performance Scientific Computing, Geoverbund, ABC/J - Geosciences enabled by HPSC

    Science.gov (United States)

    Kollet, Stefan; Görgen, Klaus; Vereecken, Harry; Gasper, Fabian; Hendricks-Franssen, Harrie-Jan; Keune, Jessica; Kulkarni, Ketan; Kurtz, Wolfgang; Sharples, Wendy; Shrestha, Prabhakar; Simmer, Clemens; Sulis, Mauro; Vanderborght, Jan

    2016-04-01

    The Centre of High-Performance Scientific Computing (HPSC TerrSys) was founded 2011 to establish a centre of competence in high-performance scientific computing in terrestrial systems and the geosciences enabling fundamental and applied geoscientific research in the Geoverbund ABC/J (geoscientfic research alliance of the Universities of Aachen, Cologne, Bonn and the Research Centre Jülich, Germany). The specific goals of HPSC TerrSys are to achieve relevance at the national and international level in (i) the development and application of HPSC technologies in the geoscientific community; (ii) student education; (iii) HPSC services and support also to the wider geoscientific community; and in (iv) the industry and public sectors via e.g., useful applications and data products. A key feature of HPSC TerrSys is the Simulation Laboratory Terrestrial Systems, which is located at the Jülich Supercomputing Centre (JSC) and provides extensive capabilities with respect to porting, profiling, tuning and performance monitoring of geoscientific software in JSC's supercomputing environment. We will present a summary of success stories of HPSC applications including integrated terrestrial model development, parallel profiling and its application from watersheds to the continent; massively parallel data assimilation using physics-based models and ensemble methods; quasi-operational terrestrial water and energy monitoring; and convection permitting climate simulations over Europe. The success stories stress the need for a formalized education of students in the application of HPSC technologies in future.

  6. Rapid, computer vision-enabled murine screening system identifies neuropharmacological potential of two new mechanisms

    Directory of Open Access Journals (Sweden)

    Steven L Roberds

    2011-09-01

    Full Text Available The lack of predictive in vitro models for behavioral phenotypes impedes rapid advancement in neuropharmacology and psychopharmacology. In vivo behavioral assays are more predictive of activity in human disorders, but such assays are often highly resource-intensive. Here we describe the successful application of a computer vision-enabled system to identify potential neuropharmacological activity of two new mechanisms. The analytical system was trained using multiple drugs that are used clinically to treat depression, schizophrenia, anxiety, and other psychiatric or behavioral disorders. During blinded testing the PDE10 inhibitor TP-10 produced a signature of activity suggesting potential antipsychotic activity. This finding is consistent with TP-10’s activity in multiple rodent models that is similar to that of clinically used antipsychotic drugs. The CK1ε inhibitor PF-670462 produced a signature consistent with anxiolytic activity and, at the highest dose tested, behavioral effects similar to that of opiate analgesics. Neither TP-10 nor PF-670462 was included in the training set. Thus, computer vision-based behavioral analysis can facilitate drug discovery by identifying neuropharmacological effects of compounds acting through new mechanisms.

  7. Time synchronization algorithm of distributed system based on server time-revise and workstation self-adjust

    International Nuclear Information System (INIS)

    Zhou Shumin; Sun Yamin; Tang Bin

    2007-01-01

    In order to enhance the time synchronization quality of the distributed system, a time synchronization algorithm of distributed system based on server time-revise and workstation self-adjust is proposed. The time-revise cycle and self-adjust process is introduced in the paper. The algorithm reduces network flow effectively and enhances the quality of clock-synchronization. (authors)

  8. Design procedure for a wind-wheel with self-adjusting blade mechanism

    Directory of Open Access Journals (Sweden)

    Gennady A. Oborsky

    2014-12-01

    Full Text Available Developed is a wind-wheel design equipped with the self-adjusting blade. The blade is positioned eccentrically to the balance wheel and can freely rotate around its axis. Elaborated is the method of calculating the energy characteristics for a wind-wheel with the self-adjusting blade, considering not only the wind force but the force of air counter flow resistance to the blade’s rotation. Initially, the blade being located at an angle α = 45 to the wheel rotation plane, the air flow rotates the wheel with the maximum force. Thus, the speed of rotation increases that involves the increase in air counter flow resistance and results in blade turning with respective angle α reduction. This, consequently, reduces the torque. When the torsional force and the resistance enter into equilibrium, the blade takes a certain angle α, and the wheel speed becomes constant. This wind-wheel design including a self-adjusting blade allows increasing the air flow load ratio when compared to the wind-wheel equipped with a jammed blade.

  9. The (1+λ) evolutionary algorithm with self-adjusting mutation rate

    DEFF Research Database (Denmark)

    Doerr, Benjamin; Witt, Carsten; Gießen, Christian

    2017-01-01

    We propose a new way to self-adjust the mutation rate in population-based evolutionary algorithms. Roughly speaking, it consists of creating half the offspring with a mutation rate that is twice the current mutation rate and the other half with half the current rate. The mutation rate is then upd......We propose a new way to self-adjust the mutation rate in population-based evolutionary algorithms. Roughly speaking, it consists of creating half the offspring with a mutation rate that is twice the current mutation rate and the other half with half the current rate. The mutation rate...... is then updated to the rate used in that subpopulation which contains the best offspring. We analyze how the (1 + A) evolutionary algorithm with this self-adjusting mutation rate optimizes the OneMax test function. We prove that this dynamic version of the (1 + A) EA finds the optimum in an expected optimization...... time (number of fitness evaluations) of O(nA/log A + n log n). This time is asymptotically smaller than the optimization time of the classic (1 + A) EA. Previous work shows that this performance is best-possible among all A-parallel mutation-based unbiased black-box algorithms. This result shows...

  10. Enabling systematic, harmonised and large-scale biofilms data computation: the Biofilms Experiment Workbench.

    Science.gov (United States)

    Pérez-Rodríguez, Gael; Glez-Peña, Daniel; Azevedo, Nuno F; Pereira, Maria Olívia; Fdez-Riverola, Florentino; Lourenço, Anália

    2015-03-01

    Biofilms are receiving increasing attention from the biomedical community. Biofilm-like growth within human body is considered one of the key microbial strategies to augment resistance and persistence during infectious processes. The Biofilms Experiment Workbench is a novel software workbench for the operation and analysis of biofilms experimental data. The goal is to promote the interchange and comparison of data among laboratories, providing systematic, harmonised and large-scale data computation. The workbench was developed with AIBench, an open-source Java desktop application framework for scientific software development in the domain of translational biomedicine. Implementation favours free and open-source third-parties, such as the R statistical package, and reaches for the Web services of the BiofOmics database to enable public experiment deposition. First, we summarise the novel, free, open, XML-based interchange format for encoding biofilms experimental data. Then, we describe the execution of common scenarios of operation with the new workbench, such as the creation of new experiments, the importation of data from Excel spreadsheets, the computation of analytical results, the on-demand and highly customised construction of Web publishable reports, and the comparison of results between laboratories. A considerable and varied amount of biofilms data is being generated, and there is a critical need to develop bioinformatics tools that expedite the interchange and comparison of microbiological and clinical results among laboratories. We propose a simple, open-source software infrastructure which is effective, extensible and easy to understand. The workbench is freely available for non-commercial use at http://sing.ei.uvigo.es/bew under LGPL license. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  11. Enabling Extreme Scale Earth Science Applications at the Oak Ridge Leadership Computing Facility

    Science.gov (United States)

    Anantharaj, V. G.; Mozdzynski, G.; Hamrud, M.; Deconinck, W.; Smith, L.; Hack, J.

    2014-12-01

    The Oak Ridge Leadership Facility (OLCF), established at the Oak Ridge National Laboratory (ORNL) under the auspices of the U.S. Department of Energy (DOE), welcomes investigators from universities, government agencies, national laboratories and industry who are prepared to perform breakthrough research across a broad domain of scientific disciplines, including earth and space sciences. Titan, the OLCF flagship system, is currently listed as #2 in the Top500 list of supercomputers in the world, and the largest available for open science. The computational resources are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, sponsored by the U.S. DOE Office of Science. In 2014, over 2.25 billion core hours on Titan were awarded via INCITE projects., including 14% of the allocation toward earth sciences. The INCITE competition is also open to research scientists based outside the USA. In fact, international research projects account for 12% of the INCITE awards in 2014. The INCITE scientific review panel also includes 20% participation from international experts. Recent accomplishments in earth sciences at OLCF include the world's first continuous simulation of 21,000 years of earth's climate history (2009); and an unprecedented simulation of a magnitude 8 earthquake over 125 sq. miles. One of the ongoing international projects involves scaling the ECMWF Integrated Forecasting System (IFS) model to over 200K cores of Titan. ECMWF is a partner in the EU funded Collaborative Research into Exascale Systemware, Tools and Applications (CRESTA) project. The significance of the research carried out within this project is the demonstration of techniques required to scale current generation Petascale capable simulation codes towards the performance levels required for running on future Exascale systems. One of the techniques pursued by ECMWF is to use Fortran2008 coarrays to overlap computations and communications and

  12. Spoken language identification based on the enhanced self-adjusting extreme learning machine approach

    Science.gov (United States)

    Tiun, Sabrina; AL-Dhief, Fahad Taha; Sammour, Mahmoud A. M.

    2018-01-01

    Spoken Language Identification (LID) is the process of determining and classifying natural language from a given content and dataset. Typically, data must be processed to extract useful features to perform LID. The extracting features for LID, based on literature, is a mature process where the standard features for LID have already been developed using Mel-Frequency Cepstral Coefficients (MFCC), Shifted Delta Cepstral (SDC), the Gaussian Mixture Model (GMM) and ending with the i-vector based framework. However, the process of learning based on extract features remains to be improved (i.e. optimised) to capture all embedded knowledge on the extracted features. The Extreme Learning Machine (ELM) is an effective learning model used to perform classification and regression analysis and is extremely useful to train a single hidden layer neural network. Nevertheless, the learning process of this model is not entirely effective (i.e. optimised) due to the random selection of weights within the input hidden layer. In this study, the ELM is selected as a learning model for LID based on standard feature extraction. One of the optimisation approaches of ELM, the Self-Adjusting Extreme Learning Machine (SA-ELM) is selected as the benchmark and improved by altering the selection phase of the optimisation process. The selection process is performed incorporating both the Split-Ratio and K-Tournament methods, the improved SA-ELM is named Enhanced Self-Adjusting Extreme Learning Machine (ESA-ELM). The results are generated based on LID with the datasets created from eight different languages. The results of the study showed excellent superiority relating to the performance of the Enhanced Self-Adjusting Extreme Learning Machine LID (ESA-ELM LID) compared with the SA-ELM LID, with ESA-ELM LID achieving an accuracy of 96.25%, as compared to the accuracy of SA-ELM LID of only 95.00%. PMID:29672546

  13. Spoken language identification based on the enhanced self-adjusting extreme learning machine approach.

    Science.gov (United States)

    Albadr, Musatafa Abbas Abbood; Tiun, Sabrina; Al-Dhief, Fahad Taha; Sammour, Mahmoud A M

    2018-01-01

    Spoken Language Identification (LID) is the process of determining and classifying natural language from a given content and dataset. Typically, data must be processed to extract useful features to perform LID. The extracting features for LID, based on literature, is a mature process where the standard features for LID have already been developed using Mel-Frequency Cepstral Coefficients (MFCC), Shifted Delta Cepstral (SDC), the Gaussian Mixture Model (GMM) and ending with the i-vector based framework. However, the process of learning based on extract features remains to be improved (i.e. optimised) to capture all embedded knowledge on the extracted features. The Extreme Learning Machine (ELM) is an effective learning model used to perform classification and regression analysis and is extremely useful to train a single hidden layer neural network. Nevertheless, the learning process of this model is not entirely effective (i.e. optimised) due to the random selection of weights within the input hidden layer. In this study, the ELM is selected as a learning model for LID based on standard feature extraction. One of the optimisation approaches of ELM, the Self-Adjusting Extreme Learning Machine (SA-ELM) is selected as the benchmark and improved by altering the selection phase of the optimisation process. The selection process is performed incorporating both the Split-Ratio and K-Tournament methods, the improved SA-ELM is named Enhanced Self-Adjusting Extreme Learning Machine (ESA-ELM). The results are generated based on LID with the datasets created from eight different languages. The results of the study showed excellent superiority relating to the performance of the Enhanced Self-Adjusting Extreme Learning Machine LID (ESA-ELM LID) compared with the SA-ELM LID, with ESA-ELM LID achieving an accuracy of 96.25%, as compared to the accuracy of SA-ELM LID of only 95.00%.

  14. Self-adjusting entropy-stable scheme for compressible Euler equations

    Institute of Scientific and Technical Information of China (English)

    程晓晗; 聂玉峰; 封建湖; LuoXiao-Yu; 蔡力

    2015-01-01

    In this work, a self-adjusting entropy-stable scheme is proposed for solving compressible Euler equations. The entropy-stable scheme is constructed by combining the entropy conservative flux with a suitable diffusion operator. The entropy has to be preserved in smooth solutions and be dissipated at shocks. To achieve this, a switch function, based on entropy variables, is employed to make the numerical diffusion term added around discontinuities automatically. The resulting scheme is still entropy-stable. A number of numerical experiments illustrating the robustness and accuracy of the scheme are presented. From these numerical results, we observe a remarkable gain in accuracy.

  15. Self-adjusting entropy-stable scheme for compressible Euler equations

    International Nuclear Information System (INIS)

    Cheng Xiao-Han; Nie Yu-Feng; Cai Li; Feng Jian-Hu; Luo Xiao-Yu

    2015-01-01

    In this work, a self-adjusting entropy-stable scheme is proposed for solving compressible Euler equations. The entropy-stable scheme is constructed by combining the entropy conservative flux with a suitable diffusion operator. The entropy has to be preserved in smooth solutions and be dissipated at shocks. To achieve this, a switch function, which is based on entropy variables, is employed to make the numerical diffusion term be automatically added around discontinuities. The resulting scheme is still entropy-stable. A number of numerical experiments illustrating the robustness and accuracy of the scheme are presented. From these numerical results, we observe a remarkable gain in accuracy. (paper)

  16. The Security Challenges in the IoT Enabled Cyber-Physical Systems and Opportunities for Evolutionary Computing & Other Computational Intelligence

    OpenAIRE

    He, H.; Maple, C.; Watson, T.; Tiwari, A.; Mehnen, J.; Jin, Y.; Gabrys, Bogdan

    2016-01-01

    Internet of Things (IoT) has given rise to the fourth industrial revolution (Industrie 4.0), and it brings great benefits by connecting people, processes and data. However, cybersecurity has become a critical challenge in the IoT enabled cyber physical systems, from connected supply chain, Big Data produced by huge amount of IoT devices, to industry control systems. Evolutionary computation combining with other computational intelligence will play an important role for cybersecurity, such as ...

  17. Enabling Lean Design Through Computer Aided Synthesis: The Injection Moulding Cooling Case

    NARCIS (Netherlands)

    Jauregui Becker, Juan Manuel; Wits, Wessel Willems

    2015-01-01

    This paper explores the application of Computer Aided Synthesis (CAS) to support the implementation of Set-Based Concurrent Engineering (SBCE) and Just In Time Decision Making (JIT-DM), which are considered as two of the cornerstones of the Lean Design method. Computer Aided Synthesis refers to a

  18. Enabling Customization through Web Development: An Iterative Study of the Dell Computer Corporation Website

    Science.gov (United States)

    Liu, Chang; Mackie, Brian G.

    2008-01-01

    Throughout the last decade, companies have increased their investment in electronic commerce (EC) by developing and implementing Web-based applications on the Internet. This paper describes a class project to develop a customized computer website which is similar to Dell Computer Corporation's (Dell) website. The objective of this project is to…

  19. SaaS enabled admission control for MCMC simulation in cloud computing infrastructures

    Science.gov (United States)

    Vázquez-Poletti, J. L.; Moreno-Vozmediano, R.; Han, R.; Wang, W.; Llorente, I. M.

    2017-02-01

    Markov Chain Monte Carlo (MCMC) methods are widely used in the field of simulation and modelling of materials, producing applications that require a great amount of computational resources. Cloud computing represents a seamless source for these resources in the form of HPC. However, resource over-consumption can be an important drawback, specially if the cloud provision process is not appropriately optimized. In the present contribution we propose a two-level solution that, on one hand, takes advantage of approximate computing for reducing the resource demand and on the other, uses admission control policies for guaranteeing an optimal provision to running applications.

  20. Porting Erasmus Computing Grid (Condor enabled Applications for EDGeS)

    NARCIS (Netherlands)

    L.V. de Zeeuw (Luc); T.A. Knoch (Tobias)

    2008-01-01

    textabstractToday advances in scientific research as well as clinical diagnostics and treatment are inevitably connected with information solutions concerning computation power and information storage. The needs for information technology are enormous and are in many cases the limiting

  1. Enabling Students to Construct Theories of Collaborative Inquiry and Reflective Learning: Computer Support for Metacognitive Development

    OpenAIRE

    White, Barbara Y.; Shimoda, Todd A.; Frederiksen, John R.

    1999-01-01

    Part II of the Special Issue on Authoring Systems for Intelligent Tutoring Systems (editors: Tom Murray and Stephen Blessing); To develop lifelong learning skills, we argue that students need to learn how to learn via inquiry and understand the sociocognitive and metacognitive processes that are involved. We illustrate how software could play a central role in enabling students to develop such expertise. Our hypothesis is that sociocognitive systems, such as those needed for collaborative inq...

  2. The boat hull model : adapting the roofline model to enable performance prediction for parallel computing

    NARCIS (Netherlands)

    Nugteren, C.; Corporaal, H.

    2012-01-01

    Multi-core and many-core were already major trends for the past six years, and are expected to continue for the next decades. With these trends of parallel computing, it becomes increasingly difficult to decide on which architecture to run a given application. In this work, we use an algorithm

  3. The boat hull model : enabling performance prediction for parallel computing prior to code development

    NARCIS (Netherlands)

    Nugteren, C.; Corporaal, H.

    2012-01-01

    Multi-core and many-core were already major trends for the past six years and are expected to continue for the next decade. With these trends of parallel computing, it becomes increasingly difficult to decide on which processor to run a given application, mainly because the programming of these

  4. Enabling the ATLAS Experiment at the LHC for High Performance Computing

    CERN Document Server

    AUTHOR|(CDS)2091107; Ereditato, Antonio

    In this thesis, I studied the feasibility of running computer data analysis programs from the Worldwide LHC Computing Grid, in particular large-scale simulations of the ATLAS experiment at the CERN LHC, on current general purpose High Performance Computing (HPC) systems. An approach for integrating HPC systems into the Grid is proposed, which has been implemented and tested on the „Todi” HPC machine at the Swiss National Supercomputing Centre (CSCS). Over the course of the test, more than 500000 CPU-hours of processing time have been provided to ATLAS, which is roughly equivalent to the combined computing power of the two ATLAS clusters at the University of Bern. This showed that current HPC systems can be used to efficiently run large-scale simulations of the ATLAS detector and of the detected physics processes. As a first conclusion of my work, one can argue that, in perspective, running large-scale tasks on a few large machines might be more cost-effective than running on relatively small dedicated com...

  5. The impact of computer science in molecular medicine: enabling high-throughput research.

    Science.gov (United States)

    de la Iglesia, Diana; García-Remesal, Miguel; de la Calle, Guillermo; Kulikowski, Casimir; Sanz, Ferran; Maojo, Víctor

    2013-01-01

    The Human Genome Project and the explosion of high-throughput data have transformed the areas of molecular and personalized medicine, which are producing a wide range of studies and experimental results and providing new insights for developing medical applications. Research in many interdisciplinary fields is resulting in data repositories and computational tools that support a wide diversity of tasks: genome sequencing, genome-wide association studies, analysis of genotype-phenotype interactions, drug toxicity and side effects assessment, prediction of protein interactions and diseases, development of computational models, biomarker discovery, and many others. The authors of the present paper have developed several inventories covering tools, initiatives and studies in different computational fields related to molecular medicine: medical informatics, bioinformatics, clinical informatics and nanoinformatics. With these inventories, created by mining the scientific literature, we have carried out several reviews of these fields, providing researchers with a useful framework to locate, discover, search and integrate resources. In this paper we present an analysis of the state-of-the-art as it relates to computational resources for molecular medicine, based on results compiled in our inventories, as well as results extracted from a systematic review of the literature and other scientific media. The present review is based on the impact of their related publications and the available data and software resources for molecular medicine. It aims to provide information that can be useful to support ongoing research and work to improve diagnostics and therapeutics based on molecular-level insights.

  6. A Cloud Computing-Enabled Spatio-Temporal Cyber-Physical Information Infrastructure for Efficient Soil Moisture Monitoring

    Directory of Open Access Journals (Sweden)

    Lianjie Zhou

    2016-06-01

    Full Text Available Comprehensive surface soil moisture (SM monitoring is a vital task in precision agriculture applications. SM monitoring includes remote sensing imagery monitoring and in situ sensor-based observational monitoring. Cloud computing can increase computational efficiency enormously. A geographical web service was developed to assist in agronomic decision making, and this tool can be scaled to any location and crop. By integrating cloud computing and the web service-enabled information infrastructure, this study uses the cloud computing-enabled spatio-temporal cyber-physical infrastructure (CESCI to provide an efficient solution for soil moisture monitoring in precision agriculture. On the server side of CESCI, diverse Open Geospatial Consortium web services work closely with each other. Hubei Province, located on the Jianghan Plain in central China, is selected as the remote sensing study area in the experiment. The Baoxie scientific experimental field in Wuhan City is selected as the in situ sensor study area. The results show that the proposed method enhances the efficiency of remote sensing imagery mapping and in situ soil moisture interpolation. In addition, the proposed method is compared to other existing precision agriculture infrastructures. In this comparison, the proposed infrastructure performs soil moisture mapping in Hubei Province in 1.4 min and near real-time in situ soil moisture interpolation in an efficient manner. Moreover, an enhanced performance monitoring method can help to reduce costs in precision agriculture monitoring, as well as increasing agricultural productivity and farmers’ net-income.

  7. Maintaining Privacy in Pervasive Computing - Enabling Acceptance of Sensor-based Services

    Science.gov (United States)

    Soppera, A.; Burbridge, T.

    During the 1980s, Mark Weiser [1] predicted a world in which computing was so pervasive that devices embedded in the environment could sense their relationship to us and to each other. These tiny ubiquitous devices would continually feed information from the physical world into the information world. Twenty years ago, this vision was the exclusive territory of academic computer scientists and science fiction writers. Today this subject has become of interest to business, government, and society. Governmental authorities exercise their power through the networked environment. Credit card databases maintain our credit history and decide whether we are allowed to rent a house or obtain a loan. Mobile telephones can locate us in real time so that we do not miss calls. Within another 10 years, all sorts of devices will be connected through the network. Our fridge, our food, together with our health information, may all be networked for the purpose of maintaining diet and well-being. The Internet will move from being an infrastructure to connect computers, to being an infrastructure to connect everything [2, 3].

  8. WorkStream-- A Design Pattern for Multicore-Enabled Finite Element Computations

    KAUST Repository

    Turcksin, Bruno

    2016-08-31

    Many operations that need to be performed in modern finite element codes can be described as an operation that needs to be done independently on every cell, followed by a reduction of these local results into a global data structure. For example, matrix assembly, estimating discretization errors, or converting nodal values into data structures that can be output in visualization file formats all fall into this class of operations. Using this realization, we identify a software design pattern that we callWorkStream and that can be used to model such operations and enables the use of multicore shared memory parallel processing. We also describe in detail how this design pattern can be efficiently implemented, and we provide numerical scalability results from its use in the DEAL.II software library.

  9. Direct numerical simulation of reactor two-phase flows enabled by high-performance computing

    Energy Technology Data Exchange (ETDEWEB)

    Fang, Jun; Cambareri, Joseph J.; Brown, Cameron S.; Feng, Jinyong; Gouws, Andre; Li, Mengnan; Bolotnov, Igor A.

    2018-04-01

    Nuclear reactor two-phase flows remain a great engineering challenge, where the high-resolution two-phase flow database which can inform practical model development is still sparse due to the extreme reactor operation conditions and measurement difficulties. Owing to the rapid growth of computing power, the direct numerical simulation (DNS) is enjoying a renewed interest in investigating the related flow problems. A combination between DNS and an interface tracking method can provide a unique opportunity to study two-phase flows based on first principles calculations. More importantly, state-of-the-art high-performance computing (HPC) facilities are helping unlock this great potential. This paper reviews the recent research progress of two-phase flow DNS related to reactor applications. The progress in large-scale bubbly flow DNS has been focused not only on the sheer size of those simulations in terms of resolved Reynolds number, but also on the associated advanced modeling and analysis techniques. Specifically, the current areas of active research include modeling of sub-cooled boiling, bubble coalescence, as well as the advanced post-processing toolkit for bubbly flow simulations in reactor geometries. A novel bubble tracking method has been developed to track the evolution of bubbles in two-phase bubbly flow. Also, spectral analysis of DNS database in different geometries has been performed to investigate the modulation of the energy spectrum slope due to bubble-induced turbulence. In addition, the single-and two-phase analysis results are presented for turbulent flows within the pressurized water reactor (PWR) core geometries. The related simulations are possible to carry out only with the world leading HPC platforms. These simulations are allowing more complex turbulence model development and validation for use in 3D multiphase computational fluid dynamics (M-CFD) codes.

  10. Phase transitions enable computational universality in neuristor-based cellular automata

    International Nuclear Information System (INIS)

    Pickett, Matthew D; Stanley Williams, R

    2013-01-01

    We recently demonstrated that Mott memristors, two-terminal devices that exhibit threshold switching via an insulator to conductor phase transition, can serve as the active components necessary to build a neuristor, a biomimetic threshold spiking device. Here we extend those results to demonstrate, in simulation, neuristor-based circuits capable of performing general Boolean logic operations. We additionally show that these components can be used to construct a one-dimensional cellular automaton, rule 137, previously proven to be universal. This proof-of-principle shows that localized phase transitions can perform spiking computation, which is of particular interest for neuromorphic hardware. (paper)

  11. Selection Finder (SelFi: A computational metabolic engineering tool to enable directed evolution of enzymes

    Directory of Open Access Journals (Sweden)

    Neda Hassanpour

    2017-06-01

    Full Text Available Directed evolution of enzymes consists of an iterative process of creating mutant libraries and choosing desired phenotypes through screening or selection until the enzymatic activity reaches a desired goal. The biggest challenge in directed enzyme evolution is identifying high-throughput screens or selections to isolate the variant(s with the desired property. We present in this paper a computational metabolic engineering framework, Selection Finder (SelFi, to construct a selection pathway from a desired enzymatic product to a cellular host and to couple the pathway with cell survival. We applied SelFi to construct selection pathways for four enzymes and their desired enzymatic products xylitol, D-ribulose-1,5-bisphosphate, methanol, and aniline. Two of the selection pathways identified by SelFi were previously experimentally validated for engineering Xylose Reductase and RuBisCO. Importantly, SelFi advances directed evolution of enzymes as there is currently no known generalized strategies or computational techniques for identifying high-throughput selections for engineering enzymes.

  12. The self-adjusting file (SAF) system: An evidence-based update

    Science.gov (United States)

    Metzger, Zvi

    2014-01-01

    Current rotary file systems are effective tools. Nevertheless, they have two main shortcomings: They are unable to effectively clean and shape oval canals and depend too much on the irrigant to do the cleaning, which is an unrealistic illusionThey may jeopardize the long-term survival of the tooth via unnecessary, excessive removal of sound dentin and creation of micro-cracks in the remaining root dentin. The new Self-adjusting File (SAF) technology uses a hollow, compressible NiTi file, with no central metal core, through which a continuous flow of irrigant is provided throughout the procedure. The SAF technology allows for effective cleaning of all root canals including oval canals, thus allowing for the effective disinfection and obturation of all canal morphologies. This technology uses a new concept of cleaning and shaping in which a uniform layer of dentin is removed from around the entire perimeter of the root canal, thus avoiding unnecessary excessive removal of sound dentin. Furthermore, the mode of action used by this file system does not apply the machining of all root canals to a circular bore, as do all other rotary file systems, and does not cause micro-cracks in the remaining root dentin. The new SAF technology allows for a new concept in cleaning and shaping root canals: Minimally Invasive 3D Endodontics. PMID:25298639

  13. Effects of self-adjusting file, Mtwo, and ProTaper on the root canal wall.

    Science.gov (United States)

    Hin, Ellemieke S; Wu, Min-Kai; Wesselink, Paul R; Shemesh, Hagay

    2013-02-01

    The purpose of this ex vivo study was to observe the incidence of cracks in root dentin after root canal preparation with hand files, self-adjusting file (SAF), ProTaper, and Mtwo. One hundred extracted mandibular premolars with single canals were randomly selected. Two angulated radiographs were taken for each tooth, and the width of the canal was measured at 9 mm from the apex. Five groups of 20 teeth each were comparable in canal width. The control group was left unprepared. Four experimental groups were instrumented with hand files, ProTaper, Mtwo, and SAF. Roots were then sectioned horizontally and observed under a microscope. The presence of dentinal cracks and their location were noted. The difference between the experimental groups was analyzed with a χ(2) test. No cracks were observed in the control group. In the experimental groups, ProTaper, Mtwo, and SAF caused cracks in 35%, 25%, and 10% of teeth, respectively. The hand-file group did not show any dentinal cracks (P ProTaper and Mtwo caused more cracks than hand files (P .05). Instrumentation of root canals with SAF, Mtwo, and ProTaper could cause damage to root canal dentin. SAF has a tendency to cause less dentinal cracks as compared with ProTaper or Mtwo. Copyright © 2013 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  14. The self-adjusting file (SAF) system: An evidence-based update.

    Science.gov (United States)

    Metzger, Zvi

    2014-09-01

    Current rotary file systems are effective tools. Nevertheless, they have two main shortcomings: They are unable to effectively clean and shape oval canals and depend too much on the irrigant to do the cleaning, which is an unrealistic illusionThey may jeopardize the long-term survival of the tooth via unnecessary, excessive removal of sound dentin and creation of micro-cracks in the remaining root dentin. The new Self-adjusting File (SAF) technology uses a hollow, compressible NiTi file, with no central metal core, through which a continuous flow of irrigant is provided throughout the procedure. The SAF technology allows for effective cleaning of all root canals including oval canals, thus allowing for the effective disinfection and obturation of all canal morphologies. This technology uses a new concept of cleaning and shaping in which a uniform layer of dentin is removed from around the entire perimeter of the root canal, thus avoiding unnecessary excessive removal of sound dentin. Furthermore, the mode of action used by this file system does not apply the machining of all root canals to a circular bore, as do all other rotary file systems, and does not cause micro-cracks in the remaining root dentin. The new SAF technology allows for a new concept in cleaning and shaping root canals: Minimally Invasive 3D Endodontics.

  15. Assessment of apically extruded debris produced by the self-adjusting file system.

    Science.gov (United States)

    De-Deus, Gustavo André; Nogueira Leal Silva, Emmanuel João; Moreira, Edson Jorge; de Almeida Neves, Aline; Belladonna, Felipe Gonçalves; Tameirão, Michele

    2014-04-01

    This study was designed to quantitatively evaluate the amount of apically extruded debris by the Self-Adjusting-File system (SAF; ReDent-Nova, Ra'anana, Israel). Hand and rotary instruments were used as references for comparison. Sixty mesial roots of mandibular molars were randomly assigned to 3 groups (n = 20). The root canals were instrumented with hand files using a crown-down technique. The ProTaper (Dentsply Maillefer, Ballaigues, Switzerland) and SAF systems were used according to the manufacturers' instructions. Sodium hypochlorite was used as an irrigant, and the apically extruded debris was collected in preweighted glass vials and dried afterward. The mean weight of debris was assessed with a microbalance and statistically analyzed using 1-way analysis of variance and the post hoc Tukey multiple comparison test. Hand file instrumentation produced significantly more debris compared with the ProTaper and SAF systems (P system produced significantly more debris compared with the SAF system (P systems caused apical debris extrusion. SAF instrumentation was associated with less debris extrusion compared with the use of hand and rotary files. Copyright © 2014 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  16. Punishment Mechanism with Self-Adjusting Rules in Spatial Voluntary Public Goods Games

    Science.gov (United States)

    Wu, Zhong-Wei; Xu, Zhao-Jin; Zhang, Lian-Zhong

    2014-11-01

    The phenomena of cooperation in animal and human society are ubiquitous, but the selfish outcome that no player contributes to the public good will lead to the “tragedy of the commons”. The recent research shows that high punishment can improve the cooperation of the population. In this paper, we introduce a punishment mechanism into spatial voluntary public goods games with every individual only knowing his own payoff in each round. Using the self-adjusting rules, we find that the different cost for punishment can lead to different effects on the voluntary public goods games. Especially, when the cost for punishment is decreased, a higher contribution region will appear in the case of low r value. It means even for the low r value, individuals can form the contributing groups in large quantities to produce a more efficient outcome than that in moderate r value. In addition, we also find the players' memory can have effects on the average outcome of the population.

  17. Design and kinetic analysis of piezoelectric energy harvesters with self-adjusting resonant frequency

    Science.gov (United States)

    Yu-Jen, Wang; Tsung-Yi, Chuang; Jui-Hsin, Yu

    2017-09-01

    Vibration-based energy harvesters have been developed as power sources for wireless sensor networks. Because the vibration frequency of the environment is varied with surrounding conditions, how to design an adaptive energy harvester is a practical topic. This paper proposes a design for a piezoelectric energy harvester possessing the ability to self-adjust its resonant frequency in rotational environments. The effective length of a trapezoidal cantilever is extended by centrifugal force from a rotating wheel to vary its area moment of inertia. The analytical solution for the natural frequency of the piezoelectric energy harvester was derived from the parameter design process, which could specify a structure approaching resonance at any wheel rotating frequency. The kinetic equation and electrical damping induced by power generation were derived from a Lagrange method and a mechanical-electrical coupling model, respectively. An energy harvester with adequate parameters can generate power at a wide range of car speeds. The output power of an experimental prototype composed of piezoelectric thin films and connected to a 3.3 MΩ external resistor was approximately 70-140 μW at wheel speeds ranging from 200 to 700 RPM. These results demonstrate that the proposed piezoelectric energy harvester can be applied as a power source for the wireless tire pressure monitoring sensor.

  18. Computational reduction of specimen noise to enable improved thermography characterization of flaws in graphite polymer composites

    Science.gov (United States)

    Winfree, William P.; Howell, Patricia A.; Zalameda, Joseph N.

    2014-05-01

    Flaw detection and characterization with thermographic techniques in graphite polymer composites are often limited by localized variations in the thermographic response. Variations in properties such as acceptable porosity, fiber volume content and surface polymer thickness result in variations in the thermal response that in general cause significant variations in the initial thermal response. These result in a "noise" floor that increases the difficulty of detecting and characterizing deeper flaws. A method is presented for computationally removing a significant amount of the "noise" from near surface porosity by diffusing the early time response, then subtracting it from subsequent responses. Simulations of the thermal response of a composite are utilized in defining the limitations of the technique. This method for reducing the data is shown to give considerable improvement characterizing both the size and depth of damage. Examples are shown for data acquired on specimens with fabricated delaminations and impact damage.

  19. Computational Reduction of Specimen Noise to Enable Improved Thermography Characterization of Flaws in Graphite Polymer Composites

    Science.gov (United States)

    Winfree, William P.; Howell, Patricia A.; Zalameda, Joseph N.

    2014-01-01

    Flaw detection and characterization with thermographic techniques in graphite polymer composites are often limited by localized variations in the thermographic response. Variations in properties such as acceptable porosity, fiber volume content and surface polymer thickness result in variations in the thermal response that in general cause significant variations in the initial thermal response. These result in a "noise" floor that increases the difficulty of detecting and characterizing deeper flaws. A method is presented for computationally removing a significant amount of the "noise" from near surface porosity by diffusing the early time response, then subtracting it from subsequent responses. Simulations of the thermal response of a composite are utilized in defining the limitations of the technique. This method for reducing the data is shown to give considerable improvement characterizing both the size and depth of damage. Examples are shown for data acquired on specimens with fabricated delaminations and impact damage.

  20. A FPGA-based Network Interface Card with GPUDirect enabling realtime GPU computing in HEP experiments

    CERN Document Server

    Lonardo, Alessandro; Ammendola, Roberto; Biagioni, Andrea; Cotta Ramusino, Angelo; Fiorini, Massimiliano; Frezza, Ottorino; Lamanna, Gianluca; Lo Cicero, Francesca; Martinelli, Michele; Neri, Ilaria; Paolucci, Pier Stanislao; Pastorelli, Elena; Pontisso, Luca; Rossetti, Davide; Simeone, Francesco; Simula, Francesco; Sozzi, Marco; Tosoratto, Laura; Vicini, Piero

    2015-01-01

    The capability of processing high bandwidth data streams in real-time is a computational requirement common to many High Energy Physics experiments. Keeping the latency of the data transport tasks under control is essential in order to meet this requirement. We present NaNet, a FPGA-based PCIe Network Interface Card design featuring Remote Direct Memory Access towards CPU and GPU memories plus a transport protocol offload module characterized by cycle-accurate upper-bound handling. The combination of these two features allows to relieve almost entirely the OS and the application from data tranfer management, minimizing the unavoidable jitter effects associated to OS process scheduling. The design currently supports one GbE (1000Base-T) and three custom 34 Gbps APElink I/O channels, but four-channels 10GbE (10Base-R) and 2.5 Gbps deterministic latency KM3link versions are being implemented. Two use cases of NaNet will be discussed: the GPU-based low level trigger for the RICH detector in the NA62 experiment an...

  1. SciDAC-Data, A Project to Enabling Data Driven Modeling of Exascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Mubarak, M.; Ding, P.; Aliaga, L.; Tsaris, A.; Norman, A.; Lyon, A.; Ross, R.

    2016-10-10

    The SciDAC-Data project is a DOE funded initiative to analyze and exploit two decades of information and analytics that have been collected by the Fermilab Data Center on the organization, movement, and consumption of High Energy Physics data. The project will analyze the analysis patterns and data organization that have been used by the NOvA, MicroBooNE, MINERvA and other experiments, to develop realistic models of HEP analysis workflows and data processing. The SciDAC-Data project aims to provide both realistic input vectors and corresponding output data that can be used to optimize and validate simulations of HEP analysis. These simulations are designed to address questions of data handling, cache optimization and workflow structures that are the prerequisites for modern HEP analysis chains to be mapped and optimized to run on the next generation of leadership class exascale computing facilities. We will address the use of the SciDAC-Data distributions acquired from Fermilab Data Center’s analysis workflows and corresponding to around 71,000 HEP jobs, as the input to detailed queuing simulations that model the expected data consumption and caching behaviors of the work running in HPC environments. In particular we describe in detail how the Sequential Access via Metadata (SAM) data handling system in combination with the dCache/Enstore based data archive facilities have been analyzed to develop the radically different models of the analysis of HEP data. We present how the simulation may be used to analyze the impact of design choices in archive facilities.

  2. Late enhanced computed tomography in Hypertrophic Cardiomyopathy enables accurate left-ventricular volumetry

    Energy Technology Data Exchange (ETDEWEB)

    Langer, Christoph; Lutz, M.; Kuehl, C.; Frey, N. [Christian-Albrechts-Universitaet Kiel, Department of Cardiology, Angiology and Critical Care Medicine, University Medical Center Schleswig-Holstein (Germany); Partner Site Hamburg/Kiel/Luebeck, DZHK (German Centre for Cardiovascular Research), Kiel (Germany); Both, M.; Sattler, B.; Jansen, O; Schaefer, P. [Christian-Albrechts-Universitaet Kiel, Department of Diagnostic Radiology, University Medical Center Schleswig-Holstein (Germany); Harders, H.; Eden, M. [Christian-Albrechts-Universitaet Kiel, Department of Cardiology, Angiology and Critical Care Medicine, University Medical Center Schleswig-Holstein (Germany)

    2014-10-15

    Late enhancement (LE) multi-slice computed tomography (leMDCT) was introduced for the visualization of (intra-) myocardial fibrosis in Hypertrophic Cardiomyopathy (HCM). LE is associated with adverse cardiac events. This analysis focuses on leMDCT derived LV muscle mass (LV-MM) which may be related to LE resulting in LE proportion for potential risk stratification in HCM. N=26 HCM-patients underwent leMDCT (64-slice-CT) and cardiovascular magnetic resonance (CMR). In leMDCT iodine contrast (Iopromid, 350 mg/mL; 150mL) was injected 7 minutes before imaging. Reconstructed short cardiac axis views served for planimetry. The study group was divided into three groups of varying LV-contrast. LeMDCT was correlated with CMR. The mean age was 64.2 ± 14 years. The groups of varying contrast differed in weight and body mass index (p < 0.05). In the group with good LV-contrast assessment of LV-MM resulted in 147.4 ± 64.8 g in leMDCT vs. 147.1 ± 65.9 in CMR (p > 0.05). In the group with sufficient contrast LV-MM appeared with 172 ± 30.8 g in leMDCT vs. 165.9 ± 37.8 in CMR (p > 0.05). Overall intra-/inter-observer variability of semiautomatic assessment of LV-MM showed an accuracy of 0.9 ± 8.6 g and 0.8 ± 9.2 g in leMDCT. All leMDCT-measures correlated well with CMR (r > 0.9). LeMDCT primarily performed for LE-visualization in HCM allows for accurate LV-volumetry including LV-MM in > 90 % of the cases. (orig.)

  3. Numerical Nuclear Second Derivatives on a Computing Grid: Enabling and Accelerating Frequency Calculations on Complex Molecular Systems.

    Science.gov (United States)

    Yang, Tzuhsiung; Berry, John F

    2018-06-04

    The computation of nuclear second derivatives of energy, or the nuclear Hessian, is an essential routine in quantum chemical investigations of ground and transition states, thermodynamic calculations, and molecular vibrations. Analytic nuclear Hessian computations require the resolution of costly coupled-perturbed self-consistent field (CP-SCF) equations, while numerical differentiation of analytic first derivatives has an unfavorable 6 N ( N = number of atoms) prefactor. Herein, we present a new method in which grid computing is used to accelerate and/or enable the evaluation of the nuclear Hessian via numerical differentiation: NUMFREQ@Grid. Nuclear Hessians were successfully evaluated by NUMFREQ@Grid at the DFT level as well as using RIJCOSX-ZORA-MP2 or RIJCOSX-ZORA-B2PLYP for a set of linear polyacenes with systematically increasing size. For the larger members of this group, NUMFREQ@Grid was found to outperform the wall clock time of analytic Hessian evaluation; at the MP2 or B2LYP levels, these Hessians cannot even be evaluated analytically. We also evaluated a 156-atom catalytically relevant open-shell transition metal complex and found that NUMFREQ@Grid is faster (7.7 times shorter wall clock time) and less demanding (4.4 times less memory requirement) than an analytic Hessian. Capitalizing on the capabilities of parallel grid computing, NUMFREQ@Grid can outperform analytic methods in terms of wall time, memory requirements, and treatable system size. The NUMFREQ@Grid method presented herein demonstrates how grid computing can be used to facilitate embarrassingly parallel computational procedures and is a pioneer for future implementations.

  4. Malavefes: A computational voice-enabled malaria fuzzy informatics software for correct dosage prescription of anti-malarial drugs

    Directory of Open Access Journals (Sweden)

    Olugbenga O. Oluwagbemi

    2018-04-01

    Full Text Available Malaria is one of the infectious diseases consistently inherent in many Sub-Sahara African countries. Among the issues of concern are the consequences of wrong diagnosis and dosage administration of anti-malarial drugs on sick patients; these have resulted into various degrees of complications ranging from severe headaches, stomach and body discomfort, blurred vision, dizziness, hallucinations, and in extreme cases, death. Many expert systems have been developed to support different infectious disease diagnoses, but not sure of any yet, that have been specifically designed as a voice-based application to diagnose and translate malaria patients’ symptomatic data for pre-laboratory screening and correct prescription of proper dosage of the appropriate medication. We developed Malavefes, (a malaria voice-enabled computational fuzzy expert system for correct dosage prescription of anti-malarial drugs using Visual Basic.NET., and Java programming languages. Data collation for this research was conducted by survey from existing literature and interview from public health experts. The database for this malaria drug informatics system was implemented using Microsoft Access. The Root Sum Square (RSS was implemented as the inference engine of Malavefes to make inferences from rules, while Centre of Gravity (CoG was implemented as the defuzzification engine. The drug recommendation module was voice-enabled. Additional anti-malaria drug expiration validation software was developed using Java programming language. We conducted a user-evaluation of the performance and user-experience of the Malavefes software. Keywords: Informatics, Bioinformatics, Fuzzy, Anti-malaria, Voice computing, Dosage prescription

  5. A high PSRR Class-D audio amplifier IC based on a self-adjusting voltage reference

    OpenAIRE

    Huffenus , Alexandre; Pillonnet , Gaël; Abouchi , Nacer; Goutti , Frédéric; Rabary , Vincent; Cittadini , Robert

    2010-01-01

    International audience; In a wide range of applications, audio amplifiers require a large Power Supply Rejection Ratio (PSRR) that the current Class-D architecture cannot reach. This paper proposes a self-adjusting internal voltage reference scheme that sets the bias voltages of the amplifier without losing on output dynamics. This solution relaxes the constraints on gain and feedback resistors matching that were previously the limiting factor for the PSRR. Theory of operation, design and IC ...

  6. FLUKA-LIVE-an embedded framework, for enabling a computer to execute FLUKA under the control of a Linux OS

    International Nuclear Information System (INIS)

    Cohen, A.; Battistoni, G.; Mark, S.

    2008-01-01

    This paper describes a Linux-based OS framework for integrating the FLUKA Monte Carlo software (currently distributed only for Linux) into a CD-ROM, resulting in a complete environment for a scientist to edit, link and run FLUKA routines-without the need to install a UNIX/Linux operating system. The building process includes generating from scratch a complete operating system distribution which will, when operative, build all necessary components for successful operation of FLUKA software and libraries. Various source packages, as well as the latest kernel sources, are freely available from the Internet. These sources are used to create a functioning Linux system that integrates several core utilities in line with the main idea-enabling FLUKA to act as if it was running under a popular Linux distribution or even a proprietary UNIX workstation. On boot-up a file system will be created and the contents from the CD will be uncompressed and completely loaded into RAM-after which the presence of the CD is no longer necessary, and could be removed for use on a second computer. The system can operate on any i386 PC as long as it can boot from a CD

  7. A self-adjustable four-point probing system using polymeric three dimensional coils and non-toxic liquid metal

    Energy Technology Data Exchange (ETDEWEB)

    Oyunbaatar, Nomin-Erdene; Choi, Young Soo; Lee, Dong-Weon, E-mail: mems@jnu.ac.kr [MEMS and Nanotechnology Laboratory, School of Mechanical Engineering, Chonnam National University, Gwangju 500757 (Korea, Republic of)

    2015-12-15

    This paper describes a self-adjustable four-point probe (S4PP) system with a square configuration. The S4PP system consists of 3D polymer coil springs for the independent operation of each tungsten (W) probe, microfluidic channels filled with a nontoxic liquid metal, and a LabView-based control system. The 3D coil springs made by PMMA are fabricated with a 3D printer and are positioned in a small container filled with the non-toxic liquid metal. This unique configuration allows independent self-adjustment of the probe heights for precise measurements of the electrical properties of both flexible and large-step-height microsamples. The feasibility of the fabricated S4PP system is evaluated by measuring the specific resistance of Cr and Au thin films deposited on silicon wafers. The system is then employed to evaluate the electrical properties of a Au thin film deposited onto a flexible and easily breakable silicon diaphragm (spring constant: ∼3.6 × 10{sup −5} N/m). The resistance of the Cr thin films (thickness: 450 nm) with step heights of 60 and 90 μm is also successfully characterized. These experimental results indicate that the proposed S4PP system can be applied to common metals and semiconductors as well as flexible and large-step-height samples.

  8. A self-adjustable four-point probing system using polymeric three dimensional coils and non-toxic liquid metal

    International Nuclear Information System (INIS)

    Oyunbaatar, Nomin-Erdene; Choi, Young Soo; Lee, Dong-Weon

    2015-01-01

    This paper describes a self-adjustable four-point probe (S4PP) system with a square configuration. The S4PP system consists of 3D polymer coil springs for the independent operation of each tungsten (W) probe, microfluidic channels filled with a nontoxic liquid metal, and a LabView-based control system. The 3D coil springs made by PMMA are fabricated with a 3D printer and are positioned in a small container filled with the non-toxic liquid metal. This unique configuration allows independent self-adjustment of the probe heights for precise measurements of the electrical properties of both flexible and large-step-height microsamples. The feasibility of the fabricated S4PP system is evaluated by measuring the specific resistance of Cr and Au thin films deposited on silicon wafers. The system is then employed to evaluate the electrical properties of a Au thin film deposited onto a flexible and easily breakable silicon diaphragm (spring constant: ∼3.6 × 10"−"5 N/m). The resistance of the Cr thin films (thickness: 450 nm) with step heights of 60 and 90 μm is also successfully characterized. These experimental results indicate that the proposed S4PP system can be applied to common metals and semiconductors as well as flexible and large-step-height samples.

  9. Developing Long-Term Computing Skills among Low-Achieving Students via Web-Enabled Problem-Based Learning and Self-Regulated Learning

    Science.gov (United States)

    Tsai, Chia-Wen; Lee, Tsang-Hsiung; Shen, Pei-Di

    2013-01-01

    Many private vocational schools in Taiwan have taken to enrolling students with lower levels of academic achievement. The authors re-designed a course and conducted a series of quasi-experiments to develop students' long-term computing skills, and examined the longitudinal effects of web-enabled, problem-based learning (PBL) and self-regulated…

  10. Computer Enabled Neuroplasticity Treatment: a Clinical Trial of a Novel Design for Neurofeedback Therapy in Adult ADHD

    Directory of Open Access Journals (Sweden)

    Benjamin eCowley

    2016-05-01

    analysis will be reported elsewhere.Trial RegistrationComputer Enabled Neuroplasticity Treatment (CENT, ISRCTN13915109.Partly funded by Finnish science agency TEKES, project #440078.

  11. Identifying the contents of a type 1 diabetes outpatient care program based on the self-adjustment of insulin using the Delphi method.

    Science.gov (United States)

    Kubota, Mutsuko; Shindo, Yukari; Kawaharada, Mariko

    2014-10-01

    The objective of this study is to identify the items necessary for an outpatient care program based on the self-adjustment of insulin for type 1 diabetes patients. Two surveys based on the Delphi method were conducted. The survey participants were 41 certified diabetes nurses in Japan. An outpatient care program based on the self-adjustment of insulin was developed based on pertinent published work and expert opinions. There were a total of 87 survey items in the questionnaire, which was developed based on the care program mentioned earlier, covering matters such as the establishment of prerequisites and a cooperative relationship, the basics of blood glucose pattern management, learning and practice sessions for the self-adjustment of insulin, the implementation of the self-adjustment of insulin, and feedback. The participants' approval on items in the questionnaires was defined at 70%. Participants agreed on all of the items in the first survey. Four new parameters were added to make a total of 91 items for the second survey and participants agreed on the inclusion of 84 of them. Items necessary for a type 1 diabetes outpatient care program based on self-adjustment of insulin were subsequently selected. It is believed that this care program received a fairly strong approval from certified diabetes nurses; however, it will be necessary to have the program further evaluated in conjunction with intervention studies in the future. © 2014 The Authors. Japan Journal of Nursing Science © 2014 Japan Academy of Nursing Science.

  12. The cloud services innovation platform- enabling service-based environmental modelling using infrastructure-as-a-service cloud computing

    Science.gov (United States)

    Service oriented architectures allow modelling engines to be hosted over the Internet abstracting physical hardware configuration and software deployments from model users. Many existing environmental models are deployed as desktop applications running on user's personal computers (PCs). Migration ...

  13. Evaluation of the incidence of microcracks caused by Mtwo and ProTaper Next rotary file systems versus the self-adjusting file: A scanning electron microscopic study.

    Science.gov (United States)

    Saha, Suparna Ganguly; Vijaywargiya, Neelam; Saxena, Divya; Saha, Mainak Kanti; Bharadwaj, Anuj; Dubey, Sandeep

    2017-01-01

    To evaluate the incidence of microcrack formation canal preparation with two rotary nickel-titanium systems Mtwo and ProTaper Next along with the self-adjusting file system. One hundred and twenty mandibular premolar teeth were selected. Standardized access cavities were prepared and the canals were manually prepared up to size 20 after coronal preflaring. The teeth were divided into three experimental groups and one control group ( n = 30). Group 1: The canals were prepared using Mtwo rotary files. Group 2: The canals were prepared with ProTaper Next files. Group 3: The canals were prepared with self-adjusting files. Group 4: The canals were unprepared and used as a control. The roots were sectioned horizontally 3, 6, and 9 mm from the apex and examined under a scanning electron microscope to check for the presence of microcracks. The Pearson's Chi-square test was applied. The highest incidence of microcracks were associated with the ProTaper Next group, 80% ( P = 0.00), followed by the Mtwo group, 70% ( P = 0.000), and the least number of microcracks was noted in the self-adjusting file group, 10% ( P = 0.068). No significant difference was found between the ProTaper Next and Mtwo groups ( P = 0.368) while a significant difference was observed between the ProTaper Next and self-adjusting file groups ( P = 0.000) as well as the Mtwo and self-adjusting file groups ( P = 0.000). All nickel-titanium rotary instrument systems were associated with microcracks. However, the self-adjusting file system had significantly fewer microcracks when compared with the Mtwo and ProTaper Next.

  14. Introducing Enabling Computational Tools to the Climate Sciences: Multi-Resolution Climate Modeling with Adaptive Cubed-Sphere Grids

    Energy Technology Data Exchange (ETDEWEB)

    Jablonowski, Christiane [Univ. of Michigan, Ann Arbor, MI (United States)

    2015-07-14

    The research investigates and advances strategies how to bridge the scale discrepancies between local, regional and global phenomena in climate models without the prohibitive computational costs of global cloud-resolving simulations. In particular, the research explores new frontiers in computational geoscience by introducing high-order Adaptive Mesh Refinement (AMR) techniques into climate research. AMR and statically-adapted variable-resolution approaches represent an emerging trend for atmospheric models and are likely to become the new norm in future-generation weather and climate models. The research advances the understanding of multi-scale interactions in the climate system and showcases a pathway how to model these interactions effectively with advanced computational tools, like the Chombo AMR library developed at the Lawrence Berkeley National Laboratory. The research is interdisciplinary and combines applied mathematics, scientific computing and the atmospheric sciences. In this research project, a hierarchy of high-order atmospheric models on cubed-sphere computational grids have been developed that serve as an algorithmic prototype for the finite-volume solution-adaptive Chombo-AMR approach. The foci of the investigations have lied on the characteristics of both static mesh adaptations and dynamically-adaptive grids that can capture flow fields of interest like tropical cyclones. Six research themes have been chosen. These are (1) the introduction of adaptive mesh refinement techniques into the climate sciences, (2) advanced algorithms for nonhydrostatic atmospheric dynamical cores, (3) an assessment of the interplay between resolved-scale dynamical motions and subgrid-scale physical parameterizations, (4) evaluation techniques for atmospheric model hierarchies, (5) the comparison of AMR refinement strategies and (6) tropical cyclone studies with a focus on multi-scale interactions and variable-resolution modeling. The results of this research project

  15. Efficiency of the Self Adjusting File, WaveOne, Reciproc, ProTaper and hand files in root canal debridement.

    Science.gov (United States)

    Topcu, K Meltem; Karatas, Ertugrul; Ozsu, Damla; Ersoy, Ibrahim

    2014-07-01

    The aim of this study was to compare the canal debridement capabilities of three single file systems, ProTaper, and K-files in oval-shaped canals. Seventy-five extracted human mandibular central incisors with oval-shaped root canals were selected. A radiopaque contrast medium (Metapex; Meta Biomed Co. Ltd., Chungcheongbuk-do, Korea) was introduced into the canal systems and the self-adjusting file (SAF), WaveOne, Reciproc, ProTaper, and K-files were used for the instrumentation of the canals. The percentage of removed contrast medium was calculated using pre- and post-operative radiographs. An overall comparison between the groups revealed that the hand file (HF) and SAF groups presented the lowest percentage of removed contrast medium, whereas the WaveOne group showed the highest percentage (P ProTaper group removed more contrast medium than the SAF and HF groups (P < 0.05). None of the instruments was able to remove the contrast medium completely. WaveOne performed significantly better than other groups.

  16. Influence of apical enlargement and complementary canal preparation with the Self-Adjusting File on endotoxin reduction in retreatment cases.

    Science.gov (United States)

    Silva, E J N L; Ferreira, V M; Silva, C C; Herrera, D R; De-Deus, G; Gomes, B P

    2017-07-01

    To compare the effectiveness of large apical preparations and complementary canal preparation with the Self-Adjusting File (SAF) in removing endotoxins from the root canal of teeth with apical periodontitis. Ten single-rooted and single-canaled teeth with post-treatment apical periodontitis were selected. Endotoxin samples were taken after removal of the root filling (S1), after chemomechanical preparation (CMP) using 2.5% NaOCl and an R25 file (S2), after CMP using 2.5% NaOCl and an R40 file (S3) and after complementary CMP using the SAF system (S4). Limulus amebocyte lysate (LAL) was used to measure endotoxin levels. The Friedman and Wilcoxon tests were used to compare endotoxin levels at each clinical intervention (P file was able to significantly reduce endotoxin levels (P file (P  0.05) following the use of the R40 instrument. Apical enlargement protocols were effective in significantly reducing endotoxin levels. Complementary preparation with the SAF system failed to eliminate residual endotoxin contents beyond those obtained with the R40 instrument. © 2016 International Endodontic Journal. Published by John Wiley & Sons Ltd.

  17. Efficacy of Two Irrigants Used with Self-Adjusting File System on Smear Layer: A Scanning Electron Microscopy Study.

    Science.gov (United States)

    Genç Şen, Özgür; Kaya, Sadullah; Er, Özgür; Alaçam, Tayfun

    2014-01-01

    Mechanical instrumentation of root canals produces a smear layer that adversely affects the root canal seal. The aim of this study was to evaluate efficacy of MTAD and citric acid solutions used with self-adjusting file (SAF) system on smear layer. Twenty-three single-rooted human teeth were used for the study. Canals were instrumented manually up to a number 20 K file size. SAF was used to prepare the root canals. The following groups were studied: Group 1: MTAD + 5.25% NaOCl, Group 2: 20% citric acid + 5.25% NaOCl, and Group 3: Control (5.25% NaOCl). All roots were split longitudinally and subjected to scanning electron microscopy. The presence of smear layer in the coronal, middle, and apical thirds was evaluated using a five-score evaluation system. Kruskal-Wallis and Mann-Whitney U tests were used for statistical analysis. In the coronal third, Group 2 exhibited the best results and was statistically different froms the other groups (P 0.05). The solutions used in Group 1 and 2 could effectively remove smear layer in most of the specimens. However, citric acid was more effective than MTAD in the three thirds of the canal.

  18. Neuron splitting in compute-bound parallel network simulations enables runtime scaling with twice as many processors.

    Science.gov (United States)

    Hines, Michael L; Eichner, Hubert; Schürmann, Felix

    2008-08-01

    Neuron tree topology equations can be split into two subtrees and solved on different processors with no change in accuracy, stability, or computational effort; communication costs involve only sending and receiving two double precision values by each subtree at each time step. Splitting cells is useful in attaining load balance in neural network simulations, especially when there is a wide range of cell sizes and the number of cells is about the same as the number of processors. For compute-bound simulations load balance results in almost ideal runtime scaling. Application of the cell splitting method to two published network models exhibits good runtime scaling on twice as many processors as could be effectively used with whole-cell balancing.

  19. High performance computing enabling exhaustive analysis of higher order single nucleotide polymorphism interaction in Genome Wide Association Studies.

    Science.gov (United States)

    Goudey, Benjamin; Abedini, Mani; Hopper, John L; Inouye, Michael; Makalic, Enes; Schmidt, Daniel F; Wagner, John; Zhou, Zeyu; Zobel, Justin; Reumann, Matthias

    2015-01-01

    Genome-wide association studies (GWAS) are a common approach for systematic discovery of single nucleotide polymorphisms (SNPs) which are associated with a given disease. Univariate analysis approaches commonly employed may miss important SNP associations that only appear through multivariate analysis in complex diseases. However, multivariate SNP analysis is currently limited by its inherent computational complexity. In this work, we present a computational framework that harnesses supercomputers. Based on our results, we estimate a three-way interaction analysis on 1.1 million SNP GWAS data requiring over 5.8 years on the full "Avoca" IBM Blue Gene/Q installation at the Victorian Life Sciences Computation Initiative. This is hundreds of times faster than estimates for other CPU based methods and four times faster than runtimes estimated for GPU methods, indicating how the improvement in the level of hardware applied to interaction analysis may alter the types of analysis that can be performed. Furthermore, the same analysis would take under 3 months on the currently largest IBM Blue Gene/Q supercomputer "Sequoia" at the Lawrence Livermore National Laboratory assuming linear scaling is maintained as our results suggest. Given that the implementation used in this study can be further optimised, this runtime means it is becoming feasible to carry out exhaustive analysis of higher order interaction studies on large modern GWAS.

  20. Using computers to enable self-management of aphasia therapy exercises for word finding: the patient and carer perspective.

    Science.gov (United States)

    Palmer, Rebecca; Enderby, Pam; Paterson, Gail

    2013-01-01

    Speech and language therapy (SLT) for aphasia can be difficult to access in the later stages of stroke recovery, despite evidence of continued improvement with sufficient therapeutic intensity. Computerized aphasia therapy has been reported to be useful for independent language practice, providing new opportunities for continued rehabilitation. The success of this option depends on its acceptability to patients and carers. To investigate factors that affect the acceptability of independent home computerized aphasia therapy practice. An acceptability study of computerized therapy was carried out alongside a pilot randomized controlled trial of computer aphasia therapy versus usual care for people more than 6 months post-stroke. Following language assessment and computer exercise prescription by a speech and language therapist, participants practised three times a week for 5 months at home with monthly volunteer support. Semi-structured interviews were conducted with 14 participants who received the intervention and ten carers (n = 24). Questions from a topic guide were presented and answered using picture, gesture and written support. Interviews were audio recorded, transcribed verbatim and analysed thematically. Three research SLTs identified and cross-checked themes and subthemes emerging from the data. The key themes that emerged were benefits and disadvantages of computerized aphasia therapy, need for help and support, and comparisons with face-to-face therapy. The independence, flexibility and repetition afforded by the computer was viewed as beneficial and the personalized exercises motivated participants to practise. Participants and carers perceived improvements in word-finding and confidence-talking. Computer practice could cause fatigue and interference with other commitments. Support from carers or volunteers for motivation and technical assistance was seen as important. Although some participants preferred face-to-face therapy, using a computer for

  1. Reduced-order modeling (ROM) for simulation and optimization powerful algorithms as key enablers for scientific computing

    CERN Document Server

    Milde, Anja; Volkwein, Stefan

    2018-01-01

    This edited monograph collects research contributions and addresses the advancement of efficient numerical procedures in the area of model order reduction (MOR) for simulation, optimization and control. The topical scope includes, but is not limited to, new out-of-the-box algorithmic solutions for scientific computing, e.g. reduced basis methods for industrial problems and MOR approaches for electrochemical processes. The target audience comprises research experts and practitioners in the field of simulation, optimization and control, but the book may also be beneficial for graduate students alike. .

  2. Final report and documentation for the security enabled programmable switch for protection of distributed internetworked computers LDRD.

    Energy Technology Data Exchange (ETDEWEB)

    Van Randwyk, Jamie A.; Robertson, Perry J.; Durgin, Nancy Ann; Toole, Timothy J.; Kucera, Brent D.; Campbell, Philip LaRoche; Pierson, Lyndon George

    2010-02-01

    An increasing number of corporate security policies make it desirable to push security closer to the desktop. It is not practical or feasible to place security and monitoring software on all computing devices (e.g. printers, personal digital assistants, copy machines, legacy hardware). We have begun to prototype a hardware and software architecture that will enforce security policies by pushing security functions closer to the end user, whether in the office or home, without interfering with users' desktop environments. We are developing a specialized programmable Ethernet network switch to achieve this. Embodied in this device is the ability to detect and mitigate network attacks that would otherwise disable or compromise the end user's computing nodes. We call this device a 'Secure Programmable Switch' (SPS). The SPS is designed with the ability to be securely reprogrammed in real time to counter rapidly evolving threats such as fast moving worms, etc. This ability to remotely update the functionality of the SPS protection device is cryptographically protected from subversion. With this concept, the user cannot turn off or fail to update virus scanning and personal firewall filtering in the SPS device as he/she could if implemented on the end host. The SPS concept also provides protection to simple/dumb devices such as printers, scanners, legacy hardware, etc. This report also describes the development of a cryptographically protected processor and its internal architecture in which the SPS device is implemented. This processor executes code correctly even if an adversary holds the processor. The processor guarantees both the integrity and the confidentiality of the code: the adversary cannot determine the sequence of instructions, nor can the adversary change the instruction sequence in a goal-oriented way.

  3. Pseudo-real-time low-pass filter in ECG, self-adjustable to the frequency spectra of the waves.

    Science.gov (United States)

    Christov, Ivaylo; Neycheva, Tatyana; Schmid, Ramun; Stoyanov, Todor; Abächerli, Roger

    2017-09-01

    The electrocardiogram (ECG) acquisition is often accompanied by high-frequency electromyographic (EMG) noise. The noise is difficult to be filtered, due to considerable overlapping of its frequency spectrum to the frequency spectrum of the ECG. Today, filters must conform to the new guidelines (2007) for low-pass filtering in ECG with cutoffs of 150 Hz for adolescents and adults, and to 250 Hz for children. We are suggesting a pseudo-real-time low-pass filter, self-adjustable to the frequency spectra of the ECG waves. The filter is based on the approximation procedure of Savitzky-Golay with dynamic change in the cutoff frequency. The filter is implemented pseudo-real-time (real-time with a certain delay). An additional option is the automatic on/off triggering, depending on the presence/absence of EMG noise. The analysis of the proposed filter shows that the low-frequency components of the ECG (low-power P- and T-waves, PQ-, ST- and TP-segments) are filtered with a cutoff of 14 Hz, the high-power P- and T-waves are filtered with a cutoff frequency in the range of 20-30 Hz, and the high-frequency QRS complexes are filtered with cutoff frequency of higher than 100 Hz. The suggested dynamic filter satisfies the conflicting requirements for a strong suppression of EMG noise and at the same time a maximal preservation of the ECG high-frequency components.

  4. Effectiveness of various irrigation activation protocols and the self-adjusting file system on smear layer and debris removal.

    Science.gov (United States)

    Çapar, İsmail Davut; Aydinbelge, Hale Ari

    2014-01-01

    The purpose of the present study is to evaluate smear layer generation and residual debris after using self-adjusting file (SAF) or rotary instrumentation and to compare the debris and smear layer removal efficacy of the SAF cleaning/shaping irrigation system against final agitation techniques. One hundred and eight maxillary lateral incisor teeth were randomly divided into nine experimental groups (n = 12), and root canals were prepared using ProTaper Universal rotary files, with the exception of the SAF instrumentation group. During instrumentation, root canals were irrigated with a total of 16 mL of 5% NaOCl. For final irrigation, rotary-instrumented groups were irrigated with 10 mL of 17% EDTA and 10 mL of 5% NaOCl using different irrigation agitation regimens (syringe irrigation with needles, NaviTip FX, manual dynamic irrigation, CanalBrush, EndoActivator, EndoVac, passive ultrasonic irrigation (PUI), and SAF irrigation). In the SAF instrumentation group, root canals were instrumented for 4 min at a rate of 4 mL/min with 5% NaOCl and received a final flush with same as syringe irrigation with needles. The surface of the root dentin was observed using a scanning electron microscope. The SAF instrumentation group generated less smear layer and yielded cleaner canals compared to rotary instrumentation. The EndoActivator, EndoVac, PUI, and SAF irrigation groups increased the efficacy of irrigating solutions on the smear layer and debris removal. The SAF instrumentation yielded cleaner canal walls when compared to rotary instrumentation. None of the techniques completely removed the smear layer from the root canal walls. © 2014 Wiley Periodicals, Inc.

  5. Computational Laboratory Astrophysics to Enable Transport Modeling of Protons and Hydrogen in Stellar Winds, the ISM, and other Astrophysical Environments

    Science.gov (United States)

    Schultz, David

    As recognized prominently by the APRA program, interpretation of NASA astrophysical mission observations requires significant products of laboratory astrophysics, for example, spectral lines and transition probabilities, electron-, proton-, or heavy-particle collision data. Availability of these data underpin robust and validated models of astrophysical emissions and absorptions, energy, momentum, and particle transport, dynamics, and reactions. Therefore, measured or computationally derived, analyzed, and readily available laboratory astrophysics data significantly enhances the scientific return on NASA missions such as HST, Spitzer, and JWST. In the present work a comprehensive set of data will be developed for the ubiquitous proton-hydrogen and hydrogen-hydrogen collisions in astrophysical environments including ISM shocks, supernova remnants and bubbles, HI clouds, young stellar objects, and winds within stellar spheres, covering the necessary wide range of energy- and charge-changing channels, collision energies, and most relevant scattering parameters. In addition, building on preliminary work, a transport and reaction simulation will be developed incorporating the elastic and inelastic collision data collected and produced. The work will build upon significant previous efforts of the principal investigators and collaborators, will result in a comprehensive data set required for modeling these environments and interpreting NASA astrophysical mission observations, and will benefit from feedback from collaborators who are active users of the work proposed.

  6. An AmI-Based Software Architecture Enabling Evolutionary Computation in Blended Commerce: The Shopping Plan Application

    Directory of Open Access Journals (Sweden)

    Giuseppe D’Aniello

    2015-01-01

    Full Text Available This work describes an approach to synergistically exploit ambient intelligence technologies, mobile devices, and evolutionary computation in order to support blended commerce or ubiquitous commerce scenarios. The work proposes a software architecture consisting of three main components: linked data for e-commerce, cloud-based services, and mobile apps. The three components implement a scenario where a shopping mall is presented as an intelligent environment in which customers use NFC capabilities of their smartphones in order to handle e-coupons produced, suggested, and consumed by the abovesaid environment. The main function of the intelligent environment is to help customers define shopping plans, which minimize the overall shopping cost by looking for best prices, discounts, and coupons. The paper proposes a genetic algorithm to find suboptimal solutions for the shopping plan problem in a highly dynamic context, where the final cost of a product for an individual customer is dependent on his previous purchases. In particular, the work provides details on the Shopping Plan software prototype and some experimentation results showing the overall performance of the genetic algorithm.

  7. Technology-enabled academic detailing: computer-mediated education between pharmacists and physicians for evidence-based prescribing.

    Science.gov (United States)

    Ho, Kendall; Nguyen, Anne; Jarvis-Selinger, Sandra; Novak Lauscher, Helen; Cressman, Céline; Zibrik, Lindsay

    2013-09-01

    Academic detailing (AD) is the practice of specially trained pharmacists with detailed medication knowledge meeting with physicians to share best practices of prescribing. AD has demonstrated efficacy in positively influencing physicians' prescribing behavior. Nevertheless, a key challenge has been that physicians in rural and remote locations, or physicians who are time challenged, have limited ability to participate in face-to-face meetings with academic detailers, as these specially trained academic detailers are primarily urban-based and limited in numbers. To determine the feasibility of using information technologies to facilitate communication between academic detailers and physicians (known as Technology-Enabled Academic Detailing or TEAD) through a comparison to traditional face-to-face academic detailing (AD). Specifically, TEAD is compared to AD in terms of the ability to aid physicians in acquiring evidence-informed prescribing information on diabetes-related medications, measured in terms of time efficiency, satisfaction of both physicians and pharmacists, and quality of knowledge exchange. General Practitioner Physicians (n=105) and pharmacists (n=12) were recruited from across British Columbia. Pharmacists were trained to be academic detailers on diabetes medication usage. Physicians were assigned to one of four intervention groups to receive four academic detailing sessions from trained pharmacists. Intervention groups included: (1) AD only, (2) TEAD only, (3) TEAD crossed over to AD at midpoint, and (4) AD crossed over to TEAD at midpoint. Evaluation included physician-completed surveys before and after each session, pharmacist logs after each detailing session, interviews and focus groups with physicians and pharmacists at study completion, as well as a technical support log to record all phone calls and emails from physicians and pharmacists regarding any technical challenges during the TEAD sessions, or usage of the web portal. Because

  8. X-Ray Micro-Computed Tomography of Apollo Samples as a Curation Technique Enabling Better Research

    Science.gov (United States)

    Ziegler, R. A.; Almeida, N. V.; Sykes, D.; Smith, C. L.

    2014-01-01

    X-ray micro-computed tomography (micro-CT) is a technique that has been used to research meteorites for some time and many others], and recently it is becoming a more common tool for the curation of meteorites and Apollo samples. Micro-CT is ideally suited to the characterization of astromaterials in the curation process as it can provide textural and compositional information at a small spatial resolution rapidly, nondestructively, and without compromising the cleanliness of the samples (e.g., samples can be scanned sealed in Teflon bags). This data can then inform scientists and curators when making and processing future sample requests for meteorites and Apollo samples. Here we present some preliminary results on micro-CT scans of four Apollo regolith breccias. Methods: Portions of four Apollo samples were used in this study: 14321, 15205, 15405, and 60639. All samples were 8-10 cm in their longest dimension and approximately equant. These samples were micro-CT scanned on the Nikon HMXST 225 System at the Natural History Museum in London. Scans were made at 205-220 kV, 135-160 microamps beam current, with an effective voxel size of 21-44 microns. Results: Initial examination of the data identify a variety of mineral clasts (including sub-voxel FeNi metal grains) and lithic clasts within the regolith breccias. Textural information within some of the lithic clasts was also discernable. Of particular interest was a large basalt clast (approx.1.3 cc) found within sample 60639, which appears to have a sub-ophitic texture. Additionally, internal void space, e.g., fractures and voids, is readily identifiable. Discussion: It is clear from the preliminary data that micro-CT analyses are able to identify important "new" clasts within the Apollo breccias, and better characterize previously described clasts or igneous samples. For example, the 60639 basalt clast was previously believed to be quite small based on its approx.0.5 sq cm exposure on the surface of the main mass

  9. Final Report for the project titled "Enabling Supernova Computations by Integrated Transport and Provisioning Methods Optimized for Dedicated Channels"

    Energy Technology Data Exchange (ETDEWEB)

    Malathi Veeraraghavan

    2007-10-31

    A high-speed optical circuit network is one that offers users rate-guaranteed connectivity between two endpoints, unlike today’s IP-routed Internet in which the rate available to a pair of users fluctuates based on the volume of competing traffic. This particular research project advanced our understanding of circuit networks in two ways. First, transport protocols were developed for circuit networks. In a circuit network, since bandwidth resources are reserved for each circuit on an end-to-end basis (much like how a person reserves a seat on every leg of a multi-segment flight), and the sender is limited to send at the rate of the circuit, there is no possibility of congestion during data transfer. Therefore, no congestion control functions are necessary in a transport protocol designed for circuits. However, error control and flow control are still required because bits can become errored due to noise and interference even on highly reliable optical links, and receivers can, due to multitasking or other reasons, not deplete the receive buffer fast enough to keep up with the sending rate (e.g., if the receiving host is multitasking between receiving a file transfer and some other computation). In this work, we developed two transport protocols for circuits, both of which are described below. Second, this project developed techniques for internetworking different types of connection-oriented networks, which are of two types: circuit-switched or packet-switched. In circuit-switched networks, multiplexing on links is “position based,” where “position” refers to the frequency, time slot, and port (fiber), while connection-oriented packet-switched networks use packet header information to demultiplex packets and switch them from node to node. The latter are commonly referred to as virtual circuit networks. Examples of circuit networks are time-division multiplexed Synchronous Optical Network/Synchronous Digital Hierarchy (SONET/SDH) and Wavelength Division

  10. Final Report on Institutional Computing Project s15_hilaserion, “Kinetic Modeling of Next-Generation High-Energy, High-Intensity Laser-Ion Accelerators as an Enabling Capability”

    Energy Technology Data Exchange (ETDEWEB)

    Albright, Brian James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Yin, Lin [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Stark, David James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-02-06

    This proposal sought of order 1M core-hours of Institutional Computing time intended to enable computing by a new LANL Postdoc (David Stark) working under LDRD ER project 20160472ER (PI: Lin Yin) on laser-ion acceleration. The project was “off-cycle,” initiating in June of 2016 with a postdoc hire.

  11. On performing of interference technique based on self-adjusting Zernike filters (SA-AVT method) to investigate flows and validate 3D flow numerical simulations

    Science.gov (United States)

    Pavlov, Al. A.; Shevchenko, A. M.; Khotyanovsky, D. V.; Pavlov, A. A.; Shmakov, A. S.; Golubev, M. P.

    2017-10-01

    We present a method for and results of determination of the field of integral density in the structure of flow corresponding to the Mach interaction of shock waves at Mach number M = 3. The optical diagnostics of flow was performed using an interference technique based on self-adjusting Zernike filters (SA-AVT method). Numerical simulations were carried out using the CFS3D program package for solving the Euler and Navier-Stokes equations. Quantitative data on the distribution of integral density on the path of probing radiation in one direction of 3D flow transillumination in the region of Mach interaction of shock waves were obtained for the first time.

  12. Meant to make a difference, the clinical experience of minimally invasive endodontics with the self-adjusting file system in India.

    Science.gov (United States)

    Pawar, Ajinkya M; Pawar, Mansing G; Kokate, Sharad R

    2014-01-01

    The vital steps in any endodontic treatment are thorough mechanical shaping and chemical cleaning followed by obtaining a fluid tight impervious seal by an inert obturating material. For the past two decades, introduction and use of rotary nickel-titanium (Ni-Ti) files have changed our concepts of endodontic treatment from conventional to contemporary. They have reported good success rates, but still have many drawbacks. The Self-Adjusting File (SAF) introduces a new era in endodontics by performing the vital steps of shaping and cleaning simultaneously. The SAF is a hollow file in design that adapts itself three-dimensionally to the root canal and is a single file system, made up of Ni-Ti lattice. The case series presented in the paper report the clinical experience, while treating primary endodontic cases with the SAF system in India.

  13. Meant to make a difference, the clinical experience of minimally invasive endodontics with the self-adjusting file system in India

    Directory of Open Access Journals (Sweden)

    Ajinkya M Pawar

    2014-01-01

    Full Text Available The vital steps in any endodontic treatment are thorough mechanical shaping and chemical cleaning followed by obtaining a fluid tight impervious seal by an inert obturating material. For the past two decades, introduction and use of rotary nickel-titanium (Ni-Ti files have changed our concepts of endodontic treatment from conventional to contemporary. They have reported good success rates, but still have many drawbacks. The Self-Adjusting File (SAF introduces a new era in endodontics by performing the vital steps of shaping and cleaning simultaneously. The SAF is a hollow file in design that adapts itself three-dimensionally to the root canal and is a single file system, made up of Ni-Ti lattice. The case series presented in the paper report the clinical experience, while treating primary endodontic cases with the SAF system in India.

  14. Facet-dependent photocatalytic mechanisms of anatase TiO2: A new sight on the self-adjusted surface heterojunction

    International Nuclear Information System (INIS)

    Gao, Shujun; Wang, Wei; Ni, Yaru; Lu, Chunhua; Xu, Zhongzi

    2015-01-01

    Efficient separation of photo-generated electrons and holes is crucial for improving the photocatalytic activity of semiconductor photocatalysts. In the present study, we show surface heterojunction is existed on anatase TiO 2 with exposed {101}, {010}, {001}, and {110} facets. With the help of selective Pt deposition, it is found the Schottky junction together with proper surface heterojunction is helpful to separate the photo-generated electrons and holes. Moreover, the photo-reduction and photo-oxidation activities of the facets are depended on the reaction systems, resulting in self-adjusted surface heterojunction. The as-prepared photocatalyst will give the highest phenol degradation efficiency when Pt nanoparticles are only deposited on the {101} and {010} facets. In contrast, more Pt deposited on the {001} and {110} facets will decrease the photocatalytic activity. The average phenol degradation rate, which will gradually reduce along with the increased phenol concentration, of TiO 2 (20 mg) is ca. 1.59 mg/min when its concentration is lower than 8 mg/L. However, similar results have not been observed in P25-based reaction systems, evidencing the great influence of self-adjusted surface heterojunction. This study may be helpful to understand the photocatalytic mechanisms of semiconductor photocatalysts with exposed different facets. Thus more efficient practical application of the photocatalysts for environment protection can be reached. - Highlights: • Surface heterojunction is systematically discussed on TiO 2 with various facets. • The surface heterojunction is found to be closely related to the reaction systems. • Proper surface heterojunction and Schottky junction is positive for photocatalysis. • A new sight is given to sufficiently unleash the fascinating properties of TiO 2 . • More efficient practical application can be reached

  15. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  16. Advanced Physical Models and Numerical Algorithms to Enable High-Fidelity Aerothermodynamic Simulations of Planetary Entry Vehicles on Emerging Distributed Heterogeneous Computing Architectures

    Data.gov (United States)

    National Aeronautics and Space Administration — The design and qualification of entry systems for planetary exploration largely rely on computational simulations. However, state-of-the-art modeling capabilities...

  17. Effectiveness of the self-adjusting file versus ProTaper systems to remove the smear layer in artificially induced internal root resorption cavities

    Directory of Open Access Journals (Sweden)

    Senem Yigit Özer

    2013-01-01

    Full Text Available Aim: Smear layer removal from artificially prepared internal root resorption (IRR cavities using the self-adjusting file (SAF system with activated continuous irrigation or the ProTaper system (Dentsply Maillefer, Ballaigues, Switzerland with conventional syringe/needle irrigation was compared. Materials and methods: Twenty-eight maxillary central incisors were selected, decoronated and 20 of them were randomizedly splited along the coronal plane into labial and lingual sections, and artificial IRR cavities were prepared in both walls. Tooth segments were rejoined and teeth were divided into two groups. Each group (n = 10 was prepared using the SAF or ProTaper system with 12-mL 5.25% NaOCl and 12-mL 17% EDTA. Root canals were prepared in six intact positive control teeth using the SAF or ProTaper system with 5.25% NaOCl and 17% EDTA. As negative controls, two intact teeth were prepared using NaOCl only. Roots were than split longitudinally from the rejoined segments and samples were evaluated under scanning electron microscopy using a five-point scoring system. Results: Most SAF (87% and ProTaper (83% samples (P > 0.05, had scores of 1 and 2 indicating clean canal walls for the IRR cavities. Conclusions: SAF with activated continuous irrigation and ProTaper with conventional syringe/needle irrigation both successfully removed the smear layer from artificially prepared IRR cavities

  18. Endodontic management of C-shaped root canal system of mandibular first molar by using a modified technique of self-adjusting file system.

    Science.gov (United States)

    Helvacioglu-Yigit, Dilek

    2015-01-01

    C-shaped canal system is a seldom-found root canal anatomy which displays a challenge in all stages of endodontic treatment. According to the literature, this type of canal morphology is not a common finding in the mandibular first molar teeth. This case report presents endodontic management of a mandibular first molar with a C-shaped canal system. Root canal system was cleaned and shaped by nickel-titanium (NiTi) rotary instruments combined with self-adjusting file (SAF). Obturation was performed using warm, vertical condensation combined with the injection of warm gutta-percha. Follow-up examination 12 months later showed that the tooth was asymptomatic. The radiological findings presented no signs of periapical pathology. The clinician must be aware of the occurence and complexity of C-shaped canals in mandibular first molar teeth to perform a successful root canal treatment. The supplementary use of SAF after application of rotary instruments in C-shaped root canals might be a promising approach in endodontic treatment of this type of canal morphology.

  19. Comparative evaluation of apically extruded debris with V-Taper, ProTaper Next, and the Self-adjusting File systems.

    Science.gov (United States)

    Vyavahare, Nishant K; Raghavendra, Srinidhi Surya; Desai, Niranjan N

    2016-01-01

    Complete cleaning of the root canal is the goal for ensuring success in endodontics. Removal of debris plays an important role in achieving this goal. In spite of advancements in instrument design, apical extrusion of debris remains a source of inflammation in the periradicular region. To comparatively evaluate the amount of apically extruded debris with V-Taper, ProTaper Next, and the self-adjusting File (SAF) system. Sixty-four extracted human mandibular teeth with straight root canals were taken. Access openings were done and working length determined. The samples were randomly divided into three groups: Group I - V-Taper files (n = 20), Group II - ProTaper Next (n = 20), Group III - SAF (n = 20). Biomechanical preparation was completed and the debris collected in vials to be quantitatively determined. The data obtained was statistically analyzed using ANOVA and post hoc Tukey's test. All the specimens showed apical debris extrusion. SAF showed significantly less debris extrusion compared to V-Taper and ProTaper Next (P endodontic instruments. This indicates that the incidence of inter-treatment flare-ups due to debris extrusion would be less with the SAF.

  20. 3D Analysis of D-RaCe and Self-Adjusting File in Removing Filling Materials from Curved Root Canals Instrumented and Filled with Different Techniques

    Directory of Open Access Journals (Sweden)

    Neslihan Simsek

    2014-01-01

    Full Text Available The aim of this study was to compare the efficacy of D-RaCe files and a self-adjusting file (SAF system in removing filling material from curved root canals instrumented and filled with different techniques by using microcomputed tomography (micro-CT. The mesial roots of 20 extracted mandibular first molars were used. Root canals (mesiobuccal and mesiolingual were instrumented with SAF or Revo-S. The canals were then filled with gutta-percha and AH Plus sealer using cold lateral compaction or thermoplasticized injectable techniques. The root fillings were first removed with D-RaCe (Step 1, followed by Step 2, in which a SAF system was used to remove the residual fillings in all groups. Micro-CT scans were used to measure the volume of residual filling after root canal filling, reinstrumentation with D-RaCe (Step 1, and reinstrumentation with SAF (Step 2. Data were analyzed using Wilcoxon and Kruskal-Wallis tests. There were no statistically significant differences between filling techniques in the canals instrumented with SAF (P=0.292 and Revo-S (P=0.306. The amount of remaining filling material was similar in all groups (P=0.363; all of the instrumentation techniques left filling residue inside the canals. However, the additional use of SAF was more effective than using D-RaCe alone.

  1. Principles and frequency of self-adjustment of insulin dose in people with diabetes mellitus type 1 and correlation with markers of metabolic control.

    Science.gov (United States)

    Kramer, Guido; Kuniss, Nadine; Kloos, Christof; Lehmann, Thomas; Müller, Nicolle; Wolf, Gunter; Lorkowski, Stefan; Müller, Ulrich A

    2016-06-01

    Insulin dose self-adjustment (ISA) to different blood glucose levels, carbohydrate intake, exercise or illness is a core element of structured education programmes for people with diabetes mellitus type 1 (DM1). The aim of this study was to register the patients' current principles and frequency of ISA and to check the ability for correct adjustments. 117 people with DM1 (mean HbA1c 7.1%, diabetes duration 24y) were interviewed in a tertiary care centre. The number of ISA was drawn from the last 28days of the patients' diary. The ability to find the correct insulin dose was assessed using five different calculation examples. All patients had participated in a structured education programme. Mean frequency of ISA was 72.1±29.4 per 28days. ISA by adjustment rules was used in 48% (56/117) and by personal experience or feeling in 44% (52/117). Patients adjusting by feeling were older, did less ISA and had lower social status. There were no differences in HbA1c (feeling 7.2±0.8 vs. rules 7.0±0.9, p=0.403), non severe hypoglycaemia (feeling 1.7±1.8 vs. rules 1.9±1.9, p=0.132) and comprehensibility of ISA between both groups. Overall, the participants answered on average 2.8±2.3 of the five calculation examples correctly. Although all people were trained to use a factor for correction for ISA in case of high premeal blood glucose levels, only half of the patients adjusted their insulin dosage using the complex rules from the treatment and education programme. Patients, who performed their ISA based upon feeling, did not show worse metabolic control. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  2. Evaluation of apically extruded debris during root canal retreatment with two different rotary systems followed by a self-adjusting file.

    Science.gov (United States)

    Cakici, Fatih; Cakici, Elif B; Küçükekenci, Funda Fundaoglu

    2016-02-01

    To compare the amount of apically extruded debris during root canal retreatment using ProTaper retreatment system (Dentsply Maillefer, Ballaigues, Switzerland), ProTaper retreatment system with Self-Adjusting File (SAF) system (ReDent-Nova, Ra'anana, Israel), Mtwo retreatment system (VDW, Munich, Germany), Mtwo retreatment system with SAF instruments. In total, 72 extracted human mandibular incisor teeth were used. All root canals were prepared with ProTaper universal (Dentsply Maillefer) up to F2 file and filled with gutta percha and AH plus sealer using cold lateral condensation before being assembled randomly into 4 groups (n = 18 each). Root canal filling materials were removed using the ProTaper retreatment system, the ProTaper retreatment system followed by SAF system, Mtwo retreatment system and Mtwo retreatment system followed by SAF system. Debris extruded apically during the removal of canal filling material was collected into preweighed Eppendorf tubes. The tubes were then stored in an incubator at 70°C for 5 days. The weight of the dry extruded debris was established by subtracting the preretreatment and postretreatment weight of the Eppendorf tubes for each group. The data obtained were analyzed using Kruskal-Wallis test. All retreatment techniques caused the apical extrusion of debris. There was no significant difference between the groups statistically (p>0.05). The results of this study showed that SAF system after Mtwo retreatment system and ProTaper retreatment system for improving retreatment has no significant effect on the amount of debris extruded apically.

  3. Toward genome-enabled mycology.

    Science.gov (United States)

    Hibbett, David S; Stajich, Jason E; Spatafora, Joseph W

    2013-01-01

    Genome-enabled mycology is a rapidly expanding field that is characterized by the pervasive use of genome-scale data and associated computational tools in all aspects of fungal biology. Genome-enabled mycology is integrative and often requires teams of researchers with diverse skills in organismal mycology, bioinformatics and molecular biology. This issue of Mycologia presents the first complete fungal genomes in the history of the journal, reflecting the ongoing transformation of mycology into a genome-enabled science. Here, we consider the prospects for genome-enabled mycology and the technical and social challenges that will need to be overcome to grow the database of complete fungal genomes and enable all fungal biologists to make use of the new data.

  4. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  6. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  7. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  9. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  10. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  11. Development of the Glenn-HT Computer Code to Enable Time-Filtered Navier-Stokes (TFNS) Simulations and Application to Film Cooling on a Flat Plate Through Long Cooling Tubes

    Science.gov (United States)

    Ameri, Ali; Shyam, Vikram; Rigby, David; Poinsatte, Philip; Thurman, Douglas; Steinthorsson, Erlendur

    2014-01-01

    Computational fluid dynamics (CFD) analysis using Reynolds-averaged Navier-Stokes (RANS) formulation for turbomachinery-related flows has enabled improved engine component designs. RANS methodology has limitations which are related to its inability to accurately describe the spectrum of flow phenomena encountered in engines. Examples of flows that are difficult to compute accurately with RANS include phenomena such as laminarturbulent transition, turbulent mixing due to mixing of streams, and separated flows. Large eddy simulation (LES) can improve accuracy but at a considerably higher cost. In recent years, hybrid schemes which take advantage of both unsteady RANS and LES have been proposed. This study investigated an alternative scheme, the time-filtered Navier-Stokes (TFNS) method applied to compressible flows. The method developed by Shih and Liu was implemented in the Glenn-HT code and applied to film cooling flows. In this report the method and its implementation is briefly described. The film effectiveness results obtained for film cooling from a row of 30 holes with a pitch of 3.0 diameters emitting air at a nominal density ratio of unity and four blowing ratios of 0.5, 1.0, 1.5 and 2.0 are shown. Flow features under those conditions are also described.

  12. Development of the Glenn Heat-Transfer (Glenn-HT) Computer Code to Enable Time-Filtered Navier-Stokes (TFNS) Simulations and Application to Film Cooling on a Flat Plate Through Long Cooling Tubes

    Science.gov (United States)

    Ameri, Ali; Shyam, Vikram; Rigby, David; Poinsatte, Phillip; Thurman, Douglas; Steinthorsson, Erlendur

    2014-01-01

    Computational fluid dynamics (CFD) analysis using Reynolds-averaged Navier-Stokes (RANS) formulation for turbomachinery-related flows has enabled improved engine component designs. RANS methodology has limitations that are related to its inability to accurately describe the spectrum of flow phenomena encountered in engines. Examples of flows that are difficult to compute accurately with RANS include phenomena such as laminar/turbulent transition, turbulent mixing due to mixing of streams, and separated flows. Large eddy simulation (LES) can improve accuracy but at a considerably higher cost. In recent years, hybrid schemes that take advantage of both unsteady RANS and LES have been proposed. This study investigated an alternative scheme, the time-filtered Navier-Stokes (TFNS) method applied to compressible flows. The method developed by Shih and Liu was implemented in the Glenn-Heat-Transfer (Glenn-HT) code and applied to film-cooling flows. In this report the method and its implementation is briefly described. The film effectiveness results obtained for film cooling from a row of 30deg holes with a pitch of 3.0 diameters emitting air at a nominal density ratio of unity and two blowing ratios of 0.5 and 1.0 are shown. Flow features under those conditions are also described.

  13. Development of the Glenn-Heat-Transfer (Glenn-HT) Computer Code to Enable Time-Filtered Navier Stokes (TFNS) Simulations and Application to Film Cooling on a Flat Plate Through Long Cooling Tubes

    Science.gov (United States)

    Ameri, Ali A.; Shyam, Vikram; Rigby, David; Poinsatte, Phillip; Thurman, Douglas; Steinthorsson, Erlendur

    2014-01-01

    Computational fluid dynamics (CFD) analysis using Reynolds-averaged Navier-Stokes (RANS) formulation for turbomachinery-related flows has enabled improved engine component designs. RANS methodology has limitations that are related to its inability to accurately describe the spectrum of flow phenomena encountered in engines. Examples of flows that are difficult to compute accurately with RANS include phenomena such as laminar/turbulent transition, turbulent mixing due to mixing of streams, and separated flows. Large eddy simulation (LES) can improve accuracy but at a considerably higher cost. In recent years, hybrid schemes that take advantage of both unsteady RANS and LES have been proposed. This study investigated an alternative scheme, the time-filtered Navier-Stokes (TFNS) method applied to compressible flows. The method developed by Shih and Liu was implemented in the Glenn-Heat-Transfer (Glenn-HT) code and applied to film-cooling flows. In this report the method and its implementation is briefly described. The film effectiveness results obtained for film cooling from a row of 30deg holes with a pitch of 3.0 diameters emitting air at a nominal density ratio of unity and two blowing ratios of 0.5 and 1.0 are shown. Flow features under those conditions are also described.

  14. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  15. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  18. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  19. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  20. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  1. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  2. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  3. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  4. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  6. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  8. Organising to Enable Innovation

    DEFF Research Database (Denmark)

    Brink, Tove

    2016-01-01

    The purpose of this conceptual paper is to reveal how organising can enable innovation across organisational layers and organisational units. This approach calls for a cross-disciplinary literature review. The aim is to provide an integrated understanding of innovation in an organisational approach....... The findings reveal a continous organising process between individual/ team creativity and organisational structures/control to enable innovation at firm level. Organising provides a dynamic approach and contains the integrated reconstruction of creativity, structures and boundaries for enhanced balance...... of explorative and exploitative learning in uncertain environments. Shedding light on the cross-disciplinary theories to organise innovation provides a contribution at the firm level to enable innovation....

  9. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  10. Simulation enabled safeguards assessment methodology

    International Nuclear Information System (INIS)

    Bean, Robert; Bjornard, Trond; Larson, Tom

    2007-01-01

    It is expected that nuclear energy will be a significant component of future supplies. New facilities, operating under a strengthened international nonproliferation regime will be needed. There is good reason to believe virtual engineering applied to the facility design, as well as to the safeguards system design will reduce total project cost and improve efficiency in the design cycle. Simulation Enabled Safeguards Assessment MEthodology has been developed as a software package to provide this capability for nuclear reprocessing facilities. The software architecture is specifically designed for distributed computing, collaborative design efforts, and modular construction to allow step improvements in functionality. Drag and drop wire-frame construction allows the user to select the desired components from a component warehouse, render the system for 3D visualization, and, linked to a set of physics libraries and/or computational codes, conduct process evaluations of the system they have designed. (authors)

  11. Simulation Enabled Safeguards Assessment Methodology

    International Nuclear Information System (INIS)

    Robert Bean; Trond Bjornard; Thomas Larson

    2007-01-01

    It is expected that nuclear energy will be a significant component of future supplies. New facilities, operating under a strengthened international nonproliferation regime will be needed. There is good reason to believe virtual engineering applied to the facility design, as well as to the safeguards system design will reduce total project cost and improve efficiency in the design cycle. Simulation Enabled Safeguards Assessment Methodology (SESAME) has been developed as a software package to provide this capability for nuclear reprocessing facilities. The software architecture is specifically designed for distributed computing, collaborative design efforts, and modular construction to allow step improvements in functionality. Drag and drop wireframe construction allows the user to select the desired components from a component warehouse, render the system for 3D visualization, and, linked to a set of physics libraries and/or computational codes, conduct process evaluations of the system they have designed

  12. The Nordic Housing Enabler

    DEFF Research Database (Denmark)

    Helle, Tina; Slaug, Bjørn; Brandt, Åse

    2010-01-01

    This study addresses development of a content valid cross-Nordic version of the Housing Enabler and investigation of its inter-rater reliability when used in occupational therapy rating situations, involving occupational therapists, clients and their home environments. The instrument was translated...... from the original Swedish version of the Housing Enabler, and adapted according to accessibility norms and guidelines for housing design in Sweden, Denmark, Finland and Iceland. This iterative process involved occupational therapists, architects, building engineers and professional translators......, resulting in the Nordic Housing Enabler. For reliability testing, the sampling strategy and data collection procedures used were the same in all countries. Twenty voluntary occupational therapists, pair-wise but independently from each other, collected data from 106 cases by means of the Nordic Housing...

  13. Prospective analysis of principles and frequency of self-adjustment of insulin dose in people with diabetes type 1 before and after participation in a diabetes treatment and teaching programme.

    Science.gov (United States)

    Kramer, Guido; Kuniss, Nadine; Jörgens, Viktor; Lehmann, Thomas; Müller, Nicolle; Lorkowski, Stefan; Wolf, Gunter; Müller, Ulrich A; Kloos, Christof

    2016-09-01

    Insulin dose self-adjustment is an essential part of intensified insulin therapy - nowadays the routine treatment of type 1 diabetes (DM1). The aim of this study was to evaluate principles and frequency of insulin dose self-adjustments in people with DM1 before and one year after participating in a structured diabetes treatment and teaching programme (DTTP) and to determine to which extent the patients followed the way they had been trained. 72 people with DM1 were interviewed before participation in our inpatient (32/72) or outpatient (40/72) DTTP. Sixty-six participants (91.7%) were followed up after one year. The number of adaptations of the insulin dose by the patients was recorded from 28days of the patients' diary. The ability to find the correct dose was tested using five different examples. Metabolic control improved significantly after one year (7.9±1.0 to 7.5±0.8%, p=0.004). The participants performed 86.0±37.1 insulin dosage adaptations per 28days before the DTTP. After one year the frequency increased significantly to 99.1±30.7 per 28days (p=0.011). Before the DTTP, 42 of 72 patients (58.3%) adjusted their insulin dose to correct high blood glucose levels by adjustment rules (factor for correction or correction scheme) and 20 of 72 people (27.8%) by personal experience/feeling. One year after the DTTP, 73% (48/66) used adjustment rules. After participating in an structured education programme, patients adjusted their insulin dosage more frequently. Metabolic control improved despite the fact that many patients did not strictly apply the rules they had been trained for. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  14. Pilot project as enabler?

    DEFF Research Database (Denmark)

    Neisig, Margit; Glimø, Helle; Holm, Catrine Granzow

    This article deals with a systemic perspective on transition. The field of study addressed is a pilot project as enabler of transition in a highly complex polycentric context. From a Luhmannian systemic approach, a framework is created to understand and address barriers of change occurred using...... pilot projects as enabler of transition. Aspects of how to create trust and deal with distrust during a transition are addressed. The transition in focus is the concept of New Public Management and how it is applied in the management of the Employment Service in Denmark. The transition regards...

  15. Enabling distributed collaborative science

    DEFF Research Database (Denmark)

    Hudson, T.; Sonnenwald, Diane H.; Maglaughlin, K.

    2000-01-01

    To enable collaboration over distance, a collaborative environment that uses a specialized scientific instrument called a nanoManipulator is evaluated. The nanoManipulator incorporates visualization and force feedback technology to allow scientists to see, feel, and modify biological samples bein...

  16. The Nordic Housing Enabler

    DEFF Research Database (Denmark)

    Helle, T.; Nygren, C.; Slaug, B.

    2014-01-01

    This study addresses development of a content-valid cross-Nordic version of the Housing Enabler and investigation of its inter-rater reliability when used in occupational therapy rating situations, involving occupational therapists, clients, and their home environments. The instrument was transla......This study addresses development of a content-valid cross-Nordic version of the Housing Enabler and investigation of its inter-rater reliability when used in occupational therapy rating situations, involving occupational therapists, clients, and their home environments. The instrument...... was translated from the original Swedish version of the Housing Enabler, and adapted according to accessibility norms and guidelines for housing design in Sweden, Denmark, Finland, and Iceland. This iterative process involved occupational therapists, architects, building engineers, and professional translators......, resulting in the Nordic Housing Enabler. For reliability testing, the sampling strategy and data collection procedures used were the same in all countries. Twenty voluntary occupational therapists, pair-wise but independently of each other, collected data from 106 cases by means of the Nordic Housing...

  17. Spatially enabled land administration

    DEFF Research Database (Denmark)

    Enemark, Stig

    2006-01-01

    enabling of land administration systems managing tenure, valuation, planning, and development will allow the information generated by these activities to be much more useful. Also, the services available to private and public sectors and to community organisations should commensurably improve. Knowledge....... In other words: Good governance and sustainable development is not attainable without sound land administration or - more broadly – sound land management. The paper presents a land management vision that incorporates the benefits of ICT enabled land administration functions. The idea is that spatial...... the communication between administrative systems and also establish more reliable data due to the use the original data instead of copies. In Denmark, such governmental guidelines for a service-oriented ITarchitecture in support of e-government are recently adopted. Finally, the paper presents the role of FIG...

  18. Nordic Housing Enabler

    DEFF Research Database (Denmark)

    Helle, Tina; Brandt, Åse

    Development and reliability testing of the Nordic Housing Enabler – an instrument for accessibility assessment of the physical housing. Tina Helle & Åse Brandt University of Lund, Health Sciences, Faculty of Medicine (SE) and University College Northern Jutland, Occupational Therapy department (DK......). Danish Centre for Assistive Technology. Abstract. For decades, accessibility to the physical housing environment for people with functional limitations has been of interest politically, professionally and for the users. Guidelines and norms on accessible housing design have gradually been developed......, however, the built environment shows serious deficits when it comes to accessibility. This study addresses development of a content valid cross-Nordic version of the Housing Enabler and investigation of inter-rater reliability, when used in occupational therapy practice. The instrument was translated from...

  19. Enabling Wind Power Nationwide

    Energy Technology Data Exchange (ETDEWEB)

    Jose Zayas, Michael Derby, Patrick Gilman and Shreyas Ananthan,

    2015-05-01

    Leveraging this experience, the U.S. Department of Energy’s (DOE’s) Wind and Water Power Technologies Office has evaluated the potential for wind power to generate electricity in all 50 states. This report analyzes and quantifies the geographic expansion that could be enabled by accessing higher above ground heights for wind turbines and considers the means by which this new potential could be responsibly developed.

  20. Contextual Interaction Design Research: Enabling HCI

    OpenAIRE

    Murer , Martin; Meschtscherjakov , Alexander; Fuchsberger , Verena; Giuliani , Manuel; Neureiter , Katja; Moser , Christiane; Aslan , Ilhan; Tscheligi , Manfred

    2015-01-01

    International audience; Human-Computer Interaction (HCI) has always been about humans, their needs and desires. Contemporary HCI thinking investigates interactions in everyday life and puts an emphasis on the emotional and experiential qualities of interactions. At the Center for Human-Computer Interaction we seek to bridge meandering strands in the field by following a guiding metaphor that shifts focus to what has always been the core quality of our research field: Enabling HCI, as a leitmo...

  1. EnableATIS strategy assessment.

    Science.gov (United States)

    2014-02-01

    Enabling Advanced Traveler Information Systems (EnableATIS) is the traveler information component of the Dynamic Mobility Application (DMA) program. The objective of : the EnableATIS effort is to foster transformative traveler information application...

  2. Enabling Digital Literacy

    DEFF Research Database (Denmark)

    Ryberg, Thomas; Georgsen, Marianne

    2010-01-01

    There are some tensions between high-level policy definitions of “digital literacy” and actual teaching practice. We need to find workable definitions of digital literacy; obtain a better understanding of what digital literacy might look like in practice; and identify pedagogical approaches, which...... support teachers in designing digital literacy learning. We suggest that frameworks such as Problem Based Learning (PBL) are approaches that enable digital literacy learning because they provide good settings for engaging with digital literacy. We illustrate this through analysis of a case. Furthermore......, these operate on a meso-level mediating between high-level concepts of digital literacy and classroom practice....

  3. CtOS Enabler

    OpenAIRE

    Crespo Cepeda, Rodrigo; El Yamri El Khatibi, Meriem; Carrera García, Juan Manuel

    2015-01-01

    Las Smart Cities son, indudablemente, el futuro próximo de la tecnología al que nos acercamos cada día, lo que se puede observar en la abundancia de dispositivos móviles entre la población, que informatizan la vida cotidiana mediante el uso de la geolocalización y la información. Pretendemos unir estos dos ámbitos con CtOS Enabler para crear un estándar de uso que englobe todos los sistemas de Smart Cities y facilite a los desarrolladores de dicho software la creación de nuevas herramientas. ...

  4. Smart Grid Enabled EVSE

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2015-01-12

    The combined team of GE Global Research, Federal Express, National Renewable Energy Laboratory, and Consolidated Edison has successfully achieved the established goals contained within the Department of Energy’s Smart Grid Capable Electric Vehicle Supply Equipment funding opportunity. The final program product, shown charging two vehicles in Figure 1, reduces by nearly 50% the total installed system cost of the electric vehicle supply equipment (EVSE) as well as enabling a host of new Smart Grid enabled features. These include bi-directional communications, load control, utility message exchange and transaction management information. Using the new charging system, Utilities or energy service providers will now be able to monitor transportation related electrical loads on their distribution networks, send load control commands or preferences to individual systems, and then see measured responses. Installation owners will be able to authorize usage of the stations, monitor operations, and optimally control their electricity consumption. These features and cost reductions have been developed through a total system design solution.

  5. Enabling graphene nanoelectronics.

    Energy Technology Data Exchange (ETDEWEB)

    Pan, Wei; Ohta, Taisuke; Biedermann, Laura Butler; Gutierrez, Carlos; Nolen, C. M.; Howell, Stephen Wayne; Beechem Iii, Thomas Edwin; McCarty, Kevin F.; Ross, Anthony Joseph, III

    2011-09-01

    Recent work has shown that graphene, a 2D electronic material amenable to the planar semiconductor fabrication processing, possesses tunable electronic material properties potentially far superior to metals and other standard semiconductors. Despite its phenomenal electronic properties, focused research is still required to develop techniques for depositing and synthesizing graphene over large areas, thereby enabling the reproducible mass-fabrication of graphene-based devices. To address these issues, we combined an array of growth approaches and characterization resources to investigate several innovative and synergistic approaches for the synthesis of high quality graphene films on technologically relevant substrate (SiC and metals). Our work focused on developing the fundamental scientific understanding necessary to generate large-area graphene films that exhibit highly uniform electronic properties and record carrier mobility, as well as developing techniques to transfer graphene onto other substrates.

  6. Grid-Enabled Measures

    Science.gov (United States)

    Moser, Richard P.; Hesse, Bradford W.; Shaikh, Abdul R.; Courtney, Paul; Morgan, Glen; Augustson, Erik; Kobrin, Sarah; Levin, Kerry; Helba, Cynthia; Garner, David; Dunn, Marsha; Coa, Kisha

    2011-01-01

    Scientists are taking advantage of the Internet and collaborative web technology to accelerate discovery in a massively connected, participative environment —a phenomenon referred to by some as Science 2.0. As a new way of doing science, this phenomenon has the potential to push science forward in a more efficient manner than was previously possible. The Grid-Enabled Measures (GEM) database has been conceptualized as an instantiation of Science 2.0 principles by the National Cancer Institute with two overarching goals: (1) Promote the use of standardized measures, which are tied to theoretically based constructs; and (2) Facilitate the ability to share harmonized data resulting from the use of standardized measures. This is done by creating an online venue connected to the Cancer Biomedical Informatics Grid (caBIG®) where a virtual community of researchers can collaborate together and come to consensus on measures by rating, commenting and viewing meta-data about the measures and associated constructs. This paper will describe the web 2.0 principles on which the GEM database is based, describe its functionality, and discuss some of the important issues involved with creating the GEM database, such as the role of mutually agreed-on ontologies (i.e., knowledge categories and the relationships among these categories— for data sharing). PMID:21521586

  7. Enabling distributed petascale science

    International Nuclear Information System (INIS)

    Baranovski, Andrew; Bharathi, Shishir; Bresnahan, John

    2007-01-01

    Petascale science is an end-to-end endeavour, involving not only the creation of massive datasets at supercomputers or experimental facilities, but the subsequent analysis of that data by a user community that may be distributed across many laboratories and universities. The new SciDAC Center for Enabling Distributed Petascale Science (CEDPS) is developing tools to support this end-to-end process. These tools include data placement services for the reliable, high-performance, secure, and policy-driven placement of data within a distributed science environment; tools and techniques for the construction, operation, and provisioning of scalable science services; and tools for the detection and diagnosis of failures in end-to-end data placement and distributed application hosting configurations. In each area, we build on a strong base of existing technology and have made useful progress in the first year of the project. For example, we have recently achieved order-of-magnitude improvements in transfer times (for lots of small files) and implemented asynchronous data staging capabilities; demonstrated dynamic deployment of complex application stacks for the STAR experiment; and designed and deployed end-to-end troubleshooting services. We look forward to working with SciDAC application and technology projects to realize the promise of petascale science

  8. Enabling immersive simulation.

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, Josh (University of California Santa Cruz, Santa Cruz, CA); Mateas, Michael (University of California Santa Cruz, Santa Cruz, CA); Hart, Derek H.; Whetzel, Jonathan; Basilico, Justin Derrick; Glickman, Matthew R.; Abbott, Robert G.

    2009-02-01

    The object of the 'Enabling Immersive Simulation for Complex Systems Analysis and Training' LDRD has been to research, design, and engineer a capability to develop simulations which (1) provide a rich, immersive interface for participation by real humans (exploiting existing high-performance game-engine technology wherever possible), and (2) can leverage Sandia's substantial investment in high-fidelity physical and cognitive models implemented in the Umbra simulation framework. We report here on these efforts. First, we describe the integration of Sandia's Umbra modular simulation framework with the open-source Delta3D game engine. Next, we report on Umbra's integration with Sandia's Cognitive Foundry, specifically to provide for learning behaviors for 'virtual teammates' directly from observed human behavior. Finally, we describe the integration of Delta3D with the ABL behavior engine, and report on research into establishing the theoretical framework that will be required to make use of tools like ABL to scale up to increasingly rich and realistic virtual characters.

  9. Displays enabling mobile multimedia

    Science.gov (United States)

    Kimmel, Jyrki

    2007-02-01

    With the rapid advances in telecommunications networks, mobile multimedia delivery to handsets is now a reality. While a truly immersive multimedia experience is still far ahead in the mobile world, significant advances have been made in the constituent audio-visual technologies to make this become possible. One of the critical components in multimedia delivery is the mobile handset display. While such alternatives as headset-style near-to-eye displays, autostereoscopic displays, mini-projectors, and roll-out flexible displays can deliver either a larger virtual screen size than the pocketable dimensions of the mobile device can offer, or an added degree of immersion by adding the illusion of the third dimension in the viewing experience, there are still challenges in the full deployment of such displays in real-life mobile communication terminals. Meanwhile, direct-view display technologies have developed steadily, and can provide a development platform for an even better viewing experience for multimedia in the near future. The paper presents an overview of the mobile display technology space with an emphasis on the advances and potential in developing direct-view displays further to meet the goal of enabling multimedia in the mobile domain.

  10. Enabling scientific teamwork

    International Nuclear Information System (INIS)

    Hereld, Mark; Uram, Thomas; Hudson, Randy; Norris, John; Papka, Michael E

    2009-01-01

    The Computer Supported Collaborative Work research community has identified that the technology used to support distributed teams of researchers, such as email, instant messaging, and conferencing environments, are not enough. Building from a list of areas where it is believed technology can help support distributed teams, we have divided our efforts into support of asynchronous and synchronous activities. This paper will describe two of our recent efforts to improve the productivity of distributed science teams. One effort focused on supporting the management and tracking of milestones and results, with the hope of helping manage information overload. The second effort focused on providing an environment that supports real-time analysis of data. Both of these efforts are seen as add-ons to the existing collaborative infrastructure, developed to enhance the experience of teams working at a distance by removing barriers to effective communication.

  11. Enabling scientific teamwork

    Science.gov (United States)

    Hereld, Mark; Hudson, Randy; Norris, John; Papka, Michael E.; Uram, Thomas

    2009-07-01

    The Computer Supported Collaborative Work research community has identified that the technology used to support distributed teams of researchers, such as email, instant messaging, and conferencing environments, are not enough. Building from a list of areas where it is believed technology can help support distributed teams, we have divided our efforts into support of asynchronous and synchronous activities. This paper will describe two of our recent efforts to improve the productivity of distributed science teams. One effort focused on supporting the management and tracking of milestones and results, with the hope of helping manage information overload. The second effort focused on providing an environment that supports real-time analysis of data. Both of these efforts are seen as add-ons to the existing collaborative infrastructure, developed to enhance the experience of teams working at a distance by removing barriers to effective communication.

  12. Platform computing

    CERN Multimedia

    2002-01-01

    "Platform Computing releases first grid-enabled workload management solution for IBM eServer Intel and UNIX high performance computing clusters. This Out-of-the-box solution maximizes the performance and capability of applications on IBM HPC clusters" (1/2 page) .

  13. Enabling cleanup technology transfer

    International Nuclear Information System (INIS)

    Ditmars, J. D.

    2002-01-01

    Technology transfer in the environmental restoration, or cleanup, area has been challenging. While there is little doubt that innovative technologies are needed to reduce the times, risks, and costs associated with the cleanup of federal sites, particularly those of the Departments of Energy (DOE) and Defense, the use of such technologies in actual cleanups has been relatively limited. There are, of course, many reasons why technologies do not reach the implementation phase or do not get transferred from developing entities to the user community. For example, many past cleanup contracts provided few incentives for performance that would compel a contractor to seek improvement via technology applications. While performance-based contracts are becoming more common, they alone will not drive increased technology applications. This paper focuses on some applications of cleanup methodologies and technologies that have been successful and are illustrative of a more general principle. The principle is at once obvious and not widely practiced. It is that, with few exceptions, innovative cleanup technologies are rarely implemented successfully alone but rather are implemented in the context of enabling processes and methodologies. And, since cleanup is conducted in a regulatory environment, the stage is better set for technology transfer when the context includes substantive interactions with the relevant stakeholders. Examples of this principle are drawn from Argonne National Laboratory's experiences in Adaptive Sampling and Analysis Programs (ASAPs), Precise Excavation, and the DOE Technology Connection (TechCon) Program. The lessons learned may be applicable to the continuing challenges posed by the cleanup and long-term stewardship of radioactive contaminants and unexploded ordnance (UXO) at federal sites

  14. Preservation of root canal anatomy using self-adjusting file instrumentation with glide path prepared by 20/0.02 hand files versus 20/0.04 rotary files

    Science.gov (United States)

    Jain, Niharika; Pawar, Ajinkya M.; Ukey, Piyush D.; Jain, Prashant K.; Thakur, Bhagyashree; Gupta, Abhishek

    2017-01-01

    Objectives: To compare the relative axis modification and canal concentricity after glide path preparation with 20/0.02 hand K-file (NITIFLEX®) and 20/0.04 rotary file (HyFlex™ CM) with subsequent instrumentation with 1.5 mm self-adjusting file (SAF). Materials and Methods: One hundred and twenty ISO 15, 0.02 taper, Endo Training Blocks (Dentsply Maillefer, Ballaigues, Switzerland) were acquired and randomly divided into following two groups (n = 60): group 1, establishing glide path till 20/0.02 hand K-file (NITIFLEX®) followed by instrumentation with 1.5 mm SAF; and Group 2, establishing glide path till 20/0.04 rotary file (HyFlex™ CM) followed by instrumentation with 1.5 mm SAF. Pre- and post-instrumentation digital images were processed with MATLAB R 2013 software to identify the central axis, and then superimposed using digital imaging software (Picasa 3.0 software, Google Inc., California, USA) taking five landmarks as reference points. Student's t-test for pairwise comparisons was applied with the level of significance set at 0.05. Results: Training blocks instrumented with 20/0.04 rotary file and SAF were associated less deviation in canal axis (at all the five marked points), representing better canal concentricity compared to those, in which glide path was established by 20/0.02 hand K-files followed by SAF instrumentation. Conclusion: Canal geometry is better maintained after SAF instrumentation with a prior glide path established with 20/0.04 rotary file. PMID:28855752

  15. FOILFEST :community enabled security.

    Energy Technology Data Exchange (ETDEWEB)

    Moore, Judy Hennessey; Johnson, Curtis Martin; Whitley, John B.; Drayer, Darryl Donald; Cummings, John C., Jr. (.,; .)

    2005-09-01

    The Advanced Concepts Group of Sandia National Laboratories hosted a workshop, ''FOILFest: Community Enabled Security'', on July 18-21, 2005, in Albuquerque, NM. This was a far-reaching look into the future of physical protection consisting of a series of structured brainstorming sessions focused on preventing and foiling attacks on public places and soft targets such as airports, shopping malls, hotels, and public events. These facilities are difficult to protect using traditional security devices since they could easily be pushed out of business through the addition of arduous and expensive security measures. The idea behind this Fest was to explore how the public, which is vital to the function of these institutions, can be leveraged as part of a physical protection system. The workshop considered procedures, space design, and approaches for building community through technology. The workshop explored ways to make the ''good guys'' in public places feel safe and be vigilant while making potential perpetrators of harm feel exposed and convinced that they will not succeed. Participants in the Fest included operators of public places, social scientists, technology experts, representatives of government agencies including DHS and the intelligence community, writers and media experts. Many innovative ideas were explored during the fest with most of the time spent on airports, including consideration of the local airport, the Albuquerque Sunport. Some provocative ideas included: (1) sniffers installed in passage areas like revolving door, escalators, (2) a ''jumbotron'' showing current camera shots in the public space, (3) transparent portal screeners allowing viewing of the screening, (4) a layered open/funnel/open/funnel design where open spaces are used to encourage a sense of ''communitas'' and take advantage of citizen ''sensing'' and funnels are technological

  16. New algorithm for tensor contractions on multi-core CPUs, GPUs, and accelerators enables CCSD and EOM-CCSD calculations with over 1000 basis functions on a single compute node.

    Science.gov (United States)

    Kaliman, Ilya A; Krylov, Anna I

    2017-04-30

    A new hardware-agnostic contraction algorithm for tensors of arbitrary symmetry and sparsity is presented. The algorithm is implemented as a stand-alone open-source code libxm. This code is also integrated with general tensor library libtensor and with the Q-Chem quantum-chemistry package. An overview of the algorithm, its implementation, and benchmarks are presented. Similarly to other tensor software, the algorithm exploits efficient matrix multiplication libraries and assumes that tensors are stored in a block-tensor form. The distinguishing features of the algorithm are: (i) efficient repackaging of the individual blocks into large matrices and back, which affords efficient graphics processing unit (GPU)-enabled calculations without modifications of higher-level codes; (ii) fully asynchronous data transfer between disk storage and fast memory. The algorithm enables canonical all-electron coupled-cluster and equation-of-motion coupled-cluster calculations with single and double substitutions (CCSD and EOM-CCSD) with over 1000 basis functions on a single quad-GPU machine. We show that the algorithm exhibits predicted theoretical scaling for canonical CCSD calculations, O(N 6 ), irrespective of the data size on disk. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  17. Cloud Computing Fundamentals

    Science.gov (United States)

    Furht, Borko

    In the introductory chapter we define the concept of cloud computing and cloud services, and we introduce layers and types of cloud computing. We discuss the differences between cloud computing and cloud services. New technologies that enabled cloud computing are presented next. We also discuss cloud computing features, standards, and security issues. We introduce the key cloud computing platforms, their vendors, and their offerings. We discuss cloud computing challenges and the future of cloud computing.

  18. Enabling search services on outsourced private spatial data

    KAUST Repository

    Yiu, Man Lung; Ghinita, Gabriel; Jensen, Christian Sø ndergaard; Kalnis, Panos

    2009-01-01

    Cloud computing services enable organizations and individuals to outsource the management of their data to a service provider in order to save on hardware investments and reduce maintenance costs. Only authorized users are allowed to access the data

  19. Geo-Enabled, Mobile Services

    DEFF Research Database (Denmark)

    Jensen, Christian Søndergaard

    2006-01-01

    We are witnessing the emergence of a global infrastructure that enables the widespread deployment of geo-enabled, mobile services in practice. At the same time, the research community has also paid increasing attention to data management aspects of mobile services. This paper offers me...

  20. 18F-fluorodeoxyglucose positron emission tomography/computed tomography enables the detection of recurrent same-site deep vein thrombosis by illuminating recently formed, neutrophil-rich thrombus.

    Science.gov (United States)

    Hara, Tetsuya; Truelove, Jessica; Tawakol, Ahmed; Wojtkiewicz, Gregory R; Hucker, William J; MacNabb, Megan H; Brownell, Anna-Liisa; Jokivarsi, Kimmo; Kessinger, Chase W; Jaff, Michael R; Henke, Peter K; Weissleder, Ralph; Jaffer, Farouc A

    2014-09-23

    Accurate detection of recurrent same-site deep vein thrombosis (DVT) is a challenging clinical problem. Because DVT formation and resolution are associated with a preponderance of inflammatory cells, we investigated whether noninvasive (18)F-fluorodeoxyglucose (FDG)-positron emission tomography (PET) imaging could identify inflamed, recently formed thrombi and thereby improve the diagnosis of recurrent DVT. We established a stasis-induced DVT model in murine jugular veins and also a novel model of recurrent stasis DVT in mice. C57BL/6 mice (n=35) underwent ligation of the jugular vein to induce stasis DVT. FDG-PET/computed tomography (CT) was performed at DVT time points of day 2, 4, 7, 14, or 2+16 (same-site recurrent DVT at day 2 overlying a primary DVT at day 16). Antibody-based neutrophil depletion was performed in a subset of mice before DVT formation and FDG-PET/CT. In a clinical study, 38 patients with lower extremity DVT or controls undergoing FDG-PET were analyzed. Stasis DVT demonstrated that the highest FDG signal occurred at day 2, followed by a time-dependent decrease (Pthrombus neutrophils (Pthrombus PET signal intensity. Neutrophil depletion decreased FDG signals in day 2 DVT in comparison with controls (P=0.03). Recurrent DVT demonstrated significantly higher FDG uptake than organized day 14 DVT (P=0.03). The FDG DVT signal in patients also exhibited a time-dependent decrease (Pthrombus inflammation in murine DVT, and demonstrates a time-dependent signal decrease in both murine and clinical DVT. FDG-PET/CT may offer a molecular imaging strategy to accurately diagnose recurrent DVT. © 2014 American Heart Association, Inc.

  1. NASP - Enabling new space launch options

    Science.gov (United States)

    Froning, David; Gaubatz, William; Mathews, George

    1990-10-01

    Successful NASP developments in the United States are bringing about the possibility of effective, fully reusable vehicles for transport of people and cargo between earth and space. These developments include: extension of airbreathing propulsion to a much higher speed; densification of propellants for greater energy per unit volume of mass; structures with much greater strength-to-weight at high temperatures; computational advancements that enable more optimal design and integration of airframes, engines and controls; and advances in avionics, robotics, artificial intelligence and automation that enable accomplishment of earth-to-orbit (ETO) operations with much less manpower support and cost. This paper describes the relative magnitude of improvement that these developments may provide.

  2. How GNSS Enables Precision Farming

    Science.gov (United States)

    2014-12-01

    Precision farming: Feeding a Growing Population Enables Those Who Feed the World. Immediate and Ongoing Needs - population growth (more to feed) - urbanization (decrease in arable land) Double food production by 2050 to meet world demand. To meet thi...

  3. [Grid computing

    CERN Multimedia

    Wolinsky, H

    2003-01-01

    "Turn on a water spigot, and it's like tapping a bottomless barrel of water. Ditto for electricity: Flip the switch, and the supply is endless. But computing is another matter. Even with the Internet revolution enabling us to connect in new ways, we are still limited to self-contained systems running locally stored software, limited by corporate, institutional and geographic boundaries" (1 page).

  4. Optical Computing

    Indian Academy of Sciences (India)

    Optical computing technology is, in general, developing in two directions. One approach is ... current support in many places, with private companies as well as governments in several countries encouraging such research work. For example, much ... which enables more information to be carried and data to be processed.

  5. Enabling Rapid Naval Architecture Design Space Exploration

    Science.gov (United States)

    Mueller, Michael A.; Dufresne, Stephane; Balestrini-Robinson, Santiago; Mavris, Dimitri

    2011-01-01

    Well accepted conceptual ship design tools can be used to explore a design space, but more precise results can be found using detailed models in full-feature computer aided design programs. However, defining a detailed model can be a time intensive task and hence there is an incentive for time sensitive projects to use conceptual design tools to explore the design space. In this project, the combination of advanced aerospace systems design methods and an accepted conceptual design tool facilitates the creation of a tool that enables the user to not only visualize ship geometry but also determine design feasibility and estimate the performance of a design.

  6. Camera-enabled techniques for organic synthesis

    Directory of Open Access Journals (Sweden)

    Steven V. Ley

    2013-05-01

    Full Text Available A great deal of time is spent within synthetic chemistry laboratories on non-value-adding activities such as sample preparation and work-up operations, and labour intensive activities such as extended periods of continued data collection. Using digital cameras connected to computer vision algorithms, camera-enabled apparatus can perform some of these processes in an automated fashion, allowing skilled chemists to spend their time more productively. In this review we describe recent advances in this field of chemical synthesis and discuss how they will lead to advanced synthesis laboratories of the future.

  7. OGC® Sensor Web Enablement Standards

    Directory of Open Access Journals (Sweden)

    George Percivall

    2006-09-01

    Full Text Available This article provides a high-level overview of and architecture for the Open Geospatial Consortium (OGC standards activities that focus on sensors, sensor networks, and a concept called the “Sensor Web”. This OGC work area is known as Sensor Web Enablement (SWE. This article has been condensed from "OGC® Sensor Web Enablement: Overview And High Level Architecture," an OGC White Paper by Mike Botts, PhD, George Percivall, Carl Reed, PhD, and John Davidson which can be downloaded from http://www.opengeospatial.org/pt/15540. Readers interested in greater technical and architecture detail can download and read the OGC SWE Architecture Discussion Paper titled “The OGC Sensor Web Enablement Architecture” (OGC document 06-021r1, http://www.opengeospatial.org/pt/14140.

  8. Directory Enabled Policy Based Networking; TOPICAL

    International Nuclear Information System (INIS)

    KELIIAA, CURTIS M.

    2001-01-01

    This report presents a discussion of directory-enabled policy-based networking with an emphasis on its role as the foundation for securely scalable enterprise networks. A directory service provides the object-oriented logical environment for interactive cyber-policy implementation. Cyber-policy implementation includes security, network management, operational process and quality of service policies. The leading network-technology vendors have invested in these technologies for secure universal connectivity that transverses Internet, extranet and intranet boundaries. Industry standards are established that provide the fundamental guidelines for directory deployment scalable to global networks. The integration of policy-based networking with directory-service technologies provides for intelligent management of the enterprise network environment as an end-to-end system of related clients, services and resources. This architecture allows logical policies to protect data, manage security and provision critical network services permitting a proactive defense-in-depth cyber-security posture. Enterprise networking imposes the consideration of supporting multiple computing platforms, sites and business-operation models. An industry-standards based approach combined with principled systems engineering in the deployment of these technologies allows these issues to be successfully addressed. This discussion is focused on a directory-based policy architecture for the heterogeneous enterprise network-computing environment and does not propose specific vendor solutions. This document is written to present practical design methodology and provide an understanding of the risks, complexities and most important, the benefits of directory-enabled policy-based networking

  9. Enabling Large-Scale Biomedical Analysis in the Cloud

    Directory of Open Access Journals (Sweden)

    Ying-Chih Lin

    2013-01-01

    Full Text Available Recent progress in high-throughput instrumentations has led to an astonishing growth in both volume and complexity of biomedical data collected from various sources. The planet-size data brings serious challenges to the storage and computing technologies. Cloud computing is an alternative to crack the nut because it gives concurrent consideration to enable storage and high-performance computing on large-scale data. This work briefly introduces the data intensive computing system and summarizes existing cloud-based resources in bioinformatics. These developments and applications would facilitate biomedical research to make the vast amount of diversification data meaningful and usable.

  10. Organizational Enablers for Project Governance

    DEFF Research Database (Denmark)

    Müller, Ralf; Shao, Jingting; Pemsel, Sofia

    and their relationships to organizational success. Based on these results, the authors discovered that organizational enablers (including key factors such as leadership, governance, and influence of project managers) have a critical impact on how organizations operate, adapt to market fluctuations and forces, and make......While corporate culture plays a significant role in the success of any corporation, governance and “governmentality” not only determine how business should be conducted, but also define the policies and procedures organizations follow to achieve business functions and goals. In their book......, Organizational Enablers for Project Governance, Ralf Müller, Jingting Shao, and Sofia Pemsel examine the interaction of governance and governmentality in various types of companies and demonstrate how these factors drive business success and influence project work, efficiency, and profitability. The data...

  11. 'Ethos' Enabling Organisational Knowledge Creation

    Science.gov (United States)

    Matsudaira, Yoshito

    This paper examines knowledge creation in relation to improvements on the production line in the manufacturing department of Nissan Motor Company and aims to clarify embodied knowledge observed in the actions of organisational members who enable knowledge creation will be clarified. For that purpose, this study adopts an approach that adds a first, second, and third-person's viewpoint to the theory of knowledge creation. Embodied knowledge, observed in the actions of organisational members who enable knowledge creation, is the continued practice of 'ethos' (in Greek) founded in Nissan Production Way as an ethical basis. Ethos is knowledge (intangible) assets for knowledge creating companies. Substantiated analysis classifies ethos into three categories: the individual, team and organisation. This indicates the precise actions of the organisational members in each category during the knowledge creation process. This research will be successful in its role of showing the indispensability of ethos - the new concept of knowledge assets, which enables knowledge creation -for future knowledge-based management in the knowledge society.

  12. Computing environment logbook

    Science.gov (United States)

    Osbourn, Gordon C; Bouchard, Ann M

    2012-09-18

    A computing environment logbook logs events occurring within a computing environment. The events are displayed as a history of past events within the logbook of the computing environment. The logbook provides search functionality to search through the history of past events to find one or more selected past events, and further, enables an undo of the one or more selected past events.

  13. Enabling Indoor Location-Based Services

    DEFF Research Database (Denmark)

    Radaelli, Laura

    Indoor spaces have always attracted interest from different scientific disciplines. Relatively recent interest in indoor settings by computer scientists is driven in part by the increasing use of smartphones, which serve as a platform for service delivery and can generate extensive volumes...... of trajectory data that can be used to study how people actually use indoor spaces. In this dissertation, we contribute partial solutions that address challenges in indoor positioning and indoor trajectory management and analysis. The key enabler of indoor location-based services and indoor movement analysis...... is a well-functioning positioning system that can be easily deployed in most public places. Different technologies are able to provide indoor positioning with different accuracy and coverage, but it is difficult to find a technology that by itself can provide good positioning in the many different layouts...

  14. Smart Grid enabled heat pumps

    DEFF Research Database (Denmark)

    Carmo, Carolina; Detlefsen, Nina; Nielsen, Mads Pagh

    2014-01-01

    The transition towards a 100 % fossil-free energy system, while achieving extreme penetration levels of intermittent wind and solar power in electricity generation, requires demand-side technologies that are smart (intermittency-friendly) and efficient. The integration of Smart Grid enabling...... with an empirical study in order to achieve a number of recommendations with respect to technology concepts and control strategies that would allow residential vapor-compression heat pumps to support large-scale integration of intermittent renewables. The analysis is based on data gathered over a period of up to 3...

  15. Enabling department-scale supercomputing

    Energy Technology Data Exchange (ETDEWEB)

    Greenberg, D.S.; Hart, W.E.; Phillips, C.A.

    1997-11-01

    The Department of Energy (DOE) national laboratories have one of the longest and most consistent histories of supercomputer use. The authors summarize the architecture of DOE`s new supercomputers that are being built for the Accelerated Strategic Computing Initiative (ASCI). The authors then argue that in the near future scaled-down versions of these supercomputers with petaflop-per-weekend capabilities could become widely available to hundreds of research and engineering departments. The availability of such computational resources will allow simulation of physical phenomena to become a full-fledged third branch of scientific exploration, along with theory and experimentation. They describe the ASCI and other supercomputer applications at Sandia National Laboratories, and discuss which lessons learned from Sandia`s long history of supercomputing can be applied in this new setting.

  16. Cyber-Enabled Scientific Discovery

    International Nuclear Information System (INIS)

    Chan, Tony; Jameson, Leland

    2007-01-01

    It is often said that numerical simulation is third in the group of three ways to explore modern science: theory, experiment and simulation. Carefully executed modern numerical simulations can, however, be considered at least as relevant as experiment and theory. In comparison to physical experimentation, with numerical simulation one has the numerically simulated values of every field variable at every grid point in space and time. In comparison to theory, with numerical simulation one can explore sets of very complex non-linear equations such as the Einstein equations that are very difficult to investigate theoretically. Cyber-enabled scientific discovery is not just about numerical simulation but about every possible issue related to scientific discovery by utilizing cyberinfrastructure such as the analysis and storage of large data sets, the creation of tools that can be used by broad classes of researchers and, above all, the education and training of a cyber-literate workforce

  17. Context-Enabled Business Intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Troy Hiltbrand

    2012-04-01

    To truly understand context and apply it in business intelligence, it is vital to understand what context is and how it can be applied in addressing organizational needs. Context describes the facets of the environment that impact the way that end users interact with the system. Context includes aspects of location, chronology, access method, demographics, social influence/ relationships, end-user attitude/ emotional state, behavior/ past behavior, and presence. To be successful in making Business Intelligence content enabled, it is important to be able to capture the context of use user. With advances in technology, there are a number of ways in which this user based information can be gathered and exposed to enhance the overall end user experience.

  18. Informatics enables public health surveillance

    Directory of Open Access Journals (Sweden)

    Scott J. N McNabb

    2017-01-01

    Full Text Available Over the past decade, the world has radically changed. New advances in information and communication technologies (ICT connect the world in ways never imagined. Public health informatics (PHI leveraged for public health surveillance (PHS, can enable, enhance, and empower essential PHS functions (i.e., detection, reporting, confirmation, analyses, feedback, response. However, the tail doesn't wag the dog; as such, ICT cannot (should not drive public health surveillance strengthening. Rather, ICT can serve PHS to more effectively empower core functions. In this review, we explore promising ICT trends for prevention, detection, and response, laboratory reporting, push notification, analytics, predictive surveillance, and using new data sources, while recognizing that it is the people, politics, and policies that most challenge progress for implementation of solutions.

  19. Uncertainty enabled Sensor Observation Services

    Science.gov (United States)

    Cornford, Dan; Williams, Matthew; Bastin, Lucy

    2010-05-01

    Almost all observations of reality are contaminated with errors, which introduce uncertainties into the actual observation result. Such uncertainty is often held to be a data quality issue, and quantification of this uncertainty is essential for the principled exploitation of the observations. Many existing systems treat data quality in a relatively ad-hoc manner, however if the observation uncertainty is a reliable estimate of the error on the observation with respect to reality then knowledge of this uncertainty enables optimal exploitation of the observations in further processes, or decision making. We would argue that the most natural formalism for expressing uncertainty is Bayesian probability theory. In this work we show how the Open Geospatial Consortium Sensor Observation Service can be implemented to enable the support of explicit uncertainty about observations. We show how the UncertML candidate standard is used to provide a rich and flexible representation of uncertainty in this context. We illustrate this on a data set of user contributed weather data where the INTAMAP interpolation Web Processing Service is used to help estimate the uncertainty on the observations of unknown quality, using observations with known uncertainty properties. We then go on to discuss the implications of uncertainty for a range of existing Open Geospatial Consortium standards including SWE common and Observations and Measurements. We discuss the difficult decisions in the design of the UncertML schema and its relation and usage within existing standards and show various options. We conclude with some indications of the likely future directions for UncertML in the context of Open Geospatial Consortium services.

  20. Foundations of Neuromorphic Computing

    Science.gov (United States)

    2013-05-01

    paradigms: few sensors/complex computations and many sensors/simple computation. Challenges with Nano-enabled Neuromorphic Chips A wide variety of...FOUNDATIONS OF NEUROMORPHIC COMPUTING MAY 2013 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE; DISTRIBUTION...2009 – SEP 2012 4. TITLE AND SUBTITLE FOUNDATIONS OF NEUROMORPHIC COMPUTING 5a. CONTRACT NUMBER IN-HOUSE 5b. GRANT NUMBER N/A 5c. PROGRAM

  1. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT,J.

    2004-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security.

  2. Computing the functional proteome

    DEFF Research Database (Denmark)

    O'Brien, Edward J.; Palsson, Bernhard

    2015-01-01

    Constraint-based models enable the computation of feasible, optimal, and realized biological phenotypes from reaction network reconstructions and constraints on their operation. To date, stoichiometric reconstructions have largely focused on metabolism, resulting in genome-scale metabolic models (M...

  3. SciDAC Visualization and Analytics Center for Enabling Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Joy, Kenneth I. [Univ. of California, Davis, CA (United States)

    2014-09-14

    This project focuses on leveraging scientific visualization and analytics software technology as an enabling technology for increasing scientific productivity and insight. Advances in computational technology have resulted in an "information big bang," which in turn has created a significant data understanding challenge. This challenge is widely acknowledged to be one of the primary bottlenecks in contemporary science. The vision for our Center is to respond directly to that challenge by adapting, extending, creating when necessary and deploying visualization and data understanding technologies for our science stakeholders. Using an organizational model as a Visualization and Analytics Center for Enabling Technologies (VACET), we are well positioned to be responsive to the needs of a diverse set of scientific stakeholders in a coordinated fashion using a range of visualization, mathematics, statistics, computer and computational science and data management technologies.

  4. Enabling Collaborative Human-Machine (H-M) Decision Making in Time-critical Activities

    Data.gov (United States)

    National Aeronautics and Space Administration — Self-adjusting autonomous systems (SAS) are spreading from well-defined control activities, such as manufacturing, to complex activities with multi-faceted human...

  5. Grid Enabled Geospatial Catalogue Web Service

    Science.gov (United States)

    Chen, Ai-Jun; Di, Li-Ping; Wei, Ya-Xing; Liu, Yang; Bui, Yu-Qi; Hu, Chau-Min; Mehrotra, Piyush

    2004-01-01

    Geospatial Catalogue Web Service is a vital service for sharing and interoperating volumes of distributed heterogeneous geospatial resources, such as data, services, applications, and their replicas over the web. Based on the Grid technology and the Open Geospatial Consortium (0GC) s Catalogue Service - Web Information Model, this paper proposes a new information model for Geospatial Catalogue Web Service, named as GCWS which can securely provides Grid-based publishing, managing and querying geospatial data and services, and the transparent access to the replica data and related services under the Grid environment. This information model integrates the information model of the Grid Replica Location Service (RLS)/Monitoring & Discovery Service (MDS) with the information model of OGC Catalogue Service (CSW), and refers to the geospatial data metadata standards from IS0 19115, FGDC and NASA EOS Core System and service metadata standards from IS0 191 19 to extend itself for expressing geospatial resources. Using GCWS, any valid geospatial user, who belongs to an authorized Virtual Organization (VO), can securely publish and manage geospatial resources, especially query on-demand data in the virtual community and get back it through the data-related services which provide functions such as subsetting, reformatting, reprojection etc. This work facilitates the geospatial resources sharing and interoperating under the Grid environment, and implements geospatial resources Grid enabled and Grid technologies geospatial enabled. It 2!so makes researcher to focus on science, 2nd not cn issues with computing ability, data locztic, processir,g and management. GCWS also is a key component for workflow-based virtual geospatial data producing.

  6. Polymorphous computing fabric

    Science.gov (United States)

    Wolinski, Christophe Czeslaw [Los Alamos, NM; Gokhale, Maya B [Los Alamos, NM; McCabe, Kevin Peter [Los Alamos, NM

    2011-01-18

    Fabric-based computing systems and methods are disclosed. A fabric-based computing system can include a polymorphous computing fabric that can be customized on a per application basis and a host processor in communication with said polymorphous computing fabric. The polymorphous computing fabric includes a cellular architecture that can be highly parameterized to enable a customized synthesis of fabric instances for a variety of enhanced application performances thereof. A global memory concept can also be included that provides the host processor random access to all variables and instructions associated with the polymorphous computing fabric.

  7. Enabling individualized therapy through nanotechnology.

    Science.gov (United States)

    Sakamoto, Jason H; van de Ven, Anne L; Godin, Biana; Blanco, Elvin; Serda, Rita E; Grattoni, Alessandro; Ziemys, Arturas; Bouamrani, Ali; Hu, Tony; Ranganathan, Shivakumar I; De Rosa, Enrica; Martinez, Jonathan O; Smid, Christine A; Buchanan, Rachel M; Lee, Sei-Young; Srinivasan, Srimeenakshi; Landry, Matthew; Meyn, Anne; Tasciotti, Ennio; Liu, Xuewu; Decuzzi, Paolo; Ferrari, Mauro

    2010-08-01

    Individualized medicine is the healthcare strategy that rebukes the idiomatic dogma of 'losing sight of the forest for the trees'. We are entering a new era of healthcare where it is no longer acceptable to develop and market a drug that is effective for only 80% of the patient population. The emergence of "-omic" technologies (e.g. genomics, transcriptomics, proteomics, metabolomics) and advances in systems biology are magnifying the deficiencies of standardized therapy, which often provide little treatment latitude for accommodating patient physiologic idiosyncrasies. A personalized approach to medicine is not a novel concept. Ever since the scientific community began unraveling the mysteries of the genome, the promise of discarding generic treatment regimens in favor of patient-specific therapies became more feasible and realistic. One of the major scientific impediments of this movement towards personalized medicine has been the need for technological enablement. Nanotechnology is projected to play a critical role in patient-specific therapy; however, this transition will depend heavily upon the evolutionary development of a systems biology approach to clinical medicine based upon "-omic" technology analysis and integration. This manuscript provides a forward looking assessment of the promise of nanomedicine as it pertains to individualized medicine and establishes a technology "snapshot" of the current state of nano-based products over a vast array of clinical indications and range of patient specificity. Other issues such as market driven hurdles and regulatory compliance reform are anticipated to "self-correct" in accordance to scientific advancement and healthcare demand. These peripheral, non-scientific concerns are not addressed at length in this manuscript; however they do exist, and their impact to the paradigm shifting healthcare transformation towards individualized medicine will be critical for its success. Copyright 2010 Elsevier Ltd. All rights

  8. Enabling individualized therapy through nanotechnology

    Science.gov (United States)

    Sakamoto, Jason H.; van de Ven, Anne L.; Godin, Biana; Blanco, Elvin; Serda, Rita E.; Grattoni, Alessandro; Ziemys, Arturas; Bouamrani, Ali; Hu, Tony; Ranganathan, Shivakumar I.; De Rosa, Enrica; Martinez, Jonathan O.; Smid, Christine A.; Buchanan, Rachel M.; Lee, Sei-Young; Srinivasan, Srimeenakshi; Landry, Matthew; Meyn, Anne; Tasciotti, Ennio; Liu, Xuewu; Decuzzi, Paolo; Ferrari, Mauro

    2010-01-01

    Individualized medicine is the healthcare strategy that rebukes the idiomatic dogma of ‘losing sight of the forest for the trees’. We are entering a new era of healthcare where it is no longer acceptable to develop and market a drug that is effective for only 80% of the patient population. The emergence of “-omic” technologies (e.g. genomics, transcriptomics, proteomics, metabolomics) and advances in systems biology are magnifying the deficiencies of standardized therapy, which often provide little treatment latitude for accommodating patient physiologic idiosyncrasies. A personalized approach to medicine is not a novel concept. Ever since the scientific community began unraveling the mysteries of the genome, the promise of discarding generic treatment regimens in favor of patient-specific therapies became more feasible and realistic. One of the major scientific impediments of this movement towards personalized medicine has been the need for technological enablement. Nanotechnology is projected to play a critical role in patient-specific therapy; however, this transition will depend heavily upon the evolutionary development of a systems biology approach to clinical medicine based upon “-omic” technology analysis and integration. This manuscript provides a forward looking assessment of the promise of nanomedicine as it pertains to individualized medicine and establishes a technology “snapshot” of the current state of nano-based products over a vast array of clinical indications and range of patient specificity. Other issues such as market driven hurdles and regulatory compliance reform are anticipated to “self-correct” in accordance to scientific advancement and healthcare demand. These peripheral, non-scientific concerns are not addressed at length in this manuscript; however they do exist, and their impact to the paradigm shifting healthcare transformation towards individualized medicine will be critical for its success. PMID:20045055

  9. Physician communication via Internet-enabled technology: A systematic review.

    Science.gov (United States)

    Barr, Neil G; Randall, Glen E; Archer, Norman P; Musson, David M

    2017-10-01

    The use of Internet-enabled technology (information and communication technology such as smartphone applications) may enrich information exchange among providers and, consequently, improve health care delivery. The purpose of this systematic review was to gain a greater understanding of the role that Internet-enabled technology plays in enhancing communication among physicians. Studies were identified through a search in three electronic platforms: the Association for Computing Machinery Digital Library, ProQuest, and Web of Science. The search identified 5140 articles; of these, 21 met all inclusion criteria. In general, physicians were satisfied with Internet-enabled technology, but consensus was lacking regarding whether Internet-enabled technology improved efficiency or made a difference to clinical decision-making. Internet-enabled technology can play an important role in enhancing communication among physicians, but the extent of that benefit is influenced by (1) the impact of Internet-enabled technology on existing work practices, (2) the availability of adequate resources, and (3) the nature of institutional elements, such as privacy legislation.

  10. Enabling model customization and integration

    Science.gov (United States)

    Park, Minho; Fishwick, Paul A.

    2003-09-01

    Until fairly recently, the idea of dynamic model content and presentation were treated synonymously. For example, if one was to take a data flow network, which captures the dynamics of a target system in terms of the flow of data through nodal operators, then one would often standardize on rectangles and arrows for the model display. The increasing web emphasis on XML, however, suggests that the network model can have its content specified in an XML language, and then the model can be represented in a number of ways depending on the chosen style. We have developed a formal method, based on styles, that permits a model to be specified in XML and presented in 1D (text), 2D, and 3D. This method allows for customization and personalization to exert their benefits beyond e-commerce, to the area of model structures used in computer simulation. This customization leads naturally to solving the bigger problem of model integration - the act of taking models of a scene and integrating them with that scene so that there is only one unified modeling interface. This work focuses mostly on customization, but we address the integration issue in the future work section.

  11. Enabling scientific workflows in virtual reality

    Science.gov (United States)

    Kreylos, O.; Bawden, G.; Bernardin, T.; Billen, M.I.; Cowgill, E.S.; Gold, R.D.; Hamann, B.; Jadamec, M.; Kellogg, L.H.; Staadt, O.G.; Sumner, D.Y.

    2006-01-01

    To advance research and improve the scientific return on data collection and interpretation efforts in the geosciences, we have developed methods of interactive visualization, with a special focus on immersive virtual reality (VR) environments. Earth sciences employ a strongly visual approach to the measurement and analysis of geologic data due to the spatial and temporal scales over which such data ranges, As observations and simulations increase in size and complexity, the Earth sciences are challenged to manage and interpret increasing amounts of data. Reaping the full intellectual benefits of immersive VR requires us to tailor exploratory approaches to scientific problems. These applications build on the visualization method's strengths, using both 3D perception and interaction with data and models, to take advantage of the skills and training of the geological scientists exploring their data in the VR environment. This interactive approach has enabled us to develop a suite of tools that are adaptable to a range of problems in the geosciences and beyond. Copyright ?? 2008 by the Association for Computing Machinery, Inc.

  12. Federated and Cloud Enabled Resources for Data Management and Utilization

    Science.gov (United States)

    Rankin, R.; Gordon, M.; Potter, R. G.; Satchwill, B.

    2011-12-01

    The emergence of cloud computing over the past three years has led to a paradigm shift in how data can be managed, processed and made accessible. Building on the federated data management system offered through the Canadian Space Science Data Portal (www.cssdp.ca), we demonstrate how heterogeneous and geographically distributed data sets and modeling tools have been integrated to form a virtual data center and computational modeling platform that has services for data processing and visualization embedded within it. We also discuss positive and negative experiences in utilizing Eucalyptus and OpenStack cloud applications, and job scheduling facilitated by Condor and Star Cluster. We summarize our findings by demonstrating use of these technologies in the Cloud Enabled Space Weather Data Assimilation and Modeling Platform CESWP (www.ceswp.ca), which is funded through Canarie's (canarie.ca) Network Enabled Platforms program in Canada.

  13. Operator overloading as an enabling technology for automatic differentiation

    International Nuclear Information System (INIS)

    Corliss, G.F.; Griewank, A.

    1993-01-01

    We present an example of the science that is enabled by object-oriented programming techniques. Scientific computation often needs derivatives for solving nonlinear systems such as those arising in many PDE algorithms, optimization, parameter identification, stiff ordinary differential equations, or sensitivity analysis. Automatic differentiation computes derivatives accurately and efficiently by applying the chain rule to each arithmetic operation or elementary function. Operator overloading enables the techniques of either the forward or the reverse mode of automatic differentiation to be applied to real-world scientific problems. We illustrate automatic differentiation with an example drawn from a model of unsaturated flow in a porous medium. The problem arises from planning for the long-term storage of radioactive waste

  14. BIM-enabled Conceptual Modelling and Representation of Building Circulation

    OpenAIRE

    Lee, Jin Kook; Kim, Mi Jeong

    2014-01-01

    This paper describes how a building information modelling (BIM)-based approach for building circulation enables us to change the process of building design in terms of its computational representation and processes, focusing on the conceptual modelling and representation of circulation within buildings. BIM has been designed for use by several BIM authoring tools, in particular with the widely known interoperable industry foundation classes (IFCs), which follow an object-oriented data modelli...

  15. Sparsity enabled cluster reduced-order models for control

    Science.gov (United States)

    Kaiser, Eurika; Morzyński, Marek; Daviller, Guillaume; Kutz, J. Nathan; Brunton, Bingni W.; Brunton, Steven L.

    2018-01-01

    Characterizing and controlling nonlinear, multi-scale phenomena are central goals in science and engineering. Cluster-based reduced-order modeling (CROM) was introduced to exploit the underlying low-dimensional dynamics of complex systems. CROM builds a data-driven discretization of the Perron-Frobenius operator, resulting in a probabilistic model for ensembles of trajectories. A key advantage of CROM is that it embeds nonlinear dynamics in a linear framework, which enables the application of standard linear techniques to the nonlinear system. CROM is typically computed on high-dimensional data; however, access to and computations on this full-state data limit the online implementation of CROM for prediction and control. Here, we address this key challenge by identifying a small subset of critical measurements to learn an efficient CROM, referred to as sparsity-enabled CROM. In particular, we leverage compressive measurements to faithfully embed the cluster geometry and preserve the probabilistic dynamics. Further, we show how to identify fewer optimized sensor locations tailored to a specific problem that outperform random measurements. Both of these sparsity-enabled sensing strategies significantly reduce the burden of data acquisition and processing for low-latency in-time estimation and control. We illustrate this unsupervised learning approach on three different high-dimensional nonlinear dynamical systems from fluids with increasing complexity, with one application in flow control. Sparsity-enabled CROM is a critical facilitator for real-time implementation on high-dimensional systems where full-state information may be inaccessible.

  16. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT, J.

    2005-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include, for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security. To achieve our goals we have established a close alliance with applied mathematicians and computer scientists at Stony Brook and Columbia Universities.

  17. Enabling Wireless Avionics Intra-Communications

    Science.gov (United States)

    Torres, Omar; Nguyen, Truong; Mackenzie, Anne

    2016-01-01

    The Electromagnetics and Sensors Branch of NASA Langley Research Center (LaRC) is investigating the potential of an all-wireless aircraft as part of the ECON (Efficient Reconfigurable Cockpit Design and Fleet Operations using Software Intensive, Networked and Wireless Enabled Architecture) seedling proposal, which is funded by the Convergent Aeronautics Solutions (CAS) project, Transformative Aeronautics Concepts (TAC) program, and NASA Aeronautics Research Institute (NARI). The project consists of a brief effort carried out by a small team in the Electromagnetic Environment Effects (E3) laboratory with the intention of exposing some of the challenges faced by a wireless communication system inside the reflective cavity of an aircraft and to explore potential solutions that take advantage of that environment for constructive gain. The research effort was named EWAIC for "Enabling Wireless Aircraft Intra-communications." The E3 laboratory is a research facility that includes three electromagnetic reverberation chambers and equipment that allow testing and generation of test data for the investigation of wireless systems in reflective environments. Using these chambers, the EWAIC team developed a set of tests and setups that allow the intentional variation of intensity of a multipath field to reproduce the environment of the various bays and cabins of large transport aircraft. This setup, in essence, simulates an aircraft environment that allows the investigation and testing of wireless communication protocols that can effectively be used as a tool to mitigate some of the risks inherent to an aircraft wireless system for critical functions. In addition, the EWAIC team initiated the development of a computational modeling tool to illustrate the propagation of EM waves inside the reflective cabins and bays of aircraft and to obtain quantifiable information regarding the degradation of signals in aircraft subassemblies. The nose landing gear of a UAV CAD model was used

  18. A Modular Swarm Optimization Framework Enabling Multi-Vehicle Coordinated Path Planning, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The advancement of Unmanned Aerial Systems (UAS) with computing power and communications hardware has enabled an increased capability set for multi-vehicle...

  19. Quantum Computing: Pro and Con

    OpenAIRE

    Preskill, John

    1997-01-01

    I assess the potential of quantum computation. Broad and important applications must be found to justify construction of a quantum computer; I review some of the known quantum algorithms and consider the prospects for finding new ones. Quantum computers are notoriously susceptible to making errors; I discuss recently developed fault-tolerant procedures that enable a quantum computer with noisy gates to perform reliably. Quantum computing hardware is still in its infancy; I comment on the spec...

  20. Realising the Uncertainty Enabled Model Web

    Science.gov (United States)

    Cornford, D.; Bastin, L.; Pebesma, E. J.; Williams, M.; Stasch, C.; Jones, R.; Gerharz, L.

    2012-12-01

    conversion between uncertainty types, and between the spatial / temporal support of service inputs / outputs. Finally we describe the tools being generated within the UncertWeb project, considering three main aspects: i) Elicitation of uncertainties on model inputs. We are developing tools to enable domain experts to provide judgements about input uncertainties from UncertWeb model components (e.g. parameters in meteorological models) which allow panels of experts to engage in the process and reach a consensus view on the current knowledge / beliefs about that parameter or variable. We are developing systems for continuous and categorical variables as well as stationary spatial fields. ii) Visualisation of the resulting uncertain outputs from the end of the workflow, but also at intermediate steps. At this point we have prototype implementations driven by the requirements from the use cases that motivate UncertWeb. iii) Sensitivity and uncertainty analysis on model outputs. Here we show the design of the overall system we are developing, including the deployment of an emulator framework to allow computationally efficient approaches. We conclude with a summary of the open issues and remaining challenges we are facing in UncertWeb, and provide a brief overview of how we plan to tackle these.

  1. New Catalog of Resources Enables Paleogeosciences Research

    Science.gov (United States)

    Lingo, R. C.; Horlick, K. A.; Anderson, D. M.

    2014-12-01

    The 21st century promises a new era for scientists of all disciplines, the age where cyber infrastructure enables research and education and fuels discovery. EarthCube is a working community of over 2,500 scientists and students of many Earth Science disciplines who are looking to build bridges between disciplines. The EarthCube initiative will create a digital infrastructure that connects databases, software, and repositories. A catalog of resources (databases, software, repositories) has been produced by the Research Coordination Network for Paleogeosciences to improve the discoverability of resources. The Catalog is currently made available within the larger-scope CINERGI geosciences portal (http://hydro10.sdsc.edu/geoportal/catalog/main/home.page). Other distribution points and web services are planned, using linked data, content services for the web, and XML descriptions that can be harvested using metadata protocols. The databases provide searchable interfaces to find data sets that would otherwise remain dark data, hidden in drawers and on personal computers. The software will be described in catalog entries so just one click will lead users to methods and analytical tools that many geoscientists were unaware of. The repositories listed in the Paleogeosciences Catalog contain physical samples found all across the globe, from natural history museums to the basements of university buildings. EarthCube has over 250 databases, 300 software systems, and 200 repositories which will grow in the coming year. When completed, geoscientists across the world will be connected into a productive workflow for managing, sharing, and exploring geoscience data and information that expedites collaboration and innovation within the paleogeosciences, potentially bringing about new interdisciplinary discoveries.

  2. Integrated and Intelligent Manufacturing: Perspectives and Enablers

    Directory of Open Access Journals (Sweden)

    Yubao Chen

    2017-10-01

    Full Text Available With ever-increasing market competition and advances in technology, more and more countries are prioritizing advanced manufacturing technology as their top priority for economic growth. Germany announced the Industry 4.0 strategy in 2013. The US government launched the Advanced Manufacturing Partnership (AMP in 2011 and the National Network for Manufacturing Innovation (NNMI in 2014. Most recently, the Manufacturing USA initiative was officially rolled out to further “leverage existing resources... to nurture manufacturing innovation and accelerate commercialization” by fostering close collaboration between industry, academia, and government partners. In 2015, the Chinese government officially published a 10-year plan and roadmap toward manufacturing: Made in China 2025. In all these national initiatives, the core technology development and implementation is in the area of advanced manufacturing systems. A new manufacturing paradigm is emerging, which can be characterized by two unique features: integrated manufacturing and intelligent manufacturing. This trend is in line with the progress of industrial revolutions, in which higher efficiency in production systems is being continuously pursued. To this end, 10 major technologies can be identified for the new manufacturing paradigm. This paper describes the rationales and needs for integrated and intelligent manufacturing (i2M systems. Related technologies from different fields are also described. In particular, key technological enablers, such as the Internet of Things and Services (IoTS, cyber-physical systems (CPSs, and cloud computing are discussed. Challenges are addressed with applications that are based on commercially available platforms such as General Electric (GE’s Predix and PTC’s ThingWorx.

  3. An Internet enabled impact limiter material database

    Energy Technology Data Exchange (ETDEWEB)

    Wix, S.; Kanipe, F.; McMurtry, W.

    1998-09-01

    This paper presents a detailed explanation of the construction of an interest enabled database, also known as a database driven web site. The data contained in the internet enabled database are impact limiter material and seal properties. The technique used in constructing the internet enabled database presented in this paper are applicable when information that is changing in content needs to be disseminated to a wide audience.

  4. An internet enabled impact limiter material database

    Energy Technology Data Exchange (ETDEWEB)

    Wix, S.; Kanipe, F.; McMurtry, W. [Sandia National Labs., Albuquerque, NM (United States)

    1998-07-01

    This paper presents a detailed explanation of the construction of an internet enabled database, also known as a database driven web site. The data contained in the internet enabled database are impact limiter material and seal properties. The techniques used in constructing the internet enabled database presented in this paper are applicable when information that is changing in content needs to be disseminated to a wide audience. (authors)

  5. An internet enabled impact limiter material database

    International Nuclear Information System (INIS)

    Wix, S.; Kanipe, F.; McMurtry, W.

    1998-01-01

    This paper presents a detailed explanation of the construction of an internet enabled database, also known as a database driven web site. The data contained in the internet enabled database are impact limiter material and seal properties. The techniques used in constructing the internet enabled database presented in this paper are applicable when information that is changing in content needs to be disseminated to a wide audience. (authors)

  6. An Internet enabled impact limiter material database

    International Nuclear Information System (INIS)

    Wix, S.; Kanipe, F.; McMurtry, W.

    1998-01-01

    This paper presents a detailed explanation of the construction of an interest enabled database, also known as a database driven web site. The data contained in the internet enabled database are impact limiter material and seal properties. The technique used in constructing the internet enabled database presented in this paper are applicable when information that is changing in content needs to be disseminated to a wide audience

  7. ESIM_DSN Web-Enabled Distributed Simulation Network

    Science.gov (United States)

    Bedrossian, Nazareth; Novotny, John

    2002-01-01

    In this paper, the eSim(sup DSN) approach to achieve distributed simulation capability using the Internet is presented. With this approach a complete simulation can be assembled from component subsystems that run on different computers. The subsystems interact with each other via the Internet The distributed simulation uses a hub-and-spoke type network topology. It provides the ability to dynamically link simulation subsystem models to different computers as well as the ability to assign a particular model to each computer. A proof-of-concept demonstrator is also presented. The eSim(sup DSN) demonstrator can be accessed at http://www.jsc.draper.com/esim which hosts various examples of Web enabled simulations.

  8. Enabling Campus Grids with Open Science Grid Technology

    International Nuclear Information System (INIS)

    Weitzel, Derek; Fraser, Dan; Pordes, Ruth; Bockelman, Brian; Swanson, David

    2011-01-01

    The Open Science Grid is a recognized key component of the US national cyber-infrastructure enabling scientific discovery through advanced high throughput computing. The principles and techniques that underlie the Open Science Grid can also be applied to Campus Grids since many of the requirements are the same, even if the implementation technologies differ. We find five requirements for a campus grid: trust relationships, job submission, resource independence, accounting, and data management. The Holland Computing Center's campus grid at the University of Nebraska-Lincoln was designed to fulfill the requirements of a campus grid. A bridging daemon was designed to bring non-Condor clusters into a grid managed by Condor. Condor features which make it possible to bridge Condor sites into a multi-campus grid have been exploited at the Holland Computing Center as well.

  9. Enabling campus grids with open science grid technology

    Energy Technology Data Exchange (ETDEWEB)

    Weitzel, Derek [Nebraska U.; Bockelman, Brian [Nebraska U.; Swanson, David [Nebraska U.; Fraser, Dan [Argonne; Pordes, Ruth [Fermilab

    2011-01-01

    The Open Science Grid is a recognized key component of the US national cyber-infrastructure enabling scientific discovery through advanced high throughput computing. The principles and techniques that underlie the Open Science Grid can also be applied to Campus Grids since many of the requirements are the same, even if the implementation technologies differ. We find five requirements for a campus grid: trust relationships, job submission, resource independence, accounting, and data management. The Holland Computing Center's campus grid at the University of Nebraska-Lincoln was designed to fulfill the requirements of a campus grid. A bridging daemon was designed to bring non-Condor clusters into a grid managed by Condor. Condor features which make it possible to bridge Condor sites into a multi-campus grid have been exploited at the Holland Computing Center as well.

  10. Security of fixed and wireless computer networks

    NARCIS (Netherlands)

    Verschuren, J.; Degen, A.J.G.; Veugen, P.J.M.

    2003-01-01

    A few decades ago, most computers were stand-alone machines: they were able to process information using their own resources. Later, computer systems were connected to each other enabling a computer system to exchange data with another computer and to use resources of another computer. With the

  11. The enabling approach for housing supply

    Directory of Open Access Journals (Sweden)

    Ghada Farouk Hassan

    2011-12-01

    The paper attempts to highlight prerequisites needed to improve the success of the enabling approach in achieving adequate housing provision. Then the paper revisits the Egyptian experiences in the application of the enabling approach from 2005 till 2010. Finally, the paper highlights the main drops and lessons must be considered as promising approach after the revolution.

  12. Octopus: LLL's computing utility

    International Nuclear Information System (INIS)

    Anon.

    1978-01-01

    The Laboratory's Octopus network constitutes one of the greatest concentrations of computing power in the world. This power derives from the network's organization as well as from the size and capability of its computers, storage media, input/output devices, and communication channels. Being in a network enables these facilities to work together to form a unified computing utility that is accessible on demand directly from the users' offices. This computing utility has made a major contribution to the pace of research and development at the Laboratory; an adequate rate of progress in research could not be achieved without it. 4 figures

  13. Optoelectronic Computer Architecture Development for Image Reconstruction

    National Research Council Canada - National Science Library

    Forber, Richard

    1996-01-01

    .... Specifically, we collaborated with UCSD and ERIM on the development of an optically augmented electronic computer for high speed inverse transform calculations to enable real time image reconstruction...

  14. BIM-Enabled Conceptual Modelling and Representation of Building Circulation

    Directory of Open Access Journals (Sweden)

    Jin Kook Lee

    2014-08-01

    Full Text Available This paper describes how a building information modelling (BIM-based approach for building circulation enables us to change the process of building design in terms of its computational representation and processes, focusing on the conceptual modelling and representation of circulation within buildings. BIM has been designed for use by several BIM authoring tools, in particular with the widely known interoperable industry foundation classes (IFCs, which follow an object-oriented data modelling methodology. Advances in BIM authoring tools, using space objects and their relations defined in an IFC's schema, have made it possible to model, visualize and analyse circulation within buildings prior to their construction. Agent-based circulation has long been an interdisciplinary topic of research across several areas, including design computing, computer science, architectural morphology, human behaviour and environmental psychology. Such conventional approaches to building circulation are centred on navigational knowledge about built environments, and represent specific circulation paths and regulations. This paper, however, places emphasis on the use of ‘space objects’ in BIM-enabled design processes rather than on circulation agents, the latter of which are not defined in the IFCs' schemas. By introducing and reviewing some associated research and projects, this paper also surveys how such a circulation representation is applicable to the analysis of building circulation-related rules.

  15. The anatomy of the grid : enabling scalable virtual organizations.

    Energy Technology Data Exchange (ETDEWEB)

    Foster, I.; Kesselman, C.; Tuecke, S.; Mathematics and Computer Science; Univ. of Chicago; Univ. of Southern California

    2001-10-01

    'Grid' computing has emerged as an important new field, distinguished from conventional distributed computing by its focus on large-scale resource sharing, innovative applications, and, in some cases, high performance orientation. In this article, the authors define this new field. First, they review the 'Grid problem,' which is defined as flexible, secure, coordinated resource sharing among dynamic collections of individuals, institutions, and resources -- what is referred to as virtual organizations. In such settings, unique authentication, authorization, resource access, resource discovery, and other challenges are encountered. It is this class of problem that is addressed by Grid technologies. Next, the authors present an extensible and open Grid architecture, in which protocols, services, application programming interfaces, and software development kits are categorized according to their roles in enabling resource sharing. The authors describe requirements that they believe any such mechanisms must satisfy and discuss the importance of defining a compact set of intergrid protocols to enable interoperability among different Grid systems. Finally, the authors discuss how Grid technologies relate to other contemporary technologies, including enterprise integration, application service provider, storage service provider, and peer-to-peer computing. They maintain that Grid concepts and technologies complement and have much to contribute to these other approaches.

  16. SciDAC visualization and analytics center for enabling technology

    International Nuclear Information System (INIS)

    Bethel, E Wes; Johnson, Chris; Joy, Ken; Ahern, Sean; Pascucci, Valerio; Childs, Hank; Cohen, Jonathan; Duchaineau, Mark; Hamann, Bernd; Hansen, Charles; Laney, Dan; Lindstrom, Peter; Meredith, Jeremy; Ostrouchov, George; Parker, Steven; Silva, Claudio; Sanderson, Allen; Tricoche, Xavier

    2007-01-01

    The Visualization and Analytics Center for Enabling Technologies (VACET) focuses on leveraging scientific visualization and analytics software technology as an enabling technology for increasing scientific productivity and insight. Advances in computational technology have resulted in an 'information big bang,' which in turn has created a significant data understanding challenge. This challenge is widely acknowledged to be one of the primary bottlenecks in contemporary science. The vision of VACET is to adapt, extend, create when necessary, and deploy visual data analysis solutions that are responsive to the needs of DOE's computational and experimental scientists. Our center is engineered to be directly responsive to those needs and to deliver solutions for use in DOE's large open computing facilities. The research and development directly target data understanding problems provided by our scientific application stakeholders. VACET draws from a diverse set of visualization technology ranging from production quality applications and application frameworks to state-of-the-art algorithms for visualization, analysis, analytics, data manipulation, and data management

  17. A Web-based Architecture Enabling Multichannel Telemedicine Applications

    Directory of Open Access Journals (Sweden)

    Fabrizio Lamberti

    2003-02-01

    Full Text Available Telemedicine scenarios include today in-hospital care management, remote teleconsulting, collaborative diagnosis and emergency situations handling. Different types of information need to be accessed by means of etherogeneous client devices in different communication environments in order to enable high quality continuous sanitary assistance delivery wherever and whenever needed. In this paper, a Web-based telemedicine architecture based on Java, XML and XSL technologies is presented. By providing dynamic content delivery services and Java based client applications for medical data consultation and modification, the system enables effective access to an Electronic Patient Record based standard database by means of any device equipped with a Web browser, such as traditional Personal Computers and workstation as well as modern Personal Digital Assistants. The effectiveness of the proposed architecture has been evaluated in different scenarios, experiencing fixed and mobile clinical data transmissions over Local Area Networks, wireless LANs and wide coverage telecommunication network including GSM and GPRS.

  18. Optical Coherent Receiver Enables THz Wireless Bridge

    DEFF Research Database (Denmark)

    Yu, Xianbin; Liu, Kexin; Zhang, Hangkai

    2016-01-01

    We experimentally demonstrated a 45 Gbit/s 400 GHz photonic wireless communication system enabled by an optical coherent receiver, which has a high potential in fast recovery of high data rate connections, for example, in disaster....

  19. Web Enabled DROLS Verity TopicSets

    National Research Council Canada - National Science Library

    Tong, Richard

    1999-01-01

    The focus of this effort has been the design and development of automatically generated TopicSets and HTML pages that provide the basis of the required search and browsing capability for DTIC's Web Enabled DROLS System...

  20. Creating an Economically Enabling and Competitive Business ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Creating an Economically Enabling and Competitive Business Environment in the ... the scope of operations of private sector enterprises in the West Bank and Gaza. ... IWRA/IDRC webinar on climate change and adaptive water management.

  1. Utility Energy Services Contracts: Enabling Documents

    Energy Technology Data Exchange (ETDEWEB)

    None

    2009-05-01

    Utility Energy Services Contracts: Enabling Documents provides materials that clarify the authority for Federal agencies to enter into utility energy services contracts (UESCs), as well as sample documents and resources to ease utility partnership contracting.

  2. Utility Energy Services Contracts: Enabling Documents

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, Karen; Vasquez, Deb

    2017-01-01

    The Federal Energy Management Program's 'Utility Energy Service Contracts: Enabling Documents' provide legislative information and materials that clarify the authority for federal agencies to enter into utility energy service contracts, or UESCs.

  3. 5G-Enabled Tactile Internet

    OpenAIRE

    Simsek, Meryem; Aijaz, Adnan; Dohler, Mischa; Sachs, Joachim; Fettweis, Gerhard

    2016-01-01

    The long-term ambition of the Tactile Internet is to enable a democratization of skill, and how it is being delivered globally. An integral part of this is to be able to transmit touch in perceived real-time, which is enabled by suitable robotics and haptics equipment at the edges, along with an unprecedented communications network. The fifth generation (5G) mobile communications systems will underpin this emerging Internet at the wireless edge. This paper presents the most important technolo...

  4. Integrated Photonics Enabled by Slow Light

    DEFF Research Database (Denmark)

    Mørk, Jesper; Chen, Yuntian; Ek, Sara

    2012-01-01

    In this talk we will discuss the physics of slow light in semiconductor materials and in particular the possibilities offered for integrated photonics. This includes ultra-compact slow light enabled optical amplifiers, lasers and pulse sources.......In this talk we will discuss the physics of slow light in semiconductor materials and in particular the possibilities offered for integrated photonics. This includes ultra-compact slow light enabled optical amplifiers, lasers and pulse sources....

  5. Cloud Computing Governance Lifecycle

    OpenAIRE

    Soňa Karkošková; George Feuerlicht

    2016-01-01

    Externally provisioned cloud services enable flexible and on-demand sourcing of IT resources. Cloud computing introduces new challenges such as need of business process redefinition, establishment of specialized governance and management, organizational structures and relationships with external providers and managing new types of risk arising from dependency on external providers. There is a general consensus that cloud computing in addition to challenges brings many benefits but it is uncle...

  6. Demonstration of blind quantum computing.

    Science.gov (United States)

    Barz, Stefanie; Kashefi, Elham; Broadbent, Anne; Fitzsimons, Joseph F; Zeilinger, Anton; Walther, Philip

    2012-01-20

    Quantum computers, besides offering substantial computational speedups, are also expected to preserve the privacy of a computation. We present an experimental demonstration of blind quantum computing in which the input, computation, and output all remain unknown to the computer. We exploit the conceptual framework of measurement-based quantum computation that enables a client to delegate a computation to a quantum server. Various blind delegated computations, including one- and two-qubit gates and the Deutsch and Grover quantum algorithms, are demonstrated. The client only needs to be able to prepare and transmit individual photonic qubits. Our demonstration is crucial for unconditionally secure quantum cloud computing and might become a key ingredient for real-life applications, especially when considering the challenges of making powerful quantum computers widely available.

  7. Cloud Computing: Exploring the scope

    OpenAIRE

    Maurya, Brajesh Kumar

    2010-01-01

    Cloud computing refers to a paradigm shift to overall IT solutions while raising the accessibility, scalability and effectiveness through its enabling technologies. However, migrated cloud platforms and services cost benefits as well as performances are neither clear nor summarized. Globalization and the recessionary economic times have not only raised the bar of a better IT delivery models but also have given access to technology enabled services via internet. Cloud computing has va...

  8. Parallel quantum computing in a single ensemble quantum computer

    International Nuclear Information System (INIS)

    Long Guilu; Xiao, L.

    2004-01-01

    We propose a parallel quantum computing mode for ensemble quantum computer. In this mode, some qubits are in pure states while other qubits are in mixed states. It enables a single ensemble quantum computer to perform 'single-instruction-multidata' type of parallel computation. Parallel quantum computing can provide additional speedup in Grover's algorithm and Shor's algorithm. In addition, it also makes a fuller use of qubit resources in an ensemble quantum computer. As a result, some qubits discarded in the preparation of an effective pure state in the Schulman-Varizani and the Cleve-DiVincenzo algorithms can be reutilized

  9. Computational Materials Repository

    DEFF Research Database (Denmark)

    Landis, David

    , different abstraction levels and enables users to analyze their own results, and allows to share data with collaborators. The approach of the Computational Materials Repository (CMR) is to convert data to an internal format that maintains the original variable names without insisting on any semantics...

  10. Enabling Open Innovation: Lessons from Haier

    Institute of Scientific and Technical Information of China (English)

    Arie Y.Lewin; Liisa V(a)likangas; Jin Chen

    2017-01-01

    Open innovation has become a dominant innovation paradigm.However,the actual adoption of open innovation organizational designs and practices remains elusive,and ongoing examples of large companies practicing open innovation in mature industries or beyond R&D activities are rare.Despite the continuing interest in open innovation and the surging research on the topic,not much is documented about how,in particular,large companies interpret and implement open innovation or develop and sustain an innovation-enabling culture.This paper reports on a study of Haier's adoption of six radical innovations as it implements an open innovation organization over a period of seven years.The study is unique in that the cases reveal how open innovation is enabled by the socially enabling mechanisms developed under Chairman Ruimin Zhang's leadership.These varied enabling mechanisms open the organization to serendipity at every level,from the bottom up to suppliers.Most importantly,the mechanisms imprint and sustain an open innovation culture recognized as important-yet often left unarticulated in terms of how it is practiced-in the prior literature.The paper contributes to and highlights the centrality of socially enabling mechanisms underlying an organization's innovation absorptive capacity.

  11. Nanomaterial-Enabled Wearable Sensors for Healthcare.

    Science.gov (United States)

    Yao, Shanshan; Swetha, Puchakayala; Zhu, Yong

    2018-01-01

    Highly sensitive wearable sensors that can be conformably attached to human skin or integrated with textiles to monitor the physiological parameters of human body or the surrounding environment have garnered tremendous interest. Owing to the large surface area and outstanding material properties, nanomaterials are promising building blocks for wearable sensors. Recent advances in the nanomaterial-enabled wearable sensors including temperature, electrophysiological, strain, tactile, electrochemical, and environmental sensors are presented in this review. Integration of multiple sensors for multimodal sensing and integration with other components into wearable systems are summarized. Representative applications of nanomaterial-enabled wearable sensors for healthcare, including continuous health monitoring, daily and sports activity tracking, and multifunctional electronic skin are highlighted. Finally, challenges, opportunities, and future perspectives in the field of nanomaterial-enabled wearable sensors are discussed. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. The ENABLER - Based on proven NERVA technology

    International Nuclear Information System (INIS)

    Livingston, J.M.; Pierce, B.L.

    1991-01-01

    The ENABLER reactor for use in a nuclear thermal propulsion engine uses the technology developed in the NERVA/Rover program, updated to incorporate advances in the technology. Using composite fuel, higher power densities per fuel element, improved radiation resistant control components and the advancements in use of carbon-carbon materials; the ENABLER can provide a specific impulse of 925 seconds, an engine thrust to weight (excluding reactor shield) approaching five, an improved initial mass in low Earth orbit and a consequent reduction in launch costs and logistics problems. This paper describes the 75,000 lbs thrust ENABLER design which is a low cost, low risk approach to meeting tommorrow's space propulsion needs

  13. The ENABLER - Based on proven NERVA technology

    Science.gov (United States)

    Livingston, Julie M.; Pierce, Bill L.

    The ENABLER reactor for use in a nuclear thermal propulsion engine uses the technology developed in the NERVA/Rover program, updated to incorporate advances in the technology. Using composite fuel, higher power densities per fuel element, improved radiation resistant control components and the advancements in use of carbon-carbon materials; the ENABLER can provide a specific impulse of 925 seconds, an engine thrust to weight (excluding reactor shield) approaching five, an improved initial mass in low Earth orbit and a consequent reduction in launch costs and logistics problems. This paper describes the 75,000 lbs thrust ENABLER design which is a low cost, low risk approach to meeting tommorrow's space propulsion needs.

  14. Desktop grid computing

    CERN Document Server

    Cerin, Christophe

    2012-01-01

    Desktop Grid Computing presents common techniques used in numerous models, algorithms, and tools developed during the last decade to implement desktop grid computing. These techniques enable the solution of many important sub-problems for middleware design, including scheduling, data management, security, load balancing, result certification, and fault tolerance. The book's first part covers the initial ideas and basic concepts of desktop grid computing. The second part explores challenging current and future problems. Each chapter presents the sub-problems, discusses theoretical and practical

  15. Origami-enabled deformable silicon solar cells

    Energy Technology Data Exchange (ETDEWEB)

    Tang, Rui; Huang, Hai; Liang, Hanshuang; Liang, Mengbing [School of Electrical, Computer and Energy Engineering, Arizona State University, Tempe, Arizona 85287 (United States); Tu, Hongen; Xu, Yong [Electrical and Computer Engineering, Wayne State University, 5050 Anthony Wayne Dr., Detroit, Michigan 48202 (United States); Song, Zeming; Jiang, Hanqing, E-mail: hanqing.jiang@asu.edu [School for Engineering of Matter, Transport and Energy, Arizona State University, Tempe, Arizona 85287 (United States); Yu, Hongyu, E-mail: hongyu.yu@asu.edu [School of Electrical, Computer and Energy Engineering, Arizona State University, Tempe, Arizona 85287 (United States); School of Earth and Space Exploration, Arizona State University, Tempe, Arizona 85287 (United States)

    2014-02-24

    Deformable electronics have found various applications and elastomeric materials have been widely used to reach flexibility and stretchability. In this Letter, we report an alternative approach to enable deformability through origami. In this approach, the deformability is achieved through folding and unfolding at the creases while the functional devices do not experience strain. We have demonstrated an example of origami-enabled silicon solar cells and showed that this solar cell can reach up to 644% areal compactness while maintaining reasonable good performance upon cyclic folding/unfolding. This approach opens an alternative direction of producing flexible, stretchable, and deformable electronics.

  16. Origami-enabled deformable silicon solar cells

    International Nuclear Information System (INIS)

    Tang, Rui; Huang, Hai; Liang, Hanshuang; Liang, Mengbing; Tu, Hongen; Xu, Yong; Song, Zeming; Jiang, Hanqing; Yu, Hongyu

    2014-01-01

    Deformable electronics have found various applications and elastomeric materials have been widely used to reach flexibility and stretchability. In this Letter, we report an alternative approach to enable deformability through origami. In this approach, the deformability is achieved through folding and unfolding at the creases while the functional devices do not experience strain. We have demonstrated an example of origami-enabled silicon solar cells and showed that this solar cell can reach up to 644% areal compactness while maintaining reasonable good performance upon cyclic folding/unfolding. This approach opens an alternative direction of producing flexible, stretchable, and deformable electronics

  17. Enabling Routes as Context in Mobile Services

    DEFF Research Database (Denmark)

    Brilingaite, Agne; Jensen, Christian Søndergaard; Zokaite, Nora

    2004-01-01

    With the continuing advances in wireless communications, geo-positioning, and portable electronics, an infrastructure is emerging that enables the delivery of on-line, location-enabled services to very large numbers of mobile users. A typical usage situation for mobile services is one characterized...... by a small screen and no keyboard, and by the service being only a secondary focus of the user. It is therefore particularly important to deliver the "right" information and service at the right time, with as little user interaction as possible. This may be achieved by making services context aware.Mobile...

  18. Nanotechnologv Enabled Biological and Chemical Sensors

    Science.gov (United States)

    Koehne, Jessica; Meyyappan, M.

    2011-01-01

    Nanotechnology is an enabling technology that will impact almost all economic sectors: one of the most important and with great potential is the health/medical sector. - Nanomaterials for drug delivery - Early warning sensors - Implantable devices - Artificial parts with improved characteristics Carbon nanotubes and nanofibers show promise for use in sensor development, electrodes and other biomedical applications.

  19. Action Learning: Avoiding Conflict or Enabling Action

    Science.gov (United States)

    Corley, Aileen; Thorne, Ann

    2006-01-01

    Action learning is based on the premise that action and learning are inextricably entwined and it is this potential, to enable action, which has contributed to the growth of action learning within education and management development programmes. However has this growth in action learning lead to an evolution or a dilution of Revan's classical…

  20. Creating an Economically Enabling and Competitive Business ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Creating an Economically Enabling and Competitive Business Environment in the West Bank and Gaza Strip. The prospect of indefinite Israeli occupation of the Palestinian territories, and their extreme dependence on foreign assistance and Israeli-controlled customs revenues, had led to the conclusion that the Palestinian ...

  1. Creating an Economically Enabling and Competitive Business ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Creating an Economically Enabling and Competitive Business Environment in the West Bank and Gaza Strip. The prospect of indefinite Israeli occupation of the ... Impact of implementing the Palestinian banking law on the performance of the private sector [Arabic language]. Documents. Impact of the commercial agents law ...

  2. Enabling DRM-preserving Digital content Redistribution

    NARCIS (Netherlands)

    Krishnan Nair, S.; Popescu, B.C.; Gamage, C.D.; Crispo, B.; Tanenbaum, A.S.

    2005-01-01

    Traditionally, the process of online digital content distribution has involved a limited number of centralised distributors selling protected contents and licenses authorising the use of the se contents, to consumers. In this paper, we extend this model by introducing a security scheme that enables

  3. Enablements and constraints to school leadership practice

    African Journals Online (AJOL)

    There are many schools in developing countries which, despite the challenges they face, defy the odds and continue to perform at exceptionally high levels. We cast our gaze on one of these resilient schools in South Africa, and sought to learn about the leadership practices prevalent in this school and the enablements and ...

  4. Sustainable Venture Capital Investments: An Enabler Investigation

    Directory of Open Access Journals (Sweden)

    Elena Antarciuc

    2018-04-01

    Full Text Available Investing in sustainable projects can help tackle the current sustainability challenges. Venture capital investments can contribute significantly to the growth of sustainable start-ups. Sustainable venture capital (SVC research is just emerging. This paper identifies enablers for sustainable venture capital investments in Saudi Arabia taking into account different stakeholders and firm’s tangible and intangible resources. Using perspectives from venture capital experts in Saudi Arabia and the grey-based Decision-Making Trial and Evaluation Laboratory (DEMATEL method, this study pinpoints the most critical enablers and investigates their causal and effect interconnections. The methodological process consists of reviewing the SVC literature and consulting the experts to identify the SVC enablers, creating a questionnaire, acquiring the answers from four experts, analyzing the data with grey-based DEMATEL and performing a sensitivity analysis. The government use of international standards, policies and regulations for sustainable investments, the commitment of the venture capitalists to sustainability and their deep understanding of sustainable business models are the most influential enablers. The paper concludes with implications for different actors, limitations and prospective directions for the sustainable venture capital research.

  5. 75 FR 13235 - IP-Enabled Services

    Science.gov (United States)

    2010-03-19

    ... FEDERAL COMMUNICATIONS COMMISSION 47 CFR Part 63 [WC Docket No. 04-36; FCC 09-40] IP-Enabled Services AGENCY: Federal Communications Commission ACTION: Final rule; announcement of effective date... Internet Protocol (VoIP) service the discontinuance obligations that apply to domestic non-dominant...

  6. Extreme Networks' 10-Gigabit Ethernet enables

    CERN Multimedia

    2002-01-01

    " Extreme Networks, Inc.'s 10-Gigabit switching platform enabled researchers to transfer one Terabyte of information from Vancouver to Geneva across a single network hop, the world's first large-scale, end-to-end transfer of its kind" (1/2 page).

  7. Computing with impure numbers - Automatic consistency checking and units conversion using computer algebra

    Science.gov (United States)

    Stoutemyer, D. R.

    1977-01-01

    The computer algebra language MACSYMA enables the programmer to include symbolic physical units in computer calculations, and features automatic detection of dimensionally-inhomogeneous formulas and conversion of inconsistent units in a dimensionally homogeneous formula. Some examples illustrate these features.

  8. Geometric Constructions with the Computer.

    Science.gov (United States)

    Chuan, Jen-chung

    The computer can be used as a tool to represent and communicate geometric knowledge. With the appropriate software, a geometric diagram can be manipulated through a series of animation that offers more than one particular snapshot as shown in a traditional mathematical text. Geometric constructions with the computer enable the learner to see and…

  9. Computer tomography in otolaryngology

    International Nuclear Information System (INIS)

    Gradzki, J.

    1981-01-01

    The principles of design and the action of computer tomography which was applied also for the diagnosis of nose, ear and throat diseases are discussed. Computer tomography makes possible visualization of the structures of the nose, nasal sinuses and facial skeleton in transverse and eoronal planes. The method enables an accurate evaluation of the position and size of neoplasms in these regions and differentiation of inflammatory exudates against malignant masses. In otology computer tomography is used particularly in the diagnosis of pontocerebellar angle tumours and otogenic brain abscesses. Computer tomography of the larynx and pharynx provides new diagnostic data owing to the possibility of obtaining transverse sections and visualization of cartilage. Computer tomograms of some cases are presented. (author)

  10. Computed tomography

    International Nuclear Information System (INIS)

    Boyd, D.P.

    1989-01-01

    This paper reports on computed tomographic (CT) scanning which has improved computer-assisted imaging modalities for radiologic diagnosis. The advantage of this modality is its ability to image thin cross-sectional planes of the body, thus uncovering density information in three dimensions without tissue superposition problems. Because this enables vastly superior imaging of soft tissues in the brain and body, CT scanning was immediately successful and continues to grow in importance as improvements are made in speed, resolution, and cost efficiency. CT scanners are used for general purposes, and the more advanced machines are generally preferred in large hospitals, where volume and variety of usage justifies the cost. For imaging in the abdomen, a scanner with a rapid speed is preferred because peristalsis, involuntary motion of the diaphram, and even cardiac motion are present and can significantly degrade image quality. When contrast media is used in imaging to demonstrate scanner, immediate review of images, and multiformat hardcopy production. A second console is reserved for the radiologist to read images and perform the several types of image analysis that are available. Since CT images contain quantitative information in terms of density values and contours of organs, quantitation of volumes, areas, and masses is possible. This is accomplished with region-of- interest methods, which involve the electronic outlining of the selected region of the television display monitor with a trackball-controlled cursor. In addition, various image- processing options, such as edge enhancement (for viewing fine details of edges) or smoothing filters (for enhancing the detectability of low-contrast lesions) are useful tools

  11. Enabling the First Ever Measurement of Coherent Neutrino Scattering Through Background Neutron Measurements.

    Energy Technology Data Exchange (ETDEWEB)

    Reyna, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Betty, Rita [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-03-01

    Using High Performance Computing to Examine the Processes of Neurogenesis Underlying Pattern Separation/Completion of Episodic Information - Sandia researchers developed novel methods and metrics for studying the computational function of neurogenesis,thus generating substantial impact to the neuroscience and neural computing communities. This work could benefit applications in machine learning and other analysis activities. The purpose of this project was to computationally model the impact of neural population dynamics within the neurobiological memory system in order to examine how subareas in the brain enable pattern separation and completion of information in memory across time as associated experiences.

  12. Optical Computing

    OpenAIRE

    Woods, Damien; Naughton, Thomas J.

    2008-01-01

    We consider optical computers that encode data using images and compute by transforming such images. We give an overview of a number of such optical computing architectures, including descriptions of the type of hardware commonly used in optical computing, as well as some of the computational efficiencies of optical devices. We go on to discuss optical computing from the point of view of computational complexity theory, with the aim of putting some old, and some very recent, re...

  13. Enabling ICU patients to die at home.

    Science.gov (United States)

    Battle, Emma; Bates, Lucy; Liderth, Emma; Jones, Samantha; Sheen, Sheryl; Ginty, Andrew; Northmore, Melanie

    2014-10-07

    There is often an overlap between intensive care medicine and palliative medicine. When all curative treatment options have been explored, keeping the patient comfortable and free from pain is the main concern for healthcare practitioners. Patient autonomy in end of life decisions has not been encouraged in the intensive care unit (ICU), until now, because of its specialised and technical nature. Staff at the Royal Bolton Hospital have broken down the barriers to enabling ICU patients to die in their own homes, and have developed a system of collaborative working that can help to fulfil a patient's final wish to go home. This article describes how ICU staff developed a process that enabled two ventilated patients to be transferred home for end of life care.

  14. Femtosecond laser enabled keratoplasty for advanced keratoconus

    Directory of Open Access Journals (Sweden)

    Yathish Shivanna

    2013-01-01

    Full Text Available Purpose : To assess the efficacy and advantages of femtosecond laser enabled keratoplasty (FLEK over conventional penetrating keratoplasty (PKP in advanced keratoconus. Materials and Methods: Detailed review of literature of published randomized controlled trials of operative techniques in PKP and FLEK. Results: Fifteen studies were identified, analyzed, and compared with our outcome. FLEK was found to have better outcome in view of better and earlier stabilization uncorrected visual acuity (UCVA, best corrected visual acuity (BCVA, and better refractive outcomes with low astigmatism as compared with conventional PKP. Wound healing also was noticed to be earlier, enabling early suture removal in FLEK. Conclusions: Studies relating to FLEK have shown better results than conventional PKP, however further studies are needed to assess the safety and intraoperative complications of the procedure.

  15. Enablers & Barriers for Realizing Modularity Benefits

    DEFF Research Database (Denmark)

    Storbjerg, Simon Haahr; Brunø, Thomas Ditlev; Thyssen, Jesper

    2012-01-01

    far less attention compared to the theories and methods concerning modularization of technical systems. Harvesting the full potential of modularization, particularly in relation to product development agility, depends on more than an optimal architecture. Key enablers in this context......Although modularization is becoming both a well-described domain in academia and a broadly applied concept in business, many of today’s firm still struggle to realize the promised benefits of this approach. Managing modularization is a complex matter, and in spite of this, a topic that has received...... are the organizational and systems related aspects. Recognizing the need for guidance to realize the benefits of modularity, the purpose of this study is through a literature study and a case study to improve the insight into the organizational and systems related enablers and barriers with regard to obtaining the full...

  16. Enabling Sustainable Improvement in IT Entrepreneurship

    Directory of Open Access Journals (Sweden)

    Paul E. Renaud

    2013-06-01

    Full Text Available Firms must embrace processes that enable the information technology (IT function to become a strategic partner to the business functions it serves. Process ambidexterity is a way for processes to be augmented to improve alignment and adaptability to new markets and technologies. By applying the principles of process ambidexterity, the key elements required for sustainable change within the capabilities that comprise the IT function of the firm are identified. Furthermore, the scope and depth of the dysfunction that is widespread across large firms that depend upon IT are outlined to provide a contextual basis for presenting a solution framework to address sustainable change. This framework for sustainable change is of primary benefit to IT executives seeking to systematically transform the IT function and enable IT entrepreneurship.

  17. Enabling Routes as Context in Mobile Services

    DEFF Research Database (Denmark)

    Brilingaite, Agne; Jensen, Christian Søndergaard; Zokaite, Nora

    With the continuing advances in wireless communications, geo-positioning, and portable electronics, an infrastructure is emerging that enables the delivery of on-line, location-enabled services to very large numbers of mobile users. A typical usage situation for mobile services is one characteriz...... and accumulates the routes of a user along with their usage patterns and that makes the routes available to services. Experiences from using the component on logs of GPS positions acquired from vehicles traveling within a real road network are reported....... by a small screen and no keyboard, and by the service being only a secondary focus of the user. Under such circumstances, it is particularly important to deliver the "right" information and service at the right time, with as little user interaction as possible. This may be achieved by making services context...

  18. Ethics case reflection sessions: Enablers and barriers.

    Science.gov (United States)

    Bartholdson, Cecilia; Molewijk, Bert; Lützén, Kim; Blomgren, Klas; Pergert, Pernilla

    2018-03-01

    In previous research on ethics case reflection (ECR) sessions about specific cases, healthcare professionals in childhood cancer care were clarifying their perspectives on the ethical issue to resolve their main concern of consolidating care. When perspectives were clarified, consequences in the team included 'increased understanding', 'group strengthening' and 'decision grounding'. Additional analysis of the data was needed on conditions that could contribute to the quality of ECR sessions. The aim of this study was to explore conditions for clarifying perspectives during ECR sessions. Data were collected from observations and interviews and the results emerged from an inductive analysis using grounded theory. Participants and research context: Six observations during ECR sessions and 10 interviews were performed with healthcare professionals working in childhood cancer care and advanced paediatric homecare. Ethical considerations: The study was approved by a regional ethical review board. Participants were informed about their voluntary involvement and that they could withdraw their participation without explaining why. Two categories emerged: organizational enablers and barriers and team-related enablers and barriers. Organizational enablers and barriers included the following sub-categories: the timing of the ECR session, the structure during the ECR session and the climate during the ECR session. Sub-categories to team-related enablers and barriers were identified as space for inter-professional perspectives, varying levels of ethical skills and space for the patient's and the family's perspectives. Space for inter-professional perspectives included the dominance of a particular perspective that can result from hierarchical positions. The medical perspective is relevant for understanding the child's situation but should not dominate the ethical reflection. Conditions for ECR sessions have been explored and the new knowledge can be used when training

  19. IT Enabled Agility in Organizational Ambidexterity

    OpenAIRE

    Röder, Nina; Schermann, Michael; Krcmar, Helmut

    2015-01-01

    The aim of ambidextrous organizations is to balance exploratory and exploitative learning concepts. They innovate through experiments and research, and capture the value of innovations through refinement and continuous improvement. In this paper, we study the relationship of organizational ambidexterity and IT enabled agility. Based on a case study with a German car manufacturer we find that (1) entrepreneurial agility impedes exploitative concepts, (2) adaptive agility impedes exploratory co...

  20. Naval Science & Technology: Enabling the Future Force

    Science.gov (United States)

    2013-04-01

    corn for disruptive technologies Laser Cooling Spintronics Bz 1st U.S. Intel satellite GRAB Semiconductors GaAs, GaN, SiC GPS...Payoff • Innovative and game-changing • Approved by Corporate Board • Delivers prototype Innovative Naval Prototypes (5-10 Year) Disruptive ... Technologies Free Electron Laser Integrated Topside EM Railgun Sea Base Enablers Tactical Satellite Large Displacement UUV AACUS Directed

  1. Enabling technologies for the prassi autonomous robot

    Energy Technology Data Exchange (ETDEWEB)

    Taraglio, S.; Nanni, V. [ENEA, Robotics and Information Technology Division, Rome (Italy)

    2001-07-01

    In this book are summarised some of the results of the PRASSI project as presented by the different partners of the effort. PRASSI is an acronym which stands for Autonomous Robotic Platform for the Security and Surveillance of plants, the Italian for it is 'Piattaforma Robotica per la Sorveglianza e Sicurezza d'Impianto'. This project has been funded by the Italian Ministry for the Education, the University and the Research (MIUR) in the framework of the project High Performance Computing Applied to Robotics (Calcolo Parallelo con Applicazioni alla Robotica) of the law 95/1995. The idea behind such an initiative is that of fostering the knowledge and possibly the use of high performance computing in the research and industrial community. In other words, robotic scientists are always simplifying their algorithms or using particular approaches (e.g. soft computing) in order to use standard processors for difficult sensorial data processing; well, what if an embedded parallel computer were available, with at least one magnitude more of computing power?.

  2. Enabling technologies for the prassi autonomous robot

    Energy Technology Data Exchange (ETDEWEB)

    Taraglio, S; Nanni, V [ENEA, Robotics and Information Technology Division, Rome (Italy)

    2001-07-01

    In this book are summarised some of the results of the PRASSI project as presented by the different partners of the effort. PRASSI is an acronym which stands for Autonomous Robotic Platform for the Security and Surveillance of plants, the Italian for it is 'Piattaforma Robotica per la Sorveglianza e Sicurezza d'Impianto'. This project has been funded by the Italian Ministry for the Education, the University and the Research (MIUR) in the framework of the project High Performance Computing Applied to Robotics (Calcolo Parallelo con Applicazioni alla Robotica) of the law 95/1995. The idea behind such an initiative is that of fostering the knowledge and possibly the use of high performance computing in the research and industrial community. In other words, robotic scientists are always simplifying their algorithms or using particular approaches (e.g. soft computing) in order to use standard processors for difficult sensorial data processing; well, what if an embedded parallel computer were available, with at least one magnitude more of computing power?.

  3. Web-enabling technologies for the factory floor: a web-enabling strategy for emanufacturing

    Science.gov (United States)

    Velez, Ricardo; Lastra, Jose L. M.; Tuokko, Reijo O.

    2001-10-01

    This paper is intended to address the different technologies available for Web-enabling of the factory floor. It will give an overview of the importance of Web-enabling of the factory floor, in the application of the concepts of flexible and intelligent manufacturing, in conjunction with e-commerce. As a last section, it will try to define a Web-enabling strategy for the application in eManufacturing. This is made under the scope of the electronics manufacturing industry, so every application, technology or related matter is presented under such scope.

  4. Architectural Strategies for Enabling Data-Driven Science at Scale

    Science.gov (United States)

    Crichton, D. J.; Law, E. S.; Doyle, R. J.; Little, M. M.

    2017-12-01

    The analysis of large data collections from NASA or other agencies is often executed through traditional computational and data analysis approaches, which require users to bring data to their desktops and perform local data analysis. Alternatively, data are hauled to large computational environments that provide centralized data analysis via traditional High Performance Computing (HPC). Scientific data archives, however, are not only growing massive, but are also becoming highly distributed. Neither traditional approach provides a good solution for optimizing analysis into the future. Assumptions across the NASA mission and science data lifecycle, which historically assume that all data can be collected, transmitted, processed, and archived, will not scale as more capable instruments stress legacy-based systems. New paradigms are needed to increase the productivity and effectiveness of scientific data analysis. This paradigm must recognize that architectural and analytical choices are interrelated, and must be carefully coordinated in any system that aims to allow efficient, interactive scientific exploration and discovery to exploit massive data collections, from point of collection (e.g., onboard) to analysis and decision support. The most effective approach to analyzing a distributed set of massive data may involve some exploration and iteration, putting a premium on the flexibility afforded by the architectural framework. The framework should enable scientist users to assemble workflows efficiently, manage the uncertainties related to data analysis and inference, and optimize deep-dive analytics to enhance scalability. In many cases, this "data ecosystem" needs to be able to integrate multiple observing assets, ground environments, archives, and analytics, evolving from stewardship of measurements of data to using computational methodologies to better derive insight from the data that may be fused with other sets of data. This presentation will discuss

  5. ARIES: Enabling Visual Exploration and Organization of Art Image Collections.

    Science.gov (United States)

    Crissaff, Lhaylla; Wood Ruby, Louisa; Deutch, Samantha; DuBois, R Luke; Fekete, Jean-Daniel; Freire, Juliana; Silva, Claudio

    2018-01-01

    Art historians have traditionally used physical light boxes to prepare exhibits or curate collections. On a light box, they can place slides or printed images, move the images around at will, group them as desired, and visual-ly compare them. The transition to digital images has rendered this workflow obsolete. Now, art historians lack well-designed, unified interactive software tools that effectively support the operations they perform with physi-cal light boxes. To address this problem, we designed ARIES (ARt Image Exploration Space), an interactive image manipulation system that enables the exploration and organization of fine digital art. The system allows images to be compared in multiple ways, offering dynamic overlays analogous to a physical light box, and sup-porting advanced image comparisons and feature-matching functions, available through computational image processing. We demonstrate the effectiveness of our system to support art historians tasks through real use cases.

  6. Digital watermarking opportunities enabled by mobile media proliferation

    Science.gov (United States)

    Modro, Sierra; Sharma, Ravi K.

    2009-02-01

    Consumer usages of mobile devices and electronic media are changing. Mobile devices now include increased computational capabilities, mobile broadband access, better integrated sensors, and higher resolution screens. These enhanced features are driving increased consumption of media such as images, maps, e-books, audio, video, and games. As users become more accustomed to using mobile devices for media, opportunities arise for new digital watermarking usage models. For example, transient media, like images being displayed on screens, could be watermarked to provide a link between mobile devices. Applications based on these emerging usage models utilizing watermarking can provide richer user experiences and drive increased media consumption. We describe the enabling factors and highlight a few of the usage models and new opportunities. We also outline how the new opportunities are driving further innovation in watermarking technologies. We discuss challenges in market adoption of applications based on these usage models.

  7. Raexplore: Enabling Rapid, Automated Architecture Exploration for Full Applications

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Yao [Argonne National Lab. (ANL), Argonne, IL (United States); Balaprakash, Prasanna [Argonne National Lab. (ANL), Argonne, IL (United States); Meng, Jiayuan [Argonne National Lab. (ANL), Argonne, IL (United States); Morozov, Vitali [Argonne National Lab. (ANL), Argonne, IL (United States); Parker, Scott [Argonne National Lab. (ANL), Argonne, IL (United States); Kumaran, Kalyan [Argonne National Lab. (ANL), Argonne, IL (United States)

    2014-12-01

    We present Raexplore, a performance modeling framework for architecture exploration. Raexplore enables rapid, automated, and systematic search of architecture design space by combining hardware counter-based performance characterization and analytical performance modeling. We demonstrate Raexplore for two recent manycore processors IBM Blue- Gene/Q compute chip and Intel Xeon Phi, targeting a set of scientific applications. Our framework is able to capture complex interactions between architectural components including instruction pipeline, cache, and memory, and to achieve a 3–22% error for same-architecture and cross-architecture performance predictions. Furthermore, we apply our framework to assess the two processors, and discover and evaluate a list of architectural scaling options for future processor designs.

  8. Network Enabled - Unresolved Residual Analysis and Learning (NEURAL)

    Science.gov (United States)

    Temple, D.; Poole, M.; Camp, M.

    Since the advent of modern computational capacity, machine learning algorithms and techniques have served as a method through which to solve numerous challenging problems. However, for machine learning methods to be effective and robust, sufficient data sets must be available; specifically, in the space domain, these are generally difficult to acquire. Rapidly evolving commercial space-situational awareness companies boast the capability to collect hundreds of thousands nightly observations of resident space objects (RSOs) using a ground-based optical sensor network. This provides the ability to maintain custody of and characterize thousands of objects persistently. With this information available, novel deep learning techniques can be implemented. The technique discussed in this paper utilizes deep learning to make distinctions between nightly data collects with and without maneuvers. Implementation of these techniques will allow the data collected from optical ground-based networks to enable well informed and timely the space domain decision making.

  9. Computer group

    International Nuclear Information System (INIS)

    Bauer, H.; Black, I.; Heusler, A.; Hoeptner, G.; Krafft, F.; Lang, R.; Moellenkamp, R.; Mueller, W.; Mueller, W.F.; Schati, C.; Schmidt, A.; Schwind, D.; Weber, G.

    1983-01-01

    The computer groups has been reorganized to take charge for the general purpose computers DEC10 and VAX and the computer network (Dataswitch, DECnet, IBM - connections to GSI and IPP, preparation for Datex-P). (orig.)

  10. Computer Engineers.

    Science.gov (United States)

    Moncarz, Roger

    2000-01-01

    Looks at computer engineers and describes their job, employment outlook, earnings, and training and qualifications. Provides a list of resources related to computer engineering careers and the computer industry. (JOW)

  11. Identifying enabling management practices for employee engagement

    Directory of Open Access Journals (Sweden)

    Marius Joubert

    2011-12-01

    Full Text Available Orientation: A currently emerging viewpoint is that today's management practices no longer add value to organisations. The focus of this article is to conduct a systematic review of the scholarly literature on management practices that could be related to employee engagement. Research purpose: This study searched for evidence in support of the notion of a management value chain, and enabling management practices within each value chain component that could relate to employee engagement. Motivation for the study: An alternative management value chain model could contribute towards a better understanding of which management practices may potentially impact employee engagement. Research design, approach, and method: This is a non-empirical (theoretical study, based on a systematic, in-depth literature review to identify the key management components and enabling practices within this proposed management value chain. Scholarly research databases were sourced for relevant peer reviewed research conducted since 1990, not excluding important contributions prior to 1990. The literature was systematically searched, selected, studied, and contextualized within this study. Main findings: Support was found for the notion of a management value chain, for enabling management practices within each proposed management value chain component, and it was also established these management practices indeed have an impact on employee engagement. Practical/managerial/implications: The possibility that management work can be presented as a generic management value chain allows managers to approach engaging management practices more systematically. Contribution/value-add: This study highlights the importance of some management practices that have never been seen as part of management work.

  12. A wireless sensor enabled by wireless power.

    Science.gov (United States)

    Lee, Da-Sheng; Liu, Yu-Hong; Lin, Chii-Ruey

    2012-11-22

    Through harvesting energy by wireless charging and delivering data by wireless communication, this study proposes the concept of a wireless sensor enabled by wireless power (WPWS) and reports the fabrication of a prototype for functional tests. One WPWS node consists of wireless power module and sensor module with different chip-type sensors. Its main feature is the dual antenna structure. Following RFID system architecture, a power harvesting antenna was designed to gather power from a standard reader working in the 915 MHz band. Referring to the Modbus protocol, the other wireless communication antenna was integrated on a node to send sensor data in parallel. The dual antenna structure integrates both the advantages of an RFID system and a wireless sensor. Using a standard UHF RFID reader, WPWS can be enabled in a distributed area with a diameter up to 4 m. Working status is similar to that of a passive tag, except that a tag can only be queried statically, while the WPWS can send dynamic data from the sensors. The function is the same as a wireless sensor node. Different WPWSs equipped with temperature and humidity, optical and airflow velocity sensors are tested in this study. All sensors can send back detection data within 8 s. The accuracy is within 8% deviation compared with laboratory equipment. A wireless sensor network enabled by wireless power should be a totally wireless sensor network using WPWS. However, distributed WPWSs only can form a star topology, the simplest topology for constructing a sensor network. Because of shielding effects, it is difficult to apply other complex topologies. Despite this limitation, WPWS still can be used to extend sensor network applications in hazardous environments. Further research is needed to improve WPWS to realize a totally wireless sensor network.

  13. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  14. Mesh Network Architecture for Enabling Inter-Spacecraft Communication

    Science.gov (United States)

    Becker, Christopher; Merrill, Garrick

    2017-01-01

    To enable communication between spacecraft operating in a formation or small constellation, a mesh network architecture was developed and tested using a time division multiple access (TDMA) communication scheme. The network is designed to allow for the exchange of telemetry and other data between spacecraft to enable collaboration between small spacecraft. The system uses a peer-to-peer topology with no central router, so that it does not have a single point of failure. The mesh network is dynamically configurable to allow for addition and subtraction of new spacecraft into the communication network. Flight testing was performed using an unmanned aerial system (UAS) formation acting as a spacecraft analogue and providing a stressing environment to prove mesh network performance. The mesh network was primarily devised to provide low latency, high frequency communication but is flexible and can also be configured to provide higher bandwidth for applications desiring high data throughput. The network includes a relay functionality that extends the maximum range between spacecraft in the network by relaying data from node to node. The mesh network control is implemented completely in software making it hardware agnostic, thereby allowing it to function with a wide variety of existing radios and computing platforms..

  15. Calculations enable optimum design of magnetic brake

    Science.gov (United States)

    Kosmahl, H. G.

    1966-01-01

    Mathematical analysis and computations determine optimum magnetic coil configurations for a magnetic brake which controllably decelerates a free falling load to a soft stop. Calculations on unconventionally wound coils determine the required parameters for the desired deceleration with minimum electrical energy supplied to the stationary coil.

  16. Software-Enabled Modular Instrumentation Systems

    NARCIS (Netherlands)

    Soijer, M.W.

    2003-01-01

    Like most other types of instrumentation systems, flight test instrumentation is not produced in series; its development is a one-time achievement by a test department. With the introduction of powerful digital computers, instrumentation systems have included data analysis tasks that were previously

  17. Introduction to computer networking

    CERN Document Server

    Robertazzi, Thomas G

    2017-01-01

    This book gives a broad look at both fundamental networking technology and new areas that support it and use it. It is a concise introduction to the most prominent, recent technological topics in computer networking. Topics include network technology such as wired and wireless networks, enabling technologies such as data centers, software defined networking, cloud and grid computing and applications such as networks on chips, space networking and network security. The accessible writing style and non-mathematical treatment makes this a useful book for the student, network and communications engineer, computer scientist and IT professional. • Features a concise, accessible treatment of computer networking, focusing on new technological topics; • Provides non-mathematical introduction to networks in their most common forms today;< • Includes new developments in switching, optical networks, WiFi, Bluetooth, LTE, 5G, and quantum cryptography.

  18. Physicist or computer specialist?

    Energy Technology Data Exchange (ETDEWEB)

    Clifton, J S [University College Hospital, London (United Kingdom)

    1966-06-15

    Since to most clinicians physical and computer science are two of the great mysteries of the world, the physicist in a hospital is expected by clinicians to be fully conversant with, and competent to make profound pronouncements on, all methods of computing. specific computing problems, and the suitability of computing machinery ranging from desk calculators to Atlas. This is not surprising since the proportion of the syllabus devoted to physics and mathematics in an M. B. degree is indeed meagre, and the word 'computer' has been surrounded with an aura of mysticism which suggests that it is some fantastic piece of electronic gadgetry comprehensible only to a veritable genius. The clinician consequently turns to the only scientific colleague with whom he has direct contact - the medical physicist - and expects him to be an authority. The physicist is thus thrust, however unwillingly, into the forefront of the advance of computer assistance to scientific medicine. It is therefore essential for him to acquire sufficient knowledge of computing science to enable him to provide satisfactory answers for the clinicianst queries, to proffer more detailed advice as to programming convince clinicians that the computer is really a 'simpleton' which can only add and subtract and even that only under instruction.

  19. Thermodynamic theory of dislocation-enabled plasticity

    International Nuclear Information System (INIS)

    Langer, J. S.

    2017-01-01

    The thermodynamic theory of dislocation-enabled plasticity is based on two unconventional hypotheses. The first of these is that a system of dislocations, driven by external forces and irreversibly exchanging heat with its environment, must be characterized by a thermodynamically defined effective temperature that is not the same as the ordinary temperature. The second hypothesis is that the overwhelmingly dominant mechanism controlling plastic deformation is thermally activated depinning of entangled pairs of dislocations. This paper consists of a systematic reformulation of this theory followed by examples of its use in analyses of experimentally observed phenomena including strain hardening, grain-size (Hall-Petch) effects, yielding transitions, and adiabatic shear banding.

  20. PHM Enabled Autonomous Propellant Loading Operations

    Science.gov (United States)

    Walker, Mark; Figueroa, Fernando

    2017-01-01

    The utility of Prognostics and Health Management (PHM) software capability applied to Autonomous Operations (AO) remains an active research area within aerospace applications. The ability to gain insight into which assets and subsystems are functioning properly, along with the derivation of confident predictions concerning future ability, reliability, and availability, are important enablers for making sound mission planning decisions. When coupled with software that fully supports mission planning and execution, an integrated solution can be developed that leverages state assessment and estimation for the purposes of delivering autonomous operations. The authors have been applying this integrated, model-based approach to the autonomous loading of cryogenic spacecraft propellants at Kennedy Space Center.

  1. Blended Learning: enabling Higher Education Reform

    Directory of Open Access Journals (Sweden)

    Kathleen Matheos

    2018-01-01

    Full Text Available Blended learning research and practice have been areas of growth for two decades in Canada, with over 95% of Canadian higher education institutions involved in some form of blended learning. Despite strong evidence based research and practice blended learning, for the most part, has remained at sidelined in Canadian universities. The article argues the need for blended learning to situate itself within the timely and crucial Higher Education Reform (HER agenda. By aligning the affordances of blended learning with the components of HER, blended learning can clearly serve as an enabler for HER.

  2. Product Line Enabled Intelligent Mobile Middleware

    DEFF Research Database (Denmark)

    Zhang, Weishan; Kunz, Thomas; Hansen, Klaus Marius

    2007-01-01

    research project called PLIMM that focuses on user-centered application scenarios. PLIMM is designed based on software product line ideas which make it possible for specialized customization and optimization for different purposes and hardware/software platforms. To enable intelligence, the middleware...... needs access to a range of context models. We model these contexts with OWL, focusing on user-centered concepts. The basic building block of PLIMM is the enhanced BDI agent where OWL context ontology logic reasoning will add indirect beliefs to the belief sets. Our approach also addresses the handling...

  3. Framework for Enabling User-Generated Content

    OpenAIRE

    Nilsson, Karin H

    2012-01-01

    User-generated content, UGC, is a modern topic today and refers to media and creative works created by Internet users and posted on the Internet. More and more application developers wants to offer sharing functionalities in their applications and on their websites. The alternatives of doing so today are to use UGC platforms API, like Facebook and Twitter, to upload the content to that specific platform or to implement the framework ShareKit that enables the user to share their content on mul...

  4. Enabling information sharing in a port

    DEFF Research Database (Denmark)

    Olesen, Peter Bjerg; Hvolby, Hans-Henrik; Dukovska-Popovska, Iskra

    2012-01-01

    Ports are integral parts of many supply chains and are as such a contributing factor to the overall efficiency of the supply chain. Ports are also dynamic entities where things changes continuously. The dynamic nature of ports is also a problem when trying to optimise the utilisation of resources...... and ensure a low lead-time. Information sharing is a very important tool to reduce the effect of dynamism. This paper attempts to explain how information sharing is enabled in such an environment, and which considerations are relevant, both in regards to the information and required technology. The paper...

  5. Nanoarchitecture Control Enabled by Ionic Liquids

    Science.gov (United States)

    Murdoch, Heather A.; Limmer, Krista R.; Labukas, Joseph P.

    2017-04-01

    Ionic liquids have many advantages over traditional aqueous electrosynthesis for fabrication of functional nanoarchitectures, including enabling the integration of nanoparticles into traditional coatings, superhydrophobicity, nanofoams, and other hierarchical structures. Shape and size control through ionic liquid selection and processing conditions can synthesize nanoparticles and nanoarchitectures without the use of capping agents, surfactants, or templates that are often deleterious to the functionality of the resultant system. Here we give a brief overview of some recent and interesting applications of ionic liquids to the synthesis of nanoparticles and nanoarchitectures.

  6. Principles for enabling deep secondary design

    DEFF Research Database (Denmark)

    Pries-Heje, Jan; Hansen, Magnus Rotvit Perlt

    2017-01-01

    design by analyzing two cases where secondary designers fundamentally change functionality, content and technology complexity level. The first case redesigns a decision model for agile development in an insurance company; the second creates a contingency model for choosing project management tools...... and techniques in a hospital. Our analysis of the two cases leads to the identification of four principles of design implementation that primary designers can apply to enable secondary design and four corresponding design implementation principles that secondary designers themselves need to apply....

  7. Quantum computing on encrypted data.

    Science.gov (United States)

    Fisher, K A G; Broadbent, A; Shalm, L K; Yan, Z; Lavoie, J; Prevedel, R; Jennewein, T; Resch, K J

    2014-01-01

    The ability to perform computations on encrypted data is a powerful tool for protecting privacy. Recently, protocols to achieve this on classical computing systems have been found. Here, we present an efficient solution to the quantum analogue of this problem that enables arbitrary quantum computations to be carried out on encrypted quantum data. We prove that an untrusted server can implement a universal set of quantum gates on encrypted quantum bits (qubits) without learning any information about the inputs, while the client, knowing the decryption key, can easily decrypt the results of the computation. We experimentally demonstrate, using single photons and linear optics, the encryption and decryption scheme on a set of gates sufficient for arbitrary quantum computations. As our protocol requires few extra resources compared with other schemes it can be easily incorporated into the design of future quantum servers. These results will play a key role in enabling the development of secure distributed quantum systems.

  8. A diversity-oriented synthesis strategy enabling the combinatorial-type variation of macrocyclic peptidomimetic scaffolds† †Electronic supplementary information (ESI) available: Experimental procedures, characterization data and details of the computational analyses. See DOI: 10.1039/c5ob00371g Click here for additional data file.

    Science.gov (United States)

    Isidro-Llobet, Albert; Hadje Georgiou, Kathy; Galloway, Warren R. J. D.; Giacomini, Elisa; Hansen, Mette R.; Méndez-Abt, Gabriela; Tan, Yaw Sing; Carro, Laura; Sore, Hannah F.

    2015-01-01

    Macrocyclic peptidomimetics are associated with a broad range of biological activities. However, despite such potentially valuable properties, the macrocyclic peptidomimetic structural class is generally considered as being poorly explored within drug discovery. This has been attributed to the lack of general methods for producing collections of macrocyclic peptidomimetics with high levels of structural, and thus shape, diversity. In particular, there is a lack of scaffold diversity in current macrocyclic peptidomimetic libraries; indeed, the efficient construction of diverse molecular scaffolds presents a formidable general challenge to the synthetic chemist. Herein we describe a new, advanced strategy for the diversity-oriented synthesis (DOS) of macrocyclic peptidomimetics that enables the combinatorial variation of molecular scaffolds (core macrocyclic ring architectures). The generality and robustness of this DOS strategy is demonstrated by the step-efficient synthesis of a structurally diverse library of over 200 macrocyclic peptidomimetic compounds, each based around a distinct molecular scaffold and isolated in milligram quantities, from readily available building-blocks. To the best of our knowledge this represents an unprecedented level of scaffold diversity in a synthetically derived library of macrocyclic peptidomimetics. Cheminformatic analysis indicated that the library compounds access regions of chemical space that are distinct from those addressed by top-selling brand-name drugs and macrocyclic natural products, illustrating the value of our DOS approach to sample regions of chemical space underexploited in current drug discovery efforts. An analysis of three-dimensional molecular shapes illustrated that the DOS library has a relatively high level of shape diversity. PMID:25778821

  9. Gantry for computed tomography

    International Nuclear Information System (INIS)

    Kelman, A.L.; Peterson, T.E.

    1981-01-01

    A novel design of gantry for use in computed tomography is described in detail. In the new gantry, curved tracks are mounted to the laterally spaced apart sides of the frame which rotates and carries the detector and X-ray source. This permits the frame to be tilted either side of vertical enabling angular slices of body layers to be viewed and allows simplification of the algorithm which the computer uses for image reconstruction. The tracks are supported on rollers which carry the substantial weight. Explicit engineering details are presented especially of the ball bearing races used in the rotation. (U.K.)

  10. Gantry for computed tomography

    International Nuclear Information System (INIS)

    Brandt, R.T.; Hein, P.W.

    1981-01-01

    A novel design of gantry for use in computed tomography is described in detail. In the new gantry, curved tracks are mounted to the laterally spaced apart sides of the frame which rotates and carries the detector and X-ray source. This permits the frame to be tilted either side of vertical enabling angular slices of body layers to be viewed and allows simplification of the algorithm which the computer uses for image reconstruction. The tracks are supported on rollers which carry the substantial weight. Explicit engineering details are presented. (U.K.)

  11. Efficiently outsourcing multiparty computation under multiple keys

    NARCIS (Netherlands)

    Peter, Andreas; Tews, Erik; Tews, Erik; Katzenbeisser, Stefan

    2013-01-01

    Secure multiparty computation enables a set of users to evaluate certain functionalities on their respective inputs while keeping these inputs encrypted throughout the computation. In many applications, however, outsourcing these computations to an untrusted server is desirable, so that the server

  12. Protean appearance of craniopharyngioma on computed tomography

    International Nuclear Information System (INIS)

    Danziger, A.; Price, H.I.

    1979-01-01

    Craniopharyngiomas present a diverse appearance on computed tomography. Histological diagnosis is not always possible, but computed tomography is of great assistance in the delineation of the tumour as well as of the degree of associated hydrocephalus. Computed tomography also enables rapid non-invasive follow-up after surgery or radiotherapy, or both

  13. Quantum computing based on semiconductor nanowires

    NARCIS (Netherlands)

    Frolov, S.M.; Plissard, S.R.; Nadj-Perge, S.; Kouwenhoven, L.P.; Bakkers, E.P.A.M.

    2013-01-01

    A quantum computer will have computational power beyond that of conventional computers, which can be exploited for solving important and complex problems, such as predicting the conformations of large biological molecules. Materials play a major role in this emerging technology, as they can enable

  14. Enablers of and barriers to abortion training.

    Science.gov (United States)

    Guiahi, Maryam; Lim, Sahnah; Westover, Corey; Gold, Marji; Westhoff, Carolyn L

    2013-06-01

    Since the legalization of abortion services in the United States, provision of abortions has remained a controversial issue of high political interest. Routine abortion training is not offered at all obstetrics and gynecology (Ob-Gyn) training programs, despite a specific training requirement by the Accreditation Council for Graduate Medical Education. Previous studies that described Ob-Gyn programs with routine abortion training either examined associations by using national surveys of program directors or described the experience of a single program. We set out to identify enablers of and barriers to Ob-Gyn abortion training in the context of a New York City political initiative, in order to better understand how to improve abortion training at other sites. We conducted in-depth qualitative interviews with 22 stakeholders from 7 New York City public hospitals and focus group interviews with 62 current residents at 6 sites. Enablers of abortion training included program location, high-capacity services, faculty commitment to abortion training, external programmatic support, and resident interest. Barriers to abortion training included lack of leadership continuity, leadership conflict, lack of second-trimester abortion services, difficulty obtaining mifepristone, optional rather than routine training, and antiabortion values of hospital personnel. Supportive leadership, faculty commitment, and external programmatic support appear to be key elements for establishing routine abortion training at Ob-Gyn residency training programs.

  15. Enabling technologies for oil sands development

    International Nuclear Information System (INIS)

    Bailey, R.T.

    1998-01-01

    A review of oil sands production and expansion possibilities in Alberta were presented. The enabling technologies for oil sands projects include mining (bucketwheels, draglines, trucks, shovels conveyors, slurry hydrotransport); extraction (conditioning tumblers, pipelines, tanks, hot water, caustic, cold water, frothers); froth cleaning (centrifuges, solvent treatment); tailings (tailings ponds, consolidated tailings); and upgrading (coking, hydrotreating for SCO, hydrocracking and multiple products). The enabling technologies for in situ production include cyclic steam stimulation for vertical wells, steam assisted gravity drainage (SAGD) for dual horizontal wells, and cold production with wormholes. This paper described the recovery potentials of each of these processes. It also discussed the role of government and industry in research and cooperative research involving both the private and public sectors. Examples of each of these were described such as SAGD, the OSLO cold water extraction process, The consolidated tailings (CT) project, the low energy extraction process (slurry production, hydrotransport, pipeline conditioning and warm water extraction), and research in fine tailings, to demonstrate that although objectives may differ, government and industry research objectives are complementary

  16. Blue space geographies: Enabling health in place.

    Science.gov (United States)

    Foley, Ronan; Kistemann, Thomas

    2015-09-01

    Drawing from research on therapeutic landscapes and relationships between environment, health and wellbeing, we propose the idea of 'healthy blue space' as an important new development Complementing research on healthy green space, blue space is defined as; 'health-enabling places and spaces, where water is at the centre of a range of environments with identifiable potential for the promotion of human wellbeing'. Using theoretical ideas from emotional and relational geographies and critical understandings of salutogenesis, the value of blue space to health and wellbeing is recognised and evaluated. Six individual papers from five different countries consider how health can be enabled in mixed blue space settings. Four sub-themes; embodiment, inter-subjectivity, activity and meaning, document multiple experiences within a range of healthy blue spaces. Finally, we suggest a considerable research agenda - theoretical, methodological and applied - for future work within different forms of blue space. All are suggested as having public health policy relevance in social and public space. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Recon3D enables a three-dimensional view of gene variation in human metabolism

    DEFF Research Database (Denmark)

    Brunk, Elizabeth; Sahoo, Swagatika; Zielinski, Daniel C.

    2018-01-01

    Genome-scale network reconstructions have helped uncover the molecular basis of metabolism. Here we present Recon3D, a computational resource that includes three-dimensional (3D) metabolite and protein structure data and enables integrated analyses of metabolic functions in humans. We use Recon3D...

  18. Computational Biology and High Performance Computing 2000

    Energy Technology Data Exchange (ETDEWEB)

    Simon, Horst D.; Zorn, Manfred D.; Spengler, Sylvia J.; Shoichet, Brian K.; Stewart, Craig; Dubchak, Inna L.; Arkin, Adam P.

    2000-10-19

    The pace of extraordinary advances in molecular biology has accelerated in the past decade due in large part to discoveries coming from genome projects on human and model organisms. The advances in the genome project so far, happening well ahead of schedule and under budget, have exceeded any dreams by its protagonists, let alone formal expectations. Biologists expect the next phase of the genome project to be even more startling in terms of dramatic breakthroughs in our understanding of human biology, the biology of health and of disease. Only today can biologists begin to envision the necessary experimental, computational and theoretical steps necessary to exploit genome sequence information for its medical impact, its contribution to biotechnology and economic competitiveness, and its ultimate contribution to environmental quality. High performance computing has become one of the critical enabling technologies, which will help to translate this vision of future advances in biology into reality. Biologists are increasingly becoming aware of the potential of high performance computing. The goal of this tutorial is to introduce the exciting new developments in computational biology and genomics to the high performance computing community.

  19. Enabling Virtual Sensing as a Service

    Directory of Open Access Journals (Sweden)

    Yang Li

    2016-03-01

    Full Text Available In many situations, placing a physical sensor in the ideal position in or on the human body to acquire sensing data is incredibly difficult. Virtual sensors, in contrast to physical sensors, can provide indirect measurements by making use of other available sensor data. In this paper, we demonstrate a virtual sensing application developed as a service on top of a cloud-based health sensor data management platform called Wiki-Health. The proposed application “implants” virtual sensors in the human body by integrating environmental, geographic and personal sensor data with physiological models to compute temperature estimations of various parts of the body. The feasibility of the proposed virtual sensing service is supported by a case study. The ability to share computational models relevant to do calculations on measured data on the go is also discussed.

  20. Rapid hyperspectral image classification to enable autonomous search systems

    Directory of Open Access Journals (Sweden)

    Raj Bridgelal

    2016-11-01

    Full Text Available The emergence of lightweight full-frame hyperspectral cameras is destined to enable autonomous search vehicles in the air, on the ground and in water. Self-contained and long-endurance systems will yield important new applications, for example, in emergency response and the timely identification of environmental hazards. One missing capability is rapid classification of hyperspectral scenes so that search vehicles can immediately take actions to verify potential targets. Onsite verifications minimise false positives and preclude the expense of repeat missions. Verifications will require enhanced image quality, which is achievable by either moving closer to the potential target or by adjusting the optical system. Such a solution, however, is currently impractical for small mobile platforms with finite energy sources. Rapid classifications with current methods demand large computing capacity that will quickly deplete the on-board battery or fuel. To develop the missing capability, the authors propose a low-complexity hyperspectral image classifier that approaches the performance of prevalent classifiers. This research determines that the new method will require at least 19-fold less computing capacity than the prevalent classifier. To assess relative performances, the authors developed a benchmark that compares a statistic of library endmember separability in their respective feature spaces.

  1. Enabling Research Network Connectivity to Clouds with Virtual Router Technology

    Science.gov (United States)

    Seuster, R.; Casteels, K.; Leavett-Brown, CR; Paterson, M.; Sobie, RJ

    2017-10-01

    The use of opportunistic cloud resources by HEP experiments has significantly increased over the past few years. Clouds that are owned or managed by the HEP community are connected to the LHCONE network or the research network with global access to HEP computing resources. Private clouds, such as those supported by non-HEP research funds are generally connected to the international research network; however, commercial clouds are either not connected to the research network or only connect to research sites within their national boundaries. Since research network connectivity is a requirement for HEP applications, we need to find a solution that provides a high-speed connection. We are studying a solution with a virtual router that will address the use case when a commercial cloud has research network connectivity in a limited region. In this situation, we host a virtual router in our HEP site and require that all traffic from the commercial site transit through the virtual router. Although this may increase the network path and also the load on the HEP site, it is a workable solution that would enable the use of the remote cloud for low I/O applications. We are exploring some simple open-source solutions. In this paper, we present the results of our studies and how it will benefit our use of private and public clouds for HEP computing.

  2. Optimising LAN access to grid enabled storage elements

    International Nuclear Information System (INIS)

    Stewart, G A; Dunne, B; Elwell, A; Millar, A P; Cowan, G A

    2008-01-01

    When operational, the Large Hadron Collider experiments at CERN will collect tens of petabytes of physics data per year. The worldwide LHC computing grid (WLCG) will distribute this data to over two hundred Tier-1 and Tier-2 computing centres, enabling particle physicists around the globe to access the data for analysis. Although different middleware solutions exist for effective management of storage systems at collaborating institutes, the patterns of access envisaged for Tier-2s fall into two distinct categories. The first involves bulk transfer of data between different Grid storage elements using protocols such as GridFTP. This data movement will principally involve writing ESD and AOD files into Tier-2 storage. Secondly, once datasets are stored at a Tier-2, physics analysis jobs will read the data from the local SE. Such jobs require a POSIX-like interface to the storage so that individual physics events can be extracted. In this paper we consider the performance of POSIX-like access to files held in Disk Pool Manager (DPM) storage elements, a popular lightweight SRM storage manager from EGEE

  3. Analog computing

    CERN Document Server

    Ulmann, Bernd

    2013-01-01

    This book is a comprehensive introduction to analog computing. As most textbooks about this powerful computing paradigm date back to the 1960s and 1970s, it fills a void and forges a bridge from the early days of analog computing to future applications. The idea of analog computing is not new. In fact, this computing paradigm is nearly forgotten, although it offers a path to both high-speed and low-power computing, which are in even more demand now than they were back in the heyday of electronic analog computers.

  4. Computational composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.; Redström, Johan

    2007-01-01

    Computational composite is introduced as a new type of composite material. Arguing that this is not just a metaphorical maneuver, we provide an analysis of computational technology as material in design, which shows how computers share important characteristics with other materials used in design...... and architecture. We argue that the notion of computational composites provides a precise understanding of the computer as material, and of how computations need to be combined with other materials to come to expression as material. Besides working as an analysis of computers from a designer’s point of view......, the notion of computational composites may also provide a link for computer science and human-computer interaction to an increasingly rapid development and use of new materials in design and architecture....

  5. The Community Seismic Network: Enabling Observations Through Citizen Science Participation

    Science.gov (United States)

    Kohler, M. D.; Clayton, R. W.; Heaton, T. H.; Bunn, J.; Guy, R.; Massari, A.; Chandy, K. M.

    2017-12-01

    The Community Seismic Network is a dense accelerometer array deployed in the greater Los Angeles area and represents the future of densely instrumented urban cities where localized vibration measurements are collected continuously throughout the free-field and built environment. The hardware takes advantage of developments in the semiconductor industry in the form of inexpensive MEMS accelerometers that are each coupled with a single board computer. The data processing and archival architecture borrows from developments in cloud computing and network connectedness. The ability to deploy densely in the free field and in upper stories of mid/high-rise buildings is enabled by community hosts for sensor locations. To this end, CSN has partnered with the Los Angeles Unified School District (LAUSD), the NASA-Jet Propulsion Laboratory (JPL), and commercial and civic building owners to host sensors. At these sites, site amplification estimates from RMS noise measurements illustrate the lateral variation in amplification over length scales of 100 m or less, that correlate with gradients in the local geology such as sedimentary basins that abut crystalline rock foothills. This is complemented by high-resolution, shallow seismic velocity models obtained using an H/V method. In addition, noise statistics are used to determine the reliability of sites for ShakeMap and earthquake early warning data. The LAUSD and JPL deployments are examples of how situational awareness and centralized warning products such as ShakeMap and ShakeCast are enabled by citizen science participation. Several buildings have been instrumented with at least one triaxial accelerometer per floor, providing measurements for real-time structural health monitoring through local, customized displays. For real-time and post-event evaluation, the free-field and built environment CSN data and products illustrate the feasibility of order-of-magnitude higher spatial resolution mapping compared to what is currently

  6. GapMap: Enabling Comprehensive Autism Resource Epidemiology.

    Science.gov (United States)

    Albert, Nikhila; Daniels, Jena; Schwartz, Jessey; Du, Michael; Wall, Dennis P

    2017-05-04

    For individuals with autism spectrum disorder (ASD), finding resources can be a lengthy and difficult process. The difficulty in obtaining global, fine-grained autism epidemiological data hinders researchers from quickly and efficiently studying large-scale correlations among ASD, environmental factors, and geographical and cultural factors. The objective of this study was to define resource load and resource availability for families affected by autism and subsequently create a platform to enable a more accurate representation of prevalence rates and resource epidemiology. We created a mobile application, GapMap, to collect locational, diagnostic, and resource use information from individuals with autism to compute accurate prevalence rates and better understand autism resource epidemiology. GapMap is hosted on AWS S3, running on a React and Redux front-end framework. The backend framework is comprised of an AWS API Gateway and Lambda Function setup, with secure and scalable end points for retrieving prevalence and resource data, and for submitting participant data. Measures of autism resource scarcity, including resource load, resource availability, and resource gaps were defined and preliminarily computed using simulated or scraped data. The average distance from an individual in the United States to the nearest diagnostic center is approximately 182 km (50 miles), with a standard deviation of 235 km (146 miles). The average distance from an individual with ASD to the nearest diagnostic center, however, is only 32 km (20 miles), suggesting that individuals who live closer to diagnostic services are more likely to be diagnosed. This study confirmed that individuals closer to diagnostic services are more likely to be diagnosed and proposes GapMap, a means to measure and enable the alleviation of increasingly overburdened diagnostic centers and resource-poor areas where parents are unable to diagnose their children as quickly and easily as needed. GapMap will

  7. Microsystem enabled photovoltaic modules and systems

    Science.gov (United States)

    Nielson, Gregory N; Sweatt, William C; Okandan, Murat

    2015-05-12

    A microsystem enabled photovoltaic (MEPV) module including: an absorber layer; a fixed optic layer coupled to the absorber layer; a translatable optic layer; a translation stage coupled between the fixed and translatable optic layers; and a motion processor electrically coupled to the translation stage to controls motion of the translatable optic layer relative to the fixed optic layer. The absorber layer includes an array of photovoltaic (PV) elements. The fixed optic layer includes an array of quasi-collimating (QC) micro-optical elements designed and arranged to couple incident radiation from an intermediate image formed by the translatable optic layer into one of the PV elements such that it is quasi-collimated. The translatable optic layer includes an array of focusing micro-optical elements corresponding to the QC micro-optical element array. Each focusing micro-optical element is designed to produce a quasi-telecentric intermediate image from substantially collimated radiation incident within a predetermined field of view.

  8. Technology enabled evolutions in liquids marketing

    International Nuclear Information System (INIS)

    Manning, S.

    1998-01-01

    Deregulation, mergers, changing economic conditions, and downsizing have captured the headlines in the energy industry in recent times. To say that companies have struggled to react to these changes would be an understatement. Huge trading organizations have grown from nothing in a few years, while entire industry segments have been forced to restructure themselves. Information technology has enabled much of this change. By bringing information management out of the back office and onto the trading floors, companies have radically redesigned their work processes. The future promises even faster change, with business focus turning to innovative packaging of services with products, expanding asset bases, and reducing costs. Information technology will fuel this transformation by providing enterprise-wide trading solutions and, ultimately, linking the entire industry into a virtual supply chain. To remain competitive, companies need a strategy to manage information technology as a core asset

  9. Enabling MEMS technologies for communications systems

    Science.gov (United States)

    Lubecke, Victor M.; Barber, Bradley P.; Arney, Susanne

    2001-11-01

    Modern communications demands have been steadily growing not only in size, but sophistication. Phone calls over copper wires have evolved into high definition video conferencing over optical fibers, and wireless internet browsing. The technology used to meet these demands is under constant pressure to provide increased capacity, speed, and efficiency, all with reduced size and cost. Various MEMS technologies have shown great promise for meeting these challenges by extending the performance of conventional circuitry and introducing radical new systems approaches. A variety of strategic MEMS structures including various cost-effective free-space optics and high-Q RF components are described, along with related practical implementation issues. These components are rapidly becoming essential for enabling the development of progressive new communications systems technologies including all-optical networks, and low cost multi-system wireless terminals and basestations.

  10. Enabling women to achieve their breastfeeding goals.

    Science.gov (United States)

    Stuebe, Alison M

    2014-03-01

    In mammalian physiology, lactation follows pregnancy, and disruption of this physiology is associated with adverse health outcomes for mother and child. Although lactation is the physiologic norm, cultural norms for infant feeding have changed dramatically over the past century. Breastfeeding initiation fell from 70% in the early 1900s to 22% in 1972. In the past 40 years, rates have risen substantially, to 77% in 2010. Although more mothers are initiating breastfeeding, many report that they do not continue as long as they desire. As reproductive health care experts, obstetricians are uniquely positioned to assist women to make an informed feeding decision, offer anticipatory guidance, support normal lactation physiology, and evaluate and treat breastfeeding complications. Integration of care among the obstetrician, pediatric provider, and lactation consultant may enable more women to achieve their breastfeeding goals, thereby improving health outcomes across two generations.

  11. Flexibility-enabling Contracts in Electricity Markets

    DEFF Research Database (Denmark)

    Boscan, Luis; Poudineh, Rahmatallah

    As the share of intermittent renewable energy increases in the generation mix, power systems are exposed to greater levels of uncertainty and risk, which requires planners, policy and business decision makers to incentivise flexibility, that is: their adaptability to unforeseen variations....... Additionally, along with traditional sources, which already enable flexibility, a number of business models, such as thermostat-based demand response, aggregators and small storage providers, are emerging in electricity markets and expected to constitute important sources of flexibility in future decentralised...... power systems. However, due to presence of high transaction costs, relative to the size of resource, the emerging small resources cannot directly participate in an organised electricity market and/or compete. This paper asks the fundamental question of how should the provision of flexibility, as a multi...

  12. Provision of enabling technology in professional sports.

    Science.gov (United States)

    McBride, D K

    2000-06-01

    Multiple-round golf tournaments are designed intentionally to separate individuals' scores as play proceeds. Variance analyses and consideration of individual differences (vs group mean effects) for a sample of professional events confirm that 3-, 4-, and 5-round tournaments show significantly increased variability (though stable means) from first to last rounds. It is argued here that the dispersion of scores increases as play proceeds because the more physically or mentally fit players emerge and continue to perform best. Furthermore, a marginal income analysis indicates that the average gain in earnings from a one-shot improvement in score is approximately $8,000. An interpretation based on fatigue, competition, and stress supports the Professional Golf Association's claim that provision of enabling devices, like a golf cart for disabled players, is also an enhancement and is thus unfair.

  13. Bluetooth-enabled teleradiology: applications and complications.

    Science.gov (United States)

    Hura, Angela M

    2002-01-01

    Wireless personal area networks and local area networks are becoming increasingly more prevalent in the teleradiology and telemedicine industry. Although there has been much debate about the role that Bluetooth will play in the future of wireless technology, both promoters and doubters acknowledge that Bluetooth will have an impact on networking, even if only as a "niche" product. This article provides an overview of the Bluetooth standard and highlights current and future areas of inclusion for use in a teleradiology environment. The possibilities for Bluetooth in a teleradiology environment without wires are nearly boundless and an overview of current and proposed Bluetooth-enabled radiology equipment and vendors is provided. A comparison of Bluetooth and other wireless technologies is provided, including areas of similarity and potential conflict. Bluetooth and other wireless technologies can not only peacefully coexist but also complement each other and provide enhanced teleradiology services.

  14. Metasurface-Enabled Remote Quantum Interference.

    Science.gov (United States)

    Jha, Pankaj K; Ni, Xingjie; Wu, Chihhui; Wang, Yuan; Zhang, Xiang

    2015-07-10

    An anisotropic quantum vacuum (AQV) opens novel pathways for controlling light-matter interaction in quantum optics, condensed matter physics, etc. Here, we theoretically demonstrate a strong AQV over macroscopic distances enabled by a judiciously designed array of subwavelength-scale nanoantennas-a metasurface. We harness the phase-control ability and the polarization-dependent response of the metasurface to achieve strong anisotropy in the decay rate of a quantum emitter located over distances of hundreds of wavelengths. Such an AQV induces quantum interference among radiative decay channels in an atom with orthogonal transitions. Quantum vacuum engineering with metasurfaces holds promise for exploring new paradigms of long-range light-matter interaction for atom optics, solid-state quantum optics, quantum information processing, etc.

  15. Framework Resources Multiply Computing Power

    Science.gov (United States)

    2010-01-01

    As an early proponent of grid computing, Ames Research Center awarded Small Business Innovation Research (SBIR) funding to 3DGeo Development Inc., of Santa Clara, California, (now FusionGeo Inc., of The Woodlands, Texas) to demonstrate a virtual computer environment that linked geographically dispersed computer systems over the Internet to help solve large computational problems. By adding to an existing product, FusionGeo enabled access to resources for calculation- or data-intensive applications whenever and wherever they were needed. Commercially available as Accelerated Imaging and Modeling, the product is used by oil companies and seismic service companies, which require large processing and data storage capacities.

  16. Quantum Computing

    OpenAIRE

    Scarani, Valerio

    1998-01-01

    The aim of this thesis was to explain what quantum computing is. The information for the thesis was gathered from books, scientific publications, and news articles. The analysis of the information revealed that quantum computing can be broken down to three areas: theories behind quantum computing explaining the structure of a quantum computer, known quantum algorithms, and the actual physical realizations of a quantum computer. The thesis reveals that moving from classical memor...

  17. Cloud Computing Governance Lifecycle

    Directory of Open Access Journals (Sweden)

    Soňa Karkošková

    2016-06-01

    Full Text Available Externally provisioned cloud services enable flexible and on-demand sourcing of IT resources. Cloud computing introduces new challenges such as need of business process redefinition, establishment of specialized governance and management, organizational structures and relationships with external providers and managing new types of risk arising from dependency on external providers. There is a general consensus that cloud computing in addition to challenges brings many benefits but it is unclear how to achieve them. Cloud computing governance helps to create business value through obtain benefits from use of cloud computing services while optimizing investment and risk. Challenge, which organizations are facing in relation to governing of cloud services, is how to design and implement cloud computing governance to gain expected benefits. This paper aims to provide guidance on implementation activities of proposed Cloud computing governance lifecycle from cloud consumer perspective. Proposed model is based on SOA Governance Framework and consists of lifecycle for implementation and continuous improvement of cloud computing governance model.

  18. VACET: Proposed SciDAC2 Visualization and Analytics Center for Enabling Technologies

    International Nuclear Information System (INIS)

    Bethel, W; Johnson, C; Hansen, C; Parker, S; Sanderson, A; Silva, C; Tricoche, X; Pascucci, V; Childs, H; Cohen, J; Duchaineau, M; Laney, D; Lindstrom, P; Ahern, S; Meredith, J; Ostrouchov, G; Joy, K; Hamann, B

    2006-01-01

    This project focuses on leveraging scientific visualization and analytics software technology as an enabling technology for increasing scientific productivity and insight. Advances in computational technology have resulted in an 'information big bang',' which in turn has created a significant data understanding challenge. This challenge is widely acknowledged to be one of the primary bottlenecks in contemporary science. The vision for our Center is to respond directly to that challenge by adapting, extending, creating when necessary and deploying visualization and data understanding technologies for our science stakeholders. Using an organizational model as a Visualization and Analytics Center for Enabling Technologies (VACET), we are well positioned to be responsive to the needs of a diverse set of scientific stakeholders in a coordinated fashion using a range of visualization, mathematics, statistics, computer and computational science and data management technologies

  19. Environmental Models as a Service: Enabling Interoperability ...

    Science.gov (United States)

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantage of streamlined deployment processes and affordable cloud access to move algorithms and data to the web for discoverability and consumption. In these deployments, environmental models can become available to end users through RESTful web services and consistent application program interfaces (APIs) that consume, manipulate, and store modeling data. RESTful modeling APIs also promote discoverability and guide usability through self-documentation. Embracing the RESTful paradigm allows models to be accessible via a web standard, and the resulting endpoints are platform- and implementation-agnostic while simultaneously presenting significant computational capabilities for spatial and temporal scaling. RESTful APIs present data in a simple verb-noun web request interface: the verb dictates how a resource is consumed using HTTP methods (e.g., GET, POST, and PUT) and the noun represents the URL reference of the resource on which the verb will act. The RESTful API can self-document in both the HTTP response and an interactive web page using the Open API standard. This lets models function as an interoperable service that promotes sharing, documentation, and discoverability. Here, we discuss the

  20. Enabling technologies for fiber optic sensing

    Science.gov (United States)

    Ibrahim, Selwan K.; Farnan, Martin; Karabacak, Devrez M.; Singer, Johannes M.

    2016-04-01

    In order for fiber optic sensors to compete with electrical sensors, several critical parameters need to be addressed such as performance, cost, size, reliability, etc. Relying on technologies developed in different industrial sectors helps to achieve this goal in a more efficient and cost effective way. FAZ Technology has developed a tunable laser based optical interrogator based on technologies developed in the telecommunication sector and optical transducer/sensors based on components sourced from the automotive market. Combining Fiber Bragg Grating (FBG) sensing technology with the above, high speed, high precision, reliable quasi distributed optical sensing systems for temperature, pressure, acoustics, acceleration, etc. has been developed. Careful design needs to be considered to filter out any sources of measurement drifts/errors due to different effects e.g. polarization and birefringence, coating imperfections, sensor packaging etc. Also to achieve high speed and high performance optical sensing systems, combining and synchronizing multiple optical interrogators similar to what has been used with computer/processors to deliver super computing power is an attractive solution. This path can be achieved by using photonic integrated circuit (PIC) technology which opens the doors to scaling up and delivering powerful optical sensing systems in an efficient and cost effective way.

  1. Europium enabled luminescent nanoparticles for biomedical applications

    Energy Technology Data Exchange (ETDEWEB)

    Syamchand, S.S., E-mail: syamchand.ss@gmail.com; Sony, G., E-mail: emailtosony@gmail.com

    2015-09-15

    Lanthanide based nanoparticles are receiving great attention ought to their excellent luminescent and magnetic properties and find challenging biomedical applications. Among the luminescent lanthanide NPs, europium based NPs (Eu-NPs) are better candidates for immunoassay and imaging applications. The Eu-NPs have an edge over quantum dots (QDs) by means of their stable luminescence, long fluorescence lifetime, sharp emission peaks with narrow band width, lack of blinking and biocompatibility. This review surveys the synthesis and properties of a variety of Eu-NPs consolidated from different research articles, for their applications in medicine and biology. The exquisite luminescent properties of Eu-NPs are explored for developing biomedical applications such as immunoassay and bioimaging including multimodal imaging. The biomedical applications of Eu-NPs are mostly diagnostic in nature and mainly focus on various key analytes present in biological systems. The luminescent properties of europium enabled NPs are influenced by a number of factors such as the site symmetry, the metal nanoparticles, metal ions, quantum dots, surfactants, morphology of Eu-NPs, crystal defect, phenomena like antenna effect and physical parameters like temperature. Through this review we explore and assimilate all the factors which affect the luminescence in Eu-NPs and coil a new thread of parameters that control the luminescence in Eu-NPs, which would provide further insight in developing Eu-based nanoprobes for future biomedical prospects. - Highlights: • The review describes 14 major factors that influence the luminescence properties of europium enabled luminescent nanoparticles (Eu-NPs). • Surveys different types of europium containing nanoparticles that have been reported for their biomedical applications. • Eu-NPs are conveniently divided into four different categories, based on the type of the substrates involved. The four categories are (1) virgin Eu-substrate based NPs; (2

  2. Europium enabled luminescent nanoparticles for biomedical applications

    International Nuclear Information System (INIS)

    Syamchand, S.S.; Sony, G.

    2015-01-01

    Lanthanide based nanoparticles are receiving great attention ought to their excellent luminescent and magnetic properties and find challenging biomedical applications. Among the luminescent lanthanide NPs, europium based NPs (Eu-NPs) are better candidates for immunoassay and imaging applications. The Eu-NPs have an edge over quantum dots (QDs) by means of their stable luminescence, long fluorescence lifetime, sharp emission peaks with narrow band width, lack of blinking and biocompatibility. This review surveys the synthesis and properties of a variety of Eu-NPs consolidated from different research articles, for their applications in medicine and biology. The exquisite luminescent properties of Eu-NPs are explored for developing biomedical applications such as immunoassay and bioimaging including multimodal imaging. The biomedical applications of Eu-NPs are mostly diagnostic in nature and mainly focus on various key analytes present in biological systems. The luminescent properties of europium enabled NPs are influenced by a number of factors such as the site symmetry, the metal nanoparticles, metal ions, quantum dots, surfactants, morphology of Eu-NPs, crystal defect, phenomena like antenna effect and physical parameters like temperature. Through this review we explore and assimilate all the factors which affect the luminescence in Eu-NPs and coil a new thread of parameters that control the luminescence in Eu-NPs, which would provide further insight in developing Eu-based nanoprobes for future biomedical prospects. - Highlights: • The review describes 14 major factors that influence the luminescence properties of europium enabled luminescent nanoparticles (Eu-NPs). • Surveys different types of europium containing nanoparticles that have been reported for their biomedical applications. • Eu-NPs are conveniently divided into four different categories, based on the type of the substrates involved. The four categories are (1) virgin Eu-substrate based NPs; (2

  3. Computer network defense system

    Science.gov (United States)

    Urias, Vincent; Stout, William M. S.; Loverro, Caleb

    2017-08-22

    A method and apparatus for protecting virtual machines. A computer system creates a copy of a group of the virtual machines in an operating network in a deception network to form a group of cloned virtual machines in the deception network when the group of the virtual machines is accessed by an adversary. The computer system creates an emulation of components from the operating network in the deception network. The components are accessible by the group of the cloned virtual machines as if the group of the cloned virtual machines was in the operating network. The computer system moves network connections for the group of the virtual machines in the operating network used by the adversary from the group of the virtual machines in the operating network to the group of the cloned virtual machines, enabling protecting the group of the virtual machines from actions performed by the adversary.

  4. Computational Protein Design

    DEFF Research Database (Denmark)

    Johansson, Kristoffer Enøe

    Proteins are the major functional group of molecules in biology. The impact of protein science on medicine and chemical productions is rapidly increasing. However, the greatest potential remains to be realized. The fi eld of protein design has advanced computational modeling from a tool of support...... to a central method that enables new developments. For example, novel enzymes with functions not found in natural proteins have been de novo designed to give enough activity for experimental optimization. This thesis presents the current state-of-the-art within computational design methods together...... with a novel method based on probability theory. With the aim of assembling a complete pipeline for protein design, this work touches upon several aspects of protein design. The presented work is the computational half of a design project where the other half is dedicated to the experimental part...

  5. Glass ceramic ZERODUR enabling nanometer precision

    Science.gov (United States)

    Jedamzik, Ralf; Kunisch, Clemens; Nieder, Johannes; Westerhoff, Thomas

    2014-03-01

    The IC Lithography roadmap foresees manufacturing of devices with critical dimension of digit nanometer asking for nanometer positioning accuracy requiring sub nanometer position measurement accuracy. The glass ceramic ZERODUR® is a well-established material in critical components of microlithography wafer stepper and offered with an extremely low coefficient of thermal expansion (CTE), the tightest tolerance available on market. SCHOTT is continuously improving manufacturing processes and it's method to measure and characterize the CTE behavior of ZERODUR® to full fill the ever tighter CTE specification for wafer stepper components. In this paper we present the ZERODUR® Lithography Roadmap on the CTE metrology and tolerance. Additionally, simulation calculations based on a physical model are presented predicting the long term CTE behavior of ZERODUR® components to optimize dimensional stability of precision positioning devices. CTE data of several low thermal expansion materials are compared regarding their temperature dependence between - 50°C and + 100°C. ZERODUR® TAILORED 22°C is full filling the tight CTE tolerance of +/- 10 ppb / K within the broadest temperature interval compared to all other materials of this investigation. The data presented in this paper explicitly demonstrates the capability of ZERODUR® to enable the nanometer precision required for future generation of lithography equipment and processes.

  6. Breakthrough Science Enabled by Smallsat Optical Communication

    Science.gov (United States)

    Gorjian, V.

    2017-12-01

    The recent NRC panel on "Achieving Science with Cubesats" found that "CubeSats have already proven themselves to be an important scientific tool. CubeSats can produce high-value science, as demonstrated by peer-reviewed publications that address decadal survey science goals." While some science is purely related to the size of the collecting aperture, there are plentiful examples of new and exciting experiments that can be achieved using the relatively inexpensive Cubesat platforms. We will present various potential science applications that can benefit from higher bandwidth communication. For example, on or near Earth orbit, Cubesats could provide hyperspectral imaging, gravity field mapping, atmospheric probing, and terrain mapping. These can be achieved either as large constellations of Cubesats or a few Cubesats that provide multi-point observations. Away from the Earth (up to 1AU) astrophysical variability studies, detections of solar particles between the Earth and Venus, mapping near earth objects, and high-speed videos of the Sun will also be enabled by high bandwidth communications.

  7. Enabling Semantic Queries Against the Spatial Database

    Directory of Open Access Journals (Sweden)

    PENG, X.

    2012-02-01

    Full Text Available The spatial database based upon the object-relational database management system (ORDBMS has the merits of a clear data model, good operability and high query efficiency. That is why it has been widely used in spatial data organization and management. However, it cannot express the semantic relationships among geospatial objects, making the query results difficult to meet the user's requirement well. Therefore, this paper represents an attempt to combine the Semantic Web technology with the spatial database so as to make up for the traditional database's disadvantages. In this way, on the one hand, users can take advantages of ORDBMS to store and manage spatial data; on the other hand, if the spatial database is released in the form of Semantic Web, the users could describe a query more concisely with the cognitive pattern which is similar to that of daily life. As a consequence, this methodology enables the benefits of both Semantic Web and the object-relational database (ORDB available. The paper discusses systematically the semantic enriched spatial database's architecture, key technologies and implementation. Subsequently, we demonstrate the function of spatial semantic queries via a practical prototype system. The query results indicate that the method used in this study is feasible.

  8. Enabler for the agile virtual enterprise

    Science.gov (United States)

    Fuerst, Karl; Schmidt, Thomas; Wippel, Gerald

    2001-10-01

    In this presentation, a new approach for a flexible low-cost Internet extended enterprise (project FLoCI-EE) will be presented. FLoCI-EE is a project in the fifth framework program of the European commission with 8 partners from 4 countries, which started in January 2001 and will be finished in December 2003. The main objective of FLoCI-EE is the development of a software prototype, which enables flexible enterprise cooperation with the aim to design, manufacture and sell products commonly, independent of enterprise borderlines. The needed IT-support includes functions of product data management (PDM), enterprise resource planning (ERP), supply chain management (SCM) and customer relationship management (CRM). Especially for small and medium sized enterprises, existing solutions are too expensive and inflexible to be of use under current turbulent market conditions. The second part of this paper covers the item Web Services, because in the role-specific support approach of FLoCI-EE, there are user- interface-components, which are tailored for specific roles in an enterprise. These components integrate automatically the services of the so-called basic-components, and the externally offered Web Services like UDDI.

  9. Barriers and enablers to academic health leadership.

    Science.gov (United States)

    Bharwani, Aleem; Kline, Theresa; Patterson, Margaret; Craighead, Peter

    2017-02-06

    Purpose This study sought to identify the barriers and enablers to leadership enactment in academic health-care settings. Design/methodology/approach Semi-structured interviews ( n = 77) with programme stakeholders (medical school trainees, university leaders, clinical leaders, medical scientists and directors external to the medical school) were conducted, and the responses content-analysed. Findings Both contextual and individual factors were identified as playing a role in affecting academic health leadership enactment that has an impact on programme development, success and maintenance. Contextual factors included sufficient resources allocated to the programme, opportunities for learners to practise leadership skills, a competent team around the leader once that person is in place, clear expectations for the leader and a culture that fosters open communication. Contextual barriers included highly bureaucratic structures, fear-of-failure and non-trusting cultures and inappropriate performance systems. Programmes were advised to select participants based on self-awareness, strong communication skills and an innovative thinking style. Filling specific knowledge and skill gaps, particularly for those not trained in medical school, was viewed as essential. Ineffective decision-making styles and tendencies to get involved in day-to-day activities were barriers to the development of academic health leaders. Originality/value Programmes designed to develop academic health-care leaders will be most effective if they develop leadership at all levels; ensure that the organisation's culture, structure and processes reinforce positive leadership practices; and recognise the critical role of teams in supporting its leaders.

  10. BEST: barcode enabled sequencing of tetrads.

    Science.gov (United States)

    Scott, Adrian C; Ludlow, Catherine L; Cromie, Gareth A; Dudley, Aimée M

    2014-05-01

    Tetrad analysis is a valuable tool for yeast genetics, but the laborious manual nature of the process has hindered its application on large scales. Barcode Enabled Sequencing of Tetrads (BEST)1 replaces the manual processes of isolating, disrupting and spacing tetrads. BEST isolates tetrads by virtue of a sporulation-specific GFP fusion protein that permits fluorescence-activated cell sorting of tetrads directly onto agar plates, where the ascus is enzymatically digested and the spores are disrupted and randomly arrayed by glass bead plating. The haploid colonies are then assigned sister spore relationships, i.e. information about which spores originated from the same tetrad, using molecular barcodes read during genotyping. By removing the bottleneck of manual dissection, hundreds or even thousands of tetrads can be isolated in minutes. Here we present a detailed description of the experimental procedures required to perform BEST in the yeast Saccharomyces cerevisiae, starting with a heterozygous diploid strain through the isolation of colonies derived from the haploid meiotic progeny.

  11. Survey of Enabling Technologies for CAPS

    Science.gov (United States)

    Antol, Jeffrey; Mazanek, Daniel D.; Koons, Robert H.

    2005-01-01

    The enabling technologies required for the development of a viable Comet/Asteroid Protection System (CAPS) can be divided into two principal areas: detection and deflection/orbit modification. With the proper funding levels, many of the technologies needed to support a CAPS architecture could be achievable within the next 15 to 20 years. In fact, many advanced detection technologies are currently in development for future in-space telescope systems such as the James Webb Space Telescope (JWST), formerly known as the Next Generation Space Telescope. It is anticipated that many of the JWST technologies would be available for application for CAPS detection concepts. Deflection/orbit modification technologies are also currently being studied as part of advanced power and propulsion research. However, many of these technologies, such as extremely high-output power systems, advanced propulsion, heat rejection, and directed energy systems, would likely be farther term in availability than many of the detection technologies. Discussed subsequently is a preliminary examination of the main technologies that have been identified as being essential to providing the element functionality defined during the CAPS conceptual study. The detailed requirements for many of the technology areas are still unknown, and many additional technologies will be identified as future in-depth studies are conducted in this area.

  12. Enabling technologies for industrial energy demand management

    International Nuclear Information System (INIS)

    Dyer, Caroline H.; Hammond, Geoffrey P.; Jones, Craig I.; McKenna, Russell C.

    2008-01-01

    This state-of-science review sets out to provide an indicative assessment of enabling technologies for reducing UK industrial energy demand and carbon emissions to 2050. In the short term, i.e. the period that will rely on current or existing technologies, the road map and priorities are clear. A variety of available technologies will lead to energy demand reduction in industrial processes, boiler operation, compressed air usage, electric motor efficiency, heating and lighting, and ancillary uses such as transport. The prospects for the commercial exploitation of innovative technologies by the middle of the 21st century are more speculative. Emphasis is therefore placed on the range of technology assessment methods that are likely to provide policy makers with a guide to progress in the development of high-temperature processes, improved materials, process integration and intensification, and improved industrial process control and monitoring. Key among the appraisal methods applicable to the energy sector is thermodynamic analysis, making use of energy, exergy and 'exergoeconomic' techniques. Technical and economic barriers will limit the improvement potential to perhaps a 30% cut in industrial energy use, which would make a significant contribution to reducing energy demand and carbon emissions in UK industry. Non-technological drivers for, and barriers to, the take-up of innovative, low-carbon energy technologies for industry are also outlined

  13. Imaging enabled platforms for development of therapeutics

    Science.gov (United States)

    Celli, Jonathan; Rizvi, Imran; Blanden, Adam R.; Evans, Conor L.; Abu-Yousif, Adnan O.; Spring, Bryan Q.; Muzikansky, Alona; Pogue, Brian W.; Finkelstein, Dianne M.; Hasan, Tayyaba

    2011-03-01

    Advances in imaging and spectroscopic technologies have enabled the optimization of many therapeutic modalities in cancer and noncancer pathologies either by earlier disease detection or by allowing therapy monitoring. Amongst the therapeutic options benefiting from developments in imaging technologies, photodynamic therapy (PDT) is exceptional. PDT is a photochemistry-based therapeutic approach where a light-sensitive molecule (photosensitizer) is activated with light of appropriate energy (wavelength) to produce reactive molecular species such as free radicals and singlet oxygen. These molecular entities then react with biological targets such as DNA, membranes and other cellular components to impair their function and lead to eventual cell and tissue death. Development of PDT-based imaging also provides a platform for rapid screening of new therapeutics in novel in vitro models prior to expensive and labor-intensive animal studies. In this study we demonstrate how an imaging platform can be used for strategizing a novel combination treatment strategy for multifocal ovarian cancer. Using an in vitro 3D model for micrometastatic ovarian cancer in conjunction with quantitative imaging we examine dose and scheduling strategies for PDT in combination with carboplatin, a chemotherapeutic agent presently in clinical use for management of this deadly form of cancer.

  14. Multimode Communication Protocols Enabling Reconfigurable Radios

    Directory of Open Access Journals (Sweden)

    Berlemann Lars

    2005-01-01

    Full Text Available This paper focuses on the realization and application of a generic protocol stack for reconfigurable wireless communication systems. This focus extends the field of software-defined radios which usually concentrates on the physical layer. The generic protocol stack comprises common protocol functionality and behavior which are extended through specific parts of the targeted radio access technology. This paper considers parameterizable modules of basic protocol functions residing in the data link layer of the ISO/OSI model. System-specific functionality of the protocol software is realized through adequate parameterization and composition of the generic modules. The generic protocol stack allows an efficient realization of reconfigurable protocol software and enables a completely reconfigurable wireless communication system. It is a first step from side-by-side realized, preinstalled modes in a terminal towards a dynamic reconfigurable anymode terminal. The presented modules of the generic protocol stack can also be regarded as a toolbox for the accelerated and cost-efficient development of future communication protocols.

  15. Enabling electroweak baryogenesis through dark matter

    International Nuclear Information System (INIS)

    Lewicki, Marek; Rindler-Daller, Tanja; Wells, James D.

    2016-01-01

    We study the impact on electroweak baryogenesis from a swifter cosmological expansion induced by dark matter. We detail the experimental bounds that one can place on models that realize it, and we investigate the modifications of these bounds that result from a non-standard cosmological history. The modifications can be sizeable if the expansion rate of the Universe increases by several orders of magnitude. We illustrate the impact through the example of scalar field dark matter, which can alter the cosmological history enough to enable a strong-enough first-order phase transition in the Standard Model when it is supplemented by a dimension six operator directly modifying the Higgs boson potential. We show that due to the modified cosmological history, electroweak baryogenesis can be realized, while keeping deviations of the triple Higgs coupling below HL-LHC sensitivies. The required scale of new physics to effectuate a strong-enough first order phase transition can change by as much as twenty percent as the expansion rate increases by six orders of magnitude.

  16. Water: A Critical Material Enabling Space Exploration

    Science.gov (United States)

    Pickering, Karen D.

    2014-01-01

    Water is one of the most critical materials in human spaceflight. The availability of water defines the duration of a space mission; the volume of water required for a long-duration space mission becomes too large, heavy, and expensive for launch vehicles to carry. Since the mission duration is limited by the amount of water a space vehicle can carry, the capability to recycle water enables space exploration. In addition, water management in microgravity impacts spaceflight in other respects, such as the recent emergency termination of a spacewalk caused by free water in an astronaut's spacesuit helmet. A variety of separation technologies are used onboard spacecraft to ensure that water is always available for use, and meets the stringent water quality required for human space exploration. These separation technologies are often adapted for use in a microgravity environment, where water behaves in unique ways. The use of distillation, membrane processes, ion exchange and granular activated carbon will be reviewed. Examples of microgravity effects on operations will also be presented. A roadmap for future technologies, needed to supply water resources for the exploration of Mars, will also be reviewed.

  17. Enabling Process Alignment for IT Entrepreneurship

    Directory of Open Access Journals (Sweden)

    Sonia D. Bot

    2012-11-01

    Full Text Available All firms use information technology (IT. Larger firms have IT organizations whose business function is to supply and manage IT infrastructure and applications to support the firm's business objectives. Regardless of whether the IT function has been outsourced or is resident within a firm, the objectives of the IT organization must be aligned to the strategic needs of the business. It is often a challenge to balance the demand for IT against the available supply within the firm. Most IT organizations have little capacity to carry out activities that go beyond the incremental ones that are needed to run the immediate needs of the business. A process-ambidexterity framework for IT improves the IT organization's entrepreneurial ability, which in turn, better aligns the IT function with the business functions in the firm. Process ambidexterity utilizes both process alignment and process adaptability. This article presents a framework for process alignment in IT. This is useful for understanding how the processes in Business Demand Management, a core component of the process-ambidexterity framework for IT, relate to those in IT Governance and IT Supply Chain Management. The framework is presented through three lenses (governance, business, and technology along with real-world examples from major firms in the USA. Enabling process alignment in the IT function, and process ambidexterity overall, benefits those who govern IT, the executives who lead IT, as well as their peers in the business functions that depend on IT.

  18. Smart Sensors Enable Smart Air Conditioning Control

    Directory of Open Access Journals (Sweden)

    Chin-Chi Cheng

    2014-06-01

    Full Text Available In this study, mobile phones, wearable devices, temperature and human motion detectors are integrated as smart sensors for enabling smart air conditioning control. Smart sensors obtain feedback, especially occupants’ information, from mobile phones and wearable devices placed on human body. The information can be used to adjust air conditioners in advance according to humans’ intentions, in so-called intention causing control. Experimental results show that the indoor temperature can be controlled accurately with errors of less than ±0.1 °C. Rapid cool down can be achieved within 2 min to the optimized indoor capacity after occupants enter a room. It’s also noted that within two-hour operation the total compressor output of the smart air conditioner is 48.4% less than that of the one using On-Off control. The smart air conditioner with wearable devices could detect the human temperature and activity during sleep to determine the sleeping state and adjusting the sleeping function flexibly. The sleeping function optimized by the smart air conditioner with wearable devices could reduce the energy consumption up to 46.9% and keep the human health. The presented smart air conditioner could provide a comfortable environment and achieve the goals of energy conservation and environmental protection.

  19. Translator program converts computer printout into braille language

    Science.gov (United States)

    Powell, R. A.

    1967-01-01

    Computer program converts print image tape files into six dot Braille cells, enabling a blind computer programmer to monitor and evaluate data generated by his own programs. The Braille output is printed 8 lines per inch.

  20. Supporting hypothesis generation by learners exploring an interactive computer simulation

    NARCIS (Netherlands)

    van Joolingen, Wouter R.; de Jong, Ton

    1992-01-01

    Computer simulations provide environments enabling exploratory learning. Research has shown that these types of learning environments are promising applications of computer assisted learning but also that they introduce complex learning settings, involving a large number of learning processes. This

  1. Computational Medicine

    DEFF Research Database (Denmark)

    Nygaard, Jens Vinge

    2017-01-01

    The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours......The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours...

  2. Grid Computing

    Indian Academy of Sciences (India)

    A computing grid interconnects resources such as high performancecomputers, scientific databases, and computercontrolledscientific instruments of cooperating organizationseach of which is autonomous. It precedes and is quitedifferent from cloud computing, which provides computingresources by vendors to customers ...

  3. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  4. Internet ware cloud computing :Challenges

    OpenAIRE

    Qamar, S; Lal, Niranjan; Singh, Mrityunjay

    2010-01-01

    After decades of engineering development and infrastructural investment, Internet connections have become commodity product in many countries, and Internet scale “cloud computing” has started to compete with traditional software business through its technological advantages and economy of scale. Cloud computing is a promising enabling technology of Internet ware Cloud Computing is termed as the next big thing in the modern corporate world. Apart from the present day software and technologies,...

  5. VLSI Architectures for Computing DFT's

    Science.gov (United States)

    Truong, T. K.; Chang, J. J.; Hsu, I. S.; Reed, I. S.; Pei, D. Y.

    1986-01-01

    Simplifications result from use of residue Fermat number systems. System of finite arithmetic over residue Fermat number systems enables calculation of discrete Fourier transform (DFT) of series of complex numbers with reduced number of multiplications. Computer architectures based on approach suitable for design of very-large-scale integrated (VLSI) circuits for computing DFT's. General approach not limited to DFT's; Applicable to decoding of error-correcting codes and other transform calculations. System readily implemented in VLSI.

  6. Ontology Enabled Generation of Embedded Web Services

    DEFF Research Database (Denmark)

    Hansen, Klaus Marius; Zhang, Weishan; Soares, Goncalo Teofilo Afonso Pinheiro

    2008-01-01

    Web services are increasingly adopted as a service provision mechanism in pervasive computing environments. Implementing web services on networked, embedded devices raises a number of challenges, for example efficiency of web services, handling of variability and dependencies of hardware...... and software platforms, and of devices state and context changes. To address these challenges, we developed a Web service compiler, Limbo, in which Web Ontology Language (OWL) ontologies are used to make the Limbo compiler aware of its compilation context, such as targeted hardware and software. At the same...... time, knowledge on device details, platform dependencies, and resource/power consumption is built into the supporting ontologies, which are used to configure Limbo for generating resource efficient web service code. A state machine ontology is used to generate stub code to facilitate handling of state...

  7. Privacy enabling technology for video surveillance

    Science.gov (United States)

    Dufaux, Frédéric; Ouaret, Mourad; Abdeljaoued, Yousri; Navarro, Alfonso; Vergnenègre, Fabrice; Ebrahimi, Touradj

    2006-05-01

    In this paper, we address the problem privacy in video surveillance. We propose an efficient solution based on transformdomain scrambling of regions of interest in a video sequence. More specifically, the sign of selected transform coefficients is flipped during encoding. We address more specifically the case of Motion JPEG 2000. Simulation results show that the technique can be successfully applied to conceal information in regions of interest in the scene while providing with a good level of security. Furthermore, the scrambling is flexible and allows adjusting the amount of distortion introduced. This is achieved with a small impact on coding performance and negligible computational complexity increase. In the proposed video surveillance system, heterogeneous clients can remotely access the system through the Internet or 2G/3G mobile phone network. Thanks to the inherently scalable Motion JPEG 2000 codestream, the server is able to adapt the resolution and bandwidth of the delivered video depending on the usage environment of the client.

  8. Framework for Computation Offloading in Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Dejan Kovachev

    2012-12-01

    Full Text Available The inherently limited processing power and battery lifetime of mobile phones hinder the possible execution of computationally intensive applications like content-based video analysis or 3D modeling. Offloading of computationally intensive application parts from the mobile platform into a remote cloud infrastructure or nearby idle computers addresses this problem. This paper presents our Mobile Augmentation Cloud Services (MACS middleware which enables adaptive extension of Android application execution from a mobile client into the cloud. Applications are developed by using the standard Android development pattern. The middleware does the heavy lifting of adaptive application partitioning, resource monitoring and computation offloading. These elastic mobile applications can run as usual mobile application, but they can also use remote computing resources transparently. Two prototype applications using the MACS middleware demonstrate the benefits of the approach. The evaluation shows that applications, which involve costly computations, can benefit from offloading with around 95% energy savings and significant performance gains compared to local execution only.

  9. Virtual Machine Logbook - Enabling virtualization for ATLAS

    International Nuclear Information System (INIS)

    Yao Yushu; Calafiura, Paolo; Leggett, Charles; Poffet, Julien; Cavalli, Andrea; Frederic, Bapst

    2010-01-01

    ATLAS software has been developed mostly on CERN linux cluster lxplus or on similar facilities at the experiment Tier 1 centers. The fast rise of virtualization technology has the potential to change this model, turning every laptop or desktop into an ATLAS analysis platform. In the context of the CernVM project we are developing a suite of tools and CernVM plug-in extensions to promote the use of virtualization for ATLAS analysis and software development. The Virtual Machine Logbook (VML), in particular, is an application to organize work of physicists on multiple projects, logging their progress, and speeding up ''context switches'' from one project to another. An important feature of VML is the ability to share with a single 'click' the status of a given project with other colleagues. VML builds upon the save and restore capabilities of mainstream virtualization software like VMware, and provides a technology-independent client interface to them. A lot of emphasis in the design and implementation has gone into optimizing the save and restore process to makepractical to store many VML entries on a typical laptop disk or to share a VML entry over the network. At the same time, taking advantage of CernVM's plugin capabilities, we are extending the CernVM platform to help increase the usability of ATLAS software. For example, we added the ability to start the ATLAS event display on any computer running CernVM simply by clicking a button in a web browser. We want to integrate seamlessly VML with CernVM unique file system design to distribute efficiently ATLAS software on every physicist computer. The CernVM File System (CVMFS) download files on-demand via HTTP, and cache it locally for future use. This reduces by one order of magnitude the download sizes, making practical for a developer to work with multiple software releases on a virtual machine.

  10. Enabling Graph Appliance for Genome Assembly

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Rina [ORNL; Graves, Jeffrey A [ORNL; Lee, Sangkeun (Matt) [ORNL; Sukumar, Sreenivas R [ORNL; Shankar, Mallikarjun [ORNL

    2015-01-01

    In recent years, there has been a huge growth in the amount of genomic data available as reads generated from various genome sequencers. The number of reads generated can be huge, ranging from hundreds to billions of nucleotide, each varying in size. Assembling such large amounts of data is one of the challenging computational problems for both biomedical and data scientists. Most of the genome assemblers developed have used de Bruijn graph techniques. A de Bruijn graph represents a collection of read sequences by billions of vertices and edges, which require large amounts of memory and computational power to store and process. This is the major drawback to de Bruijn graph assembly. Massively parallel, multi-threaded, shared memory systems can be leveraged to overcome some of these issues. The objective of our research is to investigate the feasibility and scalability issues of de Bruijn graph assembly on Cray s Urika-GD system; Urika-GD is a high performance graph appliance with a large shared memory and massively multithreaded custom processor designed for executing SPARQL queries over large-scale RDF data sets. However, to the best of our knowledge, there is no research on representing a de Bruijn graph as an RDF graph or finding Eulerian paths in RDF graphs using SPARQL for potential genome discovery. In this paper, we address the issues involved in representing a de Bruin graphs as RDF graphs and propose an iterative querying approach for finding Eulerian paths in large RDF graphs. We evaluate the performance of our implementation on real world ebola genome datasets and illustrate how genome assembly can be accomplished with Urika-GD using iterative SPARQL queries.

  11. Quantum computers and quantum computations

    International Nuclear Information System (INIS)

    Valiev, Kamil' A

    2005-01-01

    This review outlines the principles of operation of quantum computers and their elements. The theory of ideal computers that do not interact with the environment and are immune to quantum decohering processes is presented. Decohering processes in quantum computers are investigated. The review considers methods for correcting quantum computing errors arising from the decoherence of the state of the quantum computer, as well as possible methods for the suppression of the decohering processes. A brief enumeration of proposed quantum computer realizations concludes the review. (reviews of topical problems)

  12. Quantum Computing for Computer Architects

    CERN Document Server

    Metodi, Tzvetan

    2011-01-01

    Quantum computers can (in theory) solve certain problems far faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their infancy, it is not too early to consider their scalability and reliability in the context of the design of large-scale quantum computers. To architect such systems, one must understand what it takes to design and model a balanced, fault-tolerant quantum computer architecture. The goal of this lecture is to provide architectural abstractions for the design of a quantum computer and to explore

  13. Pervasive Computing

    NARCIS (Netherlands)

    Silvis-Cividjian, N.

    This book provides a concise introduction to Pervasive Computing, otherwise known as Internet of Things (IoT) and Ubiquitous Computing (Ubicomp) which addresses the seamless integration of computing systems within everyday objects. By introducing the core topics and exploring assistive pervasive

  14. Computational vision

    CERN Document Server

    Wechsler, Harry

    1990-01-01

    The book is suitable for advanced courses in computer vision and image processing. In addition to providing an overall view of computational vision, it contains extensive material on topics that are not usually covered in computer vision texts (including parallel distributed processing and neural networks) and considers many real applications.

  15. Spatial Computation

    Science.gov (United States)

    2003-12-01

    Computation and today’s microprocessors with the approach to operating system architecture, and the controversy between microkernels and monolithic kernels...Both Spatial Computation and microkernels break away a relatively monolithic architecture into in- dividual lightweight pieces, well specialized...for their particular functionality. Spatial Computation removes global signals and control, in the same way microkernels remove the global address

  16. "Nanotechnology Enabled Advanced Industrial Heat Transfer Fluids"

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Ganesh Skandan; Dr. Amit Singhal; Mr. Kenneth Eberts; Mr. Damian Sobrevilla; Prof. Jerry Shan; Stephen Tse; Toby Rossmann

    2008-06-12

    ABSTRACT Nanotechnology Enabled Advanced industrial Heat Transfer Fluids” Improving the efficiency of Industrial Heat Exchangers offers a great opportunity to improve overall process efficiencies in diverse industries such as pharmaceutical, materials manufacturing and food processing. The higher efficiencies can come in part from improved heat transfer during both cooling and heating of the material being processed. Additionally, there is great interest in enhancing the performance and reducing the weight of heat exchangers used in automotives in order to increase fuel efficiency. The goal of the Phase I program was to develop nanoparticle containing heat transfer fluids (e.g., antifreeze, water, silicone and hydrocarbon-based oils) that are used in transportation and in the chemical industry for heating, cooling and recovering waste heat. Much work has been done to date at investigating the potential use of nanoparticle-enhanced thermal fluids to improve heat transfer in heat exchangers. In most cases the effect in a commercial heat transfer fluid has been marginal at best. In the Phase I work, we demonstrated that the thermal conductivity, and hence heat transfer, of a fluid containing nanoparticles can be dramatically increased when subjected to an external influence. The increase in thermal conductivity was significantly larger than what is predicted by commonly used thermal models for two-phase materials. Additionally, the surface of the nanoparticles was engineered so as to have a minimal influence on the viscosity of the fluid. As a result, a nanoparticle-laden fluid was successfully developed that can lead to enhanced heat transfer in both industrial and automotive heat exchangers

  17. The Grid-Enabled NMR Spectroscopy

    International Nuclear Information System (INIS)

    Lawenda, M.; Meyer, N.; Stroinski, M.; Popenda, L.; Gdaniec, Z.; Adamiak, R.W.

    2005-01-01

    The laboratory equipment used for experimental work is very expensive and unique as well. Only big regional or national centers could afford to purchase and use it, but on a very limited scale. That is a real problem that disqualifies all other research groups not having direct access to these instruments. Therefore the proposed framework plays a crucial role in equalizing the chances of all research groups. The Virtual Laboratory (VLab) project focuses its activity on embedding laboratory equipments in grid environments (handling HPC and visualization), touching some crucial issues not solved yet. In general the issues concern the standardization of the laboratory equipment definition to treat it as a simple grid resource, supporting the end user under the term of the workflow definition, introducing the accounting issues and prioritizing jobs which follow experiments on equipments. Nowadays, we have a lot of various equipments, which can be accessed remotely via network, but only on the way allowing the local management console/display to move through the network to make a simpler access. To manage an experimental and post-processing data as well as store them in a organized way, a special Digital Science Library was developed. The project delivers a framework to enable the usage of many different scientific facilities. The physical layer of the architecture includes the existing high-speed network like PIONIER in Poland, and the HPC and visualization infrastructure. The application, in fact the framework, can be used in all experimental disciplines, where access to physical equipments are crucial, e.g., chemistry (spectrometer), radio astronomy (radio telescope), and medicine (CAT scanner). The poster presentation will show how we deployed the concept in chemistry, supporting these disciplines with grid environment and embedding the Bruker Avance 600 MHz and Varian 300 MHz spectrometers. (author)

  18. Risk Management Considerations in Cloud Computing Adoption

    OpenAIRE

    Doherty, Eileen; Carcary, Marian; Conway, Gerard

    2012-01-01

    Information and Communication Technology (ICT) plays a pivotal role in enabling organizational capability and productivity, and in initiating and facilitating innovation across all industry sectors. In recent years, cloud computing has emerged as a growing trend because it serves as an enabler of scalable, flexible and powerful computing. Consequently, each year significant global investment is made in migrating to the cloud environment. However, despite its growing po...

  19. Electromagnetic Induction: A Computer-Assisted Experiment

    Science.gov (United States)

    Fredrickson, J. E.; Moreland, L.

    1972-01-01

    By using minimal equipment it is possible to demonstrate Faraday's Law. An electronic desk calculator enables sophomore students to solve a difficult mathematical expression for the induced EMF. Polaroid pictures of the plot of induced EMF, together with the computer facility, enables students to make comparisons. (PS)

  20. Heat Treatment Used to Strengthen Enabling Coating Technology for Oil-Free Turbomachinery

    Science.gov (United States)

    Edmonds, Brian J.; DellaCorte, Christopher

    2002-01-01

    The PS304 high-temperature solid lubricant coating is a key enabling technology for Oil- Free turbomachinery propulsion and power systems. Breakthroughs in the performance of advanced foil air bearings and improvements in computer-based finite element modeling techniques are the key technologies enabling the development of Oil-Free aircraft engines being pursued by the Oil-Free Turbomachinery team at the NASA Glenn Research Center. PS304 is a plasma spray coating applied to the surface of shafts operating against foil air bearings or in any other component requiring solid lubrication at high temperatures, where conventional materials such as graphite cannot function.

  1. Computer applications in radiation protection

    International Nuclear Information System (INIS)

    Cole, P.R.; Moores, B.M.

    1995-01-01

    Computer applications in general and diagnostic radiology in particular are becoming more widespread. Their application to the field of radiation protection in medical imaging, including quality control initiatives, is similarly becoming more widespread. Advances in computer technology have enabled departments of diagnostic radiology to have access to powerful yet affordable personal computers. The application of databases, expert systems and computer-based learning is under way. The executive information systems for the management of dose and QA data that are under way at IRS are discussed. An important consideration in developing these pragmatic software tools has been the range of computer literacy within the end user group. Using interfaces have been specifically designed to reflect the requirements of many end users who will have little or no computer knowledge. (Author)

  2. Parallel computations

    CERN Document Server

    1982-01-01

    Parallel Computations focuses on parallel computation, with emphasis on algorithms used in a variety of numerical and physical applications and for many different types of parallel computers. Topics covered range from vectorization of fast Fourier transforms (FFTs) and of the incomplete Cholesky conjugate gradient (ICCG) algorithm on the Cray-1 to calculation of table lookups and piecewise functions. Single tridiagonal linear systems and vectorized computation of reactive flow are also discussed.Comprised of 13 chapters, this volume begins by classifying parallel computers and describing techn

  3. Human Computation

    CERN Multimedia

    CERN. Geneva

    2008-01-01

    What if people could play computer games and accomplish work without even realizing it? What if billions of people collaborated to solve important problems for humanity or generate training data for computers? My work aims at a general paradigm for doing exactly that: utilizing human processing power to solve computational problems in a distributed manner. In particular, I focus on harnessing human time and energy for addressing problems that computers cannot yet solve. Although computers have advanced dramatically in many respects over the last 50 years, they still do not possess the basic conceptual intelligence or perceptual capabilities...

  4. Quantum computation

    International Nuclear Information System (INIS)

    Deutsch, D.

    1992-01-01

    As computers become ever more complex, they inevitably become smaller. This leads to a need for components which are fabricated and operate on increasingly smaller size scales. Quantum theory is already taken into account in microelectronics design. This article explores how quantum theory will need to be incorporated into computers in future in order to give them their components functionality. Computation tasks which depend on quantum effects will become possible. Physicists may have to reconsider their perspective on computation in the light of understanding developed in connection with universal quantum computers. (UK)

  5. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  6. Computer sciences

    Science.gov (United States)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  7. Enhancement of automated blood flow estimates (ENABLE) from arterial spin-labeled MRI.

    Science.gov (United States)

    Shirzadi, Zahra; Stefanovic, Bojana; Chappell, Michael A; Ramirez, Joel; Schwindt, Graeme; Masellis, Mario; Black, Sandra E; MacIntosh, Bradley J

    2018-03-01

    To validate a multiparametric automated algorithm-ENhancement of Automated Blood fLow Estimates (ENABLE)-that identifies useful and poor arterial spin-labeled (ASL) difference images in multiple postlabeling delay (PLD) acquisitions and thereby improve clinical ASL. ENABLE is a sort/check algorithm that uses a linear combination of ASL quality features. ENABLE uses simulations to determine quality weighting factors based on an unconstrained nonlinear optimization. We acquired a set of 6-PLD ASL images with 1.5T or 3.0T systems among 98 healthy elderly and adults with mild cognitive impairment or dementia. We contrasted signal-to-noise ratio (SNR) of cerebral blood flow (CBF) images obtained with ENABLE vs. conventional ASL analysis. In a subgroup, we validated our CBF estimates with single-photon emission computed tomography (SPECT) CBF images. ENABLE produced significantly increased SNR compared to a conventional ASL analysis (Wilcoxon signed-rank test, P Wilcoxon signed-rank test, P < 0.0001) and this similarity was strongly related to ASL SNR (t = 24, P < 0.0001). These findings suggest that ENABLE improves CBF image quality from multiple PLD ASL in dementia cohorts at either 1.5T or 3.0T, achieved by multiparametric quality features that guided postprocessing of dementia ASL. 2 Technical Efficacy: Stage 2 J. Magn. Reson. Imaging 2018;47:647-655. © 2017 International Society for Magnetic Resonance in Medicine.

  8. Arduino-enabled Patron Interaction Counting

    Directory of Open Access Journals (Sweden)

    Tim Ribaric

    2013-04-01

    Full Text Available Using the Arduino development board (http://arduino.cc has become a very popular way to create hardware prototypes that bridge the divide between the physical world and the Internet. This article outlines how to use an Arduino, some off-the-shelf electronic parts, the Processing programming language, and Google Documents to create a push-button reference desk transaction tally device. The design: plugged into a computer at the reference desk, staff members push the appropriate button on the device when a reference transaction occurs, and the action is instantly tallied in a Google Document. Having a physical device on the desktop increases chances of proper collection of information since it is constantly visible and easily accessible, versus requiring staff members to click through a series of options in a piece of software running on the PC. The data can be tabulated in Google Documents or any other source that processes form-based HTML data. This article covers all of the major components of creating the project: - Constructing the Arduino circuit and programming it - Creating the Google Docs form - Creating the Processing program that will listen for information from the Arduino and send it to the Google Docs form

  9. Bioblendstocks that Enable High Efficiency Engine Designs

    Energy Technology Data Exchange (ETDEWEB)

    McCormick, Robert L.; Fioroni, Gina M.; Ratcliff, Matthew A.; Zigler, Bradley T.; Farrell, John

    2016-11-03

    The past decade has seen a high level of innovation in production of biofuels from sugar, lipid, and lignocellulose feedstocks. As discussed in several talks at this workshop, ethanol blends in the E25 to E50 range could enable more highly efficient spark-ignited (SI) engines. This is because of their knock resistance properties that include not only high research octane number (RON), but also charge cooling from high heat of vaporization, and high flame speed. Emerging alcohol fuels such as isobutanol or mixed alcohols have desirable properties such as reduced gasoline blend vapor pressure, but also have lower RON than ethanol. These fuels may be able to achieve the same knock resistance benefits, but likely will require higher blend levels or higher RON hydrocarbon blendstocks. A group of very high RON (>150) oxygenates such as dimethyl furan, methyl anisole, and related compounds are also produced from biomass. While providing no increase in charge cooling, their very high octane numbers may provide adequate knock resistance for future highly efficient SI engines. Given this range of options for highly knock resistant fuels there appears to be a critical need for a fuel knock resistance metric that includes effects of octane number, heat of vaporization, and potentially flame speed. Emerging diesel fuels include highly branched long-chain alkanes from hydroprocessing of fats and oils, as well as sugar-derived terpenoids. These have relatively high cetane number (CN), which may have some benefits in designing more efficient CI engines. Fast pyrolysis of biomass can produce diesel boiling range streams that are high in aromatic, oxygen and acid contents. Hydroprocessing can be applied to remove oxygen and consequently reduce acidity, however there are strong economic incentives to leave up to 2 wt% oxygen in the product. This oxygen will primarily be present as low CN alkyl phenols and aryl ethers. While these have high heating value, their presence in diesel fuel

  10. Robust Brain-Computer Interfaces

    NARCIS (Netherlands)

    Reuderink, B.

    2011-01-01

    A brain-computer interface (BCI) enables direct communication from the brain to devices, bypassing the traditional pathway of peripheral nerves and muscles. Current BCIs aimed at patients require that the user invests weeks, or even months, to learn the skill to intentionally modify their brain

  11. Engineering and Computing Portal to Solve Environmental Problems

    Science.gov (United States)

    Gudov, A. M.; Zavozkin, S. Y.; Sotnikov, I. Y.

    2018-01-01

    This paper describes architecture and services of the Engineering and Computing Portal, which is considered to be a complex solution that provides access to high-performance computing resources, enables to carry out computational experiments, teach parallel technologies and solve computing tasks, including technogenic safety ones.

  12. An Overview of Computer-Based Natural Language Processing.

    Science.gov (United States)

    Gevarter, William B.

    Computer-based Natural Language Processing (NLP) is the key to enabling humans and their computer-based creations to interact with machines using natural languages (English, Japanese, German, etc.) rather than formal computer languages. NLP is a major research area in the fields of artificial intelligence and computational linguistics. Commercial…

  13. Cloud computing as a new technology trend in education

    OpenAIRE

    Шамина, Ольга Борисовна; Буланова, Татьяна Валентиновна

    2014-01-01

    The construction and operation of extremely large-scale, commodity-computer datacenters was the key necessary enabler of Cloud Computing. Cloud Computing could offer services make a good profit for using in education. With Cloud Computing it is possible to increase the quality of education, improve communicative culture and give to teachers and students new application opportunities.

  14. Enabling web users and developers to script accessibility with Accessmonkey.

    Science.gov (United States)

    Bigham, Jeffrey P; Brudvik, Jeremy T; Leung, Jessica O; Ladner, Richard E

    2009-07-01

    Efficient web access remains elusive for blind computer users. Previous efforts to improve web accessibility have focused on developer awareness, automated improvement, and legislation, but these approaches have left remaining concerns. First, while many tools can help produce accessible content, most are difficult to integrate into existing developer workflows and rarely offer specific suggestions that developers can implement. Second, tools that automatically improve web content for users generally solve specific problems and are difficult to combine and use on a diversity of existing assistive technology. Finally, although blind web users have proven adept at overcoming the shortcomings of the web and existing tools, they have been only marginally involved in improving the accessibility of their own web experience. In a step toward addressing these concerns, we have developed Accessmonkey, a common scripting framework that web users, web developers and web researchers can use to collaboratively improve accessibility. This framework advances the idea that Javascript and dynamic web content can be used to improve inaccessible content instead of being a cause of it. Using Accessmonkey, web users and developers on different platforms and with potentially different goals can collaboratively make the web more accessible. In this article, we first present the design of the Accessmonkey framework and offer several example scripts that demonstrate the utility of our approach. We conclude by discussing possible future extensions that will provide easy access to scripts as users browse the web and enable non-technical blind users to independently create and share improvements.

  15. A review of the Technologies Enabling Agile Manufacturing program

    Energy Technology Data Exchange (ETDEWEB)

    Gray, W.H.; Neal, R.E.; Cobb, C.K.

    1996-10-01

    Addressing a technical plan developed in consideration with major US manufacturers, software and hardware providers, and government representatives, the Technologies Enabling Agile Manufacturing (TEAM) program is leveraging the expertise and resources of industry, universities, and federal agencies to develop, integrate, and deploy leap-ahead manufacturing technologies. One of the TEAM program`s goals is to transition products from design to production faster, more efficiently, and at less cost. TEAM`s technology development strategy also provides all participants with early experience in establishing and working within an electronic enterprise that includes access to high-speed networks and high-performance computing and storage systems. The TEAM program uses the cross-cutting tools it collects, develops, and integrates to demonstrate and deploy agile manufacturing capabilities for three high-priority processes identified by industry: material removal, sheet metal forming, electro-mechanical assembly. This paper reviews the current status of the TEAM program with emphasis upon TEAM`s information infrastructure.

  16. SimPhospho: a software tool enabling confident phosphosite assignment.

    Science.gov (United States)

    Suni, Veronika; Suomi, Tomi; Tsubosaka, Tomoya; Imanishi, Susumu Y; Elo, Laura L; Corthals, Garry L

    2018-03-27

    Mass spectrometry combined with enrichment strategies for phosphorylated peptides has been successfully employed for two decades to identify sites of phosphorylation. However, unambiguous phosphosite assignment is considered challenging. Given that site-specific phosphorylation events function as different molecular switches, validation of phosphorylation sites is of utmost importance. In our earlier study we developed a method based on simulated phosphopeptide spectral libraries, which enables highly sensitive and accurate phosphosite assignments. To promote more widespread use of this method, we here introduce a software implementation with improved usability and performance. We present SimPhospho, a fast and user-friendly tool for accurate simulation of phosphopeptide tandem mass spectra. Simulated phosphopeptide spectral libraries are used to validate and supplement database search results, with a goal to improve reliable phosphoproteome identification and reporting. The presented program can be easily used together with the Trans-Proteomic Pipeline and integrated in a phosphoproteomics data analysis workflow. SimPhospho is available for Windows, Linux and Mac operating systems at https://sourceforge.net/projects/simphospho/. It is open source and implemented in C ++. A user's manual with detailed description of data analysis using SimPhospho as well as test data can be found as supplementary material of this article. Supplementary data are available at https://www.btk.fi/research/ computational-biomedicine/software/.

  17. Electrical bioimpedance enabling prompt intervention in traumatic brain injury

    Science.gov (United States)

    Seoane, Fernando; Atefi, S. Reza

    2017-05-01

    Electrical Bioimpedance (EBI) is a well spread technology used in clinical practice across the world. Advancements in Textile material technology with conductive textile fabrics and textile-electronics integration have allowed exploring potential applications for Wearable Measurement Sensors and Systems exploiting. The sensing principle of electrical bioimpedance is based on the intrinsic passive dielectric properties of biological tissue. Using a pair of electrodes, tissue is electrically stimulated and the electrical response can be sensed with another pair of surface electrodes. EBI spectroscopy application for cerebral monitoring of neurological conditions such as stroke and perinatal asphyxia in newborns have been justified using animal studies and computational simulations. Such studies have shown proof of principle that neurological pathologies indeed modify the dielectric composition of the brain that is detectable via EBI. Similar to stroke, Traumatic Brain Injury (TBI) also affects the dielectric properties of brain tissue that can be detected via EBI measurements. Considering the portable and noninvasive characteristics of EBI it is potentially useful for prehospital triage of TBI patients where. In the battlefield blast induced Traumatic Brain Injuries are very common. Brain damage must be assessed promptly to have a chance to prevent severe damage or eventually death. The relatively low-complexity of the sensing hardware required for EBI sensing and the already proven compatibility with textile electrodes suggest the EBI technology is indeed a candidate for developing a handheld device equipped with a sensorized textile cap to produce an examination in minutes for enabling medically-guided prompt intervention.

  18. Enabling search services on outsourced private spatial data

    KAUST Repository

    Yiu, Man Lung

    2009-10-30

    Cloud computing services enable organizations and individuals to outsource the management of their data to a service provider in order to save on hardware investments and reduce maintenance costs. Only authorized users are allowed to access the data. Nobody else, including the service provider, should be able to view the data. For instance, a real-estate company that owns a large database of properties wants to allow its paying customers to query for houses according to location. On the other hand, the untrusted service provider should not be able to learn the property locations and, e. g., selling the information to a competitor. To tackle the problem, we propose to transform the location datasets before uploading them to the service provider. The paper develops a spatial transformation that re-distributes the locations in space, and it also proposes a cryptographic-based transformation. The data owner selects the transformation key and shares it with authorized users. Without the key, it is infeasible to reconstruct the original data points from the transformed points. The proposed transformations present distinct trade-offs between query efficiency and data confidentiality. In addition, we describe attack models for studying the security properties of the transformations. Empirical studies demonstrate that the proposed methods are efficient and applicable in practice. © 2009 Springer-Verlag.

  19. Social-ecological enabling conditions for payments for ecosystem services

    OpenAIRE

    Heidi R. Huber-Stearns; Drew E. Bennett; Stephen Posner; Ryan C. Richards; Jenn Hoyle. Fair; Stella J. M. Cousins; Chelsie L. Romulo

    2017-01-01

    The concept of "enabling conditions" centers on conditions that facilitate approaches to addressing social and ecological challenges. Although multiple fields have independently addressed the concept of enabling conditions, the literature lacks a shared understanding or integration of concepts. We propose a more synthesized understanding of enabling conditions beyond disciplinary boundaries by focusing on the enabling conditions that influence the implementation of a range of environmental p...

  20. ITK: Enabling Reproducible Research and Open Science

    Directory of Open Access Journals (Sweden)

    Matthew Michael McCormick

    2014-02-01

    Full Text Available Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature.Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification.This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46.

  1. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  2. People avoid situations that enable them to deceive others

    NARCIS (Netherlands)

    Shalvi, S.; Handgraaf, M.J.J.; de Dreu, C.K.W.

    2011-01-01

    Information advantage enables people to benefit themselves by deceiving their counterparts. Using a modified ultimatum bargaining game with an exit option, we find that people are more likely to avoid settings enabling them to privately deceive their counterparts than settings which do not enable

  3. Perceptually-Inspired Computing

    Directory of Open Access Journals (Sweden)

    Ming Lin

    2015-08-01

    Full Text Available Human sensory systems allow individuals to see, hear, touch, and interact with the surrounding physical environment. Understanding human perception and its limit enables us to better exploit the psychophysics of human perceptual systems to design more efficient, adaptive algorithms and develop perceptually-inspired computational models. In this talk, I will survey some of recent efforts on perceptually-inspired computing with applications to crowd simulation and multimodal interaction. In particular, I will present data-driven personality modeling based on the results of user studies, example-guided physics-based sound synthesis using auditory perception, as well as perceptually-inspired simplification for multimodal interaction. These perceptually guided principles can be used to accelerating multi-modal interaction and visual computing, thereby creating more natural human-computer interaction and providing more immersive experiences. I will also present their use in interactive applications for entertainment, such as video games, computer animation, and shared social experience. I will conclude by discussing possible future research directions.

  4. Progress in computational toxicology.

    Science.gov (United States)

    Ekins, Sean

    2014-01-01

    Computational methods have been widely applied to toxicology across pharmaceutical, consumer product and environmental fields over the past decade. Progress in computational toxicology is now reviewed. A literature review was performed on computational models for hepatotoxicity (e.g. for drug-induced liver injury (DILI)), cardiotoxicity, renal toxicity and genotoxicity. In addition various publications have been highlighted that use machine learning methods. Several computational toxicology model datasets from past publications were used to compare Bayesian and Support Vector Machine (SVM) learning methods. The increasing amounts of data for defined toxicology endpoints have enabled machine learning models that have been increasingly used for predictions. It is shown that across many different models Bayesian and SVM perform similarly based on cross validation data. Considerable progress has been made in computational toxicology in a decade in both model development and availability of larger scale or 'big data' models. The future efforts in toxicology data generation will likely provide us with hundreds of thousands of compounds that are readily accessible for machine learning models. These models will cover relevant chemistry space for pharmaceutical, consumer product and environmental applications. Copyright © 2013 Elsevier Inc. All rights reserved.

  5. Organic Computing

    CERN Document Server

    Würtz, Rolf P

    2008-01-01

    Organic Computing is a research field emerging around the conviction that problems of organization in complex systems in computer science, telecommunications, neurobiology, molecular biology, ethology, and possibly even sociology can be tackled scientifically in a unified way. From the computer science point of view, the apparent ease in which living systems solve computationally difficult problems makes it inevitable to adopt strategies observed in nature for creating information processing machinery. In this book, the major ideas behind Organic Computing are delineated, together with a sparse sample of computational projects undertaken in this new field. Biological metaphors include evolution, neural networks, gene-regulatory networks, networks of brain modules, hormone system, insect swarms, and ant colonies. Applications are as diverse as system design, optimization, artificial growth, task allocation, clustering, routing, face recognition, and sign language understanding.

  6. Computational biomechanics

    International Nuclear Information System (INIS)

    Ethier, C.R.

    2004-01-01

    Computational biomechanics is a fast-growing field that integrates modern biological techniques and computer modelling to solve problems of medical and biological interest. Modelling of blood flow in the large arteries is the best-known application of computational biomechanics, but there are many others. Described here is work being carried out in the laboratory on the modelling of blood flow in the coronary arteries and on the transport of viral particles in the eye. (author)

  7. The OptIPuter microscopy demonstrator: enabling science through a transatlantic lightpath

    Science.gov (United States)

    Ellisman, M.; Hutton, T.; Kirkland, A.; Lin, A.; Lin, C.; Molina, T.; Peltier, S.; Singh, R.; Tang, K.; Trefethen, A.E.; Wallom, D.C.H.; Xiong, X.

    2009-01-01

    The OptIPuter microscopy demonstrator project has been designed to enable concurrent and remote usage of world-class electron microscopes located in Oxford and San Diego. The project has constructed a network consisting of microscopes and computational and data resources that are all connected by a dedicated network infrastructure using the UK Lightpath and US Starlight systems. Key science drivers include examples from both materials and biological science. The resulting system is now a permanent link between the Oxford and San Diego microscopy centres. This will form the basis of further projects between the sites and expansion of the types of systems that can be remotely controlled, including optical, as well as electron, microscopy. Other improvements will include the updating of the Microsoft cluster software to the high performance computing (HPC) server 2008, which includes the HPC basic profile implementation that will enable the development of interoperable clients. PMID:19487201

  8. The OptIPuter microscopy demonstrator: enabling science through a transatlantic lightpath.

    Science.gov (United States)

    Ellisman, M; Hutton, T; Kirkland, A; Lin, A; Lin, C; Molina, T; Peltier, S; Singh, R; Tang, K; Trefethen, A E; Wallom, D C H; Xiong, X

    2009-07-13

    The OptIPuter microscopy demonstrator project has been designed to enable concurrent and remote usage of world-class electron microscopes located in Oxford and San Diego. The project has constructed a network consisting of microscopes and computational and data resources that are all connected by a dedicated network infrastructure using the UK Lightpath and US Starlight systems. Key science drivers include examples from both materials and biological science. The resulting system is now a permanent link between the Oxford and San Diego microscopy centres. This will form the basis of further projects between the sites and expansion of the types of systems that can be remotely controlled, including optical, as well as electron, microscopy. Other improvements will include the updating of the Microsoft cluster software to the high performance computing (HPC) server 2008, which includes the HPC basic profile implementation that will enable the development of interoperable clients.

  9. The fusion code XGC: Enabling kinetic study of multi-scale edge turbulent transport in ITER

    Energy Technology Data Exchange (ETDEWEB)

    D' Azevedo, Eduardo [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Abbott, Stephen [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Koskela, Tuomas [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Worley, Patrick [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Ku, Seung-Hoe [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Ethier, Stephane [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Yoon, Eisung [Rensselaer Polytechnic Inst., Troy, NY (United States); Shephard, Mark [Rensselaer Polytechnic Inst., Troy, NY (United States); Hager, Robert [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Lang, Jianying [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Intel Corporation, Santa Clara, CA (United States); Choi, Jong [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Podhorszki, Norbert [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Klasky, Scott [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Parashar, Manish [Rutgers Univ., Piscataway, NJ (United States); Chang, Choong-Seock [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States)

    2017-01-01

    The XGC fusion gyrokinetic code combines state-of-the-art, portable computational and algorithmic technologies to enable complicated multiscale simulations of turbulence and transport dynamics in ITER edge plasma on the largest US open-science computer, the CRAY XK7 Titan, at its maximal heterogeneous capability, which have not been possible before due to a factor of over 10 shortage in the time-to-solution for less than 5 days of wall-clock time for one physics case. Frontier techniques such as nested OpenMP parallelism, adaptive parallel I/O, staging I/O and data reduction using dynamic and asynchronous applications interactions, dynamic repartitioning.

  10. Computational Composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.

    to understand the computer as a material like any other material we would use for design, like wood, aluminum, or plastic. That as soon as the computer forms a composition with other materials it becomes just as approachable and inspiring as other smart materials. I present a series of investigations of what...... Computational Composite, and Telltale). Through the investigations, I show how the computer can be understood as a material and how it partakes in a new strand of materials whose expressions come to be in context. I uncover some of their essential material properties and potential expressions. I develop a way...

  11. Lean computing for the cloud

    CERN Document Server

    Bauer, Eric

    2016-01-01

    Applies lean manufacturing principles across the cloud service delivery chain to enable application and infrastructure service providers to sustainably achieve the shortest lead time, best quality, and value This book focuses on lean in the context of cloud computing capacity management of applications and the physical and virtual cloud resources that support them. Lean Computing for the Cloud considers business, architectural and operational aspects of efficiently delivering valuable services to end users via cloud-based applications hosted on shared cloud infrastructure. The work also focuses on overall optimization of the service delivery chain to enable both application service and infrastructure service providers to adopt leaner, demand driven operations to serve end users more efficiently. The book’s early chapters analyze how capacity management morphs with cloud computing into interlocked physical infrastructure capacity management, virtual resou ce capacity management, and application capacity ma...

  12. Ocean Tide Loading Computation

    Science.gov (United States)

    Agnew, Duncan Carr

    2005-01-01

    September 15,2003 through May 15,2005 This grant funds the maintenance, updating, and distribution of programs for computing ocean tide loading, to enable the corrections for such loading to be more widely applied in space- geodetic and gravity measurements. These programs, developed under funding from the CDP and DOSE programs, incorporate the most recent global tidal models developed from Topex/Poscidon data, and also local tide models for regions around North America; the design of the algorithm and software makes it straightforward to combine local and global models.

  13. Method for computed tomography

    International Nuclear Information System (INIS)

    Wagner, W.

    1980-01-01

    In transversal computer tomography apparatus, in which the positioning zone in which the patient can be positioned is larger than the scanning zone in which a body slice can be scanned, reconstruction errors are liable to occur. These errors are caused by incomplete irradiation of the body during examination. They become manifest not only as an incorrect image of the area not irradiated, but also have an adverse effect on the image of the other, completely irradiated areas. The invention enables reduction of these errors

  14. Current status and future prospects for enabling chemistry technology in the drug discovery process.

    Science.gov (United States)

    Djuric, Stevan W; Hutchins, Charles W; Talaty, Nari N

    2016-01-01

    This review covers recent advances in the implementation of enabling chemistry technologies into the drug discovery process. Areas covered include parallel synthesis chemistry, high-throughput experimentation, automated synthesis and purification methods, flow chemistry methodology including photochemistry, electrochemistry, and the handling of "dangerous" reagents. Also featured are advances in the "computer-assisted drug design" area and the expanding application of novel mass spectrometry-based techniques to a wide range of drug discovery activities.

  15. Current status and future prospects for enabling chemistry technology in the drug discovery process

    Science.gov (United States)

    Djuric, Stevan W.; Hutchins, Charles W.; Talaty, Nari N.

    2016-01-01

    This review covers recent advances in the implementation of enabling chemistry technologies into the drug discovery process. Areas covered include parallel synthesis chemistry, high-throughput experimentation, automated synthesis and purification methods, flow chemistry methodology including photochemistry, electrochemistry, and the handling of “dangerous” reagents. Also featured are advances in the “computer-assisted drug design” area and the expanding application of novel mass spectrometry-based techniques to a wide range of drug discovery activities. PMID:27781094

  16. Secure and Lightweight Cloud-Assisted Video Reporting Protocol over 5G-Enabled Vehicular Networks.

    Science.gov (United States)

    Nkenyereye, Lewis; Kwon, Joonho; Choi, Yoon-Ho

    2017-09-23

    In the vehicular networks, the real-time video reporting service is used to send the recorded videos in the vehicle to the cloud. However, when facilitating the real-time video reporting service in the vehicular networks, the usage of the fourth generation (4G) long term evolution (LTE) was proved to suffer from latency while the IEEE 802.11p standard does not offer sufficient scalability for a such congested environment. To overcome those drawbacks, the fifth-generation (5G)-enabled vehicular network is considered as a promising technology for empowering the real-time video reporting service. In this paper, we note that security and privacy related issues should also be carefully addressed to boost the early adoption of 5G-enabled vehicular networks. There exist a few research works for secure video reporting service in 5G-enabled vehicular networks. However, their usage is limited because of public key certificates and expensive pairing operations. Thus, we propose a secure and lightweight protocol for cloud-assisted video reporting service in 5G-enabled vehicular networks. Compared to the conventional public key certificates, the proposed protocol achieves entities' authorization through anonymous credential. Also, by using lightweight security primitives instead of expensive bilinear pairing operations, the proposed protocol minimizes the computational overhead. From the evaluation results, we show that the proposed protocol takes the smaller computation and communication time for the cryptographic primitives than that of the well-known Eiza-Ni-Shi protocol.

  17. Computer-Based Concept Maps for Enabling Multilingual Education in Computer Science: A Basque, English and Spanish Languages Case

    Science.gov (United States)

    Arruarte, Ana; Elorriaga, Jon A.; Calvo, Inaki; Larranaga, Mikel; Rueda, Urko

    2012-01-01

    Inside the globalisation era in which society is immersed, one of the current challenges for any educational system is to provide quality education. While some countries are linguistically homogeneous, many countries and regions display a wealth of linguistic diversity and it is essential to adapt the educational system to those realities. In…

  18. Coordination processes in computer supported collaborative writing

    NARCIS (Netherlands)

    Kanselaar, G.; Erkens, Gijsbert; Jaspers, Jos; Prangsma, M.E.

    2005-01-01

    In the COSAR-project a computer-supported collaborative learning environment enables students to collaborate in writing an argumentative essay. The TC3 groupware environment (TC3: Text Composer, Computer supported and Collaborative) offers access to relevant information sources, a private notepad, a

  19. Beyond CMOS computing with spin and polarization

    Science.gov (United States)

    Manipatruni, Sasikanth; Nikonov, Dmitri E.; Young, Ian A.

    2018-04-01

    Spintronic and multiferroic systems are leading candidates for achieving attojoule-class logic gates for computing, thereby enabling the continuation of Moore's law for transistor scaling. However, shifting the materials focus of computing towards oxides and topological materials requires a holistic approach addressing energy, stochasticity and complexity.

  20. Memristor for computing: myth or reality?

    NARCIS (Netherlands)

    Hamdioui, S.; Kvatinsky, S.; Cauwenberghs, G.; Xie, L.; Wald, N.; Joshi, S.; Elsayed, H.M.; Corporaal, H.; Bertels, K.

    2017-01-01

    CMOS technology and its sustainable scaling have been the enablers for the design and manufacturing of computer architectures that have been fuelling a wider range of applications. Today, however, both the technology and the computer architectures are suffering from serious challenges/ walls making

  1. The Fourth Revolution--Computers and Learning.

    Science.gov (United States)

    Bork, Alfred

    The personal computer is sparking a major historical change in the way people learn, a change that could lead to the disappearance of formal education as we know it. The computer can help resolve many of the difficulties now crippling education by enabling expert teachers and curriculum developers to prepare interactive and individualized…

  2. Revolutionize Propulsion Test Facility High-Speed Video Imaging with Disruptive Computational Photography Enabling Technology

    Data.gov (United States)

    National Aeronautics and Space Administration — Advanced rocket propulsion testing requires high-speed video recording that can capture essential information for NASA during rocket engine flight certification...

  3. Enabling three-dimensional densitometric measurements using laboratory source X-ray micro-computed tomography

    Science.gov (United States)

    Pankhurst, M. J.; Fowler, R.; Courtois, L.; Nonni, S.; Zuddas, F.; Atwood, R. C.; Davis, G. R.; Lee, P. D.

    2018-01-01

    We present new software allowing significantly improved quantitative mapping of the three-dimensional density distribution of objects using laboratory source polychromatic X-rays via a beam characterisation approach (c.f. filtering or comparison to phantoms). One key advantage is that a precise representation of the specimen material is not required. The method exploits well-established, widely available, non-destructive and increasingly accessible laboratory-source X-ray tomography. Beam characterisation is performed in two stages: (1) projection data are collected through a range of known materials utilising a novel hardware design integrated into the rotation stage; and (2) a Python code optimises a spectral response model of the system. We provide hardware designs for use with a rotation stage able to be tilted, yet the concept is easily adaptable to virtually any laboratory system and sample, and implicitly corrects the image artefact known as beam hardening.

  4. WorkStream-- A Design Pattern for Multicore-Enabled Finite Element Computations

    KAUST Repository

    Turcksin, Bruno; Kronbichler, Martin; Bangerth, Wolfgang

    2016-01-01

    , matrix assembly, estimating discretization errors, or converting nodal values into data structures that can be output in visualization file formats all fall into this class of operations. Using this realization, we identify a software design pattern

  5. Using Partial Reconfiguration and Message Passing to Enable FPGA-Based Generic Computing Platforms

    Directory of Open Access Journals (Sweden)

    Manuel Saldaña

    2012-01-01

    Full Text Available Partial reconfiguration (PR is an FPGA feature that allows the modification of certain parts of an FPGA while the rest of the system continues to operate without disruption. This distinctive characteristic of FPGAs has many potential benefits but also challenges. The lack of good CAD tools and the deep hardware knowledge requirement result in a hard-to-use feature. In this paper, the new partition-based Xilinx PR flow is used to incorporate PR within our MPI-based message-passing framework to allow hardware designers to create template bitstreams, which are predesigned, prerouted, generic bitstreams that can be reused for multiple applications. As an example of the generality of this approach, four different applications that use the same template bitstream are run consecutively, with a PR operation performed at the beginning of each application to instantiate the desired application engine. We demonstrate a simplified, reusable, high-level, and portable PR interface for X86-FPGA hybrid machines. PR issues such as local resets of reconfigurable modules and context saving and restoring are addressed in this paper followed by some examples and preliminary PR overhead measurements.

  6. Enabling On-Demand Database Computing with MIT SuperCloud Database Management System

    Science.gov (United States)

    2015-09-15

    arc.liv.ac.uk/trac/SGE) provides these services and is independent of programming language (C, Fortran, Java , Matlab, etc) or parallel programming...a MySQL database to store DNS records. The DNS records are controlled via a simple web service interface that allows records to be created

  7. Combining Archetypes, Ontologies and Formalization Enables Automated Computation of Quality Indicators

    NARCIS (Netherlands)

    Legaz-García, María Del Carmen; Dentler, Kathrin; Fernández-Breis, Jesualdo Tomás; Cornet, Ronald

    2017-01-01

    ArchMS is a framework that represents clinical information and knowledge using ontologies in OWL, which facilitates semantic interoperability and thereby the exploitation and secondary use of clinical data. However, it does not yet support the automated assessment of quality of care. CLIF is a

  8. High Performance Computing and Enabling Technologies for Nano and Bio Systems and Interfaces

    Science.gov (United States)

    2014-12-12

    Thuy Hien T., Z. Liu, and Preston B. Moore. 2013. Molecular Dynamics Simulations of Homo- oligomeric Bundles Embedded Within a Lipid Bilayer...assembly process that leads to the formation of different nanostructures [21]. Fullerenes functionalized with different ionic groups have been shown to...relation [14]. The analysis indicated that there is no collective diffusion of molecular clusters in the mixture and the pure liquid . The present work

  9. Role of proactive behaviour enabled by advanced computational intelligence and ICT in Smart Energy Grids

    NARCIS (Netherlands)

    Nguyen, P.H.; Kling, W.L.; Ribeiro, P.F.; Venayagamoorthy, G.K.; Croes, R.

    2013-01-01

    Significant increase in renewable energy production and new forms of consumption has enormous impact to the electrical power grid operation. A Smart Energy Grid (SEG) is needed to overcome the challenge of a sustainable and reliable energy supply by merging advanced ICT and control techniques to

  10. A Design for Computationally Enabled Analyses Supporting the Pre-Intervention Analytical Framework (PIAF)

    Science.gov (United States)

    2015-06-01

    public release; distribution is unlimited. The US Army Engineer Research and Development Center (ERDC) solves the nation’s toughest engineering and...Framework (PIAF) Timothy K. Perkins and Chris C. Rewerts Construction Engineering Research Laboratory U.S. Army Engineer Research and Development Center...Prepared for U.S. Army Corps of Engineers Washington, DC 20314-1000 Under Project P2 335530, “Cultural Reasoning and Ethnographic Analysis for the

  11. Mesoscopic distinct element method-enabled multiscale computational design of carbon nanotube-based composite materials

    Data.gov (United States)

    National Aeronautics and Space Administration — There is a sustained effort to develop super-lightweight composites by using polymer impregnation of carbon nanotube (CNT) sheets. This promising area is still in...

  12. The Livermore Brain: Massive Deep Learning Networks Enabled by High Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Barry Y. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-11-29

    The proliferation of inexpensive sensor technologies like the ubiquitous digital image sensors has resulted in the collection and sharing of vast amounts of unsorted and unexploited raw data. Companies and governments who are able to collect and make sense of large datasets to help them make better decisions more rapidly will have a competitive advantage in the information era. Machine Learning technologies play a critical role for automating the data understanding process; however, to be maximally effective, useful intermediate representations of the data are required. These representations or “features” are transformations of the raw data into a form where patterns are more easily recognized. Recent breakthroughs in Deep Learning have made it possible to learn these features from large amounts of labeled data. The focus of this project is to develop and extend Deep Learning algorithms for learning features from vast amounts of unlabeled data and to develop the HPC neural network training platform to support the training of massive network models. This LDRD project succeeded in developing new unsupervised feature learning algorithms for images and video and created a scalable neural network training toolkit for HPC. Additionally, this LDRD helped create the world’s largest freely-available image and video dataset supporting open multimedia research and used this dataset for training our deep neural networks. This research helped LLNL capture several work-for-others (WFO) projects, attract new talent, and establish collaborations with leading academic and commercial partners. Finally, this project demonstrated the successful training of the largest unsupervised image neural network using HPC resources and helped establish LLNL leadership at the intersection of Machine Learning and HPC research.

  13. Thermal/Heat Transfer Analysis Using a Graphic Processing Unit (GPU) Enabled Computing Environment

    Data.gov (United States)

    National Aeronautics and Space Administration — Simulation technology plays an important role in propulsion test facility design and development by assessing risks, identifying failure modes, and predicting...

  14. FY 1999 Blue Book: Computing, Information, and Communications: Networked Computing for the 21st Century

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — U.S.research and development R and D in computing, communications, and information technologies has enabled unprecedented scientific and engineering advances,...

  15. Plasmonic computing of spatial differentiation

    Science.gov (United States)

    Zhu, Tengfeng; Zhou, Yihan; Lou, Yijie; Ye, Hui; Qiu, Min; Ruan, Zhichao; Fan, Shanhui

    2017-05-01

    Optical analog computing offers high-throughput low-power-consumption operation for specialized computational tasks. Traditionally, optical analog computing in the spatial domain uses a bulky system of lenses and filters. Recent developments in metamaterials enable the miniaturization of such computing elements down to a subwavelength scale. However, the required metamaterial consists of a complex array of meta-atoms, and direct demonstration of image processing is challenging. Here, we show that the interference effects associated with surface plasmon excitations at a single metal-dielectric interface can perform spatial differentiation. And we experimentally demonstrate edge detection of an image without any Fourier lens. This work points to a simple yet powerful mechanism for optical analog computing at the nanoscale.

  16. Plasmonic computing of spatial differentiation.

    Science.gov (United States)

    Zhu, Tengfeng; Zhou, Yihan; Lou, Yijie; Ye, Hui; Qiu, Min; Ruan, Zhichao; Fan, Shanhui

    2017-05-19

    Optical analog computing offers high-throughput low-power-consumption operation for specialized computational tasks. Traditionally, optical analog computing in the spatial domain uses a bulky system of lenses and filters. Recent developments in metamaterials enable the miniaturization of such computing elements down to a subwavelength scale. However, the required metamaterial consists of a complex array of meta-atoms, and direct demonstration of image processing is challenging. Here, we show that the interference effects associated with surface plasmon excitations at a single metal-dielectric interface can perform spatial differentiation. And we experimentally demonstrate edge detection of an image without any Fourier lens. This work points to a simple yet powerful mechanism for optical analog computing at the nanoscale.

  17. BLAST in Gid (BiG): A Grid-Enabled Software Architecture and Implementation of Parallel and Sequential BLAST

    International Nuclear Information System (INIS)

    Aparicio, G.; Blanquer, I.; Hernandez, V.; Segrelles, D.

    2007-01-01

    The integration of High-performance computing tools is a key issue in biomedical research. Many computer-based applications have been migrated to High-Performance computers to deal with their computing and storage needs such as BLAST. However, the use of clusters and computing farm presents problems in scalability. The use of a higher layer of parallelism that splits the task into highly independent long jobs that can be executed in parallel can improve the performance maintaining the efficiency. Grid technologies combined with parallel computing resources are an important enabling technology. This work presents a software architecture for executing BLAST in a International Grid Infrastructure that guarantees security, scalability and fault tolerance. The software architecture is modular an adaptable to many other high-throughput applications, both inside the field of bio computing and outside. (Author)

  18. GPGPU COMPUTING

    Directory of Open Access Journals (Sweden)

    BOGDAN OANCEA

    2012-05-01

    Full Text Available Since the first idea of using GPU to general purpose computing, things have evolved over the years and now there are several approaches to GPU programming. GPU computing practically began with the introduction of CUDA (Compute Unified Device Architecture by NVIDIA and Stream by AMD. These are APIs designed by the GPU vendors to be used together with the hardware that they provide. A new emerging standard, OpenCL (Open Computing Language tries to unify different GPU general computing API implementations and provides a framework for writing programs executed across heterogeneous platforms consisting of both CPUs and GPUs. OpenCL provides parallel computing using task-based and data-based parallelism. In this paper we will focus on the CUDA parallel computing architecture and programming model introduced by NVIDIA. We will present the benefits of the CUDA programming model. We will also compare the two main approaches, CUDA and AMD APP (STREAM and the new framwork, OpenCL that tries to unify the GPGPU computing models.

  19. Quantum Computing

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 5; Issue 9. Quantum Computing - Building Blocks of a Quantum Computer. C S Vijay Vishal Gupta. General Article Volume 5 Issue 9 September 2000 pp 69-81. Fulltext. Click here to view fulltext PDF. Permanent link:

  20. Quantum Computing

    Indian Academy of Sciences (India)

    In the first part of this article, we had looked at how quantum physics can be harnessed to make the building blocks of a quantum computer. In this concluding part, we look at algorithms which can exploit the power of this computational device, and some practical difficulties in building such a device. Quantum Algorithms.

  1. Quantum computing

    OpenAIRE

    Burba, M.; Lapitskaya, T.

    2017-01-01

    This article gives an elementary introduction to quantum computing. It is a draft for a book chapter of the "Handbook of Nature-Inspired and Innovative Computing", Eds. A. Zomaya, G.J. Milburn, J. Dongarra, D. Bader, R. Brent, M. Eshaghian-Wilner, F. Seredynski (Springer, Berlin Heidelberg New York, 2006).

  2. Computational Pathology

    Science.gov (United States)

    Louis, David N.; Feldman, Michael; Carter, Alexis B.; Dighe, Anand S.; Pfeifer, John D.; Bry, Lynn; Almeida, Jonas S.; Saltz, Joel; Braun, Jonathan; Tomaszewski, John E.; Gilbertson, John R.; Sinard, John H.; Gerber, Georg K.; Galli, Stephen J.; Golden, Jeffrey A.; Becich, Michael J.

    2016-01-01

    Context We define the scope and needs within the new discipline of computational pathology, a discipline critical to the future of both the practice of pathology and, more broadly, medical practice in general. Objective To define the scope and needs of computational pathology. Data Sources A meeting was convened in Boston, Massachusetts, in July 2014 prior to the annual Association of Pathology Chairs meeting, and it was attended by a variety of pathologists, including individuals highly invested in pathology informatics as well as chairs of pathology departments. Conclusions The meeting made recommendations to promote computational pathology, including clearly defining the field and articulating its value propositions; asserting that the value propositions for health care systems must include means to incorporate robust computational approaches to implement data-driven methods that aid in guiding individual and population health care; leveraging computational pathology as a center for data interpretation in modern health care systems; stating that realizing the value proposition will require working with institutional administrations, other departments, and pathology colleagues; declaring that a robust pipeline should be fostered that trains and develops future computational pathologists, for those with both pathology and non-pathology backgrounds; and deciding that computational pathology should serve as a hub for data-related research in health care systems. The dissemination of these recommendations to pathology and bioinformatics departments should help facilitate the development of computational pathology. PMID:26098131

  3. Gantry for computed tomography

    International Nuclear Information System (INIS)

    Kelman, A.L.; O'Dell, W.R.; Brook, R.F.; Hein, P.W.; Brandt, R.T.

    1981-01-01

    A novel design of gantry for use in computed tomography is described in detail. In the new gantry, curved tracks are mounted to the laterally spaced apart sides of the frame which rotates and carries the detector and X-ray source. This permits the frame to be tilted either side of vertical enabling angular slices of body layers to be viewed and allows simplification of the algorithm which the computer uses for image reconstruction. A failsafe, solenoid brake is described which can lock the shaft against rotation. The gantry also contains a hoist mechanism which aids maintenance of the heavy X-ray tube and/or detector arrays. Explicit engineering details are presented. (U.K.)

  4. Wireless infrared computer control

    Science.gov (United States)

    Chen, George C.; He, Xiaofei

    2004-04-01

    Wireless mouse is not restricted by cable"s length and has advantage over its wired counterpart. However, all the mice available in the market have detection range less than 2 meters and angular coverage less than 180 degrees. Furthermore, commercial infrared mice are based on track ball and rollers to detect movements. This restricts them to be used in those occasions where users want to have dynamic movement, such as presentations and meetings etc. This paper presents our newly developed infrared wireless mouse, which has a detection range of 6 meters and angular coverage of 180 degrees. This new mouse uses buttons instead of traditional track ball and is developed to be a hand-held device like remote controller. It enables users to control cursor with a distance closed to computer and the mouse to be free from computer operation.

  5. [Computer program "PANCREAS"].

    Science.gov (United States)

    Jakubowicz, J; Jankowski, M; Szomański, B; Switka, S; Zagórowicz, E; Pertkiewicz, M; Szczygieł, B

    1998-01-01

    Contemporary computer technology allows precise and fast large database analysis. Widespread and common use depends on appropriate, user friendly software, usually lacking in special medical applications. The aim of this work was to develop an integrated system designed to store, explore and analyze data of patients treated for pancreatic cancer. For that purpose the database administration system MS Visual Fox Pro 3.0 was used and special application, according to ISO 9000 series has been developed. The system works under MS Windows 95 with possibility of easy adaptation to MS Windows 3.11 or MS Windows NT by graphic user's interface. The system stores personal data, laboratory results, visual and histological analyses and information on treatment course and complications. However the system archives them and enables the preparation reports of according to individual and statistical needs. Help and security settings allow to work also for one not familiar with computer science.

  6. Spintronics-based computing

    CERN Document Server

    Prenat, Guillaume

    2015-01-01

    This book provides a comprehensive introduction to spintronics-based computing for the next generation of ultra-low power/highly reliable logic, which is widely considered a promising candidate to replace conventional, pure CMOS-based logic. It will cover aspects from device to system-level, including magnetic memory cells, device modeling, hybrid circuit structure, design methodology, CAD tools, and technological integration methods. This book is accessible to a variety of readers and little or no background in magnetism and spin electronics are required to understand its content.  The multidisciplinary team of expert authors from circuits, devices, computer architecture, CAD and system design reveal to readers the potential of spintronics nanodevices to reduce power consumption, improve reliability and enable new functionality.  .

  7. Cloud Computing

    DEFF Research Database (Denmark)

    Krogh, Simon

    2013-01-01

    with technological changes, the paradigmatic pendulum has swung between increased centralization on one side and a focus on distributed computing that pushes IT power out to end users on the other. With the introduction of outsourcing and cloud computing, centralization in large data centers is again dominating...... the IT scene. In line with the views presented by Nicolas Carr in 2003 (Carr, 2003), it is a popular assumption that cloud computing will be the next utility (like water, electricity and gas) (Buyya, Yeo, Venugopal, Broberg, & Brandic, 2009). However, this assumption disregards the fact that most IT production......), for instance, in establishing and maintaining trust between the involved parties (Sabherwal, 1999). So far, research in cloud computing has neglected this perspective and focused entirely on aspects relating to technology, economy, security and legal questions. While the core technologies of cloud computing (e...

  8. Computability theory

    CERN Document Server

    Weber, Rebecca

    2012-01-01

    What can we compute--even with unlimited resources? Is everything within reach? Or are computations necessarily drastically limited, not just in practice, but theoretically? These questions are at the heart of computability theory. The goal of this book is to give the reader a firm grounding in the fundamentals of computability theory and an overview of currently active areas of research, such as reverse mathematics and algorithmic randomness. Turing machines and partial recursive functions are explored in detail, and vital tools and concepts including coding, uniformity, and diagonalization are described explicitly. From there the material continues with universal machines, the halting problem, parametrization and the recursion theorem, and thence to computability for sets, enumerability, and Turing reduction and degrees. A few more advanced topics round out the book before the chapter on areas of research. The text is designed to be self-contained, with an entire chapter of preliminary material including re...

  9. Computational Streetscapes

    Directory of Open Access Journals (Sweden)

    Paul M. Torrens

    2016-09-01

    Full Text Available Streetscapes have presented a long-standing interest in many fields. Recently, there has been a resurgence of attention on streetscape issues, catalyzed in large part by computing. Because of computing, there is more understanding, vistas, data, and analysis of and on streetscape phenomena than ever before. This diversity of lenses trained on streetscapes permits us to address long-standing questions, such as how people use information while mobile, how interactions with people and things occur on streets, how we might safeguard crowds, how we can design services to assist pedestrians, and how we could better support special populations as they traverse cities. Amid each of these avenues of inquiry, computing is facilitating new ways of posing these questions, particularly by expanding the scope of what-if exploration that is possible. With assistance from computing, consideration of streetscapes now reaches across scales, from the neurological interactions that form among place cells in the brain up to informatics that afford real-time views of activity over whole urban spaces. For some streetscape phenomena, computing allows us to build realistic but synthetic facsimiles in computation, which can function as artificial laboratories for testing ideas. In this paper, I review the domain science for studying streetscapes from vantages in physics, urban studies, animation and the visual arts, psychology, biology, and behavioral geography. I also review the computational developments shaping streetscape science, with particular emphasis on modeling and simulation as informed by data acquisition and generation, data models, path-planning heuristics, artificial intelligence for navigation and way-finding, timing, synthetic vision, steering routines, kinematics, and geometrical treatment of collision detection and avoidance. I also discuss the implications that the advances in computing streetscapes might have on emerging developments in cyber

  10. COMPUTATIONAL THINKING

    Directory of Open Access Journals (Sweden)

    Evgeniy K. Khenner

    2016-01-01

    Full Text Available Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education; on comparing the notion of «computational thinking» with related concepts used in the Russian scientific and pedagogical literature.Results. The concept «computational thinking» is analyzed from the point of view of intuitive understanding and scientific and applied aspects. It is shown as computational thinking has evolved in the process of development of computers hardware and software. The practice-oriented interpretation of computational thinking which dominant among educators is described along with some ways of its formation. It is shown that computational thinking is a metasubject result of general education as well as its tool. From the point of view of the author, purposeful development of computational thinking should be one of the tasks of the Russian education.Scientific novelty. The author gives a theoretical justification of the role of computational thinking schemes as metasubject results of learning. The dynamics of the development of this concept is described. This process is connected with the evolution of computer and information technologies as well as increase of number of the tasks for effective solutions of which computational thinking is required. Author substantiated the affirmation that including «computational thinking » in the set of pedagogical concepts which are used in the national education system fills an existing gap.Practical significance. New metasubject result of education associated with

  11. Security and Privacy Issues in Cloud Computing

    OpenAIRE

    Sen, Jaydip

    2013-01-01

    Today, cloud computing is defined and talked about across the ICT industry under different contexts and with different definitions attached to it. It is a new paradigm in the evolution of Information Technology, as it is one of the biggest revolutions in this field to have taken place in recent times. According to the National Institute for Standards and Technology (NIST), “cloud computing is a model for enabling convenient, on-demand network access to a shared pool of configurable computing ...

  12. Cloud Computing Security Issues - Challenges and Opportunities

    OpenAIRE

    Vaikunth, Pai T.; Aithal, P. S.

    2017-01-01

    Cloud computing services enabled through information communication technology delivered to a customer as services over the Internet on a leased basis have the capability to extend up or down their service requirements or needs. In this model, the infrastructure is owned by a third party vendor and the cloud computing services are delivered to the requested customers. Cloud computing model has many advantages including scalability, flexibility, elasticity, efficiency, and supports outsourcing ...

  13. Music Learning Based on Computer Software

    OpenAIRE

    Baihui Yan; Qiao Zhou

    2017-01-01

    In order to better develop and improve students’ music learning, the authors proposed the method of music learning based on computer software. It is still a new field to use computer music software to assist teaching. Hereby, we conducted an in-depth analysis on the computer-enabled music learning and the music learning status in secondary schools, obtaining the specific analytical data. Survey data shows that students have many cognitive problems in the current music classroom, and yet teach...

  14. PERSPECTIVES FOR FOG COMPUTING IN MANUFACTURING

    Directory of Open Access Journals (Sweden)

    Jakub PIZOŃ

    2016-09-01

    Full Text Available This article discusses ongoing efforts to enable the fog computing vision in manufacturing. As a new paradigm of computing implementation of fog computing faces many challenges that open perspective of new applications within a field of manufacturing. It is expected that fog computing will be one of factors that will accelerate development of in forth industrial revolution. In this article we discuss the perspectives of manufacturing companies surrounded by new solutions of CPS, CPPS and CM in relation to fog computing.

  15. AFC-Enabled Simplified High-Lift System Integration Study

    Science.gov (United States)

    Hartwich, Peter M.; Dickey, Eric D.; Sclafani, Anthony J.; Camacho, Peter; Gonzales, Antonio B.; Lawson, Edward L.; Mairs, Ron Y.; Shmilovich, Arvin

    2014-01-01

    The primary objective of this trade study report is to explore the potential of using Active Flow Control (AFC) for achieving lighter and mechanically simpler high-lift systems for transonic commercial transport aircraft. This assessment was conducted in four steps. First, based on the Common Research Model (CRM) outer mold line (OML) definition, two high-lift concepts were developed. One concept, representative of current production-type commercial transonic transports, features leading edge slats and slotted trailing edge flaps with Fowler motion. The other CRM-based design relies on drooped leading edges and simply hinged trailing edge flaps for high-lift generation. The relative high-lift performance of these two high-lift CRM variants is established using Computational Fluid Dynamics (CFD) solutions to the Reynolds-Averaged Navier-Stokes (RANS) equations for steady flow. These CFD assessments identify the high-lift performance that needs to be recovered through AFC to have the CRM variant with the lighter and mechanically simpler high-lift system match the performance of the conventional high-lift system. Conceptual design integration studies for the AFC-enhanced high-lift systems were conducted with a NASA Environmentally Responsible Aircraft (ERA) reference configuration, the so-called ERA-0003 concept. These design trades identify AFC performance targets that need to be met to produce economically feasible ERA-0003-like concepts with lighter and mechanically simpler high-lift designs that match the performance of conventional high-lift systems. Finally, technical challenges are identified associated with the application of AFC-enabled highlift systems to modern transonic commercial transports for future technology maturation efforts.

  16. Enabling image fusion for a CT guided needle placement robot

    Science.gov (United States)

    Seifabadi, Reza; Xu, Sheng; Aalamifar, Fereshteh; Velusamy, Gnanasekar; Puhazhendi, Kaliyappan; Wood, Bradford J.

    2017-03-01

    Purpose: This study presents development and integration of hardware and software that enables ultrasound (US) and computer tomography (CT) fusion for a FDA-approved CT-guided needle placement robot. Having real-time US image registered to a priori-taken intraoperative CT image provides more anatomic information during needle insertion, in order to target hard-to-see lesions or avoid critical structures invisible to CT, track target motion, and to better monitor ablation treatment zone in relation to the tumor location. Method: A passive encoded mechanical arm is developed for the robot in order to hold and track an abdominal US transducer. This 4 degrees of freedom (DOF) arm is designed to attach to the robot end-effector. The arm is locked by default and is released by a press of button. The arm is designed such that the needle is always in plane with US image. The articulated arm is calibrated to improve its accuracy. Custom designed software (OncoNav, NIH) was developed to fuse real-time US image to a priori-taken CT. Results: The accuracy of the end effector before and after passive arm calibration was 7.07mm +/- 4.14mm and 1.74mm +/-1.60mm, respectively. The accuracy of the US image to the arm calibration was 5mm. The feasibility of US-CT fusion using the proposed hardware and software was demonstrated in an abdominal commercial phantom. Conclusions: Calibration significantly improved the accuracy of the arm in US image tracking. Fusion of US to CT using the proposed hardware and software was feasible.

  17. TEACH (Train to Enable/Achieve Culturally Sensitive Healthcare)

    Science.gov (United States)

    Maulitz, Russell; Santarelli, Thomas; Barnieu, Joanne; Rosenzweig, Larry; Yi, Na Yi; Zachary, Wayne; OConnor, Bonnie

    2010-01-01

    Personnel from diverse ethnic and demographic backgrounds come together in both civilian and military healthcare systems, facing diagnoses that at one level are equalizers: coronary disease is coronary disease, breast cancer is breast cancer. Yet the expression of disease in individuals from different backgrounds, individual patient experience of disease as a particular illness, and interactions between patients and providers occurring in any given disease scenario, all vary enormously depending on the fortuity of the equation of "which patient happens to arrive in whose exam room." Previously, providers' absorption of lessons-learned depended on learning as an apprentice would when exposed over time to multiple populations. As a result, and because providers are often thrown into situations where communications falter through inadequate direct patient experience, diversity in medicine remains a training challenge. The questions then become: Can simulation and virtual training environments (VTEs) be deployed to short-track and standardize this sort of random-walk problem? Can we overcome the unevenness of training caused by some providers obtaining the valuable exposure to diverse populations, whereas others are left to "sink or swim"? This paper summarizes developing a computer-based VTE called TEACH (Training to Enable/Achieve Culturally Sensitive Healthcare). TEACH was developed to enhance healthcare providers' skills in delivering culturally sensitive care to African-American women with breast cancer. With an authoring system under development to ensure extensibility, TEACH allows users to role-play in clinical oncology settings with virtual characters who interact on the basis of different combinations of African American sub-cultural beliefs regarding breast cancer. The paper reports on the roll-out and evaluation of the degree to which these interactions allow providers to acquire, practice, and refine culturally appropriate communication skills and to

  18. Computer interfacing

    CERN Document Server

    Dixey, Graham

    1994-01-01

    This book explains how computers interact with the world around them and therefore how to make them a useful tool. Topics covered include descriptions of all the components that make up a computer, principles of data exchange, interaction with peripherals, serial communication, input devices, recording methods, computer-controlled motors, and printers.In an informative and straightforward manner, Graham Dixey describes how to turn what might seem an incomprehensible 'black box' PC into a powerful and enjoyable tool that can help you in all areas of your work and leisure. With plenty of handy

  19. Computational physics

    CERN Document Server

    Newman, Mark

    2013-01-01

    A complete introduction to the field of computational physics, with examples and exercises in the Python programming language. Computers play a central role in virtually every major physics discovery today, from astrophysics and particle physics to biophysics and condensed matter. This book explains the fundamentals of computational physics and describes in simple terms the techniques that every physicist should know, such as finite difference methods, numerical quadrature, and the fast Fourier transform. The book offers a complete introduction to the topic at the undergraduate level, and is also suitable for the advanced student or researcher who wants to learn the foundational elements of this important field.

  20. Computational physics

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1987-01-15

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October.

  1. Cloud Computing

    CERN Document Server

    Baun, Christian; Nimis, Jens; Tai, Stefan

    2011-01-01

    Cloud computing is a buzz-word in today's information technology (IT) that nobody can escape. But what is really behind it? There are many interpretations of this term, but no standardized or even uniform definition. Instead, as a result of the multi-faceted viewpoints and the diverse interests expressed by the various stakeholders, cloud computing is perceived as a rather fuzzy concept. With this book, the authors deliver an overview of cloud computing architecture, services, and applications. Their aim is to bring readers up to date on this technology and thus to provide a common basis for d

  2. Computational Viscoelasticity

    CERN Document Server

    Marques, Severino P C

    2012-01-01

    This text is a guide how to solve problems in which viscoelasticity is present using existing commercial computational codes. The book gives information on codes’ structure and use, data preparation  and output interpretation and verification. The first part of the book introduces the reader to the subject, and to provide the models, equations and notation to be used in the computational applications. The second part shows the most important Computational techniques: Finite elements formulation, Boundary elements formulation, and presents the solutions of Viscoelastic problems with Abaqus.

  3. Optical computing.

    Science.gov (United States)

    Stroke, G. W.

    1972-01-01

    Applications of the optical computer include an approach for increasing the sharpness of images obtained from the most powerful electron microscopes and fingerprint/credit card identification. The information-handling capability of the various optical computing processes is very great. Modern synthetic-aperture radars scan upward of 100,000 resolvable elements per second. Fields which have assumed major importance on the basis of optical computing principles are optical image deblurring, coherent side-looking synthetic-aperture radar, and correlative pattern recognition. Some examples of the most dramatic image deblurring results are shown.

  4. Computational physics

    International Nuclear Information System (INIS)

    Anon.

    1987-01-01

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October

  5. Phenomenological Computation?

    DEFF Research Database (Denmark)

    Brier, Søren

    2014-01-01

    Open peer commentary on the article “Info-computational Constructivism and Cognition” by Gordana Dodig-Crnkovic. Upshot: The main problems with info-computationalism are: (1) Its basic concept of natural computing has neither been defined theoretically or implemented practically. (2. It cannot...... encompass human concepts of subjective experience and intersubjective meaningful communication, which prevents it from being genuinely transdisciplinary. (3) Philosophically, it does not sufficiently accept the deep ontological differences between various paradigms such as von Foerster’s second- order...

  6. Cloud computing for comparative genomics

    Directory of Open Access Journals (Sweden)

    Pivovarov Rimma

    2010-05-01

    Full Text Available Abstract Background Large comparative genomics studies and tools are becoming increasingly more compute-expensive as the number of available genome sequences continues to rise. The capacity and cost of local computing infrastructures are likely to become prohibitive with the increase, especially as the breadth of questions continues to rise. Alternative computing architectures, in particular cloud computing environments, may help alleviate this increasing pressure and enable fast, large-scale, and cost-effective comparative genomics strategies going forward. To test this, we redesigned a typical comparative genomics algorithm, the reciprocal smallest distance algorithm (RSD, to run within Amazon's Elastic Computing Cloud (EC2. We then employed the RSD-cloud for ortholog calculations across a wide selection of fully sequenced genomes. Results We ran more than 300,000 RSD-cloud processes within the EC2. These jobs were farmed simultaneously to 100 high capacity compute nodes using the Amazon Web Service Elastic Map Reduce and included a wide mix of large and small genomes. The total computation time took just under 70 hours and cost a total of $6,302 USD. Conclusions The effort to transform existing comparative genomics algorithms from local compute infrastructures is not trivial. However, the speed and flexibility of cloud computing environments provides a substantial boost with manageable cost. The procedure designed to transform the RSD algorithm into a cloud-ready application is readily adaptable to similar comparative genomics problems.

  7. Cloud computing for comparative genomics.

    Science.gov (United States)

    Wall, Dennis P; Kudtarkar, Parul; Fusaro, Vincent A; Pivovarov, Rimma; Patil, Prasad; Tonellato, Peter J

    2010-05-18

    Large comparative genomics studies and tools are becoming increasingly more compute-expensive as the number of available genome sequences continues to rise. The capacity and cost of local computing infrastructures are likely to become prohibitive with the increase, especially as the breadth of questions continues to rise. Alternative computing architectures, in particular cloud computing environments, may help alleviate this increasing pressure and enable fast, large-scale, and cost-effective comparative genomics strategies going forward. To test this, we redesigned a typical comparative genomics algorithm, the reciprocal smallest distance algorithm (RSD), to run within Amazon's Elastic Computing Cloud (EC2). We then employed the RSD-cloud for ortholog calculations across a wide selection of fully sequenced genomes. We ran more than 300,000 RSD-cloud processes within the EC2. These jobs were farmed simultaneously to 100 high capacity compute nodes using the Amazon Web Service Elastic Map Reduce and included a wide mix of large and small genomes. The total computation time took just under 70 hours and cost a total of $6,302 USD. The effort to transform existing comparative genomics algorithms from local compute infrastructures is not trivial. However, the speed and flexibility of cloud computing environments provides a substantial boost with manageable cost. The procedure designed to transform the RSD algorithm into a cloud-ready application is readily adaptable to similar comparative genomics problems.

  8. Cloud@Home: A New Enhanced Computing Paradigm

    Science.gov (United States)

    Distefano, Salvatore; Cunsolo, Vincenzo D.; Puliafito, Antonio; Scarpa, Marco

    Cloud computing is a distributed computing paradigm that mixes aspects of Grid computing, ("… hardware and software infrastructure that provides dependable, consistent, pervasive, and inexpensive access to high-end computational capabilities" (Foster, 2002)) Internet Computing ("…a computing platform geographically distributed across the Internet" (Milenkovic et al., 2003)), Utility computing ("a collection of technologies and business practices that enables computing to be delivered seamlessly and reliably across multiple computers, ... available as needed and billed according to usage, much like water and electricity are today" (Ross & Westerman, 2004)) Autonomic computing ("computing systems that can manage themselves given high-level objectives from administrators" (Kephart & Chess, 2003)), Edge computing ("… provides a generic template facility for any type of application to spread its execution across a dedicated grid, balancing the load …" Davis, Parikh, & Weihl, 2004) and Green computing (a new frontier of Ethical computing1 starting from the assumption that in next future energy costs will be related to the environment pollution).

  9. Sparks in the Fog: Social and Economic Mechanisms as Enablers for Community Network Clouds

    Directory of Open Access Journals (Sweden)

    Muhammad Amin KHAN

    2014-10-01

    Full Text Available Internet and communication technologies have lowered the costs of enabling individuals and communities to collaborate together. This collaboration has provided new services like user-generated content and social computing, as evident from success stories like Wikipedia. Through collaboration, collectively built infrastructures like community wireless mesh networks where users provide the communication network, have also emerged. Community networks have demonstrated successful bandwidth sharing, but have not been able to extend their collective effort to other computing resources like storage and processing. The success of cloud computing has been enabled by economies of scale and the need for elastic, flexible and on-demand provisioning of computing services. The consolidation of today’s cloud technologies offers now the possibility of collectively built community clouds, building upon user-generated content and user-provided networks towards an ecosystem of cloud services. We explore in this paper how social and economic mechanisms can play a role in overcoming the barriers of voluntary resource provisioning in such community clouds, by analysing the costs involved in building these services and how they give value to the participants. We indicate socio-economic policies and how they can be implemented in community networks, to ease the uptake and ensure the sustainability of community clouds.

  10. Semantic Sensor Web Enablement for COAST, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Sensor Web Enablement (SWE) is an Open Geospatial Consortium (OGC) standard Service Oriented Architecture (SOA) that facilitates discovery and integration of...

  11. Logistics Reduction: RFID Enabled Autonomous Logistics Management (REALM)

    Data.gov (United States)

    National Aeronautics and Space Administration — The Advanced Exploration Systems (AES) Logistics Reduction (LR) project Radio-frequency identification (RFID) Enabled Autonomous Logistics Management (REALM) task...

  12. Essentials of cloud computing

    CERN Document Server

    Chandrasekaran, K

    2014-01-01

    ForewordPrefaceComputing ParadigmsLearning ObjectivesPreambleHigh-Performance ComputingParallel ComputingDistributed ComputingCluster ComputingGrid ComputingCloud ComputingBiocomputingMobile ComputingQuantum ComputingOptical ComputingNanocomputingNetwork ComputingSummaryReview PointsReview QuestionsFurther ReadingCloud Computing FundamentalsLearning ObjectivesPreambleMotivation for Cloud ComputingThe Need for Cloud ComputingDefining Cloud ComputingNIST Definition of Cloud ComputingCloud Computing Is a ServiceCloud Computing Is a Platform5-4-3 Principles of Cloud computingFive Essential Charact

  13. Personal Computers.

    Science.gov (United States)

    Toong, Hoo-min D.; Gupta, Amar

    1982-01-01

    Describes the hardware, software, applications, and current proliferation of personal computers (microcomputers). Includes discussions of microprocessors, memory, output (including printers), application programs, the microcomputer industry, and major microcomputer manufacturers (Apple, Radio Shack, Commodore, and IBM). (JN)

  14. Computational Literacy

    DEFF Research Database (Denmark)

    Chongtay, Rocio; Robering, Klaus

    2016-01-01

    In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies for the acquisit......In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies...... for the acquisition of Computational Literacy at basic educational levels, focus on higher levels of education has been much less prominent. The present paper considers the case of courses for higher education programs within the Humanities. A model is proposed which conceives of Computational Literacy as a layered...

  15. Computing Religion

    DEFF Research Database (Denmark)

    Nielbo, Kristoffer Laigaard; Braxton, Donald M.; Upal, Afzal

    2012-01-01

    The computational approach has become an invaluable tool in many fields that are directly relevant to research in religious phenomena. Yet the use of computational tools is almost absent in the study of religion. Given that religion is a cluster of interrelated phenomena and that research...... concerning these phenomena should strive for multilevel analysis, this article argues that the computational approach offers new methodological and theoretical opportunities to the study of religion. We argue that the computational approach offers 1.) an intermediary step between any theoretical construct...... and its targeted empirical space and 2.) a new kind of data which allows the researcher to observe abstract constructs, estimate likely outcomes, and optimize empirical designs. Because sophisticated mulitilevel research is a collaborative project we also seek to introduce to scholars of religion some...

  16. Computational Controversy

    NARCIS (Netherlands)

    Timmermans, Benjamin; Kuhn, Tobias; Beelen, Kaspar; Aroyo, Lora

    2017-01-01

    Climate change, vaccination, abortion, Trump: Many topics are surrounded by fierce controversies. The nature of such heated debates and their elements have been studied extensively in the social science literature. More recently, various computational approaches to controversy analysis have

  17. Grid Computing

    Indian Academy of Sciences (India)

    IAS Admin

    emergence of supercomputers led to the use of computer simula- tion as an .... Scientific and engineering applications (e.g., Tera grid secure gate way). Collaborative ... Encryption, privacy, protection from malicious software. Physical Layer.

  18. Computer tomographs

    International Nuclear Information System (INIS)

    Niedzwiedzki, M.

    1982-01-01

    Physical foundations and the developments in the transmission and emission computer tomography are presented. On the basis of the available literature and private communications a comparison is made of the various transmission tomographs. A new technique of computer emission tomography ECT, unknown in Poland, is described. The evaluation of two methods of ECT, namely those of positron and single photon emission tomography is made. (author)

  19. Computational sustainability

    CERN Document Server

    Kersting, Kristian; Morik, Katharina

    2016-01-01

    The book at hand gives an overview of the state of the art research in Computational Sustainability as well as case studies of different application scenarios. This covers topics such as renewable energy supply, energy storage and e-mobility, efficiency in data centers and networks, sustainable food and water supply, sustainable health, industrial production and quality, etc. The book describes computational methods and possible application scenarios.

  20. Computing farms

    International Nuclear Information System (INIS)

    Yeh, G.P.

    2000-01-01

    High-energy physics, nuclear physics, space sciences, and many other fields have large challenges in computing. In recent years, PCs have achieved performance comparable to the high-end UNIX workstations, at a small fraction of the price. We review the development and broad applications of commodity PCs as the solution to CPU needs, and look forward to the important and exciting future of large-scale PC computing