WorldWideScience

Sample records for epics applications tools

  1. EPICS Application Based on RTEMS

    International Nuclear Information System (INIS)

    Shi Haoli; Wang Chunhong; Tang Jingyu

    2009-01-01

    At present, in the accelerator field all over the world, the control system is established with EPICS, which is a configuration software package based on Ethernet and distributed control system. Early version of EPICS was developed based on VxWorks. Now, EPICS international collaboration group is dedicating to supporting open source, free RTEMS, and extending RTEMS into the newer version of EPICS3.14. The paper illustrates the development, characteristic of RTEMS, compares the performances of several ordinary RTOS, builds RTEMS operating system based on MVME5500, and finally presents an EPICS application example on RTEMS. (authors)

  2. EPICS application source/release control

    International Nuclear Information System (INIS)

    Zieman, B.; Anderson, J.; Kraimer, M.

    1995-01-01

    This manual describes a set of Application Source/Release Control tools (appSR) that can be used to develop software for EPICS based control systems. The Application Source/Release Control System (appSR) has been unbundled from base EPICS and is now available as an EPICS extension. Due to this unbundling, two new directories must be added to a user's path (see section ''Environment'' on page 3 for more information) and a new command getapp must be issued after the getrel command to get a specific version of appSR (see section ''Creating The Initial Application System Area'' on page 7 for more information). It is now required that GNU make version 3.71 or later be used for makes instead of SUN make. Users should now type gmake instead of make

  3. EPIC'S NEW REMOTE SENSING DATA AND INFORMATION TOOLS AVAILABLE FOR EPA CUSTOMERS

    Science.gov (United States)

    EPIC's New Remote Sensing Data and Information Tools Available for EPA Customers Donald Garofalo Environmental Photographic Interpretation Center (EPIC) Landscape Ecology Branch Environmental Sciences Division National Exposure Research Laboratory Several new too...

  4. A shared memory based interface of MARTe with EPICS for real-time applications

    International Nuclear Information System (INIS)

    Yun, Sangwon; Neto, André C.; Park, Mikyung; Lee, Sangil; Park, Kaprai

    2014-01-01

    Highlights: • We implemented a shared memory based interface of MARTe with EPICS. • We implemented an EPICS module supporting device and driver support. • We implemented an example EPICS IOC and CSS OPI for evaluation. - Abstract: The Multithreaded Application Real-Time executor (MARTe) is a multi-platform C++ middleware designed for the implementation of real-time control systems. It currently supports the Linux, Linux + RTAI, VxWorks, Solaris and MS Windows platforms. In the fusion community MARTe is being used at JET, COMPASS, ISTTOK, FTU and RFX in fusion [1]. The Experimental Physics and Industrial Control System (EPICS), a standard framework for the control systems in KSTAR and ITER, is a set of software tools and applications which provide a software infrastructure for use in building distributed control systems to operate devices. For a MARTe based application to cooperate with an EPICS based application, an interface layer between MARTe and EPICS is required. To solve this issue, a number of interfacing solutions have been proposed and some of them have been implemented. Nevertheless, a new approach is required to mitigate the functional limitations of existing solutions and to improve their performance for real-time applications. This paper describes the design and implementation of a shared memory based interface between MARTe and EPICS

  5. A shared memory based interface of MARTe with EPICS for real-time applications

    Energy Technology Data Exchange (ETDEWEB)

    Yun, Sangwon, E-mail: yunsw@nfri.re.kr [National Fusion Research Institute (NFRI), Gwahangno 169-148, Yuseong-Gu, Daejeon 305-806 (Korea, Republic of); Neto, André C. [Associação EURATOM/IST, Instituto de Plasmas e Fusão Nuclear, Instituto Superior Técnico, Universidade Técnica de Lisboa, P-1049-001 Lisboa (Portugal); Park, Mikyung; Lee, Sangil; Park, Kaprai [National Fusion Research Institute (NFRI), Gwahangno 169-148, Yuseong-Gu, Daejeon 305-806 (Korea, Republic of)

    2014-05-15

    Highlights: • We implemented a shared memory based interface of MARTe with EPICS. • We implemented an EPICS module supporting device and driver support. • We implemented an example EPICS IOC and CSS OPI for evaluation. - Abstract: The Multithreaded Application Real-Time executor (MARTe) is a multi-platform C++ middleware designed for the implementation of real-time control systems. It currently supports the Linux, Linux + RTAI, VxWorks, Solaris and MS Windows platforms. In the fusion community MARTe is being used at JET, COMPASS, ISTTOK, FTU and RFX in fusion [1]. The Experimental Physics and Industrial Control System (EPICS), a standard framework for the control systems in KSTAR and ITER, is a set of software tools and applications which provide a software infrastructure for use in building distributed control systems to operate devices. For a MARTe based application to cooperate with an EPICS based application, an interface layer between MARTe and EPICS is required. To solve this issue, a number of interfacing solutions have been proposed and some of them have been implemented. Nevertheless, a new approach is required to mitigate the functional limitations of existing solutions and to improve their performance for real-time applications. This paper describes the design and implementation of a shared memory based interface between MARTe and EPICS.

  6. Experimental Physics and Industrial Control System (EPICS): Application source/release control for EPICS R3.11.6

    International Nuclear Information System (INIS)

    Zieman, B.; Kraimer, M.

    1994-01-01

    This manual describes a set of tools that can be used to develop software for EPICS based control systems. It provides the following features: Multiple applications; the entire system is composed of an arbitrary number of applications: Source/Release Control; all files created or modified by the applications developers can be put under sccs (a UNIX Source/Release control utility): Multiple Developers; it allows a number of applications developers to work separately during the development phase but combine their applications for system testing and for a production system; Makefiles: makefiles are provided to automatically rebuild various application components. For C and state notation programs, Imagefiles are provided

  7. Experience with EPICS in a wide variety of applications

    International Nuclear Information System (INIS)

    Kraimer, M.R.; Clausen, M.; Lupton, W.; Watson, C.

    1997-01-01

    Currently more than 70 organizations have obtained permission to use the Experimental Physics and Industrial Control System (EPICS), a set of software packages for building real-time control systems. In this paper representatives from four of these sites discuss the reasons their sites chose EPICS, provide a brief discussion of their control system development, and discuss additional control system tools obtained elsewhere or developed locally

  8. Experience with EPICS in a wide variety of applications

    Energy Technology Data Exchange (ETDEWEB)

    Kraimer, M.R. [Argonne National Lab., IL (United States); Clausen, M. [Deutsches Elektronen Synchrotron, Hamburg (Germany); Lupton, W. [W.M. Keck Observatory, Kamuela, HI (United States); Watson, C. [Thomas Jefferson National Accelerator Facility, Newport News, VA (United States)

    1997-08-01

    Currently more than 70 organizations have obtained permission to use the Experimental Physics and Industrial Control System (EPICS), a set of software packages for building real-time control systems. In this paper representatives from four of these sites discuss the reasons their sites chose EPICS, provide a brief discussion of their control system development, and discuss additional control system tools obtained elsewhere or developed locally.

  9. RIO EPICS device support application case study on an ion source control system (ISHP)

    Energy Technology Data Exchange (ETDEWEB)

    Sanz, Diego [UPM – Universidad Politécnica de Madrid, Madrid (Spain); Ruiz, Mariano, E-mail: mariano.ruiz@upm.es [UPM – Universidad Politécnica de Madrid, Madrid (Spain); Eguiraun, Mikel [Department of Electricity and Electronic, Faculty of Science and Technology, University of Basque Country, Bilbao (Spain); Arredondo, Iñigo [ESS Bilbao Consortium, Zamudio (Spain); Badillo, Inari; Jugo, Josu [Department of Electricity and Electronic, Faculty of Science and Technology, University of Basque Country, Bilbao (Spain); Vega, Jesús; Castro, Rodrigo [Asociación EURATOM/CIEMAT, Madrid (Spain)

    2015-10-15

    Highlights: • A use case example of RIO/FlexRIO design methodology is described. • Ion source device is controlled and monitored by means EPICS IOCs. • NIRIO EPICS device support demonstrates that is able to manage RIO devices. • Easy and fast deployment is possible using RIO/FlexRIO design methodology using NIRIO-EDS. • RIO/FlexRIO technology and EPICS are a good combination for support large scale experiments in fusion environments. - Abstract: Experimental Physics and Industrial Control System (EPICS) is a software tool that during last years has become relevant as a main framework to deploy distributed control systems in large scientific environments. At the moment, ESS Bilbao uses this middleware to perform the control of their Ion Source Hydrogen Positive (ISHP) project. The implementation of the control system was based on: PXI Real Time controllers using the LabVIEW-RT and LabVIEW-EPICS tools; and RIO devices based on Field-Programmable Gate Array (FPGA) technology. Intended to provide a full compliant EPICS IOCs for RIO devices and to avoid additional efforts on the system maintainability, a migration of the current system to a derivative Red Hat Linux (CentOS) environment has been conducted. This paper presents a real application case study for using the NIRIO EPICS device support (NIRIO-EDS) to give support to the ISHP. Although RIO FPGA configurations are particular solutions for ISHP performance, the NIRIO-EDS has permitted the control and monitoring of devices by applying a well-defined design methodology into the previous FPGA configuration for RIO/FlexRIO devices. This methodology has permitted a fast and easy deployment for the new robust, scalable and maintainable software to support RIO devices into the ISHP control architecture.

  10. Rapid application development by KEKB accelerator operators using EPICS/Python

    International Nuclear Information System (INIS)

    Tanaka, M.; Satoh, Y.; Kitabayashi, T.

    2004-01-01

    In the KEKB accelerator facility, the control system is constructed based on the framework of EPICS. By using EPICS/Python API, which is originated from KEK, we can develop an EPICS channel access application based on simple Python technology with only a few knowledge of EPICS channel access protocols. The operator's new tuning ideas are quickly implemented to the control system. In this paper, we introduce the EPICS/Python API and report the effectiveness of rapid application development by the KEKB operators using the API. (author)

  11. EPICS and its role in data acquisition and beamline control

    International Nuclear Information System (INIS)

    Mooney, T. M.; Arnold, N. D.; Boucher, E.; Cha, B. K.; Goetze, K. A.; Kraimer, M. R.; Rivers, M. L.; Sluiter, R. L.; Sullivan, J. P.; Wallis, D. B.

    1999-01-01

    Beamline-control and data-acquisition software based on EPICS (a tool kit for building distributed control systems) has been running on many Advanced Photon Source beamlines for several years. EPICS itself, the collaborative software-development effort surrounding it, and EPICS-based beamline software have been described previously in general terms. This talk will review and update that material, focusing on the role EPICS core software plays in beamline applications and on the effects of a few defining characteristics of EPICS on the beamline software we have developed with it

  12. Construction of a remote controlled monitoring system with GPIB devices and EPICS

    International Nuclear Information System (INIS)

    Yoshikawa, Takeshi; Yamamoto, Noboru.

    1995-01-01

    The Experimental Physics and Industrial Control System (EPICS) has been used for the accelerator control system in recent years. EPICS has rich set of tools to create application with Graphical User Interface (GUI). It reduces the load of complex programming for GUI and shortens the application development period. This paper will describe the remote temperature monitoring system using EPICS. (author)

  13. EPICS architecture

    International Nuclear Information System (INIS)

    Dalesio, L.R.; Kozubal, A.J.; Kraimer, M.R.

    1992-01-01

    The Experimental Physics and Industrial Control System (EPICS) provides control and data acquisition for the experimental physics community. Because the capabilities required by the experimental physics community for control were not available through industry, we began the design and implementation of EPICS. It is a distributed process control system built on a software communication bus. The functional subsystems, which provide data acquisition, supervisory control, closed loop control, archiving, and alarm management, greatly reduce the need for programming. Sequential control is provided through a sequential control language, allowing the implementer to express state diagrams easily. Data analysis of the archived data is provided through an interactive tool. The timing system provides distributed synchronization for control and time stamped data for data correlation across nodes in the network. The system is scalable from a single test station with a low channel count to a large distributed network with thousands of channels. The functions provided to the physics applications have proven helpful to the experiments while greatly reducing the time to deliver controls. (author)

  14. Study on managing EPICS database using ORACLE

    International Nuclear Information System (INIS)

    Liu Shu; Wang Chunhong; Zhao Jijiu

    2007-01-01

    EPICS is used as a development toolkit of BEPCII control system. The core of EPICS is a distributed database residing in front-end machines. The distributed database is usually created by tools such as VDCT and text editor in the host, then loaded to front-end target IOCs through the network. In BEPCII control system there are about 20,000 signals, which are distributed in more than 20 IOCs. All the databases are developed by device control engineers using VDCT or text editor. There's no uniform tools providing transparent management. The paper firstly presents the current status on EPICS database management issues in many labs. Secondly, it studies EPICS database and the interface between ORACLE and EPICS database. finally, it introduces the software development and application is BEPCII control system. (authors)

  15. EPICS Input/Output Controller (IOC) application developer's guide. APS Release 3.12

    International Nuclear Information System (INIS)

    Kraimer, M.R.

    1994-11-01

    This document describes the core software that resides in an Input/Output Controller (IOC), one of the major components of EPICS. The basic components are: (OPI) Operator Interface; this is a UNIX based workstation which can run various EPICS tools; (IOC) Input/Output Controller; this is a VME/VXI based chassis containing a Motorola 68xxx processor, various I/O modules, and VME modules that provide access to other I/O buses such as GPIB, (LAN), Local Area Network; and this is the communication network which allows the IOCs and OPIs to communicate. Epics provides a software component, Channel Access, which provides network transparent communication between a Channel Access client and an arbitrary number of Channel Access servers

  16. Design and application of an EPICS compatible slow plant system controller in J-TEXT tokamak

    International Nuclear Information System (INIS)

    Zhang, J.; Zhang, M.; Zheng, W.; Zhuang, G.; Ding, T.

    2014-01-01

    Highlights: • Underlying functionalities are encapsulated into plug-and-play modules. • The slow controller is EPICS compatible. • The slow controller can work as PSH. - Abstract: J-TEXT tokamak has recently implemented J-TEXT COntrol, Data Access and Communication (CODAC) system on the principle of ITER CODAC. The control network in J-TEXT CODAC system is based on Experimental Physics and Industrial Control System (EPICS). However, former slow plant system controllers in J-TEXT did not support EPICS. Therefore, J-TEXT has designed an EPICS compatible slow controller. And moreover, the slow controller also acts the role of Plant System Host (PSH), which helps non-EPICS controllers to keep working in J-TEXT CODAC system. The basic functionalities dealing with user defined tasks have been modularized into driver or plug-in modules, which are plug-and-play and configured with XML files according to specific control task. In this case, developers are able to implement various kinds of control tasks with these reusable modules, regardless of how the lower-lever functions are implemented, and mainly focusing on control algorithm. And it is possible to develop custom-built modules by themselves. This paper presents design of the slow controller. Some applications of the slow controller have been deployed in J-TEXT, and will be introduced in this paper

  17. Design and application of an EPICS compatible slow plant system controller in J-TEXT tokamak

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, J.; Zhang, M. [State Key Laboratory of Advanced Electromagnetic Engineering and Technology, Huazhong University of Science and Technology, Wuhan 430074 (China); College of Electrical and Electronic Engineering, Huazhong University of Science and Technology, Wuhan 430074 (China); Zheng, W., E-mail: zhengwei@hust.edu.cn [State Key Laboratory of Advanced Electromagnetic Engineering and Technology, Huazhong University of Science and Technology, Wuhan 430074 (China); College of Electrical and Electronic Engineering, Huazhong University of Science and Technology, Wuhan 430074 (China); Zhuang, G.; Ding, T. [State Key Laboratory of Advanced Electromagnetic Engineering and Technology, Huazhong University of Science and Technology, Wuhan 430074 (China); College of Electrical and Electronic Engineering, Huazhong University of Science and Technology, Wuhan 430074 (China)

    2014-05-15

    Highlights: • Underlying functionalities are encapsulated into plug-and-play modules. • The slow controller is EPICS compatible. • The slow controller can work as PSH. - Abstract: J-TEXT tokamak has recently implemented J-TEXT COntrol, Data Access and Communication (CODAC) system on the principle of ITER CODAC. The control network in J-TEXT CODAC system is based on Experimental Physics and Industrial Control System (EPICS). However, former slow plant system controllers in J-TEXT did not support EPICS. Therefore, J-TEXT has designed an EPICS compatible slow controller. And moreover, the slow controller also acts the role of Plant System Host (PSH), which helps non-EPICS controllers to keep working in J-TEXT CODAC system. The basic functionalities dealing with user defined tasks have been modularized into driver or plug-in modules, which are plug-and-play and configured with XML files according to specific control task. In this case, developers are able to implement various kinds of control tasks with these reusable modules, regardless of how the lower-lever functions are implemented, and mainly focusing on control algorithm. And it is possible to develop custom-built modules by themselves. This paper presents design of the slow controller. Some applications of the slow controller have been deployed in J-TEXT, and will be introduced in this paper.

  18. Configuration and application of He RFQ LLRF control system based on EPICS

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Tae-Sung; Jeong, Hae-Seong; Kim, Seong-Gu; Song, Young-Gi; Kim, Han-Sung; Seol, Kyung-Tae; Kwon, Hyeok-Jung; Cho, Yong-Sub [Korea Multipurpose Accelerator Complex, Gyeongju (Korea, Republic of)

    2015-10-15

    In He RFQ device, the high-power Radio-Frequency (RF) is very important because it is responsible for the stable delivery and efficient acceleration of the beam. Since that, the control system of high-power Radio-Frequency must be developed and this system is called LLRF control system. The LLRF control system required exquisite amplitude value that has ±1 % error range. We need a precise remote control system for this reason. This paper represents the configuration of LLRF control system in terms of software layers based on EPICS. Also, this paper explains the application of LLRF control system to test environment (hardware) and represents test result and suggests future work. The LLRF control system at the He RFQ is very important. The configuration of LLRF control system is completed on the software side and hardware modules: vxworks operating system installation, EPICS BASE compilation, module source code compiled, object file loading and execution on vxworks, EPICS IOC operation check, etc. The application of LLRF control system to module is implemented well: ADC module, DAC module, EPICS IOC test.

  19. Performance Comparison of EPICS IOC and MARTe in a Hard Real-Time Control Application

    Science.gov (United States)

    Barbalace, Antonio; Manduchi, Gabriele; Neto, A.; De Tommasi, G.; Sartori, F.; Valcarcel, D. F.

    2011-12-01

    EPICS is used worldwide mostly for controlling accelerators and large experimental physics facilities. Although EPICS is well fit for the design and development of automation systems, which are typically VME or PLC-based systems, and for soft real-time systems, it may present several drawbacks when used to develop hard real-time systems/applications especially when general purpose operating systems as plain Linux are chosen. This is in particular true in fusion research devices typically employing several hard real-time systems, such as the magnetic control systems, that may require strict determinism, and high performance in terms of jitter and latency. Serious deterioration of important plasma parameters may happen otherwise, possibly leading to an abrupt termination of the plasma discharge. The MARTe framework has been recently developed to fulfill the demanding requirements for such real-time systems that are alike to run on general purpose operating systems, possibly integrated with the low-latency real-time preemption patches. MARTe has been adopted to develop a number of real-time systems in different Tokamaks. In this paper, we first summarize differences and similarities between EPICS IOC and MARTe. Then we report on a set of performance measurements executed on an x86 64 bit multicore machine running Linux with an IO control algorithm implemented in an EPICS IOC and in MARTe.

  20. Multipurpose Controller with EPICS integration and data logging: BPM application for ESS Bilbao

    International Nuclear Information System (INIS)

    Arredondo, I.; Campo, M. del; Echevarria, P.; Jugo, J.; Etxebarria, V.

    2013-01-01

    This work presents a multipurpose configurable control system which can be integrated in an EPICS control network, this functionality being configured through a XML configuration file. The core of the system is the so-called Hardware Controller which is in charge of the control hardware management, the set up and communication with the EPICS network and the data storage. The reconfigurable nature of the controller is based on a single XML file, allowing any final user to easily modify and adjust the control system to any specific requirement. The selected Java development environment ensures a multiplatform operation and large versatility, even regarding the control hardware to be controlled. Specifically, this paper, focused on fast control based on a high performance FPGA, describes also an application approach for the ESS Bilbao's Beam Position Monitoring system. The implementation of the XML configuration file and the satisfactory performance outcome achieved are presented, as well as a general description of the Multipurpose Controller itself

  1. Multipurpose Controller with EPICS integration and data logging: BPM application for ESS Bilbao

    Energy Technology Data Exchange (ETDEWEB)

    Arredondo, I., E-mail: iarredondo@essbilbao.org [ESS Bilbao, Edificio Cosimet Paseode Landabarri 2, 48940 Leioa, Bizkaia (Spain); Campo, M. del; Echevarria, P. [ESS Bilbao, Edificio Cosimet Paseode Landabarri 2, 48940 Leioa, Bizkaia (Spain); Jugo, J.; Etxebarria, V. [University of Basque Country (UPV/EHU), Department of Electricity and Electronics, Science and Technology Fac., Barrio Sarriena s/n, 48940 Leioa, Bizkaia (Spain)

    2013-10-21

    This work presents a multipurpose configurable control system which can be integrated in an EPICS control network, this functionality being configured through a XML configuration file. The core of the system is the so-called Hardware Controller which is in charge of the control hardware management, the set up and communication with the EPICS network and the data storage. The reconfigurable nature of the controller is based on a single XML file, allowing any final user to easily modify and adjust the control system to any specific requirement. The selected Java development environment ensures a multiplatform operation and large versatility, even regarding the control hardware to be controlled. Specifically, this paper, focused on fast control based on a high performance FPGA, describes also an application approach for the ESS Bilbao's Beam Position Monitoring system. The implementation of the XML configuration file and the satisfactory performance outcome achieved are presented, as well as a general description of the Multipurpose Controller itself.

  2. EPICS V4 expands support to physics application, data acquisition, and data analysis

    International Nuclear Information System (INIS)

    Dalesio, L.; Carcassi, G.; Kraimer, M.R.; Malitsky, N.; Shen, G.; Davidsaver, M.; Lange, R.; Sekoranja, M.; Rowland, J.; White, G.; Korhonen, T.

    2012-01-01

    EPICS version 4 extends the functionality of version 3 by providing the ability to define, transport, and introspect composite data types. Version 3 provided a set of process variables and a data protocol that adequately defined scalar data along with an atomic set of attributes. While remaining backward compatible, Version 4 is able to easily expand this set with a data protocol capable of exchanging complex data types and parameterized data requests. Additionally, a group of engineers defined reference types for some applications in this environment. The goal of this work is to define a narrow interface with the minimal set of data types needed to support a distributed architecture for physics applications, data acquisition, and data analysis. (authors)

  3. The EPICS process variable Gateway Version 2

    International Nuclear Information System (INIS)

    Evans, K.

    2005-01-01

    The EPICS Process Variable Gateway is both a Channel Access Server and Channel Access Client that provides a means for many clients, typically on different subnets, to access a process variable while making only one connection to the server that owns the process variable. It also provides additional access security beyond that implemented on the server. It thus protects critical servers while providing suitably restricted access to needed process variables. The original version of the Gateway worked with EPICS Base 3.13 but required a special version, since the changes necessary for its operation were never incorporated into EPICS Base. Version 2 works with any standard EPICS Base 3.14.6 or later and has many improvements in both performance and features over the older version. The Gateway is now used at many institutions and has become a stable, high-performance application. It is capable of handling tens of thousands of process variables with hundreds of thousands of events per second. It has run for over three months in a production environment without having to be restarted. It has many internal process variables that can be used to monitor its state using standard EPICS client tools, such as MEDM and StripTool. Other internal process variables can be used to stop the Gateway, make several kinds of reports, or change the access security without stopping the Gateway. It can even be started on remote workstations from MEDM by using a Secure Shell script. This paper will describe the new Gateway and how it is used. The Gateway is both a server (like an EPICS Input/Output Controller (IOC)) and a client (like the EPICS Motif Editor and Display Manager (MEDM), StripTool, and others). Clients connect to the server side, and the client side connects to IOCs and other servers, possibly other Gateways. See Fig. 1. There are perhaps three principal reasons for using the Gateway: (1) it allows many clients to access a process variable while making only one connection to

  4. EPICS-based control and data acquisition for the APS slope profiler (Conference Presentation)

    Science.gov (United States)

    Sullivan, Joseph; Assoufid, Lahsen; Qian, Jun; Jemian, Peter R.; Mooney, Tim; Rivers, Mark L.; Goetze, Kurt; Sluiter, Ronald L.; Lang, Keenan

    2016-09-01

    The motion control, data acquisition and analysis system for APS Slope Measuring Profiler was implemented using the Experimental Physics and Industrial Control System (EPICS). EPICS was designed as a framework with software tools and applications that provide a software infrastructure used in building distributed control systems to operate devices such as particle accelerators, large experiments and major telescopes. EPICS was chosen to implement the APS Slope Measuring Profiler because it is also applicable to single purpose systems. The control and data handling capability available in the EPICS framework provides the basic functionality needed for high precision X-ray mirror measurement. Those built in capabilities include hardware integration of high-performance motion control systems (3-axis gantry and tip-tilt stages), mirror measurement devices (autocollimator, laser spot camera) and temperature sensors. Scanning the mirror and taking measurements was accomplished with an EPICS feature (the sscan record) which synchronizes motor positioning with measurement triggers and data storage. Various mirror scanning modes were automatically configured using EPICS built-in scripting. EPICS tools also provide low-level image processing (areaDetector). Operation screens were created using EPICS-aware GUI screen development tools.

  5. EPICS system: system structure and user interface

    International Nuclear Information System (INIS)

    West, R.E.; Bartlett, J.F.; Bobbitt, J.S.; Lahey, T.E.; Kramper, B.J.; MacKinnon, B.A.

    1984-02-01

    This paper present the user's view of and the general organization of the EPICS control system at Fermilab. Various subsystems of the EPICS control system are discussed. These include the user command language, software protection, the device database, remote computer interfaces, and several application utilities. This paper is related to two other papers on EPICS: an overview paper and a detailed implementation paper

  6. EPICS V4 in Python

    International Nuclear Information System (INIS)

    Guobao Shen; Kraimer, M.; Davidsaver, M.

    2012-01-01

    At NSLS-II, Python has been selected as the primary development language for physics applications. Interest in Python as a rapid application development environment continues to grow. Many large experimental scientific facilities have adopted Python for beam commissioning and the operation. The EPICS control system framework has become the de facto standard for the control of large experimental facilities, where it is in use in over 100 facilities. The next version of EPICS (EPICS V4), under active development will extend the support for physics applications, data acquisition, and data analysis. Python support for EPICS V4 will provide an effective framework to address these requirements. This paper presents design, development and status of activities focused on EPICS V4 in Python

  7. Monitor and Control for PEFP System using EPICS

    International Nuclear Information System (INIS)

    Choi, Hyun Mi; Hong, I. S.; Song, Y. G.; Cho, Y. S.

    2005-01-01

    The construction of PEFP project whose final objective is to build 100 Mev proton accelerator started in 2002 and expected to finish in 2012. In 2005, we have performed 20mA proton beam of 20Mev. For developing the control systems of the 20Mev accelerator as well as 100 Mev accelerator, we chose EPICS(Experimental Physics and Industrial Control System) as the most suitable tool. We have studied EPICS applications for various situation and as the application we developed vacuum control system using EPICS base3.14.4 as the core software and EPICS extensions (e.g., EDM(Extensible Display Manager), MEDM(Motif Editor and Display Manager) etc.) as the user interface. There are a number of projects using EPICS for a broad spectrum of applications. EPICS began as a collaboration between Argonne National Laboratory and Los Alamos National Laboratory in 1991, building on work that was initially done at the ground test Accelerator. It is now running on accelerators that have as many as 180 distributed front-end controllers and control rooms with 20 consoles and a gateway to make system parameters available to offices, web site, and other remote control stations. It is also used at single controller and one workstation systems. We use the EPICS tool kit as a foundation of the control system. We developed a vacuum monitor, RFQ, DTL Turbo pump control system for use Ethernet Multi Serial Deice Severs on PEFP control system. The control system now shows stable and reliable characteristics enough to meet our control requirement. However, the control system is continuously being upgraded to accommodate additional control requirements such as vacuum device control

  8. Application study of EPICS-based redundant method for reactor control system

    International Nuclear Information System (INIS)

    Zhang Ning; Han Lifeng; Chen Yongzhong; Guo Bing; Yin Congcong

    2013-01-01

    In the reactor control system prototype development of TMSR (Thorium Molten Salt Reactor system, CAS) project, EPICS (Experimental Physics and Industrial Control System) is adopted as Instrument and Control software platform. For the aim of IOC (Input/Output Controller) redundancy and data synchronization of the system, the EPICS-based RMT (Redundancy Monitor Task ) software package and its data-synchronization component CCE (Continuous Control Executive) were introduced. By the development of related IOC driver, redundant switch-over control of server IOC was implemented. The method of redundancy implementation using RMT in server and redundancy performance test for power control system are discussed in this paper. (authors)

  9. J-TEXT-EPICS: An EPICS toolkit attempted to improve productivity

    International Nuclear Information System (INIS)

    Zheng, Wei; Zhang, Ming; Zhang, Jing; Zhuang, Ge

    2013-01-01

    Highlights: • Tokamak control applications can be developed in very short period with J-TEXT-EPICS. • J-TEXT-EPICS enables users to build control applications with device-oriented functions. • J-TEXT-EPICS is fully compatible with EPICS Channel Access protocol. • J-TEXT-EPICS can be easily extended by plug-ins and drivers. -- Abstract: The Joint Texas Experimental Tokamak (J-TEXT) team has developed a new software toolkit for building Experimental Physics and Industrial Control System (EPICS) control applications called J-TEXT-EPICS. It aims to improve the development efficiency of control applications. With device-oriented features, it can be used to set or obtain the configuration or status of a device as well as invoke methods on a device. With its modularized design, its functions can be easily extended. J-TEXT-EPICS is completely compatible with the original EPICS Channel Access protocol and can be integrated into existing EPICS control systems smoothly. It is fully implemented in C number sign, thus it will benefit from abundant resources in.NET Framework. The J-TEXT control system is build with this toolkit. This paper presents the design and implementation of J-TEXT EPICS as well as its application in the J-TEXT control system

  10. Sight Application Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Bronevetsky, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-09-17

    The scale and complexity of scientific applications makes it very difficult to optimize, debug and extend them to support new capabilities. We have developed a tool that supports developers’ efforts to understand the logical flow of their applications and interactions between application components and hardware in a way that scales with application complexity and parallelism.

  11. EPICS release 3.11 specific documentation -- EPICS release notes for 3.11

    International Nuclear Information System (INIS)

    1994-01-01

    EPICS release 3.11 is now ready for user testing. A person who wants to set up a simplified application environment to boot an IOC and create databases using R3.11 should follow the directions in Appendix B, page 27, of the EPICS Source/Release Control Manual, Sept. 20, 1993. The R3.11 EPICS path at ANL/APS is /net/phebos/epics/R3.11 so the command to get the new release is /net/phebos/epics/R3.11/Unix/share/bin/getrel /net/phebos/epics/R3.11. An existing R3.8 short form report can be copied to this new directory and used to create a database. ANL/APS is currently testing an Application Developers Source/Release control system. It is not yet ready for general distribution. Attached are the EPICS R3.11 release notes

  12. EPICS: Experimental Physics and Industrial Control System

    Science.gov (United States)

    Epics Development Team

    2013-02-01

    EPICS is a set of software tools and applications developed collaboratively and used to create distributed soft real-time control systems for scientific instruments such as particle accelerators and telescopes. Such distributed control systems typically comprise tens or even hundreds of computers, networked together to allow communication between them and to provide control and feedback of the various parts of the device from a central control room, or even remotely over the internet. EPICS uses Client/Server and Publish/Subscribe techniques to communicate between the various computers. A Channel Access Gateway allows engineers and physicists elsewhere in the building to examine the current state of the IOCs, but prevents them from making unauthorized adjustments to the running system. In many cases the engineers can make a secure internet connection from home to diagnose and fix faults without having to travel to the site. EPICS is used by many facilities worldwide, including the Advanced Photon Source at Argonne National Laboratory, Fermilab, Keck Observatory, Laboratori Nazionali di Legnaro, Brazilian Synchrotron Light Source, Los Alamos National Laboratory, Australian Synchrotron, and Stanford Linear Accellerator Center.

  13. Doing accelerator physics using SDDS, UNIX, and EPICS

    International Nuclear Information System (INIS)

    Borland, M.; Emery, L.; Sereno, N.

    1995-01-01

    The use of the SDDS (Self-Describing Data Sets) file protocol, together with the UNIX operating system and EPICS (Experimental Physics and Industrial Controls System), has proved powerful during the commissioning of the APS (Advanced Photon Source) accelerator complex. The SDDS file protocol has permitted a tool-oriented approach to developing applications, wherein generic programs axe written that function as part of multiple applications. While EPICS-specific tools were written for data collection, automated experiment execution, closed-loop control, and so forth, data processing and display axe done with the SDDS Toolkit. Experiments and data reduction axe implemented as UNIX shell scripts that coordinate the execution of EPICS specific tools and SDDS tools. Because of the power and generic nature of the individual tools and of the UNIX shell environment, automated experiments can be prepared and executed rapidly in response to unanticipated needs or new ideas. Examples are given of application of this methodology to beam motion characterization, beam-position-monitor offset measurements, and klystron characterization

  14. The PSI web interface to the EPICS channel archiver

    International Nuclear Information System (INIS)

    Gaudenz Jud; Luedeke, A.; Portmann, W.

    2012-01-01

    the EPICS (Experimental Physics and Industrial Control System) channel archiver is used at different facilities at PSI (Paul Scherrer Institute) like the Swiss Light Source or the medical cyclotron. The EPICS channel archiver is a powerful tool to collect control system data of thousands of EPICS process variables with rates of many Hertz each to an archive for later retrieval. Within the package of the channel archiver version 2 you get a Java application for graphical data retrieval and a command line tool for data extraction into different file formats. For the Paul Scherrer Institute (PSI) we wanted a possibility to retrieve the archived data from a web interface. It was desired to have flexible retrieval functions and to allow interchanging data references by e-mail. This web interface has been implemented by the PSI controls group and has now been in operation for several years. This paper will highlight the special features of the PSI web interface to the EPICS channel archiver

  15. An autonomous observation and control system based on EPICS and RTS2 for Antarctic telescopes

    Science.gov (United States)

    Zhang, Guang-yu; Wang, Jian; Tang, Peng-yi; Jia, Ming-hao; Chen, Jie; Dong, Shu-cheng; Jiang, Fengxin; Wu, Wen-qing; Liu, Jia-jing; Zhang, Hong-fei

    2016-01-01

    For unattended telescopes in Antarctic, the remote operation, autonomous observation and control are essential. An EPICS-(Experimental Physics and Industrial Control System) and RTS2-(Remote Telescope System, 2nd Version) based autonomous observation and control system with remoted operation is introduced in this paper. EPICS is a set of open source software tools, libraries and applications developed collaboratively and used worldwide to create distributed soft real-time control systems for scientific instruments while RTS2 is an open source environment for control of a fully autonomous observatory. Using the advantage of EPICS and RTS2, respectively, a combined integrated software framework for autonomous observation and control is established that use RTS2 to fulfil the function of astronomical observation and use EPICS to fulfil the device control of telescope. A command and status interface for EPICS and RTS2 is designed to make the EPICS IOC (Input/Output Controller) components integrate to RTS2 directly. For the specification and requirement of control system of telescope in Antarctic, core components named Executor and Auto-focus for autonomous observation is designed and implemented with remote operation user interface based on browser-server mode. The whole system including the telescope is tested in Lijiang Observatory in Yunnan Province for practical observation to complete the autonomous observation and control, including telescope control, camera control, dome control, weather information acquisition with the local and remote operation.

  16. Joint effect of unlinked genotypes: application to type 2 diabetes in the EPIC-Potsdam case-cohort study.

    Science.gov (United States)

    Knüppel, Sven; Meidtner, Karina; Arregui, Maria; Holzhütter, Hermann-Georg; Boeing, Heiner

    2015-07-01

    Analyzing multiple single nucleotide polymorphisms (SNPs) is a promising approach to finding genetic effects beyond single-locus associations. We proposed the use of multilocus stepwise regression (MSR) to screen for allele combinations as a method to model joint effects, and compared the results with the often used genetic risk score (GRS), conventional stepwise selection, and the shrinkage method LASSO. In contrast to MSR, the GRS, conventional stepwise selection, and LASSO model each genotype by the risk allele doses. We reanalyzed 20 unlinked SNPs related to type 2 diabetes (T2D) in the EPIC-Potsdam case-cohort study (760 cases, 2193 noncases). No SNP-SNP interactions and no nonlinear effects were found. Two SNP combinations selected by MSR (Nagelkerke's R² = 0.050 and 0.048) included eight SNPs with mean allele combination frequency of 2%. GRS and stepwise selection selected nearly the same SNP combinations consisting of 12 and 13 SNPs (Nagelkerke's R² ranged from 0.020 to 0.029). LASSO showed similar results. The MSR method showed the best model fit measured by Nagelkerke's R² suggesting that further improvement may render this method a useful tool in genetic research. However, our comparison suggests that the GRS is a simple way to model genetic effects since it does not consider linkage, SNP-SNP interactions, and no non-linear effects. © 2015 John Wiley & Sons Ltd/University College London.

  17. PRACTICAL APPLICATION OF QUALITY TOOLS

    Directory of Open Access Journals (Sweden)

    Duško Pavletić

    2008-09-01

    Full Text Available The paper is dealing with one segment of broader research of universality an systematicness in application of seven basic quality tools (7QC tools, which is possible to use in different areas: power plant, process industry, government, health and tourism services. The aim of the paper was to show on practical examples that there is real possibility of application of 7QC tools. Furthermore, the research has to show to what extent are selected tools in usage and what reasons of avoiding their broader application are. The simple example of successful application of the quality tools are shown on selected company in process industry.

  18. EPICS based DAQ system

    International Nuclear Information System (INIS)

    Cheng Weixing; Chen Yongzhong; Zhou Weimin; Ye Kairong; Liu Dekang

    2002-01-01

    EPICS is the most popular developing platform to build control system and beam diagnostic system in modern physics experiment facilities. An EPICS based data acquisition system was built in Redhat 6.2 operation system. The system is successfully used in the beam position monitor mapping, it improves the mapping process a lot

  19. Integrating EPICS and MDSplus

    International Nuclear Information System (INIS)

    Mastrovito, D.; Davis, W.; Dong, J.; Roney, P.; Sichta, P.

    2006-01-01

    The National Spherical Torus Experiment (NSTX) has been in operation at the Princeton Plasma Physics Laboratory (PPPL) since 1999. Since then, NSTX has made use of the Experimental Physics and Industrial Control System (EPICS) and MDSplus software packages, among others for control and data acquisition. To date, the two products have been integrated using special 'bridging' programs that include client-components for the EPICS and MDSplus servers. Recent improvements in the EPICS software have made it easier to develop a direct interface with MDSplus. This paper will describe the new EPICS extensions developed at PPPL that provide: (1) a direct data interface between EPICS process variables and MDSplus nodes; and (2) an interface between EPICS events and MDSplus events. These extensions have been developed for use with EPICS on Solaris and are currently being modified for use on real-time operating systems. Separately, an XML-RPC client was written to access EPICS 'trended' data, sampled usually once per minute during a 24 h period. The client extracts and writes a day's worth of trended data to a 'daily' MDSplus tree

  20. EPICS: operating system independent device/driver support

    International Nuclear Information System (INIS)

    Kraimer, M.R.

    2003-01-01

    Originally EPICS input/output controllers (IOCs) were only supported on VME-based systems running the vxWorks operating system. Now IOCs are supported on many systems: vxWorks, RTEMS, Solaris, HPUX, Linux, WIN32, and Darwin. A challenge is to provide operating-system-independent device and driver support. This paper presents some techniques for providing such support. EPICS (Experimental Physics and Industrial Control System) is a set of software tools, libraries, and applications developed collaboratively and used worldwide to create distributed, real-time control systems for scientific instruments such as particle accelerators, telescopes, and other large scientific experiments. An important component of all EPICS-based control systems is a collection of input/output controllers (IOCs). An IOC has three primary components: (1) a real-time database; (2) channel access, which provides network access to the database; and (3) device/driver support for interfacing to equipment. This paper describes some projects related to providing device/driver support on non-vxWorks systems. In order to support IOCs on platforms other than vxWorks, operating-system-independent (OSI) application program interfaces (APIs) were defined for threads, semaphores, timers, etc. Providing support for a new platform consists of providing an operating-system-dependent implementation of the OSI APIs.

  1. EPICS system: an overview

    International Nuclear Information System (INIS)

    Bartlett, J.F.; Bobbitt, J.S.; Kramper, B.J.; Lahey, T.E.; MacKinnon, B.A.; West, R.E.

    1984-02-01

    This paper presents an overview of the EPICS control system at FERMILAB. EPICS is a distributed, multi-user, interactive system for the control and monitoring of particle beamlines at a high-energy experimental physics laboratory. The overview discusses the operating environment of the control system, the requirements which determined the design decisions, the hardware and software configurations, and plans for the future growth and enhancement of the present system. This paper is the first of three related papers on the EPICS system. The other two cover (1) the system structure and user interface and (2) RSX implementation issues

  2. Development of EPICS IOC for TPLC-32 platform

    International Nuclear Information System (INIS)

    Jain, Sanjay Kumar; Bhamra, Ratna; Kavalan, P.K.; Vaidya, U.W.

    2014-01-01

    Experimental Physics and Industrial Control System (EPICS) SCADA software package is popular worldwide for deploying accelerator control systems. EPICS base allows building server applications that interact with EPICS compliant lower level hardware, embedded systems, commercial PLCs on one side and facilitates seamless connectivity with EPICS clients on the other side. Large number of such lower level (EPICS complaint) networked systems work in collaborative environment for physics experiment. RCnD has developed a PLC platform, Trombay Programmable Logic Controller TPLC-32 for deploying C and I systems of NPP and allied utilities. This platform is now also being used for deploying accelerator C and I systems. Hence the activity of developing TPLC-32 interface with EPICS was taken up in RCnD. This paper introduces the architecture of the EPICS Input Output Controller (IOC) and describes the implementation of the EPICS SoftIOC for TPLC-32. It also describes Operator Interface developed using MEDM EPICS client for LEHIPA vacuum control system built using TPLC-32. (author)

  3. DOE's Pollution Prevention Information Clearinghouse (EPIC)

    International Nuclear Information System (INIS)

    Otis, P.T.

    1994-05-01

    The US Department of Energy's (DOE's) Pollution Prevention Information Clearinghouse (EPIC) is a computer system intended for the exchange of pollution prevention information DOE-wide. EPIC is being developed as a distributed system that will allow access to other databases and applications. The first prototype of EPIC (Prototype I) was put on-line in January 1994. Prototype I contains information on EM-funded pollution prevention projects; relevant laws, regulations, guidance, and policy; facility and DOE contacts; and meetings and conferences. Prototype I also gives users access to the INEL Hazardous Solvent Substitution Data System (HSSDS) and to information contained on the US Environmental Protection Agency's (EPNS) Pollution Prevention Infbrmation Exchange System (PIES) as a test of the distributed system concept. An initial user group of about 35 is testing and providing feedback on Prototype I. Prototype II, with a Graphical User Interface (GUI), is planned for the end of CY94. This paper describes the current state of EPIC in terms of architecture, user interface, and information content. Plans for Prototype II and the final system are then discussed. The EPIC development effort is being coordinated with EPA and US Department of Defense (DoD) efforts to develop or upgrade their pollution prevention information exchange systems

  4. EPICS IOC development on open source RTEMS RTOS

    International Nuclear Information System (INIS)

    Bharade, S.K.; Joshi, Gopal; Das, D.

    2015-01-01

    Modern control systems applications are often built on top of a real time operating system. At LEHIPA beamlines, open source control systems offer a modem solution for cost effectiveness and technical competence. The 'Experimental Physics and Industrial Control System' (EPICS) and the 'Real-Time Operating System for Multiprocessor Systems' (RTEMS) were chosen to develop the core control system. Presently, the EPICS/RTEMS/MVME5500 control system is implemented for RF Protection Interlock system and BPM. This paper shares an experience of building RTEMS for MVME5500, configuring and run EPICS for RTEMS-MVME5500 architecture. It further shows EPICS bench marking results using EPICS catime and hamesstest utilities for this architecture. (author)

  5. An EPICS IOC builder

    International Nuclear Information System (INIS)

    Abbott, M.G.; Cobb, T.

    2012-01-01

    An EPICS IOC (Input/Output Controller) is typically assembled from a number of standard components each with potentially quite complex hardware or software initialization procedures intermixed with a good deal of repetitive boiler-plate code. Assembling and maintaining a complex IOC can be a quite difficult and error prone process, particularly if the components are unfamiliar. The EPICS IOC builder is a Python library designed to automate the assembly of a complete IOC from a concise component level description. The dependencies and interactions between components as well as their detailed initialization procedures are automatically managed by the IOC builder through component description files maintained with the individual components. At Diamond Light Source we have a large library of components that can be assembled into EPICS IOCs. The IOC Builder is further finding increasing use in helping non-expert users to assemble an IOC without specialist knowledge. (authors)

  6. Advanced SOA tools and applications

    CERN Document Server

    Brzezinski, Jerzy; Cellary, Wojciech; Grzech, Adam; Zielinski, Krzysztof

    2014-01-01

    This book presents advanced software development tools for construction, deployment and governance of Service Oriented Architecture (SOA) applications. Novel technical concepts and paradigms, formulated during the research stage and during development of such tools are presented and illustrated by practical usage examples. Hence this book will be of interest not only to theoreticians but also to engineers who cope with real-life problems. Additionally, each chapter contains an overview of related work, enabling comparison of the proposed concepts with exiting solutions in various areas of the SOA development process. This makes the book interesting also for students and scientists who investigate similar issues.

  7. Fast radiative transfer models for retrieval of cloud properties in the back-scattering region: application to DSCOVR-EPIC sensor

    Science.gov (United States)

    Molina Garcia, Victor; Sasi, Sruthy; Efremenko, Dmitry; Doicu, Adrian; Loyola, Diego

    2017-04-01

    In this work, the requirements for the retrieval of cloud properties in the back-scattering region are described, and their application to the measurements taken by the Earth Polychromatic Imaging Camera (EPIC) on board the Deep Space Climate Observatory (DSCOVR) is shown. Various radiative transfer models and their linearizations are implemented, and their advantages and issues are analyzed. As radiative transfer calculations in the back-scattering region are computationally time-consuming, several acceleration techniques are also studied. The radiative transfer models analyzed include the exact Discrete Ordinate method with Matrix Exponential (DOME), the Matrix Operator method with Matrix Exponential (MOME), and the approximate asymptotic and equivalent Lambertian cloud models. To reduce the computational cost of the line-by-line (LBL) calculations, the k-distribution method, the Principal Component Analysis (PCA) and a combination of the k-distribution method plus PCA are used. The linearized radiative transfer models for retrieval of cloud properties include the Linearized Discrete Ordinate method with Matrix Exponential (LDOME), the Linearized Matrix Operator method with Matrix Exponential (LMOME) and the Forward-Adjoint Discrete Ordinate method with Matrix Exponential (FADOME). These models were applied to the EPIC oxygen-A band absorption channel at 764 nm. It is shown that the approximate asymptotic and equivalent Lambertian cloud models give inaccurate results, so an offline processor for the retrieval of cloud properties in the back-scattering region requires the use of exact models such as DOME and MOME, which behave similarly. The combination of the k-distribution method plus PCA presents similar accuracy to the LBL calculations, but it is up to 360 times faster, and the relative errors for the computed radiances are less than 1.5% compared to the results when the exact phase function is used. Finally, the linearized models studied show similar behavior

  8. Laboratory Instrumentation Design Research for Scalable Next Generation Epitaxy: Non-Equilibrium Wide Application Epitaxial Patterning by Intelligent Control (NEW-EPIC). Volume 1. 3D Composition/Doping Control via Micromiror Patterned Deep UV Photodesorption: Revolutionary in situ Characterization/Control

    Science.gov (United States)

    2009-02-19

    34 (to be submitted to APL) " Positron Annihilation Spectroscopy of Annealed and As-grown Be-doped GaN" (to be submitted to APL - delayed by the...WIDE APPLICATION EPITAXIAL PATTERNING BY INTELLIGENT CONTROL (NEW-EPIC) 6. AUTHOR(S) DRS DOOLITTILE, FRAZIER, BURNHAM, PRITCHETT, BILLINGSLEY...NEXT GENERATION EPITAXY: NON-EQUILIBRIUM WIDE APPLICATION EPITAXIAL PATTERNING BY INTELLIGENT CONTROL (NEW-EPIC) VOLUME I 3D COMPOSITION/DOPING

  9. Challenges in Comparative Oral Epic

    Directory of Open Access Journals (Sweden)

    John Miles Foley

    2012-10-01

    Full Text Available Originally written in 2001 and subsequently published in China, this collaborative essay explores five questions central to comparative oral epic with regard to Mongolian, South Slavic, ancient Greek, and Old English traditions: “What is a poem in oral epic tradition?” “What is a typical scene or theme in oral epic tradition?” “What is a poetic line in oral epic tradition?” “What is a formula in an oral epic tradition?” “What is the register in oral epic poetry?” Now available for the first time in English, this essay reflects a foundational stage of what has become a productive and long-term collaboration between the Center for Studies in Oral Tradition and the Institute of Ethnic Literature of the Chinese Academy of Social Sciences.

  10. Qt based GUI system for EPICS control systems

    International Nuclear Information System (INIS)

    Rhyder, A.; Fernandes, R.N.; Starritt, A.

    2012-01-01

    The Qt-based GUI system developed at the Australian Synchrotron for use on EPICS control systems has recently been enhanced to including support for imaging, plotting, user login, logging and configuration recipes. Plans are also being made to broaden its appeal within the wider EPICS community by expanding the range of development options and adding support for EPICS V4. Current features include graphical and non-graphical application development as well as simple 'code-free' GUI design. Additional features will allow developers to let the GUI system handle its own data using Qt-based EPICS-aware classes or, as an alternative, use other control systems data such as PSI's CAFE. (author)

  11. GDA and EPICS: working in unison for science driven data acquisition and control at Diamond light source

    International Nuclear Information System (INIS)

    Gibbons, E.P.; Heron, M.T.; Rees, N.P.

    2012-01-01

    Diamond Light Source has recently received funding for an additional 10 photon beamlines, bringing the total to 32 beamlines and around 40 end-stations. These all use EPICS (Experimental Physics and Industrial Control System) for the control of the underlying instrumentation associated with photon delivery, the experiment and most of the data acquisition hardware. For the scientific users Diamond has developed the Generic Data Acquisition (GDA) application framework to provide a consistent science interface across all beamlines. While each application is customized to the science of its beamline, all applications are built from the framework and predominantly interface to the underlying instrumentation through the EPICS abstraction. We will describe the complete system, illustrate how it can be configured for a specific beamline application, and how other synchrotrons are, and can, adapt these tools for their needs. (authors)

  12. EPICS MySQL Archiver - integration between EPICS and MySQL

    International Nuclear Information System (INIS)

    Roy, A.; Bhole, R.B.; Pal, S.; Sarkar, D.

    2012-01-01

    The performance evaluation and analysis of intersystem dependency of the various subsystems of the Superconducting Cyclotron (SCC) demand a well configured data logging, archiving and historic analysis facility for large number of control parameters along with on-line failure analysis facility of every system. Experimental Physics and Industrial Control System (EPICS) is used as development architecture of the control system of these systems with MySQL as database for large amount of relational data management. This combination requires integration between EPICS and MySQL server. For this purpose, MySQL Archiver as an EPICS Extension is developed for data logging and archiving of control parameters into MySQL database. This extension also provides a web based tool for online monitoring of control parameters and historic analysis of archived data. This paper describes the software architecture, implementation, as well as method of configuration for any other EPICS based control system as a utility. This facility is also elaborated with examples, web page views and experiences of deploying it in SCC. (author)

  13. WAPTT - Web Application Penetration Testing Tool

    Directory of Open Access Journals (Sweden)

    DURIC, Z.

    2014-02-01

    Full Text Available Web applications vulnerabilities allow attackers to perform malicious actions that range from gaining unauthorized account access to obtaining sensitive data. The number of reported web application vulnerabilities in last decade is increasing dramatically. The most of vulnerabilities result from improper input validation and sanitization. The most important of these vulnerabilities based on improper input validation and sanitization are: SQL injection (SQLI, Cross-Site Scripting (XSS and Buffer Overflow (BOF. In order to address these vulnerabilities we designed and developed the WAPTT (Web Application Penetration Testing Tool tool - web application penetration testing tool. Unlike other web application penetration testing tools, this tool is modular, and can be easily extended by end-user. In order to improve efficiency of SQLI vulnerability detection, WAPTT uses an efficient algorithm for page similarity detection. The proposed tool showed promising results as compared to six well-known web application scanners in detecting various web application vulnerabilities.

  14. EPICS system: RSX implementation issues

    International Nuclear Information System (INIS)

    Lahey, T.E.; Bartlett, J.F.; Bobbitt, J.S.; Kramper, B.J.; MacKinnon, B.A.; West, R.E.

    1984-02-01

    This paper presents implementation details of the Experimental Physics Interactive Control System (EPICS). EPICS is used to control accelerated particle beams for high-energy physics experiments at the Fermi National Accelerator Laboratory. The topics discussed are: interprocessor communication, support of beamline terminals and devices, resource management, mapping, various problems, some solutions to the problems, performance measurement, and modifications and extensions to RSX-11M. This paper is the third of three related papers on the EPICS system. The other two cover (1) the system overview and (2) the system structure and user interface

  15. The KSTAR integrated control system based on EPICS

    International Nuclear Information System (INIS)

    Kim, K.H.; Ju, C.J.; Kim, M.K.; Park, M.K.; Choi, J.W.; Kyum, M.C.; Kwon, M.

    2006-01-01

    The Korea Superconducting Tokamak Advanced Research (KSTAR) control system will be developed with several subsystems, which consist of the central control system (e.g. plasma control, machine control, diagnostic control, time synchronization, and interlock systems) and local control systems for various subsystems. We are planning to connect the entire system with several networks, viz. a reflective-memory-based real-time network, an optical timing network, a gigabit Ethernet network for generic machine control, and a storage network. Then it will evolve into a network-based, distributed real-time control system. Thus, we have to consider the standard communication protocols among the subsystems and how to handle the various kinds of hardware in a homogeneous way. To satisfy these requirements, EPICS has been chosen for the KSTAR control. The EPICS framework provides network-based real-time distributed control, operating system independent programming tools, operator interface tools, archiving tools, and interface tools with other commercial and non-commercial software. The most important advantage of the use of the EPICS framework is in providing homogeneity of the system for the control system developer. The developer does not have to be concerned about the specifics of the local system, but can concentrate on the implementation of the control logic with EPICS tools. We will present the details of the integration issues and also will give a brief summary of the entire KSTAR control system from an integration point of view

  16. EPICS GPIB device support

    International Nuclear Information System (INIS)

    Winans, J.

    1993-01-01

    A GPIB device support module is used to provide access to the operating parameters of a GPIB device. GPIB devices may be accessed via National Instruments 1014 cards or via Bitbus Universal Gateways. GPIB devices typically have many parameters, each of which may be thought of in terms of the standard types of database records available in EPICS. It is the job of the device support module designer to decide how the mapping of these parameters will be made to the available record types. Once this mapping is complete, the device support module may be written. The writing of the device support module consists primarily of the construction of a parameter table. This table is used to associate the database record types with the operating parameters of the GPIB instrument. Other aspects of module design include the handling of SRQ events and errors. SRQ events are made available to the device support module if so desired. The processing of an SRQ event is completely up to the designer of the module. They may be ignored, tied to event based record processing, or anything else the designer wishes. Error conditions may be handled in a similar fashion

  17. Performability Modelling Tools, Evaluation Techniques and Applications

    NARCIS (Netherlands)

    Haverkort, Boudewijn R.H.M.

    1990-01-01

    This thesis deals with three aspects of quantitative evaluation of fault-tolerant and distributed computer and communication systems: performability evaluation techniques, performability modelling tools, and performability modelling applications. Performability modelling is a relatively new

  18. Tooling Foam for Structural Composite Applications

    Science.gov (United States)

    DeLay, Tom; Smith, Brett H.; Ely, Kevin; MacArthur, Doug

    1998-01-01

    Tooling technology applications for composite structures fabrication have been expanded at MSFC's Productivity Enhancement Complex (PEC). Engineers from NASA/MSFC and Lockheed Martin Corporation have developed a tooling foam for use in composite materials processing and manufacturing that exhibits superior thermal and mechanical properties in comparison with other tooling foam materials. This tooling foam is also compatible with most preimpregnated composite resins such as epoxy, bismaleimide, phenolic and their associated cure cycles. MARCORE tooling foam has excellent processability for applications requiring either integral or removable tooling. It can also be tailored to meet the requirements for composite processing of parts with unlimited cross sectional area. A shelf life of at least six months is easily maintained when components are stored between 50F - 70F. The MARCORE tooling foam system is a two component urethane-modified polyisocyanurate, high density rigid foam with zero ozone depletion potential. This readily machineable, lightweight tooling foam is ideal for composite structures fabrication and is dimensionally stable at temperatures up to 350F and pressures of 100 psi.

  19. EPICS: Channel Access security design

    International Nuclear Information System (INIS)

    Kraimer, M.; Hill, J.

    1994-05-01

    This document presents the design for implementing the requirements specified in: EPICS -- Channel Access Security -- functional requirements, Ned. D. Arnold, 03/09/92. Use of the access security system is described along with a summary of the functional requirements. The programmer's interface is given. Security protocol is described and finally aids for reading the access security code are provided

  20. EPICS channel access using websocket

    International Nuclear Information System (INIS)

    Uchiyama, A.; Furukawa, K.; Higurashi, Y.

    2012-01-01

    Web technology is useful as a means of widely disseminating accelerator and beam status information. For this purpose, WebOPI was implemented by SNS as a web-based system using Ajax (asynchronous JavaScript and XML) with EPICS. On the other hand, it is often necessary to control the accelerator from different locations as well as the central control room during beam operation and maintenance. However, it is not realistic to replace the GUI-based operator interface (OPI) with a Web-based system using Ajax technology because of interactive performance issue. Therefore, as a next generation OPI over the web using EPICS Channel Access (CA), we developed a client system based on WebSocket, which is a new protocol provided by the Internet Engineering Task Force (IETF) for Web-based systems. WebSocket is a web technology that provides bidirectional, full-duplex communication channels over a single TCP connection. By utilizing Node.js and the WebSocket access library called Socket.IO, a WebSocket server was implemented. Node.js is a server-side JavaScript language built on the Google V8 JavaScript Engine. In order to construct the WebSocket server as an EPICS CA client, an add-on for Node.js was developed in C/C++ using the EPICS CA library, which is included in the EPICS base. As a result, for accelerator operation, Web-based client systems became available not only in the central control room but also with various types of equipment. (author)

  1. Optogenetic Tools for Subcellular Applications in Neuroscience.

    Science.gov (United States)

    Rost, Benjamin R; Schneider-Warme, Franziska; Schmitz, Dietmar; Hegemann, Peter

    2017-11-01

    The ability to study cellular physiology using photosensitive, genetically encoded molecules has profoundly transformed neuroscience. The modern optogenetic toolbox includes fluorescent sensors to visualize signaling events in living cells and optogenetic actuators enabling manipulation of numerous cellular activities. Most optogenetic tools are not targeted to specific subcellular compartments but are localized with limited discrimination throughout the cell. Therefore, optogenetic activation often does not reflect context-dependent effects of highly localized intracellular signaling events. Subcellular targeting is required to achieve more specific optogenetic readouts and photomanipulation. Here we first provide a detailed overview of the available optogenetic tools with a focus on optogenetic actuators. Second, we review established strategies for targeting these tools to specific subcellular compartments. Finally, we discuss useful tools and targeting strategies that are currently missing from the optogenetics repertoire and provide suggestions for novel subcellular optogenetic applications. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. The implement of the interface between EPICS and LabVIEW

    International Nuclear Information System (INIS)

    Liu Jia; Wang Chunhong

    2009-01-01

    The control system of BEPCII (Beijing Electron Positron Collider) is based on EPICS (Experimental Physics and Industrial Control System). LabVIEW is often used to develop data acquisition system on Windows platform. EPICS IOC version 3.14 or later, can be currently running on windows. The SharedMemory developed by SNS (Spallation Neutron Source) can implement the interface between Windows IOC and LabVIEW. The paper describes the method how to use SharedMemory and a application of magnet measurement system, so that LabVIEW and EPICS can share data each other. (authors)

  3. Usability of human Infinium MethylationEPIC BeadChip for mouse DNA methylation studies.

    Science.gov (United States)

    Needhamsen, Maria; Ewing, Ewoud; Lund, Harald; Gomez-Cabrero, David; Harris, Robert Adam; Kular, Lara; Jagodic, Maja

    2017-11-15

    The advent of array-based genome-wide DNA methylation methods has enabled quantitative measurement of single CpG methylation status at relatively low cost and sample input. Whereas the use of Infinium Human Methylation BeadChips has shown great utility in clinical studies, no equivalent tool is available for rodent animal samples. We examined the feasibility of using the new Infinium MethylationEPIC BeadChip for studying DNA methylation in mouse. In silico, we identified 19,420 EPIC probes (referred as mEPIC probes), which align with a unique best alignment score to the bisulfite converted reference mouse genome mm10. Further annotation revealed that 85% of mEPIC probes overlapped with mm10.refSeq genes at different genomic features including promoters (TSS1500 and TSS200), 1st exons, 5'UTRs, 3'UTRs, CpG islands, shores, shelves, open seas and FANTOM5 enhancers. Hybridization of mouse samples to Infinium Human MethylationEPIC BeadChips showed successful measurement of mEPIC probes and reproducibility between inter-array biological replicates. Finally, we demonstrated the utility of mEPIC probes for data exploration such as hierarchical clustering. Given the absence of cost and labor convenient genome-wide technologies in the murine system, our findings show that the Infinium MethylationEPIC BeadChip platform is suitable for investigation of the mouse methylome. Furthermore, we provide the "mEPICmanifest" with genomic features, available to users of Infinium Human MethylationEPIC arrays for mouse samples.

  4. Lessons learned enhancing EPICS CA for LANSCE timed and flavored data

    International Nuclear Information System (INIS)

    Hill, Jeffrey O.

    2009-01-01

    A previous paper described an upgrade to EPICS enabling client side tools at LANSCE to receive subscription updates filtered selectively to match a logical configuration of LANSCE beam gates, as configured by the control room. The upgrade required fundamental changes in the EPICS core components. First, the event queue in the EPICS server was upgraded to buffer record (function block) and device specific parameters accessed generically via software interfaces for introspection of 3rd party data. In contrast, event queues in previous versions of EPICS were strictly limited to buffering only value, timestamp, and alarm status tuples. Second, the Channel Access server is being upgraded to filter subscription updates. In this follow on paper some necessary design changes mid-project and the lessons learned during the software development will be described.

  5. Extrasolar Planetary Imaging Coronagraph (EPIC)

    Science.gov (United States)

    Clampin, Mark

    2009-01-01

    The Extrasolar Planetary Imaging Coronagraph (EPIC) is a proposed NASA Exoplanet Probe mission to image and characterize extrasolar giant planets. EPIC will provide insights into the physical nature and architecture of a variety of planets in other solar systems. Initially, it will detect and characterize the atmospheres of planets identified by radial velocity surveys, determine orbital inclinations and masses and characterize the atmospheres around A and F type stars which cannot be found with RV techniques. It will also observe the inner spatial structure of exozodiacal disks. EPIC has a heliocentric Earth trailing drift-away orbit, with a 5 year mission lifetime. The robust mission design is simple and flexible ensuring mission success while minimizing cost and risk. The science payload consists of a heritage optical telescope assembly (OTA), and visible nulling coronagraph (VNC) instrument. The instrument achieves a contrast ratio of 10^9 over a 5 arcsecond field-of-view with an unprecedented inner working angle of 0.13 arcseconds over the spectral range of 440-880 nm. The telescope is a 1.65 meter off-axis Cassegrain with an OTA wavefront error of lambda/9, which when coupled to the VNC greatly reduces the requirements on the large scale optics.

  6. Simulation using Xorbit with EPICS

    International Nuclear Information System (INIS)

    Evans, K. Jr.

    1995-01-01

    The accelerator code Xorbit has an interface to the Experimental Physics and Industrial Control System (EPICS). This means that machine data such as magnet settings can be sent to Xorbit via EPICS, and the resulting orbit parameters such as beta functions, etc., can be calculated. In addition, Xorbit can be made to simulate the real machine, whether the latter is running or not. To accomplish this for the APS, there is a database of process variables in an IOC corresponding to each APS ring and beamline. These process variables are very similar to the real process variables that read and set power supplies and read monitors, except that when a setting is changed, Xorbit is notified via a callback, calculates a new orbit, and outputs the appropriate readbacks to the database. By attaching the string ''Xorbit:'' to a control name the control system will respond to the simulation rather than the real system. This allows the testing of control algorithms, orbit diagnostics, and many other components of the control system (as well as EPICS itself). It is fast enough to be visually similar to accessing the real system

  7. EPICS SCA CLIENTS ON THE .NET X64 PLATFORM

    International Nuclear Information System (INIS)

    Timossi, Chris; Nishimura, Hiroshi

    2006-01-01

    We have developed a .NET assembly, which we call SCA.NET, which we have been using for building EPICS based control room applications at the Advanced Light Source (ALS). In this paper we report on our experiences building a 64-bit version of SCA.NET and the underlying channel access libraries for Windows XP x64 (using a dual core AMD Athlon CPU). We also report on our progress in building new accelerator control applications for this environment

  8. Dynamic Binary Modification Tools, Techniques and Applications

    CERN Document Server

    Hazelwood, Kim

    2011-01-01

    Dynamic binary modification tools form a software layer between a running application and the underlying operating system, providing the powerful opportunity to inspect and potentially modify every user-level guest application instruction that executes. Toolkits built upon this technology have enabled computer architects to build powerful simulators and emulators for design-space exploration, compiler writers to analyze and debug the code generated by their compilers, software developers to fully explore the features, bottlenecks, and performance of their software, and even end-users to extend

  9. The Philippine Epics and Ballads Multimedia Archive

    Directory of Open Access Journals (Sweden)

    Nicole Revel

    2013-10-01

    Full Text Available This essay offers an introduction to the Philippine Epics and Ballads Archive. This collection is a joint endeavor between singers, scholars, knowledgeable local persons, and technical assistants. This archive exemplifies a part of the cultural heritage among 15 national cultural communities and their respective languages. A multi-media eCompanion offers an interactive version of a Palawan epic song.

  10. [Is the socioeconomic deprivation EPICES score useful in obstetrics?].

    Science.gov (United States)

    Convers, M; Langeron, A; Sass, C; Moulin, J-J; Augier, A; Varlet, M-N; Seffert, P; Chêne, G

    2012-04-01

    To describe a validated and multifactorial deprivation score to study the relationship between socioeconomic deprivation and perinatal risks. The index of deprivation EPICES (Evaluation of Precarity and Inequalities in Health Examination Centers) was used to characterize the deprivation status of 234 women in post-partum in comparison with perinatal morbidity. The cutoff value of 30.7 was the threshold to define deprivation. Two hundred and eight patients were included in this retrospective study from whom 48 (23%) had a score of deprivation higher than 30.7. Maternofetal morbidity was more severe in deprived patients. The current results show that the EPICES score could be a useful obstetrical tool for the identification of deprived women during pregnancy. Copyright © 2011 Elsevier Masson SAS. All rights reserved.

  11. EPICS IOC module development and implementation for the ISTTOK machine subsystem operation and control

    Energy Technology Data Exchange (ETDEWEB)

    Carvalho, Paulo, E-mail: pricardofc@ipfn.ist.utl.pt [Associacao EURATOM/IST, Instituto de Plasmas e Fusao Nuclear-Laboratorio Associado, Instituto Superior Tecnico, P-1049-001 Lisboa (Portugal); Duarte, Andre; Pereira, Tiago; Carvalho, Bernardo; Sousa, Jorge; Fernandes, Horacio [Associacao EURATOM/IST, Instituto de Plasmas e Fusao Nuclear-Laboratorio Associado, Instituto Superior Tecnico, P-1049-001 Lisboa (Portugal); Correia, Carlos [Grupo de Electronica e Instrumentacao-Centro de Instrumentacao, Departamento de Fisica, Universidade de Coimbra, P-3004-516 Coimbra (Portugal); Goncalves, Bruno; Varandas, Carlos [Associacao EURATOM/IST, Instituto de Plasmas e Fusao Nuclear-Laboratorio Associado, Instituto Superior Tecnico, P-1049-001 Lisboa (Portugal)

    2011-10-15

    This paper presents a developed, tested and integrated EPICS IOC (I/O controller) module solution for the ISTTOK tokamak machine operation and control for the vacuum and gas injection systems. The work is organized in two software layers which communicate through a serial RS-232 communication protocol. The first software layer is an EPICS IOC module running as a computer server application capable of receiving requests from remote or local clients providing driver interface to the system by forwarding requested commands and receiving system and control operation status. The second software layer is the firmware running in Microchip dsPIC microcontroller modules which performs the interface from RS-232 optical fiber serial protocol to EPICS IOC module. The dsPIC module communicates to the ISTTOK tokamak sensors and actuators via RS-485 and is programmed with a new protocol developed for this purpose that allows EPICS IOC module command sending/receiving, machine operation control and monitoring and system status information. Communication between EPICS IOC module and clients is achieved via a TCP/IP and UDP protocol referred as Channel Access. In addition, the EPICS IOC module provides user client applications access allowing operators to perform remote or local monitoring, operation and control.

  12. Organic Bioelectronic Tools for Biomedical Applications

    Directory of Open Access Journals (Sweden)

    Susanne Löffler

    2015-11-01

    Full Text Available Organic bioelectronics forms the basis of conductive polymer tools with great potential for application in biomedical science and medicine. It is a rapidly growing field of both academic and industrial interest since conductive polymers bridge the gap between electronics and biology by being electronically and ionically conductive. This feature can be employed in numerous ways by choosing the right polyelectrolyte system and tuning its properties towards the intended application. This review highlights how active organic bioelectronic surfaces can be used to control cell attachment and release as well as to trigger cell signaling by means of electrical, chemical or mechanical actuation. Furthermore, we report on the unique properties of conductive polymers that make them outstanding materials for labeled or label-free biosensors. Techniques for electronically controlled ion transport in organic bioelectronic devices are introduced, and examples are provided to illustrate their use in self-regulated medical devices. Organic bioelectronics have great potential to become a primary platform in future bioelectronics. We therefore introduce current applications that will aid in the development of advanced in vitro systems for biomedical science and of automated systems for applications in neuroscience, cell biology and infection biology. Considering this broad spectrum of applications, organic bioelectronics could lead to timely detection of disease, and facilitate the use of remote and personalized medicine. As such, organic bioelectronics might contribute to efficient healthcare and reduced hospitalization times for patients.

  13. EPICS V4 Evaluation for SNS Neutron Data

    Energy Technology Data Exchange (ETDEWEB)

    Kasemir, Kay [ORNL; Pearson, Matthew R [ORNL; Guyotte, Greg S [ORNL

    2015-01-01

    Version 4 of the Experimental Physics and Industrial Control System (EPICS) toolkit allows defining application-specific structured data types (pvData) and offers a network protocol for their efficient exchange (pvAccess). We evaluated V4 for the transport of neutron events from the detectors of the Spallation Neutron Source (SNS) to data acquisition and experiment monitoring systems. This includes the comparison of possible data structures, performance tests, and experience using V4 in production on a beam line.

  14. The CEBAF accelerator control system: migrating from a TACL to an EPICS based system

    International Nuclear Information System (INIS)

    Watson, W.A. III; Barker, David; Bickley, Matthew; Gupta, Pratik; Johnson, R.P.

    1994-01-01

    CEBAF is in the process of migrating its accelerator and experimental hall control systems to one based upon EPICS, a control system toolkit developed by a collaboration among several DOE laboratories in the US. The new system design interfaces existing CAMAC hardware via a CAMAC serial highway to VME-based I/O controllers running the EPICS software; future additions and upgrades will for the most part go directly into VME. The decision to use EPICS followed difficulties in scaling the CEBAF-developed TACL system to full machine operation. TACL and EPICS share many design concepts, facilitating the conversion of software from one toolkit to the other. In particular, each supports graphical entry of algorithms built up from modular code, graphical displays with a display editor, and a client-server architecture with name-based I/O. During the migration, TACL and EPICS will interoperate through a socket-based I/O gateway. As part of a collaboration with other laboratories, CEBAF will add relational database support for system management and high level applications support. Initial experience with EPICS is presented, along with a plan for the full migration which is expected to be finished next year. ((orig.))

  15. Experience using EPICS on PC platforms

    International Nuclear Information System (INIS)

    Hill, J.O.; Kasemire, K.U.

    1997-03-01

    The Experimental Physics and Industrial Control System (EPICS) has been widely adopted in the accelerator community. Although EPICS is available on many platforms, the majority of implementations have used UNIX workstations as clients, and VME- or VXI-based processors for distributed input output controllers. Recently, a significant portion of EPICS has been ported to personal computer (PC) hardware platforms running Microsoft's operating systems, and also Wind River System's real time vxWorks operating system. This development should significantly reduce the cost of deploying EPICS systems, and the prospect of using EPICS together with the many high quality commercial components available for PC platforms is also encouraging. A hybrid system using both PC and traditional platforms is currently being implemented at LANL for LEDA, the low energy demonstration accelerator under construction as part of the Accelerator Production of Tritium (APT) project. To illustrate these developments the authors compare their recent experience deploying a PC-based EPICS system with experience deploying similar systems based on traditional (UNIX-hosted) EPICS hardware and software platforms

  16. Three Object-Oriented enhancement for EPICS

    Science.gov (United States)

    Osberg, E. A.; Dohan, D. A.; Richter, R.; Biggs, R.; Chillara, K.; Wade, D.; Bossom, J.

    1994-12-01

    In line with our group's intention of producing software using, where possible, Object-Oriented methodologies and techniques in the development of RF control systems, we have undertaken three projects to enhance the EPICS software environment. Two of the projects involve interfaces to EPICs Channel Access from Object-Oriented languages. The third is an enhancement to the EPICS State Notation Language to better support the Shlaer-Mellor Object-Oriented Analysis and Design Methodology. This paper discusses the motivation, approaches, results and future directions of these three projects.

  17. Integrating commercial and legacy systems with EPICS

    International Nuclear Information System (INIS)

    Hill, J.O.; Kasemir, K.U.

    1997-01-01

    The Experimental Physics and Industrial Control System (EPICS) is a software toolkit, developed by a worldwide collaboration, which significantly reduces the level of effort required to implement a new control system. Recent developments now also significantly reduce the level of effort required to integrate commercial, legacy and/or site-authored control systems with EPICS. This paper will illustrate with an example both the level and type of effort required to use EPICS with other control system components as well as the benefits that may arise

  18. Implementation of the Integrated Alarm System for KOMAC facility using EPICS framework and Eclipse

    International Nuclear Information System (INIS)

    Song, Young-Gi; Kim, Jae-Ha; Kim, Han-Sung; Kwon, Hyeok-Jung; Cho, Yong-Sub

    2017-01-01

    The alarm detecting layer is the component that monitors alarm signals which are transported to the processing part through message queue. The main purpose of the processing part is to transfer the alarm signals connecting an alarm identification and state of the alarm to database system. The operation interface of system level signal links has been developed by EPICS framework. EPICS tools have been used for monitoring device alarm status. The KOMAC alarm system was developed for offering a user-friendly, intuitive user interface. The alarm system is implemented with EPICS IOC for alarm server, eclipse-mars integrated development tool for alarm viewer, and mariadb for alarm log. The new alarm system supports intuitive user interface on alarm information and alarm history. Alarm view has plans to add login function, user permission on alarm acknowledge, user permission of PV import, search and report function.

  19. Web services interface to EPICS channel access

    Institute of Scientific and Technical Information of China (English)

    DUAN Lei; SHEN Liren

    2008-01-01

    Web services is used in Experimental Physics and Industrial Control System (EPICS). Combined with EPICS Channel Access protocol, Web services' high usability, platform independence and language independence can be used to design a fully transparent and uniform software interface layer, which helps us complete channel data acquisition, modification and monitoring functions. This software interface layer, a cross-platform of cross-language,has good interopcrability and reusability.

  20. Web services interface to EPICS channel access

    International Nuclear Information System (INIS)

    Duan Lei; Shen Liren

    2008-01-01

    Web services is used in Experimental Physics and Industrial Control System (EPICS). Combined with EPICS Channel Access protocol, Web services high usability, platform independence and language independence can be used to design a fully transparent and uniform software interface layer, which helps us complete channel data acquisition, modification and monitoring functions. This software interface layer, a cross-platform of cross-language, has good interoperability and reusability. (authors)

  1. EPIC: an Error Propagation/Inquiry Code

    International Nuclear Information System (INIS)

    Baker, A.L.

    1985-01-01

    The use of a computer program EPIC (Error Propagation/Inquiry Code) will be discussed. EPIC calculates the variance of a materials balance closed about a materials balance area (MBA) in a processing plant operated under steady-state conditions. It was designed for use in evaluating the significance of inventory differences in the Department of Energy (DOE) nuclear plants. EPIC rapidly estimates the variance of a materials balance using average plant operating data. The intent is to learn as much as possible about problem areas in a process with simple straightforward calculations assuming a process is running in a steady-state mode. EPIC is designed to be used by plant personnel or others with little computer background. However, the user should be knowledgeable about measurement errors in the system being evaluated and have a limited knowledge of how error terms are combined in error propagation analyses. EPIC contains six variance equations; the appropriate equation is used to calculate the variance at each measurement point. After all of these variances are calculated, the total variance for the MBA is calculated using a simple algebraic sum of variances. The EPIC code runs on any computer that accepts a standard form of the BASIC language. 2 refs., 1 fig., 6 tabs

  2. Hyperarchiver: an evolution of EPICS channel archiver

    International Nuclear Information System (INIS)

    Campo, M. del; Arredondo, I.; Jugo, J.; Giacchini, M.; Giovannini, L.

    2012-01-01

    Data storage is a primary issue in any research facility. In the EPICS middleware based accelerator community, Channel Archiver has been always considered the main reference. It works with Oracle and MySQL, probably the best well known relational databases. However, demanding requirements at minimum costs have fostered the development of a wide range of alternatives, like MDSPlus (Consorzio RFX), SciDB (BNL) or Hypertable (IFNF). This document launches a tool called HyperArchiver, which was firstly developed at IFNF (Italy) and eventually customised by ESS Bilbao (Spain). Based on a NoSQL database named Hypertable, it focuses on large data sets management with maximum scalability, reliability and performance. Besides the update and further customization made at ESS Bilbao, HyperArchiver is presented with a set of GUIs, in order to provide an easy use and integration with any general control system. A LabVIEW VI and two cross-platform PyQt GUIs for both Hypertable data retrieval and HyperArchiver control have been developed and successfully tested at ESS Bilbao. (author)

  3. Collaborative development of the EPICS Qt framework Phase I Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Mayssat, Robert E. [Lyncean Technologies, Inc., Palo Alto, CA (United States)

    2015-01-15

    At Lyncean, a private company spun-off from technology developed at the SLAC National Lab, we have been using EPICS for over a decade. EPICS is ubiquitous on our flagship product – the Compact Light Source. EPICS is not only used to control our laser and accelerator systems, but also to control our x-ray beamlines. The goal of this SBIR is for Lyncean Technologies to spearhead a worldwide collaborative effort for the development of control system tools for EPICS using the Qt framework, a C++-based coding environment that could serve as a competitive alternative to the Java-based Control System Studio (CSS). This grant's Phase I, not unlike a feasibility study, is designed for planning and scoping the preparatory work needed for Phase II or other funding opportunities. The three main objectives of this Phase I are (1) to become better acquainted with the existing EPICS Qt software and Qt framework in order to evaluate the best options for ongoing development, (2) to demonstrate that our engineers can lead the EPICS community and jump-start the Qt collaboration, and (3) to identify a scope for our future work with solicited feedback from the EPICS community. This Phase I report includes key technical findings. It clarifies the differences between the two apparently-competing EPICS Qt implementations, caQtDM and the QE Framework; it explains how to create python-bindings, and compares Qt graphical libraries. But this report is also a personal story that narrates the birth of a collaboration. Starting a collaboration is not the work of a single individual, but the work of many. Therefore this report is also an attempt to publicly give credit to many who supported the effort. The main take-away from this grant is the successful birth of an EPICS Qt collaboration, seeded with existing software from the PSI and the Australian Synchrotron. But a lot more needs to be done for the collaboration founders' vision to be realized, and for the collaboration to reach

  4. Application of Genomic Tools in Plant Breeding

    OpenAIRE

    Pérez-de-Castro, A.M.; Vilanova, S.; Cañizares, J.; Pascual, L.; Blanca, J.M.; Díez, M.J.; Prohens, J.; Picó, B.

    2012-01-01

    Plant breeding has been very successful in developing improved varieties using conventional tools and methodologies. Nowadays, the availability of genomic tools and resources is leading to a new revolution of plant breeding, as they facilitate the study of the genotype and its relationship with the phenotype, in particular for complex traits. Next Generation Sequencing (NGS) technologies are allowing the mass sequencing of genomes and transcriptomes, which is producing a vast array of genomic...

  5. 77 FR 73655 - Epic Marketplace, Inc., and Epic Media Group, LLC; Analysis of Proposed Consent Order To Aid...

    Science.gov (United States)

    2012-12-11

    ... action or make final the agreement's proposed order. Epic Marketplace, Inc. (``Epic'') is an advertising company that engages in online behavioral advertising, which is the practice of tracking a consumer's online activities in order to deliver advertising targeted to the consumer's interests. Epic is a wholly...

  6. Hedging tools and cross market applications

    International Nuclear Information System (INIS)

    Schlenker, C.

    1997-01-01

    The nature of the basic tools of the market - put, call, and swap - their various combinations and how they are used in various market transactions were explained. The role and effect of the use of these tools on prices, price fluctuations and risks were outlined. Predictions for the future for producers (reduced use of price optimization and its replacement by price diversification strategies, realignment of risk management tools to be in accord with the changed marketing techniques), and for end users (focus on managing short term fluctuations in input costs, short-term price fixing, more frequent utilization of traditional option strategies and spread options), were summarized. For the electricity market in particular, the prominence of gas-fired generation units as options on the spread between gas and electricity, providing opportunities for outperformance options was predicted

  7. Centrally managed name resolution schemes for EPICS

    International Nuclear Information System (INIS)

    Jun, D.

    1997-01-01

    The Experimental Physics and Industrial Control System (EPICS) uses a broadcast method to locate resources and controls distributed across control servers. There are many advantages offered by using a centrally managed name resolution method, in which resources are located using a repository. The suitability of DCE Directory Service as a name resolution method is explored, and results from a study involving DCE are discussed. An alternative nameserver method developed and in use at the Thomas Jefferson national Accelerator Facility (Jefferson Lab) is described and results of integrating this new method with existing EPICS utilities presented. The various methods discussed in the paper are compared

  8. Decommissioned Data Tools and Web Applications

    Science.gov (United States)

    Employment and Payroll Survey of Business Owners Work from Home Our statistics highlight trends in household statistics from multiple surveys. Data Tools & Apps Main American FactFinder Census Business Builder My small business owners selected Census Bureau & other statistics to guide their research for opening

  9. Database development tool DELPHI and its application

    International Nuclear Information System (INIS)

    Ma Mei

    2000-01-01

    The authors described the progress of the software development technologies and tools, the features and performances of Borland Delphi and a software development instance which is the Management Information System of Tank region storage and transportation control center for Zhenhai Refining and Chemical CO., Ltd. in the Zhejiang province

  10. Artificial intelligence tool development and applications to nuclear power

    International Nuclear Information System (INIS)

    Naser, J.A.

    1987-01-01

    Two parallel efforts are being performed at the Electric Power Research Institute (EPRI) to help the electric utility industry take advantage of the expert system technology. The first effort is the development of expert system building tools, which are tailored to electric utility industry applications. The second effort is the development of expert system applications. These two efforts complement each other. The application development tests the tools and identifies additional tool capabilities that are required. The tool development helps define the applications that can be successfully developed. Artificial intelligence, as demonstrated by the developments described is being established as a credible technological tool for the electric utility industry. The challenge to transferring artificial intelligence technology and an understanding of its potential to the electric utility industry is to gain an understanding of the problems that reduce power plant performance and identify which can be successfully addressed using artificial intelligence

  11. OpenDOAR Policy tools and applications

    CERN Document Server

    CERN. Geneva; Van de Sompel, Herbert

    2007-01-01

    OpenDOAR conducted a survey of the world's repositories that showed 2/3 as having unusable or missing policies for content re-use and other issues. Such policies are essential for service providers to be able to develop innovative services that use the full potential of open access. OpenDOAR has developed a set of policy generator tools for repository administrators and is contacting administrators to advocate policy development. It is hoped that one outcome from this work will be some standardisation of policies in vocabulary and intent. Other developments include an OpenDOAR API. This presentation looks at the way that the tools and API have been developed and the implcations for their use.

  12. OpenDOAR Policy tools and applications

    CERN Multimedia

    CERN. Geneva

    2007-01-01

    OpenDOAR conducted a survey of the world's repositories that showed 2/3 as having unusable or missing policies for content re-use and other issues. Such policies are essential for service providers to be able to develop innovative services that use the full potential of open access. OpenDOAR has developed a set of policy generator tools for repository administrators and is contacting administrators to advocate policy development. It is hoped that one outcome from this work will be some standardisation of policies in vocabulary and intent. Other developments include an OpenDOAR API. This presentation looks at the way that the tools and API have been developed and the implcations for their use. View Bill Hubbard's biography

  13. Application of diamond tools when decontaminating concrete

    International Nuclear Information System (INIS)

    Woods, B.L.; Gossett, R.F.

    1980-01-01

    The utilization of diamond concrete cutting tools offers new potential approaches to the recurring problems of removing contaminated concrete. Innovative techniques can provide exacting removal within a dust-free environment. Present day technology allows remote control operated equipment to perform tasks heretofore considered impossible. Experience gained from years of removing concrete within the construction industry hopefully can contribute new and improved methods to D and D projects

  14. Application of genomic tools in plant breeding.

    Science.gov (United States)

    Pérez-de-Castro, A M; Vilanova, S; Cañizares, J; Pascual, L; Blanca, J M; Díez, M J; Prohens, J; Picó, B

    2012-05-01

    Plant breeding has been very successful in developing improved varieties using conventional tools and methodologies. Nowadays, the availability of genomic tools and resources is leading to a new revolution of plant breeding, as they facilitate the study of the genotype and its relationship with the phenotype, in particular for complex traits. Next Generation Sequencing (NGS) technologies are allowing the mass sequencing of genomes and transcriptomes, which is producing a vast array of genomic information. The analysis of NGS data by means of bioinformatics developments allows discovering new genes and regulatory sequences and their positions, and makes available large collections of molecular markers. Genome-wide expression studies provide breeders with an understanding of the molecular basis of complex traits. Genomic approaches include TILLING and EcoTILLING, which make possible to screen mutant and germplasm collections for allelic variants in target genes. Re-sequencing of genomes is very useful for the genome-wide discovery of markers amenable for high-throughput genotyping platforms, like SSRs and SNPs, or the construction of high density genetic maps. All these tools and resources facilitate studying the genetic diversity, which is important for germplasm management, enhancement and use. Also, they allow the identification of markers linked to genes and QTLs, using a diversity of techniques like bulked segregant analysis (BSA), fine genetic mapping, or association mapping. These new markers are used for marker assisted selection, including marker assisted backcross selection, 'breeding by design', or new strategies, like genomic selection. In conclusion, advances in genomics are providing breeders with new tools and methodologies that allow a great leap forward in plant breeding, including the 'superdomestication' of crops and the genetic dissection and breeding for complex traits.

  15. Education for Change: Epic Charter School

    Science.gov (United States)

    EDUCAUSE, 2015

    2015-01-01

    The student-centered school model of Epic Charter School in Oakland, California, framed around a hero's journey empowers middle school students with sense of unity and purpose in life, where they can feel part of a culture with a shared experience and with more opportunities to experiences growth and accomplishment. Design and engineering is front…

  16. The coal epic at Freyming-Merlebach

    International Nuclear Information System (INIS)

    2003-01-01

    This information document has been realized on the closure of the Merlebach site. It is devoted to the coal epic at Freyming-Merlebach. The historical aspects of the exploitation, the working conditions, the economic and environmental aspects of the mine and the today situation are detailed. (A.L.B.)

  17. JBluIce-EPICS control system for macromolecular crystallography

    International Nuclear Information System (INIS)

    Stepanov, S.; Makarov, O.; Hilgart, M.; Pothineni, S.; Urakhchin, A.; Devarapalli, S.; Yoder, D.; Becker, M.; Ogata, C.; Sanishvili, R.; Nagarajan, V.; Smith, J.L.; Fischetti, R.F.

    2011-01-01

    The trio of macromolecular crystallography beamlines constructed by the General Medicine and Cancer Institutes Collaborative Access Team (GM/CA-CAT) in Sector 23 of the Advanced Photon Source (APS) have been in growing demand owing to their outstanding beam quality and capacity to measure data from crystals of only a few micrometres in size. To take full advantage of the state-of-the-art mechanical and optical design of these beamlines, a significant effort has been devoted to designing fast, convenient, intuitive and robust beamline controls that could easily accommodate new beamline developments. The GM/CA-CAT beamline controls are based on the power of EPICS for distributed hardware control, the rich Java graphical user interface of Eclipse RCP and the task-oriented philosophy as well as the look and feel of the successful SSRL BluIce graphical user interface for crystallography. These beamline controls feature a minimum number of software layers, the wide use of plug-ins that can be written in any language and unified motion controls that allow on-the-fly scanning and optimization of any beamline component. This paper describes the ways in which BluIce was combined with EPICS and converted into the Java-based JBluIce, discusses the solutions aimed at streamlining and speeding up operations and gives an overview of the tools that are provided by this new open-source control system for facilitating crystallographic experiments, especially in the field of microcrystallography.

  18. Mixed Reality: Concepts, Tools and Applications

    Directory of Open Access Journals (Sweden)

    Ildeberto Aparecido Rodello

    2010-10-01

    Full Text Available The Mixed Reality proposes scenes combining between virtual and real worlds offering to the user an intuitive way of interaction according to a specific application. This tutorial paper aims at presenting the fundamentals concepts of this emergent kind of human-computer interface.

  19. Enhancing the Effectiveness of ICT Applications and Tools for ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Enhancing the Effectiveness of ICT Applications and Tools for Disaster ... of disaster management in the Caribbean, including early warning systems and collection ... to enhancing regional strategies to respond to natural hazards using ICTs.

  20. Preliminary design and integration of EPICS operation interface for the Taiwan photon source

    International Nuclear Information System (INIS)

    Cheng, Y.S.; Jenny Chen; Chiu, P.C.; Kuo, C.H.; Liao, C.Y.; Hsu, K.T.; Wu, C.Y.

    2012-01-01

    The TPS (Taiwan Photon Source) is the latest generation 3 GeV synchrotron light source which has been in construction since 2010. The EPICS (Experimental Physics and Industrial Control System) framework is adopted as control system infrastructure for the TPS. The EPICS IOCs (Input Output Controller) and various database records have been gradually implemented to control and monitor available subsystems of the TPS at this moment. The subsystem includes timing, power supply, motion controller, miscellaneous Ethernet compliant devices etc. Through EPICS PVs (Process Variables) channel access, remote access I/O data via Ethernet interface can be observed by the usable graphical tool-kits, such as the EDM (Extensible Display Manager) and MATLAB. The operation interface mainly includes the function of setting, reading, save, restore and etc. Integration of operation interfaces will depend upon properties of each subsystem. In addition, the centralized management method is utilized to serve every client from file servers in order to maintain consistent versions of related EPICS files. (authors)

  1. Sensitivity of an Integrated Mesoscale Atmosphere and Agriculture Land Modeling System (WRF/CMAQ-EPIC) to MODIS Vegetation and Lightning Assimilation

    Science.gov (United States)

    Ran, L.; Cooter, E. J.; Gilliam, R. C.; Foroutan, H.; Kang, D.; Appel, W.; Wong, D. C.; Pleim, J. E.; Benson, V.; Pouliot, G.

    2017-12-01

    The combined meteorology and air quality modeling system composed of the Weather Research and Forecast (WRF) model and Community Multiscale Air Quality (CMAQ) model is an important decision support tool that is used in research and regulatory decisions related to emissions, meteorology, climate, and chemical transport. The Environmental Policy Integrated Climate (EPIC) is a cropping model which has long been used in a range of applications related to soil erosion, crop productivity, climate change, and water quality around the world. We have integrated WRF/CMAQ with EPIC using the Fertilizer Emission Scenario Tool for CMAQ (FEST-C) to estimate daily soil N information with fertilization for CMAQ bi-directional ammonia flux modeling. Driven by the weather and N deposition from WRF/CMAQ, FEST-C EPIC simulations are conducted on 22 different agricultural production systems ranging from managed grass lands (e.g. hay and alfalfa) to crop lands (e.g. corn grain and soybean) with rainfed and irrigated information across any defined conterminous United States (U.S.) CMAQ domain and grid resolution. In recent years, this integrated system has been enhanced and applied in many different air quality and ecosystem assessment projects related to land-water-atmosphere interactions. These enhancements have advanced this system to become a valuable tool for integrated assessments of air, land and water quality in light of social drivers and human and ecological outcomes. This presentation will focus on evaluating the sensitivity of precipitation and N deposition in the integrated system to MODIS vegetation input and lightning assimilation and their impacts on agricultural production and fertilization. We will describe the integrated modeling system and evaluate simulated precipitation and N deposition along with other weather information (e.g. temperature, humidity) for 2011 over the conterminous U.S. at 12 km grids from a coupled WRF/CMAQ with MODIS and lightning assimilation

  2. Computer- Aided Design in Power Engineering Application of Software Tools

    CERN Document Server

    Stojkovic, Zlatan

    2012-01-01

    This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents  application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel & Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems ...

  3. Calibration of the DSCOVR EPIC visible and NIR channels using MODIS Terra and Aqua data and EPIC lunar observations

    Directory of Open Access Journals (Sweden)

    I. V. Geogdzhayev

    2018-01-01

    Full Text Available The unique position of the Deep Space Climate Observatory (DSCOVR Earth Polychromatic Imaging Camera (EPIC at the Lagrange 1 point makes an important addition to the data from currently operating low Earth orbit observing instruments. EPIC instrument does not have an onboard calibration facility. One approach to its calibration is to compare EPIC observations to the measurements from polar-orbiting radiometers. Moderate Resolution Imaging Spectroradiometer (MODIS is a natural choice for such comparison due to its well-established calibration record and wide use in remote sensing. We use MODIS Aqua and Terra L1B 1 km reflectances to infer calibration coefficients for four EPIC visible and NIR channels: 443, 551, 680 and 780 nm. MODIS and EPIC measurements made between June 2015 and 2016 are employed for comparison. We first identify favorable MODIS pixels with scattering angle matching temporarily collocated EPIC observations. Each EPIC pixel is then spatially collocated to a subset of the favorable MODIS pixels within 25 km radius. Standard deviation of the selected MODIS pixels as well as of the adjacent EPIC pixels is used to find the most homogeneous scenes. These scenes are then used to determine calibration coefficients using a linear regression between EPIC counts s−1 and reflectances in the close MODIS spectral channels. We present thus inferred EPIC calibration coefficients and discuss sources of uncertainties. The lunar EPIC observations are used to calibrate EPIC O2 absorbing channels (688 and 764 nm, assuming that there is a small difference between moon reflectances separated by  ∼  10 nm in wavelength and provided the calibration factors of the red (680 nm and NIR (780 nm are known from comparison between EPIC and MODIS.

  4. Calibration of the DSCOVR EPIC Visible and NIR Channels using MODIS Terra and Aqua Data and EPIC Lunar Observations

    Science.gov (United States)

    Geogdzhayev, Igor V.; Marshak, Alexander

    2018-01-01

    The unique position of the Deep Space Climate Observatory (DSCOVR) Earth Polychromatic Imaging Camera (EPIC) at the Lagrange 1 point makes an important addition to the data from currently operating low Earth orbit observing instruments. EPIC instrument does not have an onboard calibration facility. One approach to its calibration is to compare EPIC observations to the measurements from polar-orbiting radiometers. Moderate Resolution Imaging Spectroradiometer (MODIS) is a natural choice for such comparison due to its well-established calibration record and wide use in remote sensing. We use MODIS Aqua and Terra L1B 1km reflectances to infer calibration coefficients for four EPIC visible and NIR channels: 443, 551, 680 and 780 nm. MODIS and EPIC measurements made between June 2015 and 2016 are employed for comparison. We first identify favorable MODIS pixels with scattering angle matching temporarily collocated EPIC observations. Each EPIC pixel is then spatially collocated to a subset of the favorable MODIS pixels within 25 km radius. Standard deviation of the selected MODIS pixels as well as of the adjacent EPIC pixels is used to find the most homogeneous scenes. These scenes are then used to determine calibration coefficients using a linear regression between EPIC counts/sec and reflectances in the close MODIS spectral channels. We present thus inferred EPIC calibration coefficients and discuss sources of uncertainties. The lunar EPIC observations are used to calibrate EPIC O2 absorbing channels (688 and 764 nm), assuming that there is a small difference between moon reflectances separated by approx.10 nm in wavelength provided the calibration factors of the red (680 nm) and near-IR (780 nm) are known from comparison between EPIC and MODIS.

  5. EPICS device support module as ATCA system manager for the ITER fast plant system controller

    Energy Technology Data Exchange (ETDEWEB)

    Carvalho, Paulo F., E-mail: pricardofc@ipfn.ist.utl.pt [Associação EURATOM/IST, Instituto de Plasmas e Fusão Nuclear, Instituto Superior Técnico – Universidade Técnica de Lisboa, Lisboa (Portugal); Santos, Bruno; Gonçalves, Bruno; Carvalho, Bernardo B.; Sousa, Jorge; Rodrigues, A.P.; Batista, António J.N.; Correia, Miguel; Combo, Álvaro [Associação EURATOM/IST, Instituto de Plasmas e Fusão Nuclear, Instituto Superior Técnico – Universidade Técnica de Lisboa, Lisboa (Portugal); Correia, Carlos M.B.A. [Centro de Instrumentação, Departamento de Física, Universidade de Coimbra, Coimbra (Portugal); Varandas, Carlos A.F. [Associação EURATOM/IST, Instituto de Plasmas e Fusão Nuclear, Instituto Superior Técnico – Universidade Técnica de Lisboa, Lisboa (Portugal)

    2013-10-15

    Highlights: ► In Nuclear Fusion, demanding security and high-availability requirements call for redundancy to be available. ► ATCA based Nuclear Fusion Systems are composed by several electronic and mechanical component. ► Control and monitoring of ATCA electronic systems are recommended. ► ITER Fast Plant System Controller Project CODAC system prototype. ► EPICS device support module as External ATCA system manager solution. -- Abstract: This paper presents an Enhanced Physics and Industrial Control System (EPICS) device support module for the International Thermonuclear Experimental Reactor (ITER) Fast Plant System Controller (FPSC) project based in Advanced Telecommunications Computing Architecture (ATCA) specification. The developed EPICS device support module provides an External System Manager (ESM) solution for monitoring and control the ITER FPSC ATCA shelf system and data acquisition boards in order to take proper action and report problems to a control room operator or high level management unit in case of any system failure occurrence. EPICS device support module acts as a Channel Access (CA) server to report problems and publish ATCA system data information to the control room operator, high level management unit or other CA network clients such as Control System Studio Operator Interfaces (CSS OPIs), Best Ever Alarm System Toolkit (BEAST), Best Ever Archive Utility (BEAUTY) or other CA client applications. EPICS device support module communicates with the ATCA Shelf manager (ShM) using HTTP protocol to send and receive commands through POST method in order to get and set system and shelf components properties such as fan speeds measurements, temperatures readings, module status and ATCA boards acquisition and configuration parameters. All system properties, states, commands and parameters are available through the EPICS device support module CA server in EPICS Process Variables (PV) and signals format. ATCA ShM receives the HTTP protocol

  6. Novel diamond-coated tools for dental drilling applications.

    Science.gov (United States)

    Jackson, M J; Sein, H; Ahmed, W; Woodwards, R

    2007-01-01

    The application of diamond coatings on cemented tungsten carbide (WC-Co) tools has been the subject of much attention in recent years in order to improve cutting performance and tool life in orthodontic applications. WC-Co tools containing 6% Co metal and 94% WC substrate with an average grain size of 1 - 3 microm were used in this study. In order to improve the adhesion between diamond and WC substrates it is necessary to etch cobalt from the surface and prepare it for subsequent diamond growth. Alternatively, a titanium nitride (TiN) interlayer can be used prior to diamond deposition. Hot filament chemical vapour deposition (HFCVD) with a modified vertical filament arrangement has been employed for the deposition of diamond films to TiN and etched WC substrates. Diamond film quality and purity has been characterized using scanning electron microscopy (SEM) and micro Raman spectroscopy. The performances of diamond-coated WC-Co tools, uncoated WC-Co tools, and diamond embedded (sintered) tools have been compared by drilling a series of holes into various materials such as human tooth, borosilicate glass, and acrylic tooth materials. Flank wear has been used to assess the wear rates of the tools when machining biomedical materials such as those described above. It is shown that using an interlayer such as TiN prior to diamond deposition provides the best surface preparation for producing dental tools.

  7. Enlisted Personnel Individualized Career System (EPICS) Test and Evaluation

    Science.gov (United States)

    1984-01-01

    The EPICS program, which was developed using an integrated personnel systems approach ( IPSA ), delays formal school training until after personnel have...received shipboard on-job training complemented by job performance aids (3PAs). Early phases of the program, which involved developing the IPSA EPICS...detailed description of the conception and development of the EPICS IPSA model, the execution of the front-end job design analyses, 3PA and instructional

  8. EPICS data archiver at SSRF beamlines

    International Nuclear Information System (INIS)

    Hu Zheng; Mi Qingru; Zheng Lifang; Li Zhong

    2014-01-01

    The control system of SSRF (Shanghai Synchrotron Radiation Facility) is based on EPICS (Experimental Physics and Industrial Control System). Operation data storage for synchrotron radiation facility is important for its status monitoring and analysis. At SSRF, operation data used to be index files recorded by traditional EPICS Channel Archiver. Nevertheless, index files are not suitable for long-term maintenance and difficult for data analysis. Now, RDB Channel Archiver and MySQL are used for SSRF beamline operation data archiving, so as to promote the data storage reliability and usability. By applying a new uploading mechanism to RDB Channel Archiver, its writing performance is improved. A web-based GUI (Graphics User Interface) is also developed to make it easier to access database. (authors)

  9. Recent applications of synthetic biology tools for yeast metabolic engineering

    DEFF Research Database (Denmark)

    Jensen, Michael Krogh; Keasling, Jay

    2015-01-01

    to engineer microbial chemical factories has steadily decreased, improvement is still needed. Through the development of synthetic biology tools for key microbial hosts, it should be possible to further decrease the development times and improve the reliability of the resulting microorganism. Together...... with continuous decreases in price and improvements in DNA synthesis, assembly and sequencing, synthetic biology tools will rationalize time-consuming strain engineering, improve control of metabolic fluxes, and diversify screening assays for cellular metabolism. This review outlines some recently developed...... synthetic biology tools and their application to improve production of chemicals and fuels in yeast. Finally, we provide a perspective for the challenges that lie ahead....

  10. Application of Risk Monitor MAREas tool in operation and maintenance

    International Nuclear Information System (INIS)

    Carretero, J. A.; Fuentes, I.

    2004-01-01

    From the very beginning and ongoing application objective of the Probabilistic Safety Assessment (PSA) was to develop Monitors. Their development was contingent on the PSA model computing capacity of the computer tools. the availability of effective tools, as well as the requirements of the Consejo de Seguridad Nuclear to apply the Maintenance Rule, have driven their implementation in Spain. The MARE application of Empresarios Agrupados presented herein has been developed for that purpose and has been implemented in Almaraz NPP. This article describes the process followed and experience in using it. (Author)

  11. Developing security tools of WSN and WBAN networks applications

    CERN Document Server

    A M El-Bendary, Mohsen

    2015-01-01

    This book focuses on two of the most rapidly developing areas in wireless technology (WT) applications, namely, wireless sensors networks (WSNs) and wireless body area networks (WBANs). These networks can be considered smart applications of the recent WT revolutions. The book presents various security tools and scenarios for the proposed enhanced-security of WSNs, which are supplemented with numerous computer simulations. In the computer simulation section, WSN modeling is addressed using MATLAB programming language.

  12. Remote controlled tool systems for nuclear sites have subsea applications

    International Nuclear Information System (INIS)

    Bath, B.; Yemington, C.; Kuhta, B.

    1995-10-01

    Remotely operated underwater tool systems designed to operate in Nuclear Fuel Storage Basins can be applied to deep water, subsea oilfield applications. Spent nuclear fuel rods re stored underwater in large indoor swimming pool-like facilities where the water cover shields the workers from the radiation. This paper describes three specialized tooling systems that were designed and built by Sonsub for work at the Department of Energy's Hanford site. The Door Seal Tool removed an existing seal system, cleaned a 20 ft. tall, carbon steel, underwater hatch and installed a new stainless steel gasket surface with underwater epoxy. The Concrete Sampling Tool was built to take core samples from the vertical, concrete walls of the basins. The tool has three hydraulic drills with proprietary hollow core drill bits to cut and retrieve the concrete samples. The Rack Saw remotely attached itself to a structure, cut a variety of steel shapes and pipes, and retained the cut pieces for retrieval. All of these systems are remotely operated with onboard video cameras and debris collection systems. The methods and equipment proven in this application are available to refurbish sealing surfaces and to drill or sample concrete in offshore oil field applications

  13. Knowledge Management Tools in Application to Regulatory Body Activity

    International Nuclear Information System (INIS)

    Volkov, E.

    2016-01-01

    Full text: The paper presents the application of knowledge management tools to regulatory authority activity. Knowledge management tools are considered a means for improving the efficiency of regulator activities. Three case studies are considered: 1. a knowledge management audit procedure in the regulator (tools for knowledge management audit application, results and the audit outcomes); 2. the development of a guide to identify causes of discrepancies and shortcomings revealed during inspections in NPP maintenance (ontologies of factors influencing on a maintenance quality and causes of discrepancies and shortcoming development); 3. the development of a knowledge portal for regulator (regulator needs which could be covered by the portal, definition and basic function of the portal, it’s functioning principles, development goals and tasks, common model, development stages). (author)

  14. Applications of computational tools in biosciences and medical engineering

    CERN Document Server

    Altenbach, Holm

    2015-01-01

     This book presents the latest developments and applications of computational tools related to the biosciences and medical engineering. It also reports the findings of different multi-disciplinary research projects, for example, from the areas of scaffolds and synthetic bones, implants and medical devices, and medical materials. It is also shown that the application of computational tools often requires mathematical and experimental methods. Computational tools such as the finite element methods, computer-aided design and optimization as well as visualization techniques such as computed axial tomography open up completely new research fields that combine the fields of engineering and bio/medical. Nevertheless, there are still hurdles since both directions are based on quite different ways of education. Often even the “language” can vary from discipline to discipline.

  15. Automatically Assessing Lexical Sophistication: Indices, Tools, Findings, and Application

    Science.gov (United States)

    Kyle, Kristopher; Crossley, Scott A.

    2015-01-01

    This study explores the construct of lexical sophistication and its applications for measuring second language lexical and speaking proficiency. In doing so, the study introduces the Tool for the Automatic Analysis of LExical Sophistication (TAALES), which calculates text scores for 135 classic and newly developed lexical indices related to word…

  16. Metabolic interrelationships software application: Interactive learning tool for intermediary metabolism

    NARCIS (Netherlands)

    A.J.M. Verhoeven (Adrie); M. Doets (Mathijs); J.M.J. Lamers (Jos); J.F. Koster (Johan)

    2005-01-01

    textabstractWe developed and implemented the software application titled Metabolic Interrelationships as a self-learning and -teaching tool for intermediary metabolism. It is used by undergraduate medical students in an integrated organ systems-based and disease-oriented core curriculum, which

  17. Development of Desktop Computing Applications and Engineering Tools on GPUs

    DEFF Research Database (Denmark)

    Sørensen, Hans Henrik Brandenborg; Glimberg, Stefan Lemvig; Hansen, Toke Jansen

    (GPUs) for high-performance computing applications and software tools in science and engineering, inverse problems, visualization, imaging, dynamic optimization. The goals are to contribute to the development of new state-of-the-art mathematical models and algorithms for maximum throughout performance...

  18. EPIC: A Framework for Using Video Games in Ethics Education

    Science.gov (United States)

    Schrier, Karen

    2015-01-01

    Ethics education can potentially be supplemented through the use of video games. This article proposes a novel framework (Ethics Practice and Implementation Categorization [EPIC] Framework), which helps educators choose games to be used for ethics education purposes. The EPIC Framework is derived from a number of classic moral development,…

  19. Design for Change : EPIC pillars for Persuasive Design for Health

    NARCIS (Netherlands)

    Tjin-Kam-Jet-Siemons, Liseth; van Gemert-Pijnen, Julia E.W.C.

    2016-01-01

    What makes technology now truly empathic? How to develop designs that matter? We apply the EPIC for change model for persuasive and empathic designs. EPIC stands for: • Engagement: Creating experience, flow using persuasive strategies and triggers in development, using positive psychology concepts;

  20. EPIC: Helping School Life and Family Support Each Other.

    Science.gov (United States)

    Montgomery, David

    1992-01-01

    Born out of a 1981 murder, Buffalo (New York) Public Schools' EPIC (Effective Parenting Information for Children) program successfully combines parenting, effective teaching, and community programs to help family and school life support each other. Under EPIC, teachers are advised to help students acquire 23 skills involving self-esteem, rules,…

  1. Perfusion Electronic Record Documentation Using Epic Systems Software.

    Science.gov (United States)

    Riley, Jeffrey B; Justison, George A

    2015-12-01

    The authors comment on Steffens and Gunser's article describing the University of Wisconsin adoption of the Epic anesthesia record to include perfusion information from the cardiopulmonary bypass patient experience. We highlight the current-day lessons and the valuable quality and safety principles the Wisconsin-Epic model anesthesia-perfusion record provides.

  2. Jllumina - A comprehensive Java-based API for statistical Illumina Infinium HumanMethylation450 and Infinium MethylationEPIC BeadChip data processing

    Directory of Open Access Journals (Sweden)

    Almeida Diogo

    2016-10-01

    Full Text Available Measuring differential methylation of the DNA is the nowadays most common approach to linking epigenetic modifications to diseases (called epigenome-wide association studies, EWAS. For its low cost, its efficiency and easy handling, the Illumina HumanMethylation450 BeadChip and its successor, the Infinium MethylationEPIC BeadChip, is the by far most popular techniques for conduction EWAS in large patient cohorts. Despite the popularity of this chip technology, raw data processing and statistical analysis of the array data remains far from trivial and still lacks dedicated software libraries enabling high quality and statistically sound downstream analyses. As of yet, only R-based solutions are freely available for low-level processing of the Illumina chip data. However, the lack of alternative libraries poses a hurdle for the development of new bioinformatic tools, in particular when it comes to web services or applications where run time and memory consumption matter, or EWAS data analysis is an integrative part of a bigger framework or data analysis pipeline. We have therefore developed and implemented Jllumina, an open-source Java library for raw data manipulation of Illumina Infinium HumanMethylation450 and Infinium MethylationEPIC BeadChip data, supporting the developer with Java functions covering reading and preprocessing the raw data, down to statistical assessment, permutation tests, and identification of differentially methylated loci. Jllumina is fully parallelizable and publicly available at http://dimmer.compbio.sdu.dk/download.html

  3. Study on the utilization of the cognitive architecture EPIC to the task analysis of a nuclear power plant operator

    International Nuclear Information System (INIS)

    Soares, Herculano Vieira

    2003-02-01

    This work presents a study of the use of the integrative cognitive architecture EPIC - Executive-Process - Interactive-Control, designed to evaluate the performance of a person performing tasks in parallel in a man-machine interface, as a methodology for Cognitive Task Analysis of a nuclear power plant operator. A comparison of the results obtained by the simulation by EPIC and the results obtained by application of the MHP model to the tasks performed by a shift operator during the execution of the procedure PO-E-3 - Steam Generator Tube Rupture of Angra 1 Nuclear Power Plant is done. To subsidize that comparison, an experiment was performed at the Angra 2 Nuclear Power Plant Full Scope Simulator in which three operator tasks were executed, its completion time measured and compared with the results of MHP and EPIC modeling. (author)

  4. Topography measurements and applications in ballistics and tool mark identifications*

    Science.gov (United States)

    Vorburger, T V; Song, J; Petraco, N

    2016-01-01

    The application of surface topography measurement methods to the field of firearm and toolmark analysis is fairly new. The field has been boosted by the development of a number of competing optical methods, which has improved the speed and accuracy of surface topography acquisitions. We describe here some of these measurement methods as well as several analytical methods for assessing similarities and differences among pairs of surfaces. We also provide a few examples of research results to identify cartridge cases originating from the same firearm or tool marks produced by the same tool. Physical standards and issues of traceability are also discussed. PMID:27182440

  5. Beam simulation tools for GEANT4 (and neutrino source applications)

    International Nuclear Information System (INIS)

    V.Daniel Elvira, Paul Lebrun and Panagiotis Spentzouris email daniel@fnal.gov

    2002-01-01

    Geant4 is a tool kit developed by a collaboration of physicists and computer professionals in the High Energy Physics field for simulation of the passage of particles through matter. The motivation for the development of the Beam Tools is to extend the Geant4 applications to accelerator physics. Although there are many computer programs for beam physics simulations, Geant4 is ideal to model a beam going through material or a system with a beam line integrated to a complex detector. There are many examples in the current international High Energy Physics programs, such as studies related to a future Neutrino Factory, a Linear Collider, and a very Large Hadron Collider

  6. Overview of Automotive Core Tools: Applications and Benefits

    Science.gov (United States)

    Doshi, Jigar A.; Desai, Darshak

    2017-08-01

    Continuous improvement of product and process quality is always challenging and creative task in today's era of globalization. Various quality tools are available and used for the same. Some of them are successful and few of them are not. Considering the complexity in the continuous quality improvement (CQI) process various new techniques are being introduced by the industries, as well as proposed by researchers and academia. Lean Manufacturing, Six Sigma, Lean Six Sigma is some of the techniques. In recent years, there are new tools being opted by the industry, especially automotive, called as Automotive Core Tools (ACT). The intention of this paper is to review the applications and benefits along with existing research on Automotive Core Tools with special emphasis on continuous quality improvement. The methodology uses an extensive review of literature through reputed publications—journals, conference proceedings, research thesis, etc. This paper provides an overview of ACT, its enablers, and exertions, how it evolved into sophisticated methodologies and benefits used in organisations. It should be of value to practitioners of Automotive Core Tools and to academics who are interested in how CQI can be achieved using ACT. It needs to be stressed here that this paper is not intended to scorn Automotive Core Tools, rather, its purpose is limited only to provide a balance on the prevailing positive views toward ACT.

  7. Application of parameters space analysis tools for empirical model validation

    Energy Technology Data Exchange (ETDEWEB)

    Paloma del Barrio, E. [LEPT-ENSAM UMR 8508, Talence (France); Guyon, G. [Electricite de France, Moret-sur-Loing (France)

    2004-01-01

    A new methodology for empirical model validation has been proposed in the framework of the Task 22 (Building Energy Analysis Tools) of the International Energy Agency. It involves two main steps: checking model validity and diagnosis. Both steps, as well as the underlying methods, have been presented in the first part of the paper. In this part, they are applied for testing modelling hypothesis in the framework of the thermal analysis of an actual building. Sensitivity analysis tools have been first used to identify the parts of the model that can be really tested on the available data. A preliminary diagnosis is then supplied by principal components analysis. Useful information for model behaviour improvement has been finally obtained by optimisation techniques. This example of application shows how model parameters space analysis is a powerful tool for empirical validation. In particular, diagnosis possibilities are largely increased in comparison with residuals analysis techniques. (author)

  8. Application of GIS tools in determining the navigability of waterways

    Science.gov (United States)

    Nadolny, Grzegorz; Rabant, Hubert; Szatten, Dawid

    2017-11-01

    This article presents the results of a research conducted on Lower Noteć river for the application of geographic information system (GIS) tools. The study consisted of longitudinal profile soundings of navigable route combined with statistical analysis of water levels. GIS software - ArcMap v. 10.0 was used to perform analysis of changes in waterway depth depending on hydrological conditions. A mileage of waterway sections was specified depending on whether they met or did not meet classification requirements in accordance with Polish law. The application of spatial data of Lower Noteć river developed for the purpose of the article is presented. Conducted analyses and obtained results demonstrate the importance of GIS tools in inland navigation studies.

  9. UTOPIA—User-Friendly Tools for Operating Informatics Applications

    Science.gov (United States)

    Sinnott, J. R.; Attwood, T. K.

    2004-01-01

    Bioinformaticians routinely analyse vast amounts of information held both in large remote databases and in flat data files hosted on local machines. The contemporary toolkit available for this purpose consists of an ad hoc collection of data manipulation tools, scripting languages and visualization systems; these must often be combined in complex and bespoke ways, the result frequently being an unwieldy artefact capable of one specific task, which cannot easily be exploited or extended by other practitioners. Owing to the sizes of current databases and the scale of the analyses necessary, routine bioinformatics tasks are often automated, but many still require the unique experience and intuition of human researchers: this requires tools that support real-time interaction with complex datasets. Many existing tools have poor user interfaces and limited real-time performance when applied to realistically large datasets; much of the user's cognitive capacity is therefore focused on controlling the tool rather than on performing the research. The UTOPIA project is addressing some of these issues by building reusable software components that can be combined to make useful applications in the field of bioinformatics. Expertise in the fields of human computer interaction, high-performance rendering, and distributed systems is being guided by bioinformaticians and end-user biologists to create a toolkit that is both architecturally sound from a computing point of view, and directly addresses end-user and application-developer requirements. PMID:18629035

  10. A MySQL Based EPICS Archiver

    Energy Technology Data Exchange (ETDEWEB)

    Christopher Slominski

    2009-10-01

    Archiving a large fraction of the EPICS signals within the Jefferson Lab (JLAB) Accelerator control system is vital for postmortem and real-time analysis of the accelerator performance. This analysis is performed on a daily basis by scientists, operators, engineers, technicians, and software developers. Archiving poses unique challenges due to the magnitude of the control system. A MySQL Archiving system (Mya) was developed to scale to the needs of the control system; currently archiving 58,000 EPICS variables, updating at a rate of 11,000 events per second. In addition to the large collection rate, retrieval of the archived data must also be fast and robust. Archived data retrieval clients obtain data at a rate over 100,000 data points per second. Managing the data in a relational database provides a number of benefits. This paper describes an archiving solution that uses an open source database and standard off the shelf hardware to reach high performance archiving needs. Mya has been in production at Jefferson Lab since February of 2007.

  11. A MySQL Based EPICS Archiver

    International Nuclear Information System (INIS)

    Slominski, Christopher

    2009-01-01

    Archiving a large fraction of the EPICS signals within the Jefferson Lab (JLAB) Accelerator control system is vital for postmortem and real-time analysis of the accelerator performance. This analysis is performed on a daily basis by scientists, operators, engineers, technicians, and software developers. Archiving poses unique challenges due to the magnitude of the control system. A MySQL Archiving system (Mya) was developed to scale to the needs of the control system; currently archiving 58,000 EPICS variables, updating at a rate of 11,000 events per second. In addition to the large collection rate, retrieval of the archived data must also be fast and robust. Archived data retrieval clients obtain data at a rate over 100,000 data points per second. Managing the data in a relational database provides a number of benefits. This paper describes an archiving solution that uses an open source database and standard off the shelf hardware to reach high performance archiving needs. Mya has been in production at Jefferson Lab since February of 2007.

  12. Math tools 500+ applications in science and arts

    CERN Document Server

    Glaeser, Georg

    2017-01-01

    In this book, topics such as algebra, trigonometry, calculus and statistics are brought to life through over 500 applications ranging from biology, physics and chemistry to astronomy, geography and music. With over 600 illustrations emphasizing the beauty of mathematics, Math Tools complements more theoretical textbooks on the market, bringing the subject closer to the reader and providing a useful reference to students. By highlighting the ubiquity of mathematics in practical fields, the book will appeal not only to students and teachers, but to anyone with a keen interest in mathematics and its applications.

  13. Field-programmable custom computing technology architectures, tools, and applications

    CERN Document Server

    Luk, Wayne; Pocek, Ken

    2000-01-01

    Field-Programmable Custom Computing Technology: Architectures, Tools, and Applications brings together in one place important contributions and up-to-date research results in this fast-moving area. In seven selected chapters, the book describes the latest advances in architectures, design methods, and applications of field-programmable devices for high-performance reconfigurable systems. The contributors to this work were selected from the leading researchers and practitioners in the field. It will be valuable to anyone working or researching in the field of custom computing technology. It serves as an excellent reference, providing insight into some of the most challenging issues being examined today.

  14. Applications of Spatial Data Using Business Analytics Tools

    Directory of Open Access Journals (Sweden)

    Anca Ioana ANDREESCU

    2011-12-01

    Full Text Available This paper addresses the possibilities of using spatial data in business analytics tools, with emphasis on SAS software. Various kinds of map data sets containing spatial data are presented and discussed. Examples of map charts illustrating macroeconomic parameters demonstrate the application of spatial data for the creation of map charts in SAS Enterprise Guise. Extended features of map charts are being exemplified by producing charts via SAS programming procedures.

  15. Internet tomography an introduction to concepts, techniques, tools and applications

    CERN Document Server

    Moloisane, Abia; O’Droma, Máirtín

    2013-01-01

    Internet tomography, introduced from basic principles through to techniques, tools and applications, is the subject of this book. The design of Internet Tomography Measurement Systems (ITMS) aimed at mapping the Internet performance profile spatially and temporally over paths between probing stations is a particular focus.The Internet Tomography Measurement System design criteria addressed include:Minimally-invasive, independent and autonomous, active or passive measurement;Flexibility and scalability;Capability of targeting local, regional and global Internet paths and underlying IP networks;

  16. A Cloud Top Pressure Algorithm for DSCOVR-EPIC

    Science.gov (United States)

    Min, Q.; Morgan, E. C.; Yang, Y.; Marshak, A.; Davis, A. B.

    2017-12-01

    The Earth Polychromatic Imaging Camera (EPIC) sensor on the Deep Space Climate Observatory (DSCOVR) satellite presents unique opportunities to derive cloud properties of the entire daytime Earth. In particular, the Oxygen A- and B-band and corresponding reference channels provide cloud top pressure information. In order to address the in-cloud penetration depth issue—and ensuing retrieval bias—a comprehensive sensitivity study has been conducted to simulate satellite-observed radiances for a wide variety of cloud structures and optical properties. Based on this sensitivity study, a cloud top pressure algorithm for DSCOVR-EPIC has been developed. Further, the algorithm has been applied to EPIC measurements.

  17. EPIC229426032 b and EPIC246067459 b: discovery and characterization of two new transiting hot Jupiters from K2

    Science.gov (United States)

    Soto, M. G.; Díaz, M. R.; Jenkins, J. S.; Rojas, F.; Espinoza, N.; Brahm, R.; Drass, H.; Jones, M. I.; Rabus, M.; Hartman, J.; Sarkis, P.; Jordán, A.; Lachaume, R.; Pantoja, B.; Vučković, M.; Ciardi, D. R.; Crossfield, I.; Dressing, C.; Gonzales, E.; Hirsch, L.

    2018-05-01

    We report the discovery of two hot Jupiters orbiting the stars EPIC229426032 and EPIC246067459. We used photometric data from Campaign 11 and 12 of the Kepler K2 Mission and radial velocity data obtained using the HARPS, FEROS, and CORALIE spectrographs. EPIC229426032 b and EPIC246067459 b have masses of 1.60^{+0.11}_{-0.11} and 0.86^{+0.13}_{-0.12}Mjup, radii of 1.65^{+0.07}_{-0.08} and 1.30^{+0.15}_{-0.14} R_{jup}, and are orbiting their host stars in 2.18 and 3.20-day orbits, respectively. The large radius of EPIC229426032 b leads us to conclude that this candidate corresponds to a highly inflated hot Jupiter. EPIC2460674559 b has a radius consistent with theoretical models, considering the high incident flux falling on the planet. We consider EPIC229426032 b to be a excellent system for follow-up studies, since not only is it very inflated, but it also orbits a relatively bright star (V = 11.6).

  18. Soft real-time EPICS extensions for fast control: A case study applied to a TCV equilibrium algorithm

    International Nuclear Information System (INIS)

    Castro, R.; Romero, J.A.; Vega, J.; Nieto, J.; Ruiz, M.; Sanz, D.; Barrera, E.; De Arcas, G.

    2014-01-01

    Highlights: • Implementation of a soft real-time control system based on EPICS technology. • High data throughput system control implementation. • GPU technology applied to fast control. • EPICS fast control based solution. • Fast control and data acquisition in Linux. - Abstract: For new control systems development, ITER distributes CODAC Core System that is a software package based on Linux RedHat, and includes EPICS (Experimental Physics and Industrial Control System) as software control system solution. EPICS technology is being widely used for implementing control systems in research experiments and it is a very well tested technology, but presents important lacks to meet fast control requirements. To manage and process massive amounts of acquired data, EPICS requires additional functions such as: data block oriented transmissions, links with speed-optimized data buffers and synchronization mechanisms not based on system interruptions. This EPICS limitation turned out clearly during the development of the Fast Plant System Controller Prototype for ITER based on PXIe platform. In this work, we present a solution that, on the one hand, is completely compatible and based on EPCIS technology, and on the other hand, extends EPICS technology for implementing high performance fast control systems with soft-real time characteristics. This development includes components such as: data acquisition, processing, monitoring, data archiving, and data streaming (via network and shared memory). Additionally, it is important to remark that this system is compatible with multiple Graphics Processing Units (GPUs) and is able to integrate MatLab code through MatLab engine connections. It preserves EPICS modularity, enabling system modification or extension with a simple change of configuration, and finally it enables parallelization based on data distribution to different processing components. With the objective of illustrating the presented solution in an actual

  19. Development of Multi-Sensor Global Cloud and Radiance Composites for DSCOVR EPIC Imager with Subpixel Definition

    Science.gov (United States)

    Khlopenkov, K. V.; Duda, D. P.; Thieman, M. M.; Sun-Mack, S.; Su, W.; Minnis, P.; Bedka, K. M.

    2017-12-01

    retrieval). Overall, the composite product has been generated for every EPIC observation from June 2015 to December 2016, typically 300-500 composites per month, which makes it useful for many climate applications.

  20. Nuclear Tools For Oilfield Logging-While-Drilling Applications

    International Nuclear Information System (INIS)

    Reijonen, Jani

    2011-01-01

    Schlumberger is an international oilfield service company with nearly 80,000 employees of 140 nationalities, operating globally in 80 countries. As a market leader in oilfield services, Schlumberger has developed a suite of technologies to assess the downhole environment, including, among others, electromagnetic, seismic, chemical, and nuclear measurements. In the past 10 years there has been a radical shift in the oilfield service industry from traditional wireline measurements to logging-while-drilling (LWD) analysis. For LWD measurements, the analysis is performed and the instruments are operated while the borehole is being drilled. The high temperature, high shock, and extreme vibration environment of LWD imposes stringent requirements for the devices used in these applications. This has a significant impact on the design of the components and subcomponents of a downhole tool. Another significant change in the past few years for nuclear-based oilwell logging tools is the desire to replace the sealed radioisotope sources with active, electronic ones. These active radiation sources provide great benefits compared to the isotopic sources, ranging from handling and safety to nonproliferation and well contamination issues. The challenge is to develop electronic generators that have a high degree of reliability for the entire lifetime of a downhole tool. LWD tool testing and operations are highlighted with particular emphasis on electronic radiation sources and nuclear detectors for the downhole environment.

  1. DAMT - DISTRIBUTED APPLICATION MONITOR TOOL (HP9000 VERSION)

    Science.gov (United States)

    Keith, B.

    1994-01-01

    Typical network monitors measure status of host computers and data traffic among hosts. A monitor to collect statistics about individual processes must be unobtrusive and possess the ability to locate and monitor processes, locate and monitor circuits between processes, and report traffic back to the user through a single application program interface (API). DAMT, Distributed Application Monitor Tool, is a distributed application program that will collect network statistics and make them available to the user. This distributed application has one component (i.e., process) on each host the user wishes to monitor as well as a set of components at a centralized location. DAMT provides the first known implementation of a network monitor at the application layer of abstraction. Potential users only need to know the process names of the distributed application they wish to monitor. The tool locates the processes and the circuit between them, and reports any traffic between them at a user-defined rate. The tool operates without the cooperation of the processes it monitors. Application processes require no changes to be monitored by this tool. Neither does DAMT require the UNIX kernel to be recompiled. The tool obtains process and circuit information by accessing the operating system's existing process database. This database contains all information available about currently executing processes. Expanding the information monitored by the tool can be done by utilizing more information from the process database. Traffic on a circuit between processes is monitored by a low-level LAN analyzer that has access to the raw network data. The tool also provides features such as dynamic event reporting and virtual path routing. A reusable object approach was used in the design of DAMT. The tool has four main components; the Virtual Path Switcher, the Central Monitor Complex, the Remote Monitor, and the LAN Analyzer. All of DAMT's components are independent, asynchronously executing

  2. The Safety Assessment Framework Tool (SAFRAN) - Description, Overview and Applicability

    International Nuclear Information System (INIS)

    Alujevic, Luka

    2014-01-01

    The SAFRAN tool (Safety Assessment Framework) is a user-friendly software application that incorporates the methodologies developed in the SADRWMS (Safety Assessment Driven Radioactive Waste Management Solutions) project. The International Atomic Energy Agency (IAEA) organized the International Project on Safety Assessment Driving Radioactive Waste Management Solutions (SADRWMS) to examine international approaches to safety assessment for predisposal management of all types of radioactive waste, including disused sources, small volumes, legacy and decommissioning waste, operational waste, and large volume naturally occurring radioactive material residues. SAFRAN provides aid in: Describing the predisposal RW management activities in a systematic way, Conducting the SA (safety assessment) with clear documentation of the methodology, assumptions, input data and models, Establishing a traceable and transparent record of the safety basis for decisions on the proposed RW management solutions, Demonstrating clear consideration of and compliance with national and international safety standards and recommendations. The SAFRAN tool allows the user to visibly, systematically and logically address predisposal radioactive waste management and decommissioning challenges in a structured way. It also records the decisions taken in such a way that it constitutes a justifiable safety assessment of the proposed management solutions. The objective of this paper is to describe the SAFRAN architecture and features, properly define the terms safety case and safety assessment, and to predict the future development of the SAFRAN tool and assess its applicability to the construction of a future LILW (Low and Intermediate Level Waste) storage facility and repository in Croatia, taking into account all the capabilities and modelling features of the SAFRAN tool. (author)

  3. Radiation monitoring system based on EPICS

    International Nuclear Information System (INIS)

    Wang Weizhen; Li Jianmin; Wang Xiaobing; Hua Zhengdong; Xu Xunjiang

    2008-01-01

    Shanghai Synchrotron Radiation Facility (SSRF for short) is a third-generation light source building in China, including a 150 MeV injector, 3.5 GeV booster, 3.5 GeV storage ring and an amount of beam line stations. During operation, a mass of Synchrotron Radiation will be produced by electrons in the booster and the storage ring. Bremsstrahlung and neutrons will also be produced as a result of the interaction between the electrons, especially the beam loss, and the wall of the vacuum beam pipe. SSRF Radiation Monitoring System is established for monitoring the radiation dosage of working area and environment while SSRF operating. The system consists of detectors, intelligent data-collecting modules, monitoring computer, and managing computer. The software system is developed based on EPICS (Experimental Physics and Industrial Control System), implementing the collecting and monitoring the data output from intelligent modules, analyzing the data, and so on. (authors)

  4. Mapping Romanzo Criminale. An Epic Narrative Ecosystem?

    Directory of Open Access Journals (Sweden)

    Marta Boni

    2015-05-01

    Full Text Available Romanzo Criminale is one of the few recent Italian media products that has emerged as a societal phenomenon and as a vehicle for the exportation of a national culture. It is a complex narrative which extends in time and space due to its various adaptations and intermedial crossovers. Following the path of complexity, drawing on Edgar Morin’s work, Romanzo Criminale will be thought of as a complex system. As precedent studies on the intertwining of official and grassroots discourses show, Romanzo Criminale becomes a complex world, with its boundaries and internal organization. This paper will show that Romanzo Criminale can be studied as a semiosphere (Lotman 2005, or a semiotic space defined by and which encourages the intertwining of texts and audience appropriations, creating an epic process. Some methodological perspectives used for mapping this phenomenon will be discussed, namely Franco Moretti’s distant reading.

  5. NEW EPICS/RTEMS IOC BASED ON ALTERA SOC AT JEFFERSON LAB

    Energy Technology Data Exchange (ETDEWEB)

    Yan, Jianxun [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Seaton, Chad [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Allison, Trent L. [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Bevins, Brian S. [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Cuffe, Anthony W. [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States)

    2018-02-01

    A new EPICS/RTEMS IOC based on the Altera System-on-Chip (SoC) FPGA is being designed at Jefferson Lab. The Altera SoC FPGA integrates a dual ARM Cortex-A9 Hard Processor System (HPS) consisting of processor, peripherals and memory interfaces tied seamlessly with the FPGA fabric using a high-bandwidth interconnect backbone. The embedded Altera SoC IOC has features of remote network boot via U-Boot from SD card or QSPI Flash, 1Gig Ethernet, 1GB DDR3 SDRAM on HPS, UART serial ports, and ISA bus interface. RTEMS for the ARM processor BSP were built with CEXP shell, which will dynamically load the EPICS applications at runtime. U-Boot is the primary bootloader to remotely load the kernel image into local memory from a DHCP/TFTP server over Ethernet, and automatically run RTEMS and EPICS. The first design of the SoC IOC will be compatible with Jefferson Lab’s current PC104 IOCs, which have been running in CEBAF 10 years. The next design would be mounting in a chassis and connected to a daughter card via standard HSMC connectors. This standard SoC IOC will become the next generation of low-level IOC for the accelerator controls at Jefferson Lab.

  6. Operational experience from a large EPICS-based accelerator facility

    International Nuclear Information System (INIS)

    Ciarlette, D.J.; Gerig, R.

    1995-01-01

    The Advanced Photon Source (APS) at Argonne National Laboratory is a third-generation x-ray light source which uses the Experimental Physics and Industrial Control System (EPICS) to operate its linear accelerator, positron accumulator ring, booster synchrotron, and storage ring equipment. EPICS has been used at the APS since the beginning of installation and commissioning. Currently, EPICS controls approximately 100 VME crates containing over 100,000 process variables. With this complexity, the APS has had to review some of the methods originally employed and make changes as necessary. In addition, due to commissioning and operational needs, higher-level operator software needed to be created. EPICS has been flexible enough to allow this

  7. Practical applications of surface analytic tools in tribology

    Science.gov (United States)

    Ferrante, J.

    1980-01-01

    Many of the currently, widely used tools available for surface analysis are described. Those which have the highest applicability for giving elemental and/or compound analysis for problems of interest in tribology and are truly surface sensitive (that is, less than 10 atomic layers) are presented. The latter group is evaluated in detail in terms of strengths and weaknesses. Emphasis is placed on post facto analysis of experiments performed under 'real' conditions (e.g., in air with lubricants). It is further indicated that such equipment could be used for screening and quality control.

  8. Sustainability rating tools for buildings and its wider application

    Directory of Open Access Journals (Sweden)

    Siew Renard

    2017-01-01

    Full Text Available This paper provides a commentary on the latest research in measuring the sustainability of buildings and its wider application. The emergence of sustainability rating tools (SRTs has faced critique from scholars due to their deficiencies such as the overemphasis on environmental criteria, the negligence of uncertainty in scoring and existence of non-scientific criteria benchmarks among many others. This could have attributed to the mixed evidence in the literature on the benefits of SRTs. Future research direction is proposed to advance the state-of-the art in this field.

  9. Ergodic optimization in the expanding case concepts, tools and applications

    CERN Document Server

    Garibaldi, Eduardo

    2017-01-01

    This book focuses on the interpretation of ergodic optimal problems as questions of variational dynamics, employing a comparable approach to that of the Aubry-Mather theory for Lagrangian systems. Ergodic optimization is primarily concerned with the study of optimizing probability measures. This work presents and discusses the fundamental concepts of the theory, including the use and relevance of Sub-actions as analogues to subsolutions of the Hamilton-Jacobi equation. Further, it provides evidence for the impressively broad applicability of the tools inspired by the weak KAM theory.

  10. iPhone Open Application Development Write Native Applications Using the Open Source Tool Chain

    CERN Document Server

    Zdziarski, Jonathan

    2008-01-01

    Developers everywhere are eager to create applications for the iPhone, and many of them prefer the open source, community-developed tool chain to Apple's own toolkit. This new edition of iPhone Open Application Development covers the latest version of the open toolkit -- now updated for Apple's iPhone 2.x software and iPhone 3G -- and explains in clear language how to create applications using Objective-C and the iPhone API.

  11. The rebirth of the epic from the Nietzsche's Philosophy

    Directory of Open Access Journals (Sweden)

    Reza Samim

    2016-12-01

    Full Text Available his philosophy with Iranian mysticism. Such identification is fundamentally flawed and contradicts Nietzsche's ontological principles and moral values. Some of the Iranian commentators, expert in Nietzsche's philosophy, identified Nietzsche's thought is pregnant from the epic universal values, not the mystical patterns. Understanding of Nietzsche's Philosophy is possible with the help of Shahnameh and Iliad not mysticism. The reason of this fundamental error lies in the fact that these Iranian commentators fail to distinguish the subtle differences between mysticism and epic, and this failure, has led to their mixing Nietzsche's thought with the Iranian mysticism. Epic and mysticism are related in the differences not the similarities. Although there could be some similarities between the mystical worldview and that of epic, they are merely outward and superficial. In effect, in the matter of epistemic, moral and ontological principles, epic contradicts mysticism. At the best, mysticism can be considered to be the negative correspondence of epic and called “Negative Epic”. Nietzsche's thought has been affected to a greater extent by the Greek culture than and prior to the Iranian traditions. Nietzche's symbolic recourse to Zoroaster cannot be a cogent basis for these commentators' claim. Moreover, Nietzche's grasp of Zoroastrian worldview is so much blurred and incomplete. He appreciates the Greek culture not the Iranian traditions. Therefore, autonomy, voluntarism, appreciation of life and denunciation of passivity are the set of values and principles associating Nietzsche's philosophy with epic. These are exactly the principles disregarded and even denied in mystical thought. In other words, Nietzsche's philosophy can be considered the rebirth of the epic in the sphere of philosophical thought.

  12. Butterflies and Dragon-Eagles: Processing Epics from Southwest China

    Directory of Open Access Journals (Sweden)

    Mark Bender

    2012-03-01

    Full Text Available In the mountains of southwest China, epic narratives are part of the traditional performance-scapes of many ethnic minority cultures. In some cases locals participate in the preservation of oral or oral-connected epics from their respective areas. This article discusses the dynamics of acquiring and translating texts from two major ethnic minority groups in cooperation with local tradition-bearers, poets, and scholars.

  13. EPICS - MDSplus integration in the ITER Neutral Beam Test Facility

    International Nuclear Information System (INIS)

    Luchetta, Adriano; Manduchi, Gabriele; Barbalace, Antonio; Soppelsa, Anton; Taliercio, Cesare

    2011-01-01

    SPIDER, the ITER-size ion-source test bed in the ITER Neutral Beam Test Facility, is a fusion device requiring a complex central system to provide control and data acquisition, referred to as CODAS. The CODAS software architecture will rely on EPICS and MDSplus, two open-source, collaborative software frameworks, targeted at control and data acquisition, respectively. EPICS has been selected as ITER CODAC middleware and, as the final deliverable of the Neutral Beam Test Facility is the procurement of the ITER Heating Neutral Beam Injector, we decided to adopt this ITER technology. MDSplus is a software package for data management, supporting advanced concepts, such as platform and underlying hardware independence, self description data, and data driven model. The combined use of EPICS and MDSplus is not new in fusion, but their level of integration will be new in SPIDER, achieved by a more refined data access layer. The paper presents the integration software to use effectively EPICS and MDSplus, including the definition of appropriate EPICS records to interact with MDSplus. The MDSplus and EPICS archive concepts are also compared on the basis of performance tests and data streaming is investigated by ad-hoc measurements.

  14. BASINs and WEPP Climate Assessment Tools (CAT): Case Study Guide to Potential Applications (Final Report)

    Science.gov (United States)

    EPA announced the release of the final report, BASINs and WEPP Climate Assessment Tools (CAT): Case Study Guide to Potential Applications. This report supports application of two recently developed water modeling tools, the Better Assessment Science Integrating point & ...

  15. Remote tooling applications at the FFTF/IEM cell

    International Nuclear Information System (INIS)

    Webb, R.H.

    1990-01-01

    At the fast flux test facility, a US Government-owned 400-MW(thermal) sodium-cooled fast reactor, the interim examination and maintenance (IEM) cell is used for the remote disassembly of irradiated fuel and material experiments and remote maintenance operations. The IEM cell has been a challenging area both for maintenance and operation of remote equipment. Innovative tooling has been required to provide the reliability, strength and dexterity for performing myriad bolting, cutting, gripping, and other such functions. Over the years, a set of basic components that can be modified and adapted to several applications has been developed. These include torque multipliers, torque limiters, right-angle drives, and many common hand tools with fittings designed to be handled by master-slave manipulators (MSM) or electromechanical manipulator (EMM) hands. An example of such a system is the closed loop ex-vessel machine (CLEM) grapple change tool, which is designed for both hands-on use in a glove box and remote use in the IEM cell

  16. Materials modelling - a possible design tool for advanced nuclear applications

    International Nuclear Information System (INIS)

    Hoffelner, W.; Samaras, M.; Bako, B.; Iglesias, R.

    2008-01-01

    The design of components for power plants is usually based on codes, standards and design rules or code cases. However, it is very difficult to get the necessary experimental data to prove these lifetime assessment procedures for long-term applications in environments where complex damage interactions (temperature, stress, environment, irradiation) can occur. The rules used are often very simple and do not have a basis which take physical damage into consideration. The linear life fraction rule for creep and fatigue interaction can be taken as a prominent example. Materials modelling based on a multi-scale approach in principle provides a tool to convert microstructural findings into mechanical response and therefore has the capability of providing a set of tools for the improvement of design life assessments. The strength of current multi-scale modelling efforts is the insight they offer as regards experimental phenomena. To obtain an understanding of these phenomena it is import to focus on issues which are important at the various time and length scales of the modelling code. In this presentation the multi-scale path will be demonstrated with a few recent examples which focus on VHTR applications. (authors)

  17. Data Integration Tool: From Permafrost Data Translation Research Tool to A Robust Research Application

    Science.gov (United States)

    Wilcox, H.; Schaefer, K. M.; Jafarov, E. E.; Strawhacker, C.; Pulsifer, P. L.; Thurmes, N.

    2016-12-01

    The United States National Science Foundation funded PermaData project led by the National Snow and Ice Data Center (NSIDC) with a team from the Global Terrestrial Network for Permafrost (GTN-P) aimed to improve permafrost data access and discovery. We developed a Data Integration Tool (DIT) to significantly speed up the time of manual processing needed to translate inconsistent, scattered historical permafrost data into files ready to ingest directly into the GTN-P. We leverage this data to support science research and policy decisions. DIT is a workflow manager that divides data preparation and analysis into a series of steps or operations called widgets. Each widget does a specific operation, such as read, multiply by a constant, sort, plot, and write data. DIT allows the user to select and order the widgets as desired to meet their specific needs. Originally it was written to capture a scientist's personal, iterative, data manipulation and quality control process of visually and programmatically iterating through inconsistent input data, examining it to find problems, adding operations to address the problems, and rerunning until the data could be translated into the GTN-P standard format. Iterative development of this tool led to a Fortran/Python hybrid then, with consideration of users, licensing, version control, packaging, and workflow, to a publically available, robust, usable application. Transitioning to Python allowed the use of open source frameworks for the workflow core and integration with a javascript graphical workflow interface. DIT is targeted to automatically handle 90% of the data processing for field scientists, modelers, and non-discipline scientists. It is available as an open source tool in GitHub packaged for a subset of Mac, Windows, and UNIX systems as a desktop application with a graphical workflow manager. DIT was used to completely translate one dataset (133 sites) that was successfully added to GTN-P, nearly translate three datasets

  18. On the Issue of Origin of the Yakut Epic Olonkho

    Directory of Open Access Journals (Sweden)

    Vasiliy Nikolayevich Ivanov

    2018-03-01

    Full Text Available The issue on the origin of the Yakut heroic epic Olonkho was covered in works in history and ethnography of the Yakuts back in the 19th century, for instance, in the famous monograph Yakuts. Experience of ethnographic research by a Polish exile V.L. Seroshevskiy (1896. Since that time, this issue was interesting for many, but no special monograph research has been done yet. Currently, the issue of Olonkho origin is gaining special scientific and general cultural significance, as on November 25, 2005 the Yakut heroic epic Olonkho according to the historical decision of UNESCO was granted the high status “Masterpiece of oral and non-material heritage of humanity”. The Yakut epic is a part of the multicomponent epic creative work of the Turkic nations but it was the only one to get such a high international recognition. This paper aims to revive the scientific interest to the issue of the Yakut epic’s genesis. To date, some rich source-related and historiographical material has been accumulated for broader generalizations – the main point is that the Yakut epic is becoming an important object of comparative historical analysis of the origin of all Turkic epics. The thing is that epic researchers admit that almost a thousand years of existence isolated from the whole Turkic world in the North-East of Asia kept many archaic features of the epics of the ancient ancestors – natives of Central Asia and Southern Siberia. It became clear that Olonkho origin is organically linked with the ethnic history of its nation.The paper follows this comprehensive process reflected in works by archeologists, ethnographers, historians and linguists. Their latest achievements are impressive, bringing a lot of novelty into the conventional views of origins and development of the Yakut epic. The paper attempts to specify that novelty and rationalize the idea that time has come to introduce that novelty into science to solve the long-standing issue of origin of

  19. EPIC Calibration/Validation Experiment Field Campaign Report

    Energy Technology Data Exchange (ETDEWEB)

    Koch, Steven E [National Severe Storm Laboratory/NOAA; Chilson, Phillip [University of Oklahoma; Argrow, Brian [University of Colorado

    2017-03-15

    A field exercise involving several different kinds of Unmanned Aerial Systems (UAS) and supporting instrumentation systems provided by DOE/ARM and NOAA/NSSL was conducted at the ARM SGP site in Lamont, Oklahoma on 29-30 October 2016. This campaign was part of a larger National Oceanic and Atmospheric Administration (NOAA) UAS Program Office program awarded to the National Severe Storms Laboratory (NSSL). named Environmental Profiling and Initiation of Convection (EPIC). The EPIC Field Campaign (Test and Calibration/Validation) proposed to ARM was a test or “dry-run” for a follow-up campaign to be requested for spring/summer 2017. The EPIC project addresses NOAA’s objective to “evaluate options for UAS profiling of the lower atmosphere with applications for severe weather.” The project goal is to demonstrate that fixed-wing and rotary-wing small UAS have the combined potential to provide a unique observing system capable of providing detailed profiles of temperature, moisture, and winds within the atmospheric boundary layer (ABL) to help determine the potential for severe weather development. Specific project objectives are: 1) to develop small UAS capable of acquiring needed wind and thermodynamic profiles and transects of the ABL using one fixed-wing UAS operating in tandem with two different fixed rotary-wing UAS pairs; 2) adapt and test miniaturized, high-precision, and fast-response atmospheric sensors with high accuracy in strong winds characteristic of the pre-convective ABL in Oklahoma; 3) conduct targeted short-duration experiments at the ARM Southern Great Plains site in northern Oklahoma concurrently with a second site to be chosen in “real-time” from the Oklahoma Mesonet in coordination with the (National Weather Service (NWS)-Norman Forecast Office; and 4) gain valuable experience in pursuit of NOAA’s goals for determining the value of airborne, mobile observing systems for monitoring rapidly evolving high-impact severe weather

  20. Advances in Intelligent Modelling and Simulation Simulation Tools and Applications

    CERN Document Server

    Oplatková, Zuzana; Carvalho, Marco; Kisiel-Dorohinicki, Marek

    2012-01-01

    The human capacity to abstract complex systems and phenomena into simplified models has played a critical role in the rapid evolution of our modern industrial processes and scientific research. As a science and an art, Modelling and Simulation have been one of the core enablers of this remarkable human trace, and have become a topic of great importance for researchers and practitioners. This book was created to compile some of the most recent concepts, advances, challenges and ideas associated with Intelligent Modelling and Simulation frameworks, tools and applications. The first chapter discusses the important aspects of a human interaction and the correct interpretation of results during simulations. The second chapter gets to the heart of the analysis of entrepreneurship by means of agent-based modelling and simulations. The following three chapters bring together the central theme of simulation frameworks, first describing an agent-based simulation framework, then a simulator for electrical machines, and...

  1. Development and Application of Camelid Molecular Cytogenetic Tools

    Science.gov (United States)

    Avila, Felipe; Das, Pranab J.; Kutzler, Michelle; Owens, Elaine; Perelman, Polina; Rubes, Jiri; Hornak, Miroslav; Johnson, Warren E.

    2014-01-01

    Cytogenetic chromosome maps offer molecular tools for genome analysis and clinical cytogenetics and are of particular importance for species with difficult karyotypes, such as camelids (2n = 74). Building on the available human–camel zoo-fluorescence in situ hybridization (FISH) data, we developed the first cytogenetic map for the alpaca (Lama pacos, LPA) genome by isolating and identifying 151 alpaca bacterial artificial chromosome (BAC) clones corresponding to 44 specific genes. The genes were mapped by FISH to 31 alpaca autosomes and the sex chromosomes; 11 chromosomes had 2 markers, which were ordered by dual-color FISH. The STS gene mapped to Xpter/Ypter, demarcating the pseudoautosomal region, whereas no markers were assigned to chromosomes 14, 21, 22, 28, and 36. The chromosome-specific markers were applied in clinical cytogenetics to identify LPA20, the major histocompatibility complex (MHC)-carrying chromosome, as a part of an autosomal translocation in a sterile male llama (Lama glama, LGL; 2n = 73,XY). FISH with LPAX BACs and LPA36 paints, as well as comparative genomic hybridization, were also used to investigate the origin of the minute chromosome, an abnormally small LPA36 in infertile female alpacas. This collection of cytogenetically mapped markers represents a new tool for camelid clinical cytogenetics and has applications for the improvement of the alpaca genome map and sequence assembly. PMID:23109720

  2. FOSS Tools and Applications for Education in Geospatial Sciences

    Directory of Open Access Journals (Sweden)

    Marco Ciolli

    2017-07-01

    Full Text Available While the theory and implementation of geographic information systems (GIS have a history of more than 50 years, the development of dedicated educational tools and applications in this field is more recent. This paper presents a free and open source software (FOSS approach for education in the geospatial disciplines, which has been used over the last 20 years at two Italian universities. The motivations behind the choice of FOSS are discussed with respect to software availability and development, as well as educational material licensing. Following this philosophy, a wide range of educational tools have been developed, covering topics from numerical cartography and GIS principles to the specifics regarding different systems for the management and analysis of spatial data. Various courses have been implemented for diverse recipients, ranging from professional training workshops to PhD courses. Feedback from the students of those courses provides an invaluable assessment of the effectiveness of the approach, supplying at the same time directions for further improvement. Finally, lessons learned after 20 years are discussed, highlighting how the management of educational materials can be difficult even with a very open approach to licensing. Overall, the use of free and open source software for geospatial (FOSS4G science provides a clear advantage over other approaches, not only simplifying software and data management, but also ensuring that all of the information related to system design and implementation is available.

  3. Performance Analysis, Modeling and Scaling of HPC Applications and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bhatele, Abhinav [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-01-13

    E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research along the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.

  4. Applications of rotary jetting tool with coiled tubing offshore Brazil

    Energy Technology Data Exchange (ETDEWEB)

    Guerra, Ricardo; Almeida, Victor; Mendez, Alfredo; Dean, Greg [BJ Services do Brasil Ltda., RJ (Brazil)

    2004-07-01

    It is well known that offshore operators are continuously looking for alternatives to reduce rig time, especially when it comes to work over operations due to high costs. The introduction of a Rotary Jetting Tool (RJT) in conjunction with coiled tubing was successfully tested and proved to be a better alternative not only because of its efficiency but also due to a reduction in the time of intervention operations. The RJT was created to remove scales and well obstructions by utilization of stress-cycling jetting. Stress cycling is a jetting mechanism that consists of pressuring and energizing fluid against a material. This mechanism breaks scales or obstructions and vibrates proppants in gravel pack completions. The RJT is composed of turbines that generate spinning and magnets that control the rotation. Most fluids used in the oil industry for remedial operations are compatible with this tool, hence its wide range of applications. This paper will present case histories that vary from hydrate and scale removal, and matrix stimulations including cleaning of gravel pack completions. The usage of this RJT has demonstrated effectiveness as a new alternative to improve well production and reduce rig time when compared to other methods commonly used in the area. (author)

  5. LabVIEW Library to EPICS Channel Access

    CERN Document Server

    Liyu, Andrei; Thompson, Dave H

    2005-01-01

    The Spallation Neutron Source (SNS) accelerator systems will deliver a 1.0 GeV, 1.4 MW proton beam to a liquid mercury target for neutron scattering research. The accelerator complex consists of a 1 GeV linear accelerator, an accumulator ring and associated transport lines. The SNS diagnostics platform is PC-based and will run Windows for its OS and LabVIEW as its programming language. Data acquisition hardware will be based on PCI cards. There will be about 300 rack-mounted computers. The Channel Access (CA) protocol of the Experimental Physics and Industrial Control System (EPICS) is the SNS control system communication standard. This paper describes the approaches, implementation, and features of LabVIEW library to CA for Windows, Linux, and Mac OS X. We also discuss how the library implements the asynchronous CA monitor routine using LabVIEW's occurrence mechanism instead of a callback function (which is not available in LabVIEW). The library is used to acquire accelerator data and applications have been ...

  6. SNS online display technologies for EPICS

    International Nuclear Information System (INIS)

    Kasemir, K.U.; Chen, X.; Purcell, J.; Danilova, E.

    2012-01-01

    The ubiquitousness of web clients from personal computers to cell phones results in a growing demand for web-based access to control system data. At the Oak Ridge National Laboratory Spallation Neutron Source (SNS) we have investigated different technical approaches to provide read access to data in the Experimental Physics and Industrial Control System (EPICS) for a wide variety of web client devices. The core web technology, HTTP, is less than ideal for online control system displays. Appropriate use of Ajax, especially the Long Poll paradigm, can alleviate fundamental HTTP limitations. The SNS Status web uses basic Ajax technology to generate generic displays for a wide audience. The Dashboard uses Long Poll and more client-side Java-Script to offer more customization and faster updates for users that need specialized displays. The Web OPI uses RAP for web access to any BOY display, offering utmost flexibility because users can create their own BOY displays in CSS. These three approaches complement each other. Users can access generic status displays with zero effort, invest time in creating their fully customized displays for the Web OPI, or use the Dashboard as an intermediate solution

  7. Mathematical Tools for Discovery of Nanoporous Materials for Energy Applications

    International Nuclear Information System (INIS)

    Haranczyk, M; Martin, R L

    2015-01-01

    Porous materials such as zeolites and metal organic frameworks have been of growing importance as materials for energy-related applications such as CO 2 capture, hydrogen and methane storage, and catalysis. The current state-of-the-art molecular simulations allow for accurate in silico prediction of materials' properties but the computational cost of such calculations prohibits their application in the characterisation of very large sets of structures, which would be required to perform brute-force screening. Our work focuses on the development of novel methodologies to efficiently characterize and explore this complex materials space. In particular, we have been developing algorithms and tools for enumeration and characterisation of porous material databases as well as efficient screening approaches. Our methodology represents a ensemble of mathematical methods. We have used Voronoi tessellation-based techniques to enable high-throughput structure characterisation, statistical techniques to perform comparison and screening, and continuous optimisation to design materials. This article outlines our developments in material design

  8. The Cosmopolitan Epics of 2004: A Case Study

    Directory of Open Access Journals (Sweden)

    Assoc. Prof. Saverio Giovacchini

    2011-01-01

    Full Text Available In 2004 Hollywood produced three purportedly blockbuster epic films:Troy, King Arthur and Alexander. Many critics suggested a direct linkbetween the 1950s “sword and sandal” epic and this new crop of movies.Similarities between the two cycles certainly exist but in this essay I want to emphasize a crucial difference between the contemporary,cosmopolitan, epic and the previous, more nation-bound, 1950s cycle.Rather than being in tune with key elements of American foreign policy, the new cycle of “sword and sandal” films offers a somber assessment of American imperial adventures. I shall contend, in fact, that the new crop of epic films had to choose between two generic conventions that are, at present, not compatible. On the one hand, epic films had traditionally been the bearers of the foreign policy vision of the country that produced them. On the other, their inflated budgets made them dependent on an international market. Deeply aware of a globalized and rising opposition to US foreign policy and of the fact that foreign box office now exceeds the domestic take of a blockbuster, it may be no wonder that the makers of these films chose to craft them into citizens of the world.

  9. Chronicle and the epic: Machado de Assis in homeric verses

    Directory of Open Access Journals (Sweden)

    Ionara Satin

    2015-12-01

    Full Text Available The aim of this study is to show the presence of classical epic, Homer's Iliad, in the chronicle of Machado de Assis, analyzing the intertextual dialogue between Machado de Assis and the epic poem by Homer, considering the concept of intertextuality developed by Julia Kristeva from philosophical conceptions of Bakhtin. In the chronicle of March 18, 1894 for the newspaper Gazeta de Notícias on sunday column "A Semana", Machado de Assis crosses Homer's epic to his chronicle, rewrites the epic text for the daily issues of his weekly column. To Tiphaine Samoyault, writing is rewriting, "stand on the existing foundations and contribute to a continued creation" (2008, p. 77 one of the principles of intertextuality. It was observed that from the reading and assimilation of the classic poem, Machado de Assis can approach so far genres, bringing the verses to his prose, leaving it closer to poetry. In this sense, we can see the richness of Machado de Assis chronicles, often left on the sidelines in favor of his short stories and romances. In addition, the dialogue, to rewrite the epic in his chronicle, Machado seems to contribute to this "continuous creation", reviving the memory of literature and emphasizing the permanence of classical work.

  10. 76 FR 71341 - BASINS and WEPP Climate Assessment Tools: Case Study Guide to Potential Applications

    Science.gov (United States)

    2011-11-17

    ... Climate Assessment Tools: Case Study Guide to Potential Applications AGENCY: Environmental Protection... Tools (CAT): Case Study Guide to Potential Applications (EPA/600/R-11/123A). EPA also is announcing that... report presents a series of short case studies designed to illustrate the capabilities of these tools for...

  11. Implementation of KoHLT-EB DAQ System using compact RIO with EPICS

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Dae-Sik; Kim, Suk-Kwon; Lee, Dong Won [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Cho, Seungyon [National Fusion Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    EPICS (Experimental Physics and Industrial Control System) is a collection of software tools collaboratively developed which can be integrated to provide a comprehensive and scalable control system. Currently there is an increase in use of such systems in large Physics experiments like KSTAR, ITER and DAIC (Daejeon Accelerator Ion Complex). The Korean heat load test facility (KoHLT-EB) was installed at KAERI. This facility is utilized for a qualification test of the plasma facing component (PFC) for the ITER first wall and DEMO divertor, and the thermo-hydraulic experiments. The existing data acquisition device was Agilent 34980A multifunction switch and measurement unit and controlled by Agilent VEE. In the present paper, we report the EPICS based newly upgraded KoHLT-EB DAQ system which is the advanced data acquisition system using FPGA-based reconfigurable DAQ devices like compact RIO. The operator interface of KoHLT-EB DAQ system is composed of Control-System Studio (CSS) and another server is able to archive the related data using the standalone archive tool and the archiveviewer can retrieve that data at any time in the infra-network.

  12. Design tool for offgrid hydrogen refuelling systems for aerospace applications

    International Nuclear Information System (INIS)

    Troncoso, E.; Lapeña-Rey, N.; Gonzalez, M.

    2016-01-01

    Highlights: • A simulation tool for offgrid CPV-based hydrogen refuelling systems is presented. • Simulations of system configurations with specific UAS hydrogen demand scenarios. • Regarding system size & reliability the most critical components are the CPV array and batteries. • In terms of energy efficiency the most critical component is the electrolyser. - Abstract: To develop an environmentally acceptable refuelling solution for fuel cell-powered unmanned aerial systems (UASs) to operate in remote areas, hydrogen fuel must be produced on-site from renewable energy sources. This paper describes a Matlab-based simulation tool specifically developed to pre-design offgrid hydrogen refuelling systems for UAS applications. The refuelling system comprises a high concentrated PV array (CPV), an electrolyser, a hydrogen buffer tank and a diaphragm hydrogen compressor. Small composite tanks are also included for fast refuelling of the UAV platforms at any time during the year. The novel approach of selecting a CPV power source is justified on the basis of minimizing the system footprint (versus flat plat or low concentration PV), aiming for a containerized remotely deployable UAS offgrid refuelling solution. To validate the simulation tool a number of simulations were performed using experimental data from a prototype offgrid hydrogen refuelling station for UAVs developed by Boeing Research & Technology Europe. Solar irradiation data for a selected location and daily UAS hydrogen demands of between 2.8 and 15.8 Nm"3 were employed as the primary inputs, in order to calculate a recommended system sizing solution and assess the expected operation of the refuelling system across a given year. The specific energy consumption of the refuelling system obtained from the simulations is between 5.6 and 8.9 kW h_e per kg of hydrogen delivered to the UAVs, being lower for larger daily hydrogen demands. Increasing the CPV area and electrolyser size in order to supply higher

  13. Pirate Alterity and Mimesis in Colonial Epic Poetry

    Directory of Open Access Journals (Sweden)

    Javier de Navascués

    2016-05-01

    Full Text Available Pirate representation is studied in a series of Epic poems in the late sixteenth century and early seventeenth century. The ambiguous image of the English enemy is read in texts by Juan de Miramontes, Pedro de Oña, Martín del Barco Centenera and Juan de Castellanos, among others. On the one hand, Colonial Epic ignores some important differences between privateers and pirates since the privateering had been legally accepted by all European nations, including Spain. Besides, Pirate is always called «Lutheran» and revealing its absolute otherness with respect to the Catholic model. On the other hand, it proposes a laudatory epics enemy painting from the imitation of the values accepted by the colonial society. The relationship between the Spanish hero and the privateer is represented not in a vertical direction, as could happen between colonizer and colonized subject, but on a level of rivalry.

  14. Development of Nylon Based FDM Filament for Rapid Tooling Application

    Science.gov (United States)

    Singh, R.; Singh, S.

    2014-04-01

    There has been critical need for development of cost effective nylon based wire to be used as feed stock filament for fused deposition modelling (FDM) machine. But hitherto, very less work has been reported for development of alternate solution of acrylonitrile butadiene styrene (ABS) based wire which is presently used in most of FDM machines. The present research work is focused on development of nylon based wire as an alternative of ABS wire (which is to be used as feedstock filament on FDM) without changing any hardware or software of machine. For the present study aluminium oxide (Al2O3) as additive in different proportion has been used with nylon fibre. Single screw extruder was used for wire preparation and wire thus produced was tested on FDM. Mechanical properties i.e. tensile strength and percentage elongation of finally developed wire have been optimized by Taguchi L9 technique. The work represented major development in reducing cost and time in rapid tooling applications.

  15. Bioimmobilization of uranium-practical tools for field applications

    Science.gov (United States)

    Istok, J. D.

    2011-12-01

    Extensive laboratory and field research has conclusively demonstrated that it is possible to stimulate indigenous microbial activity and create conditions favorable for the reductive precipitation of uranium from groundwater, reducing aqueous U concentrations below regulatory levels. A wide variety of complex and coupled biogeochemical processes have been identified and specific reaction mechanisms and parameters have been quantified for a variety of experimental systems including pure, mixed, and natural microbial cultures, and single mineral, artificial, and natural sediments, and groundwater aquifers at scales ranging from very small (10s nm) to very large (10s m). Multicomponent coupled reactive transport models have also been developed to simulate various aspects of this process in 3D heterogeneous environments. Nevertheless, full-scale application of reductive bioimmobilization of uranium (and other radionuclides and metals) remains problematical because of the technical and logistical difficulties in creating and maintaining reducing environment in the many large U contaminated groundwater aquifers currently under aerobic and oxidizing conditions and often containing high concentrations of competing and more energetically favorable electron acceptors (esp. nitrate). This talk will discuss how simple tools, including small-scale in situ testing and geochemical reaction path modeling, can be used to quickly assess the feasibility of applying bioimmobilization to remediate U contaminated groundwater aquifers and provide data needed for full-scale design.

  16. Nuclear and radiation applications in industry: Tools for innovation

    International Nuclear Information System (INIS)

    Machi, S.; Iyer, R.

    1994-01-01

    Applications of nuclear and radiation technologies have been contributing to industrial efficiency, energy conservation, and environmental protection for many years. Some of these are: Manufacturing industries: Radiation processing technologies are playing increasing roles during manufacturing of such everyday products as wire and cable, automobile tires, plastic films and sheets, and surface materials. Production processes: Other techniques employing radioisotope gauges are indispensable for on-line thickness measurements during paper, plastic, and steel plate production. Processing and quality checks are made using nucleonic control systems that are common features of industrial production lines. Sterilization of medical products using electron beam accelerators or cobalt-60 radiation is better than the conventional methods. Industrial safety and product quality: Non-destructive examination or testing using gamma- or X-ray radiography is widely used for checking welds, casting, machinery, and ceramics to ensure quality and safety. Additionally, radiotracer techniques are unique tools for the optimization of chemical processes in reactors, leakage detection, and wear and corrosion studies, for example. Environmental protection: An innovative technology using electron beams to simultaneously remove sulfur dioxide (SO 2 ) and nitrogen oxides (NO x ) has been under development. The electron beam technology is very cost competitive and its byproduct can be used as agricultural fertilizer

  17. On Collecting and Publishing the Albanian Oral Epic

    Directory of Open Access Journals (Sweden)

    Arbnora Dushi

    2014-05-01

    Full Text Available The aim of this paper is to examine how the Albanian epic known as the ‘Cycle of the Frontier Warriors’ has been presented in Albanian folklore collections. I will examine seven written versions of the song ‘The Wedding of Ali Bajraktari’, which belongs to this epic cycle. The ‘Cycle of the Frontier Warriors’, has been an object of collection since the beginning of the twentieth century. There are now dozens of volumes published, but the studies published to date concentrate on historical, thematic and comparative rather than contextual and textual issues.

  18. EPICS: A control system software co-development success story

    International Nuclear Information System (INIS)

    Knott, M.; Gurd, D.; Lewis, S.; Thuot, M.

    1993-01-01

    The Experimental Physics and Industrial Control Systems (EPICS) is the result of a software sharing and co-development effort of major importance now underway. The initial two participants, LANL and ANL, have now been joined by three other labs, and an earlier version of the software has been transferred to three commercial firms and is currently undergoing separate development. The reasons for EPICS's success may be useful to enumerate and explain and the desire and prospects for its continued development are certainly worth examining

  19. Tools for signal compression applications to speech and audio coding

    CERN Document Server

    Moreau, Nicolas

    2013-01-01

    This book presents tools and algorithms required to compress/uncompress signals such as speech and music. These algorithms are largely used in mobile phones, DVD players, HDTV sets, etc. In a first rather theoretical part, this book presents the standard tools used in compression systems: scalar and vector quantization, predictive quantization, transform quantization, entropy coding. In particular we show the consistency between these different tools. The second part explains how these tools are used in the latest speech and audio coders. The third part gives Matlab programs simulating t

  20. Implementation of Epic Beaker Clinical Pathology at an academic medical center

    Directory of Open Access Journals (Sweden)

    Matthew D Krasowski

    2016-01-01

    Full Text Available Background: Epic Beaker Clinical Pathology (CP is a relatively new laboratory information system (LIS operating within the Epic suite of software applications. To date, there have not been any publications describing implementation of Beaker CP. In this report, we describe our experience in implementing Beaker CP version 2012 at a state academic medical center with a go-live of August 2014 and a subsequent upgrade to Beaker version 2014 in May 2015. The implementation of Beaker CP was concurrent with implementations of Epic modules for revenue cycle, patient scheduling, and patient registration. Methods: Our analysis covers approximately 3 years of time (2 years preimplementation of Beaker CP and roughly 1 year after using data summarized from pre- and post-implementation meetings, debriefings, and the closure document for the project. Results: We summarize positive aspects of, and key factors leading to, a successful implementation of Beaker CP. The early inclusion of subject matter experts in the design and validation of Beaker workflows was very helpful. Since Beaker CP does not directly interface with laboratory instrumentation, the clinical laboratories spent extensive preimplementation effort establishing middleware interfaces. Immediate challenges postimplementation included bar code scanning and nursing adaptation to Beaker CP specimen collection. The most substantial changes in laboratory workflow occurred with microbiology orders. This posed a considerable challenge with microbiology orders from the operating rooms and required intensive interventions in the weeks following go-live. In postimplementation surveys, pathology staff, informatics staff, and end-users expressed satisfaction with the new LIS. Conclusions: Beaker CP can serve as an effective LIS for an academic medical center. Careful planning and preparation aid the transition to this LIS.

  1. Development of EPICS Input Output Controller and User Interface for the PEFP Low Level RF Control System

    Energy Technology Data Exchange (ETDEWEB)

    Song, Young Gi; Kim, Han Sung; Seol, Kyung Tae; Kwon, Hyeok Jung; Cho, Yong Sub [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2010-05-15

    The Low-Level RF (LLRF) control system of the Proton Engineering Frontier Project (PEFP) was developed for handling the driving frequency for Quadrupole (RFQ) and the Draft Tube Linac (DTL) cavities in 2006. The RF amplitude and phase of the accelerating field were controlled within 1% and 1 degree by stability requirements, respectively. Operators have been using the LLRF control system under the windows based text console mode as an operator interface. The LLRF control system could not be integrated with Experimental Physics Industrial Control System (EPICS) Input Output Controllers (IOC) for each subsection of PEFP facility. The main objective of this study is to supply operators of the LLRF control system with user friendly and convenient operating environment. The new LLRF control system is composed of a Verse Module Eurocard (VME) baseboard, a PCI Mezzanine Card (PMC), Board Support Package (BSP), EPICS software tool and a Real-Time Operating System (RTOS) VxWorks. A test with a dummy cavity of the new LLRF control system shows that operators can control and monitor operation parameters for a desired feedback action by using EPICS Channel Access (CA).

  2. Development of EPICS Input Output Controller and User Interface for the PEFP Low Level RF Control System

    International Nuclear Information System (INIS)

    Song, Young Gi; Kim, Han Sung; Seol, Kyung Tae; Kwon, Hyeok Jung; Cho, Yong Sub

    2010-01-01

    The Low-Level RF (LLRF) control system of the Proton Engineering Frontier Project (PEFP) was developed for handling the driving frequency for Quadrupole (RFQ) and the Draft Tube Linac (DTL) cavities in 2006. The RF amplitude and phase of the accelerating field were controlled within 1% and 1 degree by stability requirements, respectively. Operators have been using the LLRF control system under the windows based text console mode as an operator interface. The LLRF control system could not be integrated with Experimental Physics Industrial Control System (EPICS) Input Output Controllers (IOC) for each subsection of PEFP facility. The main objective of this study is to supply operators of the LLRF control system with user friendly and convenient operating environment. The new LLRF control system is composed of a Verse Module Eurocard (VME) baseboard, a PCI Mezzanine Card (PMC), Board Support Package (BSP), EPICS software tool and a Real-Time Operating System (RTOS) VxWorks. A test with a dummy cavity of the new LLRF control system shows that operators can control and monitor operation parameters for a desired feedback action by using EPICS Channel Access (CA).

  3. Application of new tool material for electrical discharge machining ...

    Indian Academy of Sciences (India)

    Administrator

    MST Division, National Metallurgical Laboratory, Jamshedpur 831 007, India. MS received 8 July 2007; revised 25 April 2009. Abstract. In EDM, Cu and graphite are commonly used as tool materials. The poor wear resistance is the drawback of these tools. In the current study, an attempt has been made to develop a ...

  4. Tool-Supported User-Centred Prototyping of Mobile Applications

    DEFF Research Database (Denmark)

    Leichtenstern, Karin; André, Elisabeth; Rehm, Matthias

    2011-01-01

    the process both cost-effective and time-effective. In this paper we cover that problem and provide insights in so-called user-centered prototyping (UCP) tools which support the production of prototypes as well as their evaluation with end-users. In particular, we introduce our UCP tool called Mo...

  5. Application of molecular genetic tools for forest pathology

    Science.gov (United States)

    Mee-Sook Kim; John Hanna; Amy Ross-Davis; Ned Klopfenstein

    2012-01-01

    In recent years, advances in molecular genetics have provided powerful tools to address critical issues in forest pathology to help promote resilient forests. Although molecular genetic tools are initially applied to understand individual components of forest pathosystems, forest pathosystems involve dynamic interactions among biotic and abiotic components of the...

  6. Extrasolar Planetary Imaging Coronagraph (EPIC): visible nulling cornagraph testbed results

    Science.gov (United States)

    Lyon, Richard G.; Clampin, Mark; Melnick, Gary; Tolls, Volker; Woodruff, Robert; Vasudevan, Gopal

    2008-07-01

    The Extrasolar Planetary Imaging Coronagraph (EPIC) is a NASA Astrophysics Strategic Mission Concept under study for the upcoming Exoplanet Probe. EPIC's mission would be to image and characterize extrasolar giant planets, and potential super-Earths, in orbits with semi-major axes between 2 and 10 AU. EPIC will provide insights into the physical nature of a variety of planets in other solar systems complimenting radial velocity (RV) and astrometric planet searches. It will detect and characterize the atmospheres of planets identified by radial velocity surveys and potentially some transits, determine orbital inclinations and masses, characterize the atmospheres of gas giants around A and F stars, observed the inner spatial structure and colors of inner Spitzer selected debris disks. EPIC would be launched into a heliocentric Earth trailing drift-away orbit, with a 3-year mission lifetime (5 year goal) and will revisit planets at least three times. The starlight suppression approach consists of a visible nulling coronagraph (VNC) that enables high order starlight suppression in broadband light. To demonstrate the VNC approach and advance it's technology readiness the NASA/Goddard Space Flight Center and Lockheed-Martin have developed a laboratory VNC and have demonstrated white light nulling. We will discuss our ongoing VNC work and show the latest results from the VNC testbed.

  7. Anthropometry and the risk of lung cancer in EPIC

    NARCIS (Netherlands)

    Dewi, Nikmah Utami; Boshuizen, Hendriek C.; Johansson, Mattias; Vineis, Paolo; Kampman, Ellen; Steffen, Annika; Tjønneland, Anne; Halkjær, Jytte; Overvad, Kim; Severi, Gianluca; Fagherazzi, Guy; Boutron-Ruault, Marie Christine; Kaaks, Rudolf; Li, Kuanrong; Boeing, Heiner; Trichopoulou, Antonia; Bamia, Christina; Klinaki, Eleni; Tumino, Rosario; Palli, Domenico; Mattiello, Amalia; Tagliabue, Giovanna; Peeters, Petra H.; Vermeulen, Roel; Weiderpass, Elisabete; Gram, Inger Torhild; Huerta, José María; Agudo, Antonio; Sánchez, María José; Ardanaz, Eva; Dorronsoro, Miren; Quirós, José Ramón; Sonestedt, Emily; Johansson, Mikael; Grankvist, Kjell; Key, Tim; Khaw, Kay Tee; Wareham, Nick; Cross, Amanda J.; Norat, Teresa; Riboli, Elio; Fanidi, Anouar; Muller, David; Bueno-De-Mesquita, H.B.

    2016-01-01

    The associations of body mass index (BMI) and other anthropometric measurements with lung cancer were examined in 348,108 participants in the European Investigation Into Cancer and Nutrition (EPIC) between 1992 and 2010. The study population included 2,400 case patients with incident lung cancer,

  8. Anthropometry and the Risk of Lung Cancer in EPIC

    NARCIS (Netherlands)

    Dewi, Nikmah Utami; Boshuizen, Hendriek C; Johansson, Mattias; Vineis, Paolo; Kampman, Ellen; Steffen, Annika; Tjønneland, Anne; Halkjær, Jytte; Overvad, Kim; Severi, Gianluca; Fagherazzi, Guy; Boutron-Ruault, Marie-Christine; Kaaks, Rudolf; Li, Kuanrong; Boeing, Heiner; Trichopoulou, Antonia; Bamia, Christina; Klinaki, Eleni; Tumino, Rosario; Palli, Domenico; Mattiello, Amalia; Tagliabue, Giovanna; Peeters, Petra H; Vermeulen, Roel; Weiderpass, Elisabete; Torhild Gram, Inger; Huerta, José María; Agudo, Antonio; Sánchez, María-José; Ardanaz, Eva; Dorronsoro, Miren; Quirós, José Ramón; Sonestedt, Emily; Johansson, Mikael; Grankvist, Kjell; Key, Tim; Khaw, Kay-Tee; Wareham, Nick; Cross, Amanda J; Norat, Teresa; Riboli, Elio; Fanidi, Anouar; Muller, David; Bueno-de-Mesquita, H Bas

    2016-01-01

    The associations of body mass index (BMI) and other anthropometric measurements with lung cancer were examined in 348,108 participants in the European Investigation Into Cancer and Nutrition (EPIC) between 1992 and 2010. The study population included 2,400 case patients with incident lung cancer,

  9. Anthropometry and the risk of lung cancer in EPIC

    NARCIS (Netherlands)

    Dewi, Nikmah Utami; Boshuizen, Hendriek C.; Johansson, Mattias; Vineis, Paolo; Kampman, Ellen; Steffen, Annika; Tjønneland, Anne; Halkjær, Jytte; Overvad, Kim; Severi, Gianluca; Fagherazzi, Guy; Boutron-Ruault, Marie Christine; Kaaks, Rudolf; Li, Kuanrong; Boeing, Heiner; Trichopoulou, Antonia; Bamia, Christina; Klinaki, Eleni; Tumino, Rosario; Palli, Domenico; Mattiello, Amalia; Tagliabue, Giovanna; Peeters, Petra H.; Vermeulen, Roel; Weiderpass, Elisabete; Gram, Inger Torhild; Huerta, José María; Agudo, Antonio; Sánchez, María José; Ardanaz, Eva; Dorronsoro, Miren; Quirós, José Ramón; Sonestedt, Emily; Johansson, Mikael; Grankvist, Kjell; Key, Tim; Khaw, Kay Tee; Wareham, Nick; Cross, Amanda J.; Norat, Teresa; Riboli, Elio; Fanidi, Anouar; Muller, David; Bueno-De-Mesquita, H. Bas

    2016-01-01

    The associations of body mass index (BMI) and other anthropometric measurements with lung cancer were examined in 348,108 participants in the European Investigation Into Cancer and Nutrition (EPIC) between 1992 and 2010. The study population included 2,400 case patients with incident lung cancer,

  10. EPICS-QT based graphical user interface for accelerator control

    International Nuclear Information System (INIS)

    Basu, A.; Singh, S.K.; Rosily, Sherry; Bhagwat, P.V.

    2016-01-01

    Particle accelerators and many industrial complex systems, require a robust and efficient control for its proper operation to achieve required beam quality, safety of its sub component and all working personnel. This control is executed via a graphical user interface through which an operator interacts with the accelerator to achieve the desired state of the machine and its output. Experimental Physics and Industrial Control System (EPICS) is a widely used control system framework in the field of accelerator control. It acts as a middle layer between field devices and graphic user interface used by the operator. Field devices can also be made EPICS compliant by using EPICS based software in that. On the other hand Qt is a C++ framework which is widely used for creating very professional looking and user friendly graphical component. In Low Energy High Intensity Proton Accelerator (LEHIPA), which is the first stage of the three stage Accelerator Driven System (ADS) program taken by Bhabha Atomic Research Centre (BARC), it is decided that EPICS will be used for controlling the accelerator and Qt will be used for developing the various Graphic User Interface (GUI) for operation and diagnostics. This paper discuss the work carried out to achieve this goal in LEHIPA

  11. CAFE, a modern C++ interface to the EPICS channel access library

    International Nuclear Information System (INIS)

    Chrin, J.; Sloan, M.C.

    2012-01-01

    CAFE (Channel Access interface) is a C++ library that provides a modern, multifaceted interface to the EPICS-based control system that we may find in particle accelerators for instance. CAFE makes extensive use of templates and containers with multiple STL-compatible access methods to enhance efficiency, flexibility and performance. Stability and robustness are accomplished by ensuring that connectivity to EPICS channels remains in a well defined state in every eventuality, and results of all synchronous and asynchronous operations are captured and reported with integrity. CAFE presents the user with a number of options for writing and retrieving data to and from the control system. In addition to basic read and write operations, a further abstraction layer provides transparency to more intricate functionalities involving logical sets of data; such 'group' objects are easily instantiated through an XML-based configuration mechanism. CAFE's suitability for use in a broad spectrum of applications is demonstrated. These range from high performance Qt GUI (Graphical User Interface) control widgets, to event processing agents that propagate data through the Object Managements Group's Data Distribution Service (OMG-DDS), to script-like frameworks such as MATLAB. The methodology for the modular use of CAFE serves to improve maintainability by enforcing a logical boundary between the channel access components and the programming extensions of the application framework at hand. (authors)

  12. Costs and financing of routine immunization: Approach and selected findings of a multi-country study (EPIC).

    Science.gov (United States)

    Brenzel, Logan; Young, Darwin; Walker, Damian G

    2015-05-07

    Few detailed facility-based costing studies of routine immunization (RI) programs have been conducted in recent years, with planners, managers and donors relying on older information or data from planning tools. To fill gaps and improve quality of information, a multi-country study on costing and financing of routine immunization and new vaccines (EPIC) was conducted in Benin, Ghana, Honduras, Moldova, Uganda and Zambia. This paper provides the rationale for the launch of the EPIC study, as well as outlines methods used in a Common Approach on facility sampling, data collection, cost and financial flow estimation for both the routine program and new vaccine introduction. Costing relied on an ingredients-based approach from a government perspective. Estimating incremental economic costs of new vaccine introduction in contexts with excess capacity are highlighted. The use of more disaggregated System of Health Accounts (SHA) coding to evaluate financial flows is presented. The EPIC studies resulted in a sample of 319 primary health care facilities, with 65% of facilities in rural areas. The EPIC studies found wide variation in total and unit costs within each country, as well as between countries. Costs increased with level of scale and socio-economic status of the country. Governments are financing an increasing share of total RI financing. This study provides a wealth of high quality information on total and unit costs and financing for RI, and demonstrates the value of in-depth facility approaches. The paper discusses the lessons learned from using a standardized approach, as well as proposes further areas of methodology development. The paper discusses how results can be used for resource mobilization and allocation, improved efficiency of services at the country level, and to inform policies at the global level. Efforts at routinizing cost analysis to support sustainability efforts would be beneficial. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Application of hard coatings for blanking and piercing tools

    DEFF Research Database (Denmark)

    Podgornik, B.; Zajec, B.; Bay, Niels

    2011-01-01

    The aim of the present investigation was to examine the possibility of reducing lubrication and replacing expensive tungsten carbide material in blanking/piercing through introduction of hard tool coatings. Results show that hard PVD coatings can be successfully used in blanking/piercing...... critical value under dry friction conditions and leads to tool failure. Therefore, at present oxidation and temperature resistant hard coatings can give improved wear resistance of stamping tools, but elimination of lubricants in blanking and piercing processes is still not feasible....

  14. BASINS and WEPP Climate Assessment Tools (CAT): Case Study Guide to Potential Applications (External Review Draft)

    Science.gov (United States)

    This draft report supports application of two recently developed water modeling tools, the BASINS and WEPP climate assessment tools. The report presents a series of short case studies designed to illustrate the capabilities of these tools for conducting scenario based assessments...

  15. High Fidelity Regolith Simulation Tool for ISRU Applications, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — NASA has serious unmet needs for simulation tools capable of predicting the behavior of lunar regolith in proposed excavation, transport and handling systems....

  16. comparative analysis of diagnostic applications of autoscan tools

    African Journals Online (AJOL)

    user

    A structured questionnaire with 3 items questions as checklist, 2 Auto scan tools were used for the study and test carried out on 5 vehicle systems at the 3 centers where Innova ... maintenance, auto- analyzers, solid work design and can-.

  17. Sustainability assessment in the 21. century. Tools, trends and applications. Symposium abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-07-01

    Focus on sustainability of products and corporations has been increasing over the last decade. New market trends develop, engendering new tools and application areas with the purpose of increasing sustainability, thus setting new demands for industry and academia. The 2012 SETAC LCA Case Study Symposium focuses on the experiences gained in industry and academia on the application of LCA and on the application of new tools for sustainability assessment. These tools may relate to environmental 'footstep' assessments, such as carbon, water or chemical footprints, as well as life cycle oriented tools for assessing other dimensions of sustainability. (LN)

  18. Sustainability assessment in the 21. century. Tools, trends and applications. Symposium abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-07-01

    Focus on sustainability of products and corporations has been increasing over the last decade. New market trends develop, engendering new tools and application areas with the purpose of increasing sustainability, thus setting new demands for industry and academia. The 2012 SETAC LCA Case Study Symposium focuses on the experiences gained in industry and academia on the application of LCA and on the application of new tools for sustainability assessment. These tools may relate to environmental 'footstep' assessments, such as carbon, water or chemical footprints, as well as life cycle oriented tools for assessing other dimensions of sustainability. (LN)

  19. Micromilling of hardened tool steel for mould making applications

    DEFF Research Database (Denmark)

    Bissacco, Giuliano; Hansen, Hans Nørgaard; De Chiffre, Leonardo

    2005-01-01

    geometries as those characterizing injection moulding moulds. The realization of the micromilling process in connection with hardened tool steel as workpiece material is particularly challenging. The low strength of the miniaturized end mills implies reduction and accurate control of the chip load which...... wear. This paper presents the micromilling process applied to the manufacturing of micro injection moulding moulds in hardened tool steel, presenting experimental evidence and possible solutions to the above-mentioned issues....

  20. Proceedings of the of the Eleventh Workshop on Language Descriptions, Tools and Applications (LDTA 2011)

    DEFF Research Database (Denmark)

    This volume contains the proceedings of the Eleventh Workshop on Language Descriptions, Tools and Applications (LDTA 2011), held in Saarbrücken, Germany on March 26 & 27, 2011. LDTA is a two-day satellite event of ETAPS (European Joint Conferences on Theory and Practice of Software) and organized...... in cooperation with ACM SIGPLAN. LDTA is an application and tool-oriented workshop focused on grammarware---software based on grammars in some form. Grammarware applications are typically language processing applications and traditional examples include parsers, program analyzers, optimizers and translators......, as well as techniques and tools, to the test in a new way in the form of the LDTA Tool Challenge. Tool developers were invited to participate in the Challenge by developing solutions to a range of language processing tasks over a simple but evolving set of imperative programming languages. Tool challenge...

  1. Afghanistan, history and beyond - GIS based application tool

    Science.gov (United States)

    Swamy, Rahul Chidananda

    The emphasis of this tool is to provide an insight into the history of Afghanistan. Afghanistan has been a warring nation for decades; this tool provides a brief account of the reasons behind the importance of Afghanistan, which led to its invasion by Britain, Russia and USA. The timeline for this thesis was set from 1879 to 1990 which ranges from Barakzai Dynasty to the soviet invasion. Maps are used judiciously to show battles during the British invasion. Maps that show roads, rivers, lakes and provinces are incorporated into the tool to provide an overview of the present situation. The user has options to filter this data by using the timeline and a filtering tool. To quench the users thirst for more information, HTML pages are used judiciously. HTML pages are embedded in key events to provide detailed insight into these events with the help of pictures and videos. An intuitive slider is used to show the people who played a significant role in Afghanistan. The user interface was made intuitive and easy to use, keeping in mind the novice user. A help menu is provided to guide the user on the tool. Spending time researching about Afghanistan has helped me again a new perspective on Afghanistan and its people. With this tool, I hope I can provide a valuable channel for people to understand Afghanistan and gain a fresh perspective into this war ridden nation.

  2. Tools for man-machine interface development in accelerator control applications

    International Nuclear Information System (INIS)

    Kopylov, L.; Mikhev, M.; Trofimov, N.; Yurpalov, V.

    1994-01-01

    For the UNK Project a development of the Accelerator Control Applications is in the progress. These applications will use a specific Graphical User Interface for data presentation and accelerator parameter management. A number of tools have been developed based on the Motif Tool Kit. They contain a set of problem oriented screen templates and libraries. Using these tools, full scale prototype applications of the UNK Tune and Orbit measurement and correction were developed and are described, as examples. A subset of these allows the creation of the synoptic control screens from the Autocad pictures files and Oracle DB equipment descriptions. The basic concepts and a few application examples are presented. ((orig.))

  3. Cooking of meat and fish in Europe--results from the European Prospective Investigation into Cancer and Nutrition (EPIC).

    Science.gov (United States)

    Rohrmann, S; Linseisen, J; Becker, N; Norat, T; Sinha, R; Skeie, G; Lund, E; Martínez, C; Barricarte, A; Mattisson, I; Berglund, G; Welch, A; Davey, G; Overvad, K; Tjønneland, A; Clavel-Chapelon, F; Kesse, E; Lotze, G; Klipstein-Grobusch, K; Vasilopoulou, E; Polychronopoulos, E; Pala, V; Celentano, E; Bueno-De-Mesquita, H B; Peeters, P H M; Riboli, E; Slimani, N

    2002-12-01

    There is epidemiologic evidence that the consumption of fried, grilled or barbecued meat and fish that are well-done or browned may be associated with an increased cancer risk. These high-temperature cooking methods are thought to be surrogates for mutagens and carcinogens produced in meat and fish, eg heterocyclic amines or polycyclic hydrocarbons. Since data on food cooking methods are scarce, the aim of this study was to describe the variation in meat and fish cooking methods in different parts of Europe. Using a standardized 24 h recall from a sub-sample of the EPIC cohort (35 644 persons, 35-75 y old), mean daily intake of meat and fish prepared by different cooking methods and the relative contribution of the cooking methods to the overall cooking of meat and fish was calculated. Whereas frying was more often noted in northern Europe, roasting and stir frying were more often used in the south. Concerning high-temperature cooking methods, their frequency of application varies between 15% in the EPIC cohort of North-Italy and 49% in the cohort of The Netherlands. Average consumption of fried, grilled and barbecued meat and fish ranges from a low of 12 g/day in the centres in southern Spain to a high of 91 g/day in northern Spain. High variation in both the kind of meat/fish consumed as well as its cooking methods is observed within EPIC. In order to use this variation for the evaluation of the impact of cooking methods on cancer risk, a questionnaire on meat and fish cooking methods is being developed and could be applied in the whole EPIC cohort.

  4. Psychometric Assessment of the Chinese Version of the Abbreviated Expanded Prostate Cancer Index Composite (EPIC-26) and the Clinical Practice Version (EPIC-CP) in Chinese Men With Prostate Cancer.

    Science.gov (United States)

    Lam, Wendy W T; Tse, Michael A; Ng, Chris N L; Chung, Edward K M; Fielding, Richard

    2017-06-01

    The Expanded Prostate Cancer Index Composite (EPIC) instrument was designed to assess a range of health-related quality-of-life issues specifically relevant to patients with prostate cancer. This study examined the validity and reliability of Chinese versions of the 26-item EPIC and of the 16-item EPIC for Clinical Practice (EPIC-CP) in Chinese patients with prostate cancer. A Chinese version of the 26-item EPIC and the 16-item EPIC-CP were self-completed by 252 Chinese patients with prostate cancer who were recruited from three community-based cancer service centers. Confirmatory factors analysis assessed the factor structures of the EPIC and the EPIC-CP. Internal consistency and construct and clinical validities of the factor structures were assessed. Confirmatory factor analysis revealed that the original factor structure of both EPIC-26 and EPIC-CP showed good fit to this sample. A correlated model was superior to a hierarchical model in both EPIC-26 and EPIC-CP supporting the utility of the domain scores over the total scores. Cronbach α ranged from 0.55 to 0.91 for EPIC-26 and 0.44 to 0.67 for EPIC-CP. Construct validity was supported by correlations between EPIC-26/EPIC-CP and psychological distress measures. Clinical validity was supported by differentiation between patients with and without prostatectomy. These Chinese versions of the five-factor EPIC-26 and the EPIC-CP are valid and practical measures for assessing a range of health-related quality-of-life issues related to the diagnosis and treatment of prostate cancer, highlighting their utility in assessing health-related quality of life for patients diagnosed with prostate cancer. Copyright © 2017 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.

  5. Software tool for data mining and its applications

    Science.gov (United States)

    Yang, Jie; Ye, Chenzhou; Chen, Nianyi

    2002-03-01

    A software tool for data mining is introduced, which integrates pattern recognition (PCA, Fisher, clustering, hyperenvelop, regression), artificial intelligence (knowledge representation, decision trees), statistical learning (rough set, support vector machine), computational intelligence (neural network, genetic algorithm, fuzzy systems). It consists of nine function models: pattern recognition, decision trees, association rule, fuzzy rule, neural network, genetic algorithm, Hyper Envelop, support vector machine, visualization. The principle and knowledge representation of some function models of data mining are described. The software tool of data mining is realized by Visual C++ under Windows 2000. Nonmonotony in data mining is dealt with by concept hierarchy and layered mining. The software tool of data mining has satisfactorily applied in the prediction of regularities of the formation of ternary intermetallic compounds in alloy systems, and diagnosis of brain glioma.

  6. Use of EPICS and Python technology for the development of a computational toolkit for high heat flux testing of plasma facing components

    Energy Technology Data Exchange (ETDEWEB)

    Sugandhi, Ritesh, E-mail: ritesh@ipr.res.in; Swamy, Rajamannar, E-mail: rajamannar@ipr.res.in; Khirwadkar, Samir, E-mail: sameer@ipr.res.in

    2016-11-15

    Highlights: • An integrated approach to software development for computational processing and experimental control. • Use of open source, cross platform, robust and advanced tools for computational code development. • Prediction of optimized process parameters for critical heat flux model. • Virtual experimentation for high heat flux testing of plasma facing components. - Abstract: The high heat flux testing and characterization of the divertor and first wall components are a challenging engineering problem of a tokamak. These components are subject to steady state and transient heat load of high magnitude. Therefore, the accurate prediction and control of the cooling parameters is crucial to prevent burnout. The prediction of the cooling parameters is based on the numerical solution of the critical heat flux (CHF) model. In a test facility for high heat flux testing of plasma facing components (PFC), the integration of computations and experimental control is an essential requirement. Experimental physics and industrial control system (EPICS) provides powerful tools for steering controls, data simulation, hardware interfacing and wider usability. Python provides an open source alternative for numerical computations and scripting. We have integrated these two open source technologies to develop a graphical software for a typical high heat flux experiment. The implementation uses EPICS based tools namely IOC (I/O controller) server, control system studio (CSS) and Python based tools namely Numpy, Scipy, Matplotlib and NOSE. EPICS and Python are integrated using PyEpics library. This toolkit is currently under operation at high heat flux test facility at Institute for Plasma Research (IPR) and is also useful for the experimental labs working in the similar research areas. The paper reports the software architectural design, implementation tools and rationale for their selection, test and validation.

  7. Use of EPICS and Python technology for the development of a computational toolkit for high heat flux testing of plasma facing components

    International Nuclear Information System (INIS)

    Sugandhi, Ritesh; Swamy, Rajamannar; Khirwadkar, Samir

    2016-01-01

    Highlights: • An integrated approach to software development for computational processing and experimental control. • Use of open source, cross platform, robust and advanced tools for computational code development. • Prediction of optimized process parameters for critical heat flux model. • Virtual experimentation for high heat flux testing of plasma facing components. - Abstract: The high heat flux testing and characterization of the divertor and first wall components are a challenging engineering problem of a tokamak. These components are subject to steady state and transient heat load of high magnitude. Therefore, the accurate prediction and control of the cooling parameters is crucial to prevent burnout. The prediction of the cooling parameters is based on the numerical solution of the critical heat flux (CHF) model. In a test facility for high heat flux testing of plasma facing components (PFC), the integration of computations and experimental control is an essential requirement. Experimental physics and industrial control system (EPICS) provides powerful tools for steering controls, data simulation, hardware interfacing and wider usability. Python provides an open source alternative for numerical computations and scripting. We have integrated these two open source technologies to develop a graphical software for a typical high heat flux experiment. The implementation uses EPICS based tools namely IOC (I/O controller) server, control system studio (CSS) and Python based tools namely Numpy, Scipy, Matplotlib and NOSE. EPICS and Python are integrated using PyEpics library. This toolkit is currently under operation at high heat flux test facility at Institute for Plasma Research (IPR) and is also useful for the experimental labs working in the similar research areas. The paper reports the software architectural design, implementation tools and rationale for their selection, test and validation.

  8. GIS based application tool -- history of East India Company

    Science.gov (United States)

    Phophaliya, Sudhir

    The emphasis of the thesis is to build an intuitive and robust GIS (Geographic Information systems) Tool which gives an in depth information on history of East India Company. The GIS tool also incorporates various achievements of East India Company which helped to establish their business all over world especially India. The user has the option to select these movements and acts by clicking on any of the marked states on the World map. The World Map also incorporates key features for East India Company like landing of East India Company in India, Darjeeling Tea Establishment, East India Company Stock Redemption Act etc. The user can know more about these features simply by clicking on each of them. The primary focus of the tool is to give the user a unique insight about East India Company; for this the tool has several HTML (Hypertext markup language) pages which the user can select. These HTML pages give information on various topics like the first Voyage, Trade with China, 1857 Revolt etc. The tool has been developed in JAVA. For the Indian map MOJO (Map Objects Java Objects) is used. MOJO is developed by ESRI. The major features shown on the World map was designed using MOJO. MOJO made it easy to incorporate the statistical data with these features. The user interface was intentionally kept simple and easy to use. To keep the user engaged, key aspects are explained using HTML pages. The idea is that pictures will help the user garner interest in the history of East India Company.

  9. Learning curve tool applications in DOE materials management activities

    International Nuclear Information System (INIS)

    Lipinski, A.

    1994-01-01

    This paper will examine the application of learning curve theory, an economic theory that quantifies cost savings over time in a labor intensive process. Learning curve theory has been traditionally applied to a production process. This paper examines the application of learning curve theory in cost estimating of waste characterization in storage at a DOE facility

  10. An object oriented framework of EPICS for MicroTCA based control system

    International Nuclear Information System (INIS)

    Geng, Z.

    2012-01-01

    EPICS (Experimental Physics and Industrial Control System) is a distributed control system platform which has been widely used for large scientific devices control like particle accelerators and fusion plant. EPICS has introduced object oriented (C ++ ) interfaces to most of the core services. But the major part of EPICS, the run-time database, only provides C interfaces, which is hard to involve the EPICS record concerned data and routines in the object oriented architecture of the software. This paper presents an object oriented framework which contains some abstract classes to encapsulate the EPICS record concerned data and routines in C ++ classes so that full OOA (Objected Oriented Analysis) and OOD (Object Oriented Design) methodologies can be used for EPICS IOC design. We also present a dynamic device management scheme for the hot swap capability of the MicroTCA based control system. (authors)

  11. Image acquisition and analysis for beam diagnostics, applications of the Taiwan photon source

    International Nuclear Information System (INIS)

    Liao, C.Y.; Chen, J.; Cheng, Y.S.; Hsu, K.T.; Hu, K.H.; Kuo, C.H.; Wu, C.Y.

    2012-01-01

    Design and implementation of image acquisition and analysis is in proceeding for the Taiwan Photon Source (TPS) diagnostic applications. The optical system contains screen, lens, and lighting system. A CCD camera with Gigabit Ethernet interface (GigE Vision) will be a standard image acquisition device. Image acquisition will be done on EPICS IOC via PV channel and analysis the properties by using Matlab tool to evaluate the beam profile (sigma), beam size position and tilt angle et al. The EPICS IOC integrated with Matlab as a data processing system is not only could be used in image analysis but also in many types of equipment data processing applications. Progress of the project will be summarized in this report. (authors)

  12. Connect high speed analog-digital converter with EPICS based on LabVIEW

    International Nuclear Information System (INIS)

    Wang Wei; Chi Yunlong

    2008-01-01

    This paper introduce a method to connect high speed analog-digital converter (ADC212/100) with EPICS on Windows platform using LabVIEW. We use labVIEW to communicate with the converter, then use interface sub-VIs between LabVIEW and EPICS to access the EPICS IOC by Channel Access (CA). For the easy use graph programming language of LabVIEW, this method could shorten the develop period and reduce manpower cost. (authors)

  13. A Consistent EPIC Visible Channel Calibration Using VIIRS and MODIS as a Reference.

    Science.gov (United States)

    Haney, C.; Doelling, D. R.; Minnis, P.; Bhatt, R.; Scarino, B. R.; Gopalan, A.

    2017-12-01

    The Earth Polychromatic Imaging Camera (EPIC) aboard the Deep Space Climate Observatory (DSCOVR) satellite constantly images the sunlit disk of Earth from the Lagrange-1 (L1) point in 10 spectral channels spanning the UV, VIS, and NIR spectrums. Recently, the DSCOVR EPIC team has publicly released version 2 dataset, which has implemented improved navigation, stray-light correction, and flat-fielding of the CCD array. The EPIC 2-year data record must be well-calibrated for consistent cloud, aerosol, trace gas, land use and other retrievals. Because EPIC lacks onboard calibrators, the observations made by EPIC channels must be calibrated vicariously using the coincident measurements from radiometrically stable instruments that have onboard calibration systems. MODIS and VIIRS are best-suited instruments for this task as they contain similar spectral bands that are well-calibrated onboard using solar diffusers and lunar tracking. We have previously calibrated the EPIC version 1 dataset by using EPIC and VIIRS angularly matched radiance pairs over both all-sky ocean and deep convective clouds (DCC). We noted that the EPIC image required navigations adjustments, and that the EPIC stray-light correction provided an offset term closer to zero based on the linear regression of the EPIC and VIIRS ray-matched radiance pairs. We will evaluate the EPIC version 2 navigation and stray-light improvements using the same techniques. In addition, we will monitor the EPIC channel calibration over the two years for any temporal degradation or anomalous behavior. These two calibration methods will be further validated using desert and DCC invariant Earth targets. The radiometric characterization of the selected invariant targets is performed using multiple years of MODIS and VIIRS measurements. Results of these studies will be shown at the conference.

  14. EPICS based control system for cryogenic plant at VECC

    International Nuclear Information System (INIS)

    Panda, Umashankar; Pal, Sandip; Mandal, Anupam; Dey, Ranadhir

    2012-01-01

    Cryogenic Plant of Variable Energy Cyclotron Centre consists of two Helium refrigerators (250W and 415W at the rate 4.5K), valve box with sub-cooler and associated sub systems like pure gas storage, helium purifier and impure gas recovery etc. The system also consists of 3.1K liters of liquid Nitrogen (LN 2 ) storage and delivery system. Many of the systems are procured from different suppliers and some are also developed in house. Due to the variety of systems and suppliers the control philosophy, communication protocols and component is also different. So the Supervisory control and data acquisition (SCADA) module has to be such that it can take care of the variance and bring everything into a common control platform. To solve this purpose EPICS (Experimental Physics and Industrial Control System) architecture has been adopted. EPICS is having the advantage of being open source, flexible and unlimited as compared to the commercial SCADA packages. (author)

  15. Application of Lean Manufacturing Tools in the Food and Beverage Industries

    Directory of Open Access Journals (Sweden)

    Rui Borges Lopes

    2015-10-01

    Full Text Available Recent years have shown an increasing use of lean manufacturing (LM principles and tools in several industrial sectors. Already a well-established management philosophy, it has shown numerous successful applications even outside production environments. This work presents the application of some LM tools, and the corresponding shift in philosophy, in two Portuguese companies of the food and beverage industries. Main implementation issues are presented and discussed; followed by the results obtained from the application of LM tools in the production system of these companies. Significant gains are obtained in both companies and, more importantly, it instills a continuous improvement culture and increases production flexibility while reducing lead times.

  16. Tools and tool application for the dismantling of the nuclear power plant Brennilis in France

    International Nuclear Information System (INIS)

    Bienia, Harald; Welbers, Philipp; Krueger, Peter; Noll, Thomas

    2012-01-01

    The EL-4 reactor in the NPP Brennilis in France is a CO2 cooled heavy water moderated test reactor with net power of 70 MW, the reactor started operation in 1967 and was decommissioned in 1985. Due to the construction features it was not necessary to enter the reactor area during operation, therefore the reactor pressure vessel and the surrounding piping systems are built in a very compact way. The dismantling procedures are therefore different from German BWR or PWR systems, the remote cutting and handling tools have to be adapted to the different features. Because of the high local dosage rate in the reactor hall it is also necessary to perform the erection of the dismantling equipment by robot systems. For cutting of the piping system a new plasma cutting technique, the hot wire method will be used. Other mechanical cutting techniques have to be used for instance for zircaloy containing components due to fire prevention purposes. The required time for tool and manipulator changes, including wearing part replacements constitute a significant part of the dismantling schedule. The suction/exhaust system for radioactive dust removal allowed a reduction of the total personal dose by one third of the allowed dose.

  17. A versatile trigger and synchronization module with IEEE1588 capabilities and EPICS support

    International Nuclear Information System (INIS)

    Lopez, J.M.; Ruiz, M.; Borrego, J.; Arcas, G. de; Barrera, E.; Vega, J.

    2010-01-01

    Event timing and synchronization are two key aspects to improve in the implementation of distributed data acquisition (dDAQ) systems such as the ones used in fusion experiments. It is also of great importance the integration of dDAQ in control and measurement networks. This paper analyzes the applicability of the IEEE1588 and EPICS standards to solve these problems, and presents a hardware module implementation based in both of them that allow adding these functionalities to any DAQ. The IEEE1588 standard facilitates the integration of event timing and synchronization mechanisms in distributed data acquisition systems based on IEEE 803.3 (Ethernet). An optimal implementation of such system requires the use of network interface devices which include specific hardware resources devoted to the IEE1588 functionalities. Unfortunately, this is not the approach followed in most of the large number of applications available nowadays. Therefore, most solutions are based in software and use standard hardware network interfaces. This paper presents the development of a hardware module (GI2E) with IEEE1588 capabilities which includes USB, RS232, RS485 and CAN interfaces. This permits to integrate any DAQ element that uses these interfaces in dDAQ systems in an efficient and simple way. The module has been developed with Motorola's Coldfire MCF5234 processor and National Semiconductors's PHY DP83640T, providing it with the possibility to implement the PTP protocol of IEEE1588 by hardware, and therefore increasing its performance over other implementations based in software. To facilitate the integration of the dDAQ system in control and measurement networks the module includes a basic Input/Output Controller (IOC) functionality of the Experimental Physics and Industrial Control System (EPICS) architecture. The paper discusses the implementation details of this module and presents its applications in advanced dDAQ applications in the fusion community.

  18. Development of KOMAC Beam Monitoring System Using EPICS

    Energy Technology Data Exchange (ETDEWEB)

    Song, Young-Gi; Yun, Sang-Pil; Kim, Han-Sung; Kwon, Hyeok-Jung; Cho, Yong-Sub [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    The beam loss signals must be digitized and the sampling has to be synchronized to a reference signal which is an external trigger for beam operation. The digitized data must be accessible by the Experimental Physics and Industrial Control System (EPICS)-based control system, which manages the whole accelerator control. In order to satisfy the requirement, an Input /Output Controller (IOC), which runs Linux on a CPU module with PCI express based Analog to Digital Converter (ADC) modules, has been adopted. An associated linux driver and EPICS device support module also have been developed. The IOC meets the requirements and the development and maintenance of the software for the IOC is considerably efficient. The data acquisition system running EPICS will be used in increasing phase of KOrea Multi-purpose Accelerator Complex (KOMAC) beam power. The beam monitoring system integrates BLM and BPM signals into control system and offers real-time data to operators. The IOC, which is implemented with Linux and PCI driver, has supported data acquisition as a very flexible solution.

  19. Epic and Romance in The Lord Of The Rings

    Directory of Open Access Journals (Sweden)

    Martin Simonson

    2016-10-01

    Full Text Available In the field of comparative literature The Lord of the Rings has been most frequently studied within the contexts of romance and epic. This approach, however, leaves out important generic aspects of the global picture, such as the narrative’s strong adherence to the novel genre and to mythic traditions beyond romance and epic narratives. If we choose one particular genre as the yardstick against which to measure the work’s success in narrative terms, we tend to end up with the conclusion that The Lord of the Rings does not quite make sense within the given limits of the genre in question. In Tolkien’s work there is a narrative and stylistic exploration of the different genres’ constraints in which the Western narrative traditions – myth, epic, romance, the novel, and their respective subgenres – interact in a previously unknown but still very much coherent world that, because of the particular cohesion required by such a chronotope, exhibits a clear contextualization of references to the previous traditions. As opposed to many contemporary literary expressions, the ensuing absence of irony and parody creates a generic dialogue, in which the various narrative traditions explore and interrogate each other’s limits without rendering the others absurdly incompatible, ridiculous or superfluous.

  20. Development of KOMAC Beam Monitoring System Using EPICS

    International Nuclear Information System (INIS)

    Song, Young-Gi; Yun, Sang-Pil; Kim, Han-Sung; Kwon, Hyeok-Jung; Cho, Yong-Sub

    2014-01-01

    The beam loss signals must be digitized and the sampling has to be synchronized to a reference signal which is an external trigger for beam operation. The digitized data must be accessible by the Experimental Physics and Industrial Control System (EPICS)-based control system, which manages the whole accelerator control. In order to satisfy the requirement, an Input /Output Controller (IOC), which runs Linux on a CPU module with PCI express based Analog to Digital Converter (ADC) modules, has been adopted. An associated linux driver and EPICS device support module also have been developed. The IOC meets the requirements and the development and maintenance of the software for the IOC is considerably efficient. The data acquisition system running EPICS will be used in increasing phase of KOrea Multi-purpose Accelerator Complex (KOMAC) beam power. The beam monitoring system integrates BLM and BPM signals into control system and offers real-time data to operators. The IOC, which is implemented with Linux and PCI driver, has supported data acquisition as a very flexible solution

  1. Practical Applications of Quality Tools in Polish Manufacturing Companies

    Directory of Open Access Journals (Sweden)

    Starzyńska Beata

    2014-08-01

    Full Text Available Background and Purpose: Modern companies have found themselves in a situation where the ability for the dynamic adaptation to the changing market conditions is a key competitive advantage. Therefore they are continually searching for intensive ways of improvement of their processes and products. The basis for the implementation of such strategy is the efficient use of information resources. In quality management, appropriate tools and techniques equip decision-makers with information, necessary to take: correction, corrective, preventive, and finally – improvement actions.

  2. Formulation of EPICS record naming conventions in J-PARC linac and RCS. Build process of unique and standardized name

    International Nuclear Information System (INIS)

    Fukuta, Shinpei; Kawase, Masato; Kikuzawa, Nobuhiro; Watanabe, Kazuhiko; Sakaki, Hironao; Takahashi, Hiroki

    2011-02-01

    J-PARC (Japan Proton Accelerator Research Complex) accelerator devices are controlled by the use of the software called EPICS (Experimental Physics and Industrial Control System). The unique name called an EPICS record is given to a control signal and data acquisition, Accelerator device control is achieved using the EPICS record. The requirement for the EPICS record name is 2 points; (1) no overlap of the EPICS record name, (2) the control contents can be easily imagined from the EPICS record name. To manage the EPICS record using relational database for the information management of the accelerator device in J-PARC, the naming structure is required so that a mechanical process can be performed easily. It was necessary to standardize the EPICS record name and the EPICS record structure to achieve these requirements. Therefore, we have formulated a guideline called 'EPICS record naming conventions' to decide to an EPICS record name uniquely and standardization. The abbreviated key word list of the accelerator devices and the control signal that compose the EPICS record name is appended to the EPICS record naming conventions. (author)

  3. Development and first application of an operating events ranking tool

    International Nuclear Information System (INIS)

    Šimić, Zdenko; Zerger, Benoit; Banov, Reni

    2015-01-01

    Highlights: • A method using analitycal hierarchy process for ranking operating events is developed and tested. • The method is applied for 5 years of U.S. NRC Licensee Event Reports (1453 events). • Uncertainty and sensitivity of the ranking results are evaluated. • Real events assessment shows potential of the method for operating experience feedback. - Abstract: The operating experience feedback is important for maintaining and improving safety and availability in nuclear power plants. Detailed investigation of all events is challenging since it requires excessive resources, especially in case of large event databases. This paper presents an event groups ranking method to complement the analysis of individual operating events. The basis for the method is the use of an internationally accepted events characterization scheme that allows different ways of events grouping and ranking. The ranking method itself consists of implementing the analytical hierarchy process (AHP) by means of a custom developed tool which allows events ranking based on ranking indexes pre-determined by expert judgment. Following the development phase, the tool was applied to analyze a complete set of 5 years of real nuclear power plants operating events (1453 events). The paper presents the potential of this ranking method to identify possible patterns throughout the event database and therefore to give additional insights into the events as well as to give quantitative input for the prioritization of further more detailed investigation of selected event groups

  4. Geomorphic Unit Tool (GUT): Applications of Fluvial Mapping

    Science.gov (United States)

    Kramer, N.; Bangen, S. G.; Wheaton, J. M.; Bouwes, N.; Wall, E.; Saunders, C.; Bennett, S.; Fortney, S.

    2017-12-01

    Geomorphic units are the building blocks of rivers and represent distinct habitat patches for many fluvial organisms. We present the Geomorphic Unit Toolkit (GUT), a flexible GIS geomorphic unit mapping tool, to generate maps of fluvial landforms from topography. GUT applies attributes to landforms based on flow stage (Tier 1), topographic signatures (Tier 2), geomorphic characteristics (Tier 3) and patch characteristics (Tier 4) to derive attributed maps at the level of detail required by analysts. We hypothesize that if more rigorous and consistent geomorphic mapping is conducted, better correlations between physical habitat units and ecohydraulic model results will be obtained compared to past work. Using output from GUT for coarse bed tributary streams in the Columbia River Basin, we explore relationships between salmonid habitat and geomorphic spatial metrics. We also highlight case studies of how GUT can be used to showcase geomorphic impact from large wood restoration efforts. Provided high resolution topography exists, this tool can be used to quickly assess changes in fluvial geomorphology in watersheds impacted by human activities.

  5. Application of corrosion screening tools for riser inspection

    International Nuclear Information System (INIS)

    Zamir Mohamed Daud; Vijayan, S.

    2003-01-01

    As offshore facilities approach the end of their design life, owners would like to assess the condition and integrity of plant and equipment. Detailed inspection, including non-destructive testing (NDT), are implemented and results are utilised for predictive maintenance and estimating useful remaining life. Except for risk based inspection, the extent of surface coverage required would be more compared to inspection of pre-determined spots. Risers, for example, usually have several layers of coating that prevent use of conventional techniques for inspection of corrosion. Complete coverage requires access (including removal coatings and insulation). Inspection utilising the conventional NDT tools can be very slow and expensive. However, recent advances have forwarded the use of specialised NDT techniques that were developed for inspection of corrosion under insulation (CUI). This paper details two screening inspection tools, LIXI Profiler and RTD-INCOTEST that have been applied to inspection of risers. LIXI Profiler is based on attenuation of penetrating radiation by materials, and RTD-INCOTEST is based on decay of pulsed eddy current in materials. (Author)

  6. [Application of microelectronics CAD tools to synthetic biology].

    Science.gov (United States)

    Madec, Morgan; Haiech, Jacques; Rosati, Élise; Rezgui, Abir; Gendrault, Yves; Lallement, Christophe

    2017-02-01

    Synthetic biology is an emerging science that aims to create new biological functions that do not exist in nature, based on the knowledge acquired in life science over the last century. Since the beginning of this century, several projects in synthetic biology have emerged. The complexity of the developed artificial bio-functions is relatively low so that empirical design methods could be used for the design process. Nevertheless, with the increasing complexity of biological circuits, this is no longer the case and a large number of computer aided design softwares have been developed in the past few years. These tools include languages for the behavioral description and the mathematical modelling of biological systems, simulators at different levels of abstraction, libraries of biological devices and circuit design automation algorithms. All of these tools already exist in other fields of engineering sciences, particularly in microelectronics. This is the approach that is put forward in this paper. © 2017 médecine/sciences – Inserm.

  7. Case study applications of the BASINS climate assessment tool (CAT)

    Science.gov (United States)

    This EPA report will illustrate the application of different climate assessment capabilities within EPA’s BASINS modeling system for assessing a range of potential questions about the effects of climate change on streamflow and water quality in different watershed settings and us...

  8. ACHILLES AS A MARKETING TOOL FOR VIRTUAL HERITAGE APPLICATIONS

    Directory of Open Access Journals (Sweden)

    Mohamed Nabil Arafa

    2017-11-01

    Full Text Available Virtual Reality technology has made it possible for people to visit places and enjoy different exciting experiences while remaining at home. It gives an opportunity to enjoy the past at its best. Virtual Reality was introduced in 1929 with interactive training devices that simulated fighter planes. In 1957, the Sensorama simulator was designed which could generate city smells and wind sensations. The need for tourism to become virtual becomes more urgent than ever before. Virtual Reality applications provide this chance, not only in place, but in time as well. This paper presents a guide to the heritage applications' builders and marketers to reach more online users. The paper helps the builder to understand the consumer behaviour for marketing research. The paper illustrates eight levels, with each one leading to the next. The author named the eight levels A.C.H.I.L.L.E.S. Each letter represents a level; beginning with the awareness and ending with the sustainability. ACHILLES represents a sequence that shows three main phases of mobile application usage. It aims for a better management for the online visitors' engagement. This aim can be accomplished through the understanding of the different stages that the online visitors go through. In addition, it shows the correlation between the users and the mobile application.

  9. Towards evolution-guided microbial engineering - tools development and applications

    DEFF Research Database (Denmark)

    Genee, Hans Jasper

    is thedevelopment of highly robust biosensor-based synthetic selection systemsthat enable high-throughput functional interrogation of complexphenotypic libraries. Using the model organism Escherichia coli as a host, Ideploy these systems to i) perform metagenome wide sequenceindependentidentification of novel...... for microbial engineering anddemonstrates direct applications to gene discovery, protein engineering andcell factory development....

  10. Identification of New Tools to Predict Surgical Performance of Novices using a Plastic Surgery Simulator.

    Science.gov (United States)

    Kazan, Roy; Viezel-Mathieu, Alex; Cyr, Shantale; Hemmerling, Thomas M; Lin, Samuel J; Gilardino, Mirko S

    2018-04-09

    To identify new tools capable of predicting surgical performance of novices on an augmentation mammoplasty simulator. The pace of technical skills acquisition varies between residents and may necessitate more time than that allotted by residency training before reaching competence. Identifying applicants with superior innate technical abilities might shorten learning curves and the time to reach competence. The objective of this study is to identify new tools that could predict surgical performance of novices on a mammoplasty simulator. We recruited 14 medical students and recorded their performance in 2 skill-games: Mikado and Perplexus Epic, and in 2 video games: Star War Racer (Sony Playstation 3) and Super Monkey Ball 2 (Nintendo Wii). Then, each participant performed an augmentation mammoplasty procedure on a Mammoplasty Part-task Trainer, which allows the simulation of the essential steps of the procedure. The average age of participants was 25.4 years. Correlation studies showed significant association between Perplexus Epic, Star Wars Racer, Super Monkey Ball scores and the modified OSATS score with r s = 0.8491 (p 41 (p = 0.005), and r s = 0.7309 (p < 0.003), but not with the Mikado score r s = -0.0255 (p = 0.9). Linear regressions were strongest for Perplexus Epic and Super Monkey Ball scores with coefficients of determination of 0.59 and 0.55, respectively. A combined score (Perplexus/Super-Monkey-Ball) was computed and showed a significant correlation with the modified OSATS score having an r s = 0.8107 (p < 0.001) and R 2 = 0.75, respectively. This study identified a combination of skill games that correlated to better performance of novices on a surgical simulator. With refinement, such tools could serve to help screen plastic surgery applicants and identify those with higher surgical performance predictors. Copyright © 2018 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  11. PC graphics generation and management tool for real-time applications

    Science.gov (United States)

    Truong, Long V.

    1992-01-01

    A graphics tool was designed and developed for easy generation and management of personal computer graphics. It also provides methods and 'run-time' software for many common artificial intelligence (AI) or expert system (ES) applications.

  12. Transgene Expression in Microalgae-From Tools to Applications.

    Science.gov (United States)

    Doron, Lior; Segal, Na'ama; Shapira, Michal

    2016-01-01

    Microalgae comprise a biodiverse group of photosynthetic organisms that reside in water sources and sediments. The green microalgae Chlamydomonas reinhardtii was adopted as a useful model organism for studying various physiological systems. Its ability to grow under both photosynthetic and heterotrophic conditions allows efficient growth of non-photosynthetic mutants, making Chlamydomonas a useful genetic tool to study photosynthesis. In addition, this green alga can grow as haploid or diploid cells, similar to yeast, providing a powerful genetic system. As a result, easy and efficient transformation systems have been developed for Chlamydomonas, targeting both the chloroplast and nuclear genomes. Since microalgae comprise a rich repertoire of species that offer variable advantages for biotech and biomed industries, gene transfer technologies were further developed for many microalgae to allow for the expression of foreign proteins of interest. Expressing foreign genes in the chloroplast enables the targeting of foreign DNA to specific sites by homologous recombination. Chloroplast transformation also allows for the introduction of genes encoding several enzymes from a complex pathway, possibly as an operon. Expressing foreign proteins in the chloroplast can also be achieved by introducing the target gene into the nuclear genome, with the protein product bearing a targeting signal that directs import of the transgene-product into the chloroplast, like other endogenous chloroplast proteins. Integration of foreign genes into the nuclear genome is mostly random, resulting in large variability between different clones, such that extensive screening is required. The use of different selection modalities is also described, with special emphasis on the use of herbicides and metabolic markers which are considered to be friendly to the environment, as compared to drug-resistance genes that are commonly used. Finally, despite the development of a wide range of transformation

  13. Transgene expression in microalgae – from tools to applications

    Directory of Open Access Journals (Sweden)

    Lior eDoron

    2016-04-01

    Full Text Available Microalgae comprise a biodiverse group of photosynthetic organisms that reside in water sources and sediments. The green microalgae Chlamydomonas reinhardtii was adopted as a useful model organism for studying various physiological systems. Its ability to grow under both photosynthetic and heterotrophic conditions allows efficient growth of non-photosynthetic mutants, making Chlamydomonas a useful genetic tool to study photosynthesis. In addition, this green alga can grow as haploid or diploid cells, similar to yeast, providing a powerful genetic system. As a result, easy and efficient transformation systems have been developed for Chlamydomonas, targeting both the chloroplast and nuclear genomes. Since microalgae comprise a rich repertoire of species that offer variable advantages for biotech and biomed industries, gene transfer technologies were further developed for many microalgae to allow for the expression of foreign proteins of interest. Expressing foreign genes in the chloroplast enables the targeting of foreign DNA to specific sites by homologous recombination. Chloroplast transformation also allows for the introduction of genes encoding several enzymes from a complex pathway, possibly as an operon. Expressing foreign proteins in the chloroplast can also be achieved by introducing the target gene into the nuclear genome, with the protein product bearing a targeting signal that directs import of the transgene-product into the chloroplast, like other endogenous chloroplast proteins. Integration of foreign genes into the nuclear genome is mostly random, resulting in large variability between different clones, such that extensive screening is required. The use of different selection modalities is also described, with special emphasis on the use of herbicides and metabolic markers which are considered to be friendly to the environment, as compared to drug-resistance genes that are commonly used. Finally, despite the development of a wide

  14. An Evaluation Tool for Agricultural Health and Safety Mobile Applications.

    Science.gov (United States)

    Reyes, Iris; Ellis, Tammy; Yoder, Aaron; Keifer, Matthew C

    2016-01-01

    As the use of mobile devices and their software applications, or apps, becomes ubiquitous, use amongst agricultural working populations is expanding as well. The smart device paired with a well-designed app has potential for improving workplace health and safety in the hands of those who can act upon the information provided. Many apps designed to assess workplace hazards and implementation of worker protections already exist. However, the abundance and diversity of such applications also presents challenges regarding evaluation practices and assignation of value. This is particularly true in the agricultural workspace, as there is currently little information on the value of these apps for agricultural safety and health. This project proposes a framework for developing and evaluating apps that have potential usefulness in agricultural health and safety. The evaluation framework is easily transferable, with little modification for evaluation of apps in several agriculture-specific areas.

  15. Applications of inventory difference tool at Los Alamos Plutonium Facility

    International Nuclear Information System (INIS)

    Hench, K.W.; Longmire, V.; Yarbro, T.F.; Zardecki, A.

    1998-01-01

    A prototype computer program reads the inventory entries directly from the Microsoft Access database. Based on historical data, the program then displays temporal trends and constructs a library of rules that encapsulate the system behavior. The analysis of inventory data is illustrated using a combination of realistic and simulated facility examples. Potential payoffs of this methodology include a reduction in time and resources needed to perform statistical tests and a broad applicability to DOE needs such as treaty verification

  16. Mojo Hand, a TALEN design tool for genome editing applications

    Directory of Open Access Journals (Sweden)

    Neff Kevin L

    2013-01-01

    Full Text Available Abstract Background Recent studies of transcription activator-like (TAL effector domains fused to nucleases (TALENs demonstrate enormous potential for genome editing. Effective design of TALENs requires a combination of selecting appropriate genetic features, finding pairs of binding sites based on a consensus sequence, and, in some cases, identifying endogenous restriction sites for downstream molecular genetic applications. Results We present the web-based program Mojo Hand for designing TAL and TALEN constructs for genome editing applications (http://www.talendesign.org. We describe the algorithm and its implementation. The features of Mojo Hand include (1 automatic download of genomic data from the National Center for Biotechnology Information, (2 analysis of any DNA sequence to reveal pairs of binding sites based on a user-defined template, (3 selection of restriction-enzyme recognition sites in the spacer between the TAL monomer binding sites including options for the selection of restriction enzyme suppliers, and (4 output files designed for subsequent TALEN construction using the Golden Gate assembly method. Conclusions Mojo Hand enables the rapid identification of TAL binding sites for use in TALEN design. The assembly of TALEN constructs, is also simplified by using the TAL-site prediction program in conjunction with a spreadsheet management aid of reagent concentrations and TALEN formulation. Mojo Hand enables scientists to more rapidly deploy TALENs for genome editing applications.

  17. Electrochemical immunosensors - A powerful tool for analytical applications.

    Science.gov (United States)

    Felix, Fabiana S; Angnes, Lúcio

    2018-04-15

    Immunosensors are biosensors based on interactions between an antibody and antigen on a transducer surface. Either antibody or antigen can be the species immobilized on the transducer to detect antigen or antibody, respectively. Because of the strong binding forces between these biomolecules, immunosensors present high selectivity and very high sensitivity, making them very attractive for many applications in different science fields. Electrochemical immunosensors explore measurements of an electrical signal produced on an electrochemical transductor. This signal can be voltammetric, potentiometric, conductometric or impedimetric. Immunosensors utilizing electrochemical detection have been explored in several analyses since they are specific, simple, portable, and generally disposable and can carry out in situ or automated detection. This review addresses the potential of immunosensors destined for application in food and environmental analysis, and cancer biomarker diagnosis. Emphasis is given to the approaches that have been used for construction of electrochemical immunosensors. Additionally, the fundamentals of immunosensors, technology of transducers and nanomaterials and a general overview of the possible applications of electrochemical immunosensors to the food, environmental and diseases analysis fields are described. Copyright © 2017. Published by Elsevier B.V.

  18. Mojo Hand, a TALEN design tool for genome editing applications.

    Science.gov (United States)

    Neff, Kevin L; Argue, David P; Ma, Alvin C; Lee, Han B; Clark, Karl J; Ekker, Stephen C

    2013-01-16

    Recent studies of transcription activator-like (TAL) effector domains fused to nucleases (TALENs) demonstrate enormous potential for genome editing. Effective design of TALENs requires a combination of selecting appropriate genetic features, finding pairs of binding sites based on a consensus sequence, and, in some cases, identifying endogenous restriction sites for downstream molecular genetic applications. We present the web-based program Mojo Hand for designing TAL and TALEN constructs for genome editing applications (http://www.talendesign.org). We describe the algorithm and its implementation. The features of Mojo Hand include (1) automatic download of genomic data from the National Center for Biotechnology Information, (2) analysis of any DNA sequence to reveal pairs of binding sites based on a user-defined template, (3) selection of restriction-enzyme recognition sites in the spacer between the TAL monomer binding sites including options for the selection of restriction enzyme suppliers, and (4) output files designed for subsequent TALEN construction using the Golden Gate assembly method. Mojo Hand enables the rapid identification of TAL binding sites for use in TALEN design. The assembly of TALEN constructs, is also simplified by using the TAL-site prediction program in conjunction with a spreadsheet management aid of reagent concentrations and TALEN formulation. Mojo Hand enables scientists to more rapidly deploy TALENs for genome editing applications.

  19. Simulation tools for industrial applications of phased array inspection techniques

    International Nuclear Information System (INIS)

    Mahaut, St.; Roy, O.; Chatillon, S.; Calmon, P.

    2001-01-01

    Ultrasonic phased arrays techniques have been developed at the French Atomic Energy Commission in order to improve defects characterization and adaptability to various inspection configuration (complex geometry specimen). Such transducers allow 'standard' techniques - adjustable beam-steering and focusing -, or more 'advanced' techniques - self-focusing on defects for instance -. To estimate the performances of those techniques, models have been developed, which allows to compute the ultrasonic field radiated by an arbitrary phased array transducer through any complex specimen, and to predict the ultrasonic response of various defects inspected with a known beam. Both modeling applications are gathered in the Civa software, dedicated to NDT expertise. The use of those complementary models allows to evaluate the ability of a phased array to steer and focus the ultrasonic beam, and therefore its relevancy to detect and characterize defects. These models are specifically developed to give accurate solutions to realistic inspection applications. This paper briefly describes the CIVA models, and presents some applications dedicated to the inspection of complex specimen containing various defects with a phased array used to steer and focus the beam. Defect detection and characterization performances are discussed for the various configurations. Some experimental validation of both models are also presented. (authors)

  20. New applications of statistical tools in plant pathology.

    Science.gov (United States)

    Garrett, K A; Madden, L V; Hughes, G; Pfender, W F

    2004-09-01

    ABSTRACT The series of papers introduced by this one address a range of statistical applications in plant pathology, including survival analysis, nonparametric analysis of disease associations, multivariate analyses, neural networks, meta-analysis, and Bayesian statistics. Here we present an overview of additional applications of statistics in plant pathology. An analysis of variance based on the assumption of normally distributed responses with equal variances has been a standard approach in biology for decades. Advances in statistical theory and computation now make it convenient to appropriately deal with discrete responses using generalized linear models, with adjustments for overdispersion as needed. New nonparametric approaches are available for analysis of ordinal data such as disease ratings. Many experiments require the use of models with fixed and random effects for data analysis. New or expanded computing packages, such as SAS PROC MIXED, coupled with extensive advances in statistical theory, allow for appropriate analyses of normally distributed data using linear mixed models, and discrete data with generalized linear mixed models. Decision theory offers a framework in plant pathology for contexts such as the decision about whether to apply or withhold a treatment. Model selection can be performed using Akaike's information criterion. Plant pathologists studying pathogens at the population level have traditionally been the main consumers of statistical approaches in plant pathology, but new technologies such as microarrays supply estimates of gene expression for thousands of genes simultaneously and present challenges for statistical analysis. Applications to the study of the landscape of the field and of the genome share the risk of pseudoreplication, the problem of determining the appropriate scale of the experimental unit and of obtaining sufficient replication at that scale.

  1. Transient analysis of power systems solution techniques, tools and applications

    CERN Document Server

    Martinez-Velasco, J

    2014-01-01

    A comprehensive introduction and up-to-date reference to SiC power semiconductor devices covering topics from material properties to applicationsBased on a number of breakthroughs in SiC material science and fabrication technology in the 1980s and 1990s, the first SiC Schottky barrier diodes (SBDs) were released as commercial products in 2001.  The SiC SBD market has grown significantly since that time, and SBDs are now used in a variety of power systems, particularly switch-mode power supplies and motor controls.  SiC power MOSFETs entered commercial production in 2011, providing rugged, hig

  2. Average energetic ion flux variations associated with geomagnetic activity from EPIC/STICS on Geotail

    Science.gov (United States)

    Christon, S. P.; Gloeckler, G.; Eastman, T. E.; McEntire, R. W.; Roelef, E. C.; Lui, A. T. Y.; Williams, D. J.; Frank, L. A.; Paterson, W. R.; Kokubun, S.; hide

    1996-01-01

    The magnetotail ion flux measurements from the Geotail spacecraft are analyzed both with and without the application of selection criteria that identify the plasma regime in which an observation is obtained. The different results are compared with each other. The initial results on the changes of energetic ion flux and composition correlated to average substorm activity in different magnetotail plasma regimes are discussed. The energetic ions are measured using the energetic particles and ion composition (EPIC) experiment and the suprathermal ion composition spectrometer (STICS). The plasma, wave and field instruments of the Geotail satellite were used to identify the principle magnetotail plasma regimes of plasma sheet, lobe, and magnetospheric boundary layer, as well as the magnetosheath and solar wind. Energetic O and H ions were observed in all the plasma regimes.

  3. Implementation of an EPICS IOC on an Embedded Soft Core Processor Using Field Programmable Gate Arrays

    International Nuclear Information System (INIS)

    Douglas Curry; Alicia Hofler; Hai Dong; Trent Allison; J. Hovater; Kelly Mahoney

    2005-01-01

    At Jefferson Lab, we have been evaluating soft core processors running an EPICS IOC over μClinux on our custom hardware. A soft core processor is a flexible CPU architecture that is configured in the FPGA as opposed to a hard core processor which is fixed in silicon. Combined with an on-board Ethernet port, the technology incorporates the IOC and digital control hardware within a single FPGA. By eliminating the general purpose computer IOC, the designer is no longer tied to a specific platform, e.g. PC, VME, or VXI, to serve as the intermediary between the high level controls and the field hardware. This paper will discuss the design and development process as well as specific applications for JLab's next generation low-level RF controls and Machine Protection Systems

  4. Research tools application for female fashion underwear comfort assesment

    Directory of Open Access Journals (Sweden)

    Andreia Salvan Pagnan

    2016-06-01

    Full Text Available Within the women's clothing of the universe's underwear were long an insignificant plan with regard to the development of new textile materials, shapes and colors. The panties that had been known as breeches or long underwear only became a necessity around the twentieth century with the vaporous dresses Christian Dior in the 50 Technological advances in the textile industry brought spandex created by the American laboratory DuPont's better known as the lycra. The elasticity of the fabric gave comfort to women's lingerie, passing this attribute to be considered as a quality factor in lingeries. To understand the desires of the users a qualitative research was conducted with women 18-45 years collecting opinions on the perceived comfort of already existing models compared to a new one be launched. Through the Quality Function Deployment Tool (QFD, or Quality Function Deployment, the data obtained from users of the answers given an interpretation which is to prioritize targets for the development of a based product on analyzes of desired characteristics which are converted into attributes technicians.

  5. Hackathons as A Capacity Building Tool for Environmental Applications

    Science.gov (United States)

    Bye, B. L.; Mildorf, T.; Charvat, K.; Berre, A. J.

    2017-12-01

    Today's society requires easy, reliable and quick access to environmental information published by various organizations and initiatives. The environment questions require many activities that produce various sorts of data; by authorities through operation of instruments such as satellites, and through informal local and community activities producing videos, photos or oral stories. The collected information can contribute to up-to-date data. Volunteered geographic information (VGI) is the harnessing of tools to create, assemble, and disseminate geographic data provided voluntarily by individuals. Under the INSPIRE (Infrastructure for spatial information in Europe) umbrella, a number of EU projects co-organize hackathons - the INSPIRE Hack. The INSPIRE Hack focuses on methods where citizens are able to contribute to different environmental and societal issues through smart phones and other sensors. The INSPIRE Hack supports creativity, innovation, technical capabilities and knowledge sharing by combining open data, VGI, and data from citizens observatories or other citizen science activities. This presentation offer a capacity building perspective on the INSPIRE hackathons, the co-design aspects and the argility with respect to the accelerating technological and social innovations, and effective up-take in societal use. Starting in Europe, the concept can be broadened to encompass all continents.

  6. Application of a Novel Tool for Diagnosing Bile Acid Diarrhoea

    Directory of Open Access Journals (Sweden)

    Karna D. Bardhan

    2013-09-01

    Full Text Available Bile acid diarrhoea (BAD is a common disease that requires expensive imaging to diagnose. We have tested the efficacy of a new method to identify BAD, based on the detection of differences in volatile organic compounds (VOC in urine headspace of BAD vs. ulcerative colitis and healthy controls. A total of 110 patients were recruited; 23 with BAD, 42 with ulcerative colitis (UC and 45 controls. Patients with BAD also received standard imaging (Se75HCAT for confirmation. Urine samples were collected and the headspace analysed using an AlphaMOS Fox 4000 electronic nose in combination with an Owlstone Lonestar Field Asymmetric Ion Mobility Spectrometer (FAIMS. A subset was also tested by gas chromatography, mass spectrometry (GCMS. Linear Discriminant Analysis (LDA was used to explore both the electronic nose and FAIMS data. LDA showed statistical differences between the groups, with reclassification success rates (using an n-1 approach at typically 83%. GCMS experiments confirmed these results and showed that patients with BAD had two chemical compounds, 2-propanol and acetamide, that were either not present or were in much reduced quantities in the ulcerative colitis and control samples. We believe that this work may lead to a new tool to diagnose BAD, which is cheaper, quicker and easier that current methods.

  7. Application of a Novel Tool for Diagnosing Bile Acid Diarrhoea

    Science.gov (United States)

    Covington, James A.; Westenbrink, Eric W.; Ouaret, Nathalie; Harbord, Ruth; Bailey, Catherine; O'Connell, Nicola; Cullis, James; Williams, Nigel; Nwokolo, Chuka U.; Bardhan, Karna D.; Arasaradnam, Ramesh P.

    2013-01-01

    Bile acid diarrhoea (BAD) is a common disease that requires expensive imaging to diagnose. We have tested the efficacy of a new method to identify BAD, based on the detection of differences in volatile organic compounds (VOC) in urine headspace of BAD vs. ulcerative colitis and healthy controls. A total of 110 patients were recruited; 23 with BAD, 42 with ulcerative colitis (UC) and 45 controls. Patients with BAD also received standard imaging (Se75HCAT) for confirmation. Urine samples were collected and the headspace analysed using an AlphaMOS Fox 4000 electronic nose in combination with an Owlstone Lonestar Field Asymmetric Ion Mobility Spectrometer (FAIMS). A subset was also tested by gas chromatography, mass spectrometry (GCMS). Linear Discriminant Analysis (LDA) was used to explore both the electronic nose and FAIMS data. LDA showed statistical differences between the groups, with reclassification success rates (using an n-1 approach) at typically 83%. GCMS experiments confirmed these results and showed that patients with BAD had two chemical compounds, 2-propanol and acetamide, that were either not present or were in much reduced quantities in the ulcerative colitis and control samples. We believe that this work may lead to a new tool to diagnose BAD, which is cheaper, quicker and easier that current methods. PMID:24018955

  8. Application of a 16-bit microprocessor to the digital control of machine tools

    International Nuclear Information System (INIS)

    Issaly, Alain

    1979-01-01

    After an overview of machine tools (various types, definition standardization, associated technologies for motors and position sensors), this research thesis describes the principles of computer-based digital control: classification of machine tool command systems, machining programming, programming languages, dialog function, interpolation function, servo-control function, tool compensation function. The author reports the application of a 16-bit microprocessor to the computer-based digital control of a machine tool: feasibility, selection of microprocessor, hardware presentation, software development and description, machining mode, translation-loading mode

  9. Smart Cutting Tools and Smart Machining: Development Approaches, and Their Implementation and Application Perspectives

    Science.gov (United States)

    Cheng, Kai; Niu, Zhi-Chao; Wang, Robin C.; Rakowski, Richard; Bateman, Richard

    2017-09-01

    Smart machining has tremendous potential and is becoming one of new generation high value precision manufacturing technologies in line with the advance of Industry 4.0 concepts. This paper presents some innovative design concepts and, in particular, the development of four types of smart cutting tools, including a force-based smart cutting tool, a temperature-based internally-cooled cutting tool, a fast tool servo (FTS) and smart collets for ultraprecision and micro manufacturing purposes. Implementation and application perspectives of these smart cutting tools are explored and discussed particularly for smart machining against a number of industrial application requirements. They are contamination-free machining, machining of tool-wear-prone Si-based infra-red devices and medical applications, high speed micro milling and micro drilling, etc. Furthermore, implementation techniques are presented focusing on: (a) plug-and-produce design principle and the associated smart control algorithms, (b) piezoelectric film and surface acoustic wave transducers to measure cutting forces in process, (c) critical cutting temperature control in real-time machining, (d) in-process calibration through machining trials, (e) FE-based design and analysis of smart cutting tools, and (f) application exemplars on adaptive smart machining.

  10. PTaaS: Platform for Providing Software Developing Applications and Tools as a Service

    DEFF Research Database (Denmark)

    Chauhan, Muhammad Aufeef; Babar, Muhammad Ali

    2014-01-01

    technological support for it that is not limited to one specific tools and a particular phase of software development life cycle. In this thesis, we have explored the possibility of offering software development applications and tools as services that can be acquired on demand according to the software...... with process. Information gained from the review of literature on GSD tools and processes is used to extract functional requirements for the middleware platform for provisioning of software development applications and tools as services. Finding from the review of literature on architecture solutions for cloud......Cloud computing has become an established paradigm for enabling organizations to build scalable software systems and to meet challenges of rapid demand of computing and storage resources. There has been a significant success in building cloud-enabled applications for many disciplines ranging from...

  11. 78 FR 69363 - Lake Tahoe Basin Management Unit, California, Heavenly Mountain Resort Epic Discovery Project

    Science.gov (United States)

    2013-11-19

    ... DEPARTMENT OF AGRICULTURE Forest Service Lake Tahoe Basin Management Unit, California, Heavenly Mountain Resort Epic Discovery Project AGENCY: Lake Tahoe Basin Management Unit, Forest Service, USDA...: The Epic Discovery Project is intended to enhance summer activities in response to the USDA Forest...

  12. 75 FR 65985 - Safety Zone: Epic Roasthouse Private Party Firework Display, San Francisco, CA

    Science.gov (United States)

    2010-10-27

    ... the navigable waters of San Francisco Bay 1,000 yards off Epic Roasthouse Restaurant, San Francisco.... Wright, Program Manager, Docket Operations, telephone 202-366-9826. SUPPLEMENTARY INFORMATION: Regulatory... waters of San Francisco Bay, 1,000 yards off Epic Roasthouse Restaurant, San Francisco, CA. The fireworks...

  13. The application of human error prevention tool in Tianwan nuclear power station

    International Nuclear Information System (INIS)

    Qiao Zhiguo

    2013-01-01

    This paper mainly discusses the application and popularization of human error prevention tool in Tianwan nuclear power station, including the study on project implementation background, main contents and innovation, performance management, innovation practice and development, and performance of innovation application. (authors)

  14. Evanescent field: A potential light-tool for theranostics application

    Science.gov (United States)

    Polley, Nabarun; Singh, Soumendra; Giri, Anupam; Pal, Samir Kumar

    2014-03-01

    A noninvasive or minimally invasive optical approach for theranostics, which would reinforce diagnosis, treatment, and preferably guidance simultaneously, is considered to be major challenge in biomedical instrument design. In the present work, we have developed an evanescent field-based fiber optic strategy for the potential theranostics application in hyperbilirubinemia, an increased concentration of bilirubin in the blood and is a potential cause of permanent brain damage or even death in newborn babies. Potential problem of bilirubin deposition on the hydroxylated fiber surface at physiological pH (7.4), that masks the sensing efficacy and extraction of information of the pigment level, has also been addressed. Removal of bilirubin in a blood-phantom (hemoglobin and human serum albumin) solution from an enhanced level of 77 μM/l (human jaundice >50 μM/l) to ˜30 μM/l (normal level ˜25 μM/l in human) using our strategy has been successfully demonstrated. In a model experiment using chromatography paper as a mimic of biological membrane, we have shown efficient degradation of the bilirubin under continuous monitoring for guidance of immediate/future course of action.

  15. Application of chemical tools to evaluate phytoremediation of weathered hydrocarbons

    International Nuclear Information System (INIS)

    Camp, H.; Kulakow, P.; Smart, D.R.; O'Reilly, K.

    2002-01-01

    The effectiveness of using phytoremediation methods to treat soils contaminated with hydrocarbons was tested in a three-year study at a site in northern California at a treatment pond for refinery process water. The treatment pond was drained several years ago and is targeted for cleanup. The petroleum hydrocarbons from the refinery waste were already highly degraded from natural weathering processes by the time the study began. The soil consists of about 23 per cent sand, 38 per cent silt, and 39 per cent clay. The study followed the Environmental Protection Agency's standardized field protocol and analytical approach. During the study, chemical data for several hydrocarbon parameters was gathered. Soil samples were Soxhlet-extracted in organic solvent and measured for oil and grease and total petroleum hydrocarbons using gravimetric techniques. One of the objectives was to develop an accurate quantitative way to identify sites and conditions where phytoremediation will be effective to supplement decision-tree-type approaches. The focus of the study is the application of chemical data in evaluating the effectiveness of the treatment process. Phytoremediation uses living plants for in situ remediation of polluted soils. The basic benefits of the techniques is that it is aesthetically pleasing, natural and passive. In addition, it is effective in cleaning up sites with low to moderate levels of pollution at shallow depths. A particular form of phytoremediation called rhizodegradation or enhanced rhizosphere biodegradation was the treatment used in this study. It is a treatment in which microorganisms digest organic substances and beak them down by biodegradation while being supported in the plant root structure. Test results indicate that the effects of phytoremediation treatments are subtle for highly weathered source material. It was noted that more statistical analysis will be performed with the data to determine compositional changes due to phytoremediation

  16. Using Mobile App Development Tools to Build a GIS Application

    Science.gov (United States)

    Mital, A.; Catchen, M.; Mital, K.

    2014-12-01

    Our group designed and built working web, android, and IOS applications using different mapping libraries as bases on which to overlay fire data from NASA. The group originally planned to make app versions for Google Maps, Leaflet, and OpenLayers. However, because the Leaflet library did not properly load on Android, the group focused efforts on the other two mapping libraries. For Google Maps, the group first designed a UI for the web app and made a working version of the app. After updating the source of fire data to one which also provided historical fire data, the design had to be modified to include the extra data. After completing a working version of the web app, the group used webview in android, a built in resource which allowed porting the web app to android without rewriting the code for android. Upon completing this, the group found Apple IOS devices had a similar capability, and so decided to add an IOS app to the project using a function similar to webview. Alongside this effort, the group began implementing an OpenLayers fire map using a simpler UI. This web app was completed fairly quickly relative to Google Maps; however, it did not include functionality such as satellite imagery or searchable locations. The group finished the project with a working android version of the Google Maps based app supporting API levels 14-19 and an OpenLayers based app supporting API levels 8-19, as well as a Google Maps based IOS app supporting both old and new screen formats. This project was implemented by high school and college students under an SGT Inc. STEM internship program

  17. Juvencus and the biblical epic: specificity and literary criticism

    Directory of Open Access Journals (Sweden)

    Elena María Calderón de Cuervo

    2015-04-01

    Full Text Available Latin Christian poetry has emerged in  Constantine Era and flourished between 400 and 800. It has a fundamental role in the development of literary theory and critical discourse, because, except for Prudencio, the rest of the poets of this first period has chosen by the adaptation of the classical canon to Christian themes. The Christian epic is therefore one of the first genres and begins as biblical epic. The first major work of this type is the Gospel Harmony from the Spanish poet Juvencus, until 330. This work begins a long series of biblical poetry, Latin at first, but after this there is its continuation in the vernaculars, like Caedmon, Cynewulf, The Heliand, The Passion by Clermont till Ojeda, Milton and Klopstock.The dedication to the established authority , the subordination of the art´s purpose for the salvation of the soul as well as the desire to legitimize poetry with Christian arguments remain as fundamental premises in the construction of gender. When the modern epic apear, its compromise with new theological Aporia will not lose those extraliterary requirements from provenance.Keywords: Latin Christian poetry; Constantine Era; Virgil.

  18. Exon-primed intron-crossing (EPIC markers for non-model teleost fishes

    Directory of Open Access Journals (Sweden)

    Riethoven Jean-Jack M

    2010-03-01

    Full Text Available Abstract Background Exon-primed intron-crossing (EPIC markers have three advantages over anonymous genomic sequences in studying evolution of natural populations. First, the universal primers designed in exon regions can be applied across a broad taxonomic range. Second, the homology of EPIC-amplified sequences can be easily determined by comparing either their exon or intron portion depending on the genetic distance between the taxa. Third, having both the exon and intron fragments could help in examining genetic variation at the intraspecific and interspecific level simultaneously, particularly helpful when studying species complex. However, the paucity of EPIC markers has hindered multilocus studies using nuclear gene sequences, particularly in teleost fishes. Results We introduce a bioinformatics pipeline for developing EPIC markers by comparing the whole genome sequences between two or more species. By applying this approach on five teleost fishes whose genomes were available in the Ensembl database http://www.ensembl.org, we identified 210 EPIC markers that have single-copy and conserved exon regions with identity greater than 85% among the five teleost fishes. We tested 12 randomly chosen EPIC markers in nine teleost species having a wide phylogenetic range. The success rate of amplifying and sequencing those markers varied from 44% to 100% in different species. We analyzed the exon sequences of the 12 EPIC markers from 13 teleosts. The resulting phylogeny contains many traditionally well-supported clades, indicating the usefulness of the exon portion of EPIC markers in reconstructing species phylogeny, in addition to the value of the intron portion of EPIC markers in interrogating the population history. Conclusions This study illustrated an effective approach to develop EPIC markers in a taxonomic group, where two or more genome sequences are available. The markers identified could be amplified across a broad taxonomic range of teleost

  19. Transportable Applications Environment (TAE) Plus: A NASA tool for building and managing graphical user interfaces

    Science.gov (United States)

    Szczur, Martha R.

    1993-01-01

    The Transportable Applications Environment (TAE) Plus, developed at NASA's Goddard Space Flight Center, is an advanced portable user interface development which simplifies the process of creating and managing complex application graphical user interfaces (GUI's). TAE Plus supports the rapid prototyping of GUI's and allows applications to be ported easily between different platforms. This paper will discuss the capabilities of the TAE Plus tool, and how it makes the job of designing and developing GUI's easier for application developers. TAE Plus is being applied to many types of applications, and this paper discusses how it has been used both within and outside NASA.

  20. EPIC OF AWESOMEANIMÁTICA DE UN EPISODIO PILOTO

    OpenAIRE

    PEIRÓ TIMONER, LLORENÇ ANDREU

    2016-01-01

    [EN] Epic of Awesome is a pre-production project for an animated short in 2D, produced entirely using specialized software for story-board creation. The project includes every phase prior to the final animation stage needed for the production of audiovisual narratives: idea, research, script, designs, initial story-board, final story-board, layout, voice acting, music and effects, animatics, clean-up and the final edition of the animated short. The story, exposed in a comedic tone with the in...

  1. Poetry’s Politics in Archaic Greek Epic and Lyric

    Directory of Open Access Journals (Sweden)

    David F. Elmer

    2013-03-01

    Full Text Available This essay builds on work in The Poetics of Consent (2013, which argues that the Iliad’s representation of politics reflects the workings of the oral tradition underlying the poem as we have it, a tradition that developed in the context of Panhellenic festivals. Applying a similar perspective to poetry belonging to the very different performative context of the symposium, this essay draws evidence from Theognis and Alcaeus suggesting that the social dynamics of sympotic performance could be expressed in terms of political fragmentation and alienation. In the Odyssey, the contrast between the epic singers Phemios and Demodokos reflects an awareness of the difference between these performative contexts.

  2. HyperArchiver: an EPICS archiver prototype based on Hypertable

    International Nuclear Information System (INIS)

    Giacchini, M.; Giovannini, L.; Montis, M.; Bassato, G.; Vasquez, J.A.; Prete, G.; Andrighetto, A.; Petkus, R.; Lange, R.; Kasemir, K.; Del Campo, M.; Jugo, J.

    2012-01-01

    This work started in the context of NSLS2 project at Brookhaven National Laboratory. The NSLS2 control system foresees a very high number of PV variables and has strict requirements in terms of archiving/retrieving rate: our goal was to store 10 K PV/sec and retrieve 4 K PV/sec for a group of 4 signals. The HyperArchiver is an EPICS Archiver implementation engined by Hypertable, an open source database whose internal architecture is derived from Google's Big Table. We discuss the performance of HyperArchiver and present the results of some comparative tests. (authors)

  3. Using the missed opportunity tool as an application of the Lives Saved Tool (LiST) for intervention prioritization.

    Science.gov (United States)

    Tam, Yvonne; Pearson, Luwei

    2017-11-07

    The Missed Opportunity tool was developed as an application in the Lives Saved Tool (LiST) to allow users to quickly compare the relative impact of interventions. Global Financing Facility (GFF) investment cases have been identified as a potential application of the Missed Opportunity analyses in Democratic Republic of the Congo (DRC), Ethiopia, Kenya, and Tanzania, to use 'lives saved' as a normative factor to set priorities. The Missed Opportunity analysis draws on data and methods in LiST to project maternal, stillbirth, and child deaths averted based on changes in interventions' coverage. Coverage of each individual intervention in LiST was automated to be scaled up from current coverage to 90% in the next year, to simulate a scenario where almost every mother and child receive proven interventions that they need. The main outcome of the Missed Opportunity analysis is deaths averted due to each intervention. When reducing unmet need for contraception is included in the analysis, it ranks as the top missed opportunity across the four countries. When it is not included in the analysis, top interventions with the most total deaths averted are hospital-based interventions such as labor and delivery management in the CEmOC and BEmOC level, and full treatment and supportive care for premature babies, and for sepsis/pneumonia. The Missed Opportunity tool can be used to provide a quick, first look at missed opportunities in a country or geographic region, and help identify interventions for prioritization. While it is a useful advocate for evidence-based priority setting, decision makers need to consider other factors that influence decision making, and also discuss how to implement, deliver, and sustain programs to achieve high coverage.

  4. Transportable Applications Environment (TAE) Plus - A NASA productivity tool used to develop graphical user interfaces

    Science.gov (United States)

    Szczur, Martha R.

    1991-01-01

    The Transportable Applications Environment (TAE) Plus, developed at NASA's Goddard Space Flight Center, is an advanced portable user interface development environment which simplifies the process of creating and managing complex application graphical user interfaces (GUIs), supports prototyping, allows applications to be oported easily between different platforms, and encourages appropriate levels of user interface consistency between applications. This paper discusses the capabilities of the TAE Plus tool, and how it makes the job of designing and developing GUIs easier for the application developers. The paper also explains how tools like TAE Plus provide for reusability and ensure reliability of UI software components, as well as how they aid in the reduction of development and maintenance costs.

  5. Galaxy tools and workflows for sequence analysis with applications in molecular plant pathology.

    Science.gov (United States)

    Cock, Peter J A; Grüning, Björn A; Paszkiewicz, Konrad; Pritchard, Leighton

    2013-01-01

    The Galaxy Project offers the popular web browser-based platform Galaxy for running bioinformatics tools and constructing simple workflows. Here, we present a broad collection of additional Galaxy tools for large scale analysis of gene and protein sequences. The motivating research theme is the identification of specific genes of interest in a range of non-model organisms, and our central example is the identification and prediction of "effector" proteins produced by plant pathogens in order to manipulate their host plant. This functional annotation of a pathogen's predicted capacity for virulence is a key step in translating sequence data into potential applications in plant pathology. This collection includes novel tools, and widely-used third-party tools such as NCBI BLAST+ wrapped for use within Galaxy. Individual bioinformatics software tools are typically available separately as standalone packages, or in online browser-based form. The Galaxy framework enables the user to combine these and other tools to automate organism scale analyses as workflows, without demanding familiarity with command line tools and scripting. Workflows created using Galaxy can be saved and are reusable, so may be distributed within and between research groups, facilitating the construction of a set of standardised, reusable bioinformatic protocols. The Galaxy tools and workflows described in this manuscript are open source and freely available from the Galaxy Tool Shed (http://usegalaxy.org/toolshed or http://toolshed.g2.bx.psu.edu).

  6. A systematic review on popularity, application and characteristics of protein secondary structure prediction tools.

    Science.gov (United States)

    Kashani-Amin, Elaheh; Tabatabaei-Malazy, Ozra; Sakhteman, Amirhossein; Larijani, Bagher; Ebrahim-Habibi, Azadeh

    2018-02-27

    Prediction of proteins' secondary structure is one of the major steps in the generation of homology models. These models provide structural information which is used to design suitable ligands for potential medicinal targets. However, selecting a proper tool between multiple secondary structure prediction (SSP) options is challenging. The current study is an insight onto currently favored methods and tools, within various contexts. A systematic review was performed for a comprehensive access to recent (2013-2016) studies which used or recommended protein SSP tools. Three databases, Web of Science, PubMed and Scopus were systematically searched and 99 out of 209 studies were finally found eligible to extract data. Four categories of applications for 59 retrieved SSP tools were: (I) prediction of structural features of a given sequence, (II) evaluation of a method, (III) providing input for a new SSP method and (IV) integrating a SSP tool as a component for a program. PSIPRED was found to be the most popular tool in all four categories. JPred and tools utilizing PHD (Profile network from HeiDelberg) method occupied second and third places of popularity in categories I and II. JPred was only found in the two first categories, while PHD was present in three fields. This study provides a comprehensive insight about the recent usage of SSP tools which could be helpful for selecting a proper tool's choice. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  7. Mass Production Tools and Process Readiness for Uniform Parts—Injection Molding Application

    DEFF Research Database (Denmark)

    Boorla, Srinivasa Murthy; Eifler, Tobias; Howard, Thomas J.

    2017-01-01

    A mass production always aims to produce uniform performing products. Production tools such as pressing dies, casting dies and injection moulds, play a significant role by producing uniform parts for achieving final products. Tool complexity increases when multiple cavities are present. These tools...... pass through several stages of quality maturation, before starting production, where the tool capability for part uniformity can be assessed, corrected and aligned to mass production variables. This research article describes the process of systematic understanding of the impact of variables...... and of finding opportunities to counter them. Application is assessed over a hypothetical plastic injection mould and found feasible. Proposed process could evaluate the tool capability for producing uniform parts, at its digital design verification and its physical validation....

  8. Design of an instrumented smart cutting tool and its implementation and application perspectives

    International Nuclear Information System (INIS)

    Wang, Chao; Cheng, Kai; Chen, Xun; Minton, Timothy; Rakowski, Richard

    2014-01-01

    This paper presents an innovative design of a smart cutting tool, using two surface acoustic wave (SAW) strain sensors mounted onto the top and the side surface of the tool shank respectively, and its implementation and application perspectives. This surface acoustic wave-based smart cutting tool is capable of measuring the cutting force and the feed force in a real machining environment, after a calibration process under known cutting conditions. A hybrid dissimilar workpiece is then machined using the SAW-based smart cutting tool. The hybrid dissimilar material is made of two different materials, NiCu alloy (Monel) and steel, welded together to form a single bar; this can be used to simulate an abrupt change in material properties. The property transition zone is successfully detected by the tool; the sensor feedback can then be used to initiate a change in the machining parameters to compensate for the altered material properties. (paper)

  9. An Integrated Development Tool for a safety application using FBD language

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Young Jun; Lee, Jang Soo; Lee, Dong Young [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2012-05-15

    Regarding digitalizing the Nuclear Instrumentation and Control Systems, the application program responsible for the safety functions of Nuclear I and C Systems shall ensure the robustness of the safety function through development, testing, and validation roles for a life cycle process during software development. The importance of software in nuclear systems increases continuously. The integrated engineering tools to develop, test, and validate safety application programs require increasingly more complex parts among a number of components within nuclear digital I and C systems. This paper introduces the integrated engineering tool (SafeCASE-PLC) developed by our project. The SafeCASE-PLC is a kind of software engineering tool to develop, test, and validate the nuclear application program performed in an automatic controller

  10. Climate risk screening tools and their application: A guide to the guidance

    Energy Technology Data Exchange (ETDEWEB)

    Traerup, S.; Olhoff, A.

    2011-07-01

    Climate risk screening is an integral part of efforts to ascertain current and future vulnerabilities and risks related to climate change. It is a prerequisite for identifying and designing adaptation measures, and an important element in the process of integrating, or mainstreaming, climate change adaptation into development project, planning and policy processes. There is an increasing demand and attention among national stakeholders in developing countries to take into account potential implications of climate variability and change for planning and prioritizing of development strategies and activities. Subsequently, there is a need for user friendly guidance on climate risk screening tools and their potentials for application that targets developing country stakeholders. This need is amplified by the sheer volume of climate change mainstreaming guidance documents and risk screening and assessment tools available and currently under development. Against this background, this paper sets out to provide potential users in developing countries, including project and programme developers and managers, with an informational entry point to climate risk screening tools. The emphasis in this report is on providing: 1) An overview of available climate risk screening and assessment tools along with indications of the tools available and relevant for specific purposes and contexts (Section 3). 2) Examples of application of climate risk screening and assessment tools along with links to further information (Section 4). Before turning to the respective sections on available climate risk screening tools and examples of their application, a delimitation of the tools included in this paper is included in Section 2. This section also provides a brief overview of how climate screening and related tools fit into decision making steps at various planning and decision making levels in conjunction with an outline of overall considerations to make when choosing a tool. The paper is

  11. An integrated user-friendly ArcMAP tool for bivariate statistical modeling in geoscience applications

    Science.gov (United States)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.

    2014-10-01

    Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  12. Nanobody-derived nanobiotechnology tool kits for diverse biomedical and biotechnology applications.

    Science.gov (United States)

    Wang, Yongzhong; Fan, Zhen; Shao, Lei; Kong, Xiaowei; Hou, Xianjuan; Tian, Dongrui; Sun, Ying; Xiao, Yazhong; Yu, Li

    2016-01-01

    Owing to peculiar properties of nanobody, including nanoscale size, robust structure, stable and soluble behaviors in aqueous solution, reversible refolding, high affinity and specificity for only one cognate target, superior cryptic cleft accessibility, and deep tissue penetration, as well as a sustainable source, it has been an ideal research tool for the development of sophisticated nanobiotechnologies. Currently, the nanobody has been evolved into versatile research and application tool kits for diverse biomedical and biotechnology applications. Various nanobody-derived formats, including the nanobody itself, the radionuclide or fluorescent-labeled nanobodies, nanobody homo- or heteromultimers, nanobody-coated nanoparticles, and nanobody-displayed bacteriophages, have been successfully demonstrated as powerful nanobiotechnological tool kits for basic biomedical research, targeting drug delivery and therapy, disease diagnosis, bioimaging, and agricultural and plant protection. These applications indicate a special advantage of these nanobody-derived technologies, already surpassing the "me-too" products of other equivalent binders, such as the full-length antibodies, single-chain variable fragments, antigen-binding fragments, targeting peptides, and DNA-based aptamers. In this review, we summarize the current state of the art in nanobody research, focusing on the nanobody structural features, nanobody production approach, nanobody-derived nanobiotechnology tool kits, and the potentially diverse applications in biomedicine and biotechnology. The future trends, challenges, and limitations of the nanobody-derived nanobiotechnology tool kits are also discussed.

  13. Computer-aided design in power engineering. Application of software tools

    International Nuclear Information System (INIS)

    Stojkovic, Zlatan

    2012-01-01

    Demonstrates the use software tools in the practice of design in the field of power systems. Presents many applications in the design in the field of power systems. Useful for educative purposes and practical work. This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel and Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems is discussed. Here, the emphasis is put on the standard software MS Excel and MS Project.

  14. Computer-aided design in power engineering. Application of software tools

    Energy Technology Data Exchange (ETDEWEB)

    Stojkovic, Zlatan

    2012-07-01

    Demonstrates the use software tools in the practice of design in the field of power systems. Presents many applications in the design in the field of power systems. Useful for educative purposes and practical work. This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel and Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems is discussed. Here, the emphasis is put on the standard software MS Excel and MS Project.

  15. [Dietary habits and cancer: the experience of EPIC-Italy].

    Science.gov (United States)

    Sieri, Sabina; Agnoli, Claudia; Pala, Valeria; Mattiello, Amalia; Panico, Salvatore; Masala, Giovanna; Assedi, Melania; Tumino, Rosario; Frasca, Graziella; Sacerdote, Carlotta; Vineis, Paolo; Krogh, Vittorio

    2015-01-01

    to investigate hypothesised relationships between diet and cancer by assessing diet as a whole, in the Italian cohort EPIC. multicentric prospective study. 47,749 volunteers were recruited between 1993 and 1998 in the centres of Varese and Turin (Northern Italy), Florence (Central Italy), Naples and Ragusa (Southern Italy). Information on diet and lifestyle were collected through validated questionnaires. Anthropometric measurements were taken and biological samples collected using standardised protocols. follow-up was carried out by accessing regional cancer and mortality registries, hospital discharge records, and by telephone inquiries (only for Naples). After a median follow-up of 11 years, 879 incident cases of breast cancer, 421 cases of colorectal cancer, and 152 deaths were identified. Multivariate Cox regression models were used to estimate risks in relation to dietary characteristics. the "Olive oil & Salad" dietary pattern, characterised by high consumption of raw vegetables and olive oil, was associated with a lower risk of overall mortality in the elderly. Adherence to a Mediterranean diet rich in vegetables and fruit was associated with reduced risk of colon cancer. Consumption of high-glycemic carbohydrates was associated with higher incidence of breast cancer and colorectal cancer. Reduced risk of colon cancer was also found in regular consumers of yoghurt. the accuracy and comprehensiveness of EPIC-Italy data made it possible to investigate both individual dietary components and dietary habits as a whole, to thereby provide Italians with dietary and lifestyle advice that will help them to remain healthy.

  16. Development and Validation of WebQuests in Teaching Epics

    Directory of Open Access Journals (Sweden)

    Ronald Candy Santos Lasaten

    2017-05-01

    Full Text Available Using the Research Development (R&D methodology, the study aimed to develop and validate WebQuests which can be used in literature subjects, particularly in the tertiary level to address the need of literature teachers for pedagogy in the teaching of epic s. The development of the Web Quests was anchored on the Theory of Constructivism. Two groups of experts validated the Web Quests – the literature experts and the ICT experts. The Content Validation Checklist, used by the literature experts, was utilized t o evaluate the content of the Web Quests. Meanwhile, the Rubric for Evaluating Web Quests, used by the ICT experts, was utilized to evaluate the design characteristics of the Web Quests. Computed weighted means using range interval of point scores were emp loyed to treat the data gathered from the evaluation conducted by both group of experts. The Web Quests developed contain five major parts which include: 1 introduction; 2 task; 3 process; 4 evaluation; and 5 conclusion. Based on the findings, the con tent of the Web Quests developed are valid in terms of objectives, activities and instructional characteristics. Likewise, the design characteristics of the Web Quests are excellent in terms of introductions, tasks, processes, resources, evaluations, concl usions and overall designs. Thus, the Web Quests developed are acceptable and can be utilized as instructional materials by literature teachers in the teaching of epics.

  17. Indigenous women in Spanish American Historic Epic Poetry

    Directory of Open Access Journals (Sweden)

    Lise Segas

    2016-05-01

    Full Text Available Epic poetry has always been considered a masculine genre. The eruption of a group identity, masculine, white, aristocratic and christian, is the result of the representation and the exclusion of the Other, fictitious and singular, but in fact composed of a variety of ethnic groups, origins, sex, genders, religions and different degrees between fiction and historicity. Indeed, in the historical epic poetry which narrated the Conquest, except for the conquistadors listed at length and the indigenous kings and caciques, only few characters are distinguished by a historical individualisation. The Other, Amerindian and female, makes a shy entrance into history, into singularity, into the (historical and christian truth. It is the case of interpreters: Malinche and India Catalina, only historical native women that appear as part of the narrative plot as well as in the conquest enterprise in the poems of Lasso de la Vega (Cortés valeroso y Mexicana, Mexicana, of Juan de Castellanos (Elegías de varones ilustres de Indias and of Saavedra Guzmán (El peregrino indiano.

  18. EPICS Controlled Collimator for Controlling Beam Sizes in HIPPO

    Energy Technology Data Exchange (ETDEWEB)

    Napolitano, Arthur Soriano [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vogel, Sven C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-03

    Controlling the beam spot size and shape in a diffraction experiment determines the probed sample volume. The HIPPO - High-Pressure-Preferred Orientation– neutron time-offlight diffractometer is located at the Lujan Neutron Scattering Center in Los Alamos National Laboratories. HIPPO characterizes microstructural parameters, such as phase composition, strains, grain size, or texture, of bulk (cm-sized) samples. In the current setup, the beam spot has a 10 mm diameter. Using a collimator, consisting of two pairs of neutron absorbing boron-nitride slabs, horizontal and vertical dimensions of a rectangular beam spot can be defined. Using the HIPPO robotic sample changer for sample motion, the collimator would enable scanning of e.g. cylindrical samples along the cylinder axis by probing slices of such samples. The project presented here describes implementation of such a collimator, in particular the motion control software. We utilized the EPICS (Experimental Physics Interface and Control System) software interface to integrate the collimator control into the HIPPO instrument control system. Using EPICS, commands are sent to commercial stepper motors that move the beam windows.

  19. Workshop on the applications of new computer tools to thermal engineering; Applications a la thermique des nouveaux outils informatiques

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-31

    This workshop on the applications of new computer tools to thermal engineering has been organized by the French society of thermal engineers. Seven papers have been presented, from which two papers dealing with thermal diffusivity measurements in materials and with the optimization of dryers have been selected for ETDE. (J.S.)

  20. Development of EPICS based beam-line experimental control employing motor controller for precision positioning

    International Nuclear Information System (INIS)

    Tuli, Anupriya; Jain, Rajiv; Vora, H.S.

    2015-01-01

    In a Synchrotron Radiation Source the beamline experiments are carried out in radiation prone environment, inside the hutch, which demands to conduct experiments remotely. These experiments involves instrument control and data acquisition from various devices. Another factor which attributes to system complexity is precise positioning of sample and placement of detectors. A large number of stepper motors are engaged for achieving the required precision positioning. This work is a result of development of Experimental Physics and Industrial Control System (EPICS) based control system to interface a stepper motor controller developed indigenously by Laser Electronics Support Division of RRCAT. EPICS is an internationally accepted open source software environment which follows toolkit approach and standard model paradigm. The operator interface for the control system software was implemented using CSS BOY. The system was successfully tested for Ethernet based remote access. The developed control software comprises of an OPI and alarm handler (EPICS ALH). Both OPI and ALH are linked with PV's defined in database files. The development process resulted into a set of EPICS based commands for controlling stepper motor. These commands are independent of operator interface, i.e. stepper motor can be controlled by using these set of commands directly on EPICS prompt. This command set is illustrated in the above table. EPICS Alarm Handler was also tested independently by running these commands on EPIC prompt. If not using ALH, operator can read the alarm status of a PV using 'SEVR' and 'STAT' attributes. (author)

  1. High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis.

    Science.gov (United States)

    Simonyan, Vahan; Mazumder, Raja

    2014-09-30

    The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.

  2. High-Performance Integrated Virtual Environment (HIVE Tools and Applications for Big Data Analysis

    Directory of Open Access Journals (Sweden)

    Vahan Simonyan

    2014-09-01

    Full Text Available The High-performance Integrated Virtual Environment (HIVE is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.

  3. MOBILE APPLICATIONS AS TOOL FOR EXPLOITING CULTURAL HERITAGE IN THE REGION OF TURIN AND MILAN

    OpenAIRE

    A. Rolando; A. Scandiffio

    2013-01-01

    The current research aims at showing as applications working on personal mobile communication terminals such as smartphones, can be useful for exploration of places and, at the same time, as tools able to develop interaction between cultural heritage and users. In this sense, the use of smartphone applications can be combined with GIS in order to make a platform of knowledge useful to support research studies in the field of cultural heritage, with specific reference to accessibility...

  4. A study of some features of the ultra high vacuum systems for EPIC

    International Nuclear Information System (INIS)

    Elsey, R.J.; Bennett, J.R.J.; Dossett, A.J.

    1977-01-01

    This report covers the experimental work carried out towards the development of the ultra high vacuum for the proposed electron positron storage ring, EPIC. Experiments included outgassing tests on samples of materials and pump-down tests on full scale aluminium vessels. The effect of baking was investigated. The approval of the similar machine PETRA at Hamburg and the subsequent withdrawal of the EPIC proposal in October 1975 curtailed the vacuum work. The experiments reported here are therefore incomplete, but nevertheless proved useful in showing that there should have been no major problems with building the vacuum system for EPIC. (author)

  5. Monitoring commercial conventional facilities control with the APS control system: The Metasys-to-EPICS interface

    International Nuclear Information System (INIS)

    Nawrocki, G.J.; Seaver, C.L.; Kowalkowski, J.B.

    1995-01-01

    As controls needs at the Advanced Photon Source matured from an installation phase to an operational phase, the need to monitor the existing conventional facilities control system with the EPICS-based accelerator control system was realized. This existing conventional facilities control network is based on a proprietary system from Johnson Controls called Metasys. Initially read-only monitoring of the Metasys parameters will be provided; however, the ability for possible future expansion to full control is available. This paper describes a method of using commercially available hardware and existing EPICS software as a bridge between the Metasys and EPICS control systems

  6. Chemometrics-based process analytical technology (PAT) tools: applications and adaptation in pharmaceutical and biopharmaceutical industries.

    Science.gov (United States)

    Challa, Shruthi; Potumarthi, Ravichandra

    2013-01-01

    Process analytical technology (PAT) is used to monitor and control critical process parameters in raw materials and in-process products to maintain the critical quality attributes and build quality into the product. Process analytical technology can be successfully implemented in pharmaceutical and biopharmaceutical industries not only to impart quality into the products but also to prevent out-of-specifications and improve the productivity. PAT implementation eliminates the drawbacks of traditional methods which involves excessive sampling and facilitates rapid testing through direct sampling without any destruction of sample. However, to successfully adapt PAT tools into pharmaceutical and biopharmaceutical environment, thorough understanding of the process is needed along with mathematical and statistical tools to analyze large multidimensional spectral data generated by PAT tools. Chemometrics is a chemical discipline which incorporates both statistical and mathematical methods to obtain and analyze relevant information from PAT spectral tools. Applications of commonly used PAT tools in combination with appropriate chemometric method along with their advantages and working principle are discussed. Finally, systematic application of PAT tools in biopharmaceutical environment to control critical process parameters for achieving product quality is diagrammatically represented.

  7. Semantic Web applications and tools for the life sciences: SWAT4LS 2010.

    Science.gov (United States)

    Burger, Albert; Paschke, Adrian; Romano, Paolo; Marshall, M Scott; Splendiani, Andrea

    2012-01-25

    As Semantic Web technologies mature and new releases of key elements, such as SPARQL 1.1 and OWL 2.0, become available, the Life Sciences continue to push the boundaries of these technologies with ever more sophisticated tools and applications. Unsurprisingly, therefore, interest in the SWAT4LS (Semantic Web Applications and Tools for the Life Sciences) activities have remained high, as was evident during the third international SWAT4LS workshop held in Berlin in December 2010. Contributors to this workshop were invited to submit extended versions of their papers, the best of which are now made available in the special supplement of BMC Bioinformatics. The papers reflect the wide range of work in this area, covering the storage and querying of Life Sciences data in RDF triple stores, tools for the development of biomedical ontologies and the semantics-based integration of Life Sciences as well as clinicial data.

  8. CMS Partial Releases Model, Tools, and Applications. Online and Framework-Light Releases

    CERN Document Server

    Jones, Christopher D; Meschi, Emilio; Shahzad Muzaffar; Andreas Pfeiffer; Ratnikova, Natalia; Sexton-Kennedy, Elizabeth

    2009-01-01

    The CMS Software project CMSSW embraces more than a thousand packages organized in subsystems for analysis, event display, reconstruction, simulation, detector description, data formats, framework, utilities and tools. The release integration process is highly automated by using tools developed or adopted by CMS. Packaging in rpm format is a built-in step in the software build process. For several well-defined applications it is highly desirable to have only a subset of the CMSSW full package bundle. For example, High Level Trigger algorithms that run on the Online farm, and need to be rebuilt in a special way, require no simulation, event display, or analysis packages. Physics analysis applications in Root environment require only a few core libraries and the description of CMS specific data formats. We present a model of CMS Partial Releases, used for preparation of the customized CMS software builds, including description of the tools used, the implementation, and how we deal with technical challenges, suc...

  9. Discover Space Weather and Sun's Superpowers: Using CCMC's innovative tools and applications

    Science.gov (United States)

    Mendoza, A. M. M.; Maddox, M. M.; Kuznetsova, M. M.; Chulaki, A.; Rastaetter, L.; Mullinix, R.; Weigand, C.; Boblitt, J.; Taktakishvili, A.; MacNeice, P. J.; Pulkkinen, A. A.; Pembroke, A. D.; Mays, M. L.; Zheng, Y.; Shim, J. S.

    2015-12-01

    Community Coordinated Modeling Center (CCMC) has developed a comprehensive set of tools and applications that are directly applicable to space weather and space science education. These tools, some of which were developed by our student interns, are capable of serving a wide range of student audiences, from middle school to postgraduate research. They include a web-based point of access to sophisticated space physics models and visualizations, and a powerful space weather information dissemination system, available on the web and as a mobile app. In this demonstration, we will use CCMC's innovative tools to engage the audience in real-time space weather analysis and forecasting and will share some of our interns' hands-on experiences while being trained as junior space weather forecasters. The main portals to CCMC's educational material are ccmc.gsfc.nasa.gov and iswa.gsfc.nasa.gov

  10. The VI-Suite: a set of environmental analysis tools with geospatial data applications

    NARCIS (Netherlands)

    Southall, Ryan; Biljecki, F.

    2017-01-01

    Background: The VI-Suite is a free and open-source addon for the 3D content creation application Blender, developed primarily as a tool for the contextual and performative analysis of buildings. Its functionality has grown from simple, static lighting analysis to fully parametric lighting,

  11. PETrA : A Software-Based Tool for Estimating the Energy Profile of Android Applications

    NARCIS (Netherlands)

    Di Nucci, D.; Palomba, F.; Prota, Antonio; Panichella, A.; Zaidman, A.E.; De Lucia, Andrea

    2017-01-01

    Energy efficiency is a vital characteristic of any mobile application, and indeed is becoming an important factor for user satisfaction. For this reason, in recent years several approaches and tools for measuring the energy consumption of mobile devices have been proposed. Hardware-based solutions

  12. Application of ocean bottom cable as a new tool in offshore 3-D ...

    African Journals Online (AJOL)

    Application of ocean bottom cable as a new tool in offshore 3-D seismic data acquisition. CC Ugbor, KM Onuoha, LI Mamah. Abstract. No Abstract. Journal of Mining and Geology Vol. 43 (1) 2007: pp. 63-69. Full Text: EMAIL FULL TEXT EMAIL FULL TEXT · DOWNLOAD FULL TEXT DOWNLOAD FULL TEXT.

  13. Multimedia Instructional Tools' Impact on Student Motivation and Learning Strategies in Computer Applications Courses

    Science.gov (United States)

    Chapman, Debra; Wang, Shuyan

    2015-01-01

    Multimedia instructional tools (MMIT) have been identified as a way effectively and economically present instructional material. MMITs are commonly used in introductory computer applications courses as MMITs should be effective in increasing student knowledge and positively impact motivation and learning strategies, without increasing costs. This…

  14. A Client/Server Architecture for Supporting Science Data Using EPICS Version 4

    Energy Technology Data Exchange (ETDEWEB)

    Dalesio, Leo [EPIC Consulting, Jacksonville, FL (United States)

    2015-04-21

    The Phase 1 grant that serves as a precursor to this proposal, prototyped complex storage techniques for high speed structured data that is being produced in accelerator diagnostics and beam line experiments. It demonstrates the technologies that can be used to archive and retrieve complex data structures and provide the performance required by our new accelerators, instrumentations, and detectors. Phase 2 is proposed to develop a high-performance platform for data acquisition and analysis to provide physicists and operators a better understanding of the beam dynamics. This proposal includes developing a platform for reading 109 MHz data at 10 KHz rates through a multicore front end processor, archiving the data to an archive repository that is then indexed for fast retrieval. The data is then retrieved from this data archive, integrated with the scalar data, to provide data sets to client applications for analysis, use in feedback, and to aid in identifying problem with the instrumentation, plant, beam steering, or model. This development is built on EPICS version 4 , which is being successfully deployed to implement physics applications. Through prior SBIR grants, EPICS version 4 has a solid communication protocol for middle layer services (PVAccess), structured data representation and methods for efficient transportation and access (PVData), an operational hierarchical record environment (JAVA IOC), and prototypes for standard structured data (Normative Types). This work was further developed through project funding to successfully deploy the first service based physics application environment with demonstrated services that provide arbitrary object views, save sets, model, lattice, and unit conversion. Thin client physics applications have been developed in Python that implement quad centering, orbit display, bump control, and slow orbit feedback. This service based architecture has provided a very modular and robust environment that enables commissioning teams

  15. Tools for Empirical and Operational Analysis of Mobile Offloading in Loop-Based Applications

    Directory of Open Access Journals (Sweden)

    Alexandru-Corneliu OLTEANU

    2013-01-01

    Full Text Available Offloading for mobile devices is an increasingly popular research topic, matching the popu-larity mobile devices have in the general population. Studying mobile offloading is challenging because of device and application heterogeneity. However, we believe that focusing on a specific type of application can bring advances in offloading for mobile devices, while still keeping a wide range of applicability. In this paper we focus on loop-based applications, in which most of the functionality is given by iterating an execution loop. We model the main loop of the application with a graph that consists of a cycle and propose an operational analysis to study offloading on this model. We also propose a testbed based on a real-world application to empirically evaluate offloading. We conduct performance evaluation using both tools and compare the analytical and empirical results.

  16. Integrating Gigabit ethernet cameras into EPICS at Diamond light source

    International Nuclear Information System (INIS)

    Cobb, T.

    2012-01-01

    At Diamond Light Source a range of cameras are used to provide images for diagnostic purposes in both the accelerator and photo beamlines. The accelerator and existing beamlines use Point Grey Flea and Flea2 Firewire cameras. We have selected Gigabit Ethernet cameras supporting GigE Vision for our new photon beamlines. GigE Vision is an interface standard for high speed Ethernet cameras which encourages inter-operability between manufacturers. This paper describes the challenges encountered while integrating GigE Vision cameras from a range of vendors into EPICS. GigE Vision cameras appear to be more reliable than the Firewire cameras, and the simple cabling makes much easier to move the cameras to different positions. Upcoming power over Ethernet versions of the cameras will reduce the number of cables still further

  17. Numerical and modeling techniques used in the EPIC code

    International Nuclear Information System (INIS)

    Pizzica, P.A.; Abramson, P.B.

    1977-01-01

    EPIC models fuel and coolant motion which result from internal fuel pin pressure (from fission gas or fuel vapor) and/or from the generation of sodium vapor pressures in the coolant channel subsequent to pin failure in an LMFBR. The modeling includes the ejection of molten fuel from the pin into a coolant channel with any amount of voiding through a clad rip which may be of any length or which may expand with time. One-dimensional Eulerian hydrodynamics is used to model both the motion of fuel and fission gas inside a molten fuel cavity and the mixture of two-phase sodium and fission gas in the channel. Motion of molten fuel particles in the coolant channel is tracked with a particle-in-cell technique

  18. Retrieving Smoke Aerosol Height from DSCOVR/EPIC

    Science.gov (United States)

    Xu, X.; Wang, J.; Wang, Y.

    2017-12-01

    Unlike industrial pollutant particles that are often confined within the planetary boundary layer, smoke from forest and agriculture fires can inject massive carbonaceous aerosols into the upper troposphere due to the intense pyro-convection. Sensitivity of weather and climate to absorbing carbonaceous aerosols is regulated by the altitude of those aerosol layers. However, aerosol height information remains limited from passive satellite sensors. Here we present an algorithm to estimate smoke aerosol height from radiances in the oxygen A and B bands measured by the Earth Polychromatic Imaging Camera (EPIC) from the Deep Space Climate Observatory (DSCOVR). With a suit of case studies and validation efforts, we demonstrate that smoke aerosol height can be well retrieved over both ocean and land surfaces multiple times daily.

  19. Using EPICS enabled industrial hardware for upgrading control systems

    International Nuclear Information System (INIS)

    Bjorkland, Eric A.; Veeramani, Arun; Debelle, Thierry

    2009-01-01

    Los Alamos National Laboratory has been working with National Instruments (NI) and Cosy lab to implement EPICS Input Output Controller (IOC) software that runs directly on NI CompactRIO Real Time Controller (RTC) and communicates with NI LabVIEW through a shared memory interface. In this presentation, we will discuss our current progress in upgrading the control system at the Los Alamos Neutron Science Centre (LANSCE) and what we have learned about integrating CompactRIO into large experimental physics facilities. We will also discuss the implications of using Channel Access Server for LabVIEW which will enable more commercial hardware platforms to be used in upgrading existing facilities or in commissioning new ones.

  20. Radiation effects in wild terrestrial vertebrates - the EPIC collection.

    Science.gov (United States)

    Sazykina, Tatiana; Kryshev, Ivan I

    2006-01-01

    The paper presents data on radiation effects in populations of wild vertebrate animals inhabiting contaminated terrestrial ecosystems. The data were extracted from the database "Radiation effects on biota", compiled within the framework of the EC Project EPIC (2000-2003). The data collection, based on publications in Russian, demonstrates radiation effects in the areas characterized with high levels of radionuclides (Kyshtym radioactive trace; "spots" of enhanced natural radioactivity in the Komi region of Russia; territories contaminated from the Chernobyl fallout). The data covers a wide range of exposures from acute accidental irradiation to lifetime exposures at relatively low dose rates. Radiation effects include mortality, changes in reproduction, decrease of health, ecological effects, cytogenetic effects, adaptation to radiation, and others. Peculiarities of radiation effects caused by different radionuclides are described, also the severity of effects as they appear in different organisms (e.g. mice, frogs, birds, etc.).

  1. Radiation effects in wild terrestrial vertebrates - the EPIC collection

    International Nuclear Information System (INIS)

    Sazykina, Tatiana; Kryshev, Ivan I.

    2006-01-01

    The paper presents data on radiation effects in populations of wild vertebrate animals inhabiting contaminated terrestrial ecosystems. The data were extracted from the database 'Radiation effects on biota', compiled within the framework of the EC Project EPIC (2000-2003). The data collection, based on publications in Russian, demonstrates radiation effects in the areas characterized with high levels of radionuclides (Kyshtym radioactive trace; 'spots' of enhanced natural radioactivity in the Komi region of Russia; territories contaminated from the Chernobyl fallout). The data covers a wide range of exposures from acute accidental irradiation to lifetime exposures at relatively low dose rates. Radiation effects include mortality, changes in reproduction, decrease of health, ecological effects, cytogenetic effects, adaptation to radiation, and others. Peculiarities of radiation effects caused by different radionuclides are described, also the severity of effects as they appear in different organisms (e.g. mice, frogs, birds, etc.)

  2. ISAC EPICS on Linux: the march of the penguins

    International Nuclear Information System (INIS)

    Richards, J.; Nussbaumer, R.; Rapaz, S.; Waters, G.

    2012-01-01

    The DC linear accelerators of the ISAC radioactive beam facility at TRIUMF do not impose rigorous timing constraints on the control system. Therefore a real-time operating system is not essential for device control. The ISAC Control System is completing a move to the use of the Linux operating system for hosting all EPICS IOCs. The IOC platforms include GE-Fanuc VME based CPUs for control of most optics and diagnostics, rack mounted servers for supervising PLCs, small desktop PCs for GPIB and RS232 instruments, as well as embedded ARM processors controlling CAN-bus devices that provide a suitcase sized control system. This article focuses on the experience of creating a customized Linux distribution for front-end IOC deployment. Rationale, a road-map of the process, and efficiency advantages in personnel training and system management realized by using a single OS will be discussed. (authors)

  3. Trends in HFE Methods and Tools and Their Applicability to Safety Reviews

    Energy Technology Data Exchange (ETDEWEB)

    O' Hara, J.M.; Plott, C.; Milanski, J.; Ronan, A.; Scheff, S.; Laux, L.; and Bzostek, J.

    2009-09-30

    The U.S. Nuclear Regulatory Commission's (NRC) conducts human factors engineering (HFE) safety reviews of applicant submittals for new plants and for changes to existing plants. The reviews include the evaluation of the methods and tools (M&T) used by applicants as part of their HFE program. The technology used to perform HFE activities has been rapidly evolving, resulting in a whole new generation of HFE M&Ts. The objectives of this research were to identify the current trends in HFE methods and tools, determine their applicability to NRC safety reviews, and identify topics for which the NRC may need additional guidance to support the NRC's safety reviews. We conducted a survey that identified over 100 new HFE M&Ts. The M&Ts were assessed to identify general trends. Seven trends were identified: Computer Applications for Performing Traditional Analyses, Computer-Aided Design, Integration of HFE Methods and Tools, Rapid Development Engineering, Analysis of Cognitive Tasks, Use of Virtual Environments and Visualizations, and Application of Human Performance Models. We assessed each trend to determine its applicability to the NRC's review by considering (1) whether the nuclear industry is making use of M&Ts for each trend, and (2) whether M&Ts reflecting the trend can be reviewed using the current design review guidance. We concluded that M&T trends that are applicable to the commercial nuclear industry and are expected to impact safety reviews may be considered for review guidance development. Three trends fell into this category: Analysis of Cognitive Tasks, Use of Virtual Environments and Visualizations, and Application of Human Performance Models. The other trends do not need to be addressed at this time.

  4. Trends in HFE Methods and Tools and Their Applicability to Safety Reviews

    International Nuclear Information System (INIS)

    O'Hara, J.M.; Plott, C.; Milanski, J.; Ronan, A.; Scheff, S.; Laux, L.; Bzostek, J.

    2009-01-01

    The U.S. Nuclear Regulatory Commission's (NRC) conducts human factors engineering (HFE) safety reviews of applicant submittals for new plants and for changes to existing plants. The reviews include the evaluation of the methods and tools (M and T) used by applicants as part of their HFE program. The technology used to perform HFE activities has been rapidly evolving, resulting in a whole new generation of HFE M and Ts. The objectives of this research were to identify the current trends in HFE methods and tools, determine their applicability to NRC safety reviews, and identify topics for which the NRC may need additional guidance to support the NRC's safety reviews. We conducted a survey that identified over 100 new HFE M and Ts. The M and Ts were assessed to identify general trends. Seven trends were identified: Computer Applications for Performing Traditional Analyses, Computer-Aided Design, Integration of HFE Methods and Tools, Rapid Development Engineering, Analysis of Cognitive Tasks, Use of Virtual Environments and Visualizations, and Application of Human Performance Models. We assessed each trend to determine its applicability to the NRC's review by considering (1) whether the nuclear industry is making use of M and Ts for each trend, and (2) whether M and Ts reflecting the trend can be reviewed using the current design review guidance. We concluded that M and T trends that are applicable to the commercial nuclear industry and are expected to impact safety reviews may be considered for review guidance development. Three trends fell into this category: Analysis of Cognitive Tasks, Use of Virtual Environments and Visualizations, and Application of Human Performance Models. The other trends do not need to be addressed at this time.

  5. Vegetation Earth System Data Record from DSCOVR EPIC Observations

    Science.gov (United States)

    Knyazikhin, Y.; Song, W.; Yang, B.; Mottus, M.; Rautiainen, M.; Stenberg, P.

    2017-12-01

    The NASA's Earth Polychromatic Imaging Camera (EPIC) onboard NOAA's Deep Space Climate Observatory (DSCOVR) mission was launched on February 11, 2015 to the Sun-Earth Lagrangian L1 point where it began to collect radiance data of the entire sunlit Earth every 65 to 110 min in June 2015. It provides imageries in near backscattering directions with the scattering angle between 168° and 176° at ten ultraviolet to near infrared (NIR) narrow spectral bands centered at 317.5 (band width 1.0) nm, 325.0 (2.0) nm, 340.0 (3.0) nm, 388.0 (3.0) nm, 433.0 (3.0) nm, 551.0 (3.0) nm, 680.0 (3.0) nm, 687.8 (0.8) nm, 764.0 (1.0) nm and 779.5 (2.0) nm. This poster presents current status of the Vegetation Earth System Data Record of global Leaf Area Index (LAI), solar zenith angle dependent Sunlit Leaf Area Index (SLAI), Fraction vegetation absorbed Photosynthetically Active Radiation (FPAR) and Normalized Difference Vegetation Index (NDVI) derived from the DSCOVR EPIC observations. Whereas LAI is a standard product of many satellite missions, the SLAI is a new satellite-derived parameter. Sunlit and shaded leaves exhibit different radiative response to incident Photosynthetically Active Radiation (400-700 nm), which in turn triggers various physiological and physical processes required for the functioning of plants. FPAR, LAI and SLAI are key state parameters in most ecosystem productivity models and carbon/nitrogen cycle. The product at 10 km sinusoidal grid and 65 to 110 min temporal frequency as well as accompanying Quality Assessment (QA) variables will be publicly available from the NASA Langley Atmospheric Science Data Center. The Algorithm Theoretical Basis (ATBD) and product validation strategy are also discussed in this poster.

  6. Nitrous Oxide Emissions from Biofuel Crops and Parameterization in the EPIC Biogeochemical Model

    Science.gov (United States)

    This presentation describes year 1 field measurements of N2O fluxes and crop yields which are used to parameterize the EPIC biogeochemical model for the corresponding field site. Initial model simulations are also presented.

  7. Prospect-EPIC Utrecht: Study design and characteristics of the cohort population

    NARCIS (Netherlands)

    Boker, L.K.; Noord, P.A.H. van; Schouw, Y.T. van der; Koot, V.C.M.; Bueno-de-Mesquita, H.B.; Riboli, E.; Grobbee, D.E.; Peeters, P.H.M.

    2001-01-01

    The European Prospective Investigation into Cancer and Nutrition (EPIC), which hasbe en established in order to investigate the relations between nutrition and cancer, wasinitiated in 1990 and involves10 European countrieswith heterogeneous dietary patternsand differing cancer incidence rates. This

  8. Exploring Lyric, Epic, and Dramatic Voices: Stages of Incandescence in the Poetry of the Aged.

    Science.gov (United States)

    Reed, M. Ann

    1992-01-01

    Identifies true relationships between the psyche and the lyric, epic, and dramatic voices of poetry. Shows how the acts of identifying, responding to, and composing in these three voices engage healing, inspiration, and active imagination among the aging. (SR)

  9. A Woman Voice in an Epic: Tracing Gendered Motifs in Anne Vabarna's Peko

    Directory of Open Access Journals (Sweden)

    Andreas Kalkun

    2008-12-01

    Full Text Available In the article the gendered motifs found in Anne Vabarna’s Seto epic Peko are analysed. Besides the narrative telling of the life of the male hero, the motives regarding eating, refusing to eat or offering food, and the aspect of the female body or its control deserve to be noticed. These scenes do not communicate the main plot, they are often related to minor characters of the epic and slow down the narrative, but at the same time they clearly carry artistic purpose and meaning. I consider these motifs, present in the liminal parts of the epic, to be the dominant symbols of the epic where the author’s feminine world is being exposed. Observing these motifs of Peko in the context of Seto religious worldview, the life of Anne Vabarna and the social position of Seto women, the symbols become eloquent and informative.

  10. Performance Analysis Tool for HPC and Big Data Applications on Scientific Clusters

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Wucherl [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Koo, Michelle [Univ. of California, Berkeley, CA (United States); Cao, Yu [California Inst. of Technology (CalTech), Pasadena, CA (United States); Sim, Alex [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Nugent, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Univ. of California, Berkeley, CA (United States); Wu, Kesheng [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-09-17

    Big data is prevalent in HPC computing. Many HPC projects rely on complex workflows to analyze terabytes or petabytes of data. These workflows often require running over thousands of CPU cores and performing simultaneous data accesses, data movements, and computation. It is challenging to analyze the performance involving terabytes or petabytes of workflow data or measurement data of the executions, from complex workflows over a large number of nodes and multiple parallel task executions. To help identify performance bottlenecks or debug the performance issues in large-scale scientific applications and scientific clusters, we have developed a performance analysis framework, using state-ofthe- art open-source big data processing tools. Our tool can ingest system logs and application performance measurements to extract key performance features, and apply the most sophisticated statistical tools and data mining methods on the performance data. It utilizes an efficient data processing engine to allow users to interactively analyze a large amount of different types of logs and measurements. To illustrate the functionality of the big data analysis framework, we conduct case studies on the workflows from an astronomy project known as the Palomar Transient Factory (PTF) and the job logs from the genome analysis scientific cluster. Our study processed many terabytes of system logs and application performance measurements collected on the HPC systems at NERSC. The implementation of our tool is generic enough to be used for analyzing the performance of other HPC systems and Big Data workows.

  11. Application of a tool of aid to the energy planning in isolated rural communities. Case of application Las Peladas

    International Nuclear Information System (INIS)

    Benítez Leyva, Lázaro Ventura; Jerez Pereira, Rubén; Pompa Chávez, Yanel; Tamayo Saborit, Michel; Rosa Andino, Alain de la

    2014-01-01

    This work refers the experience of a study of case realized in the rural community Las Peladas, located in the municipality Bartolome Maso Marquez, of the Granma province. In the same the application of a multi-objective mathematical model like computer tool of aid to the planning appears energetics, in agreement with the specific characteristics of the analyzed locality. The field study was based on the results of a participating survey, the observation and compilation of data that it made possible to obtain a characterization of the community and, this way, of delimiting the necessary parameters for the application of the tool. Five alternatives were evaluated: Aeolian energy, biomass, to pave, hydraulics and the connection to the national network. Of them, the model suggests, that the photovoltaic energy solar exerts the greater influence in the improvement of the integral sustainability of the capitals natural, physical, financial, human and social in the community. (author)

  12. ALARA radiation protection applications at NPP A1 decommissioning using VISPLAN planning tool

    International Nuclear Information System (INIS)

    Slavik, O.; Kucharova, D.; Listjak, M.

    2005-01-01

    The SCK.CEN developed during the BR3 reactor decommissioning a graphical interfaced 3D dose assessment tool aimed at the above mentioned dose optimization problems. The tool was improved further and commercialized under the name VISIPLAN 3D ALARA planning tool. The use of VISIPLAN tool at NPP A1 decommissioning was advised by EDF within the IAEA TCP SLR/4008 project missions as a component of a 3D technological chain ensuring acquisition and evaluation of digitized information leading to ALARA optimisation of the developed decommissioning working procedures (see Fig. 1). VISIPLAN license was purchased by IAEA and granted to VUJE and NPP A1 within the IAEA TCP project SLR/4008, covering also the necessary basic and advanced training (at NPP A1 real decommissioning environment), and is currently used by VUJE analysts at NPP A1 decommissioning as an ALARA tool for intervention planning and optimisation. The VISIPLAN allows a fast dose assessment for work planned in a radioactive environment. The calculations are based on a 3D model of the work place. This PC-based tool is user friendly and calculates a detailed dose account for different work scenarios defined by the ALARA analyst, taking into account worker position, work duration and subsequent geometry and source distribution changes. The VISIPLAN methodology is described. The VISIPLAN tool was applied to several ALARA studies carried out by the VUJE within development and analysis of various Work Programs for NPP Al decommissioning tasks. The applications ranged from simpler to complex ALARA planning tasks covering simple shielding applications to selection of the most suitable order of complex working procedures. Removal of inner liners from underground reservoirs 6/1 and 6/2 as well as Dismantling of pipes and components from a (hostile) 60 m corridor are described. (authors)

  13. An integrated user-friendly ArcMAP tool for bivariate statistical modelling in geoscience applications

    Science.gov (United States)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.

    2015-03-01

    Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  14. PROMO: a computerized tool to support process monitoring activities -application in CANDU simulators

    International Nuclear Information System (INIS)

    Singh, D.T.; Singh, P.P.

    1995-01-01

    PROMO, a prototype computerized PROcess MOnitoring tool, has been designed for the resolution of perceived complexity under conditions of time constraints and criticality. It is suggested that this makes it uniquely suitable for applications such as nuclear power plant operator training and support. This paper describes the tool, the theory underlying its design, and results from preliminary laboratory experiments. While field tests are necessary prior to the drawing of conclusions, the results from the laboratory trials are promising. Efforts are currently underway to extend the research setting to power plant operator training centers. (author). 57 refs., 1 fig

  15. PROMO: a computerized tool to support process monitoring activities -application in CANDU simulators

    Energy Technology Data Exchange (ETDEWEB)

    Singh, D T [York Univ., Toronto, ON (Canada); Singh, P P [Case Western Reserve Univ., Cleveland, OH (United States)

    1996-12-31

    PROMO, a prototype computerized PROcess MOnitoring tool, has been designed for the resolution of perceived complexity under conditions of time constraints and criticality. It is suggested that this makes it uniquely suitable for applications such as nuclear power plant operator training and support. This paper describes the tool, the theory underlying its design, and results from preliminary laboratory experiments. While field tests are necessary prior to the drawing of conclusions, the results from the laboratory trials are promising. Efforts are currently underway to extend the research setting to power plant operator training centers. (author). 57 refs., 1 fig.

  16. Application of lean tools in the supply chain of a maintenance environment

    Directory of Open Access Journals (Sweden)

    Fourie, C. J.

    2017-05-01

    Full Text Available Historically, Lean thinking has limited applications in the maintenance environment (that is, a non-manufacturing environment. This article reports on the Lean tools that can be implemented in the maintenance environment. To achieve this, a typical supply chain management of a rolling stock service organisation was used for analysis and validation. The approach was initially to map the current supply chain process through a standard method of value stream mapping so as to identify non-Lean activities. After mapping the current state, other suitable Lean tools for the current supply chain management were applied. Finally, performance Indicators were formulated for continuous review and assessment.

  17. Application analysis tools for ASIP design: application profiling and instruction-set customization

    National Research Council Canada - National Science Library

    Karuri, Kingshuk; Leupers, Ranier

    2011-01-01

    ... of the operating system they are targeting and execute them on the main application processor in the SmartPhone. However these are not the only applications that need to run on the SmartPhone. There are many underlying applications, or lets better call them algorithms, that have very demanding performance and power consumption targets, yet need to be flexible. Here is where the custom processor comes into play for the implementation of algorithms that will have certain variability but are covering a narrow enough design sp...

  18. Hydrological Scenario Using Tools and Applications Available in enviroGRIDS Portal

    Science.gov (United States)

    Bacu, V.; Mihon, D.; Stefanut, T.; Rodila, D.; Cau, P.; Manca, S.; Soru, C.; Gorgan, D.

    2012-04-01

    Nowadays the decision makers but also citizens are concerning with the sustainability and vulnerability of land management practices on various aspects and in particular on water quality and quantity in complex watersheds. The Black Sea Catchment is an important watershed in the Central and East Europe. In the FP7 project enviroGRIDS [1] was developed a Web Portal that incorporates different tools and applications focused on geospatial data management, hydrologic model calibration, execution and visualization and training activities. This presentation highlights, from the end-user point of view, the scenario related with hydrological models using the tools and applications available in the enviroGRIDS Web Portal [2]. The development of SWAT (Soil Water Assessment Tool) hydrological models is a well known procedure for the hydrological specialists [3]. Starting from the primary data (information related to weather, soil properties, topography, vegetation, and land management practices of the particular watershed) that are used to develop SWAT hydrological models, to specific reports, about the water quality in the studied watershed, the hydrological specialist will use different applications available in the enviroGRIDS portal. The tools and applications available through the enviroGRIDS portal are not dealing with the building up of the SWAT hydrological models. They are mainly focused on: calibration procedure (gSWAT [4]) - uses the GRID computational infrastructure to speed-up the calibration process; development of specific scenarios (BASHYT [5]) - starts from an already calibrated SWAT hydrological model and defines new scenarios; execution of scenarios (gSWATSim [6]) - executes the scenarios exported from BASHYT; visualization (BASHYT) - displays charts, tables and maps. Each application is built-up as a stack of functional layers. We combine different layers of applications by vertical interoperability in order to build the desired complex functionality. On

  19. Sustainable Use of Pesticide Applications in Citrus: A Support Tool for Volume Rate Adjustment

    Directory of Open Access Journals (Sweden)

    Cruz Garcerá

    2017-06-01

    Full Text Available Rational application of pesticides by properly adjusting the amount of product to the actual needs and specific conditions for application is a key factor for sustainable plant protection. However, current plant protection product (PPP labels registered for citrus in EU are usually expressed as concentration (%; rate/hl and/or as the maximum dose of product per unit of ground surface, without taking into account those conditions. In this work, the fundamentals of a support tool, called CitrusVol, developed to recommend mix volume rates in PPP applications in citrus orchards using airblast sprayers, are presented. This tool takes into consideration crop characteristics (geometry, leaf area density, pests, and product and application efficiency, and it is based on scientific data obtained previously regarding the minimum deposit required to achieve maximum efficacy, efficiency of airblast sprayers in citrus orchards, and characterization of the crop. The use of this tool in several commercial orchards allowed a reduction of the volume rate and the PPPs used in comparison with the commonly used by farmers of between 11% and 74%, with an average of 31%, without affecting the efficacy. CitrusVol is freely available on a website and in an app for smartphones.

  20. Application of the NCSA Habanero tool for collaboration on structural integrity assessments

    International Nuclear Information System (INIS)

    Bass, B.R.; Kruse, K.; Dodds, R.H. Jr.; Malik, S.N.M.

    1998-11-01

    The Habanero software was developed by the National Center for Superconducting Applications at the University of Illinois, Urbana-Champaign, as a framework for the collaborative sharing of Java applications. The Habanero tool performs distributed communication of single-user, computer software interactions to a multiuser collaborative environment. An investigation was conducted to evaluate the capabilities of the Habanero tool in providing an Internet-based collaborative framework for researchers located at different sites and operating on different workstations. These collaborative sessions focused on the sharing of test data and analysis results from materials engineering areas (i.e., fracture mechanics and structural integrity evaluations) related to reactor pressure vessel safety research sponsored by the US Nuclear Regulatory Commission. This report defines collaborative-system requirements for engineering applications and provides an overview of collaborative systems within the project. The installation, application, and detailed evaluation of the performance of the Habanero collaborative tool are compared to those of another commercially available collaborative product. Recommendations are given for future work in collaborative communications

  1. Application of the GEM Inventory Data Capture Tools for Dynamic Vulnerability Assessment and Recovery Modelling

    Science.gov (United States)

    Verrucci, Enrica; Bevington, John; Vicini, Alessandro

    2014-05-01

    A set of open-source tools to create building exposure datasets for seismic risk assessment was developed from 2010-13 by the Inventory Data Capture Tools (IDCT) Risk Global Component of the Global Earthquake Model (GEM). The tools were designed to integrate data derived from remotely-sensed imagery, statistically-sampled in-situ field data of buildings to generate per-building and regional exposure data. A number of software tools were created to aid the development of these data, including mobile data capture tools for in-field structural assessment, and the Spatial Inventory Data Developer (SIDD) for creating "mapping schemes" - statistically-inferred distributions of building stock applied to areas of homogeneous urban land use. These tools were made publically available in January 2014. Exemplar implementations in Europe and Central Asia during the IDCT project highlighted several potential application areas beyond the original scope of the project. These are investigated here. We describe and demonstrate how the GEM-IDCT suite can be used extensively within the framework proposed by the EC-FP7 project SENSUM (Framework to integrate Space-based and in-situ sENSing for dynamic vUlnerability and recovery Monitoring). Specifically, applications in the areas of 1) dynamic vulnerability assessment (pre-event), and 2) recovery monitoring and evaluation (post-event) are discussed. Strategies for using the IDC Tools for these purposes are discussed. The results demonstrate the benefits of using advanced technology tools for data capture, especially in a systematic fashion using the taxonomic standards set by GEM. Originally designed for seismic risk assessment, it is clear the IDCT tools have relevance for multi-hazard risk assessment. When combined with a suitable sampling framework and applied to multi-temporal recovery monitoring, data generated from the tools can reveal spatio-temporal patterns in the quality of recovery activities and resilience trends can be

  2. Replica sizing strategy for aortic valve replacement improves haemodynamic outcome of the epic supra valve.

    Science.gov (United States)

    Gonzalez-Lopez, David; Faerber, Gloria; Diab, Mahmoud; Amorim, Paulo; Zeynalov, Natig; Doenst, Torsten

    2017-10-01

    Current sizing strategies suggest valve selection based on annulus diameter despite supra-annular placement of biological prostheses potentially allowing placement of a larger size. We assessed the frequency of selecting a larger prosthesis if prosthesis size was selected using a replica (upsizing) and evaluated its impact on haemodynamics. We analysed all discharge echocardiograms between June 2012 and June 2014, where a replica sizer was used for isolated aortic valve replacement (Epic Supra: 266 patients, Trifecta: 49 patients). Upsizing was possible in 71% of the Epic Supra valves (by 1 size: 168, by 2 sizes: 20) and in 59% of the Trifectas (by 1 size: 26, by 2 sizes: 3). Patients for whom upsizing was possible had the lowest pressure gradients within their annulus size groups. The difference was significant in annulus diameters of 21-22 or 25-26 mm (Epic Supra) and 23-24 mm (Trifecta). Trifecta gradients were the lowest. However, the ability to upsize the Epic Supra by 2 sizes eliminated the differences between Epic Supra and Trifecta. Upsizing did not cause intraoperative complications. Using replica sizers for aortic prosthesis size selection allows the implantation of bigger prostheses than recommended in most cases and reduces postoperative gradients, specifically for Epic Supra. © The Author 2017. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  3. The EPIC nutrient database project (ENDB): a first attempt to standardize nutrient databases across the 10 European countries participating in the EPIC study

    DEFF Research Database (Denmark)

    Slimani, N.; Deharveng, G.; Unwin, I.

    2007-01-01

    because there is currently no European reference NDB available. Design: A large network involving national compilers, nutritionists and experts on food chemistry and computer science was set up for the 'EPIC Nutrient DataBase' ( ENDB) project. A total of 550-1500 foods derived from about 37 000...... standardized EPIC 24-h dietary recalls (24-HDRS) were matched as closely as possible to foods available in the 10 national NDBs. The resulting national data sets ( NDS) were then successively documented, standardized and evaluated according to common guidelines and using a DataBase Management System...

  4. Application of Statistical Tools for Data Analysis and Interpretation in Rice Plant Pathology

    Directory of Open Access Journals (Sweden)

    Parsuram Nayak

    2018-01-01

    Full Text Available There has been a significant advancement in the application of statistical tools in plant pathology during the past four decades. These tools include multivariate analysis of disease dynamics involving principal component analysis, cluster analysis, factor analysis, pattern analysis, discriminant analysis, multivariate analysis of variance, correspondence analysis, canonical correlation analysis, redundancy analysis, genetic diversity analysis, and stability analysis, which involve in joint regression, additive main effects and multiplicative interactions, and genotype-by-environment interaction biplot analysis. The advanced statistical tools, such as non-parametric analysis of disease association, meta-analysis, Bayesian analysis, and decision theory, take an important place in analysis of disease dynamics. Disease forecasting methods by simulation models for plant diseases have a great potentiality in practical disease control strategies. Common mathematical tools such as monomolecular, exponential, logistic, Gompertz and linked differential equations take an important place in growth curve analysis of disease epidemics. The highly informative means of displaying a range of numerical data through construction of box and whisker plots has been suggested. The probable applications of recent advanced tools of linear and non-linear mixed models like the linear mixed model, generalized linear model, and generalized linear mixed models have been presented. The most recent technologies such as micro-array analysis, though cost effective, provide estimates of gene expressions for thousands of genes simultaneously and need attention by the molecular biologists. Some of these advanced tools can be well applied in different branches of rice research, including crop improvement, crop production, crop protection, social sciences as well as agricultural engineering. The rice research scientists should take advantage of these new opportunities adequately in

  5. Analytics for smart energy management tools and applications for sustainable manufacturing

    CERN Document Server

    Oh, Seog-Chan

    2016-01-01

    This book introduces the issues and problems that arise when implementing smart energy management for sustainable manufacturing in the automotive manufacturing industry and the analytical tools and applications to deal with them. It uses a number of illustrative examples to explain energy management in automotive manufacturing, which involves most types of manufacturing technology and various levels of energy consumption. It demonstrates how analytical tools can help improve energy management processes, including forecasting, consumption, and performance analysis, emerging new technology identification as well as investment decisions for establishing smart energy consumption practices. It also details practical energy management systems, making it a valuable resource for professionals involved in real energy management processes, and allowing readers to implement the procedures and applications presented.

  6. Proceedings of the of the Tenth Workshop on Language Descriptions, Tools and Applications (LDTA 2010)

    DEFF Research Database (Denmark)

    Brabrand, Claus

    2010-01-01

    -Louis Giavitto ("A Domain Specific Language for Complex Natural & Artificial Systems Simulations") and the 11 contributed papers that were selected for presentation and the proceedings by the programme committee from 30 submissions (i.e., 37% acceptance rate). Every submission was reviewed by at least three......This volume contains the proceedings of the Tenth Workshop on Language Descriptions, Tools and Applications (LDTA 2010), held in Paphos, Cyprus on March 28--29, 2010. LDTA is a two-day satellite event of ETAPS (European Joint Conferences on Theory and Practice of Software) organized in cooperation...... with ACM Sigplan. LDTA is an application and tool-oriented forum on meta programming in a broad sense. A meta program is a program that takes other programs as input or output. The focus of LDTA is on generated or otherwise efficiently implemented meta programs, possibly using high level descriptions...

  7. Development and application of modeling tools for sodium fast reactor inspection

    Energy Technology Data Exchange (ETDEWEB)

    Le Bourdais, Florian; Marchand, Benoît; Baronian, Vahan [CEA LIST, Centre de Saclay F-91191 Gif-sur-Yvette (France)

    2014-02-18

    To support the development of in-service inspection methods for the Advanced Sodium Test Reactor for Industrial Demonstration (ASTRID) project led by the French Atomic Energy Commission (CEA), several tools that allow situations specific to Sodium cooled Fast Reactors (SFR) to be modeled have been implemented in the CIVA software and exploited. This paper details specific applications and results obtained. For instance, a new specular reflection model allows the calculation of complex echoes from scattering structures inside the reactor vessel. EMAT transducer simulation models have been implemented to develop new transducers for sodium visualization and imaging. Guided wave analysis tools have been developed to permit defect detection in the vessel shell. Application examples and comparisons with experimental data are presented.

  8. MURMoT. Design and Application of Microbial Uranium Reduction Monitoring Tools

    Energy Technology Data Exchange (ETDEWEB)

    Loeffler, Frank E. [Univ. of Tennessee, Knoxville, TN (United States)

    2014-12-31

    Uranium (U) contamination in the subsurface is a major remediation challenge at many DOE sites. Traditional site remedies present enormous costs to DOE; hence, enhanced bioremediation technologies (i.e., biostimulation and bioaugmentation) combined with monitoring efforts are being considered as cost-effective corrective actions to address subsurface contamination. This research effort improved understanding of the microbial U reduction process and developed new tools for monitoring microbial activities. Application of these tools will promote science-based site management decisions that achieve contaminant detoxification, plume control, and long-term stewardship in the most efficient manner. The overarching hypothesis was that the design, validation and application of a suite of new molecular and biogeochemical tools advance process understanding, and improve environmental monitoring regimes to assess and predict in situ U immobilization. Accomplishments: This project (i) advanced nucleic acid-based approaches to elucidate the presence, abundance, dynamics, spatial distribution, and activity of metal- and radionuclide-detoxifying bacteria; (ii) developed proteomics workflows for detection of metal reduction biomarker proteins in laboratory cultures and contaminated site groundwater; (iii) developed and demonstrated the utility of U isotopic fractionation using high precision mass spectrometry to quantify U(VI) reduction for a range of reduction mechanisms and environmental conditions; and (iv) validated the new tools using field samples from U-contaminated IFRC sites, and demonstrated their prognostic and diagnostic capabilities in guiding decision making for environmental remediation and long-term site stewardship.

  9. Applicability of the Existing CVD Risk Assessment Tools to Type II Diabetics in Oman: A Review

    Directory of Open Access Journals (Sweden)

    Abdulhakeem Al-Rawahi

    2015-09-01

    Full Text Available Patients with type II diabetes (T2DM have an elevated risk for cardiovascular disease (CVD, and it is considered to be a leading cause of morbidity and premature mortality in these patients. Many traditional risk factors such as age, male sex, hypertension, dyslipidemia, glycemic control, diabetes duration, renal dysfunction, obesity, and smoking have been studied and identified as independent factors for CVD. Quantifying the risk of CVD among diabetics using the common risk factors in order to plan the treatment and preventive measures is important in the management of these patients as recommended by many clinical guidelines. Therefore, several risk assessment tools have been developed in different parts of the world for this purpose. These include the tools that have been developed for general populations and considered T2DM as a risk factor, and the tools that have been developed for T2DM populations specifically. However, due to the differences in sociodemographic factors and lifestyle patterns, as well as the differences in the distribution of various CVD risk factors in different diabetic populations, the external applicability of these tools on different populations is questionable. This review aims to address the applicability of the existing CVD risk models to the Omani diabetic population.

  10. Solid waste management in primary healthcare centers: application of a facilitation tool

    Directory of Open Access Journals (Sweden)

    Ana Maria Maniero Moreira

    Full Text Available Abstract Objectives: to propose a tool to facilitate diagnosis, formulation and evaluation of the Waste Management Plan in Primary Healthcare Centers and to present the results of the application in four selected units. Method: descriptive research, covering the stages of formulation /application of the proposed instrument and the evaluation of waste management performance at the units. Results: the tool consists in five forms; specific indicators of waste generation for outpatients healthcare units were proposed, and performance indicators that give scores for compliance with current legislation. In the studied units it is generated common waste (52-60%, infectious-sharps (31-42% and recyclable (5-17%. The average rates of generation are: 0,09kg of total waste/outpatient assistance and 0,09kg of infectious-sharps waste/outpatient procedure. The compliance with regulations, initially 26-30%, then reached 30-38% a year later. Conclusion: the tool showed to be easy to use, bypassing the existence of a complex range of existing regulatory requirements, allowed to identify non-conformities, pointed out corrective measures and evaluated the performance of waste management. In this sense, it contributes to decision making and management practices relating to waste, tasks usually assigned to nurses. It is recommended that the tool be applied in similar healthcare units for comparative studies, and implementation of necessary adaptations for other medical services.

  11. Solid waste management in primary healthcare centers: application of a facilitation tool 1

    Science.gov (United States)

    Moreira, Ana Maria Maniero; Günther, Wanda Maria Risso

    2016-01-01

    Abstract Objectives: to propose a tool to facilitate diagnosis, formulation and evaluation of the Waste Management Plan in Primary Healthcare Centers and to present the results of the application in four selected units. Method: descriptive research, covering the stages of formulation /application of the proposed instrument and the evaluation of waste management performance at the units. Results: the tool consists in five forms; specific indicators of waste generation for outpatients healthcare units were proposed, and performance indicators that give scores for compliance with current legislation. In the studied units it is generated common waste (52-60%), infectious-sharps (31-42%) and recyclable (5-17%). The average rates of generation are: 0,09kg of total waste/outpatient assistance and 0,09kg of infectious-sharps waste/outpatient procedure. The compliance with regulations, initially 26-30%, then reached 30-38% a year later. Conclusion: the tool showed to be easy to use, bypassing the existence of a complex range of existing regulatory requirements, allowed to identify non-conformities, pointed out corrective measures and evaluated the performance of waste management. In this sense, it contributes to decision making and management practices relating to waste, tasks usually assigned to nurses. It is recommended that the tool be applied in similar healthcare units for comparative studies, and implementation of necessary adaptations for other medical services. PMID:27556874

  12. GPS as a tool used in tourism as illustrated by selected mobile applications

    Science.gov (United States)

    Szark-Eckardt, Mirosława

    2017-11-01

    Mobile technologies have permanently changed our way of life. Their availability, common use and introducing to virtually all areas of human activity means that we can call present times the age of mobility [1]. Mobile applications based on the GPS module belong to the most dynamically developing apps as particularly reflected in tourism. A multitude of applications dedicated to different participants of tourism, which can be operated by means of smartphones or simple GPS trackers, are encouraging more people to reach for this kind of technology perceiving it as a basic tool used in today's tourism. Due to an increasingly wider access to mobile applications, not only more dynamic development of tourism itself can be noticed, but also the growth of healthy behaviours that comprise a positive "side effect" of tourism based on mobile technology. This article demonstrates a correlation between health and physical condition of the population and the use of mobile applications.

  13. Digital Holography, a metrological tool for quantitative analysis: Trends and future applications

    Science.gov (United States)

    Paturzo, Melania; Pagliarulo, Vito; Bianco, Vittorio; Memmolo, Pasquale; Miccio, Lisa; Merola, Francesco; Ferraro, Pietro

    2018-05-01

    A review on the last achievements of Digital Holography is reported in this paper, showing that this powerful method can be a key metrological tool for the quantitative analysis and non-invasive inspection of a variety of materials, devices and processes. Nowadays, its range of applications has been greatly extended, including the study of live biological matter and biomedical applications. This paper overviews the main progresses and future perspectives of digital holography, showing new optical configurations and investigating the numerical issues to be tackled for the processing and display of quantitative data.

  14. THREE-DIMENSIONAL WEB-BASED PHYSICS SIMULATION APPLICATION FOR PHYSICS LEARNING TOOL

    Directory of Open Access Journals (Sweden)

    William Salim

    2012-10-01

    Full Text Available The purpose of this research is to present a multimedia application for doing simulation in Physics. The application is a web based simulator that implementing HTML5, WebGL, and JavaScript. The objects and the environment will be in three dimensional views. This application is hoped will become the substitute for practicum activity. The current development is the application only covers Newtonian mechanics. Questionnaire and literature study is used as the data collecting method. While Waterfall Method used as the design method. The result is Three-DimensionalPhysics Simulator as online web application. Three-Dimensionaldesign and mentor-mentee relationship is the key features of this application. The conclusion made is Three-DimensionalPhysics Simulator already fulfilled in both design and functionality according to user. This application also helps them to understand Newtonian mechanics by simulation. Improvements are needed, because this application only covers Newtonian Mechanics. There is a lot possibility in the future that this simulation can also covers other Physics topic, such as optic, energy, or electricity.Keywords: Simulation, Physic, Learning Tool, HTML5, WebGL

  15. The Evaluation and Application Plan Report for the Development of Nuclear Power Plant DCS Using CASE Tools

    Energy Technology Data Exchange (ETDEWEB)

    Lee, B.Y.; Moon, H.J.; Yoon, M.H.; Lee, Y.K. [Korea Electric Power Research Institute, Taejon (Korea)

    2000-06-01

    This report contains the evaluation and application plan report for the development of nuclear power plant DCS using CASE tools. In this report, the necessity of using CASE tools is considered and a available CASE environment is suggested. And, also according to the IEEE Std 1209 Recommended Practice for Evaluation and Selection of CASE Tools, their functional and economical evaluation about available commercial CASE tools is performed and described. (author). 6 figs., 3 tabs.

  16. A Critique of the Application of Neighborhood Sustainability Assessment Tools in Urban Regeneration

    Directory of Open Access Journals (Sweden)

    Luke Boyle

    2018-03-01

    Full Text Available Neighbourhood Sustainability Assessment Tools (NSA tools are fast becoming the principal framework for urban planners and developers for promoting urban sustainability. The majority of NSA tools promote a specific type of urban development that effectively excludes regeneration projects from the urban sustainability conversation. Given that the world’s megacities are mostly built, it is argued that it is essential that strategies for global sustainability consider that urban development is focussed internally to address existing, under-serviced communities in particular need of meaningful intervention and sustainable redevelopment frameworks. The paper uses existing knowledge on NSA tools to highlight the shortcomings of outcomes-based approaches to urban governance and builds the case that the technocratic “one-size-fits-all” approach adopted by many tools inadequately accounts for underlying institutional, social and economic arrangements that influence urban development, making them inappropriate for application in both planned and existing communities. The paper proposes that urban redevelopment strategies need to be derived from the urban realities of a particular place or context. Such strategies must be grounded in principles of urban governance, participatory action and an understanding of market dynamics. Without these collaborative procedural frameworks, urban regeneration projects will continue to inadequately transition towards more comprehensive sustainability.

  17. Ratsnake: A Versatile Image Annotation Tool with Application to Computer-Aided Diagnosis

    Directory of Open Access Journals (Sweden)

    D. K. Iakovidis

    2014-01-01

    Full Text Available Image segmentation and annotation are key components of image-based medical computer-aided diagnosis (CAD systems. In this paper we present Ratsnake, a publicly available generic image annotation tool providing annotation efficiency, semantic awareness, versatility, and extensibility, features that can be exploited to transform it into an effective CAD system. In order to demonstrate this unique capability, we present its novel application for the evaluation and quantification of salient objects and structures of interest in kidney biopsy images. Accurate annotation identifying and quantifying such structures in microscopy images can provide an estimation of pathogenesis in obstructive nephropathy, which is a rather common disease with severe implication in children and infants. However a tool for detecting and quantifying the disease is not yet available. A machine learning-based approach, which utilizes prior domain knowledge and textural image features, is considered for the generation of an image force field customizing the presented tool for automatic evaluation of kidney biopsy images. The experimental evaluation of the proposed application of Ratsnake demonstrates its efficiency and effectiveness and promises its wide applicability across a variety of medical imaging domains.

  18. Continuity with Creole Epic and Innovation in the «History of New Mexico», 1610

    Directory of Open Access Journals (Sweden)

    Pedro Cebollero

    2017-11-01

    Full Text Available This is an introductory study to Gaspar de Villagrá’s History of New Mexico (1610 in the context of the Cortés epic cycle tradition. The concept of «epic continuity» is utilized in order to better understand how the poem adheres to epic tradition at the same time that it goes through formal changes needed for its renovation. Part of the renovation is the inclusion of prose as a legal defense strategy, and its use as a vehicle for the struggles of Mexican Creoles. It is proposed that this poem belongs in a Frontier Creole epic sub-cycle.

  19. Usefulness of the automatic quantitative estimation tool for cerebral blood flow: clinical assessment of the application software tool AQCEL.

    Science.gov (United States)

    Momose, Mitsuhiro; Takaki, Akihiro; Matsushita, Tsuyoshi; Yanagisawa, Shin; Yano, Kesato; Miyasaka, Tadashi; Ogura, Yuka; Kadoya, Masumi

    2011-01-01

    coefficient between AQCEL and conventional methods were 0.973 and 0.986 for the normal and affected sides at rest, respectively, and 0.977 and 0.984 for the normal and affected sides after ACZ loading, respectively. The quality of images reconstructed using the application software AQCEL were superior to that obtained using conventional method after ACZ loading, and high correlations were shown in quantity at rest and after ACZ loading. This software can be applied to clinical practice and is a useful tool for improvement of reproducibility and throughput.

  20. The Applicability of Taylor’s Model to the Drilling of CFRP Using Uncoated WC-Co Tools: The Influence of Cutting Speed on Tool Wear

    OpenAIRE

    Merino Perez, J.L.; Merson, E.; Ayvar-Soberanis, S.; Hodzic, A.

    2014-01-01

    This work investigates the applicability of Taylor’s model on the drilling of CFRP using uncoated WC-Co tools, by assessing the influence of cutting speed (Vc) on tool wear. Two different resins, possessing low and high glass transition temperatures (Tg), and two different reinforcements, high strength and high modulus woven fabrics, were combined into three different systems. Flank wear rate gradient exhibited to be more reinforcement dependent, while the actual flank wear rate showed to be ...

  1. SWAT Check: A Screening Tool to Assist Users in the Identification of Potential Model Application Problems.

    Science.gov (United States)

    White, Michael J; Harmel, R Daren; Arnold, Jeff G; Williams, Jimmy R

    2014-01-01

    The Soil and Water Assessment Tool (SWAT) is a basin-scale hydrologic model developed by the United States Department of Agriculture Agricultural Research Service. SWAT's broad applicability, user-friendly model interfaces, and automatic calibration software have led to a rapid increase in the number of new users. These advancements also allow less experienced users to conduct SWAT modeling applications. In particular, the use of automated calibration software may produce simulated values that appear appropriate because they adequately mimic measured data used in calibration and validation. Autocalibrated model applications (and often those of unexperienced modelers) may contain input data errors and inappropriate parameter adjustments not readily identified by users or the autocalibration software. The objective of this research was to develop a program to assist users in the identification of potential model application problems. The resulting "SWAT Check" is a stand-alone Microsoft Windows program that (i) reads selected SWAT output and alerts users of values outside the typical range; (ii) creates process-based figures for visualization of the appropriateness of output values, including important outputs that are commonly ignored; and (iii) detects and alerts users of common model application errors. By alerting users to potential model application problems, this software should assist the SWAT community in developing more reliable modeling applications. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  2. Development and Application of a Tool for Optimizing Composite Matrix Viscoplastic Material Parameters

    Science.gov (United States)

    Murthy, Pappu L. N.; Naghipour Ghezeljeh, Paria; Bednarcyk, Brett A.

    2018-01-01

    This document describes a recently developed analysis tool that enhances the resident capabilities of the Micromechanics Analysis Code with the Generalized Method of Cells (MAC/GMC) and its application. MAC/GMC is a composite material and laminate analysis software package developed at NASA Glenn Research Center. The primary focus of the current effort is to provide a graphical user interface (GUI) capability that helps users optimize highly nonlinear viscoplastic constitutive law parameters by fitting experimentally observed/measured stress-strain responses under various thermo-mechanical conditions for braided composites. The tool has been developed utilizing the MATrix LABoratory (MATLAB) (The Mathworks, Inc., Natick, MA) programming language. Illustrative examples shown are for a specific braided composite system wherein the matrix viscoplastic behavior is represented by a constitutive law described by seven parameters. The tool is general enough to fit any number of experimentally observed stress-strain responses of the material. The number of parameters to be optimized, as well as the importance given to each stress-strain response, are user choice. Three different optimization algorithms are included: (1) Optimization based on gradient method, (2) Genetic algorithm (GA) based optimization and (3) Particle Swarm Optimization (PSO). The user can mix and match the three algorithms. For example, one can start optimization with either 2 or 3 and then use the optimized solution to further fine tune with approach 1. The secondary focus of this paper is to demonstrate the application of this tool to optimize/calibrate parameters for a nonlinear viscoplastic matrix to predict stress-strain curves (for constituent and composite levels) at different rates, temperatures and/or loading conditions utilizing the Generalized Method of Cells. After preliminary validation of the tool through comparison with experimental results, a detailed virtual parametric study is

  3. CRISP. Requirements Specifications of Intelligent ICT Simulation Tools for Power Applications

    Energy Technology Data Exchange (ETDEWEB)

    Warmer, C.J.; Kester, J.C.P.; Kamphuis, I.G. [ECN Energy in the Built Environment and Networks, Petten (Netherlands); Carlsson, P [EnerSearch, Malmoe (Sweden); Fontela, M. [Laboratory of Electrical Engineering LEG, Grenoble (France); Gustavsson, R. [Blekinge Institute of Technology BTH, Karlskrona (Sweden)

    2003-10-15

    This report, deliverable D2.1 in the CRISP project, serves as a preparation report for the development of simulation tools and prototype software which will be developed in forthcoming stages of the CRISP project. Application areas for these simulations are: fault detection and diagnosis, supply and demand matching and intelligent load shedding. The context in which these applications function is the power network with a high degree of distributed generation, including renewables. In order to control a so called distributed grid we can benefit from a high level of distributed control and intelligence. This requires, on top of the power system network, an information and communication network.. We argue that such a network should be seen as an enabler of distributed control and intelligence. The applications, through which control and intelligence is implemented, then form a third network layer, the service oriented network. Building upon this three-layered network model we derive in this report the requirements for a simulation tool and experiments which study new techniques for fault detection and diagnostics and for simulation tools and experiments implementing intelligent load shedding and supply and demand matching scenarios. We also look at future implementation of these services within the three-layered network model and the requirements that follow for the core information and communication network and for the service oriented network. These requirements, supported by the studies performed in the CRISP Workpackage 1, serve as a basis for development of the simulation tools in the tasks 2.2 to 2.4.

  4. CRISP. Requirements Specifications of Intelligent ICT Simulation Tools for Power Applications

    International Nuclear Information System (INIS)

    Warmer, C.J.; Kester, J.C.P.; Kamphuis, I.G.; Carlsson, P; Fontela, M.; Gustavsson, R.

    2003-10-01

    This report, deliverable D2.1 in the CRISP project, serves as a preparation report for the development of simulation tools and prototype software which will be developed in forthcoming stages of the CRISP project. Application areas for these simulations are: fault detection and diagnosis, supply and demand matching and intelligent load shedding. The context in which these applications function is the power network with a high degree of distributed generation, including renewables. In order to control a so called distributed grid we can benefit from a high level of distributed control and intelligence. This requires, on top of the power system network, an information and communication network.. We argue that such a network should be seen as an enabler of distributed control and intelligence. The applications, through which control and intelligence is implemented, then form a third network layer, the service oriented network. Building upon this three-layered network model we derive in this report the requirements for a simulation tool and experiments which study new techniques for fault detection and diagnostics and for simulation tools and experiments implementing intelligent load shedding and supply and demand matching scenarios. We also look at future implementation of these services within the three-layered network model and the requirements that follow for the core information and communication network and for the service oriented network. These requirements, supported by the studies performed in the CRISP Workpackage 1, serve as a basis for development of the simulation tools in the tasks 2.2 to 2.4

  5. The application of a multi-physics tool kit to spatial reactor dynamics

    International Nuclear Information System (INIS)

    Clifford, I.; Jasak, H.

    2009-01-01

    Traditionally coupled field nuclear reactor analysis has been carried out using several loosely coupled solvers, each having been developed independently from the others. In the field of multi-physics, the current generation of object-oriented tool kits provides robust close coupling of multiple fields on a single framework. This paper describes the initial results obtained as part of continuing research in the use of the OpenFOAM multi-physics tool kit for reactor dynamics application development. An unstructured, three-dimensional, time-dependent multi-group diffusion code Diffusion FOAM has been developed using the OpenFOAM multi-physics tool kit as a basis. The code is based on the finite-volume methodology and uses a newly developed block-coupled sparse matrix solver for the coupled solution of the multi-group diffusion equations. A description of this code is given with particular emphasis on the newly developed block-coupled solver, along with a selection of results obtained thus far. The code has performed well, indicating that the OpenFOAM tool kit is suited to reactor dynamics applications. This work has shown that the neutronics and simplified thermal-hydraulics of a reactor May be represented and solved for using a common calculation platform, and opens up the possibility for research into robust close-coupling of neutron diffusion and thermal-fluid calculations. This work has further opened up the possibility for research in a number of other areas, including research into three-dimensional unstructured meshes for reactor dynamics applications. (authors)

  6. Application of performance assessment as a tool for guiding project work

    International Nuclear Information System (INIS)

    McCombie, C.; Zuidema, P.

    1992-01-01

    The ultimate aim of the performance assessment methodology developed over the last 10-15 years is to predict quantitatively the behavior of disposal systems over periods of time into the far future. The methodology can, however, also be applied in range of tasks during repository development and is in many programmes used as a tool for improving or optimizing the design of subsystem components of for guiding the course of project planning. In Swiss waste management program, there are several examples of the use of performance assessment as a tool in the manner mentioned above. The interaction between research models, assessment models and simplified models is considered to be of key importance and corresponding measures are taken to properly structure the process and to track the data: first, the results of all applications of the models are included in a consistent manner in the scenario analyses for the different sites and systems and, second, consistency in the underlying assumptions and in the data used in the different model calculations is assured by the consequent application of a configuration data management system (CDM). Almost all the applications of performance assessment have been included in Swiss work, but for this paper, only two examples have been selected: applications of performance assessment in both the HLW and the LLW program; and acceptance of specific waste types and their allocation to an appropriate repository on the basis of simplified safety analyses

  7. A generative tool for building health applications driven by ISO 13606 archetypes.

    Science.gov (United States)

    Menárguez-Tortosa, Marcos; Martínez-Costa, Catalina; Fernández-Breis, Jesualdo Tomás

    2012-10-01

    The use of Electronic Healthcare Records (EHR) standards in the development of healthcare applications is crucial for achieving the semantic interoperability of clinical information. Advanced EHR standards make use of the dual model architecture, which provides a solution for clinical interoperability based on the separation of the information and knowledge. However, the impact of such standards is biased by the limited availability of tools that facilitate their usage and practical implementation. In this paper, we present an approach for the automatic generation of clinical applications for the ISO 13606 EHR standard, which is based on the dual model architecture. This generator has been generically designed, so it can be easily adapted to other dual model standards and can generate applications for multiple technological platforms. Such good properties are based on the combination of standards for the representation of generic user interfaces and model-driven engineering techniques.

  8. Versatile Data Acquisition and Controls for EPICS using VME-based FPGAs

    International Nuclear Information System (INIS)

    Trent Allison; Roger Flood

    2001-01-01

    Field Programmable Gate Arrays (FPGAs) have provided versatile VME based data acquisition and control systems with minimal development times for the Thomas Jefferson National Accelerator Facility (Jefferson Lab). FPGAs have been used to interface with VME controllers using both standard A16 and A24 address modes. VME vector-interrupt capability has also been implemented for timing issues in some applications. FPGA designs have additionally been used to provide controls for various systems by interfacing with Analog to Digital Converters (DAC), interlocks, and other drive signals. These controls can be molded to the individual needs of each system and can provide operators with indicators and controls in EPICS via a VME interface. This allows the developer to choose components and make specifications that are not available commercially. Jefferson Lab has begun developing standard FPGA libraries that result in quick turnaround times and inexpensive designs. There have been approximately eight VME based FPGA designs implanted in one department at Jefferson Lab and they are becoming more widespread. FPGAs continue to become larger and faster enabling systems to be incorporated on one integrated circuit. Inherently, FPGAs can process data faster than many microprocessors due to the small processing overhead associated with a custom FPGA design. The ability to modify FPGA code enables the developer to easily implement future additions to a system, making the design flexible and expandable. This work supported by the U.S. DOE Contract No. DE-AC05-84ER40150

  9. A method to encapsulate model structural uncertainty in ensemble projections of future climate: EPIC v1.0

    Science.gov (United States)

    Lewis, Jared; Bodeker, Greg E.; Kremser, Stefanie; Tait, Andrew

    2017-12-01

    computationally efficient than running multiple GCM or RCM simulations. Such a large ensemble of projections permits a description of a probability density function (PDF) of future climate states rather than a small number of individual story lines within that PDF, which may not be representative of the PDF as a whole; the EPIC method largely corrects for such potential sampling biases. The method is useful for providing projections of changes in climate to users wishing to investigate the impacts and implications of climate change in a probabilistic way. A web-based tool, using the EPIC method to provide probabilistic projections of changes in daily maximum and minimum temperatures for New Zealand, has been developed and is described in this paper.

  10. Creation of a Web-Based GIS Server and Custom Geoprocessing Tools for Enhanced Hydrologic Applications

    Science.gov (United States)

    Welton, B.; Chouinard, K.; Sultan, M.; Becker, D.; Milewski, A.; Becker, R.

    2010-12-01

    Rising populations in the arid and semi arid parts of the World are increasing the demand for fresh water supplies worldwide. Many data sets needed for assessment of hydrologic applications across vast regions of the world are expensive, unpublished, difficult to obtain, or at varying scales which complicates their use. Fortunately, this situation is changing with the development of global remote sensing datasets and web-based platforms such as GIS Server. GIS provides a cost effective vehicle for comparing, analyzing, and querying a variety of spatial datasets as geographically referenced layers. We have recently constructed a web-based GIS, that incorporates all relevant geological, geochemical, geophysical, and remote sensing data sets that were readily used to identify reservoir types and potential well locations on local and regional scales in various tectonic settings including: (1) extensional environment (Red Sea rift), (2) transcurrent fault system (Najd Fault in the Arabian-Nubian Shield), and (3) compressional environments (Himalayas). The web-based GIS could also be used to detect spatial and temporal trends in precipitation, recharge, and runoff in large watersheds on local, regional, and continental scales. These applications were enabled through the construction of a web-based ArcGIS Server with Google Map’s interface and the development of customized geoprocessing tools. ArcGIS Server provides out-of-the-box setups that are generic in nature. This platform includes all of the standard web based GIS tools (e.g. pan, zoom, identify, search, data querying, and measurement). In addition to the standard suite of tools provided by ArcGIS Server an additional set of advanced data manipulation and display tools was also developed to allow for a more complete and customizable view of the area of interest. The most notable addition to the standard GIS Server tools is the custom on-demand geoprocessing tools (e.g., graph, statistical functions, custom raster

  11. Nanobody-derived nanobiotechnology tool kits for diverse biomedical and biotechnology applications

    Directory of Open Access Journals (Sweden)

    Wang Y

    2016-07-01

    Full Text Available Yongzhong Wang,1 Zhen Fan,2 Lei Shao,3 Xiaowei Kong,1 Xianjuan Hou,1 Dongrui Tian,1 Ying Sun,1 Yazhong Xiao,1 Li Yu4 1School of Life Sciences, Collaborative Innovation Center of Modern Bio-manufacture, Anhui University, Hefei, People’s Republic of China; 2Department of Biomedical Engineering, The Ohio State University, Columbus, OH, USA; 3State Key Laboratory of New Drugs and Pharmaceutical Process, Shanghai Institute of Pharmaceutical Industry, Shanghai, 4Department of Microbiology and Parasitology, Anhui Provincial Laboratory of Microbiology and Parasitology, Anhui Key Laboratory of Zoonoses, Anhui Medical University, Hefei, People’s Republic of China Abstract: Owing to peculiar properties of nanobody, including nanoscale size, robust structure, stable and soluble behaviors in aqueous solution, reversible refolding, high affinity and specificity for only one cognate target, superior cryptic cleft accessibility, and deep tissue penetration, as well as a sustainable source, it has been an ideal research tool for the development of sophisticated nanobiotechnologies. Currently, the nanobody has been evolved into versatile research and application tool kits for diverse biomedical and biotechnology applications. Various nanobody-derived formats, including the nanobody itself, the radionuclide or fluorescent-labeled nanobodies, nanobody homo- or heteromultimers, nanobody-coated nanoparticles, and nanobody-displayed bacteriophages, have been successfully demonstrated as powerful nanobiotechnological tool kits for basic biomedical research, targeting drug delivery and therapy, disease diagnosis, bioimaging, and agricultural and plant protection. These applications indicate a special advantage of these nanobody-derived technologies, already surpassing the “me-too” products of other equivalent binders, such as the full-length antibodies, single-chain variable fragments, antigen-binding fragments, targeting peptides, and DNA-based aptamers. In

  12. Contemporary molecular tools in microbial ecology and their application to advancing biotechnology.

    Science.gov (United States)

    Rashid, Mamoon; Stingl, Ulrich

    2015-12-01

    Novel methods in microbial ecology are revolutionizing our understanding of the structure and function of microbes in the environment, but concomitant advances in applications of these tools to biotechnology are mostly lagging behind. After more than a century of efforts to improve microbial culturing techniques, about 70-80% of microbial diversity - recently called the "microbial dark matter" - remains uncultured. In early attempts to identify and sample these so far uncultured taxonomic lineages, methods that amplify and sequence ribosomal RNA genes were extensively used. Recent developments in cell separation techniques, DNA amplification, and high-throughput DNA sequencing platforms have now made the discovery of genes/genomes of uncultured microorganisms from different environments possible through the use of metagenomic techniques and single-cell genomics. When used synergistically, these metagenomic and single-cell techniques create a powerful tool to study microbial diversity. These genomics techniques have already been successfully exploited to identify sources for i) novel enzymes or natural products for biotechnology applications, ii) novel genes from extremophiles, and iii) whole genomes or operons from uncultured microbes. More can be done to utilize these tools more efficiently in biotechnology. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Contemporary molecular tools in microbial ecology and their application to advancing biotechnology

    KAUST Repository

    Rashid, Mamoon; Stingl, Ulrich

    2015-01-01

    Novel methods in microbial ecology are revolutionizing our understanding of the structure and function of microbes in the environment, but concomitant advances in applications of these tools to biotechnology are mostly lagging behind. After more than a century of efforts to improve microbial culturing techniques, about 70–80% of microbial diversity – recently called the “microbial dark matter” – remains uncultured. In early attempts to identify and sample these so far uncultured taxonomic lineages, methods that amplify and sequence ribosomal RNA genes were extensively used. Recent developments in cell separation techniques, DNA amplification, and high-throughput DNA sequencing platforms have now made the discovery of genes/genomes of uncultured microorganisms from different environments possible through the use of metagenomic techniques and single-cell genomics. When used synergistically, these metagenomic and single-cell techniques create a powerful tool to study microbial diversity. These genomics techniques have already been successfully exploited to identify sources for i) novel enzymes or natural products for biotechnology applications, ii) novel genes from extremophiles, and iii) whole genomes or operons from uncultured microbes. More can be done to utilize these tools more efficiently in biotechnology.

  14. Methods and tools to support real time risk-based flood forecasting - a UK pilot application

    Directory of Open Access Journals (Sweden)

    Brown Emma

    2016-01-01

    Full Text Available Flood managers have traditionally used probabilistic models to assess potential flood risk for strategic planning and non-operational applications. Computational restrictions on data volumes and simulation times have meant that information on the risk of flooding has not been available for operational flood forecasting purposes. In practice, however, the operational flood manager has probabilistic questions to answer, which are not completely supported by the outputs of traditional, deterministic flood forecasting systems. In a collaborative approach, HR Wallingford and Deltares have developed methods, tools and techniques to extend existing flood forecasting systems with elements of strategic flood risk analysis, including probabilistic failure analysis, two dimensional flood spreading simulation and the analysis of flood impacts and consequences. This paper presents the results of the application of these new operational flood risk management tools to a pilot catchment in the UK. It discusses the problems of performing probabilistic flood risk assessment in real time and how these have been addressed in this study. It also describes the challenges of the communication of risk to operational flood managers and to the general public, and how these new methods and tools can provide risk-based supporting evidence to assist with this process.

  15. Techniques and tools for measuring energy efficiency of scientific software applications

    International Nuclear Information System (INIS)

    Abdurachmanov, David; Elmer, Peter; Eulisse, Giulio; Knight, Robert; Niemi, Tapio; Pestana, Gonçalo; Khan, Kashif; Nurminen, Jukka K; Nyback, Filip; Ou, Zhonghong

    2015-01-01

    The scale of scientific High Performance Computing (HPC) and High Throughput Computing (HTC) has increased significantly in recent years, and is becoming sensitive to total energy use and cost. Energy-efficiency has thus become an important concern in scientific fields such as High Energy Physics (HEP). There has been a growing interest in utilizing alternate architectures, such as low power ARM processors, to replace traditional Intel x86 architectures. Nevertheless, even though such solutions have been successfully used in mobile applications with low I/O and memory demands, it is unclear if they are suitable and more energy-efficient in the scientific computing environment. Furthermore, there is a lack of tools and experience to derive and compare power consumption between the architectures for various workloads, and eventually to support software optimizations for energy efficiency. To that end, we have performed several physical and software-based measurements of workloads from HEP applications running on ARM and Intel architectures, and compare their power consumption and performance. We leverage several profiling tools (both in hardware and software) to extract different characteristics of the power use. We report the results of these measurements and the experience gained in developing a set of measurement techniques and profiling tools to accurately assess the power consumption for scientific workloads. (paper)

  16. Contemporary molecular tools in microbial ecology and their application to advancing biotechnology

    KAUST Repository

    Rashid, Mamoon

    2015-09-25

    Novel methods in microbial ecology are revolutionizing our understanding of the structure and function of microbes in the environment, but concomitant advances in applications of these tools to biotechnology are mostly lagging behind. After more than a century of efforts to improve microbial culturing techniques, about 70–80% of microbial diversity – recently called the “microbial dark matter” – remains uncultured. In early attempts to identify and sample these so far uncultured taxonomic lineages, methods that amplify and sequence ribosomal RNA genes were extensively used. Recent developments in cell separation techniques, DNA amplification, and high-throughput DNA sequencing platforms have now made the discovery of genes/genomes of uncultured microorganisms from different environments possible through the use of metagenomic techniques and single-cell genomics. When used synergistically, these metagenomic and single-cell techniques create a powerful tool to study microbial diversity. These genomics techniques have already been successfully exploited to identify sources for i) novel enzymes or natural products for biotechnology applications, ii) novel genes from extremophiles, and iii) whole genomes or operons from uncultured microbes. More can be done to utilize these tools more efficiently in biotechnology.

  17. Application of ICT tools in communicating information and knowledge to artisanal fishermen communities in Zanzibar

    Directory of Open Access Journals (Sweden)

    Ronald Benard

    2017-06-01

    Full Text Available This article assesses the application of ICT tools in communicating information and knowledge to artisanal fishermen communities in Zanzibar. The study was carried out in four purposefully selected wards in Unguja District in Zanzibar, Tanzania. The study involved a sample size of 80 respondents. Data were collected by using document reviews, questionnaires, focus group discussions and personal observations. Results showed that artisanal fishermen need information and knowledge on weather condition, modern fish capturing methods, market and marketing, fish preservation and processing. The study also found that mobile phones and radio are the most ICTs tools used by the artisanal fishermen. The findings also revealed that communicating information and knowledge through ICT tools was limited by lack of funds, poor network connectivity, lack of training and seminars on the use of ICTs in accessing information and poor coverage on radio and television transmission. It is therefore recommended that the government should support artisanal fishermen in acquiring some of the fishing gears and ICTs tools such as GPS and sonar through subsidizing them.

  18. Android Based Binus Profile Applications as the Marketing Tools of Bina Nusantara University

    Science.gov (United States)

    Iskandar, Karto

    2014-03-01

    Smart phones with apps in it is not a new phenomenon. Both of technologies have been fused with the lifestyle today. The ease and speed of access to information makes a lot of companies use it in the process of marketing a product to the public. Objective of this action is to win the competition that more competitive. The purpose of this research is to create mobile application android based to assist in the marketing and introduction Bina Nusantara University profile to prospective students. This research method using software engineering waterfall model to produce Android-based mobile applications. The results in the form of Android-based mobile application that can be used as a viral marketing tool for Bina Nusantara University. At the end of this study can be generated that mobile technology can be used as a media for effective marketing and branding, especially for Bina Nusantara University. Android technology based for marketing applications suited to the Bina Nusantara University applicant segment which are generally young people. The future along with the improvement of network quality and affordable cost, then the application can be made online, so features such as chat, maps, and other can be used optimally.

  19. Android Based Binus Profile Applications as the Marketing Tools of Bina Nusantara University

    Directory of Open Access Journals (Sweden)

    Iskandar Karto

    2014-03-01

    Full Text Available Smart phones with apps in it is not a new phenomenon. Both of technologies have been fused with the lifestyle today. The ease and speed of access to information makes a lot of companies use it in the process of marketing a product to the public. Objective of this action is to win the competition that more competitive. The purpose of this research is to create mobile application android based to assist in the marketing and introduction Bina Nusantara University profile to prospective students. This research method using software engineering waterfall model to produce Android-based mobile applications. The results in the form of Android-based mobile application that can be used as a viral marketing tool for Bina Nusantara University. At the end of this study can be generated that mobile technology can be used as a media for effective marketing and branding, especially for Bina Nusantara University. Android technology based for marketing applications suited to the Bina Nusantara University applicant segment which are generally young people. The future along with the improvement of network quality and affordable cost, then the application can be made online, so features such as chat, maps, and other can be used optimally.

  20. Implementation of EPICS based Control System for Radioisotope Beam line

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae-Ha; Ahn, Tae-Sung; Song, Young-Gi; Kwon, Hyeok-Jung; Cho, Yong-Sub [Korea Atomic Energy Research Institute, Gyeongju (Korea, Republic of)

    2015-10-15

    Korea Mult-purpose Accelerator Complex (KOMAC) has been operating 100 MeV proton linear accelerator . For operating 100 MeV linac, various control system has been implemented such as vacuum, power supply, RCCS and etc. KOMAC is operating two beam lines so that clients can use 100 MeV proton beam for their experiment. KOMAC sends beam to beam line and target room using two dipole magnets and several quadrupole magnets. As demand for experiments and Radius Isotope using beam is increased, another beam line is under construction and RI beam line control system is need. To synchronize with KOMAC control system, RI beam line control system is based on Experimental Physics and Industrial control System (EPICS) software. The beam is transported to RI beam line to control magnet power supply and vacuum. Implementation of RI beam line control system is presented and some preliminary results are reported. The base RI beam line control system is implemented. It can control beam direction and vacuum. Comparing archived data and current data, RI beam line and control system will be improved. In the future, scroll pump and gate control system will be implemented using programmable logic controller PLC. RI beam interlock sequence will be added to KOMAC interlock system to protect linac.

  1. IFMIF LLRF control system architecture based on EPICS

    International Nuclear Information System (INIS)

    Calvo, J.; Ibarra, A.; Miguel Angel Patricio; Rivers, M.

    2012-01-01

    The IFMIF-EVEDA (International Fusion Materials Irradiation Facility - Engineering Validation and Engineering Design Activity) linear accelerator will be a 9 MeV, 125 mA CW (Continuous Wave) deuteron accelerator prototype to validate the technical options of the accelerator design for IFMIF. The primary mission of such facility is to test and verify materials performance when subjected to extensive neutron irradiation of the type encountered in a fusion reactor. The RF (Radio Frequency) power system of IFMIF-EVEDA consists of 18 RF chains working at 175 MHz with three amplification stages each. The LLRF (Low-Level Radio Frequency) controls the amplitude and phase of the signal to be synchronized with the beam and it also controls the resonance frequency of the cavities. The system is based on a commercial cPCI (Compact Peripheral Component Interconnect) FPGA (Field Programmable Gate Array) board provided by Lyrtech and controlled by a Windows Host PC. For this purpose, it is mandatory to communicate the cPCI FPGA Board with an EPICS Channel Access, building an IOC (Input Output Controller). A new software architecture to design a device support, using AsynPortDriver class and CSS as a GUI (Graphical User Interface), is also presented. (authors)

  2. Social Life Cycle Assessment as a Management Tool: Methodology for Application in Tourism

    Directory of Open Access Journals (Sweden)

    Roberto Merli

    2013-08-01

    Full Text Available As is widely known, sustainability is an important factor in competition, increasing the added value of a company in terms of image and credibility. However, it is important that sustainability assessments are effectively addressed in a global perspective. Therefore, life cycle tools are adopted to evaluate environmental and social impacts. Among these, and of particular significance, appears the Social Life Cycle Assessment (SLCA, which, although in its early stage of development, seems to have extremely promising methodological features. For this reason, it seemed interesting to propose a first application to the tourism sector, which could be better than other methods, studied in terms of social sustainability data. The particular characteristics of service delivery lend themselves more to the development of data related to social sustainability than other sectors. In this paper the results of a case study carried out using social accounting and business management tools are shown.

  3. Safety-barrier diagrams as a tool for modelling safety of hydrogen applications

    DEFF Research Database (Denmark)

    Duijm, Nijs Jan; Markert, Frank

    2009-01-01

    Safety-barrier diagrams have proven to be a useful tool in documenting the safety measures taken to prevent incidents and accidents in process industry. Especially during the introduction of new hydrogen technologies or applications, as e.g. hydrogen refuelling stations, safety-barrier diagrams...... are considered a valuable supplement to other traditional risk analysis tools to support the communication with authorities and other stakeholders during the permitting process. Another advantage of safety-barrier diagrams is that they highlight the importance of functional and reliable safety barriers in any...... system and here is a direct focus on those barriers that need to be subject to safety management in terms of design and installation, operational use, inspection and monitoring, and maintenance. Safety-barrier diagrams support both quantitative and qualitative approaches. The paper will describe...

  4. Development of Application Programming Tool for Safety Grade PLC (POSAFE-Q)

    International Nuclear Information System (INIS)

    Koo, Kyungmo; You, Byungyong; Kim, Tae-Wook; Cho, Sengjae; Lee, Jin S.

    2006-01-01

    The pSET (POSAFE-Q Software Engineering Tool) is an application programming tool of the POSAFE-Q which is a safety graded programmable logic controller (PLC) developed for the reactor protect system of the nuclear power plant. The pSET provides an integrated development environment (IDE) which includes editors, compiler, simulator, down loader, debugger, and monitor. The pSET supports the IEC61131-3 standard software model and languages such as LD (ladder diagram) and FBD (function block diagram) which are two of the most widely used PLC programming languages in industry fields. The pSET will also support SFC (sequential function chart) language. The pSET is developed as a part of a Korea Nuclear Instrumentation and Control System (KNICS) project

  5. Mobile computing device as tools for college student education: a case on flashcards application

    Science.gov (United States)

    Kang, Congying

    2012-04-01

    Traditionally, college students always use flash cards as a tool to remember massive knowledge, such as nomenclature, structures, and reactions in chemistry. Educational and information technology have enabled flashcards viewed on computers, like Slides and PowerPoint, works as tunnels of drilling and feedback for the learners. The current generation of students is more capable of information technology and mobile computing devices. For example, they use their Mobile phones much more intensively everyday day. Trends of using Mobile phone as an educational tool is analyzed and a educational technology initiative is proposed, which use Mobile phone flash cards applications to help students learn biology and chemistry. Experiments show that users responded positively to these mobile flash cards.

  6. Applications of a broad-spectrum tool for conservation and fisheries analysis: aquatic gap analysis

    Science.gov (United States)

    McKenna, James E.; Steen, Paul J.; Lyons, John; Stewart, Jana S.

    2009-01-01

    . Aquatic gap analysis naturally focuses on aquatic habitats. The analytical tools are largely based on specification of the species-habitat relations for the system and organism group of interest (Morrison et al. 2003; McKenna et al. 2006; Steen et al. 2006; Sowa et al. 2007). The Great Lakes Regional Aquatic Gap Analysis (GLGap) project focuses primarily on lotic habitat of the U.S. Great Lakes drainage basin and associated states and has been developed to address fish and fisheries issues. These tools are unique because they allow us to address problems at a range of scales from the region to the stream segment and include the ability to predict species specific occurrence or abundance for most of the fish species in the study area. The results and types of questions that can be addressed provide better global understanding of the ecological context within which specific natural resources fit (e.g., neighboring environments and resources, and large and small scale processes). The geographic analysis platform consists of broad and flexible geospatial tools (and associated data) with many potential applications. The objectives of this article are to provide a brief overview of GLGap methods and analysis tools, and demonstrate conservation and planning applications of those data and tools. Although there are many potential applications, we will highlight just three: (1) support for the Eastern Brook Trout Joint Venture (EBTJV), (2) Aquatic Life classification in Wisconsin, and (3) an educational tool that makes use of Google Earth (use of trade or product names does not imply endorsement by the U.S. Government) and Internet accessibility.

  7. Mobile application as a prenatal education and engagement tool: A randomized controlled pilot.

    Science.gov (United States)

    Ledford, Christy J W; Canzona, Mollie Rose; Cafferty, Lauren A; Hodge, Joshua A

    2016-04-01

    Research has shown that mobile applications provide a powerful alternative to traditional paper diaries; however, little data exists in comparing apps to the traditional mode of paper as a patient education and engagement tool in the clinical setting. This study was designed to compare the effectiveness of a mobile app versus a spiral-notebook guide throughout prenatal care. This randomized (n=173) controlled pilot was conducted at an East Coast community hospital. Chi-square and repeated-measures analysis of variance was used to test intervention effects in the sample of 127 pregnant mothers who completed their prenatal care in the healthcare system. Patients who were distributed the mobile application used the tool to record information about pregnancy more frequently (p=.04) and developed greater patient activation (p=.02) than patients who were distributed notebooks. No difference was detected on interpersonal clinical communication. A mobile application successfully activated a patient population in which self-management is a critical factor. This study shows that mobile apps can prompt greater use and result in more activated patients. Findings may be translated to other patient populations who receive recurring care for chronic disease. Published by Elsevier Ireland Ltd.

  8. BioTapestry now provides a web application and improved drawing and layout tools.

    Science.gov (United States)

    Paquette, Suzanne M; Leinonen, Kalle; Longabaugh, William J R

    2016-01-01

    Gene regulatory networks (GRNs) control embryonic development, and to understand this process in depth, researchers need to have a detailed understanding of both the network architecture and its dynamic evolution over time and space. Interactive visualization tools better enable researchers to conceptualize, understand, and share GRN models. BioTapestry is an established application designed to fill this role, and recent enhancements released in Versions 6 and 7 have targeted two major facets of the program. First, we introduced significant improvements for network drawing and automatic layout that have now made it much easier for the user to create larger, more organized network drawings. Second, we revised the program architecture so it could continue to support the current Java desktop Editor program, while introducing a new BioTapestry GRN Viewer that runs as a JavaScript web application in a browser. We have deployed a number of GRN models using this new web application. These improvements will ensure that BioTapestry remains viable as a research tool in the face of the continuing evolution of web technologies, and as our understanding of GRN models grows.

  9. Perceptions of Contractors and Consultants Toward Application of Greenship Rating Tools on Apartment Buildings in Surabaya

    Directory of Open Access Journals (Sweden)

    Herry Pintardi Chandra

    2014-04-01

    Full Text Available During the last ten years, the growth of apartment buildings in Surabaya has encountered the bitter experience of global warming, resource depletion, energy scarcity, and other environmental impacts. We cannot avoid them, but we can minimize the negative impacts of global warming. The green building concept is one of the methods to minimize the environmental impact. It takes into account principles of sustainable development in planning, construction, operation, and maintenance. Greenship Rating Tools is used to evaluate and calculate green achievements, prior to green building certification. The aim of this research is to represent the perceptions of contractors and consultants toward application of Greenship Rating Tools on apartment buildings in Surabaya. Based on the data obtained from a questionnaires survey carried out to 41 respondents, the mean value ranking method is used to evaluate the main factors of Greenship. These factors are Appropriate Site Development, Energy Efficiency and Conservation, Water Conservation, Material Resource and Cycle, Indoor Health and Comfort, and Building Environmental Management. In general, the results of this research show that there are a number of differences between perceptions of contractors and consultants toward application of Greenship Rating Tools on apartment buildings in Surabaya. According to the contractors’ perception, Visual Comfort is a factor that would easily to be applied, whilst consultants’ is Landscape. On the other hand, there are factors that would difficult to be applied. Based on contractors’ perceptiom is Climate Change, while consultants’ perception is Renewal Energy. In summary, Greenship Rating Tools can be applied on contractors’ and consultants’ perceptions, whilst there are some variables which can not be applied.

  10. Applications and issues of GIS as tool for civil engineering modeling

    Science.gov (United States)

    Miles, S.B.; Ho, C.L.

    1999-01-01

    A tool that has proliferated within civil engineering in recent years is geographic information systems (GIS). The goal of a tool is to supplement ability and knowledge that already exists, not to serve as a replacement for that which is lacking. To secure the benefits and avoid misuse of a burgeoning tool, engineers must understand the limitations, alternatives, and context of the tool. The common benefits of using GIS as a supplement to engineering modeling are summarized. Several brief case studies of GIS modeling applications are taken from popular civil engineering literature to demonstrate the wide use and varied implementation of GIS across the discipline. Drawing from the case studies, limitations regarding traditional GIS data models find the implementation of civil engineering models within current GIS are identified and countered by discussing the direction of the next generation of GIS. The paper concludes by highlighting the potential for the misuse of GIS in the context of engineering modeling and suggests that this potential can be reduced through education and awareness. The goal of this paper is to promote awareness of the issues related to GIS-based modeling and to assist in the formulation of questions regarding the application of current GIS. The technology has experienced much publicity of late, with many engineers being perhaps too excited about the usefulness of current GIS. An undoubtedly beneficial side effect of this, however, is that engineers are becoming more aware of GIS and, hopefully, the associated subtleties. Civil engineers must stay informed of GIS issues and progress, but more importantly, civil engineers must inform the GIS community to direct the technology development optimally.

  11. Quantitative food intake in the EPIC-Germany cohorts. European Investigation into Cancer and Nutrition.

    Science.gov (United States)

    Schulze, M B; Brandstetter, B R; Kroke, A; Wahrendorf, J; Boeing, H

    1999-01-01

    The EPIC-Heidelberg and the EPIC-Potsdam studies with about 53,000 study participants represent the German contribution to the EPIC (European Investigation into Cancer and Nutrition) cohort study. Within the EPIC study, standardized 24-hour dietary recalls were applied as a quantitative calibration method in order to estimate the amount of scaling bias introduced by the varying center-specific dietary assessment methods. This article presents intake of food items and food groups in the two German cohorts estimated by 24-hour quantitative dietary recalls. Recalls from 1,013 men and 1,078 women in Heidelberg and 1,032 men and 898 women in Potsdam were included in the analysis. The intake of recorded food items or recipe ingredients as well as fat used for cooking was summarized into 16 main food groups and a variety of different subgroups stratified by sex and weighted for the day of the week and age. In more than 90% of the recalls, consumption of dairy products, cereals and cereal products, bread, fat, and non-alcoholic beverages, particularly coffee/tea, was reported. Inter-cohort evaluations revealed that bread, potatoes, fruit and fat were consumed in higher amounts in the Potsdam cohort while the opposite was found for pasta/rice, non-alcoholic, and alcoholic beverages. It was concluded that the exposure variation was increased by having two instead of one EPIC study centers in Germany. Copyright 1999 S. Karger AG, Basel

  12. Uncertainties in cloud phase and optical thickness retrievals from the Earth Polychromatic Imaging Camera (EPIC)

    Science.gov (United States)

    Meyer, Kerry; Yang, Yuekui; Platnick, Steven

    2018-01-01

    This paper presents an investigation of the expected uncertainties of a single channel cloud optical thickness (COT) retrieval technique, as well as a simple cloud temperature threshold based thermodynamic phase approach, in support of the Deep Space Climate Observatory (DSCOVR) mission. DSCOVR cloud products will be derived from Earth Polychromatic Imaging Camera (EPIC) observations in the ultraviolet and visible spectra. Since EPIC is not equipped with a spectral channel in the shortwave or mid-wave infrared that is sensitive to cloud effective radius (CER), COT will be inferred from a single visible channel with the assumption of appropriate CER values for liquid and ice phase clouds. One month of Aqua MODIS daytime granules from April 2005 is selected for investigating cloud phase sensitivity, and a subset of these granules that has similar EPIC sun-view geometry is selected for investigating COT uncertainties. EPIC COT retrievals are simulated with the same algorithm as the operational MODIS cloud products (MOD06), except using fixed phase-dependent CER values. Uncertainty estimates are derived by comparing the single channel COT retrievals with the baseline bi-spectral MODIS retrievals. Results show that a single channel COT retrieval is feasible for EPIC. For ice clouds, single channel retrieval errors are minimal (clouds the error is mostly limited to within 10%, although for thin clouds (COT cloud masking and cloud temperature retrievals are not considered in this study. PMID:29619116

  13. Software tool for resolution of inverse problems using artificial intelligence techniques: an application in neutron spectrometry

    International Nuclear Information System (INIS)

    Castaneda M, V. H.; Martinez B, M. R.; Solis S, L. O.; Castaneda M, R.; Leon P, A. A.; Hernandez P, C. F.; Espinoza G, J. G.; Ortiz R, J. M.; Vega C, H. R.; Mendez, R.; Gallego, E.; Sousa L, M. A.

    2016-10-01

    The Taguchi methodology has proved to be highly efficient to solve inverse problems, in which the values of some parameters of the model must be obtained from the observed data. There are intrinsic mathematical characteristics that make a problem known as inverse. Inverse problems appear in many branches of science, engineering and mathematics. To solve this type of problem, researches have used different techniques. Recently, the use of techniques based on Artificial Intelligence technology is being explored by researches. This paper presents the use of a software tool based on artificial neural networks of generalized regression in the solution of inverse problems with application in high energy physics, specifically in the solution of the problem of neutron spectrometry. To solve this problem we use a software tool developed in the Mat Lab programming environment, which employs a friendly user interface, intuitive and easy to use for the user. This computational tool solves the inverse problem involved in the reconstruction of the neutron spectrum based on measurements made with a Bonner spheres spectrometric system. Introducing this information, the neural network is able to reconstruct the neutron spectrum with high performance and generalization capability. The tool allows that the end user does not require great training or technical knowledge in development and/or use of software, so it facilitates the use of the program for the resolution of inverse problems that are in several areas of knowledge. The techniques of Artificial Intelligence present singular veracity to solve inverse problems, given the characteristics of artificial neural networks and their network topology, therefore, the tool developed has been very useful, since the results generated by the Artificial Neural Network require few time in comparison to other techniques and are correct results comparing them with the actual data of the experiment. (Author)

  14. Direct push driven in situ color logging tool (CLT): technique, analysis routines, and application

    Science.gov (United States)

    Werban, U.; Hausmann, J.; Dietrich, P.; Vienken, T.

    2014-12-01

    Direct push technologies have recently seen a broad development providing several tools for in situ parameterization of unconsolidated sediments. One of these techniques is the measurement of soil colors - a proxy information that reveals to soil/sediment properties. We introduce the direct push driven color logging tool (CLT) for real-time and depth-resolved investigation of soil colors within the visible spectrum. Until now, no routines exist on how to handle high-resolved (mm-scale) soil color data. To develop such a routine, we transform raw data (CIEXYZ) into soil color surrogates of selected color spaces (CIExyY, CIEL*a*b*, CIEL*c*h*, sRGB) and denoise small-scale natural variability by Haar and Daublet4 wavelet transformation, gathering interpretable color logs over depth. However, interpreting color log data as a single application remains challenging. Additional information, such as site-specific knowledge of the geological setting, is required to correlate soil color data to specific layers properties. Hence, we exemplary provide results from a joint interpretation of in situ-obtained soil color data and 'state-of-the-art' direct push based profiling tool data and discuss the benefit of additional data. The developed routine is capable of transferring the provided information obtained as colorimetric data into interpretable color surrogates. Soil color data proved to correlate with small-scale lithological/chemical changes (e.g., grain size, oxidative and reductive conditions), especially when combined with additional direct push vertical high resolution data (e.g., cone penetration testing and soil sampling). Thus, the technique allows enhanced profiling by means of providing another reproducible high-resolution parameter for analysis subsurface conditions. This opens potential new areas of application and new outputs for such data in site investigation. It is our intention to improve color measurements by means method of application and data

  15. Cross entropy-based memetic algorithms: An application study over the tool switching problem

    Directory of Open Access Journals (Sweden)

    Jhon Edgar Amaya

    2013-05-01

    Full Text Available This paper presents a parameterized schema for building memetic algorithms based on cross-entropy (CE methods. This novel schema is general in nature, and features multiple probability mass functions and Lamarckian learning. The applicability of the approach is assessed by considering the Tool Switching Problem, a complex combinatorial problem in the field of Flexible Manufacturing Systems. An exhaustive evaluation (including techniques ranging from local search and evolutionary algorithms to constructive methods provides evidence of the effectiveness of CE-based memetic algorithms.

  16. The potential application of military fleet scheduling tools to the Federal Waste Management System Transportation System

    International Nuclear Information System (INIS)

    Harrison, I.G.; Pope, R.B.; Kraemer, R.D.; Hilliard, M.R.

    1991-01-01

    This paper discusses the feasibility of adapting concepts and tools that were developed for the US military's transportation management systems to the management of the Federal Waste Management System's (FWMS) Transportation System. Many of the lessons in the development of the planning and scheduling software for the US military are applicable to the development of similar software for the FWMS Transportation System. The resulting system would be invaluable to the US Department of Energy's (DOE) Office of Civilian Radioactive Waste Management (OCRWM), both initially, for long-range planning, and later, in day-to-day scheduling and management activities

  17. Applications of biological tools or biomarkers in aquatic biota: A case study of the Tamar estuary, South West England.

    Science.gov (United States)

    Dallas, Lorna J; Jha, Awadhesh N

    2015-06-30

    Biological systems are the ultimate recipients of pollutant-induced damage. Consequently, our traditional reliance on analytical tools is not enough to assess ecosystem health. Biological responses or biomarkers are therefore also considered to be important tools for environmental hazard and risk assessments. Due to historical mining, other anthropogenic activities, and its conservational importance (e.g. NATURA sites, SACs), the Tamar estuary in South West England is an ideal environment in which to examine applications of such biological tools. This review presents a thorough and critical evaluation of the different biological tools used in the Tamar estuary thus far, while also discussing future perspectives for biomarker studies from a global perspective. In particular, we focus on the challenges which hinder applications of biological tools from being more readily incorporated into regulatory frameworks, with the aim of enabling both policymakers and primary stakeholders to maximise the environmental relevance and regulatory usefulness of such tools. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. The standardized computerized 24-h dietary recall method EPIC-Soft adapted for pan-European dietary monitoring

    DEFF Research Database (Denmark)

    Slimani, N.; Casagrande, C.; Nicolas, G.

    2011-01-01

    monitoring. Within European Food Consumption Validation (EFCOVAL), EPIC-Soft was adapted and further developed on various aspects that were required to optimize its use. In this paper, we present the structure and main interview steps of the EPIC-Soft program, after implementation of a series of new...

  19. Unified Lambert Tool for Massively Parallel Applications in Space Situational Awareness

    Science.gov (United States)

    Woollands, Robyn M.; Read, Julie; Hernandez, Kevin; Probe, Austin; Junkins, John L.

    2018-03-01

    Awareness computer cluster at the LASR Lab, Texas A&M University. We demonstrate the power of our tool by solving a highly parallel example problem, that is the generation of extremal field maps for optimal spacecraft rendezvous (and eventual orbit debris removal). In addition we demonstrate the need for including perturbative effects in simulations for satellite tracking or data association. The unified Lambert tool is ideal for but not limited to space situational awareness applications.

  20. Feature Usage Explorer: Usage Monitoring and Visualization Tool in HTML5 Based Applications

    Directory of Open Access Journals (Sweden)

    Sarunas Marciuska

    2013-10-01

    Full Text Available Feature Usage Explorer is a JavaScript library, which automatically detects features in HTML5 based applications and monitors their usage. The collected information can be visualized in a Feature Usage Diagram, which is automatically generated from an input json file. Currently, the users of Feature Usage Explorer have to design their own tool in order to generate the json file from collected usage information. This option remains viable when using the library in order not to constraint the user’s choice of preferred data storage. Feature Usage Explorer can be reused in any HTML5 based applications where an understanding of how users interact with the system is required (i.e. user experience and usability studies, human computer interaction field, or requirement prioritization area.

  1. An Overview of the Monte Carlo Application ToolKit (MCATK)

    Energy Technology Data Exchange (ETDEWEB)

    Trahan, Travis John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-01-07

    MCATK is a C++ component-based Monte Carlo neutron-gamma transport software library designed to build specialized applications and designed to provide new functionality in existing general-purpose Monte Carlo codes like MCNP; it was developed with Agile software engineering methodologies under the motivation to reduce costs. The characteristics of MCATK can be summarized as follows: MCATK physics – continuous energy neutron-gamma transport with multi-temperature treatment, static eigenvalue (k and α) algorithms, time-dependent algorithm, fission chain algorithms; MCATK geometry – mesh geometries, solid body geometries. MCATK provides verified, unit-tested Monte Carlo components, flexibility in Monte Carlo applications development, and numerous tools such as geometry and cross section plotters. Recent work has involved deterministic and Monte Carlo analysis of stochastic systems. Static and dynamic analysis is discussed, and the results of a dynamic test problem are given.

  2. Analysis Tools for Sizing and Placement of Energy Storage for Grid Applications - A Literature Review

    Energy Technology Data Exchange (ETDEWEB)

    Hoffman, Michael G.; Kintner-Meyer, Michael CW; Sadovsky, Artyom; DeSteese, John G.

    2010-09-24

    The purpose of this report was to review pertinent literature and studies that might reveal models capable of optimizing the siting, sizing and economic value of energy storage in the future smart grid infrastructure. Energy storage technology and utility system deployment have been subjects of intense research and development for over three decades. During this time, many models have been developed that consider energy storage implementation in the electric power industry and other applications. Nevertheless, this review of literature discovered no actual models and only a few software tools that relate specifically to the application environment and expected requirements of the evolving smart grid infrastructure. This report indicates the existing need for such a model and describes a pathway for developing it.

  3. Techniques and tools for measuring energy efficiency of scientific software applications

    CERN Document Server

    Abdurachmanov, David; Eulisse, Giulio; Knight, Robert; Niemi, Tapio; Nurminen, Jukka K.; Nyback, Filip; Pestana, Goncalo; Ou, Zhonghong; Khan, Kashif

    2014-01-01

    The scale of scientific High Performance Computing (HPC) and High Throughput Computing (HTC) has increased significantly in recent years, and is becoming sensitive to total energy use and cost. Energy-efficiency has thus become an important concern in scientific fields such as High Energy Physics (HEP). There has been a growing interest in utilizing alternate architectures, such as low power ARM processors, to replace traditional Intel x86 architectures. Nevertheless, even though such solutions have been successfully used in mobile applications with low I/O and memory demands, it is unclear if they are suitable and more energy-efficient in the scientific computing environment. Furthermore, there is a lack of tools and experience to derive and compare power consumption between the architectures for various workloads, and eventually to support software optimizations for energy efficiency. To that end, we have performed several physical and software-based measurements of workloads from HEP applications running o...

  4. Transportable Applications Environment (TAE) Plus: A NASA tool used to develop and manage graphical user interfaces

    Science.gov (United States)

    Szczur, Martha R.

    1992-01-01

    The Transportable Applications Environment (TAE) Plus was built to support the construction of graphical user interfaces (GUI's) for highly interactive applications, such as real-time processing systems and scientific analysis systems. It is a general purpose portable tool that includes a 'What You See Is What You Get' WorkBench that allows user interface designers to layout and manipulate windows and interaction objects. The WorkBench includes both user entry objects (e.g., radio buttons, menus) and data-driven objects (e.g., dials, gages, stripcharts), which dynamically change based on values of realtime data. Discussed here is what TAE Plus provides, how the implementation has utilized state-of-the-art technologies within graphic workstations, and how it has been used both within and without NASA.

  5. A case study: application of statistical process control tool for determining process capability and sigma level.

    Science.gov (United States)

    Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona

    2012-01-01

    Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical

  6. Systematic Review: Concept and Tool Development with Application in the Integrated Risk Information System (IRIS) Assessment Process

    Science.gov (United States)

    Systematic Review: Concept and tool development with application to the National Toxicology Program (NTP) and the Integrated Risk Information System (IRIS) Assessment Processes. There is growing interest within the environmental health community to incorporate systematic review m...

  7. I-HASTREAM : density-based hierarchical clustering of big data streams and its application to big graph analytics tools

    NARCIS (Netherlands)

    Hassani, M.; Spaus, P.; Cuzzocrea, A.; Seidl, T.

    2016-01-01

    Big Data Streams are very popular at now, as stirred-up by a plethora of modern applications such as sensor networks, scientific computing tools, Web intelligence, social network analysis and mining tools, and so forth. Here, the main research issue consists in how to effectively and efficiently

  8. Development of EPICS channel access embedded ActiveX components for GUI development

    International Nuclear Information System (INIS)

    Roy, A.; Bhole, R.B.; Pal, S.

    2012-01-01

    The paper describes the integration of Experimental Physics and Industrial Control System (EPICS) Channel Access (CA) protocol and Microsoft ActiveX technology towards developing a generalize operator interface (OPI) building facility for Windows platform. EPICS is used as the development architecture of the control system in Superconducting Cyclotron (SCC). Considering the operators' familiarity and compatibility with third party software, it was decided to use MS-Windows platform at operator interface level in SCC during commission. Microsoft Visual Basic (VB) is used on trial basis as OPI building platform to incorporate user specific features e.g. file system access for data storage and analysis, user authentication at OPI level etc. A set of EPICS Channel Access embedded ActiveX components is developed to ease the programming complexity and reduce developmental time of the OPI for Windows platform. OPIs, developed using these components and containing hundreds of process parameters, are being used reliably over a considerable period of time. (author)

  9. Application of MCDM based hybrid optimization tool during turning of ASTM A588

    Directory of Open Access Journals (Sweden)

    Himadri Majumder

    2017-07-01

    Full Text Available Multi-criteria decision making approach is one of the most troublesome tools for solving the tangled optimization problems in the machining area due to its capability of solving the complex optimization problems in the production process. Turning is widely used in the manufacturing processes as it offers enormous advantages like good quality product, customer satisfaction, economical and relatively easy to apply. A contemporary approach, MOORA coupled with PCA, was used to ascertain an optimal combination of input parameters (spindle speed, depth of cut and feed rate for the given output parameters (power consumption, average surface roughness and frequency of tool vibration using L27 orthogonal array for turning on ASTM A588 mild steel. Comparison between MOORA-PCA and TOPSIS-PCA shows the effectiveness of MOORA over TOPSIS method. The optimum parameter combination for multi-performance characteristics has been established for ASTM A588 mild steel are spindle speed 160 rpm, depth of cut 0.1 mm and feed rate 0.08 mm/rev. Therefore, this study focuses on the application of the hybrid MCDM approach as a vital selection making tool to deal with multi objective optimization problems.

  10. A Tool and Application Programming Interface for Browsing Historical Geostationary Satellite Data

    Science.gov (United States)

    Chee, T.; Nguyen, L.; Minnis, P.; Spangenberg, D.; Ayers, J.

    2013-12-01

    Providing access to information is a key concern for NASA Langley Research Center. We describe a tool and method that allows end users to easily browse and access information that is otherwise difficult to acquire and manipulate. The tool described has as its core the application-programming interface that is made available to the public. One goal of the tool is to provide a demonstration to end users so that they can use the enhanced imagery as an input into their own work flows. This project builds upon NASA Langley Cloud and Radiation Group's experience with making real-time and historical satellite imagery accessible and easily searchable. As we see the increasing use of virtual supply chains that provide additional value at each link there is value in making satellite imagery available through a simple access method as well as allowing users to browse and view that imagery as they need rather than in a manner most convenient for the data provider.

  11. Application of cleaner production tools and failure modes and effects analysis in pig slaughterhourses

    Directory of Open Access Journals (Sweden)

    J. M. Fonseca

    2017-07-01

    Full Text Available Cleaner production programs (CP and Failure Modes and Effects Analysis (FMEA are tools used to improve the sustainability of industries, ensuring greater profitability, quality, reliability and safety of their products and services. The meat industry is among the most polluting industries because of the large amounts of organic waste produced during meat processing. The objective of this study was to combine the CP and FMEA tools and to apply them in a pig slaughterhouse in order to detect critical points along the production chain that have a major environmental impact and to establish corrective and preventive actions that could minimize these problems. The results showed that water is the most consumed resource by the industry and also the main producer of waste due to microbiological contamination with animal feces and blood and meat residues. All impacts were found to be real due to their daily occurrence in the industry. Their severity, occurrence, detection and coverage were classified as moderate and high, high, low and moderate, and moderate and high, respectively. The application of the CP and FMEA tools was efficient in identifying and evaluating the environmental impacts caused by the slaughter and processing of pork carcasses. Liquid slaughter effluents and solid wastes (blood and bones are the factors that pose the greatest risks to the environment. The substitution of treatment plant chemicals with decomposing microorganisms, composting, and the production of animal meal and feed from solid waste are appropriate measures the industry could adopt to minimize the contamination of water resources and soil.

  12. De-extinction and Barriers to the Application of New Conservation Tools.

    Science.gov (United States)

    Seddon, Philip J

    2017-07-01

    Decades of globally coordinated work in conservation have failed to slow the loss of biodiversity. To do better-even if that means nothing more than failing less spectacularly-bolder thinking is necessary. One of the first possible conservation applications of synthetic biology to be debated is the use of genetic tools to resurrect once-extinct species. Since the currency of conservation is biodiversity and the discipline of conservation biology was formed around the prevention of species extinctions, the prospect of reversing extinctions might have been expected to generate unreserved enthusiasm. But it was not universal acclaim that greeted the coming-out party for "de-extinction" that was the TEDx conference and accompanying National Geographic feature in 2013. Why the concern, the skepticism, even the hostility among many conservationists about the idea of restoring lost species? And how does this professional concern relate to public perception and support for conservation? This essay explores the barriers to the acceptance of risky new genomic-based conservation tools by considering five key areas and associated questions that could be addressed in relation to any new conservation tool. I illustrate these using the specific example of de-extinction, and in doing so, I consider whether de-extinction would necessarily be the best first point of engagement between conservation biology and synthetic biology. © 2017 The Hastings Center.

  13. Development of a Nursing Handoff Tool: A Web-Based Application to Enhance Patient Safety

    Science.gov (United States)

    Goldsmith, Denise; Boomhower, Marc; Lancaster, Diane R.; Antonelli, Mary; Kenyon, Mary Anne Murphy; Benoit, Angela; Chang, Frank; Dykes, Patricia C.

    2010-01-01

    Dynamic and complex clinical environments present many challenges for effective communication among health care providers. The omission of accurate, timely, easily accessible vital information by health care providers significantly increases risk of patient harm and can have devastating consequences for patient care. An effective nursing handoff supports the standardized transfer of accurate, timely, critical patient information, as well as continuity of care and treatment, resulting in enhanced patient safety. The Brigham and Women’s/Faulkner Hospital Healthcare Information Technology Innovation Program (HIP) is supporting the development of a web based nursing handoff tool (NHT). The goal of this project is to develop a “proof of concept” handoff application to be evaluated by nurses on the inpatient intermediate care units. The handoff tool would enable nurses to use existing knowledge of evidence-based handoff methodology in their everyday practice to improve patient care and safety. In this paper, we discuss the results of nursing focus groups designed to identify the current state of handoff practice as well as the functional and data element requirements of a web based Nursing Handoff Tool (NHT). PMID:21346980

  14. Alternatives of applications in the information taking with artificial radioactive tools in the mature fields of the South region

    International Nuclear Information System (INIS)

    Fuentes, J.L.

    2005-01-01

    In this work it is widely described the application of the saturation control tool (RST) used in the campo Rodador (Mexico), and shortly the registration tools of location monitoring (RMT) and the Monitor tool of the location operation (RPM) used for the evaluation in site of the mature wells, had being at the present time these three tools with nuclear reaction mechanisms to obtain the water saturation and of hydrocarbon by means of the Neutron capture (PNC) and Inelastic scattering (IS). Both tools have been designed to help to the evaluation of the mature wells in the locations. Starting from measurements made through the lining pipe applying nuclear techniques used by these tools of registrations to derive the water saturation and of hydrocarbon. In this work the basic principles of radioactivity and their application in the radioactive tools are described as well as the operative aspects of the tools before mentioned, some practical applications of the saturation control tool are presented and a technical study of cost-benefit and it is shown as the technology advances have allowed to carry out considerable progresses in the taking of information of the mature wells helping by this way, to build better geological models in the locations that help to increase the hydrocarbon production in wells that have many years of exploitation. (Author)

  15. Application of bioinformatics tools and databases in microbial dehalogenation research (a review).

    Science.gov (United States)

    Satpathy, R; Konkimalla, V B; Ratha, J

    2015-01-01

    Microbial dehalogenation is a biochemical process in which the halogenated substances are catalyzed enzymatically in to their non-halogenated form. The microorganisms have a wide range of organohalogen degradation ability both explicit and non-specific in nature. Most of these halogenated organic compounds being pollutants need to be remediated; therefore, the current approaches are to explore the potential of microbes at a molecular level for effective biodegradation of these substances. Several microorganisms with dehalogenation activity have been identified and characterized. In this aspect, the bioinformatics plays a key role to gain deeper knowledge in this field of dehalogenation. To facilitate the data mining, many tools have been developed to annotate these data from databases. Therefore, with the discovery of a microorganism one can predict a gene/protein, sequence analysis, can perform structural modelling, metabolic pathway analysis, biodegradation study and so on. This review highlights various methods of bioinformatics approach that describes the application of various databases and specific tools in the microbial dehalogenation fields with special focus on dehalogenase enzymes. Attempts have also been made to decipher some recent applications of in silico modeling methods that comprise of gene finding, protein modelling, Quantitative Structure Biodegradibility Relationship (QSBR) study and reconstruction of metabolic pathways employed in dehalogenation research area.

  16. Haemocompatibility of iron oxide nanoparticles synthesized for theranostic applications: a high-sensitivity microfluidic tool

    Energy Technology Data Exchange (ETDEWEB)

    Rodrigues, Raquel O. [Polytechnic Institute of Bragança, Laboratory of Separation and Reaction Engineering-Laboratory of Catalysis and Materials (LSRE-LCM) (Portugal); Bañobre-López, Manuel; Gallo, Juan [INL-International Iberian Nanotechnology Laboratory, Advanced (Magnetic) Theranostic Nanostructures Lab (Portugal); Tavares, Pedro B. [Universidade de Trás-os-Montes e Alto Douro, CQVR-Centro de Química-Vila Real (Portugal); Silva, Adrián M. T. [Universidade do Porto, Laboratory of Separation and Reaction Engineering-Laboratory of Catalysis and Materials (LSRE-LCM), Faculdade de Engenharia (Portugal); Lima, Rui, E-mail: rl@dem.uminho.pt [MEtRiCS, University of Minho, Mechanical Engineering Department (Portugal); Gomes, Helder T. [Polytechnic Institute of Bragança, Laboratory of Separation and Reaction Engineering-Laboratory of Catalysis and Materials (LSRE-LCM) (Portugal)

    2016-07-15

    The poor heating efficiency of the most reported magnetic nanoparticles (MNPs), allied to the lack of comprehensive biocompatibility and haemodynamic studies, hampers the spread of multifunctional nanoparticles as the next generation of therapeutic bio-agents in medicine. The present work reports the synthesis and characterization, with special focus on biological/toxicological compatibility, of superparamagnetic nanoparticles with diameter around 18 nm, suitable for theranostic applications (i.e. simultaneous diagnosis and therapy of cancer). Envisioning more insights into the complex nanoparticle-red blood cells (RBCs) membrane interaction, the deformability of the human RBCs in contact with magnetic nanoparticles (MNPs) was assessed for the first time with a microfluidic extensional approach, and used as an indicator of haematological disorders in comparison with a conventional haematological test, i.e. the haemolysis analysis. Microfluidic results highlight the potential of this microfluidic tool over traditional haemolysis analysis, by detecting small increments in the rigidity of the blood cells, when traditional haemotoxicology analysis showed no significant alteration (haemolysis rates lower than 2 %). The detected rigidity has been predicted to be due to the wrapping of small MNPs by the bilayer membrane of the RBCs, which is directly related to MNPs size, shape and composition. The proposed microfluidic tool adds a new dimension into the field of nanomedicine, allowing to be applied as a high-sensitivity technique capable of bringing a better understanding of the biological impact of nanoparticles developed for clinical applications.

  17. Uncertainties in cloud phase and optical thickness retrievals from the Earth Polychromatic Imaging Camera (EPIC).

    Science.gov (United States)

    Meyer, Kerry; Yang, Yuekui; Platnick, Steven

    2016-01-01

    This paper presents an investigation of the expected uncertainties of a single channel cloud optical thickness (COT) retrieval technique, as well as a simple cloud temperature threshold based thermodynamic phase approach, in support of the Deep Space Climate Observatory (DSCOVR) mission. DSCOVR cloud products will be derived from Earth Polychromatic Imaging Camera (EPIC) observations in the ultraviolet and visible spectra. Since EPIC is not equipped with a spectral channel in the shortwave or mid-wave infrared that is sensitive to cloud effective radius (CER), COT will be inferred from a single visible channel with the assumption of appropriate CER values for liquid and ice phase clouds. One month of Aqua MODIS daytime granules from April 2005 is selected for investigating cloud phase sensitivity, and a subset of these granules that has similar EPIC sun-view geometry is selected for investigating COT uncertainties. EPIC COT retrievals are simulated with the same algorithm as the operational MODIS cloud products (MOD06), except using fixed phase-dependent CER values. Uncertainty estimates are derived by comparing the single channel COT retrievals with the baseline bi-spectral MODIS retrievals. Results show that a single channel COT retrieval is feasible for EPIC. For ice clouds, single channel retrieval errors are minimal (< 2%) due to the particle size insensitivity of the assumed ice crystal (i.e., severely roughened aggregate of hexagonal columns) scattering properties at visible wavelengths, while for liquid clouds the error is mostly limited to within 10%, although for thin clouds (COT < 2) the error can be higher. Potential uncertainties in EPIC cloud masking and cloud temperature retrievals are not considered in this study.

  18. Quality assurance of the international computerised 24 h dietary recall method (EPIC-Soft).

    Science.gov (United States)

    Crispim, Sandra P; Nicolas, Genevieve; Casagrande, Corinne; Knaze, Viktoria; Illner, Anne-Kathrin; Huybrechts, Inge; Slimani, Nadia

    2014-02-01

    The interview-administered 24 h dietary recall (24-HDR) EPIC-Soft® has a series of controls to guarantee the quality of dietary data across countries. These comprise all steps that are part of fieldwork preparation, data collection and data management; however, a complete characterisation of these quality controls is still lacking. The present paper describes in detail the quality controls applied in EPIC-Soft, which are, to a large extent, built on the basis of the EPIC-Soft error model and are present in three phases: (1) before, (2) during and (3) after the 24-HDR interviews. Quality controls for consistency and harmonisation are implemented before the interviews while preparing the seventy databases constituting an EPIC-Soft version (e.g. pre-defined and coded foods and recipes). During the interviews, EPIC-Soft uses a cognitive approach by helping the respondent to recall the dietary intake information in a stepwise manner and includes controls for consistency (e.g. probing questions) as well as for completeness of the collected data (e.g. system calculation for some unknown amounts). After the interviews, a series of controls can be applied by dietitians and data managers to further guarantee data quality. For example, the interview-specific 'note files' that were created to track any problems or missing information during the interviews can be checked to clarify the information initially provided. Overall, the quality controls employed in the EPIC-Soft methodology are not always perceivable, but prove to be of assistance for its overall standardisation and possibly for the accuracy of the collected data.

  19. Application of quantum dots as analytical tools in automated chemical analysis: A review

    International Nuclear Information System (INIS)

    Frigerio, Christian; Ribeiro, David S.M.; Rodrigues, S. Sofia M.; Abreu, Vera L.R.G.; Barbosa, João A.C.; Prior, João A.V.; Marques, Karine L.; Santos, João L.M.

    2012-01-01

    Highlights: ► Review on quantum dots application in automated chemical analysis. ► Automation by using flow-based techniques. ► Quantum dots in liquid chromatography and capillary electrophoresis. ► Detection by fluorescence and chemiluminescence. ► Electrochemiluminescence and radical generation. - Abstract: Colloidal semiconductor nanocrystals or quantum dots (QDs) are one of the most relevant developments in the fast-growing world of nanotechnology. Initially proposed as luminescent biological labels, they are finding new important fields of application in analytical chemistry, where their photoluminescent properties have been exploited in environmental monitoring, pharmaceutical and clinical analysis and food quality control. Despite the enormous variety of applications that have been developed, the automation of QDs-based analytical methodologies by resorting to automation tools such as continuous flow analysis and related techniques, which would allow to take advantage of particular features of the nanocrystals such as the versatile surface chemistry and ligand binding ability, the aptitude to generate reactive species, the possibility of encapsulation in different materials while retaining native luminescence providing the means for the implementation of renewable chemosensors or even the utilisation of more drastic and even stability impairing reaction conditions, is hitherto very limited. In this review, we provide insights into the analytical potential of quantum dots focusing on prospects of their utilisation in automated flow-based and flow-related approaches and the future outlook of QDs applications in chemical analysis.

  20. Development of a zonal applicability tool for remote handling equipment in DEMO

    Energy Technology Data Exchange (ETDEWEB)

    Madzharov, Vladimir, E-mail: vladimir.madzharov@kit.edu [Karlsruhe Institute of Technology, Institute for Material Handling and Logistics, Karlsruhe (Germany); Mittwollen, Martin [Karlsruhe Institute of Technology, Institute for Material Handling and Logistics, Karlsruhe (Germany); Leichtle, Dieter [Fusion for Energy F4E, Barcelona (Spain); Hermon, Gary [Culham Center for Fusion Energy, Culham Science Centre, OX14 3DB Abingdon (United Kingdom)

    2015-10-15

    Highlights: • Radiation-hardness assessment of remote handling (RH) components used in DEMO. • A radiation assessment tool for supporting remote handling engineers. • Connecting data from the radiation field analysis to the radiation hardness data. • Output is the expected lifetime of the selected RH component used for maintenance. - Abstract: A radiation-induced damage caused by the ionizing radiation can induce a malfunctioning of the remote handling equipment (RHE) used during maintenance in fusion power plants, other nuclear power stations and high-energy accelerators facilities like e.g. IFMIF. Therefore to achieve a sufficient length of operational time inside future fusion power plants, a suitable radiation tolerant RHE for maintenance operations in radiation environments is inevitably required. To assess the influence of the radiation on remote handling equipment (RHE), an investigation about radiation hardness assessment of typically used RHE components, has been performed. Additionally, information about the absorbed total dose that every component can withstand before failure was collected. Furthermore, the development of a zonal applicability tool for supporting RHE designers has been started using Excel VBA. The tool connects the data from the radiation field analysis (3-D radiation map) to the radiation hardness data of the planned RHE for DEMO remote maintenance. The intelligent combination of the available information for the radiation behaviour and radiation level at certain time and certain location may help with the taking of decisions about the application of RHE in radiation environment. The user inputs the following parameters: the specific device used in the RHE, the planned location and the maintenance period. The output is the expected lifetime of the selected RHE component at the given location and maintenance period. Planned action times have to be also considered. After having all the parameters it can be decided, if specific RHE

  1. Improving adherence to the Epic Beacon ambulatory workflow.

    Science.gov (United States)

    Chackunkal, Ellen; Dhanapal Vogel, Vishnuprabha; Grycki, Meredith; Kostoff, Diana

    2017-06-01

    Computerized physician order entry has been shown to significantly improve chemotherapy safety by reducing the number of prescribing errors. Epic's Beacon Oncology Information System of computerized physician order entry and electronic medication administration was implemented in Henry Ford Health System's ambulatory oncology infusion centers on 9 November 2013. Since that time, compliance to the infusion workflow had not been assessed. The objective of this study was to optimize the current workflow and improve the compliance to this workflow in the ambulatory oncology setting. This study was a retrospective, quasi-experimental study which analyzed the composite workflow compliance rate of patient encounters from 9 to 23 November 2014. Based on this analysis, an intervention was identified and implemented in February 2015 to improve workflow compliance. The primary endpoint was to compare the composite compliance rate to the Beacon workflow before and after a pharmacy-initiated intervention. The intervention, which was education of infusion center staff, was initiated by ambulatory-based, oncology pharmacists and implemented by a multi-disciplinary team of pharmacists and nurses. The composite compliance rate was then reassessed for patient encounters from 2 to 13 March 2015 in order to analyze the effects of the determined intervention on compliance. The initial analysis in November 2014 revealed a composite compliance rate of 38%, and data analysis after the intervention revealed a statistically significant increase in the composite compliance rate to 83% ( p < 0.001). This study supports a pharmacist-initiated educational intervention can improve compliance to an ambulatory, oncology infusion workflow.

  2. Application of the CO2-PENS risk analysis tool to the Rock Springs Uplift, Wyoming

    Science.gov (United States)

    Stauffer, P.H.; Pawar, R.J.; Surdam, R.C.; Jiao, Z.; Deng, H.; Lettelier, B.C.; Viswanathan, H.S.; Sanzo, D.L.; Keating, G.N.

    2011-01-01

    We describe preliminary application of the CO2-PENS performance and risk analysis tool to a planned geologic CO2 sequestration demonstration project in the Rock Springs Uplift (RSU), located in south western Wyoming. We use data from the RSU to populate CO2-PENS, an evolving system-level modeling tool developed at Los Alamos National Laboratory. This tool has been designed to generate performance and risk assessment calculations for the geologic sequestration of carbon dioxide. Our approach follows Systems Analysis logic and includes estimates of uncertainty in model parameters and Monte-Carlo simulations that lead to probabilistic results. Probabilistic results provide decision makers with a range in the likelihood of different outcomes. Herein we present results from a newly implemented approach in CO 2-PENS that captures site-specific spatially coherent details such as topography on the reservoir/cap-rock interface, changes in saturation and pressure during injection, and dip on overlying aquifers that may be impacted by leakage upward through wellbores and faults. We present simulations of CO 2 injection under different uncertainty distributions for hypothetical leaking wells and faults. Although results are preliminary and to be used only for demonstration of the approach, future results of the risk analysis will form the basis for a discussion on methods to reduce uncertainty in the risk calculations. Additionally, we present ideas on using the model to help locate monitoring equipment to detect potential leaks. By maintaining site-specific details in the CO2-PENS analysis we provide a tool that allows more logical presentations to stakeholders in the region. ?? 2011 Published by Elsevier Ltd.

  3. Study of 3D visualization of fast active reflector based on openGL and EPICS

    International Nuclear Information System (INIS)

    Luo Mingcheng; Wu Wenqing; Liu Jiajing; Tang Pengyi; Wang Jian

    2014-01-01

    Active Reflector is the one of the innovations of Five hundred meter Aperture Spherical Telescope (FAST). Its performance will influence the performance of whole telescope and for display all status of ARS in real time, the EPICS (Experimental Physics and Industrial Control System) is used to develop the control system of ARS and virtual 3D technology-OpenGL is used to visualize the status. For the real-time performance of EPICS, the status visualization is also display in real time for users to improve the efficiency of telescope observing. (authors)

  4. DSCOVR/EPIC observations of SO2 reveal dynamics of young volcanic eruption clouds

    Science.gov (United States)

    Carn, S. A.; Krotkov, N. A.; Taylor, S.; Fisher, B. L.; Li, C.; Bhartia, P. K.; Prata, F. J.

    2017-12-01

    Volcanic emissions of sulfur dioxide (SO2) and ash have been measured by ultraviolet (UV) and infrared (IR) sensors on US and European polar-orbiting satellites since the late 1970s. Although successful, the main limitation of these observations from low Earth orbit (LEO) is poor temporal resolution (once per day at low latitudes). Furthermore, most currently operational geostationary satellites cannot detect SO2, a key tracer of volcanic plumes, limiting our ability to elucidate processes in fresh, rapidly evolving volcanic eruption clouds. In 2015, the launch of the Earth Polychromatic Imaging Camera (EPIC) aboard the Deep Space Climate Observatory (DSCOVR) provided the first opportunity to observe volcanic clouds from the L1 Lagrange point. EPIC is a 10-band spectroradiometer spanning UV to near-IR wavelengths with two UV channels sensitive to SO2, and a ground resolution of 25 km. The unique L1 vantage point provides continuous observations of the sunlit Earth disk, from sunrise to sunset, offering multiple daily observations of volcanic SO2 and ash clouds in the EPIC field of view. When coupled with complementary retrievals from polar-orbiting UV and IR sensors such as the Ozone Monitoring Instrument (OMI), the Ozone Mapping and Profiler Suite (OMPS), and the Atmospheric Infrared Sounder (AIRS), we demonstrate how the increased observation frequency afforded by DSCOVR/EPIC permits more timely volcanic eruption detection and novel analyses of the temporal evolution of volcanic clouds. Although EPIC has detected several mid- to high-latitude volcanic eruptions since launch, we focus on recent eruptions of Bogoslof volcano (Aleutian Islands, AK, USA). A series of EPIC exposures from May 28-29, 2017, uniquely captures the evolution of SO2 mass in a young Bogoslof eruption cloud, showing separation of SO2- and ice-rich regions of the cloud. We show how analyses of these sequences of EPIC SO2 data can elucidate poorly understood processes in transient eruption

  5. RealCalc : a real time Java calculation tool. Application to HVSR estimation

    Science.gov (United States)

    Hloupis, G.; Vallianatos, F.

    2009-04-01

    Java computation platform is not a newcomer in the seismology field. It is mainly used for applications regarding collecting, requesting, spreading and visualizing seismological data because it is productive, safe and has low maintenance costs. Although it has very attractive characteristics for the engineers, Java didn't used frequently in real time applications where prediction and reliability required as a reaction to real world events. The main reasons for this are the absence of priority support (such as priority ceiling or priority inversion) and the use of an automated memory management (called garbage collector). To overcome these problems a number of extensions have been proposed with the Real Time Specification for Java (RTSJ) being the most promising and used one. In the current study we used the RTSJ to build an application that receives data continuously and provides estimations in real time. The application consists of four main modules: incoming data, preprocessing, estimation and publication. As an application example we present real time HVSR estimation. Microtremors recordings are collected continuously from the incoming data module. The preprocessing module consists of a window selector tool based on wavelets which is applied on the incoming data stream in order derive the most stationary parts. The estimation module provides all the necessary calculations according to user specifications. Finally the publication module except the results presentation it also calculates attributes and relevant statistics for each site (temporal variations, HVSR stability). Acknowledgements This work is partially supported by the Greek General Secretariat of Research and Technology in the frame of Crete Regional Project 2000- 2006 (M1.2): "TALOS: An integrated system of seismic hazard monitoring and management in the front of the Hellenic Arc", CRETE PEP7 (KP_7).

  6. Suitability evaluation tool for lands (rice, corn and soybean) as mobile application

    Science.gov (United States)

    Rahim, S. E.; Supli, A. A.; Damiri, N.

    2017-09-01

    Evaluation of land suitability for special purposes e.g. for food crops is a must, a means to understand determining factors to be considered in the management of a land successfully. A framework for evaluating the land suitability for purposes in agriculture was first introduced by the Food and Agriculture Organization (FAO) in late 1970s. When using the framework manually, it is time consuming and not interesting for land users. Therefore, the authors have developed an effective tool by transforming the FAO framework into smart mobile application. This application is designed by using simple language for each factor and also by utilizing rule based system (RBS) algorithm. The factors involved are soil type, depth of soil solum, soil fertility, soil pH, drainage, risk of flood, etc. Suitability in this paper is limited to rice, corn and soybean. The application is found to be easier to understand and also could automatically determine the suitability of land. Usability testing was also conducted with 75 respondents. The results showed the usability was in "very good" classification. The program is urgently needed by the land managers, farmers, lecturers, students and government officials (planners) to help them more easily manage their land for a better future.

  7. Application of a pilot control banding tool for risk level assessment and control of nanoparticle exposures.

    Science.gov (United States)

    Paik, Samuel Y; Zalk, David M; Swuste, Paul

    2008-08-01

    Control banding (CB) strategies offer simplified solutions for controlling worker exposures to constituents that are found in the workplace in the absence of firm toxicological and exposure data. These strategies may be particularly useful in nanotechnology applications, considering the overwhelming level of uncertainty over what nanomaterials and nanotechnologies present as potential work-related health risks, what about these materials might lead to adverse toxicological activity, how risk related to these might be assessed and how to manage these issues in the absence of this information. This study introduces a pilot CB tool or 'CB Nanotool' that was developed specifically for characterizing the health aspects of working with engineered nanoparticles and determining the level of risk and associated controls for five ongoing nanotechnology-related operations being conducted at two Department of Energy research laboratories. Based on the application of the CB Nanotool, four of the five operations evaluated in this study were found to have implemented controls consistent with what was recommended by the CB Nanotool, with one operation even exceeding the required controls for that activity. The one remaining operation was determined to require an upgrade in controls. By developing this dynamic CB Nanotool within the realm of the scientific information available, this application of CB appears to be a useful approach for assessing the risk of nanomaterial operations, providing recommendations for appropriate engineering controls and facilitating the allocation of resources to the activities that most need them.

  8. Developing Cancer Informatics Applications and Tools Using the NCI Genomic Data Commons API.

    Science.gov (United States)

    Wilson, Shane; Fitzsimons, Michael; Ferguson, Martin; Heath, Allison; Jensen, Mark; Miller, Josh; Murphy, Mark W; Porter, James; Sahni, Himanso; Staudt, Louis; Tang, Yajing; Wang, Zhining; Yu, Christine; Zhang, Junjun; Ferretti, Vincent; Grossman, Robert L

    2017-11-01

    The NCI Genomic Data Commons (GDC) was launched in 2016 and makes available over 4 petabytes (PB) of cancer genomic and associated clinical data to the research community. This dataset continues to grow and currently includes over 14,500 patients. The GDC is an example of a biomedical data commons, which collocates biomedical data with storage and computing infrastructure and commonly used web services, software applications, and tools to create a secure, interoperable, and extensible resource for researchers. The GDC is (i) a data repository for downloading data that have been submitted to it, and also a system that (ii) applies a common set of bioinformatics pipelines to submitted data; (iii) reanalyzes existing data when new pipelines are developed; and (iv) allows users to build their own applications and systems that interoperate with the GDC using the GDC Application Programming Interface (API). We describe the GDC API and how it has been used both by the GDC itself and by third parties. Cancer Res; 77(21); e15-18. ©2017 AACR . ©2017 American Association for Cancer Research.

  9. Multi-Hazard Advanced Seismic Probabilistic Risk Assessment Tools and Applications

    International Nuclear Information System (INIS)

    Coleman, Justin L.; Bolisetti, Chandu; Veeraraghavan, Swetha; Parisi, Carlo; Prescott, Steven R.; Gupta, Abhinav

    2016-01-01

    Design of nuclear power plant (NPP) facilities to resist natural hazards has been a part of the regulatory process from the beginning of the NPP industry in the United States (US), but has evolved substantially over time. The original set of approaches and methods was entirely deterministic in nature and focused on a traditional engineering margins-based approach. However, over time probabilistic and risk-informed approaches were also developed and implemented in US Nuclear Regulatory Commission (NRC) guidance and regulation. A defense-in-depth framework has also been incorporated into US regulatory guidance over time. As a result, today, the US regulatory framework incorporates deterministic and probabilistic approaches for a range of different applications and for a range of natural hazard considerations. This framework will continue to evolve as a result of improved knowledge and newly identified regulatory needs and objectives, most notably in response to the NRC activities developed in response to the 2011 Fukushima accident in Japan. Although the US regulatory framework has continued to evolve over time, the tools, methods and data available to the US nuclear industry to meet the changing requirements have not kept pace. Notably, there is significant room for improvement in the tools and methods available for external event probabilistic risk assessment (PRA), which is the principal assessment approach used in risk-informed regulations and risk-informed decision-making applied to natural hazard assessment and design. This is particularly true if PRA is applied to natural hazards other than seismic loading. Development of a new set of tools and methods that incorporate current knowledge, modern best practice, and state-of-the-art computational resources would lead to more reliable assessment of facility risk and risk insights (e.g., the SSCs and accident sequences that are most risk-significant), with less uncertainty and reduced conservatisms.

  10. Multi-Hazard Advanced Seismic Probabilistic Risk Assessment Tools and Applications

    Energy Technology Data Exchange (ETDEWEB)

    Coleman, Justin L. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bolisetti, Chandu [Idaho National Lab. (INL), Idaho Falls, ID (United States); Veeraraghavan, Swetha [Idaho National Lab. (INL), Idaho Falls, ID (United States); Parisi, Carlo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Prescott, Steven R. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Gupta, Abhinav [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    Design of nuclear power plant (NPP) facilities to resist natural hazards has been a part of the regulatory process from the beginning of the NPP industry in the United States (US), but has evolved substantially over time. The original set of approaches and methods was entirely deterministic in nature and focused on a traditional engineering margins-based approach. However, over time probabilistic and risk-informed approaches were also developed and implemented in US Nuclear Regulatory Commission (NRC) guidance and regulation. A defense-in-depth framework has also been incorporated into US regulatory guidance over time. As a result, today, the US regulatory framework incorporates deterministic and probabilistic approaches for a range of different applications and for a range of natural hazard considerations. This framework will continue to evolve as a result of improved knowledge and newly identified regulatory needs and objectives, most notably in response to the NRC activities developed in response to the 2011 Fukushima accident in Japan. Although the US regulatory framework has continued to evolve over time, the tools, methods and data available to the US nuclear industry to meet the changing requirements have not kept pace. Notably, there is significant room for improvement in the tools and methods available for external event probabilistic risk assessment (PRA), which is the principal assessment approach used in risk-informed regulations and risk-informed decision-making applied to natural hazard assessment and design. This is particularly true if PRA is applied to natural hazards other than seismic loading. Development of a new set of tools and methods that incorporate current knowledge, modern best practice, and state-of-the-art computational resources would lead to more reliable assessment of facility risk and risk insights (e.g., the SSCs and accident sequences that are most risk-significant), with less uncertainty and reduced conservatisms.

  11. Cointegration as a data normalization tool for structural health monitoring applications

    Science.gov (United States)

    Harvey, Dustin Y.; Todd, Michael D.

    2012-04-01

    The structural health monitoring literature has shown an abundance of features sensitive to various types of damage in laboratory tests. However, robust feature extraction in the presence of varying operational and environmental conditions has proven to be one of the largest obstacles in the development of practical structural health monitoring systems. Cointegration, a technique adapted from the field of econometrics, has recently been introduced to the SHM field as one solution to the data normalization problem. Response measurements and feature histories often show long-run nonstationarity due to fluctuating temperature, load conditions, or other factors that leads to the occurrence of false positives. Cointegration theory allows nonstationary trends common to two or more time series to be modeled and subsequently removed. Thus, the residual retains sensitivity to damage with dependence on operational and environmental variability removed. This study further explores the use of cointegration as a data normalization tool for structural health monitoring applications.

  12. Application of Molecular Tools for Gut Health of Pet Animals: A Review

    Directory of Open Access Journals (Sweden)

    Lipismita Samal

    2011-04-01

    Full Text Available Gut health is an important facet of well being of pet animals; it is in this context, various nutritional and biotechnological approaches have been proposed to manipulate the gut health by specifically targeting the colonic microbiota. Nutritional approaches include supplementation of antioxidants and phytochemicals like flavonoids, isoflavonoids and carotenoids. Biotechnological approaches include supplementation of probiotics, prebiotics, synbiotics in the diet and potential application of molecular tools like fluorescent in situ hybridization, denaturing gradient gel electrophoresis, quantitative dot blot hybridization, and restriction fragment length polymorphism etc. in studying the fecal microbiota composition. Post-genomic and related technologies, i.e. genomics, nutrigenomics, transcriptomics, proteomics, metabolomics and epigenomics in the study of gastrointestinal tract also put forward challenges for nutritionists and microbiologists to elucidate the complex interactions between gut microbiota and host.

  13. Methodology for validating technical tools to assess customer Demand Response: Application to a commercial customer

    International Nuclear Information System (INIS)

    Alcazar-Ortega, Manuel; Escriva-Escriva, Guillermo; Segura-Heras, Isidoro

    2011-01-01

    The authors present a methodology, which is demonstrated with some applications to the commercial sector, in order to validate a Demand Response (DR) evaluation method previously developed and applied to a wide range of industrial and commercial segments, whose flexibility was evaluated by modeling. DR is playing a more and more important role in the framework of electricity systems management for the effective integration of other distributed energy resources. Consequently, customers must identify what they are using the energy for in order to use their flexible loads for management purposes. Modeling tools are used to predict the impact of flexibility on the behavior of customers, but this result needs to be validated since both customers and grid operators have to be confident in these flexibility predictions. An easy-to-use two-steps method to achieve this goal is presented in this paper.

  14. Delay dynamical systems and applications to nonlinear machine-tool chatter

    International Nuclear Information System (INIS)

    Fofana, M.S.

    2003-01-01

    The stability behaviour of machine chatter that exhibits Hopf and degenerate bifurcations has been examined without the assumption of small delays between successive cuts. Delay dynamical system theory leading to the reduction of the infinite-dimensional character of the governing delay differential equations (DDEs) to a finite-dimensional set of ordinary differential equations have been employed. The essential mathematical arguments for these systems in the context of retarded DDEs are summarized. Then the application of these arguments in the stability study of machine-tool chatter with multiple time delays is presented. Explicit analytical expressions ensuring stable and unstable machining when perturbations are periodic, stochastic and nonlinear have been derived using the integral averaging method and Lyapunov exponents

  15. Analysis on the applicability of environmental management tools for the management of natural wetlands within Colombia

    International Nuclear Information System (INIS)

    Herrera A, Maria A; Sepulveda L, Monica V; Aguirre R, Nestor J

    2008-01-01

    As a result of an investigation of scientific and legislative information about environmental management of natural wetlands in Colombia, analyzes the applicability of the administration tools employed in the integrated management of these ecosystems. For this work, there were identified key categories and sub categories of analysis, based on a ranking of natural inland wetlands in the country and its current state, the review of existing environmental regulations, also discussed some wetland management plans and finally, the identification of the scientific groups in Colciencias conducting studies on this subject. The information will be systematized bases on these results, thereby generating an approximation to a proposal of analysis that will help to rulers and scientist to guide the future investigation and policy about the management of wetlands in the country.

  16. Control theory in physics and other fields of science concepts, tools and applications

    CERN Document Server

    Schulz, Michael

    2006-01-01

    This book covers systematically and in a simple language the mathematical and physical foundations of controlling deterministic and stochastic evolutionary processes in systems with a high degree of complexity. Strong emphasis is placed on concepts, methods and techniques for modelling, assessment and the solution or estimation of control problems in an attempt to understand the large variability of these problems in several branches of physics, chemistry and biology as well as in technology and economics. The main focus of the book is on a clear physical and mathematical understanding of the dynamics and kinetics behind several kinds of control problems and their relation to self-organizing principles in complex systems. The book is a modern introduction and a helpful tool for researchers, engineers as well as post-docs and graduate students interested in an application oriented control theory and related topics.

  17. Graphical Environment Tools for Application to Gamma-Ray Energy Tracking Arrays

    Energy Technology Data Exchange (ETDEWEB)

    Todd, Richard A. [RIS Corp.; Radford, David C. [ORNL Physics Div.

    2013-12-30

    Highly segmented, position-sensitive germanium detector systems are being developed for nuclear physics research where traditional electronic signal processing with mixed analog and digital function blocks would be enormously complex and costly. Future systems will be constructed using pipelined processing of high-speed digitized signals as is done in the telecommunications industry. Techniques which provide rapid algorithm and system development for future systems are desirable. This project has used digital signal processing concepts and existing graphical system design tools to develop a set of re-usable modular functions and libraries targeted for the nuclear physics community. Researchers working with complex nuclear detector arrays such as the Gamma-Ray Energy Tracking Array (GRETA) have been able to construct advanced data processing algorithms for implementation in field programmable gate arrays (FPGAs) through application of these library functions using intuitive graphical interfaces.

  18. Warehousing in the Global Supply Chain Advanced Models, Tools and Applications for Storage Systems

    CERN Document Server

    2012-01-01

    With increased globalization and offshore sourcing, global supply chain management is becoming an important issue for many businesses as it involves a company's worldwide interests and suppliers rather than simply a local or national orientation. The storage systems significantly affect the level of quality of products, the customer’s service level, and the global logistic cost. The mission of warehousing systems design, control and optimization is to effectively ship products in the right place, at the right time, and in the right quantity (i.e. in any configuration) without any damages or alterations, and minimizing costs. Warehousing in the Global Supply Chain presents and discusses a set of models, tools and real applications, including a few case studies rarely presented with a sufficient detail by other literature, to illustrate the main challenges in warehousing activities. This includes all warehouse operations (from receiving to shipping), problems and issues (e.g. storage allocation, assignment,...

  19. The CRISPR/Cas genome-editing tool: application in improvement of crops

    Directory of Open Access Journals (Sweden)

    SURENDER eKHATODIA

    2016-04-01

    Full Text Available The Clustered Regularly Interspaced Short Palindromic Repeats (CRISPR associated Cas9/sgRNA system is a novel fledgling targeted genome-editing technique from bacterial immune system, which is a cheap, easy and most rapidly adopted genome editing tool transforming to revolutionary paradigm. Cas9 protein is an RNA guided endonuclease utilized for creating targeted double stranded breaks with only a short RNA sequence to confer recognition of the target in animals and plants. Development of genetically edited (GE crops similar to those developed by conventional or mutation breeding using this potential technique makes it a promising and extremely versatile tool for providing sustainable productive agriculture for better feeding of rapidly growing population in changing climate. The emerging areas of research for the genome editing in plants are like, interrogating gene function, rewiring the regulatory signaling networks, sgRNA library for high-throughput loss-of-function screening. In this review, we will discuss the broad applicability of the Cas9 nuclease mediated targeted plant genome editing for development of designer crops. The regulatory uncertainty and social acceptance of plant breeding by Cas9 genome editing have also been discussed. The non-GM designer genetically edited plants could prospect climate resilient and sustainable energy agriculture in coming future for maximizing the yield by combating abiotic and biotic stresses with this new innovative plant breeding technique.

  20. Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration

    Energy Technology Data Exchange (ETDEWEB)

    Bill Stanley; Sandra Brown; Patrick Gonzalez; Brent Sohngen; Neil Sampson; Mark Anderson; Miguel Calmon; Sean Grimland; Zoe Kant; Dan Morse; Sarah Woodhouse Murdock; Arlene Olivero; Tim Pearson; Sarah Walker; Jon Winsten; Chris Zganjar

    2007-03-31

    The Nature Conservancy is participating in a Cooperative Agreement with the Department of Energy (DOE) National Energy Technology Laboratory (NETL) to explore the compatibility of carbon sequestration in terrestrial ecosystems and the conservation of biodiversity. The title of the research project is ''Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration''. The objectives of the project are to: (1) improve carbon offset estimates produced in both the planning and implementation phases of projects; (2) build valid and standardized approaches to estimate project carbon benefits at a reasonable cost; and (3) lay the groundwork for implementing cost-effective projects, providing new testing ground for biodiversity protection and restoration projects that store additional atmospheric carbon. This Technical Progress Report discusses preliminary results of the six specific tasks that The Nature Conservancy is undertaking to answer research needs while facilitating the development of real projects with measurable greenhouse gas reductions. The research described in this report occurred between January 1st and March 31st 2007. The specific tasks discussed include: Task 1--carbon inventory advancements; Task 2--emerging technologies for remote sensing of terrestrial carbon; Task 3--baseline method development; Task 4--third-party technical advisory panel meetings; Task 5--new project feasibility studies; and Task 6--development of new project software screening tool.

  1. Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration

    Energy Technology Data Exchange (ETDEWEB)

    Bill Stanley; Sandra Brown; Patrick Gonzalez; Brent Sohngen; Neil Sampson; Mark Anderson; Miguel Calmon; Sean Grimland; Ellen Hawes; Zoe Kant; Dan Morse; Sarah Woodhouse Murdock; Arlene Olivero; Tim Pearson; Sarah Walker; Jon Winsten; Chris Zganjar

    2006-09-30

    The Nature Conservancy is participating in a Cooperative Agreement with the Department of Energy (DOE) National Energy Technology Laboratory (NETL) to explore the compatibility of carbon sequestration in terrestrial ecosystems and the conservation of biodiversity. The title of the research project is ''Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration''. The objectives of the project are to: (1) improve carbon offset estimates produced in both the planning and implementation phases of projects; (2) build valid and standardized approaches to estimate project carbon benefits at a reasonable cost; and (3) lay the groundwork for implementing cost-effective projects, providing new testing ground for biodiversity protection and restoration projects that store additional atmospheric carbon. This Technical Progress Report discusses preliminary results of the six specific tasks that The Nature Conservancy is undertaking to answer research needs while facilitating the development of real projects with measurable greenhouse gas reductions. The research described in this report occurred between April 1st and July 30th 2006. The specific tasks discussed include: Task 1: carbon inventory advancements; Task 2: emerging technologies for remote sensing of terrestrial carbon; Task 3: baseline method development; Task 4: third-party technical advisory panel meetings; Task 5: new project feasibility studies; and Task 6: development of new project software screening tool.

  2. Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration

    Energy Technology Data Exchange (ETDEWEB)

    Bill Stanley; Patrick Gonzalez; Sandra Brown; Jenny Henman; Zoe Kant; Sarah Woodhouse Murdock; Neil Sampson; Gilberto Tiepolo; Tim Pearson; Sarah Walker; Miguel Calmon

    2006-01-01

    The Nature Conservancy is participating in a Cooperative Agreement with the Department of Energy (DOE) National Energy Technology Laboratory (NETL) to explore the compatibility of carbon sequestration in terrestrial ecosystems and the conservation of biodiversity. The title of the research project is ''Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration''. The objectives of the project are to: (1) improve carbon offset estimates produced in both the planning and implementation phases of projects; (2) build valid and standardized approaches to estimate project carbon benefits at a reasonable cost; and (3) lay the groundwork for implementing cost-effective projects, providing new testing ground for biodiversity protection and restoration projects that store additional atmospheric carbon. This Technical Progress Report discusses preliminary results of the six specific tasks that The Nature Conservancy is undertaking to answer research needs while facilitating the development of real projects with measurable greenhouse gas reductions. The research described in this report occurred between April 1st , 2005 and June 30th, 2005. The specific tasks discussed include: Task 1: carbon inventory advancements; Task 2: emerging technologies for remote sensing of terrestrial carbon; Task 3: baseline method development; Task 4: third-party technical advisory panel meetings; Task 5: new project feasibility studies; and Task 6: development of new project software screening tool.

  3. APPLICATION AND DEVELOPMENT OF APPROPRIATE TOOLS AND TECHNOLOGIES FOR COST-EFFECTIVE CARBON

    Energy Technology Data Exchange (ETDEWEB)

    Bill Stanley; Sandra Brown; Ellen Hawes; Zoe Kant; Miguel Calmon; Patrick Gonzalez; Brad Kreps; Gilberto Tiepolo

    2003-09-01

    The Nature Conservancy is participating in a Cooperative Agreement with the Department of Energy (DOE) National Energy Technology Laboratory (NETL) to explore the compatibility of carbon sequestration in terrestrial ecosystems and the conservation of biodiversity. The title of the research project is ''Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration''. The objectives of the project are to: (1) improve carbon offset estimates produced in both the planning and implementation phases of projects; (2) build valid and standardized approaches to estimate project carbon benefits at a reasonable cost; and (3) lay the groundwork for implementing cost-effective projects, providing new testing ground for biodiversity protection and restoration projects that store additional atmospheric carbon. This Technical Progress Report discusses preliminary results of the six specific tasks that The Nature Conservancy is undertaking to answer research needs while facilitating the development of real projects with measurable greenhouse gas impacts. The research described in this report occurred between July 1, 2002 and June 30, 2003. The specific tasks discussed include: Task 1: carbon inventory advancements; Task 2: advanced videography testing; Task 3: baseline method development; Task 4: third-party technical advisory panel meetings; Task 5: new project feasibility studies; and Task 6: development of new project software screening tool.

  4. Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration

    Energy Technology Data Exchange (ETDEWEB)

    Bill Stanley; Patrick Gonzalez; Sandra Brown; Jenny Henman; Sarah Woodhouse Murdock; Neil Sampson; Tim Pearson; Sarah Walker; Zoe Kant; Miguel Calmon

    2006-04-01

    The Nature Conservancy is participating in a Cooperative Agreement with the Department of Energy (DOE) National Energy Technology Laboratory (NETL) to explore the compatibility of carbon sequestration in terrestrial ecosystems and the conservation of biodiversity. The title of the research project is ''Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration''. The objectives of the project are to: (1) improve carbon offset estimates produced in both the planning and implementation phases of projects; (2) build valid and standardized approaches to estimate project carbon benefits at a reasonable cost; and (3) lay the groundwork for implementing cost-effective projects, providing new testing ground for biodiversity protection and restoration projects that store additional atmospheric carbon. This Technical Progress Report discusses preliminary results of the six specific tasks that The Nature Conservancy is undertaking to answer research needs while facilitating the development of real projects with measurable greenhouse gas reductions. The research described in this report occurred between January 1st and March 31st 2006. The specific tasks discussed include: Task 1: carbon inventory advancements; Task 2: emerging technologies for remote sensing of terrestrial carbon; Task 3: baseline method development; Task 4: third-party technical advisory panel meetings; Task 5: new project feasibility studies; and Task 6: development of new project software screening tool.

  5. THE APPLICATION AND DEVELOPMENT OF APPROPRIATE TOOLS AND TECHNOLOGIES FOR COST-EFFECTIVE CARBON SEQUESTRATION

    Energy Technology Data Exchange (ETDEWEB)

    Bill Stanley; Sandra Brown; Ellen Hawes; Zoe Kant; Miguel Calmon; Gilberto Tiepolo

    2002-09-01

    The Nature Conservancy is participating in a Cooperative Agreement with the Department of Energy (DOE) National Energy Technology Laboratory (NETL) to explore the compatibility of carbon sequestration in terrestrial ecosystems and the conservation of biodiversity. The title of the research projects is ''Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration''. The objectives of the project are to: (1) improve carbon offset estimates produced in both the planning and implementation phases of projects; (2) build valid and standardized approaches to estimate project carbon benefits at a reasonable cost; and (3) lay the groundwork for implementing cost-effective projects, providing new testing ground for biodiversity protection and restoration projects that store additional atmospheric carbon. This Technical Progress Report discusses preliminary results of the six specific tasks that The Nature Conservancy is undertaking to answer research needs while facilitating the development of real projects with measurable greenhouse gas impacts. The specific tasks discussed include: Task 1: carbon inventory advancements; Task 2: advanced videography testing; Task 3: baseline method development; Task 4: third-party technical advisory panel meetings; Task 5: new project feasibility studies; and Task 6: development of new project software screening tool.

  6. Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration

    Energy Technology Data Exchange (ETDEWEB)

    Bill Stanley; Patrick Gonzalez; Sandra Brown; Sarah Woodhouse Murdock; Jenny Henman; Zoe Kant; Gilberto Tiepolo; Tim Pearson; Neil Sampson; Miguel Calmon

    2005-10-01

    The Nature Conservancy is participating in a Cooperative Agreement with the Department of Energy (DOE) National Energy Technology Laboratory (NETL) to explore the compatibility of carbon sequestration in terrestrial ecosystems and the conservation of biodiversity. The title of the research project is ''Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration''. The objectives of the project are to: (1) improve carbon offset estimates produced in both the planning and implementation phases of projects; (2) build valid and standardized approaches to estimate project carbon benefits at a reasonable cost; and (3) lay the groundwork for implementing cost-effective projects, providing new testing ground for biodiversity protection and restoration projects that store additional atmospheric carbon. This Technical Progress Report discusses preliminary results of the six specific tasks that The Nature Conservancy is undertaking to answer research needs while facilitating the development of real projects with measurable greenhouse gas reductions. The research described in this report occurred between April 1st , 2005 and June 30th, 2005. The specific tasks discussed include: Task 1: carbon inventory advancements; Task 2: emerging technologies for remote sensing of terrestrial carbon; Task 3: baseline method development; Task 4: third-party technical advisory panel meetings; Task 5: new project feasibility studies; and Task 6: development of new project software screening tool.

  7. Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration

    Energy Technology Data Exchange (ETDEWEB)

    Bill Stanley; Sandra Brown; Patrick Gonzalez; Zoe Kant; Gilberto Tiepolo; Wilber Sabido; Ellen Hawes; Jenny Henman; Miguel Calmon; Michael Ebinger

    2004-07-10

    The Nature Conservancy is participating in a Cooperative Agreement with the Department of Energy (DOE) National Energy Technology Laboratory (NETL) to explore the compatibility of carbon sequestration in terrestrial ecosystems and the conservation of biodiversity. The title of the research project is ''Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration''. The objectives of the project are to: (1) improve carbon offset estimates produced in both the planning and implementation phases of projects; (2) build valid and standardized approaches to estimate project carbon benefits at a reasonable cost; and (3) lay the groundwork for implementing cost-effective projects, providing new testing ground for biodiversity protection and restoration projects that store additional atmospheric carbon. This Technical Progress Report discusses preliminary results of the six specific tasks that The Nature Conservancy is undertaking to answer research needs while facilitating the development of real projects with measurable greenhouse gas impacts. The research described in this report occurred between July 1, 2002 and June 30, 2003. The specific tasks discussed include: Task 1: carbon inventory advancements; Task 2: remote sensing for carbon analysis; Task 3: baseline method development; Task 4: third-party technical advisory panel meetings; Task 5: new project feasibility studies; and Task 6: development of new project software screening tool.

  8. Applications of nanoparticles in cancer detection and diagnostic tool for hepatocellular carcinoma

    International Nuclear Information System (INIS)

    Venkatasalam, C.; Nagappan, A.

    2012-01-01

    Cancer nanotechnology is multidisciplinary area of science and technology. In recent days nano particles are used in medical field as diagnostic tools. It is highly precious and accurate measurement tools for detecting many disease. One of the broad application of cancer biology, for detecting molecular imaging, molecular diagnosis of cancer cells. In the present study deals with the nanoparticles are widely used for finding tumor as biomarker imaging for cancer detection and the nanoparticles are have important notice. An ample choice of materials may be used for construct nanoparticles that can cover for increase the capability of delivery or to provide unique structural and electrical properties for imaging. This exclusive properties are worn to several functional nanoparticles have already been demonstrated, including some clinically approved liposome drugs and metallic imaging agents. In early detection of heptocellular carcinoma, the metallic nanoparticles are vital role in the imaging technology. Several functions of nanoparticles that may eventually additional the understanding of producing imaging especially the darkening and enlarging of the images. These nanoparticles may be able to identify malignant cells by means of molecular detection, visualization of their location in the body by providing enhanced contrast in medical imaging technology, Through selective particle targeting and monitoring of identification of multiplied cells in different organs of the body. In the future prospective of medical field, the nanoparticles are having vital role for detecting cancer cells. (author)

  9. A computer tool for daily application of the linear quadratic model

    International Nuclear Information System (INIS)

    Macias Jaen, J.; Galan Montenegro, P.; Bodineau Gil, C.; Wals Zurita, A.; Serradilla Gil, A.M.

    2001-01-01

    The aim of this paper is to indicate the relevance of the criteria A.S.A.R.A. (As Short As Reasonably Achievable) in the optimization of a fractionated radiotherapy schedule and the presentation of a Windows computer program as an easy tool in order to: Evaluate the Biological Equivalent Dose (BED) in a fractionated schedule; Make comparison between different treatments; Compensate a treatment when a delay has been happened with a version of the Linear Quadratic model that has into account the factor of accelerated repopulation. Conclusions: Delays in the normal radiotherapy schedule are items that have to be controlled as much as possible because it is able to be a very important parameter in order to release a good application of treatment, principally when the tumour is fast growing. It is necessary to evaluate them. ASARA criteria is useful to indicate the relevance of this aspect. Also, computer tools like this one could help us in order to achieve this. (author)

  10. The NOAA Local Climate Analysis Tool - An Application in Support of a Weather Ready Nation

    Science.gov (United States)

    Timofeyeva, M. M.; Horsfall, F. M.

    2012-12-01

    Citizens across the U.S., including decision makers from the local to the national level, have a multitude of questions about climate, such as the current state and how that state fits into the historical context, and more importantly, how climate will impact them, especially with regard to linkages to extreme weather events. Developing answers to these types of questions for locations has typically required extensive work to gather data, conduct analyses, and generate relevant explanations and graphics. Too frequently providers don't have ready access to or knowledge of reliable, trusted data sets, nor sound, scientifically accepted analysis techniques such that they can provide a rapid response to queries they receive. In order to support National Weather Service (NWS) local office forecasters with information they need to deliver timely responses to climate-related questions from their customers, we have developed the Local Climate Analysis Tool (LCAT). LCAT uses the principles of artificial intelligence to respond to queries, in particular, through use of machine technology that responds intelligently to input from users. A user translates customer questions into primary variables and issues and LCAT pulls the most relevant data and analysis techniques to provide information back to the user, who in turn responds to their customer. Most responses take on the order of 10 seconds, which includes providing statistics, graphical displays of information, translations for users, metadata, and a summary of the user request to LCAT. Applications in Phase I of LCAT, which is targeted for the NWS field offices, include Climate Change Impacts, Climate Variability Impacts, Drought Analysis and Impacts, Water Resources Applications, Attribution of Extreme Events, and analysis techniques such as time series analysis, trend analysis, compositing, and correlation and regression techniques. Data accessed by LCAT are homogenized historical COOP and Climate Prediction Center

  11. Sulfate, nitrate and blood pressure - An EPIC interaction between sulfur and nitrogen.

    Science.gov (United States)

    Kuhnle, Gunter G; Luben, Robert; Khaw, Kay-Tee; Feelisch, Martin

    2017-08-01

    Nitrate (NO 3 - )-rich foods such as green leafy vegetables are not only part of a healthy diet, but increasingly marketed for primary prevention of cardiovascular disease (CVD) and used as ergogenic aids by competitive athletes. While there is abundant evidence for mild hypotensive effects of nitrate on acute application there is limited data on chronic intake in humans, and results from animal studies suggest no long-term benefit. This is important as nitrate can also promote the formation of nitrosamines. It is therefore classified as 'probably carcinogenic to humans', although a beneficial effect on CVD risk might compensate for an increased cancer risk. Dietary nitrate requires reduction to nitrite (NO 2 - ) by oral commensal bacteria to contribute to the formation of nitric oxide (NO). The extensive crosstalk between NO and hydrogen sulfide (H 2 S) related metabolites may further affect nitrate's bioactivity. Using nitrate and nitrite concentrations of drinking water - the only dietary source continuously monitored for which detailed data exist - in conjunction with data of >14,000 participants of the EPIC-Norfolk study, we found no inverse associations with blood pressure or CVD risk. Instead, we found a strong interaction with sulfate (SO 4 2- ). At low sulfate concentrations, nitrate was inversely associated with BP (-4mmHg in top quintile) whereas this was reversed at higher concentrations (+3mmHg in top quintile). Our findings have a potentially significant impact for pharmacology, physiology and public health, redirecting our attention from the oral microbiome and mouthwash use to interaction with sulfur-containing dietary constituents. These results also indicate that nitrate bioactivation is more complex than hitherto assumed. The modulation of nitrate bioactivity by sulfate may render dietary lifestyle interventions aimed at increasing nitrate intake ineffective and even reverse potential antihypertensive effects, warranting further investigation

  12. Politics in poetry: epic poetry as a critique of Dutch culture

    Directory of Open Access Journals (Sweden)

    O.M. Heynders

    2010-07-01

    Full Text Available This article describes a Dutch volume of epic poetry, using a disciplinary strategy (concepts and devices from narrative studies and a cultural analytical and rhetorical approach. The volume “Roeshoofd hemelt” by Joost Zwagerman (2005 is a political poetic text that raises fundamental questions on issues of mental illness and on consumerism in contemporary Dutch society.

  13. Diversity of dietary patterns observed in the European Prospective Investigation into Cancer and Nutrition (EPIC) project

    NARCIS (Netherlands)

    Slimani, N.; Fahey, M.; Welch, A.A.; Wirfalt, E.; Stripp, C.; Bergstrom, E.; Linseisen, J.; Schulze, M.B.; Bamia, C.; Chloptsios, Y.; Veglia, F.; Panico, S.; Bueno de Mesquita, B.; Ocké, M.C.; Brustadt, M.; Lund, E.; Gonzalez, C.A.; Barcos, A.; Berglund, G.; Winkvist, A.; Mulligan, A.; Appleby, P.; Overvad, K.; Tjonneland, A.; Clavel-Chapelon, F.; Kesse, E.; Ferrari, P.; Staveren, van W.A.; Riboli, E.

    2002-01-01

    Objective: To describe the diversity in dietary patterns existing across centres/regions participating in the European Prospective Investigation into Cancer and Nutrition (EPIC). Design and setting: Single 24-hour dietary recall measurements were obtained by means of standardised face-to-face

  14. The True Lion King of Africa: The Epic History of Sundiata, King of Old Mali.

    Science.gov (United States)

    Paterno, Domenica R.

    David Wisniewski's 1992 picture book version of the African epic of "Sundiata, Lion King of Mali" and the actual historical account of the 13th century Lion King, Sundiata, are both badly served by Disney's "The Lion King." Disney has been praised for using African animals as story characters; for using the African landscape as…

  15. Implementation of EPICS based vacuum control system for variable energy cyclotron centre, Kolkata

    Science.gov (United States)

    Roy, Anindya; Bhole, R. B.; Nandy, Partha P.; Yadav, R. C.; Pal, Sarbajit; Roy, Amitava

    2015-03-01

    The vacuum system of the Room Temperature (K = 130) Cyclotron of Variable Energy Cyclotron Centre is comprised of vacuum systems of main machine and Beam Transport System. The vacuum control system is upgraded to a PLC based Automated system from the initial relay based Manual system. The supervisory control of the vacuum system is implemented in Experimental Physics and Industrial Control System (EPICS). An EPICS embedded ARM based vacuum gauge controller is developed to mitigate the requirement of vendor specific gauge controller for gauges and also for seamless integration of the gauge controllers with the control system. A set of MS-Windows ActiveX components with embedded EPICS Channel Access interface are developed to build operator interfaces with less complex programming and to incorporate typical Windows feature, e.g., user authentication, file handling, better fonts, colors, mouse actions etc. into the operator interfaces. The control parameters, monitoring parameters, and system interlocks of the system are archived in MySQL based EPICS MySQL Archiver developed indigenously. In this paper, we describe the architecture, the implementation details, and the performance of the system.

  16. Implementation of EPICS based vacuum control system for variable energy cyclotron centre, Kolkata

    Energy Technology Data Exchange (ETDEWEB)

    Roy, Anindya, E-mail: r-ani@vecc.gov.in; Bhole, R. B.; Nandy, Partha P.; Yadav, R. C.; Pal, Sarbajit; Roy, Amitava [Variable Energy Cyclotron Centre, 1/AF Bidhan Nagar, Kolkata 700064 (India)

    2015-03-15

    The vacuum system of the Room Temperature (K = 130) Cyclotron of Variable Energy Cyclotron Centre is comprised of vacuum systems of main machine and Beam Transport System. The vacuum control system is upgraded to a PLC based Automated system from the initial relay based Manual system. The supervisory control of the vacuum system is implemented in Experimental Physics and Industrial Control System (EPICS). An EPICS embedded ARM based vacuum gauge controller is developed to mitigate the requirement of vendor specific gauge controller for gauges and also for seamless integration of the gauge controllers with the control system. A set of MS-Windows ActiveX components with embedded EPICS Channel Access interface are developed to build operator interfaces with less complex programming and to incorporate typical Windows feature, e.g., user authentication, file handling, better fonts, colors, mouse actions etc. into the operator interfaces. The control parameters, monitoring parameters, and system interlocks of the system are archived in MySQL based EPICS MySQL Archiver developed indigenously. In this paper, we describe the architecture, the implementation details, and the performance of the system.

  17. The Epic Poem "Raol de Cambrai" and Student Analysis of the French Feudal Aristocracy.

    Science.gov (United States)

    Madison, Kenneth G.

    1980-01-01

    Suggests how college history teachers can help students understand the French aristocracy and its role in medieval society by using a twelfth century epic. "Raol de Cambrai" gives students a sense that the poem's action could have happened to real people. A content analysis of the poem's action and characters is included. (DB)

  18. Constraining the Origin of Phobos with the Elpasolite Planetary Ice and Composition Spectrometer (EPICS) - Simulated Performance

    Science.gov (United States)

    Nowicki, S. F.; Mesick, K.; Coupland, D. D. S.; Dallmann, N. A.; Feldman, W. C.; Stonehill, L. C.; Hardgrove, C.; Dibb, S.; Gabriel, T. S. J.; West, S.

    2017-12-01

    Elpasolites are a promising new family of inorganic scintillators that can detect both gamma rays and neutrons within a single detector volume, reducing the instrument size, weight, and power (SWaP), all of which are critical for planetary science missions. The ability to distinguish between neutron and gamma events is done through pulse shape discrimination (PSD). The Elpasolite Planetary Ice and Composition Spectrometer (EPICS) utilizes elpasolites in a next-generation, highly capable, low-SWaP gamma-ray and neutron spectrometer. We present simulated capabilities of EPICS sensitivities to neutron and gamma-rays, and demonstrate how EPICS can constrain the origin of Phobos between the following three main hypotheses: 1) accretion after a giant impact with Mars, 2) co-accretion with Mars, and 3) capture of an external body. The MCNP6 code was used to calculate the neutron and gamma-ray flux that escape the surface of Phobos, and GEANT4 to model the response of the EPICS instrument on orbit around Phobos.

  19. European Prospective Investigation into Cancer and Nutrition (EPIC): study populations and data collection

    DEFF Research Database (Denmark)

    Riboli, E.; Hunt, K.J.; Slimani, N.

    2002-01-01

    , mostly in liquid nitrogen. To calibrate dietary measurements, a standardised, computer-assisted 24-hour dietary recall was implemented at each centre on stratified random samples of the participants, for a total of 36 900 subjects. EPIC represents the largest single resource available today world...

  20. EPIC 219217635: A Doubly Eclipsing Quadruple System Containing an Evolved Binary

    DEFF Research Database (Denmark)

    Borkovits, T.; Albrecht, S.; Rappaport, S.

    2018-01-01

    We have discovered a doubly eclipsing, bound, quadruple star system in the field of K2 Campaign 7. EPIC 219217635 is a stellar image with Kp = 12.7 that contains an eclipsing binary (‘EB’) with PA = 3.59470 d and a second EB with PB = 0.61825 d. We have obtained followup radial-velocity (‘RV’) sp...

  1. POPULARIZING EPIC NARRATIVE IN GEORGE R.R. MARTIN'S A GAME OF THRONES

    Directory of Open Access Journals (Sweden)

    Ida Rochani Adi

    2012-11-01

    Full Text Available This research is intended to show the sustainability of epic in latest years of human history through the most phenomenal fantasy in American literature, A Game of Thrones. Along with the capability of human beings in thinking dearly and sensibly, it is commonsensical that people tend to free themselves from irrationality. The reality shows, however, that the existence of epic fantasy still has power in appealing audiences or readers. This is the case with A Game of Thrones written by George RR Martin who was given the award of One of The Most Influential People in 2011 by Time magazine. This qualitative march, using genre approach, finds out that in order to be compatible with today's society, an epic seen in A Game of Thrones, which is commonly known as a story centering on the legendary hero and his heroic deed in oral folk tradition, keeps its power as an epic fantasy narrative through certain archetypes and formulas. Through genre analysis using semiotic approach, the research brings about conclusions that the elements of high fantasy, elements built through rational representation, and a smart combination of convention and invention brings about its popularity. It is also concluded that there is a close relationship between the myth and the mode of people living even in the most modern context.

  2. Mediterranean Style Diet and 12-Year Incidence of Cardiovascular Diseases: The Epic-NL Cohort Stusy

    NARCIS (Netherlands)

    Hoevenaar-Blom, M.P.; Nooyens, A.J.C.; Kromhout, D.; Spijkerman, A.M.W.; Beulens, W.J.; Schouw, van der Y.T.; Bueno-de-Mesquita4, B.; Verschuren, W.M.M.

    2012-01-01

    Background: A recent meta-analysis showed that a Mediterranean style diet may protect against cardiovascular diseases (CVD). Studies on disease-specific associations are limited. We evaluated the Mediterranean Diet Score (MDS) in relation to incidence of total and specific CVDs. Methods: The EPIC-NL

  3. Implementation of EPICS based vacuum control system for variable energy cyclotron centre, Kolkata

    International Nuclear Information System (INIS)

    Roy, Anindya; Bhole, R. B.; Nandy, Partha P.; Yadav, R. C.; Pal, Sarbajit; Roy, Amitava

    2015-01-01

    The vacuum system of the Room Temperature (K = 130) Cyclotron of Variable Energy Cyclotron Centre is comprised of vacuum systems of main machine and Beam Transport System. The vacuum control system is upgraded to a PLC based Automated system from the initial relay based Manual system. The supervisory control of the vacuum system is implemented in Experimental Physics and Industrial Control System (EPICS). An EPICS embedded ARM based vacuum gauge controller is developed to mitigate the requirement of vendor specific gauge controller for gauges and also for seamless integration of the gauge controllers with the control system. A set of MS-Windows ActiveX components with embedded EPICS Channel Access interface are developed to build operator interfaces with less complex programming and to incorporate typical Windows feature, e.g., user authentication, file handling, better fonts, colors, mouse actions etc. into the operator interfaces. The control parameters, monitoring parameters, and system interlocks of the system are archived in MySQL based EPICS MySQL Archiver developed indigenously. In this paper, we describe the architecture, the implementation details, and the performance of the system

  4. Design and Implementation of a Web-based Monitoring System by using EPICS Channel Access Protocol

    International Nuclear Information System (INIS)

    An, Eun Mi; Song, Yong Gi

    2009-01-01

    Proton Engineering Frontier Project (PEFP) has developed a 20MeV proton accelerator, and established a distributed control system based on EPICS for sub-system components such as vacuum unit, beam diagnostics, and power supply system. The control system includes a real-time monitoring and alarm functions. From the aspect of a efficient maintenance of a control system and a additional extension of subsystems, EPICS software framework was adopted. In addition, a control system should be capable of providing an easy access for users and a real-time monitoring on a user screen. Therefore, we have implemented a new web-based monitoring server with several libraries. By adding DB module, the new IOC web monitoring system makes it possible to monitor the system through the web. By integrating EPICS Channel Access (CA) and Database libraries into a Database module, the web-based monitoring system makes it possible to monitor the sub-system status through user's internet browser. In this study, we developed a web based monitoring system by using EPICS IOC (Input Output Controller) with IBM server

  5. Poetics of the Epic and Survivals of the Genre in Swaziland ...

    African Journals Online (AJOL)

    This paper seeks to suggest that the ongoing and robust tradition of bringing together excerpts from legends, praise poems, songs and genealogical recitations during national ceremonies and festivals such as Incwala, vouch for an erstwhile epic culture in Swaziland. International Journal of Humanistic Studies Vol.3 2004: ...

  6. Health-related quality of life using SF-8 and EPIC questionnaires after treatment with radical retropubic prostatectomy and permanent prostate brachytherapy

    International Nuclear Information System (INIS)

    Hashine, Katsuyoshi; Kusuhara, Yoshito; Miura, Noriyoshi; Shirato, Akitomi; Sumiyoshi, Yoshiteru; Kataoka, Masaaki

    2009-01-01

    The health-related quality of life (HRQOL) after treatment of prostate cancer is examined using a new HRQOL tool. HRQOL, based on the expanded prostate cancer index composite (EPIC) and SF-8 questionnaires, was prospectively compared after either a radical retropubic prostatectomy (RRP) or a permanent prostate brachytherapy (PPB) at a single institute. Between October 2005 and June 2007, 96 patients were treated by an RRP and 88 patients were treated by a PPB. A HRQOL survey was completed at baseline, and at 1, 3, 6 and 12 months after treatment, prospectively. The general HRQOL in the RRP and PPB groups was not different after 3 months. However, at baseline and 1 month after treatment, the mental component summary was significantly better in the PPB group than in the RRP group. Moreover, the disease-specific HRQOL was worse regarding urinary and sexual functions in the RRP group. Urinary irritative/obstructive was worse in the PPB group, but urinary incontinence was worse in the RRP group and had not recovered to baseline after 12 months. The bowel function and bother were worse in the PPB group than in the RRP group after 3 months. In the RRP group, the patients with nerve sparing demonstrated the same scores in sexual function as the PPB group. This prospective study revealed the differences in the HRQOL after an RRP and PPB. Disease-specific HRQOL is clarified by using EPIC survey. These results will be helpful for making treatment decisions. (author)

  7. Epic Dimensions: a Comparative Analysis of 3d Acquisition Methods

    Science.gov (United States)

    Graham, C. A.; Akoglu, K. G.; Lassen, A. W.; Simon, S.

    2017-08-01

    When it comes to capturing the geometry of a cultural heritage artifact, there is certainly no dearth of possible acquisition techniques. As technology has rapidly developed, the availability of intuitive 3D generating tools has increased exponentially and made it possible even for non-specialists to create many models quickly. Though the by-products of these different acquisition methods may be incongruent in terms of quality, these discrepancies are not problematic, as there are many applications of 3D models, each with their own set of requirements. Comparisons of high-resolution 3D models of an iconic Babylonian tablet, captured via four different closerange technologies discussed in this paper assess which methods of 3D digitization best suit specific intended purposes related to research, conservation and education. Taking into consideration repeatability, time and resource implications, qualitative and quantitative potential and ease of use, this paper presents a study of the strengths and weakness of structured light scanning, triangulation laser scanning, photometric stereo and close-range photogrammetry, in the context of interactive investigation, conditions monitoring, engagement, and dissemination.

  8. Road salt application planning tool for winter de-icing operations

    Science.gov (United States)

    Trenouth, William R.; Gharabaghi, Bahram; Perera, Nandana

    2015-05-01

    Road authorities, who are charged with the task of maintaining safe, driveable road conditions during severe winter storm events are coming under increasing pressure to protect salt vulnerable areas (SVAs). For the purpose of modelling urban winter hydrology, the temperature index method was modified to incorporate ploughing and salting considerations and was calibrated using winter field data from two sites in Southern Ontario and validated using data collected from a section of Highway 401 - Canada's busiest highway. The modified temperature index model (MTIM) accurately predicted salt-induced melt (R2 = 0.98 and 0.99, RMSE = 19.9 and 282.4 m3, CRM = -0.003 and 0.006 for calibration and validation sites respectively), and showed a demonstrable ability to calculate the Bare Pavement Regain Time (BPRT). The BPRT is a key factor on road safety and the basis for many winter maintenance performance standards for different classes of highways. Optimizing salt application rate scenarios can be achieved using the MTIM with only two meteorological forecast inputs for the storm event - readily available on-line through the Road Weather Information System (RWIS) - and can serve as a simple yet effective tool for winter road maintenance practitioners seeking to optimize salt application rates for a given storm event in salt vulnerable areas.

  9. Nutrient patterns and their food sources in an International Study Setting: report from the EPIC study.

    Science.gov (United States)

    Moskal, Aurelie; Pisa, Pedro T; Ferrari, Pietro; Byrnes, Graham; Freisling, Heinz; Boutron-Ruault, Marie-Christine; Cadeau, Claire; Nailler, Laura; Wendt, Andrea; Kühn, Tilman; Boeing, Heiner; Buijsse, Brian; Tjønneland, Anne; Halkjær, Jytte; Dahm, Christina C; Chiuve, Stephanie E; Quirós, Jose R; Buckland, Genevieve; Molina-Montes, Esther; Amiano, Pilar; Huerta Castaño, José M; Gurrea, Aurelio Barricarte; Khaw, Kay-Tee; Lentjes, Marleen A; Key, Timothy J; Romaguera, Dora; Vergnaud, Anne-Claire; Trichopoulou, Antonia; Bamia, Christina; Orfanos, Philippos; Palli, Domenico; Pala, Valeria; Tumino, Rosario; Sacerdote, Carlotta; de Magistris, Maria Santucci; Bueno-de-Mesquita, H Bas; Ocké, Marga C; Beulens, Joline W J; Ericson, Ulrika; Drake, Isabel; Nilsson, Lena M; Winkvist, Anna; Weiderpass, Elisabete; Hjartåker, Anette; Riboli, Elio; Slimani, Nadia

    2014-01-01

    Compared to food patterns, nutrient patterns have been rarely used particularly at international level. We studied, in the context of a multi-center study with heterogeneous data, the methodological challenges regarding pattern analyses. We identified nutrient patterns from food frequency questionnaires (FFQ) in the European Prospective Investigation into Cancer and Nutrition (EPIC) Study and used 24-hour dietary recall (24-HDR) data to validate and describe the nutrient patterns and their related food sources. Associations between lifestyle factors and the nutrient patterns were also examined. Principal component analysis (PCA) was applied on 23 nutrients derived from country-specific FFQ combining data from all EPIC centers (N = 477,312). Harmonized 24-HDRs available for a representative sample of the EPIC populations (N = 34,436) provided accurate mean group estimates of nutrients and foods by quintiles of pattern scores, presented graphically. An overall PCA combining all data captured a good proportion of the variance explained in each EPIC center. Four nutrient patterns were identified explaining 67% of the total variance: Principle component (PC) 1 was characterized by a high contribution of nutrients from plant food sources and a low contribution of nutrients from animal food sources; PC2 by a high contribution of micro-nutrients and proteins; PC3 was characterized by polyunsaturated fatty acids and vitamin D; PC4 was characterized by calcium, proteins, riboflavin, and phosphorus. The nutrients with high loadings on a particular pattern as derived from country-specific FFQ also showed high deviations in their mean EPIC intakes by quintiles of pattern scores when estimated from 24-HDR. Center and energy intake explained most of the variability in pattern scores. The use of 24-HDR enabled internal validation and facilitated the interpretation of the nutrient patterns derived from FFQs in term of food sources. These outcomes open research opportunities and

  10. Nutrient patterns and their food sources in an International Study Setting: report from the EPIC study.

    Directory of Open Access Journals (Sweden)

    Aurelie Moskal

    Full Text Available Compared to food patterns, nutrient patterns have been rarely used particularly at international level. We studied, in the context of a multi-center study with heterogeneous data, the methodological challenges regarding pattern analyses.We identified nutrient patterns from food frequency questionnaires (FFQ in the European Prospective Investigation into Cancer and Nutrition (EPIC Study and used 24-hour dietary recall (24-HDR data to validate and describe the nutrient patterns and their related food sources. Associations between lifestyle factors and the nutrient patterns were also examined. Principal component analysis (PCA was applied on 23 nutrients derived from country-specific FFQ combining data from all EPIC centers (N = 477,312. Harmonized 24-HDRs available for a representative sample of the EPIC populations (N = 34,436 provided accurate mean group estimates of nutrients and foods by quintiles of pattern scores, presented graphically. An overall PCA combining all data captured a good proportion of the variance explained in each EPIC center. Four nutrient patterns were identified explaining 67% of the total variance: Principle component (PC 1 was characterized by a high contribution of nutrients from plant food sources and a low contribution of nutrients from animal food sources; PC2 by a high contribution of micro-nutrients and proteins; PC3 was characterized by polyunsaturated fatty acids and vitamin D; PC4 was characterized by calcium, proteins, riboflavin, and phosphorus. The nutrients with high loadings on a particular pattern as derived from country-specific FFQ also showed high deviations in their mean EPIC intakes by quintiles of pattern scores when estimated from 24-HDR. Center and energy intake explained most of the variability in pattern scores.The use of 24-HDR enabled internal validation and facilitated the interpretation of the nutrient patterns derived from FFQs in term of food sources. These outcomes open research

  11. ATCA Shelf Manager EPICS device support for ITER CODAC Core System

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Bruno, E-mail: bsantos@ipfn.ist.utl.pt [Instituto de Plasmas e Fusão Nuclear, Instituto Superior Técnico, Universidade de Lisboa, 1049-001 Lisboa (Portugal); Carvalho, Paulo F.; Rodrigues, A.P.; Carvalho, Bernardo B.; Sousa, Jorge; Batista, António J.N.; Correia, Miguel; Combo, Álvaro M.; Cruz, Nuno [Instituto de Plasmas e Fusão Nuclear, Instituto Superior Técnico, Universidade de Lisboa, 1049-001 Lisboa (Portugal); Correia, Carlos M.B.A. [Centro de Instrumentação, Departamento de Física, Universidade de Coimbra, 3004-516 Coimbra (Portugal); Gonçalves, Bruno [Instituto de Plasmas e Fusão Nuclear, Instituto Superior Técnico, Universidade de Lisboa, 1049-001 Lisboa (Portugal)

    2015-10-15

    Highlights: • This architecture targets the health management integration into the NDS. • The developed solution supports the ShM redundancy features, specified by ATCA. • The average RTT was around 59 ms and in 99.9% of the cases was less than 130 ms. • Without losing any update cycle, can monitor a system shelf with approximately 400 sensors. • This solution enables the user to configure the entire system in DB files and st.cmd. - Abstract: The ITER CODAC Core System (CCS) is responsible for plant Instrumentation and Control (I&C) supervising and monitoring. This system uses the Enhanced Physics and Industrial Control System (EPICS) Channel Access (CA) protocol as the interface with the Plant Operation Network (PON). This paper presents a generic EPICS device support developed for the integration of the ATCA Shelf Manager (ShM) into the ITER CCS, providing scalability and easy configuration. The device support uses the available HTTP interface on Shelf Manager in the communication layer. Both HTTP server and sensors/actuators definitions can be configured using the EPICS database file and the Input/Output Controller (IOC) initialization file. A proposal based on this device is also presented, targeting the Nominal Device Support (NDS) for health management. The EPICS device support running in an IOC provides Process Variables (PV) to the PON network with the system information and these PVs can be used by all CA clients, such as EPICS user interface clients, alarm systems and archive systems. Operation with redundant ATCA ShMs and device support scalability tests were performed and the results are presented.

  12. E-learning tools for education: regulatory aspects, current applications in radiology and future prospects.

    Science.gov (United States)

    Pinto, A; Selvaggi, S; Sicignano, G; Vollono, E; Iervolino, L; Amato, F; Molinari, A; Grassi, R

    2008-02-01

    E-learning, an abbreviation of electronic learning, indicates the provision of education and training on the Internet or the World Wide Web. The impact of networks and the Internet on radiology is undoubtedly important, as it is for medicine as a whole. The Internet offers numerous advantages compared with other mass media: it provides access to a large amount of information previously known only to individual specialists; it is flexible, permitting the use of images or video; and it allows linking to Web sites on a specific subject, thus contributing to further expand knowledge. Our purpose is to illustrate the regulatory aspects (including Internet copyright laws), current radiological applications and future prospects of e-learning. Our experience with the installation of an e-learning platform is also presented. We performed a PubMed search on the published literature (without time limits) dealing with e-learning tools and applications in the health sector with specific reference to radiology. The search included all study types in the English language with the following key words: e-learning, education, teaching, online exam, radiology and radiologists. The Fiaso study was referred to for the regulatory aspects of e-learning. The application of e-learning to radiology requires the development of a model that involves selecting and creating e-learning platforms, creating and technologically adapting multimedia teaching modules, creating and managing a unified catalogue of teaching modules, planning training actions, defining training pathways and Continuing Education in Medicine (CME) credits, identifying levels of teaching and technological complexity of support tools, sharing an organisational and methodological model, training the trainers, operators' participation and relational devices, providing training, monitoring progress of the activities, and measuring the effectiveness of training. Since 2004, a platform--LiveLearning--has been used at our

  13. An Intracranial Electroencephalography (iEEG Brain Function Mapping Tool with an Application to Epilepsy Surgery Evaluation

    Directory of Open Access Journals (Sweden)

    Yinghua eWang

    2016-04-01

    Full Text Available Object: Before epilepsy surgeries, intracranial electroencephalography (iEEG is often employed in function mapping and epileptogenic foci localization. Although the implanted electrodes provide crucial information for epileptogenic zone resection, a convenient clinical tool for electrode position registration and brain function mapping visualization is still lacking. In this study, we developed a Brain Function Mapping (BFM Tool, which facilitates electrode position registration and brain function mapping visualization, with an application to epilepsy surgeries.Methods: The BFM Tool mainly utilizes electrode location registration and function mapping based on pre-defined brain models from other software. In addition, the electrode node and mapping properties, such as the node size/color, edge color / thickness, mapping method, can be adjusted easily using the setting panel. Moreover, users may manually import / export location and connectivity data to generate figures for further application. The role of this software is demonstrated by a clinical study of language area localization.Results: The BFM Tool helps clinical doctors and researchers visualize implanted electrodes and brain functions in an easy, quick and flexible manner.Conclusions: Our tool provides convenient electrode registration, easy brain function visualization, and has good performance. It is clinical-oriented and is easy to deploy and use. The BFM tool is suitable for epilepsy and other clinical iEEG applications.

  14. Multicomponent Musculoskeletal Movement Assessment Tools: A Systematic Review and Critical Appraisal of Their Development and Applicability to Professional Practice.

    Science.gov (United States)

    Bennett, Hunter; Davison, Kade; Arnold, John; Slattery, Flynn; Martin, Max; Norton, Kevin

    2017-10-01

    Multicomponent movement assessment tools have become commonplace to measure movement quality, proposing to indicate injury risk and performance capabilities. Despite popular use, there has been no attempt to compare the components of each tool reported in the literature, the processes in which they were developed, or the underpinning rationale for their included content. As such, the objective of this systematic review was to provide a comprehensive summary of current movement assessment tools and appraise the evidence supporting their development. A systematic literature search was performed using PRISMA guidelines to identify multicomponent movement assessment tools. Commonalities between tools and the evidence provided to support the content of each tool was identified. Each tool underwent critical appraisal to identify the rigor in which it was developed, and its applicability to professional practice. Eleven tools were identified, of which 5 provided evidence to support their content as assessments of movement quality. One assessment tool (Soccer Injury Movement Screen [SIMS]) received an overall score of above 65% on critical appraisal, with a further 2 tools (Movement Competency Screen [MCS] and modified 4 movement screen [M4-MS]) scoring above 60%. Only the MCS provided clear justification for its developmental process. The remaining 8 tools scored between 40 and 60%. On appraisal, the MCS, M4-MS, and SIMS seem to provide the most practical value for assessing movement quality as they provide the strongest reports of developmental rigor and an identifiable evidence base. In addition, considering the evidence provided, these tools may have the strongest potential for identifying performance capabilities and guiding exercise prescription in athletic and sport-specific populations.

  15. The balanced scorecard as a strategic management tool: its application in the regional public health system in Campania.

    Science.gov (United States)

    Impagliazzo, Cira; Ippolito, Adelaide; Zoccoli, Paola

    2009-01-01

    Health, as a primary and advanced need, can only be guaranteed through the appropriate management of dedicated resources. As in any situation where funds are limited, it is vital to have logical frameworks and tools to set up structures capable of making a complex system like the health service work. Only through an appropriate and competent activity of governance can such structures be identified, organized, and rendered operational. This can be achieved by using ad hoc tools such as the Balanced Scorecard. Its application in the case of the Regional Government of Campania indicates that it is a valid tool in all circumstances except in situations of crisis.

  16. Selection and application of familiar and novel tools in patients with left and right hemispheric stroke: Psychometrics and normative data.

    Science.gov (United States)

    Buchmann, Ilka; Randerath, Jennifer

    2017-09-01

    Frequently left brain damage (LBD) leads to limb apraxia, a disorder that can affect tool-use. Despite its impact on daily life, classical tests examining the pantomime of tool-use and imitation of gestures are seldom applied in clinical practice. The study's aim was to present a diagnostic approach which appears more strongly related to actions in daily life in order to sensitize applicants and patients about the relevance of the disorder before patients are discharged. Two tests were introduced that evaluate actual tool selection and tool-object-application: the Novel Tools (NTT) and the Familiar Tools (FTT) Test (parts of the DILA-S: Diagnostic Instrument for Limb Apraxia - Short Version). Normative data in healthy subjects (N = 82) was collected. Then the tests were applied in stroke patients with unilateral left brain damage (LBD: N = 33), a control right brain damage group (RBD: N = 20) as well as healthy age and gender matched controls (CL: N = 28, and CR, N = 18). The tests showed appropriate interrater-reliability and internal consistency as well as concurrent and divergent validity. To examine criterion validity based on the well-known left lateralization of limb apraxia, group comparisons were run. As expected, the LBD group demonstrated a high prevalence of tool-use apraxia (NTT: 36.4%, FTT: 48.5%) ranging from mild to severe impairment and scored worse than their control group (CL). A few RBD patients did demonstrate impairments in tool-use (NTT: 15%, FTT: 15%). On a group level they did not differ from their healthy controls (CR). Further, it was demonstrated that the selection and application of familiar and novel tools can be impaired selectively. Our study results suggest that real tool-use tests evaluating tool selection and tool application should be considered for standard diagnosis of limb apraxia in left as well as right brain damaged patients. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  17. EPIC Forest LAI Dataset: LAI estimates generated from the USDA Environmental Policy Impact Climate (EPIC) model (a widely used, field-scale, biogeochemical model) on four forest complexes spanning three physiographic provinces in VA and NC.

    Data.gov (United States)

    U.S. Environmental Protection Agency — This data depicts calculated and validated LAI estimates generated from the USDA Environmental Policy Impact Climate (EPIC) model (a widely used, field-scale,...

  18. Pseudo-random tool paths for CNC sub-aperture polishing and other applications.

    Science.gov (United States)

    Dunn, Christina R; Walker, David D

    2008-11-10

    In this paper we first contrast classical and CNC polishing techniques in regard to the repetitiveness of the machine motions. We then present a pseudo-random tool path for use with CNC sub-aperture polishing techniques and report polishing results from equivalent random and raster tool-paths. The random tool-path used - the unicursal random tool-path - employs a random seed to generate a pattern which never crosses itself. Because of this property, this tool-path is directly compatible with dwell time maps for corrective polishing. The tool-path can be used to polish any continuous area of any boundary shape, including surfaces with interior perforations.

  19. LCFM - LIVING COLOR FRAME MAKER: PC GRAPHICS GENERATION AND MANAGEMENT TOOL FOR REAL-TIME APPLICATIONS

    Science.gov (United States)

    Truong, L. V.

    1994-01-01

    Computer graphics are often applied for better understanding and interpretation of data under observation. These graphics become more complicated when animation is required during "run-time", as found in many typical modern artificial intelligence and expert systems. Living Color Frame Maker is a solution to many of these real-time graphics problems. Living Color Frame Maker (LCFM) is a graphics generation and management tool for IBM or IBM compatible personal computers. To eliminate graphics programming, the graphic designer can use LCFM to generate computer graphics frames. The graphical frames are then saved as text files, in a readable and disclosed format, which can be easily accessed and manipulated by user programs for a wide range of "real-time" visual information applications. For example, LCFM can be implemented in a frame-based expert system for visual aids in management of systems. For monitoring, diagnosis, and/or controlling purposes, circuit or systems diagrams can be brought to "life" by using designated video colors and intensities to symbolize the status of hardware components (via real-time feedback from sensors). Thus status of the system itself can be displayed. The Living Color Frame Maker is user friendly with graphical interfaces, and provides on-line help instructions. All options are executed using mouse commands and are displayed on a single menu for fast and easy operation. LCFM is written in C++ using the Borland C++ 2.0 compiler for IBM PC series computers and compatible computers running MS-DOS. The program requires a mouse and an EGA/VGA display. A minimum of 77K of RAM is also required for execution. The documentation is provided in electronic form on the distribution medium in WordPerfect format. A sample MS-DOS executable is provided on the distribution medium. The standard distribution medium for this program is one 5.25 inch 360K MS-DOS format diskette. The contents of the diskette are compressed using the PKWARE archiving tools

  20. STAR- A SIMPLE TOOL FOR AUTOMATED REASONING SUPPORTING HYBRID APPLICATIONS OF ARTIFICIAL INTELLIGENCE (UNIX VERSION)

    Science.gov (United States)

    Borchardt, G. C.

    1994-01-01

    The Simple Tool for Automated Reasoning program (STAR) is an interactive, interpreted programming language for the development and operation of artificial intelligence (AI) application systems. STAR provides an environment for integrating traditional AI symbolic processing with functions and data structures defined in compiled languages such as C, FORTRAN and PASCAL. This type of integration occurs in a number of AI applications including interpretation of numerical sensor data, construction of intelligent user interfaces to existing compiled software packages, and coupling AI techniques with numerical simulation techniques and control systems software. The STAR language was created as part of an AI project for the evaluation of imaging spectrometer data at NASA's Jet Propulsion Laboratory. Programming in STAR is similar to other symbolic processing languages such as LISP and CLIP. STAR includes seven primitive data types and associated operations for the manipulation of these structures. A semantic network is used to organize data in STAR, with capabilities for inheritance of values and generation of side effects. The AI knowledge base of STAR can be a simple repository of records or it can be a highly interdependent association of implicit and explicit components. The symbolic processing environment of STAR may be extended by linking the interpreter with functions defined in conventional compiled languages. These external routines interact with STAR through function calls in either direction, and through the exchange of references to data structures. The hybrid knowledge base may thus be accessed and processed in general by either side of the application. STAR is initially used to link externally compiled routines and data structures. It is then invoked to interpret the STAR rules and symbolic structures. In a typical interactive session, the user enters an expression to be evaluated, STAR parses the input, evaluates the expression, performs any file input

  1. STAR- A SIMPLE TOOL FOR AUTOMATED REASONING SUPPORTING HYBRID APPLICATIONS OF ARTIFICIAL INTELLIGENCE (DEC VAX VERSION)

    Science.gov (United States)

    Borchardt, G. C.

    1994-01-01

    The Simple Tool for Automated Reasoning program (STAR) is an interactive, interpreted programming language for the development and operation of artificial intelligence (AI) application systems. STAR provides an environment for integrating traditional AI symbolic processing with functions and data structures defined in compiled languages such as C, FORTRAN and PASCAL. This type of integration occurs in a number of AI applications including interpretation of numerical sensor data, construction of intelligent user interfaces to existing compiled software packages, and coupling AI techniques with numerical simulation techniques and control systems software. The STAR language was created as part of an AI project for the evaluation of imaging spectrometer data at NASA's Jet Propulsion Laboratory. Programming in STAR is similar to other symbolic processing languages such as LISP and CLIP. STAR includes seven primitive data types and associated operations for the manipulation of these structures. A semantic network is used to organize data in STAR, with capabilities for inheritance of values and generation of side effects. The AI knowledge base of STAR can be a simple repository of records or it can be a highly interdependent association of implicit and explicit components. The symbolic processing environment of STAR may be extended by linking the interpreter with functions defined in conventional compiled languages. These external routines interact with STAR through function calls in either direction, and through the exchange of references to data structures. The hybrid knowledge base may thus be accessed and processed in general by either side of the application. STAR is initially used to link externally compiled routines and data structures. It is then invoked to interpret the STAR rules and symbolic structures. In a typical interactive session, the user enters an expression to be evaluated, STAR parses the input, evaluates the expression, performs any file input

  2. MODBUS APPLICATION AT JEFFERSON LAB

    Energy Technology Data Exchange (ETDEWEB)

    Yan, Jianxun [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Seaton, Chad [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Philip, Sarin [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States)

    2018-02-01

    Modbus is a client/server communication model. In our applications, the embedded Ethernet device XPort is designed as the server and a SoftIOC running EPICS Modbus is the client. The SoftIOC builds a Modbus request from parameter contained in a demand that is sent by the EPICS application to the Modbus Client interface. On reception of the Modbus request, the Modbus server activates a local action to read, write, or achieve some other action. So, the main Modbus server functions are to wait for a Modbus request on 502 TCP port, treat this request, and then build a Modbus response.

  3. A Pragmatic and Sociolinguistic Account of δαιμόνιε in Early Greek Epic

    Directory of Open Access Journals (Sweden)

    H. Paul Brown

    2011-11-01

    Full Text Available Consideration of the social standing and emotional relation of speaker to addressee in epic shows that the address daimonie implies negative disposition and surprise, not the friendship term that would emerge later.

  4. Advanced Data Acquisition System Implementation for the ITER Neutron Diagnostic Use Case Using EPICS and FlexRIO Technology on a PXIe Platform

    Science.gov (United States)

    Sanz, D.; Ruiz, M.; Castro, R.; Vega, J.; Afif, M.; Monroe, M.; Simrock, S.; Debelle, T.; Marawar, R.; Glass, B.

    2016-04-01

    To aid in assessing the functional performance of ITER, Fission Chambers (FC) based on the neutron diagnostic use case deliver timestamped measurements of neutron source strength and fusion power. To demonstrate the Plant System Instrumentation & Control (I&C) required for such a system, ITER Organization (IO) has developed a neutron diagnostics use case that fully complies with guidelines presented in the Plant Control Design Handbook (PCDH). The implementation presented in this paper has been developed on the PXI Express (PXIe) platform using products from the ITER catalog of standard I&C hardware for fast controllers. Using FlexRIO technology, detector signals are acquired at 125 MS/s, while filtering, decimation, and three methods of neutron counting are performed in real-time via the onboard Field Programmable Gate Array (FPGA). Measurement results are reported every 1 ms through Experimental Physics and Industrial Control System (EPICS) Channel Access (CA), with real-time timestamps derived from the ITER Timing Communication Network (TCN) based on IEEE 1588-2008. Furthermore, in accordance with ITER specifications for CODAC Core System (CCS) application development, the software responsible for the management, configuration, and monitoring of system devices has been developed in compliance with a new EPICS module called Nominal Device Support (NDS) and RIO/FlexRIO design methodology.

  5. Manufacture of functional surfaces through combined application of tool manufacturing processes and Robot Assisted Polishing

    DEFF Research Database (Denmark)

    Eriksen, Rasmus Solmer; Arentoft, Mogens; Grønbæk, J.

    2012-01-01

    The tool surface topography is often a key parameter in the tribological performance of modern metal forming tools. A new generation of multifunctional surfaces is achieved by combination of conventional tool manufacturing processes with a novel Robot Assisted Polishing process. This novel surface...

  6. Computational protein design-the next generation tool to expand synthetic biology applications.

    Science.gov (United States)

    Gainza-Cirauqui, Pablo; Correia, Bruno Emanuel

    2018-05-02

    One powerful approach to engineer synthetic biology pathways is the assembly of proteins sourced from one or more natural organisms. However, synthetic pathways often require custom functions or biophysical properties not displayed by natural proteins, limitations that could be overcome through modern protein engineering techniques. Structure-based computational protein design is a powerful tool to engineer new functional capabilities in proteins, and it is beginning to have a profound impact in synthetic biology. Here, we review efforts to increase the capabilities of synthetic biology using computational protein design. We focus primarily on computationally designed proteins not only validated in vitro, but also shown to modulate different activities in living cells. Efforts made to validate computational designs in cells can illustrate both the challenges and opportunities in the intersection of protein design and synthetic biology. We also highlight protein design approaches, which although not validated as conveyors of new cellular function in situ, may have rapid and innovative applications in synthetic biology. We foresee that in the near-future, computational protein design will vastly expand the functional capabilities of synthetic cells. Copyright © 2018. Published by Elsevier Ltd.

  7. Compliance monitoring in business processes: Functionalities, application, and tool-support.

    Science.gov (United States)

    Ly, Linh Thao; Maggi, Fabrizio Maria; Montali, Marco; Rinderle-Ma, Stefanie; van der Aalst, Wil M P

    2015-12-01

    In recent years, monitoring the compliance of business processes with relevant regulations, constraints, and rules during runtime has evolved as major concern in literature and practice. Monitoring not only refers to continuously observing possible compliance violations, but also includes the ability to provide fine-grained feedback and to predict possible compliance violations in the future. The body of literature on business process compliance is large and approaches specifically addressing process monitoring are hard to identify. Moreover, proper means for the systematic comparison of these approaches are missing. Hence, it is unclear which approaches are suitable for particular scenarios. The goal of this paper is to define a framework for Compliance Monitoring Functionalities (CMF) that enables the systematic comparison of existing and new approaches for monitoring compliance rules over business processes during runtime. To define the scope of the framework, at first, related areas are identified and discussed. The CMFs are harvested based on a systematic literature review and five selected case studies. The appropriateness of the selection of CMFs is demonstrated in two ways: (a) a systematic comparison with pattern-based compliance approaches and (b) a classification of existing compliance monitoring approaches using the CMFs. Moreover, the application of the CMFs is showcased using three existing tools that are applied to two realistic data sets. Overall, the CMF framework provides powerful means to position existing and future compliance monitoring approaches.

  8. SINDBAD: a realistic multi-purpose and scalable X-ray simulation tool for NDT applications

    International Nuclear Information System (INIS)

    Tabary, J.; Hugonnard, P.; Mathy, F.

    2007-01-01

    The X-ray radiographic simulation software SINDBAD, has been developed to help the design stage of radiographic systems or to evaluate the efficiency of image processing techniques, in both medical imaging and Non-Destructive Evaluation (NDE) industrial fields. This software can model any radiographic set-up, including the X-ray source, the beam interaction inside the object represented by its Computed Aided Design (CAD) model, and the imaging process in the detector. For each step of the virtual experimental bench, SINDBAD combines different modelling modules, accessed via Graphical User Interfaces (GUI), to provide realistic synthetic images. In this paper, we present an overview of all the functionalities which are available in SINDBAD, with a complete description of all the physics taken into account in models as well as the CAD and GUI facilities available in many computing platforms. We underline the different modules usable for different applications which make SINDBAD a multi-purposed and scalable X-ray simulation tool. (authors)

  9. CSAU (code scaling, applicability and uncertainty), a tool to prioritize advanced reactor research

    International Nuclear Information System (INIS)

    Wilson, G.E.; Boyack, B.E.

    1990-01-01

    Best Estimate computer codes have been accepted by the US Nuclear Regulatory Commission as an optional tool for performing safety analysis related to the licensing and regulation of current nuclear reactors producing commercial electrical power, providing their uncertainty is quantified. In support of this policy change, the NRC and its contractors and consultants have developed and demonstrated an uncertainty quantification methodology called CSAU. At the process level, the method is generic to any application which relies on best estimate computer code simulations to determine safe operating margins. The primary use of the CSAU methodology is to quantify safety margins for existing designs; however, the methodology can also serve an equally important role in advanced reactor research for plants not yet built. Applied early, during the period when alternate designs are being evaluated, the methodology can identify the relative importance of the sources of uncertainty in the knowledge of each plant behavior and, thereby, help prioritize the research needed to bring the new designs to fruition. This paper describes the CSAU methodology, at the generic process level, and provides the general principles whereby it may be applied to evaluations of advanced reactor designs. 9 refs., 1 fig., 1 tab

  10. Design and application of a formal verification tool for VHDL designs

    International Nuclear Information System (INIS)

    John, Ajith K.; Bhattacharjee, A.K.; Sharma, Mukesh; Ganesh, G.; Dhodapkar, S.D.; Biswas, B.B.

    2012-01-01

    The design of Control and Instrumentation (C and I) systems used in safety critical applications such as nuclear power plants involves partitioning of the overall system functionality into subparts and implementing each subpart in hardware and/or software as appropriate. With increasing use of programmable devices like FPGA, the hardware subsystems are often implemented in Hardware Description Languages (HDL) like VHDL. Since the functional bugs in such hardware subsystems used in safety critical C and I systems have disastrous consequences, it is important to use rigorous reasoning to verify the functionalities of the HDL models. Work on developing a software tool named VBMC (VHDL Bounded Model Checker) for mathematically proving functional properties of hardware designs described in VHDL has been described. It is based on the principle of bounded model checking. Although the design of VBMC is still evolving, it is currently also being used for the functional verification of FPGA based intelligent I/O (EHS) boards developed in Reactor Control Division, BARC

  11. Various MRS application tools for Alzheimer disease and mild cognitive impairment.

    Science.gov (United States)

    Gao, F; Barker, P B

    2014-06-01

    MR spectroscopy is a noninvasive technique that allows the detection of several naturally occurring compounds (metabolites) from well-defined regions of interest within the human brain. Alzheimer disease, a progressive neurodegenerative disorder, is the most common cause of dementia in the elderly. During the past 20 years, multiple studies have been performed on MR spectroscopy in patients with both mild cognitive impairment and Alzheimer disease. Generally, MR spectroscopy studies have found decreased N-acetylaspartate and increased myo-inositol in both patients with mild cognitive impairment and Alzheimer disease, with greater changes in Alzheimer disease than in mild cognitive impairment. This review summarizes the information content of proton brain MR spectroscopy and its related technical aspects, as well as applications of MR spectroscopy to mild cognitive impairment and Alzheimer disease. While MR spectroscopy may have some value in the differential diagnosis of dementias and assessing prognosis, more likely its role in the near future will be predominantly as a tool for monitoring disease response or progression in treatment trials. More work is needed to evaluate the role of MR spectroscopy as a biomarker in Alzheimer disease and its relationship to other imaging modalities. © 2014 by American Journal of Neuroradiology.

  12. Power-Production Diagnostic Tools for Low-Density Wind Farms with Applications to Wake Steering

    Science.gov (United States)

    Takle, E. S.; Herzmann, D.; Rajewski, D. A.; Lundquist, J. K.; Rhodes, M. E.

    2016-12-01

    Hansen (2011) provided guidelines for wind farm wake analysis with applications to "high density" wind farms (where average distance between turbines is less than ten times rotor diameter). For "low-density" (average distance greater than fifteen times rotor diameter) wind farms, or sections of wind farms we demonstrate simpler sorting and visualization tools that reveal wake interactions and opportunities for wind farm power prediction and wake steering. SCADA data from a segment of a large mid-continent wind farm, together with surface flux measurements and lidar data are subjected to analysis and visualization of wake interactions. A time-history animated visualization of a plan view of power level of individual turbines provides a quick analysis of wake interaction dynamics. Yaw-based sectoral histograms of enhancement/decline of wind speed and power from wind farm reference levels reveals angular width of wake interactions and identifies the turbine(s) responsible for the power reduction. Concurrent surface flux measurements within the wind farm allowed us to evaluate stability influence on wake loss. A one-season climatology is used to identify high-priority candidates for wake steering based on estimated power recovery. Typical clearing prices on the day-ahead market are used to estimate the added value of wake steering. Current research is exploring options for identifying candidate locations for wind farm "build-in" in existing low-density wind farms.

  13. OCT as a convenient tool to assess the quality and application of organotypic retinal samples

    Science.gov (United States)

    Gater, Rachel; Khoshnaw, Nicholas; Nguyen, Dan; El Haj, Alicia J.; Yang, Ying

    2016-03-01

    Eye diseases such as macular degeneration and glaucoma have profound consequences on the quality of human life. Without treatment, these diseases can lead to loss of sight. To develop better treatments for retinal diseases, including cell therapies and drug intervention, establishment of an efficient and reproducible 3D native retinal tissue system, enabled over a prolonged culture duration, will be valuable. The retina is a complex tissue, consisting of ten layers with a different density and cellular composition to each. Uniquely, as a light transmitting tissue, retinal refraction of light differs among the layers, forming a good basis to use optical coherence tomography (OCT) in assessing the layered structure of the retina and its change during the culture and treatments. In this study, we develop a new methodology to generate retinal organotypic tissues and compare two substrates: filter paper and collagen hydrogel, to culture the organotypic tissue. Freshly slaughtered pig eyes have been obtained for use in this study. The layered morphology of intact organotypic retinal tissue cultured on two different substrates has been examined by spectral domain OCT. The viability of the tissues has been examined by live/dead fluorescence dye kit to cross validate the OCT images. For the first time, it is demonstrated that the use of a collagen hydrogel supports the viability of retinal organotypic tissue, capable of prolonged culture up to 2 weeks. OCT is a convenient tool for appraising the quality and application of organotypic retinal samples and is important in the development of current organotypic models.

  14. Application of Infrared Thermography as a Diagnostic Tool of Knee Osteoarthritis

    Science.gov (United States)

    Arfaoui, Ahlem; Bouzid, Mohamed Amine; Pron, Hervé; Taiar, Redha; Polidori, Guillaume

    This paper aimed to study the feasibility of application of infrared thermography to detect osteoarthritis of the knee and to compare the distribution of skin temperature between participants with osteoarthritis and those without pathology. All tests were conducted at LACM (Laboratory of Mechanical Stresses Analysis) and the gymnasium of the University of Reims Champagne Ardennes. IR thermography was performed using an IR camera. Ten participants with knee osteoarthritis and 12 reference healthy participants without OA participated in this study. Questionnaires were also used. The participants with osteoarthritis of the knee were selected on clinical examination and a series of radiographs. The level of pain was recorded by using a simple verbal scale (0-4). Infrared thermography reveals relevant disease by highlighting asymmetrical behavior in thermal color maps of both knees. Moreover, a linear evolution of skin temperature in the knee area versus time has been found whatever the participant group is in the first stage following a given effort. Results clearly show that the temperature can be regarded as a key parameter for evaluating pain. Thermal images of the knee were taken with an infrared camera. The study shows that with the advantage of being noninvasive and easily repeatable, IRT appears to be a useful tool to detect quantifiable patterns of surface temperatures and predict the singular thermal behavior of this pathology. It also seems that this non-intrusive technique enables to detect the early clinical manifestations of knee OA.

  15. Potential substitution of mineral P fertilizer by manure: EPIC development and implementation

    Science.gov (United States)

    Azevedo, Ligia B.; Vadas, Peter A.; Balkovič, Juraj; Skalský, Rastislav; Folberth, Christian; van der Velde, Marijn; Obersteiner, Michael

    2016-04-01

    Sources of mineral phosphorus (P) fertilizers are non-renewable. Although the longevity of P mines and the risk of future P depletion are highly debated P scarcity may be detrimental to agriculture in various ways. Some of these impacts include increasing food insecurity and nitrogen (N) and P imbalances, serious fluctuations in the global fertilizer and crop market prices, and contribution in geopolitical conflicts. P-rich waste produced from livestock production activities (i.e. manure) are an alternative to mineral P fertilizer. The substitution of mineral fertilizer with manure (1) delays the depletion of phosphate rock stocks, (2) reduces the vulnerability of P fertilizer importing countries to sudden changes in the fertilizer market, (3) reduces the chances of geopolitical conflicts arising from P exploitation pressures, (4) avoids the need for environmental protection policies in livestock systems, (5) is an opportunity for the boosting of crop yields in low nutrient input agricultural systems, and (6) contributes to the inflow of not only P but also other essential nutrients to agricultural soils. The Environmental Policy Integrated Climate model (EPIC) is a widely used process-based, crop model integrating various environmental flows relevant to crop production as well as environmental quality assessments. We simulate crop yields using a powerful computer cluster infra-structure (known as EPIC-IIASA) in combination with spatially-explicit EPIC input data on climate, management, soils, and landscape. EPIC-IIASA contains over 131,000 simulation units and it has 5 arc-min resolution. In this work, we implement two process-based models of manure biogeochemistry into EPIC-IIASA, i.e. SurPhos (for P) and Manure DNDC (for N and carbon) and a fate model model describing nutrient outflows from fertilizer via runoff. For EGU, we will use EPIC-IIASA to quantify the potential of mineral P fertilizer substitution with manure. Specifically, we will estimate the relative

  16. A Framework for the Application of Robust Design Methods and Tools

    DEFF Research Database (Denmark)

    Göhler, Simon Moritz; Howard, Thomas J.

    2014-01-01

    can deliver are not always clear. Expectations to the output are sometimes misleading and imply the incorrect utilization of tools. A categorization of tools, methods and techniques typically associated with robust design methodology in the literature is provided in this paper in terms of purpose...... and deliverables of the individual tool or method. The majority of tools aims for optimizing an existing design solution or give an indication of how robust a design is, which requires a somewhat settled design. Furthermore, the categorization presented in this paper shows a lack in the methodology for tools...... of the existing tools. When to apply, what tool or method, for which purpose can be concluded. The paper also contributes with a framework for researchers to derive a generic landscape or database for RDM build upon the main premises and deliverables of each method....

  17. Short Tools to Assess Young Children's Dietary Intake: A Systematic Review Focusing on Application to Dietary Index Research

    Directory of Open Access Journals (Sweden)

    Lucinda K. Bell

    2013-01-01

    Full Text Available Dietary indices evaluate diet quality, usually based on current dietary guidelines. Indices can therefore contribute to our understanding of early-life obesity-risk dietary behaviours. Yet indices are commonly applied to dietary data collected by onerous methods (e.g., recalls or records. Short dietary assessment instruments are an attractive alternative to collect data from which to derive an index score. A systematic review of studies published before April 2013 was conducted to identify short (≤50 items tools that measure whole-of-diet intake of young children (birth-five years and are applicable to dietary indices, in particular screening obesogenic dietary behaviours. The search identified 3686 papers of which 16, reporting on 15 tools (n=7, infants and toddlers birth-24 months; n=8, preschoolers 2–5 years, met the inclusion criteria. Most tools were food frequency questionnaires (n=14, with one innovative dietary questionnaire identified. Seven were tested for validity or reliability, and one was tested for both. Six tools (n=2, infants and toddlers; n=4, preschoolers are applicable for use with current dietary indices, five of which screen obesogenic dietary behaviours. Given the limited number of brief, valid and reliable dietary assessment tools for young children to which an index can be applied, future short tool development is warranted, particularly for screening obesogenic dietary behaviours.

  18. MOBILE APPLICATIONS AS TOOL FOR EXPLOITING CULTURAL HERITAGE IN THE REGION OF TURIN AND MILAN

    Directory of Open Access Journals (Sweden)

    A. Rolando

    2013-07-01

    Full Text Available The current research aims at showing as applications working on personal mobile communication terminals such as smartphones, can be useful for exploration of places and, at the same time, as tools able to develop interaction between cultural heritage and users. In this sense, the use of smartphone applications can be combined with GIS in order to make a platform of knowledge useful to support research studies in the field of cultural heritage, with specific reference to accessibility issues and to the combined use of integrated technologies like GPS, QR code and GIS, with the final aim to find an useful methodology for collecting data by visitors and visualizing them through mapping techniques. The research shows how the integration of different systems and technologies can be used as method for inquiring the interactions between users and cultural heritage in terms of accessibility to places. GPS devices can be used to record visitors movements (cultural routes in terms of space and time; QR code can be used for users interaction with cultural heritage (tourists opinion, heritage ranking, facilities, accessibility; GIS software can be used for data management, analysis and mapping (tourist flows, more visited places. The focus of research is about a combination of information related to cultural routes with the information related to single cultural places. The focus of research is about a combination of information related to cultural routes with the information related to single cultural places. The current research shows the potential use of smartphone applications, as mobile device for collecting data, as means to record rides and more visited places by tourists. The research could be divided into three steps; the first one concerns with GPS that can be used to record routes; the second one deals with interaction between tourists and cultural heritage through a system based on QR code; the third one is about GIS, used as tool for management

  19. Mobile Applications as Tool for Exploiting Cultural Heritage in the Region of Turin and Milan

    Science.gov (United States)

    Rolando, A.; Scandiffio, A.

    2013-07-01

    The current research aims at showing as applications working on personal mobile communication terminals such as smartphones, can be useful for exploration of places and, at the same time, as tools able to develop interaction between cultural heritage and users. In this sense, the use of smartphone applications can be combined with GIS in order to make a platform of knowledge useful to support research studies in the field of cultural heritage, with specific reference to accessibility issues and to the combined use of integrated technologies like GPS, QR code and GIS, with the final aim to find an useful methodology for collecting data by visitors and visualizing them through mapping techniques. The research shows how the integration of different systems and technologies can be used as method for inquiring the interactions between users and cultural heritage in terms of accessibility to places. GPS devices can be used to record visitors movements (cultural routes) in terms of space and time; QR code can be used for users interaction with cultural heritage (tourists opinion, heritage ranking, facilities, accessibility); GIS software can be used for data management, analysis and mapping (tourist flows, more visited places). The focus of research is about a combination of information related to cultural routes with the information related to single cultural places. The focus of research is about a combination of information related to cultural routes with the information related to single cultural places. The current research shows the potential use of smartphone applications, as mobile device for collecting data, as means to record rides and more visited places by tourists. The research could be divided into three steps; the first one concerns with GPS that can be used to record routes; the second one deals with interaction between tourists and cultural heritage through a system based on QR code; the third one is about GIS, used as tool for management, analysis and

  20. Design for Reliability and Robustness Tool Platform for Power Electronic Systems – Study Case on Motor Drive Applications

    DEFF Research Database (Denmark)

    Vernica, Ionut; Wang, Huai; Blaabjerg, Frede

    2018-01-01

    conventional approach, mainly based on failure statistics from the field, the reliability evaluation of the power devices is still a challenging task. In order to address the given problem, a MATLAB based reliability assessment tool has been developed. The Design for Reliability and Robustness (DfR2) tool...... allows the user to easily investigate the reliability performance of the power electronic components (or sub-systems) under given input mission profiles and operating conditions. The main concept of the tool and its framework are introduced, highlighting the reliability assessment procedure for power...... semiconductor devices. Finally, a motor drive application is implemented and the reliability performance of the power devices is investigated with the help of the DfR2 tool, and the resulting reliability metrics are presented....

  1. Development and application of the practice tool to deal with severe accident

    International Nuclear Information System (INIS)

    Kawasaki, Ikuo; Yoshida, Yoshitaka; Iwasaki, Yoshito

    2014-01-01

    We developed the practice tool to simulate communications between operators at a nuclear power station and persons at the headquarters at the time of severe accident (SA). The tool was developed from considering the lessons learned in dealing with the accident at the Tokyo Electric Power Company Fukushima Daiichi Nuclear Power Station, especially related to making appropriate responses to events. The tool allows users at headquarters to learn about the constitution of a specific plant and to make a reply and state a judgment based on knowledge about SA. The situations used for the practice tool were made using SPDS data from past disaster prevention drills. In a test, SA education of headquarters workers was carried out using this practice tool, and we confirmed that users were able to make the right phenomenon judgment and communicate it effectively based on the knowledge given by this practice tool. (author)

  2. Simulation Tools for Forest Health Analysis: An Application in the Red River Watershed, Idaho

    Science.gov (United States)

    Andrew J. McMahan; Eric L. Smith

    2006-01-01

    Software tools for landscape analyses--including FVS model extensions, and a number of FVS-related pre- and post-processing “tools”--are presented, using an analysis in the Red River Watershed, Nez Perce National Forest as an example. We present (1) a discussion of pre-simulation data analysis; (2) the Physiographic Information Extraction System (PIES), a tool that can...

  3. Computer assisted audit tools and techniques in real world: CAATT's applications and approaches in context

    OpenAIRE

    Pedrosa, I.; Costa, C. J.

    2012-01-01

    Nowadays, Computer Aided Audit Tools (and Techniques’) support almost all audit processes concerning data extraction and analysis. These tools were firstly aimed to support financial auditing processes. However, their scope is beyond this, therefore, we present case studies and good practices in an academic context. Although in large auditing companies Audit Tools to do data extraction and analysis are very common and applied in several contexts, we realized that is not easy to find practical...

  4. European Prospective Investigation into Cancer and Nutrition (EPIC) calibration study: rationale, design and population characteristics

    DEFF Research Database (Denmark)

    Slimani, N.; Kaaks, R.; Ferrari, P.

    2002-01-01

    The European Prospective Investigation into Cancer and Nutrition (EPIC), which covers a large cohort of half a million men and women from 23 European centres in 10 Western European countries, was designed to study the relationship between diet and the risk of chronic diseases, particularly cancer......, a calibration approach was developed. This approach involved an additional dietary assessment common across study populations to re-express individual dietary intakes according to the same reference scale. A single 24-hour diet recall was therefore collected, as the EPIC reference calibration method, from...... in a large multi-centre European study. These studies showed that, despite certain inherent methodological and logistic constraints, a study design such as this one works relatively well in practice. The average response in the calibration study was 78.3% and ranged from 46.5% to 92.5%. The calibration...

  5. Diversity of dietary patterns observed in the European Prospective Investigation into Cancer and Nutrition (EPIC) project

    DEFF Research Database (Denmark)

    Slimani, N.; Fahey, M.; Welch, A.A.

    2002-01-01

    differences were observed across centres, the countries participating in EPIC are characterised by specific dietary patterns. Overall, Italy and Greece have a dietary pattern characterised by plant foods (except potatoes) and a lower consumption of animal and processed foods, compared with the other EPIC...... countries. France and particularly Spain have more heterogeneous dietary patterns, with a relatively high consumption of both plant foods and animal products. Apart from characteristics specific to vegetarian groups, the UK 'health-conscious' group shares with the UK general population a relatively high...... consumption of tea, sauces, cakes, soft drinks (women), margarine and butter. In contrast, the diet in the Nordic countries, The Netherlands, Germany and the UK general population is relatively high in potatoes and animal, processed and sweetened/refined foods, with proportions varying across countries...

  6. Analisis Efektivitas Iklan Jejaring Sosial sebagai Media Promosi Menggunakan EPIC Model

    Directory of Open Access Journals (Sweden)

    Nur Hasanah

    2016-02-01

    Full Text Available Keberhasilan dalam sebuah iklan atau promosi yang sesuai dengan kebutuhan masyarakat terhadap Informasi Layanan Pendidikan di jejaring sosial sangat bergantung pada kemasan tampilan yang menarik dan berita yang disampaikan. Pencapaian keberhasilan tujuan promosi tersebut akan membutuhkan tindakan yang berkesinambungan dan tepat sasaran. Untuk mengetahui apakah Facebook Huma Harati itu efektif atau tidak maka perlu adanya pengukuran terhadap Facebook itu sendiri, salah satunya menggunakan metode EPIC (Empaty, Persuation, Impact, and Communication. Hasil analisa menyatakan bahwa Fanpage Huma Harati merupakan tempat yang efektif sebagai media promosi, hal ini terlihat dari nilai empaty, persuasiona, impact and communication yang diperoleh. Nilai rata-rata pada EPIC rate adalah 3,978, dimensi komunikasi mendapat nilai tertinggi dari dimensi lainnya yaitu 4,02. 

  7. A GIS Tool for evaluating and improving NEXRAD and its application in distributed hydrologic modeling

    Science.gov (United States)

    Zhang, X.; Srinivasan, R.

    2008-12-01

    In this study, a user friendly GIS tool was developed for evaluating and improving NEXRAD using raingauge data. This GIS tool can automatically read in raingauge and NEXRAD data, evaluate the accuracy of NEXRAD for each time unit, implement several geostatistical methods to improve the accuracy of NEXRAD through raingauge data, and output spatial precipitation map for distributed hydrologic model. The geostatistical methods incorporated in this tool include Simple Kriging with varying local means, Kriging with External Drift, Regression Kriging, Co-Kriging, and a new geostatistical method that was newly developed by Li et al. (2008). This tool was applied in two test watersheds at hourly and daily temporal scale. The preliminary cross-validation results show that incorporating raingauge data to calibrate NEXRAD can pronouncedly change the spatial pattern of NEXRAD and improve its accuracy. Using different geostatistical methods, the GIS tool was applied to produce long term precipitation input for a distributed hydrologic model - Soil and Water Assessment Tool (SWAT). Animated video was generated to vividly illustrate the effect of using different precipitation input data on distributed hydrologic modeling. Currently, this GIS tool is developed as an extension of SWAT, which is used as water quantity and quality modeling tool by USDA and EPA. The flexible module based design of this tool also makes it easy to be adapted for other hydrologic models for hydrological modeling and water resources management.

  8. Solid waste management in primary healthcare centers: application of a facilitation tool.

    Science.gov (United States)

    Moreira, Ana Maria Maniero; Günther, Wanda Maria Risso

    2016-08-18

    to propose a tool to facilitate diagnosis, formulation and evaluation of the Waste Management Plan in Primary Healthcare Centers and to present the results of the application in four selected units. descriptive research, covering the stages of formulation /application of the proposed instrument and the evaluation of waste management performance at the units. the tool consists in five forms; specific indicators of waste generation for outpatients healthcare units were proposed, and performance indicators that give scores for compliance with current legislation. In the studied units it is generated common waste (52-60%), infectious-sharps (31-42%) and recyclable (5-17%). The average rates of generation are: 0,09kg of total waste/outpatient assistance and 0,09kg of infectious-sharps waste/outpatient procedure. The compliance with regulations, initially 26-30%, then reached 30-38% a year later. the tool showed to be easy to use, bypassing the existence of a complex range of existing regulatory requirements, allowed to identify non-conformities, pointed out corrective measures and evaluated the performance of waste management. In this sense, it contributes to decision making and management practices relating to waste, tasks usually assigned to nurses. It is recommended that the tool be applied in similar healthcare units for comparative studies, and implementation of necessary adaptations for other medical services. propor instrumento para facilitar diagnóstico, elaboração e avaliação de Plano de Gerenciamento de Resíduos em Unidades Básicas de Saúde e apresentar os resultados da aplicação em quatro unidades selecionadas. pesquisa descritiva que contemplou as etapas de construção/aplicação do instrumento proposto e a avaliação de desempenho do gerenciamento de resíduos nas unidades estudadas. geração de instrumento composto por cinco formulários; proposta de indicadores específicos de geração de resíduos para unidades assistenciais de saúde sem

  9. EPIC: A Testbed for Scientifically Rigorous Cyber-Physical Security Experimentation

    OpenAIRE

    SIATERLIS CHRISTOS; GENGE BELA; HOHENADEL MARC

    2013-01-01

    Recent malware, like Stuxnet and Flame, constitute a major threat to Networked Critical Infrastructures (NCIs), e.g., power plants. They revealed several vulnerabilities in today's NCIs, but most importantly they highlighted the lack of an efficient scientific approach to conduct experiments that measure the impact of cyber threats on both the physical and the cyber parts of NCIs. In this paper we present EPIC, a novel cyber-physical testbed and a modern scientific instrument that can pr...

  10. The improved physical activity index for measuring physical activity in EPIC Germany.

    Science.gov (United States)

    Wientzek, Angelika; Vigl, Matthäus; Steindorf, Karen; Brühmann, Boris; Bergmann, Manuela M; Harttig, Ulrich; Katzke, Verena; Kaaks, Rudolf; Boeing, Heiner

    2014-01-01

    In the European Investigation into Cancer and Nutrition study (EPIC), physical activity (PA) has been indexed as a cross-tabulation between PA at work and recreational activity. As the proportion of non-working participants increases, other categorization strategies are needed. Therefore, our aim was to develop a valid PA index for this population, which will also be able to express PA continuously. In the German EPIC centers Potsdam and Heidelberg, a clustered sample of 3,766 participants was re-invited to the study center. 1,615 participants agreed to participate and 1,344 participants were finally included in this study. PA was measured by questionnaires on defined activities and a 7-day combined heart rate and acceleration sensor. In a training sample of 433 participants, the Improved Physical Activity Index (IPAI) was developed. Its performance was evaluated in a validation sample of 911 participants and compared with the Cambridge Index and the Total PA Index. The IPAI consists of items covering five areas including PA at work, sport, cycling, television viewing, and computer use. The correlations of the IPAI with accelerometer counts in the training and validation sample ranged r = 0.40-0.43 and with physical activity energy expenditure (PAEE) r = 0.33-0.40 and were higher than for the Cambridge Index and the Total PA Index previously applied in EPIC. In non-working participants the IPAI showed higher correlations than the Cambridge Index and the Total PA Index, with r = 0.34 for accelerometer counts and r = 0.29 for PAEE. In conclusion, we developed a valid physical activity index which is able to express PA continuously as well as to categorize participants according to their PA level. In populations with increasing rates of non-working people the performance of the IPAI is better than the established indices used in EPIC.

  11. The improved physical activity index for measuring physical activity in EPIC Germany.

    Directory of Open Access Journals (Sweden)

    Angelika Wientzek

    Full Text Available In the European Investigation into Cancer and Nutrition study (EPIC, physical activity (PA has been indexed as a cross-tabulation between PA at work and recreational activity. As the proportion of non-working participants increases, other categorization strategies are needed. Therefore, our aim was to develop a valid PA index for this population, which will also be able to express PA continuously. In the German EPIC centers Potsdam and Heidelberg, a clustered sample of 3,766 participants was re-invited to the study center. 1,615 participants agreed to participate and 1,344 participants were finally included in this study. PA was measured by questionnaires on defined activities and a 7-day combined heart rate and acceleration sensor. In a training sample of 433 participants, the Improved Physical Activity Index (IPAI was developed. Its performance was evaluated in a validation sample of 911 participants and compared with the Cambridge Index and the Total PA Index. The IPAI consists of items covering five areas including PA at work, sport, cycling, television viewing, and computer use. The correlations of the IPAI with accelerometer counts in the training and validation sample ranged r = 0.40-0.43 and with physical activity energy expenditure (PAEE r = 0.33-0.40 and were higher than for the Cambridge Index and the Total PA Index previously applied in EPIC. In non-working participants the IPAI showed higher correlations than the Cambridge Index and the Total PA Index, with r = 0.34 for accelerometer counts and r = 0.29 for PAEE. In conclusion, we developed a valid physical activity index which is able to express PA continuously as well as to categorize participants according to their PA level. In populations with increasing rates of non-working people the performance of the IPAI is better than the established indices used in EPIC.

  12. Fuel and coolant motions following pin failure: EPIC models and the PBE-5S experiment

    International Nuclear Information System (INIS)

    Garner, P.L.; Abramson, P.B.

    1979-01-01

    The EPIC computer code has been used to analyze the post-fuel-pin-failure behavior in the PBE-5S experiment performed at Sandia Laboratories. The effects of modeling uncertainties on the calculation are examined. The calculations indicate that the majority of the piston motion observed in the test is due to the initial pressurization of the coolant channel by fuel vapor at cladding failure. A more definitive analysis requires improvements in calculational capabilities and experiment diagnostics

  13. Feasibility of 2 × 24-h dietary recalls combined with a food-recording booklet, using EPIC-Soft, among schoolchildren.

    Science.gov (United States)

    Trolle, E; Amiano, P; Ege, M; Bower, E; Lioret, S; Brants, H; Kaic-Rak, A; de Boer, E J; Andersen, L F

    2011-07-01

    The aim of this study was to evaluate the feasibility of the suggested trans-European methodology for undertaking representative dietary surveys among schoolchildren: 2 × 24-h dietary recalls (24-HDRs) combined with a food-recording booklet, using EPIC-Soft (the software developed to conduct 24-HDRs in the European Prospective Investigation into Cancer and Nutrition (EPIC) Study) pc-program. A total of 75 children aged 7-8 years and 70 children aged 12-13 years old were recruited through the Civil Registration System in Denmark, and 57 children aged 7-8 years and 47 children aged 12-13 years were recruited through schools in Spain. Each child with one parent completed two face-to-face 24-HDRs, combined with optional use of a food-recording booklet (FRB) to be filled in by the child, a parent or other proxy persons for preparing the recalls. Feasibility was evaluated by questionnaires completed by parents, children and interviewers, and by selected data from the 24-HDRs. The face-to-face interviews with the child and a parent together are confirmed as feasible. The children participated actively in the interviews, the oldest children being most active. The children, parents and interviewers agreed that children needed help from the parents, and that parents were of help to the child. In both countries, other proxy persons, such as teachers or the school cafeteria staff, were involved before the interview, and the majority of the parents and children reported that the FRB had been a help for the child during the interview. Further results point at specific needed improvements of the tools. The evaluated method is shown feasible in two culturally diverse European populations. However, the feasibility study also points to specific improvements of tools and data collection protocol that are strongly recommended before implementation of the method in each country of a pan-European dietary survey.

  14. Validity and applicability of a video-based animated tool to assess mobility in elderly Latin American populations.

    Science.gov (United States)

    Guerra, Ricardo Oliveira; Oliveira, Bruna Silva; Alvarado, Beatriz Eugenia; Curcio, Carmen Lucia; Rejeski, W Jack; Marsh, Anthony P; Ip, Edward H; Barnard, Ryan T; Guralnik, Jack M; Zunzunegui, Maria Victoria

    2014-10-01

    To assess the reliability and the validity of Portuguese- and Spanish-translated versions of the video-based short-form Mobility Assessment Tool in assessing self-reported mobility, and to provide evidence for the applicability of these videos in elderly Latin American populations as a complement to physical performance measures. The sample consisted of 300 elderly participants (150 from Brazil, 150 from Colombia) recruited at neighborhood social centers. Mobility was assessed with the Mobility Assessment Tool, and compared with the Short Physical Performance Battery score and self-reported functional limitations. Reliability was calculated using intraclass correlation coefficients. Multiple linear regression analyses were used to assess associations among mobility assessment tools and health, and sociodemographic variables. A significant gradient of increasing Mobility Assessment Tool score with better physical function was observed for both self-reported and objective measures, and in each city. Associations between self-reported mobility and health were strong, and significant. Mobility Assessment Tool scores were lower in women at both sites. Intraclass correlation coefficients of the Mobility Assessment Tool were 0.94 (95% confidence interval 0.90-0.97) in Brazil and 0.81 (95% confidence interval 0.66-0.91) in Colombia. Mobility Assessment Tool scores were lower in Manizales than in Natal after adjustment by Short Physical Performance Battery, self-rated health and sex. These results provide evidence for high reliability and good validity of the Mobility Assessment Tool in its Spanish and Portuguese versions used in Latin American populations. In addition, the Mobility Assessment Tool can detect mobility differences related to environmental features that cannot be captured by objective performance measures. © 2013 Japan Geriatrics Society.

  15. Macronutrient, vitamin, and mineral intakes in the EPIC-Germany cohorts.

    Science.gov (United States)

    Schulze, M B; Linseisen, J; Kroke, A; Boeing, H

    2001-01-01

    This article presents intakes of nutrients in the EPIC-Heidelberg and the EPIC-Potsdam (European Investigation into Cancer and Nutrition) studies. Estimates are based on standardized 24-hour dietary recalls. Recalls from 1,013 men and 1,078 women in Heidelberg and from 1,032 men and 898 women in Potsdam were included in the analysis. The estimated nutrient intake was based on the German Food Code and Nutrient Data Base version II.3. Analyses were carried out stratified by sex and weighted for the day of the week and age. Men in Potsdam reported significantly higher intakes of energy (mean Potsdam = 10,718 kJ, mean Heidelberg = 10,387 kJ) and higher intakes of vitamins and minerals as compared with men in Heidelberg. However, Heidelberg men consumed more alcohol, alpha-tocopherol, phosphorus, calcium, and magnesium. Potsdam women reported lower energy (mean Potsdam = 7,537 kJ, mean Heidelberg = 7,855 kJ), alcohol, and cholesterol intakes as compared with Heidelberg women. Vitamin and mineral intakes were lower too, except for retinol and ascorbic acid. The intakes of energy and most nutrients observed in the Potsdam and Heidelberg study populations were within the range reported from other German studies. The observed differences between both study populations indicate different dietary patterns, increasing the exposure variation in the EPIC study. Copyright 2001 S. Karger AG, Basel

  16. Expert System Application of Forward Chaining and Certainty Factors Method for The Decision of Contraception Tools

    Science.gov (United States)

    Prambudi, Dwi Arief; Widodo, Catur Edi; Widodo, Aris Puji

    2018-02-01

    The choice of contraceptive tools is not an easy thing because the risks or effects will give impact on the body that never using it previously. in the other side, there is no contraception always suit for everybody because the circumstances of each body is different, so the extensive knowledge must be needed to know the advantages and disadvantages of each contraceptive tools then adjusted to the user's body.The expert system for contraceptive tools uses Forward Chaining search method combined with Certainty Factors Method. These method value the patient's indication. The Expert system gives the output data which define the kind of tool uses of the patient. the results obtained will be able to help people to find indications that lead to appropriate contraceptive tools and advice or suggestions about these tools. The success rate of the contraceptive tools decision experienced by experienced by the user by using forward chaining combined with the CF computation method is also influenced by the number of indication criteria selected by the user. Based on testing that has been done, expert system contraception tools has accuracy level equal to 75%.

  17. The application of springback compensation to the CAD geometries of forming tools: Milestone 2

    NARCIS (Netherlands)

    Lingbeek, R.A.

    2005-01-01

    For car body parts, generally a surface modeling CAD system is used. The geometry of the tools that are used for forming the parts is based directly on this description. To compensate for springback after forming, the tools have to be modified. This turns out to be a complicated and time consuming

  18. Advanced REACH Tool : Development and application of the substance emission potential modifying factor

    NARCIS (Netherlands)

    Tongeren, M. van; Fransman, W.; Spankie, S.; Tischer, M.; Brouwer, D.; Schinkel, J.; Cherrie, J.W.; Tielemans, E.

    2011-01-01

    The Advanced REACH Tool (ART) is an exposure assessment tool that combines mechanistically modelled inhalation exposure estimates with available exposure data using a Bayesian approach. The mechanistic model is based on nine independent principal modifying factors (MF). One of these MF is the

  19. Simulation Tool for Designing off-Grid PV Applications for the Urban Environments

    DEFF Research Database (Denmark)

    Poulsen, Peter Behrensdorff; Dam-Hansen, Carsten; Thorseth, Anders

    2013-01-01

    A barrier for exploiting use of standalone solar lighting for the urban environment seem to be lack of knowledge and lack of available tools for proper dimensioning. In this work, the first part of the development of powerful dimensioning tool is described and initial measurements are presented....

  20. Possibilities of Application of High Pressure Jet Assisted Machining in Hard Turning with Carbide Tools

    Directory of Open Access Journals (Sweden)

    G. Globočki Lakić

    2017-06-01

    Full Text Available High Pressure Jet Assisted Machining (HPJAM in turning is a hybrid machining method in which a high pressure jet of cooling and lubrication fluid, under high pressure (50 MPa, leads to the zone between the cutting tool edge and workpiece. An experimental study was performed to investigate the capabilities of conventional and high pressure cooling (HPC in the turning of hard-to-machine materials: hard-chromed and surface hardened steel Ck45 (58 HRc and hardened bearing steel 100Cr6 (62 HRc. Machining experiments were performed using coated carbide tools and highly cutting speed. Experimental measurements were performed for different input process parameters. The cooling capabilities are compared by monitoring of tool wear, tool life, cooling efficiency, and surface roughness. Connection between the tool wear and surface roughness is established. Experimental research show that the hard turning with carbide cutting tools and HP supply CLF provides numerous advantages from the techno-economic aspect: greater productivity, reduce of temperature in the cutting zone, improved control chip formation, extended tool life, low intensity of tool wear, surface roughness in acceptable limits, significant reduce of production costs related to the CLF.