WorldWideScience

Sample records for radar software toolkit

  1. SIGKit: Software for Introductory Geophysics Toolkit

    Science.gov (United States)

    Kruse, S.; Bank, C. G.; Esmaeili, S.; Jazayeri, S.; Liu, S.; Stoikopoulos, N.

    2017-12-01

    The Software for Introductory Geophysics Toolkit (SIGKit) affords students the opportunity to create model data and perform simple processing of field data for various geophysical methods. SIGkit provides a graphical user interface built with the MATLAB programming language, but can run even without a MATLAB installation. At this time SIGkit allows students to pick first arrivals and match a two-layer model to seismic refraction data; grid total-field magnetic data, extract a profile, and compare this to a synthetic profile; and perform simple processing steps (subtraction of a mean trace, hyperbola fit) to ground-penetrating radar data. We also have preliminary tools for gravity, resistivity, and EM data representation and analysis. SIGkit is being built by students for students, and the intent of the toolkit is to provide an intuitive interface for simple data analysis and understanding of the methods, and act as an entrance to more sophisticated software. The toolkit has been used in introductory courses as well as field courses. First reactions from students are positive. Think-aloud observations of students using the toolkit have helped identify problems and helped shape it. We are planning to compare the learning outcomes of students who have used the toolkit in a field course to students in a previous course to test its effectiveness.

  2. The Python ARM Radar Toolkit (Py-ART, a Library for Working with Weather Radar Data in the Python Programming Language

    Directory of Open Access Journals (Sweden)

    Jonathan J Helmus

    2016-07-01

    Full Text Available The Python ARM Radar Toolkit is a package for reading, visualizing, correcting and analysing data from weather radars. Development began to meet the needs of the Atmospheric Radiation Measurement Climate Research Facility and has since expanded to provide a general-purpose framework for working with data from weather radars in the Python programming language. The toolkit is built on top of libraries in the Scientific Python ecosystem including NumPy, SciPy, and matplotlib, and makes use of Cython for interfacing with existing radar libraries written in C and to speed up computationally demanding algorithms. The source code for the toolkit is available on GitHub and is distributed under a BSD license.

  3. Software Radar Technology

    Directory of Open Access Journals (Sweden)

    Tang Jun

    2015-08-01

    Full Text Available In this paper, the definition and the key features of Software Radar, which is a new concept, are proposed and discussed. We consider the development of modern radar system technology to be divided into three stages: Digital Radar, Software radar and Intelligent Radar, and the second stage is just commencing now. A Software Radar system should be a combination of various modern digital modular components conformed to certain software and hardware standards. Moreover, a software radar system with an open system architecture supporting to decouple application software and low level hardware would be easy to adopt "user requirements-oriented" developing methodology instead of traditional "specific function-oriented" developing methodology. Compared with traditional Digital Radar, Software Radar system can be easily reconfigured and scaled up or down to adapt to the changes of requirements and technologies. A demonstration Software Radar signal processing system, RadarLab 2.0, which has been developed by Tsinghua University, is introduced in this paper and the suggestions for the future development of Software Radar in China are also given in the conclusion.

  4. The Python ARM Radar Toolkit (Py-ART), a Library for Working with Weather Radar Data in the Python Programming Language

    OpenAIRE

    Helmus, Jonathan J; Collis, Scott M

    2016-01-01

    The Python ARM Radar Toolkit is a package for reading, visualizing, correcting and analysing data from weather radars. Development began to meet the needs of the Atmospheric Radiation Measurement Climate Research Facility and has since expanded to provide a general-purpose framework for working with data from weather radars in the Python programming language. The toolkit is built on top of libraries in the Scientific Python ecosystem including NumPy, SciPy, and matplotlib, and makes use of Cy...

  5. PsyToolkit: a software package for programming psychological experiments using Linux.

    Science.gov (United States)

    Stoet, Gijsbert

    2010-11-01

    PsyToolkit is a set of software tools for programming psychological experiments on Linux computers. Given that PsyToolkit is freely available under the Gnu Public License, open source, and designed such that it can easily be modified and extended for individual needs, it is suitable not only for technically oriented Linux users, but also for students, researchers on small budgets, and universities in developing countries. The software includes a high-level scripting language, a library for the programming language C, and a questionnaire presenter. The software easily integrates with other open source tools, such as the statistical software package R. PsyToolkit is designed to work with external hardware (including IoLab and Cedrus response keyboards and two common digital input/output boards) and to support millisecond timing precision. Four in-depth examples explain the basic functionality of PsyToolkit. Example 1 demonstrates a stimulus-response compatibility experiment. Example 2 demonstrates a novel mouse-controlled visual search experiment. Example 3 shows how to control light emitting diodes using PsyToolkit, and Example 4 shows how to build a light-detection sensor. The last two examples explain the electronic hardware setup such that they can even be used with other software packages.

  6. Integrating existing software toolkits into VO system

    Science.gov (United States)

    Cui, Chenzhou; Zhao, Yong-Heng; Wang, Xiaoqian; Sang, Jian; Luo, Ze

    2004-09-01

    Virtual Observatory (VO) is a collection of interoperating data archives and software tools. Taking advantages of the latest information technologies, it aims to provide a data-intensively online research environment for astronomers all around the world. A large number of high-qualified astronomical software packages and libraries are powerful and easy of use, and have been widely used by astronomers for many years. Integrating those toolkits into the VO system is a necessary and important task for the VO developers. VO architecture greatly depends on Grid and Web services, consequently the general VO integration route is "Java Ready - Grid Ready - VO Ready". In the paper, we discuss the importance of VO integration for existing toolkits and discuss the possible solutions. We introduce two efforts in the field from China-VO project, "gImageMagick" and "Galactic abundance gradients statistical research under grid environment". We also discuss what additional work should be done to convert Grid service to VO service.

  7. Ground and Space Radar Volume Matching and Comparison Software

    Science.gov (United States)

    Morris, Kenneth; Schwaller, Mathew

    2010-01-01

    This software enables easy comparison of ground- and space-based radar observations. The software was initially designed to compare ground radar reflectivity from operational, ground based Sand C-band meteorological radars with comparable measurements from the Tropical Rainfall Measuring Mission (TRMM) satellite s Precipitation Radar (PR) instrument. The software is also applicable to other ground-based and space-based radars. The ground and space radar volume matching and comparison software was developed in response to requirements defined by the Ground Validation System (GVS) of Goddard s Global Precipitation Mission (GPM) project. This software innovation is specifically concerned with simplifying the comparison of ground- and spacebased radar measurements for the purpose of GPM algorithm and data product validation. This software is unique in that it provides an operational environment to routinely create comparison products, and uses a direct geometric approach to derive common volumes of space- and ground-based radar data. In this approach, spatially coincident volumes are defined by the intersection of individual space-based Precipitation Radar rays with the each of the conical elevation sweeps of the ground radar. Thus, the resampled volume elements of the space and ground radar reflectivity can be directly compared to one another.

  8. The Liquid Argon Software Toolkit (LArSoft): Goals, Status and Plan

    Energy Technology Data Exchange (ETDEWEB)

    Pordes, Rush [Fermilab; Snider, Erica [Fermilab

    2016-08-17

    LArSoft is a toolkit that provides a software infrastructure and algorithms for the simulation, reconstruction and analysis of events in Liquid Argon Time Projection Chambers (LArTPCs). It is used by the ArgoNeuT, LArIAT, MicroBooNE, DUNE (including 35ton prototype and ProtoDUNE) and SBND experiments. The LArSoft collaboration provides an environment for the development, use, and sharing of code across experiments. The ultimate goal is to develop fully automatic processes for reconstruction and analysis of LArTPC events. The toolkit is based on the art framework and has a well-defined architecture to interface to other packages, including to GEANT4 and GENIE simulation software and the Pandora software development kit for pattern recognition. It is designed to facilitate and support the evolution of algorithms including their transition to new computing platforms. The development of the toolkit is driven by the scientific stakeholders involved. The core infrastructure includes standard definitions of types and constants, means to input experiment geometries as well as meta and event- data in several formats, and relevant general utilities. Examples of algorithms experiments have contributed to date are: photon-propagation; particle identification; hit finding, track finding and fitting; electromagnetic shower identification and reconstruction. We report on the status of the toolkit and plans for future work.

  9. Guest editors' introduction to the 4th issue of Experimental Software and Toolkits (EST-4)

    NARCIS (Netherlands)

    Brand, van den M.G.J.; Kienle, H.M.; Mens, K.

    2014-01-01

    Experimental software and toolkits play a crucial role in computer science. Elsevier’s Science of Computer Programming special issues on Experimental Software and Toolkits (EST) provide a means for academic tool builders to get more visibility and credit for their work, by publishing a paper along

  10. Debris Examination Using Ballistic and Radar Integrated Software

    Science.gov (United States)

    Griffith, Anthony; Schottel, Matthew; Lee, David; Scully, Robert; Hamilton, Joseph; Kent, Brian; Thomas, Christopher; Benson, Jonathan; Branch, Eric; Hardman, Paul; hide

    2012-01-01

    The Debris Examination Using Ballistic and Radar Integrated Software (DEBRIS) program was developed to provide rapid and accurate analysis of debris observed by the NASA Debris Radar (NDR). This software provides a greatly improved analysis capacity over earlier manual processes, allowing for up to four times as much data to be analyzed by one-quarter of the personnel required by earlier methods. There are two applications that comprise the DEBRIS system: the Automated Radar Debris Examination Tool (ARDENT) and the primary DEBRIS tool.

  11. STAR: Software Toolkit for Analysis Research

    International Nuclear Information System (INIS)

    Doak, J.E.; Prommel, J.M.; Whiteson, R.; Hoffbauer, B.L.; Thomas, T.R.; Helman, P.

    1993-01-01

    Analyzing vast quantities of data from diverse information sources is an increasingly important element for nonproliferation and arms control analysis. Much of the work in this area has used human analysts to assimilate, integrate, and interpret complex information gathered from various sources. With the advent of fast computers, we now have the capability to automate this process thereby shifting this burden away from humans. In addition, there now exist huge data storage capabilities which have made it possible to formulate large integrated databases comprising many thereabouts of information spanning a variety of subjects. We are currently designing a Software Toolkit for Analysis Research (STAR) to address these issues. The goal of STAR is to Produce a research tool that facilitates the development and interchange of algorithms for locating phenomena of interest to nonproliferation and arms control experts. One major component deals with the preparation of information. The ability to manage and effectively transform raw data into a meaningful form is a prerequisite for analysis by any methodology. The relevant information to be analyzed can be either unstructured text structured data, signals, or images. Text can be numerical and/or character, stored in raw data files, databases, streams of bytes, or compressed into bits in formats ranging from fixed, to character-delimited, to a count followed by content The data can be analyzed in real-time or batch mode. Once the data are preprocessed, different analysis techniques can be applied. Some are built using expert knowledge. Others are trained using data collected over a period of time. Currently, we are considering three classes of analyzers for use in our software toolkit: (1) traditional machine learning techniques, (2) the purely statistical system, and (3) expert systems

  12. A GIS Software Toolkit for Monitoring Areal Snow Cover and Producing Daily Hydrologic Forecasts using NASA Satellite Imagery, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Aniuk Consulting, LLC, proposes to create a GIS software toolkit for monitoring areal snow cover extent and producing streamflow forecasts. This toolkit will be...

  13. A Software Toolkit to Study Systematic Uncertainties of the Physics Models of the Geant4 Simulation Package

    Science.gov (United States)

    Genser, Krzysztof; Hatcher, Robert; Kelsey, Michael; Perdue, Gabriel; Wenzel, Hans; Wright, Dennis H.; Yarba, Julia

    2017-10-01

    The Geant4 simulation toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models rely on measured cross-sections and phenomenological models with the physically motivated parameters that are tuned to cover many application domains. To study what uncertainties are associated with the Geant4 physics models we have designed and implemented a comprehensive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variants of the resulting physics observables of interest in order to estimate the uncertainties associated with the simulation model choices. Based on modern event-processing infrastructure software, the toolkit offers a variety of attractive features, e.g. flexible run-time configurable workflow, comprehensive bookkeeping, easy to expand collection of analytical components. Design, implementation technology, and key functionalities of the toolkit are presented in this paper and illustrated with selected results.

  14. A software toolkit for implementing low-cost virtual reality training systems

    International Nuclear Information System (INIS)

    Louka, Michael N.

    1999-04-01

    VR is a powerful technology for implementing training systems but better tools are needed to achieve wider usage and acceptance for desktop computer-based training applications. A need has been identified for a software tool kit to support the efficient implementation of well-structured desktop VR training systems. A powerful toolkit for implementing scalable low-cost VR training applications is described in this report (author) (ml)

  15. Simulación de sistemas radar FMCW basado en Software Defined Radio

    OpenAIRE

    Vidal Morera, Marc

    2016-01-01

    This project consists of the simulation of radar systems based on a Software Defined Radio architecture. The initiation of this project was born from the idea to introduce the Software Defined Radio (SDR) in the skeleton of a radar device. The SDR aims to achieve, in a single programmable device, what a radio tramsmitter and receiver do, meaning that most of their components are running in the digital domain. Within the architecture of a radar, it comes to replace the analog components to ...

  16. Integrated Systems Health Management (ISHM) Toolkit

    Science.gov (United States)

    Venkatesh, Meera; Kapadia, Ravi; Walker, Mark; Wilkins, Kim

    2013-01-01

    A framework of software components has been implemented to facilitate the development of ISHM systems according to a methodology based on Reliability Centered Maintenance (RCM). This framework is collectively referred to as the Toolkit and was developed using General Atomics' Health MAP (TM) technology. The toolkit is intended to provide assistance to software developers of mission-critical system health monitoring applications in the specification, implementation, configuration, and deployment of such applications. In addition to software tools designed to facilitate these objectives, the toolkit also provides direction to software developers in accordance with an ISHM specification and development methodology. The development tools are based on an RCM approach for the development of ISHM systems. This approach focuses on defining, detecting, and predicting the likelihood of system functional failures and their undesirable consequences.

  17. Software Toolkits: Practical Aspects of the Internet of Things—A Survey

    OpenAIRE

    Wang, Feng; Hu, Liang; Zhou, Jin; Wu, Yang; Hu, Jiejun; Zhao, Kuo

    2015-01-01

    The Internet of Things (IoT) is neither science fiction nor industry hype; rather it is based on solid technological advances and visions of network ubiquity that are zealously being realized. The paper serves to provide guidance regarding the practical aspects of the IoT. Such guidance is largely missing in the current literature in which the focus has been more on research problems and less on issues describing how to set up an IoT system and what software toolkits are required. This paper ...

  18. Sierra toolkit computational mesh conceptual model

    International Nuclear Information System (INIS)

    Baur, David G.; Edwards, Harold Carter; Cochran, William K.; Williams, Alan B.; Sjaardema, Gregory D.

    2010-01-01

    The Sierra Toolkit computational mesh is a software library intended to support massively parallel multi-physics computations on dynamically changing unstructured meshes. This domain of intended use is inherently complex due to distributed memory parallelism, parallel scalability, heterogeneity of physics, heterogeneous discretization of an unstructured mesh, and runtime adaptation of the mesh. Management of this inherent complexity begins with a conceptual analysis and modeling of this domain of intended use; i.e., development of a domain model. The Sierra Toolkit computational mesh software library is designed and implemented based upon this domain model. Software developers using, maintaining, or extending the Sierra Toolkit computational mesh library must be familiar with the concepts/domain model presented in this report.

  19. A Software Toolkit to Study Systematic Uncertainties of the Physics Models of the Geant4 Simulation Package

    CERN Document Server

    Genser, Krzysztof; Perdue, Gabriel; Wenzel, Hans; Yarba, Julia; Kelsey, Michael; Wright, Dennis H

    2016-01-01

    The Geant4 toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models are tuned to cover a large variety of possible applications. This raises the critical question of what uncertainties are associated with the Geant4 physics model, or group of models, involved in a simulation project. To address the challenge, we have designed and implemented a comprehen- sive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variants of the resulting physics observables of interest in order to estimate the uncertain- ties associated with the simulation model choices. Key functionalities of the toolkit are presented in this paper and are illustrated with selected results.

  20. A Statistical Toolkit for Data Analysis

    International Nuclear Information System (INIS)

    Donadio, S.; Guatelli, S.; Mascialino, B.; Pfeiffer, A.; Pia, M.G.; Ribon, A.; Viarengo, P.

    2006-01-01

    The present project aims to develop an open-source and object-oriented software Toolkit for statistical data analysis. Its statistical testing component contains a variety of Goodness-of-Fit tests, from Chi-squared to Kolmogorov-Smirnov, to less known, but generally much more powerful tests such as Anderson-Darling, Goodman, Fisz-Cramer-von Mises, Kuiper, Tiku. Thanks to the component-based design and the usage of the standard abstract interfaces for data analysis, this tool can be used by other data analysis systems or integrated in experimental software frameworks. This Toolkit has been released and is downloadable from the web. In this paper we describe the statistical details of the algorithms, the computational features of the Toolkit and describe the code validation

  1. LAIT: a local ancestry inference toolkit.

    Science.gov (United States)

    Hui, Daniel; Fang, Zhou; Lin, Jerome; Duan, Qing; Li, Yun; Hu, Ming; Chen, Wei

    2017-09-06

    Inferring local ancestry in individuals of mixed ancestry has many applications, most notably in identifying disease-susceptible loci that vary among different ethnic groups. Many software packages are available for inferring local ancestry in admixed individuals. However, most of these existing software packages require specific formatted input files and generate output files in various types, yielding practical inconvenience. We developed a tool set, Local Ancestry Inference Toolkit (LAIT), which can convert standardized files into software-specific input file formats as well as standardize and summarize inference results for four popular local ancestry inference software: HAPMIX, LAMP, LAMP-LD, and ELAI. We tested LAIT using both simulated and real data sets and demonstrated that LAIT provides convenience to run multiple local ancestry inference software. In addition, we evaluated the performance of local ancestry software among different supported software packages, mainly focusing on inference accuracy and computational resources used. We provided a toolkit to facilitate the use of local ancestry inference software, especially for users with limited bioinformatics background.

  2. Quad channel software defined receiver for passive radar application

    Directory of Open Access Journals (Sweden)

    Pető Tamás

    2017-03-01

    Full Text Available In recent times the growing utilization of the electromagnetic environment brings the passive radar researches more and more to the fore. For the utilization of the wide range of illuminators of opportunity the application of wideband radio receivers is required. At the same time the multichannel receiver structure has also critical importance in target direction finding and interference suppression. This paper presents the development of a multichannel software defined receiver specifically for passive radar applications. One of the relevant feature of the developed receiver platform is its up-to-date SoC (System on hip based structure, which greatly enhance the integration and signal processing capacity of the system, all while keeping the costs low. The software defined operation of the discussed receiver system is demonstrated with using DVB-T (Digital Video Broadcast – Terrestrial signal as illuminator of opportunity. During this demonstration the multichannel capabilities of the realized system are also tested with real data using direction finding and beamforming algorithms.

  3. Land surface Verification Toolkit (LVT)

    Science.gov (United States)

    Kumar, Sujay V.

    2017-01-01

    LVT is a framework developed to provide an automated, consolidated environment for systematic land surface model evaluation Includes support for a range of in-situ, remote-sensing and other model and reanalysis products. Supports the analysis of outputs from various LIS subsystems, including LIS-DA, LIS-OPT, LIS-UE. Note: The Land Information System Verification Toolkit (LVT) is a NASA software tool designed to enable the evaluation, analysis and comparison of outputs generated by the Land Information System (LIS). The LVT software is released under the terms and conditions of the NASA Open Source Agreement (NOSA) Version 1.1 or later. Land Information System Verification Toolkit (LVT) NOSA.

  4. JAVA Stereo Display Toolkit

    Science.gov (United States)

    Edmonds, Karina

    2008-01-01

    This toolkit provides a common interface for displaying graphical user interface (GUI) components in stereo using either specialized stereo display hardware (e.g., liquid crystal shutter or polarized glasses) or anaglyph display (red/blue glasses) on standard workstation displays. An application using this toolkit will work without modification in either environment, allowing stereo software to reach a wider audience without sacrificing high-quality display on dedicated hardware. The toolkit is written in Java for use with the Swing GUI Toolkit and has cross-platform compatibility. It hooks into the graphics system, allowing any standard Swing component to be displayed in stereo. It uses the OpenGL graphics library to control the stereo hardware and to perform the rendering. It also supports anaglyph and special stereo hardware using the same API (application-program interface), and has the ability to simulate color stereo in anaglyph mode by combining the red band of the left image with the green/blue bands of the right image. This is a low-level toolkit that accomplishes simply the display of components (including the JadeDisplay image display component). It does not include higher-level functions such as disparity adjustment, 3D cursor, or overlays all of which can be built using this toolkit.

  5. Fragment Impact Toolkit (FIT)

    Energy Technology Data Exchange (ETDEWEB)

    Shevitz, Daniel Wolf [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Key, Brian P. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Garcia, Daniel B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-05

    The Fragment Impact Toolkit (FIT) is a software package used for probabilistic consequence evaluation of fragmenting sources. The typical use case for FIT is to simulate an exploding shell and evaluate the consequence on nearby objects. FIT is written in the programming language Python and is designed as a collection of interacting software modules. Each module has a function that interacts with the other modules to produce desired results.

  6. Antenna toolkit

    CERN Document Server

    Carr, Joseph

    2006-01-01

    Joe Carr has provided radio amateurs and short-wave listeners with the definitive design guide for sending and receiving radio signals with Antenna Toolkit 2nd edition.Together with the powerful suite of CD software, the reader will have a complete solution for constructing or using an antenna - bar the actual hardware! The software provides a simple Windows-based aid to carrying out the design calculations at the heart of successful antenna design. All the user needs to do is select the antenna type and set the frequency - a much more fun and less error prone method than using a con

  7. The development of an artificial organic networks toolkit for LabVIEW.

    Science.gov (United States)

    Ponce, Hiram; Ponce, Pedro; Molina, Arturo

    2015-03-15

    Two of the most challenging problems that scientists and researchers face when they want to experiment with new cutting-edge algorithms are the time-consuming for encoding and the difficulties for linking them with other technologies and devices. In that sense, this article introduces the artificial organic networks toolkit for LabVIEW™ (AON-TL) from the implementation point of view. The toolkit is based on the framework provided by the artificial organic networks technique, giving it the potential to add new algorithms in the future based on this technique. Moreover, the toolkit inherits both the rapid prototyping and the easy-to-use characteristics of the LabVIEW™ software (e.g., graphical programming, transparent usage of other softwares and devices, built-in programming event-driven for user interfaces), to make it simple for the end-user. In fact, the article describes the global architecture of the toolkit, with particular emphasis in the software implementation of the so-called artificial hydrocarbon networks algorithm. Lastly, the article includes two case studies for engineering purposes (i.e., sensor characterization) and chemistry applications (i.e., blood-brain barrier partitioning data model) to show the usage of the toolkit and the potential scalability of the artificial organic networks technique. © 2015 Wiley Periodicals, Inc.

  8. Health Equity Assessment Toolkit (HEAT: software for exploring and comparing health inequalities in countries

    Directory of Open Access Journals (Sweden)

    Ahmad Reza Hosseinpoor

    2016-10-01

    Full Text Available Abstract Background It is widely recognised that the pursuit of sustainable development cannot be accomplished without addressing inequality, or observed differences between subgroups of a population. Monitoring health inequalities allows for the identification of health topics where major group differences exist, dimensions of inequality that must be prioritised to effect improvements in multiple health domains, and also population subgroups that are multiply disadvantaged. While availability of data to monitor health inequalities is gradually improving, there is a commensurate need to increase, within countries, the technical capacity for analysis of these data and interpretation of results for decision-making. Prior efforts to build capacity have yielded demand for a toolkit with the computational ability to display disaggregated data and summary measures of inequality in an interactive and customisable fashion that would facilitate interpretation and reporting of health inequality in a given country. Methods To answer this demand, the Health Equity Assessment Toolkit (HEAT, was developed between 2014 and 2016. The software, which contains the World Health Organization’s Health Equity Monitor database, allows the assessment of inequalities within a country using over 30 reproductive, maternal, newborn and child health indicators and five dimensions of inequality (economic status, education, place of residence, subnational region and child’s sex, where applicable. Results/Conclusion HEAT was beta-tested in 2015 as part of ongoing capacity building workshops on health inequality monitoring. This is the first and only application of its kind; further developments are proposed to introduce an upload data feature, translate it into different languages and increase interactivity of the software. This article will present the main features and functionalities of HEAT and discuss its relevance and use for health inequality monitoring.

  9. The Reconstruction Toolkit (RTK), an open-source cone-beam CT reconstruction toolkit based on the Insight Toolkit (ITK)

    International Nuclear Information System (INIS)

    Rit, S; Vila Oliva, M; Sarrut, D; Brousmiche, S; Labarbe, R; Sharp, G C

    2014-01-01

    We propose the Reconstruction Toolkit (RTK, http://www.openrtk.org), an open-source toolkit for fast cone-beam CT reconstruction, based on the Insight Toolkit (ITK) and using GPU code extracted from Plastimatch. RTK is developed by an open consortium (see affiliations) under the non-contaminating Apache 2.0 license. The quality of the platform is daily checked with regression tests in partnership with Kitware, the company supporting ITK. Several features are already available: Elekta, Varian and IBA inputs, multi-threaded Feldkamp-David-Kress reconstruction on CPU and GPU, Parker short scan weighting, multi-threaded CPU and GPU forward projectors, etc. Each feature is either accessible through command line tools or C++ classes that can be included in independent software. A MIDAS community has been opened to share CatPhan datasets of several vendors (Elekta, Varian and IBA). RTK will be used in the upcoming cone-beam CT scanner developed by IBA for proton therapy rooms. Many features are under development: new input format support, iterative reconstruction, hybrid Monte Carlo / deterministic CBCT simulation, etc. RTK has been built to freely share tomographic reconstruction developments between researchers and is open for new contributions.

  10. Automated prototyping tool-kit (APT)

    OpenAIRE

    Nada, Nader; Shing, M.; Berzins, V.; Luqi

    2002-01-01

    Automated prototyping tool-kit (APT) is an integrated set of software tools that generate source programs directly from real-time requirements. The APT system uses a fifth-generation prototyping language to model the communication structure, timing constraints, 1/0 control, and data buffering that comprise the requirements for an embedded software system. The language supports the specification of hard real-time systems with reusable components from domain specific component libraries. APT ha...

  11. Geant4 - A Simulation Toolkit

    International Nuclear Information System (INIS)

    2002-01-01

    Geant4 is a toolkit for simulating the passage of particles through matter. It includes a complete range of functionality including tracking, geometry, physics models and hits. The physics processes offered cover a comprehensive range, including electromagnetic, hadronic and optical processes, a large set of long-lived particles, materials and elements, over a wide energy range starting, in some cases, from 250 eV and extending in others to the TeV energy range. it has been designed and constructed to expose the physics models utilized, to handle complex geometries, and to enable its easy adaptation for optimal use in different sets of applications. The toolkit is the result of a worldwide collaboration of physicists and software engineers. It has been created exploiting software engineering and object-oriented technology and implemented in the C++ programming language. It has been used in applications in particle physics, nuclear physics, accelerator design, space engineering and medical physics

  12. GEANT4 A Simulation toolkit

    CERN Document Server

    Agostinelli, S; Amako, K; Apostolakis, John; Araújo, H M; Arce, P; Asai, M; Axen, D A; Banerjee, S; Barrand, G; Behner, F; Bellagamba, L; Boudreau, J; Broglia, L; Brunengo, A; Chauvie, S; Chuma, J; Chytracek, R; Cooperman, G; Cosmo, G; Degtyarenko, P V; Dell'Acqua, A; De Paola, G O; Dietrich, D D; Enami, R; Feliciello, A; Ferguson, C; Fesefeldt, H S; Folger, G; Foppiano, F; Forti, A C; Garelli, S; Giani, S; Giannitrapani, R; Gibin, D; Gómez-Cadenas, J J; González, I; Gracía-Abríl, G; Greeniaus, L G; Greiner, W; Grichine, V M; Grossheim, A; Gumplinger, P; Hamatsu, R; Hashimoto, K; Hasui, H; Heikkinen, A M; Howard, A; Hutton, A M; Ivanchenko, V N; Johnson, A; Jones, F W; Kallenbach, Jeff; Kanaya, N; Kawabata, M; Kawabata, Y; Kawaguti, M; Kelner, S; Kent, P; Kodama, T; Kokoulin, R P; Kossov, M; Kurashige, H; Lamanna, E; Lampen, T; Lara, V; Lefébure, V; Lei, F; Liendl, M; Lockman, W; Longo, F; Magni, S; Maire, M; Mecking, B A; Medernach, E; Minamimoto, K; Mora de Freitas, P; Morita, Y; Murakami, K; Nagamatu, M; Nartallo, R; Nieminen, P; Nishimura, T; Ohtsubo, K; Okamura, M; O'Neale, S W; O'Ohata, Y; Perl, J; Pfeiffer, A; Pia, M G; Ranjard, F; Rybin, A; Sadilov, S; Di Salvo, E; Santin, G; Sasaki, T; Savvas, N; Sawada, Y; Scherer, S; Sei, S; Sirotenko, V I; Smith, D; Starkov, N; Stöcker, H; Sulkimo, J; Takahata, M; Tanaka, S; Chernyaev, E; Safai-Tehrani, F; Tropeano, M; Truscott, P R; Uno, H; Urbàn, L; Urban, P; Verderi, M; Walkden, A; Wander, W; Weber, H; Wellisch, J P; Wenaus, T; Williams, D C; Wright, D; Yamada, T; Yoshida, H; Zschiesche, D

    2003-01-01

    Geant4 is a toolkit for simulating the passage of particles through matter. It includes a complete range of functionality including tracking, geometry, physics models and hits. The physics processes offered cover a comprehensive range, including electromagnetic, hadronic and optical processes, a large set of long-lived particles, materials and elements, over a wide energy range starting, in some cases, from 250 eV and extending in others to the TeV energy range. It has been designed and constructed to expose the physics models utilised, to handle complex geometries, and to enable its easy adaptation for optimal use in different sets of applications. The toolkit is the result of a worldwide collaboration of physicists and software engineers. It has been created exploiting software engineering and object-oriented technology and implemented in the C++ programming language. It has been used in applications in particle physics, nuclear physics, accelerator design, space engineering and medical physics.

  13. Geant4 - A Simulation Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Wright, Dennis H

    2002-08-09

    GEANT4 is a toolkit for simulating the passage of particles through matter. it includes a complete range of functionality including tracking, geometry, physics models and hits. The physics processes offered cover a comprehensive range, including electromagnetic, hadronic and optical processes, a large set of long-lived particles, materials and elements, over a wide energy range starting, in some cases, from 250 eV and extending in others to the TeV energy range. it has been designed and constructed to expose the physics models utilized, to handle complex geometries, and to enable its easy adaptation for optimal use in different sets of applications. The toolkit is the result of a worldwide collaboration of physicists and software engineers. It has been created exploiting software engineering and object-oriented technology and implemented in the C++ programming language. It has been used in applications in particle physics, nuclear physics, accelerator design, space engineering and medical physics.

  14. CRISPR-Cas9 Toolkit for Actinomycete Genome Editing

    DEFF Research Database (Denmark)

    Tong, Yaojun; Robertsen, Helene Lunde; Blin, Kai

    2018-01-01

    engineering approaches for boosting known and discovering novel natural products. In order to facilitate the genome editing for actinomycetes, we developed a CRISPR-Cas9 toolkit with high efficiency for actinomyces genome editing. This basic toolkit includes a software for spacer (sgRNA) identification......, a system for in-frame gene/gene cluster knockout, a system for gene loss-of-function study, a system for generating a random size deletion library, and a system for gene knockdown. For the latter, a uracil-specific excision reagent (USER) cloning technology was adapted to simplify the CRISPR vector...... construction process. The application of this toolkit was successfully demonstrated by perturbation of genomes of Streptomyces coelicolor A3(2) and Streptomyces collinus Tü 365. The CRISPR-Cas9 toolkit and related protocol described here can be widely used for metabolic engineering of actinomycetes....

  15. TRSkit: A Simple Digital Library Toolkit

    Science.gov (United States)

    Nelson, Michael L.; Esler, Sandra L.

    1997-01-01

    This paper introduces TRSkit, a simple and effective toolkit for building digital libraries on the World Wide Web. The toolkit was developed for the creation of the Langley Technical Report Server and the NASA Technical Report Server, but is applicable to most simple distribution paradigms. TRSkit contains a handful of freely available software components designed to be run under the UNIX operating system and served via the World Wide Web. The intended customer is the person that must continuously and synchronously distribute anywhere from 100 - 100,000's of information units and does not have extensive resources to devote to the problem.

  16. Design Optimization Toolkit: Users' Manual

    Energy Technology Data Exchange (ETDEWEB)

    Aguilo Valentin, Miguel Alejandro [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Computational Solid Mechanics and Structural Dynamics

    2014-07-01

    The Design Optimization Toolkit (DOTk) is a stand-alone C++ software package intended to solve complex design optimization problems. DOTk software package provides a range of solution methods that are suited for gradient/nongradient-based optimization, large scale constrained optimization, and topology optimization. DOTk was design to have a flexible user interface to allow easy access to DOTk solution methods from external engineering software packages. This inherent flexibility makes DOTk barely intrusive to other engineering software packages. As part of this inherent flexibility, DOTk software package provides an easy-to-use MATLAB interface that enables users to call DOTk solution methods directly from the MATLAB command window.

  17. Toolkit for Conceptual Modeling (TCM): User's Guide and Reference

    NARCIS (Netherlands)

    Dehne, F.; Wieringa, Roelf J.

    1997-01-01

    The Toolkit for Conceptual Modeling (TCM) is a suite of graphical editors for a number of graphical notation systems that are used in software specification methods. The notations can be used to represent the conceptual structure of the software - hence the name of the suite. This manual describes

  18. Iterative software kernels

    Energy Technology Data Exchange (ETDEWEB)

    Duff, I.

    1994-12-31

    This workshop focuses on kernels for iterative software packages. Specifically, the three speakers discuss various aspects of sparse BLAS kernels. Their topics are: `Current status of user lever sparse BLAS`; Current status of the sparse BLAS toolkit`; and `Adding matrix-matrix and matrix-matrix-matrix multiply to the sparse BLAS toolkit`.

  19. Matlab based Toolkits used to Interface with Optical Design Software for NASA's James Webb Space Telescope

    Science.gov (United States)

    Howard, Joseph

    2007-01-01

    The viewgraph presentation provides an introduction to the James Webb Space Telescope (JWST). The first part provides a brief overview of Matlab toolkits including CodeV, OSLO, and Zemax Toolkits. The toolkit overview examines purpose, layout, how Matlab gets data from CodeV, function layout, and using cvHELP. The second part provides examples of use with JWST, including wavefront sensitivities and alignment simulations.

  20. Development of Radar Control system for Multi-mode Active Phased Array Radar for atmospheric probing

    Science.gov (United States)

    Yasodha, Polisetti; Jayaraman, Achuthan; Thriveni, A.

    2016-07-01

    Modern multi-mode active phased array radars require highly efficient radar control system for hassle free real time radar operation. The requirement comes due to the distributed architecture of the active phased array radar, where each antenna element in the array is connected to a dedicated Transmit-Receive (TR) module. Controlling the TR modules, which are generally few hundreds in number, and functioning them in synchronisation, is a huge task during real time radar operation and should be handled with utmost care. Indian MST Radar, located at NARL, Gadanki, which is established during early 90's, as an outcome of the middle atmospheric program, is a remote sensing instrument for probing the atmosphere. This radar has a semi-active array, consisting of 1024 antenna elements, with limited beam steering, possible only along the principle planes. To overcome the limitations and difficulties, the radar is being augmented into fully active phased array, to accomplish beam agility and multi-mode operations. Each antenna element is excited with a dedicated 1 kW TR module, located in the field and enables to position the radar beam within 20° conical volume. A multi-channel receiver makes the radar to operate in various modes like Doppler Beam Swinging (DBS), Spaced Antenna (SA), Frequency Domain Interferometry (FDI) etc. Present work describes the real-time radar control (RC) system for the above described active phased array radar. The radar control system consists of a Spartan 6 FPGA based Timing and Control Signal Generator (TCSG), and a computer containing the software for controlling all the subsystems of the radar during real-time radar operation and also for calibrating the radar. The main function of the TCSG is to generate the control and timing waveforms required for various subsystems of the radar. Important components of the RC system software are (i) TR module configuring software which does programming, controlling and health parameter monitoring of the

  1. The MUSOS (MUsic SOftware System) Toolkit: A computer-based, open source application for testing memory for melodies.

    Science.gov (United States)

    Rainsford, M; Palmer, M A; Paine, G

    2018-04-01

    Despite numerous innovative studies, rates of replication in the field of music psychology are extremely low (Frieler et al., 2013). Two key methodological challenges affecting researchers wishing to administer and reproduce studies in music cognition are the difficulty of measuring musical responses, particularly when conducting free-recall studies, and access to a reliable set of novel stimuli unrestricted by copyright or licensing issues. In this article, we propose a solution for these challenges in computer-based administration. We present a computer-based application for testing memory for melodies. Created using the software Max/MSP (Cycling '74, 2014a), the MUSOS (Music Software System) Toolkit uses a simple modular framework configurable for testing common paradigms such as recall, old-new recognition, and stem completion. The program is accompanied by a stimulus set of 156 novel, copyright-free melodies, in audio and Max/MSP file formats. Two pilot tests were conducted to establish the properties of the accompanying stimulus set that are relevant to music cognition and general memory research. By using this software, a researcher without specialist musical training may administer and accurately measure responses from common paradigms used in the study of memory for music.

  2. Integrated System Health Management Development Toolkit

    Science.gov (United States)

    Figueroa, Jorge; Smith, Harvey; Morris, Jon

    2009-01-01

    This software toolkit is designed to model complex systems for the implementation of embedded Integrated System Health Management (ISHM) capability, which focuses on determining the condition (health) of every element in a complex system (detect anomalies, diagnose causes, and predict future anomalies), and to provide data, information, and knowledge (DIaK) to control systems for safe and effective operation.

  3. Open source tools and toolkits for bioinformatics: significance, and where are we?

    Science.gov (United States)

    Stajich, Jason E; Lapp, Hilmar

    2006-09-01

    This review summarizes important work in open-source bioinformatics software that has occurred over the past couple of years. The survey is intended to illustrate how programs and toolkits whose source code has been developed or released under an Open Source license have changed informatics-heavy areas of life science research. Rather than creating a comprehensive list of all tools developed over the last 2-3 years, we use a few selected projects encompassing toolkit libraries, analysis tools, data analysis environments and interoperability standards to show how freely available and modifiable open-source software can serve as the foundation for building important applications, analysis workflows and resources.

  4. The IGUANA interactive graphics toolkit with examples from CMS and D0

    International Nuclear Information System (INIS)

    Alverson, G.; Osborne, I.; Taylor, L.; Tuura, L.

    2001-01-01

    IGUANA (Interactive Graphics for User ANAlysis) is a C++ toolkit for developing graphical user interfaces and high performance 2-D and 3-D graphics applications, such as data browsers and detector and event visualisation programs. The IGUANA strategy is to use freely available software (e.g. Qt, SoQt, OpenInventor, OpenGL, HEPVis) and package and extend it to provide a general-purpose and experiment-independent toolkit. The authors describe the evaluation and choices of publicly available GUI/graphics software and the additional functionality currently provided by IGUANA. The authors demonstrate the use of IGUANA with several applications built for CMS and D0

  5. Development of a Software-Defined Radar

    Science.gov (United States)

    2017-10-01

    disrupt desired radar operation. The cognitive radar system discussed herein mitigates the effects of RFI by sensing and adapting the transmitted...present received data, and plot processed data. Top right: Calculates a “ flicker ” rate caused by an unknown issue where blank data are received due to...and plot processed data. Top right: Calculates a “ flicker ” rate caused by an unknown issue where blank data are received due to missed

  6. XPIWIT--an XML pipeline wrapper for the Insight Toolkit.

    Science.gov (United States)

    Bartschat, Andreas; Hübner, Eduard; Reischl, Markus; Mikut, Ralf; Stegmaier, Johannes

    2016-01-15

    The Insight Toolkit offers plenty of features for multidimensional image analysis. Current implementations, however, often suffer either from a lack of flexibility due to hard-coded C++ pipelines for a certain task or by slow execution times, e.g. caused by inefficient implementations or multiple read/write operations for separate filter execution. We present an XML-based wrapper application for the Insight Toolkit that combines the performance of a pure C++ implementation with an easy-to-use graphical setup of dynamic image analysis pipelines. Created XML pipelines can be interpreted and executed by XPIWIT in console mode either locally or on large clusters. We successfully applied the software tool for the automated analysis of terabyte-scale, time-resolved 3D image data of zebrafish embryos. XPIWIT is implemented in C++ using the Insight Toolkit and the Qt SDK. It has been successfully compiled and tested under Windows and Unix-based systems. Software and documentation are distributed under Apache 2.0 license and are publicly available for download at https://bitbucket.org/jstegmaier/xpiwit/downloads/. johannes.stegmaier@kit.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  7. The MOLGENIS toolkit : rapid prototyping of biosoftware at the push of a button

    NARCIS (Netherlands)

    Swertz, Morris A.; Dijkstra, Martijn; Adamusiak, Tomasz; van der Velde, Joeri K.; Kanterakis, Alexandros; Roos, Erik T.; Lops, Joris; Thorisson, Gudmundur A.; Arends, Danny; Byelas, George; Muilu, Juha; Brookes, Anthony J.; de Brock, Engbert O.; Jansen, Ritsert C.; Parkinson, Helen

    2010-01-01

    Background: There is a huge demand on bioinformaticians to provide their biologists with user friendly and scalable software infrastructures to capture, exchange, and exploit the unprecedented amounts of new *omics data. We here present MOLGENIS, a generic, open source, software toolkit to quickly

  8. Open Drug Discovery Toolkit (ODDT): a new open-source player in the drug discovery field.

    Science.gov (United States)

    Wójcikowski, Maciej; Zielenkiewicz, Piotr; Siedlecki, Pawel

    2015-01-01

    There has been huge progress in the open cheminformatics field in both methods and software development. Unfortunately, there has been little effort to unite those methods and software into one package. We here describe the Open Drug Discovery Toolkit (ODDT), which aims to fulfill the need for comprehensive and open source drug discovery software. The Open Drug Discovery Toolkit was developed as a free and open source tool for both computer aided drug discovery (CADD) developers and researchers. ODDT reimplements many state-of-the-art methods, such as machine learning scoring functions (RF-Score and NNScore) and wraps other external software to ease the process of developing CADD pipelines. ODDT is an out-of-the-box solution designed to be easily customizable and extensible. Therefore, users are strongly encouraged to extend it and develop new methods. We here present three use cases for ODDT in common tasks in computer-aided drug discovery. Open Drug Discovery Toolkit is released on a permissive 3-clause BSD license for both academic and industrial use. ODDT's source code, additional examples and documentation are available on GitHub (https://github.com/oddt/oddt).

  9. The PyRosetta Toolkit: a graphical user interface for the Rosetta software suite.

    Science.gov (United States)

    Adolf-Bryfogle, Jared; Dunbrack, Roland L

    2013-01-01

    The Rosetta Molecular Modeling suite is a command-line-only collection of applications that enable high-resolution modeling and design of proteins and other molecules. Although extremely useful, Rosetta can be difficult to learn for scientists with little computational or programming experience. To that end, we have created a Graphical User Interface (GUI) for Rosetta, called the PyRosetta Toolkit, for creating and running protocols in Rosetta for common molecular modeling and protein design tasks and for analyzing the results of Rosetta calculations. The program is highly extensible so that developers can add new protocols and analysis tools to the PyRosetta Toolkit GUI.

  10. Pulse Doppler radar

    CERN Document Server

    Alabaster, Clive

    2012-01-01

    This book is a practitioner's guide to all aspects of pulse Doppler radar. It concentrates on airborne military radar systems since they are the most used, most complex, and most interesting of the pulse Doppler radars; however, ground-based and non-military systems are also included. It covers the fundamental science, signal processing, hardware issues, systems design and case studies of typical systems. It will be a useful resource for engineers of all types (hardware, software and systems), academics, post-graduate students, scientists in radar and radar electronic warfare sectors and milit

  11. Commercial Building Energy Saver: An energy retrofit analysis toolkit

    International Nuclear Information System (INIS)

    Hong, Tianzhen; Piette, Mary Ann; Chen, Yixing; Lee, Sang Hoon; Taylor-Lange, Sarah C.; Zhang, Rongpeng; Sun, Kaiyu; Price, Phillip

    2015-01-01

    Highlights: • Commercial Building Energy Saver is a powerful toolkit for energy retrofit analysis. • CBES provides benchmarking, load shape analysis, and model-based retrofit assessment. • CBES covers 7 building types, 6 vintages, 16 climates, and 100 energy measures. • CBES includes a web app, API, and a database of energy efficiency performance. • CBES API can be extended and integrated with third party energy software tools. - Abstract: Small commercial buildings in the United States consume 47% of the total primary energy of the buildings sector. Retrofitting small and medium commercial buildings poses a huge challenge for owners because they usually lack the expertise and resources to identify and evaluate cost-effective energy retrofit strategies. This paper presents the Commercial Building Energy Saver (CBES), an energy retrofit analysis toolkit, which calculates the energy use of a building, identifies and evaluates retrofit measures in terms of energy savings, energy cost savings and payback. The CBES Toolkit includes a web app (APP) for end users and the CBES Application Programming Interface (API) for integrating CBES with other energy software tools. The toolkit provides a rich set of features including: (1) Energy Benchmarking providing an Energy Star score, (2) Load Shape Analysis to identify potential building operation improvements, (3) Preliminary Retrofit Analysis which uses a custom developed pre-simulated database and, (4) Detailed Retrofit Analysis which utilizes real-time EnergyPlus simulations. CBES includes 100 configurable energy conservation measures (ECMs) that encompass IAQ, technical performance and cost data, for assessing 7 different prototype buildings in 16 climate zones in California and 6 vintages. A case study of a small office building demonstrates the use of the toolkit for retrofit analysis. The development of CBES provides a new contribution to the field by providing a straightforward and uncomplicated decision

  12. Application experiences with the Globus toolkit.

    Energy Technology Data Exchange (ETDEWEB)

    Brunett, S.

    1998-06-09

    The Globus grid toolkit is a collection of software components designed to support the development of applications for high-performance distributed computing environments, or ''computational grids'' [14]. The Globus toolkit is an implementation of a ''bag of services'' architecture, which provides application and tool developers not with a monolithic system but rather with a set of stand-alone services. Each Globus component provides a basic service, such as authentication, resource allocation, information, communication, fault detection, and remote data access. Different applications and tools can combine these services in different ways to construct ''grid-enabled'' systems. The Globus toolkit has been used to construct the Globus Ubiquitous Supercomputing Testbed, or GUSTO: a large-scale testbed spanning 20 sites and included over 4000 compute nodes for a total compute power of over 2 TFLOPS. Over the past six months, we and others have used this testbed to conduct a variety of application experiments, including multi-user collaborative environments (tele-immersion), computational steering, distributed supercomputing, and high throughput computing. The goal of this paper is to review what has been learned from these experiments regarding the effectiveness of the toolkit approach. To this end, we describe two of the application experiments in detail, noting what worked well and what worked less well. The two applications are a distributed supercomputing application, SF-Express, in which multiple supercomputers are harnessed to perform large distributed interactive simulations; and a tele-immersion application, CAVERNsoft, in which the focus is on connecting multiple people to a distributed simulated world.

  13. Tribal Green Building Toolkit

    Science.gov (United States)

    This Tribal Green Building Toolkit (Toolkit) is designed to help tribal officials, community members, planners, developers, and architects develop and adopt building codes to support green building practices. Anyone can use this toolkit!

  14. The Medical Imaging Interaction Toolkit: challenges and advances : 10 years of open-source development.

    Science.gov (United States)

    Nolden, Marco; Zelzer, Sascha; Seitel, Alexander; Wald, Diana; Müller, Michael; Franz, Alfred M; Maleike, Daniel; Fangerau, Markus; Baumhauer, Matthias; Maier-Hein, Lena; Maier-Hein, Klaus H; Meinzer, Hans-Peter; Wolf, Ivo

    2013-07-01

    The Medical Imaging Interaction Toolkit (MITK) has been available as open-source software for almost 10 years now. In this period the requirements of software systems in the medical image processing domain have become increasingly complex. The aim of this paper is to show how MITK evolved into a software system that is able to cover all steps of a clinical workflow including data retrieval, image analysis, diagnosis, treatment planning, intervention support, and treatment control. MITK provides modularization and extensibility on different levels. In addition to the original toolkit, a module system, micro services for small, system-wide features, a service-oriented architecture based on the Open Services Gateway initiative (OSGi) standard, and an extensible and configurable application framework allow MITK to be used, extended and deployed as needed. A refined software process was implemented to deliver high-quality software, ease the fulfillment of regulatory requirements, and enable teamwork in mixed-competence teams. MITK has been applied by a worldwide community and integrated into a variety of solutions, either at the toolkit level or as an application framework with custom extensions. The MITK Workbench has been released as a highly extensible and customizable end-user application. Optional support for tool tracking, image-guided therapy, diffusion imaging as well as various external packages (e.g. CTK, DCMTK, OpenCV, SOFA, Python) is available. MITK has also been used in several FDA/CE-certified applications, which demonstrates the high-quality software and rigorous development process. MITK provides a versatile platform with a high degree of modularization and interoperability and is well suited to meet the challenging tasks of today's and tomorrow's clinically motivated research.

  15. ECCE Toolkit: Prototyping Sensor-Based Interaction

    Directory of Open Access Journals (Sweden)

    Andrea Bellucci

    2017-02-01

    Full Text Available Building and exploring physical user interfaces requires high technical skills and hours of specialized work. The behavior of multiple devices with heterogeneous input/output channels and connectivity has to be programmed in a context where not only the software interface matters, but also the hardware components are critical (e.g., sensors and actuators. Prototyping physical interaction is hindered by the challenges of: (1 programming interactions among physical sensors/actuators and digital interfaces; (2 implementing functionality for different platforms in different programming languages; and (3 building custom electronic-incorporated objects. We present ECCE (Entities, Components, Couplings and Ecosystems, a toolkit for non-programmers that copes with these issues by abstracting from low-level implementations, thus lowering the complexity of prototyping small-scale, sensor-based physical interfaces to support the design process. A user evaluation provides insights and use cases of the kind of applications that can be developed with the toolkit.

  16. The Data Warehouse Lifecycle Toolkit

    CERN Document Server

    Kimball, Ralph; Thornthwaite, Warren; Mundy, Joy; Becker, Bob

    2011-01-01

    A thorough update to the industry standard for designing, developing, and deploying data warehouse and business intelligence systemsThe world of data warehousing has changed remarkably since the first edition of The Data Warehouse Lifecycle Toolkit was published in 1998. In that time, the data warehouse industry has reached full maturity and acceptance, hardware and software have made staggering advances, and the techniques promoted in the premiere edition of this book have been adopted by nearly all data warehouse vendors and practitioners. In addition, the term "business intelligence" emerge

  17. Advanced processing and simulation of MRS data using the FID appliance (FID-A)-An open source, MATLAB-based toolkit.

    Science.gov (United States)

    Simpson, Robin; Devenyi, Gabriel A; Jezzard, Peter; Hennessy, T Jay; Near, Jamie

    2017-01-01

    To introduce a new toolkit for simulation and processing of magnetic resonance spectroscopy (MRS) data, and to demonstrate some of its novel features. The FID appliance (FID-A) is an open-source, MATLAB-based software toolkit for simulation and processing of MRS data. The software is designed specifically for processing data with multiple dimensions (eg, multiple radiofrequency channels, averages, spectral editing dimensions). It is equipped with functions for importing data in the formats of most major MRI vendors (eg, Siemens, Philips, GE, Agilent) and for exporting data into the formats of several common processing software packages (eg, LCModel, jMRUI, Tarquin). This paper introduces the FID-A software toolkit and uses examples to demonstrate its novel features, namely 1) the use of a spectral registration algorithm to carry out useful processing routines automatically, 2) automatic detection and removal of motion-corrupted scans, and 3) the ability to perform several major aspects of the MRS computational workflow from a single piece of software. This latter feature is illustrated through both high-level processing of in vivo GABA-edited MEGA-PRESS MRS data, as well as detailed quantum mechanical simulations to generate an accurate LCModel basis set for analysis of the same data. All of the described processing steps resulted in a marked improvement in spectral quality compared with unprocessed data. Fitting of MEGA-PRESS data using a customized basis set resulted in improved fitting accuracy compared with a generic MEGA-PRESS basis set. The FID-A software toolkit enables high-level processing of MRS data and accurate simulation of in vivo MRS experiments. Magn Reson Med 77:23-33, 2017. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  18. NeuroMatic: An Integrated Open-Source Software Toolkit for Acquisition, Analysis and Simulation of Electrophysiological Data

    Science.gov (United States)

    Rothman, Jason S.; Silver, R. Angus

    2018-01-01

    Acquisition, analysis and simulation of electrophysiological properties of the nervous system require multiple software packages. This makes it difficult to conserve experimental metadata and track the analysis performed. It also complicates certain experimental approaches such as online analysis. To address this, we developed NeuroMatic, an open-source software toolkit that performs data acquisition (episodic, continuous and triggered recordings), data analysis (spike rasters, spontaneous event detection, curve fitting, stationarity) and simulations (stochastic synaptic transmission, synaptic short-term plasticity, integrate-and-fire and Hodgkin-Huxley-like single-compartment models). The merging of a wide range of tools into a single package facilitates a more integrated style of research, from the development of online analysis functions during data acquisition, to the simulation of synaptic conductance trains during dynamic-clamp experiments. Moreover, NeuroMatic has the advantage of working within Igor Pro, a platform-independent environment that includes an extensive library of built-in functions, a history window for reviewing the user's workflow and the ability to produce publication-quality graphics. Since its original release, NeuroMatic has been used in a wide range of scientific studies and its user base has grown considerably. NeuroMatic version 3.0 can be found at http://www.neuromatic.thinkrandom.com and https://github.com/SilverLabUCL/NeuroMatic. PMID:29670519

  19. The MicroAnalysis Toolkit: X-ray Fluorescence Image Processing Software

    International Nuclear Information System (INIS)

    Webb, S. M.

    2011-01-01

    The MicroAnalysis Toolkit is an analysis suite designed for the processing of x-ray fluorescence microprobe data. The program contains a wide variety of analysis tools, including image maps, correlation plots, simple image math, image filtering, multiple energy image fitting, semi-quantitative elemental analysis, x-ray fluorescence spectrum analysis, principle component analysis, and tomographic reconstructions. To be as widely useful as possible, data formats from many synchrotron sources can be read by the program with more formats available by request. An overview of the most common features will be presented.

  20. Web-based Toolkit for Dynamic Generation of Data Processors

    Science.gov (United States)

    Patel, J.; Dascalu, S.; Harris, F. C.; Benedict, K. K.; Gollberg, G.; Sheneman, L.

    2011-12-01

    All computation-intensive scientific research uses structured datasets, including hydrology and all other types of climate-related research. When it comes to testing their hypotheses, researchers might use the same dataset differently, and modify, transform, or convert it to meet their research needs. Currently, many researchers spend a good amount of time performing data processing and building tools to speed up this process. They might routinely repeat the same process activities for new research projects, spending precious time that otherwise could be dedicated to analyzing and interpreting the data. Numerous tools are available to run tests on prepared datasets and many of them work with datasets in different formats. However, there is still a significant need for applications that can comprehensively handle data transformation and conversion activities and help prepare the various processed datasets required by the researchers. We propose a web-based application (a software toolkit) that dynamically generates data processors capable of performing data conversions, transformations, and customizations based on user-defined mappings and selections. As a first step, the proposed solution allows the users to define various data structures and, in the next step, can select various file formats and data conversions for their datasets of interest. In a simple scenario, the core of the proposed web-based toolkit allows the users to define direct mappings between input and output data structures. The toolkit will also support defining complex mappings involving the use of pre-defined sets of mathematical, statistical, date/time, and text manipulation functions. Furthermore, the users will be allowed to define logical cases for input data filtering and sampling. At the end of the process, the toolkit is designed to generate reusable source code and executable binary files for download and use by the scientists. The application is also designed to store all data

  1. chemf: A purely functional chemistry toolkit.

    Science.gov (United States)

    Höck, Stefan; Riedl, Rainer

    2012-12-20

    Although programming in a type-safe and referentially transparent style offers several advantages over working with mutable data structures and side effects, this style of programming has not seen much use in chemistry-related software. Since functional programming languages were designed with referential transparency in mind, these languages offer a lot of support when writing immutable data structures and side-effects free code. We therefore started implementing our own toolkit based on the above programming paradigms in a modern, versatile programming language. We present our initial results with functional programming in chemistry by first describing an immutable data structure for molecular graphs together with a couple of simple algorithms to calculate basic molecular properties before writing a complete SMILES parser in accordance with the OpenSMILES specification. Along the way we show how to deal with input validation, error handling, bulk operations, and parallelization in a purely functional way. At the end we also analyze and improve our algorithms and data structures in terms of performance and compare it to existing toolkits both object-oriented and purely functional. All code was written in Scala, a modern multi-paradigm programming language with a strong support for functional programming and a highly sophisticated type system. We have successfully made the first important steps towards a purely functional chemistry toolkit. The data structures and algorithms presented in this article perform well while at the same time they can be safely used in parallelized applications, such as computer aided drug design experiments, without further adjustments. This stands in contrast to existing object-oriented toolkits where thread safety of data structures and algorithms is a deliberate design decision that can be hard to implement. Finally, the level of type-safety achieved by Scala highly increased the reliability of our code as well as the productivity of

  2. Practical computational toolkits for dendrimers and dendrons structure design

    Science.gov (United States)

    Martinho, Nuno; Silva, Liana C.; Florindo, Helena F.; Brocchini, Steve; Barata, Teresa; Zloh, Mire

    2017-09-01

    Dendrimers and dendrons offer an excellent platform for developing novel drug delivery systems and medicines. The rational design and further development of these repetitively branched systems are restricted by difficulties in scalable synthesis and structural determination, which can be overcome by judicious use of molecular modelling and molecular simulations. A major difficulty to utilise in silico studies to design dendrimers lies in the laborious generation of their structures. Current modelling tools utilise automated assembly of simpler dendrimers or the inefficient manual assembly of monomer precursors to generate more complicated dendrimer structures. Herein we describe two novel graphical user interface toolkits written in Python that provide an improved degree of automation for rapid assembly of dendrimers and generation of their 2D and 3D structures. Our first toolkit uses the RDkit library, SMILES nomenclature of monomers and SMARTS reaction nomenclature to generate SMILES and mol files of dendrimers without 3D coordinates. These files are used for simple graphical representations and storing their structures in databases. The second toolkit assembles complex topology dendrimers from monomers to construct 3D dendrimer structures to be used as starting points for simulation using existing and widely available software and force fields. Both tools were validated for ease-of-use to prototype dendrimer structure and the second toolkit was especially relevant for dendrimers of high complexity and size.

  3. Software evolution and maintenance

    CERN Document Server

    Tripathy, Priyadarshi

    2014-01-01

    Software Evolution and Maintenance: A Practitioner's Approach is an accessible textbook for students and professionals, which collates the advances in software development and provides the most current models and techniques in maintenance.Explains two maintenance standards: IEEE/EIA 1219 and ISO/IEC14764Discusses several commercial reverse and domain engineering toolkitsSlides for instructors are available onlineInformation is based on the IEEE SWEBOK (Software Engineering Body of Knowledge)

  4. Microgrid Design Toolkit (MDT) User Guide Software v1.2.

    Energy Technology Data Exchange (ETDEWEB)

    Eddy, John P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-08-01

    The Microgrid Design Toolkit (MDT) supports decision analysis for new ("greenfield") microgrid designs as well as microgrids with existing infrastructure. The current version of MDT includes two main capabilities. The first capability, the Microgrid Sizing Capability (MSC), is used to determine the size and composition of a new, grid connected microgrid in the early stages of the design process. MSC is focused on developing a microgrid that is economically viable when connected to the grid. The second capability is focused on designing a microgrid for operation in islanded mode. This second capability relies on two models: the Technology Management Optimization (TMO) model and Performance Reliability Model (PRM).

  5. Radar signal analysis and processing using Matlab

    CERN Document Server

    Mahafza, Bassem R

    2008-01-01

    Offering radar-related software for the analysis and design of radar waveform and signal processing, this book provides comprehensive coverage of radar signals and signal processing techniques and algorithms. It contains numerous graphical plots, common radar-related functions, table format outputs, and end-of-chapter problems. The complete set of MATLAB[registered] functions and routines are available for download online.

  6. The MOLGENIS toolkit: rapid prototyping of biosoftware at the push of a button.

    Science.gov (United States)

    Swertz, Morris A; Dijkstra, Martijn; Adamusiak, Tomasz; van der Velde, Joeri K; Kanterakis, Alexandros; Roos, Erik T; Lops, Joris; Thorisson, Gudmundur A; Arends, Danny; Byelas, George; Muilu, Juha; Brookes, Anthony J; de Brock, Engbert O; Jansen, Ritsert C; Parkinson, Helen

    2010-12-21

    There is a huge demand on bioinformaticians to provide their biologists with user friendly and scalable software infrastructures to capture, exchange, and exploit the unprecedented amounts of new *omics data. We here present MOLGENIS, a generic, open source, software toolkit to quickly produce the bespoke MOLecular GENetics Information Systems needed. The MOLGENIS toolkit provides bioinformaticians with a simple language to model biological data structures and user interfaces. At the push of a button, MOLGENIS' generator suite automatically translates these models into a feature-rich, ready-to-use web application including database, user interfaces, exchange formats, and scriptable interfaces. Each generator is a template of SQL, JAVA, R, or HTML code that would require much effort to write by hand. This 'model-driven' method ensures reuse of best practices and improves quality because the modeling language and generators are shared between all MOLGENIS applications, so that errors are found quickly and improvements are shared easily by a re-generation. A plug-in mechanism ensures that both the generator suite and generated product can be customized just as much as hand-written software. In recent years we have successfully evaluated the MOLGENIS toolkit for the rapid prototyping of many types of biomedical applications, including next-generation sequencing, GWAS, QTL, proteomics and biobanking. Writing 500 lines of model XML typically replaces 15,000 lines of hand-written programming code, which allows for quick adaptation if the information system is not yet to the biologist's satisfaction. Each application generated with MOLGENIS comes with an optimized database back-end, user interfaces for biologists to manage and exploit their data, programming interfaces for bioinformaticians to script analysis tools in R, Java, SOAP, REST/JSON and RDF, a tab-delimited file format to ease upload and exchange of data, and detailed technical documentation. Existing databases

  7. Perl Template Toolkit

    CERN Document Server

    Chamberlain, Darren; Cross, David; Torkington, Nathan; Diaz, tatiana Apandi

    2004-01-01

    Among the many different approaches to "templating" with Perl--such as Embperl, Mason, HTML::Template, and hundreds of other lesser known systems--the Template Toolkit is widely recognized as one of the most versatile. Like other templating systems, the Template Toolkit allows programmers to embed Perl code and custom macros into HTML documents in order to create customized documents on the fly. But unlike the others, the Template Toolkit is as facile at producing HTML as it is at producing XML, PDF, or any other output format. And because it has its own simple templating language, templates

  8. Accelerator physics analysis with an integrated toolkit

    International Nuclear Information System (INIS)

    Holt, J.A.; Michelotti, L.; Satogata, T.

    1992-08-01

    Work is in progress on an integrated software toolkit for linear and nonlinear accelerator design, analysis, and simulation. As a first application, ''beamline'' and ''MXYZPTLK'' (differential algebra) class libraries, were used with an X Windows graphics library to build an user-friendly, interactive phase space tracker which, additionally, finds periodic orbits. This program was used to analyse a theoretical lattice which contains octupoles and decapoles to find the 20th order, stable and unstable periodic orbits and to explore the local phase space structure

  9. An expanded framework for the advanced computational testing and simulation toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Marques, Osni A.; Drummond, Leroy A.

    2003-11-09

    The Advanced Computational Testing and Simulation (ACTS) Toolkit is a set of computational tools developed primarily at DOE laboratories and is aimed at simplifying the solution of common and important computational problems. The use of the tools reduces the development time for new codes and the tools provide functionality that might not otherwise be available. This document outlines an agenda for expanding the scope of the ACTS Project based on lessons learned from current activities. Highlights of this agenda include peer-reviewed certification of new tools; finding tools to solve problems that are not currently addressed by the Toolkit; working in collaboration with other software initiatives and DOE computer facilities; expanding outreach efforts; promoting interoperability, further development of the tools; and improving functionality of the ACTS Information Center, among other tasks. The ultimate goal is to make the ACTS tools more widely used and more effective in solving DOE's and the nation's scientific problems through the creation of a reliable software infrastructure for scientific computing.

  10. State transition storyboards: A tool for designing the Goldstone solar system radar data acquisition system user interface software

    Science.gov (United States)

    Howard, S. D.

    1987-01-01

    Effective user interface design in software systems is a complex task that takes place without adequate modeling tools. By combining state transition diagrams and the storyboard technique of filmmakers, State Transition Storyboards were developed to provide a detailed modeling technique for the Goldstone Solar System Radar Data Acquisition System human-machine interface. Illustrations are included with a description of the modeling technique.

  11. WING/WORLD: An Open Experimental Toolkit for the Design and Deployment of IEEE 802.11-Based Wireless Mesh Networks Testbeds

    Directory of Open Access Journals (Sweden)

    Daniele Miorandi

    2010-01-01

    Full Text Available Wireless Mesh Networks represent an interesting instance of light-infrastructure wireless networks. Due to their flexibility and resiliency to network failures, wireless mesh networks are particularly suitable for incremental and rapid deployments of wireless access networks in both metropolitan and rural areas. This paper illustrates the design and development of an open toolkit aimed at supporting the design of different solutions for wireless mesh networking by enabling real evaluation, validation, and demonstration. The resulting testbed is based on off-the-shelf hardware components and open-source software and is focused on IEEE 802.11 commodity devices. The software toolkit is based on an “open” philosophy and aims at providing the scientific community with a tool for effective and reproducible performance analysis of WMNs. The paper describes the architecture of the toolkit, and its core functionalities, as well as its potential evolutions.

  12. Mars, accessing the third dimension: a software tool to exploit Mars ground penetrating radars data.

    Science.gov (United States)

    Cantini, Federico; Ivanov, Anton B.

    2016-04-01

    The Mars Advanced Radar for Subsurface and Ionosphere Sounding (MARSIS), on board the ESA's Mars Express and the SHAllow RADar (SHARAD), on board the NASA's Mars Reconnaissance Orbiter are two ground penetrating radars (GPRs) aimed to probe the crust of Mars to explore the subsurface structure of the planet. By now they are collecting data since about 10 years covering a large fraction of the Mars surface. On the Earth GPRs collect data by sending electromagnetic (EM) pulses toward the surface and listening to the return echoes occurring at the dielectric discontinuities on the planet's surface and subsurface. The wavelengths used allow MARSIS EM pulses to penetrate the crust for several kilometers. The data products (Radargrams) are matrices where the x-axis spans different sampling points on the planet surface and the y-axis is the power of the echoes over time in the listening window. No standard way to manage this kind of data is established in the planetary science community and data analysis and interpretation require very often some knowledge of radar signal processing. Our software tool is aimed to ease the access to this data in particular to scientists without a specific background in signal processing. MARSIS and SHARAD geometrical data such as probing point latitude and longitude and spacecraft altitude, are stored, together with relevant acquisition metadata, in a geo-enabled relational database implemented using PostgreSQL and PostGIS. Data are extracted from official ESA and NASA released data using self-developed python classes and scripts and inserted in the database using OGR utilities. This software is also aimed to be the core of a collection of classes and script to implement more complex GPR data analysis. Geometrical data and metadata are exposed as WFS layers using a QGIS server, which can be further integrated with other data, such as imaging, spectroscopy and topography. Radar geometry data will be available as a part of the iMars Web

  13. RAVE-a Detector-independent vertex reconstruction toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Waltenberger, Wolfgang [Institute of High Energy Physics, Austrian Academy of Sciences A-1050 Vienna (Austria)], E-mail: walten@hephy.oeaw.ac.at; Mitaroff, Winfried; Moser, Fabian [Institute of High Energy Physics, Austrian Academy of Sciences A-1050 Vienna (Austria)

    2007-10-21

    A detector-independent toolkit for vertex reconstruction (RAVE) is being developed, along with a standalone framework (VERTIGO) for testing, analyzing and debugging. The core algorithms represent state of the art for geometric vertex finding and fitting by both linear (Kalman filter) and robust estimation methods. Main design goals are ease of use, flexibility for embedding into existing software frameworks, extensibility, and openness. The implementation is based on modern object-oriented techniques, is coded in C++ with interfaces for Java and Python, and follows an open-source approach. A beta release is available.

  14. RAVE-a Detector-independent vertex reconstruction toolkit

    International Nuclear Information System (INIS)

    Waltenberger, Wolfgang; Mitaroff, Winfried; Moser, Fabian

    2007-01-01

    A detector-independent toolkit for vertex reconstruction (RAVE) is being developed, along with a standalone framework (VERTIGO) for testing, analyzing and debugging. The core algorithms represent state of the art for geometric vertex finding and fitting by both linear (Kalman filter) and robust estimation methods. Main design goals are ease of use, flexibility for embedding into existing software frameworks, extensibility, and openness. The implementation is based on modern object-oriented techniques, is coded in C++ with interfaces for Java and Python, and follows an open-source approach. A beta release is available

  15. Pydpiper: a flexible toolkit for constructing novel registration pipelines.

    Science.gov (United States)

    Friedel, Miriam; van Eede, Matthijs C; Pipitone, Jon; Chakravarty, M Mallar; Lerch, Jason P

    2014-01-01

    Using neuroimaging technologies to elucidate the relationship between genotype and phenotype and brain and behavior will be a key contribution to biomedical research in the twenty-first century. Among the many methods for analyzing neuroimaging data, image registration deserves particular attention due to its wide range of applications. Finding strategies to register together many images and analyze the differences between them can be a challenge, particularly given that different experimental designs require different registration strategies. Moreover, writing software that can handle different types of image registration pipelines in a flexible, reusable and extensible way can be challenging. In response to this challenge, we have created Pydpiper, a neuroimaging registration toolkit written in Python. Pydpiper is an open-source, freely available software package that provides multiple modules for various image registration applications. Pydpiper offers five key innovations. Specifically: (1) a robust file handling class that allows access to outputs from all stages of registration at any point in the pipeline; (2) the ability of the framework to eliminate duplicate stages; (3) reusable, easy to subclass modules; (4) a development toolkit written for non-developers; (5) four complete applications that run complex image registration pipelines "out-of-the-box." In this paper, we will discuss both the general Pydpiper framework and the various ways in which component modules can be pieced together to easily create new registration pipelines. This will include a discussion of the core principles motivating code development and a comparison of Pydpiper with other available toolkits. We also provide a comprehensive, line-by-line example to orient users with limited programming knowledge and highlight some of the most useful features of Pydpiper. In addition, we will present the four current applications of the code.

  16. Toolkit Design for Interactive Structured Graphics

    National Research Council Canada - National Science Library

    Bederson, Benjamin B; Grosjean, Jesse; Meyer, Jon

    2003-01-01

    .... We compare hand-crafted custom code to polylithic and monolithic toolkit-based solutions. Polylithic toolkits follow a design philosophy similar to 3D scene graphs supported by toolkits including Java3D and OpenInventor...

  17. Atrioventricular junction (AVJ) motion tracking: a software tool with ITK/VTK/Qt.

    Science.gov (United States)

    Pengdong Xiao; Shuang Leng; Xiaodan Zhao; Hua Zou; Ru San Tan; Wong, Philip; Liang Zhong

    2016-08-01

    The quantitative measurement of the Atrioventricular Junction (AVJ) motion is an important index for ventricular functions of one cardiac cycle including systole and diastole. In this paper, a software tool that can conduct AVJ motion tracking from cardiovascular magnetic resonance (CMR) images is presented by using Insight Segmentation and Registration Toolkit (ITK), The Visualization Toolkit (VTK) and Qt. The software tool is written in C++ by using Visual Studio Community 2013 integrated development environment (IDE) containing both an editor and a Microsoft complier. The software package has been successfully implemented. From the software engineering practice, it is concluded that ITK, VTK, and Qt are very handy software systems to implement automatic image analysis functions for CMR images such as quantitative measure of motion by visual tracking.

  18. Penn State Radar Systems: Implementation and Observations

    Science.gov (United States)

    Urbina, J. V.; Seal, R.; Sorbello, R.; Kuyeng, K.; Dyrud, L. P.

    2014-12-01

    Software Defined Radio/Radar (SDR) platforms have become increasingly popular as researchers, hobbyists, and military seek more efficient and cost-effective means for radar construction and operation. SDR platforms, by definition, utilize a software-based interface for configuration in contrast to traditional, hard-wired platforms. In an effort to provide new and improved radar sensing capabilities, Penn State has been developing advanced instruments and technologies for future radars, with primary objectives of making such instruments more capable, portable, and more cost effective. This paper will describe the design and implementation of two low-cost radar systems and their deployment in ionospheric research at both low and mid-latitudes. One radar has been installed near Penn State campus, University Park, Pennsylvania (77.97°W, 40.70°N), to make continuous meteor observations and mid-latitude plasma irregularities. The second radar is being installed in Huancayo (12.05°S, -75.33°E), Peru, which is capable of detecting E and F region plasma irregularities as well as meteor reflections. In this paper, we examine and compare the diurnal and seasonal variability of specular, non- specular, and head-echoes collected with these two new radar systems and discuss sampling biases of each meteor observation technique. We report our current efforts to validate and calibrate these radar systems with other VHF radars such as Jicamarca and SOUSY. We also present the general characteristics of continuous measurements of E-region and F-region coherent echoes using these modern radar systems and compare them with coherent radar events observed at other geographic mid-latitude radar stations.

  19. A new open-source Python-based Space Weather data access, visualization, and analysis toolkit

    Science.gov (United States)

    de Larquier, S.; Ribeiro, A.; Frissell, N. A.; Spaleta, J.; Kunduri, B.; Thomas, E. G.; Ruohoniemi, J.; Baker, J. B.

    2013-12-01

    Space weather research relies heavily on combining and comparing data from multiple observational platforms. Current frameworks exist to aggregate some of the data sources, most based on file downloads via web or ftp interfaces. Empirical models are mostly fortran based and lack interfaces with more useful scripting languages. In an effort to improve data and model access, the SuperDARN community has been developing a Python-based Space Science Data Visualization Toolkit (DaViTpy). At the center of this development was a redesign of how our data (from 30 years of SuperDARN radars) was made available. Several access solutions are now wrapped into one convenient Python interface which probes local directories, a new remote NoSQL database, and an FTP server to retrieve the requested data based on availability. Motivated by the efficiency of this interface and the inherent need for data from multiple instruments, we implemented similar modules for other space science datasets (POES, OMNI, Kp, AE...), and also included fundamental empirical models with Python interfaces to enhance data analysis (IRI, HWM, MSIS...). All these modules and more are gathered in a single convenient toolkit, which is collaboratively developed and distributed using Github and continues to grow. While still in its early stages, we expect this toolkit will facilitate multi-instrument space weather research and improve scientific productivity.

  20. A Toolkit to Study Sensitivity of the Geant4 Predictions to the Variations of the Physics Model Parameters

    Energy Technology Data Exchange (ETDEWEB)

    Fields, Laura [Fermilab; Genser, Krzysztof [Fermilab; Hatcher, Robert [Fermilab; Kelsey, Michael [SLAC; Perdue, Gabriel [Fermilab; Wenzel, Hans [Fermilab; Wright, Dennis H. [SLAC; Yarba, Julia [Fermilab

    2017-08-21

    Geant4 is the leading detector simulation toolkit used in high energy physics to design detectors and to optimize calibration and reconstruction software. It employs a set of carefully validated physics models to simulate interactions of particles with matter across a wide range of interaction energies. These models, especially the hadronic ones, rely largely on directly measured cross-sections and phenomenological predictions with physically motivated parameters estimated by theoretical calculation or measurement. Because these models are tuned to cover a very wide range of possible simulation tasks, they may not always be optimized for a given process or a given material. This raises several critical questions, e.g. how sensitive Geant4 predictions are to the variations of the model parameters, or what uncertainties are associated with a particular tune of a Geant4 physics model, or a group of models, or how to consistently derive guidance for Geant4 model development and improvement from a wide range of available experimental data. We have designed and implemented a comprehensive, modular, user-friendly software toolkit to study and address such questions. It allows one to easily modify parameters of one or several Geant4 physics models involved in the simulation, and to perform collective analysis of multiple variants of the resulting physics observables of interest and comparison against a variety of corresponding experimental data. Based on modern event-processing infrastructure software, the toolkit offers a variety of attractive features, e.g. flexible run-time configurable workflow, comprehensive bookkeeping, easy to expand collection of analytical components. Design, implementation technology, and key functionalities of the toolkit are presented and illustrated with results obtained with Geant4 key hadronic models.

  1. BIT: Biosignal Igniter Toolkit.

    Science.gov (United States)

    da Silva, Hugo Plácido; Lourenço, André; Fred, Ana; Martins, Raúl

    2014-06-01

    The study of biosignals has had a transforming role in multiple aspects of our society, which go well beyond the health sciences domains to which they were traditionally associated with. While biomedical engineering is a classical discipline where the topic is amply covered, today biosignals are a matter of interest for students, researchers and hobbyists in areas including computer science, informatics, electrical engineering, among others. Regardless of the context, the use of biosignals in experimental activities and practical projects is heavily bounded by the cost, and limited access to adequate support materials. In this paper we present an accessible, albeit versatile toolkit, composed of low-cost hardware and software, which was created to reinforce the engagement of different people in the field of biosignals. The hardware consists of a modular wireless biosignal acquisition system that can be used to support classroom activities, interface with other devices, or perform rapid prototyping of end-user applications. The software comprehends a set of programming APIs, a biosignal processing toolbox, and a framework for real time data acquisition and postprocessing. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  2. The Automatic Parallelisation of Scientific Application Codes Using a Computer Aided Parallelisation Toolkit

    Science.gov (United States)

    Ierotheou, C.; Johnson, S.; Leggett, P.; Cross, M.; Evans, E.; Jin, Hao-Qiang; Frumkin, M.; Yan, J.; Biegel, Bryan (Technical Monitor)

    2001-01-01

    The shared-memory programming model is a very effective way to achieve parallelism on shared memory parallel computers. Historically, the lack of a programming standard for using directives and the rather limited performance due to scalability have affected the take-up of this programming model approach. Significant progress has been made in hardware and software technologies, as a result the performance of parallel programs with compiler directives has also made improvements. The introduction of an industrial standard for shared-memory programming with directives, OpenMP, has also addressed the issue of portability. In this study, we have extended the computer aided parallelization toolkit (developed at the University of Greenwich), to automatically generate OpenMP based parallel programs with nominal user assistance. We outline the way in which loop types are categorized and how efficient OpenMP directives can be defined and placed using the in-depth interprocedural analysis that is carried out by the toolkit. We also discuss the application of the toolkit on the NAS Parallel Benchmarks and a number of real-world application codes. This work not only demonstrates the great potential of using the toolkit to quickly parallelize serial programs but also the good performance achievable on up to 300 processors for hybrid message passing and directive-based parallelizations.

  3. X-CSIT: a toolkit for simulating 2D pixel detectors

    Science.gov (United States)

    Joy, A.; Wing, M.; Hauf, S.; Kuster, M.; Rüter, T.

    2015-04-01

    A new, modular toolkit for creating simulations of 2D X-ray pixel detectors, X-CSIT (X-ray Camera SImulation Toolkit), is being developed. The toolkit uses three sequential simulations of detector processes which model photon interactions, electron charge cloud spreading with a high charge density plasma model and common electronic components used in detector readout. In addition, because of the wide variety in pixel detector design, X-CSIT has been designed as a modular platform so that existing functions can be modified or additional functionality added if the specific design of a detector demands it. X-CSIT will be used to create simulations of the detectors at the European XFEL, including three bespoke 2D detectors: the Adaptive Gain Integrating Pixel Detector (AGIPD), Large Pixel Detector (LPD) and DePFET Sensor with Signal Compression (DSSC). These simulations will be used by the detector group at the European XFEL for detector characterisation and calibration. For this purpose, X-CSIT has been integrated into the European XFEL's software framework, Karabo. This will further make it available to users to aid with the planning of experiments and analysis of data. In addition, X-CSIT will be released as a standalone, open source version for other users, collaborations and groups intending to create simulations of their own detectors.

  4. Applications toolkit for accelerator control and analysis

    International Nuclear Information System (INIS)

    Borland, M.

    1997-01-01

    The Advanced Photon Source (APS) has taken a unique approach to creating high-level software applications for accelerator operation and analysis. The approach is based on self-describing data, modular program toolkits, and scripts. Self-describing data provide a communication standard that aids the creation of modular program toolkits by allowing compliant programs to be used in essentially arbitrary combinations. These modular programs can be used as part of an arbitrary number of high-level applications. At APS, a group of about 70 data analysis, manipulation, and display tools is used in concert with about 20 control-system-specific tools to implement applications for commissioning and operations. High-level applications are created using scripts, which are relatively simple interpreted programs. The Tcl/Tk script language is used, allowing creating of graphical user interfaces (GUIs) and a library of algorithms that are separate from the interface. This last factor allows greater automation of control by making it easy to take the human out of the loop. Applications of this methodology to operational tasks such as orbit correction, configuration management, and data review will be discussed

  5. The RAVE/VERTIGO vertex reconstruction toolkit and framework

    Science.gov (United States)

    Waltenberger, W.; Mitaroff, W.; Moser, F.; Pflugfelder, B.; Riedel, H. V.

    2008-07-01

    A detector-independent toolkit for vertex reconstruction (RAVE1) is being developed, along with a standalone framework (VERTIGO2) for testing, analyzing and debugging. The core algorithms represent state-of-the-art for geometric vertex finding and fitting by both linear (Kalman filter) and robust estimation methods. Main design goals are ease of use, flexibility for embedding into existing software frameworks, extensibility, and openness. The implementation is based on modern object-oriented techniques, is coded in C++ with interfaces for Java and Python, and follows an open-source approach. A beta release is available.

  6. Autonomous Non-Linear Classification of LPI Radar Signal Modulations

    National Research Council Canada - National Science Library

    Gulum, Taylan O

    2007-01-01

    ...) radar modulations is investigated. A software engineering architecture that allows a full investigation of various preprocessing algorithms and classification techniques is applied to a database of important LPI radar waveform...

  7. Business/Employers Influenza Toolkit

    Centers for Disease Control (CDC) Podcasts

    This podcast promotes the "Make It Your Business To Fight The Flu" toolkit for Businesses and Employers. The toolkit provides information and recommended strategies to help businesses and employers promote the seasonal flu vaccine. Additionally, employers will find flyers, posters, and other materials to post and distribute in the workplace.

  8. Improving Code Quality of the Compact Muon Solenoid Electromagnetic Calorimeter Control Software to Increase System Maintainability

    CERN Multimedia

    Holme, Oliver; Dissertori, Günther; Djambazov, Lubomir; Lustermann, Werner; Zelepoukine, Serguei

    2013-01-01

    The Detector Control System (DCS) software of the Electromagnetic Calorimeter (ECAL) of the Compact Muon Solenoid (CMS) experiment at CERN is designed primarily to enable safe and efficient operation of the detector during Large Hadron Collider (LHC) data-taking periods. Through a manual analysis of the code and the adoption of ConQAT [1], a software quality assessment toolkit, the CMS ECAL DCS team has made significant progress in reducing complexity and improving code quality, with observable results in terms of a reduction in the effort dedicated to software maintenance. This paper explains the methodology followed, including the motivation to adopt ConQAT, the specific details of how this toolkit was used and the outcomes that have been achieved. [1] ConQAT, Continuous Quality Assessment Toolkit; https://www.conqat.org/

  9. Development and formative evaluation of the e-Health Implementation Toolkit (e-HIT

    Directory of Open Access Journals (Sweden)

    Mair Frances

    2010-10-01

    Full Text Available Abstract Background The use of Information and Communication Technology (ICT or e-Health is seen as essential for a modern, cost-effective health service. However, there are well documented problems with implementation of e-Health initiatives, despite the existence of a great deal of research into how best to implement e-Health (an example of the gap between research and practice. This paper reports on the development and formative evaluation of an e-Health Implementation Toolkit (e-HIT which aims to summarise and synthesise new and existing research on implementation of e-Health initiatives, and present it to senior managers in a user-friendly format. Results The content of the e-HIT was derived by combining data from a systematic review of reviews of barriers and facilitators to implementation of e-Health initiatives with qualitative data derived from interviews of "implementers", that is people who had been charged with implementing an e-Health initiative. These data were summarised, synthesised and combined with the constructs from the Normalisation Process Model. The software for the toolkit was developed by a commercial company (RocketScience. Formative evaluation was undertaken by obtaining user feedback. There are three components to the toolkit - a section on background and instructions for use aimed at novice users; the toolkit itself; and the report generated by completing the toolkit. It is available to download from http://www.ucl.ac.uk/pcph/research/ehealth/documents/e-HIT.xls Conclusions The e-HIT shows potential as a tool for enhancing future e-Health implementations. Further work is needed to make it fully web-enabled, and to determine its predictive potential for future implementations.

  10. NGS QC Toolkit: a toolkit for quality control of next generation sequencing data.

    Directory of Open Access Journals (Sweden)

    Ravi K Patel

    Full Text Available Next generation sequencing (NGS technologies provide a high-throughput means to generate large amount of sequence data. However, quality control (QC of sequence data generated from these technologies is extremely important for meaningful downstream analysis. Further, highly efficient and fast processing tools are required to handle the large volume of datasets. Here, we have developed an application, NGS QC Toolkit, for quality check and filtering of high-quality data. This toolkit is a standalone and open source application freely available at http://www.nipgr.res.in/ngsqctoolkit.html. All the tools in the application have been implemented in Perl programming language. The toolkit is comprised of user-friendly tools for QC of sequencing data generated using Roche 454 and Illumina platforms, and additional tools to aid QC (sequence format converter and trimming tools and analysis (statistics tools. A variety of options have been provided to facilitate the QC at user-defined parameters. The toolkit is expected to be very useful for the QC of NGS data to facilitate better downstream analysis.

  11. BALTRAD Advanced Weather Radar Networking

    Directory of Open Access Journals (Sweden)

    Daniel Michelson

    2018-03-01

    Full Text Available BALTRAD software exchanges weather-radar data internationally, operationally, and in real-time, and it processes the data using a common toolbox of algorithms available to every node in the decentralized radar network. This approach enables each node to access and process its own and international data to meet its local needs. The software system is developed collaboratively by the BALTRAD partnership, mostly comprising the national Meteorological and Hydrological institutes in the European Union’s Baltic Sea Region. The most important sub-systems are for data exchange, data management, scheduling and event handling, and data processing. C, Java, and Python languages are used depending on the sub-system, and sub-systems communicate using well-defined interfaces. Software is available from a dedicated Git server. BALTRAD software has been deployed throughout Europe and more recently in Canada. Funding statement: From 2009–2014, the BALTRAD and BALTRAD+ projects were part-financed by the European Union (European Regional Development Fund and European Neighbourhood and Partnership Instrument, with project numbers #009 and #101, respectively.

  12. Audit: Automated Disk Investigation Toolkit

    Directory of Open Access Journals (Sweden)

    Umit Karabiyik

    2014-09-01

    Full Text Available Software tools designed for disk analysis play a critical role today in forensics investigations. However, these digital forensics tools are often difficult to use, usually task specific, and generally require professionally trained users with IT backgrounds. The relevant tools are also often open source requiring additional technical knowledge and proper configuration. This makes it difficult for investigators without some computer science background to easily conduct the needed disk analysis. In this paper, we present AUDIT, a novel automated disk investigation toolkit that supports investigations conducted by non-expert (in IT and disk technology and expert investigators. Our proof of concept design and implementation of AUDIT intelligently integrates open source tools and guides non-IT professionals while requiring minimal technical knowledge about the disk structures and file systems of the target disk image.

  13. The RAVE/VERTIGO vertex reconstruction toolkit and framework

    Energy Technology Data Exchange (ETDEWEB)

    Waltenberger, W; Mitaroff, W; Moser, F; Pflugfelder, B; Riedel, H V [Austrian Academy of Sciences, Institute of High Energy Physics, A-1050 Vienna (Austria)], E-mail: walten@hephy.oeaw.ac.at

    2008-07-15

    A detector-independent toolkit for vertex reconstruction (RAVE{sup 1}) is being developed, along with a standalone framework (VERTIGO{sup 2}) for testing, analyzing and debugging. The core algorithms represent state-of-the-art for geometric vertex finding and fitting by both linear (Kalman filter) and robust estimation methods. Main design goals are ease of use, flexibility for embedding into existing software frameworks, extensibility, and openness. The implementation is based on modern object-oriented techniques, is coded in C++ with interfaces for Java and Python, and follows an open-source approach. A beta release is available.

  14. Wetland Resources Action Planning (WRAP) toolkit

    DEFF Research Database (Denmark)

    Bunting, Stuart W.; Smith, Kevin G.; Lund, Søren

    2013-01-01

    The Wetland Resources Action Planning (WRAP) toolkit is a toolkit of research methods and better management practices used in HighARCS (Highland Aquatic Resources Conservation and Sustainable Development), an EU-funded project with field experiences in China, Vietnam and India. It aims to communi......The Wetland Resources Action Planning (WRAP) toolkit is a toolkit of research methods and better management practices used in HighARCS (Highland Aquatic Resources Conservation and Sustainable Development), an EU-funded project with field experiences in China, Vietnam and India. It aims...... to communicate best practices in conserving biodiversity and sustaining ecosystem services to potential users and to promote the wise-use of aquatic resources, improve livelihoods and enhance policy information....

  15. Impact of Shutting Down En Route Primary Radars within CONUS Interior

    Science.gov (United States)

    1993-06-01

    Remote Control Interface Unit ( RCIU ) RMS software for the primary radar will be deleted. Any dependency of the secondary radar on the primary radar data...Generators RCIU Remote Control and Interface Unit RMM Remote Monitoring and Maintenance RMMS Remote Maintenance Monitoring System RMS Remote Maintenance

  16. VaST: A variability search toolkit

    Science.gov (United States)

    Sokolovsky, K. V.; Lebedev, A. A.

    2018-01-01

    Variability Search Toolkit (VaST) is a software package designed to find variable objects in a series of sky images. It can be run from a script or interactively using its graphical interface. VaST relies on source list matching as opposed to image subtraction. SExtractor is used to generate source lists and perform aperture or PSF-fitting photometry (with PSFEx). Variability indices that characterize scatter and smoothness of a lightcurve are computed for all objects. Candidate variables are identified as objects having high variability index values compared to other objects of similar brightness. The two distinguishing features of VaST are its ability to perform accurate aperture photometry of images obtained with non-linear detectors and handle complex image distortions. The software has been successfully applied to images obtained with telescopes ranging from 0.08 to 2.5 m in diameter equipped with a variety of detectors including CCD, CMOS, MIC and photographic plates. About 1800 variable stars have been discovered with VaST. It is used as a transient detection engine in the New Milky Way (NMW) nova patrol. The code is written in C and can be easily compiled on the majority of UNIX-like systems. VaST is free software available at http://scan.sai.msu.ru/vast/.

  17. Supporting LGBT Communities: Police ToolKit

    OpenAIRE

    Vasquez del Aguila, Ernesto; Franey, Paul

    2013-01-01

    This toolkit provides police forces with practical educational tools, which can be used as part of a comprehensive LGBT strategy centred on diversity, equality, and non-discrimination. These materials are based on lessons learned through real life policing experiences with LGBT persons. The Toolkit is divided into seven scenarios where police awareness of LGBT issues has been identified as important. The toolkit employs a practical, scenario-based, problem-solving approach to help police offi...

  18. Commissioning software tools at the Advanced Photon Source

    International Nuclear Information System (INIS)

    Emery, L.

    1995-01-01

    A software tool-oriented approach has been adopted in the commissioning of the Advanced Photon Source (APS) at Argonne National Laboratory, particularly in the commissioning of the Positron Accumulator Ring (PAR). The general philosophy is to decompose a complicated procedure involving measurement, data processing, and control into a series of simpler steps, each accomplished by a generic toolkit program. The implementation is greatly facilitated by adopting the SDDS (self-describing data set protocol), which comes with its own toolkit. The combined toolkit has made accelerator physics measurements easier. For instance, the measurement of the optical functions of the PAR and the beamlines connected to it have been largely automated. Complicated measurements are feasible with a combination of tools running independently

  19. Movement and respiration detection using statistical properties of the FMCW radar signal

    KAUST Repository

    Kiuru, Tero; Metso, Mikko; Jardak, Seifallah; Pursula, Pekka; Hakli, Janne; Hirvonen, Mervi; Sepponen, Raimo

    2016-01-01

    This paper presents a 24 GHz FMCW radar system for detection of movement and respiration using change in the statistical properties of the received radar signal, both amplitude and phase. We present the hardware and software segments of the radar

  20. Toolkit Design for Interactive Structured Graphics

    National Research Council Canada - National Science Library

    Bederson, Benjamin B; Grosjean, Jesse; Meyer, Jon

    2003-01-01

    .... We describe Jazz (a polylithic toolkit) and Piccolo (a monolithic toolkit), each of which we built to support interactive 2D structured graphics applications in general, and Zoomable User Interface applications in particular...

  1. Business/Employers Influenza Toolkit

    Centers for Disease Control (CDC) Podcasts

    2011-09-06

    This podcast promotes the "Make It Your Business To Fight The Flu" toolkit for Businesses and Employers. The toolkit provides information and recommended strategies to help businesses and employers promote the seasonal flu vaccine. Additionally, employers will find flyers, posters, and other materials to post and distribute in the workplace.  Created: 9/6/2011 by Office of Infectious Diseases, Office of the Director (OD).   Date Released: 9/7/2011.

  2. Solar Integration National Dataset Toolkit | Grid Modernization | NREL

    Science.gov (United States)

    Solar Integration National Dataset Toolkit Solar Integration National Dataset Toolkit NREL is working on a Solar Integration National Dataset (SIND) Toolkit to enable researchers to perform U.S . regional solar generation integration studies. It will provide modeled, coherent subhourly solar power data

  3. CheMentor Software System by H. A. Peoples

    Science.gov (United States)

    Reid, Brian P.

    1997-09-01

    CheMentor Software System H. A. Peoples. Computerized Learning Enhancements: http://www.ecis.com/~clehap; email: clehap@ecis.com; 1996 - 1997. CheMentor is a series of software packages for introductory-level chemistry, which includes Practice Items (I), Stoichiometry (I), Calculating Chemical Formulae, and the CheMentor Toolkit. The first three packages provide practice problems for students and various types of help to solve them; the Toolkit includes "calculators" for determining chemical quantities as well as the Practice Items (I) set of problems. The set of software packages is designed so that each individual product acts as a module of a common CheMentor program. As the name CheMentor implies, the software is designed as a "mentor" for students learning introductory chemistry concepts and problems. The typical use of the software would be by individual students (or perhaps small groups) as an adjunct to lectures. CheMentor is a HyperCard application and the modules are HyperCard stacks. The requirements to run the packages include a Macintosh computer with at least 1 MB of RAM, a hard drive with several MB of available space depending upon the packages selected (10 MB were required for all the packages reviewed here), and the Mac operating system 6.0.5 or later.

  4. Wind Integration National Dataset Toolkit | Grid Modernization | NREL

    Science.gov (United States)

    Integration National Dataset Toolkit Wind Integration National Dataset Toolkit The Wind Integration National Dataset (WIND) Toolkit is an update and expansion of the Eastern Wind Integration Data Set and Western Wind Integration Data Set. It supports the next generation of wind integration studies. WIND

  5. Simplified formulae for the estimation of offshore wind turbines clutter on marine radars.

    Science.gov (United States)

    Grande, Olatz; Cañizo, Josune; Angulo, Itziar; Jenn, David; Danoon, Laith R; Guerra, David; de la Vega, David

    2014-01-01

    The potential impact that offshore wind farms may cause on nearby marine radars should be considered before the wind farm is installed. Strong radar echoes from the turbines may degrade radars' detection capability in the area around the wind farm. Although conventional computational methods provide accurate results of scattering by wind turbines, they are not directly implementable in software tools that can be used to conduct the impact studies. This paper proposes a simple model to assess the clutter that wind turbines may generate on marine radars. This method can be easily implemented in the system modeling software tools for the impact analysis of a wind farm in a real scenario.

  6. GEANT 4: an Object-Oriented toolkit for simulation in HEP

    CERN Multimedia

    Kent, P; Sirotenko, V; Komogorov, M; Pavliouk, A; Greeniaus, G L; Kayal, P I; Routenburg, P; Tanaka, S; Duellmann, D; Innocente, V; Paoli, S; Ranjard, F; Riccardi, F; Ruggier, M; Shiers, J; Egli, S; Kimura, A; Urban, P; Prior, S; Walkden, A; Forti, A; Magni, S; Strahl, K; Kokoulin, R; Braune, K; Volcker, C; Ullrich, T; Takahata, M; Nieminen, P; Ballocchi, G; Mora De Freitas, P; Verderi, M; Rybine, A; Langeveld, W; Nagamatsu, M; Hamatsu, R; Katayama, N; Chuma, J; Felawka, L; Gumplinger, P; Axen, D

    2002-01-01

    %RD44 %title\\\\ \\\\The GEANT4 software has been developed by a world-wide collaboration of about 100 scientists from over 40 institutions and laboratories participating in more than 10 experiments in Europe, Russia, Japan, Canada, and the United States. The GEANT4 detector simulation toolkit has been designed for the next generation of High Energy Physics (HEP) experiments, with primary requirements from the LHC, the CP violation, and the heavy ions experiments. In addition, GEANT4 also meets the requirements from the space and medical communities, thanks to very low energy extensions developed in a joint project with the European Space Agency (ESA). GEANT4 has exploited advanced software engineering techniques (for example PSS-05) and Object-Oriented technology to improve the validation process of the physics results, and in the same time to make possible the distributed software design and development in the world-wide collaboration. Fifteen specialised working groups have been responsible for fields as diver...

  7. Benchmarking the Collocation Stand-Alone Library and Toolkit (CSALT)

    Science.gov (United States)

    Hughes, Steven; Knittel, Jeremy; Shoan, Wendy; Kim, Youngkwang; Conway, Claire; Conway, Darrel J.

    2017-01-01

    This paper describes the processes and results of Verification and Validation (VV) efforts for the Collocation Stand Alone Library and Toolkit (CSALT). We describe the test program and environments, the tools used for independent test data, and comparison results. The VV effort employs classical problems with known analytic solutions, solutions from other available software tools, and comparisons to benchmarking data available in the public literature. Presenting all test results are beyond the scope of a single paper. Here we present high-level test results for a broad range of problems, and detailed comparisons for selected problems.

  8. A qualitative study of clinic and community member perspectives on intervention toolkits: "Unless the toolkit is used it won't help solve the problem".

    Science.gov (United States)

    Davis, Melinda M; Howk, Sonya; Spurlock, Margaret; McGinnis, Paul B; Cohen, Deborah J; Fagnan, Lyle J

    2017-07-18

    Intervention toolkits are common products of grant-funded research in public health and primary care settings. Toolkits are designed to address the knowledge translation gap by speeding implementation and dissemination of research into practice. However, few studies describe characteristics of effective intervention toolkits and their implementation. Therefore, we conducted this study to explore what clinic and community-based users want in intervention toolkits and to identify the factors that support application in practice. In this qualitative descriptive study we conducted focus groups and interviews with a purposive sample of community health coalition members, public health experts, and primary care professionals between November 2010 and January 2012. The transdisciplinary research team used thematic analysis to identify themes and a cross-case comparative analysis to explore variation by participant role and toolkit experience. Ninety six participants representing primary care (n = 54, 56%) and community settings (n = 42, 44%) participated in 18 sessions (13 focus groups, five key informant interviews). Participants ranged from those naïve through expert in toolkit development; many reported limited application of toolkits in actual practice. Participants wanted toolkits targeted at the right audience and demonstrated to be effective. Well organized toolkits, often with a quick start guide, with tools that were easy to tailor and apply were desired. Irrespective of perceived quality, participants experienced with practice change emphasized that leadership, staff buy-in, and facilitative support was essential for intervention toolkits to be translated into changes in clinic or public -health practice. Given the emphasis on toolkits in supporting implementation and dissemination of research and clinical guidelines, studies are warranted to determine when and how toolkits are used. Funders, policy makers, researchers, and leaders in primary care and

  9. Energy retrofit analysis toolkits for commercial buildings: A review

    International Nuclear Information System (INIS)

    Lee, Sang Hoon; Hong, Tianzhen; Piette, Mary Ann; Taylor-Lange, Sarah C.

    2015-01-01

    Retrofit analysis toolkits can be used to optimize energy or cost savings from retrofit strategies, accelerating the adoption of ECMs (energy conservation measures) in buildings. This paper provides an up-to-date review of the features and capabilities of 18 energy retrofit toolkits, including ECMs and the calculation engines. The fidelity of the calculation techniques, a driving component of retrofit toolkits, were evaluated. An evaluation of the issues that hinder effective retrofit analysis in terms of accessibility, usability, data requirement, and the application of efficiency measures, provides valuable insights into advancing the field forward. Following this review the general concepts were determined: (1) toolkits developed primarily in the private sector use empirically data-driven methods or benchmarking to provide ease of use, (2) almost all of the toolkits which used EnergyPlus or DOE-2 were freely accessible, but suffered from complexity, longer data input and simulation run time, (3) in general, there appeared to be a fine line between having too much detail resulting in a long analysis time or too little detail which sacrificed modeling fidelity. These insights provide an opportunity to enhance the design and development of existing and new retrofit toolkits in the future. - Highlights: • Retrofit analysis toolkits can accelerate the adoption of energy efficiency measures. • A comprehensive review of 19 retrofit analysis toolkits was conducted. • Retrofit toolkits have diverse features, data requirement and computing methods. • Empirical data-driven, normative and detailed energy modeling methods are used. • Identified immediate areas for improvement for retrofit analysis toolkits

  10. Security Assessment Simulation Toolkit (SAST) Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Meitzler, Wayne D.; Ouderkirk, Steven J.; Hughes, Chad O.

    2009-11-15

    The Department of Defense Technical Support Working Group (DoD TSWG) investment in the Pacific Northwest National Laboratory (PNNL) Security Assessment Simulation Toolkit (SAST) research planted a technology seed that germinated into a suite of follow-on Research and Development (R&D) projects culminating in software that is used by multiple DoD organizations. The DoD TSWG technology transfer goal for SAST is already in progress. The Defense Information Systems Agency (DISA), the Defense-wide Information Assurance Program (DIAP), the Marine Corps, Office Of Naval Research (ONR) National Center For Advanced Secure Systems Research (NCASSR) and Office Of Secretary Of Defense International Exercise Program (OSD NII) are currently investing to take SAST to the next level. PNNL currently distributes the software to over 6 government organizations and 30 DoD users. For the past five DoD wide Bulwark Defender exercises, the adoption of this new technology created an expanding role for SAST. In 2009, SAST was also used in the OSD NII International Exercise and is currently scheduled for use in 2010.

  11. Archival standards, in archival open access software And offer appropriate software for internal archival centers

    Directory of Open Access Journals (Sweden)

    Abdolreza Izadi

    2016-12-01

    Full Text Available The purpose of this study is Study of Descriptive Metadata Standards in Archival open source software, to determine the most appropriate descriptive metadata standard (s and also Encoder Software support of these standards. The approach of present study is combination and library methods, Delphi and descriptive survey are used. Data gathering in library study is fiche, in the Delphi method is questionnaire and in descriptive survey is checklist. Statistical population contains 5 Archival open source software. The findings suggest that 5 metadata standards, consist of EAD, ISAD, EAC-CPF, ISAAR & ISDF, diagnosed appropriate by Delphi Panel members as the most appropriate descriptive metadata standards to use for archival software. Moreover, ICA-ATOM and Archivist toolkit in terms of support for standards that were suitable, diagnosed as the most appropriate archival software.

  12. Software process in Geant4

    International Nuclear Information System (INIS)

    Cosmo, G.

    2001-01-01

    Since its erliest years of R and D, the GEANT4 simulation toolkit has been developed following software process standards which dictated the overall evolution of the project. The complexity of the software involved, the wide areas of application of the software product, the huge amount of code and Category complexity, the size and distributed nature of the Collaboration itself are all ingredients which involve and correlate together a wide variety of software processes. Although in 'production' and available to the public since December 1998, the GEANT4 software product includes Category Domains which are still under active development. Therefore they require different treatment also in terms of improvement of the development cycle, system testing and user support. The author is meant to describe some of the software processes as they are applied in GEANT4 for both development, testing and maintenance of the software

  13. Buy, don't build -- What does that mean for a software developer?

    International Nuclear Information System (INIS)

    Little, T.; Rahi, M.A.; Sinclair, C.

    1995-01-01

    The buzz phrase of the 1990's for the petroleum software industry has become ''buy, don't build.'' For an end user in an oil company, this generally means acquiring application software rather than developing it internally. The concept of buy, don't build can also apply for a software developer. Purchasing software toolkit components can expedite the development of an application as well as reduce future support requirements

  14. Water Security Toolkit User Manual: Version 1.3 | Science ...

    Science.gov (United States)

    User manual: Data Product/Software The Water Security Toolkit (WST) is a suite of tools that help provide the information necessary to make good decisions resulting in the minimization of further human exposure to contaminants, and the maximization of the effectiveness of intervention strategies. WST assists in the evaluation of multiple response actions in order to select the most beneficial consequence management strategy. It includes hydraulic and water quality modeling software and optimization methodologies to identify: (1) sensor locations to detect contamination, (2) locations in the network in which the contamination was introduced, (3) hydrants to remove contaminated water from the distribution system, (4) locations in the network to inject decontamination agents to inactivate, remove or destroy contaminants, (5) locations in the network to take grab sample to confirm contamination or cleanup and (6) valves to close in order to isolate contaminated areas of the network.

  15. A User Guide for Smoothing Air Traffic Radar Data

    Science.gov (United States)

    Bach, Ralph E.; Paielli, Russell A.

    2014-01-01

    Matlab software was written to provide smoothing of radar tracking data to simulate ADS-B (Automatic Dependent Surveillance-Broadcast) data in order to test a tactical conflict probe. The probe, called TSAFE (Tactical Separation-Assured Flight Environment), is designed to handle air-traffic conflicts left undetected or unresolved when loss-of-separation is predicted to occur within approximately two minutes. The data stream that is down-linked from an aircraft equipped with an ADS-B system would include accurate GPS-derived position and velocity information at sample rates of 1 Hz. Nation-wide ADS-B equipage (mandated by 2020) should improve surveillance accuracy and TSAFE performance. Currently, position data are provided by Center radar (nominal 12-sec samples) and Terminal radar (nominal 4.8-sec samples). Aircraft ground speed and ground track are estimated using real-time filtering, causing lags up to 60 sec, compromising performance of a tactical resolution tool. Offline smoothing of radar data reduces wild-point errors, provides a sample rate as high as 1 Hz, and yields more accurate and lag-free estimates of ground speed, ground track, and climb rate. Until full ADS-B implementation is available, smoothed radar data should provide reasonable track estimates for testing TSAFE in an ADS-B-like environment. An example illustrates the smoothing of radar data and shows a comparison of smoothed-radar and ADS-B tracking. This document is intended to serve as a guide for using the smoothing software.

  16. MONTE: the next generation of mission design and navigation software

    Science.gov (United States)

    Evans, Scott; Taber, William; Drain, Theodore; Smith, Jonathon; Wu, Hsi-Cheng; Guevara, Michelle; Sunseri, Richard; Evans, James

    2018-03-01

    The Mission analysis, Operations and Navigation Toolkit Environment (MONTE) (Sunseri et al. in NASA Tech Briefs 36(9), 2012) is an astrodynamic toolkit produced by the Mission Design and Navigation Software Group at the Jet Propulsion Laboratory. It provides a single integrated environment for all phases of deep space and Earth orbiting missions. Capabilities include: trajectory optimization and analysis, operational orbit determination, flight path control, and 2D/3D visualization. MONTE is presented to the user as an importable Python language module. This allows a simple but powerful user interface via CLUI or script. In addition, the Python interface allows MONTE to be used seamlessly with other canonical scientific programming tools such as SciPy, NumPy, and Matplotlib. MONTE is the prime operational orbit determination software for all JPL navigated missions.

  17. Network Science Research Laboratory (NSRL) Discrete Event Toolkit

    Science.gov (United States)

    2016-01-01

    ARL-TR-7579 ● JAN 2016 US Army Research Laboratory Network Science Research Laboratory (NSRL) Discrete Event Toolkit by...Laboratory (NSRL) Discrete Event Toolkit by Theron Trout and Andrew J Toth Computational and Information Sciences Directorate, ARL...Research Laboratory (NSRL) Discrete Event Toolkit 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Theron Trout

  18. How to create an interface between UrQMD and Geant4 toolkit

    CERN Document Server

    Abdel-Waged, Khaled; Uzhinskii, V.V.

    2012-01-01

    An interface between the UrQMD-1.3cr model (version 1.3 for cosmic air showers) and the Geant4 transport toolkit has been developed. Compared to the current Geant4 (hybrid) hadronic models, this provides the ability to simulate at the microscopic level hadron, nucleus, and anti-nucleus interactions with matter from 0 to 1 TeV with a single transport code. This document provides installation requirements and instructions, as well as class and member function descriptions of the software.

  19. Development of a System Analysis Toolkit for Sensitivity Analysis, Uncertainty Propagation, and Estimation of Parameter Distribution

    International Nuclear Information System (INIS)

    Heo, Jaeseok; Kim, Kyung Doo

    2015-01-01

    Statistical approaches to uncertainty quantification and sensitivity analysis are very important in estimating the safety margins for an engineering design application. This paper presents a system analysis and optimization toolkit developed by Korea Atomic Energy Research Institute (KAERI), which includes multiple packages of the sensitivity analysis and uncertainty quantification algorithms. In order to reduce the computing demand, multiple compute resources including multiprocessor computers and a network of workstations are simultaneously used. A Graphical User Interface (GUI) was also developed within the parallel computing framework for users to readily employ the toolkit for an engineering design and optimization problem. The goal of this work is to develop a GUI framework for engineering design and scientific analysis problems by implementing multiple packages of system analysis methods in the parallel computing toolkit. This was done by building an interface between an engineering simulation code and the system analysis software packages. The methods and strategies in the framework were designed to exploit parallel computing resources such as those found in a desktop multiprocessor workstation or a network of workstations. Available approaches in the framework include statistical and mathematical algorithms for use in science and engineering design problems. Currently the toolkit has 6 modules of the system analysis methodologies: deterministic and probabilistic approaches of data assimilation, uncertainty propagation, Chi-square linearity test, sensitivity analysis, and FFTBM

  20. Development of a System Analysis Toolkit for Sensitivity Analysis, Uncertainty Propagation, and Estimation of Parameter Distribution

    Energy Technology Data Exchange (ETDEWEB)

    Heo, Jaeseok; Kim, Kyung Doo [KAERI, Daejeon (Korea, Republic of)

    2015-05-15

    Statistical approaches to uncertainty quantification and sensitivity analysis are very important in estimating the safety margins for an engineering design application. This paper presents a system analysis and optimization toolkit developed by Korea Atomic Energy Research Institute (KAERI), which includes multiple packages of the sensitivity analysis and uncertainty quantification algorithms. In order to reduce the computing demand, multiple compute resources including multiprocessor computers and a network of workstations are simultaneously used. A Graphical User Interface (GUI) was also developed within the parallel computing framework for users to readily employ the toolkit for an engineering design and optimization problem. The goal of this work is to develop a GUI framework for engineering design and scientific analysis problems by implementing multiple packages of system analysis methods in the parallel computing toolkit. This was done by building an interface between an engineering simulation code and the system analysis software packages. The methods and strategies in the framework were designed to exploit parallel computing resources such as those found in a desktop multiprocessor workstation or a network of workstations. Available approaches in the framework include statistical and mathematical algorithms for use in science and engineering design problems. Currently the toolkit has 6 modules of the system analysis methodologies: deterministic and probabilistic approaches of data assimilation, uncertainty propagation, Chi-square linearity test, sensitivity analysis, and FFTBM.

  1. Measuring employee satisfaction in new offices - the WODI toolkit

    NARCIS (Netherlands)

    Maarleveld, M.; Volker, L.; van der Voordt, Theo

    2009-01-01

    Purpose: This paper presents a toolkit to measure employee satisfaction and perceived labour productivity as affected by different workplace strategies. The toolkit is being illustrated by a case study of the Dutch Revenue Service.
    Methodology: The toolkit has been developed by a review of

  2. Postmodern Software Design with NYAM: Not Yet Another Method

    NARCIS (Netherlands)

    Wieringa, Roelf J.; Broy, M.; Rumpe, B.

    1998-01-01

    This paper presents a conceptual toolbox for software specification and design that contains techniques from structured and object-oriented specification and design methods. The toolbox is called TRADE (Toolkit for Requirements and Design Engineering). The TRADE tools are used in teaching

  3. OpenDBDDAS Toolkit: Secure MapReduce and Hadoop-like Systems

    KAUST Repository

    Fabiano, Enrico

    2015-06-01

    The OpenDBDDAS Toolkit is a software framework to provide support for more easily creating and expanding dynamic big data-driven application systems (DBDDAS) that are common in environmental systems, many engineering applications, disaster management, traffic management, and manufacturing. In this paper, we describe key features needed to implement a secure MapReduce and Hadoop-like system for high performance clusters that guarantees a certain level of privacy of data from other concurrent users of the system. We also provide examples of a secure MapReduce prototype and compare it to another high performance MapReduce, MR-MPI.

  4. New software developments for quality mesh generation and optimization from biomedical imaging data.

    Science.gov (United States)

    Yu, Zeyun; Wang, Jun; Gao, Zhanheng; Xu, Ming; Hoshijima, Masahiko

    2014-01-01

    In this paper we present a new software toolkit for generating and optimizing surface and volumetric meshes from three-dimensional (3D) biomedical imaging data, targeted at image-based finite element analysis of some biomedical activities in a single material domain. Our toolkit includes a series of geometric processing algorithms including surface re-meshing and quality-guaranteed tetrahedral mesh generation and optimization. All methods described have been encapsulated into a user-friendly graphical interface for easy manipulation and informative visualization of biomedical images and mesh models. Numerous examples are presented to demonstrate the effectiveness and efficiency of the described methods and toolkit. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  5. phylo-node: A molecular phylogenetic toolkit using Node.js.

    Science.gov (United States)

    O'Halloran, Damien M

    2017-01-01

    Node.js is an open-source and cross-platform environment that provides a JavaScript codebase for back-end server-side applications. JavaScript has been used to develop very fast and user-friendly front-end tools for bioinformatic and phylogenetic analyses. However, no such toolkits are available using Node.js to conduct comprehensive molecular phylogenetic analysis. To address this problem, I have developed, phylo-node, which was developed using Node.js and provides a stable and scalable toolkit that allows the user to perform diverse molecular and phylogenetic tasks. phylo-node can execute the analysis and process the resulting outputs from a suite of software options that provides tools for read processing and genome alignment, sequence retrieval, multiple sequence alignment, primer design, evolutionary modeling, and phylogeny reconstruction. Furthermore, phylo-node enables the user to deploy server dependent applications, and also provides simple integration and interoperation with other Node modules and languages using Node inheritance patterns, and a customized piping module to support the production of diverse pipelines. phylo-node is open-source and freely available to all users without sign-up or login requirements. All source code and user guidelines are openly available at the GitHub repository: https://github.com/dohalloran/phylo-node.

  6. Transportation librarian's toolkit

    Science.gov (United States)

    2007-12-01

    The Transportation Librarians Toolkit is a product of the Transportation Library Connectivity pooled fund study, TPF- 5(105), a collaborative, grass-roots effort by transportation libraries to enhance information accessibility and professional expert...

  7. Universal Library for Building Radar Operator Interface

    Directory of Open Access Journals (Sweden)

    A. A. Karankevich

    2014-01-01

    Full Text Available The article contains the results of the development of a software library, used for building software interfaces for radars being developed in BMSTU Radioelectronic Technics Scientific and Research Institute. The library is a software application library written in C++ using Qt and OpenGL libraries.The article describes the requirements, that the library is supposed to meet, in particular — cross-platform capabilities and versatility of the solution. The data types, that library uses, are described. The description of theinterface elements developed is shown, and some pictures of their operation are given.The article shows the main interface elements used. They are: «Matrix» that shows twodimensional data, «Waterfall», that is used for time scanning of the parameter specified, and «Plan Position Indicator» that shows circular scan from surveillance radar without geometric distortions.The part «Library implementation» shows the example of radiolocation station interface, that was based on this library, used in the working model of ultrashortpulse radar. Some results of the operation of this interface are also shown. The experiment shows the system working with two people in the field. As people start to move, the system becomes capable of distinguishing moving targets and stationary surface. The article shows the system operation the same way as the system operator can see it through his interface.The conclusion contains brief results of the development, the sphere of application of the software, and the prospects of the further development of the library.

  8. Pydpiper: A Flexible Toolkit for Constructing Novel Registration Pipelines

    Directory of Open Access Journals (Sweden)

    Miriam eFriedel

    2014-07-01

    Full Text Available Using neuroimaging technologies to elucidate the relationship between genotype and phenotype and brain and behavior will be a key contribution to biomedical research in the twenty-first century. Among the many methods for analyzing neuroimaging data, image registration deserves particular attention due to its wide range of applications. Finding strategies to register together many images and analyze the differences between them can be a challenge, particularly given that different experimental designs require different registration strategies. Moreover, writing software that can handle different types of image registration pipelines in a flexible, reusable and extensible way can be challenging. In response to this challenge, we have created Pydpiper, a neuroimaging registration toolkit written in Python. Pydpiper is an open-source, freely available pipeline framework that provides multiple modules for various image registration applications. Pydpiper offers five key innovations. Specifically: (1 a robust file handling class that allows access to outputs from all stages of registration at any point in the pipeline; (2 the ability of the framework to eliminate duplicate stages; (3 reusable, easy to subclass modules; (4 a development toolkit written for non-developers; (5 four complete applications that run complex image registration pipelines ``out-of-the-box.'' In this paper, we will discuss both the general Pydpiper framework and the various ways in which component modules can be pieced together to easily create new registration pipelines. This will include a discussion of the core principles motivating code development and a comparison of Pydpiper with other available toolkits. We also provide a comprehensive, line-by-line example to orient users with limited programming knowledge and highlight some of the most useful features of Pydpiper. In addition, we will present the four current applications of the code.

  9. A computer simulation of a CWFM radar showing the tradeoffs of performance as a function of range

    Science.gov (United States)

    Gordy, Robert S.; Zoledziowski, Severyn

    2010-04-01

    This paper describes a study of the operation of CWFM radar using "System View" software for modeling and simulation. The System View software is currently offered by Agilent; a link to the website is given in the footnote. The models that were studied include: a model illustrating the basic principle of operation of the CWFM radar, the range resolution of the radar, the effect of nonlinear distortions on the detected signals, and the effect of interference and jamming on the reception of CWFM signals. The study was performed as part of the design of an airborne CWFM radar.

  10. Windows .NET Network Distributed Basic Local Alignment Search Toolkit (W.ND-BLAST

    Directory of Open Access Journals (Sweden)

    Oliver Melvin J

    2005-04-01

    Full Text Available Abstract Background BLAST is one of the most common and useful tools for Genetic Research. This paper describes a software application we have termed Windows .NET Distributed Basic Local Alignment Search Toolkit (W.ND-BLAST, which enhances the BLAST utility by improving usability, fault recovery, and scalability in a Windows desktop environment. Our goal was to develop an easy to use, fault tolerant, high-throughput BLAST solution that incorporates a comprehensive BLAST result viewer with curation and annotation functionality. Results W.ND-BLAST is a comprehensive Windows-based software toolkit that targets researchers, including those with minimal computer skills, and provides the ability increase the performance of BLAST by distributing BLAST queries to any number of Windows based machines across local area networks (LAN. W.ND-BLAST provides intuitive Graphic User Interfaces (GUI for BLAST database creation, BLAST execution, BLAST output evaluation and BLAST result exportation. This software also provides several layers of fault tolerance and fault recovery to prevent loss of data if nodes or master machines fail. This paper lays out the functionality of W.ND-BLAST. W.ND-BLAST displays close to 100% performance efficiency when distributing tasks to 12 remote computers of the same performance class. A high throughput BLAST job which took 662.68 minutes (11 hours on one average machine was completed in 44.97 minutes when distributed to 17 nodes, which included lower performance class machines. Finally, there is a comprehensive high-throughput BLAST Output Viewer (BOV and Annotation Engine components, which provides comprehensive exportation of BLAST hits to text files, annotated fasta files, tables, or association files. Conclusion W.ND-BLAST provides an interactive tool that allows scientists to easily utilizing their available computing resources for high throughput and comprehensive sequence analyses. The install package for W.ND-BLAST is

  11. A Modular Toolkit for Distributed Interactions

    Directory of Open Access Journals (Sweden)

    Julien Lange

    2011-10-01

    Full Text Available We discuss the design, architecture, and implementation of a toolkit which supports some theories for distributed interactions. The main design principles of our architecture are flexibility and modularity. Our main goal is to provide an easily extensible workbench to encompass current algorithms and incorporate future developments of the theories. With the help of some examples, we illustrate the main features of our toolkit.

  12. Development of wide band digital receiver for atmospheric radars using COTS board based SDR

    Science.gov (United States)

    Yasodha, Polisetti; Jayaraman, Achuthan; Thriveni, A.

    2016-07-01

    Digital receiver extracts the received echo signal information, and is a potential subsystem for atmospheric radar, also referred to as wind profiling radar (WPR), which provides the vertical profiles of 3-dimensional wind vector in the atmosphere. This paper presents the development of digital receiver using COTS board based Software Defined Radio technique, which can be used for atmospheric radars. The developmental work is being carried out at National Atmospheric Research Laboratory (NARL), Gadanki. The digital receiver consists of a commercially available software defined radio (SDR) board called as universal software radio peripheral B210 (USRP B210) and a personal computer. USRP B210 operates over a wider frequency range from 70 MHz to 6 GHz and hence can be used for variety of radars like Doppler weather radars operating in S/C bands, in addition to wind profiling radars operating in VHF, UHF and L bands. Due to the flexibility and re-configurability of SDR, where the component functionalities are implemented in software, it is easy to modify the software to receive the echoes and process them as per the requirement suitable for the type of the radar intended. Hence, USRP B210 board along with the computer forms a versatile digital receiver from 70 MHz to 6 GHz. It has an inbuilt direct conversion transceiver with two transmit and two receive channels, which can be operated in fully coherent 2x2 MIMO fashion and thus it can be used as a two channel receiver. Multiple USRP B210 boards can be synchronized using the pulse per second (PPS) input provided on the board, to configure multi-channel digital receiver system. RF gain of the transceiver can be varied from 0 to 70 dB. The board can be controlled from the computer via USB 3.0 interface through USRP hardware driver (UHD), which is an open source cross platform driver. The USRP B210 board is connected to the personal computer through USB 3.0. Reference (10 MHz) clock signal from the radar master oscillator

  13. Radar-to-Radar Interference Suppression for Distributed Radar Sensor Networks

    Directory of Open Access Journals (Sweden)

    Wen-Qin Wang

    2014-01-01

    Full Text Available Radar sensor networks, including bi- and multi-static radars, provide several operational advantages, like reduced vulnerability, good system flexibility and an increased radar cross-section. However, radar-to-radar interference suppression is a major problem in distributed radar sensor networks. In this paper, we present a cross-matched filtering-based radar-to-radar interference suppression algorithm. This algorithm first uses an iterative filtering algorithm to suppress the radar-to-radar interferences and, then, separately matched filtering for each radar. Besides the detailed algorithm derivation, extensive numerical simulation examples are performed with the down-chirp and up-chirp waveforms, partially overlapped or inverse chirp rate linearly frequency modulation (LFM waveforms and orthogonal frequency division multiplexing (ODFM chirp diverse waveforms. The effectiveness of the algorithm is verified by the simulation results.

  14. Microsoft BizTalk ESB Toolkit 2.1

    CERN Document Server

    Benito, Andrés Del Río

    2013-01-01

    A practical guide into the architecture and features that make up the services and components of the ESB Toolkit.This book is for experienced BizTalk developers, administrators, and architects, as well as IT managers and BizTalk business analysts. Knowledge and experience with the Toolkit is not a requirement.

  15. Basic Radar Altimetry Toolbox: Tools and Tutorial To Use Radar Altimetry For Cryosphere

    Science.gov (United States)

    Benveniste, J. J.; Bronner, E.; Dinardo, S.; Lucas, B. M.; Rosmorduc, V.; Earith, D.

    2010-12-01

    Radar altimetry is very much a technique expanding its applications. If quite a lot of efforts have been made for oceanography users (including easy-to-use data), the use of those data for cryosphere application, especially with the new ESA CryoSat-2 mission data is still somehow tedious, especially for new Altimetry data products users. ESA and CNES thus had the Basic Radar Altimetry Toolbox developed a few years ago, and are improving and upgrading it to fit new missions and the growing number of altimetry uses. The Basic Radar Altimetry Toolbox is an "all-altimeter" collection of tools, tutorials and documents designed to facilitate the use of radar altimetry data. The software is able: - to read most distributed radar altimetry data, from ERS-1 & 2, Topex/Poseidon, Geosat Follow-on, Jason-1, Envisat, Jason- 2, CryoSat and the future Saral missions, - to perform some processing, data editing and statistic, - and to visualize the results. It can be used at several levels/several ways: - as a data reading tool, with APIs for C, Fortran, Matlab and IDL - as processing/extraction routines, through the on-line command mode - as an educational and a quick-look tool, with the graphical user interface As part of the Toolbox, a Radar Altimetry Tutorial gives general information about altimetry, the technique involved and its applications, as well as an overview of past, present and future missions, including information on how to access data and additional software and documentation. It also presents a series of data use cases, covering all uses of altimetry over ocean, cryosphere and land, showing the basic methods for some of the most frequent manners of using altimetry data. It is an opportunity to teach remote sensing with practical training. It has been available from April 2007, and had been demonstrated during training courses and scientific meetings. About 1200 people downloaded it (Summer 2010), with many "newcomers" to altimetry among them, including teachers

  16. Digital Beamforming Synthetic Aperture Radar Developments at NASA Goddard Space Flight Center

    Science.gov (United States)

    Rincon, Rafael; Fatoyinbo, Temilola; Osmanoglu, Batuhan; Lee, Seung Kuk; Du Toit, Cornelis F.; Perrine, Martin; Ranson, K. Jon; Sun, Guoqing; Deshpande, Manohar; Beck, Jaclyn; hide

    2016-01-01

    Advanced Digital Beamforming (DBF) Synthetic Aperture Radar (SAR) technology is an area of research and development pursued at the NASA Goddard Space Flight Center (GSFC). Advanced SAR architectures enhances radar performance and opens a new set of capabilities in radar remote sensing. DBSAR-2 and EcoSAR are two state-of-the-art radar systems recently developed and tested. These new instruments employ multiple input-multiple output (MIMO) architectures characterized by multi-mode operation, software defined waveform generation, digital beamforming, and configurable radar parameters. The instruments have been developed to support several disciplines in Earth and Planetary sciences. This paper describes the radars advanced features and report on the latest SAR processing and calibration efforts.

  17. An Industrial Physics Toolkit

    Science.gov (United States)

    Cummings, Bill

    2004-03-01

    Physicists possess many skills highly valued in industrial companies. However, with the exception of a decreasing number of positions in long range research at large companies, job openings in industry rarely say "Physicist Required." One key to a successful industrial career is to know what subset of your physics skills is most highly valued by a given industry and to continue to build these skills while working. This combination of skills from both academic and industrial experience becomes your "Industrial Physics Toolkit" and is a transferable resource when you change positions or companies. This presentation will describe how one builds and sells your own "Industrial Physics Toolkit" using concrete examples from the speaker's industrial experience.

  18. ImTK: an open source multi-center information management toolkit

    Science.gov (United States)

    Alaoui, Adil; Ingeholm, Mary Lou; Padh, Shilpa; Dorobantu, Mihai; Desai, Mihir; Cleary, Kevin; Mun, Seong K.

    2008-03-01

    The Information Management Toolkit (ImTK) Consortium is an open source initiative to develop robust, freely available tools related to the information management needs of basic, clinical, and translational research. An open source framework and agile programming methodology can enable distributed software development while an open architecture will encourage interoperability across different environments. The ISIS Center has conceptualized a prototype data sharing network that simulates a multi-center environment based on a federated data access model. This model includes the development of software tools to enable efficient exchange, sharing, management, and analysis of multimedia medical information such as clinical information, images, and bioinformatics data from multiple data sources. The envisioned ImTK data environment will include an open architecture and data model implementation that complies with existing standards such as Digital Imaging and Communications (DICOM), Health Level 7 (HL7), and the technical framework and workflow defined by the Integrating the Healthcare Enterprise (IHE) Information Technology Infrastructure initiative, mainly the Cross Enterprise Document Sharing (XDS) specifications.

  19. J-TEXT-EPICS: An EPICS toolkit attempted to improve productivity

    International Nuclear Information System (INIS)

    Zheng, Wei; Zhang, Ming; Zhang, Jing; Zhuang, Ge

    2013-01-01

    Highlights: • Tokamak control applications can be developed in very short period with J-TEXT-EPICS. • J-TEXT-EPICS enables users to build control applications with device-oriented functions. • J-TEXT-EPICS is fully compatible with EPICS Channel Access protocol. • J-TEXT-EPICS can be easily extended by plug-ins and drivers. -- Abstract: The Joint Texas Experimental Tokamak (J-TEXT) team has developed a new software toolkit for building Experimental Physics and Industrial Control System (EPICS) control applications called J-TEXT-EPICS. It aims to improve the development efficiency of control applications. With device-oriented features, it can be used to set or obtain the configuration or status of a device as well as invoke methods on a device. With its modularized design, its functions can be easily extended. J-TEXT-EPICS is completely compatible with the original EPICS Channel Access protocol and can be integrated into existing EPICS control systems smoothly. It is fully implemented in C number sign, thus it will benefit from abundant resources in.NET Framework. The J-TEXT control system is build with this toolkit. This paper presents the design and implementation of J-TEXT EPICS as well as its application in the J-TEXT control system

  20. Modern Radar Techniques for Geophysical Applications: Two Examples

    Science.gov (United States)

    Arokiasamy, B. J.; Bianchi, C.; Sciacca, U.; Tutone, G.; Zirizzotti, A.; Zuccheretti, E.

    2005-01-01

    The last decade of the evolution of radar was heavily influenced by the rapid increase in the information processing capabilities. Advances in solid state radio HF devices, digital technology, computing architectures and software offered the designers to develop very efficient radars. In designing modern radars the emphasis goes towards the simplification of the system hardware, reduction of overall power, which is compensated by coding and real time signal processing techniques. Radars are commonly employed in geophysical radio soundings like probing the ionosphere; stratosphere-mesosphere measurement, weather forecast, GPR and radio-glaciology etc. In the laboratorio di Geofisica Ambientale of the Istituto Nazionale di Geofisica e Vulcanologia (INGV), Rome, Italy, we developed two pulse compression radars. The first is a HF radar called AIS-INGV; Advanced Ionospheric Sounder designed both for the purpose of research and for routine service of the HF radio wave propagation forecast. The second is a VHF radar called GLACIORADAR, which will be substituting the high power envelope radar used by the Italian Glaciological group. This will be employed in studying the sub glacial structures of Antarctica, giving information about layering, the bed rock and sub glacial lakes if present. These are low power radars, which heavily rely on advanced hardware and powerful real time signal processing. Additional information is included in the original extended abstract.

  1. Hydropower Regulatory and Permitting Information Desktop (RAPID) Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Levine, Aaron L [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-12-19

    Hydropower Regulatory and Permitting Information Desktop (RAPID) Toolkit presentation from the WPTO FY14-FY16 Peer Review. The toolkit is aimed at regulatory agencies, consultants, project developers, the public, and any other party interested in learning more about the hydropower regulatory process.

  2. A computer simulation of a long-range CWFM radar showing the tradeoffs of performance as a function of range

    Science.gov (United States)

    Gordy, Robert S.; Zoledziowski, Severyn

    2011-06-01

    This paper describes a study of the operation of a long range CWFM radar using "System View" software for modeling and simulation. The System View software is currently offered by Agilent. The models that were studied include: a model illustrating the basic principle of operation of the CWFM radar, the range resolution of the radar, the effect of long range processing and the resultant approach with the tradeoff of detected range resolution due to Doppler frequency shift as a function of range distance. The study was performed as part of the design of an airborne CWFM radar. The radar can be designed with a single antenna or a dual antenna. The dual antenna approach is presented in this paper.

  3. Water Security Toolkit User Manual Version 1.2.

    Energy Technology Data Exchange (ETDEWEB)

    Klise, Katherine A.; Siirola, John Daniel; Hart, David; Hart, William Eugene; Phillips, Cynthia Ann; Haxton, Terranna; Murray, Regan; Janke, Robert; Taxon, Thomas; Laird, Carl; Seth, Arpan; Hackebeil, Gabriel; McGee, Shawn; Mann, Angelica

    2014-08-01

    The Water Security Toolkit (WST) is a suite of open source software tools that can be used by water utilities to create response strategies to reduce the impact of contamination in a water distribution network . WST includes hydraulic and water quality modeling software , optimizati on methodologies , and visualization tools to identify: (1) sensor locations to detect contamination, (2) locations in the network in which the contamination was introduced, (3) hydrants to remove contaminated water from the distribution system, (4) locations in the network to inject decontamination agents to inactivate, remove, or destroy contaminants, (5) locations in the network to take grab sample s to help identify the source of contamination and (6) valves to close in order to isolate contaminate d areas of the network. This user manual describes the different components of WST , along w ith examples and case studies. License Notice The Water Security Toolkit (WST) v.1.2 Copyright c 2012 Sandia Corporation. Under the terms of Contract DE-AC04-94AL85000, there is a non-exclusive license for use of this work by or on behalf of the U.S. government. This software is distributed under the Revised BSD License (see below). In addition, WST leverages a variety of third-party software packages, which have separate licensing policies: Acro Revised BSD License argparse Python Software Foundation License Boost Boost Software License Coopr Revised BSD License Coverage BSD License Distribute Python Software Foundation License / Zope Public License EPANET Public Domain EPANET-ERD Revised BSD License EPANET-MSX GNU Lesser General Public License (LGPL) v.3 gcovr Revised BSD License GRASP AT&T Commercial License for noncommercial use; includes randomsample and sideconstraints executable files LZMA SDK Public Domain nose GNU Lesser General Public License (LGPL) v.2.1 ordereddict MIT License pip MIT License PLY BSD License PyEPANET Revised BSD License Pyro MIT License PyUtilib Revised BSD License Py

  4. Provenance tracking for scientific software toolchains through on-demand release and archiving

    Science.gov (United States)

    Ham, David

    2017-04-01

    There is an emerging consensus that published computational science results must be backed by a provenance chain tying results to the exact versions of input data and the code which generated them. There is also now an impressive range of web services devoted to revision control of software, and the archiving in citeable form of both software and input data. However, much scientific software itself builds on libraries and toolkits, and these themselves have dependencies. Further, it is common for cutting edge research to depend on the latest version of software in online repositories, rather than the official release version. This creates a situation in which an author who wishes to follow best practice in recording the provenance chain of their results must archive and cite unreleased versions of a series of dependencies. Here, we present an alternative which toolkit authors can easily implement to provide a semi-automatic mechanism for creating and archiving custom software releases of the precise version of a package used in a particular simulation. This approach leverages the excellent services provided by GitHub and Zenodo to generate a connected set of citeable DOIs for the archived software. We present the integration of this workflow into the Firedrake automated finite element framework as a practical example of this approach in use on a complex geoscientific tool chain in practical use.

  5. Pyradi: an open-source toolkit for infrared calculation and data processing

    CSIR Research Space (South Africa)

    Willers, CJ

    2012-09-01

    Full Text Available of such a toolkit facilitates and increases productivity during subsequent tool development: “develop once and use many times”. The concept of an extendible toolkit lends itself naturally to the open-source philosophy, where the toolkit user-base develops...

  6. The self-describing data sets file protocol and Toolkit

    International Nuclear Information System (INIS)

    Borland, M.; Emery, L.

    1995-01-01

    The Self-Describing Data Sets (SDDS) file protocol continues to be used extensively in commissioning the Advanced Photon Source (APS) accelerator complex. SDDS protocol has proved useful primarily due to the existence of the SDDS Toolkit, a growing set of about 60 generic commandline programs that read and/or write SDDS files. The SDDS Toolkit is also used extensively for simulation postprocessing, giving physicists a single environment for experiment and simulation. With the Toolkit, new SDDS data is displayed and subjected to complex processing without developing new programs. Data from EPICS, lab instruments, simulation, and other sources are easily integrated. Because the SDDS tools are commandline-based, data processing scripts are readily written using the user's preferred shell language. Since users work within a UNIX shell rather than an application-specific shell or GUI, they may add SDDS-compliant programs and scripts to their personal toolkits without restriction or complication. The SDDS Toolkit has been run under UNIX on SUN OS4, HP-UX, and LINUX. Application of SDDS to accelerator operation is being pursued using Tcl/Tk to provide a GUI

  7. The Exoplanet Characterization ToolKit (ExoCTK)

    Science.gov (United States)

    Stevenson, Kevin; Fowler, Julia; Lewis, Nikole K.; Fraine, Jonathan; Pueyo, Laurent; Valenti, Jeff; Bruno, Giovanni; Filippazzo, Joseph; Hill, Matthew; Batalha, Natasha E.; Bushra, Rafia

    2018-01-01

    The success of exoplanet characterization depends critically on a patchwork of analysis tools and spectroscopic libraries that currently require extensive development and lack a centralized support system. Due to the complexity of spectroscopic analyses and initial time commitment required to become productive, there are currently a limited number of teams that are actively advancing the field. New teams with significant expertise, but without the proper tools, face prohibitively steep hills to climb before they can contribute. As a solution, we are developing an open-source, modular data analysis package in Python and a publicly facing web interface focused primarily on atmospheric characterization of exoplanets and exoplanet transit observation planning with JWST. The foundation of these software tools and libraries exist within pockets of the exoplanet community. Our project will gather these seedling tools and grow a robust, uniform, and well maintained exoplanet characterization toolkit.

  8. Implementing a user-driven online quality improvement toolkit for cancer care.

    Science.gov (United States)

    Luck, Jeff; York, Laura S; Bowman, Candice; Gale, Randall C; Smith, Nina; Asch, Steven M

    2015-05-01

    Peer-to-peer collaboration within integrated health systems requires a mechanism for sharing quality improvement lessons. The Veterans Health Administration (VA) developed online compendia of tools linked to specific cancer quality indicators. We evaluated awareness and use of the toolkits, variation across facilities, impact of social marketing, and factors influencing toolkit use. A diffusion of innovations conceptual framework guided the collection of user activity data from the Toolkit Series SharePoint site and an online survey of potential Lung Cancer Care Toolkit users. The VA Toolkit Series site had 5,088 unique visitors in its first 22 months; 5% of users accounted for 40% of page views. Social marketing communications were correlated with site usage. Of survey respondents (n = 355), 54% had visited the site, of whom 24% downloaded at least one tool. Respondents' awareness of the lung cancer quality performance of their facility, and facility participation in quality improvement collaboratives, were positively associated with Toolkit Series site use. Facility-level lung cancer tool implementation varied widely across tool types. The VA Toolkit Series achieved widespread use and a high degree of user engagement, although use varied widely across facilities. The most active users were aware of and active in cancer care quality improvement. Toolkit use seemed to be reinforced by other quality improvement activities. A combination of user-driven tool creation and centralized toolkit development seemed to be effective for leveraging health information technology to spread disease-specific quality improvement tools within an integrated health care system. Copyright © 2015 by American Society of Clinical Oncology.

  9. Fpga based L-band pulse doppler radar design and implementation

    Science.gov (United States)

    Savci, Kubilay

    As its name implies RADAR (Radio Detection and Ranging) is an electromagnetic sensor used for detection and locating targets from their return signals. Radar systems propagate electromagnetic energy, from the antenna which is in part intercepted by an object. Objects reradiate a portion of energy which is captured by the radar receiver. The received signal is then processed for information extraction. Radar systems are widely used for surveillance, air security, navigation, weather hazard detection, as well as remote sensing applications. In this work, an FPGA based L-band Pulse Doppler radar prototype, which is used for target detection, localization and velocity calculation has been built and a general-purpose Pulse Doppler radar processor has been developed. This radar is a ground based stationary monopulse radar, which transmits a short pulse with a certain pulse repetition frequency (PRF). Return signals from the target are processed and information about their location and velocity is extracted. Discrete components are used for the transmitter and receiver chain. The hardware solution is based on Xilinx Virtex-6 ML605 FPGA board, responsible for the control of the radar system and the digital signal processing of the received signal, which involves Constant False Alarm Rate (CFAR) detection and Pulse Doppler processing. The algorithm is implemented in MATLAB/SIMULINK using the Xilinx System Generator for DSP tool. The field programmable gate arrays (FPGA) implementation of the radar system provides the flexibility of changing parameters such as the PRF and pulse length therefore it can be used with different radar configurations as well. A VHDL design has been developed for 1Gbit Ethernet connection to transfer digitized return signal and detection results to PC. An A-Scope software has been developed with C# programming language to display time domain radar signals and detection results on PC. Data are processed both in FPGA chip and on PC. FPGA uses fixed

  10. Design and Implementation of Modular Software for Programming Mobile Robots

    Directory of Open Access Journals (Sweden)

    Alessandro Farinelli

    2006-03-01

    Full Text Available This article describes a software development toolkit for programming mobile robots, that has been used on different platforms and for different robotic applications. We address design choices, implementation issues and results in the realization of our robot programming environment, that has been devised and built from many people since 1998. We believe that the proposed framework is extremely useful not only for experienced robotic software developers, but also for students approaching robotic research projects.

  11. The 2016 ACCP Pharmacotherapy Didactic Curriculum Toolkit.

    Science.gov (United States)

    Schwinghammer, Terry L; Crannage, Andrew J; Boyce, Eric G; Bradley, Bridget; Christensen, Alyssa; Dunnenberger, Henry M; Fravel, Michelle; Gurgle, Holly; Hammond, Drayton A; Kwon, Jennifer; Slain, Douglas; Wargo, Kurt A

    2016-11-01

    The 2016 American College of Clinical Pharmacy (ACCP) Educational Affairs Committee was charged with updating and contemporizing ACCP's 2009 Pharmacotherapy Didactic Curriculum Toolkit. The toolkit has been designed to guide schools and colleges of pharmacy in developing, maintaining, and modifying their curricula. The 2016 committee reviewed the recent medical literature and other documents to identify disease states that are responsive to drug therapy. Diseases and content topics were organized by organ system, when feasible, and grouped into tiers as defined by practice competency. Tier 1 topics should be taught in a manner that prepares all students to provide collaborative, patient-centered care upon graduation and licensure. Tier 2 topics are generally taught in the professional curriculum, but students may require additional knowledge or skills after graduation (e.g., residency training) to achieve competency in providing direct patient care. Tier 3 topics may not be taught in the professional curriculum; thus, graduates will be required to obtain the necessary knowledge and skills on their own to provide direct patient care, if required in their practice. The 2016 toolkit contains 276 diseases and content topics, of which 87 (32%) are categorized as tier 1, 133 (48%) as tier 2, and 56 (20%) as tier 3. The large number of tier 1 topics will require schools and colleges to use creative pedagogical strategies to achieve the necessary practice competencies. Almost half of the topics (48%) are tier 2, highlighting the importance of postgraduate residency training or equivalent practice experience to competently care for patients with these disorders. The Pharmacotherapy Didactic Curriculum Toolkit will continue to be updated to provide guidance to faculty at schools and colleges of pharmacy as these academic pharmacy institutions regularly evaluate and modify their curricula to keep abreast of scientific advances and associated practice changes. Access the

  12. Beyond Bundles - Reproducible Software Environments with GNU Guix

    CERN Multimedia

    CERN. Geneva; Wurmus, Ricardo

    2018-01-01

    Building reproducible data analysis pipelines and numerical experiments is a key challenge for reproducible science, in which tools to reproduce software environments play a critical role. The advent of “container-based” deployment tools such as Docker and Singularity has made it easier to replicate software environments. These tools are very much about bundling the bits of software binaries in a convenient way, not so much about describing how software is composed. Science is not just about replicating, though—it demands the ability to inspect and to experiment. In this talk we will present GNU Guix, a software management toolkit. Guix departs from container-based solutions in that it enables declarative composition of software environments. It is comparable to “package managers” like apt or yum, but with a significant difference: Guix provides accurate provenance tracking of build artifacts, and bit-reproducible software. We will illustrate the many ways in which Guix can improve how software en...

  13. A GIS-based disaggregate spatial watershed analysis using RADAR data

    International Nuclear Information System (INIS)

    Al-Hamdan, M.

    2002-01-01

    Hydrology is the study of water in all its forms, origins, and destinations on the earth.This paper develops a novel modeling technique using a geographic information system (GIS) to facilitate watershed hydrological routing using RADAR data. The RADAR rainfall data, segmented to 4 km by 4 km blocks, divides the watershed into several sub basins which are modeled independently. A case study for the GIS-based disaggregate spatial watershed analysis using RADAR data is provided for South Fork Cowikee Creek near Batesville, Alabama. All the data necessary to complete the analysis is maintained in the ArcView GIS software. This paper concludes that the GIS-Based disaggregate spatial watershed analysis using RADAR data is a viable method to calculate hydrological routing for large watersheds. (author)

  14. Pybel: a Python wrapper for the OpenBabel cheminformatics toolkit

    Directory of Open Access Journals (Sweden)

    Morley Chris

    2008-03-01

    Full Text Available Abstract Background Scripting languages such as Python are ideally suited to common programming tasks in cheminformatics such as data analysis and parsing information from files. However, for reasons of efficiency, cheminformatics toolkits such as the OpenBabel toolkit are often implemented in compiled languages such as C++. We describe Pybel, a Python module that provides access to the OpenBabel toolkit. Results Pybel wraps the direct toolkit bindings to simplify common tasks such as reading and writing molecular files and calculating fingerprints. Extensive use is made of Python iterators to simplify loops such as that over all the molecules in a file. A Pybel Molecule can be easily interconverted to an OpenBabel OBMol to access those methods or attributes not wrapped by Pybel. Conclusion Pybel allows cheminformaticians to rapidly develop Python scripts that manipulate chemical information. It is open source, available cross-platform, and offers the power of the OpenBabel toolkit to Python programmers.

  15. An open source toolkit for medical imaging de-identification

    International Nuclear Information System (INIS)

    Rodriguez Gonzalez, David; Carpenter, Trevor; Wardlaw, Joanna; Hemert, Jano I. van

    2010-01-01

    Medical imaging acquired for clinical purposes can have several legitimate secondary uses in research projects and teaching libraries. No commonly accepted solution for anonymising these images exists because the amount of personal data that should be preserved varies case by case. Our objective is to provide a flexible mechanism for anonymising Digital Imaging and Communications in Medicine (DICOM) data that meets the requirements for deployment in multicentre trials. We reviewed our current de-identification practices and defined the relevant use cases to extract the requirements for the de-identification process. We then used these requirements in the design and implementation of the toolkit. Finally, we tested the toolkit taking as a reference those requirements, including a multicentre deployment. The toolkit successfully anonymised DICOM data from various sources. Furthermore, it was shown that it could forward anonymous data to remote destinations, remove burned-in annotations, and add tracking information to the header. The toolkit also implements the DICOM standard confidentiality mechanism. A DICOM de-identification toolkit that facilitates the enforcement of privacy policies was developed. It is highly extensible, provides the necessary flexibility to account for different de-identification requirements and has a low adoption barrier for new users. (orig.)

  16. CHASM and SNVBox: toolkit for detecting biologically important single nucleotide mutations in cancer.

    Science.gov (United States)

    Wong, Wing Chung; Kim, Dewey; Carter, Hannah; Diekhans, Mark; Ryan, Michael C; Karchin, Rachel

    2011-08-01

    Thousands of cancer exomes are currently being sequenced, yielding millions of non-synonymous single nucleotide variants (SNVs) of possible relevance to disease etiology. Here, we provide a software toolkit to prioritize SNVs based on their predicted contribution to tumorigenesis. It includes a database of precomputed, predictive features covering all positions in the annotated human exome and can be used either stand-alone or as part of a larger variant discovery pipeline. MySQL database, source code and binaries freely available for academic/government use at http://wiki.chasmsoftware.org, Source in Python and C++. Requires 32 or 64-bit Linux system (tested on Fedora Core 8,10,11 and Ubuntu 10), 2.5*≤ Python 5.0, 60 GB available hard disk space (50 MB for software and data files, 40 GB for MySQL database dump when uncompressed), 2 GB of RAM.

  17. Veterinary Immunology Committee Toolkit Workshop 2010: Progress and plans

    Science.gov (United States)

    The Third Veterinary Immunology Committee (VIC) Toolkit Workshop took place at the Ninth International Veterinary Immunology Symposium (IVIS) in Tokyo, Japan on August 18, 2020. The Workshop built on previous Toolkit Workshops and covered various aspects of reagent development, commercialisation an...

  18. OpenMSI Arrayed Analysis Toolkit: Analyzing Spatially Defined Samples Using Mass Spectrometry Imaging

    DEFF Research Database (Denmark)

    de Raad, Markus; de Rond, Tristan; Rübel, Oliver

    2017-01-01

    ://openmsinersc.gov), a platform for storing, sharing, and analyzing MSI data. By using a web-based python notebook (Jupyter), OMAAT is accessible to anyone without programming experience yet allows experienced users to leverage all features. OMAAT was :evaluated by analyzing an MSI data set of a high-throughput glycoside...... processing tools for the analysis of large arrayed MSI sample sets. The OpenMSI Arrayed Analysis Toolkit (OMAAT) is a software package that addresses the challenges of analyzing spatially defined samples in MSI data sets. OMAAT is written in Python and is integrated with OpenMSI (http...

  19. Radar detection of ultra high energy cosmic rays

    Science.gov (United States)

    Myers, Isaac J.

    TARA (Telescope Array Radar) is a cosmic ray radar detection experiment co-located with Telescope Array, the conventional surface scintillation detector (SD) and fluorescence telescope detector (FD) near Delta, UT. The TARA detector combines a 40 kW transmitter and high gain transmitting antenna which broadcasts the radar carrier over the SD array and in the FD field of view to a 250 MS/s DAQ receiver. Data collection began in August, 2013. TARA stands apart from other cosmic ray radar experiments in that radar data is directly compared with conventional cosmic ray detector events. The transmitter is also directly controlled by TARA researchers. Waveforms from the FD-triggered data stream are time-matched with TA events and searched for signal using a novel signal search technique in which the expected (simulated) radar echo of a particular air shower is used as a matched filter template and compared to radio waveforms. This technique is used to calculate the radar cross-section (RCS) upper-limit on all triggers that correspond to well-reconstructed TA FD monocular events. Our lowest cosmic ray RCS upper-limit is 42 cm2 for an 11 EeV event. An introduction to cosmic rays is presented with the evolution of detection and the necessity of new detection techniques, of which radar detection is a candidate. The software simulation of radar scattering from cosmic rays follows. The TARA detector, including transmitter and receiver systems, are discussed in detail. Our search algorithm and methodology for calculating RCS is presented for the purpose of being repeatable. Search results are explained in context of the usefulness and future of cosmic ray radar detection.

  20. Radar equations for modern radar

    CERN Document Server

    Barton, David K

    2012-01-01

    Based on the classic Radar Range-Performance Analysis from 1980, this practical volume extends that work to ensure applicability of radar equations to the design and analysis of modern radars. This unique book helps you identify what information on the radar and its environment is needed to predict detection range. Moreover, it provides equations and data to improve the accuracy of range calculations. You find detailed information on propagation effects, methods of range calculation in environments that include clutter, jamming and thermal noise, as well as loss factors that reduce radar perfo

  1. SIGKit: a New Data-based Software for Learning Introductory Geophysics

    Science.gov (United States)

    Zhang, Y.; Kruse, S.; George, O.; Esmaeili, S.; Papadimitrios, K. S.; Bank, C. G.; Cadmus, A.; Kenneally, N.; Patton, K.; Brusher, J.

    2016-12-01

    Students of diverse academic backgrounds take introductory geophysics courses to learn the theory of a variety of measurement and analysis methods with the expectation to be able to apply their basic knowledge to real data. Ideally, such data is collected in field courses and also used in lecture-based courses because they provide a critical context for better learning and understanding of geophysical methods. Each method requires a separate software package for the data processing steps, and the complexity and variety of professional software makes the path through data processing to data interpretation a strenuous learning process for students and a challenging teaching task for instructors. SIGKit (Student Investigation of Geophysics Toolkit) being developed as a collaboration between the University of South Florida, the University of Toronto, and MathWorks intends to address these shortcomings by showing the most essential processing steps and allowing students to visualize the underlying physics of the various methods. It is based on MATLAB software and offered as an easy-to-use graphical user interface and packaged so it can run as an executable in the classroom and the field even on computers without MATLAB licenses. An evaluation of the software based on student feedback from focus-group interviews and think-aloud observations helps drive its development and refinement. The toolkit provides a logical gateway into the more sophisticated and costly software students will encounter later in their training and careers by combining essential visualization, modeling, processing, and analysis steps for seismic, GPR, magnetics, gravity, resistivity, and electromagnetic data.

  2. Communities and Spontaneous Urban Planning: A Toolkit for Urban ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    State-led urban planning is often absent, which creates unsustainable environments and hinders the integration of migrants. Communities' prospects of ... This toolkit is expected to be a viable alternative for planning urban expansion wherever it cannot be carried out through traditional means. The toolkit will be tested in ...

  3. Specification for a surface-search radar-detection-range model

    Science.gov (United States)

    Hattan, Claude P.

    1990-09-01

    A model that predicts surface-search radar detection range versus a variety of combatants has been developed at the Naval Ocean Systems Center. This model uses a simplified ship radar cross section (RCS) model and the U.S. Navy Oceanographic and Atmospheric Mission Library Standard Electromagnetic Propagation Model. It provides the user with a method of assessing the effects of the environment of the performance of a surface-search radar system. The software implementation of the model is written in ANSI FORTRAN 77, with MIL-STD-1753 extensions. The program provides the user with a table of expected detection ranges when the model is supplied with the proper environmental radar system inputs. The target model includes the variation in RCS as a function of aspect angle and the distribution of reflected radar energy as a function of height above the waterline. The modeled propagation effects include refraction caused by a multisegmented refractivity profile, sea-surface roughness caused by local winds, evaporation ducting, and surface-based ducts caused by atmospheric layering.

  4. Experiment in Onboard Synthetic Aperture Radar Data Processing

    Science.gov (United States)

    Holland, Matthew

    2011-01-01

    Single event upsets (SEUs) are a threat to any computing system running on hardware that has not been physically radiation hardened. In addition to mandating the use of performance-limited, hardened heritage equipment, prior techniques for dealing with the SEU problem often involved hardware-based error detection and correction (EDAC). With limited computing resources, software- based EDAC, or any more elaborate recovery methods, were often not feasible. Synthetic aperture radars (SARs), when operated in the space environment, are interesting due to their relevance to NASAs objectives, but problematic in the sense of producing prodigious amounts of raw data. Prior implementations of the SAR data processing algorithm have been too slow, too computationally intensive, and require too much application memory for onboard execution to be a realistic option when using the type of heritage processing technology described above. This standard C-language implementation of SAR data processing is distributed over many cores of a Tilera Multicore Processor, and employs novel Radiation Hardening by Software (RHBS) techniques designed to protect the component processes (one per core) and their shared application memory from the sort of SEUs expected in the space environment. The source code includes calls to Tilera APIs, and a specialized Tilera compiler is required to produce a Tilera executable. The compiled application reads input data describing the position and orientation of a radar platform, as well as its radar-burst data, over time and writes out processed data in a form that is useful for analysis of the radar observations.

  5. Design-based learning in classrooms using playful digital toolkits

    NARCIS (Netherlands)

    Scheltenaar, K.J.; van der Poel, J.E.C.; Bekker, Tilde

    2015-01-01

    The goal of this paper is to explore how to implement Design Based Learning (DBL) with digital toolkits to teach 21st century skills in (Dutch) schools. It describes the outcomes of a literature study and two design case studies in which such a DBL approach with digital toolkits was iteratively

  6. Technical Note: DIRART- A software suite for deformable image registration and adaptive radiotherapy research

    International Nuclear Information System (INIS)

    Yang Deshan; Brame, Scott; El Naqa, Issam; Aditya, Apte; Wu Yu; Murty Goddu, S.; Mutic, Sasa; Deasy, Joseph O.; Low, Daniel A.

    2011-01-01

    Purpose: Recent years have witnessed tremendous progress in image guide radiotherapy technology and a growing interest in the possibilities for adapting treatment planning and delivery over the course of treatment. One obstacle faced by the research community has been the lack of a comprehensive open-source software toolkit dedicated for adaptive radiotherapy (ART). To address this need, the authors have developed a software suite called the Deformable Image Registration and Adaptive Radiotherapy Toolkit (DIRART). Methods: DIRART is an open-source toolkit developed in MATLAB. It is designed in an object-oriented style with focus on user-friendliness, features, and flexibility. It contains four classes of DIR algorithms, including the newer inverse consistency algorithms to provide consistent displacement vector field in both directions. It also contains common ART functions, an integrated graphical user interface, a variety of visualization and image-processing features, dose metric analysis functions, and interface routines. These interface routines make DIRART a powerful complement to the Computational Environment for Radiotherapy Research (CERR) and popular image-processing toolkits such as ITK. Results: DIRART provides a set of image processing/registration algorithms and postprocessing functions to facilitate the development and testing of DIR algorithms. It also offers a good amount of options for DIR results visualization, evaluation, and validation. Conclusions: By exchanging data with treatment planning systems via DICOM-RT files and CERR, and by bringing image registration algorithms closer to radiotherapy applications, DIRART is potentially a convenient and flexible platform that may facilitate ART and DIR research.

  7. An Overview of the Geant4 Toolkit

    CERN Document Server

    Apostolakis, John

    2007-01-01

    Geant4 is a toolkit for the simulation of the transport of radiation trough matter. With a flexible kernel and choices between different physics modeling choices, it has been tailored to the requirements of a wide range of applications. With the toolkit a user can describe a setup's or detector's geometry and materials, navigate inside it, simulate the physical interactions using a choice of physics engines, underlying physics cross-sections and models, visualise and store results. Physics models describing electromagnetic and hadronic interactions are provided, as are decays and processes for optical photons. Several models, with different precision and performance are available for many processes. The toolkit includes coherent physics model configurations, which are called physics lists. Users can choose an existing physics list or create their own, depending on their requirements and the application area. A clear structure and readable code, enable the user to investigate the origin of physics results. App...

  8. The Lean and Environment Toolkit

    Science.gov (United States)

    This Lean and Environment Toolkit assembles practical experience collected by the U.S. Environmental Protection Agency (EPA) and partner companies and organizations that have experience with coordinating Lean implementation and environmental management.

  9. Decision support toolkit for integrated analysis and design of reclaimed water infrastructure.

    Science.gov (United States)

    Lee, Eun Jung; Criddle, Craig S; Geza, Mengistu; Cath, Tzahi Y; Freyberg, David L

    2018-05-01

    Planning of water reuse systems is a complex endeavor. We have developed a software toolkit, IRIPT (Integrated Urban Reclaimed Water Infrastructure Planning Toolkit) that facilitates planning and design of reclaimed water infrastructure for both centralized and hybrid configurations that incorporate satellite treatment plants (STPs). The toolkit includes a Pipeline Designer (PRODOT) that optimizes routing and sizing of pipelines for wastewater capture and reclaimed water distribution, a Selector (SelWTP) that assembles and optimizes wastewater treatment trains, and a Calculator (CalcBenefit) that estimates fees, revenues, and subsidies of alternative designs. For hybrid configurations, a Locator (LocSTP) optimizes siting of STPs and associated wastewater diversions by identifying manhole locations where the flowrates are sufficient to ensure that wastewater extracted and treated at an adjacent STP can generate the revenue needed to pay for treatment and delivery to customers. Practical local constraints are also applied to screen and identify STP locations. Once suitable sites are selected, System Integrator (ToolIntegrator) identifies a set of centralized and hybrid configurations that: (1) maximize reclaimed water supply, (2) maximize reclaimed water supply while also ensuring a financial benefit for the system, and (3) maximize the net financial benefit for the system. The resulting configurations are then evaluated by an Analyst (SANNA) that uses monetary and non-monetary criteria, with weights assigned to appropriate metrics by a decision-maker, to identify a preferred configuration. To illustrate the structure, assumptions, and use of IRIPT, we apply it to a case study for the city of Golden, CO. The criteria weightings provided by a local decision-maker lead to a preference for a centralized configuration in this case. The Golden case study demonstrates that IRIPT can efficiently analyze centralized and hybrid water reuse configurations and rank them

  10. NEAMS Software Licensing, Release, and Distribution: Implications for FY2013 Work Package Planning

    International Nuclear Information System (INIS)

    Bernholdt, David E.

    2012-01-01

    The vision of the NEAMS program is to bring truly predictive modeling and simulation (M and S) capabilities to the nuclear engineering community in order to enable a new approach to the analysis of nuclear systems. NEAMS anticipates issuing in FY 2018 a full release of its computational 'Fermi Toolkit' aimed at advanced reactor and fuel cycles. The NEAMS toolkit involves extensive software development activities, some of which have already been underway for several years, however, the Advanced Modeling and Simulation Office (AMSO), which sponsors the NEAMS program, has not yet issued any official guidance regarding software licensing, release, and distribution policies. This motivated an FY12 task in the Capability Transfer work package to develop and recommend an appropriate set of policies. The current preliminary report is intended to provide awareness of issues with implications for work package planning for FY13. We anticipate a small amount of effort associated with putting into place formal licenses and contributor agreements for NEAMS software which doesn't already have them. We do not anticipate any additional effort or costs associated with software release procedures or schedules beyond those dictated by the quality expectations for the software. The largest potential costs we anticipate would be associated with the setup and maintenance of shared code repositories for development and early access to NEAMS software products. We also anticipate an opportunity, with modest associated costs, to work with the Radiation Safety Information Computational Center (RSICC) to clarify export control assessment policies for software under development.

  11. Lean and Information Technology Toolkit

    Science.gov (United States)

    The Lean and Information Technology Toolkit is a how-to guide which provides resources to environmental agencies to help them use Lean Startup, Lean process improvement, and Agile tools to streamline and automate processes.

  12. Outage Risk Assessment and Management (ORAM) thermal-hydraulics toolkit

    International Nuclear Information System (INIS)

    Denny, V.E.; Wassel, A.T.; Issacci, F.; Pal Kalra, S.

    2004-01-01

    A PC-based thermal-hydraulic toolkit for use in support of outage optimization, management and risk assessment has been developed. This mechanistic toolkit incorporates simple models of key thermal-hydraulic processes which occur during an outage, such as recovery from or mitigation of outage upsets; this includes heat-up of water pools following loss of shutdown cooling, inadvertent drain down of the RCS, boiloff of coolant inventory, heatup of the uncovered core, and reflux cooling. This paper provides a list of key toolkit elements, briefly describes the technical basis and presents illustrative results for RCS transient behavior during reflux cooling, peak clad temperatures for an uncovered core and RCS response to loss of shutdown cooling. (author)

  13. WEB APPLICATION TO MANAGE DOCUMENTS USING THE GOOGLE WEB TOOLKIT AND APP ENGINE TECHNOLOGIES

    Directory of Open Access Journals (Sweden)

    Velázquez Santana Eugenio César

    2017-12-01

    Full Text Available The application of new information technologies such as Google Web Toolkit and App Engine are making a difference in the academic management of Higher Education Institutions (IES, who seek to streamline their processes as well as reduce infrastructure costs. However, they encounter the problems with regard to acquisition costs, the infrastructure necessary for their use, as well as the maintenance of the software; It is for this reason that the present research aims to describe the application of these new technologies in HEIs, as well as to identify their advantages and disadvantages and the key success factors in their implementation. As a software development methodology, SCRUM was used as well as PMBOK as a project management tool. The main results were related to the application of these technologies in the development of customized software for teachers, students and administrators, as well as the weaknesses and strengths of using them in the cloud. On the other hand, it was also possible to describe the paradigm shift that data warehouses are generating with respect to today's relational databases.

  14. ‘Survival’: a simulation toolkit introducing a modular approach for radiobiological evaluations in ion beam therapy

    Science.gov (United States)

    Manganaro, L.; Russo, G.; Bourhaleb, F.; Fausti, F.; Giordanengo, S.; Monaco, V.; Sacchi, R.; Vignati, A.; Cirio, R.; Attili, A.

    2018-04-01

    One major rationale for the application of heavy ion beams in tumour therapy is their increased relative biological effectiveness (RBE). The complex dependencies of the RBE on dose, biological endpoint, position in the field etc require the use of biophysical models in treatment planning and clinical analysis. This study aims to introduce a new software, named ‘Survival’, to facilitate the radiobiological computations needed in ion therapy. The simulation toolkit was written in C++ and it was developed with a modular architecture in order to easily incorporate different radiobiological models. The following models were successfully implemented: the local effect model (LEM, version I, II and III) and variants of the microdosimetric-kinetic model (MKM). Different numerical evaluation approaches were also implemented: Monte Carlo (MC) numerical methods and a set of faster analytical approximations. Among the possible applications, the toolkit was used to reproduce the RBE versus LET for different ions (proton, He, C, O, Ne) and different cell lines (CHO, HSG). Intercomparison between different models (LEM and MKM) and computational approaches (MC and fast approximations) were performed. The developed software could represent an important tool for the evaluation of the biological effectiveness of charged particles in ion beam therapy, in particular when coupled with treatment simulations. Its modular architecture facilitates benchmarking and inter-comparison between different models and evaluation approaches. The code is open source (GPL2 license) and available at https://github.com/batuff/Survival.

  15. Phased-array radar design application of radar fundamentals

    CERN Document Server

    Jeffrey, Thomas

    2009-01-01

    Phased-Array Radar Design is a text-reference designed for electrical engineering graduate students in colleges and universities as well as for corporate in-house training programs for radar design engineers, especially systems engineers and analysts who would like to gain hands-on, practical knowledge and skills in radar design fundamentals, advanced radar concepts, trade-offs for radar design and radar performance analysis.

  16. Cinfony – combining Open Source cheminformatics toolkits behind a common interface

    Directory of Open Access Journals (Sweden)

    Hutchison Geoffrey R

    2008-12-01

    Full Text Available Abstract Background Open Source cheminformatics toolkits such as OpenBabel, the CDK and the RDKit share the same core functionality but support different sets of file formats and forcefields, and calculate different fingerprints and descriptors. Despite their complementary features, using these toolkits in the same program is difficult as they are implemented in different languages (C++ versus Java, have different underlying chemical models and have different application programming interfaces (APIs. Results We describe Cinfony, a Python module that presents a common interface to all three of these toolkits, allowing the user to easily combine methods and results from any of the toolkits. In general, the run time of the Cinfony modules is almost as fast as accessing the underlying toolkits directly from C++ or Java, but Cinfony makes it much easier to carry out common tasks in cheminformatics such as reading file formats and calculating descriptors. Conclusion By providing a simplified interface and improving interoperability, Cinfony makes it easy to combine complementary features of OpenBabel, the CDK and the RDKit.

  17. Spatial distribution of errors associated with multistatic meteor radar

    Science.gov (United States)

    Hocking, W. K.

    2018-06-01

    With the recent increase in numbers of small and versatile low-power meteor radars, the opportunity exists to benefit from simultaneous application of multiple systems spaced by only a few hundred km and less. Transmissions from one site can be recorded at adjacent receiving sites using various degrees of forward scatter, potentially allowing atmospheric conditions in the mesopause regions between stations to be diagnosed. This can allow a better spatial overview of the atmospheric conditions at any time. Such studies have been carried out using a small version of such so-called multistatic meteor radars, e.g. Chau et al. (Radio Sci 52:811-828, 2017, https://doi.org/10.1002/2016rs006225 ). These authors were able to also make measurements of vorticity and divergence. However, measurement uncertainties arise which need to be considered in any application of such techniques. Some errors are so severe that they prohibit useful application of the technique in certain locations, particularly for zones at the midpoints of the radars sites. In this paper, software is developed to allow these errors to be determined, and examples of typical errors involved are discussed. The software should be of value to others who wish to optimize their own MMR systems.

  18. Provider perceptions of an integrated primary care quality improvement strategy: The PPAQ toolkit.

    Science.gov (United States)

    Beehler, Gregory P; Lilienthal, Kaitlin R

    2017-02-01

    The Primary Care Behavioral Health (PCBH) model of integrated primary care is challenging to implement with high fidelity. The Primary Care Behavioral Health Provider Adherence Questionnaire (PPAQ) was designed to assess provider adherence to essential model components and has recently been adapted into a quality improvement toolkit. The aim of this pilot project was to gather preliminary feedback on providers' perceptions of the acceptability and utility of the PPAQ toolkit for making beneficial practice changes. Twelve mental health providers working in Department of Veterans Affairs integrated primary care clinics participated in semistructured interviews to gather quantitative and qualitative data. Descriptive statistics and qualitative content analysis were used to analyze data. Providers identified several positive features of the PPAQ toolkit organization and structure that resulted in high ratings of acceptability, while also identifying several toolkit components in need of modification to improve usability. Toolkit content was considered highly representative of the (PCBH) model and therefore could be used as a diagnostic self-assessment of model adherence. The toolkit was considered to be high in applicability to providers regardless of their degree of prior professional preparation or current clinical setting. Additionally, providers identified several system-level contextual factors that could impact the usefulness of the toolkit. These findings suggest that frontline mental health providers working in (PCBH) settings may be receptive to using an adherence-focused toolkit for ongoing quality improvement. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  19. Knowledge information management toolkit and method

    Science.gov (United States)

    Hempstead, Antoinette R.; Brown, Kenneth L.

    2006-08-15

    A system is provided for managing user entry and/or modification of knowledge information into a knowledge base file having an integrator support component and a data source access support component. The system includes processing circuitry, memory, a user interface, and a knowledge base toolkit. The memory communicates with the processing circuitry and is configured to store at least one knowledge base. The user interface communicates with the processing circuitry and is configured for user entry and/or modification of knowledge pieces within a knowledge base. The knowledge base toolkit is configured for converting knowledge in at least one knowledge base from a first knowledge base form into a second knowledge base form. A method is also provided.

  20. Movement and respiration detection using statistical properties of the FMCW radar signal

    KAUST Repository

    Kiuru, Tero

    2016-07-26

    This paper presents a 24 GHz FMCW radar system for detection of movement and respiration using change in the statistical properties of the received radar signal, both amplitude and phase. We present the hardware and software segments of the radar system as well as algorithms with measurement results for two distinct use-cases: 1. FMCW radar as a respiration monitor and 2. a dual-use of the same radar system for smart lighting and intrusion detection. By using change in statistical properties of the signal for detection, several system parameters can be relaxed, including, for example, pulse repetition rate, power consumption, computational load, processor speed, and memory space. We will also demonstrate, that the capability to switch between received signal strength and phase difference enables dual-use cases with one requiring extreme sensitivity to movement and the other robustness against small sources of interference. © 2016 IEEE.

  1. Kekule.js: An Open Source JavaScript Chemoinformatics Toolkit.

    Science.gov (United States)

    Jiang, Chen; Jin, Xi; Dong, Ying; Chen, Ming

    2016-06-27

    Kekule.js is an open-source, object-oriented JavaScript toolkit for chemoinformatics. It provides methods for many common tasks in molecular informatics, including chemical data input/output (I/O), two- and three-dimensional (2D/3D) rendering of chemical structure, stereo identification, ring perception, structure comparison, and substructure search. Encapsulated widgets to display and edit chemical structures directly in web context are also supplied. Developed with web standards, the toolkit is ideal for building chemoinformatics applications over the Internet. Moreover, it is highly platform-independent and can also be used in desktop or mobile environments. Some initial applications, such as plugins for inputting chemical structures on the web and uses in chemistry education, have been developed based on the toolkit.

  2. Validation of Power Output for the WIND Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    King, J.; Clifton, A.; Hodge, B. M.

    2014-09-01

    Renewable energy integration studies require wind data sets of high quality with realistic representations of the variability, ramping characteristics, and forecast performance for current wind power plants. The Wind Integration National Data Set (WIND) Toolkit is meant to be an update for and expansion of the original data sets created for the weather years from 2004 through 2006 during the Western Wind and Solar Integration Study and the Eastern Wind Integration Study. The WIND Toolkit expands these data sets to include the entire continental United States, increasing the total number of sites represented, and it includes the weather years from 2007 through 2012. In addition, the WIND Toolkit has a finer resolution for both the temporal and geographic dimensions. Three separate data sets will be created: a meteorological data set, a wind power data set, and a forecast data set. This report describes the validation of the wind power data set.

  3. Mission Analysis, Operations, and Navigation Toolkit Environment (Monte) Version 040

    Science.gov (United States)

    Sunseri, Richard F.; Wu, Hsi-Cheng; Evans, Scott E.; Evans, James R.; Drain, Theodore R.; Guevara, Michelle M.

    2012-01-01

    Monte is a software set designed for use in mission design and spacecraft navigation operations. The system can process measurement data, design optimal trajectories and maneuvers, and do orbit determination, all in one application. For the first time, a single software set can be used for mission design and navigation operations. This eliminates problems due to different models and fidelities used in legacy mission design and navigation software. The unique features of Monte 040 include a blowdown thruster model for GRAIL (Gravity Recovery and Interior Laboratory) with associated pressure models, as well as an updated, optimalsearch capability (COSMIC) that facilitated mission design for ARTEMIS. Existing legacy software lacked the capabilities necessary for these two missions. There is also a mean orbital element propagator and an osculating to mean element converter that allows long-term orbital stability analysis for the first time in compiled code. The optimized trajectory search tool COSMIC allows users to place constraints and controls on their searches without any restrictions. Constraints may be user-defined and depend on trajectory information either forward or backwards in time. In addition, a long-term orbit stability analysis tool (morbiter) existed previously as a set of scripts on top of Monte. Monte is becoming the primary tool for navigation operations, a core competency at JPL. The mission design capabilities in Monte are becoming mature enough for use in project proposals as well as post-phase A mission design. Monte has three distinct advantages over existing software. First, it is being developed in a modern paradigm: object- oriented C++ and Python. Second, the software has been developed as a toolkit, which allows users to customize their own applications and allows the development team to implement requirements quickly, efficiently, and with minimal bugs. Finally, the software is managed in accordance with the CMMI (Capability Maturity Model

  4. APPLICATION OF SENTINEL-1 RADAR DATA FOR MAPPING HARD-TO-REACH NORTHERN TERRITORIES

    Directory of Open Access Journals (Sweden)

    Е. А. Baldina

    2017-01-01

    Full Text Available The new European space satellites Sentinel-1A and 1B with C-band radars on board, launched in 2014 and 2016 respectively, provide regular radar data on the Earth’s surface with high temporal resolution. These new non-commercial data provides extensive opportunities for research of remote Arctic territories, poorly supplied with optical images due to cloud conditions. Difficulties in recognizing objects on radar images can be compensated for by the possibility of using multiple repeated surveys, which make it possible to identify areas of the terrain which are similar in character of changes. In the study, four Sentinel-1A images of the largest from the New Siberian islands – Kotelny – were used, which were acquired during the summer period from July 3 to August 20, 2015. After preprocessing aimed at improving the visual properties and coregistration of the multitemporal images, an automated clustering of the multitemporal image set was carried out. Clustering results were analyzed on comparison with additional sources of spatial information. Both specialized software for Sentinel-1 radar data processing - SNAP, and the GIS software complex ArcGIS were used. The latter provided the creation of the spatial data base for comparing the results of radar data processing and cartographic sources. The map of the territory zoning was obtained as clustering results which is based on the changes in the normalized radar cross section (sigma nought over the summer period, and the approximate correspondence of the areas to the main types of the relief and landscapes of the island was established.

  5. AlignerBoost: A Generalized Software Toolkit for Boosting Next-Gen Sequencing Mapping Accuracy Using a Bayesian-Based Mapping Quality Framework.

    Directory of Open Access Journals (Sweden)

    Qi Zheng

    2016-10-01

    Full Text Available Accurate mapping of next-generation sequencing (NGS reads to reference genomes is crucial for almost all NGS applications and downstream analyses. Various repetitive elements in human and other higher eukaryotic genomes contribute in large part to ambiguously (non-uniquely mapped reads. Most available NGS aligners attempt to address this by either removing all non-uniquely mapping reads, or reporting one random or "best" hit based on simple heuristics. Accurate estimation of the mapping quality of NGS reads is therefore critical albeit completely lacking at present. Here we developed a generalized software toolkit "AlignerBoost", which utilizes a Bayesian-based framework to accurately estimate mapping quality of ambiguously mapped NGS reads. We tested AlignerBoost with both simulated and real DNA-seq and RNA-seq datasets at various thresholds. In most cases, but especially for reads falling within repetitive regions, AlignerBoost dramatically increases the mapping precision of modern NGS aligners without significantly compromising the sensitivity even without mapping quality filters. When using higher mapping quality cutoffs, AlignerBoost achieves a much lower false mapping rate while exhibiting comparable or higher sensitivity compared to the aligner default modes, therefore significantly boosting the detection power of NGS aligners even using extreme thresholds. AlignerBoost is also SNP-aware, and higher quality alignments can be achieved if provided with known SNPs. AlignerBoost's algorithm is computationally efficient, and can process one million alignments within 30 seconds on a typical desktop computer. AlignerBoost is implemented as a uniform Java application and is freely available at https://github.com/Grice-Lab/AlignerBoost.

  6. AlignerBoost: A Generalized Software Toolkit for Boosting Next-Gen Sequencing Mapping Accuracy Using a Bayesian-Based Mapping Quality Framework.

    Science.gov (United States)

    Zheng, Qi; Grice, Elizabeth A

    2016-10-01

    Accurate mapping of next-generation sequencing (NGS) reads to reference genomes is crucial for almost all NGS applications and downstream analyses. Various repetitive elements in human and other higher eukaryotic genomes contribute in large part to ambiguously (non-uniquely) mapped reads. Most available NGS aligners attempt to address this by either removing all non-uniquely mapping reads, or reporting one random or "best" hit based on simple heuristics. Accurate estimation of the mapping quality of NGS reads is therefore critical albeit completely lacking at present. Here we developed a generalized software toolkit "AlignerBoost", which utilizes a Bayesian-based framework to accurately estimate mapping quality of ambiguously mapped NGS reads. We tested AlignerBoost with both simulated and real DNA-seq and RNA-seq datasets at various thresholds. In most cases, but especially for reads falling within repetitive regions, AlignerBoost dramatically increases the mapping precision of modern NGS aligners without significantly compromising the sensitivity even without mapping quality filters. When using higher mapping quality cutoffs, AlignerBoost achieves a much lower false mapping rate while exhibiting comparable or higher sensitivity compared to the aligner default modes, therefore significantly boosting the detection power of NGS aligners even using extreme thresholds. AlignerBoost is also SNP-aware, and higher quality alignments can be achieved if provided with known SNPs. AlignerBoost's algorithm is computationally efficient, and can process one million alignments within 30 seconds on a typical desktop computer. AlignerBoost is implemented as a uniform Java application and is freely available at https://github.com/Grice-Lab/AlignerBoost.

  7. ARC Code TI: Crisis Mapping Toolkit

    Data.gov (United States)

    National Aeronautics and Space Administration — The Crisis Mapping Toolkit (CMT) is a collection of tools for processing geospatial data (images, satellite data, etc.) into cartographic products that improve...

  8. Sealed radioactive sources toolkit

    International Nuclear Information System (INIS)

    Mac Kenzie, C.

    2005-09-01

    The IAEA has developed a Sealed Radioactive Sources Toolkit to provide information to key groups about the safety and security of sealed radioactive sources. The key groups addressed are officials in government agencies, medical users, industrial users and the scrap metal industry. The general public may also benefit from an understanding of the fundamentals of radiation safety

  9. The ECVET toolkit customization for the nuclear energy sector

    Energy Technology Data Exchange (ETDEWEB)

    Ceclan, Mihail; Ramos, Cesar Chenel; Estorff, Ulrike von [European Commission, Joint Research Centre, Petten (Netherlands). Inst. for Energy and Transport

    2015-04-15

    As part of its support to the introduction of ECVET in the nuclear energy sector, the Institute for Energy and Transport (IET) of the Joint Research Centre (JRC), European Commission (EC), through the ECVET Team of the European Human Resources Observatory for the Nuclear energy sector (EHRO-N), developed in the last six years (2009-2014) a sectorial approach and a road map for ECVET implementation in the nuclear energy sector. In order to observe the road map for the ECVET implementation, the toolkit customization for nuclear energy sector is required. This article describes the outcomes of the toolkit customization, based on ECVET approach, for nuclear qualifications design. The process of the toolkit customization took into account the fact that nuclear qualifications are mostly of higher levels (five and above) of the European Qualifications Framework.

  10. The ECVET toolkit customization for the nuclear energy sector

    International Nuclear Information System (INIS)

    Ceclan, Mihail; Ramos, Cesar Chenel; Estorff, Ulrike von

    2015-01-01

    As part of its support to the introduction of ECVET in the nuclear energy sector, the Institute for Energy and Transport (IET) of the Joint Research Centre (JRC), European Commission (EC), through the ECVET Team of the European Human Resources Observatory for the Nuclear energy sector (EHRO-N), developed in the last six years (2009-2014) a sectorial approach and a road map for ECVET implementation in the nuclear energy sector. In order to observe the road map for the ECVET implementation, the toolkit customization for nuclear energy sector is required. This article describes the outcomes of the toolkit customization, based on ECVET approach, for nuclear qualifications design. The process of the toolkit customization took into account the fact that nuclear qualifications are mostly of higher levels (five and above) of the European Qualifications Framework.

  11. A hardware-in-the-loop simulation program for ground-based radar

    Science.gov (United States)

    Lam, Eric P.; Black, Dennis W.; Ebisu, Jason S.; Magallon, Julianna

    2011-06-01

    A radar system created using an embedded computer system needs testing. The way to test an embedded computer system is different from the debugging approaches used on desktop computers. One way to test a radar system is to feed it artificial inputs and analyze the outputs of the radar. More often, not all of the building blocks of the radar system are available to test. This will require the engineer to test parts of the radar system using a "black box" approach. A common way to test software code on a desktop simulation is to use breakpoints so that is pauses after each cycle through its calculations. The outputs are compared against the values that are expected. This requires the engineer to use valid test scenarios. We will present a hardware-in-the-loop simulator that allows the embedded system to think it is operating with real-world inputs and outputs. From the embedded system's point of view, it is operating in real-time. The hardware in the loop simulation is based on our Desktop PC Simulation (PCS) testbed. In the past, PCS was used for ground-based radars. This embedded simulation, called Embedded PCS, allows a rapid simulated evaluation of ground-based radar performance in a laboratory environment.

  12. Multimethod evaluation of the VA's peer-to-peer Toolkit for patient-centered medical home implementation.

    Science.gov (United States)

    Luck, Jeff; Bowman, Candice; York, Laura; Midboe, Amanda; Taylor, Thomas; Gale, Randall; Asch, Steven

    2014-07-01

    Effective implementation of the patient-centered medical home (PCMH) in primary care practices requires training and other resources, such as online toolkits, to share strategies and materials. The Veterans Health Administration (VA) developed an online Toolkit of user-sourced tools to support teams implementing its Patient Aligned Care Team (PACT) medical home model. To present findings from an evaluation of the PACT Toolkit, including use, variation across facilities, effect of social marketing, and factors influencing use. The Toolkit is an online repository of ready-to-use tools created by VA clinic staff that physicians, nurses, and other team members may share, download, and adopt in order to more effectively implement PCMH principles and improve local performance on VA metrics. Multimethod evaluation using: (1) website usage analytics, (2) an online survey of the PACT community of practice's use of the Toolkit, and (3) key informant interviews. Survey respondents were PACT team members and coaches (n = 544) at 136 VA facilities. Interview respondents were Toolkit users and non-users (n = 32). For survey data, multivariable logistic models were used to predict Toolkit awareness and use. Interviews and open-text survey comments were coded using a "common themes" framework. The Consolidated Framework for Implementation Research (CFIR) guided data collection and analyses. The Toolkit was used by 6,745 staff in the first 19 months of availability. Among members of the target audience, 80 % had heard of the Toolkit, and of those, 70 % had visited the website. Tools had been implemented at 65 % of facilities. Qualitative findings revealed a range of user perspectives from enthusiastic support to lack of sufficient time to browse the Toolkit. An online Toolkit to support PCMH implementation was used at VA facilities nationwide. Other complex health care organizations may benefit from adopting similar online peer-to-peer resource libraries.

  13. POLCAL - POLARIMETRIC RADAR CALIBRATION

    Science.gov (United States)

    Vanzyl, J.

    1994-01-01

    Calibration of polarimetric radar systems is a field of research in which great progress has been made over the last few years. POLCAL (Polarimetric Radar Calibration) is a software tool intended to assist in the calibration of Synthetic Aperture Radar (SAR) systems. In particular, POLCAL calibrates Stokes matrix format data produced as the standard product by the NASA/Jet Propulsion Laboratory (JPL) airborne imaging synthetic aperture radar (AIRSAR). POLCAL was designed to be used in conjunction with data collected by the NASA/JPL AIRSAR system. AIRSAR is a multifrequency (6 cm, 24 cm, and 68 cm wavelength), fully polarimetric SAR system which produces 12 x 12 km imagery at 10 m resolution. AIRSTAR was designed as a testbed for NASA's Spaceborne Imaging Radar program. While the images produced after 1991 are thought to be calibrated (phase calibrated, cross-talk removed, channel imbalance removed, and absolutely calibrated), POLCAL can and should still be used to check the accuracy of the calibration and to correct it if necessary. Version 4.0 of POLCAL is an upgrade of POLCAL version 2.0 released to AIRSAR investigators in June, 1990. New options in version 4.0 include automatic absolute calibration of 89/90 data, distributed target analysis, calibration of nearby scenes with calibration parameters from a scene with corner reflectors, altitude or roll angle corrections, and calibration of errors introduced by known topography. Many sources of error can lead to false conclusions about the nature of scatterers on the surface. Errors in the phase relationship between polarization channels result in incorrect synthesis of polarization states. Cross-talk, caused by imperfections in the radar antenna itself, can also lead to error. POLCAL reduces cross-talk and corrects phase calibration without the use of ground calibration equipment. Removing the antenna patterns during SAR processing also forms a very important part of the calibration of SAR data. Errors in the

  14. Reconfigurable signal processor designs for advanced digital array radar systems

    Science.gov (United States)

    Suarez, Hernan; Zhang, Yan (Rockee); Yu, Xining

    2017-05-01

    The new challenges originated from Digital Array Radar (DAR) demands a new generation of reconfigurable backend processor in the system. The new FPGA devices can support much higher speed, more bandwidth and processing capabilities for the need of digital Line Replaceable Unit (LRU). This study focuses on using the latest Altera and Xilinx devices in an adaptive beamforming processor. The field reprogrammable RF devices from Analog Devices are used as analog front end transceivers. Different from other existing Software-Defined Radio transceivers on the market, this processor is designed for distributed adaptive beamforming in a networked environment. The following aspects of the novel radar processor will be presented: (1) A new system-on-chip architecture based on Altera's devices and adaptive processing module, especially for the adaptive beamforming and pulse compression, will be introduced, (2) Successful implementation of generation 2 serial RapidIO data links on FPGA, which supports VITA-49 radio packet format for large distributed DAR processing. (3) Demonstration of the feasibility and capabilities of the processor in a Micro-TCA based, SRIO switching backplane to support multichannel beamforming in real-time. (4) Application of this processor in ongoing radar system development projects, including OU's dual-polarized digital array radar, the planned new cylindrical array radars, and future airborne radars.

  15. Terrain-Toolkit

    DEFF Research Database (Denmark)

    Wang, Qi; Kaul, Manohar; Long, Cheng

    2014-01-01

    , as will be shown, is used heavily for query processing in spatial databases; and (3) they do not provide the surface distance operator which is fundamental for many applications based on terrain data. Motivated by this, we developed a tool called Terrain-Toolkit for terrain data which accepts a comprehensive set......Terrain data is becoming increasingly popular both in industry and in academia. Many tools have been developed for visualizing terrain data. However, we find that (1) they usually accept very few data formats of terrain data only; (2) they do not support terrain simplification well which...

  16. An Overview of the GEANT4 Toolkit

    International Nuclear Information System (INIS)

    Apostolakis, John; CERN; Wright, Dennis H.

    2007-01-01

    Geant4 is a toolkit for the simulation of the transport of radiation through matter. With a flexible kernel and choices between different physics modeling choices, it has been tailored to the requirements of a wide range of applications. With the toolkit a user can describe a setup's or detector's geometry and materials, navigate inside it, simulate the physical interactions using a choice of physics engines, underlying physics cross-sections and models, visualize and store results. Physics models describing electromagnetic and hadronic interactions are provided, as are decays and processes for optical photons. Several models, with different precision and performance are available for many processes. The toolkit includes coherent physics model configurations, which are called physics lists. Users can choose an existing physics list or create their own, depending on their requirements and the application area. A clear structure and readable code, enable the user to investigate the origin of physics results. Application areas include detector simulation and background simulation in High Energy Physics experiments, simulation of accelerator setups, studies in medical imaging and treatment, and the study of the effects of solar radiation on spacecraft instruments

  17. Microgrid Design Toolkit (MDT) Technical Documentation and Component Summaries

    Energy Technology Data Exchange (ETDEWEB)

    Arguello, Bryan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gearhart, Jared Lee [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jones, Katherine A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eddy, John P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    The Microgrid Design Toolkit (MDT) is a decision support software tool for microgrid designers to use during the microgrid design process. The models that support the two main capabilities in MDT are described. The first capability, the Microgrid Sizing Capability (MSC), is used to determine the size and composition of a new microgrid in the early stages of the design process. MSC is a mixed-integer linear program that is focused on developing a microgrid that is economically viable when connected to the grid. The second capability is focused on refining a microgrid design for operation in islanded mode. This second capability relies on two models: the Technology Management Optimization (TMO) model and Performance Reliability Model (PRM). TMO uses a genetic algorithm to create and refine a collection of candidate microgrid designs. It uses PRM, a simulation based reliability model, to assess the performance of these designs. TMO produces a collection of microgrid designs that perform well with respect to one or more performance metrics.

  18. NOAA Weather and Climate Toolkit (WCT)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The NOAA Weather and Climate Toolkit is an application that provides simple visualization and data export of weather and climatological data archived at NCDC. The...

  19. A User Interface Toolkit for a Small Screen Device.

    OpenAIRE

    UOTILA, ALEKSI

    2000-01-01

    The appearance of different kinds of networked mobile devices and network appliances creates special requirements for user interfaces that are not met by existing widget based user interface creation toolkits. This thesis studies the problem domain of user interface creation toolkits for portable network connected devices. The portable nature of these devices places great restrictions on the user interface capabilities. One main characteristic of the devices is that they have small screens co...

  20. Field tests of a participatory ergonomics toolkit for Total Worker Health.

    Science.gov (United States)

    Nobrega, Suzanne; Kernan, Laura; Plaku-Alakbarova, Bora; Robertson, Michelle; Warren, Nicholas; Henning, Robert

    2017-04-01

    Growing interest in Total Worker Health ® (TWH) programs to advance worker safety, health and well-being motivated development of a toolkit to guide their implementation. Iterative design of a program toolkit occurred in which participatory ergonomics (PE) served as the primary basis to plan integrated TWH interventions in four diverse organizations. The toolkit provided start-up guides for committee formation and training, and a structured PE process for generating integrated TWH interventions. Process data from program facilitators and participants throughout program implementation were used for iterative toolkit design. Program success depended on organizational commitment to regular design team meetings with a trained facilitator, the availability of subject matter experts on ergonomics and health to support the design process, and retraining whenever committee turnover occurred. A two committee structure (employee Design Team, management Steering Committee) provided advantages over a single, multilevel committee structure, and enhanced the planning, communication, and teamwork skills of participants. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Field tests of a participatory ergonomics toolkit for Total Worker Health

    Science.gov (United States)

    Kernan, Laura; Plaku-Alakbarova, Bora; Robertson, Michelle; Warren, Nicholas; Henning, Robert

    2018-01-01

    Growing interest in Total Worker Health® (TWH) programs to advance worker safety, health and well-being motivated development of a toolkit to guide their implementation. Iterative design of a program toolkit occurred in which participatory ergonomics (PE) served as the primary basis to plan integrated TWH interventions in four diverse organizations. The toolkit provided start-up guides for committee formation and training, and a structured PE process for generating integrated TWH interventions. Process data from program facilitators and participants throughout program implementation were used for iterative toolkit design. Program success depended on organizational commitment to regular design team meetings with a trained facilitator, the availability of subject matter experts on ergonomics and health to support the design process, and retraining whenever committee turnover occurred. A two committee structure (employee Design Team, management Steering Committee) provided advantages over a single, multilevel committee structure, and enhanced the planning, communication, and team-work skills of participants. PMID:28166897

  2. Quantum radar

    CERN Document Server

    Lanzagorta, Marco

    2011-01-01

    This book offers a concise review of quantum radar theory. Our approach is pedagogical, making emphasis on the physics behind the operation of a hypothetical quantum radar. We concentrate our discussion on the two major models proposed to date: interferometric quantum radar and quantum illumination. In addition, this book offers some new results, including an analytical study of quantum interferometry in the X-band radar region with a variety of atmospheric conditions, a derivation of a quantum radar equation, and a discussion of quantum radar jamming.This book assumes the reader is familiar w

  3. Newnes electronics toolkit

    CERN Document Server

    Phillips, Geoff

    2013-01-01

    Newnes Electronics Toolkit brings together fundamental facts, concepts, and applications of electronic components and circuits, and presents them in a clear, concise, and unambiguous format, to provide a reference book for engineers. The book contains 10 chapters that discuss the following concepts: resistors, capacitors, inductors, semiconductors, circuit concepts, electromagnetic compatibility, sound, light, heat, and connections. The engineer's job does not end when the circuit diagram is completed; the design for the manufacturing process is just as important if volume production is to be

  4. VariVis: a visualisation toolkit for variation databases

    Directory of Open Access Journals (Sweden)

    Smith Timothy D

    2008-04-01

    Full Text Available Abstract Background With the completion of the Human Genome Project and recent advancements in mutation detection technologies, the volume of data available on genetic variations has risen considerably. These data are stored in online variation databases and provide important clues to the cause of diseases and potential side effects or resistance to drugs. However, the data presentation techniques employed by most of these databases make them difficult to use and understand. Results Here we present a visualisation toolkit that can be employed by online variation databases to generate graphical models of gene sequence with corresponding variations and their consequences. The VariVis software package can run on any web server capable of executing Perl CGI scripts and can interface with numerous Database Management Systems and "flat-file" data files. VariVis produces two easily understandable graphical depictions of any gene sequence and matches these with variant data. While developed with the goal of improving the utility of human variation databases, the VariVis package can be used in any variation database to enhance utilisation of, and access to, critical information.

  5. A Qualitative Evaluation of Web-Based Cancer Care Quality Improvement Toolkit Use in the Veterans Health Administration.

    Science.gov (United States)

    Bowman, Candice; Luck, Jeff; Gale, Randall C; Smith, Nina; York, Laura S; Asch, Steven

    2015-01-01

    Disease severity, complexity, and patient burden highlight cancer care as a target for quality improvement (QI) interventions. The Veterans Health Administration (VHA) implemented a series of disease-specific online cancer care QI toolkits. To describe characteristics of the toolkits, target users, and VHA cancer care facilities that influenced toolkit access and use and assess whether such resources were beneficial for users. Deductive content analysis of detailed notes from 94 telephone interviews with individuals from 48 VHA facilities. We evaluated toolkit access and use across cancer types, participation in learning collaboratives, and affiliation with VHA cancer care facilities. The presence of champions was identified as a strong facilitator of toolkit use, and learning collaboratives were important for spreading information about toolkit availability. Identified barriers included lack of personnel and financial resources and complicated approval processes to support tool use. Online cancer care toolkits are well received across cancer specialties and provider types. Clinicians, administrators, and QI staff may benefit from the availability of toolkits as they become more reliant on rapid access to strategies that support comprehensive delivery of evidence-based care. Toolkits should be considered as a complement to other QI approaches.

  6. Remote access to mathematical software

    International Nuclear Information System (INIS)

    Dolan, E.; Hovland, P.; More, J.; Norris, B.; Smith, B.

    2001-01-01

    The network-oriented application services paradigm is becoming increasingly common for scientific computing. The popularity of this approach can be attributed to the numerous advantages to both user and developer provided by network-enabled mathematical software. The burden of installing and maintaining complex systems is lifted from the user, while enabling developers to provide frequent updates without disrupting service. Access to software with similar functionality can be unified under the same interface. Remote servers can utilize potentially more powerful computing resources than may be available locally. We discuss some of the application services developed by the Mathematics and Computer Science Division at Argonne National Laboratory, including the Network Enabled Optimization System (NEOS) Server and the Automatic Differentiation of C (ADIC) Server, as well as preliminary work on Web access to the Portable Extensible Toolkit for Scientific Computing (PETSc). We also provide a brief survey of related work

  7. Overview and Meteorological Validation of the Wind Integration National Dataset toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Draxl, C. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hodge, B. M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Clifton, A. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); McCaa, J. [3TIER by VAisala, Seattle, WA (United States)

    2015-04-13

    The Wind Integration National Dataset (WIND) Toolkit described in this report fulfills these requirements, and constitutes a state-of-the-art national wind resource data set covering the contiguous United States from 2007 to 2013 for use in a variety of next-generation wind integration analyses and wind power planning. The toolkit is a wind resource data set, wind forecast data set, and wind power production and forecast data set derived from the Weather Research and Forecasting (WRF) numerical weather prediction model. WIND Toolkit data are available online for over 116,000 land-based and 10,000 offshore sites representing existing and potential wind facilities.

  8. Quick Way to Port Existing C/C++ Chemoinformatics Toolkits to the Web Using Emscripten.

    Science.gov (United States)

    Jiang, Chen; Jin, Xi

    2017-10-23

    Emscripten is a special open source compiler that compiles C and C++ code into JavaScript. By utilizing this compiler, some typical C/C++ chemoinformatics toolkits and libraries are quickly ported to to web. The compiled JavaScript files have sizes similar to native programs, and from a series of constructed benchmarks, the performance of the compiled JavaScript codes is also close to that of the native codes and is better than the handwritten JavaScript codes. Therefore, we believe that Emscripten is a feasible and practical tool for reusing existing C/C++ codes on the web, and many other chemoinformatics or molecular calculation software tools can also be easily ported by Emscripten.

  9. Progress on Ultra-Wideband (UWB Multi-Antenna radar imaging for MIGA

    Directory of Open Access Journals (Sweden)

    Yedlin Matthew

    2016-01-01

    Full Text Available Progress on the development of the multi-channel, ground penetrating radar imaging system is presented from hardware and software perspectives. A new exponentially tapered slot antenna, with an operating bandwidth from 100 MHz to 1.5 GHz was fabricated and tested using the eight-port vector network analyzer, designed by Rhode and Schwarz Incorporated for this imaging project. An eight element antenna array mounted on two carts with automatic motor drive, was designed for optimal common midpoint (CMP data acquisition. Data acquisition scenarios were tested using the acoustic version of the NORSAR2D seismic ray-tracing software. This package enables the synthesis and analysis of multi-channel, multi-offset data acquisitions comprising more than a hundred thousand traces. Preliminary processing is in good agreement with published bistatic ground-penetrating radar images obtained in the tunnels of the Low-noise Underground Laboratory (LSBB at Rustrel, France.

  10. IChem: A Versatile Toolkit for Detecting, Comparing, and Predicting Protein-Ligand Interactions.

    Science.gov (United States)

    Da Silva, Franck; Desaphy, Jeremy; Rognan, Didier

    2018-03-20

    Structure-based ligand design requires an exact description of the topology of molecular entities under scrutiny. IChem is a software package that reflects the many contributions of our research group in this area over the last decade. It facilitates and automates many tasks (e.g., ligand/cofactor atom typing, identification of key water molecules) usually left to the modeler's choice. It therefore permits the detection of molecular interactions between two molecules in a very precise and flexible manner. Moreover, IChem enables the conversion of intricate three-dimensional (3D) molecular objects into simple representations (fingerprints, graphs) that facilitate knowledge acquisition at very high throughput. The toolkit is an ideal companion for setting up and performing many structure-based design computations. © 2018 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  11. The Populist Toolkit

    OpenAIRE

    Ylä-Anttila, Tuukka Salu Santeri

    2017-01-01

    Populism has often been understood as a description of political parties and politicians, who have been labelled either populist or not. This dissertation argues that it is more useful to conceive of populism in action: as something that is done rather than something that is. I propose that the populist toolkit is a collection of cultural practices, which politicians and citizens use to make sense of and do politics, by claiming that ‘the people’ are opposed by a corrupt elite – a powerful cl...

  12. Detection and localization of multiple short range targets using FMCW radar signal

    KAUST Repository

    Jardak, Seifallah; Kiuru, Tero; Metso, Mikko; Pursula, Pekka; Hakli, Janne; Hirvonen, Mervi; Ahmed, Sajid; Alouini, Mohamed-Slim

    2016-01-01

    In this paper, a 24 GHz frequency-modulated continuous wave radar is used to detect and localize both stationary and moving targets. Depending on the application, the implemented software offers different modes of operation. For example, it can

  13. A toolkit for promoting healthy ageing

    NARCIS (Netherlands)

    Jeroen Knevel; Aly Gruppen

    2016-01-01

    This toolkit therefore focusses on self-management abilities. That means finding and maintaining effective, positive coping methods in relation to our health. We included many common and frequently discussed topics such as drinking, eating, physical exercise, believing in the future, resilience,

  14. QALMA: A computational toolkit for the analysis of quality protocols for medical linear accelerators in radiation therapy

    Science.gov (United States)

    Rahman, Md Mushfiqur; Lei, Yu; Kalantzis, Georgios

    2018-01-01

    Quality Assurance (QA) for medical linear accelerator (linac) is one of the primary concerns in external beam radiation Therapy. Continued advancements in clinical accelerators and computer control technology make the QA procedures more complex and time consuming which often, adequate software accompanied with specific phantoms is required. To ameliorate that matter, we introduce QALMA (Quality Assurance for Linac with MATLAB), a MALAB toolkit which aims to simplify the quantitative analysis of QA for linac which includes Star-Shot analysis, Picket Fence test, Winston-Lutz test, Multileaf Collimator (MLC) log file analysis and verification of light & radiation field coincidence test.

  15. User's manual for the two-dimensional transputer graphics toolkit

    Science.gov (United States)

    Ellis, Graham K.

    1988-01-01

    The user manual for the 2-D graphics toolkit for a transputer based parallel processor is presented. The toolkit consists of a package of 2-D display routines that can be used for the simulation visualizations. It supports multiple windows, double buffered screens for animations, and simple graphics transformations such as translation, rotation, and scaling. The display routines are written in occam to take advantage of the multiprocessing features available on transputers. The package is designed to run on a transputer separate from the graphics board.

  16. A toolkit for detecting technical surprise.

    Energy Technology Data Exchange (ETDEWEB)

    Trahan, Michael Wayne; Foehse, Mark C.

    2010-10-01

    The detection of a scientific or technological surprise within a secretive country or institute is very difficult. The ability to detect such surprises would allow analysts to identify the capabilities that could be a military or economic threat to national security. Sandia's current approach utilizing ThreatView has been successful in revealing potential technological surprises. However, as data sets become larger, it becomes critical to use algorithms as filters along with the visualization environments. Our two-year LDRD had two primary goals. First, we developed a tool, a Self-Organizing Map (SOM), to extend ThreatView and improve our understanding of the issues involved in working with textual data sets. Second, we developed a toolkit for detecting indicators of technical surprise in textual data sets. Our toolkit has been successfully used to perform technology assessments for the Science & Technology Intelligence (S&TI) program.

  17. Design and Implementation of Radar Cross-Section Models on a Virtex-6 FPGA

    Directory of Open Access Journals (Sweden)

    B. U. V. Prashanth

    2014-01-01

    Full Text Available The simulation of radar cross-section (RCS models in FPGA is illustrated. The models adopted are the Swerling ones. Radar cross-section (RCS which is also termed as echo area gives the amount of scattered power from a target towards the radar. This paper elucidates the simulation of RCS to represent the specified targets under different conditions, namely, aspect angle and frequency. This model is used for the performance evaluation of radar. RCS models have been developed for various targets like simple objects to complex objects like aircrafts, missiles, tanks, and so forth. First, the model was developed in MATLAB real time simulation environment and after successful verification, the same was implemented in FPGA. Xilinx ISE software was used for VHDL coding. This simulation model was used for the testing of a radar system. The results were compared with MATLAB simulations and FPGA based timing diagrams and RTL synthesis. The paper illustrates the simulation of various target radar cross-section (RCS models. These models are simulated in MATLAB and in FPGA, with the aim of implementing them efficiently on a radar system. This method can be generalized to apply to objects of arbitrary geometry for the two configurations of transmitter and receiver in the same as well as different locations.

  18. Radar Fundamentals, Presentation

    OpenAIRE

    Jenn, David

    2008-01-01

    Topics include: introduction, radar functions, antennas basics, radar range equation, system parameters, electromagnetic waves, scattering mechanisms, radar cross section and stealth, and sample radar systems.

  19. Software for virtual accelerator designing

    International Nuclear Information System (INIS)

    Kulabukhova, N.; Ivanov, A.; Korkhov, V.; Lazarev, A.

    2012-01-01

    The article discusses appropriate technologies for software implementation of the Virtual Accelerator. The Virtual Accelerator is considered as a set of services and tools enabling transparent execution of computational software for modeling beam dynamics in accelerators on distributed computing resources. Distributed storage and information processing facilities utilized by the Virtual Accelerator make use of the Service-Oriented Architecture (SOA) according to a cloud computing paradigm. Control system tool-kits (such as EPICS, TANGO), computing modules (including high-performance computing), realization of the GUI with existing frameworks and visualization of the data are discussed in the paper. The presented research consists of software analysis for realization of interaction between all levels of the Virtual Accelerator and some samples of middle-ware implementation. A set of the servers and clusters at St.-Petersburg State University form the infrastructure of the computing environment for Virtual Accelerator design. Usage of component-oriented technology for realization of Virtual Accelerator levels interaction is proposed. The article concludes with an overview and substantiation of a choice of technologies that will be used for design and implementation of the Virtual Accelerator. (authors)

  20. Weather Radar Stations

    Data.gov (United States)

    Department of Homeland Security — These data represent Next-Generation Radar (NEXRAD) and Terminal Doppler Weather Radar (TDWR) weather radar stations within the US. The NEXRAD radar stations are...

  1. PDB@: an offline toolkit for exploration and analysis of PDB files.

    Science.gov (United States)

    Mani, Udayakumar; Ravisankar, Sadhana; Ramakrishnan, Sai Mukund

    2013-12-01

    Protein Data Bank (PDB) is a freely accessible archive of the 3-D structural data of biological molecules. Structure based studies offers a unique vantage point in inferring the properties of a protein molecule from structural data. This is too big a task to be done manually. Moreover, there is no single tool, software or server that comprehensively analyses all structure-based properties. The objective of the present work is to develop an offline computational toolkit, PDB@ containing in-built algorithms that help categorizing the structural properties of a protein molecule. The user has the facility to view and edit the PDB file to his need. Some features of the present work are unique in itself and others are an improvement over existing tools. Also, the representation of protein properties in both graphical and textual formats helps in predicting all the necessary details of a protein molecule on a single platform.

  2. OpenMSI Arrayed Analysis Toolkit: Analyzing Spatially Defined Samples Using Mass Spectrometry Imaging

    Energy Technology Data Exchange (ETDEWEB)

    de Raad, Markus [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); de Rond, Tristan [Univ. of California, Berkeley, CA (United States); Rübel, Oliver [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Keasling, Jay D. [Univ. of California, Berkeley, CA (United States); Joint BioEnergy Inst. (JBEI), Emeryville, CA (United States); Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Technical Univ. of Denmark, Lyngby (Denmark); Northen, Trent R. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States); Bowen, Benjamin P. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States)

    2017-05-03

    Mass spectrometry imaging (MSI) has primarily been applied in localizing biomolecules within biological matrices. Although well-suited, the application of MSI for comparing thousands of spatially defined spotted samples has been limited. One reason for this is a lack of suitable and accessible data processing tools for the analysis of large arrayed MSI sample sets. In this paper, the OpenMSI Arrayed Analysis Toolkit (OMAAT) is a software package that addresses the challenges of analyzing spatially defined samples in MSI data sets. OMAAT is written in Python and is integrated with OpenMSI (http://openmsi.nersc.gov), a platform for storing, sharing, and analyzing MSI data. By using a web-based python notebook (Jupyter), OMAAT is accessible to anyone without programming experience yet allows experienced users to leverage all features. OMAAT was evaluated by analyzing an MSI data set of a high-throughput glycoside hydrolase activity screen comprising 384 samples arrayed onto a NIMS surface at a 450 μm spacing, decreasing analysis time >100-fold while maintaining robust spot-finding. The utility of OMAAT was demonstrated for screening metabolic activities of different sized soil particles, including hydrolysis of sugars, revealing a pattern of size dependent activities. Finally, these results introduce OMAAT as an effective toolkit for analyzing spatially defined samples in MSI. OMAAT runs on all major operating systems, and the source code can be obtained from the following GitHub repository: https://github.com/biorack/omaat.

  3. ASSIMILATION OF DOPPLER RADAR DATA INTO NUMERICAL WEATHER MODELS

    Energy Technology Data Exchange (ETDEWEB)

    Chiswell, S.; Buckley, R.

    2009-01-15

    During the year 2008, the United States National Weather Service (NWS) completed an eight fold increase in sampling capability for weather radars to 250 m resolution. This increase is expected to improve warning lead times by detecting small scale features sooner with increased reliability; however, current NWS operational model domains utilize grid spacing an order of magnitude larger than the radar data resolution, and therefore the added resolution of radar data is not fully exploited. The assimilation of radar reflectivity and velocity data into high resolution numerical weather model forecasts where grid spacing is comparable to the radar data resolution was investigated under a Laboratory Directed Research and Development (LDRD) 'quick hit' grant to determine the impact of improved data resolution on model predictions with specific initial proof of concept application to daily Savannah River Site operations and emergency response. Development of software to process NWS radar reflectivity and radial velocity data was undertaken for assimilation of observations into numerical models. Data values within the radar data volume undergo automated quality control (QC) analysis routines developed in support of this project to eliminate empty/missing data points, decrease anomalous propagation values, and determine error thresholds by utilizing the calculated variances among data values. The Weather Research and Forecasting model (WRF) three dimensional variational data assimilation package (WRF-3DVAR) was used to incorporate the QC'ed radar data into input and boundary conditions. The lack of observational data in the vicinity of SRS available to NWS operational models signifies an important data void where radar observations can provide significant input. These observations greatly enhance the knowledge of storm structures and the environmental conditions which influence their development. As the increase in computational power and availability has

  4. Texas Team: Academic Progression and IOM Toolkit.

    Science.gov (United States)

    Reid, Helen; Tart, Kathryn; Tietze, Mari; Joseph, Nitha Mathew; Easley, Carson

    The Institute of Medicine (IOM) Future of Nursing report, identified eight recommendations for nursing to improve health care for all Americans. The Texas Team for Advancing Health Through Nursing embraced the challenge of implementing the recommendations through two diverse projects. One group conducted a broad, online survey of leadership, practice, and academia, focusing on the IOM recommendations. The other focused specifically on academic progression through the use of CABNET (Consortium for Advancing Baccalaureate Nursing Education in Texas) articulation agreements. The survey revealed a lack of knowledge and understanding of the IOM recommendations, prompting development of an online IOM toolkit. The articulation agreements provide a clear pathway for students to the RN-to-BSN degree students. The toolkit and articulation agreements provide rich resources for implementation of the IOM recommendations.

  5. Wind Integration National Dataset (WIND) Toolkit; NREL (National Renewable Energy Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Draxl, Caroline; Hodge, Bri-Mathias

    2015-07-14

    A webinar about the Wind Integration National Dataset (WIND) Toolkit was presented by Bri-Mathias Hodge and Caroline Draxl on July 14, 2015. It was hosted by the Southern Alliance for Clean Energy. The toolkit is a grid integration data set that contains meteorological and power data at a 5-minute resolution across the continental United States for 7 years and hourly power forecasts.

  6. Improving Weather Radar Precipitation Estimates by Combining two Types of Radars

    DEFF Research Database (Denmark)

    Nielsen, Jesper Ellerbæk; Thorndahl, Søren Liedtke; Rasmussen, Michael R.

    2014-01-01

    This paper presents a demonstration of how Local Area Weather Radar (LAWR) X-band measurements can be combined with meteorological C–band measurements into a single radar product. For this purpose, a blending method has been developed which combines the strengths of the two radar systems. Combining...... the two radar types achieves a radar product with both long range and high temporal resolution. It is validated that the blended radar product performs better than the individual radars based on ground observations from laser disdrometers. However, the data combination is challenged by lower performance...... of the LAWR. Although both radars benefits from the data combination, it is also found that advection based temporal interpolation is a more favourable method for increasing the temporal resolution of meteorological C–band measurements....

  7. Designing a ticket to ride with the Cognitive Work Analysis Design Toolkit.

    Science.gov (United States)

    Read, Gemma J M; Salmon, Paul M; Lenné, Michael G; Jenkins, Daniel P

    2015-01-01

    Cognitive work analysis has been applied in the design of numerous sociotechnical systems. The process used to translate analysis outputs into design concepts, however, is not always clear. Moreover, structured processes for translating the outputs of ergonomics methods into concrete designs are lacking. This paper introduces the Cognitive Work Analysis Design Toolkit (CWA-DT), a design approach which has been developed specifically to provide a structured means of incorporating cognitive work analysis outputs in design using design principles and values derived from sociotechnical systems theory. This paper outlines the CWA-DT and describes its application in a public transport ticketing design case study. Qualitative and quantitative evaluations of the process provide promising early evidence that the toolkit fulfils the evaluation criteria identified for its success, with opportunities for improvement also highlighted. The Cognitive Work Analysis Design Toolkit has been developed to provide ergonomics practitioners with a structured approach for translating the outputs of cognitive work analysis into design solutions. This paper demonstrates an application of the toolkit and provides evaluation findings.

  8. FY17Q4 Ristra project: Release Version 1.0 of a production toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Hungerford, Aimee L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Daniel, David John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-21

    The Next Generation Code project will release Version 1.0 of a production toolkit for multi-physics application development on advanced architectures. Features of this toolkit will include remap and link utilities, control and state manager, setup, visualization and I/O, as well as support for a variety of mesh and particle data representations. Numerical physics packages that operate atop this foundational toolkit will be employed in a multi-physics demonstration problem and released to the community along with results from the demonstration.

  9. Development of a Multimedia Toolkit for Engineering Graphics Education

    Directory of Open Access Journals (Sweden)

    Moudar Zgoul

    2009-09-01

    Full Text Available This paper focuses upon the development of a multimedia toolkit to support the teaching of Engineering Graphics Course. The project used different elements for the toolkit; animations, videos and presentations which were then integrated in a dedicated internet website. The purpose of using these elements is to assist the students building and practicing the needed engineering skills at their own pace as a part of an e-Learning solution. Furthermore, this kit allows students to repeat and view the processes and techniques of graphical construction, and visualization as much as needed, allowing them to follow and practice on their own.

  10. AutoMicromanager: A microscopy scripting toolkit for LABVIEW and other programming environments

    Science.gov (United States)

    Ashcroft, Brian Alan; Oosterkamp, Tjerk

    2010-11-01

    We present a scripting toolkit for the acquisition and analysis of a wide variety of imaging data by integrating the ease of use of various programming environments such as LABVIEW, IGOR PRO, MATLAB, SCILAB, and others. This toolkit is designed to allow the user to quickly program a variety of standard microscopy components for custom microscopy applications allowing much more flexibility than other packages. Included are both programming tools as well as graphical user interface classes allowing a standard, consistent, and easy to maintain scripting environment. This programming toolkit allows easy access to most commonly used cameras, stages, and shutters through the Micromanager project so the scripter can focus on their custom application instead of boilerplate code generation.

  11. AutoMicromanager: a microscopy scripting toolkit for LABVIEW and other programming environments.

    Science.gov (United States)

    Ashcroft, Brian Alan; Oosterkamp, Tjerk

    2010-11-01

    We present a scripting toolkit for the acquisition and analysis of a wide variety of imaging data by integrating the ease of use of various programming environments such as LABVIEW, IGOR PRO, MATLAB, SCILAB, and others. This toolkit is designed to allow the user to quickly program a variety of standard microscopy components for custom microscopy applications allowing much more flexibility than other packages. Included are both programming tools as well as graphical user interface classes allowing a standard, consistent, and easy to maintain scripting environment. This programming toolkit allows easy access to most commonly used cameras, stages, and shutters through the Micromanager project so the scripter can focus on their custom application instead of boilerplate code generation.

  12. Open cyberGIS software for geospatial research and education in the big data era

    Science.gov (United States)

    Wang, Shaowen; Liu, Yan; Padmanabhan, Anand

    CyberGIS represents an interdisciplinary field combining advanced cyberinfrastructure, geographic information science and systems (GIS), spatial analysis and modeling, and a number of geospatial domains to improve research productivity and enable scientific breakthroughs. It has emerged as new-generation GIS that enable unprecedented advances in data-driven knowledge discovery, visualization and visual analytics, and collaborative problem solving and decision-making. This paper describes three open software strategies-open access, source, and integration-to serve various research and education purposes of diverse geospatial communities. These strategies have been implemented in a leading-edge cyberGIS software environment through three corresponding software modalities: CyberGIS Gateway, Toolkit, and Middleware, and achieved broad and significant impacts.

  13. Open cyberGIS software for geospatial research and education in the big data era

    Directory of Open Access Journals (Sweden)

    Shaowen Wang

    2016-01-01

    Full Text Available CyberGIS represents an interdisciplinary field combining advanced cyberinfrastructure, geographic information science and systems (GIS, spatial analysis and modeling, and a number of geospatial domains to improve research productivity and enable scientific breakthroughs. It has emerged as new-generation GIS that enable unprecedented advances in data-driven knowledge discovery, visualization and visual analytics, and collaborative problem solving and decision-making. This paper describes three open software strategies–open access, source, and integration–to serve various research and education purposes of diverse geospatial communities. These strategies have been implemented in a leading-edge cyberGIS software environment through three corresponding software modalities: CyberGIS Gateway, Toolkit, and Middleware, and achieved broad and significant impacts.

  14. Windows forensic analysis toolkit advanced analysis techniques for Windows 7

    CERN Document Server

    Carvey, Harlan

    2012-01-01

    Now in its third edition, Harlan Carvey has updated "Windows Forensic Analysis Toolkit" to cover Windows 7 systems. The primary focus of this edition is on analyzing Windows 7 systems and on processes using free and open-source tools. The book covers live response, file analysis, malware detection, timeline, and much more. The author presents real-life experiences from the trenches, making the material realistic and showing the why behind the how. New to this edition, the companion and toolkit materials are now hosted online. This material consists of electronic printable checklists, cheat sheets, free custom tools, and walk-through demos. This edition complements "Windows Forensic Analysis Toolkit, 2nd Edition", (ISBN: 9781597494229), which focuses primarily on XP. It includes complete coverage and examples on Windows 7 systems. It contains Lessons from the Field, Case Studies, and War Stories. It features companion online material, including electronic printable checklists, cheat sheets, free custom tools, ...

  15. VIDE: The Void IDentification and Examination toolkit

    Science.gov (United States)

    Sutter, P. M.; Lavaux, G.; Hamaus, N.; Pisani, A.; Wandelt, B. D.; Warren, M.; Villaescusa-Navarro, F.; Zivick, P.; Mao, Q.; Thompson, B. B.

    2015-03-01

    We present VIDE, the Void IDentification and Examination toolkit, an open-source Python/C++ code for finding cosmic voids in galaxy redshift surveys and N-body simulations, characterizing their properties, and providing a platform for more detailed analysis. At its core, VIDE uses a substantially enhanced version of ZOBOV (Neyinck 2008) to calculate a Voronoi tessellation for estimating the density field and performing a watershed transform to construct voids. Additionally, VIDE provides significant functionality for both pre- and post-processing: for example, VIDE can work with volume- or magnitude-limited galaxy samples with arbitrary survey geometries, or dark matter particles or halo catalogs in a variety of common formats. It can also randomly subsample inputs and includes a Halo Occupation Distribution model for constructing mock galaxy populations. VIDE uses the watershed levels to place voids in a hierarchical tree, outputs a summary of void properties in plain ASCII, and provides a Python API to perform many analysis tasks, such as loading and manipulating void catalogs and particle members, filtering, plotting, computing clustering statistics, stacking, comparing catalogs, and fitting density profiles. While centered around ZOBOV, the toolkit is designed to be as modular as possible and accommodate other void finders. VIDE has been in development for several years and has already been used to produce a wealth of results, which we summarize in this work to highlight the capabilities of the toolkit. VIDE is publicly available at http://bitbucket.org/cosmicvoids/vide_public and http://www.cosmicvoids.net.

  16. TU-F-17A-01: BEST IN PHYSICS (JOINT IMAGING-THERAPY) - An Automatic Toolkit for Efficient and Robust Analysis of 4D Respiratory Motion

    International Nuclear Information System (INIS)

    Wei, J; Yuan, A; Li, G

    2014-01-01

    Purpose: To provide an automatic image analysis toolkit to process thoracic 4-dimensional computed tomography (4DCT) and extract patient-specific motion information to facilitate investigational or clinical use of 4DCT. Methods: We developed an automatic toolkit in MATLAB to overcome the extra workload from the time dimension in 4DCT. This toolkit employs image/signal processing, computer vision, and machine learning methods to visualize, segment, register, and characterize lung 4DCT automatically or interactively. A fully-automated 3D lung segmentation algorithm was designed and 4D lung segmentation was achieved in batch mode. Voxel counting was used to calculate volume variations of the torso, lung and its air component, and local volume changes at the diaphragm and chest wall to characterize breathing pattern. Segmented lung volumes in 12 patients are compared with those from a treatment planning system (TPS). Voxel conversion was introduced from CT# to other physical parameters, such as gravity-induced pressure, to create a secondary 4D image. A demon algorithm was applied in deformable image registration and motion trajectories were extracted automatically. Calculated motion parameters were plotted with various templates. Machine learning algorithms, such as Naive Bayes and random forests, were implemented to study respiratory motion. This toolkit is complementary to and will be integrated with the Computational Environment for Radiotherapy Research (CERR). Results: The automatic 4D image/data processing toolkit provides a platform for analysis of 4D images and datasets. It processes 4D data automatically in batch mode and provides interactive visual verification for manual adjustments. The discrepancy in lung volume calculation between this and the TPS is <±2% and the time saving is by 1–2 orders of magnitude. Conclusion: A framework of 4D toolkit has been developed to analyze thoracic 4DCT automatically or interactively, facilitating both investigational

  17. TU-F-17A-01: BEST IN PHYSICS (JOINT IMAGING-THERAPY) - An Automatic Toolkit for Efficient and Robust Analysis of 4D Respiratory Motion

    Energy Technology Data Exchange (ETDEWEB)

    Wei, J [City College of New York, New York, NY (United States); Yuan, A; Li, G [Memorial Sloan Kettering Cancer Center, New York, NY (United States)

    2014-06-15

    Purpose: To provide an automatic image analysis toolkit to process thoracic 4-dimensional computed tomography (4DCT) and extract patient-specific motion information to facilitate investigational or clinical use of 4DCT. Methods: We developed an automatic toolkit in MATLAB to overcome the extra workload from the time dimension in 4DCT. This toolkit employs image/signal processing, computer vision, and machine learning methods to visualize, segment, register, and characterize lung 4DCT automatically or interactively. A fully-automated 3D lung segmentation algorithm was designed and 4D lung segmentation was achieved in batch mode. Voxel counting was used to calculate volume variations of the torso, lung and its air component, and local volume changes at the diaphragm and chest wall to characterize breathing pattern. Segmented lung volumes in 12 patients are compared with those from a treatment planning system (TPS). Voxel conversion was introduced from CT# to other physical parameters, such as gravity-induced pressure, to create a secondary 4D image. A demon algorithm was applied in deformable image registration and motion trajectories were extracted automatically. Calculated motion parameters were plotted with various templates. Machine learning algorithms, such as Naive Bayes and random forests, were implemented to study respiratory motion. This toolkit is complementary to and will be integrated with the Computational Environment for Radiotherapy Research (CERR). Results: The automatic 4D image/data processing toolkit provides a platform for analysis of 4D images and datasets. It processes 4D data automatically in batch mode and provides interactive visual verification for manual adjustments. The discrepancy in lung volume calculation between this and the TPS is <±2% and the time saving is by 1–2 orders of magnitude. Conclusion: A framework of 4D toolkit has been developed to analyze thoracic 4DCT automatically or interactively, facilitating both investigational

  18. The Customer Flow Toolkit: A Framework for Designing High Quality Customer Services.

    Science.gov (United States)

    New York Association of Training and Employment Professionals, Albany.

    This document presents a toolkit to assist staff involved in the design and development of New York's one-stop system. Section 1 describes the preplanning issues to be addressed and the intended outcomes that serve as the framework for creation of the customer flow toolkit. Section 2 outlines the following strategies to assist in designing local…

  19. BAT - The Bayesian Analysis Toolkit

    CERN Document Server

    Caldwell, Allen C; Kröninger, Kevin

    2009-01-01

    We describe the development of a new toolkit for data analysis. The analysis package is based on Bayes' Theorem, and is realized with the use of Markov Chain Monte Carlo. This gives access to the full posterior probability distribution. Parameter estimation, limit setting and uncertainty propagation are implemented in a straightforward manner. A goodness-of-fit criterion is presented which is intuitive and of great practical use.

  20. Implementing a modeling software for animated protein-complex interactions using a physics simulation library.

    Science.gov (United States)

    Ueno, Yutaka; Ito, Shuntaro; Konagaya, Akihiko

    2014-12-01

    To better understand the behaviors and structural dynamics of proteins within a cell, novel software tools are being developed that can create molecular animations based on the findings of structural biology. This study proposes our method developed based on our prototypes to detect collisions and examine the soft-body dynamics of molecular models. The code was implemented with a software development toolkit for rigid-body dynamics simulation and a three-dimensional graphics library. The essential functions of the target software system included the basic molecular modeling environment, collision detection in the molecular models, and physical simulations of the movement of the model. Taking advantage of recent software technologies such as physics simulation modules and interpreted scripting language, the functions required for accurate and meaningful molecular animation were implemented efficiently.

  1. JULIDE: a software tool for 3D reconstruction and statistical analysis of autoradiographic mouse brain sections.

    Directory of Open Access Journals (Sweden)

    Delphine Ribes

    Full Text Available In this article we introduce JULIDE, a software toolkit developed to perform the 3D reconstruction, intensity normalization, volume standardization by 3D image registration and voxel-wise statistical analysis of autoradiographs of mouse brain sections. This software tool has been developed in the open-source ITK software framework and is freely available under a GPL license. The article presents the complete image processing chain from raw data acquisition to 3D statistical group analysis. Results of the group comparison in the context of a study on spatial learning are shown as an illustration of the data that can be obtained with this tool.

  2. Integrating Radar Image Data with Google Maps

    Science.gov (United States)

    Chapman, Bruce D.; Gibas, Sarah

    2010-01-01

    A public Web site has been developed as a method for displaying the multitude of radar imagery collected by NASA s Airborne Synthetic Aperture Radar (AIRSAR) instrument during its 16-year mission. Utilizing NASA s internal AIRSAR site, the new Web site features more sophisticated visualization tools that enable the general public to have access to these images. The site was originally maintained at NASA on six computers: one that held the Oracle database, two that took care of the software for the interactive map, and three that were for the Web site itself. Several tasks were involved in moving this complicated setup to just one computer. First, the AIRSAR database was migrated from Oracle to MySQL. Then the back-end of the AIRSAR Web site was updated in order to access the MySQL database. To do this, a few of the scripts needed to be modified; specifically three Perl scripts that query that database. The database connections were then updated from Oracle to MySQL, numerous syntax errors were corrected, and a query was implemented that replaced one of the stored Oracle procedures. Lastly, the interactive map was designed, implemented, and tested so that users could easily browse and access the radar imagery through the Google Maps interface.

  3. A Geospatial Decision Support System Toolkit, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to build and commercialize a working prototype Geospatial Decision Support Toolkit (GeoKit). GeoKit will enable scientists, agencies, and stakeholders to...

  4. IBiSA_Tools: A Computational Toolkit for Ion-Binding State Analysis in Molecular Dynamics Trajectories of Ion Channels.

    Directory of Open Access Journals (Sweden)

    Kota Kasahara

    Full Text Available Ion conduction mechanisms of ion channels are a long-standing conundrum. Although the molecular dynamics (MD method has been extensively used to simulate ion conduction dynamics at the atomic level, analysis and interpretation of MD results are not straightforward due to complexity of the dynamics. In our previous reports, we proposed an analytical method called ion-binding state analysis to scrutinize and summarize ion conduction mechanisms by taking advantage of a variety of analytical protocols, e.g., the complex network analysis, sequence alignment, and hierarchical clustering. This approach effectively revealed the ion conduction mechanisms and their dependence on the conditions, i.e., ion concentration and membrane voltage. Here, we present an easy-to-use computational toolkit for ion-binding state analysis, called IBiSA_tools. This toolkit consists of a C++ program and a series of Python and R scripts. From the trajectory file of MD simulations and a structure file, users can generate several images and statistics of ion conduction processes. A complex network named ion-binding state graph is generated in a standard graph format (graph modeling language; GML, which can be visualized by standard network analyzers such as Cytoscape. As a tutorial, a trajectory of a 50 ns MD simulation of the Kv1.2 channel is also distributed with the toolkit. Users can trace the entire process of ion-binding state analysis step by step. The novel method for analysis of ion conduction mechanisms of ion channels can be easily used by means of IBiSA_tools. This software is distributed under an open source license at the following URL: http://www.ritsumei.ac.jp/~ktkshr/ibisa_tools/.

  5. BuddySuite: Command-Line Toolkits for Manipulating Sequences, Alignments, and Phylogenetic Trees.

    Science.gov (United States)

    Bond, Stephen R; Keat, Karl E; Barreira, Sofia N; Baxevanis, Andreas D

    2017-06-01

    The ability to manipulate sequence, alignment, and phylogenetic tree files has become an increasingly important skill in the life sciences, whether to generate summary information or to prepare data for further downstream analysis. The command line can be an extremely powerful environment for interacting with these resources, but only if the user has the appropriate general-purpose tools on hand. BuddySuite is a collection of four independent yet interrelated command-line toolkits that facilitate each step in the workflow of sequence discovery, curation, alignment, and phylogenetic reconstruction. Most common sequence, alignment, and tree file formats are automatically detected and parsed, and over 100 tools have been implemented for manipulating these data. The project has been engineered to easily accommodate the addition of new tools, is written in the popular programming language Python, and is hosted on the Python Package Index and GitHub to maximize accessibility. Documentation for each BuddySuite tool, including usage examples, is available at http://tiny.cc/buddysuite_wiki. All software is open source and freely available through http://research.nhgri.nih.gov/software/BuddySuite. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution 2017. This work is written by US Government employees and is in the public domain in the US.

  6. The Visualization Toolkit (VTK): Rewriting the rendering code for modern graphics cards

    Science.gov (United States)

    Hanwell, Marcus D.; Martin, Kenneth M.; Chaudhary, Aashish; Avila, Lisa S.

    2015-09-01

    The Visualization Toolkit (VTK) is an open source, permissively licensed, cross-platform toolkit for scientific data processing, visualization, and data analysis. It is over two decades old, originally developed for a very different graphics card architecture. Modern graphics cards feature fully programmable, highly parallelized architectures with large core counts. VTK's rendering code was rewritten to take advantage of modern graphics cards, maintaining most of the toolkit's programming interfaces. This offers the opportunity to compare the performance of old and new rendering code on the same systems/cards. Significant improvements in rendering speeds and memory footprints mean that scientific data can be visualized in greater detail than ever before. The widespread use of VTK means that these improvements will reap significant benefits.

  7. Heart Failure: Self-care to Success: Development and evaluation of a program toolkit.

    Science.gov (United States)

    Bryant, Rebecca

    2017-08-17

    The Heart Failure: Self-care to Success toolkit was developed to assist NPs in empowering patients with heart failure (HF) to improve individual self-care behaviors. This article details the evolution of this toolkit for NPs, its effectiveness with patients with HF, and recommendations for future research and dissemination strategies.

  8. Hardware And Software Architectures For Reconfigurable Time-Critical Control Tasks

    Directory of Open Access Journals (Sweden)

    Adam Piłat

    2007-01-01

    Full Text Available The most popular configuration of the controlled laboratory test-rigs is the personalcomputer (PC equipped with the I/O board. The dedicated software components allowsto conduct a wide range of user-defined tasks. The typical configuration functionality canbe customized by PC hardware components and their programmable reconfiguration. Thenext step in the automatic control system design is the embedded solution. Usually, thedesign process of the embedded control system is supported by the high-level software. Thededicated programming tools support multitasking property of the microcontroller by selectionof different sampling frequencies of algorithm blocks. In this case the multi-layer andmultitasking control strategy can be realized on the chip. The proposed solutions implementrapid prototyping approach. The available toolkits and device drivers integrate system-leveldesign environment and the real-time application software, transferring the functionality ofMATLAB/Simulink programs to PCs or microcontrolers application environment.

  9. Marine Debris and Plastic Source Reduction Toolkit

    Science.gov (United States)

    Many plastic food service ware items originate on college and university campuses—in cafeterias, snack rooms, cafés, and eateries with take-out dining options. This Campus Toolkit is a detailed “how to” guide for reducing plastic waste on college campuses.

  10. Simulation of a weather radar display for over-water airborne radar approaches

    Science.gov (United States)

    Clary, G. R.

    1983-01-01

    Airborne radar approach (ARA) concepts are being investigated as a part of NASA's Rotorcraft All-Weather Operations Research Program on advanced guidance and navigation methods. This research is being conducted using both piloted simulations and flight test evaluations. For the piloted simulations, a mathematical model of the airborne radar was developed for over-water ARAs to offshore platforms. This simulated flight scenario requires radar simulation of point targets, such as oil rigs and ships, distributed sea clutter, and transponder beacon replies. Radar theory, weather radar characteristics, and empirical data derived from in-flight radar photographs are combined to model a civil weather/mapping radar typical of those used in offshore rotorcraft operations. The resulting radar simulation is realistic and provides the needed simulation capability for ongoing ARA research.

  11. Three-dimensional subsurface imaging synthetic aperture radar

    International Nuclear Information System (INIS)

    Moussally, G.J.

    1995-01-01

    The objective of this applied research and development project is to develop a system known as '3-D SISAR'. This system consists of a ground penetrating radar with software algorithms designed for the detection, location, and identification of buried objects in the underground hazardous waste environments found at DOE storage sites. Three-dimensional maps of the object locations will be produced which can assist the development of remediation strategies and the characterization of the digface during remediation operations. It is expected that the 3-D SISAR will also prove useful for monitoring hydrocarbon based contaminant migration after remediation. The underground imaging technique being developed under this contract utilizes a spotlight mode Synthetic Aperture Radar (SAR) approach which, due to its inherent stand-off capability, will permit the rapid survey of a site and achieve a high degree of productivity over large areas. When deployed from an airborne platform, the stand-off techniques is also seen as a way to overcome practical survey limitations encountered at vegetated sites

  12. Guide to Using the WIND Toolkit Validation Code

    Energy Technology Data Exchange (ETDEWEB)

    Lieberman-Cribbin, W.; Draxl, C.; Clifton, A.

    2014-12-01

    In response to the U.S. Department of Energy's goal of using 20% wind energy by 2030, the Wind Integration National Dataset (WIND) Toolkit was created to provide information on wind speed, wind direction, temperature, surface air pressure, and air density on more than 126,000 locations across the United States from 2007 to 2013. The numerical weather prediction model output, gridded at 2-km and at a 5-minute resolution, was further converted to detail the wind power production time series of existing and potential wind facility sites. For users of the dataset it is important that the information presented in the WIND Toolkit is accurate and that errors are known, as then corrective steps can be taken. Therefore, we provide validation code written in R that will be made public to provide users with tools to validate data of their own locations. Validation is based on statistical analyses of wind speed, using error metrics such as bias, root-mean-square error, centered root-mean-square error, mean absolute error, and percent error. Plots of diurnal cycles, annual cycles, wind roses, histograms of wind speed, and quantile-quantile plots are created to visualize how well observational data compares to model data. Ideally, validation will confirm beneficial locations to utilize wind energy and encourage regional wind integration studies using the WIND Toolkit.

  13. Planetary Radar

    Science.gov (United States)

    Neish, Catherine D.; Carter, Lynn M.

    2015-01-01

    This chapter describes the principles of planetary radar, and the primary scientific discoveries that have been made using this technique. The chapter starts by describing the different types of radar systems and how they are used to acquire images and accurate topography of planetary surfaces and probe their subsurface structure. It then explains how these products can be used to understand the properties of the target being investigated. Several examples of discoveries made with planetary radar are then summarized, covering solar system objects from Mercury to Saturn. Finally, opportunities for future discoveries in planetary radar are outlined and discussed.

  14. Innovations and Challenges of Implementing a Glucose Gel Toolkit for Neonatal Hypoglycemia.

    Science.gov (United States)

    Hammer, Denise; Pohl, Carla; Jacobs, Peggy J; Kaufman, Susan; Drury, Brenda

    2018-05-24

    Transient neonatal hypoglycemia occurs most commonly in newborns who are small for gestational age, large for gestational age, infants of diabetic mothers, and late preterm infants. An exact blood glucose value has not been determined for neonatal hypoglycemia, and it is important to note that poor neurologic outcomes can occur if hypoglycemia is left untreated. Interventions that separate mothers and newborns, as well as use of formula to treat hypoglycemia, have the potential to disrupt exclusive breastfeeding. To determine whether implementation of a toolkit designed to support staff in the adaptation of the practice change for management of newborns at risk for hypoglycemia, that includes 40% glucose gel in an obstetric unit with a level 2 nursery will decrease admissions to the Intermediate Care Nursery, and increase exclusive breastfeeding. This descriptive study used a retrospective chart review for pre/postimplementation of the Management of Newborns at Risk for Hypoglycemia Toolkit (Toolkit) using a convenience sample of at-risk newborns in the first 2 days of life to evaluate the proposed outcomes. Following implementation of the Toolkit, at-risk newborns had a clinically but not statistically significant 6.5% increase in exclusive breastfeeding and a clinically but not statistically significant 5% decrease in admissions to the Intermediate Care Nursery. The Toolkit was designed for ease of staff use and to improve outcomes for the at-risk newborn. Future research includes replication at other level 2 and level 1 obstetric centers and investigation into the number of 40% glucose gel doses that can safely be administered.

  15. Understanding radar systems

    CERN Document Server

    Kingsley, Simon

    1999-01-01

    What is radar? What systems are currently in use? How do they work? This book provides engineers and scientists with answers to these critical questions, focusing on actual radar systems in use today. It is a perfect resource for those just entering the field, or as a quick refresher for experienced practitioners. The book leads readers through the specialized language and calculations that comprise the complex world of radar engineering as seen in dozens of state-of-the-art radar systems. An easy to read, wide ranging guide to the world of modern radar systems.

  16. Radar orthogonality and radar length in Finsler and metric spacetime geometry

    Science.gov (United States)

    Pfeifer, Christian

    2014-09-01

    The radar experiment connects the geometry of spacetime with an observers measurement of spatial length. We investigate the radar experiment on Finsler spacetimes which leads to a general definition of radar orthogonality and radar length. The directions radar orthogonal to an observer form the spatial equal time surface an observer experiences and the radar length is the physical length the observer associates to spatial objects. We demonstrate these concepts on a forth order polynomial Finsler spacetime geometry which may emerge from area metric or premetric linear electrodynamics or in quantum gravity phenomenology. In an explicit generalization of Minkowski spacetime geometry we derive the deviation from the Euclidean spatial length measure in an observers rest frame explicitly.

  17. Observations of Phobos by the Mars Express radar MARSIS: Description of the detection techniques and preliminary results

    Science.gov (United States)

    Cicchetti, A.; Nenna, C.; Plaut, J. J.; Plettemeier, D.; Noschese, R.; Cartacci, M.; Orosei, R.

    2017-11-01

    The Mars Advanced Radar for Subsurface and Ionosphere Sounding (MARSIS) (Picardi et al., 2005) is a synthetic aperture low frequency radar altimeter, onboard the ESA Mars Express orbiter, launched in June 2003. It is the first and so far the only spaceborne radar that has observed the Martian moon Phobos. Radar echoes were collected on different flyby trajectories. The primary aim of sounding Phobos is to prove the feasibility of deep sounding, into its subsurface. MARSIS is optimized for deep penetration investigations and is capable of transmitting at four different bands between 1.3 MHz and 5.5 MHz with a 1 MHz bandwidth. Unfortunately the instrument was originally designed to operate exclusively on Mars, assuming that Phobos would not be observed. Following this assumption, a protection mechanism was implemented in the hardware (HW) to maintain a minimum time separation between transmission and reception phases of the radar. This limitation does not have any impact on Mars observation but it prevented the observation of Phobos. In order to successfully operate the instrument at Phobos, a particular configuration of the MARSIS onboard software (SW) parameters, called ;Range Ambiguity,; was implemented to override the HW protection zone, ensuring at the same time a high level of safety of the instrument. This paper describes the principles of MARSIS onboard processing, and the procedure through which the parameters of the processing software were tuned to observe targets below the minimum distance allowed by hardware. Some preliminary results of data analysis will be shown, with the support of radar echo simulations. A qualitative comparison between the simulated results and the actual data, does not support the detection of subsurface reflectors.

  18. pypet: A Python Toolkit for Data Management of Parameter Explorations.

    Science.gov (United States)

    Meyer, Robert; Obermayer, Klaus

    2016-01-01

    pypet (Python parameter exploration toolkit) is a new multi-platform Python toolkit for managing numerical simulations. Sampling the space of model parameters is a key aspect of simulations and numerical experiments. pypet is designed to allow easy and arbitrary sampling of trajectories through a parameter space beyond simple grid searches. pypet collects and stores both simulation parameters and results in a single HDF5 file. This collective storage allows fast and convenient loading of data for further analyses. pypet provides various additional features such as multiprocessing and parallelization of simulations, dynamic loading of data, integration of git version control, and supervision of experiments via the electronic lab notebook Sumatra. pypet supports a rich set of data formats, including native Python types, Numpy and Scipy data, Pandas DataFrames, and BRIAN(2) quantities. Besides these formats, users can easily extend the toolkit to allow customized data types. pypet is a flexible tool suited for both short Python scripts and large scale projects. pypet's various features, especially the tight link between parameters and results, promote reproducible research in computational neuroscience and simulation-based disciplines.

  19. Wide Band and Wide Azimuth Beam Effect on High-resolution Synthetic Aperture Radar Radiometric Calibration

    Directory of Open Access Journals (Sweden)

    Hong Jun

    2015-06-01

    Full Text Available Passive corner reflectors and active transponders are often used as man-made reference targets in Synthetic Aperture Radar (SAR radiometric calibration, With the emergence of new radar systems and the increasing demand for greater accuracy, wide-band and wide-beam radars challenge the hypothesis that the Radar Cross Section (RCS of reference targets is constant. In this study, the FEKO electromagnetic simulation software is used to obtain the change curve of the target RCS as a function of frequency and aspect angle while incorporating high-resolution point-target SAR simulation, and quantitatively analyzing the effect of the modulation effect on SAR images. The simulation results suggest that the abovementioned factors affect the SAR calibration by more than 0.2 dB within a fractional bandwidth greater than 10% or azimuth beam width of more than 20°, which must be corrected in the data processing.

  20. Three-dimensional, subsurface imaging synthetic aperture radar

    International Nuclear Information System (INIS)

    Moussally, G.J.

    1994-01-01

    The objective of this applied research and devolpment project is to develop a system known as 3-D SISAR. This sytem consists of a gound penetrating radar with software algorithms designed for detection, location, and identification of buried objects in the underground hazardous waste environments found at US DOE storage sites. Three-dimensional maps can assist the development of remdiation strategies and characterization of the digface during remediation. The system should also be useful for monitoring hydrocarbon-based contaminant migration after remediation. 5 figs

  1. Propagation of radar rainfall uncertainty in urban flood simulations

    Science.gov (United States)

    Liguori, Sara; Rico-Ramirez, Miguel

    2013-04-01

    hydrodynamic sewer network model implemented in the Infoworks software was used to model the rainfall-runoff process in the urban area. The software calculates the flow through the sewer conduits of the urban model using rainfall as the primary input. The sewer network is covered by 25 radar pixels with a spatial resolution of 1 km2. The majority of the sewer system is combined, carrying both urban rainfall runoff as well as domestic and trade waste water [11]. The urban model was configured to receive the probabilistic radar rainfall fields. The results showed that the radar rainfall ensembles provide additional information about the uncertainty in the radar rainfall measurements that can be propagated in urban flood modelling. The peaks of the measured flow hydrographs are often bounded within the uncertainty area produced by using the radar rainfall ensembles. This is in fact one of the benefits of using radar rainfall ensembles in urban flood modelling. More work needs to be done in improving the urban models, but this is out of the scope of this research. The rainfall uncertainty cannot explain the whole uncertainty shown in the flow simulations, and additional sources of uncertainty will come from the structure of the urban models as well as the large number of parameters required by these models. Acknowledgements The authors would like to acknowledge the BADC, the UK Met Office and the UK Environment Agency for providing the various data sets. We also thank Yorkshire Water Services Ltd for providing the urban model. The authors acknowledge the support from the Engineering and Physical Sciences Research Council (EPSRC) via grant EP/I012222/1. References [1] Browning KA, 1978. Meteorological applications of radar. Reports on Progress in Physics 41 761 Doi: 10.1088/0034-4885/41/5/003 [2] Rico-Ramirez MA, Cluckie ID, Shepherd G, Pallot A, 2007. A high-resolution radar experiment on the island of Jersey. Meteorological Applications 14: 117-129. [3] Villarini G, Krajewski WF

  2. Building the Support for Radar Processing across Memory Hierarchies: On the Development of an Array Class with Shapes using Expression Templates in C++̂

    National Research Council Canada - National Science Library

    Mullin, Lenore

    2004-01-01

    ...), could be used to develop software for radar and other DSP applications. This software needs to be tuned to use the levels of memory hierarchies efficiently without the materialization of array valued temporaries 3...

  3. Monitoring the grid with the Globus Toolkit MDS4

    International Nuclear Information System (INIS)

    Schopf, Jennifer M; Pearlman, Laura; Miller, Neill; Kesselman, Carl; Foster, Ian; D'Arcy, Mike; Chervenak, Ann

    2006-01-01

    The Globus Toolkit Monitoring and Discovery System (MDS4) defines and implements mechanisms for service and resource discovery and monitoring in distributed environments. MDS4 is distinguished from previous similar systems by its extensive use of interfaces and behaviors defined in the WS-Resource Framework and WS-Notification specifications, and by its deep integration into essentially every component of the Globus Toolkit. We describe the MDS4 architecture and the Web service interfaces and behaviors that allow users to discover resources and services, monitor resource and service states, receive updates on current status, and visualize monitoring results. We present two current deployments to provide insights into the functionality that can be achieved via the use of these mechanisms

  4. Advances in bistatic radar

    CERN Document Server

    Willis, Nick

    2007-01-01

    Advances in Bistatic Radar updates and extends bistatic and multistatic radar developments since publication of Willis' Bistatic Radar in 1991. New and recently declassified military applications are documented. Civil applications are detailed including commercial and scientific systems. Leading radar engineers provide expertise to each of these applications. Advances in Bistatic Radar consists of two major sections: Bistatic/Multistatic Radar Systems and Bistatic Clutter and Signal Processing. Starting with a history update, the first section documents the early and now declassified military

  5. Recommendation on Transition from Primary/Secondary Radar to Secondary- Only Radar Capability

    Science.gov (United States)

    1994-10-01

    Radar Beacon Performance Monitor RCIU Remote Control Interface Unit RCL Remote Communications Link R E&D Research, Engineering and Development RML Radar...rate. 3.1.2.5 Maintenance The current LRRs have limited remote maintenance monitoring (RMM) capabilities via the Remote Control Interface Unit ( RCIU ...1, -2 and FPS-20 radars required an upgrade of some of the radar subsystems, namely the RCIU to respond as an RMS and the CD to interface with radar

  6. Energy Savings Performance Contract Energy Sales Agreement Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    None

    2017-08-14

    FEMP developed the Energy Savings Performance Contracting Energy Sales Agreement (ESPC ESA) Toolkit to provide federal agency contracting officers and other acquisition team members with information that will facilitate the timely execution of ESPC ESA projects.

  7. Radar-Based Depth Area Reduction Factors for Colorado

    Science.gov (United States)

    Curtis, D. C.; Humphrey, J. H.; Bare, D.

    2011-12-01

    More than 340,000 fifteen-minute storm cells, nearly 45,000 one-hour cells, and over 20,000 three-hour cells found in 21 months of gage adjusted radar-rainfall estimates (GARR) over El Paso County, CO, were identified and evaluated using TITAN (Thunderstorm Identification, Tracking, Analysis and Nowcasting) software. TITAN's storm cell identification capability enabled the analysis of the geometric properties of storms, time step by time step. The gage-adjusted radar-rainfall data set was derived for months containing runoff producing events observed in the Fountain Creek Watershed within El Paso County from 1994-2008. Storm centered Depth Area Reduction Factors (DARFs) were computed and compared to DARFs published by the U.S. National Weather Service (NWS) in Technical Paper 29, which are widely used in stormwater infrastructure design. Radar-based storm centered DARFs decay much more sharply than the NWS standard curves. The results suggest lower watershed average rainfall inputs from radar-based storm centered DARFs than from standard NWS DARFs for a given watershed area. The results also suggest that DARFs are variable by return period and, perhaps, by location. Both findings could have significant impacts on design storm standards. Lower design volumes for a given return period translate to lower capacity requirements and lower cost infrastructure. Conversely, the higher volume requirements implied for the NWS DARFs translate to higher capacity requirements, higher costs, but lower risk of failure. Ultimately, a decision about which approach is to use depends on the risk tolerance of the decision maker. However, the growing volume of historical radar rainfall estimates coupled with the type of analysis described herein, supports a better understanding of risk and more informed decision-making by local officials.

  8. Adaptive radar resource management

    CERN Document Server

    Moo, Peter

    2015-01-01

    Radar Resource Management (RRM) is vital for optimizing the performance of modern phased array radars, which are the primary sensor for aircraft, ships, and land platforms. Adaptive Radar Resource Management gives an introduction to radar resource management (RRM), presenting a clear overview of different approaches and techniques, making it very suitable for radar practitioners and researchers in industry and universities. Coverage includes: RRM's role in optimizing the performance of modern phased array radars The advantages of adaptivity in implementing RRMThe role that modelling and

  9. Radar and ARPA manual

    CERN Document Server

    Bole, A G

    2013-01-01

    Radar and ARPA Manual focuses on the theoretical and practical aspects of electronic navigation. The manual first discusses basic radar principles, including principles of range and bearing measurements and picture orientation and presentation. The text then looks at the operational principles of radar systems. Function of units; aerial, receiver, and display principles; transmitter principles; and sitting of units on board ships are discussed. The book also describes target detection, Automatic Radar Plotting Aids (ARPA), and operational controls of radar systems, and then discusses radar plo

  10. PAGANI Toolkit: Parallel graph-theoretical analysis package for brain network big data.

    Science.gov (United States)

    Du, Haixiao; Xia, Mingrui; Zhao, Kang; Liao, Xuhong; Yang, Huazhong; Wang, Yu; He, Yong

    2018-05-01

    The recent collection of unprecedented quantities of neuroimaging data with high spatial resolution has led to brain network big data. However, a toolkit for fast and scalable computational solutions is still lacking. Here, we developed the PArallel Graph-theoretical ANalysIs (PAGANI) Toolkit based on a hybrid central processing unit-graphics processing unit (CPU-GPU) framework with a graphical user interface to facilitate the mapping and characterization of high-resolution brain networks. Specifically, the toolkit provides flexible parameters for users to customize computations of graph metrics in brain network analyses. As an empirical example, the PAGANI Toolkit was applied to individual voxel-based brain networks with ∼200,000 nodes that were derived from a resting-state fMRI dataset of 624 healthy young adults from the Human Connectome Project. Using a personal computer, this toolbox completed all computations in ∼27 h for one subject, which is markedly less than the 118 h required with a single-thread implementation. The voxel-based functional brain networks exhibited prominent small-world characteristics and densely connected hubs, which were mainly located in the medial and lateral fronto-parietal cortices. Moreover, the female group had significantly higher modularity and nodal betweenness centrality mainly in the medial/lateral fronto-parietal and occipital cortices than the male group. Significant correlations between the intelligence quotient and nodal metrics were also observed in several frontal regions. Collectively, the PAGANI Toolkit shows high computational performance and good scalability for analyzing connectome big data and provides a friendly interface without the complicated configuration of computing environments, thereby facilitating high-resolution connectomics research in health and disease. © 2018 Wiley Periodicals, Inc.

  11. Basic Radar Altimetry Toolbox: Tools and Tutorial to Use Cryosat Data

    Science.gov (United States)

    Benveniste, J.; Bronner, E.; Dinardo, S.; Lucas, B. M.; Rosmorduc, V.; Earith, D.; Niemeijer, S.

    2011-12-01

    Radar altimetry is very much a technique expanding its applications. Even If quite a lot of effort has been invested for oceanography users, the use of Altimetry data for cryosphere application, especially with the new ESA CryoSat-2 mission data is still somehow tedious for new Altimetry data products users. ESA and CNES therfore developed the Basic Radar Altimetry Toolbox a few years ago, and are improving and upgrading it to fit new missions and the growing number of altimetry uses. The Basic Radar Altimetry Toolbox is an "all-altimeter" collection of tools, tutorials and documents designed to facilitate the use of radar altimetry data. The software is able: - to read most distributed radar altimetry data, from ERS-1 & 2, Topex/Poseidon, Geosat Follow-on, Jason-1, Envisat, Jason- 2, CryoSat, the future Saral missions and is ready for adaptation to Sentinel-3 products - to perform some processing, data editing and statistic, - and to visualize the results. It can be used at several levels/several ways: - as a data reading tool, with APIs for C, Fortran, Matlab and IDL - as processing/extraction routines, through the on-line command mode - as an educational and a quick-look tool, with the graphical user interface As part of the Toolbox, a Radar Altimetry Tutorial gives general information about altimetry, the technique involved and its applications, as well as an overview of past, present and future missions, including information on how to access data and additional software and documentation. It also presents a series of data use cases, covering all uses of altimetry over ocean, cryosphere and land, showing the basic methods for some of the most frequent manners of using altimetry data. It is an opportunity to teach remote sensing with practical training. It has been available since April 2007, and had been demonstrated during training courses and scientific meetings. About 2000 people downloaded it (Summer 2011), with many "newcomers" to altimetry among them

  12. Social Radar

    Science.gov (United States)

    2012-01-01

    RTA HFM-201/RSM PAPER 3 - 1 © 2012 The MITRE Corporation. All Rights Reserved. Social Radar Barry Costa and John Boiney MITRE Corporation...defenders require an integrated set of capabilities that we refer to as a “ social radar.” Such a system would support strategic- to operational-level...situation awareness, alerting, course of action analysis, and measures of effectiveness for each action undertaken. Success of a social radar

  13. NNCTRL - a CANCSD toolkit for MATLAB(R)

    DEFF Research Database (Denmark)

    Nørgård, Peter Magnus; Ravn, Ole; Poulsen, Niels Kjølstad

    1996-01-01

    A set of tools for computer-aided neuro-control system design (CANCSD) has been developed for the MATLAB environment. The tools can be used for construction and simulation of a variety of neural network based control systems. The design methods featured in the toolkit are: direct inverse control...

  14. Biomechanical ToolKit: Open-source framework to visualize and process biomechanical data.

    Science.gov (United States)

    Barre, Arnaud; Armand, Stéphane

    2014-04-01

    C3D file format is widely used in the biomechanical field by companies and laboratories to store motion capture systems data. However, few software packages can visualize and modify the integrality of the data in the C3D file. Our objective was to develop an open-source and multi-platform framework to read, write, modify and visualize data from any motion analysis systems using standard (C3D) and proprietary file formats (used by many companies producing motion capture systems). The Biomechanical ToolKit (BTK) was developed to provide cost-effective and efficient tools for the biomechanical community to easily deal with motion analysis data. A large panel of operations is available to read, modify and process data through C++ API, bindings for high-level languages (Matlab, Octave, and Python), and standalone application (Mokka). All these tools are open-source and cross-platform and run on all major operating systems (Windows, Linux, MacOS X). Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  15. Ethnography in design: Tool-kit or analytic science?

    DEFF Research Database (Denmark)

    Bossen, Claus

    2002-01-01

    The role of ethograpyh in system development is discussed through the selective application of an ethnographic easy-to-use toolkit, Contextual design, by a computer firm in the initial stages of the development of a health care system....

  16. Novel radar techniques and applications

    CERN Document Server

    Klemm, Richard; Lombardo, Pierfrancesco; Nickel, Ulrich

    2017-01-01

    Novel Radar Techniques and Applications presents the state-of-the-art in advanced radar, with emphasis on ongoing novel research and development and contributions from an international team of leading radar experts. This volume covers: Real aperture array radar; Imaging radar and Passive and multistatic radar.

  17. ATK-ForceField: a new generation molecular dynamics software package

    Science.gov (United States)

    Schneider, Julian; Hamaekers, Jan; Chill, Samuel T.; Smidstrup, Søren; Bulin, Johannes; Thesen, Ralph; Blom, Anders; Stokbro, Kurt

    2017-12-01

    ATK-ForceField is a software package for atomistic simulations using classical interatomic potentials. It is implemented as a part of the Atomistix ToolKit (ATK), which is a Python programming environment that makes it easy to create and analyze both standard and highly customized simulations. This paper will focus on the atomic interaction potentials, molecular dynamics, and geometry optimization features of the software, however, many more advanced modeling features are available. The implementation details of these algorithms and their computational performance will be shown. We present three illustrative examples of the types of calculations that are possible with ATK-ForceField: modeling thermal transport properties in a silicon germanium crystal, vapor deposition of selenium molecules on a selenium surface, and a simulation of creep in a copper polycrystal.

  18. The nursing human resource planning best practice toolkit: creating a best practice resource for nursing managers.

    Science.gov (United States)

    Vincent, Leslie; Beduz, Mary Agnes

    2010-05-01

    Evidence of acute nursing shortages in urban hospitals has been surfacing since 2000. Further, new graduate nurses account for more than 50% of total nurse turnover in some hospitals and between 35% and 60% of new graduates change workplace during the first year. Critical to organizational success, first line nurse managers must have the knowledge and skills to ensure the accurate projection of nursing resource requirements and to develop proactive recruitment and retention programs that are effective, promote positive nursing socialization, and provide early exposure to the clinical setting. The Nursing Human Resource Planning Best Practice Toolkit project supported the creation of a network of teaching and community hospitals to develop a best practice toolkit in nursing human resource planning targeted at first line nursing managers. The toolkit includes the development of a framework including the conceptual building blocks of planning tools, manager interventions, retention and recruitment and professional practice models. The development of the toolkit involved conducting a review of the literature for best practices in nursing human resource planning, using a mixed method approach to data collection including a survey and extensive interviews of managers and completing a comprehensive scan of human resource practices in the participating organizations. This paper will provide an overview of the process used to develop the toolkit, a description of the toolkit contents and a reflection on the outcomes of the project.

  19. Monitoring internal organ motion with continuous wave radar in CT

    International Nuclear Information System (INIS)

    Pfanner, Florian; Maier, Joscha; Allmendinger, Thomas; Flohr, Thomas; Kachelrieß, Marc

    2013-01-01

    Purpose: To avoid motion artifacts in medical imaging or to minimize the exposure of healthy tissues in radiation therapy, medical devices are often synchronized with the patient's respiratory motion. Today's respiratory motion monitors require additional effort to prepare the patients, e.g., mounting a motion belt or placing an optical reflector on the patient's breast. Furthermore, they are not able to measure internal organ motion without implanting markers. An interesting alternative to assess the patient's organ motion is continuous wave radar. The aim of this work is to design, implement, and evaluate such a radar system focusing on application in CT.Methods: The authors designed a radar system operating in the 860 MHz band to monitor the patient motion. In the intended application of the radar system, the antennas are located close to the patient's body inside the table of a CT system. One receive and four transmitting antennas are used to avoid the requirement of exact patient positioning. The radar waves propagate into the patient's body and are reflected at tissue boundaries, for example at the borderline between muscle and adipose tissue, or at the boundaries of organs. At present, the authors focus on the detection of respiratory motion. The radar system consists of the hardware mentioned above as well as of dedicated signal processing software to extract the desired information from the radar signal. The system was evaluated using simulations and measurements. To simulate the radar system, a simulation model based on radar and wave field equations was designed and 4D respiratory-gated CT data sets were used as input. The simulated radar signals and the measured data were processed in the same way. The radar system hardware and the signal processing algorithms were tested with data from ten volunteers. As a reference, the respiratory motion signal was recorded using a breast belt simultaneously with the radar measurements.Results: Concerning the

  20. Research standardization tools: pregnancy measures in the PhenX Toolkit.

    Science.gov (United States)

    Malinowski, Ann Kinga; Ananth, Cande V; Catalano, Patrick; Hines, Erin P; Kirby, Russell S; Klebanoff, Mark A; Mulvihill, John J; Simhan, Hyagriv; Hamilton, Carol M; Hendershot, Tabitha P; Phillips, Michael J; Kilpatrick, Lisa A; Maiese, Deborah R; Ramos, Erin M; Wright, Rosalind J; Dolan, Siobhan M

    2017-09-01

    Only through concerted and well-executed research endeavors can we gain the requisite knowledge to advance pregnancy care and have a positive impact on maternal and newborn health. Yet the heterogeneity inherent in individual studies limits our ability to compare and synthesize study results, thus impeding the capacity to draw meaningful conclusions that can be trusted to inform clinical care. The PhenX Toolkit (http://www.phenxtoolkit.org), supported since 2007 by the National Institutes of Health, is a web-based catalog of standardized protocols for measuring phenotypes and exposures relevant for clinical research. In 2016, a working group of pregnancy experts recommended 15 measures for the PhenX Toolkit that are highly relevant to pregnancy research. The working group followed the established PhenX consensus process to recommend protocols that are broadly validated, well established, nonproprietary, and have a relatively low burden for investigators and participants. The working group considered input from the pregnancy experts and the broader research community and included measures addressing the mode of conception, gestational age, fetal growth assessment, prenatal care, the mode of delivery, gestational diabetes, behavioral and mental health, and environmental exposure biomarkers. These pregnancy measures complement the existing measures for other established domains in the PhenX Toolkit, including reproductive health, anthropometrics, demographic characteristics, and alcohol, tobacco, and other substances. The preceding domains influence a woman's health during pregnancy. For each measure, the PhenX Toolkit includes data dictionaries and data collection worksheets that facilitate incorporation of the protocol into new or existing studies. The measures within the pregnancy domain offer a valuable resource to investigators and clinicians and are well poised to facilitate collaborative pregnancy research with the goal to improve patient care. To achieve this

  1. HF Radar Observations of Current, Wave and Wind Parameters in the South Australian Gulf

    Science.gov (United States)

    Middleditch, A.; Cosoli, S.

    2016-12-01

    The Australian Coastal Ocean Radar Network (ACORN) has been measuring metocean parameters from an array of HF radar systems since 2007. Current, wave and wind measurements from a WERA phased-array radar system in the South Australian Gulf are evaluated using current meter, wave buoy and weather station data over a 12-month period. The spatial and temporal scales of the radar deployment have been configured for the measurement of surface currents from the first order backscatter spectra. Quality control procedures are applied to the radar currents that relate to the geometric configurations, statistical properties, and diagnostic variables provided by the analysis software. Wave measurements are obtained through an iterative inversion algorithm that provides an estimate of the directional frequency spectrum. The standard static configurations and data sampling strategies are not optimised for waves and so additional signal processing steps need to be implemented in order to provide reliable estimates. These techniques are currently only applied in offline mode but a real-time approach is in development. Improvements in the quality of extracted wave data are found through increased averaging of the raw radar data but the impact of temporal non-stationarity and spatial inhomogeneities in the WERA measurement region needs to be taken into account. Validations of wind direction data from a weather station on Neptune Island show the potential of using HF radar to combat the spread of bushfires in South Australia.

  2. pypet: A Python Toolkit for Data Management of Parameter Explorations

    Directory of Open Access Journals (Sweden)

    Robert Meyer

    2016-08-01

    Full Text Available pypet (Python parameter exploration toolkit is a new multi-platform Python toolkit for managing numerical simulations. Sampling the space of model parameters is a key aspect of simulations and numerical experiments. pypet is designed to allow easy and arbitrary sampling of trajectories through a parameter space beyond simple grid searches.pypet collects and stores both simulation parameters and results in a single HDF5 file.This collective storage allows fast and convenient loading of data for further analyses.pypet provides various additional features such as multiprocessing and parallelization of simulations, dynamic loading of data, integration of git version control, and supervision of experiments via the electronic lab notebook Sumatra. pypet supports a rich set of data formats, including native Python types, Numpy and Scipy data, Pandas DataFrames, and BRIAN(2 quantities. Besides these formats, users can easily extend the toolkit to allow customized data types. pypet is a flexible tool suited for both short Python scripts and large scale projects. pypet's various features, especially the tight link between parameters and results, promote reproducible research in computational neuroscience and simulation-based disciplines.

  3. Minimum redundancy MIMO radars

    OpenAIRE

    Chen, Chun-Yang; Vaidyanathan, P. P.

    2008-01-01

    The multiple-input multiple-output (MIMO) radar concept has drawn considerable attention recently. In the traditional single-input multiple-output (SIMO) radar system, the transmitter emits scaled versions of a single waveform. However, in the MIMO radar system, the transmitter transmits independent waveforms. It has been shown that the MIMO radar can be used to improve system performance. Most of the MIMO radar research so far has focused on the uniform array. However, i...

  4. Integrating Satellite, Radar and Surface Observation with Time and Space Matching

    Science.gov (United States)

    Ho, Y.; Weber, J.

    2015-12-01

    The Integrated Data Viewer (IDV) from Unidata is a Java™-based software framework for analyzing and visualizing geoscience data. It brings together the ability to display and work with satellite imagery, gridded data, surface observations, balloon soundings, NWS WSR-88D Level II and Level III RADAR data, and NOAA National Profiler Network data, all within a unified interface. Applying time and space matching on the satellite, radar and surface observation datasets will automatically synchronize the display from different data sources and spatially subset to match the display area in the view window. These features allow the IDV users to effectively integrate these observations and provide 3 dimensional views of the weather system to better understand the underlying dynamics and physics of weather phenomena.

  5. BRDF profile of Tyvek and its implementation in the Geant4 simulation toolkit.

    Science.gov (United States)

    Nozka, Libor; Pech, Miroslav; Hiklova, Helena; Mandat, Dusan; Hrabovsky, Miroslav; Schovanek, Petr; Palatka, Miroslav

    2011-02-28

    Diffuse and specular characteristics of the Tyvek 1025-BL material are reported with respect to their implementation in the Geant4 Monte Carlo simulation toolkit. This toolkit incorporates the UNIFIED model. Coefficients defined by the UNIFIED model were calculated from the bidirectional reflectance distribution function (BRDF) profiles measured with a scatterometer for several angles of incidence. Results were amended with profile measurements made by a profilometer.

  6. Principles of modern radar systems

    CERN Document Server

    Carpentier, Michel H

    1988-01-01

    Introduction to random functions ; signal and noise : the ideal receiver ; performance of radar systems equipped with ideal receivers ; analysis of the operating principles of some types of radar ; behavior of real targets, fluctuation of targets ; angle measurement using radar ; data processing of radar information, radar coverage ; applications to electronic scanning antennas to radar ; introduction to Hilbert spaces.

  7. Building Emergency Contraception Awareness among Adolescents. A Toolkit for Schools and Community-Based Organizations.

    Science.gov (United States)

    Simkin, Linda; Radosh, Alice; Nelsesteun, Kari; Silverstein, Stacy

    This toolkit presents emergency contraception (EC) as a method to help adolescent women avoid pregnancy and abortion after unprotected sexual intercourse. The sections of this toolkit are designed to help increase your knowledge of EC and stay up to date. They provide suggestions for increasing EC awareness in the workplace, whether it is a school…

  8. Acquisition and use of Orlando, Florida and Continental Airbus radar flight test data

    Science.gov (United States)

    Eide, Michael C.; Mathews, Bruce

    1992-01-01

    Westinghouse is developing a lookdown pulse Doppler radar for production as the sensor and processor of a forward looking hazardous windshear detection and avoidance system. A data collection prototype of that product was ready for flight testing in Orlando to encounter low level windshear in corroboration with the FAA-Terminal Doppler Weather Radar (TDWR). Airborne real-time processing and display of the hazard factor were demonstrated with TDWR facilitated intercepts and penetrations of over 80 microbursts in a three day period, including microbursts with hazard factors in excess of .16 (with 500 ft. PIREP altitude loss) and the hazard factor display at 6 n.mi. of a visually transparent ('dry') microburst with TDWR corroborated outflow reflectivities of +5 dBz. Range gated Doppler spectrum data was recorded for subsequent development and refinement of hazard factor detection and urban clutter rejection algorithms. Following Orlando, the data collection radar was supplemental type certified for in revenue service on a Continental Airlines Airbus in an automatic and non-interferring basis with its ARINC 708 radar to allow Westinghouse to confirm its understanding of commercial aircraft installation, interface realities, and urban airport clutter. A number of software upgrades, all of which were verified at the Receiver-Transmitter-Processor (RTP) hardware bench with Orlando microburst data to produce desired advanced warning hazard factor detection, included some preliminary loads with automatic (sliding window average hazard factor) detection and annunciation recording. The current (14-APR-92) configured software is free from false and/or nuisance alerts (CAUTIONS, WARNINGS, etc.) for all take-off and landing approaches, under 2500 ft. altitude to weight-on-wheels, into all encountered airports, including Newark (NJ), LAX, Denver, Houston, Cleveland, etc. Using the Orlando data collected on hazardous microbursts, Westinghouse has developed a lookdown pulse Doppler

  9. Roofline model toolkit: A practical tool for architectural and program analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lo, Yu Jung [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Williams, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Van Straalen, Brian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ligocki, Terry J. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Cordery, Matthew J. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wright, Nicholas J. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hall, Mary W. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Oliker, Leonid [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-04-18

    We present preliminary results of the Roofline Toolkit for multicore, many core, and accelerated architectures. This paper focuses on the processor architecture characterization engine, a collection of portable instrumented micro benchmarks implemented with Message Passing Interface (MPI), and OpenMP used to express thread-level parallelism. These benchmarks are specialized to quantify the behavior of different architectural features. Compared to previous work on performance characterization, these microbenchmarks focus on capturing the performance of each level of the memory hierarchy, along with thread-level parallelism, instruction-level parallelism and explicit SIMD parallelism, measured in the context of the compilers and run-time environments. We also measure sustained PCIe throughput with four GPU memory managed mechanisms. By combining results from the architecture characterization with the Roofline model based solely on architectural specifications, this work offers insights for performance prediction of current and future architectures and their software systems. To that end, we instrument three applications and plot their resultant performance on the corresponding Roofline model when run on a Blue Gene/Q architecture.

  10. The Radar Correlation and Interpolation (C&I) Algorithms Deployed in the ASR-9 Processor Augmentation Card (9PAC)

    National Research Council Canada - National Science Library

    Elkin, G

    2001-01-01

    .... The increased processing speed and memory size of the 9PAC hardware made it possible for new surveillance algorithms to be developed in software in order to provide improved primary radar and beacon...

  11. Numerical relativity in spherical coordinates with the Einstein Toolkit

    Science.gov (United States)

    Mewes, Vassilios; Zlochower, Yosef; Campanelli, Manuela; Ruchlin, Ian; Etienne, Zachariah B.; Baumgarte, Thomas W.

    2018-04-01

    Numerical relativity codes that do not make assumptions on spatial symmetries most commonly adopt Cartesian coordinates. While these coordinates have many attractive features, spherical coordinates are much better suited to take advantage of approximate symmetries in a number of astrophysical objects, including single stars, black holes, and accretion disks. While the appearance of coordinate singularities often spoils numerical relativity simulations in spherical coordinates, especially in the absence of any symmetry assumptions, it has recently been demonstrated that these problems can be avoided if the coordinate singularities are handled analytically. This is possible with the help of a reference-metric version of the Baumgarte-Shapiro-Shibata-Nakamura formulation together with a proper rescaling of tensorial quantities. In this paper we report on an implementation of this formalism in the Einstein Toolkit. We adapt the Einstein Toolkit infrastructure, originally designed for Cartesian coordinates, to handle spherical coordinates, by providing appropriate boundary conditions at both inner and outer boundaries. We perform numerical simulations for a disturbed Kerr black hole, extract the gravitational wave signal, and demonstrate that the noise in these signals is orders of magnitude smaller when computed on spherical grids rather than Cartesian grids. With the public release of our new Einstein Toolkit thorns, our methods for numerical relativity in spherical coordinates will become available to the entire numerical relativity community.

  12. Radar Chart

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Radar Chart collection is an archived product of summarized radar data. The geographic coverage is the 48 contiguous states of the United States. These hourly...

  13. A toolkit for promoting healthy ageing

    OpenAIRE

    Knevel, Jeroen; Gruppen, Aly

    2016-01-01

    This toolkit therefore focusses on self-management abilities. That means finding and maintaining effective, positive coping methods in relation to our health. We included many common and frequently discussed topics such as drinking, eating, physical exercise, believing in the future, resilience, preventing loneliness and social participation. Besides some concise background information, we offer you a great diversity of exercises per theme which can help you discuss, assess, change or strengt...

  14. Using stakeholder perspectives to develop an ePrescribing toolkit for NHS Hospitals: a questionnaire study.

    Science.gov (United States)

    Lee, Lisa; Cresswell, Kathrin; Slee, Ann; Slight, Sarah P; Coleman, Jamie; Sheikh, Aziz

    2014-10-01

    To evaluate how an online toolkit may support ePrescribing deployments in National Health Service hospitals, by assessing the type of knowledge-based resources currently sought by key stakeholders. Questionnaire-based survey of attendees at a national ePrescribing symposium. 2013 National ePrescribing Symposium in London, UK. Eighty-four delegates were eligible for inclusion in the survey, of whom 70 completed and returned the questionnaire. Estimate of the usefulness and type of content to be included in an ePrescribing toolkit. Interest in a toolkit designed to support the implementation and use of ePrescribing systems was high (n = 64; 91.4%). As could be expected given the current dearth of such a resource, few respondents (n = 2; 2.9%) had access or used an ePrescribing toolkit at the time of the survey. Anticipated users for the toolkit included implementation (n = 62; 88.6%) and information technology (n = 61; 87.1%) teams, pharmacists (n = 61; 87.1%), doctors (n = 58; 82.9%) and nurses (n = 56; 80.0%). Summary guidance for every stage of the implementation (n = 48; 68.6%), planning and monitoring tools (n = 47; 67.1%) and case studies of hospitals' experiences (n = 45; 64.3%) were considered the most useful types of content. There is a clear need for reliable and up-to-date knowledge to support ePrescribing system deployments and longer term use. The findings highlight how a toolkit may become a useful instrument for the management of knowledge in the field, not least by allowing the exchange of ideas and shared learning.

  15. Development of a Human Physiologically Based Pharmacokinetic (PBPK Toolkit for Environmental Pollutants

    Directory of Open Access Journals (Sweden)

    Patricia Ruiz

    2011-10-01

    Full Text Available Physiologically Based Pharmacokinetic (PBPK models can be used to determine the internal dose and strengthen exposure assessment. Many PBPK models are available, but they are not easily accessible for field use. The Agency for Toxic Substances and Disease Registry (ATSDR has conducted translational research to develop a human PBPK model toolkit by recoding published PBPK models. This toolkit, when fully developed, will provide a platform that consists of a series of priority PBPK models of environmental pollutants. Presented here is work on recoded PBPK models for volatile organic compounds (VOCs and metals. Good agreement was generally obtained between the original and the recoded models. This toolkit will be available for ATSDR scientists and public health assessors to perform simulations of exposures from contaminated environmental media at sites of concern and to help interpret biomonitoring data. It can be used as screening tools that can provide useful information for the protection of the public.

  16. 77 FR 73022 - U.S. Environmental Solutions Toolkit

    Science.gov (United States)

    2012-12-07

    ... relevant to reducing air pollution from oil and natural gas production and processing. The Department of... environmental officials and foreign end-users of environmental technologies that will outline U.S. approaches to.... technologies. The Toolkit will support the President's National Export Initiative by fostering export...

  17. Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit.

    Science.gov (United States)

    Mateu, Juan; Lasala, María José; Alamán, Xavier

    2015-08-31

    In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, abstracting the creator of educational applications from the technical details involving the use of tangible interfaces and virtual worlds. Therefore, it is specially designed to enable teachers to themselves create educational activities for their students in a simple way, taking into account that teachers generally lack advanced knowledge in computer programming and electronics. The toolkit has been used to develop various educational applications that have been tested in two secondary education high schools in Spain.

  18. Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit

    Science.gov (United States)

    Mateu, Juan; Lasala, María José; Alamán, Xavier

    2015-01-01

    In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, abstracting the creator of educational applications from the technical details involving the use of tangible interfaces and virtual worlds. Therefore, it is specially designed to enable teachers to themselves create educational activities for their students in a simple way, taking into account that teachers generally lack advanced knowledge in computer programming and electronics. The toolkit has been used to develop various educational applications that have been tested in two secondary education high schools in Spain. PMID:26334275

  19. Combined radar and telemetry system

    Energy Technology Data Exchange (ETDEWEB)

    Rodenbeck, Christopher T.; Young, Derek; Chou, Tina; Hsieh, Lung-Hwa; Conover, Kurt; Heintzleman, Richard

    2017-08-01

    A combined radar and telemetry system is described. The combined radar and telemetry system includes a processing unit that executes instructions, where the instructions define a radar waveform and a telemetry waveform. The processor outputs a digital baseband signal based upon the instructions, where the digital baseband signal is based upon the radar waveform and the telemetry waveform. A radar and telemetry circuit transmits, simultaneously, a radar signal and telemetry signal based upon the digital baseband signal.

  20. Reducing Surface Clutter in Cloud Profiling Radar Data

    Science.gov (United States)

    Tanelli, Simone; Pak, Kyung; Durden, Stephen; Im, Eastwood

    2008-01-01

    An algorithm has been devised to reduce ground clutter in the data products of the CloudSat Cloud Profiling Radar (CPR), which is a nadir-looking radar instrument, in orbit around the Earth, that measures power backscattered by clouds as a function of distance from the instrument. Ground clutter contaminates the CPR data in the lowest 1 km of the atmospheric profile, heretofore making it impossible to use CPR data to satisfy the scientific interest in studying clouds and light rainfall at low altitude. The algorithm is based partly on the fact that the CloudSat orbit is such that the geodetic altitude of the CPR varies continuously over a range of approximately 25 km. As the geodetic altitude changes, the radar timing parameters are changed at intervals defined by flight software in order to keep the troposphere inside a data-collection time window. However, within each interval, the surface of the Earth continuously "scans through" (that is, it moves across) a few range bins of the data time window. For each radar profile, only few samples [one for every range-bin increment ((Delta)r = 240 m)] of the surface-clutter signature are available around the range bin in which the peak of surface return is observed, but samples in consecutive radar profiles are offset slightly (by amounts much less than (Delta)r) with respect to each other according to the relative change in geodetic altitude. As a consequence, in a case in which the surface area under examination is homogenous (e.g., an ocean surface), a sequence of consecutive radar profiles of the surface in that area contains samples of the surface response with range resolution (Delta)p much finer than the range-bin increment ((Delta)p 10 dB and a reduction of the contaminated altitude over ocean from about 1 km to about 0.5 km (over the ocean). The algorithm has been embedded in CloudSat L1B processing as of Release 04 (July 2007), and the estimated flat surface clutter is removed in L2B-GEOPROF product from the

  1. The Best Ever Alarm System Toolkit

    International Nuclear Information System (INIS)

    Kasemir, Kay; Chen, Xihui; Danilova, Ekaterina N.

    2009-01-01

    Learning from our experience with the standard Experimental Physics and Industrial Control System (EPICS) alarm handler (ALH) as well as a similar intermediate approach based on script-generated operator screens, we developed the Best Ever Alarm System Toolkit (BEAST). It is based on Java and Eclipse on the Control System Studio (CSS) platform, using a relational database (RDB) to store the configuration and log actions. It employs a Java Message Service (JMS) for communication between the modular pieces of the toolkit, which include an Alarm Server to maintain the current alarm state, an arbitrary number of Alarm Client user interfaces (GUI), and tools to annunciate alarms or log alarm related actions. Web reports allow us to monitor the alarm system performance and spot deficiencies in the alarm configuration. The Alarm Client GUI not only gives the end users various ways to view alarms in tree and table, but also makes it easy to access the guidance information, the related operator displays and other CSS tools. It also allows online configuration to be simply modified from the GUI. Coupled with a good 'alarm philosophy' on how to provide useful alarms, we can finally improve the configuration to achieve an effective alarm system.

  2. BioRuby: bioinformatics software for the Ruby programming language.

    Science.gov (United States)

    Goto, Naohisa; Prins, Pjotr; Nakao, Mitsuteru; Bonnal, Raoul; Aerts, Jan; Katayama, Toshiaki

    2010-10-15

    The BioRuby software toolkit contains a comprehensive set of free development tools and libraries for bioinformatics and molecular biology, written in the Ruby programming language. BioRuby has components for sequence analysis, pathway analysis, protein modelling and phylogenetic analysis; it supports many widely used data formats and provides easy access to databases, external programs and public web services, including BLAST, KEGG, GenBank, MEDLINE and GO. BioRuby comes with a tutorial, documentation and an interactive environment, which can be used in the shell, and in the web browser. BioRuby is free and open source software, made available under the Ruby license. BioRuby runs on all platforms that support Ruby, including Linux, Mac OS X and Windows. And, with JRuby, BioRuby runs on the Java Virtual Machine. The source code is available from http://www.bioruby.org/. katayama@bioruby.org

  3. A Genetic Toolkit for Dissecting Dopamine Circuit Function in Drosophila

    Directory of Open Access Journals (Sweden)

    Tingting Xie

    2018-04-01

    Full Text Available Summary: The neuromodulator dopamine (DA plays a key role in motor control, motivated behaviors, and higher-order cognitive processes. Dissecting how these DA neural networks tune the activity of local neural circuits to regulate behavior requires tools for manipulating small groups of DA neurons. To address this need, we assembled a genetic toolkit that allows for an exquisite level of control over the DA neural network in Drosophila. To further refine targeting of specific DA neurons, we also created reagents that allow for the conversion of any existing GAL4 line into Split GAL4 or GAL80 lines. We demonstrated how this toolkit can be used with recently developed computational methods to rapidly generate additional reagents for manipulating small subsets or individual DA neurons. Finally, we used the toolkit to reveal a dynamic interaction between a small subset of DA neurons and rearing conditions in a social space behavioral assay. : The rapid analysis of how dopaminergic circuits regulate behavior is limited by the genetic tools available to target and manipulate small numbers of these neurons. Xie et al. present genetic tools in Drosophila that allow rational targeting of sparse dopaminergic neuronal subsets and selective knockdown of dopamine signaling. Keywords: dopamine, genetics, behavior, neural circuits, neuromodulation, Drosophila

  4. Determination of radar MTF

    Energy Technology Data Exchange (ETDEWEB)

    Chambers, D. [Lawrence Livermore National Lab., CA (United States)

    1994-11-15

    The ultimate goal of the Current Meter Array (CMA) is to be able to compare the current patterns detected with the array with radar images of the water surface. The internal wave current patterns modulate the waves on the water surface giving a detectable modulation of the radar cross-section (RCS). The function relating the RCS modulations to the current patterns is the Modulation Transfer Function (MTF). By comparing radar images directly with co-located CMA measurements the MTF can be determined. In this talk radar images and CMA measurements from a recent experiment at Loch Linnhe, Scotland, will be used to make the first direct determination of MTF for an X and S band radar at low grazing angles. The technical problems associated with comparing radar images to CMA data will be explained and the solution method discussed. The results suggest the both current and strain rate contribute equally to the radar modulation for X band. For S band, the strain rate contributes more than the current. The magnitude of the MTF and the RCS modulations are consistent with previous estimates when the wind is blowing perpendicular to the radar look direction.

  5. Local Safety Toolkit: Enabling safe communities of opportunity

    CSIR Research Space (South Africa)

    Holtmann, B

    2010-08-31

    Full Text Available remain inadequate to achieve safety. The Local Safety Toolkit supports a strategy for a Safe South Africa through the implementation of a model for a Safe Community of Opportunity. The model is the outcome of work undertaken over the course of the past...

  6. Quantitative 177Lu-SPECT/CT imaging and validation of a commercial dosimetry software

    International Nuclear Information System (INIS)

    D'Ambrosio, L.; Aloj, L.; Morisco, A.; Aurilio, M.; Prisco, A.; Di Gennaro, F.; Lastoria, S.; Madesani, D.

    2015-01-01

    Full text of publication follows. Aim: 3D dosimetry is an appealing yet complex application of SPECT/CT in patients undergoing radionuclide therapy. In this study we have developed a quantitative imaging protocol and we have validated commercially available dosimetry software (Dosimetry Tool-kit Package, GE Heathcare) in patients undergoing 177 Lu-DOTATATE therapy. Materials and methods: dosimetry tool-kit uses multi SPECT/CT and/or WB planar datasets for quantifying changes in radiopharmaceutical uptake over time to determine residence times. This software includes tools for performing reconstruction of SPECT/CT data, registration of all scans to a common reference, segmentation of the different organs, creating time activity curves, curve fitting and calculation of residence times. All acquisitions were performed using a hybrid dual-head SPECT-CT camera (Discovery 670, GE Heathcare) equipped with medium energy collimator using a triple-energy window. SPECT images were reconstructed using an iterative reconstruction algorithm with attenuation, scatter and collimator depth-dependent three-dimensional resolution recovery correction. Camera sensitivity and dead time were evaluated. Accuracy of activity quantification was performed on a large homogeneous source with addition of attenuating/scattering medium. A NEMA/IEC body phantom was utilized to measure the recovery coefficient that the software does not take into account. The residence times for organs at risk were calculated in five patients. OLINDA-EXM software was used to calculate absorbed doses. Results: 177 Lu-sensitivity factor was 13 counts/MBq/s. Dead time was <3% with 1.11 GBq in the field of view. The measured activity was consistent with the decay-corrected calibrated activity for large volumes (>100 cc). The recovery coefficient varied from 0.71 (26.5 ml) to 0.16 (2.5 ml) in the absence of background activity and from 0.58 to 0.13 with a source to background activity concentration ratio 20:1. The

  7. Pipe Penetrating Radar: a New Tool for the Assessment of Critical Infrastructure

    Science.gov (United States)

    Ekes, C.; Neducz, B.

    2012-04-01

    This paper describes the development of Pipe Penetrating Radar (PPR), the underground in-pipe application of GPR, a non-destructive testing method that can detect defects and cavities within and outside mainline diameter (>18 in / 450mm) non-metallic (concrete, PVC, HDPE, etc.) underground pipes. The method uses two or more high frequency GPR antennae carried by a robot into underground pipes. The radar data is transmitted to the surface via fibre optic cable and is recorded together with the output from CCTV (and optionally sonar and laser). Proprietary software analyzes the data and pinpoints defects or cavities within and outside the pipe. Thus the testing can identify existing pipe and pipe bedding symptoms that can be addressed to prevent catastrophic failure due to sinkhole development and can provide useful information about the remaining service life of the pipe. The key innovative aspect is the unique ability to map pipe wall thickness and deterioration including cracks and voids outside the pipe, enabling accurate predictability of needed intervention or the timing of replacement. This reliable non-destructive testing method significantly impacts subsurface infrastructure condition based asset management by supplying previously unattainable measurable conditions. Keywords: pipe penetrating radar (PPR), ground penetrating radar (GPR), pipe inspection, concrete deterioration, municipal engineering

  8. Performance Engineering Technology for Scientific Component Software

    Energy Technology Data Exchange (ETDEWEB)

    Malony, Allen D.

    2007-05-08

    Large-scale, complex scientific applications are beginning to benefit from the use of component software design methodology and technology for software development. Integral to the success of component-based applications is the ability to achieve high-performing code solutions through the use of performance engineering tools for both intra-component and inter-component analysis and optimization. Our work on this project aimed to develop performance engineering technology for scientific component software in association with the DOE CCTTSS SciDAC project (active during the contract period) and the broader Common Component Architecture (CCA) community. Our specific implementation objectives were to extend the TAU performance system and Program Database Toolkit (PDT) to support performance instrumentation, measurement, and analysis of CCA components and frameworks, and to develop performance measurement and monitoring infrastructure that could be integrated in CCA applications. These objectives have been met in the completion of all project milestones and in the transfer of the technology into the continuing CCA activities as part of the DOE TASCS SciDAC2 effort. In addition to these achievements, over the past three years, we have been an active member of the CCA Forum, attending all meetings and serving in several working groups, such as the CCA Toolkit working group, the CQoS working group, and the Tutorial working group. We have contributed significantly to CCA tutorials since SC'04, hosted two CCA meetings, participated in the annual ACTS workshops, and were co-authors on the recent CCA journal paper [24]. There are four main areas where our project has delivered results: component performance instrumentation and measurement, component performance modeling and optimization, performance database and data mining, and online performance monitoring. This final report outlines the achievements in these areas for the entire project period. The submitted progress

  9. The connectome viewer toolkit: an open source framework to manage, analyze, and visualize connectomes.

    Science.gov (United States)

    Gerhard, Stephan; Daducci, Alessandro; Lemkaddem, Alia; Meuli, Reto; Thiran, Jean-Philippe; Hagmann, Patric

    2011-01-01

    Advanced neuroinformatics tools are required for methods of connectome mapping, analysis, and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration, and sharing. We have designed and implemented the Connectome Viewer Toolkit - a set of free and extensible open source neuroimaging tools written in Python. The key components of the toolkit are as follows: (1) The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. (2) The Connectome File Format Library enables management and sharing of connectome files. (3) The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration, and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/

  10. The Connectome Viewer Toolkit: an open source framework to manage, analyze and visualize connectomes

    Directory of Open Access Journals (Sweden)

    Stephan eGerhard

    2011-06-01

    Full Text Available Abstract Advanced neuroinformatics tools are required for methods of connectome mapping, analysis and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration and sharing. We have designed and implemented the Connectome Viewer Toolkit --- a set of free and extensible open-source neuroimaging tools written in Python. The key components of the toolkit are as follows: 1. The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. 2. The Connectome File Format Library enables management and sharing of connectome files. 3. The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/.

  11. National eHealth strategy toolkit

    CERN Document Server

    2012-01-01

    Worldwide the application of information and communication technologies to support national health-care services is rapidly expanding and increasingly important. This is especially so at a time when all health systems face stringent economic challenges and greater demands to provide more and better care especially to those most in need. The National eHealth Strategy Toolkit is an expert practical guide that provides governments their ministries and stakeholders with a solid foundation and method for the development and implementation of a national eHealth vision action plan and monitoring fram

  12. RADAR PPI Scope Overlay

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — RADAR PPI Scope Overlays are used to position a RADAR image over a station at the correct resolution. The archive maintains several different RADAR resolution types,...

  13. Pragmatic Software Testing Becoming an Effective and Efficient Test Professional

    CERN Document Server

    Black, Rex

    2011-01-01

    A hands-on guide to testing techniques that deliver reliable software and systemsTesting even a simple system can quickly turn into a potentially infinite task. Faced with tight costs and schedules, testers need to have a toolkit of practical techniques combined with hands-on experience and the right strategies in order to complete a successful project. World-renowned testing expert Rex Black provides you with the proven methods and concepts that test professionals must know. He presents you with the fundamental techniques for testing and clearly shows you how to select and apply successful st

  14. Effects of a Short Video-Based Resident-as-Teacher Training Toolkit on Resident Teaching.

    Science.gov (United States)

    Ricciotti, Hope A; Freret, Taylor S; Aluko, Ashley; McKeon, Bri Anne; Haviland, Miriam J; Newman, Lori R

    2017-10-01

    To pilot a short video-based resident-as-teacher training toolkit and assess its effect on resident teaching skills in clinical settings. A video-based resident-as-teacher training toolkit was previously developed by educational experts at Beth Israel Deaconess Medical Center, Harvard Medical School. Residents were recruited from two academic hospitals, watched two videos from the toolkit ("Clinical Teaching Skills" and "Effective Clinical Supervision"), and completed an accompanying self-study guide. A novel assessment instrument for evaluating the effect of the toolkit on teaching was created through a modified Delphi process. Before and after the intervention, residents were observed leading a clinical teaching encounter and scored using the 15-item assessment instrument. The primary outcome of interest was the change in number of skills exhibited, which was assessed using the Wilcoxon signed-rank test. Twenty-eight residents from two academic hospitals were enrolled, and 20 (71%) completed all phases of the study. More than one third of residents who volunteered to participate reported no prior formal teacher training. After completing two training modules, residents demonstrated a significant increase in the median number of teaching skills exhibited in a clinical teaching encounter, from 7.5 (interquartile range 6.5-9.5) to 10.0 (interquartile range 9.0-11.5; P<.001). Of the 15 teaching skills assessed, there were significant improvements in asking for the learner's perspective (P=.01), providing feedback (P=.005), and encouraging questions (P=.046). Using a resident-as-teacher video-based toolkit was associated with improvements in teaching skills in residents from multiple specialties.

  15. Educator Toolkits on Second Victim Syndrome, Mindfulness and Meditation, and Positive Psychology: The 2017 Resident Wellness Consensus Summit

    Directory of Open Access Journals (Sweden)

    Jon Smart

    2018-02-01

    Full Text Available Introduction: Burnout, depression, and suicidality among residents of all specialties have become a critical focus of attention for the medical education community. Methods: As part of the 2017 Resident Wellness Consensus Summit in Las Vegas, Nevada, resident participants from 31 programs collaborated in the Educator Toolkit workgroup. Over a seven-month period leading up to the summit, this workgroup convened virtually in the Wellness Think Tank, an online resident community, to perform a literature review and draft curricular plans on three core wellness topics. These topics were second victim syndrome, mindfulness and meditation, and positive psychology. At the live summit event, the workgroup expanded to include residents outside the Wellness Think Tank to obtain a broader consensus of the evidence-based toolkits for these three topics. Results: Three educator toolkits were developed. The second victim syndrome toolkit has four modules, each with a pre-reading material and a leader (educator guide. In the mindfulness and meditation toolkit, there are three modules with a leader guide in addition to a longitudinal, guided meditation plan. The positive psychology toolkit has two modules, each with a leader guide and a PowerPoint slide set. These toolkits provide educators the necessary resources, reading materials, and lesson plans to implement didactic sessions in their residency curriculum. Conclusion: Residents from across the world collaborated and convened to reach a consensus on high-yield—and potentially high-impact—lesson plans that programs can use to promote and improve resident wellness. These lesson plans may stand alone or be incorporated into a larger wellness curriculum.

  16. Educator Toolkits on Second Victim Syndrome, Mindfulness and Meditation, and Positive Psychology: The 2017 Resident Wellness Consensus Summit.

    Science.gov (United States)

    Chung, Arlene S; Smart, Jon; Zdradzinski, Michael; Roth, Sarah; Gende, Alecia; Conroy, Kylie; Battaglioli, Nicole

    2018-03-01

    Burnout, depression, and suicidality among residents of all specialties have become a critical focus of attention for the medical education community. As part of the 2017 Resident Wellness Consensus Summit in Las Vegas, Nevada, resident participants from 31 programs collaborated in the Educator Toolkit workgroup. Over a seven-month period leading up to the summit, this workgroup convened virtually in the Wellness Think Tank, an online resident community, to perform a literature review and draft curricular plans on three core wellness topics. These topics were second victim syndrome, mindfulness and meditation, and positive psychology. At the live summit event, the workgroup expanded to include residents outside the Wellness Think Tank to obtain a broader consensus of the evidence-based toolkits for these three topics. Three educator toolkits were developed. The second victim syndrome toolkit has four modules, each with a pre-reading material and a leader (educator) guide. In the mindfulness and meditation toolkit, there are three modules with a leader guide in addition to a longitudinal, guided meditation plan. The positive psychology toolkit has two modules, each with a leader guide and a PowerPoint slide set. These toolkits provide educators the necessary resources, reading materials, and lesson plans to implement didactic sessions in their residency curriculum. Residents from across the world collaborated and convened to reach a consensus on high-yield-and potentially high-impact-lesson plans that programs can use to promote and improve resident wellness. These lesson plans may stand alone or be incorporated into a larger wellness curriculum.

  17. Systems and Methods for Radar Data Communication

    Science.gov (United States)

    Bunch, Brian (Inventor); Szeto, Roland (Inventor); Miller, Brad (Inventor)

    2013-01-01

    A radar information processing system is operable to process high bandwidth radar information received from a radar system into low bandwidth radar information that may be communicated to a low bandwidth connection coupled to an electronic flight bag (EFB). An exemplary embodiment receives radar information from a radar system, the radar information communicated from the radar system at a first bandwidth; processes the received radar information into processed radar information, the processed radar information configured for communication over a connection operable at a second bandwidth, the second bandwidth lower than the first bandwidth; and communicates the radar information from a radar system, the radar information communicated from the radar system at a first bandwidth.

  18. An integrated radar model solution for mission level performance and cost trades

    Science.gov (United States)

    Hodge, John; Duncan, Kerron; Zimmerman, Madeline; Drupp, Rob; Manno, Mike; Barrett, Donald; Smith, Amelia

    2017-05-01

    A fully integrated Mission-Level Radar model is in development as part of a multi-year effort under the Northrop Grumman Mission Systems (NGMS) sector's Model Based Engineering (MBE) initiative to digitally interconnect and unify previously separate performance and cost models. In 2016, an NGMS internal research and development (IR and D) funded multidisciplinary team integrated radio frequency (RF), power, control, size, weight, thermal, and cost models together using a commercial-off-the-shelf software, ModelCenter, for an Active Electronically Scanned Array (AESA) radar system. Each represented model was digitally connected with standard interfaces and unified to allow end-to-end mission system optimization and trade studies. The radar model was then linked to the Air Force's own mission modeling framework (AFSIM). The team first had to identify the necessary models, and with the aid of subject matter experts (SMEs) understand and document the inputs, outputs, and behaviors of the component models. This agile development process and collaboration enabled rapid integration of disparate models and the validation of their combined system performance. This MBE framework will allow NGMS to design systems more efficiently and affordably, optimize architectures, and provide increased value to the customer. The model integrates detailed component models that validate cost and performance at the physics level with high-level models that provide visualization of a platform mission. This connectivity of component to mission models allows hardware and software design solutions to be better optimized to meet mission needs, creating cost-optimal solutions for the customer, while reducing design cycle time through risk mitigation and early validation of design decisions.

  19. Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit

    Directory of Open Access Journals (Sweden)

    Juan Mateu

    2015-08-01

    Full Text Available In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, abstracting the creator of educational applications from the technical details involving the use of tangible interfaces and virtual worlds. Therefore, it is specially designed to enable teachers to themselves create educational activities for their students in a simple way, taking into account that teachers generally lack advanced knowledge in computer programming and electronics. The toolkit has been used to develop various educational applications that have been tested in two secondary education high schools in Spain.

  20. ProtoMD: A prototyping toolkit for multiscale molecular dynamics

    Science.gov (United States)

    Somogyi, Endre; Mansour, Andrew Abi; Ortoleva, Peter J.

    2016-05-01

    ProtoMD is a toolkit that facilitates the development of algorithms for multiscale molecular dynamics (MD) simulations. It is designed for multiscale methods which capture the dynamic transfer of information across multiple spatial scales, such as the atomic to the mesoscopic scale, via coevolving microscopic and coarse-grained (CG) variables. ProtoMD can be also be used to calibrate parameters needed in traditional CG-MD methods. The toolkit integrates 'GROMACS wrapper' to initiate MD simulations, and 'MDAnalysis' to analyze and manipulate trajectory files. It facilitates experimentation with a spectrum of coarse-grained variables, prototyping rare events (such as chemical reactions), or simulating nanocharacterization experiments such as terahertz spectroscopy, AFM, nanopore, and time-of-flight mass spectroscopy. ProtoMD is written in python and is freely available under the GNU General Public License from github.com/CTCNano/proto_md.

  1. New inverse synthetic aperture radar algorithm for translational motion compensation

    Science.gov (United States)

    Bocker, Richard P.; Henderson, Thomas B.; Jones, Scott A.; Frieden, B. R.

    1991-10-01

    Inverse synthetic aperture radar (ISAR) is an imaging technique that shows real promise in classifying airborne targets in real time under all weather conditions. Over the past few years a large body of ISAR data has been collected and considerable effort has been expended to develop algorithms to form high-resolution images from this data. One important goal of workers in this field is to develop software that will do the best job of imaging under the widest range of conditions. The success of classifying targets using ISAR is predicated upon forming highly focused radar images of these targets. Efforts to develop highly focused imaging computer software have been challenging, mainly because the imaging depends on and is affected by the motion of the target, which in general is not precisely known. Specifically, the target generally has both rotational motion about some axis and translational motion as a whole with respect to the radar. The slant-range translational motion kinematic quantities must be first accurately estimated from the data and compensated before the image can be focused. Following slant-range motion compensation, the image is further focused by determining and correcting for target rotation. The use of the burst derivative measure is proposed as a means to improve the computational efficiency of currently used ISAR algorithms. The use of this measure in motion compensation ISAR algorithms for estimating the slant-range translational motion kinematic quantities of an uncooperative target is described. Preliminary tests have been performed on simulated as well as actual ISAR data using both a Sun 4 workstation and a parallel processing transputer array. Results indicate that the burst derivative measure gives significant improvement in processing speed over the traditional entropy measure now employed.

  2. SlideToolkit: an assistive toolset for the histological quantification of whole slide images.

    Directory of Open Access Journals (Sweden)

    Bastiaan G L Nelissen

    Full Text Available The demand for accurate and reproducible phenotyping of a disease trait increases with the rising number of biobanks and genome wide association studies. Detailed analysis of histology is a powerful way of phenotyping human tissues. Nonetheless, purely visual assessment of histological slides is time-consuming and liable to sampling variation and optical illusions and thereby observer variation, and external validation may be cumbersome. Therefore, within our own biobank, computerized quantification of digitized histological slides is often preferred as a more precise and reproducible, and sometimes more sensitive approach. Relatively few free toolkits are, however, available for fully digitized microscopic slides, usually known as whole slides images. In order to comply with this need, we developed the slideToolkit as a fast method to handle large quantities of low contrast whole slides images using advanced cell detecting algorithms. The slideToolkit has been developed for modern personal computers and high-performance clusters (HPCs and is available as an open-source project on github.com. We here illustrate the power of slideToolkit by a repeated measurement of 303 digital slides containing CD3 stained (DAB abdominal aortic aneurysm tissue from a tissue biobank. Our workflow consists of four consecutive steps. In the first step (acquisition, whole slide images are collected and converted to TIFF files. In the second step (preparation, files are organized. The third step (tiles, creates multiple manageable tiles to count. In the fourth step (analysis, tissue is analyzed and results are stored in a data set. Using this method, two consecutive measurements of 303 slides showed an intraclass correlation of 0.99. In conclusion, slideToolkit provides a free, powerful and versatile collection of tools for automated feature analysis of whole slide images to create reproducible and meaningful phenotypic data sets.

  3. Java Radar Analysis Tool

    Science.gov (United States)

    Zaczek, Mariusz P.

    2005-01-01

    Java Radar Analysis Tool (JRAT) is a computer program for analyzing two-dimensional (2D) scatter plots derived from radar returns showing pieces of the disintegrating Space Shuttle Columbia. JRAT can also be applied to similar plots representing radar returns showing aviation accidents, and to scatter plots in general. The 2D scatter plots include overhead map views and side altitude views. The superposition of points in these views makes searching difficult. JRAT enables three-dimensional (3D) viewing: by use of a mouse and keyboard, the user can rotate to any desired viewing angle. The 3D view can include overlaid trajectories and search footprints to enhance situational awareness in searching for pieces. JRAT also enables playback: time-tagged radar-return data can be displayed in time order and an animated 3D model can be moved through the scene to show the locations of the Columbia (or other vehicle) at the times of the corresponding radar events. The combination of overlays and playback enables the user to correlate a radar return with a position of the vehicle to determine whether the return is valid. JRAT can optionally filter single radar returns, enabling the user to selectively hide or highlight a desired radar return.

  4. Synthetic impulse and aperture radar (SIAR) a novel multi-frequency MIMO radar

    CERN Document Server

    Chen, Baixiao

    2014-01-01

    Analyzes and discusses the operating principle, signal processing method, and experimental results of this advanced radar technology This book systematically discusses the operating principle, signal processing method, target measurement technology, and experimental results of a new kind of radar called synthetic impulse and aperture radar (SIAR). The purpose is to help readers acquire an insight into the concept and principle of the SIAR, to know its operation mode, signal processing method, the difference between the traditional radar and itself, the designing ideals, and the developing me

  5. WE-H-BRA-04: Biological Geometries for the Monte Carlo Simulation Toolkit TOPASNBio

    International Nuclear Information System (INIS)

    McNamara, A; Held, K; Paganetti, H; Schuemann, J; Perl, J; Piersimoni, P; Ramos-Mendez, J; Faddegon, B

    2016-01-01

    Purpose: New advances in radiation therapy are most likely to come from the complex interface of physics, chemistry and biology. Computational simulations offer a powerful tool for quantitatively investigating radiation interactions with biological tissue and can thus help bridge the gap between physics and biology. The aim of TOPAS-nBio is to provide a comprehensive tool to generate advanced radiobiology simulations. Methods: TOPAS wraps and extends the Geant4 Monte Carlo (MC) simulation toolkit. TOPAS-nBio is an extension to TOPAS which utilizes the physics processes in Geant4-DNA to model biological damage from very low energy secondary electrons. Specialized cell, organelle and molecular geometries were designed for the toolkit. Results: TOPAS-nBio gives the user the capability of simulating biological geometries, ranging from the micron-scale (e.g. cells and organelles) to complex nano-scale geometries (e.g. DNA and proteins). The user interacts with TOPAS-nBio through easy-to-use input parameter files. For example, in a simple cell simulation the user can specify the cell type and size as well as the type, number and size of included organelles. For more detailed nuclear simulations, the user can specify chromosome territories containing chromatin fiber loops, the later comprised of nucleosomes on a double helix. The chromatin fibers can be arranged in simple rigid geometries or within factual globules, mimicking realistic chromosome territories. TOPAS-nBio also provides users with the capability of reading protein data bank 3D structural files to simulate radiation damage to proteins or nucleic acids e.g. histones or RNA. TOPAS-nBio has been validated by comparing results to other track structure simulation software and published experimental measurements. Conclusion: TOPAS-nBio provides users with a comprehensive MC simulation tool for radiobiological simulations, giving users without advanced programming skills the ability to design and run complex

  6. WE-H-BRA-04: Biological Geometries for the Monte Carlo Simulation Toolkit TOPASNBio

    Energy Technology Data Exchange (ETDEWEB)

    McNamara, A; Held, K; Paganetti, H; Schuemann, J [Massachusetts General Hospital & Harvard Med. School, Boston, MA (United States); Perl, J [Stanford Linear Accelerator Center, Menlo Park, CA (United States); Piersimoni, P; Ramos-Mendez, J; Faddegon, B [University of California, San Francisco, San Francisco, CA (United States)

    2016-06-15

    Purpose: New advances in radiation therapy are most likely to come from the complex interface of physics, chemistry and biology. Computational simulations offer a powerful tool for quantitatively investigating radiation interactions with biological tissue and can thus help bridge the gap between physics and biology. The aim of TOPAS-nBio is to provide a comprehensive tool to generate advanced radiobiology simulations. Methods: TOPAS wraps and extends the Geant4 Monte Carlo (MC) simulation toolkit. TOPAS-nBio is an extension to TOPAS which utilizes the physics processes in Geant4-DNA to model biological damage from very low energy secondary electrons. Specialized cell, organelle and molecular geometries were designed for the toolkit. Results: TOPAS-nBio gives the user the capability of simulating biological geometries, ranging from the micron-scale (e.g. cells and organelles) to complex nano-scale geometries (e.g. DNA and proteins). The user interacts with TOPAS-nBio through easy-to-use input parameter files. For example, in a simple cell simulation the user can specify the cell type and size as well as the type, number and size of included organelles. For more detailed nuclear simulations, the user can specify chromosome territories containing chromatin fiber loops, the later comprised of nucleosomes on a double helix. The chromatin fibers can be arranged in simple rigid geometries or within factual globules, mimicking realistic chromosome territories. TOPAS-nBio also provides users with the capability of reading protein data bank 3D structural files to simulate radiation damage to proteins or nucleic acids e.g. histones or RNA. TOPAS-nBio has been validated by comparing results to other track structure simulation software and published experimental measurements. Conclusion: TOPAS-nBio provides users with a comprehensive MC simulation tool for radiobiological simulations, giving users without advanced programming skills the ability to design and run complex

  7. A Toolkit For CryoSat Investigations By The ESRIN EOP-SER Altimetry Team

    Science.gov (United States)

    Dinardo, Salvatore; Bruno, Lucas; Benveniste, Jerome

    2013-12-01

    The scope of this work is to feature the new tool for the exploitation of the CryoSat data, designed and developed entirely by the Altimetry Team at ESRIN EOP-SER (Earth Observation - Exploitation, Research and Development). The tool framework is composed of two separate components: the first one handles the data collection and management, the second one is the processing toolkit. The CryoSat FBR (Full Bit Rate) data is downlinked uncompressed from the satellite, containing un-averaged individual echoes. This data is made available in the Kiruna CalVal server in a 10 day rolling archive. Daily at ESRIN all the CryoSat FBR data, in SAR and SARin Mode, are downloaded (around 30 Gigabytes) catalogued and archived in local ESRIN EOP-SER workstations. As of March 2013, the total amount of FBR data is over 9 Terabytes, with CryoSat acquisition dates spanning January 2011 to February 2013 (with some gaps). This archive was built by merging partial datasets available at ESTEC and NOAA, that have been kindly made available for EOP-SER team. The on-demand access to this low level data is restricted to expert users with validated ESA P.I. credentials. Currently the main users of the archiving functionality are the team members of the Project CP4O (STSE- CryoSat Plus for Ocean), CNES and NOAA. The second component of the service is the processing toolkit. On the EOP-SER workstations there is internally and independently developed software that is able to process the FBR data in SAR/SARin mode to generate multi-looked echoes (Level 1B) and subsequently able to re-track them in SAR and SARin mode (Level 2) over open ocean, exploiting the SAMOSA model and other internally developed models. The processing segment is used for research & development scopes, supporting the development contracts awarded confronting the deliverables to ESA, on site demonstrations/training to selected users, cross- comparison against third part products (CLS/CNES CPP Products for instance), preparation

  8. DeepInfer: open-source deep learning deployment toolkit for image-guided therapy

    Science.gov (United States)

    Mehrtash, Alireza; Pesteie, Mehran; Hetherington, Jorden; Behringer, Peter A.; Kapur, Tina; Wells, William M.; Rohling, Robert; Fedorov, Andriy; Abolmaesumi, Purang

    2017-03-01

    Deep learning models have outperformed some of the previous state-of-the-art approaches in medical image analysis. Instead of using hand-engineered features, deep models attempt to automatically extract hierarchical representations at multiple levels of abstraction from the data. Therefore, deep models are usually considered to be more flexible and robust solutions for image analysis problems compared to conventional computer vision models. They have demonstrated significant improvements in computer-aided diagnosis and automatic medical image analysis applied to such tasks as image segmentation, classification and registration. However, deploying deep learning models often has a steep learning curve and requires detailed knowledge of various software packages. Thus, many deep models have not been integrated into the clinical research work ows causing a gap between the state-of-the-art machine learning in medical applications and evaluation in clinical research procedures. In this paper, we propose "DeepInfer" - an open-source toolkit for developing and deploying deep learning models within the 3D Slicer medical image analysis platform. Utilizing a repository of task-specific models, DeepInfer allows clinical researchers and biomedical engineers to deploy a trained model selected from the public registry, and apply it to new data without the need for software development or configuration. As two practical use cases, we demonstrate the application of DeepInfer in prostate segmentation for targeted MRI-guided biopsy and identification of the target plane in 3D ultrasound for spinal injections.

  9. Pyicos: a versatile toolkit for the analysis of high-throughput sequencing data.

    Science.gov (United States)

    Althammer, Sonja; González-Vallinas, Juan; Ballaré, Cecilia; Beato, Miguel; Eyras, Eduardo

    2011-12-15

    High-throughput sequencing (HTS) has revolutionized gene regulation studies and is now fundamental for the detection of protein-DNA and protein-RNA binding, as well as for measuring RNA expression. With increasing variety and sequencing depth of HTS datasets, the need for more flexible and memory-efficient tools to analyse them is growing. We describe Pyicos, a powerful toolkit for the analysis of mapped reads from diverse HTS experiments: ChIP-Seq, either punctuated or broad signals, CLIP-Seq and RNA-Seq. We prove the effectiveness of Pyicos to select for significant signals and show that its accuracy is comparable and sometimes superior to that of methods specifically designed for each particular type of experiment. Pyicos facilitates the analysis of a variety of HTS datatypes through its flexibility and memory efficiency, providing a useful framework for data integration into models of regulatory genomics. Open-source software, with tutorials and protocol files, is available at http://regulatorygenomics.upf.edu/pyicos or as a Galaxy server at http://regulatorygenomics.upf.edu/galaxy eduardo.eyras@upf.edu Supplementary data are available at Bioinformatics online.

  10. Graph algorithms in the titan toolkit.

    Energy Technology Data Exchange (ETDEWEB)

    McLendon, William Clarence, III; Wylie, Brian Neil

    2009-10-01

    Graph algorithms are a key component in a wide variety of intelligence analysis activities. The Graph-Based Informatics for Non-Proliferation and Counter-Terrorism project addresses the critical need of making these graph algorithms accessible to Sandia analysts in a manner that is both intuitive and effective. Specifically we describe the design and implementation of an open source toolkit for doing graph analysis, informatics, and visualization that provides Sandia with novel analysis capability for non-proliferation and counter-terrorism.

  11. Array-Based Ultrawideband through-Wall Radar: Prediction and Assessment of Real Radar Abilities

    Directory of Open Access Journals (Sweden)

    Nadia Maaref

    2013-01-01

    Full Text Available This paper deals with a new through-the-wall (TTW radar demonstrator for the detection and the localisation of people in a room (in a noncooperative way with the radar situated outside but in the vicinity of the first wall. After modelling the propagation through various walls and quantifying the backscattering by the human body, an analysis of the technical considerations which aims at defining the radar design is presented. Finally, an ultrawideband (UWB frequency modulated continuous wave (FMCW radar is proposed, designed, and implemented. Some representative trials show that this radar is able to localise and track moving people behind a wall in real time.

  12. EasyInterface: A toolkit for rapid development of GUIs for research prototype tools

    OpenAIRE

    Doménech, Jesús; Genaim, Samir; Johnsen, Einar Broch; Schlatte, Rudolf

    2017-01-01

    In this paper we describe EasyInterface, an open-source toolkit for rapid development of web-based graphical user interfaces (GUIs). This toolkit addresses the need of researchers to make their research prototype tools available to the community, and integrating them in a common environment, rapidly and without being familiar with web programming or GUI libraries in general. If a tool can be executed from a command-line and its output goes to the standard output, then in few minutes one can m...

  13. Novel radar techniques and applications

    CERN Document Server

    Klemm, Richard; Koch, Wolfgang

    2017-01-01

    Novel Radar Techniques and Applications presents the state-of-the-art in advanced radar, with emphasis on ongoing novel research and development and contributions from an international team of leading radar experts. This volume covers: Waveform diversity and cognitive radar and Target tracking and data fusion.

  14. NeuroPigPen: A Scalable Toolkit for Processing Electrophysiological Signal Data in Neuroscience Applications Using Apache Pig.

    Science.gov (United States)

    Sahoo, Satya S; Wei, Annan; Valdez, Joshua; Wang, Li; Zonjy, Bilal; Tatsuoka, Curtis; Loparo, Kenneth A; Lhatoo, Samden D

    2016-01-01

    The recent advances in neurological imaging and sensing technologies have led to rapid increase in the volume, rate of data generation, and variety of neuroscience data. This "neuroscience Big data" represents a significant opportunity for the biomedical research community to design experiments using data with greater timescale, large number of attributes, and statistically significant data size. The results from these new data-driven research techniques can advance our understanding of complex neurological disorders, help model long-term effects of brain injuries, and provide new insights into dynamics of brain networks. However, many existing neuroinformatics data processing and analysis tools were not built to manage large volume of data, which makes it difficult for researchers to effectively leverage this available data to advance their research. We introduce a new toolkit called NeuroPigPen that was developed using Apache Hadoop and Pig data flow language to address the challenges posed by large-scale electrophysiological signal data. NeuroPigPen is a modular toolkit that can process large volumes of electrophysiological signal data, such as Electroencephalogram (EEG), Electrocardiogram (ECG), and blood oxygen levels (SpO2), using a new distributed storage model called Cloudwave Signal Format (CSF) that supports easy partitioning and storage of signal data on commodity hardware. NeuroPigPen was developed with three design principles: (a) Scalability-the ability to efficiently process increasing volumes of data; (b) Adaptability-the toolkit can be deployed across different computing configurations; and (c) Ease of programming-the toolkit can be easily used to compose multi-step data processing pipelines using high-level programming constructs. The NeuroPigPen toolkit was evaluated using 750 GB of electrophysiological signal data over a variety of Hadoop cluster configurations ranging from 3 to 30 Data nodes. The evaluation results demonstrate that the toolkit

  15. A flexible open-source toolkit for lava flow simulations

    Science.gov (United States)

    Mossoux, Sophie; Feltz, Adelin; Poppe, Sam; Canters, Frank; Kervyn, Matthieu

    2014-05-01

    Lava flow hazard modeling is a useful tool for scientists and stakeholders confronted with imminent or long term hazard from basaltic volcanoes. It can improve their understanding of the spatial distribution of volcanic hazard, influence their land use decisions and improve the city evacuation during a volcanic crisis. Although a range of empirical, stochastic and physically-based lava flow models exists, these models are rarely available or require a large amount of physical constraints. We present a GIS toolkit which models lava flow propagation from one or multiple eruptive vents, defined interactively on a Digital Elevation Model (DEM). It combines existing probabilistic (VORIS) and deterministic (FLOWGO) models in order to improve the simulation of lava flow spatial spread and terminal length. Not only is this toolkit open-source, running in Python, which allows users to adapt the code to their needs, but it also allows users to combine the models included in different ways. The lava flow paths are determined based on the probabilistic steepest slope (VORIS model - Felpeto et al., 2001) which can be constrained in order to favour concentrated or dispersed flow fields. Moreover, the toolkit allows including a corrective factor in order for the lava to overcome small topographical obstacles or pits. The lava flow terminal length can be constrained using a fixed length value, a Gaussian probability density function or can be calculated based on the thermo-rheological properties of the open-channel lava flow (FLOWGO model - Harris and Rowland, 2001). These slope-constrained properties allow estimating the velocity of the flow and its heat losses. The lava flow stops when its velocity is zero or the lava temperature reaches the solidus. Recent lava flows of Karthala volcano (Comoros islands) are here used to demonstrate the quality of lava flow simulations with the toolkit, using a quantitative assessment of the match of the simulation with the real lava flows. The

  16. The SpeX Prism Library Analysis Toolkit: Design Considerations and First Results

    Science.gov (United States)

    Burgasser, Adam J.; Aganze, Christian; Escala, Ivana; Lopez, Mike; Choban, Caleb; Jin, Yuhui; Iyer, Aishwarya; Tallis, Melisa; Suarez, Adrian; Sahi, Maitrayee

    2016-01-01

    Various observational and theoretical spectral libraries now exist for galaxies, stars, planets and other objects, which have proven useful for classification, interpretation, simulation and model development. Effective use of these libraries relies on analysis tools, which are often left to users to develop. In this poster, we describe a program to develop a combined spectral data repository and Python-based analysis toolkit for low-resolution spectra of very low mass dwarfs (late M, L and T dwarfs), which enables visualization, spectral index analysis, classification, atmosphere model comparison, and binary modeling for nearly 2000 library spectra and user-submitted data. The SpeX Prism Library Analysis Toolkit (SPLAT) is being constructed as a collaborative, student-centered, learning-through-research model with high school, undergraduate and graduate students and regional science teachers, who populate the database and build the analysis tools through quarterly challenge exercises and summer research projects. In this poster, I describe the design considerations of the toolkit, its current status and development plan, and report the first published results led by undergraduate students. The combined data and analysis tools are ideal for characterizing cool stellar and exoplanetary atmospheres (including direct exoplanetary spectra observations by Gemini/GPI, VLT/SPHERE, and JWST), and the toolkit design can be readily adapted for other spectral datasets as well.This material is based upon work supported by the National Aeronautics and Space Administration under Grant No. NNX15AI75G. SPLAT code can be found at https://github.com/aburgasser/splat.

  17. Toolkit for healthcare facility design evaluation - some case studies

    CSIR Research Space (South Africa)

    De Jager, Peta

    2007-10-01

    Full Text Available themes in approach. Further study is indicated, but preliminary research shows that, whilst these toolkits can be applied to the South African context, there are compelling reasons for them to be adapted. This paper briefly outlines these three case...

  18. Toolkit for healthcare facility design evaluation - some case studies.

    CSIR Research Space (South Africa)

    De Jager, Peta

    2007-10-01

    Full Text Available themes in approach. Further study is indicated, but preliminary research shows that, whilst these toolkits can be applied to the South African context, there are compelling reasons for them to be adapted. This paper briefly outlines these three case...

  19. New Careers in Nursing Scholar Alumni Toolkit: Development of an Innovative Resource for Transition to Practice.

    Science.gov (United States)

    Mauro, Ann Marie P; Escallier, Lori A; Rosario-Sim, Maria G

    2016-01-01

    The transition from student to professional nurse is challenging and may be more difficult for underrepresented minority nurses. The Robert Wood Johnson Foundation New Careers in Nursing (NCIN) program supported development of a toolkit that would serve as a transition-to-practice resource to promote retention of NCIN alumni and other new nurses. Thirteen recent NCIN alumni (54% male, 23% Hispanic/Latino, 23% African Americans) from 3 schools gave preliminary content feedback. An e-mail survey was sent to a convenience sample of 29 recent NCIN alumni who evaluated the draft toolkit using a Likert scale (poor = 1; excellent = 5). Twenty NCIN alumni draft toolkit reviewers (response rate 69%) were primarily female (80%) and Hispanic/Latino (40%). Individual chapters' mean overall rating of 4.67 demonstrated strong validation. Mean scores for overall toolkit content (4.57), usability (4.5), relevance (4.79), and quality (4.71) were also excellent. Qualitative comments were analyzed using thematic content analysis and supported the toolkit's relevance and utility. A multilevel peer review process was also conducted. Peer reviewer feedback resulted in a 6-chapter document that offers resources for successful transition to practice and lays the groundwork for continued professional growth. Future research is needed to determine the ideal time to introduce this resource. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. RGtk2: A Graphical User Interface Toolkit for R

    Directory of Open Access Journals (Sweden)

    Duncan Temple Lang

    2011-01-01

    Full Text Available Graphical user interfaces (GUIs are growing in popularity as a complement or alternative to the traditional command line interfaces to R. RGtk2 is an R package for creating GUIs in R. The package provides programmatic access to GTK+ 2.0, an open-source GUI toolkit written in C. To construct a GUI, the R programmer calls RGtk2 functions that map to functions in the underlying GTK+ library. This paper introduces the basic concepts underlying GTK+ and explains how to use RGtk2 to construct GUIs from R. The tutorial is based on simple and pratical programming examples. We also provide more complex examples illustrating the advanced features of the package. The design of the RGtk2 API and the low-level interface from R to GTK+ are discussed at length. We compare RGtk2 to alternative GUI toolkits for R.

  1. A methodological toolkit for field assessments of artisanally mined alluvial diamond deposits

    Science.gov (United States)

    Chirico, Peter G.; Malpeli, Katherine C.

    2014-01-01

    This toolkit provides a standardized checklist of critical issues relevant to artisanal mining-related field research. An integrated sociophysical geographic approach to collecting data at artisanal mine sites is outlined. The implementation and results of a multistakeholder approach to data collection, carried out in the assessment of Guinea’s artisanally mined diamond deposits, also are summarized. This toolkit, based on recent and successful field campaigns in West Africa, has been developed as a reference document to assist other government agencies or organizations in collecting the data necessary for artisanal diamond mining or similar natural resource assessments.

  2. iDC: A comprehensive toolkit for the analysis of residual dipolar couplings for macromolecular structure determination

    International Nuclear Information System (INIS)

    Wei Yufeng; Werner, Milton H.

    2006-01-01

    Measurement of residual dipolar couplings (RDCs) has become an important method for the determination and validation of protein or nucleic acid structures by NMRf spectroscopy. A number of toolkits have been devised for the handling of RDC data which run in the Linux/Unix operating environment and require specifically formatted input files. The outputs from these programs, while informative, require format modification prior to the incorporation of this data into commonly used personal computer programs for manuscript preparation. To bridge the gap between analysis and publication, an easy-to-use, comprehensive toolkit for RDC analysis has been created, iDC. iDC is written for the WaveMetrics Igor Pro mathematics program, a widely used graphing and data analysis software program that runs on both Windows PC and Mac OS X computers. Experimental RDC values can be loaded into iDC using simple data formats accessible to Igor's tabular data function. The program can perform most useful RDC analyses, including alignment tensor estimation from a histogram of RDC occurrence versus values and order tensor analysis by singular value decomposition (SVD). SVD analysis can be performed on an entire structure family at once, a feature missing in other applications of this kind. iDC can also import from and export to several different commonly used programs for the analysis of RDC data (DC, PALES, REDCAT) and can prepare formatted files for RDC-based refinement of macromolecular structures using XPLOR-NIH, CNS and ARIA. The graphical user interface provides an easy-to-use I/O for data, structures and formatted outputs

  3. Penetration Tester's Open Source Toolkit

    CERN Document Server

    Faircloth, Jeremy

    2011-01-01

    Great commercial penetration testing tools can be very expensive and sometimes hard to use or of questionable accuracy. This book helps solve both of these problems. The open source, no-cost penetration testing tools presented do a great job and can be modified by the user for each situation. Many tools, even ones that cost thousands of dollars, do not come with any type of instruction on how and in which situations the penetration tester can best use them. Penetration Tester's Open Source Toolkit, Third Edition, expands upon existing instructions so that a professional can get the most accura

  4. Google Web Toolkit for Ajax

    CERN Document Server

    Perry, Bruce

    2007-01-01

    The Google Web Toolkit (GWT) is a nifty framework that Java programmers can use to create Ajax applications. The GWT allows you to create an Ajax application in your favorite IDE, such as IntelliJ IDEA or Eclipse, using paradigms and mechanisms similar to programming a Java Swing application. After you code the application in Java, the GWT's tools generate the JavaScript code the application needs. You can also use typical Java project tools such as JUnit and Ant when creating GWT applications. The GWT is a free download, and you can freely distribute the client- and server-side code you c

  5. Radar remote sensing in biology

    Science.gov (United States)

    Moore, Richard K.; Simonett, David S.

    1967-01-01

    The present status of research on discrimination of natural and cultivated vegetation using radar imaging systems is sketched. The value of multiple polarization radar in improved discrimination of vegetation types over monoscopic radars is also documented. Possible future use of multi-frequency, multi-polarization radar systems for all weather agricultural survey is noted.

  6. The Knowledge Translation Toolkit: Bridging the Know–Do Gap: A ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2011-06-06

    Jun 6, 2011 ... It presents the theories, tools, and strategies required to encourage and enable ... Toolkit: Bridging the Know–Do Gap: A Resource for Researchers ... violence, and make digital platforms work for inclusive development.

  7. Pengolahan data Ground Penetrating Radar (GPR dengan menggunakan software MATGPR R-3.5

    Directory of Open Access Journals (Sweden)

    Elfarabi Amien

    2017-03-01

    Full Text Available Alat Ground Penetration Radar (GPR memancarkan sinyal gelombang elektromagnetik yang dipancarkan kedalam bumi kemudian gelombang elektromagnetik di tangkap saat sudah sampai permukaan bumi. Alat GPR ini dapat memetakan kondisi bawah permukaan yang dilewatinya, selain itu alat ini sangat sensitif terhadap benda-benda yang memiliki komponen atau muatan listrik dan magnet yang besar. Benda-benda tersebut dapat dikatakan sebagai sumber noise. Pengaruh noise ini akan mempengaruhi pada hasil yang keluarkan, oleh karena itu diperlukan pengolahan data untuk menfilter noise tersebut agar dapat menghasilkan hasil yang baik dan tidak menimbulkan kebingungan pada saat proses interpretasi data.

  8. Signal processing in noise waveform radar

    CERN Document Server

    Kulpa, Krzysztof

    2013-01-01

    This book is devoted to the emerging technology of noise waveform radar and its signal processing aspects. It is a new kind of radar, which use noise-like waveform to illuminate the target. The book includes an introduction to basic radar theory, starting from classical pulse radar, signal compression, and wave radar. The book then discusses the properties, difficulties and potential of noise radar systems, primarily for low-power and short-range civil applications. The contribution of modern signal processing techniques to making noise radar practical are emphasized, and application examples

  9. Developing Climate Resilience Toolkit Decision Support Training Sectio

    Science.gov (United States)

    Livezey, M. M.; Herring, D.; Keck, J.; Meyers, J. C.

    2014-12-01

    The Climate Resilience Toolkit (CRT) is a Federal government effort to address the U.S. President's Climate Action Plan and Executive Order for Climate Preparedness. The toolkit will provide access to tools and products useful for climate-sensitive decision making. To optimize the user experience, the toolkit will also provide access to training materials. The National Oceanic and Atmospheric Administration (NOAA) has been building a climate training capability for 15 years. The target audience for the training has historically been mainly NOAA staff with some modified training programs for external users and stakeholders. NOAA is now using this climate training capacity for the CRT. To organize the CRT training section, we collaborated with the Association of Climate Change Officers to determine the best strategy and identified four additional complimentary skills needed for successful decision making: climate literacy, environmental literacy, risk assessment and management, and strategic execution and monitoring. Developing the climate literacy skills requires knowledge of climate variability and change, as well as an introduction to the suite of available products and services. For the development of an environmental literacy category, specific topics needed include knowledge of climate impacts on specific environmental systems. Climate risk assessment and management introduces a process for decision making and provides knowledge on communication of climate information and integration of climate information in planning processes. The strategic execution and monitoring category provides information on use of NOAA climate products, services, and partnership opportunities for decision making. In order to use the existing training modules, it was necessary to assess their level of complexity, catalog them, and develop guidance for users on a curriculum to take advantage of the training resources to enhance their learning experience. With the development of this CRT

  10. The DLESE Evaluation Toolkit Project

    Science.gov (United States)

    Buhr, S. M.; Barker, L. J.; Marlino, M.

    2002-12-01

    The Evaluation Toolkit and Community project is a new Digital Library for Earth System Education (DLESE) collection designed to raise awareness of project evaluation within the geoscience education community, and to enable principal investigators, teachers, and evaluators to implement project evaluation more readily. This new resource is grounded in the needs of geoscience educators, and will provide a virtual home for a geoscience education evaluation community. The goals of the project are to 1) provide a robust collection of evaluation resources useful for Earth systems educators, 2) establish a forum and community for evaluation dialogue within DLESE, and 3) disseminate the resources through the DLESE infrastructure and through professional society workshops and proceedings. Collaboration and expertise in education, geoscience and evaluation are necessary if we are to conduct the best possible geoscience education. The Toolkit allows users to engage in evaluation at whichever level best suits their needs, get more evaluation professional development if desired, and access the expertise of other segments of the community. To date, a test web site has been built and populated, initial community feedback from the DLESE and broader community is being garnered, and we have begun to heighten awareness of geoscience education evaluation within our community. The web site contains features that allow users to access professional development about evaluation, search and find evaluation resources, submit resources, find or offer evaluation services, sign up for upcoming workshops, take the user survey, and submit calendar items. The evaluation resource matrix currently contains resources that have met our initial review. The resources are currently organized by type; they will become searchable on multiple dimensions of project type, audience, objectives and evaluation resource type as efforts to develop a collection-specific search engine mature. The peer review

  11. Radar Weather Observation

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Radar Weather Observation is a set of archived historical manuscripts stored on microfiche. The primary source of these radar weather observations manuscript records...

  12. A review of array radars

    Science.gov (United States)

    Brookner, E.

    1981-10-01

    Achievements in the area of array radars are illustrated by such activities as the operational deployment of the large high-power, high-range-resolution Cobra Dane; the operational deployment of two all-solid-state high-power, large UHF Pave Paws radars; and the development of the SAM multifunction Patriot radar. This paper reviews the following topics: array radars steered in azimuth and elevation by phase shifting (phase-phase steered arrays); arrays steered + or - 60 deg, limited scan arrays, hemispherical coverage, and omnidirectional coverage arrays; array radars steering electronically in only one dimension, either by frequency or by phase steering; and array radar antennas which use no electronic scanning but instead use array antennas for achieving low antenna sidelobes.

  13. A survey exploring National Health Service ePrescribing Toolkit use and perceived usefulness amongst English hospitals

    Directory of Open Access Journals (Sweden)

    Kathrin Cresswell

    2017-06-01

    Conclusions: Interactive elements and learning lessons from early adopter sites that had accumulated experiences of implementing systems was viewed as the most helpful aspect of the ePrescribing Toolkit. The Toolkit now needs to be further developed to facilitate the continuing implementation/optimisation of ePrescribing and other health information technology across the NHS.

  14. Doppler radar physiological sensing

    CERN Document Server

    Lubecke, Victor M; Droitcour, Amy D; Park, Byung-Kwon; Singh, Aditya

    2016-01-01

    Presents a comprehensive description of the theory and practical implementation of Doppler radar-based physiological monitoring. This book includes an overview of current physiological monitoring techniques and explains the fundamental technology used in remote non-contact monitoring methods. Basic radio wave propagation and radar principles are introduced along with the fundamentals of physiological motion and measurement. Specific design and implementation considerations for physiological monitoring radar systems are then discussed in detail. The authors address current research and commercial development of Doppler radar based physiological monitoring for healthcare and other applications.

  15. Radar Remote Sensing

    Science.gov (United States)

    Rosen, Paul A.

    2012-01-01

    This lecture was just a taste of radar remote sensing techniques and applications. Other important areas include Stereo radar grammetry. PolInSAR for volumetric structure mapping. Agricultural monitoring, soil moisture, ice-mapping, etc. The broad range of sensor types, frequencies of observation and availability of sensors have enabled radar sensors to make significant contributions in a wide area of earth and planetary remote sensing sciences. The range of applications, both qualitative and quantitative, continue to expand with each new generation of sensors.

  16. Implementation of the Good School Toolkit in Uganda: a quantitative process evaluation of a successful violence prevention program.

    Science.gov (United States)

    Knight, Louise; Allen, Elizabeth; Mirembe, Angel; Nakuti, Janet; Namy, Sophie; Child, Jennifer C; Sturgess, Joanna; Kyegombe, Nambusi; Walakira, Eddy J; Elbourne, Diana; Naker, Dipak; Devries, Karen M

    2018-05-09

    The Good School Toolkit, a complex behavioural intervention designed by Raising Voices a Ugandan NGO, reduced past week physical violence from school staff to primary students by an average of 42% in a recent randomised controlled trial. This process evaluation quantitatively examines what was implemented across the twenty-one intervention schools, variations in school prevalence of violence after the intervention, factors that influence exposure to the intervention and factors associated with students' experience of physical violence from staff at study endline. Implementation measures were captured prospectively in the twenty-one intervention schools over four school terms from 2012 to 2014 and Toolkit exposure captured in the student (n = 1921) and staff (n = 286) endline cross-sectional surveys in 2014. Implementation measures and the prevalence of violence are summarised across schools and are assessed for correlation using Spearman's Rank Correlation Coefficient. Regression models are used to explore individual factors associated with Toolkit exposure and with physical violence at endline. School prevalence of past week physical violence from staff against students ranged from 7% to 65% across schools at endline. Schools with higher mean levels of teacher Toolkit exposure had larger decreases in violence during the study. Students in schools categorised as implementing a 'low' number of program school-led activities reported less exposure to the Toolkit. Higher student Toolkit exposure was associated with decreased odds of experiencing physical violence from staff (OR: 0.76, 95%CI: 0.67-0.86, p-valueEffectiveness of the Toolkit may be increased by further targeting and supporting teachers' engagement with girls and students with mental health difficulties. The trial is registered at clinicaltrials.gov , NCT01678846, August 24th 2012.

  17. Patient-Centered Personal Health Record and Portal Implementation Toolkit for Ambulatory Clinics: A Feasibility Study.

    Science.gov (United States)

    Nahm, Eun-Shim; Diblasi, Catherine; Gonzales, Eva; Silver, Kristi; Zhu, Shijun; Sagherian, Knar; Kongs, Katherine

    2017-04-01

    Personal health records and patient portals have been shown to be effective in managing chronic illnesses. Despite recent nationwide implementation efforts, the personal health record and patient portal adoption rates among patients are low, and the lack of support for patients using the programs remains a critical gap in most implementation processes. In this study, we implemented the Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit in a large diabetes/endocrinology center and assessed its preliminary impact on personal health record and patient portal knowledge, self-efficacy, patient-provider communication, and adherence to treatment plans. Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit is composed of Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit-General, clinic-level resources for clinicians, staff, and patients, and Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit Plus, an optional 4-week online resource program for patients ("MyHealthPortal"). First, Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit-General was implemented, and all clinicians and staff were educated about the center's personal health record and patient portal. Then general patient education was initiated, while a randomized controlled trial was conducted to test the preliminary effects of "MyHealthPortal" using a small sample (n = 74) with three observations (baseline and 4 and 12 weeks). The intervention group showed significantly greater improvement than the control group in patient-provider communication at 4 weeks (t56 = 3.00, P = .004). For other variables, the intervention group tended to show greater improvement; however, the differences were not significant. In this preliminary study, Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit showed potential for filling the gap in the current

  18. Using features of local densities, statistics and HMM toolkit (HTK for offline Arabic handwriting text recognition

    Directory of Open Access Journals (Sweden)

    El Moubtahij Hicham

    2017-12-01

    Full Text Available This paper presents an analytical approach of an offline handwritten Arabic text recognition system. It is based on the Hidden Markov Models (HMM Toolkit (HTK without explicit segmentation. The first phase is preprocessing, where the data is introduced in the system after quality enhancements. Then, a set of characteristics (features of local densities and features statistics are extracted by using the technique of sliding windows. Subsequently, the resulting feature vectors are injected to the Hidden Markov Model Toolkit (HTK. The simple database “Arabic-Numbers” and IFN/ENIT are used to evaluate the performance of this system. Keywords: Hidden Markov Models (HMM Toolkit (HTK, Sliding windows

  19. Principles of modern radar radar applications

    CERN Document Server

    Scheer, James A

    2013-01-01

    Principles of Modern Radar: Radar Applications is the third of the three-volume seriesof what was originally designed to be accomplished in one volume. As the final volumeof the set, it finishes the original vision of a complete yet bounded reference for radartechnology. This volume describes fifteen different system applications or class ofapplications in more detail than can be found in Volumes I or II.As different as the applications described, there is a difference in how these topicsare treated by the authors. Whereas in Volumes I and II there is strict adherence tochapter format and leve

  20. Computational Chemistry Toolkit for Energetic Materials Design

    Science.gov (United States)

    2006-11-01

    industry are aggressively engaged in efforts to develop multiscale modeling and simulation methodologies to model and analyze complex phenomena across...energetic materials design. It is hoped that this toolkit will evolve into a collection of well-integrated multiscale modeling methodologies...Experimenta Theoreticala This Work 1-5-Diamino-4- methyl- tetrazolium nitrate 8.4 41.7 47.5 1-5-Diamino-4- methyl- tetrazolium azide 138.1 161.6

  1. Accelerator Operators and Software Development

    International Nuclear Information System (INIS)

    April Miller; Michele Joyce

    2001-01-01

    At Thomas Jefferson National Accelerator Facility, accelerator operators perform tasks in their areas of specialization in addition to their machine operations duties. One crucial area in which operators contribute is software development. Operators with programming skills are uniquely qualified to develop certain controls applications because of their expertise in the day-to-day operation of the accelerator. Jefferson Lab is one of the few laboratories that utilizes the skills and knowledge of operators to create software that enhances machine operations. Through the programs written; by operators, Jefferson Lab has improved machine efficiency and beam availability. Because many of these applications involve automation of procedures and need graphical user interfaces, the scripting language Tcl and the Tk toolkit have been adopted. In addition to automation, some operator-developed applications are used for information distribution. For this purpose, several standard web development tools such as perl, VBScript, and ASP are used. Examples of applications written by operators include injector steering, spin angle changes, system status reports, magnet cycling routines, and quantum efficiency measurements. This paper summarizes how the unique knowledge of accelerator operators has contributed to the success of the Jefferson Lab control system. *This work was supported by the U.S. DOE contract No. DE-AC05-84-ER40150

  2. Survey of Ultra-wideband Radar

    Science.gov (United States)

    Mokole, Eric L.; Hansen, Pete

    The development of UWB radar over the last four decades is very briefly summarized. A discussion of the meaning of UWB is followed by a short history of UWB radar developments and discussions of key supporting technologies and current UWB radars. Selected UWB radars and the associated applications are highlighted. Applications include detecting and imaging buried mines, detecting and mapping underground utilities, detecting and imaging objects obscured by foliage, through-wall detection in urban areas, short-range detection of suicide bombs, and the characterization of the impulse responses of various artificial and naturally occurring scattering objects. In particular, the Naval Research Laboratory's experimental, low-power, dual-polarized, short-pulse, ultra-high resolution radar is used to discuss applications and issues of UWB radar. Some crucial issues that are problematic to UWB radar are spectral availability, electromagnetic interference and compatibility, difficulties with waveform control/shaping, hardware limitations in the transmission chain, and the unreliability of high-power sources for sustained use above 2 GHz.

  3. Risk assessment of chemicals in foundries: The International Chemical Toolkit pilot-project

    International Nuclear Information System (INIS)

    Ribeiro, Marcela G.; Filho, Walter R.P.

    2006-01-01

    In Brazil, problems regarding protection from hazardous substances in small-sized enterprises are similar to those observed in many other countries. Looking for a simple tool to assess and control such exposures, FUNDACENTRO has started in 2005 a pilot-project to implement the International Chemical Control Toolkit. During the series of visits to foundries, it was observed that although many changes have occurred in foundry technology, occupational exposures to silica dust and metal fumes continue to occur, due to a lack of perception of occupational exposure in the work environment. After introducing the Chemical Toolkit concept to the foundry work group, it was possible to show that the activities undertaken to improve the management of chemicals, according to its concept, will support companies in fulfilling government legislations related to chemical management, occupational health and safety, and environmental impact. In the following meetings, the foundry work group and FUNDACENTRO research team will identify 'inadequate work situations'. Based on the Chemical Toolkit, improvement measures will be proposed. Afterwards, a survey will verify the efficency of those measures in the control of hazards and consequently on the management of chemicals. This step is now in course

  4. German Radar Observation Shuttle Experiment (ROSE)

    Science.gov (United States)

    Sleber, A. J.; Hartl, P.; Haydn, R.; Hildebrandt, G.; Konecny, G.; Muehlfeld, R.

    1984-01-01

    The success of radar sensors in several different application areas of interest depends on the knowledge of the backscatter of radar waves from the targets of interest, the variance of these interaction mechanisms with respect to changing measurement parameters, and the determination of the influence of he measuring systems on the results. The incidence-angle dependency of the radar cross section of different natural targets is derived. Problems involved by the combination of data gained with different sensors, e.g., MSS-, TM-, SPOTand SAR-images are analyzed. Radar cross-section values gained with ground-based radar spectrometers and spaceborne radar imaging, and non-imaging scatterometers and spaceborne radar images from the same areal target are correlated. The penetration of L-band radar waves into vegetated and nonvegetated surfaces is analyzed.

  5. Meteor detection on ST (MST) radars

    International Nuclear Information System (INIS)

    Avery, S.K.

    1987-01-01

    The ability to detect radar echoes from backscatter due to turbulent irregularities of the radio refractive index in the clear atmosphere has lead to an increasing number of established mesosphere - stratosphere - troposphere (MST or ST) radars. Humidity and temperature variations are responsible for the echo in the troposphere and stratosphere and turbulence acting on electron density gradients provides the echo in the mesosphere. The MST radar and its smaller version, the ST radar, are pulsed Doppler radars operating in the VHF - UHF frequency range. These echoes can be used to determine upper atmosphere winds at little extra cost to the ST radar configuration. In addition, the meteor echoes can supplement mesospheric data from an MST radar. The detection techniques required on the ST radar for delineating meteor echo returns are described

  6. Field trials of a novel toolkit for evaluating 'intangible' values-related dimensions of projects.

    Science.gov (United States)

    Burford, Gemma; Velasco, Ismael; Janoušková, Svatava; Zahradnik, Martin; Hak, Tomas; Podger, Dimity; Piggot, Georgia; Harder, Marie K

    2013-02-01

    A novel toolkit has been developed, using an original approach to develop its components, for the purpose of evaluating 'soft' outcomes and processes that have previously been generally considered 'intangible': those which are specifically values based. This represents a step-wise, significant, change in provision for the assessment of values-based achievements that are of absolutely key importance to most civil society organisations (CSOs) and values-based businesses, and fills a known gap in evaluation practice. In this paper, we demonstrate the significance and rigour of the toolkit by presenting an evaluation of it in three diverse scenarios where different CSOs use it to co-evaluate locally relevant outcomes and processes to obtain results which are both meaningful to them and potentially comparable across organisations. A key strength of the toolkit is its original use of a prior generated, peer-elicited 'menu' of values-based indicators which provides a framework for user CSOs to localise. Principles of participatory, process-based and utilisation-focused evaluation are embedded in this toolkit and shown to be critical to its success, achieving high face-validity and wide applicability. The emerging contribution of this next-generation evaluation tool to other fields, such as environmental values, development and environmental sustainable development, shared values, business, education and organisational change is outlined. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. CAMEX-4 TOGA RADAR V1

    Data.gov (United States)

    National Aeronautics and Space Administration — The TOGA radar dataset consists of browse and radar data collected from the TOGA radar during the CAMEX-4 experiment. TOGA is a C-band linear polarized doppler radar...

  8. Cognitive Radio Application for Evaluating Coexistence with Cognitive Radars: A Software User’s Guide

    Science.gov (United States)

    2017-10-01

    with both conventional wireless systems as well as other types of cognitive RF systems (e.g., cognitive radar). The radio hardware for this...the base stations are at fixed positions and often elevated and operating with relatively high power compared with mobiles, it is straightforward...for cognitive RF systems to detect the base station’s transmissions and avoid activity that would harm this downlink. By contrast, the mobile

  9. Toolkit for local decision makers aims to strengthen environmental sustainability

    CSIR Research Space (South Africa)

    Murambadoro, M

    2011-11-01

    Full Text Available Members of the South African Risk and Vulnerability Atlas were involved in a meeting aimed at the development of a toolkit towards improved integration of climate change into local government's integrated development planning (IDP) process....

  10. Radar Plan Position Indicator Scope

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Radar Plan Position Indicator Scope is the collection of weather radar imagery for the period prior to the beginning of the Next Generation Radar (NEXRAD) system...

  11. Improving safety on rural local and tribal roads safety toolkit.

    Science.gov (United States)

    2014-08-01

    Rural roadway safety is an important issue for communities throughout the country and presents a challenge for state, local, and Tribal agencies. The Improving Safety on Rural Local and Tribal Roads Safety Toolkit was created to help rural local ...

  12. PEA: an integrated R toolkit for plant epitranscriptome analysis.

    Science.gov (United States)

    Zhai, Jingjing; Song, Jie; Cheng, Qian; Tang, Yunjia; Ma, Chuang

    2018-05-29

    The epitranscriptome, also known as chemical modifications of RNA (CMRs), is a newly discovered layer of gene regulation, the biological importance of which emerged through analysis of only a small fraction of CMRs detected by high-throughput sequencing technologies. Understanding of the epitranscriptome is hampered by the absence of computational tools for the systematic analysis of epitranscriptome sequencing data. In addition, no tools have yet been designed for accurate prediction of CMRs in plants, or to extend epitranscriptome analysis from a fraction of the transcriptome to its entirety. Here, we introduce PEA, an integrated R toolkit to facilitate the analysis of plant epitranscriptome data. The PEA toolkit contains a comprehensive collection of functions required for read mapping, CMR calling, motif scanning and discovery, and gene functional enrichment analysis. PEA also takes advantage of machine learning technologies for transcriptome-scale CMR prediction, with high prediction accuracy, using the Positive Samples Only Learning algorithm, which addresses the two-class classification problem by using only positive samples (CMRs), in the absence of negative samples (non-CMRs). Hence PEA is a versatile epitranscriptome analysis pipeline covering CMR calling, prediction, and annotation, and we describe its application to predict N6-methyladenosine (m6A) modifications in Arabidopsis thaliana. Experimental results demonstrate that the toolkit achieved 71.6% sensitivity and 73.7% specificity, which is superior to existing m6A predictors. PEA is potentially broadly applicable to the in-depth study of epitranscriptomics. PEA Docker image is available at https://hub.docker.com/r/malab/pea, source codes and user manual are available at https://github.com/cma2015/PEA. chuangma2006@gmail.com. Supplementary data are available at Bioinformatics online.

  13. The Virtual Physiological Human ToolKit.

    Science.gov (United States)

    Cooper, Jonathan; Cervenansky, Frederic; De Fabritiis, Gianni; Fenner, John; Friboulet, Denis; Giorgino, Toni; Manos, Steven; Martelli, Yves; Villà-Freixa, Jordi; Zasada, Stefan; Lloyd, Sharon; McCormack, Keith; Coveney, Peter V

    2010-08-28

    The Virtual Physiological Human (VPH) is a major European e-Science initiative intended to support the development of patient-specific computer models and their application in personalized and predictive healthcare. The VPH Network of Excellence (VPH-NoE) project is tasked with facilitating interaction between the various VPH projects and addressing issues of common concern. A key deliverable is the 'VPH ToolKit'--a collection of tools, methodologies and services to support and enable VPH research, integrating and extending existing work across Europe towards greater interoperability and sustainability. Owing to the diverse nature of the field, a single monolithic 'toolkit' is incapable of addressing the needs of the VPH. Rather, the VPH ToolKit should be considered more as a 'toolbox' of relevant technologies, interacting around a common set of standards. The latter apply to the information used by tools, including any data and the VPH models themselves, and also to the naming and categorizing of entities and concepts involved. Furthermore, the technologies and methodologies available need to be widely disseminated, and relevant tools and services easily found by researchers. The VPH-NoE has thus created an online resource for the VPH community to meet this need. It consists of a database of tools, methods and services for VPH research, with a Web front-end. This has facilities for searching the database, for adding or updating entries, and for providing user feedback on entries. Anyone is welcome to contribute.

  14. Performance Prediction Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    2017-09-25

    The Performance Prediction Toolkit (PPT), is a scalable co-design tool that contains the hardware and middle-ware models, which accept proxy applications as input in runtime prediction. PPT relies on Simian, a parallel discrete event simulation engine in Python or Lua, that uses the process concept, where each computing unit (host, node, core) is a Simian entity. Processes perform their task through message exchanges to remain active, sleep, wake-up, begin and end. The PPT hardware model of a compute core (such as a Haswell core) consists of a set of parameters, such as clock speed, memory hierarchy levels, their respective sizes, cache-lines, access times for different cache levels, average cycle counts of ALU operations, etc. These parameters are ideally read off a spec sheet or are learned using regression models learned from hardware counters (PAPI) data. The compute core model offers an API to the software model, a function called time_compute(), which takes as input a tasklist. A tasklist is an unordered set of ALU, and other CPU-type operations (in particular virtual memory loads and stores). The PPT application model mimics the loop structure of the application and replaces the computational kernels with a call to the hardware model's time_compute() function giving tasklists as input that model the compute kernel. A PPT application model thus consists of tasklists representing kernels and the high-er level loop structure that we like to think of as pseudo code. The key challenge for the hardware model's time_compute-function is to translate virtual memory accesses into actual cache hierarchy level hits and misses.PPT also contains another CPU core level hardware model, Analytical Memory Model (AMM). The AMM solves this challenge soundly, where our previous alternatives explicitly include the L1,L2,L3 hit-rates as inputs to the tasklists. Explicit hit-rates inevitably only reflect the application modeler's best guess, perhaps informed by a few

  15. Airborne Radar Observations of Severe Hailstorms: Implications for Future Spaceborne Radar

    Science.gov (United States)

    Heymsfield, Gerald M.; Tian, Lin; Li, Lihua; McLinden, Matthew; Cervantes, Jaime I.

    2013-01-01

    A new dual-frequency (Ku and Ka band) nadir-pointing Doppler radar on the high-altitude NASA ER-2 aircraft, called the High-Altitude Imaging Wind and Rain Airborne Profiler (HIWRAP), has collected data over severe thunderstorms in Oklahoma and Kansas during the Midlatitude Continental Convective Clouds Experiment (MC3E). The overarching motivation for this study is to understand the behavior of the dualwavelength airborne radar measurements in a global variety of thunderstorms and how these may relate to future spaceborne-radar measurements. HIWRAP is operated at frequencies that are similar to those of the precipitation radar on the Tropical Rainfall Measuring Mission (Ku band) and the upcoming Global Precipitation Measurement mission satellite's dual-frequency (Ku and Ka bands) precipitation radar. The aircraft measurements of strong hailstorms have been combined with ground-based polarimetric measurements to obtain a better understanding of the response of the Ku- and Ka-band radar to the vertical distribution of the hydrometeors, including hail. Data from two flight lines on 24 May 2011 are presented. Doppler velocities were approx. 39m/s2at 10.7-km altitude from the first flight line early on 24 May, and the lower value of approx. 25m/s on a second flight line later in the day. Vertical motions estimated using a fall speed estimate for large graupel and hail suggested that the first storm had an updraft that possibly exceeded 60m/s for the more intense part of the storm. This large updraft speed along with reports of 5-cm hail at the surface, reflectivities reaching 70 dBZ at S band in the storm cores, and hail signals from polarimetric data provide a highly challenging situation for spaceborne-radar measurements in intense convective systems. The Ku- and Ka-band reflectivities rarely exceed approx. 47 and approx. 37 dBZ, respectively, in these storms.

  16. Business plans--tips from the toolkit 6.

    Science.gov (United States)

    Steer, Neville

    2010-07-01

    General practice is a business. Most practices can stay afloat by having appointments, billing patients, managing the administration processes and working long hours. What distinguishes the high performance organisation from the average organisation is a business plan. This article examines how to create a simple business plan that can be applied to the general practice setting and is drawn from material contained in The Royal Australian College of General Practitioners' 'General practice management toolkit'.

  17. EPANET Multi-Species Extension Software and User's Manual ...

    Science.gov (United States)

    Software and User's Manual EPANET is used in homeland security research to model contamination threats to water systems. Historically, EPANET has been limited to tracking the dynamics of a single chemical transported through a network of pipes and storage tanks, such as a fluoride used in a tracer study or free chlorine used in a disinfection decay study. Recently, the NHSRC released a new extension to EPANET called EPANET-MSX (Multi-Species eXtension) that allows for the consideration of multiple interacting species in the bulk flow and on the pipe walls. This capability has been incorporated into both a stand-alone executable program as well as a toolkit library of functions that programmers can use to build customized applications.

  18. The Nuclear Energy Advanced Modeling and Simulation Safeguards and Separations Reprocessing Plant Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    McCaskey, Alex [ORNL; Billings, Jay Jay [ORNL; de Almeida, Valmor F [ORNL

    2011-08-01

    This report details the progress made in the development of the Reprocessing Plant Toolkit (RPTk) for the DOE Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. RPTk is an ongoing development effort intended to provide users with an extensible, integrated, and scalable software framework for the modeling and simulation of spent nuclear fuel reprocessing plants by enabling the insertion and coupling of user-developed physicochemical modules of variable fidelity. The NEAMS Safeguards and Separations IPSC (SafeSeps) and the Enabling Computational Technologies (ECT) supporting program element have partnered to release an initial version of the RPTk with a focus on software usability and utility. RPTk implements a data flow architecture that is the source of the system's extensibility and scalability. Data flows through physicochemical modules sequentially, with each module importing data, evolving it, and exporting the updated data to the next downstream module. This is accomplished through various architectural abstractions designed to give RPTk true plug-and-play capabilities. A simple application of this architecture, as well as RPTk data flow and evolution, is demonstrated in Section 6 with an application consisting of two coupled physicochemical modules. The remaining sections describe this ongoing work in full, from system vision and design inception to full implementation. Section 3 describes the relevant software development processes used by the RPTk development team. These processes allow the team to manage system complexity and ensure stakeholder satisfaction. This section also details the work done on the RPTk ``black box'' and ``white box'' models, with a special focus on the separation of concerns between the RPTk user interface and application runtime. Section 4 and 5 discuss that application runtime component in more detail, and describe the dependencies, behavior, and rigorous testing of its constituent components.

  19. Testing Video and Social Media for Engaging Users of the U.S. Climate Resilience Toolkit

    Science.gov (United States)

    Green, C. J.; Gardiner, N.; Niepold, F., III; Esposito, C.

    2015-12-01

    We developed a custom video production stye and a method for analyzing social media behavior so that we may deliberately build and track audience growth for decision-support tools and case studies within the U.S. Climate Resilience Toolkit. The new style of video focuses quickly on decision processes; its 30s format is well-suited for deployment through social media. We measured both traffic and engagement with video using Google Analytics. Each video included an embedded tag, allowing us to measure viewers' behavior: whether or not they entered the toolkit website; the duration of their session on the website; and the number pages they visited in that session. Results showed that video promotion was more effective on Facebook than Twitter. Facebook links generated twice the number of visits to the toolkit. Videos also increased Facebook interaction overall. Because most Facebook users are return visitors, this campaign did not substantially draw new site visitors. We continue to research and apply these methods in a targeted engagement and outreach campaign that utilizes the theory of social diffusion and social influence strategies to grow our audience of "influential" decision-makers and people within their social networks. Our goal is to increase access and use of the U.S. Climate Resilience Toolkit.

  20. CONNJUR Workflow Builder: a software integration environment for spectral reconstruction.

    Science.gov (United States)

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert; Martyn, Timothy O; Ellis, Heidi J C; Gryk, Michael R

    2015-07-01

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses.

  1. CONNJUR Workflow Builder: a software integration environment for spectral reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert [UConn Health, Department of Molecular Biology and Biophysics (United States); Martyn, Timothy O. [Rensselaer at Hartford, Department of Engineering and Science (United States); Ellis, Heidi J. C. [Western New England College, Department of Computer Science and Information Technology (United States); Gryk, Michael R., E-mail: gryk@uchc.edu [UConn Health, Department of Molecular Biology and Biophysics (United States)

    2015-07-15

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses.

  2. CONNJUR Workflow Builder: a software integration environment for spectral reconstruction

    International Nuclear Information System (INIS)

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert; Martyn, Timothy O.; Ellis, Heidi J. C.; Gryk, Michael R.

    2015-01-01

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses

  3. Transition Toolkit 3.0: Meeting the Educational Needs of Youth Exposed to the Juvenile Justice System. Third Edition

    Science.gov (United States)

    Clark, Heather Griller; Mathur, Sarup; Brock, Leslie; O'Cummings, Mindee; Milligan, DeAngela

    2016-01-01

    The third edition of the National Technical Assistance Center for the Education of Neglected or Delinquent Children and Youth's (NDTAC's) "Transition Toolkit" provides updated information on existing policies, practices, strategies, and resources for transition that build on field experience and research. The "Toolkit" offers…

  4. GENFIT - a generic track-fitting toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Rauch, Johannes [Technische Universitaet Muenchen (Germany); Schlueter, Tobias [Ludwig-Maximilians-Universitaet Muenchen (Germany)

    2014-07-01

    GENFIT is an experiment-independent track-fitting toolkit, which combines fitting algorithms, track representations, and measurement geometries into a modular framework. We report on a significantly improved version of GENFIT, based on experience gained in the Belle II, PANDA, and FOPI experiments. Improvements concern the implementation of additional track-fitting algorithms, enhanced implementations of Kalman fitters, enhanced visualization capabilities, and additional implementations of measurement types suited for various kinds of tracking detectors. The data model has been revised, allowing for efficient track merging, smoothing, residual calculation and alignment.

  5. NBII-SAIN Data Management Toolkit

    Science.gov (United States)

    Burley, Thomas E.; Peine, John D.

    2009-01-01

    percent of the cost of a spatial information system is associated with spatial data collection and management (U.S. General Accounting Office, 2003). These figures indicate that the resources (time, personnel, money) of many agencies and organizations could be used more efficiently and effectively. Dedicated and conscientious data management coordination and documentation is critical for reducing such redundancy. Substantial cost savings and increased efficiency are direct results of a pro-active data management approach. In addition, details of projects as well as data and information are frequently lost as a result of real-world occurrences such as the passing of time, job turnover, and equipment changes and failure. A standardized, well documented database allows resource managers to identify issues, analyze options, and ultimately make better decisions in the context of adaptive management (National Land and Water Resources Audit and the Australia New Zealand Land Information Council on behalf of the Australian National Government, 2003). Many environmentally focused, scientific, or natural resource management organizations collect and create both spatial and non-spatial data in some form. Data management appropriate for those data will be contingent upon the project goal(s) and objectives and thus will vary on a case-by-case basis. This project and the resulting Data Management Toolkit, hereafter referred to as the Toolkit, is therefore not intended to be comprehensive in terms of addressing all of the data management needs of all projects that contain biological, geospatial, and other types of data. The Toolkit emphasizes the idea of connecting a project's data and the related management needs to the defined project goals and objectives from the outset. In that context, the Toolkit presents and describes the fundamental components of sound data and information management that are common to projects involving biological, geospatial, and other related data

  6. The use of radar for bathymetry assessment

    OpenAIRE

    Aardoom, J.H.; Greidanus, H.S.F.

    1998-01-01

    The bottom topography in shallow seas can be observed by air- and spaceborne imaging radar. Bathymetric information derived from radar data is limited in accuracy, but radar has a good spatial coverage. The accuracy can be increased by assimilating the radar imagery into existing or insitu gathered bathymetric data. The paper reviews the concepts of bathymetry assessment by radar, the radar imaging mechanism, and the possibilities and limitations of the use of radar data in rapid assessment.

  7. A HWIL test facility of infrared imaging laser radar using direct signal injection

    Science.gov (United States)

    Wang, Qian; Lu, Wei; Wang, Chunhui; Wang, Qi

    2005-01-01

    Laser radar has been widely used these years and the hardware-in-the-loop (HWIL) testing of laser radar become important because of its low cost and high fidelity compare with On-the-Fly testing and whole digital simulation separately. Scene generation and projection two key technologies of hardware-in-the-loop testing of laser radar and is a complicated problem because the 3D images result from time delay. The scene generation process begins with the definition of the target geometry and reflectivity and range. The real-time 3D scene generation computer is a PC based hardware and the 3D target models were modeled using 3dsMAX. The scene generation software was written in C and OpenGL and is executed to extract the Z-buffer from the bit planes to main memory as range image. These pixels contain each target position x, y, z and its respective intensity and range value. Expensive optical injection technologies of scene projection such as LDP array, VCSEL array, DMD and associated scene generation is ongoing. But the optical scene projection is complicated and always unaffordable. In this paper a cheaper test facility was described that uses direct electronic injection to provide rang images for laser radar testing. The electronic delay and pulse shaping circuits inject the scenes directly into the seeker's signal processing unit.

  8. Open software architecture for east articulated maintenance arm

    International Nuclear Information System (INIS)

    Wu, Jing; Wu, Huapeng; Song, Yuntao; Li, Ming; Yang, Yang; Alcina, Daniel A.M.

    2016-01-01

    Highlights: • A software requirement of serial-articulated robot for EAST assembly and maintains is presented. • A open software architecture of the robot is developed. • A component-based model distribution system with real-time communication of the robot is constructed. - Abstract: For the inside inspection and the maintenance of vacuum vessel in the EAST, an articulated maintenance arm is developed. In this article, an open software architecture developed for the EAST articulated maintenance arm (EAMA) is described, which offers a robust and proper performance and easy-going experience based on standard open robotic platform OROCOS. The paper presents a component-based model software architecture using multi-layer structure: end layer, up layer, middle, and down layer. In the end layer the components are defined off-line in the task planner manner. The components in up layer complete the function of trajectory plan. The CORBA, as a communication framework, is adopted to exchange the data between the distributed components. The contributors use Real-Time Workshop from the MATLAB/Simulink to generate the components in the middle layer. Real-time Toolkit guarantees control applications running in the hard real-time mode. Ethernets and the CAN bus are used for data transfer in the down layer, where the components implement the hardware functions. The distributed architecture of control system associates each processing node with each joint, which is mapped to a component with all functioning features of the framework.

  9. Open software architecture for east articulated maintenance arm

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Jing, E-mail: wujing@ipp.ac.cn [Institute of Plasma Physics Chinese Academy of Sciences, 350 Shushanhu Rd Hefei Anhui (China); Lappeenranta University of Technology, Skinnarilankatu 34 Lappeenranta (Finland); Wu, Huapeng [Lappeenranta University of Technology, Skinnarilankatu 34 Lappeenranta (Finland); Song, Yuntao [Institute of Plasma Physics Chinese Academy of Sciences, 350 Shushanhu Rd Hefei Anhui (China); Li, Ming [Lappeenranta University of Technology, Skinnarilankatu 34 Lappeenranta (Finland); Yang, Yang [Institute of Plasma Physics Chinese Academy of Sciences, 350 Shushanhu Rd Hefei Anhui (China); Alcina, Daniel A.M. [Lappeenranta University of Technology, Skinnarilankatu 34 Lappeenranta (Finland)

    2016-11-01

    Highlights: • A software requirement of serial-articulated robot for EAST assembly and maintains is presented. • A open software architecture of the robot is developed. • A component-based model distribution system with real-time communication of the robot is constructed. - Abstract: For the inside inspection and the maintenance of vacuum vessel in the EAST, an articulated maintenance arm is developed. In this article, an open software architecture developed for the EAST articulated maintenance arm (EAMA) is described, which offers a robust and proper performance and easy-going experience based on standard open robotic platform OROCOS. The paper presents a component-based model software architecture using multi-layer structure: end layer, up layer, middle, and down layer. In the end layer the components are defined off-line in the task planner manner. The components in up layer complete the function of trajectory plan. The CORBA, as a communication framework, is adopted to exchange the data between the distributed components. The contributors use Real-Time Workshop from the MATLAB/Simulink to generate the components in the middle layer. Real-time Toolkit guarantees control applications running in the hard real-time mode. Ethernets and the CAN bus are used for data transfer in the down layer, where the components implement the hardware functions. The distributed architecture of control system associates each processing node with each joint, which is mapped to a component with all functioning features of the framework.

  10. The Identification of Potential Resilient Estuary-based Enterprises to Encourage Economic Empowerment in South Africa: a Toolkit Approach

    Directory of Open Access Journals (Sweden)

    Rebecca Bowd

    2012-09-01

    Full Text Available It has been argued that ecosystem services can be used as the foundation to provide economic opportunities to empower the disadvantaged. The Ecosystem Services Framework (ESF approach for poverty alleviation, which balances resource conservation and human resource use, has received much attention in the literature. However, few projects have successfully achieved both conservation and economic objectives. This is partly due to there being a hiatus between theory and practice, due to the absence of tools that help make the transition between conceptual frameworks and theory, to practical integration of ecosystem services into decision making. To address this hiatus, an existing conceptual framework for analyzing the robustness of social-ecological systems was translated into a practical toolkit to help understand the complexity of social-ecological systems (SES. The toolkit can be used by a diversity of stakeholders as a decision making aid for assessing ecosystem services supply and demand and associated enterprise opportunities. The toolkit is participatory and combines both a generic "top-down" scientific approach with a case-specific "bottom-up" approach. It promotes a shared understanding of the utilization of ecosystem services, which is the foundation of identifying resilient enterprises. The toolkit comprises four steps: (i ecosystem services supply and demand assessment; (ii roles identification; (iii enterprise opportunity identification; and (vi enterprise risk assessment, and was tested at two estuary study sites. Implementation of the toolkit requires the populating of preprogrammed Excel worksheets through the holding of workshops that are attended by stakeholders associated with the ecosystems. It was concluded that for an enterprise to be resilient, it must be resilient at an external SES level,which the toolkit addresses, and at an internal business functioning level, e.g., social dynamics among personnel, skills, and literacy

  11. Software for Generating Troposphere Corrections for InSAR Using GPS and Weather Model Data

    Science.gov (United States)

    Moore, Angelyn W.; Webb, Frank H.; Fishbein, Evan F.; Fielding, Eric J.; Owen, Susan E.; Granger, Stephanie L.; Bjoerndahl, Fredrik; Loefgren, Johan; Fang, Peng; Means, James D.; hide

    2013-01-01

    Atmospheric errors due to the troposphere are a limiting error source for spaceborne interferometric synthetic aperture radar (InSAR) imaging. This software generates tropospheric delay maps that can be used to correct atmospheric artifacts in InSAR data. The software automatically acquires all needed GPS (Global Positioning System), weather, and Digital Elevation Map data, and generates a tropospheric correction map using a novel algorithm for combining GPS and weather information while accounting for terrain. Existing JPL software was prototypical in nature, required a MATLAB license, required additional steps to acquire and ingest needed GPS and weather data, and did not account for topography in interpolation. Previous software did not achieve a level of automation suitable for integration in a Web portal. This software overcomes these issues. GPS estimates of tropospheric delay are a source of corrections that can be used to form correction maps to be applied to InSAR data, but the spacing of GPS stations is insufficient to remove short-wavelength tropospheric artifacts. This software combines interpolated GPS delay with weather model precipitable water vapor (PWV) and a digital elevation model to account for terrain, increasing the spatial resolution of the tropospheric correction maps and thus removing short wavelength tropospheric artifacts to a greater extent. It will be integrated into a Web portal request system, allowing use in a future L-band SAR Earth radar mission data system. This will be a significant contribution to its technology readiness, building on existing investments in in situ space geodetic networks, and improving timeliness, quality, and science value of the collected data

  12. The doctor-patient relationship as a toolkit for uncertain clinical decisions.

    Science.gov (United States)

    Diamond-Brown, Lauren

    2016-06-01

    Medical uncertainty is a well-recognized problem in healthcare, yet how doctors make decisions in the face of uncertainty remains to be understood. This article draws on interdisciplinary literature on uncertainty and physician decision-making to examine a specific physician response to uncertainty: using the doctor-patient relationship as a toolkit. Additionally, I ask what happens to this process when the doctor-patient relationship becomes fragmented. I answer these questions by examining obstetrician-gynecologists' narratives regarding how they make decisions when faced with uncertainty in childbirth. Between 2013 and 2014, I performed 21 semi-structured interviews with obstetricians in the United States. Obstetricians were selected to maximize variation in relevant physician, hospital, and practice characteristics. I began with grounded theory and moved to analytical coding of themes in relation to relevant literature. My analysis renders it evident that some physicians use the doctor-patient relationship as a toolkit for dealing with uncertainty. I analyze how this process varies for physicians in different models of care by comparing doctors' experiences in models with continuous versus fragmented doctor-patient relationships. My key findings are that obstetricians in both models appealed to the ideal of patient-centered decision-making to cope with uncertain decisions, but in practice physicians in fragmented care faced a number of challenges to using the doctor-patient relationship as a toolkit for decision-making. These challenges led to additional uncertainties and in some cases to poor outcomes for doctors and/or patients; they also raised concerns about the reproduction of inequality. Thus organization of care delivery mitigates the efficacy of doctors' use of the doctor-patient relationship toolkit for uncertain decisions. These findings have implications for theorizing about decision-making under conditions of medical uncertainty, for understanding

  13. Evaluating Teaching Development Activities in Higher Education: A Toolkit

    Science.gov (United States)

    Kneale, Pauline; Winter, Jennie; Turner, Rebecca; Spowart, Lucy; Hughes, Jane; McKenna, Colleen; Muneer, Reema

    2016-01-01

    This toolkit is developed as a resource for providers of teaching-related continuing professional development (CPD) in higher education (HE). It focuses on capturing the longer-term value and impact of CPD for teachers and learners, and moving away from immediate satisfaction measures. It is informed by the literature on evaluating higher…

  14. Development of an evidence-informed leisure time physical activity resource for adults with spinal cord injury: the SCI Get Fit Toolkit.

    Science.gov (United States)

    Arbour-Nicitopoulos, K P; Martin Ginis, K A; Latimer-Cheung, A E; Bourne, C; Campbell, D; Cappe, S; Ginis, S; Hicks, A L; Pomerleau, P; Smith, K

    2013-06-01

    To systematically develop an evidence-informed leisure time physical activity (LTPA) resource for adults with spinal cord injury (SCI). Canada. The Appraisal of Guidelines, Research and Evaluation (AGREE) II protocol was used to develop a toolkit to teach and encourage adults with SCI how to make smart and informed choices about being physically active. A multidisciplinary expert panel appraised the evidence and generated specific recommendations for the content of the toolkit. Pilot testing was conducted to refine the toolkit's presentation. Recommendations emanating from the consultation process were that the toolkit be a brief, evidence-based resource that contains images of adults with tetraplegia and paraplegia, and links to more detailed online information. The content of the toolkit should include the physical activity guidelines (PAGs) for adults with SCI, activities tailored to manual and power chair users, the benefits of LTPA, and strategies to overcome common LTPA barriers for adults with SCI. The inclusion of action plans and safety tips was also recommended. These recommendations have resulted in the development of an evidence-informed LTPA resource to assist adults with SCI in meeting the PAGs. This toolkit will have important implications for consumers, health care professionals and policy makers for encouraging LTPA in the SCI community.

  15. Interception of LPI radar signals

    Science.gov (United States)

    Lee, Jim P.

    1991-11-01

    Most current radars are designed to transmit short duration pulses with relatively high peak power. These radars can be detected easily by the use of relatively modest EW intercept receivers. Three radar functions (search, anti-ship missile (ASM) seeker, and navigation) are examined to evaluate the effectiveness of potential low probability of intercept (LPI) techniques, such as waveform coding, antenna profile control, and power management that a radar may employ against current Electronic Warfare (EW) receivers. The general conclusion is that it is possible to design a LPI radar which is effective against current intercept EW receivers. LPI operation is most easily achieved at close ranges and against a target with a large radar cross section. The general system sensitivity requirement for the detection of current and projected LPI radars is found to be on the order of -100 dBmi which cannot be met by current EW receivers. Finally, three potential LPI receiver architectures, using channelized, superhet, and acousto-optic receivers with narrow RF and video bandwidths are discussed. They have shown some potential in terms of providing the sensitivity and capability in an environment where both conventional and LPI signals are present.

  16. Air and spaceborne radar systems an introduction

    CERN Document Server

    Lacomme, Philippe; Hardange, Jean-Philippe; Normant, Eric

    2001-01-01

    A practical tool on radar systems that will be of major help to technicians, student engineers and engineers working in industry and in radar research and development. The many users of radar as well as systems engineers and designers will also find it highly useful. Also of interest to pilots and flight engineers and military command personnel and military contractors. """"This introduction to the field of radar is intended for actual users of radar. It focuses on the history, main principles, functions, modes, properties and specific nature of modern airborne radar. The book examines radar's

  17. OpenADR Open Source Toolkit: Developing Open Source Software for the Smart Grid

    Energy Technology Data Exchange (ETDEWEB)

    McParland, Charles

    2011-02-01

    Demand response (DR) is becoming an increasingly important part of power grid planning and operation. The advent of the Smart Grid, which mandates its use, further motivates selection and development of suitable software protocols to enable DR functionality. The OpenADR protocol has been developed and is being standardized to serve this goal. We believe that the development of a distributable, open source implementation of OpenADR will benefit this effort and motivate critical evaluation of its capabilities, by the wider community, for providing wide-scale DR services

  18. The advanced computational testing and simulation toolkit (ACTS)

    International Nuclear Information System (INIS)

    Drummond, L.A.; Marques, O.

    2002-01-01

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  19. The advanced computational testing and simulation toolkit (ACTS)

    Energy Technology Data Exchange (ETDEWEB)

    Drummond, L.A.; Marques, O.

    2002-05-21

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  20. Human walking estimation with radar

    NARCIS (Netherlands)

    Dorp, Ph. van; Groen, F.C.A.

    2003-01-01

    Radar can be used to observe humans that are obscured by objects such as walls. These humans cannot be visually observed. The radar measurements are used to animate an obscured human in virtual reality. This requires detailed information about the motion. The radar measurements give detailed

  1. Software development based on high speed PC oscilloscope for automated pulsed magnetic field measurement system

    International Nuclear Information System (INIS)

    Sun Yuxiang; Shang Lei; Li Ji; Ge Lei

    2011-01-01

    It introduces a method of a software development which is based on high speed PC oscilloscope for pulsed magnetic field measurement system. The previous design has been improved by this design, high-speed virtual oscilloscope has been used in the field for the first time. In the design, the automatic data acquisition, data process, data analysis and storage have been realized. Automated point checking reduces the workload. The use of precise motion bench increases the positioning accuracy. The software gets the data from PC oscilloscope by calling DLLs and includes the function of oscilloscope, such as trigger, ranges, and sample rate setting etc. Spline Interpolation and Bandstop Filter are used to denoise the signals. The core of the software is the state machine which controls the motion of stepper motors and data acquisition and stores the data automatically. NI Vision Acquisition Software and Database Connectivity Toolkit make the video surveillance of laboratory and MySQL database connectivity available. The raw signal and processed signal have been compared in this paper. The waveform has been greatly improved by the signal processing. (authors)

  2. FMWC Radar for Breath Detection

    DEFF Research Database (Denmark)

    Suhr, Lau Frejstrup; Tafur Monroy, Idelfonso; Vegas Olmos, Juan José

    We report on the experimental demonstration of an FMCW radar operating in the 25.7 - 26.6 GHz range with a repetition rate of 500 sweeps per second. The radar is able to track the breathing rate of an adult human from a distance of 1 meter. The experiments have utilized a 50 second recording window...... to accurately track the breathing rate. The radar utilizes a saw tooth modulation format and a low latency receiver. A breath tracking radar is useful both in medical scenarios, diagnosing disorders such as sleep apnea, and for home use where the user can monitor its health. Breathing is a central part of every...... radar chip which, through the use of a simple modulation scheme, is able to measure the breathing rate of an adult human from a distance. A high frequency output makes sure that the radar cannot penetrate solid obstacles which is a wanted feature in private homes where people therefore cannot measure...

  3. Radar meteor rates and solar activity

    International Nuclear Information System (INIS)

    Prikryl, P.

    1983-01-01

    The short-term variation of diurnal radar meteor rates with solar activity represented by solar microwave flux Fsub(10.7), and sunspots relative number Rsub(z), is investigated. Applying the superposed-epoch analysis to the observational material of radar meteor rates from Christchurch (1960-61 and 1963-65), a decrease in the recorded radar rates is found during days of enhanced solar activity. No effect of geomagnetic activity similar to the one reported for the Swedish and Canadian radar meteor data was found by the author in the Christchurch data. A possible explanation of the absence of the geomagnetic effect on radar meteor rates from New Zealand due to a lower echo ceiling height of the Christchurch radar is suggested. The variation of the atmospheric parameters as a possible cause of the observed variation in radar meteor rates is also discussed. (author)

  4. Measuring acceptance of an assistive social robot: a suggested toolkit

    NARCIS (Netherlands)

    Heerink, M.; Kröse, B.; Evers, V.; Wielinga, B.

    2009-01-01

    The human robot interaction community is multidisciplinary by nature and has members from social science to engineering backgrounds. In this paper we aim to provide human robot developers with a straightforward toolkit to evaluate users' acceptance of assistive social robots they are designing or

  5. Improving Radar QPE's in Complex Terrain for Improved Flash Flood Monitoring and Prediction

    Science.gov (United States)

    Cifelli, R.; Streubel, D. P.; Reynolds, D.

    2010-12-01

    Quantitative Precipitation Estimation (QPE) is extremely challenging in regions of complex terrain due to a combination of issues related to sampling. In particular, radar beams are often blocked or scan above the liquid precipitation zone while rain gauge density is often too low to properly characterize the spatial distribution of precipitation. Due to poor radar coverage, rain gauge networks are used by the National Weather Service (NWS) River Forecast Centers as the principal source for QPE across the western U.S. The California Nevada River Forecast Center (CNRFC) uses point rainfall measurements and historical rainfall runoff relationships to derive river stage forecasts. The point measurements are interpolated to a 4 km grid using Parameter-elevation Regressions on Independent Slopes Model (PRISM) data to develop a gridded 6-hour QPE product (hereafter referred to as RFC QPE). Local forecast offices can utilize the Multi-sensor Precipitation Estimator (MPE) software to improve local QPE’s and thus local flash flood monitoring and prediction. MPE uses radar and rain gauge data to develop a combined QPE product at 1-hour intervals. The rain gauge information is used to bias correct the radar precipitation estimates so that, in situations where the rain gauge density and radar coverage are adequate, MPE can take advantage of the spatial coverage of the radar and the “ground truth” of the rain gauges to provide an accurate QPE. The MPE 1-hour QPE analysis should provide better spatial and temporal resolution for short duration hydrologic events as compared to 6-hour analyses. These hourly QPEs are then used to correct radar derived rain rates used by the Flash Flood Monitoring and Prediction (FFMP) software in forecast offices for issuance of flash flood warnings. Although widely used by forecasters across the eastern U.S., MPE is not used extensively by the NWS in the west. Part of the reason for the lack of use of MPE across the west is that there has

  6. Radar cross section

    CERN Document Server

    Knott, Gene; Tuley, Michael

    2004-01-01

    This is the second edition of the first and foremost book on this subject for self-study, training, and course work. Radar cross section (RCS) is a comparison of two radar signal strengths. One is the strength of the radar beam sweeping over a target, the other is the strength of the reflected echo sensed by the receiver. This book shows how the RCS ?gauge? can be predicted for theoretical objects and how it can be measured for real targets. Predicting RCS is not easy, even for simple objects like spheres or cylinders, but this book explains the two ?exact? forms of theory so well that even a

  7. Classification and correction of the radar bright band with polarimetric radar

    Science.gov (United States)

    Hall, Will; Rico-Ramirez, Miguel; Kramer, Stefan

    2015-04-01

    The annular region of enhanced radar reflectivity, known as the Bright Band (BB), occurs when the radar beam intersects a layer of melting hydrometeors. Radar reflectivity is related to rainfall through a power law equation and so this enhanced region can lead to overestimations of rainfall by a factor of up to 5, so it is important to correct for this. The BB region can be identified by using several techniques including hydrometeor classification and freezing level forecasts from mesoscale meteorological models. Advances in dual-polarisation radar measurements and continued research in the field has led to increased accuracy in the ability to identify the melting snow region. A method proposed by Kitchen et al (1994), a form of which is currently used operationally in the UK, utilises idealised Vertical Profiles of Reflectivity (VPR) to correct for the BB enhancement. A simpler and more computationally efficient method involves the formation of an average VPR from multiple elevations for correction that can still cause a significant decrease in error (Vignal 2000). The purpose of this research is to evaluate a method that relies only on analysis of measurements from an operational C-band polarimetric radar without the need for computationally expensive models. Initial results show that LDR is a strong classifier of melting snow with a high Critical Success Index of 97% when compared to the other variables. An algorithm based on idealised VPRs resulted in the largest decrease in error when BB corrected scans are compared to rain gauges and to lower level scans with a reduction in RMSE of 61% for rain-rate measurements. References Kitchen, M., R. Brown, and A. G. Davies, 1994: Real-time correction of weather radar data for the effects of bright band, range and orographic growth in widespread precipitation. Q.J.R. Meteorol. Soc., 120, 1231-1254. Vignal, B. et al, 2000: Three methods to determine profiles of reflectivity from volumetric radar data to correct

  8. 46 CFR 184.404 - Radars.

    Science.gov (United States)

    2010-10-01

    ... within one mile of land must be fitted with a FCC Type Accepted general marine radar system for surface... Federal Communications Commission (FCC) type accepted general marine radar system for surface navigation... 46 Shipping 7 2010-10-01 2010-10-01 false Radars. 184.404 Section 184.404 Shipping COAST GUARD...

  9. Radar network communication through sensing of frequency hopping

    Science.gov (United States)

    Dowla, Farid; Nekoogar, Faranak

    2013-05-28

    In one embodiment, a radar communication system includes a plurality of radars having a communication range and being capable of operating at a sensing frequency and a reporting frequency, wherein the reporting frequency is different than the sensing frequency, each radar is adapted for operating at the sensing frequency until an event is detected, each radar in the plurality of radars has an identification/location frequency for reporting information different from the sensing frequency, a first radar of the radars which senses the event sends a reporting frequency corresponding to its identification/location frequency when the event is detected, and all other radars in the plurality of radars switch their reporting frequencies to match the reporting frequency of the first radar upon detecting the reporting frequency switch of a radar within the communication range. In another embodiment, a method is presented for communicating information in a radar system.

  10. Sensor management in RADAR/IRST track fusion

    Science.gov (United States)

    Hu, Shi-qiang; Jing, Zhong-liang

    2004-07-01

    In this paper, a novel radar management strategy technique suitable for RADAR/IRST track fusion, which is based on Fisher Information Matrix (FIM) and fuzzy stochastic decision approach, is put forward. Firstly, optimal radar measurements' scheduling is obtained by the method of maximizing determinant of the Fisher information matrix of radar and IRST measurements, which is managed by the expert system. Then, suggested a "pseudo sensor" to predict the possible target position using the polynomial method based on the radar and IRST measurements, using "pseudo sensor" model to estimate the target position even if the radar is turned off. At last, based on the tracking performance and the state of target maneuver, fuzzy stochastic decision is used to adjust the optimal radar scheduling and retrieve the module parameter of "pseudo sensor". The experiment result indicates that the algorithm can not only limit Radar activity effectively but also keep the tracking accuracy of active/passive system well. And this algorithm eliminates the drawback of traditional Radar management methods that the Radar activity is fixed and not easy to control and protect.

  11. An Ethical Toolkit for Food Companies: Reflection on its Use

    NARCIS (Netherlands)

    Deblonde, M.K.; Graaff, R.; Brom, F.W.A.

    2007-01-01

    Nowadays many debates are going on that relate to the agricultural and food sector. It looks as if present technological and organizational developments within the agricultural and food sector are badly geared to societal needs and expectations. In this article we briefly present a toolkit for moral

  12. ISTEF Laser Radar Program

    National Research Council Canada - National Science Library

    Stryjewski, John

    1998-01-01

    The BMDO Innovative Science and Technology Experimentation Facility (BMDO/ISTEF) laser radar program is engaged in an ongoing program to develop and demonstrate advanced laser radar concepts for Ballistic Missile Defense (BMD...

  13. Toolkit for US colleges/schools of pharmacy to prepare learners for careers in academia.

    Science.gov (United States)

    Haines, Seena L; Summa, Maria A; Peeters, Michael J; Dy-Boarman, Eliza A; Boyle, Jaclyn A; Clifford, Kalin M; Willson, Megan N

    2017-09-01

    The objective of this article is to provide an academic toolkit for use by colleges/schools of pharmacy to prepare student pharmacists/residents for academic careers. Through the American Association of Colleges of Pharmac (AACP) Section of Pharmacy Practice, the Student Resident Engagement Task Force (SRETF) collated teaching materials used by colleges/schools of pharmacy from a previously reported national survey. The SRETF developed a toolkit for student pharmacists/residents interested in academic pharmacy. Eighteen institutions provided materials; five provided materials describing didactic coursework; over fifteen provided materials for an academia-focused Advanced Pharmacy Practice Experiences (APPE), while one provided materials for an APPE teaching-research elective. SRETF members created a syllabus template and sample lesson plan by integrating submitted resources. Submissions still needed to complete the toolkit include examples of curricular tracks and certificate programs. Pharmacy faculty vacancies still exist in pharmacy education. Engaging student pharmacists/residents about academia pillars of teaching, scholarship and service is critical for the future success of the academy. Published by Elsevier Inc.

  14. Advances in directional borehole radar data analysis and visualization

    Science.gov (United States)

    Smith, D.V.G.; Brown, P.J.

    2002-01-01

    The U.S. Geological Survey is developing a directional borehole radar (DBOR) tool for mapping fractures, lithologic changes, and underground utility and void detection. An important part of the development of the DBOR tool is data analysis and visualization, with the aim of making the software graphical user interface (GUI) intuitive and easy to use. The DBOR software system consists of a suite of signal and image processing routines written in Research Systems' Interactive Data Language (IDL). The software also serves as a front-end to many widely accepted Colorado School of Mines Center for Wave Phenomena (CWP) Seismic UNIX (SU) algorithms (Cohen and Stockwell, 2001). Although the SU collection runs natively in a UNIX environment, our system seamlessly emulates a UNIX session within a widely used PC operating system (MicroSoft Windows) using GNU tools (Noer, 1998). Examples are presented of laboratory data acquired with the prototype tool from two different experimental settings. The first experiment imaged plastic pipes in a macro-scale sand tank. The second experiment monitored the progress of an invasion front resulting from oil injection. Finally, challenges to further development and planned future work are discussed.

  15. A Toolkit For Storage Qos Provisioning For Data-Intensive Applications

    Directory of Open Access Journals (Sweden)

    Renata Słota

    2012-01-01

    Full Text Available This paper describes a programming toolkit developed in the PL-Grid project, named QStorMan, which supports storage QoS provisioning for data-intensive applications in distributed environments. QStorMan exploits knowledge-oriented methods for matching storage resources to non-functional requirements, which are defined for a data-intensive application. In order to support various usage scenarios, QStorMan provides two interfaces, such as programming libraries or a web portal. The interfaces allow to define the requirements either directly in an application source code or by using an intuitive graphical interface. The first way provides finer granularity, e.g., each portion of data processed by an application can define a different set of requirements. The second method is aimed at legacy applications support, which source code can not be modified. The toolkit has been evaluated using synthetic benchmarks and the production infrastructure of PL-Grid, in particular its storage infrastructure, which utilizes the Lustre file system.

  16. Srijan: a graphical toolkit for sensor network macroprogramming

    OpenAIRE

    Pathak , Animesh; Gowda , Mahanth K.

    2009-01-01

    International audience; Macroprogramming is an application development technique for wireless sensor networks (WSNs) where the developer specifies the behavior of the system, as opposed to that of the constituent nodes. In this proposed demonstration, we would like to present Srijan, a toolkit that enables application development for WSNs in a graphical manner using data-driven macroprogramming. It can be used in various stages of application development, viz. i) specification of application ...

  17. Development of software tools for 4-D visualization and quantitative analysis of PHITS simulation results

    International Nuclear Information System (INIS)

    Furutaka, Kazuyoshi

    2015-02-01

    A suite of software tools has been developed to facilitate the development of apparatus using a radiation transport simulation code PHITS by enabling 4D visualization (3D space and time) and quantitative analysis of so-called dieaway plots. To deliver useable tools as soon as possible, the existing software was utilized as much as possible; ParaView will be used for the 4D visualization of the results, whereas the analyses of dieaway plots will be done with ROOT toolkit with a tool named “diana”. To enable 4D visualization using ParaView, a group of tools (angel2vtk, DispDCAS1, CamPos) has been developed for the conversion of the data format to the one which can be read from ParaView and to ease the visualization. (author)

  18. Mocapy++ - a toolkit for inference and learning in dynamic Bayesian networks

    DEFF Research Database (Denmark)

    Paluszewski, Martin; Hamelryck, Thomas Wim

    2010-01-01

    Background Mocapy++ is a toolkit for parameter learning and inference in dynamic Bayesian networks (DBNs). It supports a wide range of DBN architectures and probability distributions, including distributions from directional statistics (the statistics of angles, directions and orientations...

  19. Radar and electronic navigation

    CERN Document Server

    Sonnenberg, G J

    2013-01-01

    Radar and Electronic Navigation, Sixth Edition discusses radar in marine navigation, underwater navigational aids, direction finding, the Decca navigator system, and the Omega system. The book also describes the Loran system for position fixing, the navy navigation satellite system, and the global positioning system (GPS). It reviews the principles, operation, presentations, specifications, and uses of radar. It also describes GPS, a real time position-fixing system in three dimensions (longitude, latitude, altitude), plus velocity information with Universal Time Coordinated (UTC). It is accur

  20. Weather Radar Impact Zones

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — These data represent an inventory of the national impacts of wind turbine interference with NEXRAD radar stations. This inventory was developed by the NOAA Radar...

  1. Quantitative Analysis of Venus Radar Backscatter Data in ArcGIS

    Science.gov (United States)

    Long, S. M.; Grosfils, E. B.

    2005-01-01

    Ongoing mapping of the Ganiki Planitia (V14) quadrangle of Venus and definition of material units has involved an integrated but qualitative analysis of Magellan radar backscatter images and topography using standard geomorphological mapping techniques. However, such analyses do not take full advantage of the quantitative information contained within the images. Analysis of the backscatter coefficient allows a much more rigorous statistical comparison between mapped units, permitting first order selfsimilarity tests of geographically separated materials assigned identical geomorphological labels. Such analyses cannot be performed directly on pixel (DN) values from Magellan backscatter images, because the pixels are scaled to the Muhleman law for radar echoes on Venus and are not corrected for latitudinal variations in incidence angle. Therefore, DN values must be converted based on pixel latitude back to their backscatter coefficient values before accurate statistical analysis can occur. Here we present a method for performing the conversions and analysis of Magellan backscatter data using commonly available ArcGIS software and illustrate the advantages of the process for geological mapping.

  2. ONKALO EDZ-measurements using ground penetrating radar (GPR) method

    Energy Technology Data Exchange (ETDEWEB)

    Silvast, M.; Wiljanen, B. (Roadscanners Oy, Rovaniemi (Finland))

    2008-09-15

    This report presents pilot project results from various Ground Penetrating Radar (GPR) tests performed on bedrock in ONKALO, the research tunnel system being built for the final disposal of spent nuclear fuel (in Finland). In recent years the GPR technology for structure inspection has improved to faster systems and higher frequencies. Processing and interpretation software has been developed for better visualization of processed data. GPR is a powerful non-destructive testing method with major advantages such as fast measurement speed and continuous survey lines. The purpose of the tests was to determine the capacity of GPR in identifying the Excavation Damaged or Disturbed Zone (EDZ). Topics included comparison of different types of GPR systems and antennas in select locations in the tunnel system and data presentation. High quality GPR data was obtained from all systems that were used on surfaces without concrete or steel reinforcement. Data processed using Geo Doctor software, which enables integrated analysis of available datasets on a single screen, provided promising results. (orig.)

  3. ONKALO EDZ-measurements using ground penetrating radar (GPR) method

    International Nuclear Information System (INIS)

    Silvast, M.; Wiljanen, B.

    2008-09-01

    This report presents pilot project results from various Ground Penetrating Radar (GPR) tests performed on bedrock in ONKALO, the research tunnel system being built for the final disposal of spent nuclear fuel (in Finland). In recent years the GPR technology for structure inspection has improved to faster systems and higher frequencies. Processing and interpretation software has been developed for better visualization of processed data. GPR is a powerful non-destructive testing method with major advantages such as fast measurement speed and continuous survey lines. The purpose of the tests was to determine the capacity of GPR in identifying the Excavation Damaged or Disturbed Zone (EDZ). Topics included comparison of different types of GPR systems and antennas in select locations in the tunnel system and data presentation. High quality GPR data was obtained from all systems that were used on surfaces without concrete or steel reinforcement. Data processed using Geo Doctor software, which enables integrated analysis of available datasets on a single screen, provided promising results. (orig.)

  4. Pocket radar guide key facts, equations, and data

    CERN Document Server

    Curry, G Richard

    2010-01-01

    ThePocket Radar Guideis a concise collection of key radar facts and important radar data that provides you with necessary radar information when you are away from your office or references. It includes statements and comments on radar design, operation, and performance; equations describing the characteristics and performance of radar systems and their components; and tables with data on radar characteristics and key performance issues.It is intended to supplement other radar information sources by providing a pocket companion to refresh memory and provide details whenever you need them such a

  5. Blending of Radial HF Radar Surface Current and Model Using ETKF Scheme For The Sunda Strait

    Science.gov (United States)

    Mujiasih, Subekti; Riyadi, Mochammad; Wandono, Dr; Wayan Suardana, I.; Nyoman Gede Wiryajaya, I.; Nyoman Suarsa, I.; Hartanto, Dwi; Barth, Alexander; Beckers, Jean-Marie

    2017-04-01

    Preliminary study of data blending of surface current for Sunda Strait-Indonesia has been done using the analysis scheme of the Ensemble Transform Kalman Filter (ETKF). The method is utilized to combine radial velocity from HF Radar and u and v component of velocity from Global Copernicus - Marine environment monitoring service (CMEMS) model. The initial ensemble is based on the time variability of the CMEMS model result. Data tested are from 2 CODAR Seasonde radar sites in Sunda Strait and 2 dates such as 09 September 2013 and 08 February 2016 at 12.00 UTC. The radial HF Radar data has a hourly temporal resolution, 20-60 km of spatial range, 3 km of range resolution, 5 degree of angular resolution and spatial resolution and 11.5-14 MHz of frequency range. The u and v component of the model velocity represents a daily mean with 1/12 degree spatial resolution. The radial data from one HF radar site is analyzed and the result compared to the equivalent radial velocity from CMEMS for the second HF radar site. Error checking is calculated by root mean squared error (RMSE). Calculation of ensemble analysis and ensemble mean is using Sangoma software package. The tested R which represents observation error covariance matrix, is a diagonal matrix with diagonal elements equal 0.05, 0.5 or 1.0 m2/s2. The initial ensemble members comes from a model simulation spanning a month (September 2013 or February 2016), one year (2013) or 4 years (2013-2016). The spatial distribution of the radial current are analyzed and the RMSE values obtained from independent HF radar station are optimized. It was verified that the analysis reproduces well the structure included in the analyzed HF radar data. More importantly, the analysis was also improved relative to the second independent HF radar site. RMSE of the improved analysis is better than first HF Radar site Analysis. The best result of the blending exercise was obtained for observation error variance equal to 0.05 m2/s2. This study is

  6. Synthetic aperture radar capabilities in development

    Energy Technology Data Exchange (ETDEWEB)

    Miller, M. [Lawrence Livermore National Lab., CA (United States)

    1994-11-15

    The Imaging and Detection Program (IDP) within the Laser Program is currently developing an X-band Synthetic Aperture Radar (SAR) to support the Joint US/UK Radar Ocean Imaging Program. The radar system will be mounted in the program`s Airborne Experimental Test-Bed (AETB), where the initial mission is to image ocean surfaces and better understand the physics of low grazing angle backscatter. The Synthetic Aperture Radar presentation will discuss its overall functionality and a brief discussion on the AETB`s capabilities. Vital subsystems including radar, computer, navigation, antenna stabilization, and SAR focusing algorithms will be examined in more detail.

  7. Extended Target Recognition in Cognitive Radar Networks

    Directory of Open Access Journals (Sweden)

    Xiqin Wang

    2010-11-01

    Full Text Available We address the problem of adaptive waveform design for extended target recognition in cognitive radar networks. A closed-loop active target recognition radar system is extended to the case of a centralized cognitive radar network, in which a generalized likelihood ratio (GLR based sequential hypothesis testing (SHT framework is employed. Using Doppler velocities measured by multiple radars, the target aspect angle for each radar is calculated. The joint probability of each target hypothesis is then updated using observations from different radar line of sights (LOS. Based on these probabilities, a minimum correlation algorithm is proposed to adaptively design the transmit waveform for each radar in an amplitude fluctuation situation. Simulation results demonstrate performance improvements due to the cognitive radar network and adaptive waveform design. Our minimum correlation algorithm outperforms the eigen-waveform solution and other non-cognitive waveform design approaches.

  8. Detection of Weather Radar Clutter

    DEFF Research Database (Denmark)

    Bøvith, Thomas

    2008-01-01

    classification and use a range of different techniques and input data. The first method uses external information from multispectral satellite images to detect clutter. The information in the visual, near-infrared, and infrared parts of the spectrum can be used to distinguish between cloud and cloud-free areas......Weather radars provide valuable information on precipitation in the atmosphere but due to the way radars work, not only precipitation is observed by the weather radar. Weather radar clutter, echoes from non-precipitating targets, occur frequently in the data, resulting in lowered data quality....... Especially in the application of weather radar data in quantitative precipitation estimation and forecasting a high data quality is important. Clutter detection is one of the key components in achieving this goal. This thesis presents three methods for detection of clutter. The methods use supervised...

  9. Scanning Cloud Radar Observations at Azores: Preliminary 3D Cloud Products

    Energy Technology Data Exchange (ETDEWEB)

    Kollias, P.; Johnson, K.; Jo, I.; Tatarevic, A.; Giangrande, S.; Widener, K.; Bharadwaj, N.; Mead, J.

    2010-03-15

    The deployment of the Scanning W-Band ARM Cloud Radar (SWACR) during the AMF campaign at Azores signals the first deployment of an ARM Facility-owned scanning cloud radar and offers a prelude for the type of 3D cloud observations that ARM will have the capability to provide at all the ARM Climate Research Facility sites by the end of 2010. The primary objective of the deployment of Scanning ARM Cloud Radars (SACRs) at the ARM Facility sites is to map continuously (operationally) the 3D structure of clouds and shallow precipitation and to provide 3D microphysical and dynamical retrievals for cloud life cycle and cloud-scale process studies. This is a challenging task, never attempted before, and requires significant research and development efforts in order to understand the radar's capabilities and limitations. At the same time, we need to look beyond the radar meteorology aspects of the challenge and ensure that the hardware and software capabilities of the new systems are utilized for the development of 3D data products that address the scientific needs of the new Atmospheric System Research (ASR) program. The SWACR observations at Azores provide a first look at such observations and the challenges associated with their analysis and interpretation. The set of scan strategies applied during the SWACR deployment and their merit is discussed. The scan strategies were adjusted for the detection of marine stratocumulus and shallow cumulus that were frequently observed at the Azores deployment. Quality control procedures for the radar reflectivity and Doppler products are presented. Finally, preliminary 3D-Active Remote Sensing of Cloud Locations (3D-ARSCL) products on a regular grid will be presented, and the challenges associated with their development discussed. In addition to data from the Azores deployment, limited data from the follow-up deployment of the SWACR at the ARM SGP site will be presented. This effort provides a blueprint for the effort required

  10. Noise and LPI radar as part of counter-drone mitigation system measures

    Science.gov (United States)

    Zhang, Yan (Rockee); Huang, Yih-Ru; Thumann, Charles

    2017-05-01

    With the rapid proliferation of small unmanned aerial systems (UAS) in the national airspace, small operational drones are being sometimes considered as a security threat for critical infrastructures, such as sports stadiums, military facilities, and airports. There have been many civilian counter-drone solutions and products reported, including radar and electromagnetic counter measures. For the current electromagnetic solutions, they are usually limited to particular type of detection and counter-measure scheme, which is usually effective for the specific type of drones. Also, control and communication link technologies used in even RC drones nowadays are more sophisticated, making them more difficult to detect, decode and counter. Facing these challenges, our team proposes a "software-defined" solution based on noise and LPI radar. For the detection, wideband-noise radar has the resolution performance to discriminate possible micro-Doppler features of the drone versus biological scatterers. It also has the benefit of more adaptive to different types of drones, and covertly detecting for security application. For counter-measures, random noise can be combined with "random sweeping" jamming scheme, to achieve the optimal balance between peak power allowed and the effective jamming probabilities. Some theoretical analysis of the proposed solution is provided in this study, a design case study is developed, and initial laboratory experiments, as well as outdoor tests are conducted to validate the basic concepts and theories. The study demonstrates the basic feasibilities of the Drone Detection and Mitigation Radar (DDMR) concept, while there are still much work needs to be done for a complete and field-worthy technology development.

  11. SMAP RADAR Calibration and Validation

    Science.gov (United States)

    West, R. D.; Jaruwatanadilok, S.; Chaubel, M. J.; Spencer, M.; Chan, S. F.; Chen, C. W.; Fore, A.

    2015-12-01

    The Soil Moisture Active Passive (SMAP) mission launched on Jan 31, 2015. The mission employs L-band radar and radiometer measurements to estimate soil moisture with 4% volumetric accuracy at a resolution of 10 km, and freeze-thaw state at a resolution of 1-3 km. Immediately following launch, there was a three month instrument checkout period, followed by six months of level 1 (L1) calibration and validation. In this presentation, we will discuss the calibration and validation activities and results for the L1 radar data. Early SMAP radar data were used to check commanded timing parameters, and to work out issues in the low- and high-resolution radar processors. From April 3-13 the radar collected receive only mode data to conduct a survey of RFI sources. Analysis of the RFI environment led to a preferred operating frequency. The RFI survey data were also used to validate noise subtraction and scaling operations in the radar processors. Normal radar operations resumed on April 13. All radar data were examined closely for image quality and calibration issues which led to improvements in the radar data products for the beta release at the end of July. Radar data were used to determine and correct for small biases in the reported spacecraft attitude. Geo-location was validated against coastline positions and the known positions of corner reflectors. Residual errors at the time of the beta release are about 350 m. Intra-swath biases in the high-resolution backscatter images are reduced to less than 0.3 dB for all polarizations. Radiometric cross-calibration with Aquarius was performed using areas of the Amazon rain forest. Cross-calibration was also examined using ocean data from the low-resolution processor and comparing with the Aquarius wind model function. Using all a-priori calibration constants provided good results with co-polarized measurements matching to better than 1 dB, and cross-polarized measurements matching to about 1 dB in the beta release. During the

  12. Solid-state radar switchboard

    Science.gov (United States)

    Thiebaud, P.; Cross, D. C.

    1980-07-01

    A new solid-state radar switchboard equipped with 16 input ports which will output data to 16 displays is presented. Each of the ports will handle a single two-dimensional radar input, or three ports will accommodate a three-dimensional radar input. A video switch card of the switchboard is used to switch all signals, with the exception of the IFF-mode-control lines. Each card accepts inputs from up to 16 sources and can pass a signal with bandwidth greater than 20 MHz to the display assigned to that card. The synchro amplifier of current systems has been eliminated and in the new design each PPI receives radar data via a single coaxial cable. This significant reduction in cabling is achieved by adding a serial-to-parallel interface and a digital-to-synchro converter located at the PPI.

  13. A patient and public involvement (PPI) toolkit for meaningful and flexible involvement in clinical trials - a work in progress.

    Science.gov (United States)

    Bagley, Heather J; Short, Hannah; Harman, Nicola L; Hickey, Helen R; Gamble, Carrol L; Woolfall, Kerry; Young, Bridget; Williamson, Paula R

    2016-01-01

    Funders of research are increasingly requiring researchers to involve patients and the public in their research. Patient and public involvement (PPI) in research can potentially help researchers make sure that the design of their research is relevant, that it is participant friendly and ethically sound. Using and sharing PPI resources can benefit those involved in undertaking PPI, but existing PPI resources are not used consistently and this can lead to duplication of effort. This paper describes how we are developing a toolkit to support clinical trials teams in a clinical trials unit. The toolkit will provide a key 'off the shelf' resource to support trial teams with limited resources, in undertaking PPI. Key activities in further developing and maintaining the toolkit are to: ● listen to the views and experience of both research teams and patient and public contributors who use the tools; ● modify the tools based on our experience of using them; ● identify the need for future tools; ● update the toolkit based on any newly identified resources that come to light; ● raise awareness of the toolkit and ● work in collaboration with others to either develop or test out PPI resources in order to reduce duplication of work in PPI. Background Patient and public involvement (PPI) in research is increasingly a funder requirement due to the potential benefits in the design of relevant, participant friendly, ethically sound research. The use and sharing of resources can benefit PPI, but available resources are not consistently used leading to duplication of effort. This paper describes a developing toolkit to support clinical trials teams to undertake effective and meaningful PPI. Methods The first phase in developing the toolkit was to describe which PPI activities should be considered in the pathway of a clinical trial and at what stage these activities should take place. This pathway was informed through review of the type and timing of PPI activities within

  14. Advanced ground-penetrating, imaging radar for bridge inspection

    International Nuclear Information System (INIS)

    Warhus, J.P.; Nelson, S.D.; Mast, J.E.; Johansson, E.M.

    1994-01-01

    During FY-93, the authors continued with development and experimental evaluation of components and system concepts aimed at improving ground-penetrating imaging radar (GPIR) for nondestructive evaluation of bridge decks and other high-value concrete structures. They developed and implemented a laboratory test bed, including features to facilitate component testing antenna system configuration evaluation, and collection of experimental data from realistic test objects. In addition, they developed pulse generators and antennas for evaluation and use in antenna configuration studies. This project was part of a cooperative effort with the Computational Electronics and Electromagnetics and Remote Imaging and Signal Engineering Thrust Areas, which contributed signal- and image-processing algorithm and software development and modeling support

  15. A graphical user interface (GUI) toolkit for the calculation of three-dimensional (3D) multi-phase biological effective dose (BED) distributions including statistical analyses.

    Science.gov (United States)

    Kauweloa, Kevin I; Gutierrez, Alonso N; Stathakis, Sotirios; Papanikolaou, Niko; Mavroidis, Panayiotis

    2016-07-01

    A toolkit has been developed for calculating the 3-dimensional biological effective dose (BED) distributions in multi-phase, external beam radiotherapy treatments such as those applied in liver stereotactic body radiation therapy (SBRT) and in multi-prescription treatments. This toolkit also provides a wide range of statistical results related to dose and BED distributions. MATLAB 2010a, version 7.10 was used to create this GUI toolkit. The input data consist of the dose distribution matrices, organ contour coordinates, and treatment planning parameters from the treatment planning system (TPS). The toolkit has the capability of calculating the multi-phase BED distributions using different formulas (denoted as true and approximate). Following the calculations of the BED distributions, the dose and BED distributions can be viewed in different projections (e.g. coronal, sagittal and transverse). The different elements of this toolkit are presented and the important steps for the execution of its calculations are illustrated. The toolkit is applied on brain, head & neck and prostate cancer patients, who received primary and boost phases in order to demonstrate its capability in calculating BED distributions, as well as measuring the inaccuracy and imprecision of the approximate BED distributions. Finally, the clinical situations in which the use of the present toolkit would have a significant clinical impact are indicated. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  16. Use of Remote Sensing Data to Enhance NWS Storm Damage Toolkit

    Science.gov (United States)

    Jedlove, Gary J.; Molthan, Andrew L.; White, Kris; Burks, Jason; Stellman, Keith; Smith, Mathew

    2012-01-01

    In the wake of a natural disaster such as a tornado, the National Weather Service (NWS) is required to provide a very detailed and timely storm damage assessment to local, state and federal homeland security officials. The Post ]Storm Data Acquisition (PSDA) procedure involves the acquisition and assembly of highly perishable data necessary for accurate post ]event analysis and potential integration into a geographic information system (GIS) available to its end users and associated decision makers. Information gained from the process also enables the NWS to increase its knowledge of extreme events, learn how to better use existing equipment, improve NWS warning programs, and provide accurate storm intensity and damage information to the news media and academia. To help collect and manage all of this information, forecasters in NWS Southern Region are currently developing a Storm Damage Assessment Toolkit (SDAT), which incorporates GIS ]capable phones and laptops into the PSDA process by tagging damage photography, location, and storm damage details with GPS coordinates for aggregation within the GIS database. However, this tool alone does not fully integrate radar and ground based storm damage reports nor does it help to identify undetected storm damage regions. In many cases, information on storm damage location (beginning and ending points, swath width, etc.) from ground surveys is incomplete or difficult to obtain. Geographic factors (terrain and limited roads in rural areas), manpower limitations, and other logistical constraints often prevent the gathering of a comprehensive picture of tornado or hail damage, and may allow damage regions to go undetected. Molthan et al. (2011) have shown that high resolution satellite data can provide additional valuable information on storm damage tracks to augment this database. This paper presents initial development to integrate satellitederived damage track information into the SDAT for near real ]time use by forecasters

  17. RISMC Toolkit and Methodology Research and Development Plan for External Hazards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Coleman, Justin Leigh [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-03-01

    This report includes the description and development plan for a Risk Informed Safety Margins Characterization (RISMC) toolkit and methodology that will evaluate multihazard risk in an integrated manner to support the operating nuclear fleet.

  18. RISMC Toolkit and Methodology Research and Development Plan for External Hazards Analysis

    International Nuclear Information System (INIS)

    Coleman, Justin Leigh

    2016-01-01

    This report includes the description and development plan for a Risk Informed Safety Margins Characterization (RISMC) toolkit and methodology that will evaluate multihazard risk in an integrated manner to support the operating nuclear fleet.

  19. DUL Radio: A light-weight, wireless toolkit for sketching in hardware

    DEFF Research Database (Denmark)

    Brynskov, Martin; Lunding, Rasmus; Vestergaard, Lasse Steenbock

    2011-01-01

    -mobile prototyping where fast reaction is needed (e.g. in controlling sound). The target audiences include designers, students, artists etc. with minimal programming and hardware skills. This presentation covers our motivations for creating the toolkit, specifications, test results, comparison to related products...

  20. The use of radar for bathymetry assessment

    NARCIS (Netherlands)

    Aardoom, J.H.; Greidanus, H.S.F.

    1998-01-01

    The bottom topography in shallow seas can be observed by air- and spaceborne imaging radar. Bathymetric information derived from radar data is limited in accuracy, but radar has a good spatial coverage. The accuracy can be increased by assimilating the radar imagery into existing or insitu gathered

  1. Margins of safety provided by COSHH Essentials and the ILO Chemical Control Toolkit.

    Science.gov (United States)

    Jones, Rachael M; Nicas, Mark

    2006-03-01

    COSHH Essentials, developed by the UK Health and Safety Executive, and the Chemical Control Toolkit (Toolkit) proposed by the International Labor Organization, are 'control banding' approaches to workplace risk management intended for use by proprietors of small and medium-sized businesses. Both systems group chemical substances into hazard bands based on toxicological endpoint and potency. COSSH Essentials uses the European Union's Risk-phrases (R-phrases), whereas the Toolkit uses R-phrases and the Globally Harmonized System (GHS) of Classification and Labeling of Chemicals. Each hazard band is associated with a range of airborne concentrations, termed exposure bands, which are to be attained by the implementation of recommended control technologies. Here we analyze the margin of safety afforded by the systems and, for each hazard band, define the minimal margin as the ratio of the minimum airborne concentration that produced the toxicological endpoint of interest in experimental animals to the maximum concentration in workplace air permitted by the exposure band. We found that the minimal margins were always occupational exposure limits, we argue that the minimal margins are better indicators of health protection. Further, given the small margins observed, we feel it is important that revisions of these systems provide the exposure bands to users, so as to permit evaluation of control technology capture efficiency.

  2. Bistatic radar

    CERN Document Server

    Willis, Nick

    2004-01-01

    Annotation his book is a major extension of a chapter on bistatic radar written by the author for the Radar Handbook, 2nd edition, edited by Merrill Skolnik. It provides a history of bistatic systems that points out to potential designers the applications that have worked and the dead-ends not worth pursuing. The text reviews the basic concepts and definitions, and explains the mathematical development of relationships, such as geometry, Ovals of Cassini, dynamic range, isorange and isodoppler contours, target doppler, and clutter doppler spread.Key Features * All development and analysis are

  3. Designing a Portable and Low Cost Home Energy Management Toolkit

    NARCIS (Netherlands)

    Keyson, D.V.; Al Mahmud, A.; De Hoogh, M.; Luxen, R.

    2013-01-01

    In this paper we describe the design of a home energy and comfort management system. The system has three components such as a smart plug with a wireless module, a residential gateway and a mobile app. The combined system is called a home energy management and comfort toolkit. The design is inspired

  4. Report of the Los Alamos accelerator automation application toolkit workshop

    International Nuclear Information System (INIS)

    Clout, P.; Daneels, A.

    1990-01-01

    A 5 day workshop was held in November 1988 at Los Alamos National Laboratory to address the viability of providing a toolkit optimized for building accelerator control systems. The workshop arose from work started independently at Los Alamos and CERN. This paper presents the discussion and the results of the meeting. (orig.)

  5. Livermore Big Artificial Neural Network Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    2016-07-01

    LBANN is a toolkit that is designed to train artificial neural networks efficiently on high performance computing architectures. It is optimized to take advantages of key High Performance Computing features to accelerate neural network training. Specifically it is optimized for low-latency, high bandwidth interconnects, node-local NVRAM, node-local GPU accelerators, and high bandwidth parallel file systems. It is built on top of the open source Elemental distributed-memory dense and spars-direct linear algebra and optimization library that is released under the BSD license. The algorithms contained within LBANN are drawn from the academic literature and implemented to work within a distributed-memory framework.

  6. Sierra Toolkit Manual Version 4.48.

    Energy Technology Data Exchange (ETDEWEB)

    Sierra Toolkit Team

    2018-03-01

    This report provides documentation for the SIERRA Toolkit (STK) modules. STK modules are intended to provide infrastructure that assists the development of computational engineering soft- ware such as finite-element analysis applications. STK includes modules for unstructured-mesh data structures, reading/writing mesh files, geometric proximity search, and various utilities. This document contains a chapter for each module, and each chapter contains overview descriptions and usage examples. Usage examples are primarily code listings which are generated from working test programs that are included in the STK code-base. A goal of this approach is to ensure that the usage examples will not fall out of date. This page intentionally left blank.

  7. Tips from the toolkit: 1 - know yourself.

    Science.gov (United States)

    Steer, Neville

    2010-01-01

    High performance organisations review their strategy and business processes as part of usual business operations. If you are new to the field of general practice, do you have a career plan for the next 5-10 years? If you are an experienced general practitioner, are you using much the same business model and processes as when you started out? The following article sets out some ideas you might use to have a fresh approach to your professional career. It is based on The Royal Australian College of General Practitioners' 'General practice management toolkit'.

  8. Radar spectrum opportunities for cognitive communications transmission

    OpenAIRE

    Wang, L; McGeehan, JP; Williams, C; Doufexi, A

    2008-01-01

    In relation to opportunistic access to radar spectrum, the impact of the radar on a communication system is investigated in this paper. This paper illustrates that by exploring the spatial and temporal opportunities in the radar spectrum and therefore improving the tolerance level to radar interference, a substantial increase on the throughput of a communication system is possible. Results are presented regarding the impact of swept radars on a WiMAX system. The results show the impact of SIR...

  9. FATES: a flexible analysis toolkit for the exploration of single-particle mass spectrometer data

    Science.gov (United States)

    Sultana, Camille M.; Cornwell, Gavin C.; Rodriguez, Paul; Prather, Kimberly A.

    2017-04-01

    Single-particle mass spectrometer (SPMS) analysis of aerosols has become increasingly popular since its invention in the 1990s. Today many iterations of commercial and lab-built SPMSs are in use worldwide. However, supporting analysis toolkits for these powerful instruments are outdated, have limited functionality, or are versions that are not available to the scientific community at large. In an effort to advance this field and allow better communication and collaboration between scientists, we have developed FATES (Flexible Analysis Toolkit for the Exploration of SPMS data), a MATLAB toolkit easily extensible to an array of SPMS designs and data formats. FATES was developed to minimize the computational demands of working with large data sets while still allowing easy maintenance, modification, and utilization by novice programmers. FATES permits scientists to explore, without constraint, complex SPMS data with simple scripts in a language popular for scientific numerical analysis. In addition FATES contains an array of data visualization graphic user interfaces (GUIs) which can aid both novice and expert users in calibration of raw data; exploration of the dependence of mass spectral characteristics on size, time, and peak intensity; and investigations of clustered data sets.

  10. Buying in to bioinformatics: an introduction to commercial sequence analysis software.

    Science.gov (United States)

    Smith, David Roy

    2015-07-01

    Advancements in high-throughput nucleotide sequencing techniques have brought with them state-of-the-art bioinformatics programs and software packages. Given the importance of molecular sequence data in contemporary life science research, these software suites are becoming an essential component of many labs and classrooms, and as such are frequently designed for non-computer specialists and marketed as one-stop bioinformatics toolkits. Although beautifully designed and powerful, user-friendly bioinformatics packages can be expensive and, as more arrive on the market each year, it can be difficult for researchers, teachers and students to choose the right software for their needs, especially if they do not have a bioinformatics background. This review highlights some of the currently available and most popular commercial bioinformatics packages, discussing their prices, usability, features and suitability for teaching. Although several commercial bioinformatics programs are arguably overpriced and overhyped, many are well designed, sophisticated and, in my opinion, worth the investment. If you are just beginning your foray into molecular sequence analysis or an experienced genomicist, I encourage you to explore proprietary software bundles. They have the potential to streamline your research, increase your productivity, energize your classroom and, if anything, add a bit of zest to the often dry detached world of bioinformatics. © The Author 2014. Published by Oxford University Press.

  11. Collision of Physics and Software in the Monte Carlo Application Toolkit (MCATK)

    Energy Technology Data Exchange (ETDEWEB)

    Sweezy, Jeremy Ed [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-01-21

    The topic is presented in a series of slides organized as follows: MCATK overview, development strategy, available algorithms, problem modeling (sources, geometry, data, tallies), parallelism, miscellaneous tools/features, example MCATK application, recent areas of research, and summary and future work. MCATK is a C++ component-based Monte Carlo neutron-gamma transport software library with continuous energy neutron and photon transport. Designed to build specialized applications and to provide new functionality in existing general-purpose Monte Carlo codes like MCNP, it reads ACE formatted nuclear data generated by NJOY. The motivation behind MCATK was to reduce costs. MCATK physics involves continuous energy neutron & gamma transport with multi-temperature treatment, static eigenvalue (keff and α) algorithms, time-dependent algorithm, and fission chain algorithms. MCATK geometry includes mesh geometries and solid body geometries. MCATK provides verified, unit-test Monte Carlo components, flexibility in Monte Carlo application development, and numerous tools such as geometry and cross section plotters.

  12. Typhoon 9707 observations with the MU radar and L-band boundary layer radar

    Directory of Open Access Journals (Sweden)

    M. Teshiba

    2001-08-01

    Full Text Available Typhoon 9707 (Opal was observed with the VHF-band Middle and Upper atmosphere (MU radar, an L-band boundary layer radar (BLR, and a vertical-pointing C-band meteorological radar at the Shigaraki MU Observatory in Shiga prefecture, Japan on 20 June 1997. The typhoon center passed about 80 km southeast from the radar site. Mesoscale precipitating clouds developed due to warm-moist airmass transport from the typhoon, and passed over the MU radar site with easterly or southeasterly winds. We primarily present the wind behaviour including the vertical component which a conventional meteorological Doppler radar cannot directly observe, and discuss the relationship between the wind behaviour of the typhoon and the precipitating system. To investigate the dynamic structure of the typhoon, the observed wind was divided into radial and tangential wind components under the assumption that the typhoon had an axi-symmetric structure. Altitude range of outflow ascended from 1–3 km to 2–10 km with increasing distance (within 80–260 km range from the typhoon center, and in-flow was observed above and below the outflow. Outflow and inflow were associated with updraft and downdraft, respectively. In the tangential wind, the maximum speed of counterclockwise winds was confirmed at 1–2 km altitudes. Based on the vertical velocity and the reflectivity obtained with the MU radar and the C-band meteorological radar, respectively, precipitating clouds, accompanied by the wind behaviour of the typhoon, were classified into stratiform and convective precipitating clouds. In the stratiform precipitating clouds, a vertical shear of radial wind and the maximum speed of counterclockwise wind were observed. There was a strong reflectivity layer called a ‘bright band’ around the 4.2 km altitude. We confirmed strong updrafts and down-drafts below and above it, respectively, and the existence of a relatively dry layer around the bright band level from radiosonde

  13. Typhoon 9707 observations with the MU radar and L-band boundary layer radar

    Directory of Open Access Journals (Sweden)

    M. Teshiba

    Full Text Available Typhoon 9707 (Opal was observed with the VHF-band Middle and Upper atmosphere (MU radar, an L-band boundary layer radar (BLR, and a vertical-pointing C-band meteorological radar at the Shigaraki MU Observatory in Shiga prefecture, Japan on 20 June 1997. The typhoon center passed about 80 km southeast from the radar site. Mesoscale precipitating clouds developed due to warm-moist airmass transport from the typhoon, and passed over the MU radar site with easterly or southeasterly winds. We primarily present the wind behaviour including the vertical component which a conventional meteorological Doppler radar cannot directly observe, and discuss the relationship between the wind behaviour of the typhoon and the precipitating system. To investigate the dynamic structure of the typhoon, the observed wind was divided into radial and tangential wind components under the assumption that the typhoon had an axi-symmetric structure. Altitude range of outflow ascended from 1–3 km to 2–10 km with increasing distance (within 80–260 km range from the typhoon center, and in-flow was observed above and below the outflow. Outflow and inflow were associated with updraft and downdraft, respectively. In the tangential wind, the maximum speed of counterclockwise winds was confirmed at 1–2 km altitudes. Based on the vertical velocity and the reflectivity obtained with the MU radar and the C-band meteorological radar, respectively, precipitating clouds, accompanied by the wind behaviour of the typhoon, were classified into stratiform and convective precipitating clouds. In the stratiform precipitating clouds, a vertical shear of radial wind and the maximum speed of counterclockwise wind were observed. There was a strong reflectivity layer called a ‘bright band’ around the 4.2 km altitude. We confirmed strong updrafts and down-drafts below and above it, respectively, and the existence of a relatively dry layer around the bright band level from radiosonde

  14. Radar observations of Mercury

    International Nuclear Information System (INIS)

    Harmon, J.K.; Campbell, D.B.

    1988-01-01

    Some of the radar altimetry profiles of Mercury obtained on the basis of data from the Arecibo Observatory are presented. In these measurements, the delay-Doppler method was used to measure altitudes along the Doppler equator, rather than to map radar reflectivity. The profiles, derived from observations made over a 6-yr period, provide extensive coverage over a restricted equatorial band and permit the identification of radar signatures for features as small as 50-km diameter craters and 1-km-high arcuate scarps. The data allowed identification of large-scale topographic features such as smooth plains subsidence zones and major highland regions

  15. Experiment for buried pipes by stepped FM-CW radar; Step shiki FM-CW radar ni yoru maisetsukan tansa jikken

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, K.; Ito, M. [Kawasaki Geological Engineering, Co. Ltd., Tokyo (Japan); Tanabe, K. [Central Research Institute of Electric Power Industry, Tokyo (Japan)

    1997-05-27

    The underground radar exploration is adopted to surveys of cavity under the road and buried pipes since the result of high resolution is obtained. However, the explorative depth of the radar is shallow, 2-3m in soil basement, and its applicable field has been limited. The continuous wave radar (FM-CW radar) was devised to get deeper explorative depth, but has been used for the geological structure survey such as the fault survey since it is lower in resolution as compared with the pulse radar. Therefore, to make use of characteristics of the continuous wave radar and enhance resolution in the shallow part, an experiment on buried pipes was conducted for the purpose of assessing and improving the FM-CW radar. In this processing, the wave form treatment used in the reflection method seismic survey was adopted for the radar survey. There are some problems, but it is effective to adopt the same algorithm to that used in the seismic survey to the radar exploration. The explorative depth was discussed from the damping rate of electromagnetic waves and dynamic range of facilities of the experimental site, and 7m was obtained. 5 figs., 1 tab.

  16. Radar reflection off extensive air showers

    CERN Document Server

    Stasielak, J; Bertaina, M; Blümer, J; Chiavassa, A; Engel, R; Haungs, A; Huege, T; Kampert, K -H; Klages, H; Kleifges, M; Krömer, O; Ludwig, M; Mathys, S; Neunteufel, P; Pekala, J; Rautenberg, J; Riegel, M; Roth, M; Salamida, F; Schieler, H; Šmída, R; Unger, M; Weber, M; Werner, F; Wilczyński, H; Wochele, J

    2012-01-01

    We investigate the possibility of detecting extensive air showers by the radar technique. Considering a bistatic radar system and different shower geometries, we simulate reflection of radio waves off the static plasma produced by the shower in the air. Using the Thomson cross-section for radio wave reflection, we obtain the time evolution of the signal received by the antennas. The frequency upshift of the radar echo and the power received are studied to verify the feasibility of the radar detection technique.

  17. Investigations on the sensitivity of a stepped-frequency radar utilizing a vector network analyzer for Ground Penetrating Radar

    Science.gov (United States)

    Seyfried, Daniel; Schubert, Karsten; Schoebel, Joerg

    2014-12-01

    Employing a continuous-wave radar system, with the stepped-frequency radar being one type of this class, all reflections from the environment are present continuously and simultaneously at the receiver. Utilizing such a radar system for Ground Penetrating Radar purposes, antenna cross-talk and ground bounce reflection form an overall dominant signal contribution while reflections from objects buried in the ground are of quite weak amplitude due to attenuation in the ground. This requires a large dynamic range of the receiver which in turn requires high sensitivity of the radar system. In this paper we analyze the sensitivity of our vector network analyzer utilized as stepped-frequency radar system for GPR pipe detection. We furthermore investigate the performance of increasing the sensitivity of the radar by means of appropriate averaging and low-noise pre-amplification of the received signal. It turns out that the improvement in sensitivity actually achievable may differ significantly from theoretical expectations. In addition, we give a descriptive explanation why our appropriate experiments demonstrate that the sensitivity of the receiver is independent of the distance between the target object and the source of dominant signal contribution. Finally, our investigations presented in this paper lead to a preferred setting of operation for our vector network analyzer in order to achieve best detection capability for weak reflection amplitudes, hence making the radar system applicable for Ground Penetrating Radar purposes.

  18. The PRIDE (Partnership to Improve Diabetes Education) Toolkit: Development and Evaluation of Novel Literacy and Culturally Sensitive Diabetes Education Materials.

    Science.gov (United States)

    Wolff, Kathleen; Chambers, Laura; Bumol, Stefan; White, Richard O; Gregory, Becky Pratt; Davis, Dianne; Rothman, Russell L

    2016-02-01

    Patients with low literacy, low numeracy, and/or linguistic needs can experience challenges understanding diabetes information and applying concepts to their self-management. The authors designed a toolkit of education materials that are sensitive to patients' literacy and numeracy levels, language preferences, and cultural norms and that encourage shared goal setting to improve diabetes self-management and health outcomes. The Partnership to Improve Diabetes Education (PRIDE) toolkit was developed to facilitate diabetes self-management education and support. The PRIDE toolkit includes a comprehensive set of 30 interactive education modules in English and Spanish to support diabetes self-management activities. The toolkit builds upon the authors' previously validated Diabetes Literacy and Numeracy Education Toolkit (DLNET) by adding a focus on shared goal setting, addressing the needs of Spanish-speaking patients, and including a broader range of diabetes management topics. Each PRIDE module was evaluated using the Suitability Assessment of Materials (SAM) instrument to determine the material's cultural appropriateness and its sensitivity to the needs of patients with low literacy and low numeracy. Reading grade level was also assessed using the Automated Readability Index (ARI), Coleman-Liau, Flesch-Kincaid, Fry, and SMOG formulas. The average reading grade level of the materials was 5.3 (SD 1.0), with a mean SAM of 91.2 (SD 5.4). All of the 30 modules received a "superior" score (SAM >70%) when evaluated by 2 independent raters. The PRIDE toolkit modules can be used by all members of a multidisciplinary team to assist patients with low literacy and low numeracy in managing their diabetes. © 2015 The Author(s).

  19. Detecting and classifying low probability of intercept radar

    CERN Document Server

    Pace, Philip E

    2008-01-01

    This revised and expanded second edition brings you to the cutting edge with new chapters on LPI radar design, including over-the-horizon radar, random noise radar, and netted LPI radar. You also discover critical LPI detection techniques, parameter extraction signal processing techniques, and anti-radiation missile design strategies to counter LPI radar.

  20. Radar probing of the auroral plasma

    International Nuclear Information System (INIS)

    Brekke, A.

    1977-01-01

    The European Incoherent Scatter Radar in the Auroral Zone (EISCAT) is an intereuropean organization planning to install an incoherent scatter radar system in Northern Scandinavia. It is supported by Finland, France, Norway, Great Britain, Sweden and West Germany, and its headquarters is in Kiruna, Sweden. The radar is planned to be operating in 1979. In order to introduce students and young scientists to the incoherent scatter radar technique, a summer school was held in Tromsoe, from 5th to 13th June 1975. In these proceedings an introduction to the basic theory of fluctuations in a plasma is given. Some of the present incoherent scatter radars now in use are presented and special considerations with respect to the planned EISACT facility are discussed. Reviews of some recent results and scientific problems relevant to EISCAT are also presented and finally a presentation of some observational techniques complementary to incoherent scatter radars is included. (Ed.)

  1. Pemfokusan Citra Radar untuk Hasil Pemodelan Radar Penembus Permukaan menggunakan Algoritma Migrasi Jarak

    Directory of Open Access Journals (Sweden)

    AZIZAH AZIZAH

    2016-02-01

    Full Text Available ABSTRAK Citra Radar Penembus Permukaan (GPR memberikan gambaran tentang objek dalam bentuk kurva hiperbola. Kurva hiperbola ini memiliki resolusi yang rendah sehingga sulit untuk menganalisis lokasi objek yang sebenarnya. Oleh karena itu diperlukan proses untuk membuat citra menjadi lebih fokus. Proses ini disebut transformasi atau migrasi. Salah satu algoritma migrasi adalah algoritma migrasi jarak. Terdapat beberapa langkah yang dilakukan dalam penelitian ini. Pertama, pemodelan GPR dilakukan menggunakan perangkat lunak. Kemudian, algoritma migrasi jarak diimplementasikan untuk data hasil pemodelan. Terakhir, dilakukan analisis hasil yang didapat. Informasi jumlah dan lokasi objek didapatkan dari citra hasil migrasi ini dengan persentase kesalahan untuk pada sumbu x sebesar 4 % untuk 1 objek, 17 % untuk 2 objek, dan 4 % untuk 3 objek. Sedangkan persentase kesalahan pada sumbu y sebesar 2% untuk 1 objek, 3% untuk 2 objek, dan 8% untuk 3 objek. Kata kunci: GPR, migrasi, algoritma, migrasi jarak, fokus, ABSTRACT Ground Penetrating Radar (GPR image give description about object in hyperbolic curve. This hyperbolic curve has low resolution so it is too difficult to analysis the actual object position. Therefore, we need a process can make the image more focus. This process usually called transformation or migration. One of them is range migration algorithm. There are several steps in this reseacrh. First, GPR modelling done using software. Next, range migration algorithm is implemented for the data result from simulation. Last, the result are analyzed. The information about the number and object position is obtained from the image in this migration process with margin error in x-axis are 4% for 1 object, 17% for 2 object, and 4% for 3 object. On the other side, margin error in y-axis are 2% for 1 object, 4% for 2 object, and 8% for 3 object. Keywords: GPR, migration, algorithm, range migration, focus

  2. A universal postprocessing toolkit for accelerator simulation and data analysis

    International Nuclear Information System (INIS)

    Borland, M.

    1998-01-01

    The Self-Describing Data Sets (SDDS) toolkit comprises about 70 generally-applicable programs sharing a common data protocol. At the Advanced Photon Source (APS), SDDS performs the vast majority of operational data collection and processing, most data display functions, and many control functions. In addition, a number of accelerator simulation codes use SDDS for all post-processing and data display. This has three principle advantages: first, simulation codes need not provide customized post-processing tools, thus simplifying development and maintenance. Second, users can enhance code capabilities without changing the code itself, by adding SDDS-based pre- and post-processing. Third, multiple codes can be used together more easily, by employing SDDS for data transfer and adaptation. Given its broad applicability, the SDDS file protocol is surprisingly simple, making it quite easy for simulations to generate SDDS-compliant data. This paper discusses the philosophy behind SDDS, contrasting it with some recent trends, and outlines the capabilities of the toolkit. The paper also gives examples of using SDDS for accelerator simulation

  3. Method for radar detection of persons wearing wires

    OpenAIRE

    Fox, William P.

    2014-01-01

    8,730,098 B1 Methods are described for radar detection of persons wearing wires using radar spectra data including the vertical polarization (VV) radar cross section and the horizontal polarization (HH) radar cross section for a person. In one embodiment, the ratio of the vertical polarization (VV) radar cross section to the horizontal polarization (HH) radar cross section for a person is compared to a detection threshold to determine whether the person is wearing wire...

  4. Phased-array radars

    Science.gov (United States)

    Brookner, E.

    1985-02-01

    The operating principles, technology, and applications of phased-array radars are reviewed and illustrated with diagrams and photographs. Consideration is given to the antenna elements, circuitry for time delays, phase shifters, pulse coding and compression, and hybrid radars combining phased arrays with lenses to alter the beam characteristics. The capabilities and typical hardware of phased arrays are shown using the US military systems COBRA DANE and PAVE PAWS as examples.

  5. Downhole pulse radar

    Science.gov (United States)

    Chang, Hsi-Tien

    1987-09-28

    A borehole logging tool generates a fast rise-time, short duration, high peak-power radar pulse having broad energy distribution between 30 MHz and 300 MHz through a directional transmitting and receiving antennas having barium titanate in the electromagnetically active region to reduce the wavelength to within an order of magnitude of the diameter of the antenna. Radar returns from geological discontinuities are sampled for transmission uphole. 7 figs.

  6. CyberGIS software: a synthetic review and integration roadmap

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Shaowen [University of Illinois, Urbana-Champaign; Anselin, Luc [Arizona State University; Bhaduri, Budhendra L [ORNL; Cosby, Christopher [University Navstar Consortium, Boulder, CO; Goodchild, Michael [University of California, Santa Barbara; Liu, Yan [University of Illinois, Urbana-Champaign; Nygers, Timothy L. [University of Washington, Seattle

    2013-01-01

    CyberGIS defined as cyberinfrastructure-based geographic information systems (GIS) has emerged as a new generation of GIS representing an important research direction for both cyberinfrastructure and geographic information science. This study introduces a 5-year effort funded by the US National Science Foundation to advance the science and applications of CyberGIS, particularly for enabling the analysis of big spatial data, computationally intensive spatial analysis and modeling (SAM), and collaborative geospatial problem-solving and decision-making, simultaneously conducted by a large number of users. Several fundamental research questions are raised and addressed while a set of CyberGIS challenges and opportunities are identified from scientific perspectives. The study reviews several key CyberGIS software tools that are used to elucidate a vision and roadmap for CyberGIS software research. The roadmap focuses on software integration and synthesis of cyberinfrastructure, GIS, and SAM by defining several key integration dimensions and strategies. CyberGIS, based on this holistic integration roadmap, exhibits the following key characteristics: high-performance and scalable, open and distributed, collaborative, service-oriented, user-centric, and community-driven. As a major result of the roadmap, two key CyberGIS modalities gateway and toolkit combined with a community-driven and participatory approach have laid a solid foundation to achieve scientific breakthroughs across many geospatial communities that would be otherwise impossible.

  7. Radar reflection off extensive air showers

    Directory of Open Access Journals (Sweden)

    Werner F.

    2013-06-01

    Full Text Available We investigate the possibility of detecting extensive air showers by the radar technique. Considering a bistatic radar system and different shower geometries, we simulate reflection of radio waves off the static plasma produced by the shower in the air. Using the Thomson cross-section for radio wave reflection, we obtain the time evolution of the signal received by the antennas. The frequency upshift of the radar echo and the power received are studied to verify the feasibility of the radar detection technique.

  8. Making Schools the Model for Healthier Environments Toolkit: What It Is

    Science.gov (United States)

    Robert Wood Johnson Foundation, 2012

    2012-01-01

    Healthy students perform better. Poor nutrition and inadequate physical activity can affect not only academic achievement, but also other factors such as absenteeism, classroom behavior, ability to concentrate, self-esteem, cognitive performance, and test scores. This toolkit provides information to help make schools the model for healthier…

  9. Millimeter wave radars raise weapon IQ

    Science.gov (United States)

    Lerner, E. J.

    1985-02-01

    The problems encountered by laser and IR homing devices for guided munitions may be tractable with warhead-mounted mm-wave radars. Operating at about 100 GHz and having several kilometers range, mm-wave radars see through darkness, fog, rain and smoke. The radar must be coupled with an analyzer that discerns moving and stationary targets and higher priority targets. The target lock-on can include shut-off of the transmitter and reception of naturally-generated mm-waves bouncing off the target when in the terminal phase of the flight. Monopulse transmitters have simplified the radar design, although mass production of finline small radar units has yet to be accomplished, particularly in combining GaAs, ferrites and other materials on one monolithic chip.

  10. The Basic Radar Altimetry Toolbox for Sentinel 3 Users

    Science.gov (United States)

    Lucas, Bruno; Rosmorduc, Vinca; Niemeijer, Sander; Bronner, Emilie; Dinardo, Salvatore; Benveniste, Jérôme

    2013-04-01

    The Basic Radar Altimetry Toolbox (BRAT) is a collection of tools and tutorial documents designed to facilitate the processing of radar altimetry data. This project started in 2006 from the joint efforts of ESA (European Space Agency) and CNES (Centre National d'Etudes Spatiales). The latest version of the software, 3.1, was released on March 2012. The tools enable users to interact with the most common altimetry data formats, being the most used way, the Graphical User Interface (BratGui). This GUI is a front-end for the powerful command line tools that are part of the BRAT suite. BRAT can also be used in conjunction with Matlab/IDL (via reading routines) or in C/C++/Fortran via a programming API, allowing the user to obtain desired data, bypassing the data-formatting hassle. The BratDisplay (graphic visualizer) can be launched from BratGui, or used as a stand-alone tool to visualize netCDF files - it is distributed with another ESA toolbox (GUT) as the visualizer. The most frequent uses of BRAT are teaching remote sensing, altimetry data reading (all missions from ERS-1 to Saral and soon Sentinel-3), quick data visualization/export and simple computation on the data fields. BRAT can be used for importing data and having a quick look at his contents, with several different types of plotting available. One can also use it to translate the data into other formats such as netCDF, ASCII text files, KML (Google Earth) and raster images (JPEG, PNG, etc.). Several kinds of computations can be done within BratGui involving combinations of data fields that the user can save for posterior reuse or using the already embedded formulas that include the standard oceanographic altimetry formulas (MSS, -SSH, MSLA, editing of spurious data, etc.). The documentation collection includes the standard user manual explaining all the ways to interact with the set of software tools but the most important item is the Radar Altimeter Tutorial, that contains a strong introduction to

  11. HF Radar Sea-echo from Shallow Water

    Directory of Open Access Journals (Sweden)

    Josh Kohut

    2008-08-01

    Full Text Available HF radar systems are widely and routinely used for the measurement of ocean surface currents and waves. Analysis methods presently in use are based on the assumption of infinite water depth, and may therefore be inadequate close to shore where the radar echo is strongest. In this paper, we treat the situation when the radar echo is returned from ocean waves that interact with the ocean floor. Simulations are described which demonstrate the effect of shallow water on radar sea-echo. These are used to investigate limits on the existing theory and to define water depths at which shallow-water effects become significant. The second-order spectral energy increases relative to the first-order as the water depth decreases, resulting in spectral saturation when the waveheight exceeds a limit defined by the radar transmit frequency. This effect is particularly marked for lower radar transmit frequencies. The saturation limit on waveheight is less for shallow water. Shallow water affects second-order spectra (which gives wave information far more than first-order (which gives information on current velocities, the latter being significantly affected only for the lowest radar transmit frequencies for extremely shallow water. We describe analysis of radar echo from shallow water measured by a Rutgers University HF radar system to give ocean wave spectral estimates. Radar-derived wave height, period and direction are compared with simultaneous shallow-water in-situ measurements.

  12. Falling Less in Kansas: Development of a Fall Risk Reduction Toolkit

    Directory of Open Access Journals (Sweden)

    Teresa S. Radebaugh

    2011-01-01

    Full Text Available Falls are a serious health risk for older adults. But for those living in rural and frontier areas of the USA, the risks are higher because of limited access to health care providers and resources. This study employed a community-based participatory research approach to develop a fall prevention toolkit to be used by residents of rural and frontier areas without the assistance of health care providers. Qualitative data were gathered from both key informant interviews and focus groups with a broad range of participants. Data analysis revealed that to be effective and accepted, the toolkit should be not only evidence based but also practical, low-cost, self-explanatory, and usable without the assistance of a health care provider. Materials must be engaging, visually interesting, empowering, sensitive to reading level, and appropriate for low-vision users. These findings should be useful to other researchers developing education and awareness materials for older adults in rural areas.

  13. Evidence-based Metrics Toolkit for Measuring Safety and Efficiency in Human-Automation Systems

    Data.gov (United States)

    National Aeronautics and Space Administration — APRIL 2016 NOTE: Principal Investigator moved to Rice University in mid-2015. Project continues at Rice with the same title (Evidence-based Metrics Toolkit for...

  14. Marine X-band Weather Radar Data Calibration

    DEFF Research Database (Denmark)

    Thorndahl, Søren Liedtke; Rasmussen, Michael R.

    2012-01-01

    estimates. This paper presents some of the challenges in small marine X-band radar calibration by comparing three calibration procedures for assessing the relationship between radar and rain gauge data. Validation shows similar results for precipitation volumes but more diverse results on peak rain......Application of weather radar data in urban hydrology is evolving and radar data is now applied for both modelling, analysis, and real time control purposes. In these contexts, it is allimportant that the radar data is well calibrated and adjusted in order to obtain valid quantitative precipitation...

  15. Development of radar cross section analysis system of naval ships

    Directory of Open Access Journals (Sweden)

    Kookhyun Kim

    2012-03-01

    Full Text Available A software system for a complex object scattering analysis, named SYSCOS, has been developed for a systematic radar cross section (RCS analysis and reduction design. The system is based on the high frequency analysis methods of physical optics, geometrical optics, and physical theory of diffraction, which are suitable for RCS analysis of electromagnetically large and complex targets as like naval ships. In addition, a direct scattering center analysis function has been included, which gives relatively simple and intuitive way to discriminate problem areas in design stage when comparing with conventional image-based approaches. In this paper, the theoretical background and the organization of the SYSCOS system are presented. To verify its accuracy and to demonstrate its applicability, numerical analyses for a square plate, a sphere and a cylinder, a weapon system and a virtual naval ship have been carried out, of which results have been compared with analytic solutions and those obtained by the other existing software.

  16. Raindrop size distribution and radar reflectivity-rain rate relationships for radar hydrology

    NARCIS (Netherlands)

    Uijlenhoet, R.

    2001-01-01

    The conversion of the radar reflectivity factor Z (mm6m-3) to rain rate R (mm h-1) is a crucial step in the hydrological application of weather radar measurements. It has been common practice for over 50 years now to take for this conversion a simple power law relationship between Z and R. It is the

  17. 5 year radar-based rainfall statistics: disturbances analysis and development of a post-correction scheme for the German radar composite

    Science.gov (United States)

    Wagner, A.; Seltmann, J.; Kunstmann, H.

    2015-02-01

    A radar-based rainfall statistic demands high quality data that provide realistic precipitation amounts in space and time. Instead of correcting single radar images, we developed a post-correction scheme for long-term composite radar data that corrects corrupted areas, but preserves the original precipitation patterns. The post-correction scheme is based on a 5 year statistical analysis of radar composite data and its constituents. The accumulation of radar images reveals artificial effects that are not visible in the individual radar images. Some of them are already inherent to single radar data such as the effect of increasing beam height, beam blockage or clutter remnants. More artificial effects are introduced in the process of compositing such as sharp gradients at the boundaries of overlapping areas due to different beam heights and resolution. The cause of these disturbances, their behaviour with respect to reflectivity level, season or altitude is analysed based on time-series of two radar products: the single radar reflectivity product PX for each of the 16 radar systems of the German Meteorological Service (DWD) for the time span 2000 to 2006 and the radar composite product RX of DWD from 2005 through to 2009. These statistics result in additional quality information on radar data that is not available elsewhere. The resulting robust characteristics of disturbances, e.g. the dependency of the frequencies of occurrence of radar reflectivities on beam height, are then used as a basis for the post-correction algorithm. The scheme comprises corrections for shading effects and speckles, such as clutter remnants or overfiltering, as well as for systematic differences in frequencies of occurrence of radar reflectivities between the near and the far ranges of individual radar sites. An adjustment to rain gauges is also included. Applying this correction, the Root-Mean-Square-Error for the comparison of radar derived annual rain amounts with rain gauge data

  18. SAR-EDU - An education initiative for applied Synthetic Aperture Radar remote sensing

    Science.gov (United States)

    Eckardt, Robert; Richter, Nicole; Auer, Stefan; Eineder, Michael; Roth, Achim; Hajnsek, Irena; Walter, Diana; Braun, Matthias; Motagh, Mahdi; Pathe, Carsten; Pleskachevsky, Andrey; Thiel, Christian; Schmullius, Christiane

    2013-04-01

    Since the 1970s, radar remote sensing techniques have evolved rapidly and are increasingly employed in all fields of earth sciences. Applications are manifold and still expanding due to the continuous development of new instruments and missions as well as the availability of very high-quality data. The trend worldwide is towards operational employment of the various algorithms and methods that have been developed. However, the utilization of operational services does not keep up yet with the rate of technical developments and the improvements in sensor technology. With the enhancing availability and variety of space borne Synthetic Aperture Radar (SAR) data and a growing number of analysis algorithms the need for a vital user community is increasing. Therefore the German Aerospace Center (DLR) together with the Friedrich-Schiller-University Jena (FSU) and the Technical University Munich (TUM) launched the education initiative SAR-EDU. The aim of the project is to facilitate access to expert knowledge in the scientific field of radar remote sensing. Within this effort a web portal will be created to provide seminar material on SAR basics, methods and applications to support both, lecturers and students. The overall intension of the project SAR-EDU is to provide seminar material for higher education in radar remote sensing covering the topic holistically from the very basics to the most advanced methods and applications that are available. The principles of processing and interpreting SAR data are going to be taught using test data sets and open-source as well as commercial software packages. The material that is provided by SAR-EDU will be accessible at no charge from a DLR web portal. The educational tool will have a modular structure, consisting of separate modules that broach the issue of a particular topic. The aim of the implementation of SAR-EDU as application-oriented radar remote sensing educational tool is to advocate the development and wider use of

  19. Low-Cost Mini Radar: Design Prototyping and Tests

    Directory of Open Access Journals (Sweden)

    Dario Tarchi

    2017-01-01

    Full Text Available Radar systems are largely employed for surveillance of wide and remote areas; the recent advent of drones gives the opportunity to exploit radar sensors on board of unmanned aerial platforms. Nevertheless, whereas drone radars are currently available for military applications, their employment in the civilian domain is still limited. The present research focuses on design, prototyping, and testing of an agile, low-cost, mini radar system, to be carried on board of Remotely Piloted Aircraft (RPAs or tethered aerostats. In particular, the paper faces the challenge to integrate the in-house developed radar sensor with a low-cost navigation board, which is used to estimate attitude and positioning data. In fact, a suitable synchronization between radar and navigation data is essential to properly reconstruct the radar picture whenever the platform is moving or the radar is scanning different azimuthal sectors. Preliminary results, relative to tests conducted in preoperational conditions, are provided and exploited to assert the suitable consistency of the obtained radar pictures. From the results, there is a high consistency between the radar images and the picture of the current environment emerges; finally, the comparison of radar images obtained in different scans shows the stability of the platform.

  20. Radar efficiency and the calculation of decade-long PMSE backscatter cross-section for the Resolute Bay VHF radar

    Directory of Open Access Journals (Sweden)

    N. Swarnalingam

    2009-04-01

    Full Text Available The Resolute Bay VHF radar, located in Nunavut, Canada (75.0° N, 95.0° W and operating at 51.5 MHz, has been used to investigate Polar Mesosphere Summer Echoes (PMSE since 1997. PMSE are a unique form of strong coherent radar echoes, and their understanding has been a challenge to the scientific community since their discovery more than three decades ago. While other high latitude radars have recorded strong levels of PMSE activities, the Resolute Bay radar has observed relatively lower levels of PMSE strengths. In order to derive absolute measurements of PMSE strength at this site, a technique is developed to determine the radar efficiency using cosmic (sky noise variations along with the help of a calibrated noise source. VHF radars are only rarely calibrated, but determination of efficiency is even less common. Here we emphasize the importance of efficiency for determination of cross-section measurements. The significant advantage of this method is that it can be directly applied to any MST radar system anywhere in the world as long as the sky noise variations are known. The radar efficiencies for two on-site radars at Resolute Bay are determined. PMSE backscatter cross-section is estimated, and decade-long PMSE strength variations at this location are investigated. It was noticed that the median of the backscatter cross-section distribution remains relatively unchanged, but over the years a great level of variability occurs in the high power tail of the distribution.

  1. 'RADAR': Euratom's standard unattended data acquisition system

    International Nuclear Information System (INIS)

    Schwalbach, P.; Holzleitner, L.; Jung, S.; Chare, P.; Smejkal, A.; Swinhoe, M.; Kloeckner, W.

    2001-01-01

    Full text: The physical verification of nuclear material is an essential part of Euratom's inspection activities. Industrial plants handling large amounts of bulk material typically require large numbers of measurements. Modem plants, particularly plutonium-handling facilities, are normally automated and make it difficult for the inspector to access the material. Adapting to the plant requirements with respect to safety and security as well as economics (throughput), safeguards instrumentation is today often integrated into the plant. In order to optimize scarce inspection resources, the required measurements as well as the data analysis have to be done automatically as far as feasible. For automatic measurements Euratom has developed a new unattended data acquisition system, called RADAR (Remote Acquisition of Data and Review), which has been deployed to more than a dozen installations, handling more than 100 sensors (neutron and gamma radiations detectors, balances, seals, identity readers, switches, etc.). RADAR is the standard choice for new systems but is also replacing older automatic data systems slowly as they become outdated. RADAR and most of the associated analysis tools are the result of an in-house development, with the support of external software contractors where appropriate. Experience with turn-key systems led, in 1997, to the conclusion that in-house development would be a more effective use of resources than to buy third party products. RADAR has several layers, which will be discussed in detail in the presentation. The inner core of the package consists of services running under Windows NT. This core has watchdog and logging functions, contains a scheduler and takes care of replicating files across a network. Message and file exchange is based on TCP/IP. The replicator service contains compression and encryption facilities, the encryption is based on POP. With the help of routers, e.g. from CISCO, network connections to remote locations can be

  2. Wind farm radar study

    International Nuclear Information System (INIS)

    Davies, N.G.

    1995-01-01

    This report examines the possible degradations of radar performance that may be caused by the presence of a wind turbine generator within the radar coverage area. A brief literature survey reviews the previously published work, which is mainly concerned with degradation of broadcast TV reception. Estimates are made of wind turbine generator scattering cross-sections, and of the time and Doppler characteristics of the echo signals from representative wind turbine generator. The general characteristics of radar detection and tracking methods are described, and the behaviour of such systems in the presence of strong returns from a wind turbine generator (or an array of them) is discussed. (author)

  3. Radar techniques using array antennas

    CERN Document Server

    Wirth, Wulf-Dieter

    2013-01-01

    Radar Techniques Using Array Antennas is a thorough introduction to the possibilities of radar technology based on electronic steerable and active array antennas. Topics covered include array signal processing, array calibration, adaptive digital beamforming, adaptive monopulse, superresolution, pulse compression, sequential detection, target detection with long pulse series, space-time adaptive processing (STAP), moving target detection using synthetic aperture radar (SAR), target imaging, energy management and system parameter relations. The discussed methods are confirmed by simulation stud

  4. New look at radar auroral motions

    International Nuclear Information System (INIS)

    Greenwald, R.A.; Ecklund, W.L.

    1975-01-01

    During October 1974, three modifications were temporarily added to the NOAA radar auroral backscatter facility located at Anchorage, Alaska. These modifications included (1) a multiple azimuth antenna system. (2) an on-line computer for processing amplitude and mean Doppler profiles of the radar backscatter, and (3) a 13-baud Barker coder. In combination with the radar these modifications provided data relevant to understanding both the microscopic and the macroscopic nature of the radar aurora. Appreciable structure was often found in the Doppler velocity profiles of radar auroral irregularities. Doppler velocities of nearly 2000 m/s were observed. By combining scatter amplitude profiles and mean Doppler profiles from the five azimuths we have produced contour maps of the scatter intensity and the Doppler velocity. The scatter intensity maps often indicate appreciable temporal and spatial structure in the radar auroral irregularities, corroborating the results of Tsunoda et al. (1974). The mean Doppler contour maps indicate that there is also appreciable temporal and spatial structure in the flow velocities of radar auroral irregularities. At those times when there appears to be large-scale uniformity in the irregularity flow, the Doppler velocity varies with azimuth in a manner that is consistent with a cosine-dependent azimuthal variation

  5. Comet radar explorer

    Science.gov (United States)

    Farnham, Tony; Asphaug, Erik; Barucci, Antonella; Belton, Mike; Bockelee-Morvan, Dominique; Brownlee, Donald; Capria, Maria Teresa; Carter, Lynn; Chesley, Steve; Farnham, Tony; Gaskell, Robert; Gim, Young; Heggy, Essam; Herique, Alain; Klaasen, Ken; Kofman, Wlodek; Kreslavsky, Misha; Lisse, Casey; Orosei, Roberto; Plaut, Jeff; Scheeres, Dan

    The Comet Radar Explorer (CORE) is designed to perform a comprehensive and detailed exploration of the interior, surface, and inner coma structures of a scientifically impor-tant Jupiter family comet. These structures will be used to investigate the origins of cometary nuclei, their physical and geological evolution, and the mechanisms driving their spectacular activity. CORE is a high heritage spacecraft, injected by solar electric propulsion into orbit around a comet. It is capable of coherent deep radar imaging at decameter wavelengths, high resolution stereo color imaging, and near-IR imaging spectroscopy. Its primary objective is to obtain a high-resolution map of the interior structure of a comet nucleus at a resolution of ¿100 elements across the diameter. This structure shall be related to the surface geology and morphology, and to the structural details of the coma proximal to the nucleus. This is an ideal complement to the science from recent comet missions, providing insight into how comets work. Knowing the structure of the interior of a comet-what's inside-and how cometary activity works, is required before we can understand the requirements for a cryogenic sample return mission. But more than that, CORE is fundamental to understanding the origin of comets and their evolution in time. The mission is made feasible at low cost by the use of now-standard MARSIS-SHARAD reflec-tion radar imaging hardware and data processing, together with proven flight heritage of solar electric propulsion. Radar flight heritage has been demonstrated by the MARSIS radar on Mars Express (Picardi et al., Science 2005; Plaut et al., Science 2007), the SHARAD radar onboard the Mars Reconnaissance Orbiter (Seu et al., JGR 2007), and the LRS radar onboard Kaguya (Ono et al, EPS 2007). These instruments have discovered detailed subsurface structure to depths of several kilometers in a variety of terrains on Mars and the Moon. A reflection radar deployed in orbit about a comet

  6. Hydrologic applications of weather radar

    Science.gov (United States)

    Seo, Dong-Jun; Habib, Emad; Andrieu, Hervé; Morin, Efrat

    2015-12-01

    By providing high-resolution quantitative precipitation information (QPI), weather radars have revolutionized hydrology in the last two decades. With the aid of GIS technology, radar-based quantitative precipitation estimates (QPE) have enabled routine high-resolution hydrologic modeling in many parts of the world. Given the ever-increasing need for higher-resolution hydrologic and water resources information for a wide range of applications, one may expect that the use of weather radar will only grow. Despite the tremendous progress, a number of significant scientific, technological and engineering challenges remain to realize its potential. New challenges are also emerging as new areas of applications are discovered, explored and pursued. The purpose of this special issue is to provide the readership with some of the latest advances, lessons learned, experiences gained, and science issues and challenges related to hydrologic applications of weather radar. The special issue features 20 contributions on various topics which reflect the increasing diversity as well as the areas of focus in radar hydrology today. The contributions may be grouped as follows:

  7. Classification of radar echoes using fractal geometry

    International Nuclear Information System (INIS)

    Azzaz, Nafissa; Haddad, Boualem

    2017-01-01

    Highlights: • Implementation of two concepts of fractal geometry to classify two types of meteorological radar echoes. • A new approach, called a multi-scale fractal dimension is used for classification between fixed echoes and rain echoes. • An Automatic identification system of meteorological radar echoes was proposed using fractal geometry. - Abstract: This paper deals with the discrimination between the precipitation echoes and the ground echoes in meteorological radar images using fractal geometry. This study aims to improve the measurement of precipitations by weather radars. For this, we considered three radar sites: Bordeaux (France), Dakar (Senegal) and Me lbourne (USA). We showed that the fractal dimension based on contourlet and the fractal lacunarity are pertinent to discriminate between ground and precipitation echoes. We also demonstrated that the ground echoes have a multifractal structure but the precipitations are more homogeneous than ground echoes whatever the prevailing climate. Thereby, we developed an automatic classification system of radar using a graphic interface. This interface, based on the fractal geometry makes possible the identification of radar echoes type in real time. This system can be inserted in weather radar for the improvement of precipitation estimations.

  8. The Supertree Toolkit 2: a new and improved software package with a Graphical User Interface for supertree construction

    Directory of Open Access Journals (Sweden)

    Jon Hill

    2014-03-01

    Full Text Available Building large supertrees involves the collection, storage, and processing of thousands of individual phylogenies to create large phylogenies with thousands to tens of thousands of taxa. Such large phylogenies are useful for macroevolutionary studies, comparative biology and in conservation and biodiversity. No easy to use and fully integrated software package currently exists to carry out this task. Here, we present a new Python-based software package that uses well defined XML schema to manage both data and metadata. It builds on previous versions by 1 including new processing steps, such as Safe Taxonomic Reduction, 2 using a user-friendly GUI that guides the user to complete at least the minimum information required and includes context-sensitive documentation, and 3 a revised storage format that integrates both tree- and meta-data into a single file. These data can then be manipulated according to a well-defined, but flexible, processing pipeline using either the GUI or a command-line based tool. Processing steps include standardising names, deleting or replacing taxa, ensuring adequate taxonomic overlap, ensuring data independence, and safe taxonomic reduction. This software has been successfully used to store and process data consisting of over 1000 trees ready for analyses using standard supertree methods. This software makes large supertree creation a much easier task and provides far greater flexibility for further work.

  9. The Supertree Toolkit 2: a new and improved software package with a Graphical User Interface for supertree construction.

    Science.gov (United States)

    Hill, Jon; Davis, Katie E

    2014-01-01

    Building large supertrees involves the collection, storage, and processing of thousands of individual phylogenies to create large phylogenies with thousands to tens of thousands of taxa. Such large phylogenies are useful for macroevolutionary studies, comparative biology and in conservation and biodiversity. No easy to use and fully integrated software package currently exists to carry out this task. Here, we present a new Python-based software package that uses well defined XML schema to manage both data and metadata. It builds on previous versions by 1) including new processing steps, such as Safe Taxonomic Reduction, 2) using a user-friendly GUI that guides the user to complete at least the minimum information required and includes context-sensitive documentation, and 3) a revised storage format that integrates both tree- and meta-data into a single file. These data can then be manipulated according to a well-defined, but flexible, processing pipeline using either the GUI or a command-line based tool. Processing steps include standardising names, deleting or replacing taxa, ensuring adequate taxonomic overlap, ensuring data independence, and safe taxonomic reduction. This software has been successfully used to store and process data consisting of over 1000 trees ready for analyses using standard supertree methods. This software makes large supertree creation a much easier task and provides far greater flexibility for further work.

  10. MST radar data-base management

    Science.gov (United States)

    Wickwar, V. B.

    1983-01-01

    Data management for Mesospheric-Stratospheric-Tropospheric, (MST) radars is addressed. An incoherent-scatter radar data base is discussed in terms of purpose, centralization, scope, and nature of the data base management system.

  11. Stepped-frequency radar sensors theory, analysis and design

    CERN Document Server

    Nguyen, Cam

    2016-01-01

    This book presents the theory, analysis and design of microwave stepped-frequency radar sensors. Stepped-frequency radar sensors are attractive for various sensing applications that require fine resolution. The book consists of five chapters. The first chapter describes the fundamentals of radar sensors including applications followed by a review of ultra-wideband pulsed, frequency-modulated continuous-wave (FMCW), and stepped-frequency radar sensors. The second chapter discusses a general analysis of radar sensors including wave propagation in media and scattering on targets, as well as the radar equation. The third chapter addresses the analysis of stepped-frequency radar sensors including their principles and design parameters. Chapter 4 presents the development of two stepped-frequency radar sensors at microwave and millimeter-wave frequencies based on microwave integrated circuits (MICs), microwave monolithic integrated circuits (MMICs) and printed-circuit antennas, and discusses their signal processing....

  12. Challenges in X-band Weather Radar Data Calibration

    DEFF Research Database (Denmark)

    Thorndahl, Søren; Rasmussen, Michael R.

    2009-01-01

    Application of weather radar data in urban hydrology is evolving and radar data is now applied for both modelling, analysis and real time control purposes. In these contexts, it is all-important that the radar data well calibrated and adjusted in order to obtain valid quantitative precipitation e...... estimates. This paper compares two calibration procedures for a small marine X-band radar by comparing radar data with rain gauge data. Validation shows a very good consensus with regards to precipitation volumes, but more diverse results on peak rain intensities.......Application of weather radar data in urban hydrology is evolving and radar data is now applied for both modelling, analysis and real time control purposes. In these contexts, it is all-important that the radar data well calibrated and adjusted in order to obtain valid quantitative precipitation...

  13. MX: A beamline control system toolkit

    Science.gov (United States)

    Lavender, William M.

    2000-06-01

    The development of experimental and beamline control systems for two Collaborative Access Teams at the Advanced Photon Source has resulted in the creation of a portable data acquisition and control toolkit called MX. MX consists of a set of servers, application programs and libraries that enable the creation of command line and graphical user interface applications that may be easily retargeted to new and different kinds of motor and device controllers. The source code for MX is written in ANSI C and Tcl/Tk with interprocess communication via TCP/IP. MX is available for several versions of Unix, Windows 95/98/NT and DOS. It may be downloaded from the web site http://www.imca.aps.anl.gov/mx/.

  14. Radar principles for the nonspecialist, 3rd edition

    CERN Document Server

    Toomay, John

    2004-01-01

    Radar Principles for the Non-specialist, Third Edition continues its popular tradition: to distill the very complex technology of radar into its fundamentals, tying them to the laws of nature on one end and to the most modern and complex systems on the other. It starts with electromagnetic propagation, describes a radar of the utmost simplicity, and derives the radar range equation from that simple radar. Once the range equation is available, the book attacks the meaning of each term in it, moving through antennas, detection and tracking, radar cross-section, waveforms andsignal proces

  15. Methodology for the development of a taxonomy and toolkit to evaluate health-related habits and lifestyle (eVITAL

    Directory of Open Access Journals (Sweden)

    Walsh Carolyn O

    2010-03-01

    Full Text Available Abstract Background Chronic diseases cause an ever-increasing percentage of morbidity and mortality, but many have modifiable risk factors. Many behaviors that predispose or protect an individual to chronic disease are interrelated, and therefore are best approached using an integrated model of health and the longevity paradigm, using years lived without disability as the endpoint. Findings This study used a 4-phase mixed qualitative design to create a taxonomy and related online toolkit for the evaluation of health-related habits. Core members of a working group conducted a literature review and created a framing document that defined relevant constructs. This document was revised, first by a working group and then by a series of multidisciplinary expert groups. The working group and expert panels also designed a systematic evaluation of health behaviors and risks, which was computerized and evaluated for feasibility. A demonstration study of the toolkit was performed in 11 healthy volunteers. Discussion In this protocol, we used forms of the community intelligence approach, including frame analysis, feasibility, and demonstration, to develop a clinical taxonomy and an online toolkit with standardized procedures for screening and evaluation of multiple domains of health, with a focus on longevity and the goal of integrating the toolkit into routine clinical practice. Trial Registration IMSERSO registry 200700012672

  16. Principles of modern radar advanced techniques

    CERN Document Server

    Melvin, William

    2012-01-01

    Principles of Modern Radar: Advanced Techniques is a professional reference for practicing engineers that provides a stepping stone to advanced practice with in-depth discussions of the most commonly used advanced techniques for radar design. It will also serve advanced radar academic and training courses with a complete set of problems for students as well as solutions for instructors.

  17. An evaluation capacity building toolkit for principal investigators of undergraduate research experiences: A demonstration of transforming theory into practice.

    Science.gov (United States)

    Rorrer, Audrey S

    2016-04-01

    This paper describes the approach and process undertaken to develop evaluation capacity among the leaders of a federally funded undergraduate research program. An evaluation toolkit was developed for Computer and Information Sciences and Engineering(1) Research Experiences for Undergraduates(2) (CISE REU) programs to address the ongoing need for evaluation capacity among principal investigators who manage program evaluation. The toolkit was the result of collaboration within the CISE REU community with the purpose being to provide targeted instructional resources and tools for quality program evaluation. Challenges were to balance the desire for standardized assessment with the responsibility to account for individual program contexts. Toolkit contents included instructional materials about evaluation practice, a standardized applicant management tool, and a modulated outcomes measure. Resulting benefits from toolkit deployment were having cost effective, sustainable evaluation tools, a community evaluation forum, and aggregate measurement of key program outcomes for the national program. Lessons learned included the imperative of understanding the evaluation context, engaging stakeholders, and building stakeholder trust. Results from project measures are presented along with a discussion of guidelines for facilitating evaluation capacity building that will serve a variety of contexts. Copyright © 2016. Published by Elsevier Ltd.

  18. Mutual information-based LPI optimisation for radar network

    Science.gov (United States)

    Shi, Chenguang; Zhou, Jianjiang; Wang, Fei; Chen, Jun

    2015-07-01

    Radar network can offer significant performance improvement for target detection and information extraction employing spatial diversity. For a fixed number of radars, the achievable mutual information (MI) for estimating the target parameters may extend beyond a predefined threshold with full power transmission. In this paper, an effective low probability of intercept (LPI) optimisation algorithm is presented to improve LPI performance for radar network. Based on radar network system model, we first provide Schleher intercept factor for radar network as an optimisation metric for LPI performance. Then, a novel LPI optimisation algorithm is presented, where for a predefined MI threshold, Schleher intercept factor for radar network is minimised by optimising the transmission power allocation among radars in the network such that the enhanced LPI performance for radar network can be achieved. The genetic algorithm based on nonlinear programming (GA-NP) is employed to solve the resulting nonconvex and nonlinear optimisation problem. Some simulations demonstrate that the proposed algorithm is valuable and effective to improve the LPI performance for radar network.

  19. The PAZAR database of gene regulatory information coupled to the ORCA toolkit for the study of regulatory sequences

    Science.gov (United States)

    Portales-Casamar, Elodie; Arenillas, David; Lim, Jonathan; Swanson, Magdalena I.; Jiang, Steven; McCallum, Anthony; Kirov, Stefan; Wasserman, Wyeth W.

    2009-01-01

    The PAZAR database unites independently created and maintained data collections of transcription factor and regulatory sequence annotation. The flexible PAZAR schema permits the representation of diverse information derived from experiments ranging from biochemical protein–DNA binding to cellular reporter gene assays. Data collections can be made available to the public, or restricted to specific system users. The data ‘boutiques’ within the shopping-mall-inspired system facilitate the analysis of genomics data and the creation of predictive models of gene regulation. Since its initial release, PAZAR has grown in terms of data, features and through the addition of an associated package of software tools called the ORCA toolkit (ORCAtk). ORCAtk allows users to rapidly develop analyses based on the information stored in the PAZAR system. PAZAR is available at http://www.pazar.info. ORCAtk can be accessed through convenient buttons located in the PAZAR pages or via our website at http://www.cisreg.ca/ORCAtk. PMID:18971253

  20. Innovations in oral health: A toolkit for interprofessional education.

    Science.gov (United States)

    Dolce, Maria C; Parker, Jessica L; Werrlein, Debra T

    2017-05-01

    The integration of oral health competencies into non-dental health professions curricula can serve as an effective driver for interprofessional education (IPE). The purpose of this report is to describe a replicable oral-health-driven IPE model and corresponding online toolkit, both of which were developed as part of the Innovations in Oral Health (IOH): Technology, Instruction, Practice, and Service programme at Bouvé College of Health Sciences, Northeastern University, USA. Tooth decay is a largely preventable disease that is connected to overall health and wellness, and it affects the majority of adults and a fifth of children in the United States. To prepare all health professionals to address this problem, the IOH model couples programming from the online resource Smiles for Life: A National Oral Health Curriculum with experiential learning opportunities designed for undergraduate and graduate students that include simulation-learning (technology), hands-on workshops and didactic sessions (instruction), and opportunities for both cooperative education (practice) and community-based learning (service). The IOH Toolkit provides the means for others to replicate portions of the IOH model or to establish a large-scale IPE initiative that will support the creation of an interprofessional workforce-one equipped with oral health competencies and ready for collaborative practice.