WorldWideScience

Sample records for protocol analysis tools

  1. Tool Supported Analysis of Web Services Protocols

    DEFF Research Database (Denmark)

    Marques, Abinoam P.; Ravn, Anders Peter; Srba, Jiri

    2011-01-01

    We describe an abstract protocol model suitable for modelling of web services and other protocols communicating via unreliable, asynchronous communication channels. The model is supported by a tool chain where the first step translates tables with state/transition protocol descriptions, often used...... e.g. in the design of web services protocols, into an intermediate XML format. We further translate this format into a network of communicating state machines directly suitable for verification in the model checking tool UPPAAL. We introduce two types of communication media abstractions in order...

  2. New method development in prehistoric stone tool research: evaluating use duration and data analysis protocols.

    Science.gov (United States)

    Evans, Adrian A; Macdonald, Danielle A; Giusca, Claudiu L; Leach, Richard K

    2014-10-01

    Lithic microwear is a research field of prehistoric stone tool (lithic) analysis that has been developed with the aim to identify how stone tools were used. It has been shown that laser scanning confocal microscopy has the potential to be a useful quantitative tool in the study of prehistoric stone tool function. In this paper, two important lines of inquiry are investigated: (1) whether the texture of worn surfaces is constant under varying durations of tool use, and (2) the development of rapid objective data analysis protocols. This study reports on the attempt to further develop these areas of study and results in a better understanding of the complexities underlying the development of flexible analytical algorithms for surface analysis. The results show that when sampling is optimised, surface texture may be linked to contact material type, independent of use duration. Further research is needed to validate this finding and test an expanded range of contact materials. The use of automated analytical protocols has shown promise but is only reliable if sampling location and scale are defined. Results suggest that the sampling protocol reports on the degree of worn surface invasiveness, complicating the ability to investigate duration related textural characterisation. Copyright © 2014. Published by Elsevier Ltd.

  3. TH-C-18A-08: A Management Tool for CT Dose Monitoring, Analysis, and Protocol Review

    International Nuclear Information System (INIS)

    Wang, J; Chan, F; Newman, B; Larson, D; Leung, A; Fleischmann, D; Molvin, L; Marsh, D; Zorich, C; Phillips, L

    2014-01-01

    Purpose: To develop a customizable tool for enterprise-wide managing of CT protocols and analyzing radiation dose information of CT exams for a variety of quality control applications Methods: All clinical CT protocols implemented on the 11 CT scanners at our institution were extracted in digital format. The original protocols had been preset by our CT management team. A commercial CT dose tracking software (DoseWatch,GE healthcare,WI) was used to collect exam information (exam date, patient age etc.), scanning parameters, and radiation doses for all CT exams. We developed a Matlab-based program (MathWorks,MA) with graphic user interface which allows to analyze the scanning protocols with the actual dose estimates, and compare the data to national (ACR,AAPM) and internal reference values for CT quality control. Results: The CT protocol review portion of our tool allows the user to look up the scanning and image reconstruction parameters of any protocol on any of the installed CT systems among about 120 protocols per scanner. In the dose analysis tool, dose information of all CT exams (from 05/2013 to 02/2014) was stratified on a protocol level, and within a protocol down to series level, i.e. each individual exposure event. This allows numerical and graphical review of dose information of any combination of scanner models, protocols and series. The key functions of the tool include: statistics of CTDI, DLP and SSDE, dose monitoring using user-set CTDI/DLP/SSDE thresholds, look-up of any CT exam dose data, and CT protocol review. Conclusion: our inhouse CT management tool provides radiologists, technologists and administration a first-hand near real-time enterprise-wide knowledge on CT dose levels of different exam types. Medical physicists use this tool to manage CT protocols, compare and optimize dose levels across different scanner models. It provides technologists feedback on CT scanning operation, and knowledge on important dose baselines and thresholds

  4. Analysis of MD5 authentication in various routing protocols using simulation tools

    Science.gov (United States)

    Dinakaran, M.; Darshan, K. N.; Patel, Harsh

    2017-11-01

    Authentication being an important paradigm of security and Computer Networks require secure paths to make the flow of the data even more secure through some security protocols. So MD-5(Message Digest 5) helps in providing data integrity to the data being sent through it and authentication to the network devices. This paper gives a brief introduction to the MD-5, simulation of the networks by including MD-5 authentication using various routing protocols like OSPF, EIGRP and RIPv2. GNS3 is being used to simulate the scenarios. Analysis of the MD-5 authentication is done in the later sections of the paper.

  5. Analysis and Verification of a Key Agreement Protocol over Cloud Computing Using Scyther Tool

    OpenAIRE

    Hazem A Elbaz

    2015-01-01

    The mostly cloud computing authentication mechanisms use public key infrastructure (PKI). Hierarchical Identity Based Cryptography (HIBC) has several advantages that sound well align with the demands of cloud computing. The main objectives of cloud computing authentication protocols are security and efficiency. In this paper, we clarify Hierarchical Identity Based Authentication Key Agreement (HIB-AKA) protocol, providing lightweight key management approach for cloud computing users. Then, we...

  6. Metabolomics Workbench: An international repository for metabolomics data and metadata, metabolite standards, protocols, tutorials and training, and analysis tools.

    Science.gov (United States)

    Sud, Manish; Fahy, Eoin; Cotter, Dawn; Azam, Kenan; Vadivelu, Ilango; Burant, Charles; Edison, Arthur; Fiehn, Oliver; Higashi, Richard; Nair, K Sreekumaran; Sumner, Susan; Subramaniam, Shankar

    2016-01-04

    The Metabolomics Workbench, available at www.metabolomicsworkbench.org, is a public repository for metabolomics metadata and experimental data spanning various species and experimental platforms, metabolite standards, metabolite structures, protocols, tutorials, and training material and other educational resources. It provides a computational platform to integrate, analyze, track, deposit and disseminate large volumes of heterogeneous data from a wide variety of metabolomics studies including mass spectrometry (MS) and nuclear magnetic resonance spectrometry (NMR) data spanning over 20 different species covering all the major taxonomic categories including humans and other mammals, plants, insects, invertebrates and microorganisms. Additionally, a number of protocols are provided for a range of metabolite classes, sample types, and both MS and NMR-based studies, along with a metabolite structure database. The metabolites characterized in the studies available on the Metabolomics Workbench are linked to chemical structures in the metabolite structure database to facilitate comparative analysis across studies. The Metabolomics Workbench, part of the data coordinating effort of the National Institute of Health (NIH) Common Fund's Metabolomics Program, provides data from the Common Fund's Metabolomics Resource Cores, metabolite standards, and analysis tools to the wider metabolomics community and seeks data depositions from metabolomics researchers across the world. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  7. An Exploratory Factor Analysis of the Sheltered Instruction Observation Protocol as an Evaluation Tool to Measure Teaching Effectiveness

    Science.gov (United States)

    Polat, Nihat; Cepik, Saban

    2016-01-01

    To narrow the achievement gap between English language learners (ELLs) and their native-speaking peers in K-12 settings in the United States, effective instructional models must be identified. However, identifying valid observation protocols that can measure the effectiveness of specially designed instructional practices is not an easy task. This…

  8. Mobile Internet Protocol Analysis

    National Research Council Canada - National Science Library

    Brachfeld, Lawrence

    1999-01-01

    ...) and User Datagram Protocol (UDP). Mobile IP allows mobile computers to send and receive packets addressed with their home network IP address, regardless of the IP address of their current point of attachment on the Internet...

  9. A communication protocol for interactively controlling software tools

    NARCIS (Netherlands)

    Wulp, van der J.

    2008-01-01

    We present a protocol for interactively using software tools in a loosely coupled tool environment. Such an environment can assist the user in doing tasks that require the use of multiple tools. For example, it can invoke tools on certain input, set processing parameters, await task completion and

  10. Tools for Performance Assessment of OLSR Protocol

    Directory of Open Access Journals (Sweden)

    Makoto Ikeda

    2009-01-01

    Full Text Available In this paper, we evaluate the performance of Optimized Link State Routing (OLSR protocol by experimental and simulation results. The experiments are carried out by using our implemented testbed and the simulations by using ns-2 simulator. We also designed and implemented a new interface for the ad-hoc network testbed in order to make more easier the experiments. The comparison between experimental and simulation results shows that for the same parameters set, in the simulation we did not notice any packet loss. On the other hand, in the experiments we experienced packet loss because of the environment effects and traffic interference.

  11. Cryptographic protocol security analysis based on bounded constructing algorithm

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    An efficient approach to analyzing cryptographic protocols is to develop automatic analysis tools based on formal methods. However, the approach has encountered the high computational complexity problem due to reasons that participants of protocols are arbitrary, their message structures are complex and their executions are concurrent. We propose an efficient automatic verifying algorithm for analyzing cryptographic protocols based on the Cryptographic Protocol Algebra (CPA) model proposed recently, in which algebraic techniques are used to simplify the description of cryptographic protocols and their executions. Redundant states generated in the analysis processes are much reduced by introducing a new algebraic technique called Universal Polynomial Equation and the algorithm can be used to verify the correctness of protocols in the infinite states space. We have implemented an efficient automatic analysis tool for cryptographic protocols, called ACT-SPA, based on this algorithm, and used the tool to check more than 20 cryptographic protocols. The analysis results show that this tool is more efficient, and an attack instance not offered previously is checked by using this tool.

  12. The Network Protocol Analysis Technique in Snort

    Science.gov (United States)

    Wu, Qing-Xiu

    Network protocol analysis is a network sniffer to capture data for further analysis and understanding of the technical means necessary packets. Network sniffing is intercepted by packet assembly binary format of the original message content. In order to obtain the information contained. Required based on TCP / IP protocol stack protocol specification. Again to restore the data packets at protocol format and content in each protocol layer. Actual data transferred, as well as the application tier.

  13. Incorporating ethical principles into clinical research protocols: a tool for protocol writers and ethics committees.

    Science.gov (United States)

    Li, Rebecca H; Wacholtz, Mary C; Barnes, Mark; Boggs, Liam; Callery-D'Amico, Susan; Davis, Amy; Digilova, Alla; Forster, David; Heffernan, Kate; Luthin, Maeve; Lynch, Holly Fernandez; McNair, Lindsay; Miller, Jennifer E; Murphy, Jacquelyn; Van Campen, Luann; Wilenzick, Mark; Wolf, Delia; Woolston, Cris; Aldinger, Carmen; Bierer, Barbara E

    2016-04-01

    A novel Protocol Ethics Tool Kit ('Ethics Tool Kit') has been developed by a multi-stakeholder group of the Multi-Regional Clinical Trials Center of Brigham and Women's Hospital and Harvard. The purpose of the Ethics Tool Kit is to facilitate effective recognition, consideration and deliberation of critical ethical issues in clinical trial protocols. The Ethics Tool Kit may be used by investigators and sponsors to develop a dedicated Ethics Section within a protocol to improve the consistency and transparency between clinical trial protocols and research ethics committee reviews. It may also streamline ethics review and may facilitate and expedite the review process by anticipating the concerns of ethics committee reviewers. Specific attention was given to issues arising in multinational settings. With the use of this Tool Kit, researchers have the opportunity to address critical research ethics issues proactively, potentially speeding the time and easing the process to final protocol approval. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  14. Security analysis of session initiation protocol

    OpenAIRE

    Dobson, Lucas E.

    2010-01-01

    Approved for public release; distribution is unlimited The goal of this thesis is to investigate the security of the Session Initiation Protocol (SIP). This was accomplished by researching previously discovered protocol and implementation vulnerabilities, evaluating the current state of security tools and using those tools to discover new vulnerabilities in SIP software. The CVSS v2 system was used to score protocol and implementation vulnerabilities to give them a meaning that was us...

  15. Physics analysis tools

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1991-04-01

    There are many tools used in analysis in High Energy Physics (HEP). They range from low level tools such as a programming language to high level such as a detector simulation package. This paper will discuss some aspects of these tools that are directly associated with the process of analyzing HEP data. Physics analysis tools cover the whole range from the simulation of the interactions of particles to the display and fitting of statistical data. For purposes of this paper, the stages of analysis is broken down to five main stages. The categories are also classified as areas of generation, reconstruction, and analysis. Different detector groups use different terms for these stages thus it is useful to define what is meant by them in this paper. The particle generation stage is a simulation of the initial interaction, the production of particles, and the decay of the short lived particles. The detector simulation stage simulates the behavior of an event in a detector. The track reconstruction stage does pattern recognition on the measured or simulated space points, calorimeter information, etc., and reconstructs track segments of the original event. The event reconstruction stage takes the reconstructed tracks, along with particle identification information and assigns masses to produce 4-vectors. Finally the display and fit stage displays statistical data accumulated in the preceding stages in the form of histograms, scatter plots, etc. The remainder of this paper will consider what analysis tools are available today, and what one might expect in the future. In each stage, the integration of the tools with other stages and the portability of the tool will be analyzed

  16. Building energy analysis tool

    Science.gov (United States)

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars

    2016-04-12

    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  17. Extended Testability Analysis Tool

    Science.gov (United States)

    Melcher, Kevin; Maul, William A.; Fulton, Christopher

    2012-01-01

    The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

  18. Contamination Analysis Tools

    Science.gov (United States)

    Brieda, Lubos

    2015-01-01

    This talk presents 3 different tools developed recently for contamination analysis:HTML QCM analyzer: runs in a web browser, and allows for data analysis of QCM log filesJava RGA extractor: can load in multiple SRS.ana files and extract pressure vs. time dataC++ Contamination Simulation code: 3D particle tracing code for modeling transport of dust particulates and molecules. Uses residence time to determine if molecules stick. Particulates can be sampled from IEST-STD-1246 and be accelerated by aerodynamic forces.

  19. Dynamic Contingency Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2016-01-14

    The Dynamic Contingency Analysis Tool (DCAT) is an open-platform and publicly available methodology to help develop applications that aim to improve the capabilities of power system planning engineers to assess the impact and likelihood of extreme contingencies and potential cascading events across their systems and interconnections. Outputs from the DCAT will help find mitigation solutions to reduce the risk of cascading outages in technically sound and effective ways. The current prototype DCAT implementation has been developed as a Python code that accesses the simulation functions of the Siemens PSS/E planning tool (PSS/E). It has the following features: It uses a hybrid dynamic and steady-state approach to simulating the cascading outage sequences that includes fast dynamic and slower steady-state events. It integrates dynamic models with protection scheme models for generation, transmission, and load. It models special protection systems (SPSs)/remedial action schemes (RASs) and automatic and manual corrective actions. Overall, the DCAT attempts to bridge multiple gaps in cascading-outage analysis in a single, unique prototype tool capable of automatically simulating and analyzing cascading sequences in real systems using multiprocessor computers.While the DCAT has been implemented using PSS/E in Phase I of the study, other commercial software packages with similar capabilities can be used within the DCAT framework.

  20. Optimizing the ASC WAN: evaluating network performance tools for comparing transport protocols.

    Energy Technology Data Exchange (ETDEWEB)

    Lydick, Christopher L.

    2007-07-01

    The Advanced Simulation & Computing Wide Area Network (ASC WAN), which is a high delay-bandwidth network connection between US Department of Energy National Laboratories, is constantly being examined and evaluated for efficiency. One of the current transport-layer protocols which is used, TCP, was developed for traffic demands which are different from that on the ASC WAN. The Stream Control Transport Protocol (SCTP), on the other hand, has shown characteristics which make it more appealing to networks such as these. Most important, before considering a replacement for TCP on any network, a testing tool that performs well against certain criteria needs to be found. In order to try to find such a tool, two popular networking tools (Netperf v.2.4.3 & v.2.4.6 (OpenSS7 STREAMS), and Iperf v.2.0.6) were tested. These tools implement both TCP and SCTP and were evaluated using four metrics: (1) How effectively can the tool reach a throughput near the bandwidth? (2) How much of the CPU does the tool utilize during operation? (3) Is the tool freely and widely available? And, (4) Is the tool actively developed? Following the analysis of those tools, this paper goes further into explaining some recommendations and ideas for future work.

  1. Symbolic Analysis of Cryptographic Protocols

    DEFF Research Database (Denmark)

    Dahl, Morten

    We present our work on using abstract models for formally analysing cryptographic protocols: First, we present an ecient method for verifying trace-based authenticity properties of protocols using nonces, symmetric encryption, and asymmetric encryption. The method is based on a type system...... of Gordon et al., which we modify to support fully-automated type inference. Tests conducted via an implementation of our algorithm found it to be very ecient. Second, we show how privacy may be captured in a symbolic model using an equivalencebased property and give a formal denition. We formalise...

  2. Frequency Response Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kosterev, Dmitry [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dai, T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-01

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  3. A Logical Analysis of Quantum Voting Protocols

    Science.gov (United States)

    Rad, Soroush Rafiee; Shirinkalam, Elahe; Smets, Sonja

    2017-12-01

    In this paper we provide a logical analysis of the Quantum Voting Protocol for Anonymous Surveying as developed by Horoshko and Kilin in (Phys. Lett. A 375, 1172-1175 2011). In particular we make use of the probabilistic logic of quantum programs as developed in (Int. J. Theor. Phys. 53, 3628-3647 2014) to provide a formal specification of the protocol and to derive its correctness. Our analysis is part of a wider program on the application of quantum logics to the formal verification of protocols in quantum communication and quantum computation.

  4. Draper Station Analysis Tool

    Science.gov (United States)

    Bedrossian, Nazareth; Jang, Jiann-Woei; McCants, Edward; Omohundro, Zachary; Ring, Tom; Templeton, Jeremy; Zoss, Jeremy; Wallace, Jonathan; Ziegler, Philip

    2011-01-01

    Draper Station Analysis Tool (DSAT) is a computer program, built on commercially available software, for simulating and analyzing complex dynamic systems. Heretofore used in designing and verifying guidance, navigation, and control systems of the International Space Station, DSAT has a modular architecture that lends itself to modification for application to spacecraft or terrestrial systems. DSAT consists of user-interface, data-structures, simulation-generation, analysis, plotting, documentation, and help components. DSAT automates the construction of simulations and the process of analysis. DSAT provides a graphical user interface (GUI), plus a Web-enabled interface, similar to the GUI, that enables a remotely located user to gain access to the full capabilities of DSAT via the Internet and Webbrowser software. Data structures are used to define the GUI, the Web-enabled interface, simulations, and analyses. Three data structures define the type of analysis to be performed: closed-loop simulation, frequency response, and/or stability margins. DSAT can be executed on almost any workstation, desktop, or laptop computer. DSAT provides better than an order of magnitude improvement in cost, schedule, and risk assessment for simulation based design and verification of complex dynamic systems.

  5. An Evaluation Methodology for Protocol Analysis Systems

    Science.gov (United States)

    2007-03-01

    Main Memory Requirement NS: Needham-Schroeder NSL: Needham-Schroeder-Lowe OCaml : Objective Caml POSIX: Portable Operating System...methodology is needed. A. PROTOCOL ANALYSIS FIELD As with any field, there is a specialized language used within the protocol analysis community. Figure...ProVerif requires that Objective Caml ( OCaml ) be installed on the system, OCaml version 3.09.3 was installed. C. WINDOWS CONFIGURATION OS

  6. Minimising post-operative risk using a Post-Anaesthetic Care Tool (PACT): protocol for a prospective observational study and cost-effectiveness analysis.

    Science.gov (United States)

    Street, Maryann; Phillips, Nicole M; Kent, Bridie; Colgan, Stephen; Mohebbi, Mohammadreza

    2015-06-01

    While the risk of adverse events following surgery has been identified, the impact of nursing care on early detection of these events is not well established. A systematic review of the evidence and an expert consensus study in post-anaesthetic care identified essential criteria for nursing assessment of patient readiness for discharge from the post-anaesthetic care unit (PACU). These criteria were included in a new nursing assessment tool, the Post-Anaesthetic Care Tool (PACT), and incorporated into the post-anaesthetic documentation at a large health service. The aim of this study is to test the clinical reliability of the PACT and evaluate whether the use of PACT will (1) enhance the recognition and response to patients at risk of deterioration in PACU; (2) improve documentation for handover from PACU nurse to ward nurse; (3) result in improved patient outcomes and (4) reduce healthcare costs. A prospective, non-randomised, pre-implementation and post-implementation design comparing: (1) patients (n=750) who have surgery prior to the implementation of the PACT and (2) patients (n=750) who have surgery after PACT. The study will examine the use of the tool through the observation of patient care and nursing handover. Patient outcomes and cost-effectiveness will be determined from health service data and medical record audit. Descriptive statistics will be used to describe the sample and compare the two patient groups (pre-intervention and post-intervention). Differences in patient outcomes between the two groups will be compared using the Cochran-Mantel-Haenszel test and regression analyses and reported as ORs with the corresponding 95% CIs. This study will test the clinical reliability and cost-effectiveness of the PACT. It is hypothesised that the PACT will enable nurses to recognise and respond to patients at risk of deterioration, improve handover to ward nurses, improve patient outcomes, and reduce healthcare costs. Published by the BMJ Publishing Group

  7. Hurricane Data Analysis Tool

    Science.gov (United States)

    Liu, Zhong; Ostrenga, Dana; Leptoukh, Gregory

    2011-01-01

    In order to facilitate Earth science data access, the NASA Goddard Earth Sciences Data Information Services Center (GES DISC) has developed a web prototype, the Hurricane Data Analysis Tool (HDAT; URL: http://disc.gsfc.nasa.gov/HDAT), to allow users to conduct online visualization and analysis of several remote sensing and model datasets for educational activities and studies of tropical cyclones and other weather phenomena. With a web browser and few mouse clicks, users can have a full access to terabytes of data and generate 2-D or time-series plots and animation without downloading any software and data. HDAT includes data from the NASA Tropical Rainfall Measuring Mission (TRMM), the NASA Quick Scatterometer(QuikSCAT) and NECP Reanalysis, and the NCEP/CPC half-hourly, 4-km Global (60 N - 60 S) IR Dataset. The GES DISC archives TRMM data. The daily global rainfall product derived from the 3-hourly multi-satellite precipitation product (3B42 V6) is available in HDAT. The TRMM Microwave Imager (TMI) sea surface temperature from the Remote Sensing Systems is in HDAT as well. The NASA QuikSCAT ocean surface wind and the NCEP Reanalysis provide ocean surface and atmospheric conditions, respectively. The global merged IR product, also known as, the NCEP/CPC half-hourly, 4-km Global (60 N -60 S) IR Dataset, is one of TRMM ancillary datasets. They are globally-merged pixel-resolution IR brightness temperature data (equivalent blackbody temperatures), merged from all available geostationary satellites (GOES-8/10, METEOSAT-7/5 & GMS). The GES DISC has collected over 10 years of the data beginning from February of 2000. This high temporal resolution (every 30 minutes) dataset not only provides additional background information to TRMM and other satellite missions, but also allows observing a wide range of meteorological phenomena from space, such as, hurricanes, typhoons, tropical cyclones, mesoscale convection system, etc. Basic functions include selection of area of

  8. Analysis of Security Protocols by Annotations

    DEFF Research Database (Denmark)

    Gao, Han

    . The development of formal techniques, e.g. control flow analyses, that can check various security properties, is an important tool to meet this challenge. This dissertation contributes to the development of such techniques. In this dissertation, security protocols are modelled in the process calculus LYSA......The trend in Information Technology is that distributed systems and networks are becoming increasingly important, as most of the services and opportunities that characterise the modern society are based on these technologies. Communication among agents over networks has therefore acquired a great...... deal of research interest. In order to provide effective and reliable means of communication, more and more communication protocols are invented, and for most of them, security is a significant goal. It has long been a challenge to determine conclusively whether a given protocol is secure or not...

  9. Java Radar Analysis Tool

    Science.gov (United States)

    Zaczek, Mariusz P.

    2005-01-01

    Java Radar Analysis Tool (JRAT) is a computer program for analyzing two-dimensional (2D) scatter plots derived from radar returns showing pieces of the disintegrating Space Shuttle Columbia. JRAT can also be applied to similar plots representing radar returns showing aviation accidents, and to scatter plots in general. The 2D scatter plots include overhead map views and side altitude views. The superposition of points in these views makes searching difficult. JRAT enables three-dimensional (3D) viewing: by use of a mouse and keyboard, the user can rotate to any desired viewing angle. The 3D view can include overlaid trajectories and search footprints to enhance situational awareness in searching for pieces. JRAT also enables playback: time-tagged radar-return data can be displayed in time order and an animated 3D model can be moved through the scene to show the locations of the Columbia (or other vehicle) at the times of the corresponding radar events. The combination of overlays and playback enables the user to correlate a radar return with a position of the vehicle to determine whether the return is valid. JRAT can optionally filter single radar returns, enabling the user to selectively hide or highlight a desired radar return.

  10. Microgrid Analysis Tools Summary

    Energy Technology Data Exchange (ETDEWEB)

    Jimenez, Antonio [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Haase, Scott G [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Mathur, Shivani [Formerly NREL

    2018-03-05

    The over-arching goal of the Alaska Microgrid Partnership is to reduce the use of total imported fuel into communities to secure all energy services by at least 50% in Alaska's remote microgrids without increasing system life cycle costs while also improving overall system reliability, security, and resilience. One goal of the Alaska Microgrid Partnership is to investigate whether a combination of energy efficiency and high-contribution (from renewable energy) power systems can reduce total imported energy usage by 50% while reducing life cycle costs and improving reliability and resiliency. This presentation provides an overview of the following four renewable energy optimization tools. Information is from respective tool websites, tool developers, and author experience. Distributed Energy Resources Customer Adoption Model (DER-CAM) Microgrid Design Toolkit (MDT) Renewable Energy Optimization (REopt) Tool Hybrid Optimization Model for Electric Renewables (HOMER).

  11. Oscillation Baselining and Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2017-03-27

    PNNL developed a new tool for oscillation analysis and baselining. This tool has been developed under a new DOE Grid Modernization Laboratory Consortium (GMLC) Project (GM0072 - “Suite of open-source applications and models for advanced synchrophasor analysis”) and it is based on the open platform for PMU analysis. The Oscillation Baselining and Analysis Tool (OBAT) performs the oscillation analysis and identifies modes of oscillations (frequency, damping, energy, and shape). The tool also does oscillation event baselining (fining correlation between oscillations characteristics and system operating conditions).

  12. The Alcohol Environment Protocol: A new tool for alcohol policy.

    Science.gov (United States)

    Casswell, Sally; Morojele, Neo; Williams, Petal Petersen; Chaiyasong, Surasak; Gordon, Ross; Gray-Philip, Gaile; Viet Cuong, Pham; MacKintosh, Anne-Marie; Halliday, Sharon; Railton, Renee; Randerson, Steve; Parry, Charles D H

    2018-01-04

    To report data on the implementation of alcohol policies regarding availability and marketing, and drink driving, along with ratings of enforcement from two small high-income to three high-middle income countries, and one low-middle income country. This study uses the Alcohol Environment Protocol, an International Alcohol Control study research tool, which documents the alcohol policy environment by standardised collection of data from administrative sources, observational studies and interviews with key informants to allow for cross-country comparison and change over time. All countries showed adoption to varying extents of key effective policy approaches outlined in the World Health Organization Global Strategy to Reduce the Harmful Use of Alcohol (2010). High-income countries were more likely to allocate resources to enforcement. However, where enforcement and implementation were high, policy on availability was fairly liberal. Key Informants judged alcohol to be very available in both high- and middle-income countries, reflecting liberal policy in the former and less implementation and enforcement and informal (unlicensed) sale of alcohol in the latter. Marketing was largely unrestricted in all countries and while drink-driving legislation was in place, it was less well enforced in middle-income countries. In countries with fewer resources, alcohol policies are less effective because of lack of implementation and enforcement and, in the case of marketing, lack of regulation. This has implications for the increase in consumption taking place as a result of the expanding distribution and marketing of commercial alcohol and consequent increases in alcohol-related harm. © 2018 The Authors Drug and Alcohol Review published by John Wiley & Sons Australia, Ltd on behalf of Australasian Professional Society on Alcohol and other Drugs.

  13. Power Tools for Talking: Custom Protocols Enrich Coaching Conversations

    Science.gov (United States)

    Pomerantz, Francesca; Ippolito, Jacy

    2015-01-01

    Discussion-based protocols--an "agreed upon set of discussion or observation rules that guide coach/teacher/student work, discussion, and interactions" (Ippolito & Lieberman, 2012, p. 79)--can help focus and structure productive professional learning discussions. However, while protocols are slowly growing into essential elements of…

  14. Sight Application Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Bronevetsky, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-09-17

    The scale and complexity of scientific applications makes it very difficult to optimize, debug and extend them to support new capabilities. We have developed a tool that supports developers’ efforts to understand the logical flow of their applications and interactions between application components and hardware in a way that scales with application complexity and parallelism.

  15. Rethinking Protocol Analysis from a Cultural Perspective.

    Science.gov (United States)

    Smagorinsky, Peter

    2001-01-01

    Outlines a cultural-historical activity theory (CHAT) perspective that accounts for protocol analysis along three key dimensions: the relationship between thinking and speech from a representational standpoint; the social role of speech in research methodology; and the influence of speech on thinking and data collection. (Author/VWL)

  16. Emulation Platform for Cyber Analysis of Wireless Communication Network Protocols

    Energy Technology Data Exchange (ETDEWEB)

    Van Leeuwen, Brian P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eldridge, John M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-11-01

    Wireless networking and mobile communications is increasing around the world and in all sectors of our lives. With increasing use, the density and complexity of the systems increase with more base stations and advanced protocols to enable higher data throughputs. The security of data transported over wireless networks must also evolve with the advances in technologies enabling more capable wireless networks. However, means for analysis of the effectiveness of security approaches and implementations used on wireless networks are lacking. More specifically a capability to analyze the lower-layer protocols (i.e., Link and Physical layers) is a major challenge. An analysis approach that incorporates protocol implementations without the need for RF emissions is necessary. In this research paper several emulation tools and custom extensions that enable an analysis platform to perform cyber security analysis of lower layer wireless networks is presented. A use case of a published exploit in the 802.11 (i.e., WiFi) protocol family is provided to demonstrate the effectiveness of the described emulation platform.

  17. A security analysis of the 802.11s wireless mesh network routing protocol and its secure routing protocols.

    Science.gov (United States)

    Tan, Whye Kit; Lee, Sang-Gon; Lam, Jun Huy; Yoo, Seong-Moo

    2013-09-02

    Wireless mesh networks (WMNs) can act as a scalable backbone by connecting separate sensor networks and even by connecting WMNs to a wired network. The Hybrid Wireless Mesh Protocol (HWMP) is the default routing protocol for the 802.11s WMN. The routing protocol is one of the most important parts of the network, and it requires protection, especially in the wireless environment. The existing security protocols, such as the Broadcast Integrity Protocol (BIP), Counter with cipher block chaining message authentication code protocol (CCMP), Secure Hybrid Wireless Mesh Protocol (SHWMP), Identity Based Cryptography HWMP (IBC-HWMP), Elliptic Curve Digital Signature Algorithm HWMP (ECDSA-HWMP), and Watchdog-HWMP aim to protect the HWMP frames. In this paper, we have analyzed the vulnerabilities of the HWMP and developed security requirements to protect these identified vulnerabilities. We applied the security requirements to analyze the existing secure schemes for HWMP. The results of our analysis indicate that none of these protocols is able to satisfy all of the security requirements. We also present a quantitative complexity comparison among the protocols and an example of a security scheme for HWMP to demonstrate how the result of our research can be utilized. Our research results thus provide a tool for designing secure schemes for the HWMP.

  18. NOAA's Inundation Analysis Tool

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Coastal storms and other meteorological phenomenon can have a significant impact on how high water levels rise and how often. The inundation analysis program is...

  19. Costing 'healthy' food baskets in Australia - a systematic review of food price and affordability monitoring tools, protocols and methods.

    Science.gov (United States)

    Lewis, Meron; Lee, Amanda

    2016-11-01

    To undertake a systematic review to determine similarities and differences in metrics and results between recently and/or currently used tools, protocols and methods for monitoring Australian healthy food prices and affordability. Electronic databases of peer-reviewed literature and online grey literature were systematically searched using the PRISMA approach for articles and reports relating to healthy food and diet price assessment tools, protocols, methods and results that utilised retail pricing. National, state, regional and local areas of Australia from 1995 to 2015. Assessment tools, protocols and methods to measure the price of 'healthy' foods and diets. The search identified fifty-nine discrete surveys of 'healthy' food pricing incorporating six major food pricing tools (those used in multiple areas and time periods) and five minor food pricing tools (those used in a single survey area or time period). Analysis demonstrated methodological differences regarding: included foods; reference households; use of availability and/or quality measures; household income sources; store sampling methods; data collection protocols; analysis methods; and results. 'Healthy' food price assessment methods used in Australia lack comparability across all metrics and most do not fully align with a 'healthy' diet as recommended by the current Australian Dietary Guidelines. None have been applied nationally. Assessment of the price, price differential and affordability of healthy (recommended) and current (unhealthy) diets would provide more robust and meaningful data to inform health and fiscal policy in Australia. The INFORMAS 'optimal' approach provides a potential framework for development of these methods.

  20. ATLAS Distributed Analysis Tools

    CERN Document Server

    Gonzalez de la Hoz, Santiago; Liko, Dietrich

    2008-01-01

    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting a...

  1. Dynamic federations: storage aggregation using open tools and protocols

    CERN Document Server

    Fabrizio Furano, F F; Ricardo Brito da Rocha, R R; Adrien Devresse, A D; Oliver Keeble, O K; Alejandro Alvarez Ayllon, A A

    2012-01-01

    A number of storage elements now offer standard protocol interfaces like NFS 4.1/pNFS and WebDAV, for access to their data repositories, in line with the standardization effort of the European Middleware Initiative (EMI). Also the LCG FileCatalogue (LFC) can offer such features. Here we report on work that seeks to exploit the federation potential of these protocols and build a system that offers a unique view of the storage and metadata ensemble and the possibility of integration of other compatible resources such as those from cloud providers. The challenge, here undertaken by the providers of dCache and DPM, and pragmatically open to other Grid and Cloud storage solutions, is to build such a system while being able to accommodate name translations from existing catalogues (e.g. LFCs), experiment- based metadata catalogues, or stateless algorithmic name translations, also known as ”trivial file catalogues”. Such so-called storage federations of standard protocols-based storage elements give a unique vie...

  2. Toward Synthesis, Analysis, and Certification of Security Protocols

    Science.gov (United States)

    Schumann, Johann

    2004-01-01

    : multiple tries with invalid passwords caused the expected error message (too many retries). but let the user nevertheless pass. Finally, security can be compromised by silly implementation bugs or design decisions. In a commercial VPN software, all calls to the encryption routines were incidentally replaced by stubs, probably during factory testing. The product worked nicely. and the error (an open VPN) would have gone undetected, if a team member had not inspected the low-level traffic out of curiosity. Also, the use secret proprietary encryption routines can backfire, because such algorithms often exhibit weaknesses which can be exploited easily (see e.g., DVD encoding). Summarizing, there is large number of possibilities to make errors which can compromise the security of a protocol. In today s world with short time-to-market and the use of security protocols in open and hostile networks for safety-critical applications (e.g., power or air-traffic control), such slips could lead to catastrophic situations. Thus, formal methods and automatic reasoning techniques should not be used just for the formal proof of absence of an attack, but they ought to be used to provide an end-to-end tool-supported framework for security software. With such an approach all required artifacts (code, documentation, test cases) , formal analyses, and reliable certification will be generated automatically, given a single, high level specification. By a combination of program synthesis, formal protocol analysis, certification; and proof-carrying code, this goal is within practical reach, since all the important technologies for such an approach actually exist and only need to be assembled in the right way.

  3. Dynamic federations: storage aggregation using open tools and protocols

    International Nuclear Information System (INIS)

    Furano, Fabrizio; Brito da Rocha, Ricardo; Devresse, Adrien; Keeble, Oliver; Álvarez Ayllón, Alejandro; Fuhrmann, Patrick

    2012-01-01

    A number of storage elements now offer standard protocol interfaces like NFS 4.1/pNFS and WebDAV, for access to their data repositories, in line with the standardization effort of the European Middleware Initiative (EMI). Also the LCG FileCatalogue (LFC) can offer such features. Here we report on work that seeks to exploit the federation potential of these protocols and build a system that offers a unique view of the storage and metadata ensemble and the possibility of integration of other compatible resources such as those from cloud providers. The challenge, here undertaken by the providers of dCache and DPM, and pragmatically open to other Grid and Cloud storage solutions, is to build such a system while being able to accommodate name translations from existing catalogues (e.g. LFCs), experiment-based metadata catalogues, or stateless algorithmic name translations, also known as “trivial file catalogues”. Such so-called storage federations of standard protocols-based storage elements give a unique view of their content, thus promoting simplicity in accessing the data they contain and offering new possibilities for resilience and data placement strategies. The goal is to consider HTTP and NFS4.1-based storage elements and metadata catalogues and make them able to cooperate through an architecture that properly feeds the redirection mechanisms that they are based upon, thus giving the functionalities of a “loosely coupled” storage federation. One of the key requirements is to use standard clients (provided by OS'es or open source distributions, e.g. Web browsers) to access an already aggregated system; this approach is quite different from aggregating the repositories at the client side through some wrapper API, like for instance GFAL, or by developing new custom clients. Other technical challenges that will determine the success of this initiative include performance, latency and scalability, and the ability to create worldwide storage federations that

  4. Channel CAT: A Tactical Link Analysis Tool

    National Research Council Canada - National Science Library

    Coleman, Michael

    1997-01-01

    .... This thesis produced an analysis tool, the Channel Capacity Analysis Tool (Channel CAT), designed to provide an automated tool for the analysis of design decisions in developing client-server software...

  5. An Overview of the Object Protocol Model (OPM) and the OPM Data Management Tools.

    Science.gov (United States)

    Chen, I-Min A.; Markowitz, Victor M.

    1995-01-01

    Discussion of database management tools for scientific information focuses on the Object Protocol Model (OPM) and data management tools based on OPM. Topics include the need for new constructs for modeling scientific experiments, modeling object structures and experiments in OPM, queries and updates, and developing scientific database applications…

  6. Physics Analysis Tools Workshop 2007

    CERN Multimedia

    Elizabeth Gallas,

    The ATLAS PAT (Physics Analysis Tools) group evaluates, develops and tests software tools for the analysis of physics data, consistent with the ATLAS analysis and event data models. Following on from earlier PAT workshops in London (2004), Tucson (2005) and Tokyo (2006), this year's workshop was hosted by the University of Bergen in Norway on April 23-28 with more than 60 participants. The workshop brought together PAT developers and users to discuss the available tools with an emphasis on preparing for data taking. At the start of the week, workshop participants, laptops and power converters in-hand, jumped headfirst into tutorials, learning how to become trigger-aware and how to use grid computing resources via the distributed analysis tools Panda and Ganga. The well organised tutorials were well attended and soon the network was humming, providing rapid results to the users and ample feedback to the developers. A mid-week break was provided by a relaxing and enjoyable cruise through the majestic Norwegia...

  7. Bioinspired Security Analysis of Wireless Protocols

    DEFF Research Database (Denmark)

    Petrocchi, Marinella; Spognardi, Angelo; Santi, Paolo

    2016-01-01

    work, this paper investigates feasibility of adopting fraglets as model for specifying security protocols and analysing their properties. In particular, we give concrete sample analyses over a secure RFID protocol, showing evolution of the protocol run as chemical dynamics and simulating an adversary...

  8. Formal analysis of a fair payment protocol

    NARCIS (Netherlands)

    J.G. Cederquist; M.T. Dashti (Mohammad)

    2004-01-01

    textabstractWe formally specify a payment protocol. This protocol is intended for fair exchange of time-sensitive data. Here the ?-CRL language is used to formalize the protocol. Fair exchange properties are expressed in the regular alternation-free ?-calculus. These properties are then verified

  9. Formal Analysis of a Fair Payment Protocol

    NARCIS (Netherlands)

    Cederquist, J.G.; Dashti, M.T.

    2004-01-01

    We formally specify a payment protocol. This protocol is intended for fair exchange of timesensitive data. Here the μCRL language is used to formalize the protocol. Fair exchange properties are expressed in the regular alternation-free μ-calculus. These properties are then verified using the finite

  10. Formal Analysis of a Fair Payment Protocol

    NARCIS (Netherlands)

    Cederquist, J.G.; Dashti, Muhammad Torabi; Dimitrakos, Theo; Martinelli, Fabio

    We formally specify a payment protocol described by Vogt et al. This protocol is intended for fair exchange of time-sensitive data. Here the mCRL language is used to formalize the protocol. Fair exchange properties are expressed in the regular alternation-free mu-calculus. These properties are then

  11. Physics Analysis Tools Workshop Report

    CERN Multimedia

    Assamagan, K A

    A Physics Analysis Tools (PAT) workshop was held at the University of Tokyo in Tokyo Japan on May 15-19, 2006. Unlike the previous ones, this workshop brought together the core PAT developers and ATLAS users. The workshop was attended by 69 people from various institutions: Australia 5 Canada 1 China 6 CERN 4 Europe 7 Japan 32 Taiwan 3 USA 11 The agenda consisted of a 2-day tutorial for users, a 0.5-day user feedback discussion session between users and developers, and a 2-day core PAT workshop devoted to issues in Physics Analysis Tools activities. The tutorial, attended by users and developers, covered the following grounds: Event Selection with the TAG Event Selection Using the Athena-Aware NTuple Event Display Interactive Analysis within ATHENA Distributed Analysis Monte Carlo Truth Tools Trigger-Aware Analysis Event View By many accounts, the tutorial was useful. This workshop was the first time that the ATLAS Asia-Pacific community (Taiwan, Japan, China and Australia) go...

  12. Network Analysis Tools: from biological networks to clusters and pathways.

    Science.gov (United States)

    Brohée, Sylvain; Faust, Karoline; Lima-Mendez, Gipsi; Vanderstocken, Gilles; van Helden, Jacques

    2008-01-01

    Network Analysis Tools (NeAT) is a suite of computer tools that integrate various algorithms for the analysis of biological networks: comparison between graphs, between clusters, or between graphs and clusters; network randomization; analysis of degree distribution; network-based clustering and path finding. The tools are interconnected to enable a stepwise analysis of the network through a complete analytical workflow. In this protocol, we present a typical case of utilization, where the tasks above are combined to decipher a protein-protein interaction network retrieved from the STRING database. The results returned by NeAT are typically subnetworks, networks enriched with additional information (i.e., clusters or paths) or tables displaying statistics. Typical networks comprising several thousands of nodes and arcs can be analyzed within a few minutes. The complete protocol can be read and executed in approximately 1 h.

  13. Analysis of a security protocol in ?CRL

    NARCIS (Netherlands)

    J. Pang

    2002-01-01

    textabstractNeedham-Schroeder public-key protocol; With the growth and commercialization of the Internet, the security of communication between computers becomes a crucial point. A variety of security protocols based on cryptographic primitives are used to establish secure communication over

  14. Enhancement of Local Climate Analysis Tool

    Science.gov (United States)

    Horsfall, F. M.; Timofeyeva, M. M.; Dutton, J.

    2012-12-01

    The National Oceanographic and Atmospheric Administration (NOAA) National Weather Service (NWS) will enhance its Local Climate Analysis Tool (LCAT) to incorporate specific capabilities to meet the needs of various users including energy, health, and other communities. LCAT is an online interactive tool that provides quick and easy access to climate data and allows users to conduct analyses at the local level such as time series analysis, trend analysis, compositing, correlation and regression techniques, with others to be incorporated as needed. LCAT uses principles of Artificial Intelligence in connecting human and computer perceptions on application of data and scientific techniques in multiprocessing simultaneous users' tasks. Future development includes expanding the type of data currently imported by LCAT (historical data at stations and climate divisions) to gridded reanalysis and General Circulation Model (GCM) data, which are available on global grids and thus will allow for climate studies to be conducted at international locations. We will describe ongoing activities to incorporate NOAA Climate Forecast System (CFS) reanalysis data (CFSR), NOAA model output data, including output from the National Multi Model Ensemble Prediction System (NMME) and longer term projection models, and plans to integrate LCAT into the Earth System Grid Federation (ESGF) and its protocols for accessing model output and observational data to ensure there is no redundancy in development of tools that facilitate scientific advancements and use of climate model information in applications. Validation and inter-comparison of forecast models will be included as part of the enhancement to LCAT. To ensure sustained development, we will investigate options for open sourcing LCAT development, in particular, through the University Corporation for Atmospheric Research (UCAR).

  15. Channel CAT: A Tactical Link Analysis Tool

    Science.gov (United States)

    1997-09-01

    NAVAL POSTGRADUATE SCHOOL Monterey, California THESIS CHANNEL CAT : A TACTICAL LINK ANALYSIS TOOL by Michael Glenn Coleman September 1997 Thesis...REPORT TYPE AND DATES COVERED September 1997 Master’s Thesis 4. TITLE AND SUBTITLE CHANNEL CAT : A TACTICAL LINK ANALYSIS TOOL 5. FUNDING NUMBERS 6...tool, the Channel Capacity Analysis Tool (Channel CAT ), designed to provide an automated tool for the anlysis of design decisions in developing client

  16. Technical Analysis of SSP-21 Protocol

    Energy Technology Data Exchange (ETDEWEB)

    Bromberger, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-06-09

    As part of the California Energy Systems for the Twenty-First Century (CES-21) program, in December 2016 San Diego Gas and Electric (SDG&E) contracted with Lawrence Livermore National Laboratory (LLNL) to perform an independent verification and validation (IV&V) of a white paper describing their Secure SCADA Protocol for the Twenty-First Century (SSP-21) in order to analyze the effectiveness and propriety of cryptographic protocol use within the SSP-21 specification. SSP-21 is designed to use cryptographic protocols to provide (optional) encryption, authentication, and nonrepudiation, among other capabilities. The cryptographic protocols to be used reflect current industry standards; future versions of SSP-21 will use other advanced technologies to provide a subset of security services.

  17. A Survey of Automatic Protocol Reverse Engineering Approaches, Methods, and Tools on the Inputs and Outputs View

    OpenAIRE

    Baraka D. Sija; Young-Hoon Goo; Kyu-Seok Shim; Huru Hasanova; Myung-Sup Kim

    2018-01-01

    A network protocol defines rules that control communications between two or more machines on the Internet, whereas Automatic Protocol Reverse Engineering (APRE) defines the way of extracting the structure of a network protocol without accessing its specifications. Enough knowledge on undocumented protocols is essential for security purposes, network policy implementation, and management of network resources. This paper reviews and analyzes a total of 39 approaches, methods, and tools towards ...

  18. Reload safety analysis automation tools

    International Nuclear Information System (INIS)

    Havlůj, F.; Hejzlar, J.; Vočka, R.

    2013-01-01

    Performing core physics calculations for the sake of reload safety analysis is a very demanding and time consuming process. This process generally begins with the preparation of libraries for the core physics code using a lattice code. The next step involves creating a very large set of calculations with the core physics code. Lastly, the results of the calculations must be interpreted, correctly applying uncertainties and checking whether applicable limits are satisfied. Such a procedure requires three specialized experts. One must understand the lattice code in order to correctly calculate and interpret its results. The next expert must have a good understanding of the physics code in order to create libraries from the lattice code results and to correctly define all the calculations involved. The third expert must have a deep knowledge of the power plant and the reload safety analysis procedure in order to verify, that all the necessary calculations were performed. Such a procedure involves many steps and is very time consuming. At ÚJV Řež, a.s., we have developed a set of tools which can be used to automate and simplify the whole process of performing reload safety analysis. Our application QUADRIGA automates lattice code calculations for library preparation. It removes user interaction with the lattice code and reduces his task to defining fuel pin types, enrichments, assembly maps and operational parameters all through a very nice and user-friendly GUI. The second part in reload safety analysis calculations is done by CycleKit, a code which is linked with our core physics code ANDREA. Through CycleKit large sets of calculations with complicated interdependencies can be performed using simple and convenient notation. CycleKit automates the interaction with ANDREA, organizes all the calculations, collects the results, performs limit verification and displays the output in clickable html format. Using this set of tools for reload safety analysis simplifies

  19. Mean-Field Analysis for the Evaluation of Gossip Protocols

    NARCIS (Netherlands)

    Bakshi, Rena; Cloth, L.; Fokkink, Wan; Haverkort, Boudewijn R.H.M.

    Gossip protocols are designed to operate in very large, decentralised networks. A node in such a network bases its decision to interact (gossip) with another node on its partial view of the global system. Because of the size of these networks, analysis of gossip protocols is mostly done using

  20. Mean-field analysis for the evaluation of gossip protocols

    NARCIS (Netherlands)

    Bakhshi, Rena; Cloth, L.; Fokkink, Wan; Haverkort, Boudewijn R.H.M.

    2008-01-01

    Gossip protocols are designed to operate in very large, decentralised networks. A node in such a network bases its decision to interact (gossip) with another node on its partial view of the global system. Because of the size of these networks, analysis of gossip protocols is mostly done using

  1. A Calculus for Control Flow Analysis of Security Protocols

    DEFF Research Database (Denmark)

    Buchholtz, Mikael; Nielson, Hanne Riis; Nielson, Flemming

    2004-01-01

    The design of a process calculus for anaysing security protocols is governed by three factors: how to express the security protocol in a precise and faithful manner, how to accommodate the variety of attack scenarios, and how to utilise the strengths (and limit the weaknesses) of the underlying...... analysis methodology. We pursue an analysis methodology based on control flow analysis in flow logic style and we have previously shown its ability to analyse a variety of security protocols. This paper develops a calculus, LysaNS that allows for much greater control and clarity in the description...

  2. Integrated Radiation Analysis and Design Tools

    Data.gov (United States)

    National Aeronautics and Space Administration — The Integrated Radiation Analysis and Design Tools (IRADT) Project develops and maintains an integrated tool set that collects the current best practices, databases,...

  3. Analysis of Security Protocols in Embedded Systems

    DEFF Research Database (Denmark)

    Bruni, Alessandro

    Embedded real-time systems have been adopted in a wide range of safety-critical applications—including automotive, avionics, and train control systems—where the focus has long been on safety (i.e., protecting the external world from the potential damage caused by the system) rather than security (i.......e., protecting the system from the external world). With increased connectivity of these systems to external networks the attack surface has grown, and consequently there is a need for securing the system from external attacks. Introducing security protocols in safety critical systems requires careful...... in this direction is to extend saturation-based techniques so that enough state information can be modelled and analysed. Finally, we present a methodology for proving the same security properties in the computational model, by means of typing protocol implementations....

  4. System analysis: Developing tools for the future

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, K.; clever, J.; Draper, J.V.; Davies, B.; Lonks, A.

    1996-02-01

    This report introduces and evaluates system analysis tools that were developed, or are under development, for the Robotics Technology Development Program (RTDP). Additionally, it discusses system analysis work completed using these tools aimed at completing a system analysis of the retrieval of waste from underground storage tanks on the Hanford Reservation near Richland, Washington. The tools developed and evaluated include a mixture of commercially available tools adapted to RTDP requirements, and some tools developed in house. The tools that are included in this report include: a Process Diagramming Tool, a Cost Modeling Tool, an Amortization Modeling Tool, a graphical simulation linked to the Cost Modeling Tool, a decision assistance tool, and a system thinking tool. Additionally, the importance of performance testing to the RTDP and the results of such testing executed is discussed. Further, the results of the Tank Waste Retrieval (TWR) System Diagram, the TWR Operations Cost Model, and the TWR Amortization Model are presented, and the implication of the results are discussed. Finally, the RTDP system analysis tools are assessed and some recommendations are made regarding continuing development of the tools and process.

  5. Analysis of Security Protocols for Mobile Healthcare.

    Science.gov (United States)

    Wazid, Mohammad; Zeadally, Sherali; Das, Ashok Kumar; Odelu, Vanga

    2016-11-01

    Mobile Healthcare (mHealth) continues to improve because of significant improvements and the decreasing costs of Information Communication Technologies (ICTs). mHealth is a medical and public health practice, which is supported by mobile devices (for example, smartphones) and, patient monitoring devices (for example, various types of wearable sensors, etc.). An mHealth system enables healthcare experts and professionals to have ubiquitous access to a patient's health data along with providing any ongoing medical treatment at any time, any place, and from any device. It also helps the patient requiring continuous medical monitoring to stay in touch with the appropriate medical staff and healthcare experts remotely. Thus, mHealth has become a major driving force in improving the health of citizens today. First, we discuss the security requirements, issues and threats to the mHealth system. We then present a taxonomy of recently proposed security protocols for mHealth system based on features supported and possible attacks, computation cost and communication cost. Our detailed taxonomy demonstrates the strength and weaknesses of recently proposed security protocols for the mHealth system. Finally, we identify some of the challenges in the area of security protocols for mHealth systems that still need to be addressed in the future to enable cost-effective, secure and robust mHealth systems.

  6. ADVANCED POWER SYSTEMS ANALYSIS TOOLS

    Energy Technology Data Exchange (ETDEWEB)

    Robert R. Jensen; Steven A. Benson; Jason D. Laumb

    2001-08-31

    The use of Energy and Environmental Research Center (EERC) modeling tools and improved analytical methods has provided key information in optimizing advanced power system design and operating conditions for efficiency, producing minimal air pollutant emissions and utilizing a wide range of fossil fuel properties. This project was divided into four tasks: the demonstration of the ash transformation model, upgrading spreadsheet tools, enhancements to analytical capabilities using the scanning electron microscopy (SEM), and improvements to the slag viscosity model. The ash transformation model, Atran, was used to predict the size and composition of ash particles, which has a major impact on the fate of the combustion system. To optimize Atran key factors such as mineral fragmentation and coalescence, the heterogeneous and homogeneous interaction of the organically associated elements must be considered as they are applied to the operating conditions. The resulting model's ash composition compares favorably to measured results. Enhancements to existing EERC spreadsheet application included upgrading interactive spreadsheets to calculate the thermodynamic properties for fuels, reactants, products, and steam with Newton Raphson algorithms to perform calculations on mass, energy, and elemental balances, isentropic expansion of steam, and gasifier equilibrium conditions. Derivative calculations can be performed to estimate fuel heating values, adiabatic flame temperatures, emission factors, comparative fuel costs, and per-unit carbon taxes from fuel analyses. Using state-of-the-art computer-controlled scanning electron microscopes and associated microanalysis systems, a method to determine viscosity using the incorporation of grey-scale binning acquired by the SEM image was developed. The image analysis capabilities of a backscattered electron image can be subdivided into various grey-scale ranges that can be analyzed separately. Since the grey scale's intensity

  7. A Survey of Automatic Protocol Reverse Engineering Approaches, Methods, and Tools on the Inputs and Outputs View

    Directory of Open Access Journals (Sweden)

    Baraka D. Sija

    2018-01-01

    Full Text Available A network protocol defines rules that control communications between two or more machines on the Internet, whereas Automatic Protocol Reverse Engineering (APRE defines the way of extracting the structure of a network protocol without accessing its specifications. Enough knowledge on undocumented protocols is essential for security purposes, network policy implementation, and management of network resources. This paper reviews and analyzes a total of 39 approaches, methods, and tools towards Protocol Reverse Engineering (PRE and classifies them into four divisions, approaches that reverse engineer protocol finite state machines, protocol formats, and both protocol finite state machines and protocol formats to approaches that focus directly on neither reverse engineering protocol formats nor protocol finite state machines. The efficiency of all approaches’ outputs based on their selected inputs is analyzed in general along with appropriate reverse engineering inputs format. Additionally, we present discussion and extended classification in terms of automated to manual approaches, known and novel categories of reverse engineered protocols, and a literature of reverse engineered protocols in relation to the seven layers’ OSI (Open Systems Interconnection model.

  8. Sustainability Tools Inventory - Initial Gaps Analysis | Science ...

    Science.gov (United States)

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consumption, waste generation, and hazard generation including air pollution and greenhouse gases. In addition, the tools have been evaluated using four screening criteria: relevance to community decision making, tools in an appropriate developmental stage, tools that may be transferrable to situations useful for communities, and tools with requiring skill levels appropriate to communities. This document provides an initial gap analysis in the area of community sustainability decision support tools. It provides a reference to communities for existing decision support tools, and a set of gaps for those wishing to develop additional needed tools to help communities to achieve sustainability. It contributes to SHC 1.61.4

  9. The Additional Protocol as an important tool for the strengthening of the safeguards system

    International Nuclear Information System (INIS)

    Loosch, Reinhard

    2001-01-01

    Full text: The following main points will be dealt with and underlined by illustrative examples: 0. A preliminary clarification: Contrary to the title's short-hand language, it is of course, not the Additional Protocols entered into by the Agency and States and other Parties to Safeguards Agreements since 1997 nor the Model Additional Protocol adopted by the Board of Governors and endorsed by the General Conference in 1997 that are, by themselves, an important tool for the strengthening of the Agency's safeguards system. They are, however, the necessary legal prerequisite as well as a strong political and moral boost for enabling the Agency to develop and apply additional tools in order to make the international nuclear non-proliferation regime more effective and, therefore, more reassuring and at the same time, more efficient and therefore, more widely accepted. 1. The importance of the new tools cannot be assessed yet. Hopefully, it will grow quickly and consistently. This will depend primarily on two factors: - The extent to which Additional Protocols are entered into force and at what speed this is achieved, and the extent to which these Protocols cover all important peaceful nuclear activities and resources, whether these exist in states with comprehensive safeguards agreements or not; - The extent to which the Agency succeeds in merging the new measures with those applicable before into an optimized, integrated toolbox. 2. The first factor tends to increase effectiveness by permitting the collection of safeguards- relevant data provided not only in reports from countries in which such activities are conducted or such resources exist but also in information coming from other sources such as publications in. or intelligence made available by, other countries. Cross-checking all those data against each other may, in the best case, reinforce their credibility or, in the worst case, reveal gaps and inconsistencies, but will at any rate, in one way or other, help

  10. Analysis of protection spanning-tree protocol

    Directory of Open Access Journals (Sweden)

    Б.Я. Корнієнко

    2007-01-01

    Full Text Available  Extraordinary sweeping  of  IT – development  causes vulnerabilities and, thereafter, attacks that use these vulnerabilities. That is why one must post factum or even in advance speed up invention of new information  security systems as well as develop the old ones. The matter of article concerns Spanning-Tree Protocol  – the vivid example of the case, when the cure of the vulnerability creates dozen of new "weak spots".

  11. Assessment tools for the measurement of the self-efficacy of drug users: protocol for a systematic review.

    Science.gov (United States)

    Vasconcelos, Selene Cordeiro; Frazão, Iracema da Silva; Sougey, Everton Botelho; Souza, Sandra Lopes de; Silva, Tatiana de Paula Santana da; Lima, Murilo Duarte da Costa

    2018-03-14

    The abuse of alcohol and other drugs is a worldwide problem, the treatment of which poses a challenge to healthcare workers. This study presents a proposal for a systematic review to analyse the psychometric properties of assessment tools developed to measure the self-efficacy of drug users with regard to resisting the urge to take drugs in high-risk situations. The guiding question was based on PICOS (Population Intervention Comparator Outcome Setting), and the report of the methods of review protocol was written in accordance with the Preferred Reporting Items for Systematic Review and Meta-Analysis Protocols (PRISMA-P). Searches will be performed in the PsycINFO, Cochrane, Pubmed, Web of Science, SCOPUS and CINAHL databases, followed by the use of the 'snowball' strategy. The inclusion criteria for the articles will be (1) assessment tool validation studies; (2) assessment tools developed to measure self-efficacy; (3) quantitative measures; (4) measures designed for use on adults; (5) data from self-reports of the participants; (6) studies involving a description of psychometric properties of the measures; and (7) studies that explain how the level of self-efficacy is scored. The search, selection and analysis will be performed by two independent reviewers. In cases of a divergence of opinion, a third reviewer will be consulted. The COSMIN checklist will be used for the appraisal of the methodological quality of the assessment tools and the certainty of the evidence in the articles (risk of bias) will be analysed using the GRADE (Grading of Recommendations Assessment, Development and Evaluation) approach. This protocol does not require ethical approval. However, this protocol is part of the thesis entitled Drug-Taking Confidence Questionnaire for use in Brazil, presented for obtaining a doctorate in neuropsychiatry and behavioural sciences from the Federal University of Pernambuco, and has received approval from the human research ethics committee of the Federal

  12. Protein analysis tools and services at IBIVU

    Directory of Open Access Journals (Sweden)

    Brandt Bernd W.

    2011-06-01

    Full Text Available During the last years several new tools applicable to protein analysis have made available on the IBIVU web site. Recently, a number of tools, ranging from multiple sequence alignment construction to domain prediction, have been updated and/or extended with services for programmatic access using SOAP. We provide an overview of these tools and their application.

  13. Protocols for Improvement of Black Pepper (Piper nigrum L.) Utilizing Biotechnological Tools.

    Science.gov (United States)

    Nirmal Babu, K; Divakaran, Minoo; Yamuna, G; Ravindran, P N; Peter, K V

    2016-01-01

    Black pepper, Piper nigrum L., the "King of spices" is the most widely used spice growing in the South-Western region of India. The humid tropical evergreen forest bordering the Malabar Coast (Western Ghats is one of the hot spot areas of plant bio-diversity on earth) is its center of origin and diversity. However, the crop faces constraints like rampant fungal and viral diseases, lack of disease free planting material, hence biotechnological tools can be utilized to address these problems and strides have been made successfully. The standardization of micropropagation, somatic embryogenesis, in vitro conservation, protoplast isolation, and genetic transformation protocols are described here. The protocols could be utilized to achieve similar goals in the related species of Piper too.

  14. Large-scale hydrological simulations using the soil water assessment tool, protocol development, and application in the danube basin.

    Science.gov (United States)

    Pagliero, Liliana; Bouraoui, Fayçal; Willems, Patrick; Diels, Jan

    2014-01-01

    The Water Framework Directive of the European Union requires member states to achieve good ecological status of all water bodies. A harmonized pan-European assessment of water resources availability and quality, as affected by various management options, is necessary for a successful implementation of European environmental legislation. In this context, we developed a methodology to predict surface water flow at the pan-European scale using available datasets. Among the hydrological models available, the Soil Water Assessment Tool was selected because its characteristics make it suitable for large-scale applications with limited data requirements. This paper presents the results for the Danube pilot basin. The Danube Basin is one of the largest European watersheds, covering approximately 803,000 km and portions of 14 countries. The modeling data used included land use and management information, a detailed soil parameters map, and high-resolution climate data. The Danube Basin was divided into 4663 subwatersheds of an average size of 179 km. A modeling protocol is proposed to cope with the problems of hydrological regionalization from gauged to ungauged watersheds and overparameterization and identifiability, which are usually present during calibration. The protocol involves a cluster analysis for the determination of hydrological regions and multiobjective calibration using a combination of manual and automated calibration. The proposed protocol was successfully implemented, with the modeled discharges capturing well the overall hydrological behavior of the basin. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  15. Design and analysis of communication protocols for quantum repeater networks

    International Nuclear Information System (INIS)

    Jones, Cody; Kim, Danny; Rakher, Matthew T; Ladd, Thaddeus D; Kwiat, Paul G

    2016-01-01

    We analyze how the performance of a quantum-repeater network depends on the protocol employed to distribute entanglement, and we find that the choice of repeater-to-repeater link protocol has a profound impact on entanglement-distribution rate as a function of hardware parameters. We develop numerical simulations of quantum networks using different protocols, where the repeater hardware is modeled in terms of key performance parameters, such as photon generation rate and collection efficiency. These parameters are motivated by recent experimental demonstrations in quantum dots, trapped ions, and nitrogen-vacancy centers in diamond. We find that a quantum-dot repeater with the newest protocol (‘MidpointSource’) delivers the highest entanglement-distribution rate for typical cases where there is low probability of establishing entanglement per transmission, and in some cases the rate is orders of magnitude higher than other schemes. Our simulation tools can be used to evaluate communication protocols as part of designing a large-scale quantum network. (paper)

  16. A Software Tool to Visualize Verbal Protocols to Enhance Strategic and Metacognitive Abilities in Basic Programming

    Directory of Open Access Journals (Sweden)

    Carlos A. Arévalo

    2011-07-01

    Full Text Available Learning to program is difficult for many first year undergraduate students. Instructional strategies of traditional programming courses tend to focus on syntactic issues and assigning practice exercises using the presentation-examples-practice formula and by showing the verbal and visual explanation of a teacher during the “step by step” process of writing a computer program. Cognitive literature regarding the mental processes involved in programming suggests that the explicit teaching of certain aspects such as mental models, strategic knowledge and metacognitive abilities, are critical issues of how to write and assemble the pieces of a computer program. Verbal protocols are often used in software engineering as a technique to record the short term cognitive process of a user or expert in evaluation or problem solving scenarios. We argue that verbal protocols can be used as a mechanism to explicitly show the strategic and metacognitive process of an instructor when writing a program. In this paper we present an Information System Prototype developed to store and visualize worked examples derived from transcribed verbal protocols during the process of writing introductory level programs. Empirical data comparing the grades obtained by two groups of novice programming students, using ANOVA, indicates a statistically positive difference in performance in the group using the tool, even though these results still cannot be extrapolated to general population, given the reported limitations of this study.

  17. Affordances of agricultural systems analysis tools

    NARCIS (Netherlands)

    Ditzler, Lenora; Klerkx, Laurens; Chan-Dentoni, Jacqueline; Posthumus, Helena; Krupnik, Timothy J.; Ridaura, Santiago López; Andersson, Jens A.; Baudron, Frédéric; Groot, Jeroen C.J.

    2018-01-01

    The increasingly complex challenges facing agricultural systems require problem-solving processes and systems analysis (SA) tools that engage multiple actors across disciplines. In this article, we employ the theory of affordances to unravel what tools may furnish users, and how those affordances

  18. Economic and Financial Analysis Tools | Energy Analysis | NREL

    Science.gov (United States)

    Economic and Financial Analysis Tools Economic and Financial Analysis Tools Use these economic and . Job and Economic Development Impact (JEDI) Model Use these easy-to-use, spreadsheet-based tools to analyze the economic impacts of constructing and operating power generation and biofuel plants at the

  19. Post-Flight Data Analysis Tool

    Science.gov (United States)

    George, Marina

    2018-01-01

    A software tool that facilitates the retrieval and analysis of post-flight data. This allows our team and other teams to effectively and efficiently analyze and evaluate post-flight data in order to certify commercial providers.

  20. Quick Spacecraft Thermal Analysis Tool, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — For spacecraft design and development teams concerned with cost and schedule, the Quick Spacecraft Thermal Analysis Tool (QuickSTAT) is an innovative software suite...

  1. Protocol Analysis as a Method for Analyzing Conversational Data.

    Science.gov (United States)

    Aleman, Carlos G.; Vangelisti, Anita L.

    Protocol analysis, a technique that uses people's verbal reports about their cognitions as they engage in an assigned task, has been used in a number of applications to provide insight into how people mentally plan, assess, and carry out those assignments. Using a system of networked computers where actors communicate with each other over…

  2. Formal Security Analysis of the MaCAN Protocol

    DEFF Research Database (Denmark)

    Bruni, Alessandro; Sojka, Michal; Nielson, Flemming

    2014-01-01

    analysis identifies two flaws in the original protocol: one creates unavailability concerns during key establishment, and the other allows re-using authenticated signals for different purposes. We propose and analyse a modification that improves its behaviour while fitting the constraints of CAN bus...

  3. Human Schedule Performance, Protocol Analysis, and the "Silent Dog" Methodology

    Science.gov (United States)

    Cabello, Francisco; Luciano, Carmen; Gomez, Inmaculada; Barnes-Holmes, Dermot

    2004-01-01

    The purpose of the current experiment was to investigate the role of private verbal behavior on the operant performances of human adults, using a protocol analysis procedure with additional methodological controls (the "silent dog" method). Twelve subjects were exposed to fixed ratio 8 and differential reinforcement of low rate 3-s schedules. For…

  4. Health-related quality of life in cancer patients at the end of life, translation, validation, and longitudinal analysis of specific tools: study protocol for a randomized controlled trial.

    Science.gov (United States)

    Poirier, Anne-Lise; Kwiatkowski, Fabrice; Commer, Jean-Marie; D'Aillières, Bénédicte; Berger, Virginie; Mercier, Mariette; Bonnetain, Franck

    2012-04-20

    The end of life for cancer patients is the ultimate stage of the disease, and care in this setting is important as it can improve the wellbeing not only of patients, but also the patients' family and close friends. As it is a matter of profoundly personal concerns, patients' perception of this phase of the disease is difficult to assess and has thus been insufficiently studied. Nonetheless, caregivers are required to provide specific care to help patients and to treat them in order to improve their wellbeing during this period.While tools to assess health-related quality of life (QoL) in cancer patients at the end of life exist in English, to our knowledge, no validated tools are available in French. This randomized multicenter cohort study will be carried out to cross-culturally adapt and validate a French version of the English QUAL-E and the Missoula Vitas Quality Of Life Index (MVQOLI) questionnaires for advanced cancer patients in a palliative setting. A randomized clinical trial component in addition to a cohort study is implemented in order to test psychometric hypotheses: order effect and improvement of sensibility to change.The validation procedure will ensure that the psychometric properties are maintained.The main criterion to assess the reliability of the questionnaires will be reproducibility (test-retest method) using intraclass correlation coefficients. It will be necessary to include 372 patients. The sensitivity to change, discriminant capability as well as convergent validity will be also investigated. If the cross-cultural validation of the MVQOLI and QUAL-E questionnaires for advanced cancer patients in a palliative setting have satisfactory psychometric properties, it will allow us to assess the specific dimensions of QoL at the end of life. Current Controlled Trials NCT01545921.

  5. The RUBA Watchdog Video Analysis Tool

    DEFF Research Database (Denmark)

    Bahnsen, Chris Holmberg; Madsen, Tanja Kidholm Osmann; Jensen, Morten Bornø

    We have developed a watchdog video analysis tool called RUBA (Road User Behaviour Analysis) to use for processing of traffic video. This report provides an overview of the functions of RUBA and gives a brief introduction into how analyses can be made in RUBA.......We have developed a watchdog video analysis tool called RUBA (Road User Behaviour Analysis) to use for processing of traffic video. This report provides an overview of the functions of RUBA and gives a brief introduction into how analyses can be made in RUBA....

  6. IPv4 and IPv6 protocol compatibility options analysis

    Directory of Open Access Journals (Sweden)

    Regina Misevičienė

    2013-09-01

    Full Text Available The popularity of the internet has led to a very rapid growth of IPv4 (Internet Protocol v4 users. This caused a shortage of IP addresses, so it was created a new version – IPv6 (Internet Protocol v6. Currently, there are two versions of IP for IPv4 and IPv6. Due to the large differences in addressing the protocols IPv4 and IPv6 are incompatible. It is therefore necessary to find ways to move from IPv4 to IPv6. To facilitate the transition from one version to another are developed various mechanisms and strategies. Comparative analysis is done for dual stack, 6to4 tunnel and NAT64 mechanisms in this work. It has helped to reveal the shortcomings of these mechanisms and their application in selection of realization decisions.

  7. RCRA groundwater data analysis protocol for the Hanford Site, Washington

    International Nuclear Information System (INIS)

    Chou, C.J.; Jackson, R.L.

    1992-04-01

    The Resource Conservation and Recovery Act of 1976 (RCRA) groundwater monitoring program currently involves site-specific monitoring of 20 facilities on the Hanford Site in southeastern Washington. The RCRA groundwater monitoring program has collected abundant data on groundwater quality. These data are used to assess the impact of a facility on groundwater quality or whether remediation efforts under RCRA corrective action programs are effective. Both evaluations rely on statistical analysis of groundwater monitoring data. The need for information on groundwater quality by regulators and environmental managers makes statistical analysis of monitoring data an important part of RCRA groundwater monitoring programs. The complexity of groundwater monitoring programs and variabilities (spatial, temporal, and analytical) exhibited in groundwater quality variables indicate the need for a data analysis protocol to guide statistical analysis. A data analysis protocol was developed from the perspective of addressing regulatory requirements, data quality, and management information needs. This data analysis protocol contains four elements: data handling methods; graphical evaluation techniques; statistical tests for trend, central tendency, and excursion analysis; and reporting procedures for presenting results to users

  8. Paediatric Automatic Phonological Analysis Tools (APAT).

    Science.gov (United States)

    Saraiva, Daniela; Lousada, Marisa; Hall, Andreia; Jesus, Luis M T

    2017-12-01

    To develop the pediatric Automatic Phonological Analysis Tools (APAT) and to estimate inter and intrajudge reliability, content validity, and concurrent validity. The APAT were constructed using Excel spreadsheets with formulas. The tools were presented to an expert panel for content validation. The corpus used in the Portuguese standardized test Teste Fonético-Fonológico - ALPE produced by 24 children with phonological delay or phonological disorder was recorded, transcribed, and then inserted into the APAT. Reliability and validity of APAT were analyzed. The APAT present strong inter- and intrajudge reliability (>97%). The content validity was also analyzed (ICC = 0.71), and concurrent validity revealed strong correlations between computerized and manual (traditional) methods. The development of these tools contributes to fill existing gaps in clinical practice and research, since previously there were no valid and reliable tools/instruments for automatic phonological analysis, which allowed the analysis of different corpora.

  9. Photogrammetry Tool for Forensic Analysis

    Science.gov (United States)

    Lane, John

    2012-01-01

    A system allows crime scene and accident scene investigators the ability to acquire visual scene data using cameras for processing at a later time. This system uses a COTS digital camera, a photogrammetry calibration cube, and 3D photogrammetry processing software. In a previous instrument developed by NASA, the laser scaling device made use of parallel laser beams to provide a photogrammetry solution in 2D. This device and associated software work well under certain conditions. In order to make use of a full 3D photogrammetry system, a different approach was needed. When using multiple cubes, whose locations relative to each other are unknown, a procedure that would merge the data from each cube would be as follows: 1. One marks a reference point on cube 1, then marks points on cube 2 as unknowns. This locates cube 2 in cube 1 s coordinate system. 2. One marks reference points on cube 2, then marks points on cube 1 as unknowns. This locates cube 1 in cube 2 s coordinate system. 3. This procedure is continued for all combinations of cubes. 4. The coordinate of all of the found coordinate systems is then merged into a single global coordinate system. In order to achieve maximum accuracy, measurements are done in one of two ways, depending on scale: when measuring the size of objects, the coordinate system corresponding to the nearest cube is used, or when measuring the location of objects relative to a global coordinate system, a merged coordinate system is used. Presently, traffic accident analysis is time-consuming and not very accurate. Using cubes with differential GPS would give absolute positions of cubes in the accident area, so that individual cubes would provide local photogrammetry calibration to objects near a cube.

  10. Protocol design and analysis for cooperative wireless networks

    CERN Document Server

    Song, Wei; Jin, A-Long

    2017-01-01

    This book focuses on the design and analysis of protocols for cooperative wireless networks, especially at the medium access control (MAC) layer and for crosslayer design between the MAC layer and the physical layer. It highlights two main points that are often neglected in other books: energy-efficiency and spatial random distribution of wireless devices. Effective methods in stochastic geometry for the design and analysis of wireless networks are also explored. After providing a comprehensive review of existing studies in the literature, the authors point out the challenges that are worth further investigation. Then, they introduce several novel solutions for cooperative wireless network protocols that reduce energy consumption and address spatial random distribution of wireless nodes. For each solution, the book offers a clear system model and problem formulation, details of the proposed cooperative schemes, comprehensive performance analysis, and extensive numerical and simulation results that validate th...

  11. SBAT. A stochastic BPMN analysis tool

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    This paper presents SBAT, a tool framework for the modelling and analysis of complex business workflows. SBAT is applied to analyse an example from the Danish baked goods industry. Based upon the Business Process Modelling and Notation (BPMN) language for business process modelling, we describe...... a formalised variant of this language extended to support the addition of intention preserving stochastic branching and parameterised reward annotations. Building on previous work, we detail the design of SBAT, a software tool which allows for the analysis of BPMN models. Within SBAT, properties of interest...

  12. A new aid to decision-making tool for the elaboration of treatment protocols by the DTPA after dose calculations

    International Nuclear Information System (INIS)

    Fritsch, P.; Poncy, J.L.; Berard, P.; Grappin, L.; Blanchin, N.; Breustedt, B.; Blanchardon, E.

    2010-01-01

    The authors summarize the presentation of a new tool aimed at the assessment of the reduction of doses associated with different therapeutic protocols after an internal contamination by some Pu (by inhalation or injury). This tool couples a dissolution model and a model describing the evolution of the diethylene triamine penta acetate (DTPA) and of Pu-DTPA compounds. Some recent experimental data are used for validation purposes

  13. [Professional divers: analysis of critical issues and proposal of a health protocol for work fitness].

    Science.gov (United States)

    Pedata, Paola; Corvino, Anna Rita; Napolitano, Raffaele Carmine; Garzillo, Elpidio Maria; Furfaro, Ciro; Lamberti, Monica

    2016-01-20

    From many years now, thanks to the development of modern diving techniques, there has been a rapid spread of diving activities everywhere. In fact, divers are ever more numerous both among the Armed Forces and civilians who dive for work, like fishing, biological research and archeology. The aim of the study was to propose a health protocol for work fitness of professional divers keeping in mind the peculiar work activity, existing Italian legislation that is almost out of date and the technical and scientific evolution in this occupational field. We performed an analysis of the most frequently occurring diseases among professional divers and of the clinical investigation and imaging techniques used for work fitness assessment of professional divers. From analysis of the health protocol recommended by D.M. 13 January 1979 (Ministerial Decree), that is most used by occupational health physician, several critical issues emerged. Very often the clinical investigation and imaging techniques still used are almost obsolete, ignoring the execution of simple and inexpensive investigations that are more useful for work fitness assessment. Considering the out-dated legislation concerning diving disciplines, it is necessary to draw up a common health protocol that takes into account clinical and scientific knowledge and skills acquired in this area. This protocol's aim is to propose a useful tool for occupational health physicians who work in this sector.

  14. Analysis of security protocols based on challenge-response

    Institute of Scientific and Technical Information of China (English)

    LUO JunZhou; YANG Ming

    2007-01-01

    Security protocol is specified as the procedure of challenge-response, which uses applied cryptography to confirm the existence of other principals and fulfill some data negotiation such as session keys. Most of the existing analysis methods,which either adopt theorem proving techniques such as state exploration or logic reasoning techniques such as authentication logic, face the conflicts between analysis power and operability. To solve the problem, a new efficient method is proposed that provides SSM semantics-based definition of secrecy and authentication goals and applies authentication logic as fundamental analysis techniques,in which secrecy analysis is split into two parts: Explicit-Information-Leakage and Implicit-Information-Leakage, and correspondence analysis is concluded as the analysis of the existence relationship of Strands and the agreement of Strand parameters. This new method owns both the power of the Strand Space Model and concision of authentication logic.

  15. Applied regression analysis a research tool

    CERN Document Server

    Pantula, Sastry; Dickey, David

    1998-01-01

    Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...

  16. Accelerator physics analysis with interactive tools

    International Nuclear Information System (INIS)

    Holt, J.A.; Michelotti, L.

    1993-05-01

    Work is in progress on interactive tools for linear and nonlinear accelerator design, analysis, and simulation using X-based graphics. The BEAMLINE and MXYZPTLK class libraries, were used with an X Windows graphics library to build a program for interactively editing lattices and studying their properties

  17. A markerless protocol for genetic analysis of Aggregatibacter actinomycetemcomitans

    Science.gov (United States)

    Cheng, Ya-An; Jee, Jason; Hsu, Genie; Huang, Yanyan; Chen, Casey; Lin, Chun-Pin

    2015-01-01

    Background/Purpose The genomes of different Aggregatibacter actinomycetemcomitans strains contain many strain-specific genes and genomic islands (defined as DNA found in some but not all strains) of unknown functions. Genetic analysis for the functions of these islands will be constrained by the limited availability of genetic markers and vectors for A. actinomycetemcomitans. In this study we tested a novel genetic approach of gene deletion and restoration in a naturally competent A. actinomycetemcomitans strain D7S-1. Methods Specific genes’ deletion mutants and mutants restored with the deleted genes were constructed by a markerless loxP/Cre system. In mutants with sequential deletion of multiple genes loxP with different spacer regions were used to avoid unwanted recombinations between loxP sites. Results Eight single-gene deletion mutants, four multiple-gene deletion mutants, and two mutants with restored genes were constructed. No unintended non-specific deletion mutants were generated by this protocol. The protocol did not negatively affect the growth and biofilm formation of A. actinomycetemcomitans. Conclusion The protocol described in this study is efficient and specific for genetic manipulation of A. actinomycetemcomitans, and will be amenable for functional analysis of multiple genes in A. actinomycetemcomitans. PMID:24530245

  18. Paramedir: A Tool for Programmable Performance Analysis

    Science.gov (United States)

    Jost, Gabriele; Labarta, Jesus; Gimenez, Judit

    2004-01-01

    Performance analysis of parallel scientific applications is time consuming and requires great expertise in areas such as programming paradigms, system software, and computer hardware architectures. In this paper we describe a tool that facilitates the programmability of performance metric calculations thereby allowing the automation of the analysis and reducing the application development time. We demonstrate how the system can be used to capture knowledge and intuition acquired by advanced parallel programmers in order to be transferred to novice users.

  19. The CANDU alarm analysis tool (CAAT)

    Energy Technology Data Exchange (ETDEWEB)

    Davey, E C; Feher, M P; Lupton, L R [Control Centre Technology Branch, ON (Canada)

    1997-09-01

    AECL undertook the development of a software tool to assist alarm system designers and maintainers based on feedback from several utilities and design groups. The software application is called the CANDU Alarm Analysis Tool (CAAT) and is being developed to: Reduce by one half the effort required to initially implement and commission alarm system improvements; improve the operational relevance, consistency and accuracy of station alarm information; record the basis for alarm-related decisions; provide printed reports of the current alarm configuration; and, make day-to-day maintenance of the alarm database less tedious and more cost-effective. The CAAT assists users in accessing, sorting and recording relevant information, design rules, decisions, and provides reports in support of alarm system maintenance, analysis of design changes, or regulatory inquiry. The paper discusses the need for such a tool, outlines the application objectives and principles used to guide tool development, describes the how specific tool features support user design and maintenance tasks, and relates the lessons learned from early application experience. (author). 4 refs, 2 figs.

  20. The CANDU alarm analysis tool (CAAT)

    International Nuclear Information System (INIS)

    Davey, E.C.; Feher, M.P.; Lupton, L.R.

    1997-01-01

    AECL undertook the development of a software tool to assist alarm system designers and maintainers based on feedback from several utilities and design groups. The software application is called the CANDU Alarm Analysis Tool (CAAT) and is being developed to: Reduce by one half the effort required to initially implement and commission alarm system improvements; improve the operational relevance, consistency and accuracy of station alarm information; record the basis for alarm-related decisions; provide printed reports of the current alarm configuration; and, make day-to-day maintenance of the alarm database less tedious and more cost-effective. The CAAT assists users in accessing, sorting and recording relevant information, design rules, decisions, and provides reports in support of alarm system maintenance, analysis of design changes, or regulatory inquiry. The paper discusses the need for such a tool, outlines the application objectives and principles used to guide tool development, describes the how specific tool features support user design and maintenance tasks, and relates the lessons learned from early application experience. (author). 4 refs, 2 figs

  1. Hippocampal unified multi-atlas network (HUMAN): protocol and scale validation of a novel segmentation tool.

    Science.gov (United States)

    Amoroso, N; Errico, R; Bruno, S; Chincarini, A; Garuccio, E; Sensi, F; Tangaro, S; Tateo, A; Bellotti, R

    2015-11-21

    In this study we present a novel fully automated Hippocampal Unified Multi-Atlas-Networks (HUMAN) algorithm for the segmentation of the hippocampus in structural magnetic resonance imaging. In multi-atlas approaches atlas selection is of crucial importance for the accuracy of the segmentation. Here we present an optimized method based on the definition of a small peri-hippocampal region to target the atlas learning with linear and non-linear embedded manifolds. All atlases were co-registered to a data driven template resulting in a computationally efficient method that requires only one test registration. The optimal atlases identified were used to train dedicated artificial neural networks whose labels were then propagated and fused to obtain the final segmentation. To quantify data heterogeneity and protocol inherent effects, HUMAN was tested on two independent data sets provided by the Alzheimer's Disease Neuroimaging Initiative and the Open Access Series of Imaging Studies. HUMAN is accurate and achieves state-of-the-art performance (Dice[Formula: see text] and Dice[Formula: see text]). It is also a robust method that remains stable when applied to the whole hippocampus or to sub-regions (patches). HUMAN also compares favorably with a basic multi-atlas approach and a benchmark segmentation tool such as FreeSurfer.

  2. Decision Analysis Tools for Volcano Observatories

    Science.gov (United States)

    Hincks, T. H.; Aspinall, W.; Woo, G.

    2005-12-01

    Staff at volcano observatories are predominantly engaged in scientific activities related to volcano monitoring and instrumentation, data acquisition and analysis. Accordingly, the academic education and professional training of observatory staff tend to focus on these scientific functions. From time to time, however, staff may be called upon to provide decision support to government officials responsible for civil protection. Recognizing that Earth scientists may have limited technical familiarity with formal decision analysis methods, specialist software tools that assist decision support in a crisis should be welcome. A review is given of two software tools that have been under development recently. The first is for probabilistic risk assessment of human and economic loss from volcanic eruptions, and is of practical use in short and medium-term risk-informed planning of exclusion zones, post-disaster response, etc. A multiple branch event-tree architecture for the software, together with a formalism for ascribing probabilities to branches, have been developed within the context of the European Community EXPLORIS project. The second software tool utilizes the principles of the Bayesian Belief Network (BBN) for evidence-based assessment of volcanic state and probabilistic threat evaluation. This is of practical application in short-term volcano hazard forecasting and real-time crisis management, including the difficult challenge of deciding when an eruption is over. An open-source BBN library is the software foundation for this tool, which is capable of combining synoptically different strands of observational data from diverse monitoring sources. A conceptual vision is presented of the practical deployment of these decision analysis tools in a future volcano observatory environment. Summary retrospective analyses are given of previous volcanic crises to illustrate the hazard and risk insights gained from use of these tools.

  3. Design and Analysis of an Enhanced Patient-Server Mutual Authentication Protocol for Telecare Medical Information System.

    Science.gov (United States)

    Amin, Ruhul; Islam, S K Hafizul; Biswas, G P; Khan, Muhammad Khurram; Obaidat, Mohammad S

    2015-11-01

    In order to access remote medical server, generally the patients utilize smart card to login to the server. It has been observed that most of the user (patient) authentication protocols suffer from smart card stolen attack that means the attacker can mount several common attacks after extracting smart card information. Recently, Lu et al.'s proposes a session key agreement protocol between the patient and remote medical server and claims that the same protocol is secure against relevant security attacks. However, this paper presents several security attacks on Lu et al.'s protocol such as identity trace attack, new smart card issue attack, patient impersonation attack and medical server impersonation attack. In order to fix the mentioned security pitfalls including smart card stolen attack, this paper proposes an efficient remote mutual authentication protocol using smart card. We have then simulated the proposed protocol using widely-accepted AVISPA simulation tool whose results make certain that the same protocol is secure against active and passive attacks including replay and man-in-the-middle attacks. Moreover, the rigorous security analysis proves that the proposed protocol provides strong security protection on the relevant security attacks including smart card stolen attack. We compare the proposed scheme with several related schemes in terms of computation cost and communication cost as well as security functionalities. It has been observed that the proposed scheme is comparatively better than related existing schemes.

  4. Implementation and Analysis of Real-Time Streaming Protocols.

    Science.gov (United States)

    Santos-González, Iván; Rivero-García, Alexandra; Molina-Gil, Jezabel; Caballero-Gil, Pino

    2017-04-12

    Communication media have become the primary way of interaction thanks to the discovery and innovation of many new technologies. One of the most widely used communication systems today is video streaming, which is constantly evolving. Such communications are a good alternative to face-to-face meetings, and are therefore very useful for coping with many problems caused by distance. However, they suffer from different issues such as bandwidth limitation, network congestion, energy efficiency, cost, reliability and connectivity. Hence, the quality of service and the quality of experience are considered the two most important issues for this type of communication. This work presents a complete comparative study of two of the most used protocols of video streaming, Real Time Streaming Protocol (RTSP) and the Web Real-Time Communication (WebRTC). In addition, this paper proposes two new mobile applications that implement those protocols in Android whose objective is to know how they are influenced by the aspects that most affect the streaming quality of service, which are the connection establishment time and the stream reception time. The new video streaming applications are also compared with the most popular video streaming applications for Android, and the experimental results of the analysis show that the developed WebRTC implementation improves the performance of the most popular video streaming applications with respect to the stream packet delay.

  5. Analysis of Intracellular Metabolites from Microorganisms: Quenching and Extraction Protocols.

    Science.gov (United States)

    Pinu, Farhana R; Villas-Boas, Silas G; Aggio, Raphael

    2017-10-23

    Sample preparation is one of the most important steps in metabolome analysis. The challenges of determining microbial metabolome have been well discussed within the research community and many improvements have already been achieved in last decade. The analysis of intracellular metabolites is particularly challenging. Environmental perturbations may considerably affect microbial metabolism, which results in intracellular metabolites being rapidly degraded or metabolized by enzymatic reactions. Therefore, quenching or the complete stop of cell metabolism is a pre-requisite for accurate intracellular metabolite analysis. After quenching, metabolites need to be extracted from the intracellular compartment. The choice of the most suitable metabolite extraction method/s is another crucial step. The literature indicates that specific classes of metabolites are better extracted by different extraction protocols. In this review, we discuss the technical aspects and advancements of quenching and extraction of intracellular metabolite analysis from microbial cells.

  6. Analysis of Intracellular Metabolites from Microorganisms: Quenching and Extraction Protocols

    Directory of Open Access Journals (Sweden)

    Farhana R. Pinu

    2017-10-01

    Full Text Available Sample preparation is one of the most important steps in metabolome analysis. The challenges of determining microbial metabolome have been well discussed within the research community and many improvements have already been achieved in last decade. The analysis of intracellular metabolites is particularly challenging. Environmental perturbations may considerably affect microbial metabolism, which results in intracellular metabolites being rapidly degraded or metabolized by enzymatic reactions. Therefore, quenching or the complete stop of cell metabolism is a pre-requisite for accurate intracellular metabolite analysis. After quenching, metabolites need to be extracted from the intracellular compartment. The choice of the most suitable metabolite extraction method/s is another crucial step. The literature indicates that specific classes of metabolites are better extracted by different extraction protocols. In this review, we discuss the technical aspects and advancements of quenching and extraction of intracellular metabolite analysis from microbial cells.

  7. Designing a Tool for History Textbook Analysis

    Directory of Open Access Journals (Sweden)

    Katalin Eszter Morgan

    2012-11-01

    Full Text Available This article describes the process by which a five-dimensional tool for history textbook analysis was conceptualized and developed in three stages. The first stage consisted of a grounded theory approach to code the content of the sampled chapters of the books inductively. After that the findings from this coding process were combined with principles of text analysis as derived from the literature, specifically focusing on the notion of semiotic mediation as theorized by Lev VYGOTSKY. We explain how we then entered the third stage of the development of the tool, comprising five dimensions. Towards the end of the article we show how the tool could be adapted to serve other disciplines as well. The argument we forward in the article is for systematic and well theorized tools with which to investigate textbooks as semiotic mediators in education. By implication, textbook authors can also use these as guidelines. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs130170

  8. Integrated tools for control-system analysis

    Science.gov (United States)

    Ostroff, Aaron J.; Proffitt, Melissa S.; Clark, David R.

    1989-01-01

    The basic functions embedded within a user friendly software package (MATRIXx) are used to provide a high level systems approach to the analysis of linear control systems. Various control system analysis configurations are assembled automatically to minimize the amount of work by the user. Interactive decision making is incorporated via menu options and at selected points, such as in the plotting section, by inputting data. There are five evaluations such as the singular value robustness test, singular value loop transfer frequency response, Bode frequency response, steady-state covariance analysis, and closed-loop eigenvalues. Another section describes time response simulations. A time response for random white noise disturbance is available. The configurations and key equations used for each type of analysis, the restrictions that apply, the type of data required, and an example problem are described. One approach for integrating the design and analysis tools is also presented.

  9. Indicators and measurement tools for health system integration: a knowledge synthesis protocol.

    Science.gov (United States)

    Oelke, Nelly D; Suter, Esther; da Silva Lima, Maria Alice Dias; Van Vliet-Brown, Cheryl

    2015-07-29

    Health system integration is a key component of health system reform with the goal of improving outcomes for patients, providers, and the health system. Although health systems continue to strive for better integration, current delivery of health services continues to be fragmented. A key gap in the literature is the lack of information on what successful integration looks like and how to measure achievement towards an integrated system. This multi-site study protocol builds on a prior knowledge synthesis completed by two of the primary investigators which identified 10 key principles that collectively support health system integration. The aim is to answer two research questions: What are appropriate indicators for each of the 10 key integration principles developed in our previous knowledge synthesis and what measurement tools are used to measure these indicators? To enhance generalizability of the findings, a partnership between Canada and Brazil was created as health system integration is a priority in both countries and they share similar contexts. This knowledge synthesis will follow an iterative scoping review process with emerging information from knowledge-user engagement leading to the refinement of research questions and study selection. This paper describes the methods for each phase of the study. Research questions were developed with stakeholder input. Indicator identification and prioritization will utilize a modified Delphi method and patient/user focus groups. Based on priority indicators, a search of the literature will be completed and studies screened for inclusion. Quality appraisal of relevant studies will be completed prior to data extraction. Results will be used to develop recommendations and key messages to be presented through integrated and end-of-grant knowledge translation strategies with researchers and knowledge-users from the three jurisdictions. This project will directly benefit policy and decision-makers by providing an easy

  10. Caries risk assessment tool and prevention protocol for public health nurses in mother and child health centers, Israel.

    Science.gov (United States)

    Natapov, Lena; Dekel-Markovich, Dan; Granit-Palmon, Hadas; Aflalo, Efrat; Zusman, Shlomo Paul

    2018-01-01

    Dental caries is the most prevalent chronic disease in children. Caries risk assessment tools enable the dentists, physicians, and nondental health care providers to assess the individual's risk. Intervention by nurses in primary care settings can contribute to the establishment of oral health habits and prevention of dental disease. In Israel, Mother and Child Health Centers provide free preventive services for pregnant women and children by public health nurses. A caries prevention program in health centers started in 2015. Nurses underwent special training regarding caries prevention. A customized Caries Risk Assessment tool and Prevention Protocol for nurses, based on the AAPD tool, was introduced. A two-step evaluation was conducted which included a questionnaire and in-depth phone interviews. Twenty-eight (out of 46) health centers returned a completed questionnaire. Most nurses believed that oral health preventive services should be incorporated into their daily work. In the in-depth phone interviews, nurses stated that the integration of the program into their busy daily schedule was realistic and appropriate. The lack of specific dental module for computer program was mentioned as an implementation difficulty. The wide use of our tool by nurses supports its simplicity and feasibility which enables quick calculation and informed decision making. The nurses readily embraced the tool and it became an integral part of their toolkit. We provide public health nurses with a caries risk assessment tool and prevention protocol thus integrating oral health into general health of infants and toddlers. © 2017 Wiley Periodicals, Inc.

  11. Conformal polishing approach: Tool footprint analysis

    Directory of Open Access Journals (Sweden)

    José A Dieste

    2016-02-01

    Full Text Available Polishing process is one of the most critical manufacturing processes during a metal part production because it determines the final quality of the product. Free-form surface polishing is a handmade process with lots of rejected parts, scrap generation and time and energy consumption. Two different research lines are being developed: prediction models of the final surface quality parameters and an analysis of the amount of material removed depending on the polishing parameters to predict the tool footprint during the polishing task. This research lays the foundations for a future automatic conformal polishing system. It is based on rotational and translational tool with dry abrasive in the front mounted at the end of a robot. A tool to part concept is used, useful for large or heavy workpieces. Results are applied on different curved parts typically used in tooling industry, aeronautics or automotive. A mathematical model has been developed to predict the amount of material removed in function of polishing parameters. Model has been fitted for different abrasives and raw materials. Results have shown deviations under 20% that implies a reliable and controllable process. Smaller amount of material can be removed in controlled areas of a three-dimensional workpiece.

  12. NADA Protocol for Behavioral Health. Putting Tools in the Hands of Behavioral Health Providers: The Case for Auricular Detoxification Specialists.

    Science.gov (United States)

    Stuyt, Elizabeth B; Voyles, Claudia A; Bursac, Sara

    2018-02-07

    Background: The National Acupuncture Detoxification Association (NADA) protocol, a simple standardized auricular treatment has the potential to provide vast public health relief on issues currently challenging our world. This includes but is not limited to addiction, such as the opioid epidemic, but also encompasses mental health, trauma, PTSD, chronic stress, and the symptoms associated with these conditions. Simple accessible tools that improve outcomes can make profound differences. We assert that the NADA protocol can have greatest impact when broadly applied by behavioral health professionals, Auricular Detoxification Specialists (ADSes). Methods: The concept of ADS is described and how current laws vary from state to state. Using available national data, a survey of practitioners in three selected states with vastly different laws regarding ADSes, and interviews of publicly funded programs which are successfully incorporating the NADA protocol, we consider possible effects of ADS-friendly conditions. Results: Data presented supports the idea that conditions conducive to ADS practice lead to greater implementation. Program interviews reflect settings in which adding ADSes can in turn lead to improved outcomes. Discussion: The primary purpose of non-acupuncturist ADSes is to expand the access of this simple but effective treatment to all who are suffering from addictions, stress, or trauma and to allow programs to incorporate acupuncture in the form of the NADA protocol at minimal cost, when and where it is needed. States that have changed laws to allow ADS practice for this standardized ear acupuncture protocol have seen increased access to this treatment, benefiting both patients and the programs.

  13. BBAT: Bunch and bucket analysis tool

    International Nuclear Information System (INIS)

    Deng, D.P.

    1995-01-01

    BBAT is written to meet the need of an interactive graphical tool to explore the longitudinal phase space. It is driven for testing new ideas or new tricks quickly. It is especially suitable for machine physicists or operation staff as well both in the control room during machine studies or off-line to analyze the data. The heart of the package contains a set of c-routines to do the number crunching. The graphics part is wired with scripting language tcl/tk and BLT. The c-routines are general enough that one can write new applications such as animation of the bucket as a machine parameter varies via a sliding scale. BBAT deals with single rf system. For double rf system, one can use Dr. BBAT, which stands for Double rf Bunch and Bucket Analysis Tool. One usage of Dr. BBAT is to visualize the process of bunch coalescing and flat bunch creation

  14. Symbolic Model Checking and Analysis for E-Commerce Protocol

    Institute of Scientific and Technical Information of China (English)

    WEN Jing-Hua; ZHANG Mei; LI Xiang

    2005-01-01

    A new approach is proposed for analyzing non-repudiation and fairness of e-commerce protocols. The authentication e-mail protocol CMP1 is modeled as finite state machine and analyzed in two vital aspects - non-repudiation and fairness using SMV. As a result, the CMP1 protocol is not fair and we have improved it. This result shows that it is effective to analyze and check the new features of e-commerce protocols using SMV model checker

  15. SIMONE: Tool for Data Analysis and Simulation

    International Nuclear Information System (INIS)

    Chudoba, V.; Hnatio, B.; Sharov, P.; Papka, Paul

    2013-06-01

    SIMONE is a software tool based on the ROOT Data Analysis Framework and developed in collaboration of FLNR JINR and iThemba LABS. It is intended for physicists planning experiments and analysing experimental data. The goal of the SIMONE framework is to provide a flexible system, user friendly, efficient and well documented. It is intended for simulation of a wide range of Nuclear Physics experiments. The most significant conditions and physical processes can be taken into account during simulation of the experiment. The user can create his own experimental setup through the access of predefined detector geometries. Simulated data is made available in the same format as for the real experiment for identical analysis of both experimental and simulated data. Significant time reduction is expected during experiment planning and data analysis. (authors)

  16. The comparative cost analysis of EAP Re-authentication Protocol and EAP TLS Protocol

    OpenAIRE

    Seema Mehla; Bhawna Gupta

    2010-01-01

    the Extensible Authentication Protocol (EAP) is a generic framework supporting multiple types of authentication methods. In systems where EAP is used for authentication, it is desirable to not repeat the entire EAP exchange with another authenticator. The EAP reauthentication Protocol provides a consistent, methodindependentand low-latency re-authentication. It is extension to current EAP mechanism to support intradomain handoff authentication. This paper analyzed the performance of the EAP r...

  17. Setup Analysis: Combining SMED with Other Tools

    Directory of Open Access Journals (Sweden)

    Stadnicka Dorota

    2015-02-01

    Full Text Available The purpose of this paper is to propose the methodology for the setup analysis, which can be implemented mainly in small and medium enterprises which are not convinced to implement the setups development. The methodology was developed after the research which determined the problem. Companies still have difficulties with a long setup time. Many of them do nothing to decrease this time. A long setup is not a sufficient reason for companies to undertake any actions towards the setup time reduction. To encourage companies to implement SMED it is essential to make some analyses of changeovers in order to discover problems. The methodology proposed can really encourage the management to take a decision about the SMED implementation, and that was verified in a production company. The setup analysis methodology is made up of seven steps. Four of them concern a setups analysis in a chosen area of a company, such as a work stand which is a bottleneck with many setups. The goal is to convince the management to begin actions concerning the setups improvement. The last three steps are related to a certain setup and, there, the goal is to reduce a setup time and the risk of problems which can appear during the setup. In this paper, the tools such as SMED, Pareto analysis, statistical analysis, FMEA and other were used.

  18. Medical decision making tools: Bayesian analysis and ROC analysis

    International Nuclear Information System (INIS)

    Lee, Byung Do

    2006-01-01

    During the diagnostic process of the various oral and maxillofacial lesions, we should consider the following: 'When should we order diagnostic tests? What tests should be ordered? How should we interpret the results clinically? And how should we use this frequently imperfect information to make optimal medical decision?' For the clinicians to make proper judgement, several decision making tools are suggested. This article discusses the concept of the diagnostic accuracy (sensitivity and specificity values) with several decision making tools such as decision matrix, ROC analysis and Bayesian analysis. The article also explain the introductory concept of ORAD program

  19. Authentication Test-Based the RFID Authentication Protocol with Security Analysis

    Directory of Open Access Journals (Sweden)

    Minghui Wang

    2014-08-01

    Full Text Available To the problem of many recently proposed RFID authentication protocol was soon find security holes, we analyzed the main reason, which is that protocol design is not rigorous, and the correctness of the protocol cannot be guaranteed. To this end, authentication test method was adopted in the process of the formal analysis and strict proof to the proposed RFID protocol in this paper. Authentication Test is a new type of analysis and design method of security protocols based on Strand space model, and it can be used for most types of the security protocols. After analysis the security, the proposed protocol can meet the RFID security demand: information confidentiality, data integrity and identity authentication.

  20. Principal Angle Enrichment Analysis (PAEA): Dimensionally Reduced Multivariate Gene Set Enrichment Analysis Tool.

    Science.gov (United States)

    Clark, Neil R; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D; Jones, Matthew R; Ma'ayan, Avi

    2015-11-01

    Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community.

  1. Standardised risk analysis as a communication tool

    International Nuclear Information System (INIS)

    Pluess, Ch.; Montanarini, M.; Bernauer, M.

    1998-01-01

    Full text of publication follows: several European countries require a risk analysis for the production, storage or transport a dangerous goods. This requirement imposes considerable administrative effort for some sectors of the industry. In order to minimize the effort of such studies, a generic risk analysis for an industrial sector proved to help. Standardised procedures can consequently be derived for efficient performance of the risk investigations. This procedure was successfully established in Switzerland for natural gas transmission lines and fossil fuel storage plants. The development process of the generic risk analysis involved an intense discussion between industry and authorities about methodology of assessment and the criteria of acceptance. This process finally led to scientific consistent modelling tools for risk analysis and to an improved communication from the industry to the authorities and the public. As a recent example, the Holland-Italy natural gas transmission pipeline is demonstrated, where this method was successfully employed. Although this pipeline traverses densely populated areas in Switzerland, using this established communication method, the risk problems could be solved without delaying the planning process. (authors)

  2. Simulation modelling as a tool for knowledge mobilisation in health policy settings: a case study protocol.

    Science.gov (United States)

    Freebairn, L; Atkinson, J; Kelly, P; McDonnell, G; Rychetnik, L

    2016-09-21

    Evidence-informed decision-making is essential to ensure that health programs and services are effective and offer value for money; however, barriers to the use of evidence persist. Emerging systems science approaches and advances in technology are providing new methods and tools to facilitate evidence-based decision-making. Simulation modelling offers a unique tool for synthesising and leveraging existing evidence, data and expert local knowledge to examine, in a robust, low risk and low cost way, the likely impact of alternative policy and service provision scenarios. This case study will evaluate participatory simulation modelling to inform the prevention and management of gestational diabetes mellitus (GDM). The risks associated with GDM are well recognised; however, debate remains regarding diagnostic thresholds and whether screening and treatment to reduce maternal glucose levels reduce the associated risks. A diagnosis of GDM may provide a leverage point for multidisciplinary lifestyle modification interventions. This research will apply and evaluate a simulation modelling approach to understand the complex interrelation of factors that drive GDM rates, test options for screening and interventions, and optimise the use of evidence to inform policy and program decision-making. The study design will use mixed methods to achieve the objectives. Policy, clinical practice and research experts will work collaboratively to develop, test and validate a simulation model of GDM in the Australian Capital Territory (ACT). The model will be applied to support evidence-informed policy dialogues with diverse stakeholders for the management of GDM in the ACT. Qualitative methods will be used to evaluate simulation modelling as an evidence synthesis tool to support evidence-based decision-making. Interviews and analysis of workshop recordings will focus on the participants' engagement in the modelling process; perceived value of the participatory process, perceived

  3. PROMOTION OF PRODUCTS AND ANALYSIS OF MARKET OF POWER TOOLS

    Directory of Open Access Journals (Sweden)

    Sergey S. Rakhmanov

    2014-01-01

    Full Text Available The article describes the general situation of power tools on the market, both in Russia and in the world. A comparative analysis of competitors, market structure analysis of power tools, as well as assessment of competitiveness of some major product lines. Also the analysis methods of promotion used by companies selling tools, competitive analysis range Bosch, the leader in its segment, power tools available on the market in Russia.

  4. Automated Steel Cleanliness Analysis Tool (ASCAT)

    Energy Technology Data Exchange (ETDEWEB)

    Gary Casuccio (RJ Lee Group); Michael Potter (RJ Lee Group); Fred Schwerer (RJ Lee Group); Dr. Richard J. Fruehan (Carnegie Mellon University); Dr. Scott Story (US Steel)

    2005-12-30

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment

  5. Automated Steel Cleanliness Analysis Tool (ASCAT)

    International Nuclear Information System (INIS)

    Gary Casuccio; Michael Potter; Fred Schwerer; Richard J. Fruehan; Dr. Scott Story

    2005-01-01

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment/steel cleanliness; slab, billet

  6. Comparative Analysis of Different Protocols to Manage Large Scale Networks

    OpenAIRE

    Anil Rao Pimplapure; Dr Jayant Dubey; Prashant Sen

    2013-01-01

    In recent year the numbers, complexity and size is increased in Large Scale Network. The best example of Large Scale Network is Internet, and recently once are Data-centers in Cloud Environment. In this process, involvement of several management tasks such as traffic monitoring, security and performance optimization is big task for Network Administrator. This research reports study the different protocols i.e. conventional protocols like Simple Network Management Protocol and newly Gossip bas...

  7. Performance analysis of signaling protocols on OBS switches

    Science.gov (United States)

    Kirci, Pinar; Zaim, A. Halim

    2005-10-01

    In this paper, Just-In-Time (JIT), Just-Enough-Time (JET) and Horizon signalling schemes for Optical Burst Switched Networks (OBS) are presented. These signaling schemes run over a core dWDM network and a network architecture based on Optical Burst Switches (OBS) is proposed to support IP, ATM and Burst traffic. In IP and ATM traffic several packets are assembled in a single packet called burst and the burst contention is handled by burst dropping. The burst length distribution in IP traffic is arbitrary between 0 and 1, and is fixed in ATM traffic at 0,5. Burst traffic on the other hand is arbitrary between 1 and 5. The Setup and Setup ack length distributions are arbitrary. We apply the Poisson model with rate λ and Self-Similar model with pareto distribution rate α to identify inter-arrival times in these protocols. We consider a communication between a source client node and a destination client node over an ingress and one or more multiple intermediate switches.We use buffering only in the ingress node. The communication is based on single burst connections in which, the connection is set up just before sending a burst and then closed as soon as the burst is sent. Our analysis accounts for several important parameters, including the burst setup, burst setup ack, keepalive messages and the optical switching protocol. We compare the performance of the three signalling schemes on the network under as burst dropping probability under a range of network scenarios.

  8. Analysis of limiting information characteristics of quantum-cryptography protocols

    International Nuclear Information System (INIS)

    Sych, D V; Grishanin, Boris A; Zadkov, Viktor N

    2005-01-01

    The problem of increasing the critical error rate of quantum-cryptography protocols by varying a set of letters in a quantum alphabet for space of a fixed dimensionality is studied. Quantum alphabets forming regular polyhedra on the Bloch sphere and the continual alphabet equally including all the quantum states are considered. It is shown that, in the absence of basis reconciliation, a protocol with the tetrahedral alphabet has the highest critical error rate among the protocols considered, while after the basis reconciliation, a protocol with the continual alphabet possesses the highest critical error rate. (quantum optics and quantum computation)

  9. Development of a physical activity monitoring tool for Thai medical schools: a protocol for a mixed methods study

    Science.gov (United States)

    Wattanapisit, Apichai; Vijitpongjinda, Surasak; Saengow, Udomsak; Amaek, Waluka; Thanamee, Sanhapan; Petchuay, Prachyapan

    2017-01-01

    Introduction Physical activity (PA) is important in promoting health, as well as in the treatment and prevention of diseases. However, insufficient PA is still a global health problem and it is also a problem in medical schools. PA training in medical curricula is still sparse or non-existent. There is a need for a comprehensive understanding of the extent of PA in medical schools through several indicators, including people, places and policies. This study includes a survey of the PA prevalence in a medical school and development of a tool, the Medical School Physical Activity Report Card (MSPARC), which will contain concise and understandable infographics and information for exploring, monitoring and reporting information relating to PA prevalence. Methods and analysis This mixed methods study will run from January to September 2017. We will involve the School of Medicine, Walailak University, Thailand, and its medical students (n=285). Data collection will consist of both primary and secondary data, divided into four parts: general information, people, places and policies. We will investigate the PA metrics about (1) people: the prevalence of PA and sedentary behaviours; (2) place: the quality and accessibility of walkable neighbourhoods, bicycle facilities and recreational areas; and (3) policy: PA promotion programmes for medical students, education metrics and investments related to PA. The MSPARC will be developed using simple symbols, infographics and short texts to evaluate the PA metrics of the medical school. Ethics and dissemination This study has been approved by the Human Research Ethics Committee of Walailak University (protocol number: WUEC-16-005-01). Findings will be published in peer-reviewed journals and presented at national or international conferences. The MSPARC and full report will be disseminated to relevant stakeholders, policymakers, staff and clients. PMID:28963299

  10. Analysis of machining and machine tools

    CERN Document Server

    Liang, Steven Y

    2016-01-01

    This book delivers the fundamental science and mechanics of machining and machine tools by presenting systematic and quantitative knowledge in the form of process mechanics and physics. It gives readers a solid command of machining science and engineering, and familiarizes them with the geometry and functionality requirements of creating parts and components in today’s markets. The authors address traditional machining topics, such as: single and multiple point cutting processes grinding components accuracy and metrology shear stress in cutting cutting temperature and analysis chatter They also address non-traditional machining, such as: electrical discharge machining electrochemical machining laser and electron beam machining A chapter on biomedical machining is also included. This book is appropriate for advanced undergraduate and graduate mechani cal engineering students, manufacturing engineers, and researchers. Each chapter contains examples, exercises and their solutions, and homework problems that re...

  11. Method and tool for network vulnerability analysis

    Science.gov (United States)

    Swiler, Laura Painton [Albuquerque, NM; Phillips, Cynthia A [Albuquerque, NM

    2006-03-14

    A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."

  12. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  13. Constructing Benchmark Databases and Protocols for Medical Image Analysis: Diabetic Retinopathy

    Directory of Open Access Journals (Sweden)

    Tomi Kauppi

    2013-01-01

    Full Text Available We address the performance evaluation practices for developing medical image analysis methods, in particular, how to establish and share databases of medical images with verified ground truth and solid evaluation protocols. Such databases support the development of better algorithms, execution of profound method comparisons, and, consequently, technology transfer from research laboratories to clinical practice. For this purpose, we propose a framework consisting of reusable methods and tools for the laborious task of constructing a benchmark database. We provide a software tool for medical image annotation helping to collect class label, spatial span, and expert's confidence on lesions and a method to appropriately combine the manual segmentations from multiple experts. The tool and all necessary functionality for method evaluation are provided as public software packages. As a case study, we utilized the framework and tools to establish the DiaRetDB1 V2.1 database for benchmarking diabetic retinopathy detection algorithms. The database contains a set of retinal images, ground truth based on information from multiple experts, and a baseline algorithm for the detection of retinopathy lesions.

  14. NADA Protocol for Behavioral Health. Putting Tools in the Hands of Behavioral Health Providers: The Case for Auricular Detoxification Specialists

    Directory of Open Access Journals (Sweden)

    Elizabeth B Stuyt

    2018-02-01

    Full Text Available Background: The National Acupuncture Detoxification Association (NADA protocol, a simple standardized auricular treatment has the potential to provide vast public health relief on issues currently challenging our world. This includes but is not limited to addiction, such as the opioid epidemic, but also encompasses mental health, trauma, PTSD, chronic stress, and the symptoms associated with these conditions. Simple accessible tools that improve outcomes can make profound differences. We assert that the NADA protocol can have greatest impact when broadly applied by behavioral health professionals, Auricular Detoxification Specialists (ADSes. Methods: The concept of ADS is described and how current laws vary from state to state. Using available national data, a survey of practitioners in three selected states with vastly different laws regarding ADSes, and interviews of publicly funded programs which are successfully incorporating the NADA protocol, we consider possible effects of ADS-friendly conditions. Results: Data presented supports the idea that conditions conducive to ADS practice lead to greater implementation. Program interviews reflect settings in which adding ADSes can in turn lead to improved outcomes. Discussion: The primary purpose of non-acupuncturist ADSes is to expand the access of this simple but effective treatment to all who are suffering from addictions, stress, or trauma and to allow programs to incorporate acupuncture in the form of the NADA protocol at minimal cost, when and where it is needed. States that have changed laws to allow ADS practice for this standardized ear acupuncture protocol have seen increased access to this treatment, benefiting both patients and the programs.

  15. Communicating systems with UML 2 modeling and analysis of network protocols

    CERN Document Server

    Barrera, David Garduno

    2013-01-01

    This book gives a practical approach to modeling and analyzing communication protocols using UML 2. Network protocols are always presented with a point of view focusing on partial mechanisms and starting models. This book aims at giving the basis needed for anybody to model and validate their own protocols. It follows a practical approach and gives many examples for the description and analysis of well known basic network mechanisms for protocols.The book firstly shows how to describe and validate the main protocol issues (such as synchronization problems, client-server interactions, layer

  16. Built Environment Analysis Tool: April 2013

    Energy Technology Data Exchange (ETDEWEB)

    Porter, C.

    2013-05-01

    This documentation describes the tool development. It was created to evaluate the effects of built environment scenarios on transportation energy and greenhouse gas (GHG) emissions. This documentation also provides guidance on how to apply the tool.

  17. Performance Analysis of TDMA Protocol in a Femtocell Network

    Directory of Open Access Journals (Sweden)

    Wanod Kumar

    2014-07-01

    Full Text Available In this paper, we evaluate the performance of TDMA (Time Division Multiple Access protocol using queuing theory in a femtocell network. The fair use of wireless channel among the users of network is carried out using TDMA protocol. The arrival of data packets from M communicating nodes becomes multiple Poisson process. The time slots of TDMA protocol represent c servers to communicate data packets coming from communicating nodes to the input of FAP (Femtocell Access Point. The service time of each server (time slot is exponentially distributed. This complete communication scenario using TDMA protocol is modeled using M/M/c queue. The performance of the protocol is evaluated in terms of mean number in system, average system delay and utilization for varying traffic intensity

  18. Sustainability Tools Inventory Initial Gap Analysis

    Science.gov (United States)

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consu...

  19. Indicators and measurement tools for health system integration : a knowledge synthesis protocol

    OpenAIRE

    Oelke, Nelly Donszelmann; Suter, Esther; Lima, Maria Alice Dias da Silva; Vliet-Brown, Cheryl Van

    2015-01-01

    Background: Health system integration is a key component of health system reform with the goal of improving outcomes for patients, providers, and the health system. Although health systems continue to strive for better integration, current delivery of health services continues to be fragmented. A key gap in the literature is the lack of information on what successful integration looks like and how to measure achievement towards an integrated system. This multi-site study protocol builds on a ...

  20. THE SMALL BODY GEOPHYSICAL ANALYSIS TOOL

    Science.gov (United States)

    Bercovici, Benjamin; McMahon, Jay

    2017-10-01

    The Small Body Geophysical Analysis Tool (SBGAT) that we are developing aims at providing scientists and mission designers with a comprehensive, easy to use, open-source analysis tool. SBGAT is meant for seamless generation of valuable simulated data originating from small bodies shape models, combined with advanced shape-modification properties.The current status of SBGAT is as follows:The modular software architecture that was specified in the original SBGAT proposal was implemented in the form of two distinct packages: a dynamic library SBGAT Core containing the data structure and algorithm backbone of SBGAT, and SBGAT Gui which wraps the former inside a VTK, Qt user interface to facilitate user/data interaction. This modular development facilitates maintenance and addi- tion of new features. Note that SBGAT Core can be utilized independently from SBGAT Gui.SBGAT is presently being hosted on a GitHub repository owned by SBGAT’s main developer. This repository is public and can be accessed at https://github.com/bbercovici/SBGAT. Along with the commented code, one can find the code documentation at https://bbercovici.github.io/sbgat-doc/index.html. This code documentation is constently updated in order to reflect new functionalities.SBGAT’s user’s manual is available at https://github.com/bbercovici/SBGAT/wiki. This document contains a comprehensive tutorial indicating how to retrieve, compile and run SBGAT from scratch.Some of the upcoming development goals are listed hereafter. First, SBGAT's dynamics module will be extented: the PGM algorithm is the only type of analysis method currently implemented. Future work will therefore consists in broadening SBGAT’s capabilities with the Spherical Harmonics Expansion of the gravity field and the calculation of YORP coefficients. Second, synthetic measurements will soon be available within SBGAT. The software should be able to generate synthetic observations of different type (radar, lightcurve, point clouds

  1. Risk analysis as a decision tool

    International Nuclear Information System (INIS)

    Yadigaroglu, G.; Chakraborty, S.

    1985-01-01

    From 1983 - 1985 a lecture series entitled ''Risk-benefit analysis'' was held at the Swiss Federal Institute of Technology (ETH), Zurich, in cooperation with the Central Department for the Safety of Nuclear Installations of the Swiss Federal Agency of Energy Economy. In that setting the value of risk-oriented evaluation models as a decision tool in safety questions was discussed on a broad basis. Experts of international reputation from the Federal Republic of Germany, France, Canada, the United States and Switzerland have contributed to report in this joint volume on the uses of such models. Following an introductory synopsis on risk analysis and risk assessment the book deals with practical examples in the fields of medicine, nuclear power, chemistry, transport and civil engineering. Particular attention is paid to the dialogue between analysts and decision makers taking into account the economic-technical aspects and social values. The recent chemical disaster in the Indian city of Bhopal again signals the necessity of such analyses. All the lectures were recorded individually. (orig./HP) [de

  2. Protocol vulnerability detection based on network traffic analysis and binary reverse engineering.

    Science.gov (United States)

    Wen, Shameng; Meng, Qingkun; Feng, Chao; Tang, Chaojing

    2017-01-01

    Network protocol vulnerability detection plays an important role in many domains, including protocol security analysis, application security, and network intrusion detection. In this study, by analyzing the general fuzzing method of network protocols, we propose a novel approach that combines network traffic analysis with the binary reverse engineering method. For network traffic analysis, the block-based protocol description language is introduced to construct test scripts, while the binary reverse engineering method employs the genetic algorithm with a fitness function designed to focus on code coverage. This combination leads to a substantial improvement in fuzz testing for network protocols. We build a prototype system and use it to test several real-world network protocol implementations. The experimental results show that the proposed approach detects vulnerabilities more efficiently and effectively than general fuzzing methods such as SPIKE.

  3. Protocol vulnerability detection based on network traffic analysis and binary reverse engineering.

    Directory of Open Access Journals (Sweden)

    Shameng Wen

    Full Text Available Network protocol vulnerability detection plays an important role in many domains, including protocol security analysis, application security, and network intrusion detection. In this study, by analyzing the general fuzzing method of network protocols, we propose a novel approach that combines network traffic analysis with the binary reverse engineering method. For network traffic analysis, the block-based protocol description language is introduced to construct test scripts, while the binary reverse engineering method employs the genetic algorithm with a fitness function designed to focus on code coverage. This combination leads to a substantial improvement in fuzz testing for network protocols. We build a prototype system and use it to test several real-world network protocol implementations. The experimental results show that the proposed approach detects vulnerabilities more efficiently and effectively than general fuzzing methods such as SPIKE.

  4. Performance analysis of routing protocols for IoT

    Science.gov (United States)

    Manda, Sridhar; Nalini, N.

    2018-04-01

    Internet of Things (IoT) is an arrangement of advancements that are between disciplinary. It is utilized to have compelling combination of both physical and computerized things. With IoT physical things can have personal virtual identities and participate in distributed computing. Realization of IoT needs the usage of sensors based on the sector for which IoT is integrated. For instance, in healthcare domain, IoT needs to have integration with wearable sensors used by patients. As sensor devices produce huge amount of data, often called big data, there should be efficient routing protocols in place. To the extent remote systems is worried there are some current protocols, for example, OLSR, DSR and AODV. It additionally tosses light into Trust based routing protocol for low-power and lossy systems (TRPL) for IoT. These are broadly utilized remote directing protocols. As IoT is developing round the corner, it is basic to investigate routing protocols that and evaluate their execution regarding throughput, end to end delay, and directing overhead. The execution experiences can help in settling on very much educated choices while incorporating remote systems with IoT. In this paper, we analyzed different routing protocols and their performance is compared. It is found that AODV showed better performance than other routing protocols aforementioned.

  5. Protocol for sampling and analysis of bone specimens

    International Nuclear Information System (INIS)

    Aras, N.K.

    2000-01-01

    The iliac crest of hip bone was chosen as the most suitable sampling site for several reasons: Local variation in the elemental concentration along the iliac crest is minimal; Iliac crest biopsies are commonly taken clinically on patients; The cortical part of the sample is small (∼2 mm) and can be separated easily from the trabecular bone; The use of the trabecular part of the iliac crest for trace element analysis has the advantage of reflecting rapidly changes in the composition of bone due to external parameters, including medication. Biopsy studies, although in some ways more difficult than autopsy studies, because of the need to obtain the informed consents of the subjects, are potentially more useful than autopsy studies. Thereby many problems of postmortem migration of elements can be avoided and reliable dietary and other data can be collected simultaneously. Select the subjects among the patients undergoing orthopedic surgery due to any reason other than osteoporosis. Follow an established protocol to obtain bone biopsies. Patients undergoing synergy should fill in the 'Osteoporosis Project Questionnaire Form' including information on lifestyle variables, dietary intakes, the reason for surgery etc. If possible, measure the bone mineral density (BMD) prior to removal of the biopsy sample. However it may not possible to have BMD results on all the subjects because of difficulty of DEXA measurement after an accident

  6. Performance analysis of simultaneous dense coding protocol under decoherence

    Science.gov (United States)

    Huang, Zhiming; Zhang, Cai; Situ, Haozhen

    2017-09-01

    The simultaneous dense coding (SDC) protocol is useful in designing quantum protocols. We analyze the performance of the SDC protocol under the influence of noisy quantum channels. Six kinds of paradigmatic Markovian noise along with one kind of non-Markovian noise are considered. The joint success probability of both receivers and the success probabilities of one receiver are calculated for three different locking operators. Some interesting properties have been found, such as invariance and symmetry. Among the three locking operators we consider, the SWAP gate is most resistant to noise and results in the same success probabilities for both receivers.

  7. An approach to standardization of urine sediment analysis via suggestion of a common manual protocol.

    Science.gov (United States)

    Ko, Dae-Hyun; Ji, Misuk; Kim, Sollip; Cho, Eun-Jung; Lee, Woochang; Yun, Yeo-Min; Chun, Sail; Min, Won-Ki

    2016-01-01

    The results of urine sediment analysis have been reported semiquantitatively. However, as recent guidelines recommend quantitative reporting of urine sediment, and with the development of automated urine sediment analyzers, there is an increasing need for quantitative analysis of urine sediment. Here, we developed a protocol for urine sediment analysis and quantified the results. Based on questionnaires, various reports, guidelines, and experimental results, we developed a protocol for urine sediment analysis. The results of this new protocol were compared with those obtained with a standardized chamber and an automated sediment analyzer. Reference intervals were also estimated using new protocol. We developed a protocol with centrifugation at 400 g for 5 min, with the average concentration factor of 30. The correlation between quantitative results of urine sediment analysis, the standardized chamber, and the automated sediment analyzer were generally good. The conversion factor derived from the new protocol showed a better fit with the results of manual count than the default conversion factor in the automated sediment analyzer. We developed a protocol for manual urine sediment analysis to quantitatively report the results. This protocol may provide a mean for standardization of urine sediment analysis.

  8. Parallel Enhancements of the General Mission Analysis Tool, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The General Mission Analysis Tool (GMAT) is a state of the art spacecraft mission design tool under active development at NASA's Goddard Space Flight Center (GSFC)....

  9. Hypersensitivity and desensitization to antineoplastic agents: outcomes of 189 procedures with a new short protocol and novel diagnostic tools assessment.

    Science.gov (United States)

    Madrigal-Burgaleta, R; Berges-Gimeno, M P; Angel-Pereira, D; Ferreiro-Monteagudo, R; Guillen-Ponce, C; Pueyo, C; Gomez de Salazar, E; Alvarez-Cuesta, E

    2013-07-01

    Desensitization to antineoplastic agents is becoming a standard of care. Efforts to establish and improve these techniques are being made at many institutions. Our aims are to evaluate a new rapid desensitization protocol designed to be shorter (approximately 4 h) and safer (reducing hazardous drugs exposure risks) and to assess the oxaliplatin-specific immunoglobulin E (IgE) as a novel diagnostic tool. Prospective, observational, longitudinal study with patients who, for a 1-year period, suffered reactions to antineoplastic agents and were referred to the Desensitization Program at Ramon y Cajal University Hospital (RCUH). Patients were included or excluded as desensitization candidates after anamnesis, skin testing, risk assessment, and graded challenge. Specific IgE was determined in oxaliplatin-reactive patients. Candidate patients were desensitized using the new RCUH rapid desensitization protocol. Of 189 intravenous rapid desensitizations, 188 were successfully accomplished in the 23 patients who met inclusion criteria for desensitization (of 58 referred patients). No breakthrough reactions occurred in 94% of desensitizations, and most breakthrough reactions were mild. In 10 oxaliplatin-reactive patients, 38 desensitizations were successfully accomplished. Sensitivity for oxaliplatin-specific IgE was 38% (0.35UI/l cutoff point) and 54% (0.10UI/l cutoff point); specificity was 100% for both cutoff points. In the hands of a Desensitization Program, managed by drug desensitization experts, this new protocol has proven an effective therapeutic tool for hypersensitivity to several antineoplastic agents (oxaliplatin, carboplatin, paclitaxel, docetaxel, cyclophosphamide, and rituximab); moreover, it improves safety handling of hazardous drugs. We report the first large series of oxaliplatin desensitizations. Oxaliplatin-specific IgE determination could be helpful. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  10. Development and Usability Testing of a Computer-Tailored Decision Support Tool for Lung Cancer Screening: Study Protocol.

    Science.gov (United States)

    Carter-Harris, Lisa; Comer, Robert Skipworth; Goyal, Anurag; Vode, Emilee Christine; Hanna, Nasser; Ceppa, DuyKhanh; Rawl, Susan M

    2017-11-16

    follow-up survey. We expect to have results by December 31, 2017 and to have data analysis completed by March 1, 2018. Results from usability testing indicate that the computer-tailored decision support tool is easy to use, is helpful, and provides a satisfactory experience for the user. At the conclusion of phase III (pilot RCT), we will have preliminary effect sizes to inform a future fully powered RCT on changes in (1) knowledge about lung cancer and screening, (2) perceived risk of lung cancer, (3) perceived benefits of lung cancer screening, (4) perceived barriers to lung cancer screening, (5) self-efficacy for lung cancer screening, and (6) perceptions of being adequately prepared to engage in a discussion with their clinician about lung cancer screening. ©Lisa Carter-Harris, Robert Skipworth Comer, Anurag Goyal, Emilee Christine Vode, Nasser Hanna, DuyKhanh Ceppa, Susan M Rawl. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 16.11.2017.

  11. Tool Wear Monitoring Using Time Series Analysis

    Science.gov (United States)

    Song, Dong Yeul; Ohara, Yasuhiro; Tamaki, Haruo; Suga, Masanobu

    A tool wear monitoring approach considering the nonlinear behavior of cutting mechanism caused by tool wear and/or localized chipping is proposed, and its effectiveness is verified through the cutting experiment and actual turning machining. Moreover, the variation in the surface roughness of the machined workpiece is also discussed using this approach. In this approach, the residual error between the actually measured vibration signal and the estimated signal obtained from the time series model corresponding to dynamic model of cutting is introduced as the feature of diagnosis. Consequently, it is found that the early tool wear state (i.e. flank wear under 40µm) can be monitored, and also the optimal tool exchange time and the tool wear state for actual turning machining can be judged by this change in the residual error. Moreover, the variation of surface roughness Pz in the range of 3 to 8µm can be estimated by the monitoring of the residual error.

  12. Analysis of the LTE Access Reservation Protocol for Real-Time Traffic

    DEFF Research Database (Denmark)

    Thomsen, Henning; Kiilerich Pratas, Nuno; Stefanovic, Cedomir

    2013-01-01

    LTE is increasingly seen as a system for serving real-time Machine-to-Machine (M2M) communication needs. The asynchronous M2M user access in LTE is obtained through a two-phase access reservation protocol (contention and data phase). Existing analysis related to these protocols is based...... of the two-phase LTE reservation protocol and asses its performance, when assumptions (1) and (2) do not hold....

  13. Special Section on "Tools and Algorithms for the Construction and Analysis of Systems"

    DEFF Research Database (Denmark)

    2006-01-01

    in the Lecture Notes in Computer Science series published by Springer. TACAS is a forum for researchers, developers and users interested in rigorously based tools for the construction and analysis of systems. The conference serves to bridge the gaps between different communities – including but not limited......This special section contains the revised and expanded versions of eight of the papers from the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems (TACAS) held in March/April 2004 in Barcelona, Spain. The conference proceedings appeared as volume 2988...... to those devoted to formal methods, software and hardware verification, static analysis, programming languages, software engineering, real-time systems, and communications protocols – that share common interests in, and techniques for, tool development. Other more theoretical papers from the conference...

  14. Effectiveness of a medication-adherence tool : Study protocol for a randomized controlled trial

    NARCIS (Netherlands)

    Hilbink, Mirrian; Lacroix, Joyca; Bremer-van der Heiden, Linda; van Halteren, Aart; Teichert, Martina; van Lieshout, Jan

    2016-01-01

    Background: Research shows that more than half of the people taking medication for a chronic condition are non-adherent. Nonadherence hinders disease control with a burden on patient quality of life and healthcare systems. We developed a tool that provides insight into nonadherence risks and

  15. Effectiveness of a medication-adherence tool: study protocol for a randomized controlled trial

    NARCIS (Netherlands)

    Hilbink-Smolders, M.; Lacroix, J.; Bremer-van der Heiden, L.; Halteren, A. van; Teichert, M.; Lieshout, J. van

    2016-01-01

    BACKGROUND: Research shows that more than half of the people taking medication for a chronic condition are non-adherent. Nonadherence hinders disease control with a burden on patient quality of life and healthcare systems. We developed a tool that provides insight into nonadherence risks and

  16. Ball Bearing Analysis with the ORBIS Tool

    Science.gov (United States)

    Halpin, Jacob D.

    2016-01-01

    Ball bearing design is critical to the success of aerospace mechanisms. Key bearing performance parameters, such as load capability, stiffness, torque, and life all depend on accurate determination of the internal load distribution. Hence, a good analytical bearing tool that provides both comprehensive capabilities and reliable results becomes a significant asset to the engineer. This paper introduces the ORBIS bearing tool. A discussion of key modeling assumptions and a technical overview is provided. Numerous validation studies and case studies using the ORBIS tool are presented. All results suggest the ORBIS code closely correlates to predictions on bearing internal load distributions, stiffness, deflection and stresses.

  17. Spacecraft Electrical Power System (EPS) generic analysis tools and techniques

    Science.gov (United States)

    Morris, Gladys M.; Sheppard, Mark A.

    1992-01-01

    An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

  18. Dynamic Channel Slot Allocation Scheme and Performance Analysis of Cyclic Quorum Multichannel MAC Protocol

    Directory of Open Access Journals (Sweden)

    Xing Hu

    2017-01-01

    Full Text Available In high diversity node situation, multichannel MAC protocol can improve the frequency efficiency, owing to fewer collisions compared with single-channel MAC protocol. And the performance of cyclic quorum-based multichannel (CQM MAC protocol is outstanding. Based on cyclic quorum system and channel slot allocation, it can avoid the bottleneck that others suffered from and can be easily realized with only one transceiver. To obtain the accurate performance of CQM MAC protocol, a Markov chain model, which combines the channel-hopping strategy of CQM protocol and IEEE 802.11 distributed coordination function (DCF, is proposed. The results of numerical analysis show that the optimal performance of CQM protocol can be obtained in saturation bound situation. And then we obtain the saturation bound of CQM system by bird swarm algorithm. In addition, to improve the performance of CQM protocol in unsaturation situation, a dynamic channel slot allocation of CQM (DCQM protocol is proposed, based on wavelet neural network. Finally, the performance of CQM protocol and DCQM protocol is simulated by Qualnet platform. And the simulation results show that the analytic and simulation results match very well; the DCQM performs better in unsaturation situation.

  19. The effect of personalized versus standard patient protocols for radiostereometric analysis (RSA)

    DEFF Research Database (Denmark)

    Muharemovic, O; Troelsen, A; Thomsen, M G

    2018-01-01

    INTRODUCTION: Increasing pressure in the clinic requires a more standardized approach to radiostereometric analysis (RSA) imaging. The aim of this study was to investigate whether implementation of personalized RSA patient protocols could increase image quality and decrease examination time...... imaging. Radiographers in the control group used a standard RSA protocol. RESULTS: At three months, radiographers in the case group significantly reduced (p .... No significant improvements were found in the control group at any time point. CONCLUSION: There is strong evidence that personalized RSA patient protocols have a positive effect on image quality and radiation dose savings. Implementation of personal patient protocols as a RSA standard will contribute...

  20. Analysis of the differential-phase-shift-keying protocol in the quantum-key-distribution system

    International Nuclear Information System (INIS)

    Rong-Zhen, Jiao; Chen-Xu, Feng; Hai-Qiang, Ma

    2009-01-01

    The analysis is based on the error rate and the secure communication rate as functions of distance for three quantum-key-distribution (QKD) protocols: the Bennett–Brassard 1984, the Bennett–Brassard–Mermin 1992, and the coherent differential-phase-shift keying (DPSK) protocols. We consider the secure communication rate of the DPSK protocol against an arbitrary individual attack, including the most commonly considered intercept-resend and photon-number splitting attacks, and concluded that the simple and efficient differential-phase-shift-keying protocol allows for more than 200 km of secure communication distance with high communication rates. (general)

  1. PERFORMANCE ANALYSIS OF DISTINCT SECURED AUTHENTICATION PROTOCOLS USED IN THE RESOURCE CONSTRAINED PLATFORM

    Directory of Open Access Journals (Sweden)

    S. Prasanna

    2014-03-01

    Full Text Available Most of the e-commerce and m-commerce applications in the current e-business world, has adopted asymmetric key cryptography technique in their authentication protocol to provide an efficient authentication of the involved parties. This paper exhibits the performance analysis of distinct authentication protocol which implements the public key cryptography like RSA, ECC and HECC. The comparison is made based on key generation, sign generation and sign verification processes. The results prove that the performance achieved through HECC based authentication protocol is better than the ECC- and RSA based authentication protocols.

  2. Modelling and Analysis of a Collision Avoidance Protocol using SPIN and UPPAAL

    DEFF Research Database (Denmark)

    Skou, Arne; Larsen, Kim Guldstrand; Jensen, Henrik Ejersbo

    1997-01-01

    , the modelling of the media becomes ackward due to the lack of broadcast communication in the PROMELA language. On the other hand we find it easy to model the timed aspects using the UPPAAL tool. Especially, the notion of committed locations supports the modelling of broadcast communication. However......This paper compares the tools SPIN and UPPAAL by modelling and verifying a Collision Avoidance Protocol for an Ethernet-like medium. We find that SPIN is well suited for modelling the untimed aspects of the protocol processes and for expressing the relevant (untimed) properties. However...

  3. A Web-Based Decision Tool to Improve Contraceptive Counseling for Women With Chronic Medical Conditions: Protocol For a Mixed Methods Implementation Study.

    Science.gov (United States)

    Wu, Justine P; Damschroder, Laura J; Fetters, Michael D; Zikmund-Fisher, Brian J; Crabtree, Benjamin F; Hudson, Shawna V; Ruffin, Mack T; Fucinari, Juliana; Kang, Minji; Taichman, L Susan; Creswell, John W

    2018-04-18

    Women with chronic medical conditions, such as diabetes and hypertension, have a higher risk of pregnancy-related complications compared with women without medical conditions and should be offered contraception if desired. Although evidence based guidelines for contraceptive selection in the presence of medical conditions are available via the United States Medical Eligibility Criteria (US MEC), these guidelines are underutilized. Research also supports the use of decision tools to promote shared decision making between patients and providers during contraceptive counseling. The overall goal of the MiHealth, MiChoice project is to design and implement a theory-driven, Web-based tool that incorporates the US MEC (provider-level intervention) within the vehicle of a contraceptive decision tool for women with chronic medical conditions (patient-level intervention) in community-based primary care settings (practice-level intervention). This will be a 3-phase study that includes a predesign phase, a design phase, and a testing phase in a randomized controlled trial. This study protocol describes phase 1 and aim 1, which is to determine patient-, provider-, and practice-level factors that are relevant to the design and implementation of the contraceptive decision tool. This is a mixed methods implementation study. To customize the delivery of the US MEC in the decision tool, we selected high-priority constructs from the Consolidated Framework for Implementation Research and the Theoretical Domains Framework to drive data collection and analysis at the practice and provider level, respectively. A conceptual model that incorporates constructs from the transtheoretical model and the health beliefs model undergirds patient-level data collection and analysis and will inform customization of the decision tool for this population. We will recruit 6 community-based primary care practices and conduct quantitative surveys and semistructured qualitative interviews with women who

  4. A Formal Analysis of the Web Services Atomic Transaction Protocol with UPPAAL

    DEFF Research Database (Denmark)

    Ravn, Anders Peter; Srba, Jiri; Vighio, Saleem

    2010-01-01

    We present a formal analysis of the Web Services Atomic Transaction (WS-AT) protocol. WS-AT is a part of the WS-Coordination framework and describes an algorithm for reaching agreement on the outcome of a distributed transaction. The protocol is modelled and verified using the model checker UPPAAL...

  5. Design and Analysis of Transport Protocols for Reliable High-Speed Communications

    NARCIS (Netherlands)

    Oláh, A.

    1997-01-01

    The design and analysis of transport protocols for reliable communications constitutes the topic of this dissertation. These transport protocols guarantee the sequenced and complete delivery of user data over networks which may lose, duplicate and reorder packets. Reliable transport services are

  6. Forensic analysis of video steganography tools

    Directory of Open Access Journals (Sweden)

    Thomas Sloan

    2015-05-01

    Full Text Available Steganography is the art and science of concealing information in such a way that only the sender and intended recipient of a message should be aware of its presence. Digital steganography has been used in the past on a variety of media including executable files, audio, text, games and, notably, images. Additionally, there is increasing research interest towards the use of video as a media for steganography, due to its pervasive nature and diverse embedding capabilities. In this work, we examine the embedding algorithms and other security characteristics of several video steganography tools. We show how all feature basic and severe security weaknesses. This is potentially a very serious threat to the security, privacy and anonymity of their users. It is important to highlight that most steganography users have perfectly legal and ethical reasons to employ it. Some common scenarios would include citizens in oppressive regimes whose freedom of speech is compromised, people trying to avoid massive surveillance or censorship, political activists, whistle blowers, journalists, etc. As a result of our findings, we strongly recommend ceasing any use of these tools, and to remove any contents that may have been hidden, and any carriers stored, exchanged and/or uploaded online. For many of these tools, carrier files will be trivial to detect, potentially compromising any hidden data and the parties involved in the communication. We finish this work by presenting our steganalytic results, that highlight a very poor current state of the art in practical video steganography tools. There is unfortunately a complete lack of secure and publicly available tools, and even commercial tools offer very poor security. We therefore encourage the steganography community to work towards the development of more secure and accessible video steganography tools, and make them available for the general public. The results presented in this work can also be seen as a useful

  7. Software reference for SaTool - a Tool for Structural Analysis of Automated Systems

    DEFF Research Database (Denmark)

    Lorentzen, Torsten; Blanke, Mogens

    2004-01-01

    This software reference details the functions of SaTool – a tool for structural analysis of technical systems. SaTool is intended used as part of an industrial systems design cycle. Structural analysis is a graph-based technique where principal relations between variables express the system’s...... of the graph. SaTool makes analysis of the structure graph to provide knowledge about fundamental properties of the system in normal and faulty conditions. Salient features of SaTool include rapid analysis of possibility to diagnose faults and ability to make autonomous recovery should faults occur........ The list of such variables and functional relations constitute the system’s structure graph. Normal operation means all functional relations are intact. Should faults occur, one or more functional relations cease to be valid. In a structure graph, this is seen as the disappearance of one or more nodes...

  8. A Protocol for the Comprehensive Flow Cytometric Analysis of Immune Cells in Normal and Inflamed Murine Non-Lymphoid Tissues

    Science.gov (United States)

    Yu, Yen-Rei A.; O’Koren, Emily G.; Hotten, Danielle F.; Kan, Matthew J.; Kopin, David; Nelson, Erik R.; Que, Loretta; Gunn, Michael D.

    2016-01-01

    Flow cytometry is used extensively to examine immune cells in non-lymphoid tissues. However, a method of flow cytometric analysis that is both comprehensive and widely applicable has not been described. We developed a protocol for the flow cytometric analysis of non-lymphoid tissues, including methods of tissue preparation, a 10-fluorochrome panel for cell staining, and a standardized gating strategy, that allows the simultaneous identification and quantification of all major immune cell types in a variety of normal and inflamed non-lymphoid tissues. We demonstrate that our basic protocol minimizes cell loss, reliably distinguishes macrophages from dendritic cells (DC), and identifies all major granulocytic and mononuclear phagocytic cell types. This protocol is able to accurately quantify 11 distinct immune cell types, including T cells, B cells, NK cells, neutrophils, eosinophils, inflammatory monocytes, resident monocytes, alveolar macrophages, resident/interstitial macrophages, CD11b- DC, and CD11b+ DC, in normal lung, heart, liver, kidney, intestine, skin, eyes, and mammary gland. We also characterized the expression patterns of several commonly used myeloid and macrophage markers. This basic protocol can be expanded to identify additional cell types such as mast cells, basophils, and plasmacytoid DC, or perform detailed phenotyping of specific cell types. In examining models of primary and metastatic mammary tumors, this protocol allowed the identification of several distinct tumor associated macrophage phenotypes, the appearance of which was highly specific to individual tumor cell lines. This protocol provides a valuable tool to examine immune cell repertoires and follow immune responses in a wide variety of tissues and experimental conditions. PMID:26938654

  9. Mobile Health Technology Interventions for Suicide Prevention: Protocol for a Systematic Review and Meta-Analysis.

    Science.gov (United States)

    Melia, Ruth; Francis, Kady; Duggan, Jim; Bogue, John; O'Sullivan, Mary; Chambers, Derek; Young, Karen

    2018-01-26

    Previous research has reported that two of the major barriers to help-seeking for individuals at risk of suicide are stigma and geographical isolation. Mobile technology offers a potential means of delivering evidence-based interventions with greater specificity to the individual, and at the time that it is needed. Despite documented motivation by at-risk individuals to use mobile technology to track mental health and to support psychological interventions, there is a shortfall of outcomes data on the efficacy of mobile health (mHealth) technology on suicide-specific outcomes. The objective of this study is to develop a protocol for a systematic review and meta-analysis that aims to evaluate the effectiveness of mobile technology-based interventions for suicide prevention. The search includes the Cochrane Central Register of Controlled Trials (CENTRAL: The Cochrane Library), MEDLINE, Embase, PsycINFO, CRESP and relevant sources of gray literature. Studies that have evaluated psychological or nonpsychological interventions delivered via mobile computing and communication technology, and have suicidality as an outcome measure will be included. Two authors will independently extract data and assess the study suitability in accordance with the Cochrane Collaboration Risk of Bias Tool. Studies will be included if they measure at least one suicide outcome variable (ie, suicidal ideation, suicidal intent, nonsuicidal self-injurious behavior, suicidal behavior). Secondary outcomes will be measures of symptoms of depression. Where studies are sufficiently homogenous and reported outcomes are amenable for pooled synthesis, meta-analysis will be performed. A narrative synthesis will be conducted if the data is unsuitable for a meta-analysis. The review is in progress, with findings expected by summer 2018. To date, evaluations of mobile technology-based interventions in suicide prevention have focused on evaluating content as opposed to efficacy. Indeed, previous research has

  10. Galaxy tools and workflows for sequence analysis with applications in molecular plant pathology.

    Science.gov (United States)

    Cock, Peter J A; Grüning, Björn A; Paszkiewicz, Konrad; Pritchard, Leighton

    2013-01-01

    The Galaxy Project offers the popular web browser-based platform Galaxy for running bioinformatics tools and constructing simple workflows. Here, we present a broad collection of additional Galaxy tools for large scale analysis of gene and protein sequences. The motivating research theme is the identification of specific genes of interest in a range of non-model organisms, and our central example is the identification and prediction of "effector" proteins produced by plant pathogens in order to manipulate their host plant. This functional annotation of a pathogen's predicted capacity for virulence is a key step in translating sequence data into potential applications in plant pathology. This collection includes novel tools, and widely-used third-party tools such as NCBI BLAST+ wrapped for use within Galaxy. Individual bioinformatics software tools are typically available separately as standalone packages, or in online browser-based form. The Galaxy framework enables the user to combine these and other tools to automate organism scale analyses as workflows, without demanding familiarity with command line tools and scripting. Workflows created using Galaxy can be saved and are reusable, so may be distributed within and between research groups, facilitating the construction of a set of standardised, reusable bioinformatic protocols. The Galaxy tools and workflows described in this manuscript are open source and freely available from the Galaxy Tool Shed (http://usegalaxy.org/toolshed or http://toolshed.g2.bx.psu.edu).

  11. A Hybrid Analysis for Security Protocols with State

    Science.gov (United States)

    2014-07-16

    http://www.mitre.org/publications/ technical-papers/completeness-of-cpsa. [19] Simon Meier, Cas Cremers , and David Basin. Efficient construction of...7] Cas Cremers and Sjouke Mauw. Operational semantics and verification of security protocols. Springer, 2012. [8] Anupam Datta, Ante Derek, John C

  12. Differentiation/Purification Protocol for Retinal Pigment Epithelium from Mouse Induced Pluripotent Stem Cells as a Research Tool.

    Directory of Open Access Journals (Sweden)

    Yuko Iwasaki

    Full Text Available To establish a novel protocol for differentiation of retinal pigment epithelium (RPE with high purity from mouse induced pluripotent stem cells (iPSC.Retinal progenitor cells were differentiated from mouse iPSC, and RPE differentiation was then enhanced by activation of the Wnt signaling pathway, inhibition of the fibroblast growth factor signaling pathway, and inhibition of the Rho-associated, coiled-coil containing protein kinase signaling pathway. Expanded pigmented cells were purified by plate adhesion after Accutase® treatment. Enriched cells were cultured until they developed a cobblestone appearance with cuboidal shape. The characteristics of iPS-RPE were confirmed by gene expression, immunocytochemistry, and electron microscopy. Functions and immunologic features of the iPS-RPE were also evaluated.We obtained iPS-RPE at high purity (approximately 98%. The iPS-RPE showed apical-basal polarity and cellular structure characteristic of RPE. Expression levels of several RPE markers were lower than those of freshly isolated mouse RPE but comparable to those of primary cultured RPE. The iPS-RPE could form tight junctions, phagocytose photoreceptor outer segments, express immune antigens, and suppress lymphocyte proliferation.We successfully developed a differentiation/purification protocol to obtain mouse iPS-RPE. The mouse iPS-RPE can serve as an attractive tool for functional and morphological studies of RPE.

  13. A SERS protocol as a potential tool to access 6-mercaptopurine release accelerated by glutathione-S-transferase.

    Science.gov (United States)

    Wang, Ying; Sun, Jie; Yang, Qingran; Lu, Wenbo; Li, Yan; Dong, Jian; Qian, Weiping

    2015-11-21

    The developed method for monitoring GST, an important drug metabolic enzyme, could greatly facilitate researches on relative biological fields. In this work, we have developed a SERS technique to monitor the absorbance behaviour of 6-mercaptopurine (6-MP) and its glutathione-S-transferase (GST)-accelerated glutathione (GSH)-triggered release behaviour on the surface of gold nanoflowers (GNFs), using the GNFs as excellent SERS substrates. The SERS signal was used as an indicator of absorbance or release of 6-MP on the gold surface. We found that GST can accelerate GSH-triggered release behaviour of 6-MP from the gold surface. We speculated that GST catalyzes nucleophilic GSH to competitively bind with the electrophilic substance 6-MP. Experimental results have proved that the presented SERS protocol can be utilized as an effective tool for accessing the release of anticancer drugs.

  14. A systematic review protocol: social network analysis of tobacco use.

    Science.gov (United States)

    Maddox, Raglan; Davey, Rachel; Lovett, Ray; van der Sterren, Anke; Corbett, Joan; Cochrane, Tom

    2014-08-08

    Tobacco use is the single most preventable cause of death in the world. Evidence indicates that behaviours such as tobacco use can influence social networks, and that social network structures can influence behaviours. Social network analysis provides a set of analytic tools to undertake methodical analysis of social networks. We will undertake a systematic review to provide a comprehensive synthesis of the literature regarding social network analysis and tobacco use. The review will answer the following research questions: among participants who use tobacco, does social network structure/position influence tobacco use? Does tobacco use influence peer selection? Does peer selection influence tobacco use? We will follow the Preferred Reporting Items for Systemic Reviews and Meta-Analyses (PRISMA) guidelines and search the following databases for relevant articles: CINAHL (Cumulative Index to Nursing and Allied Health Literature); Informit Health Collection; PsycINFO; PubMed/MEDLINE; Scopus/Embase; Web of Science; and the Wiley Online Library. Keywords include tobacco; smoking; smokeless; cigarettes; cigar and 'social network' and reference lists of included articles will be hand searched. Studies will be included that provide descriptions of social network analysis of tobacco use.Qualitative, quantitative and mixed method data that meets the inclusion criteria for the review, including methodological rigour, credibility and quality standards, will be synthesized using narrative synthesis. Results will be presented using outcome statistics that address each of the research questions. This systematic review will provide a timely evidence base on the role of social network analysis of tobacco use, forming a basis for future research, policy and practice in this area. This systematic review will synthesise the evidence, supporting the hypothesis that social network structures can influence tobacco use. This will also include exploring the relationship between social

  15. FDTD simulation tools for UWB antenna analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Brocato, Robert Wesley

    2005-02-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  16. FDTD simulation tools for UWB antenna analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Brocato, Robert Wesley

    2004-12-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  17. SIMMER as a safety analysis tool

    International Nuclear Information System (INIS)

    Smith, L.L.; Bell, C.R.; Bohl, W.R.; Bott, T.F.; Dearing, J.F.; Luck, L.B.

    1982-01-01

    SIMMER has been used for numerous applications in fast reactor safety, encompassing both accident and experiment analysis. Recent analyses of transition-phase behavior in potential core disruptive accidents have integrated SIMMER testing with the accident analysis. Results of both the accident analysis and the verification effort are presented as a comprehensive safety analysis program

  18. 5D Task Analysis Visualization Tool Phase II, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The creation of a five-dimensional task analysis visualization (5D-TAV) software tool for Task Analysis and Workload Planning using multi-dimensional visualization...

  19. 5D Task Analysis Visualization Tool, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The creation of a five-dimensional task analysis visualization (5D-TAV) software tool for Task Analysis and Workload Planning using multi-dimensional visualization...

  20. Vehicle Technology Simulation and Analysis Tools | Transportation Research

    Science.gov (United States)

    Analysis Tools NREL developed the following modeling, simulation, and analysis tools to investigate novel design goals (e.g., fuel economy versus performance) to find cost-competitive solutions. ADOPT Vehicle Simulator to analyze the performance and fuel economy of conventional and advanced light- and

  1. FEAT - FAILURE ENVIRONMENT ANALYSIS TOOL (UNIX VERSION)

    Science.gov (United States)

    Pack, G.

    1994-01-01

    The Failure Environment Analysis Tool, FEAT, enables people to see and better understand the effects of failures in a system. FEAT uses digraph models to determine what will happen to a system if a set of failure events occurs and to identify the possible causes of a selected set of failures. Failures can be user-selected from either engineering schematic or digraph model graphics, and the effects or potential causes of the failures will be color highlighted on the same schematic or model graphic. As a design tool, FEAT helps design reviewers understand exactly what redundancies have been built into a system and where weaknesses need to be protected or designed out. A properly developed digraph will reflect how a system functionally degrades as failures accumulate. FEAT is also useful in operations, where it can help identify causes of failures after they occur. Finally, FEAT is valuable both in conceptual development and as a training aid, since digraphs can identify weaknesses in scenarios as well as hardware. Digraphs models for use with FEAT are generally built with the Digraph Editor, a Macintosh-based application which is distributed with FEAT. The Digraph Editor was developed specifically with the needs of FEAT users in mind and offers several time-saving features. It includes an icon toolbox of components required in a digraph model and a menu of functions for manipulating these components. It also offers FEAT users a convenient way to attach a formatted textual description to each digraph node. FEAT needs these node descriptions in order to recognize nodes and propagate failures within the digraph. FEAT users store their node descriptions in modelling tables using any word processing or spreadsheet package capable of saving data to an ASCII text file. From within the Digraph Editor they can then interactively attach a properly formatted textual description to each node in a digraph. Once descriptions are attached to them, a selected set of nodes can be

  2. Analysis of logging data from nuclear borehole tools

    International Nuclear Information System (INIS)

    Hovgaard, J.; Oelgaard, P.L.

    1989-12-01

    The processing procedure for logging data from a borehole of the Stenlille project of Dansk Naturgas A/S has been analysed. The tools considered in the analysis were an integral, natural-gamma tool, a neutron porosity tool, a gamma density tool and a caliper tool. It is believed that in most cases the processing procedure used by the logging company in the interpretation of the raw data is fully understood. An exception is the epithermal part of the neutron porosity tool where all data needed for an interpretation were not available. The analysis has shown that some parts of the interpretation procedure may not be consistent with the physical principle of the tools. (author)

  3. Time Error Analysis of SOE System Using Network Time Protocol

    International Nuclear Information System (INIS)

    Keum, Jong Yong; Park, Geun Ok; Park, Heui Youn

    2005-01-01

    To find the accuracy of time in the fully digitalized SOE (Sequence of Events) system, we used a formal specification of the Network Time Protocol (NTP) Version 3, which is used to synchronize time keeping among a set of distributed computers. Through constructing a simple experimental environments and experimenting internet time synchronization, we analyzed the time errors of local clocks of SOE system synchronized with a time server via computer networks

  4. Nutrition screening tools: an analysis of the evidence.

    Science.gov (United States)

    Skipper, Annalynn; Ferguson, Maree; Thompson, Kyle; Castellanos, Victoria H; Porcari, Judy

    2012-05-01

    In response to questions about tools for nutrition screening, an evidence analysis project was developed to identify the most valid and reliable nutrition screening tools for use in acute care and hospital-based ambulatory care settings. An oversight group defined nutrition screening and literature search criteria. A trained analyst conducted structured searches of the literature for studies of nutrition screening tools according to predetermined criteria. Eleven nutrition screening tools designed to detect undernutrition in patients in acute care and hospital-based ambulatory care were identified. Trained analysts evaluated articles for quality using criteria specified by the American Dietetic Association's Evidence Analysis Library. Members of the oversight group assigned quality grades to the tools based on the quality of the supporting evidence, including reliability and validity data. One tool, the NRS-2002, received a grade I, and 4 tools-the Simple Two-Part Tool, the Mini-Nutritional Assessment-Short Form (MNA-SF), the Malnutrition Screening Tool (MST), and Malnutrition Universal Screening Tool (MUST)-received a grade II. The MST was the only tool shown to be both valid and reliable for identifying undernutrition in the settings studied. Thus, validated nutrition screening tools that are simple and easy to use are available for application in acute care and hospital-based ambulatory care settings.

  5. Structural analysis of ITER sub-assembly tools

    International Nuclear Information System (INIS)

    Nam, K.O.; Park, H.K.; Kim, D.J.; Ahn, H.J.; Lee, J.H.; Kim, K.K.; Im, K.; Shaw, R.

    2011-01-01

    The ITER Tokamak assembly tools are purpose-built assembly tools to complete the ITER Tokamak machine which includes the cryostat and the components contained therein. The sector sub-assembly tools descried in this paper are main assembly tools to assemble vacuum vessel, thermal shield and toroidal filed coils into a complete 40 o sector. The 40 o sector sub-assembly tools are composed of sector sub-assembly tool, including radial beam, vacuum vessel supports and mid-plane brace tools. These tools shall have sufficient strength to transport and handle heavy weight of the ITER Tokamak machine reached several hundred tons. Therefore these tools should be designed and analyzed to confirm both the strength and structural stability even in the case of conservative assumptions. To verify structural stabilities of the sector sub-assembly tools in terms of strength and deflection, ANSYS code was used for linear static analysis. The results of the analysis show that these tools are designed with sufficient strength and stiffness. The conceptual designs of these tools are briefly described in this paper also.

  6. Statistical methods for the forensic analysis of striated tool marks

    Energy Technology Data Exchange (ETDEWEB)

    Hoeksema, Amy Beth [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    In forensics, fingerprints can be used to uniquely identify suspects in a crime. Similarly, a tool mark left at a crime scene can be used to identify the tool that was used. However, the current practice of identifying matching tool marks involves visual inspection of marks by forensic experts which can be a very subjective process. As a result, declared matches are often successfully challenged in court, so law enforcement agencies are particularly interested in encouraging research in more objective approaches. Our analysis is based on comparisons of profilometry data, essentially depth contours of a tool mark surface taken along a linear path. In current practice, for stronger support of a match or non-match, multiple marks are made in the lab under the same conditions by the suspect tool. We propose the use of a likelihood ratio test to analyze the difference between a sample of comparisons of lab tool marks to a field tool mark, against a sample of comparisons of two lab tool marks. Chumbley et al. (2010) point out that the angle of incidence between the tool and the marked surface can have a substantial impact on the tool mark and on the effectiveness of both manual and algorithmic matching procedures. To better address this problem, we describe how the analysis can be enhanced to model the effect of tool angle and allow for angle estimation for a tool mark left at a crime scene. With sufficient development, such methods may lead to more defensible forensic analyses.

  7. The physics analysis tools project for the ATLAS experiment

    International Nuclear Information System (INIS)

    Lenzi, Bruno

    2012-01-01

    The Large Hadron Collider is expected to start colliding proton beams in 2009. The enormous amount of data produced by the ATLAS experiment (≅1 PB per year) will be used in searches for the Higgs boson and Physics beyond the standard model. In order to meet this challenge, a suite of common Physics Analysis Tools has been developed as part of the Physics Analysis software project. These tools run within the ATLAS software framework, ATHENA, covering a wide range of applications. There are tools responsible for event selection based on analysed data and detector quality information, tools responsible for specific physics analysis operations including data quality monitoring and physics validation, and complete analysis tool-kits (frameworks) with the goal to aid the physicist to perform his analysis hiding the details of the ATHENA framework. (authors)

  8. Analysis of Pervasive Mobile Ad Hoc Routing Protocols

    Science.gov (United States)

    Qadri, Nadia N.; Liotta, Antonio

    Mobile ad hoc networks (MANETs) are a fundamental element of pervasive networks and therefore, of pervasive systems that truly support pervasive computing, where user can communicate anywhere, anytime and on-the-fly. In fact, future advances in pervasive computing rely on advancements in mobile communication, which includes both infrastructure-based wireless networks and non-infrastructure-based MANETs. MANETs introduce a new communication paradigm, which does not require a fixed infrastructure - they rely on wireless terminals for routing and transport services. Due to highly dynamic topology, absence of established infrastructure for centralized administration, bandwidth constrained wireless links, and limited resources in MANETs, it is challenging to design an efficient and reliable routing protocol. This chapter reviews the key studies carried out so far on the performance of mobile ad hoc routing protocols. We discuss performance issues and metrics required for the evaluation of ad hoc routing protocols. This leads to a survey of existing work, which captures the performance of ad hoc routing algorithms and their behaviour from different perspectives and highlights avenues for future research.

  9. Graphical Acoustic Liner Design and Analysis Tool

    Science.gov (United States)

    Howerton, Brian M. (Inventor); Jones, Michael G. (Inventor)

    2016-01-01

    An interactive liner design and impedance modeling tool comprises software utilized to design acoustic liners for use in constrained spaces, both regularly and irregularly shaped. A graphical user interface allows the acoustic channel geometry to be drawn in a liner volume while the surface impedance calculations are updated and displayed in real-time. A one-dimensional transmission line model may be used as the basis for the impedance calculations.

  10. Performance Analysis of Untraceability Protocols for Mobile Agents Using an Adaptable Framework

    OpenAIRE

    LESZCZYNA RAFAL; GORSKI Janusz Kazimierz

    2006-01-01

    Recently we had proposed two untraceability protocols for mobile agents and began investigating their quality. We believe that quality evaluation of security protocols should extend a sole validation of their security and cover other quality aspects, primarily their efficiency. Thus after conducting a security analysis, we wanted to complement it with a performance analysis. For this purpose we developed a performance evaluation framework, which, as we realised, with certain adjustments, can ...

  11. Performance Analysis of the Mobile IP Protocol (RFC 3344 and Related RFCS)

    Science.gov (United States)

    2006-12-01

    field of 9 identifies the ICMP message as an adverstisement . Code Mobile IP home agents and foreign agents use the value of 16 to prevent any nodes...ANALYSIS OF THE MOBILE IP PROTOCOL (RFC 3344 AND RELATED RFCS) by Chin Chin Ng December 2006 Thesis Co-Advisors: George W. Dinolt J. D...December 2006 3. REPORT TYPE AND DATES COVERED Master’s Thesis 4. TITLE AND SUBTITLE Performance Analysis of the Mobile IP Protocol (RFC 3344 and

  12. Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 4, Volume IV: Inherent Optical Properties: Instruments, Characterizations, Field Measurements and Data Analysis Protocols

    Science.gov (United States)

    Mueller, J. L.; Fargion, G. S.; McClain, C. R. (Editor); Pegau, S.; Zanefeld, J. R. V.; Mitchell, B. G.; Kahru, M.; Wieland, J.; Stramska, M.

    2003-01-01

    This document stipulates protocols for measuring bio-optical and radiometric data for the Sensor Intercomparision and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) Project activities and algorithm development. The document is organized into 6 separate volumes as Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 4. Volume I: Introduction, Background, and Conventions; Volume II: Instrument Specifications, Characterization and Calibration; Volume III: Radiometric Measurements and Data Analysis Methods; Volume IV: Inherent Optical Properties: Instruments, Characterization, Field Measurements and Data Analysis Protocols; Volume V: Biogeochemical and Bio-Optical Measurements and Data Analysis Methods; Volume VI: Special Topics in Ocean Optics Protocols and Appendices. The earlier version of Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 3 is entirely superseded by the six volumes of Revision 4 listed above.

  13. MSP-Tool: a VBA-based software tool for the analysis of multispecimen paleointensity data

    Science.gov (United States)

    Monster, Marilyn; de Groot, Lennart; Dekkers, Mark

    2015-12-01

    The multispecimen protocol (MSP) is a method to estimate the Earth's magnetic field's past strength from volcanic rocks or archeological materials. By reducing the amount of heating steps and aligning the specimens parallel to the applied field, thermochemical alteration and multi-domain effects are minimized. We present a new software tool, written for Microsoft Excel 2010 in Visual Basic for Applications (VBA), that evaluates paleointensity data acquired using this protocol. In addition to the three ratios (standard, fraction-corrected and domain-state-corrected) calculated following Dekkers and Böhnel (2006) and Fabian and Leonhardt (2010) and a number of other parameters proposed by Fabian and Leonhardt (2010), it also provides several reliability criteria. These include an alteration criterion, whether or not the linear regression intersects the y axis within the theoretically prescribed range, and two directional checks. Overprints and misalignment are detected by isolating the remaining natural remanent magnetization (NRM) and the partial thermoremanent magnetization (pTRM) gained and comparing their declinations and inclinations. The NRM remaining and pTRM gained are then used to calculate alignment-corrected multispecimen plots. Data are analyzed using bootstrap statistics. The program was tested on lava samples that were given a full TRM and that acquired their pTRMs at angles of 0, 15, 30 and 90° with respect to their NRMs. MSP-Tool adequately detected and largely corrected these artificial alignment errors.

  14. The effect of personalized versus standard patient protocols for radiostereometric analysis (RSA).

    Science.gov (United States)

    Muharemovic, O; Troelsen, A; Thomsen, M G; Kallemose, T; Gosvig, K K

    2018-05-01

    Increasing pressure in the clinic requires a more standardized approach to radiostereometric analysis (RSA) imaging. The aim of this study was to investigate whether implementation of personalized RSA patient protocols could increase image quality and decrease examination time and the number of exposure repetitions. Forty patients undergoing primary total hip arthroplasty were equally randomized to either a case or a control group. Radiographers in the case group were assisted by personalized patient protocols containing information about each patient's post-operative RSA imaging. Radiographers in the control group used a standard RSA protocol. At three months, radiographers in the case group significantly reduced (p RSA patient protocols have a positive effect on image quality and radiation dose savings. Implementation of personal patient protocols as a RSA standard will contribute to the reduction of examination time, thus ensuring a cost benefit for department and patient safety. Copyright © 2017 The College of Radiographers. Published by Elsevier Ltd. All rights reserved.

  15. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    Science.gov (United States)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  16. Surrogate Analysis and Index Developer (SAID) tool

    Science.gov (United States)

    Domanski, Marian M.; Straub, Timothy D.; Landers, Mark N.

    2015-10-01

    The use of acoustic and other parameters as surrogates for suspended-sediment concentrations (SSC) in rivers has been successful in multiple applications across the Nation. Tools to process and evaluate the data are critical to advancing the operational use of surrogates along with the subsequent development of regression models from which real-time sediment concentrations can be made available to the public. Recent developments in both areas are having an immediate impact on surrogate research and on surrogate monitoring sites currently (2015) in operation.

  17. Analysis of transmission speed of AX.25 Protocol implemented in satellital earth station UPTC

    Directory of Open Access Journals (Sweden)

    Oscar Fernando Vera Cely

    2015-11-01

    Full Text Available One of the important parameters for the proper functioning of satellital ground station projected on Pedagogical and Technological University of Colombia (UPTC is the efficiency in transmission speed on communications protocol. This paper shows the results of analysis of the transmission speed of the AX.25 protocol implemented in the communication system of the satellital ground station UPTC. It begins with a brief description of the implemented hardware; the behavior of the transmission rate is evaluated using a theoretical analysis based on equations to estimate this parameter in the operation of the protocol, then tests are performed using the hardware that the satellital ground station UPTC has and finally, the conclusions are presented. Based on comparison of the theoretical analysis results obtained experimentally, it became apparent that AX.25 protocol efficiency is higher when increasing the number of frames.

  18. A Web-Based Decision Tool to Improve Contraceptive Counseling for Women With Chronic Medical Conditions: Protocol For a Mixed Methods Implementation Study

    Science.gov (United States)

    Damschroder, Laura J; Fetters, Michael D; Zikmund-Fisher, Brian J; Crabtree, Benjamin F; Hudson, Shawna V; Ruffin IV, Mack T; Fucinari, Juliana; Kang, Minji; Taichman, L Susan; Creswell, John W

    2018-01-01

    Background Women with chronic medical conditions, such as diabetes and hypertension, have a higher risk of pregnancy-related complications compared with women without medical conditions and should be offered contraception if desired. Although evidence based guidelines for contraceptive selection in the presence of medical conditions are available via the United States Medical Eligibility Criteria (US MEC), these guidelines are underutilized. Research also supports the use of decision tools to promote shared decision making between patients and providers during contraceptive counseling. Objective The overall goal of the MiHealth, MiChoice project is to design and implement a theory-driven, Web-based tool that incorporates the US MEC (provider-level intervention) within the vehicle of a contraceptive decision tool for women with chronic medical conditions (patient-level intervention) in community-based primary care settings (practice-level intervention). This will be a 3-phase study that includes a predesign phase, a design phase, and a testing phase in a randomized controlled trial. This study protocol describes phase 1 and aim 1, which is to determine patient-, provider-, and practice-level factors that are relevant to the design and implementation of the contraceptive decision tool. Methods This is a mixed methods implementation study. To customize the delivery of the US MEC in the decision tool, we selected high-priority constructs from the Consolidated Framework for Implementation Research and the Theoretical Domains Framework to drive data collection and analysis at the practice and provider level, respectively. A conceptual model that incorporates constructs from the transtheoretical model and the health beliefs model undergirds patient-level data collection and analysis and will inform customization of the decision tool for this population. We will recruit 6 community-based primary care practices and conduct quantitative surveys and semistructured qualitative

  19. A Lexical Analysis Tool with Ambiguity Support

    OpenAIRE

    Quesada, Luis; Berzal, Fernando; Cortijo, Francisco J.

    2012-01-01

    Lexical ambiguities naturally arise in languages. We present Lamb, a lexical analyzer that produces a lexical analysis graph describing all the possible sequences of tokens that can be found within the input string. Parsers can process such lexical analysis graphs and discard any sequence of tokens that does not produce a valid syntactic sentence, therefore performing, together with Lamb, a context-sensitive lexical analysis in lexically-ambiguous language specifications.

  20. Performance Analysis of On-Demand Routing Protocols in Wireless Mesh Networks

    Directory of Open Access Journals (Sweden)

    Arafatur RAHMAN

    2009-01-01

    Full Text Available Wireless Mesh Networks (WMNs have recently gained a lot of popularity due to their rapid deployment and instant communication capabilities. WMNs are dynamically self-organizing, self-configuring and self-healing with the nodes in the network automatically establishing an adiej hoc network and preserving the mesh connectivity. Designing a routing protocol for WMNs requires several aspects to consider, such as wireless networks, fixed applications, mobile applications, scalability, better performance metrics, efficient routing within infrastructure, load balancing, throughput enhancement, interference, robustness etc. To support communication, various routing protocols are designed for various networks (e.g. ad hoc, sensor, wired etc.. However, all these protocols are not suitable for WMNs, because of the architectural differences among the networks. In this paper, a detailed simulation based performance study and analysis is performed on the reactive routing protocols to verify the suitability of these protocols over such kind of networks. Ad Hoc On-Demand Distance Vector (AODV, Dynamic Source Routing (DSR and Dynamic MANET On-demand (DYMO routing protocol are considered as the representative of reactive routing protocols. The performance differentials are investigated using varying traffic load and number of source. Based on the simulation results, how the performance of each protocol can be improved is also recommended.

  1. Time Analysis: Still an Important Accountability Tool.

    Science.gov (United States)

    Fairchild, Thomas N.; Seeley, Tracey J.

    1994-01-01

    Reviews benefits to school counselors of conducting a time analysis. Describes time analysis system that authors have used, including case illustration of how authors used data to effect counseling program changes. System described followed process outlined by Fairchild: identifying services, devising coding system, keeping records, synthesizing…

  2. STARS software tool for analysis of reliability and safety

    International Nuclear Information System (INIS)

    Poucet, A.; Guagnini, E.

    1989-01-01

    This paper reports on the STARS (Software Tool for the Analysis of Reliability and Safety) project aims at developing an integrated set of Computer Aided Reliability Analysis tools for the various tasks involved in systems safety and reliability analysis including hazard identification, qualitative analysis, logic model construction and evaluation. The expert system technology offers the most promising perspective for developing a Computer Aided Reliability Analysis tool. Combined with graphics and analysis capabilities, it can provide a natural engineering oriented environment for computer assisted reliability and safety modelling and analysis. For hazard identification and fault tree construction, a frame/rule based expert system is used, in which the deductive (goal driven) reasoning and the heuristic, applied during manual fault tree construction, is modelled. Expert system can explain their reasoning so that the analyst can become aware of the why and the how results are being obtained. Hence, the learning aspect involved in manual reliability and safety analysis can be maintained and improved

  3. Integrating usability testing and think-aloud protocol analysis with "near-live" clinical simulations in evaluating clinical decision support.

    Science.gov (United States)

    Li, Alice C; Kannry, Joseph L; Kushniruk, Andre; Chrimes, Dillon; McGinn, Thomas G; Edonyabo, Daniel; Mann, Devin M

    2012-11-01

    Usability evaluations can improve the usability and workflow integration of clinical decision support (CDS). Traditional usability testing using scripted scenarios with think-aloud protocol analysis provide a useful but incomplete assessment of how new CDS tools interact with users and clinical workflow. "Near-live" clinical simulations are a newer usability evaluation tool that more closely mimics clinical workflow and that allows for a complementary evaluation of CDS usability as well as impact on workflow. This study employed two phases of testing a new CDS tool that embedded clinical prediction rules (an evidence-based medicine tool) into primary care workflow within a commercial electronic health record. Phase I applied usability testing involving "think-aloud" protocol analysis of 8 primary care providers encountering several scripted clinical scenarios. Phase II used "near-live" clinical simulations of 8 providers interacting with video clips of standardized trained patient actors enacting the clinical scenario. In both phases, all sessions were audiotaped and had screen-capture software activated for onscreen recordings. Transcripts were coded using qualitative analysis methods. In Phase I, the impact of the CDS on navigation and workflow were associated with the largest volume of negative comments (accounting for over 90% of user raised issues) while the overall usability and the content of the CDS were associated with the most positive comments. However, usability had a positive-to-negative comment ratio of only 0.93 reflecting mixed perceptions about the usability of the CDS. In Phase II, the duration of encounters with simulated patients was approximately 12 min with 71% of the clinical prediction rules being activated after half of the visit had already elapsed. Upon activation, providers accepted the CDS tool pathway 82% of times offered and completed all of its elements in 53% of all simulation cases. Only 12.2% of encounter time was spent using the

  4. Formal Analysis of SET and NSL Protocols Using the Interpretation Functions-Based Method

    Directory of Open Access Journals (Sweden)

    Hanane Houmani

    2012-01-01

    Full Text Available Most applications in the Internet such as e-banking and e-commerce use the SET and the NSL protocols to protect the communication channel between the client and the server. Then, it is crucial to ensure that these protocols respect some security properties such as confidentiality, authentication, and integrity. In this paper, we analyze the SET and the NSL protocols with respect to the confidentiality (secrecy property. To perform this analysis, we use the interpretation functions-based method. The main idea behind the interpretation functions-based technique is to give sufficient conditions that allow to guarantee that a cryptographic protocol respects the secrecy property. The flexibility of the proposed conditions allows the verification of daily-life protocols such as SET and NSL. Also, this method could be used under different assumptions such as a variety of intruder abilities including algebraic properties of cryptographic primitives. The NSL protocol, for instance, is analyzed with and without the homomorphism property. We show also, using the SET protocol, the usefulness of this approach to correct weaknesses and problems discovered during the analysis.

  5. JAVA based LCD Reconstruction and Analysis Tools

    International Nuclear Information System (INIS)

    Bower, G.

    2004-01-01

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  6. Java based LCD reconstruction and analysis tools

    International Nuclear Information System (INIS)

    Bower, Gary; Cassell, Ron; Graf, Norman; Johnson, Tony; Ronan, Mike

    2001-01-01

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  7. Analysis of quality control protocol implementation of equipment in radiotherapy services

    International Nuclear Information System (INIS)

    Calcina, Carmen S. Guzman; Lima, Luciana P. de; Rubo, Rodrigo A.; Ferraz, Eduardo; Almeida, Adelaide de

    2000-01-01

    Considering the importance of the Quality Assurance in the radiotherapy services, there was an interest to make tests' evaluation for a Quality Control for the cobalt equipment, linear accelerator and simulator as a classification and comparison. The work proposed is a suggestion that can serve as tool for medical physicists that are starting to work in the radiotherapy area and for the most experts. The discussions were made by the gathering of local tests and official protocols, resulting in a minimum protocol as a suggestion for a routine work, emphasizing the periodicity and level of tolerance of each one of the tests. (author)

  8. Standardization of a Protocol for Obtaining Platelet Rich Plasma from blood Donors; a Tool for Tissue Regeneration Procedures.

    Science.gov (United States)

    Gómez, Lina Andrea; Escobar, Magally; Peñuela, Oscar

    2015-01-01

    To develop a protocol for obtaining autologous platelet rich plasma in healthy individuals and to determine the concentration of five major growth factors before platelet activation. This protocol could be integrated into the guidelines of good clinical practice and research in regenerative medicine. Platelet rich plasma was isolated by centrifugation from 38 healthy men and 42 women ranging from 18 to 59 years old. The platelet count and quantification of growth factors were analyzed in eighty samples, stratified for age and gender of the donor. Analyses were performed using parametric the t-test or Pearson's analysis for non-parametric distribution. P platelet counts from 1.6 to 4.9 times (mean = 2.8). There was no correlation between platelet concentration and the level of the following growth factors: VEGF-D (r = 0.009, p = 0.4105), VEGF-A (r = 0.0068, p = 0.953), PDGF subunit AA (p = 0.3618; r = 0.1047), PDGF-BB (p = 0.5936; r = 0.6095). In the same way, there was no correlation between donor gender and growth factor concentrations. Only TGF-β concentration was correlated to platelet concentration (r = 0.3163, p = 0.0175). The procedure used allowed us to make preparations rich in platelets, low in leukocytes and red blood cells, and sterile. Our results showed biological variations in content of growth factors in PRP. The factors influencing these results should be further studied.

  9. Tools and Algorithms for the Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2004, held in Barcelona, Spain in March/April 2004. The 37 revised full papers and 6 revised tool demonstration papers presented were car...

  10. Making Culturally Responsive Mathematics Teaching Explicit: A Lesson Analysis Tool

    Science.gov (United States)

    Aguirre, Julia M.; Zavala, Maria del Rosario

    2013-01-01

    In the United States, there is a need for pedagogical tools that help teachers develop essential pedagogical content knowledge and practices to meet the mathematical education needs of a growing culturally and linguistically diverse student population. In this article, we introduce an innovative lesson analysis tool that focuses on integrating…

  11. Lightweight object oriented structure analysis: tools for building tools to analyze molecular dynamics simulations.

    Science.gov (United States)

    Romo, Tod D; Leioatts, Nicholas; Grossfield, Alan

    2014-12-15

    LOOS (Lightweight Object Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 140 prebuilt tools, including suites of tools for analyzing simulation convergence, three-dimensional histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only four core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. © 2014 Wiley Periodicals, Inc.

  12. Game data analysis tools and methods

    CERN Document Server

    Coupart, Thibault

    2013-01-01

    This book features an introduction to the basic theoretical tenets of data analysis from a game developer's point of view, as well as a practical guide to performing gameplay analysis on a real-world game.This book is ideal for video game developers who want to try and experiment with the game analytics approach for their own productions. It will provide a good overview of the themes you need to pay attention to, and will pave the way for success. Furthermore, the book also provides a wide range of concrete examples that will be useful for any game data analysts or scientists who want to impro

  13. Analysis of NASA communications (Nascom) II network protocols and performance

    Science.gov (United States)

    Omidyar, Guy C.; Butler, Thomas E.

    1991-01-01

    The NASA Communications (Nascom) Division of the Mission Operations and Data Systems Directorate is to undertake a major initiative to develop the Nascom II (NII) network to achieve its long-range service objectives for operational data transport to support the Space Station Freedom Program, the Earth Observing System, and other projects. NII is the Nascom ground communications network being developed to accommodate the operational traffic of the mid-1990s and beyond. The authors describe various baseline protocol architectures based on current and evolving technologies. They address the internetworking issues suggested for reliable transfer of data over heterogeneous segments. They also describe the NII architecture, topology, system components, and services. A comparative evaluation of the current and evolving technologies was made, and suggestions for further study are described. It is shown that the direction of the NII configuration and the subsystem component design will clearly depend on the advances made in the area of broadband integrated services.

  14. Sylvia Plath: a protocol analysis of her last poems.

    Science.gov (United States)

    Leenaars, A A; Wenckstern, S

    1998-01-01

    Personal documents have a significant place in psychological research. Suicide notes, diaries, novels, poems, and so on allow us to better understand the suicidal mind. The works of Sylvia Plath--a poet who killed herself at age 30--are prime examples for such protocol study. This article examines the last 6 months of Plath's poetry, revealing a suicidal malaise. Associating the results to the lives of Cesare Pavese and the case study of Natalie, a Terman-Shneidman subject of the intellectually gifted, the study shows a unit thema that facilitates the process of death. The poems reveal such themes as unbearable pain, loss, and abandonment that likely contributed significantly to death becoming the only solution.

  15. Bayesian data analysis tools for atomic physics

    Science.gov (United States)

    Trassinelli, Martino

    2017-10-01

    We present an introduction to some concepts of Bayesian data analysis in the context of atomic physics. Starting from basic rules of probability, we present the Bayes' theorem and its applications. In particular we discuss about how to calculate simple and joint probability distributions and the Bayesian evidence, a model dependent quantity that allows to assign probabilities to different hypotheses from the analysis of a same data set. To give some practical examples, these methods are applied to two concrete cases. In the first example, the presence or not of a satellite line in an atomic spectrum is investigated. In the second example, we determine the most probable model among a set of possible profiles from the analysis of a statistically poor spectrum. We show also how to calculate the probability distribution of the main spectral component without having to determine uniquely the spectrum modeling. For these two studies, we implement the program Nested_fit to calculate the different probability distributions and other related quantities. Nested_fit is a Fortran90/Python code developed during the last years for analysis of atomic spectra. As indicated by the name, it is based on the nested algorithm, which is presented in details together with the program itself.

  16. ResStock Analysis Tool | Buildings | NREL

    Science.gov (United States)

    Energy and Cost Savings for U.S. Homes Contact Eric Wilson to learn how ResStock can benefit your approach to large-scale residential energy analysis by combining: Large public and private data sources uncovered $49 billion in potential annual utility bill savings through cost-effective energy efficiency

  17. LEAP2000: tools for sustainable energy analysis

    Energy Technology Data Exchange (ETDEWEB)

    Heaps, C.; Lazarus, M.; Raskin, P. [SEU-Boston, Boston, MA (USA)

    2000-09-01

    LEAP2000 is a collaborative initiative, led by the Boston Center for the Stockholm Environment Institute, to create a new suite of analytical software and databases for integrated energy-environment analysis. The LEAP2000 software and the Technology and Environmental Database (TED) are described. 5 refs., 5 figs.

  18. Spreadsheet as a tool of engineering analysis

    International Nuclear Information System (INIS)

    Becker, M.

    1985-01-01

    In engineering analysis, problems tend to be categorized into those that can be done by hand and those that require the computer for solution. The advent of personal computers, and in particular, the advent of spreadsheet software, blurs this distinction, creating an intermediate category of problems appropriate for use with interactive personal computing

  19. SaTool - a Software Tool for Structural Analysis of Complex Automation Systems

    DEFF Research Database (Denmark)

    Blanke, Mogens; Lorentzen, Torsten

    2006-01-01

    The paper introduces SaTool, a tool for structural analysis, the use of the Matlab (R)-based implementation is presented and special features are introduced, which were motivated by industrial users. Salient features of tool are presented, including the ability to specify the behavior of a complex...... system at a high level of functional abstraction, analyze single and multiple fault scenarios and automatically generate parity relations for diagnosis for the system in normal and impaired conditions. User interface and algorithmic details are presented....

  20. User-friendly Tool for Power Flow Analysis and Distributed ...

    African Journals Online (AJOL)

    Akorede

    AKOREDE et al: TOOL FOR POWER FLOW ANALYSIS AND DISTRIBUTED GENERATION OPTIMISATION. 23 ... greenhouse gas emissions and the current deregulation of electric energy ..... Visual composition and temporal behaviour of GUI.

  1. Generalized Aliasing as a Basis for Program Analysis Tools

    National Research Council Canada - National Science Library

    O'Callahan, Robert

    2000-01-01

    .... This dissertation describes the design of a system, Ajax, that addresses this problem by using semantics-based program analysis as the basis for a number of different tools to aid Java programmers...

  2. Logical Framework Analysis (LFA): An Essential Tool for Designing ...

    African Journals Online (AJOL)

    Logical Framework Analysis (LFA): An Essential Tool for Designing Agricultural Project ... overview of the process and the structure of the Logical Framework Matrix or Logframe, derivable from it, ..... System Approach to Managing The Project.

  3. Surface Operations Data Analysis and Adaptation Tool, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — This effort undertook the creation of a Surface Operations Data Analysis and Adaptation (SODAA) tool to store data relevant to airport surface research and...

  4. SBOAT: A Stochastic BPMN Analysis and Optimisation Tool

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    In this paper we present a description of a tool development framework, called SBOAT, for the quantitative analysis of graph based process modelling languages based upon the Business Process Modelling and Notation (BPMN) language, extended with intention preserving stochastic branching and parame......In this paper we present a description of a tool development framework, called SBOAT, for the quantitative analysis of graph based process modelling languages based upon the Business Process Modelling and Notation (BPMN) language, extended with intention preserving stochastic branching...

  5. Protocol: An updated integrated methodology for analysis of metabolites and enzyme activities of ethylene biosynthesis

    Directory of Open Access Journals (Sweden)

    Geeraerd Annemie H

    2011-06-01

    Full Text Available Abstract Background The foundations for ethylene research were laid many years ago by researchers such as Lizada, Yang and Hoffman. Nowadays, most of the methods developed by them are still being used. Technological developments since then have led to small but significant improvements, contributing to a more efficient workflow. Despite this, many of these improvements have never been properly documented. Results This article provides an updated, integrated set of protocols suitable for the assembly of a complete picture of ethylene biosynthesis, including the measurement of ethylene itself. The original protocols for the metabolites 1-aminocyclopropane-1-carboxylic acid and 1-(malonylaminocyclopropane-1-carboxylic acid have been updated and downscaled, while protocols to determine in vitro activities of the key enzymes 1-aminocyclopropane-1-carboxylate synthase and 1-aminocyclopropane-1-carboxylate oxidase have been optimised for efficiency, repeatability and accuracy. All the protocols described were optimised for apple fruit, but have been proven to be suitable for the analysis of tomato fruit as well. Conclusions This work collates an integrated set of detailed protocols for the measurement of components of the ethylene biosynthetic pathway, starting from well-established methods. These protocols have been optimised for smaller sample volumes, increased efficiency, repeatability and accuracy. The detailed protocol allows other scientists to rapidly implement these methods in their own laboratories in a consistent and efficient way.

  6. Application of parameters space analysis tools for empirical model validation

    Energy Technology Data Exchange (ETDEWEB)

    Paloma del Barrio, E. [LEPT-ENSAM UMR 8508, Talence (France); Guyon, G. [Electricite de France, Moret-sur-Loing (France)

    2004-01-01

    A new methodology for empirical model validation has been proposed in the framework of the Task 22 (Building Energy Analysis Tools) of the International Energy Agency. It involves two main steps: checking model validity and diagnosis. Both steps, as well as the underlying methods, have been presented in the first part of the paper. In this part, they are applied for testing modelling hypothesis in the framework of the thermal analysis of an actual building. Sensitivity analysis tools have been first used to identify the parts of the model that can be really tested on the available data. A preliminary diagnosis is then supplied by principal components analysis. Useful information for model behaviour improvement has been finally obtained by optimisation techniques. This example of application shows how model parameters space analysis is a powerful tool for empirical validation. In particular, diagnosis possibilities are largely increased in comparison with residuals analysis techniques. (author)

  7. Data Analysis with Open Source Tools

    CERN Document Server

    Janert, Philipp

    2010-01-01

    Collecting data is relatively easy, but turning raw information into something useful requires that you know how to extract precisely what you need. With this insightful book, intermediate to experienced programmers interested in data analysis will learn techniques for working with data in a business environment. You'll learn how to look at data to discover what it contains, how to capture those ideas in conceptual models, and then feed your understanding back into the organization through business plans, metrics dashboards, and other applications. Along the way, you'll experiment with conce

  8. IMPLEMENTATION AND VALIDATION OF STATISTICAL TESTS IN RESEARCH'S SOFTWARE HELPING DATA COLLECTION AND PROTOCOLS ANALYSIS IN SURGERY.

    Science.gov (United States)

    Kuretzki, Carlos Henrique; Campos, Antônio Carlos Ligocki; Malafaia, Osvaldo; Soares, Sandramara Scandelari Kusano de Paula; Tenório, Sérgio Bernardo; Timi, Jorge Rufino Ribas

    2016-03-01

    The use of information technology is often applied in healthcare. With regard to scientific research, the SINPE(c) - Integrated Electronic Protocols was created as a tool to support researchers, offering clinical data standardization. By the time, SINPE(c) lacked statistical tests obtained by automatic analysis. Add to SINPE(c) features for automatic realization of the main statistical methods used in medicine . The study was divided into four topics: check the interest of users towards the implementation of the tests; search the frequency of their use in health care; carry out the implementation; and validate the results with researchers and their protocols. It was applied in a group of users of this software in their thesis in the strict sensu master and doctorate degrees in one postgraduate program in surgery. To assess the reliability of the statistics was compared the data obtained both automatically by SINPE(c) as manually held by a professional in statistics with experience with this type of study. There was concern for the use of automatic statistical tests, with good acceptance. The chi-square, Mann-Whitney, Fisher and t-Student were considered as tests frequently used by participants in medical studies. These methods have been implemented and thereafter approved as expected. The incorporation of the automatic SINPE (c) Statistical Analysis was shown to be reliable and equal to the manually done, validating its use as a research tool for medical research.

  9. Measurement properties of self-report physical activity assessment tools in stroke: a protocol for a systematic review

    Science.gov (United States)

    Martins, Júlia Caetano; Aguiar, Larissa Tavares; Nadeau, Sylvie; Scianni, Aline Alvim; Teixeira-Salmela, Luci Fuscaldi; Faria, Christina Danielli Coelho de Morais

    2017-01-01

    Introduction Self-report physical activity assessment tools are commonly used for the evaluation of physical activity levels in individuals with stroke. A great variety of these tools have been developed and widely used in recent years, which justify the need to examine their measurement properties and clinical utility. Therefore, the main objectives of this systematic review are to examine the measurement properties and clinical utility of self-report measures of physical activity and discuss the strengths and limitations of the identified tools. Methods and analysis A systematic review of studies that investigated the measurement properties and/or clinical utility of self-report physical activity assessment tools in stroke will be conducted. Electronic searches will be performed in five databases: Medical Literature Analysis and Retrieval System Online (MEDLINE) (PubMed), Excerpta Medica Database (EMBASE), Physiotherapy Evidence Database (PEDro), Literatura Latino-Americana e do Caribe em Ciências da Saúde (LILACS) and Scientific Electronic Library Online (SciELO), followed by hand searches of the reference lists of the included studies. Two independent reviewers will screen all retrieve titles, abstracts, and full texts, according to the inclusion criteria and will also extract the data. A third reviewer will be referred to solve any disagreement. A descriptive summary of the included studies will contain the design, participants, as well as the characteristics, measurement properties, and clinical utility of the self-report tools. The methodological quality of the studies will be evaluated using the COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) checklist and the clinical utility of the identified tools will be assessed considering predefined criteria. This systematic review will follow the Preferred Reporting Items for Systematic Review and Meta-Analyses (PRISMA) statement. Discussion This systematic review will

  10. A protocol for the development of a critical thinking assessment tool for nurses using a Delphi technique.

    Science.gov (United States)

    Jacob, Elisabeth; Duffield, Christine; Jacob, Darren

    2017-08-01

    The aim of this study was to develop an assessment tool to measure the critical thinking ability of nurses. As an increasing number of complex patients are admitted to hospitals, the importance of nurses recognizing changes in health status and picking up on deterioration is more important. To detect early signs of complication requires critical thinking skills. Registered Nurses are expected to commence their clinical careers with the necessary critical thinking skills to ensure safe nursing practice. Currently, there is no published tool to assess critical thinking skills which is context specific to Australian nurses. A modified Delphi study will be used for the project. This study will develop a series of unfolding case scenarios using national health data with multiple-choice questions to assess critical thinking. Face validity of the scenarios will be determined by an expert reference group of clinical and academic nurses. A Delphi study will determine the answers to scenario questions. Panel members will be expert clinicians and educators from two states in Australia. Rasch analysis of the questionnaire will assess validity and reliability of the tool. Funding for the study and Research Ethics Committee approval were obtained in March and November 2016, respectively. Patient outcomes and safety are directly linked to nurses' critical thinking skills. This study will develop an assessment tool to provide a standardized method of measuring nurses' critical thinking skills across Australia. This will provide healthcare providers with greater confidence in the critical thinking level of graduate Registered Nurses. © 2017 John Wiley & Sons Ltd.

  11. Analyzing the effect of routing protocols on media access control protocols in radio networks

    Energy Technology Data Exchange (ETDEWEB)

    Barrett, C. L. (Christopher L.); Drozda, M. (Martin); Marathe, A. (Achla); Marathe, M. V. (Madhav V.)

    2002-01-01

    We study the effect of routing protocols on the performance of media access control (MAC) protocols in wireless radio networks. Three well known MAC protocols: 802.11, CSMA, and MACA are considered. Similarly three recently proposed routing protocols: AODV, DSR and LAR scheme 1 are considered. The experimental analysis was carried out using GloMoSim: a tool for simulating wireless networks. The main focus of our experiments was to study how the routing protocols affect the performance of the MAC protocols when the underlying network and traffic parameters are varied. The performance of the protocols was measured w.r.t. five important parameters: (i) number of received packets, (ii) average latency of each packet, (iii) throughput (iv) long term fairness and (v) number of control packets at the MAC layer level. Our results show that combinations of routing and MAC protocols yield varying performance under varying network topology and traffic situations. The result has an important implication; no combination of routing protocol and MAC protocol is the best over all situations. Also, the performance analysis of protocols at a given level in the protocol stack needs to be studied not locally in isolation but as a part of the complete protocol stack. A novel aspect of our work is the use of statistical technique, ANOVA (Analysis of Variance) to characterize the effect of routing protocols on MAC protocols. This technique is of independent interest and can be utilized in several other simulation and empirical studies.

  12. Comparative analysis of protocols for DNA extraction from soybean caterpillars.

    Science.gov (United States)

    Palma, J; Valmorbida, I; da Costa, I F D; Guedes, J V C

    2016-04-07

    Genomic DNA extraction is crucial for molecular research, including diagnostic and genome characterization of different organisms. The aim of this study was to comparatively analyze protocols of DNA extraction based on cell lysis by sarcosyl, cetyltrimethylammonium bromide, and sodium dodecyl sulfate, and to determine the most efficient method applicable to soybean caterpillars. DNA was extracted from specimens of Chrysodeixis includens and Spodoptera eridania using the aforementioned three methods. DNA quantification was performed using spectrophotometry and high molecular weight DNA ladders. The purity of the extracted DNA was determined by calculating the A260/A280 ratio. Cost and time for each DNA extraction method were estimated and analyzed statistically. The amount of DNA extracted by these three methods was sufficient for PCR amplification. The sarcosyl method yielded DNA of higher purity, because it generated a clearer pellet without viscosity, and yielded high quality amplification products of the COI gene I. The sarcosyl method showed lower cost per extraction and did not differ from the other methods with respect to preparation times. Cell lysis by sarcosyl represents the best method for DNA extraction in terms of yield, quality, and cost effectiveness.

  13. Advanced tools for in vivo skin analysis.

    Science.gov (United States)

    Cal, Krzysztof; Zakowiecki, Daniel; Stefanowska, Justyna

    2010-05-01

    A thorough examination of the skin is essential for accurate disease diagnostics, evaluation of the effectiveness of topically applied drugs and the assessment of the results of dermatologic surgeries such as skin grafts. Knowledge of skin parameters is also important in the cosmetics industry, where the effects of skin care products are evaluated. Due to significant progress in the electronics and computer industries, sophisticated analytic devices are increasingly available for day-to-day diagnostics. The aim of this article is to review several advanced methods for in vivo skin analysis in humans: magnetic resonance imaging, electron paramagnetic resonance, laser Doppler flowmetry and time domain reflectometry. The molecular bases of these techniques are presented, and several interesting applications in the field are discussed. Methods for in vivo assessment of the biomechanical properties of human skin are also reviewed.

  14. NMR spectroscopy: a tool for conformational analysis

    International Nuclear Information System (INIS)

    Tormena, Claudio F.; Cormanich, Rodrigo A.; Rittner, Roberto; Freitas, Matheus P.

    2011-01-01

    The present review deals with the application of NMR data to the conformational analysis of simple organic compounds, together with other experimental methods like infrared spectroscopy and with theoretical calculations. Each sub-section describes the results for a group of compounds which belong to a given organic function like ketones, esters, etc. Studies of a single compound, even of special relevance, were excluded since the main goal of this review is to compare the results for a given function, where different substituents were used or small structural changes were introduced in the substrate, in an attempt to disclose their effects in the conformational equilibrium. Moreover, the huge amount of data available in the literature, on this research field, imposed some limitations which will be detailed in the Introduction, but it can be reminded in advance that these limitations include mostly the period when these results were published. (author)

  15. Pointer Analysis for JavaScript Programming Tools

    DEFF Research Database (Denmark)

    Feldthaus, Asger

    Tools that can assist the programmer with tasks, such as, refactoring or code navigation, have proven popular for Java, C#, and other programming languages. JavaScript is a widely used programming language, and its users could likewise benefit from such tools, but the dynamic nature of the language...... is an obstacle for the development of these. Because of this, tools for JavaScript have long remained ineffective compared to those for many other programming languages. Static pointer analysis can provide a foundation for more powerful tools, although the design of this analysis is itself a complicated endeavor....... In this work, we explore techniques for performing pointer analysis of JavaScript programs, and we find novel applications of these techniques. In particular, we demonstrate how these can be used for code navigation, automatic refactoring, semi-automatic refactoring of incomplete programs, and checking of type...

  16. A content analysis of posthumous sperm procurement protocols with considerations for developing an institutional policy.

    Science.gov (United States)

    Bahm, Sarah M; Karkazis, Katrina; Magnus, David

    2013-09-01

    To identify and analyze existing posthumous sperm procurement (PSP) protocols in order to outline central themes for institutions to consider when developing future policies. Qualitative content analysis. Large academic institutions across the United States. We performed a literature search and contacted 40 institutions to obtain nine full PSP protocols. We then performed a content analysis on these policies to identify major themes and factors to consider when developing a PSP protocol. Presence of a PSP policy. We identified six components of a thorough PSP protocol: Standard of Evidence, Terms of Eligibility, Sperm Designee, Restrictions on Use in Reproduction, Logistics, and Contraindications. We also identified two different approaches to policy structure. In the Limited Role approach, institutions have stricter consent requirements and limit their involvement to the time of procurement. In the Family-Centered approach, substituted judgment is permitted but a mandatory wait period is enforced before sperm use in reproduction. Institutions seeking to implement a PSP protocol will benefit from considering the six major building blocks of a thorough protocol and where they would like to fall on the spectrum from a Limited Role to a Family-Centered approach. Copyright © 2013 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  17. Hydrogen Financial Analysis Scenario Tool (H2FAST). Web Tool User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    Bush, B. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Penev, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Melaina, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zuboy, J. [Independent Consultant, Golden, CO (United States)

    2015-05-11

    The Hydrogen Financial Analysis Scenario Tool (H2FAST) provides a quick and convenient indepth financial analysis for hydrogen fueling stations. This manual describes how to use the H2FAST web tool, which is one of three H2FAST formats developed by the National Renewable Energy Laboratory (NREL). Although all of the formats are based on the same financial computations and conform to generally accepted accounting principles (FASAB 2014, Investopedia 2014), each format provides a different level of complexity and user interactivity.

  18. Scoring Tools for the Analysis of Infant Respiratory Inductive Plethysmography Signals.

    Science.gov (United States)

    Robles-Rubio, Carlos Alejandro; Bertolizio, Gianluca; Brown, Karen A; Kearney, Robert E

    2015-01-01

    Infants recovering from anesthesia are at risk of life threatening Postoperative Apnea (POA). POA events are rare, and so the study of POA requires the analysis of long cardiorespiratory records. Manual scoring is the preferred method of analysis for these data, but it is limited by low intra- and inter-scorer repeatability. Furthermore, recommended scoring rules do not provide a comprehensive description of the respiratory patterns. This work describes a set of manual scoring tools that address these limitations. These tools include: (i) a set of definitions and scoring rules for 6 mutually exclusive, unique patterns that fully characterize infant respiratory inductive plethysmography (RIP) signals; (ii) RIPScore, a graphical, manual scoring software to apply these rules to infant data; (iii) a library of data segments representing each of the 6 patterns; (iv) a fully automated, interactive formal training protocol to standardize the analysis and establish intra- and inter-scorer repeatability; and (v) a quality control method to monitor scorer ongoing performance over time. To evaluate these tools, three scorers from varied backgrounds were recruited and trained to reach a performance level similar to that of an expert. These scorers used RIPScore to analyze data from infants at risk of POA in two separate, independent instances. Scorers performed with high accuracy and consistency, analyzed data efficiently, had very good intra- and inter-scorer repeatability, and exhibited only minor confusion between patterns. These results indicate that our tools represent an excellent method for the analysis of respiratory patterns in long data records. Although the tools were developed for the study of POA, their use extends to any study of respiratory patterns using RIP (e.g., sleep apnea, extubation readiness). Moreover, by establishing and monitoring scorer repeatability, our tools enable the analysis of large data sets by multiple scorers, which is essential for

  19. Scoring Tools for the Analysis of Infant Respiratory Inductive Plethysmography Signals.

    Directory of Open Access Journals (Sweden)

    Carlos Alejandro Robles-Rubio

    Full Text Available Infants recovering from anesthesia are at risk of life threatening Postoperative Apnea (POA. POA events are rare, and so the study of POA requires the analysis of long cardiorespiratory records. Manual scoring is the preferred method of analysis for these data, but it is limited by low intra- and inter-scorer repeatability. Furthermore, recommended scoring rules do not provide a comprehensive description of the respiratory patterns. This work describes a set of manual scoring tools that address these limitations. These tools include: (i a set of definitions and scoring rules for 6 mutually exclusive, unique patterns that fully characterize infant respiratory inductive plethysmography (RIP signals; (ii RIPScore, a graphical, manual scoring software to apply these rules to infant data; (iii a library of data segments representing each of the 6 patterns; (iv a fully automated, interactive formal training protocol to standardize the analysis and establish intra- and inter-scorer repeatability; and (v a quality control method to monitor scorer ongoing performance over time. To evaluate these tools, three scorers from varied backgrounds were recruited and trained to reach a performance level similar to that of an expert. These scorers used RIPScore to analyze data from infants at risk of POA in two separate, independent instances. Scorers performed with high accuracy and consistency, analyzed data efficiently, had very good intra- and inter-scorer repeatability, and exhibited only minor confusion between patterns. These results indicate that our tools represent an excellent method for the analysis of respiratory patterns in long data records. Although the tools were developed for the study of POA, their use extends to any study of respiratory patterns using RIP (e.g., sleep apnea, extubation readiness. Moreover, by establishing and monitoring scorer repeatability, our tools enable the analysis of large data sets by multiple scorers, which is essential

  20. Pharmacotherapies for fatigue in chronic liver disease (CLD): a systematic review and meta-analysis (protocol).

    Science.gov (United States)

    Effiong, Andem; Kumari, Prerna

    2018-02-14

    This is the protocol for a systematic review (and meta-analysis) of an intervention. The primary objective of this systematic review will be to assess the benefits and harms of pharmacological therapies (pharmacotherapies) for the management of fatigue in adults with CLD of any etiology. The effects of pharmacological therapies on fatigue in CLD will be compared against those of placebo, no intervention, or non-pharmacological interventions. Specifically, this review will examine whether pharmacological therapies improve CLD-associated fatigue, and if they do, what key elements are associated with their effectiveness. The results of this systematic review will assist clinicians, policy-makers, researchers, and people with CLD in decision-making on how best to manage fatigue and its associated symptoms. MEDLINE, SCOPUS, EMBASE, EU Clinical Trials Register, WHO International Clinical Trials Registry Platform, CENTRAL (The Cochrane Library), ClinicalTrials.gov, reference lists of articles and conference proceedings will be searched for relevant studies. No language or date restrictions will be applied. Eligible studies will include adults with CLD of any etiology. Included studies will be randomized controlled trials. From included studies, data on participant characteristics, study design, setting, research ethics compliance, and intervention outcomes will be extracted. Risk of bias in included studies will be assessed using the Cochrane Risk of Bias Tool. A random-effects meta-analysis will be conducted. If substantial or considerable levels of heterogeneity are detected, analysis will be limited to a narrative synthesis. This systematic review will examine the effectiveness of pharmacological therapies on fatigue reduction in people with CLD. Such therapies may be more effective than non-pharmacological interventions in treating fatigue symptoms in CLD. Evidence derived from the findings of this study will guide future practice, policy, and research. PROSPERO, CRD

  1. Development and evaluation of an algorithm-based tool for Medication Management in nursing homes: the AMBER study protocol.

    Science.gov (United States)

    Erzkamp, Susanne; Rose, Olaf

    2018-04-20

    Residents of nursing homes are susceptible to risks from medication. Medication Reviews (MR) can increase clinical outcomes and the quality of medication therapy. Limited resources and barriers between healthcare practitioners are potential obstructions to performing MR in nursing homes. Focusing on frequent and relevant problems can support pharmacists in the provision of pharmaceutical care services. This study aims to develop and evaluate an algorithm-based tool that facilitates the provision of Medication Management in clinical practice. This study is subdivided into three phases. In phase I, semistructured interviews with healthcare practitioners and patients will be performed, and a mixed methods approach will be chosen. Qualitative content analysis and the rating of the aspects concerning the frequency and relevance of problems in the medication process in nursing homes will be performed. In phase II, a systematic review of the current literature on problems and interventions will be conducted. The findings will be narratively presented. The results of both phases will be combined to develop an algorithm for MRs. For further refinement of the aspects detected, a Delphi survey will be conducted. In conclusion, a tool for clinical practice will be created. In phase III, the tool will be tested on MRs in nursing homes. In addition, effectiveness, acceptance, feasibility and reproducibility will be assessed. The primary outcome of phase III will be the reduction of drug-related problems (DRPs), which will be detected using the tool. The secondary outcomes will be the proportion of DRPs, the acceptance of pharmaceutical recommendations and the expenditure of time using the tool and inter-rater reliability. This study intervention is approved by the local Ethics Committee. The findings of the study will be presented at national and international scientific conferences and will be published in peer-reviewed journals. DRKS00010995. © Article author(s) (or their

  2. Measurement properties of self-report physical activity assessment tools in stroke: a protocol for a systematic review.

    Science.gov (United States)

    Martins, Júlia Caetano; Aguiar, Larissa Tavares; Nadeau, Sylvie; Scianni, Aline Alvim; Teixeira-Salmela, Luci Fuscaldi; Faria, Christina Danielli Coelho de Morais

    2017-02-13

    Self-report physical activity assessment tools are commonly used for the evaluation of physical activity levels in individuals with stroke. A great variety of these tools have been developed and widely used in recent years, which justify the need to examine their measurement properties and clinical utility. Therefore, the main objectives of this systematic review are to examine the measurement properties and clinical utility of self-report measures of physical activity and discuss the strengths and limitations of the identified tools. A systematic review of studies that investigated the measurement properties and/or clinical utility of self-report physical activity assessment tools in stroke will be conducted. Electronic searches will be performed in five databases: Medical Literature Analysis and Retrieval System Online (MEDLINE) (PubMed), Excerpta Medica Database (EMBASE), Physiotherapy Evidence Database (PEDro), Literatura Latino-Americana e do Caribe em Ciências da Saúde (LILACS) and Scientific Electronic Library Online (SciELO), followed by hand searches of the reference lists of the included studies. Two independent reviewers will screen all retrieve titles, abstracts, and full texts, according to the inclusion criteria and will also extract the data. A third reviewer will be referred to solve any disagreement. A descriptive summary of the included studies will contain the design, participants, as well as the characteristics, measurement properties, and clinical utility of the self-report tools. The methodological quality of the studies will be evaluated using the COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) checklist and the clinical utility of the identified tools will be assessed considering predefined criteria. This systematic review will follow the Preferred Reporting Items for Systematic Review and Meta-Analyses (PRISMA) statement. This systematic review will provide an extensive review of the measurement

  3. SECIMTools: a suite of metabolomics data analysis tools.

    Science.gov (United States)

    Kirpich, Alexander S; Ibarra, Miguel; Moskalenko, Oleksandr; Fear, Justin M; Gerken, Joseph; Mi, Xinlei; Ashrafi, Ali; Morse, Alison M; McIntyre, Lauren M

    2018-04-20

    Metabolomics has the promise to transform the area of personalized medicine with the rapid development of high throughput technology for untargeted analysis of metabolites. Open access, easy to use, analytic tools that are broadly accessible to the biological community need to be developed. While technology used in metabolomics varies, most metabolomics studies have a set of features identified. Galaxy is an open access platform that enables scientists at all levels to interact with big data. Galaxy promotes reproducibility by saving histories and enabling the sharing workflows among scientists. SECIMTools (SouthEast Center for Integrated Metabolomics) is a set of Python applications that are available both as standalone tools and wrapped for use in Galaxy. The suite includes a comprehensive set of quality control metrics (retention time window evaluation and various peak evaluation tools), visualization techniques (hierarchical cluster heatmap, principal component analysis, modular modularity clustering), basic statistical analysis methods (partial least squares - discriminant analysis, analysis of variance, t-test, Kruskal-Wallis non-parametric test), advanced classification methods (random forest, support vector machines), and advanced variable selection tools (least absolute shrinkage and selection operator LASSO and Elastic Net). SECIMTools leverages the Galaxy platform and enables integrated workflows for metabolomics data analysis made from building blocks designed for easy use and interpretability. Standard data formats and a set of utilities allow arbitrary linkages between tools to encourage novel workflow designs. The Galaxy framework enables future data integration for metabolomics studies with other omics data.

  4. Timing Analysis of the FlexRay Communication Protocol

    DEFF Research Database (Denmark)

    Pop, Traian; Pop, Paul; Eles, Petru

    2006-01-01

    FlexRay will very likely become the de-facto standard for in-vehicle communications. However, before it can be successfully used for safety-critical applications that require predictability, timing analysis techniques are necessary for providing bounds for the message communication times....... In this paper, we propose techniques for determining the timing properties of messages transmitted in both the static (ST) and the dynamic (DYN) segments of a FlexRay communication cycle. The analysis techniques for messages are integrated in the context of a holistic schedulability analysis that computes...

  5. Tool for efficient intermodulation analysis using conventional HB packages

    OpenAIRE

    Vannini, G.; Filicori, F.; Traverso, P.

    1999-01-01

    A simple and efficient approach is proposed for the intermodulation analysis of nonlinear microwave circuits. The algorithm, which is based on a very mild assumption about the frequency response of the linear part of the circuit, allows for a reduction in computing time and memory requirement. Moreover. It can be easily implemented using any conventional tool for harmonic-balance circuit analysis

  6. Gender analysis of use of participatory tools among extension workers

    African Journals Online (AJOL)

    (c2 = 0.833, p = 0.361; t = 0.737, p = 0.737, CC = 0.396) Participatory tools used by both male and female extension personnel include resource map, mobility map, transect map, focus group discussion, venn diagram, seasonal calendar, SWOT analysis, semistructured interview, daily activity schedule, resource analysis, ...

  7. Upper Midwest Gap Analysis Program, Image Processing Protocol

    National Research Council Canada - National Science Library

    Lillesand, Thomas

    1998-01-01

    This document presents a series of technical guidelines by which land cover information is being extracted from Landsat Thematic Mapper data as part of the Upper Midwest Gap Analysis Program (UMGAP...

  8. Cost analysis of hybrid adaptive routing protocol for heterogeneous ...

    Indian Academy of Sciences (India)

    NONITA SHARMA

    Event detection; wireless sensor networks; hybrid routing; cost benefit analysis; proactive routing; reactive routing. 1. ... additional energy, high processing power, etc. are deployed to extend the .... transmit to its parent node. (2) Reactive ...

  9. Simple Public Key Infrastructure Protocol Analysis and Design

    National Research Council Canada - National Science Library

    Vidergar, Alexander G

    2005-01-01

    ...). This thesis aims at proving the applicability of the Simple Public Key Infrastructure (SPKI) as a means of PKC. The strand space approach of Guttman and Thayer is used to provide an appropriate model for analysis...

  10. A static analysis tool set for assembler code verification

    International Nuclear Information System (INIS)

    Dhodapkar, S.D.; Bhattacharjee, A.K.; Sen, Gopa

    1991-01-01

    Software Verification and Validation (V and V) is an important step in assuring reliability and quality of the software. The verification of program source code forms an important part of the overall V and V activity. The static analysis tools described here are useful in verification of assembler code. The tool set consists of static analysers for Intel 8086 and Motorola 68000 assembly language programs. The analysers examine the program source code and generate information about control flow within the program modules, unreachable code, well-formation of modules, call dependency between modules etc. The analysis of loops detects unstructured loops and syntactically infinite loops. Software metrics relating to size and structural complexity are also computed. This report describes the salient features of the design, implementation and the user interface of the tool set. The outputs generated by the analyser are explained using examples taken from some projects analysed by this tool set. (author). 7 refs., 17 figs

  11. Development of Visualization Tools for ZPPR-15 Analysis

    International Nuclear Information System (INIS)

    Lee, Min Jae; Kim, Sang Ji

    2014-01-01

    ZPPR-15 cores consist of various drawer masters that have great heterogeneity. In order to build a proper homogenization strategy, the geometry of the drawer masters should be carefully analyzed with a visualization. Additionally, a visualization of drawer masters and the core configuration is necessary for minimizing human error during the input processing. For this purpose, visualization tools for a ZPPR-15 analysis has been developed based on a Perl script. In the following section, the implementation of visualization tools will be described and various visualization samples for both drawer masters and ZPPR-15 cores will be demonstrated. Visualization tools for drawer masters and a core configuration were successfully developed for a ZPPR-15 analysis. The visualization tools are expected to be useful for understanding ZPPR-15 experiments, and finding deterministic models of ZPPR-15. It turned out that generating VTK files is handy but the application of VTK files is powerful with the aid of the VISIT program

  12. Interactive Construction Digital Tools With Real Time Analysis

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    2007-01-01

    . The aim of this research is to look into integrated digital design and analysis tools in order to find out if it is suited for use by architects and designers or only by specialists and technicians - and if not, then to look at what can be done to make them more available to architects and designers...... an architect-engineer or hybrid practitioner works simultaneously with both aesthetic and technical design requirements. In this paper the problem of a vague or not existing link between digital design tools, used by architects and designers, and the analysis tools developed by and for engineers is considered......The recent developments in computational design tools have evolved into a sometimes purely digital process which opens up for new perspectives and problems in the sketching process. One of the interesting possibilities lay within the hybrid practitioner- or architect-engineer approach, where...

  13. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Nachtigal, Noel M. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). System Analytics; Fruetel, Julia A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Gleason, Nathaniel J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Helms, Jovana [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Imbro, Dennis Raymond [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Sumner, Matthew C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis

    2013-10-01

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  14. Development of a physical activity monitoring tool for Thai medical schools: a protocol for a mixed methods study.

    Science.gov (United States)

    Wattanapisit, Apichai; Vijitpongjinda, Surasak; Saengow, Udomsak; Amaek, Waluka; Thanamee, Sanhapan; Petchuay, Prachyapan

    2017-09-27

    Physical activity (PA) is important in promoting health, as well as in the treatment and prevention of diseases. However, insufficient PA is still a global health problem and it is also a problem in medical schools. PA training in medical curricula is still sparse or non-existent. There is a need for a comprehensive understanding of the extent of PA in medical schools through several indicators, including people, places and policies. This study includes a survey of the PA prevalence in a medical school and development of a tool, the Medical School Physical Activity Report Card (MSPARC), which will contain concise and understandable infographics and information for exploring, monitoring and reporting information relating to PA prevalence. This mixed methods study will run from January to September 2017. We will involve the School of Medicine, Walailak University, Thailand, and its medical students (n=285). Data collection will consist of both primary and secondary data, divided into four parts: general information, people, places and policies. We will investigate the PA metrics about (1) people: the prevalence of PA and sedentary behaviours; (2) place: the quality and accessibility of walkable neighbourhoods, bicycle facilities and recreational areas; and (3) policy: PA promotion programmes for medical students, education metrics and investments related to PA. The MSPARC will be developed using simple symbols, infographics and short texts to evaluate the PA metrics of the medical school. This study has been approved by the Human Research Ethics Committee of Walailak University (protocol number: WUEC-16-005-01). Findings will be published in peer-reviewed journals and presented at national or international conferences. The MSPARC and full report will be disseminated to relevant stakeholders, policymakers, staff and clients. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is

  15. Practical security analysis of a quantum stream cipher by the Yuen 2000 protocol

    International Nuclear Information System (INIS)

    Hirota, Osamu

    2007-01-01

    There exists a great gap between one-time pad with perfect secrecy and conventional mathematical encryption. The Yuen 2000 (Y00) protocol or αη scheme may provide a protocol which covers from the conventional security to the ultimate one, depending on implementations. This paper presents the complexity-theoretic security analysis on some models of the Y00 protocol with nonlinear pseudo-random-number-generator and quantum noise diffusion mapping (QDM). Algebraic attacks and fast correlation attacks are applied with a model of the Y00 protocol with nonlinear filtering like the Toyocrypt stream cipher as the running key generator, and it is shown that these attacks in principle do not work on such models even when the mapping between running key and quantum state signal is fixed. In addition, a security property of the Y00 protocol with QDM is clarified. Consequently, we show that the Y00 protocol has a potential which cannot be realized by conventional cryptography and that it goes beyond mathematical encryption with physical encryption

  16. IEEE 802.11 Wireless LANs: Performance Analysis and Protocol Refinement

    Directory of Open Access Journals (Sweden)

    Chatzimisios P.

    2005-01-01

    Full Text Available The IEEE 802.11 protocol is emerging as a widely used standard and has become the most mature technology for wireless local area networks (WLANs. In this paper, we focus on the tuning of the IEEE 802.11 protocol parameters taking into consideration, in addition to throughput efficiency, performance metrics such as the average packet delay, the probability of a packet being discarded when it reaches the maximum retransmission limit, the average time to drop a packet, and the packet interarrival time. We present an analysis, which has been validated by simulation that is based on a Markov chain model commonly used in the literature. We further study the improvement on these performance metrics by employing suitable protocol parameters according to the specific communication needs of the IEEE 802.11 protocol for both basic access and RTS/CTS access schemes. We show that the use of a higher initial contention window size does not considerably degrade performance in small networks and performs significantly better in any other scenario. Moreover, we conclude that the combination of a lower maximum contention window size and a higher retry limit considerably improves performance. Results indicate that the appropriate adjustment of the protocol parameters enhances performance and improves the services that the IEEE 802.11 protocol provides to various communication applications.

  17. Methodological tools for the collection and analysis of participant observation data using grounded theory.

    Science.gov (United States)

    Laitinen, Heleena; Kaunonen, Marja; Astedt-Kurki, Päivi

    2014-11-01

    To give clarity to the analysis of participant observation in nursing when implementing the grounded theory method. Participant observation (PO) is a method of collecting data that reveals the reality of daily life in a specific context. In grounded theory, interviews are the primary method of collecting data but PO gives a distinctive insight, revealing what people are really doing, instead of what they say they are doing. However, more focus is needed on the analysis of PO. An observational study carried out to gain awareness of nursing care and its electronic documentation in four acute care wards in hospitals in Finland. Discussion of using the grounded theory method and PO as a data collection tool. The following methodological tools are discussed: an observational protocol, jotting of notes, microanalysis, the use of questioning, constant comparison, and writing and illustrating. Each tool has specific significance in collecting and analysing data, working in constant interaction. Grounded theory and participant observation supplied rich data and revealed the complexity of the daily reality of acute care. In this study, the methodological tools provided a base for the study at the research sites and outside. The process as a whole was challenging. It was time-consuming and it required rigorous and simultaneous data collection and analysis, including reflective writing. Using these methodological tools helped the researcher stay focused from data collection and analysis to building theory. Using PO as a data collection method in qualitative nursing research provides insights. It is not commonly discussed in nursing research and therefore this study can provide insight, which cannot be seen or revealed by using other data collection methods. Therefore, this paper can produce a useful tool for those who intend to use PO and grounded theory in their nursing research.

  18. Tool Efficiency Analysis model research in SEMI industry

    Directory of Open Access Journals (Sweden)

    Lei Ma

    2018-01-01

    Full Text Available One of the key goals in SEMI industry is to improve equipment through put and ensure equipment production efficiency maximization. This paper is based on SEMI standards in semiconductor equipment control, defines the transaction rules between different tool states,and presents a TEA system model which is to analysis tool performance automatically based on finite state machine. The system was applied to fab tools and verified its effectiveness successfully, and obtained the parameter values used to measure the equipment performance, also including the advices of improvement.

  19. Tools and Algorithms for the Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2004, held in Barcelona, Spain in March/April 2004. The 37 revised full papers and 6 revised tool demonstration papers presented were...... carefully reviewed and selected from a total of 162 submissions. The papers are organized in topical sections on theorem proving, probabilistic model checking, testing, tools, explicit state and Petri nets, scheduling, constraint solving, timed systems, case studies, software, temporal logic, abstraction...

  20. Development of a climate data analysis tool (CDAT)

    Energy Technology Data Exchange (ETDEWEB)

    Marlais, S.M.

    1997-09-01

    The Climate Data Analysis Tool (CDAT) is designed to provide the Program for Climate Model Diagnosis and Intercomparison (PCMDI) at Lawrence Livermore National Laboratory, California, with the capabilities needed to analyze model data with little effort on the part of the scientist, while performing complex mathematical calculations, and graphically displaying the results. This computer software will meet the demanding need of climate scientists by providing the necessary tools to diagnose, validate, and intercompare large observational and global climate model datasets.

  1. Tools for voltage stability analysis, including a probabilistic approach

    Energy Technology Data Exchange (ETDEWEB)

    Vieira Filho, X; Martins, N; Bianco, A; Pinto, H J.C.P. [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, M V.F. [Power System Research (PSR), Inc., Rio de Janeiro, RJ (Brazil); Gomes, P; Santos, M.G. dos [ELETROBRAS, Rio de Janeiro, RJ (Brazil)

    1994-12-31

    This paper reviews some voltage stability analysis tools that are being used or envisioned for expansion and operational planning studies in the Brazilian system, as well as, their applications. The paper also shows that deterministic tools can be linked together in a probabilistic framework, so as to provide complementary help to the analyst in choosing the most adequate operation strategies, or the best planning solutions for a given system. (author) 43 refs., 8 figs., 8 tabs.

  2. Database tools for enhanced analysis of TMX-U data

    International Nuclear Information System (INIS)

    Stewart, M.E.; Carter, M.R.; Casper, T.A.; Meyer, W.H.; Perkins, D.E.; Whitney, D.M.

    1986-01-01

    A commercial database software package has been used to create several databases and tools that assist and enhance the ability of experimental physicists to analyze data from the Tandem Mirror Experiment-Upgrade (TMX-U) experiment. This software runs on a DEC-20 computer in M-Divisions's User Service Center at Lawrence Livermore National Laboratory (LLNL), where data can be analyzed off line from the main TMX-U acquisition computers. When combined with interactive data analysis programs, these tools provide the capability to do batch-style processing or interactive data analysis on the computers in the USC or the supercomputers of the National Magnetic Fusion Energy Computer Center (NMFECC) in addition to the normal processing done by the TMX-U acquisition system. One database tool provides highly reduced data for searching and correlation analysis of several diagnostic signals within a single shot or over many shots. A second database tool provides retrieval and storage of unreduced data for use in detailed analysis of one or more diagnostic signals. We will show how these database tools form the core of an evolving off-line data analysis environment on the USC computers

  3. Database tools for enhanced analysis of TMX-U data

    International Nuclear Information System (INIS)

    Stewart, M.E.; Carter, M.R.; Casper, T.A.; Meyer, W.H.; Perkins, D.E.; Whitney, D.M.

    1986-01-01

    A commercial database software package has been used to create several databases and tools that assist and enhance the ability of experimental physicists to analyze data from the Tandem Mirror Experiment-Upgrade (TMX-U) experiment. This software runs on a DEC-20 computer in M-Division's User Service Center at Lawrence Livermore National Laboratory (LLNL), where data can be analyzed offline from the main TMX-U acquisition computers. When combined with interactive data analysis programs, these tools provide the capability to do batch-style processing or interactive data analysis on the computers in the USC or the supercomputers of the National Magnetic Fusion Energy Computer Center (NMFECC) in addition to the normal processing done by the TMX-U acquisition system. One database tool provides highly reduced data for searching and correlation analysis of several diagnostic signals within a single shot or over many shots. A second database tool provides retrieval and storage of unreduced data for use in detailed analysis of one or more diagnostic signals. We will show how these database tools form the core of an evolving offline data analysis environment on the USC computers

  4. Analysis Tool Web Services from the EMBL-EBI.

    Science.gov (United States)

    McWilliam, Hamish; Li, Weizhong; Uludag, Mahmut; Squizzato, Silvano; Park, Young Mi; Buso, Nicola; Cowley, Andrew Peter; Lopez, Rodrigo

    2013-07-01

    Since 2004 the European Bioinformatics Institute (EMBL-EBI) has provided access to a wide range of databases and analysis tools via Web Services interfaces. This comprises services to search across the databases available from the EMBL-EBI and to explore the network of cross-references present in the data (e.g. EB-eye), services to retrieve entry data in various data formats and to access the data in specific fields (e.g. dbfetch), and analysis tool services, for example, sequence similarity search (e.g. FASTA and NCBI BLAST), multiple sequence alignment (e.g. Clustal Omega and MUSCLE), pairwise sequence alignment and protein functional analysis (e.g. InterProScan and Phobius). The REST/SOAP Web Services (http://www.ebi.ac.uk/Tools/webservices/) interfaces to these databases and tools allow their integration into other tools, applications, web sites, pipeline processes and analytical workflows. To get users started using the Web Services, sample clients are provided covering a range of programming languages and popular Web Service tool kits, and a brief guide to Web Services technologies, including a set of tutorials, is available for those wishing to learn more and develop their own clients. Users of the Web Services are informed of improvements and updates via a range of methods.

  5. Beyond protocols

    DEFF Research Database (Denmark)

    Vanderhoeven, Sonia; Branquart, Etienne; Casaer, Jim

    2017-01-01

    Risk assessment tools for listing invasive alien species need to incorporate all available evidence and expertise. Beyond the wealth of protocols developed to date, we argue that the current way of performing risk analysis has several shortcomings. In particular, lack of data on ecological impact...... information on risk and the exploration of improved methods for decision making on biodiversity management. This is crucial for efficient conservation resource allocation and uptake by stakeholders and the public......., transparency and repeatability of assessments as well as the incorporation of uncertainty should all be explicitly considered. We recommend improved quality control of risk assessments through formalized peer review with clear feedback between assessors and reviewers. Alternatively, a consensus building...

  6. Performance Analysis of an Enhanced PRMA-HS Protocol for LEO Satellite Communication

    Institute of Scientific and Technical Information of China (English)

    ZHUO Yong-ning; YAN Shao-hu; WU Shi-qi

    2005-01-01

    The packet reservation multiple access with hindering state (PRMA-HS) is a protocol suitable for LEO satellite mobile communication. Although working well with light system payload (amount of user terminals), the protocol imposes high channel congestion on system with heavy payload, thus degrades the system's quality of service. To controlling the channel congestion, a scheme of enhanced PRMA-HS protocol is proposed, which aims to reduce the collision of voice packets by adopting a mechanism of access control. Through theoretic analysis, the system's mathematic model is presented and the packet drop probability of the scheme is deduced. To testify the performance of the scheme, a simulation is performed and the results support our analysis.

  7. Physics analysis tools for beauty physics in ATLAS

    International Nuclear Information System (INIS)

    Anastopoulos, C; Bouhova-Thacker, E; Catmore, J; Mora, L de; Dallison, S; Derue, F; Epp, B; Jussel, P; Kaczmarska, A; Radziewski, H v; Stahl, T; Reznicek, P

    2008-01-01

    The Large Hadron Collider experiments will search for physics phenomena beyond the Standard Model. Highly sensitive tests of beauty hadrons will represent an alternative approach to this research. The analysis of complex decay chains of the beauty hadrons have to efficiently extract the detector tracks made by these reactions and reject other events in order to make sufficiently precise measurement. This places severe demands on the software used to analyze the B-physics data. The ATLAS B-physics group has written a series of tools and algorithms for performing these tasks, to be run within the ATLAS offline software framework Athena. This paper describes this analysis suite, paying particular attention to mechanisms for handling combinatorics, interfaces to secondary vertex fitting packages, B-flavor tagging tools and finally Monte Carlo true information association to pursue simulation data in process of the software validations which is an important part of the development of the physics analysis tools

  8. Understanding context in knowledge translation: a concept analysis study protocol.

    Science.gov (United States)

    Squires, Janet E; Graham, Ian D; Hutchinson, Alison M; Linklater, Stefanie; Brehaut, Jamie C; Curran, Janet; Ivers, Noah; Lavis, John N; Michie, Susan; Sales, Anne E; Fiander, Michelle; Fenton, Shannon; Noseworthy, Thomas; Vine, Jocelyn; Grimshaw, Jeremy M

    2015-05-01

    To conduct a concept analysis of clinical practice contexts (work environments) that facilitate or militate against the uptake of research evidence by healthcare professionals in clinical practice. This will involve developing a clear definition of context by describing its features, domains and defining characteristics. The context where clinical care is delivered influences that care. While research shows that context is important to knowledge translation (implementation), we lack conceptual clarity on what is context, which contextual factors probably modify the effect of knowledge translation interventions (and hence should be considered when designing interventions) and which contextual factors themselves could be targeted as part of a knowledge translation intervention (context modification). Concept analysis. The Walker and Avant concept analysis method, comprised of eight systematic steps, will be used: (1) concept selection; (2) determination of aims; (3) identification of uses of context; (4) determination of defining attributes of context; (5) identification/construction of a model case of context; (6) identification/construction of additional cases of context; (7) identification/construction of antecedents and consequences of context; and (8) definition of empirical referents of context. This study is funded by the Canadian Institutes of Health Research (January 2014). This study will result in a much needed framework of context for knowledge translation, which identifies specific elements that, if assessed and used to tailor knowledge translation activities, will result in increased research use by nurses and other healthcare professionals in clinical practice, ultimately leading to better patient care. © 2014 John Wiley & Sons Ltd.

  9. Analysis of design tool attributes with regards to sustainability benefits

    Science.gov (United States)

    Zain, S.; Ismail, A. F.; Ahmad, Z.; Adesta, E. Y. T.

    2018-01-01

    The trend of global manufacturing competitiveness has shown a significant shift from profit and customer driven business to a more harmonious sustainability paradigm. This new direction, which emphasises the interests of three pillars of sustainability, i.e., social, economic and environment dimensions, has changed the ways products are designed. As a result, the roles of design tools in the product development stage of manufacturing in adapting to the new strategy are vital and increasingly challenging. The aim of this paper is to review the literature on the attributes of design tools with regards to the sustainability perspective. Four well-established design tools are selected, namely Quality Function Deployment (QFD), Failure Mode and Element Analysis (FMEA), Design for Six Sigma (DFSS) and Design for Environment (DfE). By analysing previous studies, the main attributes of each design tool and its benefits with respect to each sustainability dimension throughout four stages of product lifecycle are discussed. From this study, it is learnt that each of the design tools contributes to the three pillars of sustainability either directly or indirectly, but they are unbalanced and not holistic. Therefore, the prospective of improving and optimising the design tools is projected, and the possibility of collaboration between the different tools is discussed.

  10. 75 FR 74007 - Federal Aquatic Nuisance Species Research Risk Analysis Protocol

    Science.gov (United States)

    2010-11-30

    ... site, http://anstaskforce.gov/documents.php . To obtain a hard copy of the Protocol, see Document... aquatic species that are the target of this risk analysis. Language used in the NANPCA differentiates...: http://anstaskforce.gov/documents.php Write: Susan Pasko, National Oceanic and Atmospheric...

  11. A simple protocol for NMR analysis of the enantiomeric purity of chiral hydroxylamines.

    Science.gov (United States)

    Tickell, David A; Mahon, Mary F; Bull, Steven D; James, Tony D

    2013-02-15

    A practically simple three-component chiral derivatization protocol for determining the enantiopurity of chiral hydroxylamines by (1)H NMR spectroscopic analysis is described, involving their treatment with 2-formylphenylboronic acid and enantiopure BINOL to afford a mixture of diastereomeric nitrono-boronate esters whose ratio is an accurate reflection of the enantiopurity of the parent hydroxylamine.

  12. Protocol Analysis of Group Problem Solving in Mathematics: A Cognitive-Metacognitive Framework for Assessment.

    Science.gov (United States)

    Artzt, Alice F.; Armour-Thomas, Eleanor

    The roles of cognition and metacognition were examined in the mathematical problem-solving behaviors of students as they worked in small groups. As an outcome, a framework that links the literature of cognitive science and mathematical problem solving was developed for protocol analysis of mathematical problem solving. Within this framework, each…

  13. A Concise Protocol for the Validation of Language ENvironment Analysis (LENA) Conversational Turn Counts in Vietnamese

    Science.gov (United States)

    Ganek, Hillary V.; Eriks-Brophy, Alice

    2018-01-01

    The aim of this study was to present a protocol for the validation of the Language ENvironment Analysis (LENA) System's conversational turn count (CTC) for Vietnamese speakers. Ten families of children aged between 22 and 42 months, recruited near Ho Chi Minh City, participated in this project. Each child wore the LENA audio recorder for a full…

  14. Analysis and Transformation Tools for Constrained Horn Clause Verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2014-01-01

    Several techniques and tools have been developed for verification of properties expressed as Horn clauses with constraints over a background theory (CHC). Current CHC verification tools implement intricate algorithms and are often limited to certain subclasses of CHC problems. Our aim in this work...... is to investigate the use of a combination of off-the-shelf techniques from the literature in analysis and transformation of Constraint Logic Programs (CLPs) to solve challenging CHC verification problems. We find that many problems can be solved using a combination of tools based on well-known techniques from...... abstract interpretation, semantics-preserving transformations, program specialisation and query-answer transformations. This gives insights into the design of automatic, more general CHC verification tools based on a library of components....

  15. Optimization of oligonucleotide arrays and RNA amplification protocols for analysis of transcript structure and alternative splicing.

    Science.gov (United States)

    Castle, John; Garrett-Engele, Phil; Armour, Christopher D; Duenwald, Sven J; Loerch, Patrick M; Meyer, Michael R; Schadt, Eric E; Stoughton, Roland; Parrish, Mark L; Shoemaker, Daniel D; Johnson, Jason M

    2003-01-01

    Microarrays offer a high-resolution means for monitoring pre-mRNA splicing on a genomic scale. We have developed a novel, unbiased amplification protocol that permits labeling of entire transcripts. Also, hybridization conditions, probe characteristics, and analysis algorithms were optimized for detection of exons, exon-intron edges, and exon junctions. These optimized protocols can be used to detect small variations and isoform mixtures, map the tissue specificity of known human alternative isoforms, and provide a robust, scalable platform for high-throughput discovery of alternative splicing.

  16. Allowing Students to Select Deliverables for Peer Review: Analysis of a Free-Selection Protocol

    DEFF Research Database (Denmark)

    Papadopoulos, Pantelis M.; Lagkas, Thomas; Demetriadis, Stavros

    2011-01-01

    This study analyzes the benefits and limitations of a “free-selection” peer assignment protocol by comparing them to the widely implemented “assigned-pair” protocol. The primary motivation was to circumvent the issues that often appear to the instructors implementing peer review activities with pre......-Selection, where students were able to explore and select peer work for review. Result analysis showed a very strong tendency in favor of the Free-Selection students regarding both domain specific (conceptual) and domain-general (reviewing) knowledge....

  17. Protocol for Microplastics Sampling on the Sea Surface and Sample Analysis

    Science.gov (United States)

    Kovač Viršek, Manca; Palatinus, Andreja; Koren, Špela; Peterlin, Monika; Horvat, Petra; Kržan, Andrej

    2016-01-01

    Microplastic pollution in the marine environment is a scientific topic that has received increasing attention over the last decade. The majority of scientific publications address microplastic pollution of the sea surface. The protocol below describes the methodology for sampling, sample preparation, separation and chemical identification of microplastic particles. A manta net fixed on an »A frame« attached to the side of the vessel was used for sampling. Microplastic particles caught in the cod end of the net were separated from samples by visual identification and use of stereomicroscopes. Particles were analyzed for their size using an image analysis program and for their chemical structure using ATR-FTIR and micro FTIR spectroscopy. The described protocol is in line with recommendations for microplastics monitoring published by the Marine Strategy Framework Directive (MSFD) Technical Subgroup on Marine Litter. This written protocol with video guide will support the work of researchers that deal with microplastics monitoring all over the world. PMID:28060297

  18. Security analysis of standards-driven communication protocols for healthcare scenarios.

    Science.gov (United States)

    Masi, Massimiliano; Pugliese, Rosario; Tiezzi, Francesco

    2012-12-01

    The importance of the Electronic Health Record (EHR), that stores all healthcare-related data belonging to a patient, has been recognised in recent years by governments, institutions and industry. Initiatives like the Integrating the Healthcare Enterprise (IHE) have been developed for the definition of standard methodologies for secure and interoperable EHR exchanges among clinics and hospitals. Using the requisites specified by these initiatives, many large scale projects have been set up for enabling healthcare professionals to handle patients' EHRs. The success of applications developed in these contexts crucially depends on ensuring such security properties as confidentiality, authentication, and authorization. In this paper, we first propose a communication protocol, based on the IHE specifications, for authenticating healthcare professionals and assuring patients' safety. By means of a formal analysis carried out by using the specification language COWS and the model checker CMC, we reveal a security flaw in the protocol thus demonstrating that to simply adopt the international standards does not guarantee the absence of such type of flaws. We then propose how to emend the IHE specifications and modify the protocol accordingly. Finally, we show how to tailor our protocol for application to more critical scenarios with no assumptions on the communication channels. To demonstrate feasibility and effectiveness of our protocols we have fully implemented them.

  19. Tabhu: tools for antibody humanization.

    KAUST Repository

    Olimpieri, Pier Paolo; Marcatili, Paolo; Tramontano, Anna

    2014-01-01

    for antibody humanization. Tabhu includes tools for human template selection, grafting, back-mutation evaluation, antibody modelling and structural analysis, helping the user in all the critical steps of the humanization experiment protocol. AVAILABILITY: http

  20. Extraction and Analysis of Information Related to Research & Development Declared Under an Additional Protocol

    International Nuclear Information System (INIS)

    Idinger, J.; Labella, R.; Rialhe, A.; Teller, N.

    2015-01-01

    The additional protocol (AP) provides important tools to strengthen and improve the effectiveness and efficiency of the safeguards system. Safeguards are designed to verify that States comply with their international commitments not to use nuclear material or to engage in nuclear-related activities for the purpose of developing nuclear weapons or other nuclear explosive devices. Under an AP based on INFCIRC/540, a State must provide to the IAEA additional information about, and inspector access to, all parts of its nuclear fuel cycle. In addition, the State has to supply information about its nuclear fuel cycle-related research and development (R&D) activities. The majority of States declare their R&D activities under the AP Articles 2.a.(i), 2.a.(x), and 2.b.(i) as part of initial declarations and their annual updates under the AP. In order to verify consistency and completeness of information provided under the AP by States, the Agency has started to analyze declared R&D information by identifying interrelationships between States in different R&D areas relevant to safeguards. The paper outlines the quality of R&D information provided by States to the Agency, describes how the extraction and analysis of relevant declarations are currently carried out at the Agency and specifies what kinds of difficulties arise during evaluation in respect to cross-linking international projects and finding gaps in reporting. In addition, the paper tries to elaborate how the reporting quality of AP information with reference to R&D activities and the assessment process of R&D information could be improved. (author)

  1. Interactive exploratory data analysis tool in Alzheimer’s disease

    Directory of Open Access Journals (Sweden)

    Diana Furcila

    2015-04-01

    Thus, MorExAn provide us the possibility to relate histopathological data with neuropsychological and clinical variables. The aid of this interactive visualization tool brings us the possibility to find unexpected conclusions beyond the insight provided by simple statistics analysis, as well as to improve neuroscientists’ productivity.

  2. On the Integration of Digital Design and Analysis Tools

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    2006-01-01

    The aim of this research is to look into integrated digital design and analysis tools in order to find out if it is suited for use by architects and designers or only by specialists and technicians - and if not, then to look at what can be done to make them more available to architects and design...

  3. Assessment of Available Numerical Tools for Dynamic Mooring Analysis

    DEFF Research Database (Denmark)

    Thomsen, Jonas Bjerg; Eskilsson, Claes; Ferri, Francesco

    This report covers a preliminary assessment of available numerical tools to be used in upcoming full dynamic analysis of the mooring systems assessed in the project _Mooring Solutions for Large Wave Energy Converters_. The assessments tends to cover potential candidate software and subsequently c...

  4. Software architecture analysis tool : software architecture metrics collection

    NARCIS (Netherlands)

    Muskens, J.; Chaudron, M.R.V.; Westgeest, R.

    2002-01-01

    The Software Engineering discipline lacks the ability to evaluate software architectures. Here we describe a tool for software architecture analysis that is based on metrics. Metrics can be used to detect possible problems and bottlenecks in software architectures. Even though metrics do not give a

  5. Orienting the Neighborhood: A Subdivision Energy Analysis Tool; Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Christensen, C.; Horowitz, S.

    2008-07-01

    This paper describes a new computerized Subdivision Energy Analysis Tool being developed to allow users to interactively design subdivision street layouts while receiving feedback about energy impacts based on user-specified building design variants and availability of roof surfaces for photovoltaic and solar water heating systems.

  6. An Automated Data Analysis Tool for Livestock Market Data

    Science.gov (United States)

    Williams, Galen S.; Raper, Kellie Curry

    2011-01-01

    This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale…

  7. Logical Framework Analysis (LFA): An Essential Tool for Designing ...

    African Journals Online (AJOL)

    Evaluation of a project at any stage of its life cycle, especially at its planning stage, is necessary for its successful execution and completion. The Logical Framework Analysis or the Logical Framework Approach (LFA) is an essential tool in designing such evaluation because it is a process that serves as a reference guide in ...

  8. Models as Tools of Analysis of a Network Organisation

    Directory of Open Access Journals (Sweden)

    Wojciech Pająk

    2013-06-01

    Full Text Available The paper presents models which may be applied as tools of analysis of a network organisation. The starting point of the discussion is defining the following terms: supply chain and network organisation. Further parts of the paper present basic assumptions analysis of a network organisation. Then the study characterises the best known models utilised in analysis of a network organisation. The purpose of the article is to define the notion and the essence of network organizations and to present the models used for their analysis.

  9. Adaptive tools in virtual environments: Independent component analysis for multimedia

    DEFF Research Database (Denmark)

    Kolenda, Thomas

    2002-01-01

    The thesis investigates the role of independent component analysis in the setting of virtual environments, with the purpose of finding properties that reflect human context. A general framework for performing unsupervised classification with ICA is presented in extension to the latent semantic in...... were compared to investigate computational differences and separation results. The ICA properties were finally implemented in a chat room analysis tool and briefly investigated for visualization of search engines results....

  10. Thermal Analysis for Condition Monitoring of Machine Tool Spindles

    International Nuclear Information System (INIS)

    Clough, D; Fletcher, S; Longstaff, A P; Willoughby, P

    2012-01-01

    Decreasing tolerances on parts manufactured, or inspected, on machine tools increases the requirement to have a greater understanding of machine tool capabilities, error sources and factors affecting asset availability. Continuous usage of a machine tool during production processes causes heat generation typically at the moving elements, resulting in distortion of the machine structure. These effects, known as thermal errors, can contribute a significant percentage of the total error in a machine tool. There are a number of design solutions available to the machine tool builder to reduce thermal error including, liquid cooling systems, low thermal expansion materials and symmetric machine tool structures. However, these can only reduce the error not eliminate it altogether. It is therefore advisable, particularly in the production of high value parts, for manufacturers to obtain a thermal profile of their machine, to ensure it is capable of producing in tolerance parts. This paper considers factors affecting practical implementation of condition monitoring of the thermal errors. In particular is the requirement to find links between temperature, which is easily measureable during production and the errors which are not. To this end, various methods of testing including the advantages of thermal images are shown. Results are presented from machines in typical manufacturing environments, which also highlight the value of condition monitoring using thermal analysis.

  11. Development of a site analysis tool for distributed wind projects

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, Shawn [The Cadmus Group, Inc., Waltham MA (United States)

    2012-02-28

    The Cadmus Group, Inc., in collaboration with the National Renewable Energy Laboratory (NREL) and Encraft, was awarded a grant from the Department of Energy (DOE) to develop a site analysis tool for distributed wind technologies. As the principal investigator for this project, Mr. Shawn Shaw was responsible for overall project management, direction, and technical approach. The product resulting from this project is the Distributed Wind Site Analysis Tool (DSAT), a software tool for analyzing proposed sites for distributed wind technology (DWT) systems. This user-friendly tool supports the long-term growth and stability of the DWT market by providing reliable, realistic estimates of site and system energy output and feasibility. DSAT-which is accessible online and requires no purchase or download of software-is available in two account types; Standard: This free account allows the user to analyze a limited number of sites and to produce a system performance report for each; and Professional: For a small annual fee users can analyze an unlimited number of sites, produce system performance reports, and generate other customizable reports containing key information such as visual influence and wind resources. The tool’s interactive maps allow users to create site models that incorporate the obstructions and terrain types present. Users can generate site reports immediately after entering the requisite site information. Ideally, this tool also educates users regarding good site selection and effective evaluation practices.

  12. Porcupine: A visual pipeline tool for neuroimaging analysis.

    Directory of Open Access Journals (Sweden)

    Tim van Mourik

    2018-05-01

    Full Text Available The field of neuroimaging is rapidly adopting a more reproducible approach to data acquisition and analysis. Data structures and formats are being standardised and data analyses are getting more automated. However, as data analysis becomes more complicated, researchers often have to write longer analysis scripts, spanning different tools across multiple programming languages. This makes it more difficult to share or recreate code, reducing the reproducibility of the analysis. We present a tool, Porcupine, that constructs one's analysis visually and automatically produces analysis code. The graphical representation improves understanding of the performed analysis, while retaining the flexibility of modifying the produced code manually to custom needs. Not only does Porcupine produce the analysis code, it also creates a shareable environment for running the code in the form of a Docker image. Together, this forms a reproducible way of constructing, visualising and sharing one's analysis. Currently, Porcupine links to Nipype functionalities, which in turn accesses most standard neuroimaging analysis tools. Our goal is to release researchers from the constraints of specific implementation details, thereby freeing them to think about novel and creative ways to solve a given problem. Porcupine improves the overview researchers have of their processing pipelines, and facilitates both the development and communication of their work. This will reduce the threshold at which less expert users can generate reusable pipelines. With Porcupine, we bridge the gap between a conceptual and an implementational level of analysis and make it easier for researchers to create reproducible and shareable science. We provide a wide range of examples and documentation, as well as installer files for all platforms on our website: https://timvanmourik.github.io/Porcupine. Porcupine is free, open source, and released under the GNU General Public License v3.0.

  13. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    Science.gov (United States)

    Flores, Melissa; Malin, Jane T.

    2013-01-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component s functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  14. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    Science.gov (United States)

    Flores, Melissa D.; Malin, Jane T.; Fleming, Land D.

    2013-09-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component's functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  15. Campaign effects and self-analysis Internet tool

    Energy Technology Data Exchange (ETDEWEB)

    Brange, Birgitte [Danish Electricity Saving Trust (Denmark); Fjordbak Larsen, Troels [IT Energy ApS (Denmark); Wilke, Goeran [Danish Electricity Saving Trust (Denmark)

    2007-07-01

    In October 2006, the Danish Electricity Saving Trust launched a large TV campaign targeting domestic electricity consumption. The campaign was based on the central message '1000 kWh/year per person is enough'. The campaign was accompanied by a new internet portal with updated information about numerous household appliances, and by analysis tools for bringing down electricity consumption to 1000 kWh/year per person. The effects of the campaign are monitored through repeated surveys and analysed in relation to usage of internet tools.

  16. A dataflow analysis tool for parallel processing of algorithms

    Science.gov (United States)

    Jones, Robert L., III

    1993-01-01

    A graph-theoretic design process and software tool is presented for selecting a multiprocessing scheduling solution for a class of computational problems. The problems of interest are those that can be described using a dataflow graph and are intended to be executed repetitively on a set of identical parallel processors. Typical applications include signal processing and control law problems. Graph analysis techniques are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool is shown to facilitate the application of the design process to a given problem.

  17. Tools and Algorithms for Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 6th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2000, held as part of ETAPS 2000 in Berlin, Germany, in March/April 2000. The 33 revised full papers presented together with one invited...... paper and two short tool descriptions were carefully reviewed and selected from a total of 107 submissions. The papers are organized in topical sections on software and formal methods, formal methods, timed and hybrid systems, infinite and parameterized systems, diagnostic and test generation, efficient...

  18. Tool Support for Parametric Analysis of Large Software Simulation Systems

    Science.gov (United States)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  19. Advanced Vibration Analysis Tool Developed for Robust Engine Rotor Designs

    Science.gov (United States)

    Min, James B.

    2005-01-01

    The primary objective of this research program is to develop vibration analysis tools, design tools, and design strategies to significantly improve the safety and robustness of turbine engine rotors. Bladed disks in turbine engines always feature small, random blade-to-blade differences, or mistuning. Mistuning can lead to a dramatic increase in blade forced-response amplitudes and stresses. Ultimately, this results in high-cycle fatigue, which is a major safety and cost concern. In this research program, the necessary steps will be taken to transform a state-of-the-art vibration analysis tool, the Turbo- Reduce forced-response prediction code, into an effective design tool by enhancing and extending the underlying modeling and analysis methods. Furthermore, novel techniques will be developed to assess the safety of a given design. In particular, a procedure will be established for using natural-frequency curve veerings to identify ranges of operating conditions (rotational speeds and engine orders) in which there is a great risk that the rotor blades will suffer high stresses. This work also will aid statistical studies of the forced response by reducing the necessary number of simulations. Finally, new strategies for improving the design of rotors will be pursued.

  20. iLAP: a workflow-driven software for experimental protocol development, data acquisition and analysis

    Directory of Open Access Journals (Sweden)

    McNally James

    2009-01-01

    Full Text Available Abstract Background In recent years, the genome biology community has expended considerable effort to confront the challenges of managing heterogeneous data in a structured and organized way and developed laboratory information management systems (LIMS for both raw and processed data. On the other hand, electronic notebooks were developed to record and manage scientific data, and facilitate data-sharing. Software which enables both, management of large datasets and digital recording of laboratory procedures would serve a real need in laboratories using medium and high-throughput techniques. Results We have developed iLAP (Laboratory data management, Analysis, and Protocol development, a workflow-driven information management system specifically designed to create and manage experimental protocols, and to analyze and share laboratory data. The system combines experimental protocol development, wizard-based data acquisition, and high-throughput data analysis into a single, integrated system. We demonstrate the power and the flexibility of the platform using a microscopy case study based on a combinatorial multiple fluorescence in situ hybridization (m-FISH protocol and 3D-image reconstruction. iLAP is freely available under the open source license AGPL from http://genome.tugraz.at/iLAP/. Conclusion iLAP is a flexible and versatile information management system, which has the potential to close the gap between electronic notebooks and LIMS and can therefore be of great value for a broad scientific community.

  1. Correlation dimension based nonlinear analysis of network traffics with different application protocols

    International Nuclear Information System (INIS)

    Wang Jun-Song; Yuan Jing; Li Qiang; Yuan Rui-Xi

    2011-01-01

    This paper uses a correlation dimension based nonlinear analysis approach to analyse the dynamics of network traffics with three different application protocols—HTTP, FTP and SMTP. First, the phase space is reconstructed and the embedding parameters are obtained by the mutual information method. Secondly, the correlation dimensions of three different traffics are calculated and the results of analysis have demonstrated that the dynamics of the three different application protocol traffics is different from each other in nature, i.e. HTTP and FTP traffics are chaotic, furthermore, the former is more complex than the later; on the other hand, SMTP traffic is stochastic. It is shown that correlation dimension approach is an efficient method to understand and to characterize the nonlinear dynamics of HTTP, FTP and SMTP protocol network traffics. This analysis provided insight into and a more accurate understanding of nonlinear dynamics of internet traffics which have a complex mixture of chaotic and stochastic components. (general)

  2. Analysis tools for discovering strong parity violation at hadron colliders

    International Nuclear Information System (INIS)

    Backovic, Mihailo; Ralston, John P.

    2011-01-01

    Several arguments suggest parity violation may be observable in high energy strong interactions. We introduce new analysis tools to describe the azimuthal dependence of multiparticle distributions, or 'azimuthal flow'. Analysis uses the representations of the orthogonal group O(2) and dihedral groups D N necessary to define parity completely in two dimensions. Classification finds that collective angles used in event-by-event statistics represent inequivalent tensor observables that cannot generally be represented by a single 'reaction plane'. Many new parity-violating observables exist that have never been measured, while many parity-conserving observables formerly lumped together are now distinguished. We use the concept of 'event-shape sorting' to suggest separating right- and left-handed events, and we discuss the effects of transverse and longitudinal spin. The analysis tools are statistically robust, and can be applied equally to low or high multiplicity events at the Tevatron, RHIC or RHIC Spin, and the LHC.

  3. Anaphe - OO Libraries and Tools for Data Analysis

    CERN Document Server

    Couet, O; Molnar, Z; Moscicki, J T; Pfeiffer, A; Sang, M

    2001-01-01

    The Anaphe project is an ongoing effort to provide an Object Oriented software environment for data analysis in HENP experiments. A range of commercial and public domain libraries is used to cover basic functionalities; on top of these libraries a set of HENP-specific C++ class libraries for histogram management, fitting, plotting and ntuple-like data analysis has been developed. In order to comply with the user requirements for a command-line driven tool, we have chosen to use a scripting language (Python) as the front-end for a data analysis tool. The loose coupling provided by the consequent use of (AIDA compliant) Abstract Interfaces for each component in combination with the use of shared libraries for their implementation provides an easy integration of existing libraries into modern scripting languages thus allowing for rapid application development. This integration is simplified even further using a specialised toolkit (SWIG) to create "shadow classes" for the Python language, which map the definitio...

  4. Development of data analysis tool for combat system integration

    Directory of Open Access Journals (Sweden)

    Seung-Chun Shin

    2013-03-01

    Full Text Available System integration is an important element for the construction of naval combat ships. In particular, because impeccable combat system integration together with the sensors and weapons can ensure the combat capability and survivability of the ship, the integrated performance of the combat system should be verified and validated whether or not it fulfills the requirements of the end user. In order to conduct systematic verification and validation, a data analysis tool is requisite. This paper suggests the Data Extraction, Recording and Analysis Tool (DERAT for the data analysis of the integrated performance of the combat system, including the functional definition, architecture and effectiveness of the DERAT by presenting the test results.

  5. A reliability analysis tool for SpaceWire network

    Science.gov (United States)

    Zhou, Qiang; Zhu, Longjiang; Fei, Haidong; Wang, Xingyou

    2017-04-01

    A SpaceWire is a standard for on-board satellite networks as the basis for future data-handling architectures. It is becoming more and more popular in space applications due to its technical advantages, including reliability, low power and fault protection, etc. High reliability is the vital issue for spacecraft. Therefore, it is very important to analyze and improve the reliability performance of the SpaceWire network. This paper deals with the problem of reliability modeling and analysis with SpaceWire network. According to the function division of distributed network, a reliability analysis method based on a task is proposed, the reliability analysis of every task can lead to the system reliability matrix, the reliability result of the network system can be deduced by integrating these entire reliability indexes in the matrix. With the method, we develop a reliability analysis tool for SpaceWire Network based on VC, where the computation schemes for reliability matrix and the multi-path-task reliability are also implemented. By using this tool, we analyze several cases on typical architectures. And the analytic results indicate that redundancy architecture has better reliability performance than basic one. In practical, the dual redundancy scheme has been adopted for some key unit, to improve the reliability index of the system or task. Finally, this reliability analysis tool will has a directive influence on both task division and topology selection in the phase of SpaceWire network system design.

  6. Sensitivity Analysis of Per-Protocol Time-to-Event Treatment Efficacy in Randomized Clinical Trials

    Science.gov (United States)

    Gilbert, Peter B.; Shepherd, Bryan E.; Hudgens, Michael G.

    2013-01-01

    Summary Assessing per-protocol treatment effcacy on a time-to-event endpoint is a common objective of randomized clinical trials. The typical analysis uses the same method employed for the intention-to-treat analysis (e.g., standard survival analysis) applied to the subgroup meeting protocol adherence criteria. However, due to potential post-randomization selection bias, this analysis may mislead about treatment efficacy. Moreover, while there is extensive literature on methods for assessing causal treatment effects in compliers, these methods do not apply to a common class of trials where a) the primary objective compares survival curves, b) it is inconceivable to assign participants to be adherent and event-free before adherence is measured, and c) the exclusion restriction assumption fails to hold. HIV vaccine efficacy trials including the recent RV144 trial exemplify this class, because many primary endpoints (e.g., HIV infections) occur before adherence is measured, and nonadherent subjects who receive some of the planned immunizations may be partially protected. Therefore, we develop methods for assessing per-protocol treatment efficacy for this problem class, considering three causal estimands of interest. Because these estimands are not identifiable from the observable data, we develop nonparametric bounds and semiparametric sensitivity analysis methods that yield estimated ignorance and uncertainty intervals. The methods are applied to RV144. PMID:24187408

  7. Status of CONRAD, a nuclear reaction analysis tool

    International Nuclear Information System (INIS)

    Saint Jean, C. de; Habert, B.; Litaize, O.; Noguere, G.; Suteau, C.

    2008-01-01

    The development of a software tool (CONRAD) was initiated at CEA/Cadarache to give answers to various problems arising in the data analysis of nuclear reactions. This tool is then characterized by the handling of uncertainties from experimental values to covariance matrices for multi-group cross sections. An object oriented design was chosen allowing an easy interface with graphical tool for input/output data and being a natural framework for innovative nuclear models (Fission). The major achieved developments are a data model for describing channels, nuclear reactions, nuclear models and processes with interface to classical data formats, theoretical calculations for the resolved resonance range (Reich-Moore) and unresolved resonance range (Hauser-Feshbach, Gilbert-Cameron,...) with nuclear model parameters adjustment on experimental data sets and a Monte Carlo method based on conditional probabilities developed to calculate properly covariance matrices. The on-going developments deal with the experimental data description (covariance matrices) and the graphical user interface. (authors)

  8. Development of a HS-SPME-GC/MS protocol assisted by chemometric tools to study herbivore-induced volatiles in Myrcia splendens.

    Science.gov (United States)

    Souza Silva, Érica A; Saboia, Giovanni; Jorge, Nina C; Hoffmann, Camila; Dos Santos Isaias, Rosy Mary; Soares, Geraldo L G; Zini, Claudia A

    2017-12-01

    A headspace solid phase microextraction (HS-SPME) method combined with gas chromatography-mass spectrometry (GC/MS) was developed and optimized for extraction and analysis of volatile organic compounds (VOC) of leaves and galls of Myrcia splendens. Through a process of optimization of main factors affecting HS-SPME efficiency, the coating divivnilbenzene-carboxen-polydimethylsiloxane (DVB/Car/PDMS) was chosen as the optimum extraction phase, not only in terms of extraction efficiency, but also for its broader analyte coverage. Optimum extraction temperature was 30°C, while an extraction time of 15min provided the best compromise between extraction efficiencies of lower and higher molecular weight compounds. The optimized protocol was demonstrated to be capable of sampling plant material with high reproducibility, considering that most classes of analytes met the 20% RSD FDA criterion. The optimized method was employed for the analysis of three classes of M. splendens samples, generating a final list of 65 tentatively identified VOC, including alcohols, aldehydes, esters, ketones, phenol derivatives, as well as mono and sesquiterpenes. Significant differences were evident amongst the volatile profiles obtained from non-galled leaves (NGL) and leaf-folding galls (LFG) of M. splendens. Several differences pertaining to amounts of alcohols and aldehydes were detected between samples, particularly regarding quantities of green leaf volatiles (GLV). Alcohols represented about 14% of compounds detected in gall samples, whereas in non-galled samples, alcohol content was below 5%. Phenolic derived compounds were virtually absent in reference samples, while in non-galled leaves and galls their content ranged around 0.2% and 0.4%, respectively. Likewise, methyl salicylate, a well-known signal of plant distress, amounted for 1.2% of the sample content of galls, whereas it was only present in trace levels in reference samples. Chemometric analysis based on Heatmap associated

  9. Auditory brainstem response as a diagnostic tool for patients suffering from schizophrenia, attention deficit hyperactivity disorder, and bipolar disorder: protocol.

    Science.gov (United States)

    Wahlström, Viktor; Åhlander, Fredrik; Wynn, Rolf

    2015-02-12

    Psychiatric disorders, such as schizophrenia, attention deficit hyperactivity disorder (ADHD), and bipolar disorder, may sometimes be difficult to diagnose. There is a great need for a valid and reliable diagnostic tool to aid clinicians in arriving at the diagnoses in a timely and accurate manner. Prior studies have suggested that patients suffering from schizophrenia and ADHD may process certain sound stimuli in the brainstem in an unusual manner. When these patient groups have been examined with the electrophysiological method of brainstem audiometry, some studies have found illness-specific aberrations. Such aberrations may also exist for patients suffering from bipolar disorder. In this study, we will examine whether the method of brainstem audiometry can be used as a diagnostic tool for patients suffering from schizophrenia, ADHD, and bipolar disorder. The method includes three steps: (1) auditory stimulation with specific sound stimuli, (2) simultaneous measurement of brainstem activity, and (3) automated interpretation of the resulting brain stem audiograms with data-based signal analysis. We will compare three groups of 12 individuals with confirmed diagnoses of schizophrenia, ADHD, or bipolar disorder with 12 healthy subjects under blinded conditions for a total of 48 participants. The extent to which the method can be used to reach the correct diagnosis will be investigated. The project is now in a recruiting phase. When all patients and controls have been recruited and the measurements have been performed, the data will be analyzed according to a previously arranged algorithm. We expect the recruiting phase and measurements to be completed in early 2015, the analyses to be performed in mid-2015, and the results of the study to be published in early 2016. If the results support previous findings, this will lend strength to the idea that brainstem audiometry can offer objective diagnostic support for patients suffering from schizophrenia, ADHD, and

  10. Methods and tools for analysis and optimization of power plants

    Energy Technology Data Exchange (ETDEWEB)

    Assadi, Mohsen

    2000-09-01

    The most noticeable advantage of the introduction of the computer-aided tools in the field of power generation, has been the ability to study the plant's performance prior to the construction phase. The results of these studies have made it possible to change and adjust the plant layout to match the pre-defined requirements. Further development of computers in recent years has opened up for implementation of new features in the existing tools and also for the development of new tools for specific applications, like thermodynamic and economic optimization, prediction of the remaining component life time, and fault diagnostics, resulting in improvement of the plant's performance, availability and reliability. The most common tools for pre-design studies are heat and mass balance programs. Further thermodynamic and economic optimization of plant layouts, generated by the heat and mass balance programs, can be accomplished by using pinch programs, exergy analysis and thermoeconomics. Surveillance and fault diagnostics of existing systems can be performed by using tools like condition monitoring systems and artificial neural networks. The increased number of tools and their various construction and application areas make the choice of the most adequate tool for a certain application difficult. In this thesis the development of different categories of tools and techniques, and their application area are reviewed and presented. Case studies on both existing and theoretical power plant layouts have been performed using different commercially available tools to illuminate their advantages and shortcomings. The development of power plant technology and the requirements for new tools and measurement systems have been briefly reviewed. This thesis contains also programming techniques and calculation methods concerning part-load calculations using local linearization, which has been implemented in an inhouse heat and mass balance program developed by the author

  11. Microscopy image segmentation tool: Robust image data analysis

    Energy Technology Data Exchange (ETDEWEB)

    Valmianski, Ilya, E-mail: ivalmian@ucsd.edu; Monton, Carlos; Schuller, Ivan K. [Department of Physics and Center for Advanced Nanoscience, University of California San Diego, 9500 Gilman Drive, La Jolla, California 92093 (United States)

    2014-03-15

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy.

  12. Microscopy image segmentation tool: Robust image data analysis

    Science.gov (United States)

    Valmianski, Ilya; Monton, Carlos; Schuller, Ivan K.

    2014-03-01

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy.

  13. Microscopy image segmentation tool: Robust image data analysis

    International Nuclear Information System (INIS)

    Valmianski, Ilya; Monton, Carlos; Schuller, Ivan K.

    2014-01-01

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy

  14. Aeroelastic Ground Wind Loads Analysis Tool for Launch Vehicles

    Science.gov (United States)

    Ivanco, Thomas G.

    2016-01-01

    Launch vehicles are exposed to ground winds during rollout and on the launch pad that can induce static and dynamic loads. Of particular concern are the dynamic loads caused by vortex shedding from nearly-cylindrical structures. When the frequency of vortex shedding nears that of a lowly-damped structural mode, the dynamic loads can be more than an order of magnitude greater than mean drag loads. Accurately predicting vehicle response to vortex shedding during the design and analysis cycles is difficult and typically exceeds the practical capabilities of modern computational fluid dynamics codes. Therefore, mitigating the ground wind loads risk typically requires wind-tunnel tests of dynamically-scaled models that are time consuming and expensive to conduct. In recent years, NASA has developed a ground wind loads analysis tool for launch vehicles to fill this analytical capability gap in order to provide predictions for prelaunch static and dynamic loads. This paper includes a background of the ground wind loads problem and the current state-of-the-art. It then discusses the history and significance of the analysis tool and the methodology used to develop it. Finally, results of the analysis tool are compared to wind-tunnel and full-scale data of various geometries and Reynolds numbers.

  15. Spaceborne Differential SAR Interferometry: Data Analysis Tools for Deformation Measurement

    Directory of Open Access Journals (Sweden)

    Michele Crosetto

    2011-02-01

    Full Text Available This paper is focused on spaceborne Differential Interferometric SAR (DInSAR for land deformation measurement and monitoring. In the last two decades several DInSAR data analysis procedures have been proposed. The objective of this paper is to describe the DInSAR data processing and analysis tools developed at the Institute of Geomatics in almost ten years of research activities. Four main DInSAR analysis procedures are described, which range from the standard DInSAR analysis based on a single interferogram to more advanced Persistent Scatterer Interferometry (PSI approaches. These different procedures guarantee a sufficient flexibility in DInSAR data processing. In order to provide a technical insight into these analysis procedures, a whole section discusses their main data processing and analysis steps, especially those needed in PSI analyses. A specific section is devoted to the core of our PSI analysis tools: the so-called 2+1D phase unwrapping procedure, which couples a 2D phase unwrapping, performed interferogram-wise, with a kind of 1D phase unwrapping along time, performed pixel-wise. In the last part of the paper, some examples of DInSAR results are discussed, which were derived by standard DInSAR or PSI analyses. Most of these results were derived from X-band SAR data coming from the TerraSAR-X and CosmoSkyMed sensors.

  16. Knickpoint finder: A software tool that improves neotectonic analysis

    Science.gov (United States)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  17. Rapid analysis of vessel elements (RAVE: a tool for studying physiologic, pathologic and tumor angiogenesis.

    Directory of Open Access Journals (Sweden)

    Marc E Seaman

    Full Text Available Quantification of microvascular network structure is important in a myriad of emerging research fields including microvessel remodeling in response to ischemia and drug therapy, tumor angiogenesis, and retinopathy. To mitigate analyst-specific variation in measurements and to ensure that measurements represent actual changes in vessel network structure and morphology, a reliable and automatic tool for quantifying microvascular network architecture is needed. Moreover, an analysis tool capable of acquiring and processing large data sets will facilitate advanced computational analysis and simulation of microvascular growth and remodeling processes and enable more high throughput discovery. To this end, we have produced an automatic and rapid vessel detection and quantification system using a MATLAB graphical user interface (GUI that vastly reduces time spent on analysis and greatly increases repeatability. Analysis yields numerical measures of vessel volume fraction, vessel length density, fractal dimension (a measure of tortuosity, and radii of murine vascular networks. Because our GUI is open sourced to all, it can be easily modified to measure parameters such as percent coverage of non-endothelial cells, number of loops in a vascular bed, amount of perfusion and two-dimensional branch angle. Importantly, the GUI is compatible with standard fluorescent staining and imaging protocols, but also has utility analyzing brightfield vascular images, obtained, for example, in dorsal skinfold chambers. A manually measured image can be typically completed in 20 minutes to 1 hour. In stark comparison, using our GUI, image analysis time is reduced to around 1 minute. This drastic reduction in analysis time coupled with increased repeatability makes this tool valuable for all vessel research especially those requiring rapid and reproducible results, such as anti-angiogenic drug screening.

  18. Voice over Internet Protocol (VoIP) Technology as a Global Learning Tool: Information Systems Success and Control Belief Perspectives

    Science.gov (United States)

    Chen, Charlie C.; Vannoy, Sandra

    2013-01-01

    Voice over Internet Protocol- (VoIP) enabled online learning service providers struggling with high attrition rates and low customer loyalty issues despite VoIP's high degree of system fit for online global learning applications. Effective solutions to this prevalent problem rely on the understanding of system quality, information quality, and…

  19. Conditional Probability Analysis: A Statistical Tool for Environmental Analysis.

    Science.gov (United States)

    The use and application of environmental conditional probability analysis (CPA) is relatively recent. The first presentation using CPA was made in 2002 at the New England Association of Environmental Biologists Annual Meeting in Newport. Rhode Island. CPA has been used since the...

  20. Applying AI tools to operational space environmental analysis

    Science.gov (United States)

    Krajnak, Mike; Jesse, Lisa; Mucks, John

    1995-01-01

    The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines

  1. A Tailored Web-based Advice Tool for Skiers and Snowboarders: Protocol for a Randomized Controlled Trial.

    Science.gov (United States)

    Kemler, Ellen; Gouttebarge, Vincent

    2018-01-17

    principle (as initially assigned). The project was funded in 2016 and enrolment was completed in 2017. Data analysis is currently under way and the first results are expected to be submitted for publication in 2018. To combat the negative side effects of sports participation, the use of injury preventive measures is desirable. As the use of injury prevention is usually not compulsory in skiing and snowboarding, a behavioral change is necessary to increase the use of effective injury preventive measures in winter sports. Dutch Trial Registry NTR6233; http://www.trialregister.nl/trialreg/admin/rctview.asp?TC=6233 (Archived by WebCite at  http://www.webcitation.org/6wXZPzjUi). ©Ellen Kemler, Vincent Gouttebarge. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 17.01.2018.

  2. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    Science.gov (United States)

    Battaglieri, M.; Briscoe, B. J.; Celentano, A.; Chung, S.-U.; D'Angelo, A.; De Vita, R.; Döring, M.; Dudek, J.; Eidelman, S.; Fegan, S.; Ferretti, J.; Filippi, A.; Fox, G.; Galata, G.; García-Tecocoatzi, H.; Glazier, D. I.; Grube, B.; Hanhart, C.; Hoferichter, M.; Hughes, S. M.; Ireland, D. G.; Ketzer, B.; Klein, F. J.; Kubis, B.; Liu, B.; Masjuan, P.; Mathieu, V.; McKinnon, B.; Mitchel, R.; Nerling, F.; Paul, S.; Peláez, J. R.; Rademacker, J.; Rizzo, A.; Salgado, C.; Santopinto, E.; Sarantsev, A. V.; Sato, T.; Schlüter, T.; [Silva]da Silva, M. L. L.; Stankovic, I.; Strakovsky, I.; Szczepaniak, A.; Vassallo, A.; Walford, N. K.; Watts, D. P.; Zana, L.

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.

  3. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    International Nuclear Information System (INIS)

    Battaglieri, Marco; Briscoe, William; Celentano, Andrea; Chung, Suh-Urk; D'Angelo, Annalisa; De Vita, Rafaella; Döring, Michael; Dudek, Jozef; Eidelman, S.; Fegan, Stuart; Ferretti, J.; Filippi, A.; Fox, G.; Galata, G.; Garcia-Tecocoatzi, H.; Glazier, Derek; Grube, B.; Hanhart, C.; Hoferichter, M.; Hughes, S. M.; Ireland, David G.; Ketzer, B.; Klein, Franz J.; Kubis, B.; Liu, B.; Masjuan, P.; Mathieu, Vincent; McKinnon, Brian; Mitchel, R.; Nerling, F.; Paul, S.; Peláez, J. R.; Rademacker, J.; Rizzo, Alessandro; Salgado, Carlos; Santopinto, E.; Sarantsev, Andrey V.; Sato, Toru; Schlüter, T.; Da Silva, M. L.L.; Stankovic, I.; Strakovsky, Igor; Szczepaniak, Adam; Vassallo, A.; Walford, Natalie K.; Watts, Daniel P.

    2015-01-01

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document

  4. Nucleonica. Web-based software tools for simulation and analysis

    International Nuclear Information System (INIS)

    Magill, J.; Dreher, R.; Soti, Z.

    2014-01-01

    The authors present a description of the Nucleonica web-based portal for simulation and analysis for a wide range of commonly encountered nuclear science applications. Advantages of a web-based approach include availability wherever there is internet access, intuitive user-friendly interface, remote access to high-power computing resources, and continual maintenance, improvement, and addition of tools and techniques common to the nuclear science industry. A description of the nuclear data resources, and some applications is given.

  5. Modelling and analysis of distributed simulation protocols with distributed graph transformation

    OpenAIRE

    Lara, Juan de; Taentzer, Gabriele

    2005-01-01

    Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. J. de Lara, and G. Taentzer, "Modelling and analysis of distributed simulation protocols with distributed graph transformation...

  6. Analysis of functionality free CASE-tools databases design

    Directory of Open Access Journals (Sweden)

    A. V. Gavrilov

    2016-01-01

    Full Text Available The introduction in the educational process of database design CASEtechnologies requires the institution of significant costs for the purchase of software. A possible solution could be the use of free software peers. At the same time this kind of substitution should be based on even-com representation of the functional characteristics and features of operation of these programs. The purpose of the article – a review of the free and non-profi t CASE-tools database design, as well as their classifi cation on the basis of the analysis functionality. When writing this article were used materials from the offi cial websites of the tool developers. Evaluation of the functional characteristics of CASEtools for database design made exclusively empirically with the direct work with software products. Analysis functionality of tools allow you to distinguish the two categories CASE-tools database design. The first category includes systems with a basic set of features and tools. The most important basic functions of these systems are: management connections to database servers, visual tools to create and modify database objects (tables, views, triggers, procedures, the ability to enter and edit data in table mode, user and privilege management tools, editor SQL-code, means export/import data. CASE-system related to the first category can be used to design and develop simple databases, data management, as well as a means of administration server database. A distinctive feature of the second category of CASE-tools for database design (full-featured systems is the presence of visual designer, allowing to carry out the construction of the database model and automatic creation of the database on the server based on this model. CASE-system related to this categories can be used for the design and development of databases of any structural complexity, as well as a database server administration tool. The article concluded that the

  7. Analysis and Prediction of Micromilling Stability with Variable Tool Geometry

    Directory of Open Access Journals (Sweden)

    Ziyang Cao

    2014-11-01

    Full Text Available Micromilling can fabricate miniaturized components using micro-end mill at high rotational speeds. The analysis of machining stability in micromilling plays an important role in characterizing the cutting process, estimating the tool life, and optimizing the process. A numerical analysis and experimental method are presented to investigate the chatter stability in micro-end milling process with variable milling tool geometry. The schematic model of micromilling process is constructed and the calculation formula to predict cutting force and displacements is derived. This is followed by a detailed numerical analysis on micromilling forces between helical ball and square end mills through time domain and frequency domain method and the results are compared. Furthermore, a detailed time domain simulation for micro end milling with straight teeth and helical teeth end mill is conducted based on the machine-tool system frequency response function obtained through modal experiment. The forces and displacements are predicted and the simulation result between variable cutter geometry is deeply compared. The simulation results have important significance for the actual milling process.

  8. Economic Consequence Analysis of Disasters: The ECAT Software Tool

    Energy Technology Data Exchange (ETDEWEB)

    Rose, Adam; Prager, Fynn; Chen, Zhenhua; Chatterjee, Samrat; Wei, Dan; Heatwole, Nathaniel; Warren, Eric

    2017-04-15

    This study develops a methodology for rapidly obtaining approximate estimates of the economic consequences from numerous natural, man-made and technological threats. This software tool is intended for use by various decision makers and analysts to obtain estimates rapidly. It is programmed in Excel and Visual Basic for Applications (VBA) to facilitate its use. This tool is called E-CAT (Economic Consequence Analysis Tool) and accounts for the cumulative direct and indirect impacts (including resilience and behavioral factors that significantly affect base estimates) on the U.S. economy. E-CAT is intended to be a major step toward advancing the current state of economic consequence analysis (ECA) and also contributing to and developing interest in further research into complex but rapid turnaround approaches. The essence of the methodology involves running numerous simulations in a computable general equilibrium (CGE) model for each threat, yielding synthetic data for the estimation of a single regression equation based on the identification of key explanatory variables (threat characteristics and background conditions). This transforms the results of a complex model, which is beyond the reach of most users, into a "reduced form" model that is readily comprehensible. Functionality has been built into E-CAT so that its users can switch various consequence categories on and off in order to create customized profiles of economic consequences of numerous risk events. E-CAT incorporates uncertainty on both the input and output side in the course of the analysis.

  9. An improved method of studying user-system interaction by combining transaction log analysis and protocol analysis

    Directory of Open Access Journals (Sweden)

    Jillian R. Griffiths

    2002-01-01

    Full Text Available The paper reports a novel approach to studying user-system interaction that captures a complete record of the searcher's actions, the system responses and synchronised talk-aloud comments from the searcher. The data is recorded unobtrusively and is available for later analysis. The approach is set in context by a discussion of transaction logging and protocol analysis and examples of the search logging in operation are presented

  10. Remote-Sensing Time Series Analysis, a Vegetation Monitoring Tool

    Science.gov (United States)

    McKellip, Rodney; Prados, Donald; Ryan, Robert; Ross, Kenton; Spruce, Joseph; Gasser, Gerald; Greer, Randall

    2008-01-01

    The Time Series Product Tool (TSPT) is software, developed in MATLAB , which creates and displays high signal-to- noise Vegetation Indices imagery and other higher-level products derived from remotely sensed data. This tool enables automated, rapid, large-scale regional surveillance of crops, forests, and other vegetation. TSPT temporally processes high-revisit-rate satellite imagery produced by the Moderate Resolution Imaging Spectroradiometer (MODIS) and by other remote-sensing systems. Although MODIS imagery is acquired daily, cloudiness and other sources of noise can greatly reduce the effective temporal resolution. To improve cloud statistics, the TSPT combines MODIS data from multiple satellites (Aqua and Terra). The TSPT produces MODIS products as single time-frame and multitemporal change images, as time-series plots at a selected location, or as temporally processed image videos. Using the TSPT program, MODIS metadata is used to remove and/or correct bad and suspect data. Bad pixel removal, multiple satellite data fusion, and temporal processing techniques create high-quality plots and animated image video sequences that depict changes in vegetation greenness. This tool provides several temporal processing options not found in other comparable imaging software tools. Because the framework to generate and use other algorithms is established, small modifications to this tool will enable the use of a large range of remotely sensed data types. An effective remote-sensing crop monitoring system must be able to detect subtle changes in plant health in the earliest stages, before the effects of a disease outbreak or other adverse environmental conditions can become widespread and devastating. The integration of the time series analysis tool with ground-based information, soil types, crop types, meteorological data, and crop growth models in a Geographic Information System, could provide the foundation for a large-area crop-surveillance system that could identify

  11. Ontology-based configuration of problem-solving methods and generation of knowledge-acquisition tools: application of PROTEGE-II to protocol-based decision support.

    Science.gov (United States)

    Tu, S W; Eriksson, H; Gennari, J H; Shahar, Y; Musen, M A

    1995-06-01

    PROTEGE-II is a suite of tools and a methodology for building knowledge-based systems and domain-specific knowledge-acquisition tools. In this paper, we show how PROTEGE-II can be applied to the task of providing protocol-based decision support in the domain of treating HIV-infected patients. To apply PROTEGE-II, (1) we construct a decomposable problem-solving method called episodic skeletal-plan refinement, (2) we build an application ontology that consists of the terms and relations in the domain, and of method-specific distinctions not already captured in the domain terms, and (3) we specify mapping relations that link terms from the application ontology to the domain-independent terms used in the problem-solving method. From the application ontology, we automatically generate a domain-specific knowledge-acquisition tool that is custom-tailored for the application. The knowledge-acquisition tool is used for the creation and maintenance of domain knowledge used by the problem-solving method. The general goal of the PROTEGE-II approach is to produce systems and components that are reusable and easily maintained. This is the rationale for constructing ontologies and problem-solving methods that can be composed from a set of smaller-grained methods and mechanisms. This is also why we tightly couple the knowledge-acquisition tools to the application ontology that specifies the domain terms used in the problem-solving systems. Although our evaluation is still preliminary, for the application task of providing protocol-based decision support, we show that these goals of reusability and easy maintenance can be achieved. We discuss design decisions and the tradeoffs that have to be made in the development of the system.

  12. ISAC: A tool for aeroservoelastic modeling and analysis

    Science.gov (United States)

    Adams, William M., Jr.; Hoadley, Sherwood Tiffany

    1993-01-01

    The capabilities of the Interaction of Structures, Aerodynamics, and Controls (ISAC) system of program modules is discussed. The major modeling, analysis, and data management components of ISAC are identified. Equations of motion are displayed for a Laplace-domain representation of the unsteady aerodynamic forces. Options for approximating a frequency-domain representation of unsteady aerodynamic forces with rational functions of the Laplace variable are shown. Linear time invariant state-space equations of motion that result are discussed. Model generation and analyses of stability and dynamic response characteristics are shown for an aeroelastic vehicle which illustrates some of the capabilities of ISAC as a modeling and analysis tool for aeroelastic applications.

  13. Federal metering data analysis needs and existing tools

    Energy Technology Data Exchange (ETDEWEB)

    Henderson, Jordan W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fowler, Kimberly M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-07-01

    Agencies have been working to improve their metering data collection, management, and analysis efforts over the last decade (since EPAct 2005) and will continue to address these challenges as new requirements and data needs come into place. Unfortunately there is no “one-size-fits-all” solution. As agencies continue to expand their capabilities to use metered consumption data to reducing resource use and improve operations, the hope is that shared knowledge will empower others to follow suit. This paper discusses the Federal metering data analysis needs and some existing tools.

  14. Using the Contextual Hub Analysis Tool (CHAT) in Cytoscape to Identify Contextually Relevant Network Hubs.

    Science.gov (United States)

    Muetze, Tanja; Lynn, David J

    2017-09-13

    Highly connected nodes in biological networks are called network hubs. Hubs are topologically important to the structure of the network and have been shown to be preferentially associated with a range of phenotypes of interest. The relative importance of a hub node, however, can change depending on the biological context. Here, we provide a step-by-step protocol for using the Contextual Hub Analysis Tool (CHAT), an application within Cytoscape 3, which enables users to easily construct and visualize a network of interactions from a gene or protein list of interest, integrate contextual information, such as gene or protein expression data, and identify hub nodes that are more highly connected to contextual nodes than expected by chance. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.

  15. [Perinatal bioethics: euthanasia or end-of-life decisions? Analysis of the Groningen Protocol].

    Science.gov (United States)

    Halac, Jacobo; Halac, Eduardo; Moya, Martín P; Olmas, José M; Dopazo, Silvina L; Dolagaray, Nora

    2009-12-01

    The so called "Groningen Protocol" was conceived as a framework to discuss the euthanasia in neonates. Originally, it presents three groups of babies who might be candidates to this option. We analyzed the protocol in its original context and that of the Dutch society in which it was created. The analysis started with a careful reading of the protocol in both English and Dutch versions, translated later into Spanish. The medical and nursing staff participated in discussing it. A final consensus was reached. The Institutional Ethics Committee at our hospital discussed it freely and made recommendations for its application as a guideline to honestly discuss with parents the clinical condition of their babies, without permitting the option included literally in the word euthanasia. We selected four extremely ill infants. Their parents were interviewed at least twice daily: three stages were identified: the initial one of promoting all possible treatments; a second one of guarded and cautious request for the staff to evaluate "suffering", and a last one where requests were made to reduce therapeutic efforts to provide dignified death. A week after the death of their infants, they were presented with the facts of the protocol and the limits of our legal system. In all four cases the parents suggested that they would have chosen ending the life of their infants, in order to avoid them undue suffering. They clearly pointed out that this option emerged as a viable one to them once the ultimate outcome was evident. The protocol must not be viewed as a guideline for euthanasia in newborns, but rather as a mean to discuss the critical condition of an infant with the parents. Its direct implementation in our setting remains difficult. As a clear limitation for its overall application remains the definition of what is considered "unbearable suffering" in newborns, and how to certify when the infant has "no prospect". We emphasize the benefits of securing the help of the Ethics

  16. ADVANCED AND RAPID DEVELOPMENT OF DYNAMIC ANALYSIS TOOLS FOR JAVA

    Directory of Open Access Journals (Sweden)

    Alex Villazón

    2012-01-01

    Full Text Available Low-level bytecode instrumentation techniques are widely used in many software-engineering tools for the Java Virtual Machine (JVM, that perform some form of dynamic program analysis, such as profilers or debuggers. While program manipulation at the bytecode level is very flexible, because the possible bytecode transformations are not restricted, tool development based on this technique is tedious and error-prone. As a promising alternative, the specification of bytecode instrumentation at a higher level using aspect-oriented programming (AOP can reduce tool development time and cost. Unfortunately, prevailing AOP frameworks lack some features that are essential for certain dynamic analyses. In this article, we focus on three common shortcomings in AOP frameworks with respect to the development of aspect-based tools - (1 the lack of mechanisms for passing data between woven advices in local variables, (2 the support for user-defined static analyses at weaving time, and (3 the absence of pointcuts at the level of individual basic blocks of code. We propose @J, an annotation-based AOP language and weaver that integrates support for these three features. The benefits of the proposed features are illustrated with concrete examples.

  17. MultiSense: A Multimodal Sensor Tool Enabling the High-Throughput Analysis of Respiration.

    Science.gov (United States)

    Keil, Peter; Liebsch, Gregor; Borisjuk, Ljudmilla; Rolletschek, Hardy

    2017-01-01

    The high-throughput analysis of respiratory activity has become an important component of many biological investigations. Here, a technological platform, denoted the "MultiSense tool," is described. The tool enables the parallel monitoring of respiration in 100 samples over an extended time period, by dynamically tracking the concentrations of oxygen (O 2 ) and/or carbon dioxide (CO 2 ) and/or pH within an airtight vial. Its flexible design supports the quantification of respiration based on either oxygen consumption or carbon dioxide release, thereby allowing for the determination of the physiologically significant respiratory quotient (the ratio between the quantities of CO 2 released and the O 2 consumed). It requires an LED light source to be mounted above the sample, together with a CCD camera system, adjusted to enable the capture of analyte-specific wavelengths, and fluorescent sensor spots inserted into the sample vial. Here, a demonstration is given of the use of the MultiSense tool to quantify respiration in imbibing plant seeds, for which an appropriate step-by-step protocol is provided. The technology can be easily adapted for a wide range of applications, including the monitoring of gas exchange in any kind of liquid culture system (algae, embryo and tissue culture, cell suspensions, microbial cultures).

  18. Barberry rust survey – developing tools for diagnosis, analysis and data management

    DEFF Research Database (Denmark)

    Justesen, Annemarie Fejer; Hansen, Jens Grønbech; Hovmøller, Mogens Støvring

    Barberry (Berberis spp.) may serve as alternate host of several Puccinia species including Puccinia graminis and P. striiformis causing stem and yellow rust on cereals and grasses, respectively. In order to study the importance of barberry in the epidemiology of Puccinia species in the CWANA regi...... a rust survey was initiated. The aim was to 1) develop a surveillance protocol 2) develop molecular diagnostic tools for identifying Puccinia spp. from aecial samples, and 3) develop a data management and display system of results as part of the Wheat Rust ToolBox (http....... Due to variable quality of aecial samples DNA extraction was not successful for 40% of the samples. Sequences of EF1α, β-tubulin or ITS were analysed and compared to reference sequences of rust fungi infecting cereals and grasses. The analysis supported the presence of P. graminis s.l., P....... arrhenatheri and P. striiformoides on barberry species. Survey and DNA sample maps with species designation were displayed in the Wheat Rust ToolBox. The future aim is to integrate barberry rust survey data based on molecular diagnostics and infection assays from research groups world-wide in order to gain...

  19. Evaluating control displays with the Engineering Control Analysis Tool (ECAT)

    International Nuclear Information System (INIS)

    Plott, B.

    2006-01-01

    In the Nuclear Power Industry increased use of automated sensors and advanced control systems is expected to reduce and/or change manning requirements. However, critical questions remain regarding the extent to which safety will be compromised if the cognitive workload associated with monitoring multiple automated systems is increased. Can operators/engineers maintain an acceptable level of performance if they are required to supervise multiple automated systems and respond appropriately to off-normal conditions? The interface to/from the automated systems must provide the information necessary for making appropriate decisions regarding intervention in the automated process, but be designed so that the cognitive load is neither too high nor too low for the operator who is responsible for the monitoring and decision making. This paper will describe a new tool that was developed to enhance the ability of human systems integration (HSI) professionals and systems engineers to identify operational tasks in which a high potential for human overload and error can be expected. The tool is entitled the Engineering Control Analysis Tool (ECAT). ECAT was designed and developed to assist in the analysis of: Reliability Centered Maintenance (RCM), operator task requirements, human error probabilities, workload prediction, potential control and display problems, and potential panel layout problems. (authors)

  20. Evaluating control displays with the Engineering Control Analysis Tool (ECAT)

    Energy Technology Data Exchange (ETDEWEB)

    Plott, B. [Alion Science and Technology, MA and D Operation, 4949 Pearl E. Circle, 300, Boulder, CO 80301 (United States)

    2006-07-01

    In the Nuclear Power Industry increased use of automated sensors and advanced control systems is expected to reduce and/or change manning requirements. However, critical questions remain regarding the extent to which safety will be compromised if the cognitive workload associated with monitoring multiple automated systems is increased. Can operators/engineers maintain an acceptable level of performance if they are required to supervise multiple automated systems and respond appropriately to off-normal conditions? The interface to/from the automated systems must provide the information necessary for making appropriate decisions regarding intervention in the automated process, but be designed so that the cognitive load is neither too high nor too low for the operator who is responsible for the monitoring and decision making. This paper will describe a new tool that was developed to enhance the ability of human systems integration (HSI) professionals and systems engineers to identify operational tasks in which a high potential for human overload and error can be expected. The tool is entitled the Engineering Control Analysis Tool (ECAT). ECAT was designed and developed to assist in the analysis of: Reliability Centered Maintenance (RCM), operator task requirements, human error probabilities, workload prediction, potential control and display problems, and potential panel layout problems. (authors)

  1. Security Analysis of DTN Architecture and Bundle Protocol Specification for Space-Based Networks

    Science.gov (United States)

    Ivancic, William D.

    2009-01-01

    A Delay-Tolerant Network (DTN) Architecture (Request for Comment, RFC-4838) and Bundle Protocol Specification, RFC-5050, have been proposed for space and terrestrial networks. Additional security specifications have been provided via the Bundle Security Specification (currently a work in progress as an Internet Research Task Force internet-draft) and, for link-layer protocols applicable to Space networks, the Licklider Transport Protocol Security Extensions. This document provides a security analysis of the current DTN RFCs and proposed security related internet drafts with a focus on space-based communication networks, which is a rather restricted subset of DTN networks. Note, the original focus and motivation of DTN work was for the Interplanetary Internet . This document does not address general store-and-forward network overlays, just the current work being done by the Internet Research Task Force (IRTF) and the Consultative Committee for Space Data Systems (CCSDS) Space Internetworking Services Area (SIS) - DTN working group under the DTN and Bundle umbrellas. However, much of the analysis is relevant to general store-and-forward overlays.

  2. Performance Analysis of a Cluster-Based MAC Protocol for Wireless Ad Hoc Networks

    Directory of Open Access Journals (Sweden)

    Jesús Alonso-Zárate

    2010-01-01

    Full Text Available An analytical model to evaluate the non-saturated performance of the Distributed Queuing Medium Access Control Protocol for Ad Hoc Networks (DQMANs in single-hop networks is presented in this paper. DQMAN is comprised of a spontaneous, temporary, and dynamic clustering mechanism integrated with a near-optimum distributed queuing Medium Access Control (MAC protocol. Clustering is executed in a distributed manner using a mechanism inspired by the Distributed Coordination Function (DCF of the IEEE 802.11. Once a station seizes the channel, it becomes the temporary clusterhead of a spontaneous cluster and it coordinates the peer-to-peer communications between the clustermembers. Within each cluster, a near-optimum distributed queuing MAC protocol is executed. The theoretical performance analysis of DQMAN in single-hop networks under non-saturation conditions is presented in this paper. The approach integrates the analysis of the clustering mechanism into the MAC layer model. Up to the knowledge of the authors, this approach is novel in the literature. In addition, the performance of an ad hoc network using DQMAN is compared to that obtained when using the DCF of the IEEE 802.11, as a benchmark reference.

  3. Organ donation in the ICU: A document analysis of institutional policies, protocols, and order sets.

    Science.gov (United States)

    Oczkowski, Simon J W; Centofanti, John E; Durepos, Pamela; Arseneau, Erika; Kelecevic, Julija; Cook, Deborah J; Meade, Maureen O

    2018-04-01

    To better understand how local policies influence organ donation rates. We conducted a document analysis of our ICU organ donation policies, protocols and order sets. We used a systematic search of our institution's policy library to identify documents related to organ donation. We used Mindnode software to create a publication timeline, basic statistics to describe document characteristics, and qualitative content analysis to extract document themes. Documents were retrieved from Hamilton Health Sciences, an academic hospital system with a high volume of organ donation, from database inception to October 2015. We retrieved 12 active organ donation documents, including six protocols, two policies, two order sets, and two unclassified documents, a majority (75%) after the introduction of donation after circulatory death in 2006. Four major themes emerged: organ donation process, quality of care, patient and family-centred care, and the role of the institution. These themes indicate areas where documented institutional standards may be beneficial. Further research is necessary to determine the relationship of local policies, protocols, and order sets to actual organ donation practices, and to identify barriers and facilitators to improving donation rates. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Evaluation of a clinical decision support tool for osteoporosis disease management: protocol for an interrupted time series design.

    Science.gov (United States)

    Kastner, Monika; Sawka, Anna; Thorpe, Kevin; Chignel, Mark; Marquez, Christine; Newton, David; Straus, Sharon E

    2011-07-22

    Osteoporosis affects over 200 million people worldwide at a high cost to healthcare systems. Although guidelines on assessing and managing osteoporosis are available, many patients are not receiving appropriate diagnostic testing or treatment. Findings from a systematic review of osteoporosis interventions, a series of mixed-methods studies, and advice from experts in osteoporosis and human-factors engineering were used collectively to develop a multicomponent tool (targeted to family physicians and patients at risk for osteoporosis) that may support clinical decision making in osteoporosis disease management at the point of care. A three-phased approach will be used to evaluate the osteoporosis tool. In phase 1, the tool will be implemented in three family practices. It will involve ensuring optimal functioning of the tool while minimizing disruption to usual practice. In phase 2, the tool will be pilot tested in a quasi-experimental interrupted time series (ITS) design to determine if it can improve osteoporosis disease management at the point of care. Phase 3 will involve conducting a qualitative postintervention follow-up study to better understand participants' experiences and perceived utility of the tool and readiness to adopt the tool at the point of care. The osteoporosis tool has the potential to make several contributions to the development and evaluation of complex, chronic disease interventions, such as the inclusion of an implementation strategy prior to conducting an evaluation study. Anticipated benefits of the tool may be to increase awareness for patients about osteoporosis and its associated risks and provide an opportunity to discuss a management plan with their physician, which may all facilitate patient self-management.

  5. Replication protocol analysis: a method for the study of real-world design thinking

    DEFF Research Database (Denmark)

    Galle, Per; Kovacs, L. B.

    1996-01-01

    ’ is refined into a method called ‘replication protocol analysis’ (RPA), and discussed from a methodological perspective of design research. It is argued that for the study of real-world design thinking this method offers distinct advantages over traditional ‘design protocol analysis’, which seeks to capture......Given the brief of an architectural competition on site planning, and the design awarded the first prize, the first author (trained as an architect but not a participant in the competition) produced a line of reasoning that might have led from brief to design. In the paper, such ‘design replication...... the designer’s authentic line of reasoning. To illustrate how RPA can be used, the site planning case is briefly presented, and part of the replicated line of reasoning analysed. One result of the analysis is a glimpse of a ‘logic of design’; another is an insight which sheds new light on Darke’s classical...

  6. Performance Analysis of an Optical CDMA MAC Protocol With Variable-Size Sliding Window

    Science.gov (United States)

    Mohamed, Mohamed Aly A.; Shalaby, Hossam M. H.; Abdel-Moety El-Badawy, El-Sayed

    2006-10-01

    A media access control protocol for optical code-division multiple-access packet networks with variable length data traffic is proposed. This protocol exhibits a sliding window with variable size. A model for interference-level fluctuation and an accurate analysis for channel usage are presented. Both multiple-access interference (MAI) and photodetector's shot noise are considered. Both chip-level and correlation receivers are adopted. The system performance is evaluated using a traditional average system throughput and average delay. Finally, in order to enhance the overall performance, error control codes (ECCs) are applied. The results indicate that the performance can be enhanced to reach its peak using the ECC with an optimum number of correctable errors. Furthermore, chip-level receivers are shown to give much higher performance than that of correlation receivers. Also, it has been shown that MAI is the main source of signal degradation.

  7. Protocol for validation of the 4AT, a rapid screening tool for delirium: a multicentre prospective diagnostic test accuracy study.

    Science.gov (United States)

    Shenkin, Susan D; Fox, Christopher; Godfrey, Mary; Siddiqi, Najma; Goodacre, Steve; Young, John; Anand, Atul; Gray, Alasdair; Smith, Joel; Ryan, Tracy; Hanley, Janet; MacRaild, Allan; Steven, Jill; Black, Polly L; Boyd, Julia; Weir, Christopher J; MacLullich, Alasdair Mj

    2018-02-10

    Delirium is a severe neuropsychiatric syndrome of rapid onset, commonly precipitated by acute illness. It is common in older people in the emergency department (ED) and acute hospital, but greatly under-recognised in these and other settings. Delirium and other forms of cognitive impairment, particularly dementia, commonly coexist. There is a need for a rapid delirium screening tool that can be administered by a range of professional-level healthcare staff to patients with sensory or functional impairments in a busy clinical environment, which also incorporates general cognitive assessment. We developed the 4 'A's Test (4AT) for this purpose. This study's primary objective is to validate the 4AT against a reference standard. Secondary objectives include (1) comparing the 4AT with another widely used test (the Confusion Assessment Method (CAM)); (2) determining if the 4AT is sensitive to general cognitive impairment; (3) assessing if 4AT scores predict outcomes, including (4) a health economic analysis. 900 patients aged 70 or over in EDs or acute general medical wards will be recruited in three sites (Edinburgh, Bradford and Sheffield) over 18 months. Each patient will undergo a reference standard delirium assessment and will be randomised to assessment with either the 4AT or the CAM. At 12 weeks, outcomes (length of stay, institutionalisation and mortality) and resource utilisation will be collected by a questionnaire and via the electronic patient record. Ethical approval was granted in Scotland and England. The study involves administering tests commonly used in clinical practice. The main ethical issues are the essential recruitment of people without capacity. Dissemination is planned via publication in high impact journals, presentation at conferences, social media and the website www.the4AT.com. ISRCTN53388093; Pre-results. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial

  8. Hearing and vision screening tools for long-term care residents with dementia: protocol for a scoping review

    Science.gov (United States)

    McGilton, Katherine S; Höbler, Fiona; Campos, Jennifer; Dupuis, Kate; Labreche, Tammy; Guthrie, Dawn M; Jarry, Jonathan; Singh, Gurjit; Wittich, Walter

    2016-01-01

    Introduction Hearing and vision loss among long-term care (LTC) residents with dementia frequently goes unnoticed and untreated. Despite negative consequences for these residents, there is little information available about their sensory abilities and care assessments and practices seldom take these abilities or accessibility needs into account. Without adequate knowledge regarding such sensory loss, it is difficult for LTC staff to determine the level of an individual's residual basic competence for communication and independent functioning. We will conduct a scoping review to identify the screening measures used in research and clinical contexts that test hearing and vision in adults aged over 65 years with dementia, aiming to: (1) provide an overview of hearing and vision screening in older adults with dementia; and (2) evaluate the sensibility of the screening tools. Methods and analysis This scoping review will be conducted using the framework by Arksey and O'Malley and furthered by methodological enhancements from cited researchers. We will conduct electronic database searches in CENTRAL, CINAHL, EMBASE, MEDLINE and PsycINFO. We will also carry out a ‘grey literature’ search for studies or materials not formally published, both online and through interview discussions with healthcare professionals and research clinicians working in the field. Our aim is to find new and existing hearing and vision screening measures used in research and by clinical professionals of optometry and audiology. Abstracts will be independently reviewed twice for acceptance by a multidisciplinary team of researchers and research clinicians. Ethics and dissemination This review will inform health professionals working with this growing population. With the review findings, we aim to develop a toolkit and an algorithmic process to select the most appropriate hearing and vision screening assessments for LTC residents with dementia that will facilitate accurate testing and can

  9. A tool model for predicting atmospheric kinetics with sensitivity analysis

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A package( a tool model) for program of predicting atmospheric chemical kinetics with sensitivity analysis is presented. The new direct method of calculating the first order sensitivity coefficients using sparse matrix technology to chemical kinetics is included in the tool model, it is only necessary to triangularize the matrix related to the Jacobian matrix of the model equation. The Gear type procedure is used to integrate amodel equation and its coupled auxiliary sensitivity coefficient equations. The FORTRAN subroutines of the model equation, the sensitivity coefficient equations, and their Jacobian analytical expressions are generated automatically from a chemical mechanism. The kinetic representation for the model equation and its sensitivity coefficient equations, and their Jacobian matrix is presented. Various FORTRAN subroutines in packages, such as SLODE, modified MA28, Gear package, with which the program runs in conjunction are recommended.The photo-oxidation of dimethyl disulfide is used for illustration.

  10. Colossal Tooling Design: 3D Simulation for Ergonomic Analysis

    Science.gov (United States)

    Hunter, Steve L.; Dischinger, Charles; Thomas, Robert E.; Babai, Majid

    2003-01-01

    The application of high-level 3D simulation software to the design phase of colossal mandrel tooling for composite aerospace fuel tanks was accomplished to discover and resolve safety and human engineering problems. The analyses were conducted to determine safety, ergonomic and human engineering aspects of the disassembly process of the fuel tank composite shell mandrel. Three-dimensional graphics high-level software, incorporating various ergonomic analysis algorithms, was utilized to determine if the process was within safety and health boundaries for the workers carrying out these tasks. In addition, the graphical software was extremely helpful in the identification of material handling equipment and devices for the mandrel tooling assembly/disassembly process.

  11. Analysis of 213 currently used rehabilitation protocols in foot and ankle fractures.

    Science.gov (United States)

    Pfeifer, Christian G; Grechenig, Stephan; Frankewycz, Borys; Ernstberger, Antonio; Nerlich, Michael; Krutsch, Werner

    2015-10-01

    Fractures of the ankle, hind- and midfoot are amongst the five most common fractures. Besides initial operative or non-operative treatment, rehabilitation of the patients plays a crucial role for fracture union and long term functional outcome. Limited evidence is available with regard to what a rehabilitation regimen should include and what guidelines should be in place for the initial clinical course of these patients. This study therefore investigated the current rehabilitation concepts after fractures of the ankle, hind- and midfoot. Written rehabilitation protocols provided by orthopedic and trauma surgery institutions in terms of recommendations for weight bearing, range of motion (ROM), physiotherapy and choice of orthosis were screened and analysed. All protocols for lateral ankle fractures type AO 44A1, AO 44B1 and AO 44C1, for calcaneal fractures and fractures of the metatarsal as well as other not specific were included. Descriptive analysis was carried out and statistical analysis applied where appropriate. 209 rehabilitation protocols for ankle fractures type AO 44B1 and AO 44C1, 98 for AO 44A1, 193 for metatarsal fractures, 142 for calcaneal fractures, 107 for 5(th) metatarsal base fractures and 70 for 5(th) metatarsal Jones fractures were evaluated. The mean time recommended for orthosis treatment was 6.04 (SD 0.04) weeks. While the majority of protocols showed a trend towards increased weight bearing and increased ROM over time, the best consensus was noted for weight bearing recommendations. Our study shows that there exists a huge variability in rehabilitation of fractures of the ankle-, hind- and midfoot. This may be contributed to a lack of consensus (e.g. missing publication of guidelines), individualized patient care (e.g. in fragility fractures) or lack of specialization. This study might serve as basis for prospective randomized controlled trials in order to optimize rehabilitation for these common fractures. Copyright © 2015 Elsevier Ltd

  12. Cleft Audit Protocol for Speech (CAPS-A): A Comprehensive Training Package for Speech Analysis

    Science.gov (United States)

    Sell, D.; John, A.; Harding-Bell, A.; Sweeney, T.; Hegarty, F.; Freeman, J.

    2009-01-01

    Background: The previous literature has largely focused on speech analysis systems and ignored process issues, such as the nature of adequate speech samples, data acquisition, recording and playback. Although there has been recognition of the need for training on tools used in speech analysis associated with cleft palate, little attention has been…

  13. The reliability analysis of cutting tools in the HSM processes

    OpenAIRE

    W.S. Lin

    2008-01-01

    Purpose: This article mainly describe the reliability of the cutting tools in the high speed turning by normaldistribution model.Design/methodology/approach: A series of experimental tests have been done to evaluate the reliabilityvariation of the cutting tools. From experimental results, the tool wear distribution and the tool life are determined,and the tool life distribution and the reliability function of cutting tools are derived. Further, the reliability ofcutting tools at anytime for h...

  14. Basic statistical tools in research and data analysis

    Directory of Open Access Journals (Sweden)

    Zulfiqar Ali

    2016-01-01

    Full Text Available Statistical methods involved in carrying out a study include planning, designing, collecting data, analysing, drawing meaningful interpretation and reporting of the research findings. The statistical analysis gives meaning to the meaningless numbers, thereby breathing life into a lifeless data. The results and inferences are precise only if proper statistical tests are used. This article will try to acquaint the reader with the basic research tools that are utilised while conducting various studies. The article covers a brief outline of the variables, an understanding of quantitative and qualitative variables and the measures of central tendency. An idea of the sample size estimation, power analysis and the statistical errors is given. Finally, there is a summary of parametric and non-parametric tests used for data analysis.

  15. GOMA: functional enrichment analysis tool based on GO modules

    Institute of Scientific and Technical Information of China (English)

    Qiang Huang; Ling-Yun Wu; Yong Wang; Xiang-Sun Zhang

    2013-01-01

    Analyzing the function of gene sets is a critical step in interpreting the results of high-throughput experiments in systems biology.A variety of enrichment analysis tools have been developed in recent years,but most output a long list of significantly enriched terms that are often redundant,making it difficult to extract the most meaningful functions.In this paper,we present GOMA,a novel enrichment analysis method based on the new concept of enriched functional Gene Ontology (GO) modules.With this method,we systematically revealed functional GO modules,i.e.,groups of functionally similar GO terms,via an optimization model and then ranked them by enrichment scores.Our new method simplifies enrichment analysis results by reducing redundancy,thereby preventing inconsistent enrichment results among functionally similar terms and providing more biologically meaningful results.

  16. Accounting and Financial Data Analysis Data Mining Tools

    Directory of Open Access Journals (Sweden)

    Diana Elena Codreanu

    2011-05-01

    Full Text Available Computerized accounting systems in recent years have seen an increase in complexity due to thecompetitive economic environment but with the help of data analysis solutions such as OLAP and DataMining can be a multidimensional data analysis, can detect the fraud and can discover knowledge hidden indata, ensuring such information is useful for decision making within the organization. In the literature thereare many definitions for data mining but all boils down to same idea: the process takes place to extract newinformation from large data collections, information without the aid of data mining tools would be verydifficult to obtain. Information obtained by data mining process has the advantage that only respond to thequestion of what happens but at the same time argue and show why certain things are happening. In this paperwe wish to present advanced techniques for analysis and exploitation of data stored in a multidimensionaldatabase.

  17. How women are treated during facility-based childbirth: development and validation of measurement tools in four countries - phase 1 formative research study protocol.

    Science.gov (United States)

    Vogel, Joshua P; Bohren, Meghan A; Tunçalp, Özge; Oladapo, Olufemi T; Adanu, Richard M; Baldé, Mamadou Diouldé; Maung, Thae Maung; Fawole, Bukola; Adu-Bonsaffoh, Kwame; Dako-Gyeke, Phyllis; Maya, Ernest Tei; Camara, Mohamed Campell; Diallo, Alfa Boubacar; Diallo, Safiatou; Wai, Khin Thet; Myint, Theingi; Olutayo, Lanre; Titiloye, Musibau; Alu, Frank; Idris, Hadiza; Gülmezoglu, Metin A

    2015-07-22

    Every woman has the right to dignified, respectful care during childbirth. Recent evidence has demonstrated that globally many women experience mistreatment during labour and childbirth in health facilities, which can pose a significant barrier to women attending facilities for delivery and can contribute to poor birth experiences and adverse outcomes for women and newborns. However there is no clear consensus on how mistreatment of women during childbirth in facilities is defined and measured. We propose using a two-phased, mixed-methods study design in four countries to address these research gaps. This protocol describes the Phase 1 qualitative research activities. We will employ qualitative research methodologies among women, healthcare providers and administrators in the facility catchment areas of two health facilities in each country: Ghana, Guinea, Myanmar and Nigeria. In-depth interviews (IDIs) and focus group discussions (FGDs) will be conducted among women of reproductive age (15-49 years) to explore their perceptions and experiences of facility-based childbirth care, focused on how they were treated by healthcare workers and perceived factors affecting how they were treated. IDIs will also be conducted with healthcare providers of different cadres (e.g.: nurses, midwives, medical officers, specialist obstetricians) and facility administrators working in the selected facilities to explore healthcare providers' perceptions and experiences of facility-based childbirth care and how staff are treated, colleagues and supervisors. Audio recordings will be transcribed and translated to English. Textual data will be analysed using a thematic framework approach and will consist of two levels of analysis: (1) conduct of local analysis workshops with the research assistants in each country; and (2) line-by-line coding to develop a thematic framework and coding scheme. This study serves several roles. It will provide an in-depth understanding of how women are

  18. Analysis tools for discovering strong parity violation at hadron colliders

    Science.gov (United States)

    Backović, Mihailo; Ralston, John P.

    2011-07-01

    Several arguments suggest parity violation may be observable in high energy strong interactions. We introduce new analysis tools to describe the azimuthal dependence of multiparticle distributions, or “azimuthal flow.” Analysis uses the representations of the orthogonal group O(2) and dihedral groups DN necessary to define parity completely in two dimensions. Classification finds that collective angles used in event-by-event statistics represent inequivalent tensor observables that cannot generally be represented by a single “reaction plane.” Many new parity-violating observables exist that have never been measured, while many parity-conserving observables formerly lumped together are now distinguished. We use the concept of “event-shape sorting” to suggest separating right- and left-handed events, and we discuss the effects of transverse and longitudinal spin. The analysis tools are statistically robust, and can be applied equally to low or high multiplicity events at the Tevatron, RHIC or RHIC Spin, and the LHC.

  19. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    Science.gov (United States)

    Joshi, D. M.; Patel, H. K.

    2015-10-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.

  20. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    International Nuclear Information System (INIS)

    Joshi, D.M.; Patel, H.K.

    2015-01-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant

  1. Anaphe - OO libraries and tools for data analysis

    International Nuclear Information System (INIS)

    Couet, O.; Ferrero-Merlino, B.; Molnar, Z.; Moscicki, J.T.; Pfeiffer, A.; Sang, M.

    2001-01-01

    The Anaphe project is an ongoing effort to provide an Object Oriented software environment for data analysis in HENP experiments. A range of commercial and public domain libraries is used to cover basic functionalities; on top of these libraries a set of HENP-specific C++ class libraries for histogram management, fitting, plotting and ntuple-like data analysis has been developed. In order to comply with the user requirements for a command-line driven tool, the authors have chosen to use a scripting language (Python) as the front-end for a data analysis tool. The loose coupling provided by the consequent use of (AIDA compliant) Abstract Interfaces for each component in combination with the use of shared libraries for their implementation provides an easy integration of existing libraries into modern scripting languages thus allowing for rapid application development. This integration is simplified even further using a specialised toolkit (SWIG) to create 'shadow classes' for the Python language, which map the definitions of the Abstract Interfaces almost at a one-to-one level. The authors will give an overview of the architecture and design choices and will present the current status and future developments of the project

  2. Operations other than war: Requirements for analysis tools research report

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, D.S. III

    1996-12-01

    This report documents the research effort to determine the requirements for new or improved analysis tools to support decisions at the strategic and operational levels for military Operations Other than War (OOTW). The work was performed for the Commander in Chief, U.S. Pacific Command (USCINCPAC). The data collection was based on workshops attended by experts in OOTWs: analysis personnel from each of the Combatant Commands, the Services, the Office of the Secretary of Defense (OSD), the Joint Staff, and other knowledgeable personnel. Further data were gathered from other workshops and conferences and from the literature. The results of this research begin with the creation of a taxonomy of OOTWs: categories of operations, attributes of operations, and tasks requiring analytical support. The tasks are connected to the Joint Staff`s Universal Joint Task List (UJTL). Historical OOTWs are analyzed to produce frequency distributions by category and responsible CINC. The analysis products are synthesized into a list of requirements for analytical tools and definitions of the requirements. The report concludes with a timeline or roadmap for satisfying the requirements.

  3. Principles and tools for collaborative entity-based intelligence analysis.

    Science.gov (United States)

    Bier, Eric A; Card, Stuart K; Bodnar, John W

    2010-01-01

    Software tools that make it easier for analysts to collaborate as a natural part of their work will lead to better analysis that is informed by more perspectives. We are interested to know if software tools can be designed that support collaboration even as they allow analysts to find documents and organize information (including evidence, schemas, and hypotheses). We have modified the Entity Workspace system, described previously, to test such designs. We have evaluated the resulting design in both a laboratory study and a study where it is situated with an analysis team. In both cases, effects on collaboration appear to be positive. Key aspects of the design include an evidence notebook optimized for organizing entities (rather than text characters), information structures that can be collapsed and expanded, visualization of evidence that emphasizes events and documents (rather than emphasizing the entity graph), and a notification system that finds entities of mutual interest to multiple analysts. Long-term tests suggest that this approach can support both top-down and bottom-up styles of analysis.

  4. Development of a Ground Test and Analysis Protocol for NASA's NextSTEP Phase 2 Habitation Concepts

    Science.gov (United States)

    Gernhardt, Michael L.; Beaton, Kara H.; Chappell, Steven P.; Bekdash, Omar S.; Abercromby, Andrew F. J.

    2018-01-01

    The NASA Next Space Technologies for Exploration Partnerships (NextSTEP) program is a public-private partnership model that seeks commercial development of deep space exploration capabilities to support human spaceflight missions around and beyond cislunar space. NASA first issued the Phase 1 NextSTEP Broad Agency Announcement to U.S. industries in 2014, which called for innovative cislunar habitation concepts that leveraged commercialization plans for low-Earth orbit. These habitats will be part of the Deep Space Gateway (DSG), the cislunar space station planned by NASA for construction in the 2020s. In 2016, Phase 2 of the NextSTEP program selected five commercial partners to develop ground prototypes. A team of NASA research engineers and subject matter experts (SMEs) have been tasked with developing the ground-test protocol that will serve as the primary means by which these Phase 2 prototypes will be evaluated. Since 2008, this core test team has successfully conducted multiple spaceflight analog mission evaluations utilizing a consistent set of operational tools, methods, and metrics to enable the iterative development, testing, analysis, and validation of evolving exploration architectures, operations concepts, and vehicle designs. The purpose of implementing a similar evaluation process for the Phase 2 Habitation Concepts is to consistently evaluate different commercial partner ground prototypes to provide data-driven, actionable recommendations for Phase 3. This paper describes the process by which the ground test protocol was developed and the objectives, methods, and metrics by which the NextSTEP Phase 2 Habitation Concepts will be rigorously and systematically evaluated. The protocol has been developed using both a top-down and bottom-up approach. Top-down development began with the Human Exploration and Operations Mission Directorate (HEOMD) exploration objectives and ISS Exploration Capability Study Team (IECST) candidate flight objectives. Strategic

  5. Integrated Network Analysis and Effective Tools in Plant Systems Biology

    Directory of Open Access Journals (Sweden)

    Atsushi eFukushima

    2014-11-01

    Full Text Available One of the ultimate goals in plant systems biology is to elucidate the genotype-phenotype relationship in plant cellular systems. Integrated network analysis that combines omics data with mathematical models has received particular attention. Here we focus on the latest cutting-edge computational advances that facilitate their combination. We highlight (1 network visualization tools, (2 pathway analyses, (3 genome-scale metabolic reconstruction, and (4 the integration of high-throughput experimental data and mathematical models. Multi-omics data that contain the genome, transcriptome, proteome, and metabolome and mathematical models are expected to integrate and expand our knowledge of complex plant metabolisms.

  6. Modeling energy technology choices. Which investment analysis tools are appropriate?

    International Nuclear Information System (INIS)

    Johnson, B.E.

    1994-01-01

    A variety of tools from modern investment theory appear to hold promise for unraveling observed energy technology investment behavior that often appears anomalous when analyzed using traditional investment analysis methods. This paper reviews the assumptions and important insights of the investment theories most commonly suggested as candidates for explaining the apparent ''energy technology investment paradox''. The applicability of each theory is considered in the light of important aspects of energy technology investment problems, such as sunk costs, uncertainty and imperfect information. The theories addressed include the capital asset pricing model, the arbitrage pricing theory, and the theory of irreversible investment. Enhanced net present value methods are also considered. (author)

  7. Schema for the LANL infrasound analysis tool, infrapy

    Energy Technology Data Exchange (ETDEWEB)

    Dannemann, Fransiska Kate [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Marcillo, Omar Eduardo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-04-14

    The purpose of this document is to define the schema used for the operation of the infrasound analysis tool, infrapy. The tables described by this document extend the CSS3.0 or KB core schema to include information required for the operation of infrapy. This document is divided into three sections, the first being this introduction. Section two defines eight new, infrasonic data processing-specific database tables. Both internal (ORACLE) and external formats for the attributes are defined, along with a short description of each attribute. Section three of the document shows the relationships between the different tables by using entity-relationship diagrams.

  8. Predicting SPE Fluxes: Coupled Simulations and Analysis Tools

    Science.gov (United States)

    Gorby, M.; Schwadron, N.; Linker, J.; Caplan, R. M.; Wijaya, J.; Downs, C.; Lionello, R.

    2017-12-01

    Presented here is a nuts-and-bolts look at the coupled framework of Predictive Science Inc's Magnetohydrodynamics Around a Sphere (MAS) code and the Energetic Particle Radiation Environment Module (EPREM). MAS simulated coronal mass ejection output from a variety of events can be selected as the MHD input to EPREM and a variety of parameters can be set to run against: bakground seed particle spectra, mean free path, perpendicular diffusion efficiency, etc.. A standard set of visualizations are produced as well as a library of analysis tools for deeper inquiries. All steps will be covered end-to-end as well as the framework's user interface and availability.

  9. A design and performance analysis tool for superconducting RF systems

    International Nuclear Information System (INIS)

    Schilcher, T.; Simrock, S.N.; Merminga, L.; Wang, D.X.

    1997-01-01

    Superconducting rf systems are usually operated with continuous rf power or with rf pulse lengths exceeding 1 ms to maximize the overall wall plug power efficiency. Typical examples are CEBAF at the Thomas Jefferson National Accelerator Facility (Jefferson Lab) and the TESLA Test Facility at DESY. The long pulses allow for effective application of feedback to stabilize the accelerating field in presence of microphonics, Lorentz force detuning, and fluctuations of the beam current. In this paper the authors describe a set of tools to be used with MATLAB and SIMULINK, which allow to analyze the quality of field regulation for a given design. The tools include models for the cavities, the rf power source, the beam, sources of field perturbations, and the rf feedback system. The rf control relevant electrical and mechanical characteristics of the cavity are described in form of time-varying state space models. The power source is modeled as a current generator and includes saturation characteristics and noise.An arbitrary time structure can be imposed on the beam current to reflect a macro-pulse structure and bunch charge fluctuations. For rf feedback several schemes can be selected: Traditional amplitude and phase control as well as I/Q control. The choices for the feedback controller include analog or digital approaches and various choices of frequency response. Feed forward can be added to further suppress repetitive errors. The results of a performance analysis of the CEBAF and the TESLA Linac rf system using these tools are presented

  10. Adapting the capacities and vulnerabilities approach: a gender analysis tool.

    Science.gov (United States)

    Birks, Lauren; Powell, Christopher; Hatfield, Jennifer

    2017-12-01

    Gender analysis methodology is increasingly being considered as essential to health research because 'women's social, economic and political status undermine their ability to protect and promote their own physical, emotional and mental health, including their effective use of health information and services' {World Health Organization [Gender Analysis in Health: a review of selected tools. 2003; www.who.int/gender/documents/en/Gender. pdf (20 February 2008, date last accessed)]}. By examining gendered roles, responsibilities and norms through the lens of gender analysis, we can develop an in-depth understanding of social power differentials, and be better able to address gender inequalities and inequities within institutions and between men and women. When conducting gender analysis, tools and frameworks may help to aid community engagement and to provide a framework to ensure that relevant gendered nuances are assessed. The capacities and vulnerabilities approach (CVA) is one such gender analysis framework that critically considers gender and its associated roles, responsibilities and power dynamics in a particular community and seeks to meet a social need of that particular community. Although the original intent of the CVA was to guide humanitarian intervention and disaster preparedness, we adapted this framework to a different context, which focuses on identifying and addressing emerging problems and social issues in a particular community or area that affect their specific needs, such as an infectious disease outbreak or difficulty accessing health information and resources. We provide an example of our CVA adaptation, which served to facilitate a better understanding of how health-related disparities affect Maasai women in a remote, resource-poor setting in Northern Tanzania. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  11. Developing a tool for assessing competency in root cause analysis.

    Science.gov (United States)

    Gupta, Priyanka; Varkey, Prathibha

    2009-01-01

    Root cause analysis (RCA) is a tool for identifying the key cause(s) contributing to a sentinel event or near miss. Although training in RCA is gaining popularity in medical education, there is no published literature on valid or reliable methods for assessing competency in the same. A tool for assessing competency in RCA was pilot tested as part of an eight-station Objective Structured Clinical Examination that was conducted at the completion of a three-week quality improvement (QI) curriculum for the Mayo Clinic Preventive Medicine and Endocrinology fellowship programs. As part of the curriculum, fellows completed a QI project to enhance physician communication of the diagnosis and treatment plan at the end of a patient visit. They had a didactic session on RCA, followed by process mapping of the information flow at the project clinic, after which fellows conducted an actual RCA using the Ishikawa fishbone diagram. For the RCA competency assessment, fellows performed an RCA regarding a scenario describing an adverse medication event and provided possible solutions to prevent such errors in the future. All faculty strongly agreed or agreed that they were able to accurately assess competency in RCA using the tool. Interrater reliability for the global competency rating and checklist scoring were 0.96 and 0.85, respectively. Internal consistency (Cronbach's alpha) was 0.76. Six of eight of the fellows found the difficulty level of the test to be optimal. Assessment methods must accompany education programs to ensure that graduates are competent in QI methodologies and are able to apply them effectively in the workplace. The RCA assessment tool was found to be a valid, reliable, feasible, and acceptable method for assessing competency in RCA. Further research is needed to examine its predictive validity and generalizability.

  12. Tools for integrated sequence-structure analysis with UCSF Chimera

    Directory of Open Access Journals (Sweden)

    Huang Conrad C

    2006-07-01

    Full Text Available Abstract Background Comparing related structures and viewing the structures in the context of sequence alignments are important tasks in protein structure-function research. While many programs exist for individual aspects of such work, there is a need for interactive visualization tools that: (a provide a deep integration of sequence and structure, far beyond mapping where a sequence region falls in the structure and vice versa; (b facilitate changing data of one type based on the other (for example, using only sequence-conserved residues to match structures, or adjusting a sequence alignment based on spatial fit; (c can be used with a researcher's own data, including arbitrary sequence alignments and annotations, closely or distantly related sets of proteins, etc.; and (d interoperate with each other and with a full complement of molecular graphics features. We describe enhancements to UCSF Chimera to achieve these goals. Results The molecular graphics program UCSF Chimera includes a suite of tools for interactive analyses of sequences and structures. Structures automatically associate with sequences in imported alignments, allowing many kinds of crosstalk. A novel method is provided to superimpose structures in the absence of a pre-existing sequence alignment. The method uses both sequence and secondary structure, and can match even structures with very low sequence identity. Another tool constructs structure-based sequence alignments from superpositions of two or more proteins. Chimera is designed to be extensible, and mechanisms for incorporating user-specific data without Chimera code development are also provided. Conclusion The tools described here apply to many problems involving comparison and analysis of protein structures and their sequences. Chimera includes complete documentation and is intended for use by a wide range of scientists, not just those in the computational disciplines. UCSF Chimera is free for non-commercial use and is

  13. A survey of tools for the analysis of quantitative PCR (qPCR data

    Directory of Open Access Journals (Sweden)

    Stephan Pabinger

    2014-09-01

    Our comprehensive survey showed that most tools use their own file format and only a fraction of the currently existing tools support the standardized data exchange format RDML. To allow a more streamlined and comparable analysis of qPCR data, more vendors and tools need to adapt the standardized format to encourage the exchange of data between instrument software, analysis tools, and researchers.

  14. Analysis tools for the interplay between genome layout and regulation.

    Science.gov (United States)

    Bouyioukos, Costas; Elati, Mohamed; Képès, François

    2016-06-06

    Genome layout and gene regulation appear to be interdependent. Understanding this interdependence is key to exploring the dynamic nature of chromosome conformation and to engineering functional genomes. Evidence for non-random genome layout, defined as the relative positioning of either co-functional or co-regulated genes, stems from two main approaches. Firstly, the analysis of contiguous genome segments across species, has highlighted the conservation of gene arrangement (synteny) along chromosomal regions. Secondly, the study of long-range interactions along a chromosome has emphasised regularities in the positioning of microbial genes that are co-regulated, co-expressed or evolutionarily correlated. While one-dimensional pattern analysis is a mature field, it is often powerless on biological datasets which tend to be incomplete, and partly incorrect. Moreover, there is a lack of comprehensive, user-friendly tools to systematically analyse, visualise, integrate and exploit regularities along genomes. Here we present the Genome REgulatory and Architecture Tools SCAN (GREAT:SCAN) software for the systematic study of the interplay between genome layout and gene expression regulation. SCAN is a collection of related and interconnected applications currently able to perform systematic analyses of genome regularities as well as to improve transcription factor binding sites (TFBS) and gene regulatory network predictions based on gene positional information. We demonstrate the capabilities of these tools by studying on one hand the regular patterns of genome layout in the major regulons of the bacterium Escherichia coli. On the other hand, we demonstrate the capabilities to improve TFBS prediction in microbes. Finally, we highlight, by visualisation of multivariate techniques, the interplay between position and sequence information for effective transcription regulation.

  15. CGHPRO – A comprehensive data analysis tool for array CGH

    Directory of Open Access Journals (Sweden)

    Lenzner Steffen

    2005-04-01

    Full Text Available Abstract Background Array CGH (Comparative Genomic Hybridisation is a molecular cytogenetic technique for the genome wide detection of chromosomal imbalances. It is based on the co-hybridisation of differentially labelled test and reference DNA onto arrays of genomic BAC clones, cDNAs or oligonucleotides, and after correction for various intervening variables, loss or gain in the test DNA can be indicated from spots showing aberrant signal intensity ratios. Now that this technique is no longer confined to highly specialized laboratories and is entering the realm of clinical application, there is a need for a user-friendly software package that facilitates estimates of DNA dosage from raw signal intensities obtained by array CGH experiments, and which does not depend on a sophisticated computational environment. Results We have developed a user-friendly and versatile tool for the normalization, visualization, breakpoint detection and comparative analysis of array-CGH data. CGHPRO is a stand-alone JAVA application that guides the user through the whole process of data analysis. The import option for image analysis data covers several data formats, but users can also customize their own data formats. Several graphical representation tools assist in the selection of the appropriate normalization method. Intensity ratios of each clone can be plotted in a size-dependent manner along the chromosome ideograms. The interactive graphical interface offers the chance to explore the characteristics of each clone, such as the involvement of the clones sequence in segmental duplications. Circular Binary Segmentation and unsupervised Hidden Markov Model algorithms facilitate objective detection of chromosomal breakpoints. The storage of all essential data in a back-end database allows the simultaneously comparative analysis of different cases. The various display options facilitate also the definition of shortest regions of overlap and simplify the

  16. A new tool for accelerator system modeling and analysis

    International Nuclear Information System (INIS)

    Gillespie, G.H.; Hill, B.W.; Jameson, R.A.

    1994-01-01

    A novel computer code is being developed to generate system level designs of radiofrequency ion accelerators. The goal of the Accelerator System Model (ASM) code is to create a modeling and analysis tool that is easy to use, automates many of the initial design calculations, supports trade studies used in assessing alternate designs and yet is flexible enough to incorporate new technology concepts as they emerge. Hardware engineering parameters and beam dynamics are modeled at comparable levels of fidelity. Existing scaling models of accelerator subsystems were sued to produce a prototype of ASM (version 1.0) working within the Shell for Particle Accelerator Related Codes (SPARC) graphical user interface. A small user group has been testing and evaluating the prototype for about a year. Several enhancements and improvements are now being developed. The current version (1.1) of ASM is briefly described and an example of the modeling and analysis capabilities is illustrated

  17. Automated sensitivity analysis: New tools for modeling complex dynamic systems

    International Nuclear Information System (INIS)

    Pin, F.G.

    1987-01-01

    Sensitivity analysis is an established methodology used by researchers in almost every field to gain essential insight in design and modeling studies and in performance assessments of complex systems. Conventional sensitivity analysis methodologies, however, have not enjoyed the widespread use they deserve considering the wealth of information they can provide, partly because of their prohibitive cost or the large initial analytical investment they require. Automated systems have recently been developed at ORNL to eliminate these drawbacks. Compilers such as GRESS and EXAP now allow automatic and cost effective calculation of sensitivities in FORTRAN computer codes. In this paper, these and other related tools are described and their impact and applicability in the general areas of modeling, performance assessment and decision making for radioactive waste isolation problems are discussed

  18. Message Correlation Analysis Tool for NOvA

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic realtime correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the DAQ of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  19. Message Correlation Analysis Tool for NOvA

    International Nuclear Information System (INIS)

    Lu Qiming; Biery, Kurt A; Kowalkowski, James B

    2012-01-01

    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic real-time correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the data acquisition (DAQ) of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  20. Net energy analysis - powerful tool for selecting elective power options

    Energy Technology Data Exchange (ETDEWEB)

    Baron, S. [Brookhaven National Laboratory, Upton, NY (United States)

    1995-12-01

    A number of net energy analysis studies have been conducted in recent years for electric power production from coal, oil and uranium fuels; synthetic fuels from coal and oil shale; and heat and electric power from solar energy. This technique is an excellent indicator of investment costs, environmental impact and potential economic competitiveness of alternative electric power systems for energy planners from the Eastern European countries considering future options. Energy conservation is also important to energy planners and the net energy analysis technique is an excellent accounting system on the extent of energy resource conservation. The author proposes to discuss the technique and to present the results of his studies and others in the field. The information supplied to the attendees will serve as a powerful tool to the energy planners considering their electric power options in the future.

  1. Message correlation analysis tool for NOvA

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Qiming [Fermilab; Biery, Kurt A. [Fermilab; Kowalkowski, James B. [Fermilab

    2012-01-01

    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic real-time correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the data acquisition (DAQ) of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  2. Mechanical System Analysis/Design Tool (MSAT) Quick Guide

    Science.gov (United States)

    Lee, HauHua; Kolb, Mark; Madelone, Jack

    1998-01-01

    MSAT is a unique multi-component multi-disciplinary tool that organizes design analysis tasks around object-oriented representations of configuration components, analysis programs and modules, and data transfer links between them. This creative modular architecture enables rapid generation of input stream for trade-off studies of various engine configurations. The data transfer links automatically transport output from one application as relevant input to the next application once the sequence is set up by the user. The computations are managed via constraint propagation - the constraints supplied by the user as part of any optimization module. The software can be used in the preliminary design stage as well as during the detail design of product development process.

  3. GANALYZER: A TOOL FOR AUTOMATIC GALAXY IMAGE ANALYSIS

    International Nuclear Information System (INIS)

    Shamir, Lior

    2011-01-01

    We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ∼10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.

  4. Ganalyzer: A Tool for Automatic Galaxy Image Analysis

    Science.gov (United States)

    Shamir, Lior

    2011-08-01

    We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ~10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.

  5. The Climate Data Analysis Tools (CDAT): Scientific Discovery Made Easy

    Science.gov (United States)

    Doutriaux, C. M.; Williams, D. N.; Drach, R. S.; McCoy, R. B.; Mlaker, V.

    2008-12-01

    In recent years, amount of data available to climate scientists has grown exponentially. Whether we're looking at the increasing number of organizations providing data, the finer resolutions of climate models, or the escalating number of experiments and realizations for those experiments, every aspect of climate research leads to an unprecedented growth of the volume of data to analyze. The recent success and visibility of the Intergovernmental Panel on Climate Change Annual Report 4 (IPCC AR4) is boosting the demand to unprecedented levels and keeping the numbers increasing. Meanwhile, technology available for scientists to analyze the data has remained largely unchanged since the early days. One tool, however, has proven itself flexible enough not only to follow the trend of escalating demand, but also to be ahead of the game: the Climate Data Analysis Tools (CDAT) from the Program for Climate Model Diagnosis and Comparison (PCMDI). While providing the cutting edge technology necessary to distribute the IPCC AR4 data via the Earth System Grid, PCMDI has continuously evolved CDAT to handle new grids and higher definitions, and provide new diagnostics. In the near future, in time for AR5, PCMDI will use CDAT for state-of-the-art remote data analysis in a grid computing environment.

  6. Performance Analysis, Modeling and Scaling of HPC Applications and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bhatele, Abhinav [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-01-13

    E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research along the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.

  7. XQCAT eXtra Quark Combined Analysis Tool

    CERN Document Server

    Barducci, D; Buchkremer, M; Marrouche, J; Moretti, S; Panizzi, L

    2015-01-01

    XQCAT (eXtra Quark Combined Analysis Tool) is a tool aimed to determine exclusion Confidence Levels (eCLs) for scenarios of new physics characterised by the presence of one or multiple heavy extra quarks (XQ) which interact through Yukawa couplings with any of the Standard Model (SM) quarks. The code uses a database of efficiencies for pre-simulated processes of Quantum Chromo-Dynamics (QCD) pair production and on-shell decays of extra quarks. In the version 1.0 of XQCAT the efficiencies have been computed for a set of seven publicly available search results by the CMS experiment, and the package is subject to future updates to include further searches by both ATLAS and CMS collaborations. The input for the code is a text file in which masses, branching ratios (BRs) and dominant chirality of the couplings of the new quarks are provided. The output of the code is the eCL of the test point for each implemented experimental analysis considered individually and, when possible, in statistical combination.

  8. Simple Strategic Analysis Tools at SMEs in Ecuador

    Directory of Open Access Journals (Sweden)

    Diego H. Álvarez Peralta

    2015-06-01

    Full Text Available This article explores the possible applications of Strategic Analysis Tools (SAT in SMEs located in emerging countries such as Ecuador (where there are no formal studies on the subject. It is intended to analyze if whether or not it is feasible to effectively apply a set of proposed tools to guide mental map decisions of executives when decisions on strategy have to be made. Through an in-depth review of the state of the art in regards to SAT and interviews performed to main participants such as chambers and executives of different firms, it is shown the feasibility of their application. This analysis is complemented with specialists´ interviews to deepen our insights and obtaining valid conclusions. Our conclusion is that SMEs can smoothly develop and apply an appropriate set of SAT when opting for very relevant choices. However, there are some inconveniences to be solved which are connected with resources (such as peoples’ abilities and technology and behavioral (cultural factors and methodological processes.Once these barriers are knocked down, it would be more likely to enrich current approaches to make strategic decisions even more effective. This is a qualitative investigation and the research design is not experimental (among them it is transversal as it relates to a specific moment in time.

  9. Objective breast symmetry analysis with the breast analyzing tool (BAT): improved tool for clinical trials.

    Science.gov (United States)

    Krois, Wilfried; Romar, Alexander Ken; Wild, Thomas; Dubsky, Peter; Exner, Ruth; Panhofer, Peter; Jakesz, Raimund; Gnant, Michael; Fitzal, Florian

    2017-07-01

    Objective cosmetic analysis is important to evaluate the cosmetic outcome after breast surgery or breast radiotherapy. For this purpose, we aimed to improve our recently developed objective scoring software, the Breast Analyzing Tool (BAT ® ). A questionnaire about important factors for breast symmetry was handed out to ten experts (surgeons) and eight non-experts (students). Using these factors, the first-generation BAT ® software formula has been modified and the breast symmetry index (BSI) from 129 women after breast surgery has been calculated by the first author with this new BAT ® formula. The resulting BSI values of these 129 breast cancer patients were then correlated with subjective symmetry scores from the 18 observers using the Harris scale. The BSI of ten images was also calculated from five observers different from the first author to calculate inter-rater reliability. In a second phase, the new BAT ® formula was validated and correlated with subjective scores of additional 50 women after breast surgery. The inter-rater reliability analysis of the objective evaluation by the BAT ® from five individuals showed an ICC of 0.992 with almost no difference between different observers. All subjective scores of 50 patients correlated with the modified BSI score with a high Pearson correlation coefficient of 0.909 (p BAT ® software improves the correlation between subjective and objective BSI values, and may be a new standard for trials evaluating breast symmetry.

  10. A fresh look at the freeze-all protocol: a SWOT analysis.

    Science.gov (United States)

    Blockeel, Christophe; Drakopoulos, Panagiotis; Santos-Ribeiro, Samuel; Polyzos, Nikolaos P; Tournaye, Herman

    2016-03-01

    The 'freeze-all' strategy with the segmentation of IVF treatment, namely with the use of a GnRH antagonist protocol, GnRH agonist triggering, the elective cryopreservation of all embryos by vitrification and a frozen-thawed embryo transfer in a subsequent cycle, has become more popular. However, the approach still encounters drawbacks. In this opinion paper, a SWOT (strengths, weaknesses, opportunities and threats) analysis sheds light on the different aspects of this strategy. © The Author 2016. Published by Oxford University Press on behalf of the European Society of Human Reproduction and Embryology. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  11. Establishing a protocol for element determination in human nail clippings by neutron activation analysis

    International Nuclear Information System (INIS)

    Sanches, Thalita Pinheiro; Saiki, Mitiko

    2011-01-01

    Human nail samples have been analyzed to evaluate occupational exposure, nutritional status and to diagnose certain diseases. However, sampling and washing protocols for nail analyses vary from study to study not allowing comparisons between studies. One of the difficulties in analyzing nail samples is to eliminate only surface contamination without removing elements of interest in this tissue. In the present study, a protocol was defined in order to obtain reliable results of element concentrations in human nail clippings. Nail clippings collected from all 10 fingers or toes were previously pre cleaned using an ethyl alcohol solution to eliminate microbes. Then, the clippings were cut in small pieces and submitted to different reagents for washing by shaking. Neutron activation analysis (NAA) was applied for nail samples analysis which consisted of irradiating aliquots of samples together with synthetic elemental standards in the IEA-R1 nuclear research reactor followed by gamma ray spectrometry. Comparisons made between the results obtained for nails submitted to different reagents for cleaning indicated that the procedure using acetone and Triton X100 solution is more effective than that of nitric acid solution. Analyses in triplicates of a nail sample indicated results with relative standard deviations lower than 15% for most of elements, showing the homogeneity of the prepared sample. Qualitative analyses of different nail polishes showed that the presence of elements determined in the present study is negligible in these products. Quality control of the analytical results indicated that the applied NAA procedure is adequate for human nail analysis. (author)

  12. ELECTRA © Launch and Re-Entry Safety Analysis Tool

    Science.gov (United States)

    Lazare, B.; Arnal, M. H.; Aussilhou, C.; Blazquez, A.; Chemama, F.

    2010-09-01

    French Space Operation Act gives as prime objective to National Technical Regulations to protect people, properties, public health and environment. In this frame, an independent technical assessment of French space operation is delegated to CNES. To perform this task and also for his owns operations CNES needs efficient state-of-the-art tools for evaluating risks. The development of the ELECTRA© tool, undertaken in 2007, meets the requirement for precise quantification of the risks involved in launching and re-entry of spacecraft. The ELECTRA© project draws on the proven expertise of CNES technical centers in the field of flight analysis and safety, spaceflight dynamics and the design of spacecraft. The ELECTRA© tool was specifically designed to evaluate the risks involved in the re-entry and return to Earth of all or part of a spacecraft. It will also be used for locating and visualizing nominal or accidental re-entry zones while comparing them with suitable geographic data such as population density, urban areas, and shipping lines, among others. The method chosen for ELECTRA© consists of two main steps: calculating the possible reentry trajectories for each fragment after the spacecraft breaks up; calculating the risks while taking into account the energy of the fragments, the population density and protection afforded by buildings. For launch operations and active re-entry, the risk calculation will be weighted by the probability of instantaneous failure of the spacecraft and integrated for the whole trajectory. ELECTRA©’s development is today at the end of the validation phase, last step before delivery to users. Validation process has been performed in different ways: numerical application way for the risk formulation; benchmarking process for casualty area, level of energy of the fragments entries and level of protection housing module; best practices in space transportation industries concerning dependability evaluation; benchmarking process for

  13. Comprehensive protocol of traceability during IVF: the result of a multicentre failure mode and effect analysis.

    Science.gov (United States)

    Rienzi, L; Bariani, F; Dalla Zorza, M; Albani, E; Benini, F; Chamayou, S; Minasi, M G; Parmegiani, L; Restelli, L; Vizziello, G; Costa, A Nanni

    2017-08-01

    Can traceability of gametes and embryos be ensured during IVF? The use of a simple and comprehensive traceability system that includes the most susceptible phases during the IVF process minimizes the risk of mismatches. Mismatches in IVF are very rare but unfortunately possible with dramatic consequences for both patients and health care professionals. Traceability is thus a fundamental aspect of the treatment. A clear process of patient and cell identification involving witnessing protocols has to be in place in every unit. To identify potential failures in the traceability process and to develop strategies to mitigate the risk of mismatches, previously failure mode and effects analysis (FMEA) has been used effectively. The FMEA approach is however a subjective analysis, strictly related to specific protocols and thus the results are not always widely applicable. To reduce subjectivity and to obtain a widespread comprehensive protocol of traceability, a multicentre centrally coordinated FMEA was performed. Seven representative Italian centres (three public and four private) were selected. The study had a duration of 21 months (from April 2015 to December 2016) and was centrally coordinated by a team of experts: a risk analysis specialist, an expert embryologist and a specialist in human factor. Principal investigators of each centre were first instructed about proactive risk assessment and FMEA methodology. A multidisciplinary team to perform the FMEA analysis was then formed in each centre. After mapping the traceability process, each team identified the possible causes of mistakes in their protocol. A risk priority number (RPN) for each identified potential failure mode was calculated. The results of the FMEA analyses were centrally investigated and consistent corrective measures suggested. The teams performed new FMEA analyses after the recommended implementations. In each centre, this study involved: the laboratory director, the Quality Control & Quality

  14. Quantitative assessment of in-solution digestion efficiency identifies optimal protocols for unbiased protein analysis

    DEFF Research Database (Denmark)

    Leon, Ileana R; Schwämmle, Veit; Jensen, Ole N

    2013-01-01

    a combination of qualitative and quantitative LC-MS/MS methods and statistical data analysis. In contrast to previous studies we employed both standard qualitative as well as data-independent quantitative workflows to systematically assess trypsin digestion efficiency and bias using mitochondrial protein...... conditions (buffer, RapiGest, deoxycholate, urea), and two methods for removal of detergents prior to analysis of peptides (acid precipitation or phase separation with ethyl acetate). Our data-independent quantitative LC-MS/MS workflow quantified over 3700 distinct peptides with 96% completeness between all...... protocols and replicates, with an average 40% protein sequence coverage and an average of 11 peptides identified per protein. Systematic quantitative and statistical analysis of physicochemical parameters demonstrated that deoxycholate-assisted in-solution digestion combined with phase transfer allows...

  15. MSP-tool : A VBA-based software tool for the analysis of multispecimen paleointensity data

    NARCIS (Netherlands)

    Monster, Marilyn W L; de Groot, Lennart V.; Dekkers, Mark J.

    2015-01-01

    The multispecimen protocol (MSP) is a method to estimate the Earth's magnetic field's past strength from volcanic rocks or archeological materials. By reducing the amount of heating steps and aligning the specimens parallel to the applied field, thermochemical alteration and multi-domain effects are

  16. Financial incentive policies at workplace cafeterias for preventing obesity--a systematic review and meta-analysis (Protocol).

    Science.gov (United States)

    Sawada, Kimi; Ota, Erika; Shahrook, Sadequa; Mori, Rintaro

    2014-10-28

    Various studies are currently investigating ways to prevent lifestyle-related diseases and obesity among workers through interventions using incentive strategies, including price discounts for low-fat snacks and sugar-free beverages at workplace cafeterias or vending machines, and the provision of a free salad bar in cafeterias. Rather than assessing individual or group interventions, we will focus on the effectiveness of nutrition education programs at the population level, which primarily incorporate financial incentive strategies to prevent obesity. This paper describes the protocol of a systematic review that will examine the effectiveness of financial incentive programs at company cafeterias in improving dietary habits, nutrient intake, and obesity prevention. We will conduct searches in the Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE, Embase, and PsycINFO. Interventions will be assessed using data from randomized control trials (RCTs) and cluster RCTs. However, if few such trials exist, we will include quasi-RCTs. We will exclude controlled before-and-after studies and crossover RCTs. We will assess food-based interventions that include financial incentive strategies (discount strategies or social marketing) for workplace cafeterias, vending machines, and kiosks. Two authors will independently review studies for inclusion and will resolve differences by discussion and, if required, through consultation with a third author. We will assess the risk of bias of included studies according to the Cochrane Collaboration's "risk of bias" tool. The purpose of this paper is to outline the study protocol for a systematic review and meta-analysis that will investigate the effectiveness of population-level, incentive-focused interventions at the workplace cafeteria that aim to promote and prevent obesity. This review will give an important overview of the available evidence about the effectiveness of incentive-based environmental interventions to

  17. Dementia Population Risk Tool (DemPoRT): study protocol for a predictive algorithm assessing dementia risk in the community

    OpenAIRE

    Fisher, Stacey; Hsu, Amy; Mojaverian, Nassim; Taljaard, Monica; Huyer, Gregory; Manuel, Douglas G; Tanuseputro, Peter

    2017-01-01

    Introduction The burden of disease from dementia is a growing global concern as incidence increases dramatically with age, and average life expectancy has been increasing around the world. Planning for an ageing population requires reliable projections of dementia prevalence; however, existing population projections are simple and have poor predictive accuracy. The Dementia Population Risk Tool (DemPoRT) will predict incidence of dementia in the population setting using multivariable modellin...

  18. Microbial Genome Analysis and Comparisons: Web-based Protocols and Resources

    Science.gov (United States)

    Fully annotated genome sequences of many microorganisms are publicly available as a resource. However, in-depth analysis of these genomes using specialized tools is required to derive meaningful information. We describe here the utility of three powerful publicly available genome databases and ana...

  19. Effect of standardized training on the reliability of the Cochrane risk of bias assessment tool: a study protocol.

    Science.gov (United States)

    da Costa, Bruno R; Resta, Nina M; Beckett, Brooke; Israel-Stahre, Nicholas; Diaz, Alison; Johnston, Bradley C; Egger, Matthias; Jüni, Peter; Armijo-Olivo, Susan

    2014-12-13

    The Cochrane risk of bias (RoB) tool has been widely embraced by the systematic review community, but several studies have reported that its reliability is low. We aim to investigate whether training of raters, including objective and standardized instructions on how to assess risk of bias, can improve the reliability of this tool. We describe the methods that will be used in this investigation and present an intensive standardized training package for risk of bias assessment that could be used by contributors to the Cochrane Collaboration and other reviewers. This is a pilot study. We will first perform a systematic literature review to identify randomized clinical trials (RCTs) that will be used for risk of bias assessment. Using the identified RCTs, we will then do a randomized experiment, where raters will be allocated to two different training schemes: minimal training and intensive standardized training. We will calculate the chance-corrected weighted Kappa with 95% confidence intervals to quantify within- and between-group Kappa agreement for each of the domains of the risk of bias tool. To calculate between-group Kappa agreement, we will use risk of bias assessments from pairs of raters after resolution of disagreements. Between-group Kappa agreement will quantify the agreement between the risk of bias assessment of raters in the training groups and the risk of bias assessment of experienced raters. To compare agreement of raters under different training conditions, we will calculate differences between Kappa values with 95% confidence intervals. This study will investigate whether the reliability of the risk of bias tool can be improved by training raters using standardized instructions for risk of bias assessment. One group of inexperienced raters will receive intensive training on risk of bias assessment and the other will receive minimal training. By including a control group with minimal training, we will attempt to mimic what many review authors

  20. New Tools for Sea Ice Data Analysis and Visualization: NSIDC's Arctic Sea Ice News and Analysis

    Science.gov (United States)

    Vizcarra, N.; Stroeve, J.; Beam, K.; Beitler, J.; Brandt, M.; Kovarik, J.; Savoie, M. H.; Skaug, M.; Stafford, T.

    2017-12-01

    Arctic sea ice has long been recognized as a sensitive climate indicator and has undergone a dramatic decline over the past thirty years. Antarctic sea ice continues to be an intriguing and active field of research. The National Snow and Ice Data Center's Arctic Sea Ice News & Analysis (ASINA) offers researchers and the public a transparent view of sea ice data and analysis. We have released a new set of tools for sea ice analysis and visualization. In addition to Charctic, our interactive sea ice extent graph, the new Sea Ice Data and Analysis Tools page provides access to Arctic and Antarctic sea ice data organized in seven different data workbooks, updated daily or monthly. An interactive tool lets scientists, or the public, quickly compare changes in ice extent and location. Another tool allows users to map trends, anomalies, and means for user-defined time periods. Animations of September Arctic and Antarctic monthly average sea ice extent and concentration may also be accessed from this page. Our tools help the NSIDC scientists monitor and understand sea ice conditions in near real time. They also allow the public to easily interact with and explore sea ice data. Technical innovations in our data center helped NSIDC quickly build these tools and more easily maintain them. The tools were made publicly accessible to meet the desire from the public and members of the media to access the numbers and calculations that power our visualizations and analysis. This poster explores these tools and how other researchers, the media, and the general public are using them.

  1. msBiodat analysis tool, big data analysis for high-throughput experiments.

    Science.gov (United States)

    Muñoz-Torres, Pau M; Rokć, Filip; Belužic, Robert; Grbeša, Ivana; Vugrek, Oliver

    2016-01-01

    Mass spectrometry (MS) are a group of a high-throughput techniques used to increase knowledge about biomolecules. They produce a large amount of data which is presented as a list of hundreds or thousands of proteins. Filtering those data efficiently is the first step for extracting biologically relevant information. The filtering may increase interest by merging previous data with the data obtained from public databases, resulting in an accurate list of proteins which meet the predetermined conditions. In this article we present msBiodat Analysis Tool, a web-based application thought to approach proteomics to the big data analysis. With this tool, researchers can easily select the most relevant information from their MS experiments using an easy-to-use web interface. An interesting feature of msBiodat analysis tool is the possibility of selecting proteins by its annotation on Gene Ontology using its Gene Id, ensembl or UniProt codes. The msBiodat analysis tool is a web-based application that allows researchers with any programming experience to deal with efficient database querying advantages. Its versatility and user-friendly interface makes easy to perform fast and accurate data screening by using complex queries. Once the analysis is finished, the result is delivered by e-mail. msBiodat analysis tool is freely available at http://msbiodata.irb.hr.

  2. Analysis and specification tools in relation to the APSE

    Science.gov (United States)

    Hendricks, John W.

    1986-01-01

    Ada and the Ada Programming Support Environment (APSE) specifically address the phases of the system/software life cycle which follow after the user's problem was translated into system and software development specifications. The waterfall model of the life cycle identifies the analysis and requirements definition phases as preceeding program design and coding. Since Ada is a programming language and the APSE is a programming support environment, they are primarily targeted to support program (code) development, tecting, and maintenance. The use of Ada based or Ada related specification languages (SLs) and program design languages (PDLs) can extend the use of Ada back into the software design phases of the life cycle. Recall that the standardization of the APSE as a programming support environment is only now happening after many years of evolutionary experience with diverse sets of programming support tools. Restricting consideration to one, or even a few chosen specification and design tools, could be a real mistake for an organization or a major project such as the Space Station, which will need to deal with an increasingly complex level of system problems. To require that everything be Ada-like, be implemented in Ada, run directly under the APSE, and fit into a rigid waterfall model of the life cycle would turn a promising support environment into a straight jacket for progress.

  3. Revisiting corpus creation and analysis tools for translation tasks

    Directory of Open Access Journals (Sweden)

    Claudio Fantinuoli

    2016-06-01

    Full Text Available Many translation scholars have proposed the use of corpora to allow professional translators to produce high quality texts which read like originals. Yet, the diffusion of this methodology has been modest, one reason being the fact that software for corpora analyses have been developed with the linguist in mind, which means that they are generally complex and cumbersome, offering many advanced features, but lacking the level of usability and the specific features that meet translators’ needs. To overcome this shortcoming, we have developed TranslatorBank, a free corpus creation and analysis tool designed for translation tasks. TranslatorBank supports the creation of specialized monolingual corpora from the web; it includes a concordancer with a query system similar to a search engine; it uses basic statistical measures to indicate the reliability of results; it accesses the original documents directly for more contextual information; it includes a statistical and linguistic terminology extraction utility to extract the relevant terminology of the domain and the typical collocations of a given term. Designed to be easy and intuitive to use, the tool may help translation students as well as professionals to increase their translation quality by adhering to the specific linguistic variety of the target text corpus.

  4. Multi-Mission Power Analysis Tool (MMPAT) Version 3

    Science.gov (United States)

    Wood, Eric G.; Chang, George W.; Chen, Fannie C.

    2012-01-01

    The Multi-Mission Power Analysis Tool (MMPAT) simulates a spacecraft power subsystem including the power source (solar array and/or radioisotope thermoelectric generator), bus-voltage control, secondary battery (lithium-ion or nickel-hydrogen), thermostatic heaters, and power-consuming equipment. It handles multiple mission types including heliocentric orbiters, planetary orbiters, and surface operations. Being parametrically driven along with its user-programmable features can reduce or even eliminate any need for software modifications when configuring it for a particular spacecraft. It provides multiple levels of fidelity, thereby fulfilling the vast majority of a project s power simulation needs throughout the lifecycle. It can operate in a stand-alone mode with a graphical user interface, in batch mode, or as a library linked with other tools. This software can simulate all major aspects of a spacecraft power subsystem. It is parametrically driven to reduce or eliminate the need for a programmer. Added flexibility is provided through user-designed state models and table-driven parameters. MMPAT is designed to be used by a variety of users, such as power subsystem engineers for sizing power subsystem components; mission planners for adjusting mission scenarios using power profiles generated by the model; system engineers for performing system- level trade studies using the results of the model during the early design phases of a spacecraft; and operations personnel for high-fidelity modeling of the essential power aspect of the planning picture.

  5. Revisiting corpus creation and analysis tools for translation tasks

    Directory of Open Access Journals (Sweden)

    Claudio Fantinuoli

    2016-04-01

    Many translation scholars have proposed the use of corpora to allow professional translators to produce high quality texts which read like originals. Yet, the diffusion of this methodology has been modest, one reason being the fact that software for corpora analyses have been developed with the linguist in mind, which means that they are generally complex and cumbersome, offering many advanced features, but lacking the level of usability and the specific features that meet translators’ needs. To overcome this shortcoming, we have developed TranslatorBank, a free corpus creation and analysis tool designed for translation tasks. TranslatorBank supports the creation of specialized monolingual corpora from the web; it includes a concordancer with a query system similar to a search engine; it uses basic statistical measures to indicate the reliability of results; it accesses the original documents directly for more contextual information; it includes a statistical and linguistic terminology extraction utility to extract the relevant terminology of the domain and the typical collocations of a given term. Designed to be easy and intuitive to use, the tool may help translation students as well as professionals to increase their translation quality by adhering to the specific linguistic variety of the target text corpus.

  6. Failure mode and effects analysis of witnessing protocols for ensuring traceability during IVF.

    Science.gov (United States)

    Rienzi, Laura; Bariani, Fiorenza; Dalla Zorza, Michela; Romano, Stefania; Scarica, Catello; Maggiulli, Roberta; Nanni Costa, Alessandro; Ubaldi, Filippo Maria

    2015-10-01

    Traceability of cells during IVF is a fundamental aspect of treatment, and involves witnessing protocols. Failure mode and effects analysis (FMEA) is a method of identifying real or potential breakdowns in processes, and allows strategies to mitigate risks to be developed. To examine the risks associated with witnessing protocols, an FMEA was carried out in a busy IVF centre, before and after implementation of an electronic witnessing system (EWS). A multidisciplinary team was formed and moderated by human factors specialists. Possible causes of failures, and their potential effects, were identified and risk priority number (RPN) for each failure calculated. A second FMEA analysis was carried out after implementation of an EWS. The IVF team identified seven main process phases, 19 associated process steps and 32 possible failure modes. The highest RPN was 30, confirming the relatively low risk that mismatches may occur in IVF when a manual witnessing system is used. The introduction of the EWS allowed a reduction in the moderate-risk failure mode by two-thirds (highest RPN = 10). In our experience, FMEA is effective in supporting multidisciplinary IVF groups to understand the witnessing process, identifying critical steps and planning changes in practice to enable safety to be enhanced. Copyright © 2015 Reproductive Healthcare Ltd. Published by Elsevier Ltd. All rights reserved.

  7. Analysis of agreement between cardiac risk stratification protocols applied to participants of a center for cardiac rehabilitation

    Directory of Open Access Journals (Sweden)

    Ana A. S. Santos

    2016-01-01

    Full Text Available ABSTRACT Background Cardiac risk stratification is related to the risk of the occurrence of events induced by exercise. Despite the existence of several protocols to calculate risk stratification, studies indicating that there is similarity between these protocols are still unknown. Objective To evaluate the agreement between the existing protocols on cardiac risk rating in cardiac patients. Method The records of 50 patients from a cardiac rehabilitation program were analyzed, from which the following information was extracted: age, sex, weight, height, clinical diagnosis, medical history, risk factors, associated diseases, and the results from the most recent laboratory and complementary tests performed. This information was used for risk stratification of the patients in the protocols of the American College of Sports Medicine, the Brazilian Society of Cardiology, the American Heart Association, the protocol designed by Frederic J. Pashkow, the American Association of Cardiovascular and Pulmonary Rehabilitation, the Société Française de Cardiologie, and the Sociedad Española de Cardiología. Descriptive statistics were used to characterize the sample and the analysis of agreement between the protocols was calculated using the Kappa coefficient. Differences were considered with a significance level of 5%. Results Of the 21 analyses of agreement, 12 were considered significant between the protocols used for risk classification, with nine classified as moderate and three as low. No agreements were classified as excellent. Different proportions were observed in each risk category, with significant differences between the protocols for all risk categories. Conclusion The agreements between the protocols were considered low and moderate and the risk proportions differed between protocols.

  8. Abstract Interfaces for Data Analysis Component Architecture for Data Analysis Tools

    CERN Document Server

    Barrand, G; Dönszelmann, M; Johnson, A; Pfeiffer, A

    2001-01-01

    The fast turnover of software technologies, in particular in the domain of interactivity (covering user interface and visualisation), makes it difficult for a small group of people to produce complete and polished software-tools before the underlying technologies make them obsolete. At the HepVis '99 workshop, a working group has been formed to improve the production of software tools for data analysis in HENP. Beside promoting a distributed development organisation, one goal of the group is to systematically design a set of abstract interfaces based on using modern OO analysis and OO design techniques. An initial domain analysis has come up with several categories (components) found in typical data analysis tools: Histograms, Ntuples, Functions, Vectors, Fitter, Plotter, Analyzer and Controller. Special emphasis was put on reducing the couplings between the categories to a minimum, thus optimising re-use and maintainability of any component individually. The interfaces have been defined in Java and C++ and i...

  9. Natural funnel asymmetries. A simulation analysis of the three basic tools of meta analysis

    DEFF Research Database (Denmark)

    Callot, Laurent Abdelkader Francois; Paldam, Martin

    Meta-analysis studies a set of estimates of one parameter with three basic tools: The funnel diagram is the distribution of the estimates as a function of their precision; the funnel asymmetry test, FAT; and the meta average, where PET is an estimate. The FAT-PET MRA is a meta regression analysis...

  10. Nuclear Fuel Cycle Analysis and Simulation Tool (FAST)

    Energy Technology Data Exchange (ETDEWEB)

    Ko, Won Il; Kwon, Eun Ha; Kim, Ho Dong

    2005-06-15

    This paper describes the Nuclear Fuel Cycle Analysis and Simulation Tool (FAST) which has been developed by the Korea Atomic Energy Research Institute (KAERI). Categorizing various mix of nuclear reactors and fuel cycles into 11 scenario groups, the FAST calculates all the required quantities for each nuclear fuel cycle component, such as mining, conversion, enrichment and fuel fabrication for each scenario. A major advantage of the FAST is that the code employs a MS Excel spread sheet with the Visual Basic Application, allowing users to manipulate it with ease. The speed of the calculation is also quick enough to make comparisons among different options in a considerably short time. This user-friendly simulation code is expected to be beneficial to further studies on the nuclear fuel cycle to find best options for the future all proliferation risk, environmental impact and economic costs considered.

  11. Analysis of Sequence Diagram Layout in Advanced UML Modelling Tools

    Directory of Open Access Journals (Sweden)

    Ņikiforova Oksana

    2016-05-01

    Full Text Available System modelling using Unified Modelling Language (UML is the task that should be solved for software development. The more complex software becomes the higher requirements are stated to demonstrate the system to be developed, especially in its dynamic aspect, which in UML is offered by a sequence diagram. To solve this task, the main attention is devoted to the graphical presentation of the system, where diagram layout plays the central role in information perception. The UML sequence diagram due to its specific structure is selected for a deeper analysis on the elements’ layout. The authors research represents the abilities of modern UML modelling tools to offer automatic layout of the UML sequence diagram and analyse them according to criteria required for the diagram perception.

  12. SINEBase: a database and tool for SINE analysis.

    Science.gov (United States)

    Vassetzky, Nikita S; Kramerov, Dmitri A

    2013-01-01

    SINEBase (http://sines.eimb.ru) integrates the revisited body of knowledge about short interspersed elements (SINEs). A set of formal definitions concerning SINEs was introduced. All available sequence data were screened through these definitions and the genetic elements misidentified as SINEs were discarded. As a result, 175 SINE families have been recognized in animals, flowering plants and green algae. These families were classified by the modular structure of their nucleotide sequences and the frequencies of different patterns were evaluated. These data formed the basis for the database of SINEs. The SINEBase website can be used in two ways: first, to explore the database of SINE families, and second, to analyse candidate SINE sequences using specifically developed tools. This article presents an overview of the database and the process of SINE identification and analysis.

  13. Software Tools for Robust Analysis of High-Dimensional Data

    Directory of Open Access Journals (Sweden)

    Valentin Todorov

    2014-06-01

    Full Text Available The present work discusses robust multivariate methods specifically designed for highdimensions. Their implementation in R is presented and their application is illustratedon examples. The first group are algorithms for outlier detection, already introducedelsewhere and implemented in other packages. The value added of the new package isthat all methods follow the same design pattern and thus can use the same graphicaland diagnostic tools. The next topic covered is sparse principal components including anobject oriented interface to the standard method proposed by Zou, Hastie, and Tibshirani(2006 and the robust one proposed by Croux, Filzmoser, and Fritz (2013. Robust partialleast squares (see Hubert and Vanden Branden 2003 as well as partial least squares fordiscriminant analysis conclude the scope of the new package.

  14. In silico tools for the analysis of antibiotic biosynthetic pathways

    DEFF Research Database (Denmark)

    Weber, Tilmann

    2014-01-01

    Natural products of bacteria and fungi are the most important source for antimicrobial drug leads. For decades, such compounds were exclusively found by chemical/bioactivity-guided screening approaches. The rapid progress in sequencing technologies only recently allowed the development of novel...... screening methods based on the genome sequences of potential producing organisms. The basic principle of such genome mining approaches is to identify genes, which are involved in the biosynthesis of such molecules, and to predict the products of the identified pathways. Thus, bioinformatics methods...... and tools are crucial for genome mining. In this review, a comprehensive overview is given on programs and databases for the identification and analysis of antibiotic biosynthesis gene clusters in genomic data....

  15. PFA toolbox: a MATLAB tool for Metabolic Flux Analysis.

    Science.gov (United States)

    Morales, Yeimy; Bosque, Gabriel; Vehí, Josep; Picó, Jesús; Llaneras, Francisco

    2016-07-11

    Metabolic Flux Analysis (MFA) is a methodology that has been successfully applied to estimate metabolic fluxes in living cells. However, traditional frameworks based on this approach have some limitations, particularly when measurements are scarce and imprecise. This is very common in industrial environments. The PFA Toolbox can be used to face those scenarios. Here we present the PFA (Possibilistic Flux Analysis) Toolbox for MATLAB, which simplifies the use of Interval and Possibilistic Metabolic Flux Analysis. The main features of the PFA Toolbox are the following: (a) It provides reliable MFA estimations in scenarios where only a few fluxes can be measured or those available are imprecise. (b) It provides tools to easily plot the results as interval estimates or flux distributions. (c) It is composed of simple functions that MATLAB users can apply in flexible ways. (d) It includes a Graphical User Interface (GUI), which provides a visual representation of the measurements and their uncertainty. (e) It can use stoichiometric models in COBRA format. In addition, the PFA Toolbox includes a User's Guide with a thorough description of its functions and several examples. The PFA Toolbox for MATLAB is a freely available Toolbox that is able to perform Interval and Possibilistic MFA estimations.

  16. Steam Generator Analysis Tools and Modeling of Degradation Mechanisms

    International Nuclear Information System (INIS)

    Yetisir, M.; Pietralik, J.; Tapping, R.L.

    2004-01-01

    The degradation of steam generators (SGs) has a significant effect on nuclear heat transport system effectiveness and the lifetime and overall efficiency of a nuclear power plant. Hence, quantification of the effects of degradation mechanisms is an integral part of a SG degradation management strategy. Numerical analysis tools such as THIRST, a 3-dimensional (3D) thermal hydraulics code for recirculating SGs; SLUDGE, a 3D sludge prediction code; CHECWORKS a flow-accelerated corrosion prediction code for nuclear piping, PIPO-FE, a SG tube vibration code; and VIBIC and H3DMAP, 3D non-linear finite-element codes to predict SG tube fretting wear can be used to assess the impacts of various maintenance activities on SG thermal performance. These tools are also found to be invaluable at the design stage to influence the design by determining margins or by helping the designers minimize or avoid known degradation mechanisms. In this paper, the aforementioned numerical tools and their application to degradation mechanisms in CANDU recirculating SGs are described. In addition, the following degradation mechanisms are identified and their effect on SG thermal efficiency and lifetime are quantified: primary-side fouling, secondary-side fouling, fretting wear, and flow-accelerated corrosion (FAC). Primary-side tube inner diameter fouling has been a major contributor to SG thermal degradation. Using the results of thermalhydraulic analysis and field data, fouling margins are calculated. Individual effects of primary- and secondary-side fouling are separated through analyses, which allow station operators to decide what type of maintenance activity to perform and when to perform the maintenance activity. Prediction of the fretting-wear rate of tubes allows designers to decide on the number and locations of support plates and U-bend supports. The prediction of FAC rates for SG internals allows designers to select proper materials, and allows operators to adjust the SG maintenance

  17. Web analytics tools and web metrics tools: An overview and comparative analysis

    Directory of Open Access Journals (Sweden)

    Ivan Bekavac

    2015-10-01

    Full Text Available The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytics tools to exploring their functionalities and ability to be integrated into the respective business model. Web analytics tools support the business analyst’s efforts in obtaining useful and relevant insights into market dynamics. Thus, generally speaking, selecting a web analytics and web metrics tool should be based on an investigative approach, not a random decision. The second section is a quantitative focus shifting from theory to an empirical approach, and which subsequently presents output data resulting from a study based on perceived user satisfaction of web analytics tools. The empirical study was carried out on employees from 200 Croatian firms from either an either IT or marketing branch. The paper contributes to highlighting the support for management that available web analytics and web metrics tools available on the market have to offer, and based on the growing needs of understanding and predicting global market trends.

  18. STRESS ANALYSIS IN CUTTING TOOLS COATED TiN AND EFFECT OF THE FRICTION COEFFICIENT IN TOOL-CHIP INTERFACE

    Directory of Open Access Journals (Sweden)

    Kubilay ASLANTAŞ

    2003-02-01

    Full Text Available The coated tools are regularly used in today's metal cutting industry. Because, it is well known that thin and hard coatings can reduce tool wear, improve tool life and productivity. Such coatings have significantly contributed to the improvements cutting economies and cutting tool performance through lower tool wear and reduced cutting forces. TiN coatings have especially high strength and low friction coefficients. During the cutting process, low friction coefficient reduce damage in cutting tool. In addition, maximum stress values between coating and substrate also decrease as the friction coefficient decreases. In the present study, stress analysis is carried out for HSS (High Speed Steel cutting tool coated with TiN. The effect of the friction coefficient between tool and chip on the stresses developed at the cutting tool surface and interface of coating and HSS is investigated. Damage zones during cutting process was also attempted to determine. Finite elements method is used for the solution of the problem and FRANC2D finite element program is selected for numerical solutions.

  19. Quantifying traces of tool use: a novel morphometric analysis of damage patterns on percussive tools.

    Directory of Open Access Journals (Sweden)

    Matthew V Caruana

    Full Text Available Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnostic techniques to identify percussive tools from archaeological contexts. Here we describe a new morphometric method for distinguishing anthropogenically-generated damage patterns on percussive tools from naturally damaged river cobbles. We employ a geomatic approach through the use of three-dimensional scanning and geographical information systems software to statistically quantify the identification process in percussive technology research. This will strengthen current technological analyses of percussive tools in archaeological frameworks and open new avenues for translating behavioral inferences of early hominins from percussive damage patterns.

  20. Quantifying Traces of Tool Use: A Novel Morphometric Analysis of Damage Patterns on Percussive Tools

    Science.gov (United States)

    Caruana, Matthew V.; Carvalho, Susana; Braun, David R.; Presnyakova, Darya; Haslam, Michael; Archer, Will; Bobe, Rene; Harris, John W. K.

    2014-01-01

    Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnostic techniques to identify percussive tools from archaeological contexts. Here we describe a new morphometric method for distinguishing anthropogenically-generated damage patterns on percussive tools from naturally damaged river cobbles. We employ a geomatic approach through the use of three-dimensional scanning and geographical information systems software to statistically quantify the identification process in percussive technology research. This will strengthen current technological analyses of percussive tools in archaeological frameworks and open new avenues for translating behavioral inferences of early hominins from percussive damage patterns. PMID:25415303

  1. The Seamless Transfer-of-Care Protocol: a randomized controlled trial assessing the efficacy of an electronic transfer-of-care communication tool

    Directory of Open Access Journals (Sweden)

    Okoniewska Barbara M

    2012-11-01

    evaluation will assess the cost per life saved, cost per readmission avoided and cost per QALY gained with the TOC communication tool compared to traditional dictation summaries. Discussion This paper outlines the study protocol for a randomized controlled trial evaluating an electronic transfer-of-care communication tool, with sufficient statistical power to assess the impact of the tool on the significant outcomes of post-discharge death or readmission. The study findings will inform health systems around the world on the potential benefits of such tools, and the value for money associated with their widespread implementation. Trial registration ClinicalTrials.gov NCT01402609.

  2. The Seamless Transfer-of-Care Protocol: a randomized controlled trial assessing the efficacy of an electronic transfer-of-care communication tool.

    Science.gov (United States)

    Okoniewska, Barbara M; Santana, Maria J; Holroyd-Leduc, Jayna; Flemons, Ward; O'Beirne, Maeve; White, Deborah; Clement, Fiona; Forster, Alan; Ghali, William A

    2012-11-21

    readmission avoided and cost per QALY gained with the TOC communication tool compared to traditional dictation summaries. This paper outlines the study protocol for a randomized controlled trial evaluating an electronic transfer-of-care communication tool, with sufficient statistical power to assess the impact of the tool on the significant outcomes of post-discharge death or readmission. The study findings will inform health systems around the world on the potential benefits of such tools, and the value for money associated with their widespread implementation. ClinicalTrials.gov NCT01402609.

  3. Data Collection Protocols for Adhesive Testing Results Using the Materials Selection and Analysis Tool

    Science.gov (United States)

    2012-06-01

    processing envelope will also be equally broad, as the Army employs thermosetting , thermoplastic, paste, and film adhesives cured using a variety of...Testing ASTM D 1002 13 was the basis standard used for the single-lap-joint testing. Aluminum adherends ( Alloy 2024-T3) were used with dimensions of...Surface Preparation of Aluminum Alloys to Be Adhesively Bonded in Honeycomb Shelter Panels." ASTM International, West Conshohocken, PA, 2001, DOI

  4. MetaFIND: A feature analysis tool for metabolomics data

    Directory of Open Access Journals (Sweden)

    Cunningham Pádraig

    2008-11-01

    Full Text Available Abstract Background Metabolomics, or metabonomics, refers to the quantitative analysis of all metabolites present within a biological sample and is generally carried out using NMR spectroscopy or Mass Spectrometry. Such analysis produces a set of peaks, or features, indicative of the metabolic composition of the sample and may be used as a basis for sample classification. Feature selection may be employed to improve classification accuracy or aid model explanation by establishing a subset of class discriminating features. Factors such as experimental noise, choice of technique and threshold selection may adversely affect the set of selected features retrieved. Furthermore, the high dimensionality and multi-collinearity inherent within metabolomics data may exacerbate discrepancies between the set of features retrieved and those required to provide a complete explanation of metabolite signatures. Given these issues, the latter in particular, we present the MetaFIND application for 'post-feature selection' correlation analysis of metabolomics data. Results In our evaluation we show how MetaFIND may be used to elucidate metabolite signatures from the set of features selected by diverse techniques over two metabolomics datasets. Importantly, we also show how MetaFIND may augment standard feature selection and aid the discovery of additional significant features, including those which represent novel class discriminating metabolites. MetaFIND also supports the discovery of higher level metabolite correlations. Conclusion Standard feature selection techniques may fail to capture the full set of relevant features in the case of high dimensional, multi-collinear metabolomics data. We show that the MetaFIND 'post-feature selection' analysis tool may aid metabolite signature elucidation, feature discovery and inference of metabolic correlations.

  5. Split bolus technique in polytrauma: a prospective study on scan protocols for trauma analysis

    NARCIS (Netherlands)

    Beenen, Ludo F. M.; Sierink, Joanne C.; Kolkman, Saskia; Nio, C. Yung; Saltzherr, Teun Peter; Dijkgraaf, Marcel G. W.; Goslings, J. Carel

    2015-01-01

    For the evaluation of severely injured trauma patients a variety of total body computed tomography (CT) scanning protocols exist. Frequently multiple pass protocols are used. A split bolus contrast protocol can reduce the number of passes through the body, and thereby radiation exposure, in this

  6. A standardised protocol for texture feature analysis of endoscopic images in gynaecological cancer

    Directory of Open Access Journals (Sweden)

    Pattichis Marios S

    2007-11-01

    Full Text Available Abstract Background In the development of tissue classification methods, classifiers rely on significant differences between texture features extracted from normal and abnormal regions. Yet, significant differences can arise due to variations in the image acquisition method. For endoscopic imaging of the endometrium, we propose a standardized image acquisition protocol to eliminate significant statistical differences due to variations in: (i the distance from the tissue (panoramic vs close up, (ii difference in viewing angles and (iii color correction. Methods We investigate texture feature variability for a variety of targets encountered in clinical endoscopy. All images were captured at clinically optimum illumination and focus using 720 × 576 pixels and 24 bits color for: (i a variety of testing targets from a color palette with a known color distribution, (ii different viewing angles, (iv two different distances from a calf endometrial and from a chicken cavity. Also, human images from the endometrium were captured and analysed. For texture feature analysis, three different sets were considered: (i Statistical Features (SF, (ii Spatial Gray Level Dependence Matrices (SGLDM, and (iii Gray Level Difference Statistics (GLDS. All images were gamma corrected and the extracted texture feature values were compared against the texture feature values extracted from the uncorrected images. Statistical tests were applied to compare images from different viewing conditions so as to determine any significant differences. Results For the proposed acquisition procedure, results indicate that there is no significant difference in texture features between the panoramic and close up views and between angles. For a calibrated target image, gamma correction provided an acquired image that was a significantly better approximation to the original target image. In turn, this implies that the texture features extracted from the corrected images provided for better

  7. Portfolio: a prototype workstation for development and evaluation of tools for analysis and management of digital portal images

    International Nuclear Information System (INIS)

    Boxwala, Aziz A.; Chaney, Edward L.; Fritsch, Daniel S.; Friedman, Charles P.; Rosenman, Julian G.

    1998-01-01

    Purpose: The purpose of this investigation was to design and implement a prototype physician workstation, called PortFolio, as a platform for developing and evaluating, by means of controlled observer studies, user interfaces and interactive tools for analyzing and managing digital portal images. The first observer study was designed to measure physician acceptance of workstation technology, as an alternative to a view box, for inspection and analysis of portal images for detection of treatment setup errors. Methods and Materials: The observer study was conducted in a controlled experimental setting to evaluate physician acceptance of the prototype workstation technology exemplified by PortFolio. PortFolio incorporates a windows user interface, a compact kit of carefully selected image analysis tools, and an object-oriented data base infrastructure. The kit evaluated in the observer study included tools for contrast enhancement, registration, and multimodal image visualization. Acceptance was measured in the context of performing portal image analysis in a structured protocol designed to simulate clinical practice. The acceptability and usage patterns were measured from semistructured questionnaires and logs of user interactions. Results: Radiation oncologists, the subjects for this study, perceived the tools in PortFolio to be acceptable clinical aids. Concerns were expressed regarding user efficiency, particularly with respect to the image registration tools. Conclusions: The results of our observer study indicate that workstation technology is acceptable to radiation oncologists as an alternative to a view box for clinical detection of setup errors from digital portal images. Improvements in implementation, including more tools and a greater degree of automation in the image analysis tasks, are needed to make PortFolio more clinically practical

  8. HISTOPATHOLOGICAL AND CYTOLOGICAL ANALYSIS OF TRANSMISSIBLE VENEREAL TUMOR IN DOGS AFTER TWO TREATMENT PROTOCOLS

    Directory of Open Access Journals (Sweden)

    Fabiana Aguena Sales Lapa

    2012-06-01

    Full Text Available The transmissible venereal tumor (TVT is a contagious neoplasm of round cells that frequently affect dogs. The treatment consists of chemotherapy being more effective the vincristine alone, however the resistance emergence to this agent due multidrug resistance of the P-glycoprotein (P-gp, a transporter protein encoded by the MDR1 gene, has been taking the association with other drugs. Recent studies demonstrated the antitumoral effect of the avermectins when associated to the vincristine in the treatment of some neoplasms. Therefore, the objective of the present study was to compare the effectiveness of standard treatment of TVT with vincristine only when compared to combined treatment with vincristine and ivermectin, evaluated through number of applications of the two protocols, histopathological and cytological analysis from 50 dogs diagnosed with TVT during the period of 2007 to 2010. The combined protocol significant reduced the number of applications and cytological and histopathological findings collaborate with the hypothesis that the combination of vincristine and ivermectin promotes faster healing than the use of vincristine alone. Combination treatment with vincristine and ivermectin could be in the future an excellent therapeutic alternative for the treatment of TVT for probably reducing the resistance to vincristine, simultaneously reducing the cost of TVT treatment and promoting a faster recovery of the dog.

  9. Trade-Space Analysis Tool for Constellations (TAT-C)

    Science.gov (United States)

    Le Moigne, Jacqueline; Dabney, Philip; de Weck, Olivier; Foreman, Veronica; Grogan, Paul; Holland, Matthew; Hughes, Steven; Nag, Sreeja

    2016-01-01

    Traditionally, space missions have relied on relatively large and monolithic satellites, but in the past few years, under a changing technological and economic environment, including instrument and spacecraft miniaturization, scalable launchers, secondary launches as well as hosted payloads, there is growing interest in implementing future NASA missions as Distributed Spacecraft Missions (DSM). The objective of our project is to provide a framework that facilitates DSM Pre-Phase A investigations and optimizes DSM designs with respect to a-priori Science goals. In this first version of our Trade-space Analysis Tool for Constellations (TAT-C), we are investigating questions such as: How many spacecraft should be included in the constellation? Which design has the best costrisk value? The main goals of TAT-C are to: Handle multiple spacecraft sharing a mission objective, from SmallSats up through flagships, Explore the variables trade space for pre-defined science, cost and risk goals, and pre-defined metrics Optimize cost and performance across multiple instruments and platforms vs. one at a time.This paper describes the overall architecture of TAT-C including: a User Interface (UI) interacting with multiple users - scientists, missions designers or program managers; an Executive Driver gathering requirements from UI, then formulating Trade-space Search Requests for the Trade-space Search Iterator first with inputs from the Knowledge Base, then, in collaboration with the Orbit Coverage, Reduction Metrics, and Cost Risk modules, generating multiple potential architectures and their associated characteristics. TAT-C leverages the use of the Goddard Mission Analysis Tool (GMAT) to compute coverage and ancillary data, streamlining the computations by modeling orbits in a way that balances accuracy and performance.TAT-C current version includes uniform Walker constellations as well as Ad-Hoc constellations, and its cost model represents an aggregate model consisting of

  10. Thermal buckling comparative analysis using Different FE (Finite Element) tools

    Energy Technology Data Exchange (ETDEWEB)

    Banasiak, Waldemar; Labouriau, Pedro [INTECSEA do Brasil, Rio de Janeiro, RJ (Brazil); Burnett, Christopher [INTECSEA UK, Surrey (United Kingdom); Falepin, Hendrik [Fugro Engineers SA/NV, Brussels (Belgium)

    2009-12-19

    High operational temperature and pressure in offshore pipelines may lead to unexpected lateral movements, sometimes call lateral buckling, which can have serious consequences for the integrity of the pipeline. The phenomenon of lateral buckling in offshore pipelines needs to be analysed in the design phase using FEM. The analysis should take into account many parameters, including operational temperature and pressure, fluid characteristic, seabed profile, soil parameters, coatings of the pipe, free spans etc. The buckling initiation force is sensitive to small changes of any initial geometric out-of-straightness, thus the modeling of the as-laid state of the pipeline is an important part of the design process. Recently some dedicated finite elements programs have been created making modeling of the offshore environment more convenient that has been the case with the use of general purpose finite element software. The present paper aims to compare thermal buckling analysis of sub sea pipeline performed using different finite elements tools, i.e. general purpose programs (ANSYS, ABAQUS) and dedicated software (SAGE Profile 3D) for a single pipeline resting on an the seabed. The analyses considered the pipeline resting on a flat seabed with a small levels of out-of straightness initiating the lateral buckling. The results show the quite good agreement of results of buckling in elastic range and in the conclusions next comparative analyses with sensitivity cases are recommended. (author)

  11. Orienting the Neighborhood: A Subdivision Energy Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Christensen, C.; Horowitz, S.

    2008-01-01

    In subdivisions, house orientations are largely determined by street layout. The resulting house orientations affect energy consumption (annual and on-peak) for heating and cooling, depending on window area distributions and shading from neighboring houses. House orientations also affect energy production (annual and on-peak) from solar thermal and photovoltaic systems, depending on available roof surfaces. Therefore, house orientations fundamentally influence both energy consumption and production, and an appropriate street layout is a prerequisite for taking full advantage of energy efficiency and renewable energy opportunities. The potential influence of street layout on solar performance is often acknowledged, but solar and energy issues must compete with many other criteria and constraints that influence subdivision street layout. When only general guidelines regarding energy are available, these factors may be ignored or have limited effect. Also, typical guidelines are often not site-specific and do not account for local parameters such as climate and the time value of energy. For energy to be given its due consideration in subdivision design, energy impacts need to be accurately quantified and displayed interactively to facilitate analysis of design alternatives. This paper describes a new computerized Subdivision Energy Analysis Tool being developed to allow users to interactively design subdivision street layouts while receiving feedback about energy impacts based on user-specified building design variants and availability of roof surfaces for photovoltaic and solar water heating systems.

  12. Recurrence time statistics: versatile tools for genomic DNA sequence analysis.

    Science.gov (United States)

    Cao, Yinhe; Tung, Wen-Wen; Gao, J B

    2004-01-01

    With the completion of the human and a few model organisms' genomes, and the genomes of many other organisms waiting to be sequenced, it has become increasingly important to develop faster computational tools which are capable of easily identifying the structures and extracting features from DNA sequences. One of the more important structures in a DNA sequence is repeat-related. Often they have to be masked before protein coding regions along a DNA sequence are to be identified or redundant expressed sequence tags (ESTs) are to be sequenced. Here we report a novel recurrence time based method for sequence analysis. The method can conveniently study all kinds of periodicity and exhaustively find all repeat-related features from a genomic DNA sequence. An efficient codon index is also derived from the recurrence time statistics, which has the salient features of being largely species-independent and working well on very short sequences. Efficient codon indices are key elements of successful gene finding algorithms, and are particularly useful for determining whether a suspected EST belongs to a coding or non-coding region. We illustrate the power of the method by studying the genomes of E. coli, the yeast S. cervisivae, the nematode worm C. elegans, and the human, Homo sapiens. Computationally, our method is very efficient. It allows us to carry out analysis of genomes on the whole genomic scale by a PC.

  13. Actigraphy and motion analysis: new tools for psychiatry.

    Science.gov (United States)

    Teicher, M H

    1995-01-01

    Altered locomotor activity is a cardinal sign of several psychiatric disorders. With advances in technology, activity can now be measured precisely. Contemporary studies quantifying activity in psychiatric patients are reviewed. Studies were located by a Medline search (1965 to present; English language only) cross-referencing motor activity and major psychiatric disorders. The review focused on mood disorders and attention-deficit hyperactivity disorder (ADHD). Activity levels are elevated in mania, agitated depression, and ADHD and attenuated in bipolar depression and seasonal depression. The percentage of low-level daytime activity is directly related to severity of depression, and change in this parameter accurately mirrors recovery. Demanding cognitive tasks elicit fidgeting in children with ADHD, and precise measures of activity and attention may provide a sensitive and specific marker for this disorder. Circadian rhythm analysis enhances the sophistication of activity measures. Affective disorders in children and adolescents are characterized by an attenuated circadian rhythm and an enhanced 12-hour harmonic rhythm (diurnal variation). Circadian analysis may help to distinguish between the activity patterns of mania (dysregulated) and ADHD (intact or enhanced). Persistence of hyperactivity or circadian dysregulation in bipolar patients treated with lithium appears to predict rapid relapse once medication is discontinued. Activity monitoring is a valuable research tool, with the potential to aid clinicians in diagnosis and in prediction of treatment response.

  14. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Directory of Open Access Journals (Sweden)

    Jyri Pakarinen

    2010-01-01

    Full Text Available Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  15. Abstract interfaces for data analysis - component architecture for data analysis tools

    International Nuclear Information System (INIS)

    Barrand, G.; Binko, P.; Doenszelmann, M.; Pfeiffer, A.; Johnson, A.

    2001-01-01

    The fast turnover of software technologies, in particular in the domain of interactivity (covering user interface and visualisation), makes it difficult for a small group of people to produce complete and polished software-tools before the underlying technologies make them obsolete. At the HepVis'99 workshop, a working group has been formed to improve the production of software tools for data analysis in HENP. Beside promoting a distributed development organisation, one goal of the group is to systematically design a set of abstract interfaces based on using modern OO analysis and OO design techniques. An initial domain analysis has come up with several categories (components) found in typical data analysis tools: Histograms, Ntuples, Functions, Vectors, Fitter, Plotter, analyzer and Controller. Special emphasis was put on reducing the couplings between the categories to a minimum, thus optimising re-use and maintainability of any component individually. The interfaces have been defined in Java and C++ and implementations exist in the form of libraries and tools using C++ (Anaphe/Lizard, OpenScientist) and Java (Java Analysis Studio). A special implementation aims at accessing the Java libraries (through their Abstract Interfaces) from C++. The authors give an overview of the architecture and design of the various components for data analysis as discussed in AIDA

  16. Fatigue in cold-forging dies: Tool life analysis

    DEFF Research Database (Denmark)

    Skov-Hansen, P.; Bay, Niels; Grønbæk, J.

    1999-01-01

    In the present investigation it is shown how the tool life of heavily loaded cold-forging dies can be predicted. Low-cycle fatigue and fatigue crack growth testing of the tool materials are used in combination with finite element modelling to obtain predictions of tool lives. In the models...... the number of forming cycles is calculated first to crack initiation and then during crack growth to fatal failure. An investigation of a critical die insert in an industrial cold-forging tool as regards the influence of notch radius, the amount and method of pre-stressing and the selected tool material...

  17. Development of a Ground Test and Analysis Protocol to Support NASA's NextSTEP Phase 2 Habitation Concepts

    Science.gov (United States)

    Beaton, Kara H.; Chappell, Steven P.; Bekdash, Omar S.; Gernhardt, Michael L.

    2018-01-01

    The NASA Next Space Technologies for Exploration Partnerships (NextSTEP) program is a public-private partnership model that seeks commercial development of deep space exploration capabilities to support extensive human spaceflight missions around and beyond cislunar space. NASA first issued the Phase 1 NextSTEP Broad Agency Announcement to U.S. industries in 2014, which called for innovative cislunar habitation concepts that leveraged commercialization plans for low Earth orbit. These habitats will be part of the Deep Space Gateway (DSG), the cislunar space station planned by NASA for construction in the 2020s. In 2016, Phase 2 of the NextSTEP program selected five commercial partners to develop ground prototypes. A team of NASA research engineers and subject matter experts have been tasked with developing the ground test protocol that will serve as the primary means by which these Phase 2 prototype habitats will be evaluated. Since 2008, this core test team has successfully conducted multiple spaceflight analog mission evaluations utilizing a consistent set of operational products, tools, methods, and metrics to enable the iterative development, testing, analysis, and validation of evolving exploration architectures, operations concepts, and vehicle designs. The purpose of implementing a similar evaluation process for the NextSTEP Phase 2 Habitation Concepts is to consistently evaluate the different commercial partner ground prototypes to provide data-driven, actionable recommendations for Phase 3.

  18. The interventional effect of new drugs combined with the Stupp protocol on glioblastoma: A network meta-analysis.

    Science.gov (United States)

    Li, Mei; Song, Xiangqi; Zhu, Jun; Fu, Aijun; Li, Jianmin; Chen, Tong

    2017-08-01

    New therapeutic agents in combination with the standard Stupp protocol (a protocol about the temozolomide combined with radiotherapy treatment with glioblastoma was research by Stupp R in 2005) were assessed to evaluate whether they were superior to the Stupp protocol alone, to determine the optimum treatment regimen for patients with newly diagnosed glioblastoma. We implemented a search strategy to identify studies in the following databases: PubMed, Cochrane Library, EMBASE, CNKI, CBM, Wanfang, and VIP, and assessed the quality of extracted data from the trials included. Statistical software was used to perform network meta-analysis. The use of novel therapeutic agents in combination with the Stupp protocol were all shown to be superior than the Stupp protocol alone for the treatment of newly diagnosed glioblastoma, ranked as follows: cilengitide 2000mg/5/week, bevacizumab in combination with irinotecan, nimotuzumab, bevacizumab, cilengitide 2000mg/2/week, cytokine-induced killer cell immunotherapy, and the Stupp protocol. In terms of serious adverse effects, the intervention group showed a 29% increase in the incidence of adverse events compared with the control group (patients treated only with Stupp protocol) with a statistically significant difference (RR=1.29; 95%CI 1.17-1.43; P<0.001). The most common adverse events were thrombocytopenia, lymphopenia, neutropenia, pneumonia, nausea, and vomiting, none of which were significantly different between the groups except for neutropenia, pneumonia, and embolism. All intervention drugs evaluated in our study were superior to the Stupp protocol alone when used in combination with it. However, we could not conclusively confirm whether cilengitide 2000mg/5/week was the optimum regime, as only one trial using this protocol was included in our study. Copyright © 2017. Published by Elsevier B.V.

  19. BUSINESS INTELLIGENCE TOOLS FOR DATA ANALYSIS AND DECISION MAKING

    Directory of Open Access Journals (Sweden)

    DEJAN ZDRAVESKI

    2011-04-01

    Full Text Available Every business is dynamic in nature and is affected by various external and internal factors. These factors include external market conditions, competitors, internal restructuring and re-alignment, operational optimization and paradigm shifts in the business itself. New regulations and restrictions, in combination with the above factors, contribute to the constant evolutionary nature of compelling, business-critical information; the kind of information that an organization needs to sustain and thrive. Business intelligence (“BI” is broad term that encapsulates the process of gathering information pertaining to a business and the market it functions in. This information when collated and analyzed in the right manner, can provide vital insights into the business and can be a tool to improve efficiency, reduce costs, reduce time lags and bring many positive changes. A business intelligence application helps to achieve precisely that. Successful organizations maximize the use of their data assets through business intelligence technology. The first data warehousing and decision support tools introduced companies to the power and benefits of accessing and analyzing their corporate data. Business users at every level found new, more sophisticated ways to analyze and report on the information mined from their vast data warehouses.Choosing a Business Intelligence offering is an important decision for an enterprise, one that will have a significant impact throughout the enterprise. The choice of a BI offering will affect people up and down the chain of command (senior management, analysts, and line managers and across functional areas (sales, finance, and operations. It will affect business users, application developers, and IT professionals. BI applications include the activities of decision support systems (DSS, query and reporting, online analyticalprocessing (OLAP, statistical analysis, forecasting, and data mining. Another way of phrasing this is

  20. Study of academic achievements using spatial analysis tools

    Science.gov (United States)

    González, C.; Velilla, C.; Sánchez-Girón, V.

    2012-04-01

    In the 2010/12 academic year the College of Agricultural Engineering of the Technical University of Madrid implemented three new degrees all of them adapted to the European Space for Higher Education. These degrees are namely: Graduate in Agricultural Engineering and Science, Graduate in Food Engineering and Graduate in Agro-Environmental Engineering. A total of 382 new incoming students were finally registered and a survey study was carried out with these students about their academic achievement with the aim of finding the level of dependence among the following variables: the final mark in their secondary studies, the option followed in the secondary studies (Art, Science and Technology, and Humanities and Social Sciences), the mark obtained in the entering examination to the university and in which of the two opportunities per year this examination takes place the latter mark was obtained. Similarly, another group of 77 students were evaluated independently to the former group. These students were those entering the College in the previous academic year (2009/10) and decided to change their curricula to the new ones. Subsequently, using the tools of spatial analysis of geographic information systems, we analyzed the possible relationship between the success or failure at school and the socioeconomic profile of new students in a grade. For this purpose every student was referenced assigning UTM coordinates to their postal addresses. Furthermore, all students' secondary schools were geographically coded considering their typology (public, private, and private subsidized) and fares. Each student was represented by its average geometric point in order to be correlated to their respective record. Following this procedure a map of the performance of each student could be drawn. This map can be used as a reference system, as it includes variables as the distance from the student home to the College, that can be used as a tool to calculate the probability of success or

  1. Seismic Canvas: Evolution as a Data Exploration and Analysis Tool

    Science.gov (United States)

    Kroeger, G. C.

    2015-12-01

    SeismicCanvas, originally developed as a prototype interactive waveform display and printing application for educational use has evolved to include significant data exploration and analysis functionality. The most recent version supports data import from a variety of standard file formats including SAC and mini-SEED, as well as search and download capabilities via IRIS/FDSN Web Services. Data processing tools now include removal of means and trends, interactive windowing, filtering, smoothing, tapering, resampling. Waveforms can be displayed in a free-form canvas or as a record section based on angular or great circle distance, azimuth or back azimuth. Integrated tau-p code allows the calculation and display of theoretical phase arrivals from a variety of radial Earth models. Waveforms can be aligned by absolute time, event time, picked or theoretical arrival times and can be stacked after alignment. Interactive measurements include means, amplitudes, time delays, ray parameters and apparent velocities. Interactive picking of an arbitrary list of seismic phases is supported. Bode plots of amplitude and phase spectra and spectrograms can be created from multiple seismograms or selected windows of seismograms. Direct printing is implemented on all supported platforms along with output of high-resolution pdf files. With these added capabilities, the application is now being used as a data exploration tool for research. Coded in C++ and using the cross-platform Qt framework, the most recent version is available as a 64-bit application for Windows 7-10, Mac OS X 10.6-10.11, and most distributions of Linux, and a 32-bit version for Windows XP and 7. With the latest improvements and refactoring of trace display classes, the 64-bit versions have been tested with over 250 million samples and remain responsive in interactive operations. The source code is available under a LPGLv3 license and both source and executables are available through the IRIS SeisCode repository.

  2. Analysis on machine tool systems using spindle vibration monitoring for automatic tool changer

    OpenAIRE

    Shang-Liang Chen; Yin-Ting Cheng; Chin-Fa Su

    2015-01-01

    Recently, the intelligent systems of technology have become one of the major items in the development of machine tools. One crucial technology is the machinery status monitoring function, which is required for abnormal warnings and the improvement of cutting efficiency. During processing, the mobility act of the spindle unit determines the most frequent and important part such as automatic tool changer. The vibration detection system includes the development of hardware and software, such as ...

  3. Quantifying Traces of Tool Use: A Novel Morphometric Analysis of Damage Patterns on Percussive Tools

    OpenAIRE

    Caruana, Matthew V.; Carvalho, Susana; Braun, David R.; Presnyakova, Darya; Haslam, Michael; Archer, Will; Bobe, Rene; Harris, John W. K.

    2014-01-01

    Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnost...

  4. Neutron activation analysis as analytical tool of environmental issue

    International Nuclear Information System (INIS)

    Otoshi, Tsunehiko

    2004-01-01

    Neutron activation analysis (NAA) ia applicable to the sample of wide range of research fields, such as material science, biology, geochemistry and so on. However, respecting the advantages of NAA, a sample with small amounts or a precious sample is the most suitable samples for NAA, because NAA is capable of trace analysis and non-destructive determination. In this paper, among these fields, NAA of atmospheric particulate matter (PM) sample is discussed emphasizing on the use of obtained data as an analytical tool of environmental issue. Concentration of PM in air is usually very low, and it is not easy to get vast amount of sample even using a high volume air sampling devise. Therefore, high sensitive NAA is suitable to determine elements in PM samples. Main components of PM is crust oriented silicate, and so on in rural/remote area, and carbonic materials and heavy metals are concentrated in PM in urban area, because of automobile exhaust and other anthropogenic emission source. Elemental pattern of PM reflects a condition of air around the monitoring site. Trends of air pollution can be traced by periodical monitoring of PM by NAA method. Elemental concentrations in air change by season. For example, crustal elements increase in dry season, and sea salts components increase their concentration when wind direction from sea is dominant. Elements that emitted from anthropogenic sources are mainly contained in fine portion of PM, and increase their concentration during winter season, when emission from heating system is high and air is stable. For further analysis and understanding of environmental issues, indicator elements for various emission sources, and elemental concentration ratios of some environmental samples and source portion assignment techniques are useful. (author)

  5. Flow injection analysis: Emerging tool for laboratory automation in radiochemistry

    International Nuclear Information System (INIS)

    Egorov, O.; Ruzicka, J.; Grate, J.W.; Janata, J.

    1996-01-01

    Automation of routine and serial assays is a common practice of modern analytical laboratory, while it is virtually nonexistent in the field of radiochemistry. Flow injection analysis (FIA) is a general solution handling methodology that has been extensively used for automation of routine assays in many areas of analytical chemistry. Reproducible automated solution handling and on-line separation capabilities are among several distinctive features that make FI a very promising, yet under utilized tool for automation in analytical radiochemistry. The potential of the technique is demonstrated through the development of an automated 90 Sr analyzer and its application in the analysis of tank waste samples from the Hanford site. Sequential injection (SI), the latest generation of FIA, is used to rapidly separate 90 Sr from interfering radionuclides and deliver separated Sr zone to a flow-through liquid scintillation detector. The separation is performed on a mini column containing Sr-specific sorbent extraction material, which selectively retains Sr under acidic conditions. The 90 Sr is eluted with water, mixed with scintillation cocktail, and sent through the flow cell of a flow through counter, where 90 Sr radioactivity is detected as a transient signal. Both peak area and peak height can be used for quantification of sample radioactivity. Alternatively, stopped flow detection can be performed to improve detection precision for low activity samples. The authors current research activities are focused on expansion of radiochemical applications of FIA methodology, with an ultimate goal of creating a set of automated methods that will cover the basic needs of radiochemical analysis at the Hanford site. The results of preliminary experiments indicate that FIA is a highly suitable technique for the automation of chemically more challenging separations, such as separation of actinide elements

  6. Study protocol: pragmatic randomized control trial of an internet-based intervention (My tools 4 care) for family carers.

    Science.gov (United States)

    Duggleby, Wendy; Ploeg, Jenny; McAiney, Carrie; Fisher, Kathryn; Swindle, Jenny; Chambers, Tracey; Ghosh, Sunita; Peacock, Shelley; Markle-Reid, Maureen; Triscott, Jean; Williams, Allison; Forbes, Dorothy; Pollard, Lori

    2017-08-14

    Family carers of older persons with Alzheimer's' disease and related dementia (ADRD) and multiple chronic conditions (MCC) experience significant, complex, and distressing transitions such as changes to their environment, roles and relationships, physical health, and mental health. An online intervention (My Tools 4 Care) was developed for family carers of persons with ADRD and MCC living at home, with the aim of supporting these carers through transitions and increasing their self-efficacy, hope, and health related quality of life (HRQoL). This study will evaluate My Tools 4 Care (MT4C) by asking the following research questions: 1. Does use of MT4C result in a 3 month (immediately post intervention) and 6-month (3 months after intervention) increase in HRQoL, self-efficacy, and hope, in carers of persons with ADRD and MCC compared to an educational control group? 2. Does use of MT4C help carers of community-dwelling older adults with ADRD and MCC deal with significant changes they experience as carers? and 3. Are the effects/benefits of the MT4C intervention achieved at no additional cost compared to an educational control group? Using a pragmatic mixed methods randomized controlled trial design, 180 family carers of community dwelling older persons (65 years of age and older) with ADRD and MCC will participate in the study. Data will be collected from the intervention and an educational control group at four time points: baseline, 1 month, 3 and 6 months. We expect to find that family carers using MT4C will show greater improvement in hope, self-efficacy and HRQoL, at no additional cost from a societal perspective, compared to those in the educational control group. General estimating equations will be used to determine differences between groups and over time. Data collection began in Ontario and Alberta Canada in June 2015 and is expected to be completed in June 2017. The results will inform policy and practice as MT4C can be easily revised for local

  7. Comparison of Nutrigenomics Technology Interface Tools for Consumers and Health Professionals: Protocol for a Mixed-Methods Study.

    Science.gov (United States)

    Littlejohn, Paula; Cop, Irene; Brown, Erin; Afroze, Rimi; Davison, Karen M

    2018-06-11

    because they did not complete all required data collection forms or were later found to be ineligible. The final sample (n=55) was mostly female (75%), married (85%), and those who had completed postsecondary education (62%). This study will leverage quantitative and qualitative findings, which will guide the development of nutrigenomics-based products in electronic formats that are user-friendly for consumers and health professionals. Although the quantitative data have not been analyzed yet, the overwhelming interest in the study and the extremely high retention rate show that there is a great degree of interest in this field. Given this interest and the fact that nutrigenomics is an evolving science, a need for continued research exists to further the understanding of the role of genetic variation and its role and applications in nutrition practice. Clinicaltrials.gov NCT03310814; http://clinicaltrials.gov/ct2/show/NCT03310814 (Archived by WebCite at http://www.webcitation.org/6yGnU5deB). RR1-10.2196/9846. ©Paula Littlejohn, Irene Cop, Erin Brown, Rimi Afroze, Karen M Davison. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 11.06.2018.

  8. Analysis of the End-by-Hop Protocol for Secure Aggregation in Sensor Networks

    DEFF Research Database (Denmark)

    Zenner, Erik

    In order to save bandwidth and thus battery power, sensor network measurements are sometimes aggregated en-route while being reported back to the querying server. Authentication of the measurements then becomes a challenge if message integrity is important for the application. At ESAS 2007, the End......-by-Hop protocol for securing in-network aggregation for sensor nodes was presented. The solution was claimed to be secure and efficient and to provide the possibility of trading off bandwidth against computation time on the server. In this paper, we disprove these claims. We describe several attacks against...... the proposed solution and point out shortcomings in the original complexity analysis. In particular, we show that the proposed solution is inferior to a naive solution without in-network aggregation both in security and in efficiency....

  9. Aromatherapy for managing menopausal symptoms: A protocol for systematic review and meta-analysis.

    Science.gov (United States)

    Choi, Jiae; Lee, Hye Won; Lee, Ju Ah; Lim, Hyun-Ja; Lee, Myeong Soo

    2018-02-01

    Aromatherapy is often used as a complementary therapy for women's health. This systematic review aims to evaluate the therapeutic effects of aromatherapy as a management for menopausal symptoms. Eleven electronic databases will be searched from inception to February 2018. Randomized controlled trials that evaluated any type of aromatherapy against any type of control in individuals with menopausal symptoms will be eligible. The methodological quality will be assessed using the Cochrane risk of bias tool. Two authors will independently assess each study for eligibility and risk of bias and to extract data. This study will provide a high quality synthesis of current evidence of aromatherapy for menopausal symptoms measured with Menopause Rating Scale, the Kupperman Index, the Greene Climacteric Scale, or other validated questionnaires. The conclusion of our systematic review will provide evidence to judge whether aromatherapy is an effective intervention for patient with menopausal women. Ethical approval will not be required, given that this protocol is for a systematic review. The systematic review will be published in a peer-reviewed journal. The review will also be disseminated electronically and in print. PROSPERO CRD42017079191.

  10. Security Protocols: Specification, Verification, Implementation, and Composition

    DEFF Research Database (Denmark)

    Almousa, Omar

    An important aspect of Internet security is the security of cryptographic protocols that it deploys. We need to make sure that such protocols achieve their goals, whether in isolation or in composition, i.e., security protocols must not suffer from any aw that enables hostile intruders to break...... results. The most important generalization is the support for all security properties of the geometric fragment proposed by [Gut14]....... their security. Among others, tools like OFMC [MV09b] and Proverif [Bla01] are quite efficient for the automatic formal verification of a large class of protocols. These tools use different approaches such as symbolic model checking or static analysis. Either approach has its own pros and cons, and therefore, we...

  11. Is there a social gradient of sarcopenia? A meta-analysis and systematic review protocol.

    Science.gov (United States)

    Green, Darci; Duque, Gustavo; Fredman, Nick; Rizvi, Aoun; Brennan-Olsen, Sharon Lee

    2018-01-13

    Sarcopenia (or loss of muscle mass and function) is a relatively new area within the field of musculoskeletal research and medicine. Investigating whether there is a social gradient, including occupation type and income level, of sarcopenia, as observed for other diseases, will contribute significantly to the limited evidence base for this disease. This new information may inform the prevention and management of sarcopenia and widen the evidence base to support existing and future health campaigns. We will conduct a systematic search of the databases PubMed, Ovid, CINAHL, Scopus and EMBASE to identify articles that investigate associations between social determinants of health and sarcopenia in adults aged 50 years and older. Eligibility of the selected studies will be determined by two independent reviewers. The methodological quality of eligible studies will be assessed according to predetermined criteria. Established statistical methods to identify and control for heterogeneity will be used, and where appropriate, we will conduct a meta-analysis. In the event that heterogeneity prevents numerical synthesis, a best evidence analysis will be employed. This systematic review protocol adheres to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses Protocols reporting guidelines and will be registered with the International Prospective Register of Systematic Reviews (PROSPERO). This systematic review will use published data, thus ethical permissions will not be required. In addition to peer-reviewed publication, our results will be presented at (inter)national conferences relevant to the field of sarcopenia, ageing and/or musculoskeletal health and disseminated both electronically and in print. CRD42017072253. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  12. The use of current risk analysis tools evaluated towards preventing external domino accidents

    NARCIS (Netherlands)

    Reniers, Genserik L L; Dullaert, W.; Ale, B. J.M.; Soudan, K.

    Risk analysis is an essential tool for company safety policy. Risk analysis consists of identifying and evaluating all possible risks. The efficiency of risk analysis tools depends on the rigueur of identifying and evaluating all possible risks. The diversity in risk analysis procedures is such that

  13. Analysis of the influence of tool dynamics in diamond turning

    Energy Technology Data Exchange (ETDEWEB)

    Fawcett, S.C.; Luttrell, D.E.; Keltie, R.F.

    1988-12-01

    This report describes the progress in defining the role of machine and interface dynamics on the surface finish in diamond turning. It contains a review of literature from conventional and diamond machining processes relating tool dynamics, material interactions and tool wear to surface finish. Data from experimental measurements of tool/work piece interface dynamics are presented as well as machine dynamics for the DTM at the Center.

  14. Dementia Population Risk Tool (DemPoRT): study protocol for a predictive algorithm assessing dementia risk in the community.

    Science.gov (United States)

    Fisher, Stacey; Hsu, Amy; Mojaverian, Nassim; Taljaard, Monica; Huyer, Gregory; Manuel, Douglas G; Tanuseputro, Peter

    2017-10-24

    The burden of disease from dementia is a growing global concern as incidence increases dramatically with age, and average life expectancy has been increasing around the world. Planning for an ageing population requires reliable projections of dementia prevalence; however, existing population projections are simple and have poor predictive accuracy. The Dementia Population Risk Tool (DemPoRT) will predict incidence of dementia in the population setting using multivariable modelling techniques and will be used to project dementia prevalence. The derivation cohort will consist of elderly Ontario respondents of the Canadian Community Health Survey (CCHS) (2001, 2003, 2005 and 2007; 18 764 males and 25 288 females). Prespecified predictors include sociodemographic, general health, behavioural, functional and health condition variables. Incident dementia will be identified through individual linkage of survey respondents to population-level administrative healthcare databases (1797 and 3281 events, and 117 795 and 166 573 person-years of follow-up, for males and females, respectively, until 31 March 2014). Using time of first dementia capture as the primary outcome and death as a competing risk, sex-specific proportional hazards regression models will be estimated. The 2008/2009 CCHS survey will be used for validation (approximately 4600 males and 6300 females). Overall calibration and discrimination will be assessed as well as calibration within predefined subgroups of importance to clinicians and policy makers. Research ethics approval has been granted by the Ottawa Health Science Network Research Ethics Board. DemPoRT results will be submitted for publication in peer-review journals and presented at scientific meetings. The algorithm will be assessable online for both population and individual uses. ClinicalTrials.gov NCT03155815, pre-results. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No

  15. Hearing and vision screening tools for long-term care residents with dementia: protocol for a scoping review.

    Science.gov (United States)

    McGilton, Katherine S; Höbler, Fiona; Campos, Jennifer; Dupuis, Kate; Labreche, Tammy; Guthrie, Dawn M; Jarry, Jonathan; Singh, Gurjit; Wittich, Walter

    2016-07-26

    Hearing and vision loss among long-term care (LTC) residents with dementia frequently goes unnoticed and untreated. Despite negative consequences for these residents, there is little information available about their sensory abilities and care assessments and practices seldom take these abilities or accessibility needs into account. Without adequate knowledge regarding such sensory loss, it is difficult for LTC staff to determine the level of an individual's residual basic competence for communication and independent functioning. We will conduct a scoping review to identify the screening measures used in research and clinical contexts that test hearing and vision in adults aged over 65 years with dementia, aiming to: (1) provide an overview of hearing and vision screening in older adults with dementia; and (2) evaluate the sensibility of the screening tools. This scoping review will be conducted using the framework by Arksey and O'Malley and furthered by methodological enhancements from cited researchers. We will conduct electronic database searches in CENTRAL, CINAHL, EMBASE, MEDLINE and PsycINFO. We will also carry out a 'grey literature' search for studies or materials not formally published, both online and through interview discussions with healthcare professionals and research clinicians working in the field. Our aim is to find new and existing hearing and vision screening measures used in research and by clinical professionals of optometry and audiology. Abstracts will be independently reviewed twice for acceptance by a multidisciplinary team of researchers and research clinicians. This review will inform health professionals working with this growing population. With the review findings, we aim to develop a toolkit and an algorithmic process to select the most appropriate hearing and vision screening assessments for LTC residents with dementia that will facilitate accurate testing and can inform care planning, thereby improving residents' quality of life

  16. Suspended Cell Culture ANalysis (SCAN) Tool to Enhance ISS On-Orbit Capabilities, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Aurora Flight Sciences and partner, Draper Laboratory, propose to develop an on-orbit immuno-based label-free Suspension Cell Culture ANalysis tool, SCAN tool, which...

  17. GPFrontend and GPGraphics: graphical analysis tools for genetic association studies.

    Science.gov (United States)

    Uebe, Steffen; Pasutto, Francesca; Krumbiegel, Mandy; Schanze, Denny; Ekici, Arif B; Reis, André

    2010-09-21

    Most software packages for whole genome association studies are non-graphical, purely text based programs originally designed to run with UNIX-like operating systems. Graphical output is often not intended or supposed to be performed with other command line tools, e.g. gnuplot. Using the Microsoft .NET 2.0 platform and Visual Studio 2005, we have created a graphical software package to analyze data from microarray whole genome association studies, both for a DNA-pooling based approach as well as regular single sample data. Part of this package was made to integrate with GenePool 0.8.2, a previously existing software suite for GNU/Linux systems, which we have modified to run in a Microsoft Windows environment. Further modifications cause it to generate some additional data. This enables GenePool to interact with the .NET parts created by us. The programs we developed are GPFrontend, a graphical user interface and frontend to use GenePool and create metadata files for it, and GPGraphics, a program to further analyze and graphically evaluate output of different WGA analysis programs, among them also GenePool. Our programs enable regular MS Windows users without much experience in bioinformatics to easily visualize whole genome data from a variety of sources.

  18. GPFrontend and GPGraphics: graphical analysis tools for genetic association studies

    Directory of Open Access Journals (Sweden)

    Schanze Denny

    2010-09-01

    Full Text Available Abstract Background Most software packages for whole genome association studies are non-graphical, purely text based programs originally designed to run with UNIX-like operating systems. Graphical output is often not intended or supposed to be performed with other command line tools, e.g. gnuplot. Results Using the Microsoft .NET 2.0 platform and Visual Studio 2005, we have created a graphical software package to analyze data from microarray whole genome association studies, both for a DNA-pooling based approach as well as regular single sample data. Part of this package was made to integrate with GenePool 0.8.2, a previously existing software suite for GNU/Linux systems, which we have modified to run in a Microsoft Windows environment. Further modifications cause it to generate some additional data. This enables GenePool to interact with the .NET parts created by us. The programs we developed are GPFrontend, a graphical user interface and frontend to use GenePool and create metadata files for it, and GPGraphics, a program to further analyze and graphically evaluate output of different WGA analysis programs, among them also GenePool. Conclusions Our programs enable regular MS Windows users without much experience in bioinformatics to easily visualize whole genome data from a variety of sources.

  19. Thermal Management Tools for Propulsion System Trade Studies and Analysis

    Science.gov (United States)

    McCarthy, Kevin; Hodge, Ernie

    2011-01-01

    Energy-related subsystems in modern aircraft are more tightly coupled with less design margin. These subsystems include thermal management subsystems, vehicle electric power generation and distribution, aircraft engines, and flight control. Tighter coupling, lower design margins, and higher system complexity all make preliminary trade studies difficult. A suite of thermal management analysis tools has been developed to facilitate trade studies during preliminary design of air-vehicle propulsion systems. Simulink blocksets (from MathWorks) for developing quasi-steady-state and transient system models of aircraft thermal management systems and related energy systems have been developed. These blocksets extend the Simulink modeling environment in the thermal sciences and aircraft systems disciplines. The blocksets include blocks for modeling aircraft system heat loads, heat exchangers, pumps, reservoirs, fuel tanks, and other components at varying levels of model fidelity. The blocksets have been applied in a first-principles, physics-based modeling and simulation architecture for rapid prototyping of aircraft thermal management and related systems. They have been applied in representative modern aircraft thermal management system studies. The modeling and simulation architecture has also been used to conduct trade studies in a vehicle level model that incorporates coupling effects among the aircraft mission, engine cycle, fuel, and multi-phase heat-transfer materials.

  20. Real-Time QoS Routing Protocols in Wireless Multimedia Sensor Networks: Study and Analysis.

    Science.gov (United States)

    Alanazi, Adwan; Elleithy, Khaled

    2015-09-02

    Many routing protocols have been proposed for wireless sensor networks. These routing protocols are almost always based on energy efficiency. However, recent advances in complementary metal-oxide semiconductor (CMOS) cameras and small microphones have led to the development of Wireless Multimedia Sensor Networks (WMSN) as a class of wireless sensor networks which pose additional challenges. The transmission of imaging and video data needs routing protocols with both energy efficiency and Quality of Service (QoS) characteristics in order to guarantee the efficient use of the sensor nodes and effective access to the collected data. Also, with integration of real time applications in Wireless Senor Networks (WSNs), the use of QoS routing protocols is not only becoming a significant topic, but is also gaining the attention of researchers. In designing an efficient QoS routing protocol, the reliability and guarantee of end-to-end delay are critical events while conserving energy. Thus, considerable research has been focused on designing energy efficient and robust QoS routing protocols. In this paper, we present a state of the art research work based on real-time QoS routing protocols for WMSNs that have already been proposed. This paper categorizes the real-time QoS routing protocols into probabilistic and deterministic protocols. In addition, both categories are classified into soft and hard real time protocols by highlighting the QoS issues including the limitations and features of each protocol. Furthermore, we have compared the performance of mobility-aware query based real-time QoS routing protocols from each category using Network Simulator-2 (NS2). This paper also focuses on the design challenges and future research directions as well as highlights the characteristics of each QoS routing protocol.

  1. Improvements to Integrated Tradespace Analysis of Communications Architectures (ITACA) Network Loading Analysis Tool

    Science.gov (United States)

    Lee, Nathaniel; Welch, Bryan W.

    2018-01-01

    NASA's SCENIC project aims to simplify and reduce the cost of space mission planning by replicating the analysis capabilities of commercially licensed software which are integrated with relevant analysis parameters specific to SCaN assets and SCaN supported user missions. SCENIC differs from current tools that perform similar analyses in that it 1) does not require any licensing fees, 2) will provide an all-in-one package for various analysis capabilities that normally requires add-ons or multiple tools to complete. As part of SCENIC's capabilities, the ITACA network loading analysis tool will be responsible for assessing the loading on a given network architecture and generating a network service schedule. ITACA will allow users to evaluate the quality of service of a given network architecture and determine whether or not the architecture will satisfy the mission's requirements. ITACA is currently under development, and the following improvements were made during the fall of 2017: optimization of runtime, augmentation of network asset pre-service configuration time, augmentation of Brent's method of root finding, augmentation of network asset FOV restrictions, augmentation of mission lifetimes, and the integration of a SCaN link budget calculation tool. The improvements resulted in (a) 25% reduction in runtime, (b) more accurate contact window predictions when compared to STK(Registered Trademark) contact window predictions, and (c) increased fidelity through the use of specific SCaN asset parameters.

  2. Social dataset analysis and mapping tools for Risk Perception: resilience, people preparation and communication tools

    Science.gov (United States)

    Peters-Guarin, Graciela; Garcia, Carolina; Frigerio, Simone

    2010-05-01

    Perception has been identified as resource and part of the resilience of a community to disasters. Risk perception, if present, may determine the potential damage a household or community experience. Different levels of risk perception and preparedness can influence directly people's susceptibility and the way they might react in case of an emergency caused by natural hazards. In spite of the profuse literature about risk perception, works to spatially portray this feature are really scarce. The spatial relationship to danger or hazard is being recognised as an important factor of the risk equation; it can be used as a powerful tool either for better knowledge or for operational reasons (e.g. management of preventive information). Risk perception and people's awareness when displayed in a spatial format can be useful for several actors in the risk management arena. Local authorities and civil protection can better address educational activities to increase the preparation of particularly vulnerable groups of clusters of households within a community. It can also be useful for the emergency personal in order to optimally direct the actions in case of an emergency. In the framework of the Marie Curie Research Project, a Community Based Early Warning System (CBEWS) it's been developed in the Mountain Community Valtellina of Tirano, northern Italy. This community has been continuously exposed to different mass movements and floods, in particular, a large event in 1987 which affected a large portion of the valley and left 58 dead. The actual emergency plan for the study area is composed by a real time, highly detailed, decision support system. This emergency plan contains detailed instructions for the rapid deployment of civil protection and other emergency personal in case of emergency, for risk scenarios previously defined. Especially in case of a large event, where timely reaction is crucial for reducing casualties, it is important for those in charge of emergency

  3. Microfabricated tools for manipulation and analysis of magnetic microcarriers

    International Nuclear Information System (INIS)

    Tondra, Mark; Popple, Anthony; Jander, Albrecht; Millen, Rachel L.; Pekas, Nikola; Porter, Marc D.

    2005-01-01

    Tools for manipulating and detecting magnetic microcarriers are being developed with microscale features. Microfabricated giant magnetoresistive (GMR) sensors and wires are used for detection, and for creating high local field gradients. Microfluidic structures are added to control flow, and positioning of samples and microcarriers. These tools are designed for work in analytical chemistry and biology

  4. Computer Tools for Construction, Modification and Analysis of Petri Nets

    DEFF Research Database (Denmark)

    Jensen, Kurt

    1987-01-01

    The practical use of Petri nets is — just as any other description technique — very dependent on the existence of adequate computer tools, which may assist the user to cope with the many details of a large description. For Petri nets there is a need for tools supporting construction of nets...

  5. Tool wear analysis during duplex stainless steel trochoidal milling

    Science.gov (United States)

    Amaro, Paulo; Ferreira, Pedro; Simões, Fernando

    2018-05-01

    In this study a tool with interchangeable inserts of sintered carbides coated with AlTiN were used to mill a duplex stainless steel with trochoidal strategies. Cutting speed range from 120 to 300 m/min were used and t he evaluation of tool deterioration and tool life was made according international standard ISO 8688-1. It was observed a progressive development of a flank wear and a cumulative cyclic process of localized adhesion of the chip to the cutting edge, followed by chipping, loss of the coating and substrate exposure. The tool life reached a maximum of 35 min. for cutting speed of 120 m/min. However, it was possible to maintain a tool life of 20-25 minutes when the cutting speed was increased up to 240 m/min.

  6. An analysis of moderate sedation protocols used in dental specialty programs: a retrospective observational study.

    Science.gov (United States)

    Setty, Madhavi; Montagnese, Thomas A; Baur, Dale; Aminoshariae, Anita; Mickel, Andre

    2014-09-01

    Pain and anxiety control is critical in dental practice. Moderate sedation is a useful adjunct in managing a variety of conditions that make it difficult or impossible for some people to undergo certain dental procedures. The purpose of this study was to analyze the sedation protocols used in 3 dental specialty programs at the Case Western Reserve University School of Dental Medicine, Cleveland, OH. A retrospective analysis was performed using dental school records of patients receiving moderate sedation in the graduate endodontic, periodontic, and oral surgery programs from January 1, 2010, to December 31, 2012. Information was gathered and the data compiled regarding the reasons for sedation, age, sex, pertinent medical conditions, American Society of Anesthesiologists physical status classifications, routes of administration, drugs, dosages, failures, complications, and other information that was recorded. The reasons for the use of moderate sedation were anxiety (54%), local anesthesia failures (15%), fear of needles (15%), severe gag reflex (8%), and claustrophobia with the rubber dam (8%). The most common medical conditions were hypertension (17%), asthma (15%), and bipolar disorder (8%). Most patients were classified as American Society of Anesthesiologists class II. More women (63.1%) were treated than men (36.9%). The mean age was 45 years. Monitoring and drugs varied among the programs. The most common tooth treated in the endodontic program was the mandibular molar. There are differences in the moderate sedation protocols used in the endodontic, periodontic, and oral surgery programs regarding monitoring, drugs used, and record keeping. Copyright © 2014 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  7. Hierarchical evaluation of electrical stimulation protocols for chronic wound healing: An effect size meta-analysis.

    Science.gov (United States)

    Khouri, Charles; Kotzki, Sylvain; Roustit, Matthieu; Blaise, Sophie; Gueyffier, Francois; Cracowski, Jean-Luc

    2017-09-01

    Electrical stimulation (ES) has been tested for decades to improve chronic wound healing. However, uncertainty remains on the magnitude of the efficacy and on the best applicable protocol. We conducted an effect size meta-analysis to assess the overall efficacy of ES on wound healing, to compare the efficacy of the different modalities of electrical stimulation, and to determine whether efficacy differs depending on the wound etiology, size, and age of the chronic wound. Twenty-nine randomized clinical trials with 1,510 patients and 1,753 ulcers were selected. Overall efficacy of ES on would healing was a 0.72 SMD (95% CI: 0.48, 1) corresponding to a moderate to large effect size. We found that unidirectional high voltage pulsed current (HVPC) with the active electrode over the wound was the best evidence-based protocol to improve wound healing with a 0.8 SMD (95% CI: 0.38, 1.21), while evaluation of the efficacy of direct current was limited by the small number of studies. ES was more effective on pressure ulcers compared to venous and diabetic ulcers, and efficacy trended to be inversely associated with the wound size and duration. This study confirms the overall efficacy of ES to enhance healing of chronic wounds and highlights the superiority of HVPC over other type of currents, which is more effective on pressure ulcers, and inversely associated with the wound size and duration. This will enable to standardize future ES practices. © 2017 by the Wound Healing Society.

  8. SNP_tools: A compact tool package for analysis and conversion of genotype data for MS-Excel.

    Science.gov (United States)

    Chen, Bowang; Wilkening, Stefan; Drechsel, Marion; Hemminki, Kari

    2009-10-23

    Single nucleotide polymorphism (SNP) genotyping is a major activity in biomedical research. Scientists prefer to have a facile access to the results which may require conversions between data formats. First hand SNP data is often entered in or saved in the MS-Excel format, but this software lacks genetic and epidemiological related functions. A general tool to do basic genetic and epidemiological analysis and data conversion for MS-Excel is needed. The SNP_tools package is prepared as an add-in for MS-Excel. The code is written in Visual Basic for Application, embedded in the Microsoft Office package. This add-in is an easy to use tool for users with basic computer knowledge (and requirements for basic statistical analysis). Our implementation for Microsoft Excel 2000-2007 in Microsoft Windows 2000, XP, Vista and Windows 7 beta can handle files in different formats and converts them into other formats. It is a free software.

  9. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-II analysis model

    CERN Document Server

    FARRELL, Steven; The ATLAS collaboration

    2015-01-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This presentation will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  10. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-2 analysis model

    CERN Document Server

    FARRELL, Steven; The ATLAS collaboration; Calafiura, Paolo; Delsart, Pierre-Antoine; Elsing, Markus; Koeneke, Karsten; Krasznahorkay, Attila; Krumnack, Nils; Lancon, Eric; Lavrijsen, Wim; Laycock, Paul; Lei, Xiaowen; Strandberg, Sara Kristina; Verkerke, Wouter; Vivarelli, Iacopo; Woudstra, Martin

    2015-01-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This paper will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  11. Evaluation of Extraction Protocols for Simultaneous Polar and Non-Polar Yeast Metabolite Analysis Using Multivariate Projection Methods

    Directory of Open Access Journals (Sweden)

    Nicolas P. Tambellini

    2013-07-01

    Full Text Available Metabolomic and lipidomic approaches aim to measure metabolites or lipids in the cell. Metabolite extraction is a key step in obtaining useful and reliable data for successful metabolite studies. Significant efforts have been made to identify the optimal extraction protocol for various platforms and biological systems, for both polar and non-polar metabolites. Here we report an approach utilizing chemoinformatics for systematic comparison of protocols to extract both from a single sample of the model yeast organism Saccharomyces cerevisiae. Three chloroform/methanol/water partitioning based extraction protocols found in literature were evaluated for their effectiveness at reproducibly extracting both polar and non-polar metabolites. Fatty acid methyl esters and methoxyamine/trimethylsilyl derivatized aqueous compounds were analyzed by gas chromatography mass spectrometry to evaluate non-polar or polar metabolite analysis. The comparative breadth and amount of recovered metabolites was evaluated using multivariate projection methods. This approach identified an optimal protocol consisting of 64 identified polar metabolites from 105 ion hits and 12 fatty acids recovered, and will potentially attenuate the error and variation associated with combining metabolite profiles from different samples for untargeted analysis with both polar and non-polar analytes. It also confirmed the value of using multivariate projection methods to compare established extraction protocols.

  12. Online Analysis of Wind and Solar Part I: Ramping Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V.; Ma, Jian; Makarov, Yuri V.; Subbarao, Krishnappa

    2012-01-31

    To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. This tool predicts and displays additional capacity and ramping requirements caused by uncertainties in forecasts of loads and renewable generation. The tool is currently operational in the CAISO operations center. This is one of two final reports on the project.

  13. Bioinformatics Database Tools in Analysis of Genetics of Neurodevelopmental Disorders

    Directory of Open Access Journals (Sweden)

    Dibyashree Mallik

    2017-10-01

    Full Text Available Bioinformatics tools are recently used in various sectors of biology. Many questions regarding Neurodevelopmental disorder which arises as a major health issue recently can be solved by using various bioinformatics databases. Schizophrenia is such a mental disorder which is now arises as a major threat in young age people because it is mostly seen in case of people during their late adolescence or early adulthood period. Databases like DISGENET, GWAS, PHARMGKB, and DRUGBANK have huge repository of genes associated with schizophrenia. We found a lot of genes are being associated with schizophrenia, but approximately 200 genes are found to be present in any of these databases. After further screening out process 20 genes are found to be highly associated with each other and are also a common genes in many other diseases also. It is also found that they all are serves as a common targeting gene in many antipsychotic drugs. After analysis of various biological properties, molecular function it is found that these 20 genes are mostly involved in biological regulation process and are having receptor activity. They are belonging mainly to receptor protein class. Among these 20 genes CYP2C9, CYP3A4, DRD2, HTR1A, HTR2A are shown to be a main targeting genes of most of the antipsychotic drugs and are associated with  more than 40% diseases. The basic findings of the present study enumerated that a suitable combined drug can be design by targeting these genes which can be used for the better treatment of schizophrenia.

  14. AnalyzeHOLE: An Integrated Wellbore Flow Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Keith J. Halford

    2009-10-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically

  15. MTpy - Python Tools for Magnetotelluric Data Processing and Analysis

    Science.gov (United States)

    Krieger, Lars; Peacock, Jared; Thiel, Stephan; Inverarity, Kent; Kirkby, Alison; Robertson, Kate; Soeffky, Paul; Didana, Yohannes

    2014-05-01

    We present the Python package MTpy, which provides functions for the processing, analysis, and handling of magnetotelluric (MT) data sets. MT is a relatively immature and not widely applied geophysical method in comparison to other geophysical techniques such as seismology. As a result, the data processing within the academic MT community is not thoroughly standardised and is often based on a loose collection of software, adapted to the respective local specifications. We have developed MTpy to overcome problems that arise from missing standards, and to provide a simplification of the general handling of MT data. MTpy is written in Python, and the open-source code is freely available from a GitHub repository. The setup follows the modular approach of successful geoscience software packages such as GMT or Obspy. It contains sub-packages and modules for the various tasks within the standard work-flow of MT data processing and interpretation. In order to allow the inclusion of already existing and well established software, MTpy does not only provide pure Python classes and functions, but also wrapping command-line scripts to run standalone tools, e.g. modelling and inversion codes. Our aim is to provide a flexible framework, which is open for future dynamic extensions. MTpy has the potential to promote the standardisation of processing procedures and at same time be a versatile supplement for existing algorithms. Here, we introduce the concept and structure of MTpy, and we illustrate the workflow of MT data processing, interpretation, and visualisation utilising MTpy on example data sets collected over different regions of Australia and the USA.

  16. Use of Virtual Reality Tools for Vestibular Disorders Rehabilitation: A Comprehensive Analysis

    OpenAIRE

    Bergeron, Mathieu; Lortie, Catherine L.; Guitton, Matthieu J.

    2015-01-01

    Classical peripheral vestibular disorders rehabilitation is a long and costly process. While virtual reality settings have been repeatedly suggested to represent possible tools to help the rehabilitation process, no systematic study had been conducted so far. We systematically reviewed the current literature to analyze the published protocols documenting the use of virtual reality settings for peripheral vestibular disorders rehabilitation. There is an important diversity of settings and prot...

  17. Protocol Implementation Generator

    DEFF Research Database (Denmark)

    Carvalho Quaresma, Jose Nuno; Probst, Christian W.

    2010-01-01

    Users expect communication systems to guarantee, amongst others, privacy and integrity of their data. These can be ensured by using well-established protocols; the best protocol, however, is useless if not all parties involved in a communication have a correct implementation of the protocol and a...... Generator framework based on the LySatool and a translator from the LySa language into C or Java....... necessary tools. In this paper, we present the Protocol Implementation Generator (PiG), a framework that can be used to add protocol generation to protocol negotiation, or to easily share and implement new protocols throughout a network. PiG enables the sharing, verification, and translation...

  18. Applications of a broad-spectrum tool for conservation and fisheries analysis: aquatic gap analysis

    Science.gov (United States)

    McKenna, James E.; Steen, Paul J.; Lyons, John; Stewart, Jana S.

    2009-01-01

    Natural resources support all of our social and economic activities, as well as our biological existence. Humans have little control over most of the physical, biological, and sociological conditions dictating the status and capacity of natural resources in any particular area. However, the most rapid and threatening influences on natural resources typically are anthropogenic overuse and degradation. In addition, living natural resources (i.e., organisms) do not respect political boundaries, but are aware of their optimal habitat and environmental conditions. Most organisms have wider spatial ranges than the jurisdictional boundaries of environmental agencies that deal with them; even within those jurisdictions, information is patchy and disconnected. Planning and projecting effects of ecological management are difficult, because many organisms, habitat conditions, and interactions are involved. Conservation and responsible resource use involves wise management and manipulation of the aspects of the environment and biological communities that can be effectively changed. Tools and data sets that provide new insights and analysis capabilities can enhance the ability of resource managers to make wise decisions and plan effective, long-term management strategies. Aquatic gap analysis has been developed to provide those benefits. Gap analysis is more than just the assessment of the match or mis-match (i.e., gaps) between habitats of ecological value and areas with an appropriate level of environmental protection (e.g., refuges, parks, preserves), as the name suggests. Rather, a Gap Analysis project is a process which leads to an organized database of georeferenced information and previously available tools to examine conservation and other ecological issues; it provides a geographic analysis platform that serves as a foundation for aquatic ecological studies. This analytical tool box allows one to conduct assessments of all habitat elements within an area of interest

  19. Evaluation of static analysis tools used to assess software important to nuclear power plant safety

    Energy Technology Data Exchange (ETDEWEB)

    Ourghanlian, Alain [EDF Lab CHATOU, Simulation and Information Technologies for Power Generation Systems Department, EDF R and D, Cedex (France)

    2015-03-15

    We describe a comparative analysis of different tools used to assess safety-critical software used in nuclear power plants. To enhance the credibility of safety assessments and to optimize safety justification costs, Electricit e de France (EDF) investigates the use of methods and tools for source code semantic analysis, to obtain indisputable evidence and help assessors focus on the most critical issues. EDF has been using the PolySpace tool for more than 10 years. Currently, new industrial tools based on the same formal approach, Abstract Interpretation, are available. Practical experimentation with these new tools shows that the precision obtained on one of our shutdown systems software packages is substantially improved. In the first part of this article, we present the analysis principles of the tools used in our experimentation. In the second part, we present the main characteristics of protection-system software, and why these characteristics are well adapted for the new analysis tools.

  20. A critical analysis of a locally agreed protocol for clinical practice

    International Nuclear Information System (INIS)

    Owen, A.; Hogg, P.; Nightingale, J.

    2004-01-01

    Within the traditional scope of radiographic practice (including advanced practice) there is a need to demonstrate effective patient care and management. Such practice should be set within a context of appropriate evidence and should also reflect peer practice. In order to achieve such practice the use of protocols is encouraged. Effective protocols can maximise care and management by minimising inter- and intra-professional variation; they can also allow for detailed procedural records to be kept in case of legal claims. However, whilst literature exists to encourage the use of protocols there is little published material available to indicate how to create, manage and archive them. This article uses an analytical approach to propose a suitable method for protocol creation and archival, it also offers suggestions on the scope and content of a protocol. To achieve this an existing clinical protocol for radiographer reporting barium enemas is analysed to draw out the general issues. Proposals for protocol creation, management, and archival were identified. The clinical practice described or inferred in the protocol should be drawn from evidence, such evidence could include peer-reviewed material, national standards and peer practice. The protocol should include an explanation of how to proceed when the radiographers reach the limit of their ability. It should refer to the initial training required to undertake the clinical duties as well as the on-going continual professional updating required to maintain competence. Audit of practice should be indicated, including the preferred audit methodology, and associated with this should be a clear statement about standards and what to do if standards are not adequately met. Protocols should be archived, in a paper-based form, for lengthy periods in case of legal claims. On the archived protocol the date it was in clinical use should be included

  1. Network protocols and sockets

    OpenAIRE

    BALEJ, Marek

    2010-01-01

    My work will deal with network protocols and sockets and their use in programming language C#. It will therefore deal programming network applications on the platform .NET from Microsoft and instruments, which C# provides to us. There will describe the tools and methods for programming network applications, and shows a description and sample applications that work with sockets and application protocols.

  2. comparative analysis of diagnostic applications of autoscan tools

    African Journals Online (AJOL)

    user

    A structured questionnaire with 3 items questions as checklist, 2 Auto scan tools were used for the study and test carried out on 5 vehicle systems at the 3 centers where Innova ... maintenance, auto- analyzers, solid work design and can-.

  3. Technology Combination Analysis Tool (TCAT) for Active Debris Removal

    Science.gov (United States)

    Chamot, B.; Richard, M.; Salmon, T.; Pisseloup, A.; Cougnet, C.; Axthelm, R.; Saunder, C.; Dupont, C.; Lequette, L.

    2013-08-01

    This paper present the work of the Swiss Space Center EPFL within the CNES-funded OTV-2 study. In order to find the most performant Active Debris Removal (ADR) mission architectures and technologies, a tool was developed in order to design and compare ADR spacecraft, and to plan ADR campaigns to remove large debris. Two types of architectures are considered to be efficient: the Chaser (single-debris spacecraft), the Mothership/ Kits (multiple-debris spacecraft). Both are able to perform controlled re-entry. The tool includes modules to optimise the launch dates and the order of capture, to design missions and spacecraft, and to select launch vehicles. The propulsion, power and structure subsystems are sized by the tool thanks to high-level parametric models whilst the other ones are defined by their mass and power consumption. Final results are still under investigation by the consortium but two concrete examples of the tool's outputs are presented in the paper.

  4. Assessing the Possibility of Implementing Tools of Technical Analysys for Real Estate Market Analysis

    Directory of Open Access Journals (Sweden)

    Brzezicka Justyna

    2016-06-01

    Full Text Available Technical analysis (TA and its different aspects are widely used to study the capital market. In the traditional approach, this analysis is used to determine the probability of changes in current rates on the basis of their past changes, accounting for factors which had, have or may have an influence on shaping the supply and demand of a given asset. In the practical sense, TA is a set of techniques used for assessing the value of an asset based on the analysis of the asset's trajectories as well as statistical tools.

  5. Virtual observatory tools and amateur radio observations supporting scientific analysis of Jupiter radio emissions

    Science.gov (United States)

    Cecconi, Baptiste; Hess, Sebastien; Le Sidaner, Pierre; Savalle, Renaud; Stéphane, Erard; Coffre, Andrée; Thétas, Emmanuel; André, Nicolas; Génot, Vincent; Thieman, Jim; Typinski, Dave; Sky, Jim; Higgins, Chuck; Imai, Masafumi

    2016-04-01

    In the frame of the preparation of the NASA/JUNO and ESA/JUICE (Jupiter Icy Moon Explorer) missions, and the development of a planetary sciences virtual observatory (VO), we are proposing a new set of tools directed to data providers as well as users, in order to ease data sharing and discovery. We will focus on ground based planetary radio observations (thus mainly Jupiter radio emissions), trying for instance to enhance the temporal coverage of jovian decametric emission. The data service we will be using is EPN-TAP, a planetary science data access protocol developed by Europlanet-VESPA (Virtual European Solar and Planetary Access). This protocol is derived from IVOA (International Virtual Observatory Alliance) standards. The Jupiter Routine Observations from the Nancay Decameter Array are already shared on the planetary science VO using this protocol, as well as data from the Iitate Low Frquency Radio Antenna, in Japan. Amateur radio data from the RadioJOVE project is also available. The attached figure shows data from those three providers. We will first introduce the VO tools and concepts of interest for the planetary radioastronomy community. We will then present the various data formats now used for such data services, as well as their associated metadata. We will finally show various prototypical tools that make use of this shared datasets.

  6. Analysis of Marketing Mix Tools in a Chosen Company

    OpenAIRE

    Dvořáková, Vendula

    2009-01-01

    The complex structure of the marketing mix of the Velteko s.r.o. company is analysed in my work. Characterising of the concern, basic analysing of the individual tools of the marketing mix and clients survey evaluating is the main object. Conclusion is focused on the rating of the level and the efficiency of the utilization of the used tools. Relevant recommendations are added.

  7. Specimen preparation, imaging, and analysis protocols for knife-edge scanning microscopy.

    Science.gov (United States)

    Choe, Yoonsuck; Mayerich, David; Kwon, Jaerock; Miller, Daniel E; Sung, Chul; Chung, Ji Ryang; Huffman, Todd; Keyser, John; Abbott, Louise C

    2011-12-09

    Major advances in high-throughput, high-resolution, 3D microscopy techniques have enabled the acquisition of large volumes of neuroanatomical data at submicrometer resolution. One of the first such instruments producing whole-brain-scale data is the Knife-Edge Scanning Microscope (KESM), developed and hosted in the authors' lab. KESM has been used to section and image whole mouse brains at submicrometer resolution, revealing the intricate details of the neuronal networks (Golgi), vascular networks (India ink), and cell body distribution (Nissl). The use of KESM is not restricted to the mouse nor the brain. We have successfully imaged the octopus brain, mouse lung, and rat brain. We are currently working on whole zebra fish embryos. Data like these can greatly contribute to connectomics research; to microcirculation and hemodynamic research; and to stereology research by providing an exact ground-truth. In this article, we will describe the pipeline, including specimen preparation (fixing, staining, and embedding), KESM configuration and setup, sectioning and imaging with the KESM, image processing, data preparation, and data visualization and analysis. The emphasis will be on specimen preparation and visualization/analysis of obtained KESM data. We expect the detailed protocol presented in this article to help broaden the access to KESM and increase its utilization.

  8. Host based internet protocol (IP) packet analysis to enhance network security

    International Nuclear Information System (INIS)

    Ahmad, T.; Ahmad, S.Z.; Yasin, M.M.

    2007-01-01

    Data communication in a computer network environment is facing serious security threats from numerous sources such as viruses, worms, Zombies etc. These threats can be broadly characterized as internal or external security threats. Internal threats are mainly attributed to sneaker-nets, utility modems and unauthorized users, which can be minimized by skillful network administration, password management and optimum usage policy definition. The external threats need more serious attention as these attacks are mostly coming from public networks such as Internet. Frequency and complexity of such attacks is much higher as compared to internal attacks. This paper presents a host based network layer screening of external and internal IP packets for logging, analyzing and real-time detection of possible IP spoofing and Denial of Service attacks. This work can also be used in tuning security rules definition for gateway firewalls. Software has been developed which intercepts IP traffic and analyses it with respect to integrity and origin of I P packet. The received IP packets are parsed and analyzed for possible signs of intrusion. The results show that by watching and categorizing composition of various transport protocol such as TCP, UDP, ICMP and others along with verifying the origin of received IP packet can help in devising real-time firewall rule and blocking possible external attack. This is highly desirable for fighting against zero day attacks and can result in a better Mean Time between Failures (MTBF) to increase the survivability of computer network. Used in a right context, packet screening and filtering can be a useful tool for provision of reliable and stable network services. (author)

  9. CSReport: A New Computational Tool Designed for Automatic Analysis of Class Switch Recombination Junctions Sequenced by High-Throughput Sequencing.

    Science.gov (United States)

    Boyer, François; Boutouil, Hend; Dalloul, Iman; Dalloul, Zeinab; Cook-Moreau, Jeanne; Aldigier, Jean-Claude; Carrion, Claire; Herve, Bastien; Scaon, Erwan; Cogné, Michel; Péron, Sophie

    2017-05-15

    B cells ensure humoral immune responses due to the production of Ag-specific memory B cells and Ab-secreting plasma cells. In secondary lymphoid organs, Ag-driven B cell activation induces terminal maturation and Ig isotype class switch (class switch recombination [CSR]). CSR creates a virtually unique IgH locus in every B cell clone by intrachromosomal recombination between two switch (S) regions upstream of each C region gene. Amount and structural features of CSR junctions reveal valuable information about the CSR mechanism, and analysis of CSR junctions is useful in basic and clinical research studies of B cell functions. To provide an automated tool able to analyze large data sets of CSR junction sequences produced by high-throughput sequencing (HTS), we designed CSReport, a software program dedicated to support analysis of CSR recombination junctions sequenced with a HTS-based protocol (Ion Torrent technology). CSReport was assessed using simulated data sets of CSR junctions and then used for analysis of Sμ-Sα and Sμ-Sγ1 junctions from CH12F3 cells and primary murine B cells, respectively. CSReport identifies junction segment breakpoints on reference sequences and junction structure (blunt-ended junctions or junctions with insertions or microhomology). Besides the ability to analyze unprecedentedly large libraries of junction sequences, CSReport will provide a unified framework for CSR junction studies. Our results show that CSReport is an accurate tool for analysis of sequences from our HTS-based protocol for CSR junctions, thereby facilitating and accelerating their study. Copyright © 2017 by The American Association of Immunologists, Inc.

  10. Modelling, Verification, and Comparative Performance Analysis of the B.A.T.M.A.N. Protocol

    NARCIS (Netherlands)

    Chaudhary, Kaylash; Fehnker, Ansgar; Mehta, Vinay; Hermanns, Holger; Höfner, Peter

    2017-01-01

    This paper considers on a network routing protocol known as Better Approach to Mobile Ad hoc Networks (B.A.T.M.A.N.). The protocol serves two aims: first, to discover all bidirectional links, and second, to identify the best-next-hop for every other node in the network. A key element is that each

  11. An Overview and Analysis of Mobile Internet Protocols in Cellular Environments.

    Science.gov (United States)

    Chao, Han-Chieh

    2001-01-01

    Notes that cellular is the inevitable future architecture for the personal communication service system. Discusses the current cellular support based on Mobile Internet Protocol version 6 (Ipv6) and points out the shortfalls of using Mobile IP. Highlights protocols especially for mobile management schemes which can optimize a high-speed mobile…

  12. 75 FR 53273 - Federal Aquatic Nuisance Species Research Risk Analysis Protocol

    Science.gov (United States)

    2010-08-31

    ... Aquatic Nuisance Species Task Force (ANSTF). The Protocol is available for public review and comment... the draft revised Protocol are available on the ANSTF website, http://anstaskforce.gov/documents.php... nonindigenous species (ANS) and is designed to reduce the risk that research activities may cause introduction...

  13. Teaching Integrity in Empirical Research: A Protocol for Documenting Data Management and Analysis

    Science.gov (United States)

    Ball, Richard; Medeiros, Norm

    2012-01-01

    This article describes a protocol the authors developed for teaching undergraduates to document their statistical analyses for empirical research projects so that their results are completely reproducible and verifiable. The protocol is guided by the principle that the documentation prepared to accompany an empirical research project should be…

  14. Modified Scoring, Traditional Item Analysis, and Sato's Caution Index Used To Investigate the Reading Recall Protocol.

    Science.gov (United States)

    Deville, Craig W.; Chalhoub-Deville, Micheline

    A study demonstrated the utility of item analyses to investigate which items function well or poorly in a second language reading recall protocol instrument. Data were drawn from a larger study of 56 learners of German as a second language at various proficiency levels. Pausal units of scored recall protocols were analyzed using both classical…

  15. Racism as a determinant of health: a protocol for conducting a systematic review and meta-analysis

    Science.gov (United States)

    2013-01-01

    Background Racism is increasingly recognized as a key determinant of health. A growing body of epidemiological evidence shows strong associations between self-reported racism and poor health outcomes across diverse minority groups in developed countries. While the relationship between racism and health has received increasing attention over the last two decades, a comprehensive meta-analysis focused on the health effects of racism has yet to be conducted. The aim of this review protocol is to provide a structure from which to conduct a systematic review and meta-analysis of studies that assess the relationship between racism and health. Methods This research will consist of a systematic review and meta-analysis. Studies will be considered for review if they are empirical studies reporting quantitative data on the association between racism and health for adults and/or children of all ages from any racial/ethnic/cultural groups. Outcome measures will include general health and well-being, physical health, mental health, healthcare use and health behaviors. Scientific databases (for example, Medline) will be searched using a comprehensive search strategy and reference lists will be manually searched for relevant studies. In addition, use of online search engines (for example, Google Scholar), key websites, and personal contact with experts will also be undertaken. Screening of search results and extraction of data from included studies will be independently conducted by at least two authors, including assessment of inter-rater reliability. Studies included in the review will be appraised for quality using tools tailored to each study design. Summary statistics of study characteristics and findings will be compiled and findings synthesized in a narrative summary as well as a meta-analysis. Discussion This review aims to examine associations between reported racism and health outcomes. This comprehensive and systematic review and meta-analysis of empirical research

  16. Racism as a determinant of health: a protocol for conducting a systematic review and meta-analysis.

    Science.gov (United States)

    Paradies, Yin; Priest, Naomi; Ben, Jehonathan; Truong, Mandy; Gupta, Arpana; Pieterse, Alex; Kelaher, Margaret; Gee, Gilbert

    2013-09-23

    Racism is increasingly recognized as a key determinant of health. A growing body of epidemiological evidence shows strong associations between self-reported racism and poor health outcomes across diverse minority groups in developed countries. While the relationship between racism and health has received increasing attention over the last two decades, a comprehensive meta-analysis focused on the health effects of racism has yet to be conducted. The aim of this review protocol is to provide a structure from which to conduct a systematic review and meta-analysis of studies that assess the relationship between racism and health. This research will consist of a systematic review and meta-analysis. Studies will be considered for review if they are empirical studies reporting quantitative data on the association between racism and health for adults and/or children of all ages from any racial/ethnic/cultural groups. Outcome measures will include general health and well-being, physical health, mental health, healthcare use and health behaviors. Scientific databases (for example, Medline) will be searched using a comprehensive search strategy and reference lists will be manually searched for relevant studies. In addition, use of online search engines (for example, Google Scholar), key websites, and personal contact with experts will also be undertaken. Screening of search results and extraction of data from included studies will be independently conducted by at least two authors, including assessment of inter-rater reliability. Studies included in the review will be appraised for quality using tools tailored to each study design. Summary statistics of study characteristics and findings will be compiled and findings synthesized in a narrative summary as well as a meta-analysis. This review aims to examine associations between reported racism and health outcomes. This comprehensive and systematic review and meta-analysis of empirical research will provide a rigorous and

  17. Overview of the Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Melaina, Marc; Bush, Brian; Penev, Michael

    2015-05-12

    This presentation provides an introduction to the Hydrogen Financial Analysis Scenario Tool (H2FAST) and includes an overview of each of the three versions of H2FAST: the Web tool, the Excel spreadsheet version, and the beta version of the H2FAST Business Case Scenario tool.

  18. Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Melaina, Marc

    2015-04-21

    This presentation describes the Hydrogen Financial Analysis Scenario Tool, H2FAST, and provides an overview of each of the three H2FAST formats: the H2FAST web tool, the H2FAST Excel spreadsheet, and the H2FAST Business Case Scenario (BCS) tool. Examples are presented to illustrate the types of questions that H2FAST can help answer.

  19. Photomat: A Mobile Tool for Aiding in Student Construction of Research Questions and Data Analysis

    Science.gov (United States)

    Shelley, Tia Renee; Dasgupta, Chandan; Silva, Alexandra; Lyons, Leilah; Moher, Tom

    2015-01-01

    This paper presents a new mobile software tool, PhotoMAT (Photo Management and Analysis Tool), and students' experiences with this tool within a scaffolded curricular unit--Neighborhood Safari. PhotoMAT was designed to support learners' investigations of backyard animal behavior and works with image sets obtained using fixed-position field cameras…

  20. A novel tool for continuous fracture aftercare - Clinical feasibility and first results of a new telemetric gait analysis insole.

    Science.gov (United States)

    Braun, Benedikt J; Bushuven, Eva; Hell, Rebecca; Veith, Nils T; Buschbaum, Jan; Holstein, Joerg H; Pohlemann, Tim

    2016-02-01

    Weight bearing after lower extremity fractures still remains a highly controversial issue. Even in ankle fractures, the most common lower extremity injury no standard aftercare protocol has been established. Average non weight bearing times range from 0 to 7 weeks, with standardised, radiological healing controls at fixed time intervals. Recent literature calls for patient-adapted aftercare protocols based on individual fracture and load scenarios. We show the clinical feasibility and first results of a new, insole embedded gait analysis tool for continuous monitoring of gait, load and activity. Ten patients were monitored with a new, independent gait analysis insole for up to 3 months postoperatively. Strict 20 kg partial weight bearing was ordered for 6 weeks. Overall activity, load spectrum, ground reaction forces, clinical scoring and general health data were recorded and correlated. Statistical analysis with power analysis, t-test and Spearman correlation was performed. Only one patient completely adhered to the set weight bearing limit. Average time in minutes over the limit was 374 min. Based on the parameters load, activity, gait time over 20 kg weight bearing and maximum ground reaction force high and low performers were defined after 3 weeks. Significant difference in time to painless full weight bearing between high and low performers was shown. Correlation analysis revealed a significant correlation between weight bearing and clinical scoring as well as pain (American Orthopaedic Foot and Ankle Society (AOFAS) Score rs=0.74; Olerud-Molander Score rs=0.93; VAS pain rs=-0.95). Early, continuous gait analysis is able to define aftercare performers with significant differences in time to full painless weight bearing where clinical or radiographic controls could not. Patient compliance to standardised weight bearing limits and protocols is low. Highly individual rehabilitation patterns were seen in all patients. Aftercare protocols should be adjusted to real