Sample records for reliable assay protocol

  1. Development of a reliable assay protocol for identification of diseases (RAPID)-bioactive amplification with probing (BAP) for detection of Newcastle disease virus.

    Wang, Chi-Young; Hsu, Chia-Jen; Chen, Heng-Ju; Chulu, Julius L C; Liu, Hung-Jen


    Due to appearance of new genotypes of Newcastle disease virus (NDV) with no cross-protection and with vaccine strains, some outbreaks have been reported in Taiwan that caused significant damage to the poultry industry. A reliable assay protocol, (RAPID)-bioactive amplification with probing (BAP), for detection of NDV that uses a nested PCR and magnetic bead-based probe to increase sensitivity and specificity, was developed. Primers and probes were designed based on the conserved region of the F protein-encoding gene sequences of all NDV Taiwan isolates. The optimal annealing temperature for nested reverse transcription-polymerase chain reaction (RT-PCR) to amplify the gene was 61 degrees C and optimal hybridization occurred when buffer 1x SSC and 0.5% SDS were used at 50 degrees C. The sensitivity of RAPID-BAP was 1 copy/microl for standard plasmids and 10 copy/mul for transcribed F protein-encoding gene of NDV with comparable linearity (R(2)=0.984 versus R(2)=0.99). This sensitivity was superior to that of other techniques currently used. The assay was also highly specific because the negative controls, including classical swine fever virus, avian influenza virus, avian reovirus, and infectious bursa disease virus could not be detected. Thirty-four field samples were tested using conventional RT-PCR, nested RT-PCR, real-time quantitative RT-PCR, and RAPID-BAP assay and the positive rates were 24%, 30%, 41%, and 53%, respectively. The developed assay allows for rapid, correct, and sensitive detection of NDV and fulfils all of the key requirements for clinical applicability. It could reliably rule out false negative results from antibody-based assays and also facilitate a rapid diagnosis in the early phase of the disease for emergency quarantine that may help prevent large-scale outbreaks.

  2. The reliable multicast protocol application programming interface

    Montgomery , Todd; Whetten, Brian


    The Application Programming Interface for the Berkeley/WVU implementation of the Reliable Multicast Protocol is described. This transport layer protocol is implemented as a user library that applications and software buses link against.

  3. Reliable multicasting in the Xpress Transport Protocol

    Atwood, J.W. [Concordia Univ., Montreal, PQ (Canada); Catrina, O. [Polytehnica Univ., Bucharest (Romania); Fenton, J. [Mentat, Inc., Los Angeles, CA (United States); Strayer, W.T. [Sandia National Labs., Livermore, CA (United States)


    The Xpress Transport Protocol (XTP) is designed to meet the needs of distributed, real-time, and multimedia systems. This paper describes the genesis of recent improvements to XTP that provide mechanisms for reliable management of multicast groups, and gives details of the mechanisms used.

  4. Fault recovery in the reliable multicast protocol

    Callahan, John R.; Montgomery, Todd L.; Whetten, Brian


    The Reliable Multicast Protocol (RMP) provides a unique, group-based model for distributed programs that need to handle reconfiguration events at the application layer. This model, called membership views, provides an abstraction in which events such as site failures, network partitions, and normal join-leave events are viewed as group reformations. RMP provides access to this model through an application programming interface (API) that notifies an application when a group is reformed as the result of a some event. RMP provides applications with reliable delivery of messages using an underlying IP Multicast (12, 5) media to other group members in a distributed environment even in the case of reformations. A distributed application can use various Quality of Service (QoS) levels provided by RMP to tolerate group reformations. This paper explores the implementation details of the mechanisms in RMP that provide distributed applications with membership view information and fault recovery capabilities.

  5. Joint Architecture Standard (JAS) Reliable Data Delivery Protocol (RDDP) specification.

    Enderle, Justin Wayne; Daniels, James W.; Gardner, Michael T.; Eldridge, John M.; Hunt, Richard D.; Gallegos, Daniel E.


    The Joint Architecture Standard (JAS) program at Sandia National Laboratories requires the use of a reliable data delivery protocol over SpaceWire. The National Aeronautics and Space Administration at the Goddard Spaceflight Center in Greenbelt, Maryland, developed and specified a reliable protocol for its Geostationary Operational Environment Satellite known as GOES-R Reliable Data Delivery Protocol (GRDDP). The JAS program implemented and tested GRDDP and then suggested a number of modifications to the original specification to meet its program specific requirements. This document details the full RDDP specification as modified for JAS. The JAS Reliable Data Delivery Protocol uses the lower-level SpaceWire data link layer to provide reliable packet delivery services to one or more higher-level host application processes. This document specifies the functional requirements for JRDDP but does not specify the interfaces to the lower- or higher-level processes, which may be implementation-dependent.

  6. Verification and validation of a reliable multicast protocol

    Callahan, John R.; Montgomery, Todd L.


    This paper describes the methods used to specify and implement a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally by two complementary teams using a combination of formal and informal techniques in an attempt to ensure the correctness of the protocol implementation. The first team, called the Design team, initially specified protocol requirements using a variant of SCR requirements tables and implemented a prototype solution. The second team, called the V&V team, developed a state model based on the requirements tables and derived test cases from these tables to exercise the implementation. In a series of iterative steps, the Design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation through testing. Test cases derived from state transition paths in the formal model formed the dialogue between teams during development and served as the vehicles for keeping the model and implementation in fidelity with each other. This paper describes our experiences in developing our process model, details of our approach, and some example problems found during the development of RMP.

  7. Cacades: A reliable dissemination protocol for data collection sensor network

    Peng, Y.; Song, W.; Huang, R.; Xu, M.; Shirazi, B.; LaHusen, R.; Pei, G.


    In this paper, we propose a fast and reliable data dissemination protocol Cascades to disseminate data from the sink(base station) to all or a subset of nodes in a data collection sensor network. Cascades makes use of the parentmonitor-children analogy to ensure reliable dissemination. Each node monitors whether or not its children have received the broadcast messages through snooping children's rebroadcasts or waiting for explicit ACKs. If a node detects a gap in its message sequences, it can fetch the missing messages from its neighbours reactively. Cascades also considers many practical issues for field deployment, such as dynamic topology, link/node failure, etc.. It therefore guarantees that a disseminated message from the sink will reach all intended receivers and the dissemination is terminated in a short time period. Notice that, all existing dissemination protocols either do not guarantee reliability or do not terminate [1, 2], which does not meet the requirement of real-time command control. We conducted experiment evaluations in both TOSSIM simulator and a sensor network testbed to compare Cascades with those existing dissemination protocols in TinyOS sensor networks, which show that Cascades achieves a higher degree of reliability, lower communication cost, and less delivery delay. ??2009 IEEE.

  8. Enhanced reliable transmission control protocol for spatial information networks

    Qin, Zhihong; Zhang, Juan; Wang, Junfeng


    Satellites channels are generally featured by high bit error rate (BER), long propagation delay, large bandwidth-delay product (BDP) and so on. This tends to make the traditional TCP suffer from serious performance degradation in satellite networks. Therefore, a TCP-compatible reliable transmission protocol (i.e., TCP-AX) for spatial information networks is proposed in this paper. And a bandwidth probing mechanism is designed to distinguish network congestion and link error. Simulation results show that TCP-AX has better performance than some popular enhanced TCP protocols.

  9. Reliable adaptive multicast protocol in wireless Ad hoc networks

    Sun Baolin; Li Layuan


    In wireless ad hoc network environments, every link is wireless and every node is mobile. Those features make data lost easily as well as multicasting inefficient and unreliable. Moreover, Efficient and reliable multicast in wireless ad hoc network is a difficult issue. It is a major challenge to transmission delays and packet losses due to link changes of a multicast tree at the provision of high delivery ratio for each packet transmission in wireless ad hoc network environment.In this paper, we propose and evaluate Reliable Adaptive Multicast Protocol (RAMP) based on a relay node concept. Relay nodes are placed along the multicast tree. Data recovery is done between relay nodes. RAMP supports a reliable multicasting suitable for mobile ad hoc network by reducing the number of packet retransmissions. We compare RAMP with SRM (Scalable Reliable Multicast). Simulation results show that the RAMP has high delivery ratio and low end-to-end delay for packet transmission.

  10. Reliable and Energy Efficient Protocol for MANET Multicasting

    Bander H. AlQarni


    Full Text Available A mobile ad hoc network (MANET consists of a self-configured set of portable mobile nodes without any central infrastructure to regulate traffic in the network. These networks present problems such as lack of congestion control, reliability, and energy consumption. In this paper, we present a new model for MANET multicasting called Reliable and Energy Efficient Protocol Depending on Distance and Remaining Energy (REEDDRE. Our proposal is based on a tone system to provide more efficiency and better performance, and it combines solutions over the Medium Access Control (MAC layer. The protocol consists of a new construction method for mobile nodes using a clustering approach that depends on distance and remaining energy to provide more stability and to reduce energy consumption. In addition, we propose an adjustment to the typical multicast flow by adding unicast links between clusters. We further present in our model a technique to provide more reliability based on a busy tone system (RMBTM to reduce excessive control overhead caused by control packets in error recovery. We simulate our proposal using OPNET, and the results show enhancement in terms of reliability, packet delivery ratio (PDR, energy consumption, and throughput.

  11. Design, Implementation, and Verification of the Reliable Multicast Protocol. Thesis

    Montgomery, Todd L.


    This document describes the Reliable Multicast Protocol (RMP) design, first implementation, and formal verification. RMP provides a totally ordered, reliable, atomic multicast service on top of an unreliable multicast datagram service. RMP is fully and symmetrically distributed so that no site bears an undue portion of the communications load. RMP provides a wide range of guarantees, from unreliable delivery to totally ordered delivery, to K-resilient, majority resilient, and totally resilient atomic delivery. These guarantees are selectable on a per message basis. RMP provides many communication options, including virtual synchrony, a publisher/subscriber model of message delivery, a client/server model of delivery, mutually exclusive handlers for messages, and mutually exclusive locks. It has been commonly believed that total ordering of messages can only be achieved at great performance expense. RMP discounts this. The first implementation of RMP has been shown to provide high throughput performance on Local Area Networks (LAN). For two or more destinations a single LAN, RMP provides higher throughput than any other protocol that does not use multicast or broadcast technology. The design, implementation, and verification activities of RMP have occurred concurrently. This has allowed the verification to maintain a high fidelity between design model, implementation model, and the verification model. The restrictions of implementation have influenced the design earlier than in normal sequential approaches. The protocol as a whole has matured smoother by the inclusion of several different perspectives into the product development.

  12. Reliable and Energy Efficient Protocol for Wireless Sensor Network

    KAN Baoqiang; CAI Li; XU Yongjun


    Low-power design is one of the most important issues in wireless sensor networks (WSNs), while reliable information transmitting should be ensured as well. Transmitting power (TP) control is a simple method to make the power consumption down, but excessive interferences from potential adjacent operating links and communication reliability between nodes should be considered. In this paper, a reliable and energy efficient protocol is presented, which adopts adaptive rate control based on an optimal TP. A mathematical model considering average interference and network connectivity was used to predict the optimal TP. Then for the optimal TP, active nodes adaptively chose the data rate with the change of bit-error-rate(BER) performance. The efficiency of the new strategy was validated by mathematical analysis and simulations. Compared with 802.11 DCF which uses maximum unified TP and BASIC protocol, it is shown that the higher average throughput can achieve while the energy consumption per useful bit can be reduced according to the results.

  13. Characterization of perovskite solar cells: Towards a reliable measurement protocol

    Eugen Zimmermann


    Full Text Available Lead halide perovskite solar cells have shown a tremendous rise in power conversion efficiency with reported record efficiencies of over 20% making this material very promising as a low cost alternative to conventional inorganic solar cells. However, due to a differently severe “hysteretic” behaviour during current density-voltage measurements, which strongly depends on scan rate, device and measurement history, preparation method, device architecture, etc., commonly used solar cell measurements do not give reliable or even reproducible results. For the aspect of commercialization and the possibility to compare results of different devices among different laboratories, it is necessary to establish a measurement protocol which gives reproducible results. Therefore, we compare device characteristics derived from standard current density-voltage measurements with stabilized values obtained from an adaptive tracking of the maximum power point and the open circuit voltage as well as characteristics extracted from time resolved current density-voltage measurements. Our results provide insight into the challenges of a correct determination of device performance and propose a measurement protocol for a reliable characterisation which is easy to implement and has been tested on varying perovskite solar cells fabricated in different laboratories.

  14. Characterization of perovskite solar cells: Towards a reliable measurement protocol

    Zimmermann, Eugen; Wong, Ka Kan; Müller, Michael; Hu, Hao; Ehrenreich, Philipp; Kohlstädt, Markus; Würfel, Uli; Mastroianni, Simone; Mathiazhagan, Gayathri; Hinsch, Andreas; Gujar, Tanaji P.; Thelakkat, Mukundan; Pfadler, Thomas; Schmidt-Mende, Lukas


    Lead halide perovskite solar cells have shown a tremendous rise in power conversion efficiency with reported record efficiencies of over 20% making this material very promising as a low cost alternative to conventional inorganic solar cells. However, due to a differently severe "hysteretic" behaviour during current density-voltage measurements, which strongly depends on scan rate, device and measurement history, preparation method, device architecture, etc., commonly used solar cell measurements do not give reliable or even reproducible results. For the aspect of commercialization and the possibility to compare results of different devices among different laboratories, it is necessary to establish a measurement protocol which gives reproducible results. Therefore, we compare device characteristics derived from standard current density-voltage measurements with stabilized values obtained from an adaptive tracking of the maximum power point and the open circuit voltage as well as characteristics extracted from time resolved current density-voltage measurements. Our results provide insight into the challenges of a correct determination of device performance and propose a measurement protocol for a reliable characterisation which is easy to implement and has been tested on varying perovskite solar cells fabricated in different laboratories.

  15. A Reliable Routing Protocol for Wireless Vehicular Networks

    Mohsen Madani


    Full Text Available Recently, much attention has been paid to Vehicular Ad hoc Network (VANET. VANETs address direct communication between vehicle-to-vehicle and vehicles to roadside units (RSUs. They are similar to the Mobile and Ad hoc Networks (MANET in their rapid and dynamic network topology changes due to the fast motion of nodes. High mobility of nodes and network resources limitations have made the routing, one of the most important challenges in VANET researches. Therefore, guaranteeing a stable and reliable routing algorithm over VANET is one of the main steps to realize an effective vehicular communications. In this paper, a two-step AODV-based routing protocol is proposed for VANET networks. At first, node-grouping is done using their mobility information such as speed and movement direction. If the first step cannot respond efficiently, the algorithm enters the second step which uses link expiration time (LET information in the formation of the groups. The goal of the proposed protocol is increasing the stability of routing algorithm by selecting long-lived routes and decreasing link breakages. The comparison of proposed algorithm with AODV and DSR protocols is performed via the Network Simulator NS-2. It is shown that the proposed algorithm increases the delivery ratio and also decreases the routing control overhead.

  16. Connectivity-Based Reliable Multicast MAC Protocol for IEEE 802.11 Wireless LANs

    Woo-Yong Choi


    Full Text Available We propose the efficient reliable multicast MAC protocol based on the connectivity information among the recipients. Enhancing the BMMM (Batch Mode Multicast MAC protocol, the reliable multicast MAC protocol significantly reduces the RAK (Request for ACK frame transmissions in a reasonable computational time and enhances the MAC performance. By the analytical performance analysis, the throughputs of the BMMM protocol and our proposed MAC protocol are derived. Numerical examples show that our proposed MAC protocol increases the reliable multicast MAC performance for IEEE 802.11 wireless LANs.

  17. A Hierarchical Energy Efficient Reliable Transport Protocol for Wireless Sensor Networks

    Prabhudutta Mohanty


    Full Text Available The two important requirements for many Wireless Senor Networks (WSNs are prolonged network lifetime and end-to-end reliability. The sensor nodes consume more energy during data transmission than the data sensing. In WSN, the redundant data increase the energy consumption, latency and reduce reliability during data transmission. Therefore, it is important to support energy efficient reliable data transport in WSNs. In this paper, we present a Hierarchical Energy Efficient Reliable Transport Protocol (HEERTP for the data transmission within the WSN. This protocol maximises the network lifetime by controlling the redundant data transmission with the co-ordination of Base Station (BS. The proposed protocol also achieves end-to-end reliability using a hop-by-hop acknowledgement scheme. We evaluate the performance of the proposed protocol through simulation. The simulation results reveal that our proposed protocol achieves better performance in terms of energy efficiency, latency and reliability than the existing protocols.

  18. Reliable Data Delivery in MANETs Using POR Protocol

    B. Sunil Kumar


    Full Text Available This balance addresses the company of release matter packets for stage sprightly mercurial beating the drum hoc networks in a reliable and timely manner. Most suitable verifiable propaganda hoc routing protocols are susceptible to node mobility, especially for large-scale networks. Dominated by this topic, we clench an apt Position-based Exploitive Routing (POR rite which takes give a reason for of the stateless acquiring of geographic routing and the broadcast nature of wireless medium. At the drop of a hat a information collection is sent about, several of the neighbor nodes stray shot overheard the publish resoluteness satisfy as alteration lawn, and round play the part to prepay the package dispatch if it is whine relayed by the specific best forwarder within a certain period of time. By utilizing such in-the-air every other, message is maintained without being interrupted. The bells latency incurred by inbred tour improvement is broadly tuppence inexpensively and the replica relaying caused by packet reroute is also decreased. In the scrap of notice opening, a Look up Destination-based Delete Comport (VDVH desire is in the deep-freeze proposed to work together with POR. Both outline enquiry and affectedness benefits show mosey POR achieves superior achievement unruffled here brazen node mobility with acceptable overhead and the new void handling scheme also works well

  19. RMAC: A Reliable MAC Protocol Supporting Multicast for Wireless Ad Hoc Networks

    Wei-Sheng Si; Cheng-Zhi Li


    This paper presents a new reliable MAC protocol called "RMAC" supporting reliable multicast for wireless ad hoc networks. By utilizing the busy tones to realize the multicast reliability, RMAC has three novelties: (1) it uses a variablelength control frame to stipulate an order for the receivers to respond, thus solving the feedback collision problem; (2) it extends the usage of busy tone for preventing data frame collisions into the multicast scenario; and (3) it introduces a new usage of busy tone for acknowledging data frames positively. In addition, RMAC is generalized into a comprehensive MAC protocol that provides both reliable and unreliable services for all the three modes of communications: unicast, multicast,and broadcast, making it capable of supporting various upper-layer protocols. The evaluation shows that RMAC achieves high reliability with very limited overhead. RMAC is also compared with other reliable MAC protocols, showing that RMAC not only provides higher reliability but also involves lower cost.

  20. An Acetylcholinesterase-Based Chronoamperometric Biosensor for Fast and Reliable Assay of Nerve Agents

    Rene Kizek


    Full Text Available The enzyme acetylcholinesterase (AChE is an important part of cholinergic nervous system, where it stops neurotransmission by hydrolysis of the neurotransmitter acetylcholine. It is sensitive to inhibition by organophosphate and carbamate insecticides, some Alzheimer disease drugs, secondary metabolites such as aflatoxins and nerve agents used in chemical warfare. When immobilized on a sensor (physico-chemical transducer, it can be used for assay of these inhibitors. In the experiments described herein, an AChE- based electrochemical biosensor using screen printed electrode systems was prepared. The biosensor was used for assay of nerve agents such as sarin, soman, tabun and VX. The limits of detection achieved in a measuring protocol lasting ten minutes were 7.41 × 10−12 mol/L for sarin, 6.31 × 10−12 mol /L for soman, 6.17 × 10−11 mol/L for tabun, and 2.19 × 10−11 mol/L for VX, respectively. The assay was reliable, with minor interferences caused by the organic solvents ethanol, methanol, isopropanol and acetonitrile. Isopropanol was chosen as suitable medium for processing lipophilic samples.

  1. ZyFISH: a simple, rapid and reliable zygosity assay for transgenic mice.

    Donal McHugh

    Full Text Available Microinjection of DNA constructs into fertilized mouse oocytes typically results in random transgene integration at a single genomic locus. The resulting transgenic founders can be used to establish hemizygous transgenic mouse lines. However, practical and experimental reasons often require that such lines be bred to homozygosity. Transgene zygosity can be determined by progeny testing assays which are expensive and time-consuming, by quantitative Southern blotting which is labor-intensive, or by quantitative PCR (qPCR which requires transgene-specific design. Here, we describe a zygosity assessment procedure based on fluorescent in situ hybridization (zyFISH. The zyFISH protocol entails the detection of transgenic loci by FISH and the concomitant assignment of homozygosity using a concise and unbiased scoring system. The method requires small volumes of blood, is scalable to at least 40 determinations per assay, and produces results entirely consistent with the progeny testing assay. This combination of reliability, simplicity and cost-effectiveness makes zyFISH a method of choice for transgenic mouse zygosity determinations.

  2. Corrections to "Connectivity-Based Reliable Multicast MAC Protocol for IEEE 802.11 Wireless LANs"

    Choi Woo-Yong


    Full Text Available We have found the errors in the throughput formulae presented in our paper "Connectivity-based reliable multicast MAC protocol for IEEE 802.11 wireless LANs". We provide the corrected formulae and numerical results.

  3. Shoulder muscle endurance: the development of a standardized and reliable protocol

    Roy Jean-Sébastien


    Full Text Available Abstract Background Shoulder muscle fatigue has been proposed as a possible link to explain the association between repetitive arm use and the development of rotator cuff disorders. To our knowledge, no standardized clinical endurance protocol has been developed to evaluate the effects of muscle fatigue on shoulder function. Such a test could improve clinical examination of individuals with shoulder disorders. Therefore, the purpose of this study was to establish a reliable protocol for objective assessment of shoulder muscle endurance. Methods An endurance protocol was developed on a stationary dynamometer (Biodex System 3. The endurance protocol was performed in isotonic mode with the resistance set at 50% of each subject's peak torque as measured for shoulder external (ER and internal rotation (IR. Each subject performed 60 continuous repetitions of IR/ER rotation. The endurance protocol was performed by 36 healthy individuals on two separate occasions at least two days apart. Maximal isometric shoulder strength tests were performed before and after the fatigue protocol to evaluate the effects of the endurance protocol and its reliability. Paired t-tests were used to evaluate the reduction in shoulder strength due to the protocol, while intraclass correlation coefficients (ICC and minimal detectable change (MDC were used to evaluate its reliability. Results Maximal isometric strength was significantly decreased after the endurance protocol (P 0.84. Conclusions Changes in muscular performance observed during and after the muscular endurance protocol suggests that the protocol did result in muscular fatigue. Furthermore, this study established that the resultant effects of fatigue of the proposed isotonic protocol were reproducible over time. The protocol was performed without difficulty by all volunteers and took less than 10 minutes to perform, suggesting that it might be feasible for clinical practice. This protocol could be used to induce

  4. Micronuclei assay: A potential biomonitoring protocol in occupational exposure studies.

    Palanikumar, L; Panneerselvam, N


    As micronuclei (MN) derive from chromosomal fragments and whole chromosomes lagging behind in anaphase, the MN assay can be used to show both clastogenic and aneugenic effects. This particularly concerns the use of MN as a biomarker ofgenotoxic exposure and effects, where differences in MN frequencies between exposed subjects and referents are expected to be small. The present paper reviews the use of the MN assay in biomonitoring of occupational exposure studies.

  5. Opportunistic Cooperative Reliable Transmission Protocol for Wireless Sensor Networks

    Hua Guo


    Full Text Available The high reliability requirement of data transmission in wireless sensor networks has been a large challenge due to the random mobility and network topoloy instability, as well as restriction of energy consumption and system life time. In this work, the relationship matirx of distance and speed was created at first. Then the binay tree of mobility and reliabiliy was proposed based on the distance relationship matrix. The optimal relay nodes with high reliabiliy were selected by seeking the binay tree. The high reliable opportunistic cooperative data transmission scheme was presnted finally. The outcome of both mathematical analysis and NS simulation indicate that the proposed mechanism is superior to the traditional data transmission mechnism such as the reliability, system throughput and energy efficiency and so on.

  6. Reliable and Power Efficient Routing Protocol for MANETs

    V.Sowmya Devi


    Full Text Available In this paper, we focused on of the most proliferated network that is Mobile Adhoc Network (MANET. Due to the dynamic nature and limited power of nodes, the routes will fail frequently which intern cause high power dissipation. This paper proposed a reliable and power efficient routing with the nodes having high power level. As well as, this approach also concentrated on the reduction of power consumption during route failures by adapting an on-demand local route recovery mechanism through a set of helping nodes and they are called as Support Nodes (SN. The cooperation of support nodes will reduce the power consumption and significantly increases the reliability. The performance of proposed approach was evaluated through average energy consumption, packet delivery ratio and end-to-end delay over varying node speed and varying packet size. The power optimization and reliability achieved by the proposed approach gives an ideal solution to the future communication in MANETs for a long time.

  7. Reliable Multicast MAC Protocol for IEEE 802.11 Wireless LANs with Extended Service Range

    Choi, Woo-Yong


    In this paper, we propose the efficient reliable multicast MAC protocol by which the AP (Access Point) can transmit reliably its multicast data frames to the recipients in the AP's one-hop or two-hop transmission range. The AP uses the STAs (Stations) that are directly associated with itself as the relays for the data delivery to the remote recipients that cannot be reached directly from itself. Based on the connectivity information among the recipients, the reliable multicast MAC protocol optimizes the number of the RAK (Request for ACK) frame transmissions in a reasonable computational time. Numerical examples show that our proposed MAC protocol significantly enhances the MAC performance compared with the BMMM (Batch Mode Multicast MAC) protocol that is extended to support the recipients that are in the AP's one-hop or two-hop transmission range in IEEE 802.11 wireless LANs.

  8. An Energy-Efficient Link Layer Protocol for Reliable Transmission over Wireless Networks

    Iqbal Adnan


    Full Text Available In multihop wireless networks, hop-by-hop reliability is generally achieved through positive acknowledgments at the MAC layer. However, positive acknowledgments introduce significant energy inefficiencies on battery-constrained devices. This inefficiency becomes particularly significant on high error rate channels. We propose to reduce the energy consumption during retransmissions using a novel protocol that localizes bit-errors at the MAC layer. The proposed protocol, referred to as Selective Retransmission using Virtual Fragmentation (SRVF, requires simple modifications to the positive-ACK-based reliability mechanism but provides substantial improvements in energy efficiency. The main premise of the protocol is to localize bit-errors by performing partial checksums on disjoint parts or virtual fragments of a packet. In case of error, only the corrupted virtual fragments are retransmitted. We develop stochastic models of the Simple Positive-ACK-based reliability, the previously-proposed Packet Length Optimization (PLO protocol, and the SRVF protocol operating over an arbitrary-order Markov wireless channel. Our analytical models show that SRVF provides significant theoretical improvements in energy efficiency over existing protocols. We then use bit-error traces collected over different real networks to empirically compare the proposed and existing protocols. These experimental results further substantiate that SRVF provides considerably better energy efficiency than Simple Positive-ACK and Packet Length Optimization protocols.

  9. Inter-laboratory variation in DNA damage using a standard comet assay protocol

    Forchhammer, Lykke; Ersson, Clara; Loft, Steffen


    There are substantial inter-laboratory variations in the levels of DNA damage measured by the comet assay. The aim of this study was to investigate whether adherence to a standard comet assay protocol would reduce inter-laboratory variation in reported values of DNA damage. Fourteen laboratories ...

  10. Ada-MAC: An Adaptive MAC Protocol for Real-time and Reliable Health Monitoring,

    Xia, Feng; Wang, Linqiang; Zhang, Daqiang; Zhang, Xue(Department of Physics, Liaoning Normal University, Dalian, 116029, China); Gao, Ruixia


    IEEE 802.15.4 is regarded as one of the most suitable communication protocols for cyber-physical applications of wireless sensor and actuator networks. This is because this protocol is able to achieve low-power and low-cost transmission in wireless personal area networks. But most cyber-physical systems (CPSs) require a degree of real-time and reliability from the underlying communication protocol. Some of them are stricter than the others. However, IEEE 802.15.4 protocol cannot provide relia...

  11. Energy based reliable multicast routing protocol for packet forwarding in MANET

    S. Gopinath


    Full Text Available Mobile Ad hoc Network consists of mobile nodes without any assisting infrastructure. Mobility of nodes causes network partition. This leads to heavy overhead and less packet forwarding ratio. In this research work, Residual Energy based Reliable Multicast Routing Protocol (RERMR is proposed to attain more network lifetime and increased packet delivery and forwarding rate. A multicast backbone is constructed to achieve more stability based on node familiarity and trustable loop. Reliable path criterion is estimated to choose best reliable path among all available paths. Data packets will be forwarded once the reliable path is chosen. We have also demonstrated that residual energy of paths aids to provide maximum network lifetime. Based on the simulation results, the proposed work achieves better performance than previous protocols in terms of packet reliability rate, network stability rate, end to end delay, end to end transmission and communication overhead.

  12. Development of high-reliable real-time communication network protocol for SMART

    Song, Ki Sang; Kim, Young Sik [Korea National University of Education, Chongwon (Korea); No, Hee Chon [Korea Advanced Institute of Science and Technology, Taejon (Korea)


    In this research, we first define protocol subsets for SMART(System-integrated Modular Advanced Reactor) communication network based on the requirement of SMART MMIS transmission delay and traffic requirements and OSI(Open System Interconnection) 7 layers' network protocol functions. Also, current industrial purpose LAN protocols are analyzed and the applicability of commercialized protocols are checked. For the suitability test, we have applied approximated SMART data traffic and maximum allowable transmission delay requirement. With the simulation results, we conclude that IEEE 802.5 and FDDI which is an ANSI standard, is the most suitable for SMART. We further analyzed the FDDI and token ring protocols for SMART and nuclear plant network environment including IEEE 802.4, IEEE 802.5, and ARCnet. The most suitable protocol for SMART is FDDI and FDDI MAC and RMT protocol specifications have been verified with LOTOS and the verification results show that FDDI MAC and RMT satisfy the reachability and liveness, but does not show deadlock and livelock. Therefore, we conclude that FDDI MAC and RMT is highly reliable protocol for SMART MMIS network. After that, we consider the stacking fault of IEEE 802.5 token ring protocol and propose a fault tolerant MAM(Modified Active Monitor) protocol. The simulation results show that the MAM protocol improves lower priority traffic service rate when stacking fault occurs. Therefore, proposed MAM protocol can be applied to SMART communication network for high reliability and hard real-time communication purpose in data acquisition and inter channel network. (author). 37 refs., 79 figs., 39 tabs.

  13. Reliability of Mobile Agents for Reliable Service Discovery Protocol in MANET

    Neogy, Roshni; Neogy, Sarmistha; 10.5121/ijwmn.2011.3518


    Recently mobile agents are used to discover services in mobile ad-hoc network (MANET) where agents travel through the network, collecting and sometimes spreading the dynamically changing service information. But it is important to investigate how reliable the agents are for this application as the dependability issues(reliability and availability) of MANET are highly affected by its dynamic nature.The complexity of underlying MANET makes it hard to obtain the route reliability of the mobile agent systems (MAS); instead we estimate it using Monte Carlo simulation. Thus an algorithm for estimating the task route reliability of MAS (deployed for discovering services) is proposed, that takes into account the effect of node mobility in MANET. That mobility pattern of the nodes affects the MAS performance is also shown by considering different mobility models. Multipath propagation effect of radio signal is considered to decide link existence. Transient link errors are also considered. Finally we propose a metric t...


    Jayanthi Gokulakrishnan


    Full Text Available A network which merges the usage of the public and the private networks and uses security software for the purpose of compressing, encrypting and masking the digital packets that are being transmitted in the network is called as Virtual Private Network (VPN. In VPN, the communication between the user ends is maintained such that it appears as if the source end is directly linked to the destination end over a concealed leased line. The private network, VPN uses the public network such as internet to link the remote locations with the users. In this study, we propose a new reliable protocol called as Topology Aware Reliable Routing Protocol (TARRP for large scale VPN and compare its performance with the traditional protocol, Boarder Gateway Protocol (BGP. In this protocol, the communication between the end to end nodes takes place in two phases: Routing phase and authentication phase. In the routing phase, the upstream and the downstream routing paths are determined by the source node using the topology learning protocol. Based on the dynamic failure information of links, the sender selects the failure-free path towards the destination. In the authentication phase, the VPN gateway authenticates the packet before it is transmitted through the core. Thus, this technique efficiently allows the packet to be transmitted with ensured security. By simulation results, we show that our proposed protocol is better than the traditional routing protocol of VPN.


    Pallavi Kaliyar


    Full Text Available A mobile ad-hoc network (MANETs is an infrastructure less network in which the mobile nodes communicate with each other. Due to its various characteristics like highly dynamic topology and limited battery power of the nodes, routing is one of the key issue. Also, it is not possible to give a significant amount of power to the mobile nodes of ad-hoc networks. Because of all this the energy consumption is also an important issue. Due to limited battery power, some other issues like if some node gets fail, which results in loss of data packets and no reliable data transfer has been raised. In this paper, an algorithm is proposed for data transmission which detects the node failure (due to energy before it actually happens. Because of this network lifetime gets improved. The proposed routing algorithm is energy efficient as compared to AODV routing algorithm. The performance is analyzed on the basis of various performance metrics like Energy Consumption, Packet Delivery Ratio, Network Life Time, Network Routing Overhead and number of Exhausted nodes in the network by using the NS2 Simulator.

  16. The specification-based validation of reliable multicast protocol: Problem Report. M.S. Thesis

    Wu, Yunqing


    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP multicasting. In this report, we develop formal models for RMP using existing automated verification systems, and perform validation on the formal RMP specifications. The validation analysis help identifies some minor specification and design problems. We also use the formal models of RMP to generate a test suite for conformance testing of the implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress of implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  17. Reliable Robust and Real-Time Communication Protocol for Data Delivery in Wireless sensor Networks

    Virmani, Deepali; Jain, Satbir


    WSNs can be considered a distributed control system designed to react to sensor information with an effective and timely action. For this reason, in WSNs it is important to provide real-time coordination and communication to guarantee timely execution of the right actions. In this paper a new communication protocol RRRT to support robust real-time and reliable event data delivery with minimum energy consumption and with congestion avoidance in WSNs is proposed. The proposed protocol uses the ...

  18. Utilizing high throughput screening data for predictive toxicology models: protocols and application to MLSCN assays

    Guha, Rajarshi; Schürer, Stephan C.


    Computational toxicology is emerging as an encouraging alternative to experimental testing. The Molecular Libraries Screening Center Network (MLSCN) as part of the NIH Molecular Libraries Roadmap has recently started generating large and diverse screening datasets, which are publicly available in PubChem. In this report, we investigate various aspects of developing computational models to predict cell toxicity based on cell proliferation screening data generated in the MLSCN. By capturing feature-based information in those datasets, such predictive models would be useful in evaluating cell-based screening results in general (for example from reporter assays) and could be used as an aid to identify and eliminate potentially undesired compounds. Specifically we present the results of random forest ensemble models developed using different cell proliferation datasets and highlight protocols to take into account their extremely imbalanced nature. Depending on the nature of the datasets and the descriptors employed we were able to achieve percentage correct classification rates between 70% and 85% on the prediction set, though the accuracy rate dropped significantly when the models were applied to in vivo data. In this context we also compare the MLSCN cell proliferation results with animal acute toxicity data to investigate to what extent animal toxicity can be correlated and potentially predicted by proliferation results. Finally, we present a visualization technique that allows one to compare a new dataset to the training set of the models to decide whether the new dataset may be reliably predicted.

  19. Evaluation of the reliability of maize reference assays for GMO quantification.

    Papazova, Nina; Zhang, David; Gruden, Kristina; Vojvoda, Jana; Yang, Litao; Buh Gasparic, Meti; Blejec, Andrej; Fouilloux, Stephane; De Loose, Marc; Taverniers, Isabel


    A reliable PCR reference assay for relative genetically modified organism (GMO) quantification must be specific for the target taxon and amplify uniformly along the commercialised varieties within the considered taxon. Different reference assays for maize (Zea mays L.) are used in official methods for GMO quantification. In this study, we evaluated the reliability of eight existing maize reference assays, four of which are used in combination with an event-specific polymerase chain reaction (PCR) assay validated and published by the Community Reference Laboratory (CRL). We analysed the nucleotide sequence variation in the target genomic regions in a broad range of transgenic and conventional varieties and lines: MON 810 varieties cultivated in Spain and conventional varieties from various geographical origins and breeding history. In addition, the reliability of the assays was evaluated based on their PCR amplification performance. A single base pair substitution, corresponding to a single nucleotide polymorphism (SNP) reported in an earlier study, was observed in the forward primer of one of the studied alcohol dehydrogenase 1 (Adh1) (70) assays in a large number of varieties. The SNP presence is consistent with a poor PCR performance observed for this assay along the tested varieties. The obtained data show that the Adh1 (70) assay used in the official CRL NK603 assay is unreliable. Based on our results from both the nucleotide stability study and the PCR performance test, we can conclude that the Adh1 (136) reference assay (T25 and Bt11 assays) as well as the tested high mobility group protein gene assay, which also form parts of CRL methods for quantification, are highly reliable. Despite the observed uniformity in the nucleotide sequence of the invertase gene assay, the PCR performance test reveals that this target sequence might occur in more than one copy. Finally, although currently not forming a part of official quantification methods, zein and SSIIb

  20. Inter-laboratory variation in DNA damage using a standard comet assay protocol.

    Forchhammer, Lykke; Ersson, Clara; Loft, Steffen; Möller, Lennart; Godschalk, Roger W L; van Schooten, Frederik J; Jones, George D D; Higgins, Jennifer A; Cooke, Marcus; Mistry, Vilas; Karbaschi, Mahsa; Collins, Andrew R; Azqueta, Amaya; Phillips, David H; Sozeri, Osman; Routledge, Michael N; Nelson-Smith, Kirsty; Riso, Patrizia; Porrini, Marisa; Matullo, Giuseppe; Allione, Alessandra; Stępnik, Maciej; Steepnik, Maciej; Komorowska, Magdalena; Teixeira, João Paulo; Costa, Solange; Corcuera, Laura-Ana; López de Cerain, Adela; Laffon, Blanca; Valdiglesias, Vanessa; Møller, Peter


    There are substantial inter-laboratory variations in the levels of DNA damage measured by the comet assay. The aim of this study was to investigate whether adherence to a standard comet assay protocol would reduce inter-laboratory variation in reported values of DNA damage. Fourteen laboratories determined the baseline level of DNA strand breaks (SBs)/alkaline labile sites and formamidopyrimidine DNA glycosylase (FPG)-sensitive sites in coded samples of mononuclear blood cells (MNBCs) from healthy volunteers. There were technical problems in seven laboratories in adopting the standard protocol, which were not related to the level of experience. Therefore, the inter-laboratory variation in DNA damage was only analysed using the results from laboratories that had obtained complete data with the standard comet assay protocol. This analysis showed that the differences between reported levels of DNA SBs/alkaline labile sites in MNBCs were not reduced by applying the standard assay protocol as compared with the laboratory's own protocol. There was large inter-laboratory variation in FPG-sensitive sites by the laboratory-specific protocol and the variation was reduced when the samples were analysed by the standard protocol. The SBs and FPG-sensitive sites were measured in the same experiment, indicating that the large spread in the latter lesions was the main reason for the reduced inter-laboratory variation. However, it remains worrying that half of the participating laboratories obtained poor results using the standard procedure. This study indicates that future comet assay validation trials should take steps to evaluate the implementation of standard procedures in participating laboratories.

  1. RLT Code Based Handshake-Free Reliable MAC Protocol for Underwater Sensor Networks

    Xiujuan Du


    Full Text Available The characteristics of underwater acoustic channels such as long propagation delay and low bit rate cause the medium access control (MAC protocols designed for radio channels to either be inapplicable or have low efficiency for underwater sensor networks (UWSNs. Meanwhile, due to high bit error, conventional end-to-end reliable transfer solutions bring about too many retransmissions and are inefficient in UWSN. In this paper, we present a recursive LT (RLT code. With small degree distribution and recursive encoding, RLT achieves reliable transmission hop-by-hop while reducing the complexity of encoding and decoding in UWSN. We further propose an RLT code based handshake-free (RCHF reliable MAC protocol. In RCHF protocol, each node maintains a neighbor table including the field of state, and packages are forwarded according to the state of a receiver, which can avoid collisions of sending-receiving and overhearing. The transmission-avoidance time in RCHF decreases data-ACK collision dramatically. Without RTS/CTS handshaking, the RCHF protocol improves channel utilization while achieving reliable transmission. Simulation results show that, compared with the existing reliable data transport approaches for underwater networks, RCHF can improve network throughput while decreasing end-to-end overhead.

  2. Reliability and criterion validity of an observation protocol for working technique assessments in cash register work.

    Palm, Peter; Josephson, Malin; Mathiassen, Svend Erik; Kjellberg, Katarina


    We evaluated the intra- and inter-observer reliability and criterion validity of an observation protocol, developed in an iterative process involving practicing ergonomists, for assessment of working technique during cash register work for the purpose of preventing upper extremity symptoms. Two ergonomists independently assessed 17 15-min videos of cash register work on two occasions each, as a basis for examining reliability. Criterion validity was assessed by comparing these assessments with meticulous video-based analyses by researchers. Intra-observer reliability was acceptable (i.e. proportional agreement >0.7 and kappa >0.4) for 10/10 questions. Inter-observer reliability was acceptable for only 3/10 questions. An acceptable inter-observer reliability combined with an acceptable criterion validity was obtained only for one working technique aspect, 'Quality of movements'. Thus, major elements of the cashiers' working technique could not be assessed with an acceptable accuracy from short periods of observations by one observer, such as often desired by practitioners. Practitioner Summary: We examined an observation protocol for assessing working technique in cash register work. It was feasible in use, but inter-observer reliability and criterion validity were generally not acceptable when working technique aspects were assessed from short periods of work. We recommend the protocol to be used for educational purposes only.

  3. Improving the AODV Protocol to Satisfy the Required Level of Reliability for Home Area Networks

    Hossein Jafari Pozveh


    Full Text Available For decades, the structure of existing power grids has not changed. It is an old structure that depends heavily on fossil fuel as an energy source, and in the future, this is likely to be critical in the field of energy. To solve these problems and to make optimal use of energy resources, a new concept is proposed, called Smart Grid. Smart Grid is an electric power distribution automation system, which can provide a two-way flow of electricity and information between power plants and consumers. The Smart Grid communications infrastructure consists of different network components, such as Home Area Network (HAN, Neighborhood Area Network (NAN and Wide Area Network (WAN. Achieving the required level of reliability in the transmission of information to all sections, including the HAN, is one of the main objectives in the design and implementation of Smart Grid. This study offers a routing protocol by considering the parameters and constraints of HAN, which, by improving AODV routing protocol, achieves the level of required reliability for data transmission in this network. These improvements include: making table-driven AODV routing protocol, extending the routing protocol to compute multiple paths in a route discovery, simplification and providing the effect of HAN parameters. The results of the NS2 simulation indicate that applying this improved routing protocol in the HAN, satisfies the required level of reliability of the network, which is over 98%.

  4. An improved IEEE 802.11 protocol for reliable data transmission in power distribution fault diagnosis

    Campoccia, F.; Di Silvestre, M.L.; Sanseverino, E.R.; Zizzo, G. [Palermo Univ., Palermo (Italy)


    In power systems, on-line transmission between local units and the central unit can be done by means of power line communications or wireless technology. During an electrical fault, the reliability of the distribution system depends on the security of the timely protective and restorative actions on the network. This paper focused on the WiFi system because of its economy and ease of installation. However, WiFi systems are typically managed by the IEEE 802.11 protocol, which is not reliable in terms of security in data communication. In WiFi networks, data is divided into packets and sent in succession to reduce errors within the radio channel. The IEEE 802.11 protocol has high probability for loss of packets or delay in their transmission. In order to ensure the reliability of data transmission times between two terminal units connected by WiFi stations, a new protocol was derived by modifying the IEEE 802.11. The improvements of the new protocol were highlighted and its capability for the diagnostic service was verified. The modified protocol eliminates the danger of collisions between packets and optimizes the transmission time for sending information. 6 refs., 7 tabs., 8 figs.

  5. The Reliability and Validity of Protocols for the Assessment of Endurance Sports Performance: An Updated Review

    Stevens, Christopher John; Dascombe, Ben James


    Sports performance testing is one of the most common and important measures used in sport science. Performance testing protocols must have high reliability to ensure any changes are not due to measurement error or inter-individual differences. High validity is also important to ensure test performance reflects true performance. Time-trial…


    B. Narasimhan


    Full Text Available Wireless sensor networks (WSNs carry noteworthy pros over traditional communication. Though, unkind and composite environments fake great challenges in the reliability of WSN communications. It is more vital to develop a reliable unipath dynamic source routing protocol (RDSRPl for WSN to provide better quality of service (QoS in energy harvesting wireless sensor networks (EH-WSN. This paper proposes a dynamic source routing approach for attaining the most reliable route in EH-WSNs. Performance evaluation is carried out using NS-2 and throughput and packet delivery ratio are chosen as the metrics.

  7. Resazurin Live Cell Assay: Setup and Fine-Tuning for Reliable Cytotoxicity Results.

    Rodríguez-Corrales, José Á; Josan, Jatinder S


    In vitro cytotoxicity tests allow for fast and inexpensive screening of drug efficacy prior to in vivo studies. The resazurin assay (commercialized as Alamar Blue(®)) has been extensively utilized for this purpose in 2D and 3D cell cultures, and high-throughput screening. However, improper or lack of assay validation can generate unreliable results and limit reproducibility. Herein, we report a detailed protocol for the optimization of the resazurin assay to determine relevant analytical (limits of detection, quantification, and linear range) and biological (growth kinetics) parameters, and, thus, provide accurate cytotoxicity results. Fine-tuning of the resazurin assay will allow accurate and fast quantification of cytotoxicity for drug discovery. Unlike more complicated methods (e.g., mass spectrometry), this assay utilizes fluorescence spectroscopy and, thus, provides a less costly alternative to observe changes in the reductase proteome of the cells.

  8. Salivary Cortisol Protocol Adherence and Reliability by Sociodemographic Features: the Multi-Ethnic Study of Atherosclerosis

    Golden, Sherita Hill; Sánchez, Brisa N.; DeSantis, Amy S.; Wu, Meihua; Castro, Cecilia; Seeman, Teresa E.; Tadros, Sameh; Shrager, Sandi; Diez Roux, Ana V.


    Collection of salivary cortisol has become increasingly popular in large population-based studies. However, the impact of protocol compliance on day-to-day reliabilities of measures, and the extent to which reliabilities differ systematically according to socio-demographic characteristics, has not been well characterized in large-scale population-based studies to date. Using data on 935 men and women from the Multi-ethnic Study of Atherosclerosis, we investigated whether sampling protocol compliance differs systematically according to socio-demographic factors and whether compliance was associated with cortisol estimates, as well as whether associations of cortisol with both compliance and socio-demographic characteristics were robust to adjustments for one another. We further assessed the day-to-day reliability for cortisol features and the extent to which reliabilities vary according to socio-demographic factors and sampling protocol compliance. Overall, we found higher compliance among persons with higher levels of income and education. Lower compliance was significantly associated with a less pronounced cortisol awakening response (CAR) but was not associated with any other cortisol features, and adjustment for compliance did not affect associations of socio-demographic characteristics with cortisol. Reliability was higher for area under the curve (AUC) and wake up values than for other features, but generally did not vary according to socio-demographic characteristics, with few exceptions. Our findings regarding intra-class correlation coefficients (ICCs) support prior research indicating that multiple day collection is preferable to single day collection, particularly for CAR and slopes, more so than wakeup and AUC. There were few differences in reliability by socio-demographic characteristics. Thus, it is unlikely that group-specific sampling protocols are warranted. PMID:24703168

  9. Timer-based mechanisms in reliable transport protocol connection management: a comparison of the TCP and Delta-t protocol approaches

    Watson, R.W.


    There is a need for timer-based mechanisms (other than retransmission timers) to achieve reliable connection management in transport protocols. This need is illustrated by comparing the timer mechanisms in the Department of Defense Transmission Control Protocol (TCP), initially designed using only a message exchange based mechanism, and in the Lawrence Livermore Laboratory Delta-t protocol, designed explicitly to be timer based. The bounding of maximum packet lifetime and related parameters for achieving transport protocol reliability are important, and a mechanism is outlined for enforcing such a bound.

  10. Rorschach intercoder reliability for protocol-level comprehensive system variables in an international sample.

    Sahly, Jennifer; Shaffer, Thomas W; Erdberg, Philip; O'Toole, Siobhan


    This study examines the intercoder reliability of Rorschach Comprehensive System (CS; Exner, 2001) protocol-level variables. A large international sample was combined to obtain intercoder agreement for 489 Rorschach protocols coded using the CS. Intercoder agreement was calculated using an Iota coefficient, a statistical coefficient similar to kappa that is corrected for chance. Iota values for the variables analyzed ranged from .31 to 1.00, with 2 in the poor range of agreement, 4 in the fair range, 25 in the good range, and 116 in the excellent range of agreement. Discrepancies between variables are discussed.

  11. Energy Constrained Reliable Routing Optimized Cluster Head Protocol for Multihop under Water Acoustic Sensor Networks



    Full Text Available Underwater acoustic sensor network is an emerging technique consisting of sensor nodes, and AUVs all working together to sense various phenomenon, converts the sensed information into digital data, storethe digital data and communicate to the base stations through the intermediate nodes. Also UnderwaterAcoustic Sensor Networks are playing a main role in ocean applications. Unfortunately the efficiency of underwater Acoustic Sensor Networks is inferior to that of terrestrial sensor networks due to the longpropagation delay, narrow bandwidth and high error rates. Also battery life and storage capacity of node is limited. Many routing protocols are proposed to improve the efficiency of Under Water Acoustic Sensor Networks. However their improvement is not enough, so there is a need of suitable routing protocol that consider all these limitations and makes communication in underwater network viable. In this paper, we propose a protocol called Reliable Routing Optimized Cluster Head (RROCH protocol, a network coding approach for multihop topologies. We used performance metrics like packet delivery ratio, energy consumption, end-to-end delay and throughput of sensor nodes. LEACH, HMR-LEACH, LEACH-M are compared for their performance at different traffic conditions, number of nodes and depth. By analyzing our simulation results we found that RROCH protocol may be used for denser network with low traffic and HMR- LEACH protocol is suitable for higher traffic with less number of nodes.

  12. An easy and efficient permeabilization protocol for in vivo enzyme activity assays in cyanobacteria

    Rasmussen, Randi Engelberth; Erstad, Simon Matthé; Ramos Martinez, Erick Miguel


    microbial cell factories. Better understanding of the activities of enzymes involved in the central carbon metabolism would lead to increasing product yields. Currently cell-free lysates are the most widely used method for determination of intracellular enzyme activities. However, due to thick cell walls...... and subsequent activity assays were successfully adapted to the 96-well plate system. CONCLUSIONS: An easy, efficient and scalable permeabilization protocol was established for cyanobacteria. The permeabilized cells can be directly applied for measurement of G6PDH and Rubisco activities without using...... radioisotopes and the protocol may be readily adapted to studies of other cyanobacterial species and other intracellular enzymes. The permeabilization and enzyme assays can be performed in 96-well plates in a high-throughput manner....

  13. Comparison of mRNA splicing assay protocols across multiple laboratories: recommendations for best practice in standardized clinical testing.

    Whiley, Phillip J; de la Hoya, Miguel; Thomassen, Mads; Becker, Alexandra; Brandão, Rita; Pedersen, Inge Sokilde; Montagna, Marco; Menéndez, Mireia; Quiles, Francisco; Gutiérrez-Enríquez, Sara; De Leeneer, Kim; Tenés, Anna; Montalban, Gemma; Tserpelis, Demis; Yoshimatsu, Toshio; Tirapo, Carole; Raponi, Michela; Caldes, Trinidad; Blanco, Ana; Santamariña, Marta; Guidugli, Lucia; de Garibay, Gorka Ruiz; Wong, Ming; Tancredi, Mariella; Fachal, Laura; Ding, Yuan Chun; Kruse, Torben; Lattimore, Vanessa; Kwong, Ava; Chan, Tsun Leung; Colombo, Mara; De Vecchi, Giovanni; Caligo, Maria; Baralle, Diana; Lázaro, Conxi; Couch, Fergus; Radice, Paolo; Southey, Melissa C; Neuhausen, Susan; Houdayer, Claude; Fackenthal, Jim; Hansen, Thomas Van Overeem; Vega, Ana; Diez, Orland; Blok, Rien; Claes, Kathleen; Wappenschmidt, Barbara; Walker, Logan; Spurdle, Amanda B; Brown, Melissa A


    Accurate evaluation of unclassified sequence variants in cancer predisposition genes is essential for clinical management and depends on a multifactorial analysis of clinical, genetic, pathologic, and bioinformatic variables and assays of transcript length and abundance. The integrity of assay data in turn relies on appropriate assay design, interpretation, and reporting. We conducted a multicenter investigation to compare mRNA splicing assay protocols used by members of the ENIGMA (Evidence-Based Network for the Interpretation of Germline Mutant Alleles) consortium. We compared similarities and differences in results derived from analysis of a panel of breast cancer 1, early onset (BRCA1) and breast cancer 2, early onset (BRCA2) gene variants known to alter splicing (BRCA1: c.135-1G>T, c.591C>T, c.594-2A>C, c.671-2A>G, and c.5467+5G>C and BRCA2: c.426-12_8delGTTTT, c.7988A>T, c.8632+1G>A, and c.9501+3A>T). Differences in protocols were then assessed to determine which elements were critical in reliable assay design. PCR primer design strategies, PCR conditions, and product detection methods, combined with a prior knowledge of expected alternative transcripts, were the key factors for accurate splicing assay results. For example, because of the position of primers and PCR extension times, several isoforms associated with BRCA1, c.594-2A>C and c.671-2A>G, were not detected by many sites. Variation was most evident for the detection of low-abundance transcripts (e.g., BRCA2 c.8632+1G>A Δ19,20 and BRCA1 c.135-1G>T Δ5q and Δ3). Detection of low-abundance transcripts was sometimes addressed by using more analytically sensitive detection methods (e.g., BRCA2 c.426-12_8delGTTTT ins18bp). We provide recommendations for best practice and raise key issues to consider when designing mRNA assays for evaluation of unclassified sequence variants.

  14. Mechanisms for a reliable timer-based protocol. [Packet-switched network

    Fletcher, J.G.; Watson, R.W.


    Timer-based protocol mechanisms are developed for reliable and efficient transmission of both single-message and message-stream traffic; that is, correct data delivery is assured in the face of lost, damaged, duplicate, and out-of-sequence packets. The protocol mechanisms seem particularly useful in a high-speed local network environment. Current reliable protocol design approaches are not well suited for single-message modes of communication appropriate, for example, to distributed network operating systems. The timer intervals that must be maintained for sender and receiver are developed along with the rules for timer operation, packet acceptance, packet creation, and connection opening and closing. The underlying assumptions about network characteristics required for the timer-based approach to work correctly are discussed, particularly, that maximum packet lifetime can be bounded. The timer-based mechanisms are compared with mechanisms designed to deal with the same problems by using the exchange of multiple messages to open and close logical connections or virtual circuits. 5 figures.

  15. Validity and reliability of a protocol of orofacial myofunctional evaluation for patients with obstructive sleep apnea.

    Folha, Gislaine A; Valera, Fabiana C P; de Felício, Cláudia M


    There is no standardized protocol for the clinical evaluation of orofacial components and functions in patients with obstructive sleep apnea. The aim of this study was to examine the validity, reliability, and psychometric properties of the Expanded Protocol of Orofacial Myofunctional Evaluation with Scores (OMES-expanded) in subjects with obstructive sleep apnea. Patients with obstructive sleep apnea and control subjects were evaluated, and the validity of OMES-expanded was tested by construct validity (i.e. the ability to discriminate orofacial status between apneic and control subjects) and criterion validity (i.e. correlation between OMES-expanded and a reference instrument). Construct validity was adequate; the apneic group showed significantly worse orofacial status than did control subjects. Criterion validity of OMES-expanded was good, as was its reliability. The OMES-expanded is valid and reliable for evaluating orofacial myofunctional disorders of patients with obstructive sleep apnea, with adequate psychometric properties. It may be useful to plan a therapeutic strategy and to determine whether the effects of therapy are related to improved muscle and orofacial functions.

  16. Intra-rater and inter-rater reliability of the standardized ultrasound protocol for assessing subacromial structures

    Hougs Kjær, Birgitte; Ellegaard, Karen; Wieland, Ina


    BACKGROUND: US-examinations related to shoulder impingement (SI) often vary due to methodological differences, examiner positions, transducers, and recording parameters. Reliable US protocols for examination of different structures related to shoulder impingement are therefore needed. OBJECTIVES:...

  17. Reassessing the reliability of the salivary cortisol assay for the diagnosis of Cushing syndrome.

    Zhang, Qian; Dou, Jingtao; Gu, Weijun; Yang, Guoqing; Lu, Juming


    The cortisol concentration in saliva is 10-fold lower than total serum cortisol and accurately reflects the serum concentration, both levels being lowest around midnight. The salivary cortisol assay measures free cortisol and is unaffected by confounding factors. This study analysed published data on the sensitivity and specificity of salivary cortisol levels in the diagnosis of Cushing syndrome. Data from studies on the use of different salivary cortisol assay techniques in the diagnosis of Cushing syndrome, published between 1998 and 2012 and retrieved using Ovid MEDLINE®, were analysed for variance and correlation. For the 11 studies analysed, mean sensitivity and specificity of the salivary cortisol assay were both >90%. Repeated measurements were easily made with this assay, enabling improved diagnostic accuracy in comparison with total serum cortisol measurements. This analysis confirms the reliability of the saliva cortisol assay as pragmatic tool for the accurate diagnosis of Cushing syndrome. With many countries reporting a rising prevalence of metabolic syndrome, diabetes and obesity--in which there is often a high circulating cortisol level--salivary cortisol measurement will help distinguish these states from Cushing syndrome.

  18. Specification and Design of a Fault Recovery Model for the Reliable Multicast Protocol

    Montgomery, Todd; Callahan, John R.; Whetten, Brian


    The Reliable Multicast Protocol (RMP) provides a unique, group-based model for distributed programs that need to handle reconfiguration events at the application layer. This model, called membership views, provides an abstraction in which events such as site failures, network partitions, and normal join-leave events are viewed as group reformations. RMP provides access to this model through an application programming interface (API) that notifies an application when a group is reformed as the result of a some event. RMP provides applications with reliable delivery of messages using an underlying IP Multicast media to other group members in a distributed environment even in the case of reformations. A distributed application can use various Quality of Service (QoS) levels provided by RMP to tolerate group reformations. This paper explores the implementation details of the mechanisms in RMP that provide distributed applications with membership view information and fault recovery capabilities.

  19. Practical digest for evaluating the uncertainty of analytical assays from validation data according to the LGC/VAM protocol.

    González, A Gustavo; Angeles Herrador, M; Asuero, Agustín G


    The estimation of the measurement uncertainty of analytical assays based on the LGC/VAM protocol from validation data is fully revisited and discussed in the light of the study of precision, trueness and robustness.

  20. A step-by-step protocol for assaying protein carbonylation in biological samples.

    Colombo, Graziano; Clerici, Marco; Garavaglia, Maria Elisa; Giustarini, Daniela; Rossi, Ranieri; Milzani, Aldo; Dalle-Donne, Isabella


    Protein carbonylation represents the most frequent and usually irreversible oxidative modification affecting proteins. This modification is chemically stable and this feature is particularly important for storage and detection of carbonylated proteins. Many biochemical and analytical methods have been developed during the last thirty years to assay protein carbonylation. The most successful method consists on protein carbonyl (PCO) derivatization with 2,4-dinitrophenylhydrazine (DNPH) and consequent spectrophotometric assay. This assay allows a global quantification of PCO content due to the ability of DNPH to react with carbonyl giving rise to an adduct able to absorb at 366 nm. Similar approaches were also developed employing chromatographic separation, in particular HPLC, and parallel detection of absorbing adducts. Subsequently, immunological techniques, such as Western immunoblot or ELISA, have been developed leading to an increase of sensitivity in protein carbonylation detection. Currently, they are widely employed to evaluate change in total protein carbonylation and eventually to highlight the specific proteins undergoing selective oxidation. In the last decade, many mass spectrometry (MS) approaches have been developed for the identification of the carbonylated proteins and the relative amino acid residues modified to carbonyl derivatives. Although these MS methods are much more focused and detailed due to their ability to identify the amino acid residues undergoing carbonylation, they still require too expensive equipments and, therefore, are limited in distribution. In this protocol paper, we summarise and comment on the most diffuse protocols that a standard laboratory can employ to assess protein carbonylation; in particular, we describe step-by-step the different protocols, adding suggestions coming from our on-bench experience.

  1. A reliable transmission protocol for ZigBee-based wireless patient monitoring.

    Chen, Shyr-Kuen; Kao, Tsair; Chan, Chia-Tai; Huang, Chih-Ning; Chiang, Chih-Yen; Lai, Chin-Yu; Tung, Tse-Hua; Wang, Pi-Chung


    Patient monitoring systems are gaining their importance as the fast-growing global elderly population increases demands for caretaking. These systems use wireless technologies to transmit vital signs for medical evaluation. In a multihop ZigBee network, the existing systems usually use broadcast or multicast schemes to increase the reliability of signals transmission; however, both the schemes lead to significantly higher network traffic and end-to-end transmission delay. In this paper, we present a reliable transmission protocol based on anycast routing for wireless patient monitoring. Our scheme automatically selects the closest data receiver in an anycast group as a destination to reduce the transmission latency as well as the control overhead. The new protocol also shortens the latency of path recovery by initiating route recovery from the intermediate routers of the original path. On the basis of a reliable transmission scheme, we implement a ZigBee device for fall monitoring, which integrates fall detection, indoor positioning, and ECG monitoring. When the triaxial accelerometer of the device detects a fall, the current position of the patient is transmitted to an emergency center through a ZigBee network. In order to clarify the situation of the fallen patient, 4-s ECG signals are also transmitted. Our transmission scheme ensures the successful transmission of these critical messages. The experimental results show that our scheme is fast and reliable. We also demonstrate that our devices can seamlessly integrate with the next generation technology of wireless wide area network, worldwide interoperability for microwave access, to achieve real-time patient monitoring.


    Craig A. Williams


    Full Text Available This study investigated the between trial variation of thermoregulatory measures during a heat acclimation protocol. Eight 14-16 y old boys completed three bouts of 20-min cycling at 45 % peak VO2 in a hot environment (35.1 ± 1.2 °C and 46. 4 ± 1.0 % relative humidity on two occasions separated by a minimum of 24 h. Reliability was assessed through analysis of within-subject variation, the change in the mean, and retest correlation for measurements of aural temperature (Tau, mean skin temperature (Tsk, heart rate (HR and oxygen uptake (VO2. Between trial differences were low for Tau, Tskbout1, Tskbout2and3 and HR with coefficients of variation 0.6 %, 1.5 %, 0.5 % and 4.0 %, respectively. The results demonstrate good reliability that will allow future investigators to precisely determine the extent of heat acclimation protocols in relation to the measurement error

  3. An approach to verification and validation of a reliable multicasting protocol: Extended Abstract

    Callahan, John R.; Montgomery, Todd L.


    This paper describes the process of implementing a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally using a combination of formal and informal techniques in an attempt to ensure the correctness of its implementation. Our development process involved three concurrent activities: (1) the initial construction and incremental enhancement of a formal state model of the protocol machine; (2) the initial coding and incremental enhancement of the implementation; and (3) model-based testing of iterative implementations of the protocol. These activities were carried out by two separate teams: a design team and a V&V team. The design team built the first version of RMP with limited functionality to handle only nominal requirements of data delivery. This initial version did not handle off-nominal cases such as network partitions or site failures. Meanwhile, the V&V team concurrently developed a formal model of the requirements using a variant of SCR-based state tables. Based on these requirements tables, the V&V team developed test cases to exercise the implementation. In a series of iterative steps, the design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation. This was done by generating test cases based on suspected errant or off-nominal behaviors predicted by the current model. If the execution of a test in the model and implementation agreed, then the test either found a potential problem or verified a required behavior. However, if the execution of a test was different in the model and implementation, then the differences helped identify inconsistencies between the model and implementation. In either case, the dialogue between both teams drove the co-evolution of the model and implementation. We have found that this

  4. The MoCA 5-min protocol is a brief, valid, reliable and feasible cognitive screen for telephone administration

    Wong, Adrian; Nyenhuis, David; Black, Sandra E; Law, Lorraine S.N.; Lo, Eugene S.K.; Kwan, Pauline W.L.; Au, Lisa; Chan, Anne YY; Wong, Lawrence KS; Nasreddine, Ziad; Mok, Vincent


    Background and Purpose The NINDS-CSN vascular cognitive impairment (VCI) Harmonization working group proposed a brief cognitive protocol for screening of VCI. We investigated the validity, reliability and feasibility of the Montreal Cognitive Assessment 5-minute protocol (MoCA 5-min protocol) administered over the telephone. Methods Four items examining attention, verbal learning and memory, executive functions/language and orientation were extracted from the MoCA to form the MoCA 5-min protocol. One hundred and four patients with stroke or TIA, including 53 with normal cognition (CDR 0) and 51 with cognitive impairment (CDR 0.5 or 1), were administered the MoCA in clinic and a month later, the MoCA 5-min protocol over the telephone. Results Administration of the MoCA 5-min protocol took five minutes over the telephone. Total score of the MoCA 5-min protocol correlated negatively with age (r=-0.36, p0.05 for difference; Cohen's d for group difference 0.801.13). It differentiated cognitively impaired patients with executive domain impairment from those without (AUC=0.89, p<0.001; Cohen's d=1.7 for group difference). 30-day test-retest reliability was excellent (Intraclass correlation coefficient=0.89). Conclusions The MoCA 5-min protocol is a free, valid and reliable cognitive screen for stroke and TIA. It is brief and highly feasible for telephone administration. PMID:25700290

  5. The reliability of the ELEPAP clinical protocol for the 3D kinematic evaluation of upper limb function.

    Vanezis, Athanasios; Robinson, Mark A; Darras, Nikolaos


    Upper limb (UL) kinematic assessment protocols are becoming integrated into clinical practice due to their development over the last few years. We propose the ELEPAP UL protocol, a contemporary UL kinematic protocol that can be applied to different pathological conditions. This model is based on ISB modeling recommendations, uses functional joint definitions, and models three joints of the shoulder girdle. The specific aim of this study was to determine the within and between session reliability of the ELEPAP UL model. Ten healthy subjects (mean age: 13.6±4.3 years) performed four reach-to-grasp and five functional tasks, which included a novel throwing task to assess a wide spectrum of motor skills. Three trials of every task in two different sessions were analyzed. The reliability of angular waveforms was evaluated by measurement error (σ) and coefficient of multiple correlation (CMC). Spatiotemporal parameters were assessed by standard error of measurement (SEM). Generally joint kinematics presented low σw and σb errors (0.60) were found, demonstrating good to excellent reliability especially in joints with larger ranges of motion. The throwing task proved equally reliable, enhancing the universal application of the protocol. Compared to the literature, this study demonstrated higher reliability of the thorax, scapula and wrist joints. This was attributed to the highly standardized procedure and the implementation of recent methodological advancements. In conclusion, ELEPAP protocol was proved a reliable tool to analyze UL kinematics. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Optimization of the transcranial magnetic stimulation protocol by defining a reliable estimate for corticospinal excitability.

    Koen Cuypers

    Full Text Available The goal of this study was to optimize the transcranial magnetic stimulation (TMS protocol for acquiring a reliable estimate of corticospinal excitability (CSE using single-pulse TMS. Moreover, the minimal number of stimuli required to obtain a reliable estimate of CSE was investigated. In addition, the effect of two frequently used stimulation intensities [110% relative to the resting motor threshold (rMT and 120% rMT] and gender was evaluated. Thirty-six healthy young subjects (18 males and 18 females participated in a double-blind crossover procedure. They received 2 blocks of 40 consecutive TMS stimuli at either 110% rMT or 120% rMT in a randomized order. Based upon our data, we advise that at least 30 consecutive stimuli are required to obtain the most reliable estimate for CSE. Stimulation intensity and gender had no significant influence on CSE estimation. In addition, our results revealed that for subjects with a higher rMT, fewer consecutive stimuli were required to reach a stable estimate of CSE. The current findings can be used to optimize the design of similar TMS experiments.

  7. Optimization of the transcranial magnetic stimulation protocol by defining a reliable estimate for corticospinal excitability.

    Cuypers, Koen; Thijs, Herbert; Meesen, Raf L J


    The goal of this study was to optimize the transcranial magnetic stimulation (TMS) protocol for acquiring a reliable estimate of corticospinal excitability (CSE) using single-pulse TMS. Moreover, the minimal number of stimuli required to obtain a reliable estimate of CSE was investigated. In addition, the effect of two frequently used stimulation intensities [110% relative to the resting motor threshold (rMT) and 120% rMT] and gender was evaluated. Thirty-six healthy young subjects (18 males and 18 females) participated in a double-blind crossover procedure. They received 2 blocks of 40 consecutive TMS stimuli at either 110% rMT or 120% rMT in a randomized order. Based upon our data, we advise that at least 30 consecutive stimuli are required to obtain the most reliable estimate for CSE. Stimulation intensity and gender had no significant influence on CSE estimation. In addition, our results revealed that for subjects with a higher rMT, fewer consecutive stimuli were required to reach a stable estimate of CSE. The current findings can be used to optimize the design of similar TMS experiments.

  8. Reliable beacon transmission based MAC protocol for LR-WPANs over WLAN interferences

    Ji-Hoon PARK; Byung-Seo KIM


    The use of IEEE 802.15.4 standard based application systems has been rapidly increasing, for example, in medical services, sensor networks, public safety systems, and home automation systems. However, issues arise from the fact that IEEE 802.15.4 standard based low rate wireless personal area networks (LR-WPANs) use the same frequency bands as wireless local area networks (WLANs), and they interfere with each other. Based on past research on this issue, the interference has a more serious impact on LR-WPANs’ performance than on WLANs’ performance. In this paper we propose a method to improve LR-WPANs’ performance while coexisting with WLANs, which is called the reliable beacon transmission based medium access control (MAC) protocol. Since the reliability of a beacon frame is important, in this method, only the beacon frame is transmitted in interference-free channels, and the data packets are transmitted in interfered channels instead of abandoning the channels altogether. This method increases the reliability of beacon frames as well as overall channel utilizations. The effectiveness of the proposed method was evaluated through extensive simulations, and this paper proves that this method improves the performance of IEEE 802.15.4 based wireless sensor networks (WSNs) over WLANs’ interferences.

  9. A Performance Evaluation of NACK-Oriented Protocols as the Foundation of Reliable Delay- Tolerant Networking Convergence Layers

    Iannicca, Dennis; Hylton, Alan; Ishac, Joseph


    Delay-Tolerant Networking (DTN) is an active area of research in the space communications community. DTN uses a standard layered approach with the Bundle Protocol operating on top of transport layer protocols known as convergence layers that actually transmit the data between nodes. Several different common transport layer protocols have been implemented as convergence layers in DTN implementations including User Datagram Protocol (UDP), Transmission Control Protocol (TCP), and Licklider Transmission Protocol (LTP). The purpose of this paper is to evaluate several stand-alone implementations of negative-acknowledgment based transport layer protocols to determine how they perform in a variety of different link conditions. The transport protocols chosen for this evaluation include Consultative Committee for Space Data Systems (CCSDS) File Delivery Protocol (CFDP), Licklider Transmission Protocol (LTP), NACK-Oriented Reliable Multicast (NORM), and Saratoga. The test parameters that the protocols were subjected to are characteristic of common communications links ranging from terrestrial to cis-lunar and apply different levels of delay, line rate, and error.

  10. Comparison of mRNA Splicing Assay Protocols across Multiple Laboratories

    Whiley, Phillip; de la Hoya, Miguel; Thomassen, Mads


    and differences in results derived from analysis of a panel of breast cancer 1, early onset (BRCA1) and breast cancer 2, early onset (BRCA2) gene variants known to alter splicing (BRCA1: c.135-1G>T, c.591C>T, c.594-2A>C, c.671-2A>G, and c.5467+5G>C and BRCA2: c.426-12_8delGTTTT, c.7988A>T, c.8632+1G>A, and c.9501...... in turn relies on appropriate assay design, interpretation, and reporting.METHODS: We conducted a multicenter investigation to compare mRNA splicing assay protocols used by members of the ENIGMA (Evidence-Based Network for the Interpretation of Germline Mutant Alleles) consortium. We compared similarities...... for accurate splicing assay results. For example, because of the position of primers and PCR extension times, several isoforms associated with BRCA1, c.594-2A>C and c.671-2A>G, were not detected by many sites. Variation was most evident for the detection of low-abundance transcripts (e.g., BRCA2 c.8632+1G>A Δ...

  11. SM_TCP: a new reliable multicast transport protocol for satellite IP networks

    Liu, Gongliang; Gu, Xuemai; Li, Shizhong


    A new reliable multicast transport protocol SM_TCP is proposed for satellite IP networks in this paper. In SM_TCP, the XOR scheme with the aid of on-board buffering and processing is used for error recovery and an optimal retransmission algorithm is designed, which can reduce the recovery time by half of the RTT and minimize the number of retransmissions. In order to avoid the unnecessary decrease of congestion window in the high BER satellite channels, the occupied buffer sizes at bottlenecks are measured in adjusting the congestion window, instead of depending on the packet loss information. The average session rate of TCP sessions and of multicast sessions passing through the satellite are also measured and compared in adjusting the congestion window, which contributes to bandwidth fairness. Analysis and simulation results show fairness with TCP flows and scalability.

  12. The rate of force development scaling factor (RFD-SF): protocol, reliability, and muscle comparisons.

    Bellumori, Maria; Jaric, Slobodan; Knight, Christopher A


    Performing a set of isometric muscular contractions to varied amplitudes with instructions to generate force most rapidly reveals a strong linear relationship between peak forces (PF) achieved and corresponding peak rates of force development (RFD). The slope of this relationship, termed the RFD scaling factor (RFD-SF), quantifies the extent to which RFD scales with contraction amplitude. Such scaling allows relative invariance in the time required to reach PF regardless of contraction size. Considering the increasing use of this relationship to study quickness and consequences of slowness in older adults and movement disorders, our purpose was to further develop the protocol to measure RFD-SF. Fifteen adults (19-28 years) performed 125 rapid isometric contractions to a variety of force levels in elbow extensors, index finger abductors, and knee extensors, on 2 days. Data were used to determine (1) how the number of pulses affects computation of the RFD-SF, (2) day-to-day reliability of the RFD-SF, and (3) the nature of RFD-SF differences between diverse muscle groups. While sensitive to the number of pulses used in its computation (P50 pulses (ICC>.7) and more so with 100-125 pulses (ICC=.8-.92). Despite differences in size and function across muscles, RFD-SF was generally similar (i.e., only 8.5% greater in elbow extensors than in index finger abductors and knee extensors; P=.049). Results support this protocol as a reliable means to assess how RFD scales with PF in rapid isometric contractions as well as a simple, non-invasive probe into neuromuscular health.

  13. Detection of telomerase activity by combination of telomeric repeat amplification protocol and electrochemiluminescence assay

    Xiao Ming Zhou; Li Jia


    A highly sensitive telomerase detection method that combines telomeric repeat amplification protocol (TRAP) and magnetic beads based electrochemiluminescence (ECL) assay has been developed. Briefly, telomerase recognizes biotinylated telomerase synthesis primer (B-TS) and synthesizes extension products, which then serve as the templates for PCR amplification using B-TS as the forward primer and Iris-(2'2'-bipyridyl) ruthenium (TBR) labeled ACX (TBR-ACX) as the reversed primer. The amplified product is captured on streptavidin-coated paramagnetic beads and detected by ECL. Telomerase positive HeLa cells were used to validate the feasibility of the method. The experimental results showed down to 10 cancer cells can be detected easily. The method is a useful tool for telomerase activity analysis due to its sensitivity, rapidity, safety, high throughput, and low cost. It can be used for screening a large amount of clinical samples.

  14. Tackling reliability and construct validity: the systematic development of a qualitative protocol for skill and incident analysis.

    Savage, Trevor Nicholas; McIntosh, Andrew Stuart


    It is important to understand factors contributing to and directly causing sports injuries to improve the effectiveness and safety of sports skills. The characteristics of injury events must be evaluated and described meaningfully and reliably. However, many complex skills cannot be effectively investigated quantitatively because of ethical, technological and validity considerations. Increasingly, qualitative methods are being used to investigate human movement for research purposes, but there are concerns about reliability and measurement bias of such methods. Using the tackle in Rugby union as an example, we outline a systematic approach for developing a skill analysis protocol with a focus on improving objectivity, validity and reliability. Characteristics for analysis were selected using qualitative analysis and biomechanical theoretical models and epidemiological and coaching literature. An expert panel comprising subject matter experts provided feedback and the inter-rater reliability of the protocol was assessed using ten trained raters. The inter-rater reliability results were reviewed by the expert panel and the protocol was revised and assessed in a second inter-rater reliability study. Mean agreement in the second study improved and was comparable (52-90% agreement and ICC between 0.6 and 0.9) with other studies that have reported inter-rater reliability of qualitative analysis of human movement.

  15. Development of a Standard Protocol for the Harmonic Analysis of Radial Pulse Wave and Assessing Its Reliability in Healthy Humans



    This study was aimed to establish a standard protocol and to quantitatively assess the reliability of harmonic analysis of the radial pulse wave measured by a harmonic wave analyzer (TD01C system). Both intraobserver and interobserver assessments were conducted to investigate whether the values of harmonics are stable in successive measurements. An intraclass correlation coefficient (ICC) and a Bland–Altman plot were used for this purpose. For the reliability assessments of the intraobserver ...

  16. Design and Evaluation of Reliable Data Transmission Protocol in Wireless Sensor Networks

    Ailixier Aikebaier


    Full Text Available A wireless sensor-actuator network (WSAN is composed of sensor modes and actuator modes which are interconnected in wireless networks. A sensor node collects information on the physical world and sends a sensed value in a wireless network. Another sensor node forwards the sensed value to deliver to an actuator node. A sensor node can deliver messages with sensed values to only nearby nodes due to weak radio. Messages are forwarded by sensor nodes to an actuator node by a type of flooding protocol. A sensor mode senses an event and sends a message with the sensed value. In addition, on receipt of a message with a sensed value from another sensor mode, a sensor node forwards the sensed value. Messages transmitted by sensor nodes might be lost due to noise and collisions. In this paper, we discuss a redundant data transmission (RT protocol to reliably and efficiently deliver sensed values sensed by sensor nodes to an actuator node. Here, a sensor node sends a message with not only its sensed value but also sensed values received from other sensor nodes. The more number of sensed values are included in a message, the more frequently the message is lost. Each message carries so many number of sensed values that the message loss ratio is not increased. Even if a message with a sensed value v is lost in the wireless network, an actuator node can receive the sensed value v from a message sent by another sensor node. Thus, each sensed value is redundantly carried in multiple messages. The redundancy of a sensed value is in nature increased since the sensed value is broadcast. In order to reduce the redundancy of sensed value, we take a strategy that the farther sensor nodes from an actuator node forward the fewer number of sensed values. We evaluate the RT protocol in terms of loss ratio, redundancy, and delay time of a sensed value. We show that about 80% of sensed values can be delivered to an actuator node even if 95% of messages are lost due to noise

  17. A home-brew real-time PCR assay for reliable detection and quantification of mature miR-122.

    Naderi, Mahmood; Abdul Tehrani, Hossein; Soleimani, Masoud; Shabani, Iman; Hashemi, Seyed Mahmoud


    miR-122 is a liver-specific miRNA that has significant gene expression alterations in response to specific pathophysiological circumstances of liver such as drug-induced liver injury, hepatocellular carcinoma, and hepatitis B and C virus infections. Therefore, accurate and precise quantification of miR-122 is very important for clinical diagnostics. However, because of the lack of in vitro diagnostics assays for miR-122 detection and quantification of the existence of an open-source assay could inevitably provide external evaluation by other researchers and the chance of promoting the assay when required. The aim of this study was to develop a Taqman real-time polymerase chain reaction assay, which is capable of robust and reliable quantification of miR-122 in different sample types. We used stem loop methodology to design a specific Taqman real-time polymerase chain reaction assay for miR-122. This technique enabled us to reliably and reproducibly quantify short-length oligonucleotides such as miR-122. The specificity, sensitivity, interassay and intra-assay, and the dynamic range of the assay were experimentally determined by their respective methodology. The assay had a linear dynamic range of 3E to 4.8E miR-122 copies/reaction and the limit of detection was determined to be between 960 and 192 copies/reaction with 95% confidence interval. The assay gave a coefficient of variation for the Ct values of 50,000 copies per hepatocyte, this assay is able to suffice the need for reliable detection and quantification of this miRNA. Therefore, this study can be considered as a start point for standardizing miR-122 quantification.

  18. Scientific Literature on the Reliability and Validity of the Manchester Triage System (MTS Protocol: A Integrative Literature Review

    Cristiane Chaves de Souza


    Full Text Available OBJECTIVE To analyze the scientific production about the validity and reliability of the Manchester Triage System (MTS protocol. METHOD A descriptive study of an integrative literature review. Articles on the validity and reliability of the MTS developed with children and adults published between 1999 and 2013 were included. RESULTS 14 articles were selected from a total of 8438, nine of validity and five of reliability. The reliability of the MTS ranged from moderate to almost perfect, with higher intra-evaluation. Regarding validity, the results seem to point to equivalent and satisfactory sensibility and specificity levels of the MTS. The instrument proved to be a good predictor of the need for hospitalization and of hospital mortality. CONCLUSION The reliability and validity of the MTS obtained in the studies is varied. It is recommended that new studies indicate necessary modifications to the MTS so that it is more safely used by nurses.

  19. Computer simulation on reliability of retention index with FDG-PET and optimization of dual-time-point imaging protocol


    The inherent noise in positron emission tomography (PET) leads to the instability of quantitative indicators, which may affect the diagnostic accuracy for differentiating malignant and benign lesions in the management of lung cancer. In this paper, the reliability of retention index (RI) is systematically investigated by using computer simulation for the dual-time-point imaging protocol. The area under the receiver operating characteristic (ROC) curve is used to evaluate the optimal protocol. Results demonstrate that the reliability of RI is affected by several factors including noise level, lesion type, and imaging schedule. The Ris with small absolute values suffer from worse reliability than those larger ones. The results of ROC curves show that over delayed second scan cannot help to improve the diag- nostic performance further, while an early first scan is expected. The method of optimization based on ROC analysis can be easily extended to comprise as many lesions as possible.

  20. Enhancing the Communication Range and Reliability of Mobile ADHOC Network Using AODV-OSPF Protocol

    Onkar Nath Thakur,; Amit Saxena


    The increasing density of node and communication range of mobile node raised some problem such as dropping of packet and degraded the performance of network. For the improvement of performance of AODV routing Protocol in mobile ADHOC network various authors used different size of adjacency matrix in AODV routing protocol. In this paper proposed an improved AODV routing protocol using OSPF routing adjacency matrix in AODV protocol. The size of OSPF matrix is large instead of AO...

  1. An Optimized Protocol for Electrophoretic Mobility Shift Assay Using Infrared Fluorescent Dye-labeled Oligonucleotides.

    Hsieh, Yi-Wen; Alqadah, Amel; Chuang, Chiou-Fen


    Electrophoretic Mobility Shift Assays (EMSA) are an instrumental tool to characterize the interactions between proteins and their target DNA sequences. Radioactivity has been the predominant method of DNA labeling in EMSAs. However, recent advances in fluorescent dyes and scanning methods have prompted the use of fluorescent tagging of DNA as an alternative to radioactivity for the advantages of easy handling, saving time, reducing cost, and improving safety. We have recently used fluorescent EMSA (fEMSA) to successfully address an important biological question. Our fEMSA analysis provides mechanistic insight into the effect of a missense mutation, G73E, in the highly conserved HMG transcription factor SOX-2 on olfactory neuron type diversification. We found that mutant SOX-2(G73E) protein alters specific DNA binding activity, thereby causing olfactory neuron identity transformation. Here, we present an optimized and cost-effective step-by-step protocol for fEMSA using infrared fluorescent dye-labeled oligonucleotides containing the LIM-4/SOX-2 adjacent target sites and purified SOX-2 proteins (WT and mutant SOX-2(G73E) proteins) as a biological example.

  2. Brazilian version of the Protocole Montréal d'Evaluation de la Communication (Protocole MEC): normative and reliability data.

    Fonseca, Rochele Paz; Joanette, Yves; Côté, Hélène; Ska, Bernadette; Giroux, Francine; Fachel, Jandyra Maria Guimarães; Damasceno Ferreira, Gabriela; Parente, Maria Alice de Mattos Pimenta


    The lack of standardized instruments to evaluate communication disorders related to the right hemisphere was verified. A new evaluation tool was developed: Protocole Montréal d'Evaluation de la Communication--Protocole MEC, adapted to Brazilian Portuguese--Bateria Montreal de Avaliação da Comunicação--Bateria MAC (Montreal Evaluation of Communication Battery). The purpose was to present stratified normative data by age and educational level, and to verify the reliability parameters of the MEC Battery. 300 individuals, between the ages of 19 and 75 years, and levels of formal education between 2 and 35 years, participated in this study. They were divided equally into six normative groups, according to three age categories (young adults, intermediary age, and seniors) and two educational levels (low and high). Two procedures were used to check reliability: Cronbach alpha and reliability between evaluators, Results were established at the 10th percentile, and an alert point per task for each normative group. Cronbach's alpha was, in general, between .70 and .90 and the average rate of agreement between evaluators varied from .62 to .94. Standards of age and education were established. The reliability of this instrument was verified. The psychometric legitimization of the MEC Battery will contribute to the diagnostic process for communicative disorders.

  3. Enhancing the Communication Range and Reliability of Mobile ADHOC Network Using AODV-OSPF Protocol

    Onkar Nath Thakur,


    Full Text Available The increasing density of node and communication range of mobile node raised some problem such as dropping of packet and degraded the performance of network. For the improvement of performance of AODV routing Protocol in mobile ADHOC network various authors used different size of adjacency matrix in AODV routing protocol. In this paper proposed an improved AODV routing protocol using OSPF routing adjacency matrix in AODV protocol. The size of OSPF matrix is large instead of AODV.The change the size of matrix increases the communication range of mobile node. The increased range of communication increases the throughput of mobile ADHOC network. The proposed model simulates in ns-2.34 and compared with AODV routing protocol. Our experimental result shows better performance of AODV-OSPF routing protocol.

  4. Modeling of coupled differential equations for cellular chemical signaling pathways: Implications for assay protocols utilized in cellular engineering.

    O'Clock, George D


    Cellular engineering involves modification and control of cell properties, and requires an understanding of fundamentals and mechanisms of action for cellular derived product development. One of the keys to success in cellular engineering involves the quality and validity of results obtained from cell chemical signaling pathway assays. The accuracy of the assay data cannot be verified or assured if the effect of positive feedback, nonlinearities, and interrelationships between cell chemical signaling pathway elements are not understood, modeled, and simulated. Nonlinearities and positive feedback in the cell chemical signaling pathway can produce significant aberrations in assay data collection. Simulating the pathway can reveal potential instability problems that will affect assay results. A simulation, using an electrical analog for the coupled differential equations representing each segment of the pathway, provides an excellent tool for assay validation purposes. With this approach, voltages represent pathway enzyme concentrations and operational amplifier feedback resistance and input resistance values determine pathway gain and rate constants. The understanding provided by pathway modeling and simulation is strategically important in order to establish experimental controls for assay protocol structure, time frames specified between assays, and assay concentration variation limits; to ensure accuracy and reproducibility of results.

  5. Can the comet assay be used reliably to detect nanoparticle-induced genotoxicity?

    Karlsson, Hanna L; Di Bucchianico, Sebastiano; Collins, Andrew R; Dusinska, Maria


    The comet assay is a sensitive method to detect DNA strand breaks as well as oxidatively damaged DNA at the level of single cells. Today the assay is commonly used in nano-genotoxicology. In this review we critically discuss possible interactions between nanoparticles (NPs) and the comet assay. Concerns for such interactions have arisen from the occasional observation of NPs in the "comet head", which implies that NPs may be present while the assay is being performed. This could give rise to false positive or false negative results, depending on the type of comet assay endpoint and NP. For most NPs, an interaction that substantially impacts the comet assay results is unlikely. For photocatalytically active NPs such as TiO2 , on the other hand, exposure to light containing UV can lead to increased DNA damage. Samples should therefore not be exposed to such light. By comparing studies in which both the comet assay and the micronucleus assay have been used, a good consistency between the assays was found in general (69%); consistency was even higher when excluding studies on TiO2 NPs (81%). The strong consistency between the comet and micronucleus assays for a range of different NPs-even though the two tests measure different endpoints-implies that both can be trusted in assessing the genotoxicity of NPs, and that both could be useful in a standard battery of test methods.

  6. Temperature Switch PCR (TSP: Robust assay design for reliable amplification and genotyping of SNPs

    Mather Diane E


    Full Text Available Abstract Background Many research and diagnostic applications rely upon the assay of individual single nucleotide polymorphisms (SNPs. Thus, methods to improve the speed and efficiency for single-marker SNP genotyping are highly desirable. Here, we describe the method of temperature-switch PCR (TSP, a biphasic four-primer PCR system with a universal primer design that permits amplification of the target locus in the first phase of thermal cycling before switching to the detection of the alleles. TSP can simplify assay design for a range of commonly used single-marker SNP genotyping methods, and reduce the requirement for individual assay optimization and operator expertise in the deployment of SNP assays. Results We demonstrate the utility of TSP for the rapid construction of robust and convenient endpoint SNP genotyping assays based on allele-specific PCR and high resolution melt analysis by generating a total of 11,232 data points. The TSP assays were performed under standardised reaction conditions, requiring minimal optimization of individual assays. High genotyping accuracy was verified by 100% concordance of TSP genotypes in a blinded study with an independent genotyping method. Conclusion Theoretically, TSP can be directly incorporated into the design of assays for most current single-marker SNP genotyping methods. TSP provides several technological advances for single-marker SNP genotyping including simplified assay design and development, increased assay specificity and genotyping accuracy, and opportunities for assay automation. By reducing the requirement for operator expertise, TSP provides opportunities to deploy a wider range of single-marker SNP genotyping methods in the laboratory. TSP has broad applications and can be deployed in any animal and plant species.

  7. An efficient and reliable geographic routing protocol based on partial network coding for underwater sensor networks.

    Hao, Kun; Jin, Zhigang; Shen, Haifeng; Wang, Ying


    Efficient routing protocols for data packet delivery are crucial to underwater sensor networks (UWSNs). However, communication in UWSNs is a challenging task because of the characteristics of the acoustic channel. Network coding is a promising technique for efficient data packet delivery thanks to the broadcast nature of acoustic channels and the relatively high computation capabilities of the sensor nodes. In this work, we present GPNC, a novel geographic routing protocol for UWSNs that incorporates partial network coding to encode data packets and uses sensor nodes' location information to greedily forward data packets to sink nodes. GPNC can effectively reduce network delays and retransmissions of redundant packets causing additional network energy consumption. Simulation results show that GPNC can significantly improve network throughput and packet delivery ratio, while reducing energy consumption and network latency when compared with other routing protocols.

  8. An Efficient and Reliable Geographic Routing Protocol Based on Partial Network Coding for Underwater Sensor Networks

    Kun Hao


    Full Text Available Efficient routing protocols for data packet delivery are crucial to underwater sensor networks (UWSNs. However, communication in UWSNs is a challenging task because of the characteristics of the acoustic channel. Network coding is a promising technique for efficient data packet delivery thanks to the broadcast nature of acoustic channels and the relatively high computation capabilities of the sensor nodes. In this work, we present GPNC, a novel geographic routing protocol for UWSNs that incorporates partial network coding to encode data packets and uses sensor nodes’ location information to greedily forward data packets to sink nodes. GPNC can effectively reduce network delays and retransmissions of redundant packets causing additional network energy consumption. Simulation results show that GPNC can significantly improve network throughput and packet delivery ratio, while reducing energy consumption and network latency when compared with other routing protocols.

  9. Interface Assignment-Based AODV Routing Protocol to Improve Reliability in Multi-Interface Multichannel Wireless Mesh Networks

    Won-Suk Kim


    Full Text Available The utilization of wireless mesh networks (WMNs has greatly increased, and the multi-interface multichannel (MIMC technic has been widely used for the backbone network. Unfortunately, the ad hoc on-demand distance vector (AODV routing protocol defined in the IEEE 802.11s standard was designed for WMNs using the single-interface single-channel technic. So, we define a problem that happens when the legacy AODV is used in MIMC WMNs and propose an interface assignment-based AODV (IA-AODV in order to resolve that problem. IA-AODV, which is based on multitarget path request, consists of the PREQ prediction scheme, the PREQ loss recovery scheme, and the PREQ sender assignment scheme. A detailed operation according to various network conditions and services is introduced, and the routing efficiency and network reliability of a network using IA-AODV are analyzed over the presented system model. Finally, after a real-world test-bed for MIMC WMNs using the IA-AODV routing protocol is implemented, the various indicators of the network are evaluated through experiments. When the proposed routing protocol is compared with the existing AODV routing protocol, it performs the path update using only 14.33% of the management frames, completely removes the routing malfunction, and reduces the UDP packet loss ratio by 0.0012%.

  10. A Framework for Reliable Reception of Wireless Metering Data using Protocol Side Information

    Melchior Jacobsen, Rasmus; Popovski, Petar


    the deterministic protocol structure to obtain side information and group the packets from the same meter. We derive the probability of falsely pairing packets from different senders in the simple case of no channel errors, and show through simulation and data from an experimental deployment the probability...

  11. An efficient and reliable multi-hop geographical broadcast protocol in vehicular ad-hoc networks

    Rajendran, R.; Jongh, J. de


    In Intelligent Transportation Systems (ITS), disseminating warning messages in a timely and efficient way through wireless short-range communications can save many lives and reduce traffic congestion. A geographical broadcast protocol provides data delivery to specified geographical areas, using mul

  12. A simple and rapid fluorescence in situ hybridization microwave protocol for reliable dicentric chromosome analysis.

    Cartwright, Ian M; Genet, Matthew D; Kato, Takamitsu A


    Fluorescence in situhybridization (FISH) is an extremely effective and sensitive approach to analyzing chromosome aberrations. Until recently, this procedure has taken multiple days to complete. The introduction of telomeric and centromeric peptide nucleic acid (PNA) probes has reduced the procedure's duration to several hours, but the protocols still call for a high temperature (80-90°C) step followed by 1-3 h of hybridization. The newest method to speed up the FISH protocol is the use of a microwave to shorten the heating element to less than a minute; however this protocol still calls for a 1-h hybridization period. We have utilized PNA centromere/telomere probes in conjunction with a microwave oven to show telomere and centromere staining in as little as 30 s. We have optimized the hybridization conditions to increase the sensitivity and effectiveness of the new protocol and can effectively stain chromosomes in 2 min and 30 s of incubation. We have found that our new approach to FISH produces extremely clear and distinct signals. Radiation-induced dicentric formation in mouse and human fibroblast cells was analyzed by two individual scorers and the observed dicentrics matched very well.

  13. A reliable, delay bounded and less complex communication protocol for multicluster FANETs

    Wajiya Zafar


    Full Text Available Recently, Flying Ad-hoc Networks (FANETs, enabling ad-hoc networking between Unmanned Aerial Vehicles (UAVs is gaining importance in several military and civilian applications. The sensitivity of the applications requires adaptive; efficient; delay bounded and scalable communication network among UAVs for data transmission. Due to communication protocol complexity; rigidity; cost of commercial-off-the-shelf (COT components; limited radio bandwidth; high mobility and computational resources; maintaining the desired level of Quality of Service (QoS becomes a daunting task. For the first time in this research we propose multicluster FANETs for efficient network management; the proposed scheme considerably reduces communication cost and optimizes network performance as well as exploit low power; less complex and low cost IEEE 802.15.4 (MAC protocol for intercluster and intracluster communication. In this research both beacon enabled mode and beaconless modes have been investigated with Guaranteed Time Slots (GTS and virtual Time Division Multiple Access (TDMA respectively. The methodology plays a key role towards reserving bandwidth for latency critical applications; eliminate collisions and medium access delays. Moreover analysis ad-hoc routing protocols including two proactive (OLSR, DSDV and one reactive (AODV is also presented. The results shows that the proposed scheme guarantees high packet delivery ratios while maintaining acceptable levels of latency requirements comparable with more complex and dedicatedly designed protocols in literature.

  14. Re-use of explanted osteosynthesis devices: a reliable and inexpensive reprocessing protocol.

    Danesi, Valentina; Cristofolini, Luca; Stea, Susanna; Traina, Francesco; Beraudi, Alina; Tersi, Luca; Harman, Melinda; Viceconti, Marco


    Orthopaedic surgical treatments emphasizing immobilization using open reduction and internal fixation with osteosynthesis devices are widely accepted for their efficacy in treating complex fractures and reducing permanent musculoskeletal deformity. However, such treatments are profoundly underutilized in low- and middle-income countries (LMIC), partially due to inadequate availability of the costly osteosynthesis devices. Orthopaedic surgeons in some LMIC regularly re-use osteosynthesis devices in an effort to meet treatment demands, even though such devices typically are regulated for single-use only. The purpose of this study is to report a reprocessing protocol applied to explanted osteosynthesis devices obtained at a leading trauma care hospital. Explanted osteosynthesis devices were identified through a Register of Explanted Orthopaedic Prostheses. Guidelines to handle ethical issues were approved by the local Ethical Committee and informed patient consent was obtained at the time of explant surgery. Primary acceptance criteria were established and applied to osteosynthesis devices explanted between 2005 and 2008. A rigorous protocol for conducting decontamination and visual inspection based on specific screening criteria was implemented using simple equipment that is readily available in LMIC. A total of 2050 osteosynthesis devices, including a large variety of plates, screws and staples, were reprocessed using the decontamination and inspection protocols. The acceptance rate was 66%. Estimated labour time and implementation time of the protocol to reprocess a typical osteosynthesis unit (1 plate and 5 screws) was 25 min, with an estimated fixed cost (in Italy) of €10 per unit for implementing the protocol, plus an additional €5 for final sterilization at the end-user hospital site. This study was motivated by the treatment demands encountered by orthopaedic surgeons providing medical treatment in several different LMIC and their need for access to basic

  15. Distinct gene expression responses of two anticonvulsant drugs in a novel human embryonic stem cell based neural differentiation assay protocol.

    Schulpen, Sjors H W; de Jong, Esther; de la Fonteyne, Liset J J; de Klerk, Arja; Piersma, Aldert H


    Hazard assessment of chemicals and pharmaceuticals is increasingly gaining from knowledge about molecular mechanisms of toxic action acquired in dedicated in vitro assays. We have developed an efficient human embryonic stem cell neural differentiation test (hESTn) that allows the study of the molecular interaction of compounds with the neural differentiation process. Within the 11-day differentiation protocol of the assay, embryonic stem cells lost their pluripotency, evidenced by the reduced expression of stem cell markers Pou5F1 and Nanog. Moreover, stem cells differentiated into neural cells, with morphologically visible neural structures together with increased expression of neural differentiation-related genes such as βIII-tubulin, Map2, Neurogin1, Mapt and Reelin. Valproic acid (VPA) and carbamazepine (CBZ) exposure during hESTn differentiation led to concentration-dependent reduced expression of βIII-tubulin, Neurogin1 and Reelin. In parallel VPA caused an increased gene expression of Map2 and Mapt which is possibly related to the neural protective effect of VPA. These findings illustrate the added value of gene expression analysis for detecting compound specific effects in hESTn. Our findings were in line with and could explain effects observed in animal studies. This study demonstrates the potential of this assay protocol for mechanistic analysis of specific compound-induced inhibition of human neural cell differentiation.

  16. The clubfoot assessment protocol (CAP; description and reliability of a structured multi-level instrument for follow-up

    Jarnlo Gun-Britt


    Full Text Available Abstract Background In most clubfoot studies, the outcome instruments used are designed to evaluate classification or long-term cross-sectional results. Variables deal mainly with factors on body function/structure level. Wide scorings intervals and total sum scores increase the risk that important changes and information are not detected. Studies of the reliability, validity and responsiveness of these instruments are sparse. The lack of an instrument for longitudinal follow-up led the investigators to develop the Clubfoot Assessment Protocol (CAP. The aim of this article is to introduce and describe the CAP and evaluate the items inter- and intra reliability in relation to patient age. Methods The CAP was created from 22 items divided between body function/structure (three subgroups and activity (one subgroup levels according to the International Classification of Function, Disability and Health (ICF. The focus is on item and subgroup development. Two experienced examiners assessed 69 clubfeet in 48 children who had a median age of 2.1 years (range, 0 to 6.7 years. Both treated and untreated feet with different grades of severity were included. Three age groups were constructed for studying the influence of age on reliability. The intra- rater study included 32 feet in 20 children who had a median age of 2.5 years (range, 4 months to 6.8 years. The Unweighted Kappa statistics, percentage observer agreement, and amount of categories defined how reliability was to be interpreted. Results The inter-rater reliability was assessed as moderate to good for all but one item. Eighteen items had kappa values > 0.40. Three items varied from 0.35 to 0.38. The mean percentage observed agreement was 82% (range, 62 to 95%. Different age groups showed sufficient agreement. Intra- rater; all items had kappa values > 0.40 [range, 0.54 to 1.00] and a mean percentage agreement of 89.5%. Categories varied from 3 to 5. Conclusion The CAP contains more detailed

  17. Protocol and Demonstrations of Probabilistic Reliability Assessment for Structural Health Monitoring Systems (Preprint)


    location. Keywords: Model-assisted POD evaluation, probability of detection (POD), reliability, structural health monitoring 1. Introduction The...damage detection method. An ETrema brand Terfenol-D magnetostrictive actuator was used for band- limited pseudo-random excitation up to 1200 Hz, and

  18. ZyFISH: A simple, rapid and reliable zygosity assay for transgenic mice

    Donal McHugh; Tracy O'Connor; Juliane Bremer; Adriano Aguzzi


    Microinjection of DNA constructs into fertilized mouse oocytes typically results in random transgene integration at a single genomic locus. The resulting transgenic founders can be used to establish hemizygous transgenic mouse lines. However, practical and experimental reasons often require that such lines be bred to homozygosity. Transgene zygosity can be determined by progeny testing assays which are expensive and time-consuming, by quantitative Southern blotting which is labor-intensive, o...

  19. Reliable Multihop Broadcast Protocol with a Low-Overhead Link Quality Assessment for ITS Based on VANETs in Highway Scenarios

    Alejandro Galaviz-Mosqueda


    Full Text Available Vehicular ad hoc networks (VANETs have been identified as a key technology to enable intelligent transport systems (ITS, which are aimed to radically improve the safety, comfort, and greenness of the vehicles in the road. However, in order to fully exploit VANETs potential, several issues must be addressed. Because of the high dynamic of VANETs and the impairments in the wireless channel, one key issue arising when working with VANETs is the multihop dissemination of broadcast packets for safety and infotainment applications. In this paper a reliable low-overhead multihop broadcast (RLMB protocol is proposed to address the well-known broadcast storm problem. The proposed RLMB takes advantage of the hello messages exchanged between the vehicles and it processes such information to intelligently select a relay set and reduce the redundant broadcast. Additionally, to reduce the hello messages rate dependency, RLMB uses a point-to-zone link evaluation approach. RLMB performance is compared with one of the leading multihop broadcast protocols existing to date. Performance metrics show that our RLMB solution outperforms the leading protocol in terms of important metrics such as packet dissemination ratio, overhead, and delay.

  20. Application of the Welfare Quality protocol to dairy buffalo farms: Prevalence and reliability of selected measures.

    De Rosa, G; Grasso, F; Winckler, C; Bilancione, A; Pacelli, C; Masucci, F; Napolitano, F


    Within the general aim of developing a Welfare Quality system for monitoring dairy buffalo welfare, this study focused on prevalence and interobserver reliability of the animal-related variables to be included in the scheme. As most of the measures were developed for cattle, the study also aimed to verify their prevalence for buffaloes. Thirty animal-based measures (22 clinical and 8 behavioral measurements) and 20 terms used for qualitative behavior assessment were assessed in 42 loose-housed buffalo farms. All farms were located in central-southern Italy. Two assessors were used (1 male and 1 female). The time needed to record all measures (animal-, resource-, and management-based) was 5.47 ± 0.48 h (mean ± SD). Interobserver reliability of animal-based measures was evaluated using Spearman rank correlation coefficient test (rs). If 0.7 is considered as threshold for high interobserver reliability, all animal-based measures were above this level. In particular, most of the coefficients were above 0.85, with higher values observed for prevalence of animals that can be touched (rs = 0.99) and prevalence of animals with iatrogenic abscess (rs = 0.97), whereas lower coefficients were found for the prevalence of vulvar discharge (rs = 0.74) and dewlap edema (rs = 0.73). Twelve out of the 20 terms used for the qualitative behavior assessment reached a satisfactory interobserver reliability (rs = 0.65). Principal component analysis of qualitative behavior assessment scores was conducted for each assessor. Both principal component 1 and principal component 2 showed high interobserver reliability (rs = 0.80 and 0.79, respectively). In addition, relevant proportions of animals were affected by welfare issues specific to buffaloes, such as overgrown claws (median = 34.1%), withers hygroma (median = 13.3%), and vulvar or uterine prolapse (median = 9.3%). We concluded that most of the investigated measures could be reliably included in the final scheme, which can be used as

  1. A New Scalable and Reliable Cost Effective Key Agreement Protocol for Secure Group Communication

    S. J. Begum


    Full Text Available Problem statement: In a heterogeneous environment, for a secure multicast communication, the group members have to share a secret key which is used to encrypt/decrypt the secret messages among the members. The Secure Group Communication of large scale multicast group in a dynamic environment is more complex than securing one-to-one communication due to the inherent scalability issue of group key management. Since the group members are dynamic in nature such as joining or leaving the group, the key updating is performed among the valid members without interrupting the multicast session so that non group members can’t have access to the future renewed keys. Approach: The main aim is to develop a scheme which can reduce the cost of computational overhead, number of messages needed during the time of key refreshing and the number of keys stored in servers and members. The cost of establishing the key and renewal is proportionate to the size of the group and subsequently fetches a bottleneck performance in achieving scalability. By using a Cluster Based Hierarchical Key Distribution Protocol, the load of key management can be shared among dummy nodes of a cluster without revealing the group messages to them. Results: Especially, the existing model incurs a very less computational and communication overhead during renewal of keys. The proposed scheme yields better scalability because of the fact that the Key computational cost, the keys stored in key server and numbers of rekey-messages needed are very less. Conclusion: Our proposed protocol is based on Elliptic curve cryptography algorithm to form secure group key, even with smaller key size, it is capable of providing more security. This protocol can be used both in wired or wireless environments.

  2. Miniprimer PCR assay targeting multiple genes: a new rapid and reliable tool for genotyping Pantoea stewartii subsp. stewartii.

    Xu, R; Chen, Q; Robleh Djama, Z; Tambong, J T


    Development of a 'miniprimer' PCR assay for genotyping Pantoea stewartii subsp. stewartii, the causal agent of the Stewart's bacterial wilt on maize. Four 10-nucleotide (10-nt) 'miniprimer' sets were designed and evaluated in the presence of Titanium Taq DNA polymerase. Under optimal reaction conditions, the miniprimer pair Uni-BacF-10/Uni-BacR-10 reproducibly generated identical banding patterns among 10 strains of P. stewartii subsp. stewartii, different patterns from strains of P. stewartii subsp. indologenes, other Panteoa species, Clavibacter michiganensis, Pectobacterium spp., Pseudomonas spp. and other bacterial species. The amplicons of Pantoea stewartii subsp. stewartii were cloned and sequenced to identify genes or DNA fragments that are targeted by the miniprimer PCR assay. Of the 14 'clone types' identified, sequences of a 1.23-kb fragment had a 99.8% similarity to part of the Pantoea stewartii zeaxanthin diglucoside biosynthetic operon (AY166713). Other dominant cloned fragments included a 411-bp amplicon that exhibited 99.8% similarity to the psaU gene (syn:ysaU; GQ249669), a type III protein-secretion system complex of P. stewartii subsp. stewartii strain DC283, and a 548-bp fragment showed 63% homology to the Asp/Glu racemase encoding gene in Erwinia tasmaniensis strain ET1/99. The miniprimer PCR assay reported here is highly discriminatory and reproducible in genotyping Pantoea stewartii subsp. stewartii. This miniprimer PCR assay could be a new reliable and rapid tool for fingerprinting the Stewart's wilt pathogen of maize.

  3. Comet assay: a reliable tool for the assessment of DNA damage in different models.

    Dhawan, Alok; Bajpayee, Mahima; Parmar, Devendra


    New chemicals are being added each year to the existing burden of toxic substances in the environment. This has led to increased pollution of ecosystems as well as deterioration of the air, water, and soil quality. Excessive agricultural and industrial activities adversely affect biodiversity, threatening the survival of species in a particular habitat as well as posing disease risks to humans. Some of the chemicals, e.g., pesticides and heavy metals, may be genotoxic to the sentinel species and/or to non-target species, causing deleterious effects in somatic or germ cells. Test systems which help in hazard prediction and risk assessment are important to assess the genotoxic potential of chemicals before their release into the environment or commercial use as well as DNA damage in flora and fauna affected by contaminated/polluted habitats. The Comet assay has been widely accepted as a simple, sensitive, and rapid tool for assessing DNA damage and repair in individual eukaryotic as well as some prokaryotic cells, and has increasingly found application in diverse fields ranging from genetic toxicology to human epidemiology. This review is an attempt to comprehensively encase the use of Comet assay in different models from bacteria to man, employing diverse cell types to assess the DNA-damaging potential of chemicals and/or environmental conditions. Sentinel species are the first to be affected by adverse changes in their environment. Determination of DNA damage using the Comet assay in these indicator organisms would thus provide information about the genotoxic potential of their habitat at an early stage. This would allow for intervention strategies to be implemented for prevention or reduction of deleterious health effects in the sentinel species as well as in humans.

  4. Impact of Transport Layer Protocols on Reliable Information Access in Smart Grids

    Shahid, Kamal; Saeed, Aamir; Kristensen, Thomas le Fevre


    Time is critical for certain types of dynamic information (e.g. frequency control) in a smart grid scenario. The usefulness of such information depends upon the arrival within a specific frame of time, which in other case may not serve the purpose and effect controller’s performance......-to-end delays at the cost of unreliable, best-effort data transportation service. The research question raised in this paper is thus, which is preferred for the delay-critical applications of smart grids, and to what degree of packet losses and round trip times, TCP is preferable to UDP and vice versa...... of events at grid assets as well as the information update strategy in one single metric which otherwise is not very intuitive and difficult to allow a similar useful comparison. Further, the analysis is concluded by providing a clear guide on the selection of the transport protocol to meet application...

  5. A reliable primary human CNS culture protocol for morphological studies of dendritic and synaptic elements.

    Hammond, Robert R; Iskander, Sam; Achim, Cristian L; Hearn, Stephen; Nassif, Jane; Wiley, Clayton A


    Primary dissociated human fetal forebrain cultures were grown in defined serum-free conditions. At 4 weeks in vitro the cultures contained abundant morphologically well differentiated neurons with complex dendritic arbors. Astrocytic proliferation was negligible without the use of antimitotic agents. Confocal scanning laser microscopy (CSLM) and electron microscopy confirmed the presence of a dense neuropil, numerous cell-cell contacts and synapses. Neurons expressed a variety of proteins including growth associated protein-43 (GAP43), microtubule associated protein-2ab (MAP), class-III beta tubulin (C3BT), neurofilaments (NF), synaptophysin (SYN), parvalbumin (PA) and calbindin (CB). The cultures have proven to be reliable and simple to initiate and maintain for many weeks without passaging. They are useful in investigations of dendritic growth and injury of primary human CNS neurons.

  6. The effects on DNA migration of altering parameters in the comet assay protocol such as agarose density, electrophoresis conditions and durations of the enzyme or the alkaline treatments.

    Ersson, Clara; Möller, Lennart


    The single cell gel electrophoresis (comet assay) is a popular method for measuring DNA migration as an estimate of DNA damage. No standardised comet assay protocol exists, which make comparisons between studies complicated. In a previous inter-laboratory validation study of the comet assay, we identified important parameters in the protocol that might affect DNA migration. The aim of this study was to assess how different comet assay protocols affect DNA migration. The results in this study suggest that (i) there is a significant linear dose-response relationship between the agarose gel's density and DNA migration and that damaged cells are more sensitive to the agarose gel's density; (ii) incubation with formamidopyrimidine DNA glycosylase for 10 min is inadequate, whereas 30 min is sufficient; (iii) the typically used 20 min of alkaline treatment might be to short when analysing samples that contain particular alkali-labile sites (ALS) and (iv) the duration of electrophoresis as well as the strength of the electric field applied affects the DNA migration. By using protocol-specific calibration curves, it is possible to reduce the variation in DNA migration caused by differences in comet assay protocols. This does, however, not completely remove the impact of the durations of alkaline treatment and electrophoresis when analysing cells containing ALS that are relatively resistant to high alkaline treatment.

  7. Arabidopsis seedling flood-inoculation technique: a rapid and reliable assay for studying plant-bacterial interactions

    Uppalapati Srinivasa R


    Full Text Available Abstract Background The Arabidopsis thaliana-Pseudomonas syringae model pathosystem is one of the most widely used systems to understand the mechanisms of microbial pathogenesis and plant innate immunity. Several inoculation methods have been used to study plant-pathogen interactions in this model system. However, none of the methods reported to date are similar to those occurring in nature and amicable to large-scale mutant screens. Results In this study, we developed a rapid and reliable seedling flood-inoculation method based on young Arabidopsis seedlings grown on MS medium. This method has several advantages over conventional soil-grown plant inoculation assays, including a shorter growth and incubation period, ease of inoculation and handling, uniform infection and disease development, requires less growth chamber space and is suitable for high-throughput screens. In this study we demonstrated the efficacy of the Arabidopsis seedling assay to study 1 the virulence factors of P. syringae pv. tomato DC3000, including type III protein secretion system (TTSS and phytotoxin coronatine (COR; 2 the effector-triggered immunity; and 3 Arabidopsis mutants affected in salicylic acid (SA- and pathogen-associated molecular pattern (PAMPs-mediated pathways. Furthermore, we applied this technique to study nonhost resistance (NHR responses in Arabidopsis using nonhost pathogens, such as P. syringae pv. tabaci, pv. glycinea and pv. tomato T1, and confirmed the functional role of FLAGELLIN-SENSING 2 (FLS2 in NHR. Conclusions The Arabidopsis seedling flood-inoculation assay provides a rapid, efficient and economical method for studying Arabidopsis-Pseudomonas interactions with minimal growth chamber space and time. This assay could also provide an excellent system for investigating the virulence mechanisms of P. syringae. Using this method, we demonstrated that FLS2 plays a critical role in conferring NHR against nonhost pathovars of P. syringae, but not to

  8. Evaluation of an automated protocol for efficient and reliable DNA extraction of dietary samples.

    Wallinger, Corinna; Staudacher, Karin; Sint, Daniela; Thalinger, Bettina; Oehm, Johannes; Juen, Anita; Traugott, Michael


    Molecular techniques have become an important tool to empirically assess feeding interactions. The increased usage of next-generation sequencing approaches has stressed the need of fast DNA extraction that does not compromise DNA quality. Dietary samples here pose a particular challenge, as these demand high-quality DNA extraction procedures for obtaining the minute quantities of short-fragmented food DNA. Automatic high-throughput procedures significantly decrease time and costs and allow for standardization of extracting total DNA. However, these approaches have not yet been evaluated for dietary samples. We tested the efficiency of an automatic DNA extraction platform and a traditional CTAB protocol, employing a variety of dietary samples including invertebrate whole-body extracts as well as invertebrate and vertebrate gut content samples and feces. Extraction efficacy was quantified using the proportions of successful PCR amplifications of both total and prey DNA, and cost was estimated in terms of time and material expense. For extraction of total DNA, the automated platform performed better for both invertebrate and vertebrate samples. This was also true for prey detection in vertebrate samples. For the dietary analysis in invertebrates, there is still room for improvement when using the high-throughput system for optimal DNA yields. Overall, the automated DNA extraction system turned out as a promising alternative to labor-intensive, low-throughput manual extraction methods such as CTAB. It is opening up the opportunity for an extensive use of this cost-efficient and innovative methodology at low contamination risk also in trophic ecology.

  9. Ultrasensitive detection in optically dense physiological media: applications to fast reliable biological assays

    Matveeva, Evgenia G.; Gryczynski, Ignacy; Berndt, Klaus W.; Lakowicz, Joseph R.; Goldys, Ewa; Gryczynski, Zygmunt


    We present a novel approach for performing fluorescence immunoassay in serum and whole blood using fluorescently labeled anti-rabbit IgG. This approach, which is based on Surface Plasmon-Coupled Emission (SPCE), provides increased sensitivity and substantial background reduction due to exclusive selection of the signal from the fluorophores located near a bio-affinity surface. Effective coupling range for SPCE is only couple of hundred nanometers from the metallic surface. Excited fluorophores outside the coupling layer do not contribute to SPCE, and their free-space emission is not transmitted through the opaque metallic film into the glass substrate. An antigen (rabbit IgG) was adsorbed to a slide covered with a thin silver metal layer, and the SPCE signal from the fluorophore-labeled anti-rabbit antibody, binding to the immobilized antigen, was detected. The effect of the sample matrix (buffer, human serum, or human whole blood) on the end-point immunoassay SPCE signal is discussed. The kinetics of binding could be monitored directly in whole blood or serum. The results showed that human serum and human whole blood attenuate the SPCE end-point signal and the immunoassay kinetic signal only approximately 2- and 3-fold, respectively (compared to buffer), resulting in signals that are easily detectable even in whole blood. The high optical absorption of the hemoglobin can be tolerated because only fluorophores within a couple of hundred nanometers from the metallic film contribute to SPCE. Both glass and plastic slides can be used for SPCE-based assays. We believe that SPCE has the potential of becoming a powerful approach for performing immunoassays based on surface-bound analytes or antibodies for many biomarkers directly in dense samples such as whole blood, without any need for washing steps.

  10. Improving the communication reliability of body sensor networks based on the IEEE 802.15.4 protocol.

    Gomes, Diogo; Afonso, José A


    Body sensor networks (BSNs) enable continuous monitoring of patients anywhere, with minimum constraints to daily life activities. Although the IEEE 802.15.4 and ZigBee(®) (ZigBee Alliance, San Ramon, CA) standards were mainly developed for use in wireless sensors network (WSN) applications, they are also widely used in BSN applications because of device characteristics such as low power, low cost, and small form factor. However, compared with WSNs, BSNs present some very distinctive characteristics in terms of traffic and mobility patterns, heterogeneity of the nodes, and quality of service requirements. This article evaluates the suitability of the carrier sense multiple access-collision avoidance protocol, used by the IEEE 802.15.4 and ZigBee standards, for data-intensive BSN applications, through the execution of experimental tests in different evaluation scenarios, in order to take into account the effects of contention, clock drift, and hidden nodes on the communication reliability. Results show that the delivery ratio may decrease substantially during transitory periods, which can last for several minutes, to a minimum of 90% with retransmissions and 13% without retransmissions. This article also proposes and evaluates the performance of the BSN contention avoidance mechanism, which was designed to solve the identified reliability problems. This mechanism was able to restore the delivery ratio to 100% even in the scenario without retransmissions.

  11. A simple, rapid and reliable protocol to localize hydrogen peroxide in large plant organs by DAB-mediated tissue printing

    Yonghua eLiu


    Full Text Available Hydrogen peroxide (H2O2 is a major reactive oxygen species (ROS and plays diverse roles in plant development and stress responses. However, its localization in large and thick plant organs (e.g. stem, roots and fruits, other than leaves, has proven to be challenging due to the difficulties for the commonly used H2O2-specific chemicals, such as 3, 3’-diaminobenzidine (DAB, cerium chloride (CeCl3 and 2’, 7’-dichlorofluorescin diacetate (H2DCF-DA, to penetrate those organs. Theoretically, the reaction of endogenous H2O2 with these chemicals could be facilitated by using thin organ sections. However, the rapid production of wound-induced H2O2 associated with this procedure inevitably disturbs the original distribution of H2O2 in vivo. Here, by employing tomato seedling stems and fruits as testing materials, we report a novel, simple and rapid protocol to localize H2O2 in those organs using DAB-mediated tissue printing. The rapidity of the protocol (within 15 s completely avoided the interference of wound-induced H2O2 during experimentation. Moreover, the H2O2 signal on the printing was stable for at least 1 h with no or little background produced. We conclude that DAB-mediated tissue printing developed here provide a new feasible and reliable method to localize H2O2 in large plant organs, hence should have broad applications in studying ROS biology.

  12. A reliable indirect cell-labelling protocol for optical imaging allows ex vivo visualisation of mesenchymal stem cells after transplantation.

    Diana, Valentina; Libani, Ilaria Vittoria; Armentero, Marie-Therese; Blandini, Fabio; Lucignani, Giovanni; Silani, Vincenzo; Cova, Lidia; Ottobrini, Luisa


    We set out to assess the feasibility of exploiting expression of the mCherry gene, after lentiviral infection, in order visualise bone marrow-derived human mesenchymal stem cells (hMSCs) by optical imaging, and to provide proof of principle of this approach as a method for cell tracking and quantification in pre-clinical models. Commercial hMSCs were infected with a lentiviral vector carrying the mCherry gene under the control of the phosphoglycerate kinase promoter. After extensive in vitro culture, infected hMSCs were analysed for viability, morphology, differentiation capability, and maintenance of fluorescence. Thereafter, mCherry-positive cells were transplanted into unilaterally 6-hydroxy-dopamine lesioned rats (an experimental model of Parkinson's disease). Our analysis showed that hMSCs can be efficiently transduced with the lentiviral vector, retaining their biological features even in the long term. Intrastriatally transplanted mCherry-positive hMSCs can be detected ex vivo by a sensitive cooled CCD camera, both in the whole brain and in serial slices, and relatively quantified. Our protocol was found to be a reliable means of studying the viability of implanted hMSCs. mCherry labelling appears to be readily applicable in the post-transplantation tracking of stem cells and could favour the rapid development of new therapeutic targets for clinical treatments.

  13. Implementing voice over Internet protocol in mobile ad hoc network – analysing its features regarding efficiency, reliability and security

    Naveed Ahmed Sheikh


    Full Text Available Providing secure and efficient real-time voice communication in mobile ad hoc network (MANET environment is a challenging problem. Voice over Internet protocol (VoIP has originally been developed over the past two decades for infrastructure-based networks. There are strict timing constraints for acceptable quality VoIP services, in addition to registration and discovery issues in VoIP end-points. In MANETs, ad hoc nature of networks and multi-hop wireless environment with significant packet loss and delays present formidable challenges to the implementation. Providing a secure real-time VoIP service on MANET is the main design objective of this paper. The authors have successfully developed a prototype system that establishes reliable and efficient VoIP communication and provides an extremely flexible method for voice communication in MANETs. The authors’ cooperative mesh-based MANET implementation can be used for rapidly deployable VoIP communication with survivable and efficient dynamic networking using open source software.

  14. 一种高可靠性的MANET网络改进AODV路由协议%An Improved High Reliability AODV Protocol in MANET

    李晓婷; 沈桂华


    According to the huge spending of route discovering and route maintenance of AODV routing protocol, the paper brought an improved AODV Protocol to solve these problems. The Simulation results show that the improved AODV protocol is effective in reduce the route spending and point to point average data transfer delay, and it also improve the efficiency and the reliability of protocol.%针对普通AODV路由协议存在的路由发现和路由维护巨大开销问题,提出了一种改进的AODV路由协议。仿真结果表明,改进的AODV路由协议能有效地减少路由开销和端到端传输时延,提高协议效率和可靠性。

  15. A simple protocol for using a LDH-based cytotoxicity assay to assess the effects of death and growth inhibition at the same time.

    Shilo M Smith

    Full Text Available Analyzing the effects on cell growth inhibition and/or cell death has been an important component of biological research. The MTS assay and LDH-based cytotoxicity assays are two of the most commonly used methods for this purpose. However, data here showed that MTS cell proliferation assay could not distinguish the effects of cell death or cell growth inhibition. In addition, the original LDH-based cytotoxicity protocol grossly underestimated the proportion of dead cells in conditions with growth inhibition. To overcome the limitation, we present here a simple modified LDH-based cytotoxicity protocol by adding additional condition-specific controls. This modified protocol thus can provide more accurate measurement of killing effects in addition to the measurement of overall effects, especially in conditions with growth inhibition. In summary, we present here a simple, modified cytotoxicity assay, which can determine the overall effects, percentage of cell killing and growth inhibition in one 96-well based assay. This is a viable option for primary screening for many laboratories, and could be adapted for high throughput screening.

  16. Integration of GC-MSD and ER-Calux® assay into a single protocol for determining steroid estrogens in environmental samples.

    Avberšek, Miha; Žegura, Bojana; Filipič, Metka; Heath, Ester


    There are many published studies that use either chemical or biological methods to investigate steroid estrogens in the aquatic environment, but rarer are those that combine both. In this study, gas chromatography with mass selective detection (GC-MSD) and the ER-Calux(®) estrogenicity assay were integrated into a single protocol for simultaneous determination of natural (estrone--E1, 17β-estradiol--E2, estriol--E3) and synthetic (17α-ethinylestradiol--EE2) steroid estrogens concentrations and the total estrogenic potential of environmental samples. For integration purposes, several solvents were investigated and the commonly used dimethyl sulphoxide (DMSO) in the ER-Calux(®) assay was replaced by ethyl acetate, which is more compatible with gas chromatography and enables the same sample to be analysed by both GC-MSD and the ER-Calux(®) assay. The integrated protocol was initially tested using a standard mixture of estrogens. The results for pure standards showed that the estrogenicity calculated on the basis of GC-MSD and the ER-Calux(®) assay exhibited good correlation (r(2)=0.96; α=0.94). The result remained the same when spiked waste water extracts were tested (r(2)=0.92, α=1.02). When applied to real waste water influent and effluent samples the results proved (r(2)=0.93; α=0.99) the applicability of the protocol. The main advantages of this newly developed protocol are simple sample handling for both methods, and reduced material consumption and labour. In addition, it can be applied as either a complete or sequential analysis where the ER-Calux(®) assay is used as a pre-screening method prior to the chemical analysis.

  17. Delay-bounded semi-reliable vital sign transmission protocol for mobile telemedicine over a CDMA 1x EV-DO network.

    Lee, Tong H; Yoo, Sun K


    The reliable and instant transmission of vital signs is important for remote time-critical patient care through a telemedicine system. However, sometimes the reliability and instantaneity conditions cannot be satisfied simultaneously under a high-noise mobile network, because they are reciprocal to each other. In this paper, the vital sign transmission protocol (VSTP) running over a CDMA 1x EVDO (Code Division Multiple Access 1x Evolution Data Only) mobile network is proposed to comply with both the reliability and instantaneity requirements. The switching buffer management scheme is combined with a hybrid error control scheme, consisting of forward error correction (FEC) and automatic repeat request (ARQ). The CDMA 1x EVDO mobile network is modeled by two states using the Markov wireless channel model to test transmission performance under diverse network conditions. Throughout the noisy environment simulation, the performance of the VSTP is compared with the Transmission Control Protocol (TCP) and User Datagram Protocol (UDP) to demonstrate its efficacy over error-prone mobile network.

  18. A Reliable Routing Protocol for WSN Based on Load Balancing%一种基于负载均衡的WSN可靠路由协议

    李宗海; 柳少军; 王燕; 牛晓光


    The communication qualities of the wireless links between the neighbor nodes are poor and unstable which may be caused by node energy exhaustion, node physical destroy or environment inference. The invalidation of wireless links can cause the route breakdown, packet loss and retransmission, which weaken the reliability of the data transmission. Numerous studies show the difficulty to improve the energy efficiency and reliability of routing protocols due to the local unbalanced traffic load distribution. This paper proposes a reliable routing protocol based on load-balancing, namely RR. This protocol employs some mechanisms, including continuous gradient parameter and dynamic link quality evaluation. Experimental results show that this protocol achieves better performance in energy consumption, load balancing and delivery rate.%根据传感器网络的无线信道质量及稳定性、网络局部区域负载均衡性等特点,提出一个跨层协作的可靠路由协议(RR),包括链路质量动态评估机制、节点间距离的精确分级评估方法。RR可以建立并维护从传感器节点到汇聚节点的传输路径,将监测数据可靠地传输到汇聚节点。实验结果表明,在基于 RR 的传感器网络中,该协议可有效地减少靠近汇聚节点区域节点的最高负载,并提高数据接收率。


    K.G. Santhiya


    Full Text Available MANET (Mobile Ad-hoc Network is an infra structure less wireless ad-hoc network that does not require any basic central control. The topology of the network changes drastically due to very fast mobility of nodes. So an adaptive routing protocol is needed for routing in MANET. AODV (Ad-hoc On-demand Distance Vector routing is the effective and prominent on-demand Ad-hoc routing protocols. During route establishment phase in traditional AODV, only one route reply message will be sent in the reverse path to establish routing path. The high mobility of nodes may affect the reply messages which lead to the retransmission of route request message by the sender which in turn leads to higher communication delay, power consumption and the reduction in the ratio of packets delivered. Sending multiple route reply messages and establishing multiple paths in a single path discovery will reduce the routing overhead involved in maintaining the connection between source and destination nodes. Multipath routing can render high scalability, end-to-end throughput and provide load balancing in MANET. The new proposed novel Multipath QoS aware reliable routing protocol establishes two routes of maximum node disjoint paths and the data transfer is carried out in the two paths simultaneously. To select best paths, the new proposed protocol uses three parameters Link Eminence, MAC overhead and node residual energy. The experimental values prove that the MQARR-AODV protocol achieves high reliability, stability, low latency and outperforms AODV by the less energy consumption, overhead and delay.

  20. Temporomandibular joint involvement in Juvenile Idiopathic Arthritis : reliability and validity of a screening protocol for the rheumatologist

    Steenks, Michel H.; Giancane, G; de Leeuw, Rob R. J.; Bronkhorst, Ewald M.; van Es, Robert J. J.; Koole, Ron; van Bruggen, H. Willemijn; Wulffraat, NM


    Background: In Juvenile Idiopathic Arthritis (JIA) the temporomandibular joint (TMJ) can be involved leading to pain, dysfunction and growth disturbances of the mandible and associated structures. There may be value to a three minute screening protocol allowing the rheumatologist to detect TMJ invol

  1. Demonstration of Model Assisted Reliability Assessment Protocol on a Proposed Low Frequency Vibration Based Damage Sensing Case


    demonstration and broad use of the methodology and protocol. INTRODUCTION The successful deployment of systems for health monitoring of...based damage detection method. An ETrema brand Terfenol-D magnetostrictive actuator was used for band-limited pseudo-random excitation up to 1200 Hz

  2. Angiogenesis Assays.

    Nambiar, Dhanya K; Kujur, Praveen K; Singh, Rana P


    Neoangiogenesis constitutes one of the first steps of tumor progression beyond a critical size of tumor growth, which supplies a dormant mass of cancerous cells with the required nutrient supply and gaseous exchange through blood vessels essentially needed for their sustained and aggressive growth. In order to understand any biological process, it becomes imperative that we use models, which could mimic the actual biological system as closely as possible. Hence, finding the most appropriate model is always a vital part of any experimental design. Angiogenesis research has also been much affected due to lack of simple, reliable, and relevant models which could be easily quantitated. The angiogenesis models have been used extensively for studying the action of various molecules for agonist or antagonistic behaviour and associated mechanisms. Here, we have described two protocols or models which have been popularly utilized for studying angiogenic parameters. Rat aortic ring assay tends to bridge the gap between in vitro and in vivo models. The chorioallantoic membrane (CAM) assay is one of the most utilized in vivo model system for angiogenesis-related studies. The CAM is highly vascularized tissue of the avian embryo and serves as a good model to study the effects of various test compounds on neoangiogenesis.

  3. A Simple and Reliable Assay for Detecting Specific Nucleotide Sequences in Plants Using Optical Thin-film Biosensor Chips

    S. Bai; X. Zhong; L. Ma; W. Zheng; L. Fan; N. Wei; X.W. Deng


    @@ Here we report the adaptation and optimization of an efficient, accurate and inexpensive assay that employs custom-designed silicon-based optical thin-film biosensor chips to detect unique transgenes in genetically modified (GM) crops and SNP markers in model plant genomes.


    A. Gopi Saminathan


    Full Text Available Data aggregation protocols are required in Wireless Sensor Networks (WSNs to improve the data accuracy and extend the network lifetime by reducing the energy consumption. The existing Data Aggregation-Optimal LEACH (DAO-LEACH protocol for WSN is enhanced in terms of security and fault-tolerance based on Gracefully Degraded Data Aggregation (GDDA to ensure the integrity of the aggregated data and Hybrid Layer User Authentication (HLUA to ensure the confidentiality of the aggregated data. This data aggregation scheme rejects the false data from compromised and malfunctioning Sensor Nodes (SNs. HLUA consists of a combination of Secret Key Cryptography (SKC method such as Message Authentication Code (MAC algorithm and Public Key Cryptography (PKC method such as Elliptic Curve Cryptography (ECC. MAC algorithm is used between the Cluster Heads (CHs and SNs to fulfill lower power demand, while ECC is applied for User Authentication (UA between CHs and users. The enhanced DAO-LEACH protocol is resistant to security attacks such as, replay attacks, node compromising attacks and impersonation attacks. It performs better in terms of energy consumption, number of nodes alive, End-to-End Delay (EED, false data detection and aggregation accuracy.

  5. Development and systematic validation of qPCR assays for rapid and reliable differentiation of Xylella fastidiosa strains causing citrus variegated chlorosis.

    Li, Wenbin; Teixeira, Diva C; Hartung, John S; Huang, Qi; Duan, Yongping; Zhou, Lijuan; Chen, Jianchi; Lin, Hong; Lopes, Silvio; Ayres, A Juliano; Levy, Laurene


    The xylem-limited, Gram-negative, fastidious plant bacterium Xylella fastidiosa is the causal agent of citrus variegated chlorosis (CVC), a destructive disease affecting approximately half of the citrus plantations in the State of São Paulo, Brazil. The disease was recently found in Central America and is threatening the multi-billion U.S. citrus industry. Many strains of X. fastidiosa are pathogens or endophytes in various plants growing in the U.S., and some strains cross infect several host plants. In this study, a TaqMan-based assay targeting the 16S rDNA signature region was developed for the identification of X. fastidiosa at the species level. Another TaqMan-based assay was developed for the specific identification of the CVC strains. Both new assays have been systematically validated in comparison with the primer/probe sets from four previously published assays on one platform and under similar PCR conditions, and shown to be superior. The species specific assay detected all X. fastidiosa strains and did not amplify any other citrus pathogen or endophyte tested. The CVC-specific assay detected all CVC strains but did not amplify any non-CVC X. fastidiosa nor any other citrus pathogen or endophyte evaluated. Both sets were multiplexed with a reliable internal control assay targeting host plant DNA, and their diagnostic specificity and sensitivity remained unchanged. This internal control provides quality assurance for DNA extraction, performance of PCR reagents, platforms and operators. The limit of detection for both assays was equivalent to 2 to 10 cells of X. fastidiosa per reaction for field citrus samples. Petioles and midribs of symptomatic leaves of sweet orange harbored the highest populations of X. fastidiosa, providing the best materials for detection of the pathogen. These new species specific assay will be invaluable for molecular identification of X. fastidiosa at the species level, and the CVC specific assay will be very powerful for the

  6. 基于分段修复的区域路由协议%Reliable Zone Routing Protocol in Ad-hoc Networks

    吴静; 侯国照; 赵蕴龙


    在自组网中,ZRP的域间路由维护策略使ZRP具有较多的丢包数目和较多的路由重发现次数.前者使ZRP的数据包投递率较低,从而使其可靠性较低;后者使ZRP的传输时延和路由开销较高.针对上述问题,提出一种基于分段修复的区域路由协议SRZRP(Segmented Repairment based Zone Routing Protocol).在SRZRP中,每个节点通过维护一个基于域内拓扑结构的有向无环图来保存到达其域内每个节点的多条备份路由,尽量利用备份分段路由进行域间路由修复.理论分析表明,SRZRP具有较高的可靠性.仿真结果表明,SRZRP提高了数据包投递率,说明SRZRP提高了协议的可靠性;同时,SRZRP降低了平均端到端时延和路由开销.%In Ad-hoc networks, interzone-route maintenance scheme of ZRP makes ZRP to have many lost packets' number and many route rediscovering times. The former makes ZRP's delivery ratio of data packets low, which makes ZRP's reliability low. The latter makes ZRP's transmission delay and routing overheads high. Aiming at the above problem, the segmented repairment based zone routing protocol SRZRP was proposed. In SRZRP,each node saved multiple backup routes to each node in its intrazone by maintaining an intrazone topology structure based directed acyclic graph,and an interzone route was repaired by using backup segment-routes as much as possible. The theoretical analysis shows that SRZRP takes higher reliability. The simulation result shows that SRZRP improves delivery ratio of data packets which shows that SRZRP improves the protocol's reliability,and SRZRP also reduces the transmission delay and routing overheads of the protocol.

  7. Interface Assignment-Based AODV Routing Protocol to Improve Reliability in Multi-Interface Multichannel Wireless Mesh Networks

    Won-Suk Kim; Sang-Hwa Chung


    The utilization of wireless mesh networks (WMNs) has greatly increased, and the multi-interface multichannel (MIMC) technic has been widely used for the backbone network. Unfortunately, the ad hoc on-demand distance vector (AODV) routing protocol defined in the IEEE 802.11s standard was designed for WMNs using the single-interface single-channel technic. So, we define a problem that happens when the legacy AODV is used in MIMC WMNs and propose an interface assignment-based AODV (IA-AODV) in o...

  8. Distinct gene expression responses of two anticonvulsant drugs in a novel human embryonic stem cell based neural differentiation assay protocol

    Schulpen, Sjors H. W.; de Jong, Esther; de la Fonteyne, Liset J. J.; de Klerk, Arja; Piersma, Aldert H.


    Hazard assessment of chemicals and pharmaceuticals is increasingly gaining from knowledge about molecular mechanisms of toxic action acquired in dedicated in vitro assays. We have developed an efficient human embryonic stem cell neural differentiation test (hESTn) that allows the study of the molecu

  9. Distinct gene expression responses of two anticonvulsant drugs in a novel human embryonic stem cell based neural differentiation assay protocol

    Schulpen, Sjors H. W.; de Jong, Esther; de la Fonteyne, Liset J. J.; de Klerk, Arja; Piersma, Aldert H.

    Hazard assessment of chemicals and pharmaceuticals is increasingly gaining from knowledge about molecular mechanisms of toxic action acquired in dedicated in vitro assays. We have developed an efficient human embryonic stem cell neural differentiation test (hESTn) that allows the study of the

  10. Distinct gene expression responses of two anticonvulsant drugs in a novel human embryonic stem cell based neural differentiation assay protocol

    Schulpen, Sjors H. W.; de Jong, Esther; de la Fonteyne, Liset J. J.; de Klerk, Arja; Piersma, Aldert H.


    Hazard assessment of chemicals and pharmaceuticals is increasingly gaining from knowledge about molecular mechanisms of toxic action acquired in dedicated in vitro assays. We have developed an efficient human embryonic stem cell neural differentiation test (hESTn) that allows the study of the molecu

  11. The sentence verification task: a reliable fMRI protocol for mapping receptive language in individual subjects

    Sanjuan, Ana; Avila, Cesar [Universitat Jaume I, Departamento de Psicologia Basica, Clinica y Psicobiologia, Castellon de la Plana (Spain); Hospital La Fe, Unidad de Epilepsia, Servicio de Neurologia, Valencia (Spain); Forn, Cristina; Ventura-Campos, Noelia; Rodriguez-Pujadas, Aina; Garcia-Porcar, Maria [Universitat Jaume I, Departamento de Psicologia Basica, Clinica y Psicobiologia, Castellon de la Plana (Spain); Belloch, Vicente [Hospital La Fe, Eresa, Servicio de Radiologia, Valencia (Spain); Villanueva, Vicente [Hospital La Fe, Unidad de Epilepsia, Servicio de Neurologia, Valencia (Spain)


    To test the capacity of a sentence verification (SV) task to reliably activate receptive language areas. Presurgical evaluation of language is useful in predicting postsurgical deficits in patients who are candidates for neurosurgery. Productive language tasks have been successfully elaborated, but more conflicting results have been found in receptive language mapping. Twenty-two right-handed healthy controls made true-false semantic judgements of brief sentences presented auditorily. Group maps showed reliable functional activations in the frontal and temporoparietal language areas. At the individual level, the SV task showed activation located in receptive language areas in 100% of the participants with strong left-sided distributions (mean lateralisation index of 69.27). The SV task can be considered a useful tool in evaluating receptive language function in individual subjects. This study is a first step towards designing the fMRI task which may serve to presurgically map receptive language functions. (orig.)

  12. An improved behavioural assay demonstrates that ultrasound vocalizations constitute a reliable indicator of chronic cancer pain and neuropathic pain

    Selvaraj Deepitha


    Full Text Available Abstract Background On-going pain is one of the most debilitating symptoms associated with a variety of chronic pain disorders. An understanding of mechanisms underlying on-going pain, i.e. stimulus-independent pain has been hampered so far by a lack of behavioural parameters which enable studying it in experimental animals. Ultrasound vocalizations (USVs have been proposed to correlate with pain evoked by an acute activation of nociceptors. However, literature on the utility of USVs as an indicator of chronic pain is very controversial. A majority of these inconsistencies arise from parameters confounding behavioural experiments, which include novelty, fear and stress due to restrain, amongst others. Results We have developed an improved assay which overcomes these confounding factors and enables studying USVs in freely moving mice repetitively over several weeks. Using this improved assay, we report here that USVs increase significantly in mice with bone metastases-induced cancer pain or neuropathic pain for several weeks, in comparison to sham-treated mice. Importantly, analgesic drugs which are known to alleviate tumour pain or neuropathic pain in human patients significantly reduce USVs as well as mechanical allodynia in corresponding mouse models. Conclusions We show that studying USVs and mechanical allodynia in the same cohort of mice enables comparing the temporal progression of on-going pain (i.e. stimulus-independent pain and stimulus-evoked pain in these clinically highly-relevant forms of chronic pain.

  13. The Taste and Smell Protocol in the 2011–2014 US National Health and Nutrition Examination Survey (NHANES): Test–Retest Reliability and Validity Testing

    Rawal, Shristi; Hoffman, Howard J.; Honda, Mallory; Huedo-Medin, Tania B.; Duffy, Valerie B.


    Introduction The US NHANES 2011–2014 protocol includes a taste and smell questionnaire (CSQ) in home-based interviews and brief assessments in mobile exam centers. We report the short- and longer-term test–retest reliability and validity of this protocol against broader chemosensory measures. Methods A convenience sample of 73 adults (age=39.5±20.8 years) underwent the NHANES protocol at baseline, 2 weeks and 6 months. For taste, participants rated intensities of two tastants (1 M NaCl, 1 mM quinine) applied to the tongue tip and three tastants (1 M NaCl, 1 mM quinine, 0.32 M NaCl) sampled with the whole mouth. Smell function was assessed with a Pocket Smell Test™ (PST; eight-item odor identification test). The CSQ asked about chemosensory problems, distortions, and age-related changes. Broader baseline measurements were a 40-item olfactometer-generated identification task and additional whole-mouth taste intensities (1 M sucrose, 32 mM citric acid, 3.2 mM propylthiouracil). Results Intraclass correlations (ICCs) for NHANES taste measures showed moderate-to-good agreement after 2 weeks and 6 months (ICCs 0.42–0.71). Whole-mouth quinine intensity was significantly correlated with other taste intensities, supporting its utility as a marker for overall taste functioning. Olfactory classification from PSTs agreed for 98.5 % of participants across 2 weeks (κ=0.85; 95 % CI 0.71–0.99) and had good correspondence with the olfactometer task. CSQ items showed good-to-excellent agreement over 6 months (ICCs 0.66–0.90). Conclusions These findings further support that the NHANES chemosensory protocol has moderate-to-good test–retest reliability when administered to healthy, educated adults. Despite being a brief procedure with limited measures, the NHANES taste and smell assessments provided good information when compared to broader measures of taste and smell function. PMID:27833669

  14. Lymphocyte transformation assay for C neoformans antigen is not reliable for detecting cellular impairment in patients with Neurocryptococcosis

    Rocha Katya C


    Full Text Available Abstract Background Cryptococcus neoformans causes meningitis and disseminated infection in healthy individuals, but more commonly in hosts with defective immune responses. Cell-mediated immunity is an important component of the immune response to a great variety of infections, including yeast infections. We aimed to evaluate a specific lymphocyte transformation assay to Cryptococcus neoformans in order to identify immunodeficiency associated to neurocryptococcosis (NCC as primary cause of the mycosis. Methods Healthy volunteers, poultry growers, and HIV-seronegative patients with neurocryptococcosis were tested for cellular immune response. Cryptococcal meningitis was diagnosed by India ink staining of cerebrospinal fluid and cryptococcal antigen test (Immunomycol-Inc, SP, Brazil. Isolated peripheral blood mononuclear cells were stimulated with C. neoformans antigen, C. albicans antigen, and pokeweed mitogen. The amount of 3H-thymidine incorporated was assessed, and the results were expressed as stimulation index (SI and log SI, sensitivity, specificity, and cut-off value (receiver operating characteristics curve. We applied unpaired Student t tests to compare data and considered significant differences for p Results The lymphotoxin alpha showed a low capacity with all the stimuli for classifying patients as responders and non-responders. Lymphotoxin alpha stimulated by heated-killed antigen from patients with neurocryptococcosis was not affected by TCD4+ cell count, and the intensity of response did not correlate with the clinical evolution of neurocryptococcosis. Conclusion Response to lymphocyte transformation assay should be analyzed based on a normal range and using more than one stimulator. The use of a cut-off value to classify patients with neurocryptococcosis is inadequate. Statistical analysis should be based on the log transformation of SI. A more purified antigen for evaluating specific response to C. neoformans is needed.

  15. Lesion Explorer: a video-guided, standardized protocol for accurate and reliable MRI-derived volumetrics in Alzheimer's disease and normal elderly.

    Ramirez, Joel; Scott, Christopher J M; McNeely, Alicia A; Berezuk, Courtney; Gao, Fuqiang; Szilagyi, Gregory M; Black, Sandra E


    Obtaining in vivo human brain tissue volumetrics from MRI is often complicated by various technical and biological issues. These challenges are exacerbated when significant brain atrophy and age-related white matter changes (e.g. Leukoaraiosis) are present. Lesion Explorer (LE) is an accurate and reliable neuroimaging pipeline specifically developed to address such issues commonly observed on MRI of Alzheimer's disease and normal elderly. The pipeline is a complex set of semi-automatic procedures which has been previously validated in a series of internal and external reliability tests(1,2). However, LE's accuracy and reliability is highly dependent on properly trained manual operators to execute commands, identify distinct anatomical landmarks, and manually edit/verify various computer-generated segmentation outputs. LE can be divided into 3 main components, each requiring a set of commands and manual operations: 1) Brain-Sizer, 2) SABRE, and 3) Lesion-Seg. Brain-Sizer's manual operations involve editing of the automatic skull-stripped total intracranial vault (TIV) extraction mask, designation of ventricular cerebrospinal fluid (vCSF), and removal of subtentorial structures. The SABRE component requires checking of image alignment along the anterior and posterior commissure (ACPC) plane, and identification of several anatomical landmarks required for regional parcellation. Finally, the Lesion-Seg component involves manual checking of the automatic lesion segmentation of subcortical hyperintensities (SH) for false positive errors. While on-site training of the LE pipeline is preferable, readily available visual teaching tools with interactive training images are a viable alternative. Developed to ensure a high degree of accuracy and reliability, the following is a step-by-step, video-guided, standardized protocol for LE's manual procedures.

  16. Comprehensive neuromechanical assessment in stroke patients: reliability and responsiveness of a protocol to measure neural and non-neural wrist properties.

    van der Krogt, Hanneke; Klomp, Asbjørn; de Groot, Jurriaan H; de Vlugt, Erwin; van der Helm, Frans Ct; Meskers, Carel Gm; Arendzen, J Hans


    Understanding movement disorder after stroke and providing targeted treatment for post stroke patients requires valid and reliable identification of biomechanical (passive) and neural (active and reflexive) contributors. Aim of this study was to assess test-retest reliability of passive, active and reflexive parameters and to determine clinical responsiveness in a cohort of stroke patients with upper extremity impairments and healthy volunteers. Thirty-two community-residing chronic stroke patients with an impairment of an upper limb and fourteen healthy volunteers were assessed with a comprehensive neuromechanical assessment protocol consisting of active and passive tasks and different stretch reflex-eliciting measuring velocities, using a haptic manipulator and surface electromyography of wrist flexor and extensor muscles (Netherlands Trial Registry number NTR1424). Intraclass correlation coefficients (ICC) and Standard Error of Measurement were calculated to establish relative and absolute test-retest reliability of passive, active and reflexive parameters. Clinical responsiveness was tested with Kruskal Wallis test for differences between groups. ICC of passive parameters were fair to excellent (0.45 to 0.91). ICC of active parameters were excellent (0.88-0.99). ICC of reflexive parameters were fair to good (0.50-0.74). Only the reflexive loop time of the extensor muscles performed poor (ICC 0.18). Significant differences between chronic stroke patients and healthy volunteers were found in ten out of fourteen parameters. Passive, active and reflexive parameters can be assessed with high reliability in post-stroke patients. Parameters were responsive to clinical status. The next step is longitudinal measurement of passive, active and reflexive parameters to establish their predictive value for functional outcome after stroke.

  17. Targeted Next Generation Sequencing as a Reliable Diagnostic Assay for the Detection of Somatic Mutations in Tumours Using Minimal DNA Amounts from Formalin Fixed Paraffin Embedded Material.

    Wendy W J de Leng

    Full Text Available Targeted Next Generation Sequencing (NGS offers a way to implement testing of multiple genetic aberrations in diagnostic pathology practice, which is necessary for personalized cancer treatment. However, no standards regarding input material have been defined. This study therefore aimed to determine the effect of the type of input material (e.g. formalin fixed paraffin embedded (FFPE versus fresh frozen (FF tissue on NGS derived results. Moreover, this study aimed to explore a standardized analysis pipeline to support consistent clinical decision-making.We used the Ion Torrent PGM sequencing platform in combination with the Ion AmpliSeq Cancer Hotspot Panel v2 to sequence frequently mutated regions in 50 cancer related genes, and validated the NGS detected variants in 250 FFPE samples using standard diagnostic assays. Next, 386 tumour samples were sequenced to explore the effect of input material on variant detection variables. For variant calling, Ion Torrent analysis software was supplemented with additional variant annotation and filtering.Both FFPE and FF tissue could be sequenced reliably with a sensitivity of 99.1%. Validation showed a 98.5% concordance between NGS and conventional sequencing techniques, where NGS provided both the advantage of low input DNA concentration and the detection of low-frequency variants. The reliability of mutation analysis could be further improved with manual inspection of sequence data.Targeted NGS can be reliably implemented in cancer diagnostics using both FFPE and FF tissue when using appropriate analysis settings, even with low input DNA.

  18. A Hybrid Reliable Data Transmission based on Ant-agent Resource Allocation Technique in EEMCC Protocol for MANETS

    Dr. M. Rajanbabu


    Full Text Available Real time multicast applications in mobile adhoc network brings forward added advantages in wireless network. The fragile and mobile environment of adhoc network produces the need of bandwidth allocation for real time applications. Reliability is also an important factor in multicasting in mobile adhoc networks (MANETs, as it confirms eventual delivery of all the data to all the group members, without enforcing any particular delivery order in EEMCCP. In the first phase of this paper, we design an "ant agent-resource allocation‟ technique for reserving bandwidth for real-time multicast applications. In the forward phase, the source sends a forward ant agent which collects the bandwidth information of intermediate nodes and reserves a bandwidth for real-time flow for each multicast receiver. In the backward phase, the backward ant confirms the allocation and feeds the bandwidth information to the source.

  19. Design and performance testing of a DNA extraction assay for sensitive and reliable quantification of acetic acid bacteria directly in red wine using real time PCR

    Cédric eLONGIN


    Full Text Available Although strategies exist to prevent AAB contamination, the increased interest for wines with low sulfite addition leads to greater AAB spoilage. Hence there is a real need for a rapid, specific, sensitive and reliable method for detecting these spoilage bacteria. All these requirements are met by real time Polymerase Chain Reaction (or quantitative PCR; qPCR. Here, we compare existing methods of isolating DNA and their adaptation to a red wine matrix. Two different protocols for isolating DNA and three PCR mix compositions were tested to select the best method. The addition of insoluble polyvinylpolypyrrolidone (PVPP at 1% (v/v during DNA extraction using a protocol succeeded in eliminating PCR inhibitors from red wine. We developed a bacterial internal control which was efficient in avoiding false negative results due to decreases in the efficiency of DNA isolation and/or amplification. The specificity, linearity, repeatability and reproducibility of the method were evaluated. A standard curve was established for the enumeration of AAB inoculated into red wines. The limit of quantification in red wine was 3.7 log AAB/mL and about 2.8 log AAB/mL when the volume of the samples was increased from 1 mL to 10 mL. Thus the DNA extraction method developed in this paper allows sensitive and reliable AAB quantification without underestimation thanks to the presence of an internal control. Moreover, monitoring of both the AAB population and the amount of acetic acid in ethanol medium and red wine highlighted that a minimum about 6.0 log cells/mL of AAB is needed to significantly increase the production of acetic acid leading to spoilage.

  20. Design and Performance Testing of a DNA Extraction Assay for Sensitive and Reliable Quantification of Acetic Acid Bacteria Directly in Red Wine Using Real Time PCR.

    Longin, Cédric; Guilloux-Benatier, Michèle; Alexandre, Hervé


    Although strategies exist to prevent AAB contamination, the increased interest for wines with low sulfite addition leads to greater AAB spoilage. Hence, there is a real need for a rapid, specific, sensitive, and reliable method for detecting these spoilage bacteria. All these requirements are met by real time Polymerase Chain Reaction (or quantitative PCR; qPCR). Here, we compare existing methods of isolating DNA and their adaptation to a red wine matrix. Two different protocols for isolating DNA and three PCR mix compositions were tested to select the best method. The addition of insoluble polyvinylpolypyrrolidone (PVPP) at 1% (v/v) during DNA extraction using a protocol succeeded in eliminating PCR inhibitors from red wine. We developed a bacterial internal control which was efficient in avoiding false negative results due to decreases in the efficiency of DNA isolation and/or amplification. The specificity, linearity, repeatability, and reproducibility of the method were evaluated. A standard curve was established for the enumeration of AAB inoculated into red wines. The limit of quantification in red wine was 3.7 log AAB/mL and about 2.8 log AAB/mL when the volume of the samples was increased from 1 to 10 mL. Thus, the DNA extraction method developed in this paper allows sensitive and reliable AAB quantification without underestimation thanks to the presence of an internal control. Moreover, monitoring of both the AAB population and the amount of acetic acid in ethanol medium and red wine highlighted that a minimum about 6.0 log cells/mL of AAB is needed to significantly increase the production of acetic acid leading to spoilage.

  1. Flow cytometric 96-well microplate-based in vitro micronucleus assay with human TK6 cells: protocol optimization and transferability assessment.

    Bryce, Steven M; Avlasevich, Svetlana L; Bemis, Jeffrey C; Tate, Matthew; Walmsley, Richard M; Saad, Frédéric; Van Dijck, Kris; De Boeck, Marlies; Van Goethem, Freddy; Lukamowicz-Rajska, Magdalena; Elhajouji, Azeddine; Dertinger, Stephen D


    An automated approach for scoring in vitro micronuclei (MN) has been described in which flow cytometric analysis is combined with compound exposure, processing, and sampling in a single 96-well plate (Bryce SM et al. [2010]: Mutat Res 703:191-199). The current report describes protocol optimization and an interlaboratory assessment of the assay's transferability and reproducibility. In a training phase, the methodology was refined and collaborating laboratories were qualified by repeatedly testing three compounds. Second, a set of 32 chemicals comprised of reference genotoxicants and presumed non-genotoxicants was tested at each of four sites. TK6 cells were exposed to 10 closely spaced compound concentrations for 1.5- to 2-cell population doublings, and were then stained and lysed for flow cytometric analysis. MN frequencies were determined by evaluating ≥ 5,000 cells per replicate well, and several indices of cytotoxicity were acquired. The prevalence of positive results varied according to the MN-fold increase used to signify a genotoxic result, as well as the endpoint used to define a cytotoxicity limit. By varying these parameters, assay sensitivity and specificity values ranged from 82 to 98%, and 86 to 97%, respectively. In a third phase, one laboratory tested a further six genotoxicants and five non-genotoxic apoptosis inducers. In these experiments assay specificity was markedly improved when top concentration selection was based on two cytotoxicity endpoints-relative survival and quantification of ethidium monoazide-positive events. Collectively, the results indicate that the miniaturized assay is transferable across laboratories. The 96-well format consumes considerably less compound than conventional in vitro MN test methods, and the high information content provided by flow cytometry helps guard against irrelevant positive results arising from overt toxicity.

  2. Optimization of diagnostic RT-PCR protocols and sampling procedures for the reliable and cost-effective detection of Cassava brown streak virus.

    Abarshi, M M; Mohammed, I U; Wasswa, P; Hillocks, R J; Holt, J; Legg, J P; Seal, S E; Maruthi, M N


    Sampling procedures and diagnostic protocols were optimized for accurate diagnosis of Cassava brown streak virus (CBSV) (genus Ipomovirus, family Potyviridae). A cetyl trimethyl ammonium bromide (CTAB) method was optimized for sample preparation from infected cassava plants and compared with the RNeasy plant mini kit (Qiagen) for sensitivity, reproducibility and costs. CBSV was detectable readily in total RNAs extracted using either method. The major difference between the two methods was in the cost of consumables, with the CTAB 10x cheaper (0.53 pounds sterling=US$0.80 per sample) than the RNeasy method (5.91 pounds sterling=US$8.86 per sample). A two-step RT-PCR (1.34 pounds sterling=US$2.01 per sample), although less sensitive, was at least 3-times cheaper than a one-step RT-PCR (4.48 pounds sterling=US$6.72). The two RT-PCR tests revealed consistently the presence of CBSV both in symptomatic and asymptomatic leaves and indicated that asymptomatic leaves can be used reliably for virus diagnosis. Depending on the accuracy required, sampling 100-400 plants per field is an appropriate recommendation for CBSD diagnosis, giving a 99.9% probability of detecting a disease incidence of 6.7-1.7%, respectively. CBSV was detected at 10(-4)-fold dilutions in composite sampling, indicating that the most efficient way to index many samples for CBSV will be to screen pooled samples. The diagnostic protocols described below are reliable and the most cost-effective methods available currently for detecting CBSV.

  3. Tips and step-by-step protocol for the optimization of important factors affecting cellular enzyme-linked immunosorbent assay (CELISA).

    Morandini, R; Boeynaems, J M; Wérenne, J; Ghanem, G


    CELISA, or cellular enzyme-linked immunosorbent assay, is a powerful and easy to use technique to study cell surface antigens under different stimulations. Nevertheless, some factors must be discussed and optimized prior to reaching a reproducible CELISA. These include the choice of cell density, fixative agent, blocking agent, culture medium, optimal antibody dilutions, and incubation time. In this paper, we first present a short review of some references devoted to CELISA by means of a comparison of these parameters, followed by their description. Then, we describe and study these different parameters using practical examples comparing TNF-induced ICAM-1 expression as an end point, on HBL melanoma and HUVEC. These cell lines were also chosen because they differ in their ability to grow as discontinuous and continuous layers, respectively. Furthermore, we designed a comprehensive flow chart, as well as a complete step-by-step protocol for CELISA optimization.

  4. Validation of a standard forensic anthropology examination protocol by measurement of applicability and reliability on exhumed and archive samples of known biological attribution.

    Francisco, Raffaela Arrabaça; Evison, Martin Paul; Costa Junior, Moacyr Lobo da; Silveira, Teresa Cristina Pantozzi; Secchieri, José Marcelo; Guimarães, Marco Aurelio


    Forensic anthropology makes an important contribution to human identification and assessment of the causes and mechanisms of death and body disposal in criminal and civil investigations, including those related to atrocity, disaster and trafficking victim identification. The methods used are comparative, relying on assignment of questioned material to categories observed in standard reference material of known attribution. Reference collections typically originate in Europe and North America, and are not necessarily representative of contemporary global populations. Methods based on them must be validated when applied to novel populations. This study describes the validation of a standardized forensic anthropology examination protocol by application to two contemporary Brazilian skeletal samples of known attribution. One sample (n=90) was collected from exhumations following 7-35 years of burial and the second (n=30) was collected following successful investigations following routine case work. The study presents measurement of (1) the applicability of each of the methods: used and (2) the reliability with which the biographic parameters were assigned in each case. The results are discussed with reference to published assessments of methodological reliability regarding sex, age and-in particular-ancestry estimation. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Use of a quality-by-design approach to justify removal of the HPLC weight % assay from routine API stability testing protocols.

    Skrdla, Peter J; Wang, Tao; Antonucci, Vincent; Dowling, Thomas; Ge, Zhihong; Ellison, Dean; Curran, John; Mohan, Ganapathy; Wyvratt, Jean


    Due to the high method variability (typically > or = 0.5%, based on a literature survey and internal Merck experience) encountered in the HPLC weight percent (%) assays of various active pharmaceutical ingredients (APIs), it is proposed that the routine use of the test in stability studies should be discouraged on the basis that it is frequently not sufficiently precise to yield results that are stability-indicating. The high method variability of HPLC weight % methods is not consistent with the current ICH practice of reporting impurities/degradation products down to the 0.05% level, and it can lead to erroneous out-of-specification (OOS) results that are due to experimental error and are not attributable to API degradation. For the vast majority of cases, the HPLC impurity profile provides much better (earlier and more sensitive) detection of low-level degradation products. Based on these observations, a Quality-by-Design (QbD) approach is proposed to phase out the HPLC weight % assay from routine API stability testing protocols.

  6. BRAF(V600E) assessment by pyrosequencing in fine needle aspirates of thyroid nodules with concurrent Hashimoto's thyroiditis is a reliable assay.

    Guerra, Anna; Di Stasi, Vincenza; Zeppa, Pio; Faggiano, Antongiulio; Marotta, Vincenzo; Vitale, Mario


    Detection of BRAF mutation in cytology specimens has been proposed as a diagnostic adjunctive tool in evaluation of thyroid nodules with indeterminate cytology findings. Concurrent papillary thyroid carcinoma and Hashimoto's thyroiditis (HT), a disease characterized by thyroid lymphocytic infiltration, is a frequent occurrence. A large lymphocytic infiltrate might reduce the sensitivity of methods employed to detect BRAF mutation in thyroid cytology specimens. To determine whether testing for BRAF mutational status in fine needle aspirates (FNA) is reliable also in the presence of HT lymphocytic infiltration, we assessed the BRAF status by direct sequencing and pyrosequencing in a series of FNAs with and without concomitant HT lymphocytic infiltration. We also performed the same assessment by pyrosequencing in the corresponding tissue samples. Pyrosequencing demonstrated to be more sensitive than direct sequencing. The percentage of mutant BRAF(V600E) alleles was higher in FNAs than in the corresponding tissues, probably because of the lower stromal contamination in FNA than in the sections. In the presence of lymphocytic infiltration, the percentage of mutant BRAF(V600E) alleles determined by pyrosequencing was higher in FNAs than in the corresponding tissue samples (P lymphocytic contamination in FNA. The diagnostic value of BRAF(V600E) in inconclusive FNAs was not hampered by thyroid lymphocytic infiltration. These results indicate that BRAF(V600E) assessment by pyrosequencing is a reliable assay useful to refine inconclusive cytology of thyroid nodules also in the presence of concurrent HT.

  7. Generation of a reliable full-length cDNA of infectiousTembusu virus using a PCR-based protocol.

    Liang, Te; Liu, Xiaoxiao; Cui, Shulin; Qu, Shenghua; Wang, Dan; Liu, Ning; Wang, Fumin; Ning, Kang; Zhang, Bing; Zhang, Dabing


    Full-length cDNA of Tembusu virus (TMUV) cloned in a plasmid has been found instable in bacterial hosts. Using a PCR-based protocol, we generated a stable full-length cDNA of TMUV. Different cDNA fragments of TMUV were amplified by reverse transcription (RT)-PCR, and cloned into plasmids. Fragmented cDNAs were amplified and assembled by fusion PCR to produce a full-length cDNA using the recombinant plasmids as templates. Subsequently, a full-length RNA was transcribed from the full-length cDNA in vitro and transfected into BHK-21 cells; infectious viral particles were rescued successfully. Following several passages in BKH-21 cells, the rescued virus was compared with the parental virus by genetic marker checks, growth curve determinations and animal experiments. These assays clearly demonstrated the genetic and biological stabilities of the rescued virus. The present work will be useful for future investigations on the molecular mechanisms involved in replication and pathogenesis of TMUV. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Differential responses of sexual and asexual Artemia to genotoxicity by a reference mutagen: Is the comet assay a reliable predictor of population level responses?

    Sukumaran, Sandhya; Grant, Alastair


    The impact of chronic genotoxicity to natural populations is always questioned due to their reproductive surplus. We used a comet assay to quantify primary DNA damage after exposure to a reference mutagen ethyl methane sulfonate in two species of crustacean with different reproductive strategies (sexual Artemia franciscana and asexual Artemia parthenogenetica). We then assessed whether this predicted individual performance and population growth rate over three generations. Artemia were exposed to different chronic concentrations (0.78mM, 1.01mM, 1.24mM and 1.48mM) of ethyl methane sulfonate from instar 1 onwards for 3 h, 24 h, 7 days, 14 days and 21 days and percentage tail DNA values were used for comparisons between species. The percentage tail DNA values showed consistently elevated values up to 7 days and showed a reduction from 14 days onwards in A. franciscana. Whilst in A. parthenogenetica such a reduction was evident on 21 days assessment. The values of percentage tail DNA after 21 days were compared with population level fitness parameters, growth, survival, fecundity and population growth rate to know whether primary DNA damage as measured by comet assay is a reliable biomarker. Substantial increase in tail DNA values was associated with substantial reductions in all the fitness parameters in the parental generation of A. franciscana and parental, F1 and F2 generations of A. parthenogenetica. So comet results were more predictive in asexual species over generations. These results pointed to the importance of predicting biomarker responses from multigenerational consequences considering life history traits and reproductive strategies in ecological risk assessments.

  9. A standardized and reproducible protocol for serum-free monolayer culturing of primary paediatric brain tumours to be utilized for therapeutic assays.

    Sandén, Emma; Eberstål, Sofia; Visse, Edward; Siesjö, Peter; Darabi, Anna


    In vitro cultured brain tumour cells are indispensable tools for drug screening and therapeutic development. Serum-free culture conditions tentatively preserve the features of the original tumour, but commonly comprise neurosphere propagation, which is a technically challenging procedure. Here, we define a simple, non-expensive and reproducible serum-free cell culture protocol for establishment and propagation of primary paediatric brain tumour cultures as adherent monolayers. The success rates for establishment of primary cultures (including medulloblastomas, atypical rhabdoid tumour, ependymomas and astrocytomas) were 65% (11/17) and 78% (14/18) for sphere cultures and monolayers respectively. Monolayer culturing was particularly feasible for less aggressive tumour subsets, where neurosphere cultures could not be generated. We show by immunofluorescent labelling that monolayers display phenotypic similarities with corresponding sphere cultures and primary tumours, and secrete clinically relevant inflammatory factors, including PGE2, VEGF, IL-6, IL-8 and IL-15. Moreover, secretion of PGE2 was considerably reduced by treatment with the COX-2 inhibitor Valdecoxib, demonstrating the functional utility of our newly established monolayer for preclinical therapeutic assays. Our findings suggest that this culture method could increase the availability and comparability of clinically representative in vitro models of paediatric brain tumours, and encourages further molecular evaluation of serum-free monolayer cultures.

  10. Evaluation of a novel assay for detection of the fetal marker RASSF1A: facilitating improved diagnostic reliability of noninvasive prenatal diagnosis.

    Helen E White

    Full Text Available BACKGROUND: Analysis of cell free fetal (cff DNA in maternal plasma is used routinely for non invasive prenatal diagnosis (NIPD of fetal sex determination, fetal rhesus D status and some single gene disorders. True positive results rely on detection of the fetal target being analysed. No amplification of the target may be interpreted either as a true negative result or a false negative result due to the absence or very low levels of cffDNA. The hypermethylated RASSF1A promoter has been reported as a universal fetal marker to confirm the presence of cffDNA. Using methylation-sensitive restriction enzymes hypomethylated maternal sequences are digested leaving hypermethylated fetal sequences detectable. Complete digestion of maternal sequences is required to eliminate false positive results. METHODS: cfDNA was extracted from maternal plasma (n = 90 and digested with methylation-sensitive and insensitive restriction enzymes. Analysis of RASSF1A, SRY and DYS14 was performed by real-time PCR. RESULTS: Hypermethylated RASSF1A was amplified for 79 samples (88% indicating the presence of cffDNA. SRY real time PCR results and fetal sex at delivery were 100% accurate. Eleven samples (12% had no detectable hypermethylated RASSF1A and 10 of these (91% had gestational ages less than 7 weeks 2 days. Six of these samples were male at delivery, five had inconclusive results for SRY analysis and one sample had no amplifiable SRY. CONCLUSION: Use of this assay for the detection of hypermethylated RASSF1A as a universal fetal marker has the potential to improve the diagnostic reliability of NIPD for fetal sex determination and single gene disorders.

  11. Use of a standardized JaCVAM in vivo rat comet assay protocol to assess the genotoxicity of three coded test compounds; ampicillin trihydrate, 1,2-dimethylhydrazine dihydrochloride, and N-nitrosodimethylamine.

    McNamee, J P; Bellier, P V


    As part of the Japanese Center for the Validation of Alternative Methods (JaCVAM)-initiative international validation study of the in vivo rat alkaline comet assay (comet assay), our laboratory examined ampicillin trihydrate (AMP), 1,2-dimethylhydrazine dihydrochloride (DMH), and N-nitrosodimethylamine (NDA) using a standard comet assay validation protocol (v14.2) developed by the JaCVAM validation management team (VMT). Coded samples were received by our laboratory along with basic MSDS information. Solubility analysis and range-finding experiments of the coded test compounds were conducted for dose selection. Animal dosing schedules, the comet assay processing and analysis, and statistical analysis were conducted in accordance with the standard protocol. Based upon our blinded evaluation, AMP was not found to exhibit evidence of genotoxicity in either the rat liver or stomach. However, both NDA and DMH were observed to cause a significant increase in % tail DNA in the rat liver at all dose levels tested. While acute hepatoxicity was observed for these compounds in the high dose group, in the investigators opinion there were a sufficient number of consistently damaged/measurable cells at the medium and low dose groups to judge these compounds as genotoxic. There was no evidence of genotoxicity from either NDA or DMH in the rat stomach. In conclusion, our laboratory observed increased DNA damage from two blinded test compounds in rat liver (later identified as genotoxic carcinogens), while no evidence of genotoxicity was observed for the third blinded test compound (later identified as a non-genotoxic, non-carcinogen). This data supports the use of a standardized protocol of the in vivo comet assay as a cost-effective alternative genotoxicity assay for regulatory testing purposes.

  12. Design and Implementation of a Large-Scale Active Reliable Multicast Protocol%一个基于主动网络的大规模可靠组播协议的设计和实现

    陈晓林; 李冀; 魏明亮; 陆桑璐; 陈贵海; 谢立


    本文提出一个基于主动网络的大规模可靠组播协议LARMP(Large-scale Active Reliable Multicast Protocol),它较全面地解决了NACK/ACK (Negotiate Acknowledge / Acknowledge Implosion)风暴、选择重发、分布恢复负担、拥塞控制、健壮性这五个Internet上的可靠组播面临

  13. A new experimental protocol as an alternative to the colony-forming unit-granulocyte/macrophage (CFU-GM) clonogenic assay to assess the haematotoxic potential of new drugs.

    Dal Negro, Gianni; Vandin, Luca; Bonato, Monica; Repeto, Paolo; Sciuscio, Davide


    In this work, a first attempt to set-up a new in vitro experimental protocol with culture in liquid medium and flow cytometry analysis of bone marrow progenitors is described. This protocol is proposed as an alternative to the colony-forming unit-granulocyte/macrophage (CFU-GM) clonogenic in vitro assay currently used to assess the toxic potential of new drugs in the bone marrow. This new experimental approach should enable to speed up the procedure of the in vitro haematotoxic potential assessment, to reduce inter-experimental variability and to enhance result accuracy. Preliminary results obtained demonstrated that the progenitor cell count by flow cytometry replacing the light microscopy granulocyte/macrophage colony count represents a tremendous improvement in terms of accuracy and standardisation. Moreover, differential counts of cell sub-populations can be performed by using specific monoclonal antibodies. Furthermore, this method demonstrated to be time-saving, since 4 day cell incubation period is required instead of 7-14 day incubation in the CFU-GM clonogenic assay. On the basis of results obtained so far, the new experimental protocol proposed looks a promising alternative to the CFU-GM clonogenic assay currently used.

  14. Comprehensive neuromechanical assessment in stroke patients: reliability and responsiveness of a protocol to measure neural and non-neural wrist properties

    Van der Krogt, H.; Klomp, A.; De Groot, J.H.; De Vlugt, E.; Van der Helm, F.C.T.; Meskers, C.G.M.; Arendzen, J.H.


    Background: Understanding movement disorder after stroke and providing targeted treatment for post stroke patients requires valid and reliable identification of biomechanical (passive) and neural (active and reflexive) contributors. Aim of this study was to assess test-retest reliability of passive,

  15. Reliable multicast protocol based on reputation mechanism in wireless multi-hop networks%无线多跳网络中基于声誉机制的可靠组播协议

    许力; 蒋佳铭


    To improve the reliability of the multicast tree and the efficiency of multicast, a multicast protocol based on reputation mechanism was proposed. The protocol took into account reputation value of the nodes in constructing the multicast tree, thus excluding the selfish nodes from it, making the multicast path comparably reliable. Simulation results show that proposed protocol can significantly improving the efficiency of multicast at a lower cost.%为了提高无线多跳网络中组播树的可靠性和提高组播效率,提出了一种基于声誉机制的组播协议.该协议在构建组播树时考虑了节点的声誉值,从而将自私性节点排除在组播树之外,使得构建的组播路径比较可靠.仿真实验表明该协议可以用较低的代价有效地提高组播效率.

  16. CTL ELISPOT assay.

    Ranieri, Elena; Popescu, Iulia; Gigante, Margherita


    Enzyme-linked immune absorbent spot (Elispot) is a quantitative method for measuring relevant parameters of T cell activation. The sensitivity of Elispot allows the detection of low-frequency antigen-specific T cells that secrete cytokines and effector molecules, such as granzyme B and perforin. Cytotoxic T cell (CTL) studies have taken advantage with this high-throughput technology by providing insights into quantity and immune kinetics. Accuracy, sensitivity, reproducibility, and robustness of Elispot resulted in a wide range of applications in research as well as in diagnostic field. Actually, CTL monitoring by Elispot is a gold standard for the evaluation of antigen-specific T cell immunity in clinical trials and vaccine candidates where the ability to detect rare antigen-specific T cells is of relevance for immune diagnostic. The most utilized Elispot assay is the interferon-gamma (IFN-γ) test, a marker for CD8(+) CTL activation, but Elispot can also be used to distinguish different subsets of activated T cells by using other cytokines such as T-helper (Th) 1-type cells (characterized by the production of IFN-γ, IL-2, IL-6, IL-12, IL-21, and TNF-α), Th2 (producing cytokines like IL-4, IL-5, IL-10, and IL-13), and Th17 (IL-17) cells. The reliability of Elispot-generated data, by the evaluation of T cell frequency recognizing individual antigen/peptide, is the core of this method currently applied widely to investigate specific immune responses in cancer, infections, allergies, and autoimmune diseases. The Elispot assay is competing with other methods measuring single-cell cytokine production, e.g., intracellular cytokine by FACS or Miltenyi cytokine secretion assay. Other types of lymphocyte frequency and function assays include limiting dilution assay (LDA), cytotoxic T cell assay (CTL), and tetramer staining. Compared with respect to sensitivity the Elispot assay is outranking other methods to define frequency of antigen-specific lymphocytes. The method

  17. Optimization of killer assays for yeast selection protocols Optimización de la actividad killer para protocolos de selección de levaduras

    C. A. Lopes


    Full Text Available A new optimized semiquantitative yeast killer assay is reported for the first time. The killer activity of 36 yeast isolates belonging to three species, namely, Metschnikowia pulcherrima, Wickerhamomyces anomala and Torulaspora delbrueckii, was tested with a view to potentially using these yeasts as biocontrol agents against the wine spoilage species Pichia guilliermondii and Pichia membranifaciens. The effectiveness of the classical streak-based (qualitative method and the new semiquantitative techniques was compared. The percentage of yeasts showing killer activity was found to be higher by the semiquantitative technique (60% than by the qualitative method (45%. In all cases, the addition of 1% NaCl into the medium allowed a better observation of the killer phenomenon. Important differences were observed in the killer capacity of different isolates belonging to a same killer species. The broadest spectrum of action was detected in isolates of W. anomala NPCC 1023 and 1025, and M. pulcherrima NPCC 1009 and 1013. We also brought experimental evidence supporting the importance of the adequate selection of the sensitive isolate to be used in killer evaluation. The new semiquantitative method proposed in this work enables to visualize the relationship between the number of yeasts tested and the growth of the inhibition halo (specific productivity. Hence, this experimental approach could become an interesting tool to be taken into account for killer yeast selection protocols.En este trabajo se presenta un nuevo ensayo semicuantitativo que optimiza la detección de actividad killer en levaduras. Se evaluó la actividad killer de 36 cepas pertenecientes a las especies Metschnikowia pulcherrima, Wickerhamomyces anomala y Torulaspora delbrueckii, en vista del potencial uso de estas levaduras como agentes de biocontrol frente a las especies contaminantes de vinos Pichia guilliermondii y Pichia membranifaciens. Se comparó la efectividad de la técnica cl

  18. Monoclonal antibody-based dipstick assay: a reliable field applicable technique for diagnosis of Schistosoma mansoni infection using human serum and urine samples.

    Demerdash, Zeinab; Mohamed, Salwa; Hendawy, Mohamed; Rabia, Ibrahim; Attia, Mohy; Shaker, Zeinab; Diab, Tarek M


    A field applicable diagnostic technique, the dipstick assay, was evaluated for its sensitivity and specificity in diagnosing human Schistosoma mansoni infection. A monoclonal antibody (mAb) against S. mansoni adult worm tegumental antigen (AWTA) was employed in dipstick and sandwich ELISA for detection of circulating schistosome antigen (CSA) in both serum and urine samples. Based on clinical and parasitological examinations, 60 S. mansoni-infected patients, 30 patients infected with parasites other than schistosomiasis, and 30 uninfected healthy individuals were selected. The sensitivity and specificity of dipstick assay in urine samples were 86.7% and 90.0%, respectively, compared to 90.0% sensitivity and 91.7% specificity of sandwich ELISA. In serum samples, the sensitivity and specificity were 88.3% and 91.7% for dipstick assay vs. 91.7% and 95.0% for sandwich ELISA, respectively. The diagnostic efficacy of dipstick assay in urine and serum samples was 88.3% and 90.0%, while it was 90.8% and 93.3% for sandwich ELISA, respectively. The diagnostic indices of dipstick assay and ELISA either in serum or in urine were statistically comparable (P>0.05). In conclusion, the dipstick assay offers an alternative simple, rapid, non-invasive technique in detecting CSA or complement to stool examinations especially in field studies.

  19. Complete validation of a unique digestion assay to detect Trichinella larvae in horsemeat demonstrates its reliability for meeting food safety and trade requirements.

    A tissue digestion assay using a double separatory funnel (DSF) procedure for the detection of Trichinella larvae in horsemeat was validated for application in food safety programs and trade. It consisted of a pepsin-HCl digestion step to release larvae from muscle tissue followed by two sequential ...

  20. A simple, rapid and reliable enzyme-linked immunosorbent assay for the detection of bovine virus diarrhoea virus (BVDV) specific antibodies in cattle serum, plasma and bulk milk

    Kramps, J.A.; Maanen, van C.; Wetering, van de G.; Stienstra, G.; Quak, S.; Brinkhof, J.; Ronsholt, L.; Nylin, B.


    To detect Bovine Virus Diarrhoea Virus (BVDV)-specific antibodies in cattle serum, plasma and bulk milk, a simple, reliable and rapid blocking ELISA ("Ceditest") has been developed using two monoclonal antibodies ("WB112" and "WB103") directed to different highly conserved epitopes on the non-struct

  1. Fast and reliable DNA extraction protocol for identification of species in raw and processed meat products sold on the commercial market

    Alvarado Pavel Espinoza


    Full Text Available In this work a protocol for the extraction of DNA from the meat of different animals (beef, pork, and horse was established. The protocol utilized TE lysis buffer with varying concentrations of phenol and chloroform as a base reagent. Reactions were carried out for verying time periods and under differing temperatures. All samples analyzed were obtained from commercial grade meat sourced from the local region. 12 samples were used for methodological optimization with 30 repetitions per sample. Once optimized, purity results for the three species were 1.7 with a concentration (determined spectrophotometrically at 260 nm of 100 μl/ml of DNA. The protocol was tested using 465 different meat samples from different animal species. All meat used was fresh and processed. Results showed a purity of 1.35 ± 0.076 and a DNA concentration of 70 ± 0.31 μl for a time duration of 1.5 hours. These results were tested by polymerase chain reaction (PCR as reported by several authors. The extracts were tested using different PCR reactions using specific primers for horses. Results suggest that there was 39 positive samples. The proposed methodology provides an efficient way to detect DNA concentration and purity, suitable for amplification with PCR.

  2. Indicating spinal joint mobilisations or manipulations in patients with neck or low-back pain: Protocol of an inter-examiner reliability study among manual therapists

    E. van Trijffel (Emiel); R. Lindeboom (Robert); P.M.M. Bossuyt (Patrick); M.A. Schmitt (Maarten); C. Lucas (Cees); B.W. Koes (Bart); R.A. Oostendorp (Robert)


    textabstractBackground: Manual spinal joint mobilisations and manipulations are widely used treatments in patients with neck and low-back pain. Inter-examiner reliability of passive intervertebral motion assessment of the cervical and lumbar spine, perceived as important for indicating these interve

  3. Comet assay with gill cells of Mytilus galloprovincialis end point tools for biomonitoring of water antibiotic contamination: Biological treatment is a reliable process for detoxification.

    Mustapha, Nadia; Zouiten, Amina; Dridi, Dorra; Tahrani, Leyla; Zouiten, Dorra; Mosrati, Ridha; Cherif, Ameur; Chekir-Ghedira, Leila; Mansour, Hedi Ben


    This article investigates the ability of Pseudomonas peli to treat industrial pharmaceuticals wastewater (PW). Liquid chromatography-mass spectrometry (MS)/MS analysis revealed the presence, in this PW, of a variety of antibiotics such as sulfathiazole, sulfamoxole, norfloxacine, cloxacilline, doxycycline, and cefquinome.P. peli was very effective to be grown in PW and inducts a remarkable increase in chemical oxygen demand and biochemical oxygen demand (140.31 and 148.51%, respectively). On the other hand, genotoxicity of the studied effluent, before and after 24 h of shaking incubation with P. peli, was evaluated in vivo in the Mediterranean wild mussels Mytilus galloprovincialis using comet assay for quantification of DNA fragmentation. Results show that PW exhibited a statistically significant (pcomet assay genotoxicity end points are useful tools to biomonitor the physicochemical and biological quality of water. Also, it could be concluded that P. peli can treat and detoxify the studied PW.

  4. A reliable and reproducible method for the lipase assay in an AOT/isooctane reversed micellar system: modification of the copper-soap colorimetric method.

    Kwon, Chang Woo; Park, Kyung-Min; Choi, Seung Jun; Chang, Pahn-Shick


    The copper-soap method, which is based on the absorbance of a fatty acid-copper complex at 715 nm, is a widely used colorimetric assay to determine the lipase activity in reversed micellar system. However, the absorbance of the bis(2-ethylhexyl) sodium sulfosuccinate (AOT)-copper complex prevents the use of an AOT/isooctane reversed micellar system. An extraction step was added to the original procedure to remove AOT and eliminate interference from the AOT-copper complex. Among the solvents tested, acetonitrile was determined to be the most suitable because it allows for the generation of a reproducible calibration curve with oleic acid that is independent of the AOT concentrations. Based on the validation data, the modified method, which does not experience interference from the AOT-copper complex, could be a useful method with enhanced accuracy and reproducibility for the lipase assay.

  5. 基于停车诱导系统的能量均衡可靠路由协议的设计%Design of An Energy Balance Reliable Routing Protocol Based on Parking Guidance System

    周雪; 朱小明; 陈立建; 方凯; 雷艳静; 毛科技


    无线传感器网络中节点的能量是有限的,如何设计一个能有效节能的路由协议是其研究的热点之一.针对实际的停车位诱导系统提出了一种能量均衡的可靠路由协议.该协议在网络的初始阶段先建立全局网络拓扑,然后根据能量代价函数选择下一跳转发的节点,节点上方有障碍物的时候更新节点的可靠度信息.由于综合考虑了节点的剩余能量、跳数、邻居节点的距离以及可靠度等因素,使得网络节点能量的消耗得以均衡,达到延长网络生命周期的目的.在仿真平台对能量均衡的可靠路由协议进行了仿真分析,并且和最短路径路由进行了对比,从网络生命周期、网络能量均值以及网络能量方差三个方面分别分析,验证了本协议的优越性.最后将协议应用到实际的停车位诱导系统中,取得了较显著的效果.%Due to the limitation of node energy,the design of routing protocol which can effectively save energy and prolong the lifetime of network has become a hot-spot in WSN research. The paper proposed a reliable energy-bal-anced routing protocol after analyzing readily available protocols. The protocol establishes a global network topology in the initial stage,and then selects the next hop node according to energy cost function and updates node's reliabil-ity information when there are obstacles above the node. With the consideration of the nodes'residual energy,hops, the distance of the neighbors and reliability factors,our protocol balanced the energy consumption of the nodes and effectively prolonged the lifetime of the network. We simulated the reliable energy-balanced routing protocol on the platform and compared it with the Shortest-path routing algorithm in the aspects of lifetime,the average energy con-sumption and the energy variance of the network,which turns out that the algorithm,is superior to others. At last it is applied to the actual parking guidance system

  6. Novel Rotavirus VP7 Typing Assay Using a One-Step Reverse Transcriptase PCR Protocol and Product Sequencing and Utility of the Assay for Epidemiological Studies and Strain Characterization, Including Serotype Subgroup Analysis

    DiStefano, Daniel J.; Kraiouchkine, Nikolai; Mallette, Laura; Maliga, Marianne; Kulnis, Gregory; Keller, Paul M.; Clark, H. Fred; Shaw, Alan R.


    Rotavirus is the most common cause of severe dehydrating gastroenteritis in infants. To date, 10 different serotypes of rotavirus have been identified in human stools. While four or five serotypes dominate, serotype circulation varies with season and geography. Since our laboratory has been involved in the development of a multivalent rotavirus vaccine, it is important to identify the serotypes of rotavirus encountered during our clinical trials. We have developed methodologies for the molecular identification of rotavirus strains based on VP7 gene segment sequence. A 365-bp reverse transcriptase PCR product was generated from the VP7 gene segment using a pair of novel degenerate primers. All serotypes tested (both animal and human) yielded an identically sized product after amplification. Sequencing of these products is performed using truncated versions of the original primers. The sequence generated is compared against a database of rotavirus VP7 sequences, with the G type determined, based on the sequence homology. Using this assay, we have correctly identified human VP7 strains from a panel of available serotypes, as well as numerous animal strains. The assay was qualified using rotavirus positive stool samples, negative stool samples, and rotavirus-spiked stool samples. In addition, samples from cases of acute gastroenteritis collected at Children's Hospital of Philadelphia have been evaluated and indicate that the assay is able to discriminate subtle differences within serotypes. The assay has been utilized in the testing of >3,000 antigen-positive (enzyme immunoassay) samples collected during clinical trials of a rotavirus vaccine (RotaTeq) and identified a serotype in ∼92% of samples (3, 17, 19). PMID:16333070

  7. Reliable Quantification of the Potential for Equations Based on Spot Urine Samples to Estimate Population Salt Intake: Protocol for a Systematic Review and Meta-Analysis

    Huang, Liping; Crino, Michelle; Wu, Jason HY; Woodward, Mark; Land, Mary-Anne; McLean, Rachael; Webster, Jacqui; Enkhtungalag, Batsaikhan; Nowson, Caryl A; Elliott, Paul; Cogswell, Mary; Toft, Ulla; MILL, Jose G.; Furlanetto,Tania W.; Ilich, Jasminka Z.


    Background Methods based on spot urine samples (a single sample at one time-point) have been identified as a possible alternative approach to 24-hour urine samples for determining mean population salt intake. Objective The aim of this study is to identify a reliable method for estimating mean population salt intake from spot urine samples. This will be done by comparing the performance of existing equations against one other and against estimates derived from 24-hour urine samples. The effect...

  8. A reliable protocol for the stable transformation of non-embryogenic cells cultures of grapevine (Vitis vinifera L.) and Taxus x media

    Ascensión Martínez-Márquez; Jaime Morante-Carriel; Karla Ramírez-Estrada; Rosa Cusido; Susana Sellés-Marchart; Javier Palazon; Maria Angeles Pedreño; Roque Bru-Martínez


    One of the major intent of metabolic engineering in cell culture systems is to increase yields of secondary metabolites. Efficient transformation methods are a priority to successfully apply metabolic engineering to cell cultures of plants that produce bioactive or therapeutic compounds, such as Vitis vinifera and Taxus x media. The aim of this study was to establish a reliable method to transform non-embryogenic cell cultures of these species. The V. vinifera cv. Gamay/cv. Monastrell cell li...

  9. Optimization of a sperm-oviduct binding assay mimicking in vivo conditions. Adoption of sperm separation methods and protocols for analysing sperm motility and intracellular Ca2+ level

    Narud, Birgitte


    English: An in vitro model that mimics the interactions between spermatozoa and oviductal epithelial cells can be used to increase the knowledge about the function of the oviduct and the formation of a sperm reservoir in vivo. The aim of the present study was to optimize methods for culturing bovine epithelial cells (BOECs) bi-dimensionally on plastic and three-dimensionally on polyester membrane. These cells were used in a sperm binding assay for evaluation of sperm-BOEC binding and relea...

  10. Evaluating the reliability of a novel neck-strength assessment protocol for healthy adults using self-generated resistance with a hand-held dynamometer.

    Versteegh, Theo; Beaudet, Danielle; Greenbaum, Marla; Hellyer, Leah; Tritton, Amanda; Walton, Dave


    Objectif : Évaluer la convergence test-retest intra et intersessionnelle d'un nouveau protocole d'évaluation de la force du cou au moyen d'un dynamomètre portatif. Contexte: Une recension des écrits a révélé un manque de protocoles d'évaluation de la force du cou à la fois portables et fiables. La dynamométrie manuelle est une méthode portable et peu coûteuse d'évaluation de la force musculaire, mais elle n'est pas utilisée couramment pour évaluer la force du cou. Méthodes : On a utilisé un dynamomètre portatif pour évaluer la force du cou chez 30 participants en bonne santé. On a mesuré la force maximale en flexion cervicale, extension, flexion latérale, flexion latérale avec rotation et rotation pure en utilisant la main ipsilatérale pour appliquer une résistance isométrique pendant 3 secondes. On a pris trois mesures en 6 à 8 jours. Résultats : Les coefficients de corrélation intraclasse (CCI) de test-retest ont révélé une grande fiabilité variant de 0,94 à 0,97 dans le cas de toutes les directions vérifiées de l'essai 1 à l'essai 2 (fiabilité intrasessionnelle) [CCI (2,1), absolu]. Les valeurs CCI ont démontré une fiabilité intersessionnelle variant de bonne à élevée, variant de 0,87 à 0,95 dans le cas de toutes les directions mises à l'épreuve de l'essai 1 à l'essai 3 [CCI (2,1), absolu]. Conclusion : Les résultats indiquent qu'il est possible d'appliquer le protocole d'évaluation de la force du cou et du quadrant supérieur par les cinq positions d'essai en utilisant la dynamométrie manuelle, et qu'il produit une fiabilité variant de bonne à élevée.

  11. Nano-immunosafety: issues in assay validation

    Boraschi, Diana; Italiani, Paola [Institute of Biomedical Technologies, National Research Council, Via G. Moruzzi 1, 56124 Pisa (Italy); Oostingh, Gertie J; Duschl, Albert [Department of Molecular Biology, University of Salzburg, Hellbrunnerstrasse 34, 5020 Salzburg (Austria); Casals, Eudald; Puntes, Victor F [Institut Catala de Nanotecnologia, Campus de la UAB - Facultat de Ciencies, Edifici CM7, 08193 Bellaterra (Spain); Nelissen, Inge, E-mail: [VITO NV, Boeretang 200, BE-2400 Mol (Belgium)


    Assessing the safety of engineered nanomaterials for human health must include a thorough evaluation of their effects on the immune system, which is responsible for defending the integrity of our body from damage and disease. An array of robust and representative assays should be set up and validated, which could be predictive of the effects of nanomaterials on immune responses. In a trans-European collaborative work, in vitro assays have been developed to this end. In vitro tests have been preferred for their suitability to standardisation and easier applicability. Adapting classical assays to testing the immunotoxicological effects of nanoparticulate materials has raised a series of issues that needed to be appropriately addressed in order to ensure reliability of results. Besides the exquisitely immunological problem of selecting representative endpoints predictive of the risk of developing disease, assay results turned out to be significantly biased by artefactual interference of the nanomaterials or contaminating agents with the assay protocol. Having addressed such problems, a series of robust and representative assays have been developed that describe the effects of engineered nanoparticles on professional and non-professional human defence cells. Two of such assays are described here, one based on primary human monocytes and the other employing human lung epithelial cells transfected with a reporter gene.

  12. A reliable protocol for the stable transformation of non-embryogenic cells cultures of grapevine (Vitis vinifera L. and Taxus x media

    Ascensión Martínez-Márquez


    Full Text Available One of the major intent of metabolic engineering in cell culture systems is to increase yields of secondary metabolites. Efficient transformation methods are a priority to successfully apply metabolic engineering to cell cultures of plants that produce bioactive or therapeutic compounds, such as Vitis vinifera and Taxus x media. The aim of this study was to establish a reliable method to transform non-embryogenic cell cultures of these species. The V. vinifera cv. Gamay/cv. Monastrell cell lines and Taxus x media were used for Agrobacterium-mediated transformation using the Gateway-compatible Agrobacterium sp. binary vector system for fast reliable DNA cloning. The Taxus x media and Vitis cell lines were maintained in culture for more than 4 and 15 months, respectively, with no loss of reporter gene expression or antibiotic resistance. The introduced genes had no discernible effect on cell growth, or led to extracellular accumulation of phytoalexin trans-Resveratrol (t-R in response to elicitation with methylated cyclodextrins (MBCD and methyl jasmonate (MeJA in the grapevine transgenic cell lines compared to the parental control. The method described herein provides an excellent tool to exploit exponentially growing genomic resources to enhance, optimize or diversify the production of bioactive compounds generated by grapevine and yew cell cultures, and offers a better understanding of many grapevine and yew biology areas.

  13. Using Multiple Phenotype Assays and Epistasis Testing to Enhance the Reliability of RNAi Screening and Identify Regulators of Muscle Protein Degradation

    Nathaniel J. Szewczyk


    Full Text Available RNAi is a convenient, widely used tool for screening for genes of interest. We have recently used this technology to screen roughly 750 candidate genes, in C. elegans, for potential roles in regulating muscle protein degradation in vivo. To maximize confidence and assess reproducibility, we have only used previously validated RNAi constructs and have included time courses and replicates. To maximize mechanistic understanding, we have examined multiple sub-cellular phenotypes in multiple compartments in muscle. We have also tested knockdowns of putative regulators of degradation in the context of mutations or drugs that were previously shown to inhibit protein degradation by diverse mechanisms. Here we discuss how assaying multiple phenotypes, multiplexing RNAi screens with use of mutations and drugs, and use of bioinformatics can provide more data on rates of potential false positives and negatives as well as more mechanistic insight than simple RNAi screening.

  14. An outbreak of scrub typhus in military personnel despite protocols for antibiotic prophylaxis: doxycycline resistance excluded by a quantitative PCR-based susceptibility assay.

    Harris, Patrick N A; Oltvolgyi, Csongor; Islam, Aminul; Hussain-Yusuf, Hazizul; Loewenthal, Mark R; Vincent, Gemma; Stenos, John; Graves, Stephen


    Scrub typhus is caused by the obligate intracellular bacterium Orientia tsutsugamushi and is endemic to many countries in the Asia-Pacific region, including tropical Australia. We describe a recent large outbreak amongst military personnel in north Queensland. A total of 45 clinical cases were identified (36% of all potentially exposed individuals). This occurred despite existing military protocols stipulating the provision of doxycycline prophylaxis. Doxycycline resistance in O. tsutsugamushi has been described in South-East Asia, but not Australia. In one case, O. tsutsugamushi was cultured from eschar tissue and blood. Using quantitative real-time PCR to determine susceptibility to doxycycline for the outbreak strain, a minimum inhibitory concentration (MIC) of ≤0.04 μg/mL was found, indicating susceptibility to this agent. It seems most probable that failure to adhere to adequate prophylaxis over the duration of the military exercise accounted for the large number of cases encountered rather than doxycycline resistance.

  15. Is the Scale for Measuring Motivational Interviewing Skills a valid and reliable instrument for measuring the primary care professionals motivational skills?: EVEM study protocol

    Pérula Luis Á


    Full Text Available Abstract Background Lifestyle is one of the main determinants of people’s health. It is essential to find the most effective prevention strategies to be used to encourage behavioral changes in their patients. Many theories are available that explain change or adherence to specific health behaviors in subjects. In this sense the named Motivational Interviewing has increasingly gained relevance. Few well-validated instruments are available for measuring doctors’ communication skills, and more specifically the Motivational Interviewing. Methods/Design The hypothesis of this study is that the Scale for Measuring Motivational Interviewing Skills (EVEM questionnaire is a valid and reliable instrument for measuring the primary care professionals skills to get behavior change in patients. To test the hypothesis we have designed a prospective, observational, multi-center study to validate a measuring instrument. –Scope: Thirty-two primary care centers in Spain. -Sampling and Size: a face and consensual validity: A group composed of 15 experts in Motivational Interviewing. b Assessment of the psychometric properties of the scale; 50 physician- patient encounters will be videoed; a total of 162 interviews will be conducted with six standardized patients, and another 200 interviews will be conducted with 50 real patients (n=362. Four physicians will be specially trained to assess 30 interviews randomly selected to test the scale reproducibility. -Measurements for to test the hypothesis: a Face validity: development of a draft questionnaire based on a theoretical model, by using Delphi-type methodology with experts. b Scale psychometric properties: intraobservers will evaluate video recorded interviews: content-scalability validity (Exploratory Factor Analysis, internal consistency (Cronbach alpha, intra-/inter-observer reliability (Kappa index, intraclass correlation coefficient, Bland & Altman methodology, generalizability, construct validity and

  16. Is the Scale for Measuring Motivational Interviewing Skills a valid and reliable instrument for measuring the primary care professionals motivational skills?: EVEM study protocol.

    Pérula, Luis Á; Campiñez, Manuel; Bosch, Josep M; Barragán Brun, Nieves; Arboniés, Juan C; Bóveda Fontán, Julia; Martín Alvarez, Remedios; Prados, Jose A; Martín-Rioboó, Enrique; Massons, Josep; Criado, Margarita; Fernández, José Á; Parras, Juan M; Ruiz-Moral, Roger; Novo, Jesús M


    Lifestyle is one of the main determinants of people's health. It is essential to find the most effective prevention strategies to be used to encourage behavioral changes in their patients. Many theories are available that explain change or adherence to specific health behaviors in subjects. In this sense the named Motivational Interviewing has increasingly gained relevance. Few well-validated instruments are available for measuring doctors' communication skills, and more specifically the Motivational Interviewing. The hypothesis of this study is that the Scale for Measuring Motivational Interviewing Skills (EVEM questionnaire) is a valid and reliable instrument for measuring the primary care professionals skills to get behavior change in patients. To test the hypothesis we have designed a prospective, observational, multi-center study to validate a measuring instrument. - Thirty-two primary care centers in Spain. -Sampling and Size: a) face and consensual validity: A group composed of 15 experts in Motivational Interviewing. b) Assessment of the psychometric properties of the scale; 50 physician- patient encounters will be videoed; a total of 162 interviews will be conducted with six standardized patients, and another 200 interviews will be conducted with 50 real patients (n=362). Four physicians will be specially trained to assess 30 interviews randomly selected to test the scale reproducibility. -Measurements for to test the hypothesis: a) Face validity: development of a draft questionnaire based on a theoretical model, by using Delphi-type methodology with experts. b) Scale psychometric properties: intraobservers will evaluate video recorded interviews: content-scalability validity (Exploratory Factor Analysis), internal consistency (Cronbach alpha), intra-/inter-observer reliability (Kappa index, intraclass correlation coefficient, Bland & Altman methodology), generalizability, construct validity and sensitivity to change (Pearson product-moment correlation

  17. Terminal deoxynucleotidyl transferase dUTP nick end labeling (TUNEL) assay using bench top flow cytometer for evaluation of sperm DNA fragmentation in fertility laboratories: protocol, reference values, and quality control.

    Sharma, Rakesh; Ahmad, Gulfam; Esteves, Sandro C; Agarwal, Ashok


    The purpose of this study is to provide a detailed protocol and quality control steps for measuring sperm DNA fragmentation (SDF) by terminal deoxynucleotidyl transferase deoxyuridine triphosphate (dUTP) nick end labeling (TUNEL) assay using a new bench top flow cytometer, determine the reference value of SDF, and assess sensitivity, specificity, and distribution of SDF in infertile men and controls with proven and unproven fertility. Semen specimens from 95 controls and 261 infertile men referred to a male infertility testing laboratory were tested for SDF by TUNEL assay using Apo-Direct kit and a bench top flow cytometer. Percentage of cells positive for TUNEL was calculated. Inter- and intraobserver variability was examined. TUNEL cutoff value, sensitivity, specificity, and distribution of different cutoff values in controls and infertile patients were calculated. The reference value of SDF by TUNEL assay was 16.8 % with a specificity of 91.6 % and sensitivity of 32.6 %. The positive and negative predictive values were 91.4 and 33.1 %, respectively. The upper limit of DNA damage in infertile men was significantly higher (68.9 %) than that in the controls (19.6 %). TUNEL assay using flow cytometry is a reproducible and easy method to determine SDF. At a cutoff point of 16.8 %, the test showed high specificity and positive predictive value. The results of this test could identify infertile men whose sperm DNA fragmentation does not contribute to their infertility and confirm that a man who tests positive is likely to be infertile due to elevated sperm DNA fragmentation.

  18. Development of highly reliable in silico SNP resource and genotyping assay from exome capture and sequencing: an example from black spruce (Picea mariana).

    Pavy, Nathalie; Gagnon, France; Deschênes, Astrid; Boyle, Brian; Beaulieu, Jean; Bousquet, Jean


    Picea mariana is a widely distributed boreal conifer across Canada and the subject of advanced breeding programmes for which population genomics and genomic selection approaches are being developed. Targeted sequencing was achieved after capturing P. mariana exome with probes designed from the sequenced transcriptome of Picea glauca, a distant relative. A high capture efficiency of 75.9% was reached although spruce has a complex and large genome including gene sequences interspersed by some long introns. The results confirmed the relevance of using probes from congeneric species to perform successfully interspecific exome capture in the genus Picea. A bioinformatics pipeline was developed including stringent criteria that helped detect a set of 97,075 highly reliable in silico SNPs. These SNPs were distributed across 14,909 genes. Part of an Infinium iSelect array was used to estimate the rate of true positives by validating 4267 of the predicted in silico SNPs by genotyping trees from P. mariana populations. The true positive rate was 96.2% for in silico SNPs, compared to a genotyping success rate of 96.7% for a set 1115 P. mariana control SNPs recycled from previous genotyping arrays. These results indicate the high success rate of the genotyping array and the relevance of the selection criteria used to delineate the new P. mariana in silico SNP resource. Furthermore, in silico SNPs were generally of medium to high frequency in natural populations, thus providing high informative value for future population genomics applications. © 2015 John Wiley & Sons Ltd.

  19. Performance of a commercial assay for the diagnosis of influenza A (H1N1 infection in comparison to the Centers for Disease Control and Prevention protocol of real time RT-PCR

    María G Barbás


    Full Text Available At the time of influenza A (H1N1 emergency, the WHO responded with remarkable speed by releasing guidelines and a protocol for a real-time RT-PCR assay (rRT-PCR. The aim of the present study was to evalúate the performance of the "Real Time Ready Influenza A/H1N1 Detection Set" (June 2009-Roche kit in comparison to the CDC reference rRT-PCR protocol. The overall sensitivity of the Roche assay for detection of the Inf A gene in the presence or absence of the H1 gene was 74.5 %. The sensitivity for detecting samples that were only positive for the Inf A gene (absence of the H1 gene was 53.3 % whereas the sensitivity for H1N1-positive samples (presence of the Inf A gene and any other swine gene was 76.4 %. The specificity of the assay was 97.1 %. A new version of the kit (November 2009 is now available, and a recent evaluation of its performance showed good sensitivity to detect pandemic H1N1 compared to other molecular assays.Durante la pandemia de influenza A (H1N1, la OMS recomendó algoritmos y protocolos de detección del virus mediante RT-PCR en tiempo real. El objetivo del presente estudio fue evaluar el desempeño del equipo que comercializa la empresa Roche, Real Time Ready Influenza A/H1N1 Detection Set (junio de 2009, en comparación con el protocolo de RT-PCR en tiempo real de los CDC. La sensibilidad global del ensayo de Roche para la detección del gen Inf A en presencia o ausencia del gen H1 fue 74,5 %. La sensibilidad para la detección de muestras positivas solo para el gen Inf A (ausencia del gen H1 fue 53,3 % y la sensibilidad para la detección de muestras positivas para H1N1 (presencia del gen Inf A y cualquier otro gen porcino fue 76,4 %. La especificidad fue 97,1 %. Existe una nueva versión del equipo (noviembre 2009 que, según se ha descrito, presenta buena sensibilidad en comparación con otros ensayos moleculares para detectar H1N1 pandémica.

  20. Reliable LC-MS/MS assay for the estimation of rilpivirine in human plasma: application to a bioequivalence study and incurred sample reanalysis.

    Gupta, Ajay; Guttikar, Swati; Patel, Yogesh; Shrivastav, Pranav S; Sanyal, Mallika


    A simple, precise, and rapid stable isotope dilution liquid chromatography-tandem mass spectrometry method has been developed and validated for the quantification of rilpivirine, a non-nucleoside reverse transcriptase inhibitor in human plasma. Rilpivirine and its deuterated analogue, rilpivirine-d6, used as an internal standard (IS) were quantitatively extracted by liquid-liquid extraction with methyl-tert-butyl ether and diethyl ether solvent mixture from 50 μL plasma. The chromatography was achieved on Gemini C18 (150 × 4.6 mm, 5 µm) analytical column in a run time of 2.2 min. The precursor → product ion transitions for rilpivirine (m/z 367.1 → 128.0) and IS (m/z 373.2 → 134.2) were monitored on a triple quadrupole mass spectrometer in the positive ionization mode. The linearity of the method was established in the concentration range of 0.5-200 ng/mL. The mean extraction recovery for rilpivirine (94.9%) and IS (99.9%) from spiked plasma samples was consistent and reproducible. The IS-normalized matrix factors for rilpivirine ranged from 0.98 to 1.02 across three quality controls. Bench top, freeze-thaw, wet extract, and long-term stability of rilpivirine was examined in spiked plasma samples. The application of the method was demonstrated by a bioequivalence study with 25 mg rilpivirine tablet formulation in 40 healthy subjects. The assay reproducibility was shown by reanalysis of 200 study samples and the % change in the concentration of repeat values from the original values was within ±15%.

  1. A reliable solid phase microextraction-gas chromatography-triple quadrupole mass spectrometry method for the assay of selenomethionine and selenomethylselenocysteine in aqueous extracts: difference between selenized and not-enriched selenium potatoes.

    Gionfriddo, Emanuela; Naccarato, Attilio; Sindona, Giovanni; Tagarelli, Antonio


    A new analytical approach is exploited in the assay of selenium speciation in selenized and not selenium enriched potatoes based on the widely available solid-phase microextraction (SPME) coupled to-GC-triple quadrupole mass spectrometry (SPME-GC-QqQ MS) method. The assay of selenomethionine (SeMet) and selenomethylselenocysteine (SeMeSeCys) in potatoes here reported provides clues to the effectiveness of SPME technique combined with gas chromatography-tandem mass spectrometry, which could be of general use. For the exploitation of the GC method, the selected analytes were converted into their N(O,S)-alkoxycarbonyl alkyl esters derivatives by direct treatment with alkyl chloroformate in aqueous extracts. The performance of five SPME fibers and three chloroformates were tested in univariate mode and the best results were obtained using the divinylbenzene/carboxen/polydimethylsiloxane fiber and propylchloroformate. The variables affecting the efficiency of SPME analysis were optimized by the multivariate approach of design of experiment (DoE) and, in particular, a central composite design (CCD) was applied. Tandem mass spectrometry in selected reaction monitoring (SRM) has allowed the elimination of matrix interferences, providing reconstructed chromatograms with well-resolved peaks and the achievement of very satisfactory detection and quantification limits. Both precision and recovery of the proposed protocol tested at concentration of 8 and 40 μg kg(-1) (dry matter), offered values ranging from 82.3 to 116.3% and from 8.5 to 13.1% for recovery and precision, respectively. The application of the method to commercial samples of selenized and not selenium enriched potatoes proved that the Se fertilization increases significantly the concentration of these bioavailable selenoamino acids.

  2. Accuracy and reliability of the sensory test performed using the laryngopharyngeal endoscopic esthesiometer and rangefinder in patients with suspected obstructive sleep apnoea hypopnoea: protocol for a prospective double-blinded, randomised, exploratory study.

    Giraldo-Cadavid, Luis Fernando; Bastidas, Alirio Rodrigo; Padilla-Ortiz, Diana Marcela; Concha-Galan, Diana Carolina; Bazurto, María Angelica; Vargas, Leslie


    Patients with obstructive sleep apnoea hypopnoea syndrome (OSA) might have varying degrees of laryngopharyngeal mechanical hyposensitivity that might impair the brain's capacity to prevent airway collapse during sleep. However, this knowledge about sensory compromises in OSA comes from studies performed using methods with little evidence of their validity. Hence, the purpose of this study is to assess the reliability and accuracy of the measurement of laryngopharyngeal mechanosensitivity in patients with OSA using a recently developed laryngopharyngeal endoscopic esthesiometer and rangefinder (LPEER). The study will be prospective and double blinded, with a randomised crossover assignment of raters performing the sensory tests. Subjects will be recruited from patients with suspected OSA referred for baseline polysomnography to a university hospital sleep laboratory. Intra-rater and inter-rater reliability will be evaluated using the Bland-Altman's limits of agreement plot, the intraclass correlation coefficient, and the Pearson or Spearman correlation coefficient, depending on the distribution of the variables. Diagnostic accuracy will be evaluated plotting ROC curves using standard baseline polysomnography as a reference. The sensory threshold values ​​for patients with mild, moderate and severe OSA will be determined and compared using ANOVA or the Kruskal-Wallis test, depending on the distribution of the variables. The LPEER could be a new tool for evaluating and monitoring laryngopharyngeal sensory impairment in patients with OSA. If it is shown to be valid, it could help to increase our understanding of the pathophysiological mechanisms of this condition and potentially help in finding new therapeutic interventions for OSA. The protocol has been approved by the Institutional Review Board of Fundacion Neumologica Colombiana. The results will be disseminated through conference presentations and peer-reviewed publication. This trial was registered at Clinical

  3. Reprodutibilidade da curva força-tempo do estilo "Crawl" em protocolo de curta duração Reliability of front-Crawl's force-time curve in a short duration protocol

    Augusto Carvalho Barbosa


    Full Text Available O objetivo deste estudo foi analisar a reprodutibilidade dos parâmetros biomecânicos da curva força-tempo do estilo "Crawl" em um protocolo de 10 s no nado atado. Dezesseis nadadores do sexo masculino (idade: 20,4 ± 4,0 anos; tempo na prova de 100 m livre: 53,68 ± 0,99 s realizaram dois esforços máximos de 10 s no nado atado. Os parâmetros força pico, força média, taxa de desenvolvimento de força, impulso, duração da braçada, tempo para atingir a força pico e força mínima foram representados pela média de oito braçadas consecutivas obtidas em cada tentativa. Utilizou-se o teste t para observar as diferenças entre os esforços para cada parâmetro. O nível de significância estabelecido foi de 5%. A reprodutibilidade relativa foi medida pelo coeficiente de correlação de Pearson e a consistência entre as duas tentativas pelo coeficiente de correlação intraclasse (CCI. A reprodutibilidade absoluta foi verificada pelo coeficiente de variação (CV. Não foi demonstrada diferença estatisticamente significante para nenhum parâmetro biomecânico quando comparados os dois esforços. Os elevados CCI e baixos CV indicaram alta consistência interna dos parâmetros analisados. Conclui-se que os parâmetros biomecânicos analisados a partir do nado atado são reprodutíveis quando empregado protocolo de curta duração o que demonstra a possibilidade de utilização do protocolo com alto grau de confiabilidade, por parte de treinadores e atletas.The aim of the present study was to analyze the reliability of biomechanical parameters of the front-Crawl's force-time curve in a 10-s protocol. Sixteen national competitive male swimmers (20.4 ± 4.0 years; 100-m best time: 53.68 ± 0.99 s performed two 10-s maximal efforts in tethered swimming. Peak force, average force, impulse, rate of force development, stroke duration, time to peak force and minimum force were represented by the mean of eight consecutive strokes obtained in each

  4. 19. The HUman Micro Nucleus project. International Date Base Comparison for results with the cytokinesis-block micronucleus assay in human lymphocytes. Ⅰ. Effect of laboratory protocol, scoring criteria, and host factors on the frequency of micronuclei


    The first results of an analysis of pooled data from laboratories using the cytokinesis-block micronucleus assay in human lymphocytes and participating in the HUMN (HUman MicroNucleus project) international collaborative study are presented. The effects of laboratory protocol, scoring criteria, and host factors on baseline micronucleus(MN) frequency are evaluated, and a reference range of “normal” values against which future studies may be compared is provided. Primary data from historical records were submitted by 25 laboratories distributed in 16 countries. This resulted in a database of nearly 7000 subjects. Potentially significant differences were present in the methods used by participating laboratories, such as in the type of culture medium, the concentration of Cytochalasin-B, the percentage of fetal calf serum, and in the culture method. Differences in criteria for scoring MN were also evident. The overall median MN frequency in non-exposed(i.e., normal) subjects was 6.5‰ and the interquartile range was between 3‰ and 12‰. An increase in MN frequency with age was evident in all but two laboratories. The effect of gender, although not so evident in all databases, was also present, with females having a 19% higher level of MN (95% C.I.:14-24%). Statistical analyses were performed using random-effects models for correlated data. Our best model, which included exposure to genotoxic factors, host factors, methods, and scoring criteria, explained 75% of the total variance, with the largest contribution attributable to laboratory methods.

  5. 蝴蝶兰快速经济有效繁殖技术的研究%A Reliable Protocol for Plant Regeneration from Pedicel Axillary Bud of Phalaenopsis in vitro

    王敬文; 明凤; 叶鸣明; 董玉光; 梁斌; 陈龙英; 沈大棱


    以蝴蝶兰(Phalaenopsis)花梗腋芽为外植体,研究出一种高效、简便的诱导原球茎(Protocorm)的方法.在不含其他有机添加物,只添加BA (3 mg/L)+NAA (0.1 mg/L)的MS培养基中,原球茎诱导效率达80%. 相同培养基继代,4周内即可发育成幼苗.在含有IAA(0.1 mg/L)的MS培养基中,40%的幼苗可以诱导生根,并移植成功.由于此方法简便、经济有效,可应用于工厂化生产,而且获得的优质原球茎也为进一步的遗传改良提供材料.%An efficient and simple method of high frequency protocorm regeneration from pedicel axillary bud of Phalaenopsis is described. Pedicel axillary buds were cultured on Murashige and Skoog's basal medium (MS) with combinations of 6-benzylaminopurine (BA) (2-3 mg/L) and α-naphthaleneacetic acid (NAA) (0.1-0.5 mg/L). Medium supplemented with 3 mg/L BA in combination with 0.1 mg/LNAA produced the best response in protocorm occurrence(80%) without addition of any organic material. On subculture to the same medium, protocorm produced the shoots within 4 weeks of culture and well developing shoots were obtained separately. 40% of developed shoots produced root on medium supplemented with 0.1 mg/L IAA and the plants were transferred to moss for growth. Because of the simplicity for culture medium ingredient, it is a reliable protocol for plant regeneration of Phalaenopsis factory production in developing country and is also useful to genetic improvement programs.

  6. Reliability of diagnostic imaging techniques in suspected acute appendicitis: proposed diagnostic protocol; Indicacion de las tecnicas de diagnostico por la imagen en la sospecha de apendicitis aguda: propuesta de protocolo diagnostico

    Cura del, J. L.; Oleaga, L.; Grande, D.; Vela, A. C.; Ibanez, A. M. [Hospital de Basureto. Bilbao (Spain)


    To study the utility of ultrasound and computed tomography (CT) in case of suspected appendicitis. To determine the diagnostic yield in terms of different clinical contexts and patient characteristics. to assess the costs and benefits of introducing these techniques and propose a protocol for their use. Negative appendectomies, complications and length of hospital stay in a group of 152 patients with suspected appendicitis who underwent ultrasound and CT were compared with those of 180 patients who underwent appendectomy during the same time period, but had not been selected for the first group: these patients costs for each group were calculated. In the first group, the diagnostic value of the clinical signs was also evaluated. The reliability of the clinical signs was limited, while the results with ultrasound and CT were excellent. The incidence of negative appendectomy was 9.6% in the study group and 12.2% in the control group. Moreover, there were fewer complications and a shorter hospital stay in the first group. Among men, however, the rate of negative appendectomy was lower in the control group. The cost of using ultrasound and CT in the management of appendicitis was only slightly higher than that of the control group. Although ultrasound and CT are not necessary in cases in which the probability of appendicitis is low or in men presenting clear clinical evidence, the use of these techniques is indicated in the remaining cases in which appendicitis is suspected. In children, ultrasound is the technique of choice. In all other patients, if negative results are obtained with one of the two techniques, the other should be performed. (Author) 49 refs.

  7. Critical issues with the in vivo comet assay: A report of the comet assay working group in the 6th International Workshop on Genotoxicity Testing (IWGT).

    Speit, Günter; Kojima, Hajime; Burlinson, Brian; Collins, Andrew R; Kasper, Peter; Plappert-Helbig, Ulla; Uno, Yoshifumi; Vasquez, Marie; Beevers, Carol; De Boeck, Marlies; Escobar, Patricia A; Kitamoto, Sachiko; Pant, Kamala; Pfuhler, Stefan; Tanaka, Jin; Levy, Dan D


    As a part of the 6th IWGT, an expert working group on the comet assay evaluated critical topics related to the use of the in vivo comet assay in regulatory genotoxicity testing. The areas covered were: identification of the domain of applicability and regulatory acceptance, identification of critical parameters of the protocol and attempts to standardize the assay, experience with combination and integration with other in vivo studies, demonstration of laboratory proficiency, sensitivity and power of the protocol used, use of different tissues, freezing of samples, and choice of appropriate measures of cytotoxicity. The standard protocol detects various types of DNA lesions but it does not detect all types of DNA damage. Modifications of the standard protocol may be used to detect additional types of specific DNA damage (e.g., cross-links, bulky adducts, oxidized bases). In addition, the working group identified critical parameters that should be carefully controlled and described in detail in every published study protocol. In vivo comet assay results are more reliable if they were obtained in laboratories that have demonstrated proficiency. This includes demonstration of adequate response to vehicle controls and an adequate response to a positive control for each tissue being examined. There was a general agreement that freezing of samples is an option but more data are needed in order to establish generally accepted protocols. With regard to tissue toxicity, the working group concluded that cytotoxicity could be a confounder of comet results. It is recommended to look at multiple parameters such as histopathological observations, organ-specific clinical chemistry as well as indicators of tissue inflammation to decide whether compound-specific toxicity might influence the result. The expert working group concluded that the alkaline in vivo comet assay is a mature test for the evaluation of genotoxicity and can be recommended to regulatory agencies for use.

  8. MEMS reliability

    Hartzell, Allyson L; Shea, Herbert R


    This book focuses on the reliability and manufacturability of MEMS at a fundamental level. It demonstrates how to design MEMs for reliability and provides detailed information on the different types of failure modes and how to avoid them.

  9. Tube-Forming Assays.

    Brown, Ryan M; Meah, Christopher J; Heath, Victoria L; Styles, Iain B; Bicknell, Roy


    Angiogenesis involves the generation of new blood vessels from the existing vasculature and is dependent on many growth factors and signaling events. In vivo angiogenesis is dynamic and complex, meaning assays are commonly utilized to explore specific targets for research into this area. Tube-forming assays offer an excellent overview of the molecular processes in angiogenesis. The Matrigel tube forming assay is a simple-to-implement but powerful tool for identifying biomolecules involved in angiogenesis. A detailed experimental protocol on the implementation of the assay is described in conjunction with an in-depth review of methods that can be applied to the analysis of the tube formation. In addition, an ImageJ plug-in is presented which allows automatic quantification of tube images reducing analysis times while removing user bias and subjectivity.

  10. The comet assay – from toy to tool

    Guenter Speit


    conditions for the assay protocol, the study design and the statistical analysis. The recently launched ComNet project aims to validate the comet assay as a reliable biomonitoring tool. Unfortunately, the comet assay has sometimes been used without the necessary knowledge about the principles underlying the method and the kind of information it provides. Such knowledge gaps may lead to misconceptions regarding the use of the assay and the interpretation of results. This presentation will briefly discuss the developments and applications of the comet assay, its advantages and limitations and the requirements for appropriate test performance.

  11. Software reliability

    Bendell, A


    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  12. Two novel nonradioactive polymerase chain reaction-based assays of dried blood spots, genomic DNA, or whole cells for fast, reliable detection of Z and S mutations in the alpha 1-antitrypsin gene

    Andresen, B S; Knudsen, I; Jensen, P K;


    Two new nonradioactive polymerase chain reaction (PCR)-based assays for the Z and S mutations in the alpha 1-antitrypsin gene are presented. The assays take advantage of PCR-mediated mutagenesis, creating new diagnostic restriction enzyme sites for unambiguous discrimination between test samples...

  13. Reliability Engineering

    Lazzaroni, Massimo


    This book gives a practical guide for designers and users in Information and Communication Technology context. In particular, in the first Section, the definition of the fundamental terms according to the international standards are given. Then, some theoretical concepts and reliability models are presented in Chapters 2 and 3: the aim is to evaluate performance for components and systems and reliability growth. Chapter 4, by introducing the laboratory tests, puts in evidence the reliability concept from the experimental point of view. In ICT context, the failure rate for a given system can be

  14. (dtltt) protocol


    Mar 1, 2013 ... Keywords: multi-access, multiservice, network, synchronous, asynchronous, traffic, timed-token. 1. ... 12, 13 ] SAFENET [14], Manufacturing Automation. Protocol (MAP) ...... ken circulation on mobile Ad Hoc Networks. 21th In-.

  15. Matrix-Assisted Laser Desorption Ionization–Time of Flight Mass Spectrometry Meropenem Hydrolysis Assay with NH4HCO3, a Reliable Tool for Direct Detection of Carbapenemase Activity

    Študentová, Vendula; Izdebski, Radoslaw; Oikonomou, Olga; Pfeifer, Yvonne; Petinaki, Efthimia; Hrabák, Jaroslav


    A comparison of a matrix-assisted laser desorption ionization–time of flight mass spectrometric (MALDI-TOF MS) meropenem hydrolysis assay with the Carba NP test showed that both methods exhibited low sensitivity (approximately 76%), mainly due to the false-negative results obtained with OXA-48-type producers. The addition of NH4HCO3 to the reaction buffer for the MALDI-TOF MS assay dramatically improved its sensitivity (98%). Automatic interpretation of the MALDI-TOF MS assay, using the MBT STAR-BL software, generally agreed with the results obtained after manual analysis. For the Carba NP test, spectrophotometric analysis found six additional carbapenemase producers. PMID:25694522

  16. Mac protocols for cyber-physical systems

    Xia, Feng


    This book provides a literature review of various wireless MAC protocols and techniques for achieving real-time and reliable communications in the context of cyber-physical systems (CPS). The evaluation analysis of IEEE 802.15.4 for CPS therein will give insights into configuration and optimization of critical design parameters of MAC protocols. In addition, this book also presents the design and evaluation of an adaptive MAC protocol for medical CPS, which exemplifies how to facilitate real-time and reliable communications in CPS by exploiting IEEE 802.15.4 based MAC protocols. This book wil

  17. Enzyme assays.

    Reymond, Jean-Louis; Fluxà, Viviana S; Maillard, Noélie


    Enzyme assays are analytical tools to visualize enzyme activities. In recent years a large variety of enzyme assays have been developed to assist the discovery and optimization of industrial enzymes, in particular for "white biotechnology" where selective enzymes are used with great success for economically viable, mild and environmentally benign production processes. The present article highlights the aspects of fluorogenic and chromogenic substrates, sensors, and enzyme fingerprinting, which are our particular areas of interest.

  18. A Spectrophotometric Assay Optimizing Conditions for Pepsin Activity.

    Harding, Ethelynda E.; Kimsey, R. Scott


    Describes a laboratory protocol optimizing the conditions for the assay of pepsin activity using the Coomasie Blue dye binding assay of protein concentration. The dye bonds through strong, noncovalent interactions to basic and aromatic amino acid residues. (DDR)

  19. DSTC Layering Protocols in Wireless Cooperative Networks

    Elamvazhuthi, P S; Dey, B K


    In adhoc wireless relay networks, layers of relays are used to communicate from a source to a destination to achieve better reliability. In this paper, we consider five protocols derived from an earlier proposed protocol, where the relays do a simple processing before transmitting and as a result achieve distributed space-time code. Four of the protocols discussed utilize more complicated relaying schemes than simple layered protocols proposed in earlier literature. We have analyzed the effectiveness of these protocols in various power loss configurations among the paths. Optimum power allocation of the total power among various transmissions have been found by reasonable fine search for all the protocols. Bit error rate plots are compared under optimum power allocation for these protocols. From the simulation results, we draw some guidelines as to which protocol is good for what kind of environment.

  20. The corneal pocket assay.

    Ziche, Marina; Morbidelli, Lucia


    The cornea in most species is physiologically avascular, and thus this assay allows the measurement of newly formed vessels. The continuous monitoring of neovascular growth in the same animal allows the evaluation of drugs acting as suppressors or stimulators of angiogenesis. Under anesthesia a micropocket is produced in the cornea thickness and the angiogenesis stimulus (tumor tissue, cell suspension, growth factor) is placed into the pocket in order to induce vascular outgrowth from the limbal capillaries. Neovascular development and progression can be modified by the presence of locally released or applied inhibitory factors or by systemic treatments. In this chapter the experimental details of the avascular cornea assay, the technical challenges, and advantages and disadvantages in different species are discussed. Protocols for local drug treatment and tissue sampling for histology and pharmacokinetic profile are reported.

  1. Concurrent Validity of a Pragmatic Protocol.

    Duncan, Julie Condon; Perozzi, Joseph A.


    The scores of 11 non-handicapped kindergarten children on the Pragmatic Protocol (used in assessing language-handicapped children) were correlated with their ratings by five experienced judges on a 7-point equal-appearing interval scale of communicative competence. The concurrent validity of the Protocol and interobserver reliability were…

  2. Microelectronics Reliability


    convey any rights or permission to manufacture, use, or sell any patented invention that may relate to them. This report was cleared for public release...testing for reliability prediction of devices exhibiting multiple failure mechanisms. Also presented was an integrated accelerating and measuring ...13  Table 2  T, V, F and matrix versus  measured  FIT

  3. A high-throughput, high-quality plant genomic DNA extraction protocol.

    Li, H; Li, J; Cong, X H; Duan, Y B; Li, L; Wei, P C; Lu, X Z; Yang, J B


    The isolation of high-quality genomic DNA (gDNA) is a crucial technique in plant molecular biology. The quality of gDNA determines the reliability of real-time polymerase chain reaction (PCR) analysis. In this paper, we reported a high-quality gDNA extraction protocol optimized for real-time PCR in a variety of plant species. Performed in a 96-well block, our protocol provides high throughput. Without the need for phenol-chloroform and liquid nitrogen or dry ice, our protocol is safer and more cost-efficient than traditional DNA extraction methods. The method takes 10 mg leaf tissue to yield 5-10 µg high-quality gDNA. Spectral measurement and electrophoresis were used to demonstrate gDNA purity. The extracted DNA was qualified in a restriction enzyme digestion assay and conventional PCR. The real-time PCR amplification was sufficiently sensitive to detect gDNA at very low concentrations (3 pg/µL). The standard curve of gDNA dilutions from our phenol-chloroform-free protocol showed better linearity (R(2) = 0.9967) than the phenol-chloroform protocol (R(2) = 0.9876). The results indicate that the gDNA was of high quality and fit for real-time PCR. This safe, high-throughput plant gDNA extraction protocol could be used to isolate high-quality gDNA for real-time PCR and other downstream molecular applications.

  4. Kinetic Tetrazolium Microtiter Assay

    Pierson, Duane L.; Stowe, Raymond; Koenig, David


    Kinetic tetrazolium microtiter assay (KTMA) involves use of tetrazolium salts and Triton X-100 (or equivalent), nontoxic, in vitro color developer solubilizing colored metabolite formazan without injuring or killing metabolizing cells. Provides for continuous measurement of metabolism and makes possible to determine rate of action of antimicrobial agent in real time as well as determines effective inhibitory concentrations. Used to monitor growth after addition of stimulatory compounds. Provides for kinetic determination of efficacy of biocide, greatly increasing reliability and precision of results. Also used to determine relative effectiveness of antimicrobial agent as function of time. Capability of generating results on day of test extremely important in treatment of water and waste, disinfection of hospital rooms, and in pharmaceutical, agricultural, and food-processing industries. Assay also used in many aspects of cell biology.

  5. Histology protocols

    CarloAlberto Redi


    Full Text Available Tim D. Hewitson & Ian A. Darby (Eds Humana press, Totowa, New Jersey (USA Series: Springer Protocols Methods in Molecular Biology, Volume 611, 2010 Pages: 230; € 83.15 ISBN: 978-1-60327-344-2 Impressive as it can sounds in the era that Biology see a clear dominance of reductionism with the idea that complexity can be disentagled more and more thanks to the use of molecular tools, the reader will remain fascinated by this slim and agile volume devoted to bring together what apparently are two separeted words: molecular biology and histology. Simply remembering to the youngest scientists.....

  6. An improved protocol for the preparation of protoplasts from an established Arabidopsis thaliana cell suspension culture and infection with RNA of turnip yellow mosaic tymovirus: a simple and reliable method.

    Schirawski, J; Planchais, S; Haenni, A L


    An improved method for preparation of protoplasts of Arabidopsis thaliana cells grown in suspension culture is presented. This method is fast, reliable and can be used for the production of virtually an unlimited number of protoplasts at any time. These protoplasts can be transformed efficiently with RNA from turnip yellow mosaic tymovirus (TYMV) by polyethyleneglycol-mediated transfection. The simple transfection procedure has been optimized at various steps. Replication of TYMV can be monitored routinely by detection of the coat protein in as few as 2 x 10(4) infected protoplasts.

  7. Quantitative multiplex real-time PCR assay for shrimp allergen: comparison of commercial master mixes and PCR platforms in rapid cycling.

    Eischeid, Anne C; Kasko, Sasha M


    Real-time PCR has been used widely in numerous fields. In food safety, it has been applied to detection of microbes and other contaminants, including food allergens. Interest in rapid (fast) cycling real-time PCR has grown because it yields results in less time than does conventional cycling. However, fast cycling can adversely affect assay performance. Here we report on tests of commercial master mixes specifically designed for fast real-time PCR using a shrimp allergen assay we previously developed and validated. The objective of this work was to determine whether specialized commercial master mixes lead to improved assay performance in rapid cycling. Real-time PCR assays were carried out using four different master mixes and two different rapid cycling protocols. Results indicated that specialized master mixes did yield quality results. In many cases, linear ranges spanned up to 7 orders of magnitude, R(2) values were at least 0.95, and reaction efficiencies were within or near the optimal range of 90 to 110%. In the faster of the two rapid cycling protocols tested, assay performance and PCR amplification were markedly better for the shorter PCR product. In conclusion, specialized commercial master mixes were effective as part of rapid cycling protocols, but conventional cycling as used in our previous work is more reliable for the shrimp assay tested.

  8. Network Coding Protocols for Smart Grid Communications

    Prior, Rui; Roetter, Daniel Enrique Lucani; Phulpin, Yannick


    We propose a robust network coding protocol for enhancing the reliability and speed of data gathering in smart grids. At the heart of our protocol lies the idea of tunable sparse network coding, which adopts the transmission of sparsely coded packets at the beginning of the transmission process...... but then switches to a denser coding structure towards the end. Our systematic mechanism maintains the sparse structure during the recombination of packets at the intermediate nodes. The performance of our protocol is compared by means of simulations of IEEE reference grids against standard master-slave protocols...

  9. Traffic Performance Analysis of Manet Routing Protocol

    Rajeswari, S; 10.5121/ijdps.2011.2306


    The primary objective of this research work is to study and investigate the performance measures of Gossip Routing protocol and Energy Efficient and Reliable Adaptive Gossip routing protocols. We use TCP and CBR based traffic models to analyze the performance of above mentioned protocols based on the parameters of Packet Delivery Ratio, Average End-to-End Delay and Throughput. We will investigate the effect of change in the simulation time and Number of nodes for the MANET routing protocols. For Simulation, we have used ns-2 simulator.

  10. Grid reliability

    Saiz, P; Rocha, R; Andreeva, J


    We are offering a system to track the efficiency of different components of the GRID. We can study the performance of both the WMS and the data transfers At the moment, we have set different parts of the system for ALICE, ATLAS, CMS and LHCb. None of the components that we have developed are VO specific, therefore it would be very easy to deploy them for any other VO. Our main goal is basically to improve the reliability of the GRID. The main idea is to discover as soon as possible the different problems that have happened, and inform the responsible. Since we study the jobs and transfers issued by real users, we see the same problems that users see. As a matter of fact, we see even more problems than the end user does, since we are also interested in following up the errors that GRID components can overcome by themselves (like for instance, in case of a job failure, resubmitting the job to a different site). This kind of information is very useful to site and VO administrators. They can find out the efficien...

  11. Specific PCR and real-time PCR assays for detection and quantitation of 'Candidatus Phytoplasma phoenicium'.

    Jawhari, Maan; Abrahamian, Peter; Sater, Ali Abdel; Sobh, Hana; Tawidian, Patil; Abou-Jawdah, Yusuf


    Almond witches' broom (AlmWB) is a fast-spreading lethal disease of almond, peach and nectarine associated with 'Candidatus Phytoplasma phoenicium'. The development of PCR and quantitative real-time PCR (qPCR) assays for the sensitive and specific detection of the phytoplasma is of prime importance for early detection of 'Ca. P. phoenicium' and for epidemiological studies. The developed qPCR assay herein uses a TaqMan(®) probe labeled with Black Hole Quencher Plus. The specificity of the PCR and that of the qPCR detection protocols were tested on 17 phytoplasma isolates belonging to 11 phytoplasma 16S rRNA groups, on samples of almond, peach, nectarine, native plants and insects infected or uninfected with the phytoplasma. The developed assays showed high specificity against 'Ca. P. phoenicium' and no cross-reactivity against any other phytoplasma, plant or insect tested. The sensitivity of the developed PCR and qPCR assays was similar to the conventional nested PCR protocol using universal primers. The qPCR assay was further validated by quantitating AlmWB phytoplasma in different hosts, plant parts and potential insect vectors. The highest titers of 'Ca. P. phoenicium' were detected in the phloem tissues of stems and roots of almond and nectarine trees, where they averaged from 10(5) to 10(6) genomic units per nanogram of host DNA (GU/ng of DNA). The newly developed PCR and qPCR protocols are reliable, specific and sensitive methods that are easily applicable to high-throughput diagnosis of AlmWB in plants and insects and can be used for surveys of potential vectors and alternative hosts.

  12. CR reliability testing

    Honeyman-Buck, Janice C.; Rill, Lynn; Frost, Meryll M.; Staab, Edward V.


    The purpose of this work was to develop a method for systematically testing the reliability of a CR system under realistic daily loads in a non-clinical environment prior to its clinical adoption. Once digital imaging replaces film, it will be very difficult to revert back should the digital system become unreliable. Prior to the beginning of the test, a formal evaluation was performed to set the benchmarks for performance and functionality. A formal protocol was established that included all the 62 imaging plates in the inventory for each 24-hour period in the study. Imaging plates were exposed using different combinations of collimation, orientation, and SID. Anthropomorphic phantoms were used to acquire images of different sizes. Each combination was chosen randomly to simulate the differences that could occur in clinical practice. The tests were performed over a wide range of times with batches of plates processed to simulate the temporal constraints required by the nature of portable radiographs taken in the Intensive Care Unit (ICU). Current patient demographics were used for the test studies so automatic routing algorithms could be tested. During the test, only three minor reliability problems occurred, two of which were not directly related to the CR unit. One plate was discovered to cause a segmentation error that essentially reduced the image to only black and white with no gray levels. This plate was removed from the inventory to be replaced. Another problem was a PACS routing problem that occurred when the DICOM server with which the CR was communicating had a problem with disk space. The final problem was a network printing failure to the laser cameras. Although the units passed the reliability test, problems with interfacing to workstations were discovered. The two issues that were identified were the interpretation of what constitutes a study for CR and the construction of the look-up table for a proper gray scale display.

  13. Research on Reliable Routing Protocol Based on Random-Walk Model in UWSN%UWSN 中基于随机游走模型的可靠路由算法研究

    朱剑; 刘君; 赵海; 徐野


    UWSN(Underwater Wireless Sensor Networks)相较于传统的无线传感器网络采用了声信号进行数据传输,由于高传输延迟的引入,冲突类数据丢失现象凸显,网络可靠通信面临全新的挑战.为了在这样的环境中实现低消耗、高可靠网络通信这一目的,文中设计了一种最小化冲突概率路由算法 MCR(Minimum Conflict probability Routing).该算法融合了网络节点的度值和节点工作负载,形成了一种全新的路由策略 DBM(Degree and Buff based Metric).在该路由策略基础上,采用图论中的随机游走模型对源节点与 sink 节点之间的路径进行选择.MCR算法的核心思想是选择两点之间冲突概率最低的路径完成数据传输,虽然该算法不能从 Mac 层解决冲突类丢包问题,但是从基于 NS-2的仿真实验结果来看,在 UWSN 环境下,MCR 算法相较于传统路由算法有效地减少了路径中的冲突类丢包概率,提升了端到端链路可靠性、具有较高较稳定的网络吞吐量.%Compared with WSN,UWSN uses acoustical signal as the transmission medium,this change increases the probability of data loss in UWSN caused by signal conflict for the reason of high transmitting delay.In order to achieve a low consumption and reliable network communication in such environment,an algorithm called MCR (Minimum Clash probability Routing)is proposed in this paper.MCR gives out a routing metric called DBM (Degree and Buff based Metric)by combining nodes’degree with nodes’workload,and finds out the best communication path by random-walk theory in basis of DBM.The main idea of MCR is to select a path with lowest conflict probability between two nodes to finish the data transmission.The result of simulation experiment based NS-2 shows that,MCR can reduce the probability of data loss caused by signal conflict, increase the reliability between two nodes,and has a higher and more stable throughput

  14. Frontiers of reliability

    Basu, Asit P; Basu, Sujit K


    This volume presents recent results in reliability theory by leading experts in the world. It will prove valuable for researchers, and users of reliability theory. It consists of refereed invited papers on a broad spectrum of topics in reliability. The subjects covered include Bayesian reliability, Bayesian reliability modeling, confounding in a series system, DF tests, Edgeworth approximation to reliability, estimation under random censoring, fault tree reduction for reliability, inference about changes in hazard rates, information theory and reliability, mixture experiment, mixture of Weibul

  15. Performance Analysis of AODV-UI Routing Protocol With Energy Consumption Improvement Under Mobility Models in Hybrid Ad hoc Network

    Abdusy Syarif; Riri Fitri Sari


    In mobile ad hoc networks (MANETs), routing protocol plays the most important role. In the last decade, Ad hoc On-demand Distance Vector (AODV) routing protocol becomes the attention of focussed research on MANETs world wide. A lot of protocols had been proposed to improve and optimize AODV routing protocol to achieve the best in quest of reliable protocol. In this paper, we present some improvement suggestion to AODV routing protocol. Our proposed protocol, called AODV-UI, improved AODV in g...

  16. Delta-Reliability

    Eugster, P.; Guerraoui, R.; Kouznetsov, P.


    This paper presents a new, non-binary measure of the reliability of broadcast algorithms, called Delta-Reliability. This measure quantifies the reliability of practical broadcast algorithms that, on the one hand, were devised with some form of reliability in mind, but, on the other hand, are not considered reliable according to the ``traditional'' notion of broadcast reliability [HT94]. Our specification of Delta-Reliability suggests a further step towards bridging the gap between theory and...

  17. Reliability computation from reliability block diagrams

    Chelson, P. O.; Eckstein, E. Y.


    Computer program computes system reliability for very general class of reliability block diagrams. Four factors are considered in calculating probability of system success: active block redundancy, standby block redundancy, partial redundancy, and presence of equivalent blocks in the diagram.

  18. Reliability of plantar pressure platforms.

    Hafer, Jocelyn F; Lenhoff, Mark W; Song, Jinsup; Jordan, Joanne M; Hannan, Marian T; Hillstrom, Howard J


    Plantar pressure measurement is common practice in many research and clinical protocols. While the accuracy of some plantar pressure measuring devices and methods for ensuring consistency in data collection on plantar pressure measuring devices have been reported, the reliability of different devices when testing the same individuals is not known. This study calculated intra-mat, intra-manufacturer, and inter-manufacturer reliability of plantar pressure parameters as well as the number of plantar pressure trials needed to reach a stable estimate of the mean for an individual. Twenty-two healthy adults completed ten walking trials across each of two Novel emed-x(®) and two Tekscan MatScan(®) plantar pressure measuring devices in a single visit. Intraclass correlation (ICC) was used to describe the agreement between values measured by different devices. All intra-platform reliability correlations were greater than 0.70. All inter-emed-x(®) reliability correlations were greater than 0.70. Inter-MatScan(®) reliability correlations were greater than 0.70 in 31 and 52 of 56 parameters when looking at a 10-trial average and a 5-trial average, respectively. Inter-manufacturer reliability including all four devices was greater than 0.70 for 52 and 56 of 56 parameters when looking at a 10-trial average and a 5-trial average, respectively. All parameters reached a value within 90% of an unbiased estimate of the mean within five trials. Overall, reliability results are encouraging for investigators and clinicians who may have plantar pressure data sets that include data collected on different devices.

  19. Development of a multiplex PCR assay to detect Edwardsiella tarda, Streptococcus parauberis, and Streptococcus iniae in olive flounder (Paralichthys olivaceus).

    Park, Seong Bin; Kwon, Kyoung; Cha, In Seok; Jang, Ho Bin; Nho, Seong Won; Fagutao, Fernand F; Kim, Young Kyu; Yu, Jong Earn; Jung, Tae Sung


    A multiplex PCR protocol was established to simultaneously detect major bacterial pathogens in olive flounder (Paralichthys olivaceus) including Edwardsiella (E.) tarda, Streptococcus (S.) parauberis, and S. iniae. The PCR assay was able to detect 0.01 ng of E. tarda, 0.1 ng of S. parauberis, and 1 ng of S. iniae genomic DNA. Furthermore, this technique was found to have high specificity when tested with related bacterial species. This method represents a cheaper, faster, and reliable alternative for identifying major bacterial pathogens in olive flounder, the most important farmed fish in Korea.

  20. Protocol Implementation Generator

    Carvalho Quaresma, Jose Nuno; Probst, Christian W.


    necessary tools. In this paper, we present the Protocol Implementation Generator (PiG), a framework that can be used to add protocol generation to protocol negotiation, or to easily share and implement new protocols throughout a network. PiG enables the sharing, verification, and translation...... of communication protocols. With it, partners can suggest a new protocol by sending its specification. After formally verifying the specification, each partner generates an implementation, which can then be used for establishing communication. We also present a practical realisation of the Protocol Implementation...... Generator framework based on the LySatool and a translator from the LySa language into C or Java....

  1. A real-time, quantitative PCR protocol for assessing the relative parasitemia of Leucocytozoon in waterfowl

    Smith, Matthew M.; Schmutz, Joel A.; Apelgren, Chloe; Ramey, Andy M.


    Microscopic examination of blood smears can be effective at diagnosing and quantifying hematozoa infections. However, this method requires highly trained observers, is time consuming, and may be inaccurate for detection of infections at low levels of parasitemia. To develop a molecular methodology for identifying and quantifying Leucocytozoon parasite infection in wild waterfowl (Anseriformes), we designed a real-time, quantitative PCR protocol to amplify Leucocytozoon mitochondrial DNA using TaqMan fluorogenic probes and validated our methodology using blood samples collected from waterfowl in interior Alaska during late summer and autumn (n = 105). By comparing our qPCR results to those derived from a widely used nested PCR protocol, we determined that our assay showed high levels of sensitivity (91%) and specificity (100%) in detecting Leucocytozoon DNA from host blood samples. Additionally, results of a linear regression revealed significant correlation between the raw measure of parasitemia produced by our qPCR assay (Ct values) and numbers of parasites observed on blood smears (R2 = 0.694, P = 0.003), indicating that our assay can reliably determine the relative parasitemia levels among samples. This methodology provides a powerful new tool for studies assessing effects of haemosporidian infection in wild avian species.

  2. VLSI Reliability in Europe

    Verweij, Jan F.


    Several issue's regarding VLSI reliability research in Europe are discussed. Organizations involved in stimulating the activities on reliability by exchanging information or supporting research programs are described. Within one such program, ESPRIT, a technical interest group on IC reliability was

  3. Terrestrial Photovoltaic Module Accelerated Test-To-Failure Protocol

    Osterwald, C. R.


    This technical report documents a test-to-failure protocol that may be used to obtain quantitative information about the reliability of photovoltaic modules using accelerated testing in environmental temperature-humidity chambers.

  4. Reliable data delivery protocols for underwater sensor networks

    Nowsheen, Nusrat


    Underwater Acoustic Sensor Networks (UASNs) are becoming increasingly promising to monitor aquatic environment. The network is formed by deploying a number of sensor nodes and/or Autonomous Underwater Vehicles (AUVs) to support diverse applications such as pollution monitoring, oceanographic data collection, disaster recovery and surveillance. These applications require transmission of data packets from the source to a sink or gateway in a multihop fashion and eventually to a message ferry or...

  5. EXACT2: the semantics of biomedical protocols.

    Soldatova, Larisa N; Nadis, Daniel; King, Ross D; Basu, Piyali S; Haddi, Emma; Baumlé, Véronique; Saunders, Nigel J; Marwan, Wolfgang; Rudkin, Brian B


    The reliability and reproducibility of experimental procedures is a cornerstone of scientific practice. There is a pressing technological need for the better representation of biomedical protocols to enable other agents (human or machine) to better reproduce results. A framework that ensures that all information required for the replication of experimental protocols is essential to achieve reproducibility. To construct EXACT2 we manually inspected hundreds of published and commercial biomedical protocols from several areas of biomedicine. After establishing a clear pattern for extracting the required information we utilized text-mining tools to translate the protocols into a machine amenable format. We have verified the utility of EXACT2 through the successful processing of previously 'unseen' (not used for the construction of EXACT2)protocols. We have developed the ontology EXACT2 (EXperimental ACTions) that is designed to capture the full semantics of biomedical protocols required for their reproducibility. The paper reports on a fundamentally new version EXACT2 that supports the semantically-defined representation of biomedical protocols. The ability of EXACT2 to capture the semantics of biomedical procedures was verified through a text mining use case. In this EXACT2 is used as a reference model for text mining tools to identify terms pertinent to experimental actions, and their properties, in biomedical protocols expressed in natural language. An EXACT2-based framework for the translation of biomedical protocols to a machine amenable format is proposed. The EXACT2 ontology is sufficient to record, in a machine processable form, the essential information about biomedical protocols. EXACT2 defines explicit semantics of experimental actions, and can be used by various computer applications. It can serve as a reference model for for the translation of biomedical protocols in natural language into a semantically-defined format.

  6. Chromosome aberration assays in Allium

    Grant, W.F.


    The common onion (Allium cepa) is an excellent plant for the assay of chromosome aberrations after chemical treatment. Other species of Allium (A. cepa var. proliferum, A. carinatum, A. fistulosum and A. sativum) have also been used but to a much lesser extent. Protocols have been given for using root tips from either bulbs or seeds of Allium cepa to study the cytological end-points, such as chromosome breaks and exchanges, which follow the testing of chemicals in somatic cells. It is considered that both mitotic and meiotic end-points should be used to a greater extent in assaying the cytogenetic effects of a chemical. From a literature survey, 148 chemicals are tabulated that have been assayed in 164 Allium tests for their clastogenic effect. Of the 164 assays which have been carried out, 75 are reported as giving a positive reaction, 49 positive and with a dose response, 1 positive and temperature-related, 9 borderline positive, and 30 negative; 76% of the chemicals gave a definite positive response. It is proposed that the Allium test be included among those tests routinely used for assessing chromosomal damage induced by chemicals.

  7. Physical layer bootstrapping protocol for cognitive radio networks

    Doost-Mohammady, R.; Paweczak, P.; Janssen, G.J.M.; Segers, J.C.M.


    In this paper a novel signaling protocol for coexistence and spectrum sharing among cognitive radio nodes is proposed. This protocol allows the radios to rendezvous with each other in a statically allocated spectrum band through on-off keying signaling and reliable spectrum sensing. It enables the r

  8. Developing a yeast-based assay protocol to monitor total ...


    Jun 21, 2005 ... in a sample to which the yeast was exposed over a 20 h incu- bation period. ... umes and a 20 h incubation time. ... (Forma Scientific, USA) until used. The use of ..... al., 2002; Andersen et al., 2003; Svenson et al., 2003; Ser-.

  9. Reliability Generalization: "Lapsus Linguae"

    Smith, Julie M.


    This study examines the proposed Reliability Generalization (RG) method for studying reliability. RG employs the application of meta-analytic techniques similar to those used in validity generalization studies to examine reliability coefficients. This study explains why RG does not provide a proper research method for the study of reliability,…

  10. Two Aspects of Scorer Reliability in the Bender Gestalt Test

    Morsbach, Gisela; And Others


    This study investigated (a) interscorer reliability of the Bender-Gestalt Test by using more than one person to score the same test protocols; and (b) rate-rerate reliability of the Bender-Gestalt Test after a half-year interval. (Author)

  11. Research on Protocol Migration

    汪芸; 顾冠群; 等


    This paper elaborates the concept and model of protocol migration in network interconnection.Migration strategies and principles are discussed and several cases are studied in detail which show the basic procedure and techniques used in protocol migration.

  12. Ocular irritation reversibility assessment for personal care products using a porcine corneal culture assay.

    Donahue, Douglas A; Avalos, Javier; Kaufman, Lewis E; Simion, F Anthony; Cerven, Daniel R


    Personal care product manufacturers have used a broad spectrum of alternative ocular irritation assays during the past two decades because these tests do not require the use of live animals, they provide reliable predictive data, and they are relatively inexpensive to conduct. To complement these assays, the ex vivo Porcine Corneal Opacity Reversibility Assay (PorCORA) was recently developed using a corneal culture model to predict reversibility of ocular irritants. Three commercially available consumer products (a shampoo, a hair color glaze, and a hair colorant system containing 12% hydrogen peroxide) were each tested in two PorCORA study replicates in order to assess potential ocular damage reversibility for surfactant-, propylene carbonate-, and peroxide-based formulations, respectively. Under the exaggerated, in vitro study conditions, the surfactant-based shampoo may cause irreversible porcine corneal damage (histological changes in the epithelial squamous cell and/or basal cell layers), whereas the hair color glaze and 12% hydrogen peroxide product caused fully reversible ocular irritation (microscopic changes only in the superficial squamous cell layer). The hair color glaze and peroxide product results correlate with established in vivo data for similar compounds, but the shampoo results contradicted previous BCOP results (expected to be only a mild irritant). Therefore, although the PorCORA protocol shows promise in predicting the extent and reversibility of potential ocular damage caused by accidental consumer eye exposure to personal care products, the contradictory results for the surfactant-based shampoo indicate that more extensive validation testing of the PorCORA is necessary to definitively establish the protocol's reliability as a Draize test replacement.

  13. Multicast Routing Protocols in Adhoc Mobile networks



    Full Text Available The majority of applications are in areas where rapid deployment and dynamic reconfiguration are necessary and a wire line network is not available. These include military battlefields, emergency search and rescue sites, classrooms, and conventions where participants share information dynamically using their mobile devices. Well established routing protocols do exist to offer efficient multicasting service in conventional wired networks. These protocols, having been designed for fixed networks, may fails to keep up with node movements and frequent topology changes in a MANET. Therefore, adapting existing wired multicast protocols as such to a MANET, which completely lacks infrastructure, appear less promising. Providing efficient multicasting over MANET faces many challenges, includes scalability,quality of service, reliable service, security, Address configuration, Applications for multicast over MANET. The existing multicast routing protocol do not addresses these issues effectively over Mobile Adhoc Networks (MANET.

  14. Comparison of on Demand Routing Protocols

    Bharat Bhushan


    Full Text Available A routing protocol is used to facilitate communication in ad hoc network. The primary goal of such a routing protocol is to provide an efficient and reliable path between a pair of nodes. The routing protocols for ad hoc network can be categorized into three categories: table driven, on demand and hybrid routing. The table driven and hybrid routing strategies require periodic exchange of hello messages between nodes of the ad hoc network and thus have high processing and bandwidth requirements. On the other hand on demand routing strategy creates routes when required and hence is very much suitable for ad hoc network. This paper therefore examines the performance of three on demand routing protocols at application layer using QualNet-5.01 simulator.

  15. Reliability of Arctic offshore installations

    Bercha, F.G. [Bercha Group, Calgary, AB (Canada); Gudmestad, O.T. [Stavanger Univ., Stavanger (Norway)]|[Statoil, Stavanger (Norway)]|[Norwegian Univ. of Technology, Stavanger (Norway); Foschi, R. [British Columbia Univ., Vancouver, BC (Canada). Dept. of Civil Engineering; Sliggers, F. [Shell International Exploration and Production, Rijswijk (Netherlands); Nikitina, N. [VNIIG, St. Petersburg (Russian Federation); Nevel, D.


    Life threatening and fatal failures of offshore structures can be attributed to a broad range of causes such as fires and explosions, buoyancy losses, and structural overloads. This paper addressed the different severities of failure types, categorized as catastrophic failure, local failure or serviceability failure. Offshore tragedies were also highlighted, namely the failures of P-36, the Ocean Ranger, the Piper Alpha, and the Alexander Kieland which all resulted in losses of human life. P-36 and the Ocean Ranger both failed ultimately due to a loss of buoyancy. The Piper Alpha was destroyed by a natural gas fire, while the Alexander Kieland failed due to fatigue induced structural failure. The mode of failure was described as being the specific way in which a failure occurs from a given cause. Current reliability measures in the context of offshore installations only consider the limited number of causes such as environmental loads. However, it was emphasized that a realistic value of the catastrophic failure probability should consider all credible causes of failure. This paper presented a general method for evaluating all credible causes of failure of an installation. The approach to calculating integrated reliability involves the use of network methods such as fault trees to combine the probabilities of all factors that can cause a catastrophic failure, as well as those which can cause a local failure with the potential to escalate to a catastrophic failure. This paper also proposed a protocol for setting credible reliability targets such as the consideration of life safety targets and escape, evacuation, and rescue (EER) success probabilities. A set of realistic reliability targets for both catastrophic and local failures for representative safety and consequence categories associated with offshore installations was also presented. The reliability targets were expressed as maximum average annual failure probabilities. The method for converting these annual

  16. Topological Design of Protocols

    Jaffe, Arthur; Wozniakowski, Alex


    We give a topological simulation for tensor networks that we call the two-string model. In this approach we give a new way to design protocols, and we discover a new multipartite quantum communication protocol. We introduce the notion of topologically compressed transformations. Our new protocol can implement multiple, non-local compressed transformations among multi-parties using one multipartite resource state.

  17. Vertical Protocol Composition

    Groß, Thomas; Mödersheim, Sebastian Alexander


    The security of key exchange and secure channel protocols, such as TLS, has been studied intensively. However, only few works have considered what happens when the established keys are actually used—to run some protocol securely over the established “channel”. We call this a vertical protocol com...

  18. The Bead Assay for Biofilms: A Quick, Easy and Robust Method for Testing Disinfectants.

    Katharina Konrat

    Full Text Available Bacteria live primarily in microbial communities (biofilms, where they exhibit considerably higher biocide tolerance than their planktonic counterparts. Current standardized efficacy testing protocols of disinfectants, however, employ predominantly planktonic bacteria. In order to test the efficacy of biocides on biofilms in a standardized manner, a new assay was developed and optimized for easy-handling, quickness, low running costs, and above all-repeatability. In this assay, 5 mm glass- or polytetrafluoroethylene beads in 24 well microtiter plates served as substrate for Pseudomonas aeruginosa biofilms. After optimizing result-relevant steps, the actual performance of the assay was explored by treating P. aeruginosa biofilms with glutaraldehyde, isopropanol, or peracetic acid in predefined concentrations. The aspired 5 log10 reduction in CFU counts was achieved by glutaraldehyde at 5% (30 min, and by peracetic acid at 0.3% (10 min. In contrast, 80% isopropanol (30 min failed to meet the reduction goal. However, the main accomplishment of this study was to unveil the potential of the array itself; most noteworthy here, a reliable repeatability of the results. The new bead assay for biofilms is a robust, quick and cost-effective method for assessing the efficacy of biocides against biofilms.

  19. Sensitive non-isotopic DNA hybridisation assay or immediate-early antigen detection for rapid identification of human cytomegalovirus in urine.

    Kimpton, C P; Morris, D J; Corbitt, G


    A sensitive non-radioactive DNA hybridisation assay employing digoxigenin-labelled probes was compared with immediate-early antigen detection and conventional virus isolation for the identification of human cytomegalovirus (HCMV) in 249 urine samples. Of 44 specimens yielding HCMV by virus isolation, more were positive by DNA hybridisation (32; 73%) than by immediate-early antigen detection (25; 52%) (P = 0.05). The specificity of the hybridisation assay in 45 apparently falsely positive specimens was supported by detection of HCMV DNA in 40 of these specimens using the polymerase chain reaction. Many urine specimens may thus contain large amounts of non-viable virus or free viral DNA. Evaluation of various protocols for the extraction and denaturation of virus DNA prior to hybridisation showed that proteinase K digestion with phenol/chloroform extraction was the most sensitive and reliable procedure. We conclude that the non-radioactive DNA hybridisation assay described is a potentially valuable routine diagnostic test.

  20. A rapid and robust assay for detection of S-phase cell cycle progression in plant cells and tissues by using ethynyl deoxyuridine

    Horváth Gábor V


    Full Text Available Abstract Background Progress in plant cell cycle research is highly dependent on reliable methods for detection of cells replicating DNA. Frequency of S-phase cells (cells in DNA synthesis phase is a basic parameter in studies on the control of cell division cycle and the developmental events of plant cells. Here we extend the microscopy and flow cytometry applications of the recently developed EdU (5-ethynyl-2'-deoxyuridine-based S-phase assay to various plant species and tissues. We demonstrate that the presented protocols insure the improved preservation of cell and tissue structure and allow significant reduction in assay duration. In comparison with the frequently used detection of bromodeoxyuridine (BrdU and tritiated-thymidine incorporation, this new methodology offers several advantages as we discuss here. Results Applications of EdU-based S-phase assay in microscopy and flow cytometry are presented by using cultured cells of alfalfa, Arabidopsis, grape, maize, rice and tobacco. We present the advantages of EdU assay as compared to BrdU-based replication assay and demonstrate that EdU assay -which does not require plant cell wall digestion or DNA denaturation steps, offers reduced assay duration and better preservation of cellular, nuclear and chromosomal morphologies. We have also shown that fast and efficient EdU assay can also be an efficient tool for dual parameter flow cytometry analysis and for quantitative assessment of replication in thick root samples of rice. Conclusions In plant cell cycle studies, EdU-based S-phase detection offers a superior alternative to the existing S-phase assays. EdU method is reliable, versatile, fast, simple and non-radioactive and it can be readily applied to many different plant systems.

  1. Rate Control Protocol for Fast Flows: A Survey

    Mr. Gaganpreet Singh,


    Full Text Available In today’s world, congestion control is a main objective to maximize fairness, utilization and throughput of the Internet. Every protocol has its own features to handle the congestion. The most widely used protocol over the Internet is Transfer Control Protocol. It aims at reliable and in order delivery of bytes to the higher layer and it also protect the network from congestive control. Other congestion control protocols are XCP and RCP. These new protocols are advancement over TCP. We study new congestion control protocol like Rate Control Protocol that make flows complete frequently as compared to TCP and other version of TCP and XCP. In this paper we have presented a comparison between TCP, XCP and RCP, which shows that RCP is a superior choice to use over the Internet to make flows complete quickly

  2. Importance of a suitable working protocol for tape stripping experiments on porcine ear skin: Influence of lipophilic formulations and strip adhesion impairment.

    Nagelreiter, C; Mahrhauser, D; Wiatschka, K; Skipiol, S; Valenta, C


    The tape stripping method is a very important tool for dermopharmacokinetic experiments in vitro and the accurate measurement of the removed corneocytes is key for a reliable calculation of a drug's skin penetration behavior. Therefore, various methods to quantify the amount of corneocytes removed with each tape strip have been employed, ranging from gravimetric approaches to protein assays and recently near infrared densitometry (NIR) has become very widely used. As this method is based on a reduction of light intensity, interference of formulation components seems conceivable, as they could scatter light and change the results. In this study, NIR measurements were compared to a protein assay and in addition, the influence of highly lipophilic formulations on the results of tape stripping experiments was investigated as impairment of the adherence of strips has been reported. To this end, different tape stripping protocols were employed. The obtained results ensure suitability of the NIR method and moreover suggest a more pronounced influence on adherence with increasing lipophilicity in applied formulations. The results show that adaptation of the tape stripping protocol to the specifications of envisioned experiments is important for reliable results. Two protocols were found favorable and are presented in this work.

  3. Blind Collective Signature Protocol

    Nikolay A. Moldovyan


    Full Text Available Using the digital signature (DS scheme specified by Belarusian DS standard there are designed the collective and blind collective DS protocols. Signature formation is performed simultaneously by all of the assigned signers, therefore the proposed protocols can be used also as protocols for simultaneous signing a contract. The proposed blind collective DS protocol represents a particular implementation of the blind multisignature schemes that is a novel type of the signature schemes. The proposed protocols are the first implementations of the multisignature schemes based on Belarusian signature standard.

  4. Assuring reliability program effectiveness.

    Ball, L. W.


    An attempt is made to provide simple identification and description of techniques that have proved to be most useful either in developing a new product or in improving reliability of an established product. The first reliability task is obtaining and organizing parts failure rate data. Other tasks are parts screening, tabulation of general failure rates, preventive maintenance, prediction of new product reliability, and statistical demonstration of achieved reliability. Five principal tasks for improving reliability involve the physics of failure research, derating of internal stresses, control of external stresses, functional redundancy, and failure effects control. A final task is the training and motivation of reliability specialist engineers.

  5. The Accelerator Reliability Forum

    Lüdeke, Andreas; Giachino, R


    A high reliability is a very important goal for most particle accelerators. The biennial Accelerator Reliability Workshop covers topics related to the design and operation of particle accelerators with a high reliability. In order to optimize the over-all reliability of an accelerator one needs to gather information on the reliability of many different subsystems. While a biennial workshop can serve as a platform for the exchange of such information, the authors aimed to provide a further channel to allow for a more timely communication: the Particle Accelerator Reliability Forum [1]. This contribution will describe the forum and advertise it’s usage in the community.

  6. Enlightenment on Computer Network Reliability From Transportation Network Reliability

    Hu Wenjun; Zhou Xizhao


    Referring to transportation network reliability problem, five new computer network reliability definitions are proposed and discussed. They are computer network connectivity reliability, computer network time reliability, computer network capacity reliability, computer network behavior reliability and computer network potential reliability. Finally strategies are suggested to enhance network reliability.

  7. Assessment of assay sensitivity and precision in a malaria antibody ELISA.

    Rajasekariah, G Halli R; Kay, Graeme E; Russell, Natrice V; Smithyman, Anthony M


    Many types of ELISA-based immunodiagnostic test kits are commercially available in the market for specific indications. These kits provide necessary assay components, reagents, and guidelines to perform the assay under designated optimal conditions. By using these kits, any unknown or test sample can be assessed as negative or positive based on the results of referral calibrator (Ref+ve and Ref-ve) samples. It is essential to provide reliable test kits to end-users with adequate quality control analysis. Therefore, it is necessary to check the kit for any variations in its performance. While developing a malaria antibody ELISA test-kit, we optimized assay conditions with chequer-board analyses and developed an assay protocol. We have taken out kits randomly from the assembly line and had them evaluated by operators who are new to the test-kits. Assays are performed as per the test guidelines provided. Sera, diluted serially, have shown a clear discriminatory signal between a negative vs. positive sample. A COV is determined by evaluating the Ref-ve calibrator in replicate antigen-coated wells from 6 different plates. This COV is used as a tool to determine S/N ratio of test samples. Besides Ref-ve and Ref+ve calibrators, additional field serum samples are tested with the test kit. Several performance indices, such as mean, standard deviation, %CV are calculated, and the inter- and intra-assay variations determined. The assay precision is determined with large and small replicate samples. In addition, assays are performed concurrently in triplicate-, duplicate-, and single-wells, and the results are analyzed for any assay variations. Different plate areas are identified in antigen-coated 96-well plates and tested blind to detect any variations. The S/N ratio is found to be a very effective tool in determining the assay sensitivity. The %CV was within 10-15%. Variations seen in the assays are found to be due to operator errors and not due to kit reagents. These

  8. Assays for Determination of Protein Concentration.

    Olson, Bradley J S C


    Biochemical analysis of proteins relies on accurate quantification of protein concentration. Detailed in this appendix are some commonly used methods for protein analysis, e.g., Lowry, Bradford, bicinchoninic acid (BCA), UV spectroscopic, and 3-(4-carboxybenzoyl)quinoline-2-carboxaldehyde (CBQCA) assays. The primary focus of this report is assay selection, emphasizing sample and buffer compatibility. The fundamentals of generating protein assay standard curves and of data processing are considered, as are high-throughput adaptations of the more commonly used protein assays. Also included is a rapid, inexpensive, and reliable BCA assay of total protein in SDS-PAGE sample buffer that is used for equal loading of SDS-PAGE gels. © 2016 by John Wiley & Sons, Inc.

  9. BTP: a Block Transfer Protocol for Delay Tolerant Wireless Sensor Networks

    Hansen, Morten Tranberg; Biagioni, Edoardo S.


    proposes a Block Transfer Protocol (BTP) designed for efficient and reliable transmission in wireless sensor networks.  BTP reduces the time it takes to reliably transfer a block of packets compared to conventional link layer protocols, by piggybacking in data packets information about the transfer...

  10. Human Reliability Program Overview

    Bodin, Michael


    This presentation covers the high points of the Human Reliability Program, including certification/decertification, critical positions, due process, organizational structure, program components, personnel security, an overview of the US DOE reliability program, retirees and academia, and security program integration.

  11. Power electronics reliability analysis.

    Smith, Mark A.; Atcitty, Stanley


    This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.

  12. Single-experiment displacement assay for quantifying high-affinity binding by isothermal titration calorimetry.

    Krainer, Georg; Keller, Sandro


    Isothermal titration calorimetry (ITC) is the gold standard for dissecting the thermodynamics of a biomolecular binding process within a single experiment. However, reliable determination of the dissociation constant (KD) from a single titration is typically limited to the range 100 μM>KD>1 nM. Interactions characterized by a lower KD can be assessed indirectly by so-called competition or displacement assays, provided that a suitable competitive ligand is available whose KD falls within the directly accessible window. However, this protocol is limited by the fact that it necessitates at least two titrations to characterize one high-affinity inhibitor, resulting in considerable consumption of both sample material and time. Here, we introduce a fast and efficient ITC displacement assay that allows for the simultaneous characterization of both a high-affinity ligand and a moderate-affinity ligand competing for the same binding site on a receptor within a single experiment. The protocol is based on a titration of the high-affinity ligand into a solution containing the moderate-affinity ligand bound to the receptor present in excess. The resulting biphasic binding isotherm enables accurate and precise determination of KD values and binding enthalpies (ΔH) of both ligands. We discuss the theoretical background underlying the approach, demonstrate its practical application to metal ion chelation, explore its potential and limitations with the aid of simulations and statistical analyses, and elaborate on potential applications to protein-inhibitor interactions.

  13. Human primary osteoclasts: in vitro generation and applications as pharmacological and clinical assay

    Zamurovic Natasa


    Full Text Available Abstract Osteoclasts are cells of hematopoietic origin with a unique property of dissolving bone; their inhibition is a principle for treatment of diseases of bone loss. Protocols for generation of human osteoclasts in vitro have been described, but they often result in cells of low activity, raising questions on cell phenotype and suitability of such assays for screening of bone resorption inhibitors. Here we describe an optimized protocol for the production of stable amounts of highly active human osteoclasts. Mononuclear cells were isolated from human peripheral blood by density centrifugation, seeded at 600,000 cells per 96-well and cultured for 17 days in α-MEM medium, supplemented with 10% of selected fetal calf serum, 1 μM dexamethasone and a mix of macrophage-colony stimulating factor (M-CSF, 25 ng/ml, receptor activator of NFκB ligand (RANKL, 50 ng/ml, and transforming growth factor-β1 (TGF-β1, 5 ng/ml. Thus, in addition to widely recognized osteoclast-generating factors M-CSF and RANKL, other medium supplements and lengthy culture times were necessary. This assay reliably detected inhibition of osteoclast formation (multinucleated cells positive for tartrate-resistant acid phosphatase and activity (resorbed area and collagen fragments released from bone slices in dose response curves with several classes of bone resorption inhibitors. Therefore, this assay can be applied for monitoring bone-resorbing activity of novel drugs and as an clinical test for determining the capacity of blood cells to generate bone-resorbing osteoclasts. Isolation of large quantities of active human osteoclast mRNA and protein is also made possible by this assay.

  14. Achieving Reliable Communication in Dynamic Emergency Responses

    Chipara, Octav; Plymoth, Anders N.; Liu, Fang; Huang, Ricky; Evans, Brian; Johansson, Per; Rao, Ramesh; Griswold, William G.


    Emergency responses require the coordination of first responders to assess the condition of victims, stabilize their condition, and transport them to hospitals based on the severity of their injuries. WIISARD is a system designed to facilitate the collection of medical information and its reliable dissemination during emergency responses. A key challenge in WIISARD is to deliver data with high reliability as first responders move and operate in a dynamic radio environment fraught with frequent network disconnections. The initial WIISARD system employed a client-server architecture and an ad-hoc routing protocol was used to exchange data. The system had low reliability when deployed during emergency drills. In this paper, we identify the underlying causes of unreliability and propose a novel peer-to-peer architecture that in combination with a gossip-based communication protocol achieves high reliability. Empirical studies show that compared to the initial WIISARD system, the redesigned system improves reliability by as much as 37% while reducing the number of transmitted packets by 23%. PMID:22195075

  15. Reliable Design Versus Trust

    Berg, Melanie; LaBel, Kenneth A.


    This presentation focuses on reliability and trust for the users portion of the FPGA design flow. It is assumed that the manufacturer prior to hand-off to the user tests FPGA internal components. The objective is to present the challenges of creating reliable and trusted designs. The following will be addressed: What makes a design vulnerable to functional flaws (reliability) or attackers (trust)? What are the challenges for verifying a reliable design versus a trusted design?

  16. Viking Lander reliability program

    Pilny, M. J.


    The Viking Lander reliability program is reviewed with attention given to the development of the reliability program requirements, reliability program management, documents evaluation, failure modes evaluation, production variation control, failure reporting and correction, and the parts program. Lander hardware failures which have occurred during the mission are listed.

  17. Atomic and efficient e-cash transaction protocol

    王茜; 杨德礼


    Atomicity is necessary for reliable and secure electronic commerce transaction and to guarantee the participants'interests. An atomic and efficient e-cash (electronic cash) transaction protocol based on the classical e-cash scheme is presented. The delivery of digital goods is incorporated into the process of payment in the protocol. Apart from ensuring all three levels of atomicity, the novel protocol features high efficiency and practicability with unfavorable strong assumption removed. Furthermore, the proposed protocol provides non-repudiation proofs for any future disputes. At last, analysis of the atomicity and efficiency is illustrated.

  18. Reliable Multicast Protocol Based on Reliable Active Node%用ANTS模拟实现RANRM协议

    蔡洪斌; 周明天; 杨国纬



  19. 主动结点启动的可靠多播通信%Active-Node-Initiated Reliable Multicast

    蔡洪斌; 周明天; 杨国纬


    Sender-initiated and receiver-initiated reliable multicast protocols can suffer performance degradation as increasing the number of receivers. New technology and new service of active network would resolve the problem of the scalable and reliable multicast. The paper presents an Active-Node-Initiated Reliable Multicast (ANIRM)protocol. The protocol, which guarantees the data packet to transport correctly form active node to active node,shifts the burden of providing reliable data transfer to active nodes and receivers. Theoretical analysis shows that ANIRM proposed here is,comparing with the traditional network protocol,better for bandwidth and data recovery delay.

  20. A Reliable Transport Protocol for Resource Constrained Nodes: CRCTP- Protocol Design


    les réseaux sans fil, et particulièrement dans les réseaux multi- sauts . Ces deux raisons font que TCP est particulièrement inefficace sous certaines... sauts (plutôt que bout-à-bout), mécanisme hybride de ACK et NACK, pas de contrôle de la congestion, etc. Ce document présente la conception du...des délais inutiles sur les réseaux sans fil, et particulièrement dans les réseaux multi- sauts . Ces deux raisons font que TCP est particulièrement

  1. MDP: Reliable File Transfer for Space Missions

    Rash, James; Criscuolo, Ed; Hogie, Keith; Parise, Ron; Hennessy, Joseph F. (Technical Monitor)


    This paper presents work being done at NASA/GSFC by the Operating Missions as Nodes on the Internet (OMNI) project to demonstrate the application of the Multicast Dissemination Protocol (MDP) to space missions to reliably transfer files. This work builds on previous work by the OMNI project to apply Internet communication technologies to space communication. The goal of this effort is to provide an inexpensive, reliable, standard, and interoperable mechanism for transferring files in the space communication environment. Limited bandwidth, noise, delay, intermittent connectivity, link asymmetry, and one-way links are all possible issues for space missions. Although these are link-layer issues, they can have a profound effect on the performance of transport and application level protocols. MDP, a UDP-based reliable file transfer protocol, was designed for multicast environments which have to address these same issues, and it has done so successfully. Developed by the Naval Research Lab in the mid 1990's, MDP is now in daily use by both the US Post Office and the DoD. This paper describes the use of MDP to provide automated end-to-end data flow for space missions. It examines the results of a parametric study of MDP in a simulated space link environment and discusses the results in terms of their implications for space missions. Lessons learned are addressed, which suggest minor enhancements to the MDP user interface to add specific features for space mission requirements, such as dynamic control of data rate, and a checkpoint/resume capability. These are features that are provided for in the protocol, but are not implemented in the sample MDP application that was provided. A brief look is also taken at the status of standardization. A version of MDP known as NORM (Neck Oriented Reliable Multicast) is in the process of becoming an IETF standard.

  2. Encryption Switching Protocols

    Couteau, Geoffroy; Peters, Thomas; Pointcheval, David


    International audience; We formally define the primitive of encryption switching protocol (ESP), allowing to switch between two encryption schemes. Intuitively, this two-party protocol converts given ciphertexts from one scheme into ciphertexts of the same messages under the other scheme, for any polynomial number of switches, in any direction. Although ESP is a special kind of two-party computation protocol, it turns out that ESP implies general two-party computation (2-PC) under natural con...

  3. Reliable Delay Constrained Multihop Broadcasting in VANETs

    Koubek Martin


    Full Text Available Vehicular communication is regarded as a major innovative feature for in-car technology. While improving road safety is unanimously considered the major driving factor for the deployment of Intelligent Vehicle Safety Systems, the challenges relating to reliable multi-hop broadcasting are exigent in vehicular networking. In fact, safety applications must rely on very accurate and up-to-date information about the surrounding environment, which in turn requires the use of accurate positioning systems and smart communication protocols for exchanging information. Communications protocols for VANETs must guarantee fast and reliable delivery of information to all vehicles in the neighbourhood, where the wireless communication medium is shared and highly unreliable with limited bandwidth. In this paper, we focus on mechanisms that improve the reliability of broadcasting protocols, where the emphasis is on satisfying the delay requirements for safety applications. We present the Pseudoacknowledgments (PACKs scheme and compare this with existing methods over varying vehicle densities in an urban scenario using the network simulator OPNET.

  4. HPLC assay of tomato carotenoids: validation of a rapid microextraction technique.

    Sérino, Sylvie; Gomez, Laurent; Costagliola, Guy; Gautier, Hélène


    Carotenoids are studied for their role as pigments and as precursors of aromas, vitamin A, abscisic acid, and antioxidant compounds in different plant tissues. A novel, rapid, and inexpensive analytical protocol is proposed to enable the simultaneous analysis of four major tomato carotenoids: lutein, lycopene, beta-carotene, and phytoene. Microextraction is performed in the presence of sodium chloride, n-hexane, dichloromethane, and ethyl acetate on fresh tomato powder that has been finely ground in liquid nitrogen. The carotenoids are extracted by agitation and centrifugation and then analyzed by HPLC using a diode array detector. The principal advantage of this extraction resides in the absence of an evaporation step, often necessary to assay tomato carotenoids other than lycopene. Whatever the carotenoid, tests for accuracy, reproducibility, and linearity were satisfactory and indicative of the method's reliability. The stability of extracts over time (several days at -20 degrees C) as the satisfactory sensitivity of the assay whatever the fruit ripeness had a part in the robustness of the method. Reliable, rapid, simple, and inexpensive, this extraction technique is appropriate for the routine analysis of carotenoids in small samples.

  5. Multiparty Quantum Cryptographic Protocol

    M. Ramzan; M. K. Khan


    We propose a multiparty quantum cryptographic protocol. Unitary operators applied by Bob and Charlie, on their respective qubits of a tripartite entangled state encoding a classical symbol that can be decoded at Alice's end with the help of a decoding matrix. Eve's presence can be detected by the disturbance of the decoding matrix. Our protocol is secure against intercept-resend attacks. Furthermore, it is efficient and deterministic in the sense that two classical bits can be transferred per entangled pair of qubits. It is worth mentioning that in this protocol, the same symbol can be used for key distribution and Eve's detection that enhances the effciency of the protocol.

  6. Kinetic viability assays using DRAQ7 probe.

    Wlodkowic, Donald; Akagi, Jin; Dobrucki, Jurek; Errington, Rachel; Smith, Paul J; Takeda, Kazuo; Darzynkiewicz, Zbigniew


    Cell death within cell populations is a stochastic process where cell-to-cell variation in temporal progression through the various stages of cell death arises from asynchrony of subtle fluctuations in the signaling pathways. Most cell death assays rely on detection of the specific marker of cell demise at the end-point of cell culturing. Such an approach cannot account for the asynchrony and the stochastic nature of cell response to the death-inducing signal. There is a need therefore for rapid and high-throughput bioassays capable of continuously tracking viability of individual cells from the time of encountering a stress signal up to final stages of their demise. In this context, a new anthracycline derivative, DRAQ7, is gaining increasing interest as an easy-to-use marker capable of long-term monitoring of cell death in real-time. This novel probe neither penetrates the plasma membrane of living cells nor does it affect the cells' susceptibility to the death-inducing agents. However, when the membrane integrity is compromised, DRAQ7 enters cells undergoing demise and binds readily to nuclear DNA to report cell death. Here, we provide three sets of protocols for viability assays using DRAQ7 probe. The first protocol describes the innovative use of single-color DRAQ7 real-time assay to dynamically track cell viability. The second protocol outlines a simplified end-point DRAQ7 staining approach. The final protocol highlights the real-time and multiparametric apoptosis assay utilizing DRAQ7 dye concurrently with tetramethylrhodamine methyl ester (TMRM), the mitochondrial trans-membrane electrochemical potential (ΔΨm) sensing probe.

  7. Radioreceptor assay: theory and applications to pharmacology

    Perret, G. (U.E.R. de Medecine, Sante et Biologie Humaine, 93 - Bobigny (France)); Simon, P. (Faculte de Medecine Pitie-Salpetriere, 75 - Paris (France))

    The aim of the first part of this work is to present the theory of the radioreceptor assay and to compare it to the other techniques of radioanalysis (radioimmunoassay, competitive protein binding assays). The technology of the radioreceptor assay is then presented and its components (preparation of the receptors, radioligand, incubation medium) are described. The analytical characteristics of the radioreceptor assay (specificity, sensitivity, reproductibility, accuracy) and the pharmacological significance of the results are discussed. The second part is devoted to the description of the radioreceptor assays of some pharmacological classes (neuroleptics, tricyclic antidepressants, benzodiazepines, ..beta..-blockers, anticholinergic drugs) and to their use in therapeutic drug monitoring. In conclusion, by their nature, radioreceptor assays are highly sensitive, reliable, precise, accurate and simple to perform. Their chief disadvantage relates to specificity, since any substance having an appreciable affinity to the receptor site will displace the specifically bound radioligand. Paradoxically in some cases, this lack of specificity may be advantageous in that it allows for the detection of not only the apparent compound but of active metabolites and endogenous receptor agonists as well and in that radioreceptors assays can be devised for a whole pharmacological class and not only for one drug as it is the case for classical physico-chemical techniques. For all these reasons future of radioreceptor assay in pharmacology appears promising.

  8. New Routing Metrics for ADHOC Network Routing Protocols

    Reddy, P. C.


    The performance and reliability of Internet is measured using different quantities. When the quantities measured are essential and have wide range of acceptance then they are called metrics. Performance metrics enable comparison and selection among the alternatives. In computer networks, metrics are used to evaluate an application, protocol etc. Routing in adhoc networks is nontrivial. Routing protocols for adhoc networks are still evolving and there is need for continuous evaluation of them. In the literature existing, several routing protocols are evaluated using standard metrics under different conditions. This paper proposes new metrics for evaluation of routing protocols and uses them to evaluate the adhoc network routing protocols AODV, DSR, DSDV and TORA. Simulation environment is created using NS-2 simulator. Typical range of speeds, pause times and data rates are used. The results provide new insights in to the working of the routing protocols.


    S. Fathima


    Full Text Available In this work we designed a routing protocol to overcome upcoming challenges in under water wireless sensor networks. The routing protocol designed for specific roles leads to issues in the network. The major issues for development of routing protocol for underwater sensor network are harsh deployment environment, low bandwidth, high propagation delay, low bandwidth, requires high bandwidth energy, temporary losses, fouling and corrosion and high bit error rates. In this project the certain issues to be rectified are low bandwidth, energy efficiency and data delivery. The limitations existing routing protocols are low data delivery, data delivery ratio, energy efficiency, bandwidth efficiency and reliability. Design of three new protocol is to overcome the limitations of existing protocols in underwater wireless sensor networks.

  10. Chapter 22: Compressed Air Evaluation Protocol

    Benton, N.


    Compressed-air systems are used widely throughout industry for many operations, including pneumatic tools, packaging and automation equipment, conveyors, and other industrial process operations. Compressed-air systems are defined as a group of subsystems composed of air compressors, air treatment equipment, controls, piping, pneumatic tools, pneumatically powered machinery, and process applications using compressed air. A compressed-air system has three primary functional subsystems: supply, distribution, and demand. Air compressors are the primary energy consumers in a compressed-air system and are the primary focus of this protocol. The two compressed-air energy efficiency measures specifically addressed in this protocol are: high-efficiency/variable speed drive (VSD) compressor replacing modulating compressor; compressed-air leak survey and repairs. This protocol provides direction on how to reliably verify savings from these two measures using a consistent approach for each.

  11. Analysis of Security Protocols by Annotations

    Gao, Han

    The trend in Information Technology is that distributed systems and networks are becoming increasingly important, as most of the services and opportunities that characterise the modern society are based on these technologies. Communication among agents over networks has therefore acquired a great...... deal of research interest. In order to provide effective and reliable means of communication, more and more communication protocols are invented, and for most of them, security is a significant goal. It has long been a challenge to determine conclusively whether a given protocol is secure or not....... The development of formal techniques, e.g. control flow analyses, that can check various security properties, is an important tool to meet this challenge. This dissertation contributes to the development of such techniques. In this dissertation, security protocols are modelled in the process calculus LYSA...

  12. Reliability and safety engineering

    Verma, Ajit Kumar; Karanki, Durga Rao


    Reliability and safety are core issues that must be addressed throughout the life cycle of engineering systems. Reliability and Safety Engineering presents an overview of the basic concepts, together with simple and practical illustrations. The authors present reliability terminology in various engineering fields, viz.,electronics engineering, software engineering, mechanical engineering, structural engineering and power systems engineering. The book describes the latest applications in the area of probabilistic safety assessment, such as technical specification optimization, risk monitoring and risk informed in-service inspection. Reliability and safety studies must, inevitably, deal with uncertainty, so the book includes uncertainty propagation methods: Monte Carlo simulation, fuzzy arithmetic, Dempster-Shafer theory and probability bounds. Reliability and Safety Engineering also highlights advances in system reliability and safety assessment including dynamic system modeling and uncertainty management. Cas...

  13. Composing Interfering Abstract Protocols


    Fähndrich and K. R. M. Leino. Heap monotonic typestate. In IWACO 2003. [11] X. Feng. Local rely-guarantee reasoning . In POPL ’09. [12] T. Freeman...While protocol-based techniques to reason about interference abound, they do not address two practical concerns: the decidability of protocol...46 C Examples using Informal Extensions 48 C.1 Monotonic Counter

  14. Coded Splitting Tree Protocols

    Sørensen, Jesper Hemming; Stefanovic, Cedomir; Popovski, Petar


    This paper presents a novel approach to multiple access control called coded splitting tree protocol. The approach builds on the known tree splitting protocols, code structure and successive interference cancellation (SIC). Several instances of the tree splitting protocol are initiated, each...... instance is terminated prematurely and subsequently iterated. The combined set of leaves from all the tree instances can then be viewed as a graph code, which is decodable using belief propagation. The main design problem is determining the order of splitting, which enables successful decoding as early...... as possible. Evaluations show that the proposed protocol provides considerable gains over the standard tree splitting protocol applying SIC. The improvement comes at the expense of an increased feedback and receiver complexity....

  15. Measurement System Reliability Assessment

    Kłos Ryszard


    Full Text Available Decision-making in problem situations is based on up-to-date and reliable information. A great deal of information is subject to rapid changes, hence it may be outdated or manipulated and enforce erroneous decisions. It is crucial to have the possibility to assess the obtained information. In order to ensure its reliability it is best to obtain it with an own measurement process. In such a case, conducting assessment of measurement system reliability seems to be crucial. The article describes general approach to assessing reliability of measurement systems.

  16. Reliable knowledge discovery

    Dai, Honghua; Smirnov, Evgueni


    Reliable Knowledge Discovery focuses on theory, methods, and techniques for RKDD, a new sub-field of KDD. It studies the theory and methods to assure the reliability and trustworthiness of discovered knowledge and to maintain the stability and consistency of knowledge discovery processes. RKDD has a broad spectrum of applications, especially in critical domains like medicine, finance, and military. Reliable Knowledge Discovery also presents methods and techniques for designing robust knowledge-discovery processes. Approaches to assessing the reliability of the discovered knowledge are introduc

  17. Reliability of fluid systems

    Kopáček Jaroslav


    Full Text Available This paper focuses on the importance of detection reliability, especially in complex fluid systems for demanding production technology. The initial criterion for assessing the reliability is the failure of object (element, which is seen as a random variable and their data (values can be processed using by the mathematical methods of theory probability and statistics. They are defined the basic indicators of reliability and their applications in calculations of serial, parallel and backed-up systems. For illustration, there are calculation examples of indicators of reliability for various elements of the system and for the selected pneumatic circuit.

  18. Circuit design for reliability

    Cao, Yu; Wirth, Gilson


    This book presents physical understanding, modeling and simulation, on-chip characterization, layout solutions, and design techniques that are effective to enhance the reliability of various circuit units.  The authors provide readers with techniques for state of the art and future technologies, ranging from technology modeling, fault detection and analysis, circuit hardening, and reliability management. Provides comprehensive review on various reliability mechanisms at sub-45nm nodes; Describes practical modeling and characterization techniques for reliability; Includes thorough presentation of robust design techniques for major VLSI design units; Promotes physical understanding with first-principle simulations.

  19. Reliable transmission of security-enabled multimedia over the Internet

    Moore, David E.; Ahmed, Farid


    In this paper we address the reliable transmission of security-enabled multimedia data over the internet which is becoming increasingly vulnerable to a variety of cyber-attacks. Due to their real-timeliness aspect, multimedia data in Internet mostly uses User Datagram Protocol(UDP) as the transport media as opposed to the Transport Control Protocol (TCP). UDP is inherently an unreliable transport media that results in certain unacknowledged packet losses. Multimedia applications usually can tolerate some packet losses for its rendering at the receiver side. But, for the security-enhanced multimedia that we are talking about, reliability of reception of most of the packets within a certain tolerance time need to be guaranteed. This is where we come in with a new protocol that ensures packet-level reliability as well as stream-level authentication of multimedia.

  20. Microbead agglutination based assays

    Kodzius, Rimantas


    We report a simple and rapid room temperature assay for point-of-care (POC) testing that is based on specific agglutination. Agglutination tests are based on aggregation of microbeads in the presence of a specific analyte thus enabling the macroscopic observation. Such tests are most often used to explore antibody-antigen reactions. Agglutination has been used for protein assays using a biotin/streptavidin system as well as a hybridization based assay. The agglutination systems are prone to selftermination of the linking analyte, prone to active site saturation and loss of agglomeration at high analyte concentrations. We investigated the molecular target/ligand interaction, explaining the common agglutination problems related to analyte self-termination, linkage of the analyte to the same bead instead of different microbeads. We classified the agglutination process into three kinds of assays: a two- component assay, a three-component assay and a stepped three- component assay. Although we compared these three kinds of assays for recognizing DNA and protein molecules, the assay can be used for virtually any molecule, including ions and metabolites. In total, the optimized assay permits detecting analytes with high sensitivity in a short time, 5 min, at room temperature. Such a system is appropriate for POC testing.

  1. Colorimetric protein assay techniques.

    Sapan, C V; Lundblad, R L; Price, N C


    There has been an increase in the number of colorimetric assay techniques for the determination of protein concentration over the past 20 years. This has resulted in a perceived increase in sensitivity and accuracy with the advent of new techniques. The present review considers these advances with emphasis on the potential use of such technologies in the assay of biopharmaceuticals. The techniques reviewed include Coomassie Blue G-250 dye binding (the Bradford assay), the Lowry assay, the bicinchoninic acid assay and the biuret assay. It is shown that each assay has advantages and disadvantages relative to sensitivity, ease of performance, acceptance in the literature, accuracy and reproducibility/coefficient of variation/laboratory-to-laboratory variation. A comparison of the use of several assays with the same sample population is presented. It is suggested that the most critical issue in the use of a chromogenic protein assay for the characterization of a biopharmaceutical is the selection of a standard for the calibration of the assay; it is crucial that the standard be representative of the sample. If it is not possible to match the standard with the sample from the perspective of protein composition, then it is preferable to use an assay that is not sensitive to the composition of the protein such as a micro-Kjeldahl technique, quantitative amino acid analysis or the biuret assay. In a complex mixture it might be inappropriate to focus on a general method of protein determination and much more informative to use specific methods relating to the protein(s) of particular interest, using either specific assays or antibody-based methods. The key point is that whatever method is adopted as the 'gold standard' for a given protein, this method needs to be used routinely for calibration.

  2. Technical note: comparison of the PrestoBlue and LDH release assays with the MTT assay for skin viability assessment.

    Gaucher, Sonia; Jarraya, Mohamed


    MTT assay is the gold standard for assessing skin sample viability but it is time-consuming. Here we compared the MTT test with two other assays for the assessment of skin viability. The MTT, PrestoBlue (colorimetric method) and LDH release assays were applied to fresh and cryopreserved skin. Skin viability was considered proportional to the optical density values of the relevant analytes. PrestoBlue did not reliably distinguish between fresh and cryopreserved skin. The LDH release assay did not allow us to establish a viability index. We recommend the MTT assay for assessing skin viability.

  3. Protocol: a highly sensitive RT-PCR method for detection and quantification of microRNAs

    Walton Eric F


    Full Text Available Abstract MicroRNAs (miRNAs are a class of small non-coding RNAs with a critical role in development and environmental responses. Efficient and reliable detection of miRNAs is an essential step towards understanding their roles in specific cells and tissues. However, gel-based assays currently used to detect miRNAs are very limited in terms of throughput, sensitivity and specificity. Here we provide protocols for detection and quantification of miRNAs by RT-PCR. We describe an end-point and real-time looped RT-PCR procedure and demonstrate detection of miRNAs from as little as 20 pg of plant tissue total RNA and from total RNA isolated from as little as 0.1 μl of phloem sap. In addition, we have developed an alternative real-time PCR assay that can further improve specificity when detecting low abundant miRNAs. Using this assay, we have demonstrated that miRNAs are differentially expressed in the phloem sap and the surrounding vascular tissue. This method enables fast, sensitive and specific miRNA expression profiling and is suitable for facilitation of high-throughput detection and quantification of miRNA expression.

  4. Correlative Förster Resonance Electron Transfer-Proximity Ligation Assay (FRET-PLA) Technique for Studying Interactions Involving Membrane Proteins.

    Ivanusic, Daniel; Denner, Joachim; Bannert, Norbert


    This unit provides a guide and detailed protocol for studying membrane protein-protein interactions (PPI) using the acceptor-sensitized Förster resonance electron transfer (FRET) method in combination with the proximity ligation assay (PLA). The protocol in this unit is focused on the preparation of FRET-PLA samples and the detection of correlative FRET/PLA signals as well as on the analysis of FRET-PLA data and interpretation of correlative results when using cyan fluorescent protein (CFP) as a FRET donor and yellow fluorescent protein (YFP) as a FRET acceptor. The correlative application of FRET and PLA combines two powerful tools for monitoring PPI, yielding results that are more reliable than with either technique alone. © 2016 by John Wiley & Sons, Inc.

  5. A Survey on Underwater Acoustic Sensor Network Routing Protocols

    Li, Ning; Martínez, José-Fernán; Meneses Chaus, Juan Manuel; Eckert, Martina


    Underwater acoustic sensor networks (UASNs) have become more and more important in ocean exploration applications, such as ocean monitoring, pollution detection, ocean resource management, underwater device maintenance, etc. In underwater acoustic sensor networks, since the routing protocol guarantees reliable and effective data transmission from the source node to the destination node, routing protocol design is an attractive topic for researchers. There are many routing algorithms have been proposed in recent years. To present the current state of development of UASN routing protocols, we review herein the UASN routing protocol designs reported in recent years. In this paper, all the routing protocols have been classified into different groups according to their characteristics and routing algorithms, such as the non-cross-layer design routing protocol, the traditional cross-layer design routing protocol, and the intelligent algorithm based routing protocol. This is also the first paper that introduces intelligent algorithm-based UASN routing protocols. In addition, in this paper, we investigate the development trends of UASN routing protocols, which can provide researchers with clear and direct insights for further research. PMID:27011193

  6. A Survey on Underwater Acoustic Sensor Network Routing Protocols

    Ning Li


    Full Text Available Underwater acoustic sensor networks (UASNs have become more and more important in ocean exploration applications, such as ocean monitoring, pollution detection, ocean resource management, underwater device maintenance, etc. In underwater acoustic sensor networks, since the routing protocol guarantees reliable and effective data transmission from the source node to the destination node, routing protocol design is an attractive topic for researchers. There are many routing algorithms have been proposed in recent years. To present the current state of development of UASN routing protocols, we review herein the UASN routing protocol designs reported in recent years. In this paper, all the routing protocols have been classified into different groups according to their characteristics and routing algorithms, such as the non-cross-layer design routing protocol, the traditional cross-layer design routing protocol, and the intelligent algorithm based routing protocol. This is also the first paper that introduces intelligent algorithm-based UASN routing protocols. In addition, in this paper, we investigate the development trends of UASN routing protocols, which can provide researchers with clear and direct insights for further research.

  7. A Survey on Underwater Acoustic Sensor Network Routing Protocols.

    Li, Ning; Martínez, José-Fernán; Meneses Chaus, Juan Manuel; Eckert, Martina


    Underwater acoustic sensor networks (UASNs) have become more and more important in ocean exploration applications, such as ocean monitoring, pollution detection, ocean resource management, underwater device maintenance, etc. In underwater acoustic sensor networks, since the routing protocol guarantees reliable and effective data transmission from the source node to the destination node, routing protocol design is an attractive topic for researchers. There are many routing algorithms have been proposed in recent years. To present the current state of development of UASN routing protocols, we review herein the UASN routing protocol designs reported in recent years. In this paper, all the routing protocols have been classified into different groups according to their characteristics and routing algorithms, such as the non-cross-layer design routing protocol, the traditional cross-layer design routing protocol, and the intelligent algorithm based routing protocol. This is also the first paper that introduces intelligent algorithm-based UASN routing protocols. In addition, in this paper, we investigate the development trends of UASN routing protocols, which can provide researchers with clear and direct insights for further research.

  8. Fungicide resistance assays for fungal plant pathogens.

    Secor, Gary A; Rivera, Viviana V


    Fungicide resistance assays are useful to determine if a fungal pathogen has developed resistance to a fungicide used to manage the disease it causes. Laboratory assays are used to determine loss of sensitivity, or resistance, to a fungicide and can explain fungicide failures and for developing successful fungicide recommendations in the field. Laboratory assays for fungicide resistance are conducted by measuring reductions in growth or spore germination of fungi in the presence of fungicide, or by molecular procedures. This chapter describes two techniques for measuring fungicide resistance, using the sugarbeet leaf spot fungus Cercospora beticola as a model for the protocol. Two procedures are described for fungicides from two different classes; growth reduction for triazole (sterol demethylation inhibitor; DMI) fungicides, and inhibition of spore germination for quinone outside inhibitor (QoI) fungicides.

  9. IPv6 Protocol Analyzer


    With the emerging of next generation Intemet protocol (IPv6), it is expected to replace the current version of Internet protocol (IPv4) that will be exhausted in the near future. Besides providing adequate address space, some other new features are included into the new 128 bits of IP such as IP auto configuration, quality of service, simple routing capability, security, mobility and multicasting. The current protocol analyzer will not be able to handle IPv6 packets. This paper will focus on developing protocol analyzer that decodes IPv6 packet. IPv6 protocol analyzer is an application module,which is able to decode the IPv6 packet and provide detail breakdown of the construction of the packet. It has to understand the detail construction of the IPv6, and provide a high level abstraction of bits and bytes of the IPv6 packet.Thus it increases network administrators' understanding of a network protocol,helps he/she in solving protocol related problem in a IPv6 network environment.

  10. LED system reliability

    Driel, W.D. van; Yuan, C.A.; Koh, S.; Zhang, G.Q.


    This paper presents our effort to predict the system reliability of Solid State Lighting (SSL) applications. A SSL system is composed of a LED engine with micro-electronic driver(s) that supplies power to the optic design. Knowledge of system level reliability is not only a challenging scientific ex

  11. Principles of Bridge Reliability

    Thoft-Christensen, Palle; Nowak, Andrzej S.

    The paper gives a brief introduction to the basic principles of structural reliability theory and its application to bridge engineering. Fundamental concepts like failure probability and reliability index are introduced. Ultimate as well as serviceability limit states for bridges are formulated...

  12. Improving machinery reliability

    Bloch, Heinz P


    This totally revised, updated and expanded edition provides proven techniques and procedures that extend machinery life, reduce maintenance costs, and achieve optimum machinery reliability. This essential text clearly describes the reliability improvement and failure avoidance steps practiced by best-of-class process plants in the U.S. and Europe.

  13. Hawaii Electric System Reliability

    Loose, Verne William [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Silva Monroy, Cesar Augusto [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)


    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers’ views of reliability “worth” and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers’ views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.

  14. Hawaii electric system reliability.

    Silva Monroy, Cesar Augusto; Loose, Verne William


    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers' views of reliability %E2%80%9Cworth%E2%80%9D and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers' views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.

  15. Absolute nuclear material assay

    Prasad, Manoj K [Pleasanton, CA; Snyderman, Neal J [Berkeley, CA; Rowland, Mark S [Alamo, CA


    A method of absolute nuclear material assay of an unknown source comprising counting neutrons from the unknown source and providing an absolute nuclear material assay utilizing a model to optimally compare to the measured count distributions. In one embodiment, the step of providing an absolute nuclear material assay comprises utilizing a random sampling of analytically computed fission chain distributions to generate a continuous time-evolving sequence of event-counts by spreading the fission chain distribution in time.

  16. Chapter 9: Reliability

    Algora, Carlos; Espinet-Gonzalez, Pilar; Vazquez, Manuel; Bosco, Nick; Miller, David; Kurtz, Sarah; Rubio, Francisca; McConnell,Robert


    This chapter describes the accumulated knowledge on CPV reliability with its fundamentals and qualification. It explains the reliability of solar cells, modules (including optics) and plants. The chapter discusses the statistical distributions, namely exponential, normal and Weibull. The reliability of solar cells includes: namely the issues in accelerated aging tests in CPV solar cells, types of failure and failures in real time operation. The chapter explores the accelerated life tests, namely qualitative life tests (mainly HALT) and quantitative accelerated life tests (QALT). It examines other well proven and experienced PV cells and/or semiconductor devices, which share similar semiconductor materials, manufacturing techniques or operating conditions, namely, III-V space solar cells and light emitting diodes (LEDs). It addresses each of the identified reliability issues and presents the current state of the art knowledge for their testing and evaluation. Finally, the chapter summarizes the CPV qualification and reliability standards.

  17. ATM and Internet protocol

    Bentall, M; Turton, B


    Asynchronous Transfer Mode (ATM) is a protocol that allows data, sound and video being transferred between independent networks via ISDN links to be supplied to, and interpreted by, the various system protocols.ATM and Internet Protocol explains the working of the ATM and B-ISDN network for readers with a basic understanding of telecommunications. It provides a handy reference to everyone working with ATM who may not require the full standards in detail, but need a comprehensive guide to ATM. A substantial section is devoted to the problems of running IP over ATM and there is some discussion o

  18. Playing With Population Protocols

    Xavier Koegler


    Full Text Available Population protocols have been introduced as a model of sensor networks consisting of very limited mobile agents with no control over their own movement: A collection of anonymous agents, modeled by finite automata, interact in pairs according to some rules. Predicates on the initial configurations that can be computed by such protocols have been characterized under several hypotheses. We discuss here whether and when the rules of interactions between agents can be seen as a game from game theory. We do so by discussing several basic protocols.

  19. Linear Logical Voting Protocols

    DeYoung, Henry; Schürmann, Carsten


    . In response, we promote linear logic as a high-level language for both specifying and implementing voting protocols. Our linear logical specifications of the single-winner first-past-the-post (SW- FPTP) and single transferable vote (STV) protocols demonstrate that this approach leads to concise......Current approaches to electronic implementations of voting protocols involve translating legal text to source code of an imperative programming language. Because the gap between legal text and source code is very large, it is difficult to trust that the program meets its legal specification...... implementations that closely correspond to their legal specification, thereby increasing trust....

  20. CRPCG—Clustering Routing Protocol based on Connected Graph

    Feng Li


    Full Text Available In order to balance the load between cluster head, save the energy consumption of the inter-cluster routing, enhance reliability and flexibility of data transmission, the paper proposes a new clustering routing protocol based on connected graph (CRPCG. The protocol optimizes and innovates in three aspects: cluster head election, clusters formation and clusters routing. Eventually, a connected graph is constituted by the based station and all cluster heads, using the excellent algorithm of the graph theory, to guarantee the network connectivity and reliability, improve the link quality, balance node energy and prolong the network life cycle. The results of simulation show that, the protocol significantly prolong the network life cycle, balance the energy of network nodes, especially in the phase of inter-cluster data transmission, improving the reliability and efficiency of data transmission.

  1. Manet Load Balancing Parallel Routing Protocol

    Hesham Arafat Ali


    Full Text Available In recent years, multi-path routing protocols have attained more attention in mobile ad hoc networks as compared to other routing schemes due to their abilities and efficiency in improving bandwidth of communication, increasing delivery reliability, responding to congestion and heavy traffic. Several protocols have been developed to address multi path routing, but it always has a problem that the discovered paths may be not 100% disjoint and sending data is done in only one path until it's broken; the discovery of multiple paths also generates more overhead on the network. Load Balancing Parallel Routing Protocol [LBPRP] tried to solve previous multi path problems, distributing traffic among multiple paths sending data in parallel form as it uses all paths in the same time. We employed a simple test scenario to be sure of proposed model efficiency and to validate the proposed Load Balancing Parallel Routing Protocol. [LBPRP] will achieve load balancing in sending data, decreasing the end-to-end delay and increasing the packet delivery ratio and throughput, thus the performance of multi-path routing protocols can be improved consequently.


    S. Rajeswari; Venkataramani, Y.


    In Gossip Sleep Protocol, network performance is enhanced based on energy resource. But energy conservation is achieved with the reduced throughput. In this paper, it has been proposed a new Protocol for Mobile Ad hoc Network to achieve reliability with energy conservation. Based on the probability (p) values, the value of sleep nodes is fixed initially. The probability value can be adaptively adjusted by Remote Activated Switch during the transmission process. The adaptiveness of gossiping p...

  3. Testing AMQP protocol on unstable and mobile networks

    LUZURIAGA QUICHIMBO, JORGE ELOY; Pérez, Miguel; Boronat, Pablo; Cano Escribá, Juan Carlos; Tavares De Araujo Cesariny Calafate, Carlos Miguel; Manzoni ., Pietro


    The final publication is available at Springer via 10.1007/978-3-319-11692-1_22 AMQP is a middleware protocol extensively used for exchanging messages in distributed applications. It provides an abstraction of the different participating parts and simplifies communication programming details. AMQP provides reliability features and alleviates the coordination of different entities of an application. However, implementations of this protocol have not been w...

  4. 1996 : Track Count Protocol

    US Fish and Wildlife Service, Department of the Interior — The goal of St. Vincent National Wildlife Refuge's Track Count Protocol is to provide an index to the population size of game animals inhabiting St. Vincent Island.

  5. Quantum deniable authentication protocol

    Shi, Wei-Min; Zhou, Yi-Hua; Yang, Yu-Guang


    The proposed quantum identity authentication schemes only involved authentication between two communicators, but communications with deniability capability are often desired in electronic applications such as online negotiation and electronic voting. In this paper, we proposed a quantum deniable authentication protocol. According to the property of unitary transformation and quantum one-way function, this protocol can provide that only the specified receiver can identify the true source of a given message and the specified receiver cannot prove the source of the message to a third party by a transcript simulation algorithm. Moreover, the quantum key distribution and quantum encryption algorithm guarantee the unconditional security of this scheme. Security analysis results show that this protocol satisfies the basic security requirements of deniable authentication protocol such as completeness and deniability and can withstand the forgery attack, impersonation attack, inter-resend attack.

  6. Unconditionally Secure Protocols

    Meldgaard, Sigurd Torkel

    the secure computation. Especially we look at the communication complexity of protocols in this model, and perfectly secure protocols. We show general protocols for any finite functionality with statistical security and optimal communication complexity (but exponential amount of preprocessing). And for two......This thesis contains research on the theory of secure multi-party computation (MPC). Especially information theoretically (as opposed to computationally) secure protocols. It contains results from two main lines of work. One line on Information Theoretically Secure Oblivious RAMS, and how...... they are used to speed up secure computation. An Oblivious RAM is a construction for a client with a small $O(1)$ internal memory to store $N$ pieces of data on a server while revealing nothing more than the size of the memory $N$, and the number of accesses. This specifically includes hiding the access pattern...

  7. USA-USSR protocol


    On 30 November the USA Atomic Energy Commission and the USSR State Committee for the Utilization of Atomic Energy signed, in Washington, a protocol 'on carrying out of joint projects in the field of high energy physics at the accelerators of the National Accelerator Laboratory (Batavia) and the Institute for High Energy Physics (Serpukhov)'. The protocol will be in force for five years and can be extended by mutual agreement.

  8. Cognitive Protocol Stack Design


    directly related to the protocol stack, e.g., environmental or positioning data) that can be exploited to design and test novel cognitive networking ...quality of service (QoS) is challenging. Currently, 5G technologies are being developed to answer the need for further increasing network capacity, and...SECURITY CLASSIFICATION OF: In the ARO “Cognitive Protocol Stack Design" project we proposed cognitive networking solutions published in international

  9. Reliable Communication in Wireless Meshed Networks using Network Coding

    Pahlevani, Peyman; Paramanathan, Achuthan; Hundebøll, Martin; Heide, Janus; Rein, Stephan Alexander; Fitzek, Frank


    The advantages of network coding have been extensively studied in the field of wireless networks. Integrating network coding with existing IEEE 802.11 MAC layer is a challenging problem. The IEEE 802.11 MAC does not provide any reliability mechanisms for overheard packets. This paper addresses this problem and suggests different mechanisms to support reliability as part of the MAC protocol. Analytical expressions to this problem are given to qualify the performanceof the modified network codi...


    A. Suruliandi


    Full Text Available Multicast is a process used to transfer same message to multiple receivers at the same time. This paper presents the simulation and analysis of the performance of six different multicast routing protocols for Wireless Sensor Network (WSN. They are On Demand Multicast Routing Protocol (ODMRP, Protocol for Unified Multicasting through Announcement (PUMA, Multicast Adhoc On demand Distance Vector Protocol (MAODV, Overlay Boruvka-based Adhoc Multicast Protocol (OBAMP, Application Layer Multicast Algorithm (ALMA and enhanced version of ALMA (ALMA-H for WSN. Among them, ODMRP, MAODV and PUMA are reactive protocols while OBAMP, ALMA and ALMA-H are proactive protocols. This paper compares the performance of these protocols with common parameters such as Throughput, Reliability, End-to-End delay and Packet Delivery Ratio (PDR with increasing the numbers of nodes and increasing the speed of the nodes. The main objective of this work is to select the efficient multicast routing protocol for WSN among six multicast routing protocol based on relative strength and weakness of each protocol. The summary of above six multicast routing protocols is presented with a table of different performance characteristics. Experimental result shows that ODMRP attains higher throughput, reliability and higher packet delivery ratio than other multicast routing protocol, while incurring far less end-to-end delay.

  11. Photovoltaic system reliability

    Maish, A.B.; Atcitty, C. [Sandia National Labs., NM (United States); Greenberg, D. [Ascension Technology, Inc., Lincoln Center, MA (United States)] [and others


    This paper discusses the reliability of several photovoltaic projects including SMUD`s PV Pioneer project, various projects monitored by Ascension Technology, and the Colorado Parks project. System times-to-failure range from 1 to 16 years, and maintenance costs range from 1 to 16 cents per kilowatt-hour. Factors contributing to the reliability of these systems are discussed, and practices are recommended that can be applied to future projects. This paper also discusses the methodology used to collect and analyze PV system reliability data.

  12. Structural Reliability Methods

    Ditlevsen, Ove Dalager; Madsen, H. O.

    of structural reliability, including the theoretical basis for these methods. Partial safety factor codes under current practice are briefly introduced and discussed. A probabilistic code format for obtaining a formal reliability evaluation system that catches the most essential features of the nature......The structural reliability methods quantitatively treat the uncertainty of predicting the behaviour and properties of a structure given the uncertain properties of its geometry, materials, and the actions it is supposed to withstand. This book addresses the probabilistic methods for evaluation...

  13. High-Bandwidth Tactical-Network Data Analysis in a High-Performance-Computing (HPC) Environment: Transport Protocol (Transmission Control Protocol/User Datagram Protocol [TCP/UDP]) Analysis


    Protocol [TCP/UDP]) Analysis 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Kenneth D Renard and James R Adametz...guaranteed and ordered delivery of data from application to application. Internet Protocol ( IP ) does not guarantee either of these capabilities...that require reliable transport services, careful examination of TCP protocol performance is necessary. Data collected during test events includes IP

  14. Reliable Electronic Equipment

    N. A. Nayak


    Full Text Available The reliability aspect of electronic equipment's is discussed. To obtain optimum results, close cooperation between the components engineer, the design engineer and the production engineer is suggested.

  15. Reliability prediction techniques

    Whittaker, B.; Worthington, B.; Lord, J.F.; Pinkard, D.


    The paper demonstrates the feasibility of applying reliability assessment techniques to mining equipment. A number of techniques are identified and described and examples of their use in assessing mining equipment are given. These techniques include: reliability prediction; failure analysis; design audit; maintainability; availability and the life cycle costing. Specific conclusions regarding the usefulness of each technique are outlined. The choice of techniques depends upon both the type of equipment being assessed and its stage of development, with numerical prediction best suited for electronic equipment and fault analysis and design audit suited to mechanical equipment. Reliability assessments involve much detailed and time consuming work but it has been demonstrated that the resulting reliability improvements lead to savings in service costs which more than offset the cost of the evaluation.

  16. FLIPR assays of intracellular calcium in GPCR drug discovery

    Hansen, Kasper Bø; Bräuner-Osborne, Hans


    Fluorescent dyes sensitive to changes in intracellular calcium have become increasingly popular in G protein-coupled receptor (GPCR) drug discovery for several reasons. First of all, the assays using the dyes are easy to perform and are of low cost compared to other assays. Second, most non-Galph...... making them obtainable even for academic groups. Here, we present a protocol for measuring changes in intracellular calcium levels in living mammalian cells based on the fluorescent calcium binding dye, fluo-4....

  17. A double candidate survivable routing protocol for HAP network

    He, Panfeng; Li, Chunyue; Ni, Shuyan


    To improve HAP network invulnerability, and at the same time considering the quasi-dynamic topology in HAP network, a simple and reliable routing protocol is proposed in the paper. The protocol firstly uses a double-candidate strategy for the next-node select to provide better robustness. Then during the maintenance stage, short hello packets instead of long routing packets are used only to check link connectivity in the quasi-dynamic HAP network. The route maintenance scheme based on short hello packets can greatly reduce link spending. Simulation results based on OPNET demonstrate the effectiveness of the proposed routing protocol.

  18. The rating reliability calculator

    Solomon David J


    Full Text Available Abstract Background Rating scales form an important means of gathering evaluation data. Since important decisions are often based on these evaluations, determining the reliability of rating data can be critical. Most commonly used methods of estimating reliability require a complete set of ratings i.e. every subject being rated must be rated by each judge. Over fifty years ago Ebel described an algorithm for estimating the reliability of ratings based on incomplete data. While his article has been widely cited over the years, software based on the algorithm is not readily available. This paper describes an easy-to-use Web-based utility for estimating the reliability of ratings based on incomplete data using Ebel's algorithm. Methods The program is available public use on our server and the source code is freely available under GNU General Public License. The utility is written in PHP, a common open source imbedded scripting language. The rating data can be entered in a convenient format on the user's personal computer that the program will upload to the server for calculating the reliability and other statistics describing the ratings. Results When the program is run it displays the reliability, number of subject rated, harmonic mean number of judges rating each subject, the mean and standard deviation of the averaged ratings per subject. The program also displays the mean, standard deviation and number of ratings for each subject rated. Additionally the program will estimate the reliability of an average of a number of ratings for each subject via the Spearman-Brown prophecy formula. Conclusion This simple web-based program provides a convenient means of estimating the reliability of rating data without the need to conduct special studies in order to provide complete rating data. I would welcome other researchers revising and enhancing the program.

  19. Reliability of power connections

    BRAUNOVIC Milenko


    Despite the use of various preventive maintenance measures, there are still a number of problem areas that can adversely affect system reliability. Also, economical constraints have pushed the designs of power connections closer to the limits allowed by the existing standards. The major parameters influencing the reliability and life of Al-Al and Al-Cu connections are identified. The effectiveness of various palliative measures is determined and the misconceptions about their effectiveness are dealt in detail.

  20. Design and Analysis for Reliability of Wireless Sensor Network

    Yongxian Song


    Full Text Available Reliability is an important performance indicator of wireless sensor network, to some application fields, which have high demands in terms of reliability, it is particularly important to ensure reliability of network. At present, the reliability research findings of wireless sensor network are much more at home and abroad, but they mainly improve network reliability from the networks topology, reliable protocol and application layer fault correction and so on, and reliability of network is comprehensive considered from hardware and software aspects is much less. This paper adopts bionic hardware to implement bionic reconfigurable of wireless sensor network nodes, so as to the nodes have able to change their structure and behavior autonomously and dynamically, in the cases of the part hardware are failure, and the nodes can realize bionic self-healing. Secondly, Markov state diagram and probability analysis method are adopted to realize solution of functional model for reliability, establish the relationship between reliability and characteristic parameters for sink nodes, analyze sink nodes reliability model, so as to determine the reasonable parameters of the model and ensure reliability of sink nodes.

  1. Performance Analysis of MAC Layer Protocols in Wireless Sensor Network

    Hameeza Ahmed


    Full Text Available Media Access Control (MAC layer protocols have a critical role in making a typical Wireless Sensor Network (WSN more reliable and efficient. Choice of MAC layer protocol and other factors including number of nodes, mobility, traffic rate and playground size dictates the performance of a particular WSN. In this paper, the performance of an experimental WSN is evaluated using different MAC layer protocols. In this experiment, a WSN is created using OMNeT++ MiXiM network simulator and its performance in terms of packet delivery ratio and mean latency is evaluated. The simulation results show that IEEE 802.11 MAC layer protocol performs better than CSMA, B-MAC and IEEE 802.15.4 MAC layer protocols. In the considered scenario, IEEE 802.15.4 is ranked second in performance, followed by CSMA and B-MAC.

  2. Multidisciplinary System Reliability Analysis

    Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)


    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  3. Sensitivity Analysis of Component Reliability



    In a system, Every component has its unique position within system and its unique failure characteristics. When a component's reliability is changed, its effect on system reliability is not equal. Component reliability sensitivity is a measure of effect on system reliability while a component's reliability is changed. In this paper, the definition and relative matrix of component reliability sensitivity is proposed, and some of their characteristics are analyzed. All these will help us to analyse or improve the system reliability.

  4. Realization of Timed Reliable Communication over Off-The-Shelf Wireless Technologies

    Malinowsky, B.; Groenbaek, Jesper; Schwefel, Hans-Peter


    Industrial and safety-critical applications pose strict requirements for timeliness and reliability for the communication solution. Thereby the use of off-the-shelf (OTS) wireless communication technologies can be attractive to achieve low cost and easy deployment. This paper presents and analyses...... a protocol and its analytical model, enabling to configure for explicit timeliness and message reliability requirements under different link technologies and conditions. We assess the timing behavior and reliability properties studying a scenario of distributing safety-critical alerts. Our evaluation covers...... and link measurements to contribute to a (self-)configurable timed reliable protocol deployed on multiple technologies....

  5. In vitro detection of contact allergens: development of an optimized protocol using human peripheral blood monocyte-derived dendritic cells.

    Reuter, Hendrik; Spieker, Jochem; Gerlach, Silke; Engels, Ursula; Pape, Wolfgang; Kolbe, Ludger; Schmucker, Robert; Wenck, Horst; Diembeck, Walter; Wittern, Klaus-Peter; Reisinger, Kerstin; Schepky, Andreas G


    Allergic contact dermatitis is a delayed T-cell mediated allergic response associated with relevant social and economic impacts. Animal experiments (e.g. the local lymph node assay) are still supplying most of the data used to assess the sensitization potential of new chemicals. However, the 7th amendment to the EU Cosmetic Directive will introduce a testing ban for cosmetic ingredients after 2013. In vitro alternative methods are thus being actively developed. Although promising results have been obtained with cell lines, their reduced functionality and inherent genomic instability led us to reinvestigate the use of peripheral blood monocyte-derived dendritic cells (PBMDCs) for the establishment of a reliable in vitro sensitization test. To solve the issues associated with the use of primary cells, the culture and exposure conditions (cytokine concentrations, incubation time, readout, pooled vs. single donors and cytotoxicity) were re-assessed and optimized. Here we propose a stable and reproducible protocol based on PBMDCs. This should allow a wider acceptance of PBMDCs as a reliable test system for the detection of human skin sensitizers and the inclusion of this protocol in an integrated testing strategy.

  6. Protocol: A simple phenol-based method for 96-well extraction of high quality RNA from Arabidopsis

    Coustham Vincent


    Full Text Available Abstract Background Many experiments in modern plant molecular biology require the processing of large numbers of samples for a variety of applications from mutant screens to the analysis of natural variants. A severe bottleneck to many such analyses is the acquisition of good yields of high quality RNA suitable for use in sensitive downstream applications such as real time quantitative reverse-transcription-polymerase chain reaction (real time qRT-PCR. Although several commercial kits are available for high-throughput RNA extraction in 96-well format, only one non-kit method has been described in the literature using the commercial reagent TRIZOL. Results We describe an unusual phenomenon when using TRIZOL reagent with young Arabidopsis seedlings. This prompted us to develop a high-throughput RNA extraction protocol (HTP96 adapted from a well established phenol:chloroform-LiCl method (P:C-L that is cheap, reliable and requires no specialist equipment. With this protocol 192 high quality RNA samples can be prepared in 96-well format in three hours (less than 1 minute per sample with less than 1% loss of samples. We demonstrate that the RNA derived from this protocol is of high quality and suitable for use in real time qRT-PCR assays. Conclusion The development of the HTP96 protocol has vastly increased our sample throughput, allowing us to fully exploit the large sample capacity of modern real time qRT-PCR thermocyclers, now commonplace in many labs, and develop an effective high-throughput gene expression platform. We propose that the HTP96 protocol will significantly benefit any plant scientist with the task of obtaining hundreds of high quality RNA extractions.

  7. Cochleotoxicity monitoring protocol.

    Ferreira Penêda, José; Barros Lima, Nuno; Ribeiro, Leandro; Helena, Diamantino; Domingues, Bruno; Condé, Artur


    Cochlear damage is frequent in long-term aminoglycosides therapy or chemotherapeutic treatments with platinum-based agents. Despite its prevalence, it is currently underestimated and underdiagnosed. A monitoring protocol is vital to the early detection of cochleotoxicity and its implementation is widely encouraged in every hospital unit. Our aim was to elaborate a cochleotoxicity monitoring protocol for patients treated with platinum compounds or aminoglycosides antibiotics. PubMed® database was searched using terms relevant to drug cochleotoxicity in order to identify the most adequate protocol. Several articles and guidelines influenced our decision. There is no consensus on a universal monitoring protocol. Its formulation and application rely heavily on available resources and personnel. High-frequency audiometry and otoacoustic emissions play an important role on early detection of cochleotoxicity caused by aminoglycoside antibiotics and platinum compounds. A cochleotoxicity monitoring protocol consisting on an initial evaluation, treatment follow-up and post-treatment evaluation is proposed. Copyright © 2017 Elsevier España, S.L.U. and Sociedad Española de Otorrinolaringología y Cirugía de Cabeza y Cuello. All rights reserved.

  8. Assays for in vitro monitoring of human airway smooth muscle (ASM) and human pulmonary arterial vascular smooth muscle (VSM) cell migration.

    Goncharova, Elena A; Goncharov, Dmitry A; Krymskaya, Vera P


    Migration of human pulmonary vascular smooth muscle (VSM) cells contributes to vascular remodeling in pulmonary arterial hypertension and atherosclerosis. Evidence also indicates that, in part, migration of airway smooth muscle (ASM) cells may contribute to airway remodeling associated with asthma. Here we describe migration of VSM and ASM cells in vitro using Transwell or Boyden chamber assays. Because dissecting signaling mechanisms regulating cell migration requires molecular approaches, our protocol also describes how to assess migration of transfected VSM and ASM cells. Transwell or Boyden chamber assays can be completed in approximately 8 h and include plating of serum-deprived VSM or ASM cell suspension on membrane precoated with collagen, migration of cells toward chemotactic gradient and visual (Transwell) or digital (Boyden chamber) analysis of membrane. Although the Transwell assay is easy, the Boyden chamber assay requires hands-on experience; however, both assays are reliable cell-based approaches providing valuable information on how chemotactic and inflammatory factors modulate VSM and ASM migration.

  9. Optimisation for assay of fluorescein diacetate hydrolytic activity as a sensitive tool to evaluate impacts of pollutants and nutrients on microbial activity in coastal sediments.

    Jiang, Shan; Huang, Jing; Lu, Haoliang; Liu, JingChun; Yan, Chongling


    Fluorescein diacetate (FDA) assay has been widely applied in coastal research to quantify microbial activity in sediments. However, the present FDA assay procedures embodied in sediment studies potentially include operational errors since the protocol was established for studies of terrestrial soil. In the present study, we optimised the procedure of FDA assay using sandy and cohesive sediments to improve experiential sensitivity and reproducibility. The optimised method describes quantitative measurement of the fluorescein produced when 1.0g of fresh sediment is incubated with 50mM phosphate buffer solution (pH: 7.3) and glass beads (2g) at 35°C for 1h under a rotation of 50rpm. The covariation coefficient of the optimised method ranged from 1.9% to 3.8% and the method sensitivity ranged from 0.25 to 1.57. The improved protocol provides a more reliable measurement of the FDA hydrolysis rate over a wide range of sediments compared to the original method.

  10. JaCVAM-organized international validation study of the in vivo rodent alkaline comet assay for detection of genotoxic carcinogens: II. Summary of definitive validation study results.

    Uno, Yoshifumi; Kojima, Hajime; Omori, Takashi; Corvi, Raffaella; Honma, Masamistu; Schechtman, Leonard M; Tice, Raymond R; Beevers, Carol; De Boeck, Marlies; Burlinson, Brian; Hobbs, Cheryl A; Kitamoto, Sachiko; Kraynak, Andrew R; McNamee, James; Nakagawa, Yuzuki; Pant, Kamala; Plappert-Helbig, Ulla; Priestley, Catherine; Takasawa, Hironao; Wada, Kunio; Wirnitzer, Uta; Asano, Norihide; Escobar, Patricia A; Lovell, David; Morita, Takeshi; Nakajima, Madoka; Ohno, Yasuo; Hayashi, Makoto


    The in vivo rodent alkaline comet assay (comet assay) is used internationally to investigate the in vivo genotoxic potential of test chemicals. This assay, however, has not previously been formally validated. The Japanese Center for the Validation of Alternative Methods (JaCVAM), with the cooperation of the U.S. NTP Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM)/the Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM), the European Centre for the Validation of Alternative Methods (ECVAM), and the Japanese Environmental Mutagen Society/Mammalian Mutagenesis Study Group (JEMS/MMS), organized an international validation study to evaluate the reliability and relevance of the assay for identifying genotoxic carcinogens, using liver and stomach as target organs. The ultimate goal of this exercise was to establish an Organisation for Economic Co-operation and Development (OECD) test guideline. The study protocol was optimized in the pre-validation studies, and then the definitive (4th phase) validation study was conducted in two steps. In the 1st step, assay reproducibility was confirmed among laboratories using four coded reference chemicals and the positive control ethyl methanesulfonate. In the 2nd step, the predictive capability was investigated using 40 coded chemicals with known genotoxic and carcinogenic activity (i.e., genotoxic carcinogens, genotoxic non-carcinogens, non-genotoxic carcinogens, and non-genotoxic non-carcinogens). Based on the results obtained, the in vivo comet assay is concluded to be highly capable of identifying genotoxic chemicals and therefore can serve as a reliable predictor of rodent carcinogenicity.

  11. Cell viability assays: introduction.

    Stoddart, Martin J


    The measurement of cell viability plays a fundamental role in all forms of cell culture. Sometimes it is the main purpose of the experiment, such as in toxicity assays. Alternatively, cell viability can be used to -correlate cell behaviour to cell number, providing a more accurate picture of, for example, anabolic -activity. There are wide arrays of cell viability methods which range from the most routine trypan blue dye exclusion assay to highly complex analysis of individual cells, such as using RAMAN microscopy. The cost, speed, and complexity of equipment required will all play a role in determining the assay used. This chapter aims to provide an overview of many of the assays available today.

  12. Transgenic Animal Mutation Assays

    Tao Chen; Ph.D.D.A.B.T.


    @@ The novel transgenic mouse and rat mutation assays have provided a tool for analyzing in vivo mutation in any tissue, thus permitting the direct comparison of cancer incidence with mutant frequency.

  13. Assays for thrombopoietin

    McDonald, T.P.


    In summary, thrombopoietin levels have been determined indirectly by measuring thrombocytopoiesis in assay animals (platelet counting, measurement of isotope incorporation into newly formed platelets, changes in platelet sizes, or alterations in number and size of megakaryocytes) and by use of an immunoassay. Although much work remains, it seems clear at the present time that isotopic uptake into platelets of specially prepared assay mice (rebound-thrombocytosis) is superior to the other techniques now available for the measurement of thrombopoietin. However, the ideal assay for TSF which is specific, rapid, and inexpensive is yet to be developed. An immunoassay is in the development stage, but will require additional work before it can be utilized for the routine assay of TSF.

  14. 75 FR 43059 - Mandatory Reliability Standards for the Calculation of Available Transfer Capability, Capacity...


    ...-Power System; and Standards for Business Practices and Communications Protocols for Public Utilities..., Order No. 729-A, 131 FERC ] 61,109 (2010). \\2\\ Standards for Business Practices and Communication... of Available Transfer Capability, Capacity Benefit Margins, Transmission Reliability Margins,...

  15. New Rapid Spore Assay

    Kminek, Gerhard; Conley, Catharine


    The presentation will detail approved Planetary Protection specifications for the Rapid Spore Assay for spacecraft components and subsystems. Outlined will be the research and studies on which the specifications were based. The research, funded by ESA and NASA/JPL, was conducted over a period of two years and was followed by limited cleanroom studies to assess the feasibility of this assay during spacecraft assembly.

  16. Reliability and Validity of the Standing Heel-Rise Test

    Yocum, Allison; McCoy, Sarah Westcott; Bjornson, Kristie F.; Mullens, Pamela; Burton, Gay Naganuma


    A standardized protocol for a pediatric heel-rise test was developed and reliability and validity are reported. Fifty-seven children developing typically (CDT) and 34 children with plantar flexion weakness performed three tests: unilateral heel rise, vertical jump, and force measurement using handheld dynamometry. Intraclass correlation…

  17. Cytoskeleton - Methods and Protocols

    CarloAlberto Redi


    Full Text Available Cytoskeleton - Methods and ProtocolsSecond edition, 2010; Ray H. Gavin (Ed; Springer Protocols methods in molecular biology, vol. 586 Humana press, Totowa, New Jersey (USA; Pages: 390; €95.44; ISBN: 978-1-60761-375-6Ray H. Gavin, from the Brooklyn College of The City University of New York, Brooklyn, NY, USA, wrote a few line as preface of this book. This is quite understandable: there is not a great need of words when there are facts that sustain and favour the dissemination of a cultural product. This is the case of the second edition of Cytoskeleton - Methods and Protocols, which appears just ten years after the first edition...

  18. DNA repair protocols

    Bjergbæk, Lotte

    In its 3rd edition, this Methods in Molecular Biology(TM) book covers the eukaryotic response to genomic insult including advanced protocols and standard techniques in the field of DNA repair. Offers expert guidance for DNA repair, recombination, and replication. Current knowledge of the mechanisms...... that regulate DNA repair has grown significantly over the past years with technology advances such as RNA interference, advanced proteomics and microscopy as well as high throughput screens. The third edition of DNA Repair Protocols covers various aspects of the eukaryotic response to genomic insult including...... recent advanced protocols as well as standard techniques used in the field of DNA repair. Both mammalian and non-mammalian model organisms are covered in the book, and many of the techniques can be applied with only minor modifications to other systems than the one described. Written in the highly...

  19. Blind Cognitive MAC Protocols

    Mehanna, Omar; Gamal, Hesham El


    We consider the design of cognitive Medium Access Control (MAC) protocols enabling an unlicensed (secondary) transmitter-receiver pair to communicate over the idle periods of a set of licensed channels, i.e., the primary network. The objective is to maximize data throughput while maintaining the synchronization between secondary users and avoiding interference with licensed (primary) users. No statistical information about the primary traffic is assumed to be available a-priori to the secondary user. We investigate two distinct sensing scenarios. In the first, the secondary transmitter is capable of sensing all the primary channels, whereas it senses one channel only in the second scenario. In both cases, we propose MAC protocols that efficiently learn the statistics of the primary traffic online. Our simulation results demonstrate that the proposed blind protocols asymptotically achieve the throughput obtained when prior knowledge of primary traffic statistics is available.


    Тамаргазін, О. А.; Національний авіаційний університет; Власенко, П. О.; Національний авіаційний університет


    Airline's operational structure for Reliability program implementation — engineering division, reliability  division, reliability control division, aircraft maintenance division, quality assurance division — was considered. Airline's Reliability program structure is shown. Using of Reliability program for reducing costs on aircraft maintenance is proposed. Рассмотрена организационная структура авиакомпании по выполнению Программы надежности - инженерный отдел, отделы по надежности авиацио...

  1. Ultra reliability at NASA

    Shapiro, Andrew A.


    Ultra reliable systems are critical to NASA particularly as consideration is being given to extended lunar missions and manned missions to Mars. NASA has formulated a program designed to improve the reliability of NASA systems. The long term goal for the NASA ultra reliability is to ultimately improve NASA systems by an order of magnitude. The approach outlined in this presentation involves the steps used in developing a strategic plan to achieve the long term objective of ultra reliability. Consideration is given to: complex systems, hardware (including aircraft, aerospace craft and launch vehicles), software, human interactions, long life missions, infrastructure development, and cross cutting technologies. Several NASA-wide workshops have been held, identifying issues for reliability improvement and providing mitigation strategies for these issues. In addition to representation from all of the NASA centers, experts from government (NASA and non-NASA), universities and industry participated. Highlights of a strategic plan, which is being developed using the results from these workshops, will be presented.

  2. Photovoltaic module reliability workshop

    Mrig, L. (ed.)


    The paper and presentations compiled in this volume form the Proceedings of the fourth in a series of Workshops sponsored by Solar Energy Research Institute (SERI/DOE) under the general theme of photovoltaic module reliability during the period 1986--1990. The reliability Photo Voltaic (PV) modules/systems is exceedingly important along with the initial cost and efficiency of modules if the PV technology has to make a major impact in the power generation market, and for it to compete with the conventional electricity producing technologies. The reliability of photovoltaic modules has progressed significantly in the last few years as evidenced by warranties available on commercial modules of as long as 12 years. However, there is still need for substantial research and testing required to improve module field reliability to levels of 30 years or more. Several small groups of researchers are involved in this research, development, and monitoring activity around the world. In the US, PV manufacturers, DOE laboratories, electric utilities and others are engaged in the photovoltaic reliability research and testing. This group of researchers and others interested in this field were brought together under SERI/DOE sponsorship to exchange the technical knowledge and field experience as related to current information in this important field. The papers presented here reflect this effort.

  3. Controlling variation in the comet assay

    Andrew Richard Collins


    Full Text Available Variability of the comet assay is a serious issue, whether it occurs from experiment to experiment in the same laboratory, or between different laboratories analysing identical samples. Do we have to live with high variability, just because the comet assay is a biological assay rather than analytical chemistry? Numerous attempts have been made to limit variability by standardising the assay protocol, and the critical steps in the assay have been identified; agarose concentration, duration of alkaline incubation, and electrophoresis conditions (time, temperature and voltage gradient are particularly important. Even when these are controlled, variation seems to be inevitable. It is helpful to include in experiments reference standards, i.e. cells with a known amount of specific damage to the DNA. They can be aliquots frozen from a single large batch of cells, either untreated (negative controls or treated with, for example, H2O2 or X-rays to induce strand breaks (positive control for the basic assay, or photosensitiser plus light to oxidise guanine (positive control for Fpg- or OGG1-sensitive sites. Reference standards are especially valuable when performing a series of experiments over a long period - for example, analysing samples of white blood cells from a large human biomonitoring trial - to check that the assay is performing consistently, and to identify anomalous results necessitating a repeat experiment. The reference values of tail intensity can also be used to iron out small variations occurring from day to day. We present examples of the use of reference standards in human trials, both within one laboratory and between different laboratories, and describe procedures that can be used to control variation.

  4. Use of a Brine Shrimp Assay to Study Herbal Teas in the Classroom.

    Opler, Annette; Mizell, Rebecca; Robert, Alexander; Cervantes-Cervantes, Miguel; Kincaid, Dwight; Kennelly, Edward J.


    Introduces a brine shrimp assay to demonstrate the effects of the biological activity of herbal remedies. Describes two protocols, one using aqueous extracts and the other using methanol extracts. (Contains 21 references.) (YDS)

  5. A protocol for isolation and enriched monolayer cultivation of neural precursor cells from mouse dentate gyrus

    Harish eBabu


    Full Text Available In vitro assays are valuable tools to study the characteristics of adult neural precursor cells under controlled conditions with a defined set of parameters. We here present a detailed protocol based on our previous original publication (Babu et al., Enriched monolayer precursor cell cultures from micro-dissected adult mouse dentate gyrus yield functional granule cell-like neurons, PLoS One 2007, 2:e388 to isolate neural precursor cells from the hippocampus of adult mice and maintain and propagate them as adherent monolayer cultures. The strategy is based on the use of Percoll density gradient centrifugation to enrich precursor cells from the micro-dissected dentate gyrus. Based on the expression of Nestin and Sox2, a culture-purity of more than 98% can be achieved. The cultures are expanded under serum-free conditions in Neurobasal A medium with addition of the mitogens EGF and FGF2 as well as the supplements Glutamax-1 and B27. Under differentiation conditions, the precursor cells reliably generate approximately 30% neurons with appropriate morphological, molecular and electrophysiological characteristics that might reflect granule cell properties as their in vivo counterpart. We also highlight potential modifications to the protocol.

  6. IP Routing Protocols

    Nolasco Pinto, Armando


    Uyless Black is a widely known expert in computer networks and data communications. He is author of more than ten books in the communication technologies field, which puts him in a good position to address this topic. In IP Routing Protocols he starts by providing the background and concepts required for understanding TCP/IP technology. This is done clearly and assumes little prior knowledge of the area. As might be expected, he emphasizes the IP route discovery problem. Later he details several routing protocols.

  7. Apoptosis - Methods and Protocols

    CarloAlberto Redi


    Full Text Available Apoptosis - Methods and ProtocolsSecond edition, 2009; Peter Erhardt and Ambrus Toth (Eds; Springer Protocols - Methods in molecular biology, vol. 559; Humana press, Totowa, New Jersey (USA; Pages: 400; €88.35; ISBN: 978-1-60327-016-8The editors rightly begin the preface telling us that: “The ability to detect and quantify apoptosis, to understand its biochemistry and to identify its regulatory genes and proteins is crucial to biomedical research”. Nowadays this is a grounding concept of biology and medicine. What is particularly remarkable...

  8. Reliability Centered Maintenance - Methodologies

    Kammerer, Catherine C.


    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  9. Comparison of different PCR protocols for the detection and diagnosis of Plasmodium falciparum.

    Oster, N; Abdel-Aziz, I Z; Stich, A; Coulibaly, B; Kouyatè, B; Andrews, K T; McLean, J E; Lanzer, M


    An assessment of differing PCR protocols for the diagnosis of Plasmodium falciparum infection was performed on samples from an area of holoendemic malaria transmission in western Burkina Faso. The PCR protocols had generally high sensitivities (>92%) and specificities (>69%), but the negative predictive values (NPV) were moderate and differed widely among the PCR protocols tested. These PCR protocols that amplified either the P. falciparum pfcrt gene or the small subunit ribosomal DNA were the most reliable diagnostic tools. However, the moderate NPV imply that more than one PCR protocol should be used for diagnosis in holoendemic areas.

  10. Assays for laboratory confirmation of novel human coronavirus (hCoV-EMC) infections.

    Corman, V M; Müller, M A; Costabel, U; Timm, J; Binger, T; Meyer, B; Kreher, P; Lattwein, E; Eschbach-Bludau, M; Nitsche, A; Bleicker, T; Landt, O; Schweiger, B; Drexler, J F; Osterhaus, A D; Haagmans, B L; Dittmer, U; Bonin, F; Wolff, T; Drosten, C


    We present a rigorously validated and highly sensitive confirmatory real-time RT-PCR assay (1A assay) that can be used in combination with the previously reported upE assay. Two additional RT-PCR assays for sequencing are described, targeting the RdRp gene (RdRpSeq assay) and N gene (NSeq assay), where an insertion/deletion polymorphism might exist among different hCoV-EMC strains. Finally, a simplified and biologically safe protocol for detection of antibody response by immunofluorescence microscopy was developed using convalescent patient serum.

  11. Gearbox Reliability Collaborative Update (Presentation)

    Sheng, S.; Keller, J.; Glinsky, C.


    This presentation was given at the Sandia Reliability Workshop in August 2013 and provides information on current statistics, a status update, next steps, and other reliability research and development activities related to the Gearbox Reliability Collaborative.

  12. A reliable routing algorithm based on fuzzy Petri net in mobile ad hoc networks

    HU Zhi-gang; MA Hao; WANG Guo-jun; LIAO Lin


    A novel reliable routing algorithm in mobile ad hoc networks using fuzzy Petri net with its reasoning mechanism was proposed to increase the reliability during the routing selection. The algorithm allows the structured representation of network topology, which has a fuzzy reasoning mechanism for finding the routing sprouting tree from the source node to the destination node in the mobile ad hoc environment. Finally, by comparing the degree of reliability in the routing sprouting tree, the most reliable route can be computed. The algorithm not only offers the local reliability between each neighboring node, but also provides global reliability for the whole selected route. The algorithm can be applied to most existing on-demand routing protocols, and the simulation results show that the routing reliability is increased by more than 80% when applying the proposed algorithm to the ad hoc on demand distance vector routing protocol.

  13. System Reliability Analysis: Foundations.


    performance formulas for systems subject to pre- ventive maintenance are given. V * ~, , 9 D -2 SYSTEM RELIABILITY ANALYSIS: FOUNDATIONS Richard E...reliability in this case is V P{s can communicate with the terminal t = h(p) Sp2(((((p p)p) p)p)gp) + p(l -p)(((pL p)p)(p 2 JLp)) + p(l -p)((p(p p...For undirected networks, the basic reference is A. Satyanarayana and Kevin Wood (1982). For directed networks, the basic reference is Avinash

  14. Analysis of Security Protocols in Embedded Systems

    Bruni, Alessandro

    .e., protecting the system from the external world). With increased connectivity of these systems to external networks the attack surface has grown, and consequently there is a need for securing the system from external attacks. Introducing security protocols in safety critical systems requires careful......Embedded real-time systems have been adopted in a wide range of safety-critical applications—including automotive, avionics, and train control systems—where the focus has long been on safety (i.e., protecting the external world from the potential damage caused by the system) rather than security (i...... considerations on the available resources, especially in meeting real-time and resource constraints, as well as cost and reliability requirements. For this reason many proposed security protocols in this domain have peculiar features, not present in traditional security literature. In this thesis we tackle...

  15. Security Protocol Design: A Case Study Using Key Distribution Protocols

    Reiner Dojen


    Full Text Available Nowadays security protocols are a key component in providing security services for fixed and mobile networks. These services include data confidentiality, radio link encryption, message integrity, mobile subscriber authentication, electronic payment, certified e-mail, contract signing and nonrepudiation. This paper is concerned with design of effective security protocols. Security protocols are introduced and some common attacks against security protocols are discussed. The vulnerabilities that lead to theattacks are analyzed and guidelines for effective security protocol design are proposed. The presented guidelines are applied to the Andrew Secure RPC protocol and its adapted versions. It is demonstrated that compliance with the guidelines successfully avoidsfreshness and parallel session attacks.

  16. Survey of Performance based Transmission Control Protocol in MANET

    Sapna Bagde


    Full Text Available Transmission Control Protocol (TCP is a connection-oriented transport service that ensures the reliability of message delivery. It verifies that messages and data were received. TCP provides reliable, ordered delivery of a stream of bytes from a program on one computer to another program on another computer. TCP provides a communication service at an intermediate level between an application programs. TCP is the protocol used by major Internet applications such as the World Wide Web, email, remote administration and file transfer. TCP is a reliable transport protocol that is well tuned to perform well in traditional networks. However, several experiments and analysis have shown that this protocol is not suitable for bulk data transfer in high bandwidth, large round trip time networks because of its slow start and conservative congestion control mechanism. In this paper we discussed a survey of Performance Based Transmission Control Protocol in Mobile Ad-hoc Network environment. The performance based techniques are categorized based upon different approaches like throughput, end-to-end delay, congestion control etc. We also analysis the major improvement in recent methods for performance based TCP in MANET.

  17. A Field-Based Testing Protocol for Assessing Gross Motor Skills in Preschool Children: The Children's Activity and Movement in Preschool Study Motor Skills Protocol

    Williams, Harriet G.; Pfeiffer, Karin A.; Dowda, Marsha; Jeter, Chevy; Jones, Shaverra; Pate, Russell R.


    The purpose of this study was to develop a valid and reliable tool for use in assessing motor skills in preschool children in field-based settings. The development of the Children's Activity and Movement in Preschool Study Motor Skills Protocol included evidence of its reliability and validity for use in field-based environments as part of large…

  18. Using ImageJ for the quantitative analysis of flow-based adhesion assays in real-time under physiologic flow conditions.

    Meyer dos Santos, Sascha; Klinkhardt, Ute; Schneppenheim, Reinhard; Harder, Sebastian


    This article intends to close the gap between the abundance of regular articles focusing on adhesive mechanisms of cells in a flow field and purely technical reports confined to the description of newly developed algorithms, not yet ready to be used by users without programming skills. A simple and robust method is presented for analysing raw videomicroscopic data of flow-based adhesion assays using the freely available public domain software ImageJ. We describe in detail the image processing routines used to rapidly and reliably evaluate the number of adherent and translocating platelets in videomicroscopic recordings. The depicted procedures were exemplified by analysing platelet interaction with immobilized von Willebrand factor and fibrinogen in flowing blood under physiological wall shear rates. Neutralizing GPIbalpha function reduced shear-dependent platelet translocation on von Willebrand factor and abolished firm platelet adhesion. Abciximab, Tirofiban and Eptifibatide completely inhibited GPIIb/IIIa-dependent stable platelet deposition on fibrinogen. The presented method to analyse videomicroscopic recordings from flow-based adhesion assays offers the advantage of providing a simple and reliable way to quantify flow-based adhesion assays, which is completely based on ImageJ and can easily be applied to study adhesion mechanisms of cells in non-fluorescent modes without the need to deviate from the presented protocol.

  19. Protocol Materials: A Clarification.

    Innerd, Wilfred; O'Gorman, David

    "Protocol materials" are records or recordings of a wide variety of behavioral situations. Characteristically they are neither simulated nor extensively edited. They are to be used for the empirical verification of concepts derived from both educational theory and the social sciences. They are attempts to capture reality so that it may be studied…

  20. Principles of Protocol Design

    Sharp, Robin

    This is a new and updated edition of a book first published in 1994. The book introduces the reader to the principles used in the construction of a large range of modern data communication protocols, as used in distributed computer systems of all kinds. The approach taken is rather a formal one...


    Allegra, Carmen J.


    During the past decade, biomedical technologies have undergone an explosive evolution---from the publication of the first complete human genome in 2003, after more than a decade of effort and at a cost of hundreds of millions of dollars---to the present time, where a complete genomic sequence can be available in less than a day and at a small fraction of the cost of the original sequence. The widespread availability of next generation genomic sequencing has opened the door to the development of precision oncology. The need to test multiple new targeted agents both alone and in combination with other targeted therapies, as well as classic cytotoxic agents, demand the development of novel therapeutic platforms (particularly Master Protocols) capable of efficiently and effectively testing multiple targeted agents or targeted therapeutic strategies in relatively small patient subpopulations. Here, we describe the Master Protocol concept, with a focus on the expected gains and complexities of the use of this design. An overview of Master Protocols currently active or in development is provided along with a more extensive discussion of the Lung Master Protocol (Lung-MAP study). PMID:26433553

  2. The reliability of knee joint position testing using electrogoniometry

    Winter Adele


    Full Text Available Abstract Background The current investigation examined the inter- and intra-tester reliability of knee joint angle measurements using a flexible Penny and Giles Biometric® electrogoniometer. The clinical utility of electrogoniometry was also addressed. Methods The first study examined the inter- and intra-tester reliability of measurements of knee joint angles in supine, sitting and standing in 35 healthy adults. The second study evaluated inter-tester and intra-tester reliability of knee joint angle measurements in standing and after walking 10 metres in 20 healthy adults, using an enhanced measurement protocol with a more detailed electrogoniometer attachment procedure. Both inter-tester reliability studies involved two testers. Results In the first study, inter-tester reliability (ICC[2,10] ranged from 0.58–0.71 in supine, 0.68–0.79 in sitting and 0.57–0.80 in standing. The standard error of measurement between testers was less than 3.55° and the limits of agreement ranged from -12.51° to 12.21°. Reliability coefficients for intra-tester reliability (ICC[3,10] ranged from 0.75–0.76 in supine, 0.86–0.87 in sitting and 0.87–0.88 in standing. The standard error of measurement for repeated measures by the same tester was less than 1.7° and the limits of agreement ranged from -8.13° to 7.90°. The second study showed that using a more detailed electrogoniometer attachment protocol reduced the error of measurement between testers to 0.5°. Conclusion Using a standardised protocol, reliable measures of knee joint angles can be gained in standing, supine and sitting by using a flexible goniometer.

  3. Observer Use of Standardized Observation Protocols in Consequential Observation Systems

    Bell, Courtney A.; Yi, Qi; Jones, Nathan D.; Lewis, Jennifer M.; McLeod, Monica; Liu, Shuangshuang


    Evidence from a handful of large-scale studies suggests that although observers can be trained to score reliably using observation protocols, there are concerns related to initial training and calibration activities designed to keep observers scoring accurately over time (e.g., Bell, et al, 2012; BMGF, 2012). Studies offer little insight into how…

  4. Techniques for minimizing the effects of PCR inhibitors in the chytridiomycosis assay.

    Kosch, T A; Summers, K


    Chytridiomycosis is an amphibian disease of global conservation concern that is caused by the fungal pathogen Batrachochytrium dendrobatidis (Bd). Since the discovery of Bd in 1998, several methods have been used for detection of Bd; among these polymerase chain reaction (PCR) from skin swabs is accepted as the best method due to its noninvasiveness, high sensitivity and ease of use. However, PCR is not without problems - to be successful, this technique is dependent upon the presence of nondegraded DNA template and reaction contents that are free from inhibitors. Here, we report on an investigation of several techniques aimed at improving the reliability of the Bd PCR assay by minimizing the effects of humic acid (HA), a potent PCR inhibitor. We compared the effectiveness of four DNA extraction kits (DNeasy, QIAamp DNA Stool, PowerLyzer Power Soil and PrepMan Ultra) and four PCR methods (Amplitaq Gold, bovine serum albumin, PowerClean DNA Clean-up and inhibitor resistant Taq Polymerase). The results of this and previous studies indicate that chytridiomycosis studies that use PCR methods for disease detection may be significantly underestimating the occurrence of Bd. Our results suggest that to minimize the inhibitory effects of HA, DNeasy should be used for sample DNA extraction and Amplitaq Gold with bovine serum albumin should be used for the Bd PCR assay. We also outline protocols tested, show the results of our methods comparisons and discuss the pros and cons of each method.

  5. Expert system aids reliability

    Johnson, A.T. [Tennessee Gas Pipeline, Houston, TX (United States)


    Quality and Reliability are key requirements in the energy transmission industry. Tennessee Gas Co. a division of El Paso Energy, has applied Gensym`s G2, object-oriented Expert System programming language as a standard tool for maintaining and improving quality and reliability in pipeline operation. Tennessee created a small team of gas controllers and engineers to develop a Proactive Controller`s Assistant (ProCA) that provides recommendations for operating the pipeline more efficiently, reliably and safely. The controller`s pipeline operating knowledge is recreated in G2 in the form of Rules and Procedures in ProCA. Two G2 programmers supporting the Gas Control Room add information to the ProCA knowledge base daily. The result is a dynamic, constantly improving system that not only supports the pipeline controllers in their operations, but also the measurement and communications departments` requests for special studies. The Proactive Controller`s Assistant development focus is in the following areas: Alarm Management; Pipeline Efficiency; Reliability; Fuel Efficiency; and Controller Development.

  6. Reliability based structural design

    Vrouwenvelder, A.C.W.M.


    According to ISO 2394, structures shall be designed, constructed and maintained in such a way that they are suited for their use during the design working life in an economic way. To fulfil this requirement one needs insight into the risk and reliability under expected and non-expected actions. A ke

  7. Reliability based structural design

    Vrouwenvelder, A.C.W.M.


    According to ISO 2394, structures shall be designed, constructed and maintained in such a way that they are suited for their use during the design working life in an economic way. To fulfil this requirement one needs insight into the risk and reliability under expected and non-expected actions. A ke

  8. The value of reliability

    Fosgerau, Mogens; Karlström, Anders


    We derive the value of reliability in the scheduling of an activity of random duration, such as travel under congested conditions. Using a simple formulation of scheduling utility, we show that the maximal expected utility is linear in the mean and standard deviation of trip duration, regardless...

  9. Parametric Mass Reliability Study

    Holt, James P.


    The International Space Station (ISS) systems are designed based upon having redundant systems with replaceable orbital replacement units (ORUs). These ORUs are designed to be swapped out fairly quickly, but some are very large, and some are made up of many components. When an ORU fails, it is replaced on orbit with a spare; the failed unit is sometimes returned to Earth to be serviced and re-launched. Such a system is not feasible for a 500+ day long-duration mission beyond low Earth orbit. The components that make up these ORUs have mixed reliabilities. Components that make up the most mass-such as computer housings, pump casings, and the silicon board of PCBs-typically are the most reliable. Meanwhile components that tend to fail the earliest-such as seals or gaskets-typically have a small mass. To better understand the problem, my project is to create a parametric model that relates both the mass of ORUs to reliability, as well as the mass of ORU subcomponents to reliability.

  10. Avionics Design for Reliability


    Consultant P.O. Box 181, Hazelwood. Missouri 63042, U.S.A. soup ""•.• • CONTENTS Page LIST OF SPEAKERS iii INTRODUCTION AND OVERVIEW-RELIABILITY UNDER... primordial , d’autant plus quo dans co cam ia procg- dure do st~lection en fiabilitg eat assez peu efficaco. La ripartition des pannes suit

  11. Wind Energy - How Reliable.


    The reliability of a wind energy system depends on the size of the propeller and the size of the back-up energy storage. Design of the optimum system...speed incidents which generate a significant part of the wind energy . A nomogram is presented, based on some continuous wind speed measurements

  12. The reliability horizon

    Visser, M


    The ``reliability horizon'' for semi-classical quantum gravity quantifies the extent to which we should trust semi-classical quantum gravity, and gives a handle on just where the ``Planck regime'' resides. The key obstruction to pushing semi-classical quantum gravity into the Planck regime is often the existence of large metric fluctuations, rather than a large back-reaction.

  13. Reliability of semiology description.

    Heo, Jae-Hyeok; Kim, Dong Wook; Lee, Seo-Young; Cho, Jinwhan; Lee, Sang-Kun; Nam, Hyunwoo


    Seizure semiology is important for classifying patients' epilepsy. Physicians usually get most of the seizure information from observers though there have been few reports on the reliability of the observers' description. This study aims at determining the reliability of observers' description of the semiology. We included 92 patients who had their habitual seizures recorded during video-EEG monitoring. We compared the semiology described by the observers with that recorded on the videotape, and reviewed which characteristics of the observers affected the reliability of their reported data. The classification of seizures and the individual components of the semiology based only on the observer-description was somewhat discordant compared with the findings from the videotape (correct classification, 85%). The descriptions of some ictal behaviors such as oroalimentary automatism, tonic/dystonic limb posturing, and head versions were relatively accurate, but those of motionless staring and hand automatism were less accurate. The specified directions by the observers were relatively correct. The accuracy of the description was related to the educational level of the observers. Much of the information described by well-educated observers is reliable. However, every physician should keep in mind the limitations of this information and use this information cautiously.

  14. High reliability organizations

    Gallis, R.; Zwetsloot, G.I.J.M.


    High Reliability Organizations (HRO’s) are organizations that constantly face serious and complex (safety) risks yet succeed in realising an excellent safety performance. In such situations acceptable levels of safety cannot be achieved by traditional safety management only. HRO’s manage safety

  15. WelFur - mink: development of on-farm welfare assessment protocols for mink

    Møller, Steen Henrik; Hansen, Steffen W; Rousing, Tine


    European Fur Breeder's Association initiated the "WelFur" project in 2009 in order to develop a welfare assessment protocol for mink and fox farms after the Welfare Quality® standards. The assessment is based on four welfare principles (Good feeding, good housing, good health and appropriate...... behaviour) and 12 underlying criteria, to be measured on-farm. The major steps in the development of the WelFur mink protocols are described: (1) Writing leterature reviews and listing potential measures. (2) Identifying valid, reliable and feasable welfare measures. (3) Developing registration protocols......, descriptions, and schemes. (4) Testing preliminary protocols in relevant seasons of the annual production. This paper focus on the evaluation of validity, reliability anf feasibility of the 22 measures that have been selected for the WelFur assessment protocols. These protocols haev been tested in the three...

  16. Reliable Self-Stabilizing Communication for Quasi Rendezvous

    Johnen, Colette; Lavault, Christian


    The paper presents three self-stabilizing protocols for basic fair and reliable link communication primitives. We assume a link-register communication model under read/write atomicity, where every process can read from but cannot write into its neighbours' registers. The first primitive guarantees that any process writes a new value in its register(s) only after all its neighbours have read the previous value, whatever the initial scheduling of processes' actions. The second primitive implements a "weak rendezvous" communication mechanism by using an alternating bit protocol: whenever a process consecutively writes n values (possibly the same ones) in a register, each neighbour is guaranteed to read each value from the register at least once. On the basis of the previous protocol, the third primitive implements a "quasi rendezvous": in words, this primitive ensures furthermore that there exists exactly one reading between two writing operations All protocols are self-stabilizing and run in asynchronous arbitr...

  17. Energy Efficiency and Reliability in Wireless Biomedical Implant Systems

    Abouei, Jamshid; Plataniotis, Konstantinos N; Pasupathy, Subbarayan


    The use of wireless implant technology requires correct delivery of the vital physiological signs of the patient along with the energy management in power-constrained devices. Toward these goals, we present an augmentation protocol for the physical layer of the Medical Implant Communications Service (MICS) with focus on the energy efficiency of deployed devices over the MICS frequency band. The present protocol uses the rateless code with the Frequency Shift Keying (FSK) modulation scheme to overcome the reliability and power cost concerns in tiny implantable sensors due to the considerable attenuation of propagated signals across the human body. In addition, the protocol allows a fast start-up time for the transceiver circuitry. The main advantage of using rateless codes is to provide an inherent adaptive duty-cycling for power management, due to the flexibility of the rateless code rate. Analytical results demonstrate that an 80% energy saving is achievable with the proposed protocol when compared to the IE...

  18. A duplex PCR assay for the detection of Ralstonia solanacearum phylotype II strains in Musa spp.

    Gilles Cellier

    Full Text Available Banana wilt outbreaks that are attributable to Moko disease-causing strains of the pathogen Ralstonia solanacearum (Rs remain a social and economic burden for both multinational corporations and subsistence farmers. All known Moko strains belong to the phylotype II lineage, which has been previously recognized for its broad genetic basis. Moko strains are paraphyletic and are distributed among seven related but distinct phylogenetic clusters (sequevars that are potentially major threats to Musaceae, Solanaceae, and ornamental crops in many countries. Although clustered within the Moko IIB-4 sequevar, strains of the epidemiologically variant IIB-4NPB do not cause wilt on Cavendish or plantain bananas; instead, they establish a latent infection in the vascular tissues of plantains and demonstrate an expanded host range and high aggressiveness toward Solanaceae and Cucurbitaceae. Although most molecular diagnostic methods focus on strains that wilt Solanaceae (particularly potato, no relevant protocol has been described that universally detects strains of the Musaceae-infecting Rs phylotype II. Thus, a duplex PCR assay targeting Moko and IIB-4NPB variant strains was developed, and its performance was assessed using an extensive collection of 111 strains representing the known diversity of Rs Moko-related strains and IIB-4NPB variant strains along with certain related strains and families. The proposed diagnostic protocol demonstrated both high accuracy (inclusivity and exclusivity and high repeatability, detected targets on either pure culture or spiked plant extracts. Although they did not belong to the Moko clusters described at the time of the study, recently discovered banana-infecting strains from Brazil were also detected. According to our comprehensive evaluation, this duplex PCR assay appears suitable for both research and diagnostic laboratories and provides reliable detection of phylotype II Rs strains that infect Musaceae.

  19. 2014 Building America House Simulation Protocols

    Wilson, E.; Engebrecht-Metzger, C.; Horowitz, S.; Hendron, R.


    As BA has grown to include a large and diverse cross-section of the home building and retrofit industries, it has become more important to develop accurate, consistent analysis techniques to measure progress towards the program's goals. The House Simulation Protocol (HSP) document provides guidance to program partners and managers so they can compare energy savings for new construction and retrofit projects. The HSP provides the program with analysis methods that are proven to be effective and reliable in investigating the energy use of advanced energy systems and of entire houses.

  20. 2014 Building America House Simulation Protocols

    Wilson, E. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Engebrecht, C. Metzger [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Horowitz, S. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hendron, R. [National Renewable Energy Laboratory (NREL), Golden, CO (United States)


    As Building America has grown to include a large and diverse cross-section of the home building and retrofit industries, it has become more important to develop accurate, consistent analysis techniques to measure progress towards the program's goals. The House Simulation Protocol (HSP) document provides guidance to program partners and managers so they can compare energy savings for new construction and retrofit projects. The HSP provides the program with analysis methods that are proven to be effective and reliable in investigating the energy use of advanced energy systems and of entire houses.


    Krishan Kumar


    Full Text Available Ad hoc networks offer various applications which are verymuch essential in wireless networks. But the vital problemconcerning their security aspects is the major issue whichmust be solved. A mobile adhoc network is a collection ofnodes that are connected through a wireless mediumforming rapidly changing topologies. The dynamic andcooperative nature of ad hoc networks present challenges insecuring these networks. Attacks on ad hoc network routingprotocols is the main problem which affects the networkperformance and reliability. Here a brief introduction ismade of the most popular protocols that follow the tabledrivenapproach and the source initiated on-demandapproach.

  2. Against vaccine assay secrecy.

    Herder, Matthew; Hatchette, Todd F; Halperin, Scott A; Langley, Joanne M


    Increasing the transparency of the evidence base behind health interventions such as pharmaceuticals, biologics, and medical devices, has become a major point of critique, conflict, and policy focus in recent years. Yet the lack of publicly available information regarding the immunogenicity assays upon which many important, widely used vaccines are based has received no attention to date. In this paper we draw attention to this critical public health problem by reporting on our efforts to secure vaccine assay information in respect of 10 vaccines through Canada's access to information law. We argue, under Canadian law, that the public health interest in having access to the methods for these laboratory procedures should override claims by vaccine manufacturers and regulators that this information is proprietary; and, we call upon several actors to take steps to ensure greater transparency with respect to vaccine assays, including regulators, private firms, researchers, research institutions, research funders, and journal editors.

  3. Against vaccine assay secrecy

    Herder, Matthew; Hatchette, Todd F; Halperin, Scott A; Langley, Joanne M


    Increasing the transparency of the evidence base behind health interventions such as pharmaceuticals, biologics, and medical devices, has become a major point of critique, conflict, and policy focus in recent years. Yet the lack of publicly available information regarding the immunogenicity assays upon which many important, widely used vaccines are based has received no attention to date. In this paper we draw attention to this critical public health problem by reporting on our efforts to secure vaccine assay information in respect of 10 vaccines through Canada's access to information law. We argue, under Canadian law, that the public health interest in having access to the methods for these laboratory procedures should override claims by vaccine manufacturers and regulators that this information is proprietary; and, we call upon several actors to take steps to ensure greater transparency with respect to vaccine assays, including regulators, private firms, researchers, research institutions, research funders, and journal editors. PMID:25826194

  4. Rover waste assay system

    Akers, D.W.; Stoots, C.M.; Kraft, N.C.; Marts, D.J. [Idaho National Engineering Lab., Idaho Falls, ID (United States)


    The Rover Waste Assay System (RWAS) is a nondestructive assay system designed for the rapid assay of highly-enriched {sup 235}U contaminated piping, tank sections, and debris from the Rover nuclear rocket fuel processing facility at the Idaho Chemical Processing Plant. A scanning system translates a NaI(Tl) detector/collimator system over the structural components where both relative and calibrated measurements for {sup 137}Cs are made. Uranium-235 concentrations are in operation and is sufficiently automated that most functions are performed by the computer system. These functions include system calibration, problem identification, collimator control, data analysis, and reporting. Calibration of the system was done through a combination of measurements on calibration standards and benchmarked modeling. A description of the system is presented along with the methods and uncertainties associated with the calibration and analysis of the system for components from the Rover facility. 4 refs., 2 figs., 4 tabs.

  5. A Comparative Analysis Of GBN Protocol and SR Protocol%GBN协议和SR协议对比分析



    Go-Back-N protocol and Selective-Repeat protocol are two important protocols for reliable data transmission in the transmission layer and link layer of computer network.The design of TCP protocol references the ideas of two protocols men-tioned above.In this thesis, GBN protocol and SR protocol are compared and analyzed in order to reveal the intrinsic and import characteristics of the two protocols.%Go-Back-N协议和Selective-Repeat协议是计算机网络在传输层和链路层用于实现可靠数据传输的两个重要协议.Internet的TCP协议在设计时借鉴了上述两个协议的基本思想.该文通过对GBN协议和SR协议进行对比分析,从而揭示两个协议的内在思想和重要特性.

  6. A family of quantum protocols

    Devetak, I; Winter, A


    We introduce two dual, purely quantum protocols: for entanglement distillation assisted by quantum communication (``mother'' protocol) and for entanglement assisted quantum communication (``father'' protocol). We show how a large class of ``children'' protocols (including many previously known ones) can be derived from the two by direct application of teleportation or super-dense coding. Furthermore, the parent may be recovered from most of the children protocols by making them ``coherent''. We also summarize the various resource trade-offs these protocols give rise to.

  7. Comparison Study of Transmission Control Protocol and User Datagram Protocol Behavior over Multi-Protocol Label Switching Networks in Case of Failures

    Taha A.A Radaei


    Full Text Available Problem statement: In only a few years, Multi-Protocol Label Switching (MPLS has evolved from an exotic technology to a mainstream tool used by service providers to create revenue-generating services. MPLS provides a high reliable Label Switched Path (LSP. MPLS failures may degrade the reliability of the MPLS networks. Approach: For that reason, many studies have been conducted to keep the high reliability and survivability of the MPLS networks. Unlike User Datagram Protocol (UDP, Transmission Control Protocol does not perform well in case of like-failure of MPLS networks because of its inability to distinguish packet loss due to link-failure. After the recovery time, TCP takes longer time than UDP to continue as it was before the failure. Results: In terms of packet loss, TCP performs better than UDP. However, the receiving rate of the TCP traffic is much worse than UDP traffic. A need for a mechanism to improve the behavior of TCP after a link failure is needed. This study focused on comparing the behavior of different types TCP as well as UDP traffic over MPLS networks in case of link, node or congestion failures. Conclusion: Although extensions of RSVP-TE protocol support fast recovery mechanism of MPLS networks, the behavior of TCP will be affected during recovery time much more than with UDP.

  8. Assays for calcitonin receptors

    Teitelbaum, A.P.; Nissenson, R.A.; Arnaud, C.D.


    The assays for calcitonin receptors described focus on their use in the study of the well-established target organs for calcitonin, bone and kidney. The radioligand used in virtually all calcitonin binding studies is /sup 125/I-labelled salmon calcitonin. The lack of methionine residues in this peptide permits the use of chloramine-T for the iodination reaction. Binding assays are described for intact bone, skeletal plasma membranes, renal plasma membranes, and primary kidney cell cultures of rats. Studies on calcitonin metabolism in laboratory animals and regulation of calcitonin receptors are reviewed.

  9. A quality control program within a clinical trial Consortium for PCR protocols to detect Plasmodium species.

    Taylor, Steve M; Mayor, Alfredo; Mombo-Ngoma, Ghyslain; Kenguele, Hilaire M; Ouédraogo, Smaïla; Ndam, Nicaise Tuikue; Mkali, Happy; Mwangoka, Grace; Valecha, Neena; Singh, Jai Prakash Narayan; Clark, Martha A; Verweij, Jaco J; Adegnika, Ayola Akim; Severini, Carlo; Menegon, Michela; Macete, Eusebio; Menendez, Clara; Cisteró, Pau; Njie, Fanta; Affara, Muna; Otieno, Kephas; Kariuki, Simon; ter Kuile, Feiko O; Meshnick, Steven R


    Malaria parasite infections that are only detectable by molecular methods are highly prevalent and represent a potential transmission reservoir. The methods used to detect these infections are not standardized, and their operating characteristics are often unknown. We designed a proficiency panel of Plasmodium spp. in order to compare the accuracy of parasite detection of molecular protocols used by labs in a clinical trial consortium. Ten dried blood spots (DBSs) were assembled that contained P. falciparum, P. vivax, P. malariae, and P. ovale; DBSs contained either a single species or a species mixed with P. falciparum. DBS panels were tested in 9 participating laboratories in a masked fashion. Of 90 tests, 68 (75.6%) were correct; there were 20 false-negative results and 2 false positives. The detection rate was 77.8% (49/63) for P. falciparum, 91.7% (11/12) for P. vivax, 83.3% (10/12) for P. malariae, and 70% (7/10) for P. ovale. Most false-negative P. falciparum results were from samples with an estimated ≤ 5 parasites per μl of blood. Between labs, accuracy ranged from 100% to 50%. In one lab, the inability to detect species in mixed-species infections prompted a redesign and improvement of the assay. Most PCR-based protocols were able to detect P. falciparum and P. vivax at higher densities, but these assays may not reliably detect parasites in samples with low P. falciparum densities. Accordingly, formal quality assurance for PCR should be employed whenever this method is used for diagnosis or surveillance. Such efforts will be important if PCR is to be widely employed to assist malaria elimination efforts.

  10. Optimisation of an Advanced Oxidation Protein Products Assay: Its Application to Studies of Oxidative Stress in Diabetes Mellitus

    Emma L. Taylor


    Full Text Available Advanced oxidation protein products (AOPP are reportedly elevated in the plasma of patients with a number of diseases, including diabetes mellitus, that involve oxidative stress. However, the accurate measurement of AOPP in human plasma is hampered by the formation of a precipitate following the addition of potassium iodide and glacial acetic acid according to the published assay procedure. Here we describe a modification of the AOPP assay which eliminates interference by precipitation and provides a robust, reliable, and reproducible protocol for the measurement of iodide oxidising capacity in plasma samples (intra-assay CV 1.7–5.3%, interassay CV 5.3–10.5%. The improved method revealed a significant association of AOPP levels with age (p<0.05 and hypertension (p=0.01 in EDTA-anticoagulated plasma samples from 52 patients with diabetes and 38 nondiabetic control subjects, suggesting a possible link between plasma oxidising capacity and endothelial and/or vascular dysfunction. There was no significant difference between AOPP concentrations in diabetic (74.8 ± 7.2 μM chloramine T equivalents and nondiabetic (75.5 ± 7.0 μM chloramine T equivalents individuals.

  11. Evaluation of ribonucleic acid amplification protocols for human oocyte transcriptome analysis

    E. Mantikou; O. Bruning; S. Mastenbroek; S. Repping; T.M. Breit; M. de Jong


    OBJECTIVE: To develop a reliable, reproducible, and sensitive method for investigating gene-expression profiles from individual human oocytes. DESIGN: Five commercially available protocols were investigated for their efficiency to amplify messenger RNA (mRNA) from 54 single human oocytes. Protocols

  12. Reliability in the utility computing era: Towards reliable Fog computing

    Madsen, Henrik; Burtschy, Bernard; Albeanu, G.


    This paper considers current paradigms in computing and outlines the most important aspects concerning their reliability. The Fog computing paradigm as a non-trivial extension of the Cloud is considered and the reliability of the networks of smart devices are discussed. Combining the reliability...... requirements of grid and cloud paradigms with the reliability requirements of networks of sensor and actuators it follows that designing a reliable Fog computing platform is feasible....

  13. Seasonal Variation, Microscopic and Chromatographic Analysis of Leaves in Malus hupehensis: A Protocol for Its Quality Control

    SHEN Tao; XIANG Lan; REN Dong-mei; WANG Shu-qi; YANG Ming-ren; LOU Hong-xiang


    Objective To establish a quality control protocol based on microscopic,TLC,and HPLC methods,and to verify the optimal harvesting time for the leaves of Malus hupehensis (LMH).Methods The LMH were pulverized into powder for microscopic identification or TLC and HPLC analysis after ultrasonic extraction with methanol.Seasonal variations of the phlorizin content and average leaf weight were determined by HPLC analysis and weighing up the leaves collected from May to October.Results Microscopic and macromorphologic characteristics have been described for the leaf identification.A qualitative TLC assay and a quantitative HPLC method have been established for the quality control of LMH.Phlorizin was selected as a reference marker,which resolved at Rf0.53 in TLC assay and at 14.0 min in HPLC assay.The content of phlorizin decreased gradually from 17.0% in leaves collected in May to 7.5% in October.The average leaf weight reached the level of 0.6 g in August and maintained until its falling.Conclusion These methods are simple,selective,accurate,and reliable for the quality control of LMH.The period from late August to early September is suggested as the optimal harvesting time of the LMH.

  14. Detection of virulence, antibiotic resistance and toxin (VAT) genes in Campylobacter species using newly developed multiplex PCR assays.

    Laprade, Natacha; Cloutier, Michel; Lapen, David R; Topp, Edward; Wilkes, Graham; Villemur, Richard; Khan, Izhar U H


    Campylobacter species are one of the leading causes of bacterial gastroenteritis in humans worldwide. This twofold study was sought to: i) develop and optimize four single-tube multiplex PCR (mPCR) assays for the detection of six virulence (ciaB, dnaJ, flaA, flaB, pldA and racR), three toxin (cdtA, cdtB and cdtC) and one antibiotic resistance tet(O) genes in thermophilic Campylobacter spp. and ii) apply and evaluate the developed mPCR assays by testing 470 previously identified C. jejuni, C. coli and C. lari isolates from agricultural water. In each mPCR assay, a combination of two or three sets of primer pairs for virulence, antibiotic resistance and toxin (VAT) genes was used and optimized. Assay 1 was developed for the detection of dnaJ, racR and cdtC genes with expected amplification sizes of 720, 584 and 182bp. Assay 2 generated PCR amplicons for tet(O) and cdtA genes of 559 and 370bp. Assay 3 amplified cdtB ciaB, and pldA genes with PCR amplicon sizes of 620, 527 and 385bp. Assay 4 was optimized for flaA and flaB genes that generated PCR amplicons of 855 and 260bp. The primer pairs and optimized PCR protocols did not show interference and/or cross-amplification with each other and generated the expected size of amplification products for each target VAT gene for the C. jejuni ATCC 33291 reference strain. Overall, all ten target VAT genes were detected at a variable frequency in tested isolates of thermophilic Campylobacter spp. where cdtC, flaB, ciaB, cdtB, cdtA and pldA were commonly detected compared to the flaA, racR, dnaJ and tet(O) genes which were detected with less frequency. The developed mPCR assays are simple, rapid, reliable and sensitive tools for simultaneously assessing potential pathogenicity and antibiotic resistance profiling in thermophilic Campylobacter spp. The mPCR assays will be useful in diagnostic and analytical settings for routine screening of VAT characteristics of Campylobacter spp. as well as being applicable in epidemiological

  15. New oligosaccharyltransferase assay method.

    Kohda, Daisuke; Yamada, Masaki; Igura, Mayumi; Kamishikiryo, Jun; Maenaka, Katsumi


    We developed a new in vitro assay for oligosaccharyltransferase (OST), which catalyzes the transfer of preassembled oligosaccharides on lipid carriers onto asparagine residues in polypeptide chains. The asparagine residues reside in the sequon, Asn-X-Thr/Ser, where X can be any amino acid residue except Pro. We demonstrate the potency of our assay using the OST from yeast. In our method, polyacrylamide gel electrophoresis is used to separate the glycopeptide products from the peptide substrates. The substrate peptide is fluorescently labeled and the formation of glycopeptides is analyzed by fluorescence gel imaging. Two in vitro OST assay methods are now widely used, but both the methods depend on previous knowledge of the oligosaccharide moiety: One method uses lectin binding as the separation mechanism and the other method uses biosynthetically or chemoenzymatically synthesized lipid-linked oligosaccharides as donors. N-linked protein glycosylation is found in all three domains of life, but little is known about the N-glycosylation in Archaea. Thus, our new assay, which does not require a priori knowledge of the oligosaccharides, will be useful in such cases. Indeed, we have detected the OST activity in the membrane fraction from a hyperthermophilic archaeon, Pyrococcus furiosus.

  16. Hyaluronic Acid Assays

    Itenov, Theis S; Kirkby, Nikolai S; Bestle, Morten H


    BACKGROUD: Hyaluronic acid (HA) is proposed as a marker of functional liver capacity. The aim of the present study was to compare a new turbidimetric assay for measuring HA with the current standard method. METHODS: HA was measured by a particle-enhanced turbidimetric immunoassay (PETIA) and enzyme...

  17. Instrument for assaying radiation

    Coleman, Jody Rustyn; Farfan, Eduardo B.


    An instrument for assaying radiation includes a flat panel detector having a first side opposed to a second side. A collimated aperture covers at least a portion of the first side of the flat panel detector. At least one of a display screen or a radiation shield may cover at least a portion of the second side of the flat panel detector.

  18. Human Reliability Program Workshop

    Landers, John; Rogers, Erin; Gerke, Gretchen


    A Human Reliability Program (HRP) is designed to protect national security as well as worker and public safety by continuously evaluating the reliability of those who have access to sensitive materials, facilities, and programs. Some elements of a site HRP include systematic (1) supervisory reviews, (2) medical and psychological assessments, (3) management evaluations, (4) personnel security reviews, and (4) training of HRP staff and critical positions. Over the years of implementing an HRP, the Department of Energy (DOE) has faced various challenges and overcome obstacles. During this 4-day activity, participants will examine programs that mitigate threats to nuclear security and the insider threat to include HRP, Nuclear Security Culture (NSC) Enhancement, and Employee Assistance Programs. The focus will be to develop an understanding of the need for a systematic HRP and to discuss challenges and best practices associated with mitigating the insider threat.

  19. Accelerator reliability workshop

    Hardy, L.; Duru, Ph.; Koch, J.M.; Revol, J.L.; Van Vaerenbergh, P.; Volpe, A.M.; Clugnet, K.; Dely, A.; Goodhew, D


    About 80 experts attended this workshop, which brought together all accelerator communities: accelerator driven systems, X-ray sources, medical and industrial accelerators, spallation sources projects (American and European), nuclear physics, etc. With newly proposed accelerator applications such as nuclear waste transmutation, replacement of nuclear power plants and others. Reliability has now become a number one priority for accelerator designers. Every part of an accelerator facility from cryogenic systems to data storage via RF systems are concerned by reliability. This aspect is now taken into account in the design/budget phase, especially for projects whose goal is to reach no more than 10 interruptions per year. This document gathers the slides but not the proceedings of the workshop.

  20. Reliability and construction control

    Sherif S. AbdelSalam


    Full Text Available The goal of this study was to determine the most reliable and efficient combination of design and construction methods required for vibro piles. For a wide range of static and dynamic formulas, the reliability-based resistance factors were calculated using EGYPT database, which houses load test results for 318 piles. The analysis was extended to introduce a construction control factor that determines the variation between the pile nominal capacities calculated using static versus dynamic formulae. From the major outcomes, the lowest coefficient of variation is associated with Davisson’s criterion, and the resistance factors calculated for the AASHTO method are relatively high compared with other methods. Additionally, the CPT-Nottingham and Schmertmann method provided the most economic design. Recommendations related to a pile construction control factor were also presented, and it was found that utilizing the factor can significantly reduce variations between calculated and actual capacities.

  1. Improving Power Converter Reliability

    Ghimire, Pramod; de Vega, Angel Ruiz; Beczkowski, Szymon


    The real-time junction temperature monitoring of a high-power insulated-gate bipolar transistor (IGBT) module is important to increase the overall reliability of power converters for industrial applications. This article proposes a new method to measure the on-state collector?emitter voltage...... of a high-power IGBT module during converter operation, which may play a vital role in improving the reliability of the power converters. The measured voltage is used to estimate the module average junction temperature of the high and low-voltage side of a half-bridge IGBT separately in every fundamental...... is measured in a wind power converter at a low fundamental frequency. To illustrate more, the test method as well as the performance of the measurement circuit are also presented. This measurement is also useful to indicate failure mechanisms such as bond wire lift-off and solder layer degradation...

  2. Power electronics reliability.

    Kaplar, Robert James; Brock, Reinhard C.; Marinella, Matthew; King, Michael Patrick; Stanley, James K.; Smith, Mark A.; Atcitty, Stanley


    The project's goals are: (1) use experiments and modeling to investigate and characterize stress-related failure modes of post-silicon power electronic (PE) devices such as silicon carbide (SiC) and gallium nitride (GaN) switches; and (2) seek opportunities for condition monitoring (CM) and prognostics and health management (PHM) to further enhance the reliability of power electronics devices and equipment. CM - detect anomalies and diagnose problems that require maintenance. PHM - track damage growth, predict time to failure, and manage subsequent maintenance and operations in such a way to optimize overall system utility against cost. The benefits of CM/PHM are: (1) operate power conversion systems in ways that will preclude predicted failures; (2) reduce unscheduled downtime and thereby reduce costs; and (3) pioneering reliability in SiC and GaN.

  3. Mitosis Methods & Protocols

    CarloAlberto Redi


    Full Text Available Mitosis Methods & Protocols Andrew D. McAinsh (Edt Humana press, Totowa, New Jersey (USA Series: Springer Protocols Methods in Molecular Biology, Volume 545, 2009 ISBN: 978-1-60327-992-5   It is quite clear from the contents of this book that the remarkably fascinating phenomenon of mitosis (that captured, and still is capturing, the attention of entire generations of scientists is still open to research. This is mainly due to our lack of knowledge of so many multifaced events of this extraordinarly complex process. The reader giving a glace through the Contents and Contributors sections is speechless: All of the first-class models (i.e., budding yeast, Caenorabditis, Drosophila, Xenopus and Human are presented..... 

  4. Symmetric cryptographic protocols

    Ramkumar, Mahalingam


    This book focuses on protocols and constructions that make good use of symmetric pseudo random functions (PRF) like block ciphers and hash functions - the building blocks for symmetric cryptography. Readers will benefit from detailed discussion of several strategies for utilizing symmetric PRFs. Coverage includes various key distribution strategies for unicast, broadcast and multicast security, and strategies for constructing efficient digests of dynamic databases using binary hash trees.   •        Provides detailed coverage of symmetric key protocols •        Describes various applications of symmetric building blocks •        Includes strategies for constructing compact and efficient digests of dynamic databases

  5. Pressure-driven microfluidic perfusion culture device for integrated dose-response assays.

    Hattori, Koji; Sugiura, Shinji; Kanamori, Toshiyuki


    Cell-based assays are widely used in the various stages of drug discovery. Advances in microfluidic systems over the past two decades have enabled them to become a powerful tool for cell-based assays to achieve both reliability and high throughput. The interface between the micro-world and macro-world is important in industrial assay processes. Therefore, microfluidic cell-based assays using pressure-driven liquid handling are an ideal platform for integrated assays. The aim of this article is to review recent advancements in microfluidic cell-based assays focusing on a pressure-driven perfusion culture device. Here, we review the development of microfluidic cell-based assay devices and discuss the techniques involved in designing a microfluidic network, device fabrication, liquid and cell manipulation, and detection schemes for pressure-driven perfusion culture devices. Finally, we describe recent progress in semiautomatic and reliable pressure-driven microfluidic cell-based assays.

  6. Satellite Communications Using Commercial Protocols

    Ivancic, William D.; Griner, James H.; Dimond, Robert; Frantz, Brian D.; Kachmar, Brian; Shell, Dan


    NASA Glenn Research Center has been working with industry, academia, and other government agencies in assessing commercial communications protocols for satellite and space-based applications. In addition, NASA Glenn has been developing and advocating new satellite-friendly modifications to existing communications protocol standards. This paper summarizes recent research into the applicability of various commercial standard protocols for use over satellite and space- based communications networks as well as expectations for future protocol development. It serves as a reference point from which the detailed work can be readily accessed. Areas that will be addressed include asynchronous-transfer-mode quality of service; completed and ongoing work of the Internet Engineering Task Force; data-link-layer protocol development for unidirectional link routing; and protocols for aeronautical applications, including mobile Internet protocol routing for wireless/mobile hosts and the aeronautical telecommunications network protocol.

  7. ATLAS reliability analysis

    Bartsch, R.R.


    Key elements of the 36 MJ ATLAS capacitor bank have been evaluated for individual probabilities of failure. These have been combined to estimate system reliability which is to be greater than 95% on each experimental shot. This analysis utilizes Weibull or Weibull-like distributions with increasing probability of failure with the number of shots. For transmission line insulation, a minimum thickness is obtained and for the railgaps, a method for obtaining a maintenance interval from forthcoming life tests is suggested.

  8. Reliability of Circumplex Axes

    Micha Strack


    Full Text Available We present a confirmatory factor analysis (CFA procedure for computing the reliability of circumplex axes. The tau-equivalent CFA variance decomposition model estimates five variance components: general factor, axes, scale-specificity, block-specificity, and item-specificity. Only the axes variance component is used for reliability estimation. We apply the model to six circumplex types and 13 instruments assessing interpersonal and motivational constructs—Interpersonal Adjective List (IAL, Interpersonal Adjective Scales (revised; IAS-R, Inventory of Interpersonal Problems (IIP, Impact Messages Inventory (IMI, Circumplex Scales of Interpersonal Values (CSIV, Support Action Scale Circumplex (SAS-C, Interaction Problems With Animals (IPI-A, Team Role Circle (TRC, Competing Values Leadership Instrument (CV-LI, Love Styles, Organizational Culture Assessment Instrument (OCAI, Customer Orientation Circle (COC, and System for Multi-Level Observation of Groups (behavioral adjectives; SYMLOG—in 17 German-speaking samples (29 subsamples, grouped by self-report, other report, and metaperception assessments. The general factor accounted for a proportion ranging from 1% to 48% of the item variance, the axes component for 2% to 30%; and scale specificity for 1% to 28%, respectively. Reliability estimates varied considerably from .13 to .92. An application of the Nunnally and Bernstein formula proposed by Markey, Markey, and Tinsley overestimated axes reliabilities in cases of large-scale specificities but otherwise works effectively. Contemporary circumplex evaluations such as Tracey’s RANDALL are sensitive to the ratio of the axes and scale-specificity components. In contrast, the proposed model isolates both components.

  9. Reliability of the Advanced Psychodiagnostic Interpretation (API) Scoring System for the Bender Gestalt.

    Aucone, Ernest J.; Raphael, Alan J.; Golden, Charles J.; Espe-Pfeifer, Patricia; Seldon, Jen; Pospisil, Tanya; Dornheim, Liane; Proctor-Weber, Zoe; Calabria, Michael


    Assessed the interrater reliability of the revised Advanced Psychodiagnostic Interpretation (API) (A. Raphael and C. Golden, 1998) scoring system for the Bender Gestalt Test (L. Bender, 1938). Agreement across nine raters exceeded 90% for each of three clinical protocols, and kappa statistics indicated good interrater reliability. (SLD)

  10. Dysphonia risk screening protocol

    Katia Nemr


    Full Text Available OBJECTIVE: To propose and test the applicability of a dysphonia risk screening protocol with score calculation in individuals with and without dysphonia. METHOD: This descriptive cross-sectional study included 365 individuals (41 children, 142 adult women, 91 adult men and 91 seniors divided into a dysphonic group and a non-dysphonic group. The protocol consisted of 18 questions and a score was calculated using a 10-cm visual analog scale. The measured value on the visual analog scale was added to the overall score, along with other partial scores. Speech samples allowed for analysis/assessment of the overall degree of vocal deviation and initial definition of the respective groups and after six months, the separation of the groups was confirmed using an acoustic analysis. RESULTS: The mean total scores were different between the groups in all samples. Values ranged between 37.0 and 57.85 in the dysphonic group and between 12.95 and 19.28 in the non-dysphonic group, with overall means of 46.09 and 15.55, respectively. High sensitivity and specificity were demonstrated when discriminating between the groups with the following cut-off points: 22.50 (children, 29.25 (adult women, 22.75 (adult men, and 27.10 (seniors. CONCLUSION: The protocol demonstrated high sensitivity and specificity in differentiating groups of individuals with and without dysphonia in different sample groups and is thus an effective instrument for use in voice clinics.

  11. Automatic Validation of Protocol Narration

    Bodei, Chiara; Buchholtz, Mikael; Degano, Pierpablo;


    We perform a systematic expansion of protocol narrations into terms of a process algebra in order to make precise some of the detailed checks that need to be made in a protocol. We then apply static analysis technology to develop an automatic validation procedure for protocols. Finally, we...

  12. Evaluation of molecular assays for identification Campylobacter fetus species and subspecies and development of a C. fetus specific real-time PCR assay

    Graaf-van Bloois, van der L.; Bergen, van M.A.P.; Wal, van der F.J.; Boer, de A.G.; Duim, B.; Schmidt, T.; Wagenaar, J.A.


    Phenotypic differentiation between Campylobacter fetus (C. fetus) subspecies fetus and C. fetus subspecies venerealis is hampered by poor reliability and reproducibility of biochemical assays. AFLP (amplified fragment length polymorphism) and MLST (multilocus sequence typing) are the molecular

  13. Rely-Guarantee Protocols


    Universidade Nova de Lisboa, Caparica, Portugal. This document is a companion technical report of the paper, “Rely-Guarantee Protocols”, to appear in the...Yemini. Typestate: A programming language concept for enhancing software reliability. IEEE Trans. Software Eng., 1986. [29] K. Svendsen, L. Birkedal, and

  14. The Assessment of Parameters Affecting the Quality of Cord Blood by the Appliance of the Annexin V Staining Method and Correlation with CFU Assays.

    Radke, Teja Falk; Barbosa, David; Duggleby, Richard Charles; Saccardi, Riccardo; Querol, Sergio; Kögler, Gesine


    The assessment of nonviable haematopoietic cells by Annexin V staining method in flow cytometry has recently been published by Duggleby et al. Resulting in a better correlation with the observed colony formation in methylcellulose assays than the standard ISHAGE protocol, it presents a promising method to predict cord blood potency. Herein, we applied this method for examining the parameters during processing which potentially could affect cord blood viability. We could verify that the current standards regarding time and temperature are sufficient, since no significant difference was observed within 48 hours or in storage at 4°C up to 26°C. However, the addition of DMSO for cryopreservation alone leads to an inevitable increase in nonviable haematopoietic stem cells from initially 14.8% ± 4.3% to at least 30.6% ± 5.5%. Furthermore, CFU-assays with varied seeding density were performed in order to evaluate the applicability as a quantitative method. The results revealed that only in a narrow range reproducible clonogenic efficiency (ClonE) could be assessed, giving at least a semiquantitative estimation. We conclude that both Annexin V staining method and CFU-assays with defined seeding density are reliable means leading to a better prediction of the final potency. Especially Annexin V, due to its fast readout, is a practical tool for examining and optimising specific steps in processing, while CFU-assays add a functional confirmation.


    Deepti Kumari


    Full Text Available VoIP (voice over IP delivers standard voice over telephone services over Internet Protocol (IP. VoIP is the technology of digitizing sound, compressing it, breaking it up into data packets, and sending it over an IP (internet protocol network where it is reassembled, decompressed, and converted back into an analog wave form. Gateways are the key component required to facilitate IP Telephony. A gateway is used to bridge the traditional circuit switched PSTN with the packet switched Internet. The paper covers software, hardware and protocol requirements followed by weighing the VoIP advantages such as low cost, portability, free and advanced features, bandwidth efficiency, call recording and monitoring against the VoIP disadvantages such as power dependency, quality of voice and service, security, and reliability. With ever increasing internet penetration and better broadband connectivity, VoIP is going to expand further with businesses already using VoIP standalone or in a hybrid format, although our focus and scope here remains VoIP. Mobile VoIP, an infant with less than 4% market share, has so far been focusing on increasing active subscriptions without a sustainable revenue model, but has the potential and is going to see tussle with static VoIP for space in days ahead.

  16. Voice over Internet Protocol (VOIP: Future Potential

    Ms. Deepti


    Full Text Available VoIP (voice over IP delivers standard voice over telephone services over Internet Protocol (IP. VoIP is the technology of digitizing sound, compressing it, breaking it up into data packets, and sending it over an IP (internet protocol network where it is reassembled, decompressed, and converted back into an analog wave form. Gateways are the key component required to facilitate IP Telephony. A gateway is used to bridge the traditional circuit switched PSTN with the packet switched Internet. The paper covers software, hardware and protocol requirements followed by weighing the VoIP advantages such as low cost, portability, free and advanced features, bandwidth efficiency, call recording and monitoring against the VoIP disadvantages such as power dependency, quality of voice and service, security, and reliability. With ever increasing internet penetration and better broadband connectivity, VoIP is going to expand further with businesses already using VoIP standalone or in a hybrid format, although our focus and scope here remains VoIP. Mobile VoIP, an infant with less than 4% market share, has so far been focusing on increasing active subscriptions without a sustainable revenue model, but has the potential and is going to see tussle with static VoIP for space in days ahead.

  17. B cell helper assays.

    Abrignani, Sergio; Tonti, Elena; Casorati, Giulia; Dellabona, Paolo


    Activation, proliferation and differentiation of naïve B lymphocytes into memory B cells and plasma cells requires engagement of the B cell receptor (BCR) coupled to T-cell help (1, 2). T cells deliver help in cognate fashion when they are activated upon recognition of specific MHC-peptide complexes presented by B cells. T cells can also deliver help in a non-cognate or bystander fashion, when they do not find specific MHC-peptide complexes on B cells and are activated by alternative mechanisms. T-cell dependent activation of B cells can be studied in vitro by experimental models called "B cell helper assays" that are based on the co-culture of B cells with activated T cells. These assays allow to decipher the molecular bases for productive T-dependent B cell responses. We show here examples of B cell helper assays in vitro, which can be reproduced with any subset of T lymphocytes that displays the appropriate helper signals.

  18. Ultimately Reliable Pyrotechnic Systems

    Scott, John H.; Hinkel, Todd


    This paper presents the methods by which NASA has designed, built, tested, and certified pyrotechnic devices for high reliability operation in extreme environments and illustrates the potential applications in the oil and gas industry. NASA's extremely successful application of pyrotechnics is built upon documented procedures and test methods that have been maintained and developed since the Apollo Program. Standards are managed and rigorously enforced for performance margins, redundancy, lot sampling, and personnel safety. The pyrotechnics utilized in spacecraft include such devices as small initiators and detonators with the power of a shotgun shell, detonating cord systems for explosive energy transfer across many feet, precision linear shaped charges for breaking structural membranes, and booster charges to actuate valves and pistons. NASA's pyrotechnics program is one of the more successful in the history of Human Spaceflight. No pyrotechnic device developed in accordance with NASA's Human Spaceflight standards has ever failed in flight use. NASA's pyrotechnic initiators work reliably in temperatures as low as -420 F. Each of the 135 Space Shuttle flights fired 102 of these initiators, some setting off multiple pyrotechnic devices, with never a failure. The recent landing on Mars of the Opportunity rover fired 174 of NASA's pyrotechnic initiators to complete the famous '7 minutes of terror.' Even after traveling through extreme radiation and thermal environments on the way to Mars, every one of them worked. These initiators have fired on the surface of Titan. NASA's design controls, procedures, and processes produce the most reliable pyrotechnics in the world. Application of pyrotechnics designed and procured in this manner could enable the energy industry's emergency equipment, such as shutoff valves and deep-sea blowout preventers, to be left in place for years in extreme environments and still be relied upon to function when needed, thus greatly enhancing

  19. Advanced transport protocols for space communications

    Fang, Jian

    Satellite IP networks are characterized by high bit error rates, long propagation delays, low bandwidth feedback links, and persistent fades resulting from varying weather patterns. A new unicast transport protocol is designed to address all the above challenges. Two new algorithms, Jump Start and Quick Recovery, are presented to replace the traditional Slow Start algorithm and to recover rapidly from multiple segment losses within one window of data. The characteristics of satellite IP networks also distinguish satellite multicasting from multicasting in terrestrial wirelined networks. A reliable data multicast transport protocol, TCP-Peachtree, is proposed to solve the acknowledgment implosion and scalability problems in satellite IP networks. Developments in space technology are enabling the realization of deep space missions. The scientific data from these missions need to be delivered to the Earth successfully. To achieve this goal, the InterPlaNetary Internet is proposed as the Internet of the deep space planetary networks, which is characterized by extremely high propagation delays, high link errors, asymmetrical bandwidth, and blackouts. A reliable transport protocol, TP-Planet, is proposed for data traffic in the InterPlaNetary Internet. TP-Planet deploys rate-based additive-increase multiplicative-decrease (AIMD) congestion control and replaces the inefficient slow start algorithm with a novel Initial State algorithm that allows the capture of link resources in a very fast and controlled manner. A new congestion detection and control mechanism is developed and a Blackout State is incorporated into the protocol operation. Multimedia traffic is also one part of the aggregate traffic over InterPlaNetary Internet backbone links and it has additional requirements such as minimum bandwidth, smooth traffic, and error control. To address all the above challenges, RCP-Planet is proposed. RCP-Planet consists of two novel algorithms, i.e., Begin State and

  20. A rapid and efficient assay for extracting DNA from fungi

    Griffin, Dale W.; Kellogg, C.A.; Peak, K.K.; Shinn, E.A.


    Aims: A method for the rapid extraction of fungal DNA from small quantities of tissue in a batch-processing format was investigated. Methods and Results: Tissue (DNA for PCR/ sequencing applications. Conclusions: The method allowed batch DNA extraction from multiple fungal isolates using a simple yet rapid and reliable assay. Significance and Impact of the Study: Use of this assay will allow researchers to obtain DNA from fungi quickly for use in molecular assays that previously required specialized instrumentation, was time-consuming or was not conducive to batch processing.

  1. Ferrite logic reliability study

    Baer, J. A.; Clark, C. B.


    Development and use of digital circuits called all-magnetic logic are reported. In these circuits the magnetic elements and their windings comprise the active circuit devices in the logic portion of a system. The ferrite logic device belongs to the all-magnetic class of logic circuits. The FLO device is novel in that it makes use of a dual or bimaterial ferrite composition in one physical ceramic body. This bimaterial feature, coupled with its potential for relatively high speed operation, makes it attractive for high reliability applications. (Maximum speed of operation approximately 50 kHz.)

  2. Blade reliability collaborative :

    Ashwill, Thomas D.; Ogilvie, Alistair B.; Paquette, Joshua A.


    The Blade Reliability Collaborative (BRC) was started by the Wind Energy Technologies Department of Sandia National Laboratories and DOE in 2010 with the goal of gaining insight into planned and unplanned O&M issues associated with wind turbine blades. A significant part of BRC is the Blade Defect, Damage and Repair Survey task, which will gather data from blade manufacturers, service companies, operators and prior studies to determine details about the largest sources of blade unreliability. This report summarizes the initial findings from this work.

  3. The intergroup protocols: Scalable group communication for the internet

    Berket, Karlo [Univ. of California, Santa Barbara, CA (United States)


    Reliable group ordered delivery of multicast messages in a distributed system is a useful service that simplifies the programming of distributed applications. Such a service helps to maintain the consistency of replicated information and to coordinate the activities of the various processes. With the increasing popularity of the Internet, there is an increasing interest in scaling the protocols that provide this service to the environment of the Internet. The InterGroup protocol suite, described in this dissertation, provides such a service, and is intended for the environment of the Internet with scalability to large numbers of nodes and high latency links. The InterGroup protocols approach the scalability problem from various directions. They redefine the meaning of group membership, allow voluntary membership changes, add a receiver-oriented selection of delivery guarantees that permits heterogeneity of the receiver set, and provide a scalable reliability service. The InterGroup system comprises several components, executing at various sites within the system. Each component provides part of the services necessary to implement a group communication system for the wide-area. The components can be categorized as: (1) control hierarchy, (2) reliable multicast, (3) message distribution and delivery, and (4) process group membership. We have implemented a prototype of the InterGroup protocols in Java, and have tested the system performance in both local-area and wide-area networks.

  4. CT protocol review and optimization.

    Kofler, James M; Cody, Dianna D; Morin, Richard L


    To reduce the radiation dose associated with CT scans, much attention is focused on CT protocol review and improvement. In fact, annual protocol reviews will soon be required for ACR CT accreditation. A major challenge in the protocol review process is determining whether a current protocol is optimal and deciding what steps to take to improve it. In this paper, the authors describe methods for pinpointing deficiencies in CT protocols and provide a systematic approach for optimizing them. Emphasis is placed on a team approach, with a team consisting of at least one radiologist, one physicist, and one technologist. This core team completes a critical review of all aspects of a CT protocol and carefully evaluates proposed improvements. Changes to protocols are implemented only with consensus of the core team, with consideration of all aspects of the CT examination, including image quality, radiation dose, patient care and safety, and workflow.

  5. Comparison of real-time SYBR green dengue assay with real-time taqman RT-PCR dengue assay and the conventional nested PCR for diagnosis of primary and secondary dengue infection

    Damodar Paudel; Richard Jarman; Kriengsak Limkittikul; Chonticha Klungthong; Supat Chamnanchanunt; Ananda Nisalak; Robert Gibbons; Watcharee Chokejindachai


    Background : Dengue fever and dengue hemorrhagic fever are caused by dengue virus. Dengue infection remains a burning problem of many countries. To diagnose acute dengue in the early phase we improve the low cost, rapid SYBR green real time assay and compared the sensitivity and specificity with real time Taqman® assay and conventional nested PCR assay. Aims: To develop low cost, rapid and reliable real time SYBR green diagnostic dengue assay and compare with Taqman real-time assay and conven...

  6. Materials and Reliability Handbook for Semiconductor Optical and Electron Devices

    Pearton, Stephen


    Materials and Reliability Handbook for Semiconductor Optical and Electron Devices provides comprehensive coverage of reliability procedures and approaches for electron and photonic devices. These include lasers and high speed electronics used in cell phones, satellites, data transmission systems and displays. Lifetime predictions for compound semiconductor devices are notoriously inaccurate due to the absence of standard protocols. Manufacturers have relied on extrapolation back to room temperature of accelerated testing at elevated temperature. This technique fails for scaled, high current density devices. Device failure is driven by electric field or current mechanisms or low activation energy processes that are masked by other mechanisms at high temperature. The Handbook addresses reliability engineering for III-V devices, including materials and electrical characterization, reliability testing, and electronic characterization. These are used to develop new simulation technologies for device operation and ...


    Division des Ressources Humaines; Human Resources Division; Tel. 74683-79494


    Senior officials, holders of FRENCH PROTOCOL cards (blue cards) due to expire on 31.12.2000, are requested to return these cards and those of family members, for extension to: Bureau des cartes, Bât 33.1-009/1-015 Should the three spaces for authentication on the back of the card be full, please enclose two passport photographs for a new card. In the case of children aged 14 and over, an attestation of dependency and a school certificate should be returned with the card.


    Human Resources Division


    Senior officials, holders of FRENCH PROTOCOL cards (blue cards) due to expire on 31.12.2000, are requested to return these cards and those of family members, for extension to: Bureau des cartes, Bât 33.1-009/1-015 Should the three spaces for authentication on the back of the card be full, please enclose two passport photographs for a new card. In the case of children aged 14 and over, an attestation of dependency and a school certificate should be returned with the card.


    Division du Personnel


    Senior officials, holders of FRENCH PROTOCOL cards (blue cards) due to expire on 31.12.1999, are requested to return these cards and those of family members, for extension to:Bureau des cartes, bâtiment 33.1-025Should the 3 spaces for authentication on the back of the card be full, please enclose 2 passport photographs for a new card.In the case of children aged 14 and over, an attestation of dependency and a school certificate should be returned with the card.Personnel DivisionTel. 79494/74683

  10. Load Control System Reliability

    Trudnowski, Daniel [Montana Tech of the Univ. of Montana, Butte, MT (United States)


    This report summarizes the results of the Load Control System Reliability project (DOE Award DE-FC26-06NT42750). The original grant was awarded to Montana Tech April 2006. Follow-on DOE awards and expansions to the project scope occurred August 2007, January 2009, April 2011, and April 2013. In addition to the DOE monies, the project also consisted of matching funds from the states of Montana and Wyoming. Project participants included Montana Tech; the University of Wyoming; Montana State University; NorthWestern Energy, Inc., and MSE. Research focused on two areas: real-time power-system load control methodologies; and, power-system measurement-based stability-assessment operation and control tools. The majority of effort was focused on area 2. Results from the research includes: development of fundamental power-system dynamic concepts, control schemes, and signal-processing algorithms; many papers (including two prize papers) in leading journals and conferences and leadership of IEEE activities; one patent; participation in major actual-system testing in the western North American power system; prototype power-system operation and control software installed and tested at three major North American control centers; and, the incubation of a new commercial-grade operation and control software tool. Work under this grant certainly supported the DOE-OE goals in the area of “Real Time Grid Reliability Management.”

  11. Supply chain reliability modelling

    Eugen Zaitsev


    Full Text Available Background: Today it is virtually impossible to operate alone on the international level in the logistics business. This promotes the establishment and development of new integrated business entities - logistic operators. However, such cooperation within a supply chain creates also many problems related to the supply chain reliability as well as the optimization of the supplies planning. The aim of this paper was to develop and formulate the mathematical model and algorithms to find the optimum plan of supplies by using economic criterion and the model for the probability evaluating of non-failure operation of supply chain. Methods: The mathematical model and algorithms to find the optimum plan of supplies were developed and formulated by using economic criterion and the model for the probability evaluating of non-failure operation of supply chain. Results and conclusions: The problem of ensuring failure-free performance of goods supply channel analyzed in the paper is characteristic of distributed network systems that make active use of business process outsourcing technologies. The complex planning problem occurring in such systems that requires taking into account the consumer's requirements for failure-free performance in terms of supply volumes and correctness can be reduced to a relatively simple linear programming problem through logical analysis of the structures. The sequence of the operations, which should be taken into account during the process of the supply planning with the supplier's functional reliability, was presented.

  12. Nanoparticle-assay marker interaction: effects on nanotoxicity assessment

    Zhao, Xinxin; Xiong, Sijing; Huang, Liwen Charlotte; Ng, Kee Woei; Loo, Say Chye Joachim


    Protein-based cytotoxicity assays such as lactate dehydrogenase (LDH) and tumor necrosis factor-alpha (TNF-α) are commonly used in cytotoxic evaluation of nanoparticles (NPs) despite numerous reports on possible interactions with protein markers in these assays that can confound the results obtained. In this study, conventional cytotoxicity assays where assay markers may (LDH and TNF- α) or may not (PicoGreen and WST-8) come into contact with NPs were used to evaluate the cytotoxicity of NPs. The findings revealed selective interactions between negatively charged protein assay markers (LDH and TNF- α) and positively charged ZnO NPs under abiotic conditions. The adsorption and interaction with these protein assay markers were strongly influenced by surface charge, concentration, and specific surface area of the NPs, thereby resulting in less than accurate cytotoxic measurements, as observed from actual cell viability measurements. An improved protocol for LDH assay was, therefore, proposed and validated by eliminating any effects associated with protein-particle interactions. In view of this, additional measures and precautions should be taken when evaluating cytotoxicity of NPs with standard protein-based assays, particularly when they are of opposite charges.

  13. Nanoparticle-assay marker interaction: effects on nanotoxicity assessment

    Zhao, Xinxin; Xiong, Sijing; Huang, Liwen Charlotte; Ng, Kee Woei, E-mail:; Loo, Say Chye Joachim, E-mail: [Nanyang Technological University, School of Materials Science and Engineering (Singapore)


    Protein-based cytotoxicity assays such as lactate dehydrogenase (LDH) and tumor necrosis factor-alpha (TNF-α) are commonly used in cytotoxic evaluation of nanoparticles (NPs) despite numerous reports on possible interactions with protein markers in these assays that can confound the results obtained. In this study, conventional cytotoxicity assays where assay markers may (LDH and TNF- α) or may not (PicoGreen and WST-8) come into contact with NPs were used to evaluate the cytotoxicity of NPs. The findings revealed selective interactions between negatively charged protein assay markers (LDH and TNF- α) and positively charged ZnO NPs under abiotic conditions. The adsorption and interaction with these protein assay markers were strongly influenced by surface charge, concentration, and specific surface area of the NPs, thereby resulting in less than accurate cytotoxic measurements, as observed from actual cell viability measurements. An improved protocol for LDH assay was, therefore, proposed and validated by eliminating any effects associated with protein–particle interactions. In view of this, additional measures and precautions should be taken when evaluating cytotoxicity of NPs with standard protein-based assays, particularly when they are of opposite charges.

  14. Dynamic scapular movement analysis: is it feasible and reliable in stroke patients during arm elevation?

    Liesbet De Baets

    Full Text Available Knowledge of three-dimensional scapular movements is essential to understand post-stroke shoulder pain. The goal of the present work is to determine the feasibility and the within and between session reliability of a movement protocol for three-dimensional scapular movement analysis in stroke patients with mild to moderate impairment, using an optoelectronic measurement system. Scapular kinematics of 10 stroke patients and 10 healthy controls was recorded on two occasions during active anteflexion and abduction from 0° to 60° and from 0° to 120°. All tasks were executed unilaterally and bilaterally. The protocol's feasibility was first assessed, followed by within and between session reliability of scapular total range of motion (ROM, joint angles at start position and of angular waveforms. Additionally, measurement errors were calculated for all parameters. Results indicated that the protocol was generally feasible for this group of patients and assessors. Within session reliability was very good for all tasks. Between sessions, scapular angles at start position were measured reliably for most tasks, while scapular ROM was more reliable during the 120° tasks. In general, scapular angles showed higher reliability during anteflexion compared to abduction, especially for protraction. Scapular lateral rotations resulted in smallest measurement errors. This study indicates that scapular kinematics can be measured reliably and with precision within one measurement session. In case of multiple test sessions, further methodological optimization is required for this protocol to be suitable for clinical decision-making and evaluation of treatment efficacy.

  15. Self-Adaptive and Energy-Efficient MAC Protocol Based on Event-Driven

    Xin Hou


    Full Text Available Combined with WSN MAC layer protocol characteristics and design requirements, according to the characteristic of WSN monitoring application requirements, this paper puts forward a method based on event driven MAC protocol. The agreement algorithm is to solve the problem of network congestion and node energy unnecessary consumption cause by a large number of redundant monitoring data transceiver. It is a kind of adaptive low power consumption of the MAC layer protocol, which is pointed out based on theoretical foundation of S_MAC protocol, made use of the event driven mechanism system theory, combined with event driven mechanism and the characteristics of the WSN. It has the periodic dormancy mechanism of S_MAC protocol, in the premise of the reliability data, to reduce data redundancy and communication delay time, improve the overall network throughput, to ensure the safety and reliability of the network, which can greatly extends  the node of working time.

  16. Reliability Analysis and Modeling of ZigBee Networks

    Lin, Cheng-Min

    The architecture of ZigBee networks focuses on developing low-cost, low-speed ubiquitous communication between devices. The ZigBee technique is based on IEEE 802.15.4, which specifies the physical layer and medium access control (MAC) for a low rate wireless personal area network (LR-WPAN). Currently, numerous wireless sensor networks have adapted the ZigBee open standard to develop various services to promote improved communication quality in our daily lives. The problem of system and network reliability in providing stable services has become more important because these services will be stopped if the system and network reliability is unstable. The ZigBee standard has three kinds of networks; star, tree and mesh. The paper models the ZigBee protocol stack from the physical layer to the application layer and analyzes these layer reliability and mean time to failure (MTTF). Channel resource usage, device role, network topology and application objects are used to evaluate reliability in the physical, medium access control, network, and application layers, respectively. In the star or tree networks, a series system and the reliability block diagram (RBD) technique can be used to solve their reliability problem. However, a division technology is applied here to overcome the problem because the network complexity is higher than that of the others. A mesh network using division technology is classified into several non-reducible series systems and edge parallel systems. Hence, the reliability of mesh networks is easily solved using series-parallel systems through our proposed scheme. The numerical results demonstrate that the reliability will increase for mesh networks when the number of edges in parallel systems increases while the reliability quickly drops when the number of edges and the number of nodes increase for all three networks. More use of resources is another factor impact on reliability decreasing. However, lower network reliability will occur due to

  17. OSS reliability measurement and assessment

    Yamada, Shigeru


    This book analyses quantitative open source software (OSS) reliability assessment and its applications, focusing on three major topic areas: the Fundamentals of OSS Quality/Reliability Measurement and Assessment; the Practical Applications of OSS Reliability Modelling; and Recent Developments in OSS Reliability Modelling. Offering an ideal reference guide for graduate students and researchers in reliability for open source software (OSS) and modelling, the book introduces several methods of reliability assessment for OSS including component-oriented reliability analysis based on analytic hierarchy process (AHP), analytic network process (ANP), and non-homogeneous Poisson process (NHPP) models, the stochastic differential equation models and hazard rate models. These measurement and management technologies are essential to producing and maintaining quality/reliable systems using OSS.

  18. Reliability and validity in research.

    Roberts, Paula; Priest, Helena

    This article examines reliability and validity as ways to demonstrate the rigour and trustworthiness of quantitative and qualitative research. The authors discuss the basic principles of reliability and validity for readers who are new to research.

  19. Wolbachia detection: an assessment of standard PCR protocols.

    Simões, P M; Mialdea, G; Reiss, D; Sagot, M-F; Charlat, S


    Wolbachia is a large monophyletic genus of intracellular bacteria, traditionally detected using PCR assays. Its considerable phylogenetic diversity and impact on arthropods and nematodes make it urgent to assess the efficiency of these screening protocols. The sensitivity and range of commonly used PCR primers and of a new set of 16S primers were evaluated on a wide range of hosts and Wolbachia strains. We show that certain primer sets are significantly more efficient than others but that no single protocol can ensure the specific detection of all known Wolbachia infections.

  20. A Performance Comparison of Routing Protocols for Ad Hoc Networks

    Hicham Zougagh


    Full Text Available Mobile Ad hoc Network (MANET is a collection of mobile nodes in which the wireless links are frequently broken down due to mobility and dynamic infrastructure. Routing is a significant issue and challenge in ad hoc networks. Many routing protocols have been proposed like OLSR, AODV so far to improve the routing performance and reliability. In this paper, we describe the Optimized Link State Routing Protocol (OLSR and the Ad hoc On-Demand Distance Vector (AODV. We evaluate their performance through exhaustive simulations using the Network Simulator 2 (ns2 by varying conditions (node mobility, network density.

  1. Reliability and Its Quantitative Measures

    Alexandru ISAIC-MANIU


    Full Text Available In this article is made an opening for the software reliability issues, through wide-ranging statistical indicators, which are designed based on information collected from operating or testing (samples. It is developed the reliability issues also for the case of the main reliability laws (exponential, normal, Weibull, which validated for a particular system, allows the calculation of some reliability indicators with a higher degree of accuracy and trustworthiness

  2. New simple spectrophotometric assay of total carotenes in margarines

    Luterotti, S.; Bicanic, D.D.; Pozgaj, R.


    Direct and reliable spectrophotometric method for assaying total carotenes (TC) in margarines with the minimum of sample manipulation is proposed. For the first time saponification step used in determination of carotenes in margarines was omitted leading to a substantial cost saving and reduction of

  3. Assessment of the skin irritation potential of chemicals by using the SkinEthic reconstructed human epidermal model and the common skin irritation protocol evaluated in the ECVAM skin irritation validation study.

    Kandárová, Helena; Liebsch, Manfred; Schmidt, Elisabeth; Genschow, Elke; Traue, Dieter; Spielmann, Horst; Meyer, Kirstin; Steinhoff, Claudia; Tornier, Carine; De Wever, Bart; Rosdy, Martin


    Currently, two reconstructed human skin models, EpiDerm and EPISKIN are being evaluated in an ECVAM skin irritation validation study. A common skin irritation protocol has been developed, differing only in minor technical details for the two models. A small-scale study, applying this common skin irritation protocol to the SkinEthic reconstructed human epidermis (RHE), was performed at ZEBET at the BfR, Berlin, Germany, to consider whether this protocol could be successfully transferred to another epidermal model. Twenty substances from Phase III of the ECVAM prevalidation study on skin irritation were tested with the SkinEthic RHE. After minor, model-specific adaptations for the SkinEthic RHE, almost identical results to those obtained with the EpiDerm and EPISKIN models were achieved. The overall accuracy of the method was more than 80%, indicating a reliable prediction of the skin irritation potential of the tested chemicals when compared to in vivo rabbit data. As a next step, inter laboratory reproducibility was assessed in a study conducted between ZEBET and the Department of Experimental Toxicology, Schering AG, Berlin, Germany. Six coded substances were tested in both laboratories, with three different batches of the SkinEthic model. The assay results showed good reproducibility and correct predictions of the skin irritation potential for all six test chemicals. The results obtained with the SkinEthic RHE and the common protocol were reproducible in both phases, and the overall outcome is very similar to that of earlier studies with the EPISKIN and EpiDerm models. Therefore, the SkinEthic skin irritation assay test protocol can now be evaluated in a formal "catch-up" validation study.

  4. Growth cone collapse assay.

    Cook, Geoffrey M W; Jareonsettasin, Prem; Keynes, Roger J


    The growth cone collapse assay has proved invaluable in detecting and purifying axonal repellents. Glycoproteins/proteins present in detergent extracts of biological tissues are incorporated into liposomes, added to growth cones in culture and changes in morphology are then assessed. Alternatively purified or recombinant molecules in aqueous solution may be added directly to the cultures. In both cases after a defined period of time (up to 1 h), the cultures are fixed and then assessed by inverted phase contrast microscopy for the percentage of growth cones showing a collapsed profile with loss of flattened morphology, filopodia, and lamellipodia.



    The present invention relates to a device for use in performing assays on standard laboratory solid supports whereon chemical entities are attached. The invention furthermore relates to the use of such a device and a kit comprising such a device. The device according to the present invention is a......, when operatively connected, one or more chambers (21) comprising the chemical entities (41), the inlet(s) (5) and outlet(s) (6) and chambers (21) being in fluid connection. The device further comprise means for providing differing chemical conditions in each chamber (21)....

  6. Radon assay for SNO+

    Rumleskie, Janet [Laurentian University, Greater Sudbury, Ontario (Canada)


    The SNO+ experiment will study neutrinos while located 6,800 feet below the surface of the earth at SNOLAB. Though shielded from surface backgrounds, emanation of radon radioisotopes from the surrounding rock leads to back-grounds. The characteristic decay of radon and its daughters allows for an alpha detection technique to count the amount of Rn-222 atoms collected. Traps can collect Rn-222 from various positions and materials, including an assay skid that will collect Rn-222 from the organic liquid scintillator used to detect interactions within SNO+.


    Apoorva Dasari


    Full Text Available In layered networks, reliability is a major concern as link failures at lower layer will have a great impact on network reliability. Failure at a lower layer may lead to multiple failures at the upper layers which deteriorate the network performance. In this paper, the scenario of such a layered wireless sensor network is considered for Ad hoc On-Demand Distance Vector (AODV and Multi Commodity Flow (MCF routing protocols. MCF is developed using polynomial time approximation algorithms for the failure polynomial. Both protocols are compared in terms of different network parameters such as throughput, packet loss and end to end delay. It was shown that the network reliability is better when MCF protocol is used. It was also shown that maximizing the min cut of the layered network maximizes reliability in the terms of successful packet transmission of network. Thetwo routing protocolsare implemented in the scenario of discrete network event simulator NS-2.

  8. ALORT: a transport layer protocol using adaptive loss recovery method for WSN



    Recently, critical projects were developed using wireless sensor network (WSN) such as medical and military projects. Performing reliable communication in WSN is extremely important for such projects. To be able to perform reliable communication, packet loss must be minimal. Packet loss, energy and latency are the most serious problems for transport protocols. In this paper, reliable, energy- and delay-sensitive transport layer protocol (ALORT) has been developed for WSN. The proposed protocol provides optimum reliability, optimum packet latency, minimum energy cost and high packet delivery ratio by changing loss recovery mechanism (LRM) according to channel error rates. ALORT algorithm is compared with PSFQ and DTC in terms of packet delivery ratio, end-to-end latency and energy cost by MIXIM framework in OMNET??. ALORT algorithm reduces energy cost and end-to-end latency and increases packet delivery ratio.

  9. 2017 NREL Photovoltaic Reliability Workshop

    Kurtz, Sarah [National Renewable Energy Laboratory (NREL), Golden, CO (United States)


    NREL's Photovoltaic (PV) Reliability Workshop (PVRW) brings together PV reliability experts to share information, leading to the improvement of PV module reliability. Such improvement reduces the cost of solar electricity and promotes investor confidence in the technology -- both critical goals for moving PV technologies deeper into the electricity marketplace.

  10. Testing for PV Reliability (Presentation)

    Kurtz, S.; Bansal, S.


    The DOE SUNSHOT workshop is seeking input from the community about PV reliability and how the DOE might address gaps in understanding. This presentation describes the types of testing that are needed for PV reliability and introduces a discussion to identify gaps in our understanding of PV reliability testing.

  11. Reliable Quantum Computers

    Preskill, J


    The new field of quantum error correction has developed spectacularly since its origin less than two years ago. Encoded quantum information can be protected from errors that arise due to uncontrolled interactions with the environment. Recovery from errors can work effectively even if occasional mistakes occur during the recovery procedure. Furthermore, encoded quantum information can be processed without serious propagation of errors. Hence, an arbitrarily long quantum computation can be performed reliably, provided that the average probability of error per quantum gate is less than a certain critical value, the accuracy threshold. A quantum computer storing about 10^6 qubits, with a probability of error per quantum gate of order 10^{-6}, would be a formidable factoring engine. Even a smaller, less accurate quantum computer would be able to perform many useful tasks. (This paper is based on a talk presented at the ITP Conference on Quantum Coherence and Decoherence, 15-18 December 1996.)

  12. Development of a colorimetric assay for rapid quantitative measurement of clavulanic acid in microbial samples.

    Dai, Xida; Xiang, Sihai; Li, Jia; Gao, Qiang; Yang, Keqian


    We developed a colorimetric assay to quantify clavulanic acid (CA) in culture broth of Streptomyces clavuligerus, to facilitate screening of a large number of S. clavuligerus mutants. The assay is based on a β-lactamase-catalyzed reaction, in which the yellow substrate nitrocefin (λ (max)=390 nm) is converted to a red product (λ (max)=486 nm). Since CA can irreversibly inhibit β-lactamase activity, the level of CA in a sample can be measured as a function of the A (390)/A (486) ratio in the assay mixture. The sensitivity and detection window of the assay were determined to be 50 μg L(-1) and 50 μg L(-1) to 10 mg L(-1), respectively. The reliability of the assay was confirmed by comparing assay results with those obtained by HPLC. The assay was used to screen a pool of 65 S. clavuligerus mutants and was reliable for identifying CA over-producing mutants. Therefore, the assay saves time and labor in large-scale mutant screening and evaluation tasks. The detection window and the reliability of this assay are markedly better than those of previously reported CA assays. This assay method is suitable for high throughput screening of microbial samples and allows direct visual observation of CA levels on agar plates.

  13. JaCVAM-organized international validation study of the in vivo rodent alkaline comet assay for the detection of genotoxic carcinogens: I. Summary of pre-validation study results.

    Uno, Yoshifumi; Kojima, Hajime; Omori, Takashi; Corvi, Raffaella; Honma, Masamistu; Schechtman, Leonard M; Tice, Raymond R; Burlinson, Brian; Escobar, Patricia A; Kraynak, Andrew R; Nakagawa, Yuzuki; Nakajima, Madoka; Pant, Kamala; Asano, Norihide; Lovell, David; Morita, Takeshi; Ohno, Yasuo; Hayashi, Makoto


    The in vivo rodent alkaline comet assay (comet assay) is used internationally to investigate the in vivo genotoxic potential of test chemicals. This assay, however, has not previously been formally validated. The Japanese Center for the Validation of Alternative Methods (JaCVAM), with the cooperation of the U.S. NTP Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM)/the Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM), the European Centre for the Validation of Alternative Methods (ECVAM), and the Japanese Environmental Mutagen Society/Mammalian Mutagenesis Study Group (JEMS/MMS), organized an international validation study to evaluate the reliability and relevance of the assay for identifying genotoxic carcinogens, using liver and stomach as target organs. The ultimate goal of this validation effort was to establish an Organisation for Economic Co-operation and Development (OECD) test guideline. The purpose of the pre-validation studies (i.e., Phase 1 through 3), conducted in four or five laboratories with extensive comet assay experience, was to optimize the protocol to be used during the definitive validation study.

  14. Assay optimization for molecular detection of Zika virus

    Corman, Victor M; Rasche, Andrea; Baronti, Cecile; Aldabbagh, Souhaib; Cadar, Daniel; Reusken, Chantal BEM; Pas, Suzan D; Goorhuis, Abraham; Schinkel, Janke; Molenkamp, Richard; Kümmerer, Beate M; Bleicker, Tobias; Brünink, Sebastian; Eschbach-Bludau, Monika; Eis-Hübinger, Anna M; Koopmans, Marion P; Schmidt-Chanasit, Jonas; Grobusch, Martin P; de Lamballerie, Xavier; Drosten, Christian


    Abstract Objective To examine the diagnostic performance of real-time reverse transcription (RT)-polymerase chain reaction (PCR) assays for Zika virus detection. Methods We compared seven published real-time RT–PCR assays and two new assays that we have developed. To determine the analytical sensitivity of each assay, we constructed a synthetic universal control ribonucleic acid (uncRNA) containing all of the assays’ target regions on one RNA strand and spiked human blood or urine with known quantities of African or Asian Zika virus strains. Viral loads in 33 samples from Zika virus-infected patients were determined by using one of the new assays. Findings Oligonucleotides of the published real-time RT–PCR assays, showed up to 10 potential mismatches with the Asian lineage causing the current outbreak, compared with 0 to 4 mismatches for the new assays. The 95% lower detection limit of the seven most sensitive assays ranged from 2.1 to 12.1 uncRNA copies/reaction. Two assays had lower sensitivities of 17.0 and 1373.3 uncRNA copies/reaction and showed a similar sensitivity when using spiked samples. The mean viral loads in samples from Zika virus-infected patients were 5 × 104 RNA copies/mL of blood and 2 × 104 RNA copies/mL of urine. Conclusion We provide reagents and updated protocols for Zika virus detection suitable for the current outbreak strains. Some published assays might be unsuitable for Zika virus detection, due to the limited sensitivity and potential incompatibility with some strains. Viral concentrations in the clinical samples were close to the technical detection limit, suggesting that the use of insensitive assays will cause false-negative results. PMID:27994281

  15. Practical and reliable enzyme test for the detection of mucopolysaccharidosis IVA (Morquio Syndrome type A) in dried blood samples.

    Camelier, Marli V; Burin, Maira G; De Mari, Jurema; Vieira, Taiane A; Marasca, Giórgia; Giugliani, Roberto


    Mucopolysaccharidosis IVA (MPS IVA), or Morquio Syndrome type A, is an autosomal recessive disease caused by deficiency of the lysosomal enzyme N-acetylgalactosamine-6-sulfatase (GALNS), resulting in excessive lysosomal storage of keratan sulfate in many tissues and organs. This accumulation causes a severe skeletal dysplasia with short stature, and affects the eye, heart and other organs, with many signs and symptoms. Morquio A syndrome is estimated to occur in 1 in 200,000 to 300,000 live births. Clinical trials with enzyme replacement therapy for this disease are in progress, and it is probable that the treatment, when available, would be more effective if started early. We describe an innovative fluorometric method for the assay of GALNS in dried blood spots (DBS). We used dried blood spots (DBS) as the enzyme source and compared it with leukocytes samples, having studied 25 MPS IVA patients and 54 healthy controls. We optimized the assay conditions, including incubation time and stability of DBS samples. To eppendorf type tubes containing a 3-mm diameter blood spot we added elution liquid and substrate solution. After 2 different incubations at 37°C, the amount of hydrolyzed product was compared with a calibrator to allow the quantification of the enzyme activity. Results in DBS were compared to the ones obtained in leukocytes using the standard technique. The fluorescent methodology was validated in our laboratory and the assay was found sensitive and specific, allowing reliable detection of MPS IVA patients. The use of DBS simplifies the collection and transport steps, and is especially useful for testing patients from more remote areas of large countries, and when samples need to cross country borders. This assay could be easily incorporated into the protocol of reference laboratories and play a role in the screening for MPS IVA, contributing to earlier detection of affected patients. Copyright © 2011 Elsevier B.V. All rights reserved.

  16. Stream Control Transmission Protocol Steganography

    Fraczek, Wojciech; Szczypiorski, Krzysztof


    Stream Control Transmission Protocol (SCTP) is a new transport layer protocol that is due to replace TCP (Transmission Control Protocol) and UDP (User Datagram Protocol) protocols in future IP networks. Currently, it is implemented in such operating systems like BSD, Linux, HP-UX or Sun Solaris. It is also supported in Cisco network devices operating system (Cisco IOS) and may be used in Windows. This paper describes potential steganographic methods that may be applied to SCTP and may pose a threat to network security. Proposed methods utilize new, characteristic SCTP features like multi-homing and multistreaming. Identified new threats and suggested countermeasures may be used as a supplement to RFC 5062, which describes security attacks in SCTP protocol and can induce further standard modifications.

  17. RAS - Screens & Assays - Drug Discovery

    The RAS Drug Discovery group aims to develop assays that will reveal aspects of RAS biology upon which cancer cells depend. Successful assay formats are made available for high-throughput screening programs to yield potentially effective drug compounds.

  18. Electronics reliability calculation and design

    Dummer, Geoffrey W A; Hiller, N


    Electronics Reliability-Calculation and Design provides an introduction to the fundamental concepts of reliability. The increasing complexity of electronic equipment has made problems in designing and manufacturing a reliable product more and more difficult. Specific techniques have been developed that enable designers to integrate reliability into their products, and reliability has become a science in its own right. The book begins with a discussion of basic mathematical and statistical concepts, including arithmetic mean, frequency distribution, median and mode, scatter or dispersion of mea

  19. Survey protocol for invasive species

    Menza, Charles


    This protocol was developed by the Biogeography Branch of NOAA’s Center for Coastal Monitoring and Assessment to support invasive species research by the Papahānaumokuākea Marine National Monument. The protocol’s objective is to detect Carijoa riisei and Hypnea musciformis in deepwater habitats using visual surveys by technical divers. Note: This protocol is designed to detect the presence or absence of invasive species. A distinct protocol is required to collect information on abundance ...




    Full Text Available A Prospective randomized Clinical study of outcome of labour following. “A Programmed labour. Protocol” was done at Department of OBG, MRMC Gulbarga. The Protocol was aimed with dual. Objective of Providing Pain relief during labour and teaching the goal of safe motherhood by optimizing objective outcome. AIMS AND OBJECTIVES: Shortening of duration of labour. Effect of labour analgesia. Monitoring of the events during labour. Lowering the incidence of operative deliveries. METHODS: 100 cases primi pregnant women admitted in labour room are randomly selected. It is designed to apply to low risk primi parous, singleton cephalic presentation without evidence of CPD and spontaneous onset of labour. RESULTS: Shortened duration of all the stages of Labour, especially significant reduction in duration of active phase of labour. CONCLUSION: The programmed labour is simple easy and effective method for painless and safe delivery.

  1. Protocols for Scholarly Communication

    Pepe, Alberto; Pepe, Alberto; Yeomans, Joanne


    CERN, the European Organization for Nuclear Research, has operated an institutional preprint repository for more than 10 years. The repository contains over 850,000 records of which more than 450,000 are full-text OA preprints, mostly in the field of particle physics, and it is integrated with the library's holdings of books, conference proceedings, journals and other grey literature. In order to encourage effective propagation and open access to scholarly material, CERN is implementing a range of innovative library services into its document repository: automatic keywording, reference extraction, collaborative management tools and bibliometric tools. Some of these services, such as user reviewing and automatic metadata extraction, could make up an interesting testbed for future publishing solutions and certainly provide an exciting environment for e-science possibilities. The future protocol for scientific communication should naturally guide authors towards OA publication and CERN wants to help reach a full...




    Full Text Available The introduction of pervasive devices and mobile devices has led to immense growth of real time distributed processing. In such context reliability of the computing environment is very important. Reliability is the probability that the devices, links, processes, programs and files work efficiently for the specified period of time and in the specified condition. Distributed systems are available as conventional ring networks, clusters and agent based systems. Reliability of such systems is focused. These networks are heterogeneous and scalable in nature. There are several factors, which are to be considered for reliability estimation. These include the application related factors like algorithms, data-set sizes, memory usage pattern, input-output, communication patterns, task granularity and load-balancing. It also includes the hardware related factors like processor architecture, memory hierarchy, input-output configuration and network. The software related factors concerning reliability are operating systems, compiler, communication protocols, libraries and preprocessor performance. In estimating the reliability of a system, the performance estimation is an important aspect. Reliability analysis is approached using probability.

  3. Hybrid reliability model for fatigue reliability analysis of steel bridges

    曹珊珊; 雷俊卿


    A kind of hybrid reliability model is presented to solve the fatigue reliability problems of steel bridges. The cumulative damage model is one kind of the models used in fatigue reliability analysis. The parameter characteristics of the model can be described as probabilistic and interval. The two-stage hybrid reliability model is given with a theoretical foundation and a solving algorithm to solve the hybrid reliability problems. The theoretical foundation is established by the consistency relationships of interval reliability model and probability reliability model with normally distributed variables in theory. The solving process is combined with the definition of interval reliability index and the probabilistic algorithm. With the consideration of the parameter characteristics of theS−N curve, the cumulative damage model with hybrid variables is given based on the standards from different countries. Lastly, a case of steel structure in the Neville Island Bridge is analyzed to verify the applicability of the hybrid reliability model in fatigue reliability analysis based on the AASHTO.

  4. Development of PCR protocols for specific identification of Clostridium spiroforme and detection of sas and sbs genes.

    Drigo, Ilenia; Bacchin, Cosetta; Cocchi, Monia; Bano, Luca; Agnoletti, Fabrizio


    Rabbit diarrhoea caused by toxigenic Clostridium spiroforme is responsible for significant losses in commercial rabbitries but the accurate identification of this micro-organism is difficult due to the absence of both a commercial biochemical panel and biomolecular methods. The aim of this study was therefore to develop PCR protocols for specific detection of C. spiroforme and its binary toxin encoding genes. The C. spiroforme specie-specific primers were designed based on its 16S rDNA published sequences and the specificity of these primers was tested with DNA extracted from closely related Clostridium species. The sa/bs_F and sa/bs _R C. spiroforme binary toxin specific primers were designed to be complementary, respectively, to a sequence of 21 bases on the 3' and of sas gene and on the 5' of the sbs gene. The detection limits of in house developed PCR protocols were 25CFU/ml of bacterial suspension and 1.38x10(4)CFU/g of caecal content for specie-specific primers and 80CFU/ml of bacterial suspension and 2.8x10(4)CFU/g of caecal content in case of sa/bs primers. These results indicated that the described PCR assays enable specific identification of C. spiroforme and its binary toxin genes and can therefore be considered a rapid, reliable tool for the diagnosis of C. spiroforme-related enterotoxaemia.

  5. An Improved Harvest and in Vitro Expansion Protocol for Murine Bone Marrow-Derived Mesenchymal Stem Cells

    Song Xu


    Full Text Available Compared to bone marrow (BM derived mesenchymal stem cells (MSCs from human origin or from other species, the in vitro expansion and purification of murine MSCs (mMSCs is much more difficult because of the low MSC yield and the unwanted growth of non-MSCs in the in vitro expansion cultures. We describe a modified protocol to isolate and expand murine BM derived MSCs based on the combination of mechanical crushing and collagenase digestion at the moment of harvest, followed by an immunodepletion step using microbeads coated with CD11b, CD45 and CD34 antibodies. The number of isolated mMSCs as estimated by colony forming unit-fibroblast (CFU-F assay showed that this modified isolation method could yield 70.0% more primary colonies. After immunodepletion, a homogenous mMSC population could already be obtained after two passages. Immunodepleted mMSCs (ID-mMSCs are uniformly positive for stem cell antigen-1 (Sca-1, CD90, CD105 and CD73 cell surface markers, but negative for the hematopoietic surface markers CD14, CD34 and CD45. Moreover the immunodepleted cell population exhibits more differentiation potential into adipogenic, osteogenic and chondrogenic lineages. Our data illustrate the development of an efficient and reliable expansion protocol increasing the yield and purity of mMSCs and reducing the overall expansion time.

  6. A Passive Testing Approach for Protocols in Wireless Sensor Networks

    Che, Xiaoping; Maag, Stephane; Tan, Hwee-Xian; Tan, Hwee-Pink; Zhou, Zhangbing


    Smart systems are today increasingly developed with the number of wireless sensor devices drastically increasing. They are implemented within several contexts throughout our environment. Thus, sensed data transported in ubiquitous systems are important, and the way to carry them must be efficient and reliable. For that purpose, several routing protocols have been proposed for wireless sensor networks (WSN). However, one stage that is often neglected before their deployment is the conformance testing process, a crucial and challenging step. Compared to active testing techniques commonly used in wired networks, passive approaches are more suitable to the WSN environment. While some works propose to specify the protocol with state models or to analyze them with simulators and emulators, we here propose a logic-based approach for formally specifying some functional requirements of a novel WSN routing protocol. We provide an algorithm to evaluate these properties on collected protocol execution traces. Further, we demonstrate the efficiency and suitability of our approach by its application into common WSN functional properties, as well as specific ones designed from our own routing protocol. We provide relevant testing verdicts through a real indoor testbed and the implementation of our protocol. Furthermore, the flexibility, genericity and practicability of our approach have been proven by the experimental results. PMID:26610495

  7. A Passive Testing Approach for Protocols in Wireless Sensor Networks

    Xiaoping Che


    Full Text Available Smart systems are today increasingly developed with the number of wireless sensor devices drastically increasing. They are implemented within several contexts throughout our environment. Thus, sensed data transported in ubiquitous systems are important, and the way to carry them must be efficient and reliable. For that purpose, several routing protocols have been proposed for wireless sensor networks (WSN. However, one stage that is often neglected before their deployment is the conformance testing process, a crucial and challenging step. Compared to active testing techniques commonly used in wired networks, passive approaches are more suitable to the WSN environment. While some works propose to specify the protocol with state models or to analyze them with simulators and emulators, we here propose a logic-based approach for formally specifying some functional requirements of a novel WSN routing protocol. We provide an algorithm to evaluate these properties on collected protocol execution traces. Further, we demonstrate the efficiency and suitability of our approach by its application into common WSN functional properties, as well as specific ones designed from our own routing protocol. We provide relevant testing verdicts through a real indoor testbed and the implementation of our protocol. Furthermore, the flexibility, genericity and practicability of our approach have been proven by the experimental results.

  8. A Passive Testing Approach for Protocols in Wireless Sensor Networks.

    Che, Xiaoping; Maag, Stephane; Tan, Hwee-Xian; Tan, Hwee-Pink; Zhou, Zhangbing


    Smart systems are today increasingly developed with the number of wireless sensor devices drastically increasing. They are implemented within several contexts throughout our environment. Thus, sensed data transported in ubiquitous systems are important, and the way to carry them must be efficient and reliable. For that purpose, several routing protocols have been proposed for wireless sensor networks (WSN). However, one stage that is often neglected before their deployment is the conformance testing process, a crucial and challenging step. Compared to active testing techniques commonly used in wired networks, passive approaches are more suitable to the WSN environment. While some works propose to specify the protocol with state models or to analyze them with simulators and emulators, we here propose a logic-based approach for formally specifying some functional requirements of a novel WSN routing protocol. We provide an algorithm to evaluate these properties on collected protocol execution traces. Further, we demonstrate the efficiency and suitability of our approach by its application into common WSN functional properties, as well as specific ones designed from our own routing protocol. We provide relevant testing verdicts through a real indoor testbed and the implementation of our protocol. Furthermore, the flexibility, genericity and practicability of our approach have been proven by the experimental results.

  9. Synthesis of Reliable Telecommunication Networks

    Dusan Trstensky


    Full Text Available In many application, the network designer may to know to senthesise a reliable telecommunication network. Assume that a network, denoted Gm,e has the number of nodes n and the number of edges e, and the operational probability of each edge is known. The system reliability of the network is defined to be the reliability that every pair of nodes can communicate with each other. A network synthesis problem considered in this paper is to find a network G*n,e, that maximises system reliability over the class of all networks for the classes of networks Gn,n-1, Gn,m and Gn,n+1 respectively. In addition an upper bound of maximum reliability for the networks with n-node and e-edge (e>n+2 is derived in terms of node. Computational experiments for the reliability upper are also presented. the results show, that the proposed reliability upper bound is effective.

  10. Mathematical reliability an expository perspective

    Mazzuchi, Thomas; Singpurwalla, Nozer


    In this volume consideration was given to more advanced theoretical approaches and novel applications of reliability to ensure that topics having a futuristic impact were specifically included. Topics like finance, forensics, information, and orthopedics, as well as the more traditional reliability topics were purposefully undertaken to make this collection different from the existing books in reliability. The entries have been categorized into seven parts, each emphasizing a theme that seems poised for the future development of reliability as an academic discipline with relevance. The seven parts are networks and systems; recurrent events; information and design; failure rate function and burn-in; software reliability and random environments; reliability in composites and orthopedics, and reliability in finance and forensics. Embedded within the above are some of the other currently active topics such as causality, cascading, exchangeability, expert testimony, hierarchical modeling, optimization and survival...

  11. Static Validation of Security Protocols

    Bodei, Chiara; Buchholtz, Mikael; Degano, P.;


    We methodically expand protocol narrations into terms of a process algebra in order to specify some of the checks that need to be made in a protocol. We then apply static analysis technology to develop an automatic validation procedure for protocols. Finally, we demonstrate that these techniques ...... suffice to identify several authentication flaws in symmetric and asymmetric key protocols such as Needham-Schroeder symmetric key, Otway-Rees, Yahalom, Andrew secure RPC, Needham-Schroeder asymmetric key, and Beller-Chang-Yacobi MSR...

  12. Foraging Path-length Protocol for Drosophila melanogaster Larvae.

    Anreiter, Ina; Vasquez, Oscar E; Allen, Aaron M; Sokolowski, Marla B


    The Drosophila melanogaster larval path-length phenotype is an established measure used to study the genetic and environmental contributions to behavioral variation. The larval path-length assay was developed to measure individual differences in foraging behavior that were later linked to the foraging gene. Larval path-length is an easily scored trait that facilitates the collection of large sample sizes, at minimal cost, for genetic screens. Here we provide a detailed description of the current protocol for the larval path-length assay first used by Sokolowski. The protocol details how to reproducibly handle test animals, perform the behavioral assay and analyze the data. An example of how the assay can be used to measure behavioral plasticity in response to environmental change, by manipulating feeding environment prior to performing the assay, is also provided. Finally, appropriate test design as well as environmental factors that can modify larval path-length such as food quality, developmental age and day effects are discussed.

  13. Validation of the Thermo Scientific SureTect Escherichia coli O157:H7 Real-Time PCR Assay for Raw Beef and Produce Matrixes.

    Cloke, Jonathan; Crowley, Erin; Bird, Patrick; Bastin, Ben; Flannery, Jonathan; Agin, James; Goins, David; Clark, Dorn; Radcliff, Roy; Wickstrand, Nina; Kauppinen, Mikko


    . coli O157:NM isolate. Nonmotile isolates of E. coli O157 have been demonstrated to still contain the H7 gene; therefore, this result is not unexpected. Robustness testing was conducted to evaluate the performance of the SureTect assay with specific deviations to the assay protocol, which were outside the recommended parameters and which are open to variation. This study demonstrated that the SureTect assay gave reliable performance. A final study to verify the shelf life of the product, under accelerated conditions was also conducted.

  14. Assessment of a novel multiplex real-time PCR assay for the detection of the CBPP agent Mycoplasma mycoides subsp. mycoides SC through experimental infection in cattle

    Tomaso Herbert


    Full Text Available Abstract Background Mycoplasma mycoides subsp. mycoides SC is the pathogenic agent of contagious bovine pleuropneumonia (CBPP, the most important disease of cattle in Africa causing significant economic losses. The re-emergence of CBPP in Europe in the 1980s and 1990s illustrates that it is still a threat also to countries that have successfully eradicated the disease in the past. Nowadays, probe-based real-time PCR techniques are among the most advanced tools for a reliable identification and a sensitive detection of many pathogens, but only few protocols have been published so far for CBPP diagnosis. Therefore we developed a novel TaqMan®-based real-time PCR assay comprising the amplification of two independent targets (MSC_0136 and MSC_1046 and an internal exogenous amplification control in a multiplex reaction and evaluated its diagnostic performance with clinical samples. Results The assays detected 49 MmmSC strains from diverse temporal and geographical origin, but did not amplify DNA from 82 isolates of 20 non-target species confirming a specificity of 100%. The detection limit was determined to be 10 fg DNA per reaction for the MSC_0136 assay and 100 fg per reaction for the MSC_1046 assay corresponding to 8 and 80 genome equivalents, respectively. The diagnostic performance of the assay was evaluated with clinical samples from 19 experimentally infected cattle and from 20 cattle without CBPP and compared to those of cultivation and a conventional PCR protocol. The two rt-PCR tests proved to be the most sensitive methods and identified all 19 infected animals. The different sample types used were not equally suitable for MmmSC detection. While 94.7% of lung samples from the infected cohort were positively tested in the MSC_0136 assay, only 81% of pulmonal lymph nodes, 31% of mediastinal lymph nodes and 25% of pleural fluid samples gave a positive result. Conclusions The developed multiplex rt-PCR assay is recommended as an efficient tool

  15. Bacterial assays for recombinagens.

    Hoffmann, G R


    Two principal strategies have been used for studying recombinagenic effects of chemicals and radiation in bacteria: (1) measurement of homologous recombination involving defined alleles in a partially diploid strain, and (2) measurement of the formation and loss of genetic duplications in the bacterial chromosome. In the former category, most methods involve one allele in the bacterial chromosome and another in a plasmid, but it is also possible to detect recombination between two chromosomal alleles or between two extrachromosomal alleles. This review summarizes methods that use each of these approaches for detecting recombination and tabulates data on agents that have been found to be recombinagenic in bacteria. The assays are discussed with respect to their effectiveness in testing for recombinagens and their potential for elucidating mechanisms underlying recombinagenic effects.

  16. Standardization of cytokine flow cytometry assays

    Cox Josephine


    Full Text Available Abstract Background Cytokine flow cytometry (CFC or intracellular cytokine staining (ICS can quantitate antigen-specific T cell responses in settings such as experimental vaccination. Standardization of ICS among laboratories performing vaccine studies would provide a common platform by which to compare the immunogenicity of different vaccine candidates across multiple international organizations conducting clinical trials. As such, a study was carried out among several laboratories involved in HIV clinical trials, to define the inter-lab precision of ICS using various sample types, and using a common protocol for each experiment (see additional files online. Results Three sample types (activated, fixed, and frozen whole blood; fresh whole blood; and cryopreserved PBMC were shipped to various sites, where ICS assays using cytomegalovirus (CMV pp65 peptide mix or control antigens were performed in parallel in 96-well plates. For one experiment, antigens and antibody cocktails were lyophilised into 96-well plates to simplify and standardize the assay setup. Results (CD4+cytokine+ cells and CD8+cytokine+ cells were determined by each site. Raw data were also sent to a central site for batch analysis with a dynamic gating template. Mean inter-laboratory coefficient of variation (C.V. ranged from 17–44% depending upon the sample type and analysis method. Cryopreserved peripheral blood mononuclear cells (PBMC yielded lower inter-lab C.V.'s than whole blood. Centralized analysis (using a dynamic gating template reduced the inter-lab C.V. by 5–20%, depending upon the experiment. The inter-lab C.V. was lowest (18–24% for samples with a mean of >0.5% IFNγ + T cells, and highest (57–82% for samples with a mean of Conclusion ICS assays can be performed by multiple laboratories using a common protocol with good inter-laboratory precision, which improves as the frequency of responding cells increases. Cryopreserved PBMC may yield slightly more

  17. Optimized PCR assay for detection of white spot syndrome virus (WSSV).

    Nunan, Linda M; Lightner, Donald V


    A rapid PCR assay for detection of white spot syndrome virus (WSSV) was developed based on the nested PCR procedure described by Lo et al. (1996) and outlined as the recommended PCR diagnostic assay in the Manual of Diagnostic Tests for Aquatic Animals published by the Office of International Epizootics (OIE, 2009). The optimized procedure incorporated the second step primers used in the nested WSSV PCR. By adjusting the annealing temperature and shortening the cycling times, this modified assay is substantially faster and as sensitive as the recommended OIE protocol. The modified PCR test was compared directly to the two-step nested PCR protocol and a modified nested procedure. The sensitivity of the published assay was determined by template dilutions of semi-purified WSSV virions that had been quantitated using real-time PCR for detection of WSSV. Various isolates were tested using the modified procedure, to ensure that the assay was able to detect WSSV from different geographical locations.

  18. Reduce microRNA RT-qPCR Assay Costs by More Than 10-fold Without Compromising Results

    Goldrick, Marianna; Busk, Peter Kamp; Lepovitz, Lance


    This white paper describes a detailed protocol for carrying out qPCR-based microRNA analysis for only ~$0.39 per assay, a cost-savings of >90% compared to commonly used alternative methods.......This white paper describes a detailed protocol for carrying out qPCR-based microRNA analysis for only ~$0.39 per assay, a cost-savings of >90% compared to commonly used alternative methods....

  19. Is quantitative electromyography reliable?

    Cecere, F; Ruf, S; Pancherz, H


    The reliability of quantitative electromyography (EMG) of the masticatory muscles was investigated in 14 subjects without any signs or symptoms of temporomandibular disorders. Integrated EMG activity from the anterior temporalis and masseter muscles was recorded bilaterally by means of bipolar surface electrodes during chewing and biting activities. In the first experiment, the influence of electrode relocation was investigated. No influence of electrode relocation on the recorded EMG signal could be detected. In a second experiment, three sessions of EMG recordings during five different chewing and biting activities were performed in the morning (I); 1 hour later without intermediate removal of the electrodes (II); and in the afternoon, using new electrodes (III). The method errors for different time intervals (I-II and I-III errors) for each muscle and each function were calculated. Depending on the time interval between the EMG recordings, the muscles considered, and the function performed, the individual errors ranged from 5% to 63%. The method error increased significantly (P masseter (mean 27.2%) was higher than for the temporalis (mean 20.0%). The largest function error was found during maximal biting in intercuspal position (mean 23.1%). Based on the findings, quantitative electromyography of the masticatory muscles seems to have a limited value in diagnostics and in the evaluation of individual treatment results.

  20. Lab-on-a-Chip Multiplex Assays.

    Peter, Harald; Wienke, Julia; Bier, Frank F


    Lab-on-a-chip multiplex assays allow a rapid identification of multiple parameters in an automated manner. Here we describe a lab-based preparation followed by a rapid and fully automated DNA microarray hybridization and readout in less than 10 min using the Fraunhofer in vitro diagnostics (ivD) platform to enable rapid identification of bacterial species and detection of antibiotic resistance. The use of DNA microarrays allows a fast adaptation of new biomarkers enabling the identification of different genes as well as single-nucleotide-polymorphisms (SNPs) within these genes. In this protocol we describe a DNA microarray developed for identification of Staphylococcus aureus and the mecA resistance gene.

  1. Comparison of sample preparation methods for reliable plutonium and neptunium urinalysis using automatic extraction chromatography.

    Qiao, Jixin; Xu, Yihong; Hou, Xiaolin; Miró, Manuel


    This paper describes improvement and comparison of analytical methods for simultaneous determination of trace-level plutonium and neptunium in urine samples by inductively coupled plasma mass spectrometry (ICP-MS). Four sample pre-concentration techniques, including calcium phosphate, iron hydroxide and manganese dioxide co-precipitation and evaporation were compared and the applicability of different techniques was discussed in order to evaluate and establish the optimal method for in vivo radioassay program. The analytical results indicate that the various sample pre-concentration approaches afford dissimilar method performances and care should be taken for specific experimental parameters for improving chemical yields. The best analytical performances in terms of turnaround time (6h) and chemical yields for plutonium (88.7 ± 11.6%) and neptunium (94.2 ± 2.0%) were achieved by manganese dioxide co-precipitation. The need of drying ashing (≥ 7h) for calcium phosphate co-precipitation and long-term aging (5d) for iron hydroxide co-precipitation, respectively, rendered time-consuming analytical protocols. Despite the fact that evaporation is also somewhat time-consuming (1.5d), it endows urinalysis methods with better reliability and repeatability compared with co-precipitation techniques. In view of the applicability of different pre-concentration techniques proposed previously in the literature, the main challenge behind relevant method development is pointed to be the release of plutonium and neptunium associated with organic compounds in real urine assays. In this work, different protocols for decomposing organic matter in urine were investigated, of which potassium persulfate (K2S2O8) treatment provided the highest chemical yield of neptunium in the iron hydroxide co-precipitation step, yet, the occurrence of sulfur compounds in the processed sample deteriorated the analytical performance of the ensuing extraction chromatographic separation with chemical

  2. Lacking applicability of in vitro eye irritation methods to identify seriously eye irritating agrochemical formulations: Results of bovine cornea opacity and permeability assay, isolated chicken eye test and the EpiOcular™ ET-50 method to classify according to UN GHS.

    Kolle, Susanne N; Van Cott, Andrew; van Ravenzwaay, Bennard; Landsiedel, Robert


    In vitro methods have gained regulatory acceptance for the prediction of serious eye damage (UN GHS Cat 1). However, the majority of in vitro methods do not state whether they are applicable to agrochemical formulations. This manuscript presents a study of up to 27 agrochemical formulations tested in three in vitro assays (three versions of the bovine corneal opacity and permeability test (BCOP, OECD TG 437) assay, the isolated chicken eye test (ICE, OECD TG 438) and the EpiOcular™ ET-50 assay). The results were compared with already-available in vivo data. In the BCOP only one of the four, one of five in the ICE and six of eleven tested formulations in the EpiOcular™ ET-50 Neat Protocol resulted in the correct UN GHS Cat 1 prediction. Overpredictions occurred in all assays. These data indicate a lack of applicability of the three in vitro methods to reliably predict UN GHS Cat 1 of agrochemical formulations. In order to ensure animal-free identification of seriously eye damaging agrochemical formulations testing protocols and/or prediction models need to be modified or classification rules should be tailored to in vitro testing rather than using in vivo Draize data as a standard. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  3. QoS-aware self-adaptation of communication protocols in a pervasive service middleware

    Zhang, Weishan; Hansen, Klaus Marius; Fernandes, João


    Pervasive computing is characterized by heterogeneous devices that usually have scarce resources requiring optimized usage. These devices may use different communication protocols which can be switched at runtime. As different communication protocols have different quality of service (Qo......S) properties, this motivates optimized self-adaption of protocols for devices, e.g., considering power consumption and other QoS requirements, e.g. round trip time (RTT) for service invocations, throughput, and reliability. In this paper, we present an extensible approach for self-adaptation of communication...... protocols for pervasive web services, where protocols are designed as reusable connectors and our middleware infrastructure can hide the complexity of using different communication protocols to upper layers. We also propose to use Genetic Algorithms (GAs) to find optimized configurations at runtime...

  4. Diagnostic Certified Assay: Neuromuscular and Cardiac Assessments

    Rea Valaperta


    Full Text Available The expansion of the specific trinucleotide sequence, [CTG], is the molecular pathological mechanism responsible for the clinical manifestations of DM1. Many studies have described different molecular genetic techniques to detect DM1, but as yet there is no data on the analytical performances of techniques used so far in this disease. We therefore developed and validated a molecular method, “Myotonic Dystrophy SB kit,” to better characterize our DM1 population. 113 patients were examined: 20 DM1-positive, 11 DM1/DM2-negative, and13 DM1-negative/DM2-positive, who had a previous molecular diagnosis, while 69 were new cases. This assay correctly identified 113/113 patients, and all were confirmed by different homemade assays. Comparative analysis revealed that the sensitivity and the specificity of the new kit were very high (>99%. Same results were obtained using several extraction procedures and different concentrations of DNA. The distribution of pathologic alleles showed a prevalence of the “classical” form, while of the 96 nonexpanded alleles 19 different allelic types were observed. Cardiac and neuromuscular parameters were used to clinically characterize our patients and support the new genetic analysis. Our findings suggest that this assay appears to be a very robust and reliable molecular test, showing high reproducibility and giving an unambiguous interpretation of results.

  5. Advanced information processing system: Authentication protocols for network communication

    Harper, Richard E.; Adams, Stuart J.; Babikyan, Carol A.; Butler, Bryan P.; Clark, Anne L.; Lala, Jaynarayan H.


    In safety critical I/O and intercomputer communication networks, reliable message transmission is an important concern. Difficulties of communication and fault identification in networks arise primarily because the sender of a transmission cannot be identified with certainty, an intermediate node can corrupt a message without certainty of detection, and a babbling node cannot be identified and silenced without lengthy diagnosis and reconfiguration . Authentication protocols use digital signature techniques to verify the authenticity of messages with high probability. Such protocols appear to provide an efficient solution to many of these problems. The objective of this program is to develop, demonstrate, and evaluate intercomputer communication architectures which employ authentication. As a context for the evaluation, the authentication protocol-based communication concept was demonstrated under this program by hosting a real-time flight critical guidance, navigation and control algorithm on a distributed, heterogeneous, mixed redundancy system of workstations and embedded fault-tolerant computers.

  6. Analyzing Zone Routing Protocol in MANET Applying Authentic Parameter

    Lakhtaria, Kamaljit I


    Routing is the main part of wireless adhoc network conventionally there are two approaches first one is Proactive and another one is Reactive. Both these approaches have some substantial disadvantage and to overcome hybrid routing protocols designed. ZRP (Zone Routing Protocol) is one of the hybrid routing protocols, it takes advantage of proactive approach by providing reliability within the scalable zone, and for beyond the scalable zone it looks for the reactive approach. It (ZRP) uses the proactive and the reactive routing according to the need of the application at that particular instance of time depending upon the prevailing scenario. This work revolves around the performance of ZRP against realistic parameters by varying various attributes such as Zone Radius of ZRP in different node density. Results vary as we change the node density on Qualnet 4.0 network simulator.


    S. Rajeswari


    Full Text Available In Gossip Sleep Protocol, network performance is enhanced based on energy resource. But energy conservation is achieved with the reduced throughput. In this paper, it has been proposed a new Protocol for Mobile Ad hoc Network to achieve reliability with energy conservation. Based on the probability (p values, the value of sleep nodes is fixed initially. The probability value can be adaptively adjusted by Remote Activated Switch during the transmission process. The adaptiveness of gossiping probability is determined by the Packet Delivery Ratio. For performance comparison, we have taken Routing overhead, Packet Delivery Ratio, Number of dropped packets and Energy consumption with the increasing number of forwarding nodes. We used UDP based traffic models to analyze the performance of this protocol. We analyzed TCP based traffic models for average end to end delay. We have used the NS-2 simulator.

  8. The Impact of Black-Hole Attack on ZRP Protocol

    CHAHIDI Badr


    Full Text Available lack of infrastructure in ad hoc networks makes their deployment easier. Each node in an ad hoc network can route data using a routing protocol, which decreases the level of security. Ad hoc networks are exposed to several attacks such as the blackhole attack. In this article, a study has been made on the impact of the attack on the hybrid routing protocol ZRP (Zone Routing Protocol. In this attack a malicious node is placed between two or more nodes in order to drop data. The trick of the attack is simple, the malicious node declares to have the most reliable way to the destination so that the wife destination chooses this path. In this study, NS2 is used to assess the impact of the attack on ZRP. Two metrics measure, namely the packet delivered ratio and end to end delay.

  9. A new communication protocol family for a distributed spacecraft control system

    Baldi, Andrea; Pace, Marco


    In this paper we describe the concepts behind and architecture of a communication protocol family, which was designed to fulfill the communication requirements of ESOC's new distributed spacecraft control system SCOS 2. A distributed spacecraft control system needs a data delivery subsystem to be used for telemetry (TLM) distribution, telecommand (TLC) dispatch and inter-application communication, characterized by the following properties: reliability, so that any operational workstation is guaranteed to receive the data it needs to accomplish its role; efficiency, so that the telemetry distribution, even for missions with high telemetry rates, does not cause a degradation of the overall control system performance; scalability, so that the network is not the bottleneck both in terms of bandwidth and reconfiguration; flexibility, so that it can be efficiently used in many different situations. The new protocol family which satisfies the above requirements is built on top of widely used communication protocols (UDP and TCP), provides reliable point-to-point and broadcast communication (UDP+) and is implemented in C++. Reliability is achieved using a retransmission mechanism based on a sequence numbering scheme. Such a scheme allows to have cost-effective performances compared to the traditional protocols, because retransmission is only triggered by applications which explicitly need reliability. This flexibility enables applications with different profiles to take advantage of the available protocols, so that the best rate between sped and reliability can be achieved case by case.

  10. Industrial wireless sensor networks applications, protocols, and standards

    Güngör, V Çagri


    The collaborative nature of industrial wireless sensor networks (IWSNs) brings several advantages over traditional wired industrial monitoring and control systems, including self-organization, rapid deployment, flexibility, and inherent intelligent processing. In this regard, IWSNs play a vital role in creating more reliable, efficient, and productive industrial systems, thus improving companies' competitiveness in the marketplace. Industrial Wireless Sensor Networks: Applications, Protocols, and Standards examines the current state of the art in industrial wireless sensor networks and outline

  11. A global protocol for monitoring of coral bleaching

    Oliver, J.; Setiasih, N.; Marshall, P.; Hansen, L


    Coral bleaching and subsequent mortality represent a major threat to the future health and productivity of coral reefs. However a lack of reliable data on occurrence, severity and other characteristics of bleaching events hampers research on the causes and consequences of this important phenomenon. This article describes a global protocol for monitoring coral bleaching events, which addresses this problem and can be used by people with different levels of expertise and resources.

  12. Looking for new biomarkers of skin wound vitality with a cytokine-based multiplex assay: preliminary study.

    Peyron, Pierre-Antoine; Baccino, Éric; Nagot, Nicolas; Lehmann, Sylvain; Delaby, Constance


    Determination of skin wound vitality is an important issue in forensic practice. No reliable biomarker currently exists. Quantification of inflammatory cytokines in injured skin with MSD(®) technology is an innovative and promising approach. This preliminary study aims to develop a protocol for the preparation and the analysis of skin samples. Samples from ante mortem wounds, post mortem wounds, and intact skin ("control samples") were taken from corpses at the autopsy. After an optimization of the pre-analytical protocol had been performed in terms of skin homogeneisation and proteic extraction, the concentration of TNF-α was measured in each sample with the MSD(®) approach. Then five other cytokines of interest (IL-1β, IL-6, IL-10, IL-12p70 and IFN-γ) were simultaneously quantified with a MSD(®) multiplex assay. The optimal pre-analytical conditions consist in a proteic extraction from a 6 mm diameter skin sample, in a PBS buffer with triton 0,05%. Our results show the linearity and the reproductibility of the TNF-α quantification with MSD(®), and an inter- and intra-individual variability of the concentrations of proteins. The MSD(®) multiplex assay is likely to detect differential skin concentrations for each cytokine of interest. This preliminary study was used to develop and optimize the pre-analytical and analytical conditions of the MSD(®) method using injured and healthy skin samples, for the purpose of looking for and identifying the cytokine, or the set of cytokines, that may be biomarkers of skin wound vitality.

  13. Reliability Assessment Of Wind Turbines

    Sørensen, John Dalsgaard


    Reduction of cost of energy for wind turbines are very important in order to make wind energy competitive compared to other energy sources. Therefore the turbine components should be designed to have sufficient reliability but also not be too costly (and safe). This paper presents models...... for uncertainty modeling and reliability assessment of especially the structural components such as tower, blades, substructure and foundation. But since the function of a wind turbine is highly dependent on many electrical and mechanical components as well as a control system also reliability aspects...... of these components are discussed and it is described how there reliability influences the reliability of the structural components. Two illustrative examples are presented considering uncertainty modeling, reliability assessment and calibration of partial safety factors for structural wind turbine components exposed...

  14. Nuclear weapon reliability evaluation methodology

    Wright, D.L. [Sandia National Labs., Albuquerque, NM (United States)


    This document provides an overview of those activities that are normally performed by Sandia National Laboratories to provide nuclear weapon reliability evaluations for the Department of Energy. These reliability evaluations are first provided as a prediction of the attainable stockpile reliability of a proposed weapon design. Stockpile reliability assessments are provided for each weapon type as the weapon is fielded and are continuously updated throughout the weapon stockpile life. The reliability predictions and assessments depend heavily on data from both laboratory simulation and actual flight tests. An important part of the methodology are the opportunities for review that occur throughout the entire process that assure a consistent approach and appropriate use of the data for reliability evaluation purposes.

  15. Reliability engineering theory and practice

    Birolini, Alessandro


    This book shows how to build in, evaluate, and demonstrate reliability and availability of components, equipment, systems. It presents the state-of-theart of reliability engineering, both in theory and practice, and is based on the author's more than 30 years experience in this field, half in industry and half as Professor of Reliability Engineering at the ETH, Zurich. The structure of the book allows rapid access to practical results. This final edition extend and replace all previous editions. New are, in particular, a strategy to mitigate incomplete coverage, a comprehensive introduction to human reliability with design guidelines and new models, and a refinement of reliability allocation, design guidelines for maintainability, and concepts related to regenerative stochastic processes. The set of problems for homework has been extended. Methods & tools are given in a way that they can be tailored to cover different reliability requirement levels and be used for safety analysis. Because of the Appendice...

  16. Designing, optimization and validation of tetra-primer ARMS PCR protocol for genotyping mutations in caprine Fec genes.

    Ahlawat, Sonika; Sharma, Rekha; Maitra, A; Roy, Manoranjan; Tantia, M S


    New, quick, and inexpensive methods for genotyping novel caprine Fec gene polymorphisms through tetra-primer ARMS PCR were developed in the present investigation. Single nucleotide polymorphism (SNP) genotyping needs to be attempted to establish association between the identified mutations and traits of economic importance. In the current study, we have successfully genotyped three new SNPs identified in caprine fecundity genes viz. T(-242)C (BMPR1B), G1189A (GDF9) and G735A (BMP15). Tetra-primer ARMS PCR protocol was optimized and validated for these SNPs with short turn-around time and costs. The optimized techniques were tested on 158 random samples of Black Bengal goat breed. Samples with known genotypes for the described genes, previously tested in duplicate using the sequencing methods, were employed for validation of the assay. Upon validation, complete concordance was observed between the tetra-primer ARMS PCR assays and the sequencing results. These results highlight the ability of tetra-primer ARMS PCR in genotyping of mutations in Fec genes. Any associated SNP could be used to accelerate the improvement of goat reproductive traits by identifying high prolific animals at an early stage of life. Our results provide direct evidence that tetra-primer ARMS-PCR is a rapid, reliable, and cost-effective method for SNP genotyping of mutations in caprine Fec genes.

  17. Designing, optimization and validation of tetra-primer ARMS PCR protocol for genotyping mutations in caprine Fec genes

    Sonika Ahlawat


    Full Text Available New, quick, and inexpensive methods for genotyping novel caprine Fec gene polymorphisms through tetra-primer ARMS PCR were developed in the present investigation. Single nucleotide polymorphism (SNP genotyping needs to be attempted to establish association between the identified mutations and traits of economic importance. In the current study, we have successfully genotyped three new SNPs identified in caprine fecundity genes viz. T(-242C (BMPR1B, G1189A (GDF9 and G735A (BMP15. Tetra-primer ARMS PCR protocol was optimized and validated for these SNPs with short turn-around time and costs. The optimized techniques were tested on 158 random samples of Black Bengal goat breed. Samples with known genotypes for the described genes, previously tested in duplicate using the sequencing methods, were employed for validation of the assay. Upon validation, complete concordance was observed between the tetra-primer ARMS PCR assays and the sequencing results. These results highlight the ability of tetra-primer ARMS PCR in genotyping of mutations in Fec genes. Any associated SNP could be used to accelerate the improvement of goat reproductive traits by identifying high prolific animals at an early stage of life. Our results provide direct evidence that tetra-primer ARMS-PCR is a rapid, reliable, and cost-effective method for SNP genotyping of mutations in caprine Fec genes.

  18. Reliable Remote Relay Protection in Smart Grid

    Jiapeng Zhang; Yingfei Dong


    As the false trips of remote protection relays are among the main reasons behind cascading blackouts, it is critical to design reli⁃able relay protection. Even though common protection schemes on traditional power systems have been investigated for a few de⁃cades, cascading failures in recent years indicate more research needed in this area. Consequently, researchers have proposed agent⁃based methods on the Smart Grid (SG) to address this issue. However, these existing agent⁃based methods simply use TCP protocol without considering real⁃time communication requirements (such as bandwidth and delay). To deal with this issue, several methods for efficient network resource management are proposed. Furthermore, these existing methods do not consider the poten⁃tial issues in practical communication networks, which may result in delay violation and trigger relay false trips. We have dis⁃cussed simple backup solutions in the previous work. In this paper, in addition to network efficiency, we focus on improving the system reliability by exploring known power system information and minimizing the chances of false trips of important remote re⁃lays, e.g., defining power line priorities based on their importance. Moreover, to further improve the system reliability, we also in⁃vestigate the peer⁃to⁃peer protection approaches to address the single point of failure of centralized control center.

  19. Melanins and melanogenesis: methods, standards, protocols.

    d'Ischia, Marco; Wakamatsu, Kazumasa; Napolitano, Alessandra; Briganti, Stefania; Garcia-Borron, José-Carlos; Kovacs, Daniela; Meredith, Paul; Pezzella, Alessandro; Picardo, Mauro; Sarna, Tadeusz; Simon, John D; Ito, Shosuke


    Despite considerable advances in the past decade, melanin research still suffers from the lack of universally accepted and shared nomenclature, methodologies, and structural models. This paper stems from the joint efforts of chemists, biochemists, physicists, biologists, and physicians with recognized and consolidated expertise in the field of melanins and melanogenesis, who critically reviewed and experimentally revisited methods, standards, and protocols to provide for the first time a consensus set of recommended procedures to be adopted and shared by researchers involved in pigment cell research. The aim of the paper was to define an unprecedented frame of reference built on cutting-edge knowledge and state-of-the-art methodology, to enable reliable comparison of results among laboratories and new progress in the field based on standardized methods and shared information.

  20. Lithium battery safety and reliability

    Levy, Samuel C.

    Lithium batteries have been used in a variety of applications for a number of years. As their use continues to grow, particularly in the consumer market, a greater emphasis needs to be placed on safety and reliability. There is a useful technique which can help to design cells and batteries having a greater degree of safety and higher reliability. This technique, known as fault tree analysis, can also be useful in determining the cause of unsafe behavior and poor reliability in existing designs.