WorldWideScience

Sample records for reliable assay protocol

  1. Fault recovery in the reliable multicast protocol

    Science.gov (United States)

    Callahan, John R.; Montgomery, Todd L.; Whetten, Brian

    1995-01-01

    The Reliable Multicast Protocol (RMP) provides a unique, group-based model for distributed programs that need to handle reconfiguration events at the application layer. This model, called membership views, provides an abstraction in which events such as site failures, network partitions, and normal join-leave events are viewed as group reformations. RMP provides access to this model through an application programming interface (API) that notifies an application when a group is reformed as the result of a some event. RMP provides applications with reliable delivery of messages using an underlying IP Multicast (12, 5) media to other group members in a distributed environment even in the case of reformations. A distributed application can use various Quality of Service (QoS) levels provided by RMP to tolerate group reformations. This paper explores the implementation details of the mechanisms in RMP that provide distributed applications with membership view information and fault recovery capabilities.

  2. Developing a yeast-based assay protocol to monitor total ...

    African Journals Online (AJOL)

    A yeast-based assay protocol developed for detecting oestrogenic activity in activated sludge (AS) supernatant is described. The protocol used Saccharomyces cerevisiae construct RMY/ER-ERE with human oestrogen receptor (ERα) and lacZ reporter genes, and was developed by modifying existing assays for use with AS ...

  3. Cacades: A reliable dissemination protocol for data collection sensor network

    Science.gov (United States)

    Peng, Y.; Song, W.; Huang, R.; Xu, M.; Shirazi, B.; LaHusen, R.; Pei, G.

    2009-01-01

    In this paper, we propose a fast and reliable data dissemination protocol Cascades to disseminate data from the sink(base station) to all or a subset of nodes in a data collection sensor network. Cascades makes use of the parentmonitor-children analogy to ensure reliable dissemination. Each node monitors whether or not its children have received the broadcast messages through snooping children's rebroadcasts or waiting for explicit ACKs. If a node detects a gap in its message sequences, it can fetch the missing messages from its neighbours reactively. Cascades also considers many practical issues for field deployment, such as dynamic topology, link/node failure, etc.. It therefore guarantees that a disseminated message from the sink will reach all intended receivers and the dissemination is terminated in a short time period. Notice that, all existing dissemination protocols either do not guarantee reliability or do not terminate [1, 2], which does not meet the requirement of real-time command control. We conducted experiment evaluations in both TOSSIM simulator and a sensor network testbed to compare Cascades with those existing dissemination protocols in TinyOS sensor networks, which show that Cascades achieves a higher degree of reliability, lower communication cost, and less delivery delay. ??2009 IEEE.

  4. Reliability of dipstick assay in predicting urinary tract infection

    Directory of Open Access Journals (Sweden)

    Anith Kumar Mambatta

    2015-01-01

    Full Text Available Aims: Urine dipstick analysis is a quick, cheap and a useful test in predicting Urinary Tract Infection (UTI in hospitalized patients. Our aim is to evaluate the reliability (sensitivity of urine dipstick analysis against urine culture in the diagnosis of UTI. Materials and Methods: Patients admitted to our hospital suspected of having UTI, with positive urine cultures were included in this study from a 2-year period (January 2011 to December 2012. Dipstick urinalysis was done using multistix 10 SG (Siemens and clinitek advantus analyzer. The sensitivity of dipstick nitrites, leukocyte esterase and blood in these culture-positive UTI patients was calculated retrospectively. Results: Urine dipstick analysis of 635 urine culture-positive patients was studied. The sensitivity of nitrite alone and leukocyte esterase alone were 23.31% and 48.5%, respectively. The sensitivity of blood alone in positive urine culture was 63.94%, which was the highest sensitivity for a single screening test. The presence of leukocyte esterase and/or blood increased the sensitivity to 72.28%. The sensitivity was found to be the highest when nitrite, leukocyte and blood were considered together. Conclusions: Nitrite test and leukocyte esterase test when used individually is not reliable to rule out UTI. Hence, symptomatic UTI patients with negative dipstick assay should be subjected to urine culture for a proper management.

  5. Design, Implementation, and Verification of the Reliable Multicast Protocol. Thesis

    Science.gov (United States)

    Montgomery, Todd L.

    1995-01-01

    This document describes the Reliable Multicast Protocol (RMP) design, first implementation, and formal verification. RMP provides a totally ordered, reliable, atomic multicast service on top of an unreliable multicast datagram service. RMP is fully and symmetrically distributed so that no site bears an undue portion of the communications load. RMP provides a wide range of guarantees, from unreliable delivery to totally ordered delivery, to K-resilient, majority resilient, and totally resilient atomic delivery. These guarantees are selectable on a per message basis. RMP provides many communication options, including virtual synchrony, a publisher/subscriber model of message delivery, a client/server model of delivery, mutually exclusive handlers for messages, and mutually exclusive locks. It has been commonly believed that total ordering of messages can only be achieved at great performance expense. RMP discounts this. The first implementation of RMP has been shown to provide high throughput performance on Local Area Networks (LAN). For two or more destinations a single LAN, RMP provides higher throughput than any other protocol that does not use multicast or broadcast technology. The design, implementation, and verification activities of RMP have occurred concurrently. This has allowed the verification to maintain a high fidelity between design model, implementation model, and the verification model. The restrictions of implementation have influenced the design earlier than in normal sequential approaches. The protocol as a whole has matured smoother by the inclusion of several different perspectives into the product development.

  6. Characterization of perovskite solar cells: Towards a reliable measurement protocol

    Directory of Open Access Journals (Sweden)

    Eugen Zimmermann

    2016-09-01

    Full Text Available Lead halide perovskite solar cells have shown a tremendous rise in power conversion efficiency with reported record efficiencies of over 20% making this material very promising as a low cost alternative to conventional inorganic solar cells. However, due to a differently severe “hysteretic” behaviour during current density-voltage measurements, which strongly depends on scan rate, device and measurement history, preparation method, device architecture, etc., commonly used solar cell measurements do not give reliable or even reproducible results. For the aspect of commercialization and the possibility to compare results of different devices among different laboratories, it is necessary to establish a measurement protocol which gives reproducible results. Therefore, we compare device characteristics derived from standard current density-voltage measurements with stabilized values obtained from an adaptive tracking of the maximum power point and the open circuit voltage as well as characteristics extracted from time resolved current density-voltage measurements. Our results provide insight into the challenges of a correct determination of device performance and propose a measurement protocol for a reliable characterisation which is easy to implement and has been tested on varying perovskite solar cells fabricated in different laboratories.

  7. Interoperability and Reliability of Multiplatform MPLS VPN: Comparison of Traffic Engineering with RSVP-TE Protocol and LDP Protocol

    Directory of Open Access Journals (Sweden)

    Nanang Ismail

    2017-10-01

    48% of packet loss per 100 sent packets while on RSVP packet loss percentage is 35.5% per 100 sent packets. Both protocols have interoperability on the third layer of multiplatform MPLS VPN, but on heavy loaded traffic condition, RSVP protocol has better reliability than the LDP protocol.

  8. ZyFISH: a simple, rapid and reliable zygosity assay for transgenic mice.

    Directory of Open Access Journals (Sweden)

    Donal McHugh

    Full Text Available Microinjection of DNA constructs into fertilized mouse oocytes typically results in random transgene integration at a single genomic locus. The resulting transgenic founders can be used to establish hemizygous transgenic mouse lines. However, practical and experimental reasons often require that such lines be bred to homozygosity. Transgene zygosity can be determined by progeny testing assays which are expensive and time-consuming, by quantitative Southern blotting which is labor-intensive, or by quantitative PCR (qPCR which requires transgene-specific design. Here, we describe a zygosity assessment procedure based on fluorescent in situ hybridization (zyFISH. The zyFISH protocol entails the detection of transgenic loci by FISH and the concomitant assignment of homozygosity using a concise and unbiased scoring system. The method requires small volumes of blood, is scalable to at least 40 determinations per assay, and produces results entirely consistent with the progeny testing assay. This combination of reliability, simplicity and cost-effectiveness makes zyFISH a method of choice for transgenic mouse zygosity determinations.

  9. An Acetylcholinesterase-Based Chronoamperometric Biosensor for Fast and Reliable Assay of Nerve Agents

    Directory of Open Access Journals (Sweden)

    Rene Kizek

    2013-08-01

    Full Text Available The enzyme acetylcholinesterase (AChE is an important part of cholinergic nervous system, where it stops neurotransmission by hydrolysis of the neurotransmitter acetylcholine. It is sensitive to inhibition by organophosphate and carbamate insecticides, some Alzheimer disease drugs, secondary metabolites such as aflatoxins and nerve agents used in chemical warfare. When immobilized on a sensor (physico-chemical transducer, it can be used for assay of these inhibitors. In the experiments described herein, an AChE- based electrochemical biosensor using screen printed electrode systems was prepared. The biosensor was used for assay of nerve agents such as sarin, soman, tabun and VX. The limits of detection achieved in a measuring protocol lasting ten minutes were 7.41 × 10−12 mol/L for sarin, 6.31 × 10−12 mol /L for soman, 6.17 × 10−11 mol/L for tabun, and 2.19 × 10−11 mol/L for VX, respectively. The assay was reliable, with minor interferences caused by the organic solvents ethanol, methanol, isopropanol and acetonitrile. Isopropanol was chosen as suitable medium for processing lipophilic samples.

  10. Connectivity-Based Reliable Multicast MAC Protocol for IEEE 802.11 Wireless LANs

    Directory of Open Access Journals (Sweden)

    Woo-Yong Choi

    2009-01-01

    Full Text Available We propose the efficient reliable multicast MAC protocol based on the connectivity information among the recipients. Enhancing the BMMM (Batch Mode Multicast MAC protocol, the reliable multicast MAC protocol significantly reduces the RAK (Request for ACK frame transmissions in a reasonable computational time and enhances the MAC performance. By the analytical performance analysis, the throughputs of the BMMM protocol and our proposed MAC protocol are derived. Numerical examples show that our proposed MAC protocol increases the reliable multicast MAC performance for IEEE 802.11 wireless LANs.

  11. A Hierarchical Energy Efficient Reliable Transport Protocol for Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Prabhudutta Mohanty

    2014-12-01

    Full Text Available The two important requirements for many Wireless Senor Networks (WSNs are prolonged network lifetime and end-to-end reliability. The sensor nodes consume more energy during data transmission than the data sensing. In WSN, the redundant data increase the energy consumption, latency and reduce reliability during data transmission. Therefore, it is important to support energy efficient reliable data transport in WSNs. In this paper, we present a Hierarchical Energy Efficient Reliable Transport Protocol (HEERTP for the data transmission within the WSN. This protocol maximises the network lifetime by controlling the redundant data transmission with the co-ordination of Base Station (BS. The proposed protocol also achieves end-to-end reliability using a hop-by-hop acknowledgement scheme. We evaluate the performance of the proposed protocol through simulation. The simulation results reveal that our proposed protocol achieves better performance in terms of energy efficiency, latency and reliability than the existing protocols.

  12. A Functional Assay for GPR55: Envision Protocol.

    Science.gov (United States)

    Anavi-Goffer, Sharon; Ross, Ruth A

    2016-01-01

    AlphaScreen(®) SureFire(®) assay is a novel technology that combines luminescent oxygen channeling technology, nano-beads, and monocloncal antibodies to detect the level of a selected protein in a volume lower than 5 μl. This method is more sensitive compared with the traditional enzyme-linked immunosorbent assays (ELISA), and can detect an increasing number of new targets. Here, we described a method for AlphaScreen(®) SureFire(®) assay that targets ERK1/2 phosphorylation, a primary downstream signaling pathway that conveys activation of GPR55 by L-α-lysophosphatidylinositol (LPI) and certain cannabinoids.

  13. Design and Analysis of Transport Protocols for Reliable High-Speed Communications

    NARCIS (Netherlands)

    Oláh, A.

    1997-01-01

    The design and analysis of transport protocols for reliable communications constitutes the topic of this dissertation. These transport protocols guarantee the sequenced and complete delivery of user data over networks which may lose, duplicate and reorder packets. Reliable transport services are

  14. Design Protocols and Analytical Strategies that Incorporate Structural Reliability Models

    Science.gov (United States)

    Duffy, Stephen F.

    1997-01-01

    Al single crystal turbine blade material; map a simplistic failure strength envelope of the material; develop a statistically based reliability computer algorithm, verify the reliability model and computer algorithm, and model stator vanes for rig tests. Thus establishing design protocols that enable the engineer to analyze and predict the mechanical behavior of ceramic composites and intermetallics would mitigate the prototype (trial and error) approach currently used by the engineering community. The primary objective of the research effort supported by this short term grant is the continued creation of enabling technologies for the macroanalysis of components fabricated from ceramic composites and intermetallic material systems. The creation of enabling technologies aids in shortening the product development cycle of components fabricated from the new high technology materials.

  15. Is gait variability reliable in older adults and Parkinson's disease? Towards an optimal testing protocol.

    Science.gov (United States)

    Galna, Brook; Lord, Sue; Rochester, Lynn

    2013-04-01

    Despite the widespread use of gait variability in research and clinical studies, testing protocols designed to optimise its reliability have not been established. This study evaluates the impact of testing protocol and pathology on the reliability of gait variability. To (i) estimate the reliability of gait variability during continuous and intermittent walking protocols in older adults and people with Parkinson's disease (PD), (ii) determine optimal number of steps for acceptable levels of reliability of gait variability and (iii) provide sample size estimates for use in clinical trials. Gait variability was measured twice, one week apart, in 27 older adults and 25 PD participants. Participants walked at their preferred pace during: (i) a continuous 2 min walk and (ii) 3 intermittent walks over a 12 m walkway. Gait variability was calculated as the within-person standard deviation for step velocity, length and width, and step, stance and swing duration. Reliability of gait variability ranged from poor to excellent (intra class correlations .041-.860; relative limits of agreement 34-89%). Gait variability was more reliable during continuous walks. Control and PD participants demonstrated similar reliability. Increasing the number of steps improved reliability, with most improvement seen across the first 30 steps. In this study, we identified testing protocols that improve the reliability of measuring gait variability. We recommend using a continuous walking protocol and to collect no fewer than 30 steps. Early PD does not appear to impact negatively on the reliability of gait variability. Copyright © 2012 Elsevier B.V. All rights reserved.

  16. Evaluation of the reliability of maize reference assays for GMO quantification.

    Science.gov (United States)

    Papazova, Nina; Zhang, David; Gruden, Kristina; Vojvoda, Jana; Yang, Litao; Buh Gasparic, Meti; Blejec, Andrej; Fouilloux, Stephane; De Loose, Marc; Taverniers, Isabel

    2010-03-01

    A reliable PCR reference assay for relative genetically modified organism (GMO) quantification must be specific for the target taxon and amplify uniformly along the commercialised varieties within the considered taxon. Different reference assays for maize (Zea mays L.) are used in official methods for GMO quantification. In this study, we evaluated the reliability of eight existing maize reference assays, four of which are used in combination with an event-specific polymerase chain reaction (PCR) assay validated and published by the Community Reference Laboratory (CRL). We analysed the nucleotide sequence variation in the target genomic regions in a broad range of transgenic and conventional varieties and lines: MON 810 varieties cultivated in Spain and conventional varieties from various geographical origins and breeding history. In addition, the reliability of the assays was evaluated based on their PCR amplification performance. A single base pair substitution, corresponding to a single nucleotide polymorphism (SNP) reported in an earlier study, was observed in the forward primer of one of the studied alcohol dehydrogenase 1 (Adh1) (70) assays in a large number of varieties. The SNP presence is consistent with a poor PCR performance observed for this assay along the tested varieties. The obtained data show that the Adh1 (70) assay used in the official CRL NK603 assay is unreliable. Based on our results from both the nucleotide stability study and the PCR performance test, we can conclude that the Adh1 (136) reference assay (T25 and Bt11 assays) as well as the tested high mobility group protein gene assay, which also form parts of CRL methods for quantification, are highly reliable. Despite the observed uniformity in the nucleotide sequence of the invertase gene assay, the PCR performance test reveals that this target sequence might occur in more than one copy. Finally, although currently not forming a part of official quantification methods, zein and SSIIb

  17. An FEC Adaptive Multicast MAC Protocol for Providing Reliability in WLANs

    Science.gov (United States)

    Basalamah, Anas; Sato, Takuro

    For wireless multicast applications like multimedia conferencing, voice over IP and video/audio streaming, a reliable transmission of packets within short delivery delay is needed. Moreover, reliability is crucial to the performance of error intolerant applications like file transfer, distributed computing, chat and whiteboard sharing. Forward Error Correction (FEC) is frequently used in wireless multicast to enhance Packet Error Rate (PER) performance, but cannot assure full reliability unless coupled with Automatic Repeat Request forming what is knows as Hybrid-ARQ. While reliable FEC can be deployed at different levels of the protocol stack, it cannot be deployed on the MAC layer of the unreliable IEEE802.11 WLAN due to its inability to exchange ACKs with multiple recipients. In this paper, we propose a Multicast MAC protocol that enhances WLAN reliability by using Adaptive FEC and study it's performance through mathematical analysis and simulation. Our results show that our protocol can deliver high reliability and throughput performance.

  18. Corrections to "Connectivity-Based Reliable Multicast MAC Protocol for IEEE 802.11 Wireless LANs"

    Directory of Open Access Journals (Sweden)

    Choi Woo-Yong

    2010-01-01

    Full Text Available We have found the errors in the throughput formulae presented in our paper "Connectivity-based reliable multicast MAC protocol for IEEE 802.11 wireless LANs". We provide the corrected formulae and numerical results.

  19. Shoulder muscle endurance: the development of a standardized and reliable protocol

    Directory of Open Access Journals (Sweden)

    Roy Jean-Sébastien

    2011-01-01

    Full Text Available Abstract Background Shoulder muscle fatigue has been proposed as a possible link to explain the association between repetitive arm use and the development of rotator cuff disorders. To our knowledge, no standardized clinical endurance protocol has been developed to evaluate the effects of muscle fatigue on shoulder function. Such a test could improve clinical examination of individuals with shoulder disorders. Therefore, the purpose of this study was to establish a reliable protocol for objective assessment of shoulder muscle endurance. Methods An endurance protocol was developed on a stationary dynamometer (Biodex System 3. The endurance protocol was performed in isotonic mode with the resistance set at 50% of each subject's peak torque as measured for shoulder external (ER and internal rotation (IR. Each subject performed 60 continuous repetitions of IR/ER rotation. The endurance protocol was performed by 36 healthy individuals on two separate occasions at least two days apart. Maximal isometric shoulder strength tests were performed before and after the fatigue protocol to evaluate the effects of the endurance protocol and its reliability. Paired t-tests were used to evaluate the reduction in shoulder strength due to the protocol, while intraclass correlation coefficients (ICC and minimal detectable change (MDC were used to evaluate its reliability. Results Maximal isometric strength was significantly decreased after the endurance protocol (P 0.84. Conclusions Changes in muscular performance observed during and after the muscular endurance protocol suggests that the protocol did result in muscular fatigue. Furthermore, this study established that the resultant effects of fatigue of the proposed isotonic protocol were reproducible over time. The protocol was performed without difficulty by all volunteers and took less than 10 minutes to perform, suggesting that it might be feasible for clinical practice. This protocol could be used to induce

  20. Reliable Multicast MAC Protocol for IEEE 802.11 Wireless LANs with Extended Service Range

    Science.gov (United States)

    Choi, Woo-Yong

    2011-11-01

    In this paper, we propose the efficient reliable multicast MAC protocol by which the AP (Access Point) can transmit reliably its multicast data frames to the recipients in the AP's one-hop or two-hop transmission range. The AP uses the STAs (Stations) that are directly associated with itself as the relays for the data delivery to the remote recipients that cannot be reached directly from itself. Based on the connectivity information among the recipients, the reliable multicast MAC protocol optimizes the number of the RAK (Request for ACK) frame transmissions in a reasonable computational time. Numerical examples show that our proposed MAC protocol significantly enhances the MAC performance compared with the BMMM (Batch Mode Multicast MAC) protocol that is extended to support the recipients that are in the AP's one-hop or two-hop transmission range in IEEE 802.11 wireless LANs.

  1. An Energy-Efficient Link Layer Protocol for Reliable Transmission over Wireless Networks

    Directory of Open Access Journals (Sweden)

    Iqbal Adnan

    2009-01-01

    Full Text Available In multihop wireless networks, hop-by-hop reliability is generally achieved through positive acknowledgments at the MAC layer. However, positive acknowledgments introduce significant energy inefficiencies on battery-constrained devices. This inefficiency becomes particularly significant on high error rate channels. We propose to reduce the energy consumption during retransmissions using a novel protocol that localizes bit-errors at the MAC layer. The proposed protocol, referred to as Selective Retransmission using Virtual Fragmentation (SRVF, requires simple modifications to the positive-ACK-based reliability mechanism but provides substantial improvements in energy efficiency. The main premise of the protocol is to localize bit-errors by performing partial checksums on disjoint parts or virtual fragments of a packet. In case of error, only the corrupted virtual fragments are retransmitted. We develop stochastic models of the Simple Positive-ACK-based reliability, the previously-proposed Packet Length Optimization (PLO protocol, and the SRVF protocol operating over an arbitrary-order Markov wireless channel. Our analytical models show that SRVF provides significant theoretical improvements in energy efficiency over existing protocols. We then use bit-error traces collected over different real networks to empirically compare the proposed and existing protocols. These experimental results further substantiate that SRVF provides considerably better energy efficiency than Simple Positive-ACK and Packet Length Optimization protocols.

  2. A universal and reliable assay for molecular sex identification of three-spined sticklebacks (Gasterosteus aculeatus).

    Science.gov (United States)

    Toli, E-A; Calboli, F C F; Shikano, T; Merilä, J

    2016-11-01

    In heterogametic species, biological differences between the two sexes are ubiquitous, and hence, errors in sex identification can be a significant source of noise and bias in studies where sex-related sources of variation are of interest or need to be controlled for. We developed and validated a universal multimarker assay for reliable sex identification of three-spined sticklebacks (Gasterosteus aculeatus). The assay makes use of genotype scores from three sex-linked loci and utilizes Bayesian probabilistic inference to identify sex of the genotyped individuals. The results, validated with 286 phenotypically sexed individuals from six populations of sticklebacks representing all major genetic lineages (cf. Pacific, Atlantic and Japan Sea), indicate that in contrast to commonly used single-marker-based sex identification assays, the developed multimarker assay should be 100% accurate. As the markers in the assay can be scored from agarose gels, it provides a quick and cost-efficient tool for universal sex identification of three-spined sticklebacks. The general principle of combining information from multiple markers to improve the reliability of sex identification is transferable and can be utilized to develop and validate similar assays for other species. © 2016 John Wiley & Sons Ltd.

  3. Development of high-reliable real-time communication network protocol for SMART

    Energy Technology Data Exchange (ETDEWEB)

    Song, Ki Sang; Kim, Young Sik [Korea National University of Education, Chongwon (Korea); No, Hee Chon [Korea Advanced Institute of Science and Technology, Taejon (Korea)

    1999-04-01

    In this research, we first define protocol subsets for SMART(System-integrated Modular Advanced Reactor) communication network based on the requirement of SMART MMIS transmission delay and traffic requirements and OSI(Open System Interconnection) 7 layers' network protocol functions. Also, current industrial purpose LAN protocols are analyzed and the applicability of commercialized protocols are checked. For the suitability test, we have applied approximated SMART data traffic and maximum allowable transmission delay requirement. With the simulation results, we conclude that IEEE 802.5 and FDDI which is an ANSI standard, is the most suitable for SMART. We further analyzed the FDDI and token ring protocols for SMART and nuclear plant network environment including IEEE 802.4, IEEE 802.5, and ARCnet. The most suitable protocol for SMART is FDDI and FDDI MAC and RMT protocol specifications have been verified with LOTOS and the verification results show that FDDI MAC and RMT satisfy the reachability and liveness, but does not show deadlock and livelock. Therefore, we conclude that FDDI MAC and RMT is highly reliable protocol for SMART MMIS network. After that, we consider the stacking fault of IEEE 802.5 token ring protocol and propose a fault tolerant MAM(Modified Active Monitor) protocol. The simulation results show that the MAM protocol improves lower priority traffic service rate when stacking fault occurs. Therefore, proposed MAM protocol can be applied to SMART communication network for high reliability and hard real-time communication purpose in data acquisition and inter channel network. (author). 37 refs., 79 figs., 39 tabs.

  4. The specification-based validation of reliable multicast protocol: Problem Report. M.S. Thesis

    Science.gov (United States)

    Wu, Yunqing

    1995-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP multicasting. In this report, we develop formal models for RMP using existing automated verification systems, and perform validation on the formal RMP specifications. The validation analysis help identifies some minor specification and design problems. We also use the formal models of RMP to generate a test suite for conformance testing of the implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress of implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  5. Assessment of a robust model protocol with accelerated throughput for a human recombinant full length estrogen receptor-alpha binding assay: protocol optimization and intralaboratory assay performance as initial steps towards validation.

    Science.gov (United States)

    Freyberger, Alexius; Wilson, Vickie; Weimer, Marc; Tan, Shirlee; Tran, Hoai-Son; Ahr, Hans-Jürgen

    2010-08-01

    Despite about two decades of research in the field of endocrine active compounds, still no validated human recombinant (hr) estrogen receptor-alpha (ERalpha) binding assay is available, although hr-ERalpha is available from several sources. In a joint effort, US EPA and Bayer Schering Pharma with funding from the EU-sponsored 6th framework project, ReProTect, developed a model protocol for such a binding assay. Important features of this assay are the use of a full length hr-ERalpha and performance in a 96-well plate format. A full length hr-ERalpha was chosen, as it was considered to provide the most accurate and human-relevant results, whereas truncated receptors could perform differently. Besides three reference compounds [17beta-estradiol, norethynodrel, dibutylphthalate] nine test compounds with different affinities for the ERalpha [diethylstilbestrol (DES), ethynylestradiol, meso-hexestrol, equol, genistein, o,p'-DDT, nonylphenol, n-butylparaben, and corticosterone] were used to explore the performance of the assay. Three independent experiments per compound were performed on different days, and dilutions of test compounds from deep-frozen stocks, solutions of radiolabeled ligand and receptor preparation were freshly prepared for each experiment. The ERalpha binding properties of reference and test compounds were well detected. As expected dibutylphthalate and corticosterone were non-binders in this assay. In terms of the relative ranking of binding affinities, there was good agreement with published data obtained from experiments using a human recombinant ERalpha ligand binding domain. Irrespective of the chemical nature of the compound, individual IC(50)-values for a given compound varied by not more than a factor of 2.5. Our data demonstrate that the assay was robust and reliably ranked compounds with strong, weak, and no affinity for the ERalpha with high accuracy. It avoids the manipulation and use of animals, i.e., the preparation of uterine cytosol as

  6. Comparison of mRNA splicing assay protocols across multiple laboratories: recommendations for best practice in standardized clinical testing.

    Science.gov (United States)

    Whiley, Phillip J; de la Hoya, Miguel; Thomassen, Mads; Becker, Alexandra; Brandão, Rita; Pedersen, Inge Sokilde; Montagna, Marco; Menéndez, Mireia; Quiles, Francisco; Gutiérrez-Enríquez, Sara; De Leeneer, Kim; Tenés, Anna; Montalban, Gemma; Tserpelis, Demis; Yoshimatsu, Toshio; Tirapo, Carole; Raponi, Michela; Caldes, Trinidad; Blanco, Ana; Santamariña, Marta; Guidugli, Lucia; de Garibay, Gorka Ruiz; Wong, Ming; Tancredi, Mariella; Fachal, Laura; Ding, Yuan Chun; Kruse, Torben; Lattimore, Vanessa; Kwong, Ava; Chan, Tsun Leung; Colombo, Mara; De Vecchi, Giovanni; Caligo, Maria; Baralle, Diana; Lázaro, Conxi; Couch, Fergus; Radice, Paolo; Southey, Melissa C; Neuhausen, Susan; Houdayer, Claude; Fackenthal, Jim; Hansen, Thomas Van Overeem; Vega, Ana; Diez, Orland; Blok, Rien; Claes, Kathleen; Wappenschmidt, Barbara; Walker, Logan; Spurdle, Amanda B; Brown, Melissa A

    2014-02-01

    Accurate evaluation of unclassified sequence variants in cancer predisposition genes is essential for clinical management and depends on a multifactorial analysis of clinical, genetic, pathologic, and bioinformatic variables and assays of transcript length and abundance. The integrity of assay data in turn relies on appropriate assay design, interpretation, and reporting. We conducted a multicenter investigation to compare mRNA splicing assay protocols used by members of the ENIGMA (Evidence-Based Network for the Interpretation of Germline Mutant Alleles) consortium. We compared similarities and differences in results derived from analysis of a panel of breast cancer 1, early onset (BRCA1) and breast cancer 2, early onset (BRCA2) gene variants known to alter splicing (BRCA1: c.135-1G>T, c.591C>T, c.594-2A>C, c.671-2A>G, and c.5467+5G>C and BRCA2: c.426-12_8delGTTTT, c.7988A>T, c.8632+1G>A, and c.9501+3A>T). Differences in protocols were then assessed to determine which elements were critical in reliable assay design. PCR primer design strategies, PCR conditions, and product detection methods, combined with a prior knowledge of expected alternative transcripts, were the key factors for accurate splicing assay results. For example, because of the position of primers and PCR extension times, several isoforms associated with BRCA1, c.594-2A>C and c.671-2A>G, were not detected by many sites. Variation was most evident for the detection of low-abundance transcripts (e.g., BRCA2 c.8632+1G>A Δ19,20 and BRCA1 c.135-1G>T Δ5q and Δ3). Detection of low-abundance transcripts was sometimes addressed by using more analytically sensitive detection methods (e.g., BRCA2 c.426-12_8delGTTTT ins18bp). We provide recommendations for best practice and raise key issues to consider when designing mRNA assays for evaluation of unclassified sequence variants.

  7. Reliability of plant root comet assay in comparison with human leukocyte comet assay for assessment environmental genotoxic agents.

    Science.gov (United States)

    Reis, Gabriela Barreto Dos; Andrade-Vieira, Larissa Fonseca; Moraes, Isabella de Campos; César, Pedro Henrique Souza; Marcussi, Silvana; Davide, Lisete Chamma

    2017-08-01

    Comet assay is an efficient test to detect genotoxic compounds based on observation of DNA damage. The aim of this work was to compare the results obtained from the comet assay in two different type of cells extracted from the root tips from Lactuca sativa L. and human blood. For this, Spent Pot Liner (SPL), and its components (aluminum and fluoride) were applied as toxic agents. SPL is a solid waste generated in industry from the aluminum mining and processing with known toxicity. Three concentrations of all tested solutions were applied and the damages observed were compared to negative and positive controls. It was observed an increase in the frequency of DNA damage for human leukocytes and plant cells, in all treatments. On human leukocytes, SPL induced the highest percentage of damage, with an average of 87.68%. For root tips cells of L. sativa the highest percentage of damage was detected for aluminum (93.89%). Considering the arbitrary units (AU), the average of nuclei with high levels of DNA fragmentation was significant for both cells type evaluated. The tested cells demonstrated equal effectiveness for detection of the genotoxicity induced by the SPL and its chemical components, aluminum and fluoride. Further, using a unique method, the comet assay, we proved that cells from root tips of Lactuca sativa represent a reliable model to detect DNA damage induced by genotoxic pollutants is in agreement of those observed in human leukocytes as model. So far, plant cells may be suggested as important system to assess the toxicological risk of environmental agents. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Reliability and criterion validity of an observation protocol for working technique assessments in cash register work.

    Science.gov (United States)

    Palm, Peter; Josephson, Malin; Mathiassen, Svend Erik; Kjellberg, Katarina

    2016-06-01

    We evaluated the intra- and inter-observer reliability and criterion validity of an observation protocol, developed in an iterative process involving practicing ergonomists, for assessment of working technique during cash register work for the purpose of preventing upper extremity symptoms. Two ergonomists independently assessed 17 15-min videos of cash register work on two occasions each, as a basis for examining reliability. Criterion validity was assessed by comparing these assessments with meticulous video-based analyses by researchers. Intra-observer reliability was acceptable (i.e. proportional agreement >0.7 and kappa >0.4) for 10/10 questions. Inter-observer reliability was acceptable for only 3/10 questions. An acceptable inter-observer reliability combined with an acceptable criterion validity was obtained only for one working technique aspect, 'Quality of movements'. Thus, major elements of the cashiers' working technique could not be assessed with an acceptable accuracy from short periods of observations by one observer, such as often desired by practitioners. Practitioner Summary: We examined an observation protocol for assessing working technique in cash register work. It was feasible in use, but inter-observer reliability and criterion validity were generally not acceptable when working technique aspects were assessed from short periods of work. We recommend the protocol to be used for educational purposes only.

  9. An improved IEEE 802.11 protocol for reliable data transmission in power distribution fault diagnosis

    Energy Technology Data Exchange (ETDEWEB)

    Campoccia, F.; Di Silvestre, M.L.; Sanseverino, E.R.; Zizzo, G. [Palermo Univ., Palermo (Italy)

    2010-10-15

    In power systems, on-line transmission between local units and the central unit can be done by means of power line communications or wireless technology. During an electrical fault, the reliability of the distribution system depends on the security of the timely protective and restorative actions on the network. This paper focused on the WiFi system because of its economy and ease of installation. However, WiFi systems are typically managed by the IEEE 802.11 protocol, which is not reliable in terms of security in data communication. In WiFi networks, data is divided into packets and sent in succession to reduce errors within the radio channel. The IEEE 802.11 protocol has high probability for loss of packets or delay in their transmission. In order to ensure the reliability of data transmission times between two terminal units connected by WiFi stations, a new protocol was derived by modifying the IEEE 802.11. The improvements of the new protocol were highlighted and its capability for the diagnostic service was verified. The modified protocol eliminates the danger of collisions between packets and optimizes the transmission time for sending information. 6 refs., 7 tabs., 8 figs.

  10. IRR (Inter-Rater Reliability) of a COP (Classroom Observation Protocol)--A Critical Appraisal

    Science.gov (United States)

    Rui, Ning; Feldman, Jill M.

    2012-01-01

    Notwithstanding broad utility of COPs (classroom observation protocols), there has been limited documentation of the psychometric properties of even the most popular COPs. This study attempted to fill this void by closely examining the item and domain-level IRR (inter-rater reliability) of a COP that was used in a federally funded striving readers…

  11. RELIABLE DYNAMIC SOURCE ROUTING PROTOCOL (RDSRP FOR ENERGY HARVESTING WIRELESS SENSOR NETWORKS

    Directory of Open Access Journals (Sweden)

    B. Narasimhan

    2015-03-01

    Full Text Available Wireless sensor networks (WSNs carry noteworthy pros over traditional communication. Though, unkind and composite environments fake great challenges in the reliability of WSN communications. It is more vital to develop a reliable unipath dynamic source routing protocol (RDSRPl for WSN to provide better quality of service (QoS in energy harvesting wireless sensor networks (EH-WSN. This paper proposes a dynamic source routing approach for attaining the most reliable route in EH-WSNs. Performance evaluation is carried out using NS-2 and throughput and packet delivery ratio are chosen as the metrics.

  12. Reassessing the reliability of the salivary cortisol assay for the diagnosis of Cushing syndrome.

    Science.gov (United States)

    Zhang, Qian; Dou, Jingtao; Gu, Weijun; Yang, Guoqing; Lu, Juming

    2013-10-01

    The cortisol concentration in saliva is 10-fold lower than total serum cortisol and accurately reflects the serum concentration, both levels being lowest around midnight. The salivary cortisol assay measures free cortisol and is unaffected by confounding factors. This study analysed published data on the sensitivity and specificity of salivary cortisol levels in the diagnosis of Cushing syndrome. Data from studies on the use of different salivary cortisol assay techniques in the diagnosis of Cushing syndrome, published between 1998 and 2012 and retrieved using Ovid MEDLINE®, were analysed for variance and correlation. For the 11 studies analysed, mean sensitivity and specificity of the salivary cortisol assay were both >90%. Repeated measurements were easily made with this assay, enabling improved diagnostic accuracy in comparison with total serum cortisol measurements. This analysis confirms the reliability of the saliva cortisol assay as pragmatic tool for the accurate diagnosis of Cushing syndrome. With many countries reporting a rising prevalence of metabolic syndrome, diabetes and obesity--in which there is often a high circulating cortisol level--salivary cortisol measurement will help distinguish these states from Cushing syndrome.

  13. CPM Test-Retest Reliability: "Standard" vs "Single Test-Stimulus" Protocols.

    Science.gov (United States)

    Granovsky, Yelena; Miller-Barmak, Adi; Goldstein, Oren; Sprecher, Elliot; Yarnitsky, David

    2016-03-01

    Assessment of pain inhibitory mechanisms using conditioned pain modulation (CPM) is relevant clinically in prediction of pain and analgesic efficacy. Our objective is to provide necessary estimates of intersession CPM reliability, to enable transformation of the CPM paradigm into a clinical tool. Two cohorts of young healthy subjects (N = 65) participated in two dual-session studies. In Study I, a Bath-Thermode CPM protocol was used, with hot water immersion and contact heat as conditioning- and test-stimuli, respectively, in a classical parallel CPM design introducing test-stimulus first, and then the conditioning- and repeated test-stimuli in parallel. Study II consisted of two CPM protocols: 1) Two-Thermodes, one for each of the stimuli, in the same parallel design as above, and 2) single test-stimulus (STS) protocol with a single administration of a contact heat test-stimulus, partially overlapped in time by a remote shorter contact heat as conditioning stimulus. Test-retest reliability was assessed within 3-7 days. The STS-CPM had superior reliability intraclass correlation (ICC 2 ,: 1  = 0.59) over Bath-Thermode (ICC 2 ,: 1  = 0.34) or Two-Thermodes (ICC 2 ,: 1  = 0.21) protocols. The hand immersion conditioning pain had higher reliability than thermode pain (ICC 2 ,: 1  = 0.76 vs ICC 2 ,: 1  = 0.16). Conditioned test-stimulus pain scores were of good (ICC 2 ,: 1  = 0.62) or fair (ICC 2 ,: 1  = 0.43) reliability for the Bath-Thermode and the STS, respectively, but not for the Two-Thermodes protocol (ICC 2 ,: 1  = 0.20). The newly developed STS-CPM paradigm was more reliable than other CPM protocols tested here, and should be further investigated for its clinical relevance. It appears that large contact size of the conditioning-stimulus and use of single rather than dual test-stimulus pain contribute to augmentation of CPM reliability. © 2015 American Academy of Pain Medicine. All rights reserved. For permissions, please e

  14. A Smart Collaborative Routing Protocol for Reliable Data Diffusion in IoT Scenarios.

    Science.gov (United States)

    Ai, Zheng-Yang; Zhou, Yu-Tong; Song, Fei

    2018-06-13

    It is knotty for current routing protocols to meet the needs of reliable data diffusion during the Internet of Things (IoT) deployments. Due to the random placement, limited resources and unattended features of existing sensor nodes, the wireless transmissions are easily exposed to unauthorized users, which becomes a vulnerable area for various malicious attacks, such as wormhole and Sybil attacks. However, the scheme based on geographic location is a suitable candidate to defend against them. This paper is inspired to propose a smart collaborative routing protocol, Geographic energy aware routing and Inspecting Node (GIN), for guaranteeing the reliability of data exchanging. The proposed protocol integrates the directed diffusion routing, Greedy Perimeter Stateless Routing (GPSR), and the inspecting node mechanism. We first discuss current wireless routing protocols from three diverse perspectives (improving transmission rate, shortening transmission range and reducing transmission consumption). Then, the details of GIN, including the model establishment and implementation processes, are presented by means of the theoretical analysis. Through leveraging the game theory, the inspecting node is elected to monitor the network behaviors. Thirdly, we evaluate the network performances, in terms of transmission delay, packet loss ratio, and throughput, between GIN and three traditional schemes (i.e., Flooding, GPSR, and GEAR). The simulation results illustrate that the proposed protocol is able to outperform the others.

  15. Test-retest reliability of jump execution variables using mechanography: a comparison of jump protocols.

    Science.gov (United States)

    Fitzgerald, John S; Johnson, LuAnn; Tomkinson, Grant; Stein, Jesse; Roemmich, James N

    2018-05-01

    Mechanography during the vertical jump may enhance screening and determining mechanistic causes underlying physical performance changes. Utility of jump mechanography for evaluation is limited by scant test-retest reliability data on force-time variables. This study examined the test-retest reliability of eight jump execution variables assessed from mechanography. Thirty-two women (mean±SD: age 20.8 ± 1.3 yr) and 16 men (age 22.1 ± 1.9 yr) attended a familiarization session and two testing sessions, all one week apart. Participants performed two variations of the squat jump with squat depth self-selected and controlled using a goniometer to 80º knee flexion. Test-retest reliability was quantified as the systematic error (using effect size between jumps), random error (using coefficients of variation), and test-retest correlations (using intra-class correlation coefficients). Overall, jump execution variables demonstrated acceptable reliability, evidenced by small systematic errors (mean±95%CI: 0.2 ± 0.07), moderate random errors (mean±95%CI: 17.8 ± 3.7%), and very strong test-retest correlations (range: 0.73-0.97). Differences in random errors between controlled and self-selected protocols were negligible (mean±95%CI: 1.3 ± 2.3%). Jump execution variables demonstrated acceptable reliability, with no meaningful differences between the controlled and self-selected jump protocols. To simplify testing, a self-selected jump protocol can be used to assess force-time variables with negligible impact on measurement error.

  16. Test-retest reliability of a balance testing protocol with external perturbations in young healthy adults.

    Science.gov (United States)

    Robbins, Shawn M; Caplan, Ryan M; Aponte, Daniel I; St-Onge, Nancy

    2017-10-01

    External perturbations are utilized to challenge balance and mimic realistic balance threats in patient populations. The reliability of such protocols has not been established. The purpose was to examine test-retest reliability of balance testing with external perturbations. Healthy adults (n=34; mean age 23 years) underwent balance testing over two visits. Participants completed ten balance conditions in which the following parameters were combined: perturbation or non-perturbation, single or double leg, and eyes open or closed. Three trials were collected for each condition. Data were collected on a force plate and external perturbations were applied by translating the plate. Force plate center of pressure (CoP) data were summarized using 13 different CoP measures. Test-retest reliability was examined using intraclass correlation coefficients (ICC) and Bland-Altman plots. CoP measures of total speed and excursion in both anterior-posterior and medial-lateral directions generally had acceptable ICC values for perturbation conditions (ICC=0.46 to 0.87); however, many other CoP measures (e.g. range, area of ellipse) had unacceptable test-retest reliability (ICCbalance testing protocols that include external perturbations should be made to improve test-retest reliability and diminish learning including more extensive participant training and increasing the number of trials. CoP measures that consider all data points (e.g. total speed) are more reliable than those that only consider a few data points. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. A safe and reliable neutralization assay based on pseudovirus to measure neutralizing antibody titer against poliovirus.

    Science.gov (United States)

    Liu, Shaohua; Song, Dongmei; Bai, Han; Lu, Weiwei; Dai, Xinxian; Hao, Chunsheng; Zhang, Zhongyang; Guo, Huijie; Zhang, Yue; Li, Xiuling

    2017-12-01

    With the promotion of inactivated poliomyelitis vaccine (IPV) and live attenuated oral poliomyelitis vaccine (OPV), the global reported cases of poliomyelitis have reduced sharply from 0.35 million in 1988 to 74 in 2015. The Polio Eradication & Endgame Strategic Plan published by WHO in 2013 included the strategy of implementation of poliovirus safe handling and containment measures to minimize the risks of facility-associated reintroduction of virus into the polio-free community to prevent the re-import of poliovirus. Toward this strategy, we produced replication-incompetent pseudovirus of poliovirus type 1, 2, 3 attenuated strains by constructing poliovirus capsid expression vectors and poliovirus replicon then transfecting HEK293T cells and developed a pseudovirus-based neutralization assay (pNA) to determine neutralizing antibody titer which is more secure, time-saving and reliable than conventional neutralization assay (cNA). By using anti-poliovirus rat serum, we demonstrated excellent correlation between neutralizing antibody titers measured by cNA and pNA. It was concluded that pNA can be a potential alternative to replace cNA as a safe and time-saving system for titer determination after live poliovirus's safekeeping. © 2017 Wiley Periodicals, Inc.

  18. Reliability of soluble IL-2 receptor measurements obtained with enzyme-linked immunosorbent assay

    International Nuclear Information System (INIS)

    Akiyama, Mitoshi; Takaishi, Masatoshi; Murakami, Yoshie; Ueda, Ryuzo; Yamakido, Michio; Tsubokura, Tokuo.

    1989-09-01

    Using an enzyme-linked immunosorbent assay (ELISA), human soluble interleukin-2 receptors (IL-2R) were measured in the serum of patients with various autoimmune system diseases. To study the sensitivity and specificity of the assay, soluble IL-2Rs were measured in the culture supernatants and in the cell extracts of peripheral blood mononuclear cells activated with phytohemagglutinin (PHA), purified protein derivative of tuberculin, and allogeneic lymphocytes, as well as in the serum of patients with various collagen diseases. The results correlated well with reports from other laboratories. For example, when stimulated by PHA, the greatest amount of soluble IL-2Rs was produced at the fastest rate. In addition, soluble IL-2R levels in the serum of collagen disease patients were significantly higher than those in healthy persons, who themselves exhibited low levels of detectable soluble IL-2Rs. It is hoped that reliable ELISA measurements of soluble IL-2Rs in the serum of atomic bomb survivors will assist in the interpretation of data collected during the work described in RP 2-87, a study of autoimmunity and autoimmune diseases in the Adult Health Study. (author)

  19. Reliability of the MODS assay decentralisation process in three health regions in Peru

    Science.gov (United States)

    Mendoza, A.; Castillo, E.; Gamarra, N.; Huamán, T.; Perea, M.; Monroi, Y.; Salazar, R.; Coronel, J.; Acurio, M.; Obregón, G.; Roper, M.; Bonilla, C.; Asencios, L.; Moore, D. A. J.

    2011-01-01

    OBJECTIVE To deliver rapid isoniazid (INH) and rifampicin (RMP) drug susceptibility testing (DST) close to the patient, we designed a decentralisation process for the microscopic observation drug susceptibility (MODS) assay in Peru and evaluated its reliability. METHODS After 2 weeks of training, laboratory staff processed ≥120 consecutive sputum samples each in three regional laboratories. Samples were processed in parallel with MODS testing at an expert laboratory. Blinded paired results were independently analysed by the Instituto Nacional de Salud (INS) according to predetermined criteria: concordance for culture, DST against INH and RMP and diagnosis of multidrug-resistant t uberculosis (MDR-TB) ≥ 95%, McNemar's P > 0.05, kappa index (κ) ≥ 0.75 and contamination 1–4%. Sensitivity and specificity for MDR-TB were calculated. RESULTS The accreditation process for Callao (126 samples, 79.4% smear-positive), Lima Sur (n = 130, 84%) and Arequipa (n = 126, 80%) took respectively 94, 97 and 173 days. Pre-determined criteria in all regional laboratories were above expected values. The sensitivity and specificity for detecting MDR-TB in regional laboratories were >95%, except for sensitivity in Lima Sur, which was 91.7%. Contamination was 1.0–2.3%. Mean delay to positive MODS results was 9.9–12.9 days. CONCLUSION Technology transfer of MODS was reliable, effective and fast, enabling the INS to accredit regional laboratories swiftly. PMID:21219684

  20. The reliability of a single protocol to determine endothelial, microvascular and autonomic functions in adolescents.

    Science.gov (United States)

    Bond, Bert; Williams, Craig A; Barker, Alan R

    2017-11-01

    Impairments in macrovascular, microvascular and autonomic function are present in asymptomatic youths with clustered cardiovascular disease risk factors. This study determines the within-day reliability and between-day reliability of a single protocol to non-invasively assess these outcomes in adolescents. Forty 12- to 15-year-old adolescents (20 boys) visited the laboratory in a fasted state on two occasions, approximately 1 week apart. One hour after a standardized cereal breakfast, macrovascular function was determined via flow-mediated dilation (FMD). Heart rate variability (root mean square of successive R-R intervals; RMSSD) was determined from the ECG-gated ultrasound images acquired during the FMD protocol prior to cuff occlusion. Microvascular function was simultaneously quantified as the peak (PRH) and total (TRH) hyperaemic response to occlusion in the cutaneous circulation of the forearm via laser Doppler imaging. To address within-day reliability, a subset of twenty adolescents (10 boys) repeated these measures 90 min afterwards on one occasion. The within-day typical error and between-day typical error expressed as a coefficient of variation of these outcomes are as follows: ratio-scaled FMD, 5·1% and 10·6%; allometrically scaled FMD, 4·4% and 9·4%; PRH, 11% and 13·3%; TRH, 29·9% and 23·1%; and RMSSD, 17·6% and 17·6%. The within- and between-day test-retest correlation coefficients for these outcomes were all significant (r > 0·54 for all). Macrovascular, microvascular and autonomic functions can be simultaneously and non-invasively determined in adolescents using a single protocol with an appropriate degree of reproducibility. Determining these outcomes may provide greater understanding of the progression of cardiovascular disease and aid early intervention. © 2016 Scandinavian Society of Clinical Physiology and Nuclear Medicine. Published by John Wiley & Sons Ltd.

  1. A reliable, practical, and economical protocol for inducing diarrhea and severe dehydration in the neonatal calf.

    OpenAIRE

    Walker, P G; Constable, P D; Morin, D E; Drackley, J K; Foreman, J H; Thurmon, J C

    1998-01-01

    Fifteen healthy, colostrum-fed, male dairy calves, aged 2 to 7 d were used in a study to develop a diarrhea protocol for neonatal calves that is reliable, practical, and economical. After instrumentation and recording baseline data, diarrhea and dehydration were induced by administering milk replacer [16.5 mL/kg of body weight (BW), PO], sucrose (2 g/kg in a 20% aqueous solution, p.o.), spironolactone and hydrochlorothiazide (1 mg/kg, PO) every 8 h, and furosemide (2 mg/kg, i.m., q6h). Calves...

  2. Specification and Design of a Fault Recovery Model for the Reliable Multicast Protocol

    Science.gov (United States)

    Montgomery, Todd; Callahan, John R.; Whetten, Brian

    1996-01-01

    The Reliable Multicast Protocol (RMP) provides a unique, group-based model for distributed programs that need to handle reconfiguration events at the application layer. This model, called membership views, provides an abstraction in which events such as site failures, network partitions, and normal join-leave events are viewed as group reformations. RMP provides access to this model through an application programming interface (API) that notifies an application when a group is reformed as the result of a some event. RMP provides applications with reliable delivery of messages using an underlying IP Multicast media to other group members in a distributed environment even in the case of reformations. A distributed application can use various Quality of Service (QoS) levels provided by RMP to tolerate group reformations. This paper explores the implementation details of the mechanisms in RMP that provide distributed applications with membership view information and fault recovery capabilities.

  3. A reliable transmission protocol for ZigBee-based wireless patient monitoring.

    Science.gov (United States)

    Chen, Shyr-Kuen; Kao, Tsair; Chan, Chia-Tai; Huang, Chih-Ning; Chiang, Chih-Yen; Lai, Chin-Yu; Tung, Tse-Hua; Wang, Pi-Chung

    2012-01-01

    Patient monitoring systems are gaining their importance as the fast-growing global elderly population increases demands for caretaking. These systems use wireless technologies to transmit vital signs for medical evaluation. In a multihop ZigBee network, the existing systems usually use broadcast or multicast schemes to increase the reliability of signals transmission; however, both the schemes lead to significantly higher network traffic and end-to-end transmission delay. In this paper, we present a reliable transmission protocol based on anycast routing for wireless patient monitoring. Our scheme automatically selects the closest data receiver in an anycast group as a destination to reduce the transmission latency as well as the control overhead. The new protocol also shortens the latency of path recovery by initiating route recovery from the intermediate routers of the original path. On the basis of a reliable transmission scheme, we implement a ZigBee device for fall monitoring, which integrates fall detection, indoor positioning, and ECG monitoring. When the triaxial accelerometer of the device detects a fall, the current position of the patient is transmitted to an emergency center through a ZigBee network. In order to clarify the situation of the fallen patient, 4-s ECG signals are also transmitted. Our transmission scheme ensures the successful transmission of these critical messages. The experimental results show that our scheme is fast and reliable. We also demonstrate that our devices can seamlessly integrate with the next generation technology of wireless wide area network, worldwide interoperability for microwave access, to achieve real-time patient monitoring.

  4. A reliable, practical, and economical protocol for inducing diarrhea and severe dehydration in the neonatal calf.

    Science.gov (United States)

    Walker, P G; Constable, P D; Morin, D E; Drackley, J K; Foreman, J H; Thurmon, J C

    1998-07-01

    Fifteen healthy, colostrum-fed, male dairy calves, aged 2 to 7 d were used in a study to develop a diarrhea protocol for neonatal calves that is reliable, practical, and economical. After instrumentation and recording baseline data, diarrhea and dehydration were induced by administering milk replacer [16.5 mL/kg of body weight (BW), PO], sucrose (2 g/kg in a 20% aqueous solution, p.o.), spironolactone and hydrochlorothiazide (1 mg/kg, PO) every 8 h, and furosemide (2 mg/kg, i.m., q6h). Calves were administered sucrose and diuretic agents for 48 h to induce diarrhea and severe dehydration. Clinical changes after 48 h were severe watery diarrhea, severe depression, and marked dehydration (mean, 14% BW loss). Cardiac output, stroke volume, mean central venous pressure, plasma volume, thiocyanate space, blood pH and bicarbonate concentration, base excess, serum chloride concentration, and fetlock temperature were decreased. Plasma lactate concentration, hematocrit, and serum potassium, creatinine, phosphorus, total protein and albumin concentrations were increased. This non-infectious calf diarrhea protocol has a 100% response rate, while providing a consistent and predictable hypovolemic state with diarrhea that reflects most of the clinicopathologic changes observed in osmotic/maldigestive diarrhea caused by infection with rotavirus, coronavirus or cryptosporidia. Limitations of the protocol, when compared to infectious diarrhea models, include failure to induce a severe metabolic acidosis, absence of hyponatremia, renal instead of enteric loss of chloride, renal as well as enteric loss of free water, absence of profound clinical depression and suspected differences in the morphologic and functional effect on intestinal epithelium. Despite these differences, the sucrose/diuretic protocol should be useful in the initial screening of new treatment modalities for calf diarrhea. To confirm their efficacy, the most effective treatment methods should then be examined in

  5. Determining Reliability of a Dual-Task Functional Mobility Protocol for Individuals With Lower Extremity Amputation.

    Science.gov (United States)

    Hunter, Susan W; Frengopoulos, Courtney; Holmes, Jeff; Viana, Ricardo; Payne, Michael W

    2018-04-01

    To determine the relative and absolute reliability of a dual-task functional mobility assessment. Cross-sectional study. Academic rehabilitation hospital. Individuals (N=60) with lower extremity amputation attending an outpatient amputee clinic (mean age, 58.21±12.59y; 18, 80% male) who were stratified into 3 groups: (1) transtibial amputation of vascular etiology (n=20); (2) transtibial amputation of nonvascular etiology (n=20); and (3) transfemoral or bilateral amputation of any etiology (n=20). Not applicable. Time to complete the L Test measured functional mobility under single- and dual-task conditions. The addition of a cognitive task (serial subtractions by 3's) created dual-task conditions. Single-task performance on the cognitive task was also reported. Intraclass correlation coefficients (ICCs) measured relative reliability; SEM and minimal detectable change with a 95% confidence interval (MDC 95 ) measured absolute reliability. Bland-Altman plots measured agreement between assessments. Relative reliability results were excellent for all 3 groups. Values for the dual-task L Test for those with transtibial amputation of vascular etiology (n=20; mean age, 60.36±7.84y; 19, 90% men) were ICC=.98 (95% confidence interval [CI], .94-.99), SEM=1.36 seconds, and MDC 95 =3.76 seconds; for those with transtibial amputation of nonvascular etiology (n=20; mean age, 55.85±14.08y; 17, 85% men), values were ICC=.93 (95% CI, .80-.98), SEM=1.34 seconds, and MDC 95 =3.71 seconds; and for those with transfemoral or bilateral amputation (n=20; mean age, 58.21±14.88y; 13, 65% men), values were ICC=.998 (95% CI, .996-.999), SEM=1.03 seconds, and MDC 95 =2.85 seconds. Bland-Altman plots indicated that assessments did not vary systematically for each group. This dual-task assessment protocol achieved approved levels of relative reliability values for the 3 groups tested. This protocol may be used clinically or in research settings to assess the interaction between cognition

  6. Reliability and fatigue characteristics of a standing hip isometric endurance protocol.

    Science.gov (United States)

    Mutchler, Jessica A; Weinhandl, Joshua T; Hoch, Matthew C; Van Lunen, Bonnie L

    2015-08-01

    Muscle fatigue is a common consideration when evaluating and rehabilitating athletic injuries. The presence of muscular fatigue has been previously determined by quantifying median frequency (MF) through a power spectral analysis on EMG signals collected throughout an endurance task. Research has not yet determined if a prolonged isometric test in a standing position generates muscular fatigue of the hip. The purpose of this study was to determine the reliability and fatigue characteristics of a standing hip isometric endurance test. Twenty healthy participants completed one 60-s Maximum Voluntary Isometric Contraction of standing hip flexion, extension, adduction, and abduction. MF of the participants' dominant limb rectus femoris (RF), biceps femoris (BF), gluteus maximus (GMax), gluteus medius (GMed) and adductor longus (ADD) was determined via surface electromyography during two sessions, 30-min apart. Reliability values (ICC2,1) were moderate-to-excellent for all time intervals of each action (FlexionRF: >0.80; ExtensionBF: >0.89; ExtensionGMax: >0.60; AdductionADD: >0.78; AbductionGMed: >0.60) and MF significantly decreased over time for all actions. Results suggest the endurance test is a reliable technique to generate muscular fatigue for hip flexion, extension, adduction and abduction. It can be used as a time efficient fatigue protocol specific to the RF, BF, GMax, ADD and GMed. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. A Performance Evaluation of NACK-Oriented Protocols as the Foundation of Reliable Delay- Tolerant Networking Convergence Layers

    Science.gov (United States)

    Iannicca, Dennis; Hylton, Alan; Ishac, Joseph

    2012-01-01

    Delay-Tolerant Networking (DTN) is an active area of research in the space communications community. DTN uses a standard layered approach with the Bundle Protocol operating on top of transport layer protocols known as convergence layers that actually transmit the data between nodes. Several different common transport layer protocols have been implemented as convergence layers in DTN implementations including User Datagram Protocol (UDP), Transmission Control Protocol (TCP), and Licklider Transmission Protocol (LTP). The purpose of this paper is to evaluate several stand-alone implementations of negative-acknowledgment based transport layer protocols to determine how they perform in a variety of different link conditions. The transport protocols chosen for this evaluation include Consultative Committee for Space Data Systems (CCSDS) File Delivery Protocol (CFDP), Licklider Transmission Protocol (LTP), NACK-Oriented Reliable Multicast (NORM), and Saratoga. The test parameters that the protocols were subjected to are characteristic of common communications links ranging from terrestrial to cis-lunar and apply different levels of delay, line rate, and error.

  8. Establishment and intra-/inter-laboratory validation of a standard protocol of reactive oxygen species assay for chemical photosafety evaluation.

    Science.gov (United States)

    Onoue, Satomi; Hosoi, Kazuhiro; Wakuri, Shinobu; Iwase, Yumiko; Yamamoto, Toshinobu; Matsuoka, Naoko; Nakamura, Kazuichi; Toda, Tsuguto; Takagi, Hironori; Osaki, Naoto; Matsumoto, Yasuhiro; Kawakami, Satoru; Seto, Yoshiki; Kato, Masashi; Yamada, Shizuo; Ohno, Yasuo; Kojima, Hajime

    2013-11-01

    A reactive oxygen species (ROS) assay was previously developed for photosafety evaluation of pharmaceuticals, and the present multi-center study aimed to establish and validate a standard protocol for ROS assay. In three participating laboratories, two standards and 42 coded chemicals, including 23 phototoxins and 19 nonphototoxic drugs/chemicals, were assessed by the ROS assay according to the standardized protocol. Most phototoxins tended to generate singlet oxygen and/or superoxide under UV-vis exposure, but nonphototoxic chemicals were less photoreactive. In the ROS assay on quinine (200 µm), a typical phototoxic drug, the intra- and inter-day precisions (coefficient of variation; CV) were found to be 1.5-7.4% and 1.7-9.3%, respectively. The inter-laboratory CV for quinine averaged 15.4% for singlet oxygen and 17.0% for superoxide. The ROS assay on 42 coded chemicals (200 µm) provided no false negative predictions upon previously defined criteria as compared with the in vitro/in vivo phototoxicity, although several false positives appeared. Outcomes from the validation study were indicative of satisfactory transferability, intra- and inter-laboratory variability, and predictive capacity of the ROS assay. Copyright © 2012 John Wiley & Sons, Ltd.

  9. Overview of procalcitonin assays and procalcitonin-guided protocols for the management of patients with infections and sepsis.

    Science.gov (United States)

    Schuetz, Philipp; Bretscher, Celine; Bernasconi, Luca; Mueller, Beat

    2017-06-01

    Procalcitonin is a surrogate infection blood marker whose levels help estimate the likelihood of bacterial infections and correlate with their resolution. Recent trials have revealed the benefits of inclusion of procalcitonin in antibiotic stewardship protocols for initiation and discontinuation of antimicrobial therapy. Areas covered: Procalcitonin-guided antibiotic stewardship protocols have shown appreciable reductions in antibiotic use and duration of therapy in respiratory infections, sepsis, and other infections, with positive effects on clinical outcomes. Multiple fully automated and sensitive procalcitonin assays are routinely used in clinical practice. Utilization of these assays requires consideration of the clinical setting and knowledge of assay characteristics, particularly assay sensitivities, reproducibility, and performance across routinely used cut-off ranges. The authors provide an overview of the strengths and limitations of currently available procalcitonin assays and antibiotic therapy algorithms incorporating procalcitonin currently used in different clinical settings and in patients with different underlying infections. Expert commentary: Use of sensitive procalcitonin measurements in clinical algorithms can reduce antimicrobial overuse, decreasing the risk of side effects and controlling emerging bacterial multi-resistance. Before use in clinical practice, it is important to carefully assess the quality of novel PCT assays and rigorously evaluate them in target patient populations across clinically relevant cut-off ranges.

  10. Dose-Response Assessment of Four Genotoxic Chemicals in a Combined Mouse and Rat Micronucleus and Comet Assay Protocol

    Science.gov (United States)

    Recio, Leslie; Hobbs, Cheryl; Caspary, William; Witt, Kristine L.

    2012-01-01

    The in vivo micronucleus (MN) assay has proven to be an effective measure of genotoxicity potential. However, sampling a single tissue (bone marrow) for a single indicator of genetic damage using the MN assay provides a limited genotoxicity profile. The in vivo alkaline (pH>13) Comet assay, which detects a broad spectrum of DNA damage, can be applied to a variety of rodent tissues following administration of test agents. To determine if the Comet assay is a useful supplement to the in vivo MN assay, a combined test protocol (MN/Comet assay) was conducted in male B6C3F1 mice and F344/N rats using four model genotoxicants: ethyl methanesulfonate (EMS), acrylamide (ACM), cyclophosphamide (CP), and vincristine sulfate (VS). Test compounds were administered on 4 consecutive days at 24-hour intervals (VS was administered to rats for 3 days); animals were euthanized 4 hours after the last administration. All compounds induced significant increases in micronucleated reticulocytes (MN-RET) in the peripheral blood of mice, and all but ACM induced MN-RET in rats. EMS and ACM induced significant increases in DNA damage, measured by the Comet assay, in multiple tissues of mice and rats. CP-induced DNA damage was detected in leukocytes and duodenum cells. VS, a spindle fiber disrupting agent, was negative in the Comet assay. Based on these results, the MN/Comet assay holds promise for providing more comprehensive assessments of potential genotoxicants, and the National Toxicology Program is presently using this combined protocol in its overall evaluation of the genotoxicity of substances of public health concern. PMID:20371966

  11. Modeling of coupled differential equations for cellular chemical signaling pathways: Implications for assay protocols utilized in cellular engineering.

    Science.gov (United States)

    O'Clock, George D

    2016-08-01

    Cellular engineering involves modification and control of cell properties, and requires an understanding of fundamentals and mechanisms of action for cellular derived product development. One of the keys to success in cellular engineering involves the quality and validity of results obtained from cell chemical signaling pathway assays. The accuracy of the assay data cannot be verified or assured if the effect of positive feedback, nonlinearities, and interrelationships between cell chemical signaling pathway elements are not understood, modeled, and simulated. Nonlinearities and positive feedback in the cell chemical signaling pathway can produce significant aberrations in assay data collection. Simulating the pathway can reveal potential instability problems that will affect assay results. A simulation, using an electrical analog for the coupled differential equations representing each segment of the pathway, provides an excellent tool for assay validation purposes. With this approach, voltages represent pathway enzyme concentrations and operational amplifier feedback resistance and input resistance values determine pathway gain and rate constants. The understanding provided by pathway modeling and simulation is strategically important in order to establish experimental controls for assay protocol structure, time frames specified between assays, and assay concentration variation limits; to ensure accuracy and reproducibility of results.

  12. Can the comet assay be used reliably to detect nanoparticle-induced genotoxicity?

    Science.gov (United States)

    Karlsson, Hanna L; Di Bucchianico, Sebastiano; Collins, Andrew R; Dusinska, Maria

    2015-03-01

    The comet assay is a sensitive method to detect DNA strand breaks as well as oxidatively damaged DNA at the level of single cells. Today the assay is commonly used in nano-genotoxicology. In this review we critically discuss possible interactions between nanoparticles (NPs) and the comet assay. Concerns for such interactions have arisen from the occasional observation of NPs in the "comet head", which implies that NPs may be present while the assay is being performed. This could give rise to false positive or false negative results, depending on the type of comet assay endpoint and NP. For most NPs, an interaction that substantially impacts the comet assay results is unlikely. For photocatalytically active NPs such as TiO2 , on the other hand, exposure to light containing UV can lead to increased DNA damage. Samples should therefore not be exposed to such light. By comparing studies in which both the comet assay and the micronucleus assay have been used, a good consistency between the assays was found in general (69%); consistency was even higher when excluding studies on TiO2 NPs (81%). The strong consistency between the comet and micronucleus assays for a range of different NPs-even though the two tests measure different endpoints-implies that both can be trusted in assessing the genotoxicity of NPs, and that both could be useful in a standard battery of test methods. © 2014 Wiley Periodicals, Inc.

  13. Comparison of mRNA Splicing Assay Protocols across Multiple Laboratories

    DEFF Research Database (Denmark)

    Whiley, Phillip J; de la Hoya, Miguel; Thomassen, Mads

    2014-01-01

    Accurate evaluation of unclassified sequence variants in cancer predisposition genes is essential for clinical management and depends on a multifactorial analysis of clinical, genetic, pathologic, and bioinformatic variables and assays of transcript length and abundance. The integrity of assay data...

  14. Temperature Switch PCR (TSP: Robust assay design for reliable amplification and genotyping of SNPs

    Directory of Open Access Journals (Sweden)

    Mather Diane E

    2009-12-01

    Full Text Available Abstract Background Many research and diagnostic applications rely upon the assay of individual single nucleotide polymorphisms (SNPs. Thus, methods to improve the speed and efficiency for single-marker SNP genotyping are highly desirable. Here, we describe the method of temperature-switch PCR (TSP, a biphasic four-primer PCR system with a universal primer design that permits amplification of the target locus in the first phase of thermal cycling before switching to the detection of the alleles. TSP can simplify assay design for a range of commonly used single-marker SNP genotyping methods, and reduce the requirement for individual assay optimization and operator expertise in the deployment of SNP assays. Results We demonstrate the utility of TSP for the rapid construction of robust and convenient endpoint SNP genotyping assays based on allele-specific PCR and high resolution melt analysis by generating a total of 11,232 data points. The TSP assays were performed under standardised reaction conditions, requiring minimal optimization of individual assays. High genotyping accuracy was verified by 100% concordance of TSP genotypes in a blinded study with an independent genotyping method. Conclusion Theoretically, TSP can be directly incorporated into the design of assays for most current single-marker SNP genotyping methods. TSP provides several technological advances for single-marker SNP genotyping including simplified assay design and development, increased assay specificity and genotyping accuracy, and opportunities for assay automation. By reducing the requirement for operator expertise, TSP provides opportunities to deploy a wider range of single-marker SNP genotyping methods in the laboratory. TSP has broad applications and can be deployed in any animal and plant species.

  15. Intra-rater and inter-rater reliability of the standardized ultrasound protocol for assessing subacromial structures

    DEFF Research Database (Denmark)

    Hougs Kjær, Birgitte; Ellegaard, Karen; Wieland, Ina

    2017-01-01

    BACKGROUND: US-examinations related to shoulder impingement (SI) often vary due to methodological differences, examiner positions, transducers, and recording parameters. Reliable US protocols for examination of different structures related to shoulder impingement are therefore needed. OBJECTIVES...... of the supraspinatus tendon (SUPRA) and subacromial subdeltoid (SASD) bursa in two imaging positions, and the acromial humeral distance (AHD) in one position. Additionally, agreement on dynamic impingement (DI) examination was performed. The intra- and inter-rater reliability was carried out on the same day...

  16. Interobserver reliability of the 'Welfare Quality(®) Animal Welfare Assessment Protocol for Growing Pigs'.

    Science.gov (United States)

    Czycholl, I; Kniese, C; Büttner, K; Beilage, E Grosse; Schrader, L; Krieter, J

    2016-01-01

    The present paper focuses on evaluating the interobserver reliability of the 'Welfare Quality(®) Animal Welfare Assessment Protocol for Growing Pigs'. The protocol for growing pigs mainly consists of a Qualitative Behaviour Assessment (QBA), direct behaviour observations (BO) carried out by instantaneous scan sampling and checks for different individual parameters (IP), e.g. presence of tail biting, wounds and bursitis. Three trained observers collected the data by performing 29 combined assessments, which were done at the same time and on the same animals; but they were carried out completely independent of each other. The findings were compared by the calculation of Spearman Rank Correlation Coefficients (RS), Intraclass Correlation Coefficients (ICC), Smallest Detectable Changes (SDC) and Limits of Agreements (LoA). There was no agreement found concerning the adjectives belonging to the QBA (e.g. active: RS: 0.50, ICC: 0.30, SDC: 0.38, LoA: -0.05 to 0.45; fearful: RS: 0.06, ICC: 0.0, SDC: 0.26, LoA: -0.20 to 0.30). In contrast, the BO showed good agreement (e.g. social behaviour: RS: 0.45, ICC: 0.50, SDC: 0.09, LoA: -0.09 to 0.03 use of enrichment material: RS: 0.75, ICC: 0.68, SDC: 0.06, LoA: -0.03 to 0.03). Overall, observers agreed well in the IP, e.g. tail biting (RS: 0.52, ICC: 0.88; SDC: 0.05, LoA: -0.01 to 0.02) and wounds (RS: 0.43, ICC: 0.59, SDC: 0.10, LoA: -0.09 to 0.10). The parameter bursitis showed great differences (RS: 0.10, ICC: 0.0, SDC: 0.35, LoA: -0.37 to 0.40), which can be explained by difficulties in the assessment when the animals moved around quickly or their legs were soiled. In conclusion, the interobserver reliability was good in the BO and most IP, but not for the parameter bursitis and the QBA.

  17. The Vitotox and ToxTracker assays: A two-test combination for quick and reliable assessment of genotoxic hazards.

    Science.gov (United States)

    Ates, Gamze; Favyts, Dorien; Hendriks, Giel; Derr, Remco; Mertens, Birgit; Verschaeve, Luc; Rogiers, Vera; Y Doktorova, Tatyana

    2016-11-01

    To ensure safety for humans, it is essential to characterize the genotoxic potential of new chemical entities, such as pharmaceutical and cosmetic substances. In a first tier, a battery of in vitro tests is recommended by international regulatory agencies. However, these tests suffer from inadequate specificity: compounds may be wrongly categorized as genotoxic, resulting in unnecessary, time-consuming, and expensive in vivo follow-up testing. In the last decade, novel assays (notably, reporter-based assays) have been developed in an attempt to overcome these drawbacks. Here, we have investigated the performance of two in vitro reporter-based assays, Vitotox and ToxTracker. A set of reference compounds was selected to span a variety of mechanisms of genotoxic action and applicability domains (e.g., pharmaceutical and cosmetic ingredients). Combining the performance of the two assays, we achieved 93% sensitivity and 79% specificity for prediction of gentoxicity for this set of compounds. Both assays permit quick high-throughput analysis of drug candidates, while requiring only small quantities of the test substances. Our study shows that these two assays, when combined, can be a reliable method for assessment of genotoxicity hazard. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Complete validation of a unique digestion assay to detect Trichinella larvae in horse meat demonstrates the reliability of this assay for meeting food safety and trade requirements.

    Science.gov (United States)

    Forbes, L B; Hill, D E; Parker, S; Tessaro, S V; Gamble, H R; Gajadhar, A A

    2008-03-01

    A tissue digestion assay using a double separatory funnel procedure for the detection of Trichinella larvae in horse meat was validated for application in food safety programs and trade. The assay consisted of a pepsin-HCl digestion step to release larvae from muscle tissue and two sequential sedimentation steps in separatory funnels to recover and concentrate larvae for detection with a stereomicroscope. With defined critical control points, the assay was conducted within a quality assurance system compliant with International Organization for Standardization-International Electrotechnical Commission (ISO/IEC) 17025 guidelines. Samples used in the validation were obtained from horses experimentally infected with Trichinella spiralis to obtain a range of muscle larvae densities. One-, 5-, and 10-g samples of infected tissue were combined with 99, 95, and 90 g, respectively, of known negative horse tissue to create a 100-g sample for testing. Samples of 5 and 10 g were more likely to be positive than were 1-g samples when larval densities were less than three larvae per gram (lpg). This difference is important because ingested meat with 1 lpg is considered the threshold for clinical disease in humans. Using a 5-g sample size, all samples containing 1.3 to 2 lpg were detected, and 60 to 100% of samples with infected horse meat containing 0.1 to 0.7 lpg were detected. In this study, the double separatory funnel digestion assay was efficient and reliable for its intended use in food safety and trade. This procedure is the only digestion assay for Trichinella in horse meat that has been validated as consistent and effective at critical levels of sensitivity.

  19. Development and utilization of an ex vivo bromodeoxyuridine local lymph node assay protocol for assessing potential chemical sensitizers.

    Science.gov (United States)

    Williams, W C; Copeland, C; Boykin, E; Quell, S J; Lehmann, D M

    2015-01-01

    The murine local lymph node assay (LLNA) is widely used to identify chemicals that may cause allergic contact dermatitis. Exposure to a dermal sensitizer results in proliferation of local lymph node T cells, which has traditionally been measured by in vivo incorporation of [(3) H]methyl thymidine. A more recent non-isotopic variation of the assay utilizes bromodeoxyuridine (BrdU) incorporation in vivo. To further improve the utility of this assay, we developed an ex vivo BrdU labeling procedure eliminating the need for in vivo injections. The results of this assay correctly identified a strong sensitizer (i.e., trimellitic anhydride) as well as weak/moderate sensitizers (i.e., eugenol, cinnamaldehyde and hexylcinnaminic aldehyde). As anticipated, neither non-sensitizers isopropanol and lactic acid nor the false negative chemical nickel II sulfate hexahydrate induced a positive threshold response in the assay. The results of this assay are in close agreement with those of the in vivo LLNA:BrdU-enzyme-linked immunosorbent assay labeling procedure. We also used the ex vivo BrdU LLNA procedure to evaluate ammonium hexachloroplatinate, ammonium tetrachloroplatinate and cis-diamminedichloroplatinum(II) and the assay correctly identified them as sensitizers based on the calculation of EC2 values. We conclude that this ex vivo BrdU labeling method offers predictive capacity comparable to previously established LLNA protocols while eliminating animal injections and the use of radioisotope. Published 2014. This article is a U.S. Government work and is in the public domain in the USA. Published 2014. This article is a U.S. Government work and is in the public domain in the USA.

  20. Reliability

    OpenAIRE

    Condon, David; Revelle, William

    2017-01-01

    Separating the signal in a test from the irrelevant noise is a challenge for all measurement. Low test reliability limits test validity, attenuates important relationships, and can lead to regression artifacts. Multiple approaches to the assessment and improvement of reliability are discussed. The advantages and disadvantages of several different approaches to reliability are considered. Practical advice on how to assess reliability using open source software is provided.

  1. Reliability of single aliquot regenerative protocol (SAR) for dose estimation in quartz at different burial temperatures: A simulation study

    International Nuclear Information System (INIS)

    Koul, D.K.; Pagonis, V.; Patil, P.

    2016-01-01

    The single aliquot regenerative protocol (SAR) is a well-established technique for estimating naturally acquired radiation doses in quartz. This simulation work examines the reliability of SAR protocol for samples which experienced different ambient temperatures in nature in the range of −10 to 40 °C. The contribution of various experimental variables used in SAR protocols to the accuracy and precision of the method is simulated for different ambient temperatures. Specifically the effects of paleo-dose, test dose, pre-heating temperature and cut-heat temperature on the accuracy of equivalent dose (ED) estimation are simulated by using random combinations of the concentrations of traps and centers using a previously published comprehensive quartz model. The findings suggest that the ambient temperature has a significant bearing on the reliability of natural dose estimation using SAR protocol, especially for ambient temperatures above 0 °C. The main source of these inaccuracies seems to be thermal sensitization of the quartz samples caused by the well-known thermal transfer of holes between luminescence centers in quartz. The simulations suggest that most of this inaccuracy in the dose estimation can be removed by delivering the laboratory doses in pulses (pulsed irradiation procedures). - Highlights: • Ambient temperatures affect the reliability of SAR. • It overestimates the dose with increase in burial temperature and burial time periods. • Elevated temperature irradiation does not correct for these overestimations. • Inaccuracies in dose estimation can be removed by incorporating pulsed irradiation procedures.

  2. Reliability of diagnostic imaging techniques in suspected acute appendicitis: proposed diagnostic protocol

    International Nuclear Information System (INIS)

    Cura del, J. L.; Oleaga, L.; Grande, D.; Vela, A. C.; Ibanez, A. M.

    2001-01-01

    To study the utility of ultrasound and computed tomography (CT) in case of suspected appendicitis. To determine the diagnostic yield in terms of different clinical contexts and patient characteristics. to assess the costs and benefits of introducing these techniques and propose a protocol for their use. Negative appendectomies, complications and length of hospital stay in a group of 152 patients with suspected appendicitis who underwent ultrasound and CT were compared with those of 180 patients who underwent appendectomy during the same time period, but had not been selected for the first group: these patients costs for each group were calculated. In the first group, the diagnostic value of the clinical signs was also evaluated. The reliability of the clinical signs was limited, while the results with ultrasound and CT were excellent. The incidence of negative appendectomy was 9.6% in the study group and 12.2% in the control group. Moreover, there were fewer complications and a shorter hospital stay in the first group. Among men, however, the rate of negative appendectomy was lower in the control group. The cost of using ultrasound and CT in the management of appendicitis was only slightly higher than that of the control group. Although ultrasound and CT are not necessary in cases in which the probability of appendicitis is low or in men presenting clear clinical evidence, the use of these techniques is indicated in the remaining cases in which appendicitis is suspected. In children, ultrasound is the technique of choice. In all other patients, if negative results are obtained with one of the two techniques, the other should be performed. (Author) 49 refs

  3. Inter-laboratory variation in DNA damage using a standard comet assay protocol

    DEFF Research Database (Denmark)

    Forchhammer, Lykke; Ersson, Clara; Loft, Steffen

    2012-01-01

    determined the baseline level of DNA strand breaks (SBs)/alkaline labile sites and formamidopyrimidine DNA glycosylase (FPG)-sensitive sites in coded samples of mononuclear blood cells (MNBCs) from healthy volunteers. There were technical problems in seven laboratories in adopting the standard protocol...... analysed by the standard protocol. The SBs and FPG-sensitive sites were measured in the same experiment, indicating that the large spread in the latter lesions was the main reason for the reduced inter-laboratory variation. However, it remains worrying that half of the participating laboratories obtained...

  4. Magnetic particle separation technique: a reliable and simple tool for RIA/IRMA and quantitative PCR assay

    International Nuclear Information System (INIS)

    Shen Rongsen; Shen Decun

    1998-01-01

    Five types of magnetic particles without or with aldehyde, amino and carboxyl functional groups, respectively were used to immobilize first or second antibody by three models, i. e. physical adsorption, chemical coupling and immuno-affinity, forming four types of magnetic particle antibodies. The second antibody immobilized on polyacrolein magnetic particles through aldehyde functional groups and the first antibodies immobilized on carboxylic polystyrene magnetic particles through carboxyl functional groups were recommended to apply to RIAs and/or IRMAs. Streptavidin immobilized on commercial magnetic particles through amino functional groups was successfully applied to separating specific PCR product for quantification of human cytomegalovirus. In the paper typical data on reliability of these magnetic particle ligands were reported and simplicity of the magnetic particle separation technique was discussed. The results showed that the technique was a reliable and simple tool for RIA/IRMA and quantitative PCR assay. (author)

  5. The reliability of a maximal isometric hip strength and simultaneous surface EMG screening protocol in elite, junior rugby league athletes.

    Science.gov (United States)

    Charlton, Paula C; Mentiplay, Benjamin F; Grimaldi, Alison; Pua, Yong-Hao; Clark, Ross A

    2017-02-01

    Firstly to describe the reliability of assessing maximal isometric strength of the hip abductor and adductor musculature using a hand held dynamometry (HHD) protocol with simultaneous wireless surface electromyographic (sEMG) evaluation of the gluteus medius (GM) and adductor longus (AL). Secondly, to describe the correlation between isometric strength recorded with the HHD protocol and a laboratory standard isokinetic device. Reliability and correlational study. A sample of 24 elite, male, junior, rugby league athletes, age 16-20 years participated in repeated HHD and isometric Kin-Com (KC) strength testing with simultaneous sEMG assessment, on average (range) 6 (5-7) days apart by a single assessor. Strength tests included; unilateral hip abduction (ABD) and adduction (ADD) and bilateral ADD assessed with squeeze (SQ) tests in 0 and 45° of hip flexion. HHD demonstrated good to excellent inter-session reliability for all outcome measures (ICC (2,1) =0.76-0.91) and good to excellent association with the laboratory reference KC (ICC (2,1) =0.80-0.88). Whilst intra-session, inter-trial reliability of EMG activation and co-activation outcome measures ranged from moderate to excellent (ICC (2,1) =0.70-0.94), inter-session reliability was poor (all ICC (2,1) Isometric strength testing of the hip ABD and ADD musculature using HHD may be measured reliably in elite, junior rugby league athletes. Due to the poor inter-session reliability of sEMG measures, it is not recommended for athlete screening purposes if using the techniques implemented in this study. Copyright © 2016 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  6. Chromogenic in situ hybridization is a reliable assay for detection of ALK rearrangements in adenocarcinomas of the lung.

    Science.gov (United States)

    Schildhaus, Hans-Ulrich; Deml, Karl-Friedrich; Schmitz, Katja; Meiboom, Maren; Binot, Elke; Hauke, Sven; Merkelbach-Bruse, Sabine; Büttner, Reinhard

    2013-11-01

    Reliable detection of anaplastic lymphoma kinase (ALK) rearrangements is a prerequisite for personalized treatment of lung cancer patients, as ALK rearrangements represent a predictive biomarker for the therapy with specific tyrosine kinase inhibitors. Currently, fluorescent in situ hybridization (FISH) is considered to be the standard method for assessing formalin-fixed and paraffin-embedded tissue for ALK inversions and translocations. However, FISH requires a specialized equipment, the signals fade rapidly and it is difficult to detect overall morphology and tumor heterogeneity. Chromogenic in situ hybridization (CISH) has been successfully introduced as an alternative test for the detection of several genetic aberrations. This study validates a newly developed ALK CISH assay by comparing FISH and CISH signal patterns in lung cancer samples with and without ALK rearrangements. One hundred adenocarcinomas of the lung were included in this study, among them 17 with known ALK rearrangement. FISH and CISH were carried out and evaluated according to the manufacturers' recommendations. For both assays, tumors were considered positive if ≥15% of tumor cells showed either isolated 3' signals or break-apart patterns or a combination of both. A subset of tumors was exemplarily examined by using a novel EML4 (echinoderm microtubule-associated protein-like 4) CISH probe. Red, green and fusion CISH signals were clearcut and different signal patterns were easily recognized. The percentage of aberrant tumor cells was statistically highly correlated (PCISH. On the basis of 86 samples that were evaluable by ALK CISH, we found a 100% sensitivity and 100% specificity of this assay. Furthermore, EML4 rearrangements could be recognized by CISH. CISH is a highly reliable, sensitive and specific method for the detection of ALK gene rearrangements in pulmonary adenocarcinomas. Our results suggest that CISH might serve as a suitable alternative to FISH, which is the current gold

  7. Inter-rater and intra-rater reliability of a clinical protocol for measuring turnout in collegiate dancers.

    Science.gov (United States)

    Greene, Amanda; Lasner, Andrea; Deu, Rajwinder; Oliphant, Seth; Johnson, Kenneth

    2018-02-02

    Reliable methods of measuring turnout in dancers and comparing active turnout (used in class) with functional (uncompensated) turnout are needed. Authors have suggested measurement techniques but there is no clinically useful, easily reproducible technique with established inter-rater and intra-rater reliability. We adapted a technique based on previous research, which is easily reproducible. We hypothesized excellent inter-rater and intra-rater reliability between experienced physical therapists (PTs) and a briefly trained faculty member from a university's department of dance. Thirty-two participants were recruited from the same dance department. Dancers' active and functional turnout was measured by each rater. We found that our technique for measuring active and functional turnout has excellent inter-rater and intra-rater reliability when performed by two experienced PTs and by one briefly trained university-level dance faculty member. For active turnout, inter-rater reliability was 0.78 among all raters and 0.82 among only the PT raters; intra-rater reliability was 0.82 among all raters and 0.85 among only the PT raters. For functional turnout, inter-rater reliability was 0.86 among all raters and 0.88 among only the PT raters; intra-rater reliability was 0.87 among all raters and 0.88 among only the PT raters. The measurement technique described provides a standardized protocol with excellent inter-rater and intra-rater reliability when performed by experienced PTs or by a briefly trained university-level dance faculty member.

  8. An efficient and reliable multi-hop geographical broadcast protocol in vehicular ad-hoc networks

    NARCIS (Netherlands)

    Rajendran, R.; Jongh, J. de

    2013-01-01

    In Intelligent Transportation Systems (ITS), disseminating warning messages in a timely and efficient way through wireless short-range communications can save many lives and reduce traffic congestion. A geographical broadcast protocol provides data delivery to specified geographical areas, using

  9. A reliable protocol for the isolation of viable, chondrogenically differentiated human mesenchymal stem cells from high-density pellet cultures.

    Science.gov (United States)

    Ullah, Mujib; Hamouda, Houda; Stich, Stefan; Sittinger, Michael; Ringe, Jochen

    2012-12-01

    Administration of chondrogenically differentiated mesenchymal stem cells (MSC) is discussed as a promising approach for the regenerative treatment of injured or diseased cartilage. The high-density pellet culture is the standard culture for chondrogenic differentiation, but cells in pellets secrete extracellular matrix (ECM) that they become entrapped in. Protocols for cell isolation from pellets often result in cell damage and dedifferentiation towards less differentiated MSC. Therefore, our aim was to develop a reliable protocol for the isolation of viable, chondrogenically differentiated MSC from high-density pellet cultures. Human bone marrow MSC were chondrogenically stimulated with transforming growth factor-β3, and the cartilaginous structure of the pellets was verified by alcian blue staining of cartilage proteoglycans, antibody staining of cartilage collagen type II, and quantitative real-time reverse-transcription polymerase chain reaction of the marker genes COL2A1 and SOX9. Trypsin and collagenases II and P were tested alone or in combination, and for different concentrations and times, to find a protocol for optimized pellet digestion. Whereas trypsin was not able to release viable cells, 90-min digestion with 300 U of collagenase II, 20 U of collagenase P, and 2 mM CaCl2 worked quite well and resulted in about 2.5×10(5) cells/pellet. The protocol was further optimized for the separation of released cells and ECM from each other. Cells were alcian blue and collagen type II positive and expressed COL2A1 and SOX9, verifying a chondrogenic character. However, they had different morphological shapes. The ECM was also uniformly alcian blue and collagen type II positive but showed different organizational and structural forms. To conclude, our protocol allows the reliable isolation of a defined number of viable, chondrogenically differentiated MSC from high-density pellet cultures. Such cells, as well as the ECM components, are of interest as

  10. Arabidopsis seedling flood-inoculation technique: a rapid and reliable assay for studying plant-bacterial interactions

    Directory of Open Access Journals (Sweden)

    Uppalapati Srinivasa R

    2011-10-01

    Full Text Available Abstract Background The Arabidopsis thaliana-Pseudomonas syringae model pathosystem is one of the most widely used systems to understand the mechanisms of microbial pathogenesis and plant innate immunity. Several inoculation methods have been used to study plant-pathogen interactions in this model system. However, none of the methods reported to date are similar to those occurring in nature and amicable to large-scale mutant screens. Results In this study, we developed a rapid and reliable seedling flood-inoculation method based on young Arabidopsis seedlings grown on MS medium. This method has several advantages over conventional soil-grown plant inoculation assays, including a shorter growth and incubation period, ease of inoculation and handling, uniform infection and disease development, requires less growth chamber space and is suitable for high-throughput screens. In this study we demonstrated the efficacy of the Arabidopsis seedling assay to study 1 the virulence factors of P. syringae pv. tomato DC3000, including type III protein secretion system (TTSS and phytotoxin coronatine (COR; 2 the effector-triggered immunity; and 3 Arabidopsis mutants affected in salicylic acid (SA- and pathogen-associated molecular pattern (PAMPs-mediated pathways. Furthermore, we applied this technique to study nonhost resistance (NHR responses in Arabidopsis using nonhost pathogens, such as P. syringae pv. tabaci, pv. glycinea and pv. tomato T1, and confirmed the functional role of FLAGELLIN-SENSING 2 (FLS2 in NHR. Conclusions The Arabidopsis seedling flood-inoculation assay provides a rapid, efficient and economical method for studying Arabidopsis-Pseudomonas interactions with minimal growth chamber space and time. This assay could also provide an excellent system for investigating the virulence mechanisms of P. syringae. Using this method, we demonstrated that FLS2 plays a critical role in conferring NHR against nonhost pathovars of P. syringae, but not to

  11. Loop-mediated isothermal amplification as a reliable assay for Toxocara canis infection in pet dogs.

    Science.gov (United States)

    Khoshakhlagh, Paria; Spotin, Adel; Mahami-Oskouei, Mahmoud; Shahbazi, Abbas; Ozlati, Maryam

    2017-09-01

    Keeping of infected dogs as pet results in the potential transmission risk factors for shedding helminthic infections such as toxocariasis. Lack of accurate identification of Toxocara canis eggs in non-dewormed infected pet dogs remains a diagnostic concern among researchers. In this study, dog owners were asked to fill up a questionnaire regarding their pets and their attitude towards the deworming regimen. One hundred faecal samples were collected from pet dogs (Northwest Iran) and were subsequently identified by the ZnSo4 flotation technique, PCR and loop-mediated isothermal amplification (LAMP) assays. The DNA of the recovered T. canis eggs was then extracted and amplified by LAMP and PCR. Furthermore, ITS2 amplicons were sequenced for appraisal of the phylogenetic analysis. Nine, 5 and 11% of T. canis infections were identified by microscopy, PCR and LAMP, respectively. It was detected that LAMP was 10 times (10 -10 to 10 -13  g/μl) more sensitive than PCR (10 -10 to 10 -12  g/μl). The kappa value between LAMP and PCR indicated a faint concurrence (0.463). The kappa coefficient between LAMP and flotation technique indicated a strong agreement (0.667). The highest infection rate (n = 11) was detected in non-dewormed pet dogs, particularly those less than 3 months old (P canis eggs in infected pet dogs. It was proposed that the dog holder's awareness is insufficient to implement regular deworming schedules. Additionally, regional policymakers should broadly revise anthelmintic treatment guidelines.

  12. Interface Assignment-Based AODV Routing Protocol to Improve Reliability in Multi-Interface Multichannel Wireless Mesh Networks

    Directory of Open Access Journals (Sweden)

    Won-Suk Kim

    2015-01-01

    Full Text Available The utilization of wireless mesh networks (WMNs has greatly increased, and the multi-interface multichannel (MIMC technic has been widely used for the backbone network. Unfortunately, the ad hoc on-demand distance vector (AODV routing protocol defined in the IEEE 802.11s standard was designed for WMNs using the single-interface single-channel technic. So, we define a problem that happens when the legacy AODV is used in MIMC WMNs and propose an interface assignment-based AODV (IA-AODV in order to resolve that problem. IA-AODV, which is based on multitarget path request, consists of the PREQ prediction scheme, the PREQ loss recovery scheme, and the PREQ sender assignment scheme. A detailed operation according to various network conditions and services is introduced, and the routing efficiency and network reliability of a network using IA-AODV are analyzed over the presented system model. Finally, after a real-world test-bed for MIMC WMNs using the IA-AODV routing protocol is implemented, the various indicators of the network are evaluated through experiments. When the proposed routing protocol is compared with the existing AODV routing protocol, it performs the path update using only 14.33% of the management frames, completely removes the routing malfunction, and reduces the UDP packet loss ratio by 0.0012%.

  13. An easy and efficient permeabilization protocol for in vivo enzyme activity assays in cyanobacteria

    DEFF Research Database (Denmark)

    Rasmussen, Randi Engelberth; Erstad, Simon Matthé; Ramos Martinez, Erick Miguel

    2016-01-01

    microbial cell factories. Better understanding of the activities of enzymes involved in the central carbon metabolism would lead to increasing product yields. Currently cell-free lysates are the most widely used method for determination of intracellular enzyme activities. However, due to thick cell walls...... used directly in the assays, the permeabilized cells exhibited the enzyme activities that are comparable or even higher than those detected for cell-free lysates. Moreover, the permeabilized cells could be stored at -20 °C without losing the enzyme activities. The permeabilization process...... for permeabilization of the cyanobacteria Synechococcus sp. PCC 7002 and Synechocystis sp. PCC 6803, and determination of two intracellular enzymes, ribulose-1,5-bisphosphate carboxylase/decarboxylase (Rubisco) and glucose-6-phosphate dehydrogenase (G6PDH), that play pivotal roles in the central carbon metabolism...

  14. A reliable, delay bounded and less complex communication protocol for multicluster FANETs

    Directory of Open Access Journals (Sweden)

    Wajiya Zafar

    2017-02-01

    Full Text Available Recently, Flying Ad-hoc Networks (FANETs, enabling ad-hoc networking between Unmanned Aerial Vehicles (UAVs is gaining importance in several military and civilian applications. The sensitivity of the applications requires adaptive; efficient; delay bounded and scalable communication network among UAVs for data transmission. Due to communication protocol complexity; rigidity; cost of commercial-off-the-shelf (COT components; limited radio bandwidth; high mobility and computational resources; maintaining the desired level of Quality of Service (QoS becomes a daunting task. For the first time in this research we propose multicluster FANETs for efficient network management; the proposed scheme considerably reduces communication cost and optimizes network performance as well as exploit low power; less complex and low cost IEEE 802.15.4 (MAC protocol for intercluster and intracluster communication. In this research both beacon enabled mode and beaconless modes have been investigated with Guaranteed Time Slots (GTS and virtual Time Division Multiple Access (TDMA respectively. The methodology plays a key role towards reserving bandwidth for latency critical applications; eliminate collisions and medium access delays. Moreover analysis ad-hoc routing protocols including two proactive (OLSR, DSDV and one reactive (AODV is also presented. The results shows that the proposed scheme guarantees high packet delivery ratios while maintaining acceptable levels of latency requirements comparable with more complex and dedicatedly designed protocols in literature.

  15. Reproducibility of microbial mutagenicity assays. I. Tests with Salmonella typhimurium and Escherichia coli using a standardized protocol

    International Nuclear Information System (INIS)

    Dunkel, V.C.; Zeiger, E.; Brusick, D.; McCoy, E.; McGregor, D.; Mortelmans, K.; Rosenkranz, H.S.; Simmon, V.F.

    1984-01-01

    The Salmonella/microsome test developed by Ames and his coworkers has been widely used in the evaluation of chemicals for genotoxic potential. Although the value of this assay is well recognized, there have been no comprehensive studies on the interlaboratory reproducibility of the method using a standardized protocol. A program was therefore initiated to compare the results obtained in four laboratories from testing a series of coded mutagens and nonmutagens using a standardized protocol. Additional objectives of this study were to compare male Fisher 344 rat, B6C3F1 mouse, and Syrian hamster liver S-9 preparations for the activation of chemicals; to compare Aroclor 1254-induced liver S-9 from all three species with the corresponding non-induced liver S-9's; and to compare the response of Escherichia coli WP-2 uvrA with the Salmonella typhimurium tester strains recommended by Ames. Since a primary use of in vitro microbial mutagenesis tests is the identification of potential carcinogens by their mutagenicity, the authors decided to compare the animal species and strains used by the National Cancer Institute/National Toxicology Program (NCI/NTP) for animal carcinogenicity studies

  16. Coordination Protocols for a Reliable Sensor, Actuator, and Device Network (SADN

    Directory of Open Access Journals (Sweden)

    Keiji Ozaki

    2008-01-01

    Full Text Available A sensor, actuator, and device network (SADN is composed of three types of nodes, which are sensor, actuator, and actuation device nodes. Sensor nodes and actuator nodes are interconnected in wireless networks as discussed in wireless sensor and actuator networks (WSANs. Actuator nodes and device nodes are interconnected in types of networks, i.e. wireless and wired network. Sensor nodes sense an physical event and send sensed values of the event to actuator nodes. An actuator node makes a decision on proper actions on receipt of sensed values and then issue the action requests to the device nodes. A device node really acts to the physical world. For example, moves a robot arms by performing the action on receipt of the action request. Messages may be lost and nodes may be faulty. Especially, messages are lost due to noise and collision in a wireless network. We propose a fully redundant model for an SADN where each of sensor, actuator, and device functions is replicated in multiple nodes and each of sensor-actuator and actuator-device communication is realized in many-to-many type of communication protocols. Even if some number of nodes are faulty, the other nodes can perform requested tasks. Here, each sensor node sends sensed values to multiple actuator nodes and each actuator node receives sensed values from multiple sensor nodes. While multiple actuator nodes communicate with multiple replica nodes of a device. Even if messages are lost and some number of nodes are faulty, device nodes can surely receive action requests required for sensed values and the actions are performed. In this paper, we discuss a type of semi-passive coordination (SPC protocol of multiple actuator nodes for multiple sensor nodes. We discuss a type of active coordination protocol for multiple actuator nodes and multiple actuation device nodes. We evaluate the SPC protocol for the sensor-actuator coordination in terms of the number of messages exchanged among

  17. Reliable Multihop Broadcast Protocol with a Low-Overhead Link Quality Assessment for ITS Based on VANETs in Highway Scenarios

    Directory of Open Access Journals (Sweden)

    Alejandro Galaviz-Mosqueda

    2014-01-01

    Full Text Available Vehicular ad hoc networks (VANETs have been identified as a key technology to enable intelligent transport systems (ITS, which are aimed to radically improve the safety, comfort, and greenness of the vehicles in the road. However, in order to fully exploit VANETs potential, several issues must be addressed. Because of the high dynamic of VANETs and the impairments in the wireless channel, one key issue arising when working with VANETs is the multihop dissemination of broadcast packets for safety and infotainment applications. In this paper a reliable low-overhead multihop broadcast (RLMB protocol is proposed to address the well-known broadcast storm problem. The proposed RLMB takes advantage of the hello messages exchanged between the vehicles and it processes such information to intelligently select a relay set and reduce the redundant broadcast. Additionally, to reduce the hello messages rate dependency, RLMB uses a point-to-zone link evaluation approach. RLMB performance is compared with one of the leading multihop broadcast protocols existing to date. Performance metrics show that our RLMB solution outperforms the leading protocol in terms of important metrics such as packet dissemination ratio, overhead, and delay.

  18. A Framework for Reliable Reception of Wireless Metering Data using Protocol Side Information

    DEFF Research Database (Denmark)

    Melchior Jacobsen, Rasmus; Popovski, Petar

    2013-01-01

    the deterministic protocol structure to obtain side information and group the packets from the same meter. We derive the probability of falsely pairing packets from different senders in the simple case of no channel errors, and show through simulation and data from an experimental deployment the probability...... of false pairing with channel errors. The pairing is an essential step towards recovery of metering data from as many as possible meters under harsh channel conditions. From the experiment we find that more than 15% of all conducted pairings are between two erroneous packets, which sets an upper bound...

  19. Impact of Transport Layer Protocols on Reliable Information Access in Smart Grids

    DEFF Research Database (Denmark)

    Shahid, Kamal; Saeed, Aamir; Kristensen, Thomas le Fevre

    2017-01-01

    Time is critical for certain types of dynamic information (e.g. frequency control) in a smart grid scenario. The usefulness of such information depends upon the arrival within a specific frame of time, which in other case may not serve the purpose and effect controller’s performance....... The question is addressed by analyzing the performance of UDP and TCP over imperfect network conditions to show how the selection of transport layer protocol can dramatically affect controller’s performance. This analysis is based on a quality metric called mismatch probability that considers occurrence...

  20. An Indication of Reliability of the Two-Level Approach of the AWIN Welfare Assessment Protocol for Horses

    Directory of Open Access Journals (Sweden)

    Irena Czycholl

    2018-01-01

    Full Text Available To enhance feasibility, the Animal Welfare Indicators (AWIN assessment protocol for horses consists of two levels: the first is a visual inspection of a sample of horses performed from a distance, the second a close-up inspection of all horses. The aim was to analyse whether information would be lost if only the first level were performed. In this study, 112 first and 112 second level assessments carried out on a subsequent day by one observer were compared by calculating the Spearman’s Rank Correlation Coefficient (RS, Intraclass Correlation Coefficients (ICC, Smallest Detectable Changes (SDC and Limits of Agreements (LoA. Most indicators demonstrated sufficient reliability between the two levels. Exceptions were the Horse Grimace Scale, the Avoidance Distance Test and the Voluntary Human Approach Test (e.g., Voluntary Human Approach Test: RS: 0.38, ICC: 0.38, SDC: 0.21, LoA: −0.25–0.17, which could, however, be also interpreted as a lack of test-retest reliability. Further disagreement was found for the indicator consistency of manure (RS: 0.31, ICC: 0.38, SDC: 0.36, LoA: −0.38–0.36. For these indicators, an adaptation of the first level would be beneficial. Overall, in this study, the division into two levels was reliable and might therewith have the potential to enhance feasibility in other welfare assessment schemes.

  1. Reliability and accuracy of a video analysis protocol to assess core ability.

    Science.gov (United States)

    McDonald, Dawn A; Delgadillo, James Q; Fredericson, Michael; McConnell, Jennifer; Hodgins, Melissa; Besier, Thor F

    2011-03-01

    To develop and test a method to measure core ability in healthy athletes with 2-dimensional video analysis software (SiliconCOACH). Specific objectives were to: (1) develop a standardized exercise battery with progressions of increasing difficulty to evaluate areas of core ability in elite athletes; (2) develop an objective and quantitative grading rubric with the use of video analysis software; (3) assess the test-retest reliability of the exercise battery; (4) assess the interrater and intrarater reliability of the video analysis system; and (5) assess the accuracy of the assessment. Test-retest repeatability and accuracy. Testing was conducted in the Stanford Human Performance Laboratory, Stanford University, Stanford, CA. Nine female gymnasts currently training with the Stanford Varsity Women's Gymnastics Team participated in testing. Participants completed a test battery composed of planks, side planks, and leg bridges of increasing difficulty. Subjects completed two 20-minute testing sessions within a 4- to 10-day period. Two-dimensional sagittal-plane video was captured simultaneously with 3-dimensional motion capture. The main outcome measures were pelvic displacement and time that elapsed until failure occurred, as measured with SiliconCOACH video analysis software. Test-retest and interrater and intrarater reliability of the video analysis measures was assessed. Accuracy as compared with 3-dimensional motion capture also was assessed. Levels reached during the side planks and leg bridges had an excellent test-retest correlation (r(2) = 0.84, r(2) = 0.95). Pelvis displacements measured by examiner 1 and examiner 2 had an excellent correlation (r(2) = 0.86, intraclass correlation coefficient = 0.92). Pelvis displacements measured by examiner 1 during independent grading sessions had an excellent correlation (r(2) = 0.92). Pelvis displacements from the plank and from a set of combined plank and side plank exercises both had an excellent correlation with 3

  2. Reliability of the Star Excursion Balance Test and Two New Similar Protocols to Measure Trunk Postural Control.

    Science.gov (United States)

    López-Plaza, Diego; Juan-Recio, Casto; Barbado, David; Ruiz-Pérez, Iñaki; Vera-Garcia, Francisco J

    2018-05-18

    Although the Star Excursion Balance test (SEBT) has shown a good intrasession reliability, the intersession reliability of this test has not been deeply studied. Furthermore, there is an evident high influence of the lower limbs in the performance of the SEBT, so even if it has been used to measure core stability, it is possibly not the most suitable measurement. The aims of this study were to (1) to assess the absolute and relative between-session reliability of the SEBT and 2 novel variations of this test to assess trunk postural control while sitting, ie, the Star Excursion Sitting Test (SEST) and the Star Excursion Timing Test (SETT); and (2) to analyze the relationships between these 3 test scores. Correlational and reliability test-retest study. Controlled laboratory environment. Twenty-seven physically active men (age: 24.54 ± 3.05 years). Relative and absolute reliability of the SEBT, SEST, and SETT were calculated through the intraclass correlation coefficient (ICC) and standard error of measurement (SEM), respectively. A Pearson correlation analysis was carried out between the variables of the 3 tests. Maximum normalized reach distances were assessed for different SEBT and SEST directions. In addition, composite indexes were calculated for SEBT, SEST, and SETT. The SEBT (dominant leg: ICC = 0.87 [0.73-0.94], SEM = 2.12 [1.66-2.93]; nondominant leg: ICC = 0.74 [0.50-0.87], SEM = 3.23 [2.54-4.45]), SEST (ICC = 0.85 [0.68-0.92], SEM = 1.27 [1.03-1.80]), and SETT (ICC = 0.61 [0.30-0.80], SEM = 2.31 [1.82-3.17]) composite indexes showed moderate-to-high 1-month reliability. A learning effect was detected for some SEBT and SEST directions and for SEST and SETT composite indexes. No significant correlations were found between SEBT and its 2 variations (r ≤ .366; P > .05). A significant correlation was found between the SEST and SETT composite indexes (r = .520; P > .01). SEBT, SEST, and SETT are reliable field protocols to measure postural control. However

  3. A reliable radiochromatographic assay technique for hepatic microsomal 16α-hydroxylase activity towards oestrone 3-sulphate

    International Nuclear Information System (INIS)

    Tsoutsoulis, C.J.; Hobkirk, R.

    1980-01-01

    A reliable procedure for the assay of liver microsomal 16α-hydroxylation of oestrone 3-sulphate has been developed for the guinea pig. It is based on the rapid, quantitative separation of oestradiol and oestriol by Sephadex LH-20 columns after the chemical reduction and enzymic hydrolysis of the incubation products. Microsomal preparations and incubation conditions that optimized 16α-hydroxylation of oestrone 3-sulphate were employed. Under these circumstances, reduction of the substrate at C-17 and hydrolysis of the sulphate were minimized. Conditions were established that yielded reaction linearity with respect to time and microsomal concentration. This hydroxylation had an absolute requirement for NADPH, which could not be satisfied by NADH. Apparent Ksub(m) values for oestrone 3-sulphate and NADPH, under the conditions used, were 14μM and 0.17mM respectively. 16α-hydroxylase activity was present in the liver microsomal fraction from heavily pigmented, female English Shorthaired guinea pigs. Much lower activity was detected in mature pigmented males and albino females. No activity could be demonstrated in mature, albino males. (author)

  4. Improving the communication reliability of body sensor networks based on the IEEE 802.15.4 protocol.

    Science.gov (United States)

    Gomes, Diogo; Afonso, José A

    2014-03-01

    Body sensor networks (BSNs) enable continuous monitoring of patients anywhere, with minimum constraints to daily life activities. Although the IEEE 802.15.4 and ZigBee(®) (ZigBee Alliance, San Ramon, CA) standards were mainly developed for use in wireless sensors network (WSN) applications, they are also widely used in BSN applications because of device characteristics such as low power, low cost, and small form factor. However, compared with WSNs, BSNs present some very distinctive characteristics in terms of traffic and mobility patterns, heterogeneity of the nodes, and quality of service requirements. This article evaluates the suitability of the carrier sense multiple access-collision avoidance protocol, used by the IEEE 802.15.4 and ZigBee standards, for data-intensive BSN applications, through the execution of experimental tests in different evaluation scenarios, in order to take into account the effects of contention, clock drift, and hidden nodes on the communication reliability. Results show that the delivery ratio may decrease substantially during transitory periods, which can last for several minutes, to a minimum of 90% with retransmissions and 13% without retransmissions. This article also proposes and evaluates the performance of the BSN contention avoidance mechanism, which was designed to solve the identified reliability problems. This mechanism was able to restore the delivery ratio to 100% even in the scenario without retransmissions.

  5. Integration of GC-MSD and ER-Calux® assay into a single protocol for determining steroid estrogens in environmental samples.

    Science.gov (United States)

    Avberšek, Miha; Žegura, Bojana; Filipič, Metka; Heath, Ester

    2011-11-01

    There are many published studies that use either chemical or biological methods to investigate steroid estrogens in the aquatic environment, but rarer are those that combine both. In this study, gas chromatography with mass selective detection (GC-MSD) and the ER-Calux(®) estrogenicity assay were integrated into a single protocol for simultaneous determination of natural (estrone--E1, 17β-estradiol--E2, estriol--E3) and synthetic (17α-ethinylestradiol--EE2) steroid estrogens concentrations and the total estrogenic potential of environmental samples. For integration purposes, several solvents were investigated and the commonly used dimethyl sulphoxide (DMSO) in the ER-Calux(®) assay was replaced by ethyl acetate, which is more compatible with gas chromatography and enables the same sample to be analysed by both GC-MSD and the ER-Calux(®) assay. The integrated protocol was initially tested using a standard mixture of estrogens. The results for pure standards showed that the estrogenicity calculated on the basis of GC-MSD and the ER-Calux(®) assay exhibited good correlation (r(2)=0.96; α=0.94). The result remained the same when spiked waste water extracts were tested (r(2)=0.92, α=1.02). When applied to real waste water influent and effluent samples the results proved (r(2)=0.93; α=0.99) the applicability of the protocol. The main advantages of this newly developed protocol are simple sample handling for both methods, and reduced material consumption and labour. In addition, it can be applied as either a complete or sequential analysis where the ER-Calux(®) assay is used as a pre-screening method prior to the chemical analysis. Copyright © 2011 Elsevier B.V. All rights reserved.

  6. A comparative study of PD-L1 immunohistochemical assays with four reliable antibodies in thymic carcinoma.

    Science.gov (United States)

    Sakane, Tadashi; Murase, Takayuki; Okuda, Katsuhiro; Takino, Hisashi; Masaki, Ayako; Oda, Risa; Watanabe, Takuya; Kawano, Osamu; Haneda, Hiroshi; Moriyama, Satoru; Saito, Yushi; Yamada, Takeshi; Nakanishi, Ryoichi; Inagaki, Hiroshi

    2018-01-23

    Currently, four immunohistochemical assays are registered with the US Food and Drug Administration to detect the expression of PD-L1. We investigated the PD-L1 expression in thymic carcinomas using these four diagnostic assays. The cases of 53 patients were reviewed and their specimens were subjected to four PD-L1 assays with different antibodies (SP142, SP263, 22C3, and 28-8). The PD-L1 expression in tumor cells (TCs) and immune cells (ICs) was evaluated. In TCs, the four assays showed similar scores in each case. Histopathologically, high TC scores were observed in squamous cell carcinomas (SqCCs). Meanwhile, there were no significant relationships among the IC scores in the four assays. In SqCCs, the high expression of PD-L1 (defined as ≥50% TC score) in TCs tended to be associated with early stage cancer. The patients with high expression levels of PD-L1 tended to show longer overall survival in the 22C3 assays (p=0.0200). In thymic carcinomas, the staining pattern showed high concordance among the four assays when TCs - rather than ICs - were stained. High PD-L1 positivity in TCs, especially in SqCCs, indicated that PD-1/PD-L1 targeted therapy may be a promising therapeutic approach.

  7. Implementing voice over Internet protocol in mobile ad hoc network – analysing its features regarding efficiency, reliability and security

    Directory of Open Access Journals (Sweden)

    Naveed Ahmed Sheikh

    2014-05-01

    Full Text Available Providing secure and efficient real-time voice communication in mobile ad hoc network (MANET environment is a challenging problem. Voice over Internet protocol (VoIP has originally been developed over the past two decades for infrastructure-based networks. There are strict timing constraints for acceptable quality VoIP services, in addition to registration and discovery issues in VoIP end-points. In MANETs, ad hoc nature of networks and multi-hop wireless environment with significant packet loss and delays present formidable challenges to the implementation. Providing a secure real-time VoIP service on MANET is the main design objective of this paper. The authors have successfully developed a prototype system that establishes reliable and efficient VoIP communication and provides an extremely flexible method for voice communication in MANETs. The authors’ cooperative mesh-based MANET implementation can be used for rapidly deployable VoIP communication with survivable and efficient dynamic networking using open source software.

  8. Simplified PCR protocols for INNO-LiPA HBV Genotyping and INNO-LiPA HBV PreCore assays

    NARCIS (Netherlands)

    Qutub, Mohammed O.; Germer, Jeffrey J.; Rebers, Sjoerd P. H.; Mandrekar, Jayawant N.; Beld, Marcel G. H. M.; Yao, Joseph D. C.

    2006-01-01

    INNO-LiPA HBV Genotyping (LiPA HBV GT) and INNO-LiPA HBV PreCore (LiPA HBV PC) are commercially available assays for hepatitis B virus (HBV) characterization. These assays are labor-intensive and may be prone to exogenous DNA contamination due to their use of nested PCR amplification procedures and

  9. Development and systematic validation of qPCR assays for rapid and reliable differentiation of Xylella fastidiosa strains causing citrus variegated chlorosis.

    Science.gov (United States)

    Li, Wenbin; Teixeira, Diva C; Hartung, John S; Huang, Qi; Duan, Yongping; Zhou, Lijuan; Chen, Jianchi; Lin, Hong; Lopes, Silvio; Ayres, A Juliano; Levy, Laurene

    2013-01-01

    The xylem-limited, Gram-negative, fastidious plant bacterium Xylella fastidiosa is the causal agent of citrus variegated chlorosis (CVC), a destructive disease affecting approximately half of the citrus plantations in the State of São Paulo, Brazil. The disease was recently found in Central America and is threatening the multi-billion U.S. citrus industry. Many strains of X. fastidiosa are pathogens or endophytes in various plants growing in the U.S., and some strains cross infect several host plants. In this study, a TaqMan-based assay targeting the 16S rDNA signature region was developed for the identification of X. fastidiosa at the species level. Another TaqMan-based assay was developed for the specific identification of the CVC strains. Both new assays have been systematically validated in comparison with the primer/probe sets from four previously published assays on one platform and under similar PCR conditions, and shown to be superior. The species specific assay detected all X. fastidiosa strains and did not amplify any other citrus pathogen or endophyte tested. The CVC-specific assay detected all CVC strains but did not amplify any non-CVC X. fastidiosa nor any other citrus pathogen or endophyte evaluated. Both sets were multiplexed with a reliable internal control assay targeting host plant DNA, and their diagnostic specificity and sensitivity remained unchanged. This internal control provides quality assurance for DNA extraction, performance of PCR reagents, platforms and operators. The limit of detection for both assays was equivalent to 2 to 10 cells of X. fastidiosa per reaction for field citrus samples. Petioles and midribs of symptomatic leaves of sweet orange harbored the highest populations of X. fastidiosa, providing the best materials for detection of the pathogen. These new species specific assay will be invaluable for molecular identification of X. fastidiosa at the species level, and the CVC specific assay will be very powerful for the

  10. Identifying rapidly parasiticidal anti-malarial drugs using a simple and reliable in vitro parasite viability fast assay.

    Science.gov (United States)

    Linares, María; Viera, Sara; Crespo, Benigno; Franco, Virginia; Gómez-Lorenzo, María G; Jiménez-Díaz, María Belén; Angulo-Barturen, Íñigo; Sanz, Laura María; Gamo, Francisco-Javier

    2015-11-05

    The emergence of Plasmodium falciparum resistance to artemisinins threatens to undermine the effectiveness of artemisinin-based combination anti-malarial therapy. Developing suitable drugs to replace artemisinins requires the identification of new compounds that display rapid parasite killing kinetics. However, no current methods fully meet the requirements to screen large compound libraries for candidates with such properties. This study describes the development and validation of an in vitro parasite viability fast assay for identifying rapidly parasiticidal anti-malarial drugs. Parasite killing kinetics were determined by first culturing unlabelled erythrocytes with P. falciparum in the presence of anti-malarial drugs for 24 or 48 h. After removing the drug, samples were added to erythrocytes pre-labelled with intracellular dye to allow their subsequent identification. The ability of viable parasites to re-establish infection in labelled erythrocytes could then be detected by two-colour flow cytometry after tagging of parasite DNA. Thus, double-stained erythrocytes (with the pre-labelled intracellular dye and the parasite DNA dye) result only after establishment of new infections by surviving parasites. The capacity of the test anti-malarial drugs to eliminate viable parasites within 24 or 48 h could, therefore, be determined. The parasite viability fast assay could be completed within 48 h following drug treatment and distinguished between rapidly parasiticidal anti-malarial drugs versus those acting more slowly. The assay was validated against ten standard anti-malarial agents with known properties and results correlated well with established methods. An abbreviated assay, suitable for adaption to medium-high throughput screening, was validated and applied against a set of 20 compounds retrieved from the publically available Medicines for Malaria Venture 'Malaria Box'. The quantification of new infections to determine parasite viability offers important

  11. Molecular recognition and self-assembly special feature: A general protocol for creating high-throughput screening assays for reaction yield and enantiomeric excess applied to hydrobenzoin.

    Science.gov (United States)

    Shabbir, Shagufta H; Regan, Clinton J; Anslyn, Eric V

    2009-06-30

    A general approach to high-throughput screening of enantiomeric excess (ee) and concentration was developed by using indicator displacement assays (IDAs), and the protocol was then applied to the vicinal diol hydrobenzoin. The method involves the sequential utilization of what we define herein as screening, training, and analysis plates. Several enantioselective boronic acid-based receptors were screened by using 96-well plates, both for their ability to discriminate the enantiomers of hydrobenzoin and to find their optimal pairing with indicators resulting in the largest optical responses. The best receptor/indicator combination was then used to train an artificial neural network to determine concentration and ee. To prove the practicality of the developed protocol, analysis plates were created containing true unknown samples of hydrobenzoin generated by established Sharpless asymmetric dihydroxylation reactions, and the best ligand was correctly identified.

  12. Design and Implementation of a MAC Protocol for Timely and Reliable Delivery of Command and Data in Dynamic Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Phan Van Vinh

    2013-09-01

    Full Text Available This paper proposes and implements a new TDMA-based MAC protocol for providing timely and reliable delivery of data and command for monitoring and control networks. In this kind of network, sensor nodes are required to sense data from the monitoring environment periodically and then send the data to a sink. The sink determines whether the environment is safe or not by analyzing the acquired data. Sometimes, a command or control message is sent from the sink to a particular node or a group of nodes to execute the services or request further interested data. The proposed MAC protocol enables bidirectional communication, controls active and sleep modes of a sensor node to conserve energy, and addresses the problem of load unbalancing between the nodes near a sink and the other nodes. It can improve reliability of communication significantly while extending network lifetime. These claims are supported by the experimental results.

  13. Revaluation of biological variation of glycated hemoglobin (HbA(1c)) using an accurately designed protocol and an assay traceable to the IFCC reference system.

    Science.gov (United States)

    Braga, Federica; Dolci, Alberto; Montagnana, Martina; Pagani, Franca; Paleari, Renata; Guidi, Gian Cesare; Mosca, Andrea; Panteghini, Mauro

    2011-07-15

    Glycated hemoglobin (HbA(1c)) has a key role for diagnosing diabetes and monitoring glycemic state. As recently reviewed, available data on HbA(1c) biological variation show marked heterogeneity. Here we experimentally revaluated these data using a well designed protocol. We took five EDTA whole blood specimens from 18 apparently healthy subjects on the same day, every two weeks for two months. Samples were stored at -80°C until analysis and assayed in duplicate in a single run by Roche Tina-quant® Gen.2 immunoassay. Data were analyzed by the ANOVA. To assess the assay traceability to the IFCC reference method, we preliminarily carried out a correlation experiment. The bias (mean±SD) of the Roche immunoassay was 0.3%±0.7%, confirming the traceability of the employed assay. No difference was found in HbA(1c) values between men and women. Within- and between-subject CV were 2.5% and 7.1%, respectively. Derived desirable analytical goals for imprecision, bias, and total error resulted 1.3%, 1.9%, and 3.9%, respectively. HbA(1c) had marked individuality, limiting the use of population-based reference limits for test interpretation. The estimated critical difference was ~10%. For the first time we defined biological variation and derived indices for the clinical application of HbA(1c) measurements using an accurately designed protocol and an assay standardized according to the IFCC. Copyright © 2011 Elsevier B.V. All rights reserved.

  14. The precision and robustness of published protocols for disc diffusion assays of antimicrobial agent susceptibility: an inter-laboratory study

    DEFF Research Database (Denmark)

    Gabhainn, S.N.; Bergh, Ø.; Dixon, B.

    2004-01-01

    for each agent being 11.1%. Significant influences on zone size were detected for all three parameters of the protocol. Media source effects were particularly notable with respect to oxytetracycline and oxolinic acid discs, disc source effects with respect to ampicillin and sulphamethoxazole...

  15. reliability reliability

    African Journals Online (AJOL)

    eobe

    Corresponding author, Tel: +234-703. RELIABILITY .... V , , given by the code of practice. However, checks must .... an optimization procedure over the failure domain F corresponding .... of Concrete Members based on Utility Theory,. Technical ...

  16. Adaptations of the Saker-Solomons test: simple, reliable colorimetric field assays for chloroquine and its metabolites in urine.

    OpenAIRE

    Mount, D. L.; Nahlen, B. L.; Patchen, L. C.; Churchill, F. C.

    1989-01-01

    Two field-adapted colorimetric methods for measuring the antimalarial drug chloroquine in urine are described. Both are modifications of the method of Saker and Solomons for screening urine for phencyclidine and other drugs of abuse, using the colour reagent tetrabromophenolphthalein ethyl ester. One method is semiquantitative, detecting the presence of chloroquine (Cq) and its metabolites in urine with a 1 microgram/ml detection limit; it is more sensitive and reliable than the commonly used...

  17. Development and Utilization of an Ex Vivo Bromodeoxyuridine Local Lymph Node Assay (LLNA) Protocol for Assessing Potential Chemical Sensitizers

    Science.gov (United States)

    The murine local lymph node assay (LLNA) is widely used to identify chemicals that may cause allergic contact dermatitis. Exposure to a dermal sensitizer results in proliferation of local lymph node T cells, which has traditionally been measured by in vivo incorporation of [3H]m...

  18. Inter-observer reliability of animal-based welfare indicators included in the Animal Welfare Indicators welfare assessment protocol for dairy goats.

    Science.gov (United States)

    Vieira, A; Battini, M; Can, E; Mattiello, S; Stilwell, G

    2018-01-08

    This study was conducted within the context of the Animal Welfare Indicators (AWIN) project and the underlying scientific motivation for the development of the study was the scarcity of data regarding inter-observer reliability (IOR) of welfare indicators, particularly given the importance of reliability as a further step for developing on-farm welfare assessment protocols. The objective of this study is therefore to evaluate IOR of animal-based indicators (at group and individual-level) of the AWIN welfare assessment protocol (prototype) for dairy goats. In the design of the study, two pairs of observers, one in Portugal and another in Italy, visited 10 farms each and applied the AWIN prototype protocol. Farms in both countries were visited between January and March 2014, and all the observers received the same training before the farm visits were initiated. Data collected during farm visits, and analysed in this study, include group-level and individual-level observations. The results of our study allow us to conclude that most of the group-level indicators presented the highest IOR level ('substantial', 0.85 to 0.99) in both field studies, pointing to a usable set of animal-based welfare indicators that were therefore included in the first level of the final AWIN welfare assessment protocol for dairy goats. Inter-observer reliability of individual-level indicators was lower, but the majority of them still reached 'fair to good' (0.41 to 0.75) and 'excellent' (0.76 to 1) levels. In the paper we explore reasons for the differences found in IOR between the group and individual-level indicators, including how the number of individual-level indicators to be assessed on each animal and the restraining method may have affected the results. Furthermore, we discuss the differences found in the IOR of individual-level indicators in both countries: the Portuguese pair of observers reached a higher level of IOR, when compared with the Italian observers. We argue how the

  19. An improved behavioural assay demonstrates that ultrasound vocalizations constitute a reliable indicator of chronic cancer pain and neuropathic pain

    Directory of Open Access Journals (Sweden)

    Selvaraj Deepitha

    2010-03-01

    Full Text Available Abstract Background On-going pain is one of the most debilitating symptoms associated with a variety of chronic pain disorders. An understanding of mechanisms underlying on-going pain, i.e. stimulus-independent pain has been hampered so far by a lack of behavioural parameters which enable studying it in experimental animals. Ultrasound vocalizations (USVs have been proposed to correlate with pain evoked by an acute activation of nociceptors. However, literature on the utility of USVs as an indicator of chronic pain is very controversial. A majority of these inconsistencies arise from parameters confounding behavioural experiments, which include novelty, fear and stress due to restrain, amongst others. Results We have developed an improved assay which overcomes these confounding factors and enables studying USVs in freely moving mice repetitively over several weeks. Using this improved assay, we report here that USVs increase significantly in mice with bone metastases-induced cancer pain or neuropathic pain for several weeks, in comparison to sham-treated mice. Importantly, analgesic drugs which are known to alleviate tumour pain or neuropathic pain in human patients significantly reduce USVs as well as mechanical allodynia in corresponding mouse models. Conclusions We show that studying USVs and mechanical allodynia in the same cohort of mice enables comparing the temporal progression of on-going pain (i.e. stimulus-independent pain and stimulus-evoked pain in these clinically highly-relevant forms of chronic pain.

  20. Addressing the need for biomarker liquid chromatography/mass spectrometry assays: a protocol for effective method development for the bioanalysis of endogenous compounds in cerebrospinal fluid.

    Science.gov (United States)

    Benitex, Yulia; McNaney, Colleen A; Luchetti, David; Schaeffer, Eric; Olah, Timothy V; Morgan, Daniel G; Drexler, Dieter M

    2013-08-30

    Research on disorders of the central nervous system (CNS) has shown that an imbalance in the levels of specific endogenous neurotransmitters may underlie certain CNS diseases. These alterations in neurotransmitter levels may provide insight into pathophysiology, but can also serve as disease and pharmacodynamic biomarkers. To measure these potential biomarkers in vivo, the relevant sample matrix is cerebrospinal fluid (CSF), which is in equilibrium with the brain's interstitial fluid and circulates through the ventricular system of the brain and spinal cord. Accurate analysis of these potential biomarkers can be challenging due to low CSF sample volume, low analyte levels, and potential interferences from other endogenous compounds. A protocol has been established for effective method development of bioanalytical assays for endogenous compounds in CSF. Database searches and standard-addition experiments are employed to qualify sample preparation and specificity of the detection thus evaluating accuracy and precision. This protocol was applied to the study of the histaminergic neurotransmitter system and the analysis of histamine and its metabolite 1-methylhistamine in rat CSF. The protocol resulted in a specific and sensitive novel method utilizing pre-column derivatization ultra high performance liquid chromatography/tandem mass spectrometry (UHPLC/MS/MS), which is also capable of separating an endogenous interfering compound, identified as taurine, from the analytes of interest. Copyright © 2013 John Wiley & Sons, Ltd.

  1. Calf-raise senior: a new test for assessment of plantar flexor muscle strength in older adults: protocol, validity, and reliability.

    Science.gov (United States)

    André, Helô-Isa; Carnide, Filomena; Borja, Edgar; Ramalho, Fátima; Santos-Rocha, Rita; Veloso, António P

    2016-01-01

    This study aimed to develop a new field test protocol with a standardized measurement of strength and power in plantar flexor muscles targeted to functionally independent older adults, the calf-raise senior (CRS) test, and also evaluate its reliability and validity. Forty-one subjects aged 65 years and older of both sexes participated in five different cross-sectional studies: 1) pilot (n=12); 2) inter- and intrarater agreement (n=12); 3) construct (n=41); 4) criterion validity (n=33); and 5) test-retest reliability (n=41). Different motion parameters were compared in order to define a specifically designed protocol for seniors. Two raters evaluated each participant twice, and the results of the same individual were compared between raters and participants to assess the interrater and intrarater agreement. The validity and reliability studies involved three testing sessions that lasted 2 weeks, including a battery of functional fitness tests, CRS test in two occasions, accelerometry, and strength assessments in an isokinetic dynamometer. The CRS test presented an excellent test-retest reliability (intraclass correlation coefficient [ICC] =0.90, standard error of measurement =2.0) and interrater reliability (ICC =0.93-0.96), as well as a good intrarater agreement (ICC =0.79-0.84). Participants with better results in the CRS test were younger and presented higher levels of physical activity and functional fitness. A significant association between test results and all strength parameters (isometric, r =0.87, r 2 =0.75; isokinetic, r =0.86, r 2 =0.74; and rate of force development, r =0.77, r 2 =0.59) was shown. This study was successful in demonstrating that the CRS test can meet the scientific criteria of validity and reliability. The test can be a good indicator of ankle strength in older adults and proved to discriminate significantly between individuals with improved functionality and levels of physical activity.

  2. Evaluation of a manual DNA extraction protocol and an isothermal amplification assay for detecting HIV-1 DNA from dried blood spots for use in resource-limited settings.

    Science.gov (United States)

    Jordan, Jeanne A; Ibe, Christine O; Moore, Miranda S; Host, Christel; Simon, Gary L

    2012-05-01

    In resource-limited settings (RLS) dried blood spots (DBS) are collected on infants and transported through provincial laboratories to a central facility where HIV-1 DNA PCR testing is performed using specialized equipment. Implementing a simpler approach not requiring such equipment or skilled personnel could allow the more numerous provincial laboratories to offer testing, improving turn-around-time to identify and treat infected infants sooner. Assess performances of a manual DNA extraction method and helicase-dependent amplification (HDA) assay for detecting HIV-1 DNA from DBS. 60 HIV-1 infected adults were enrolled, blood samples taken and DBS made. DBS extracts were assessed for DNA concentration and beta globin amplification using PCR and melt-curve analysis. These same extracts were then tested for HIV-1 DNA using HDA and compared to results generated by PCR and pyrosequencing. Finally, HDA limit of detection (LOD) studies were performed using DBS extracts prepared with known numbers of 8E5 cells. The manual extraction protocol consistently yielded high concentrations of amplifiable DNA from DBS. LOD assessment demonstrated HDA detected ∼470 copies/ml of HIV-1 DNA extracts in 4/4 replicates. No statistical difference was found using the McNemar's test when comparing HDA to PCR for detecting HIV-1 DNA from DBS. Using just a magnet, heat block and pipettes, the manual extraction protocol and HDA assay detected HIV-1 DNA from DBS at levels that would be useful for early infant diagnosis. Next steps will include assessing HDA for non-B HIV-1 subtypes recognition and comparison to Roche HIV-1 DNA v1.5 PCR assay. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. MQARR-AODV: A NOVEL MULTIPATH QOS AWARE RELIABLE REVERSE ON-DEMAND DISTANCE VECTOR ROUTING PROTOCOL FOR MOBILE AD-HOC NETWORKS

    Directory of Open Access Journals (Sweden)

    K.G. Santhiya

    2012-12-01

    Full Text Available MANET (Mobile Ad-hoc Network is an infra structure less wireless ad-hoc network that does not require any basic central control. The topology of the network changes drastically due to very fast mobility of nodes. So an adaptive routing protocol is needed for routing in MANET. AODV (Ad-hoc On-demand Distance Vector routing is the effective and prominent on-demand Ad-hoc routing protocols. During route establishment phase in traditional AODV, only one route reply message will be sent in the reverse path to establish routing path. The high mobility of nodes may affect the reply messages which lead to the retransmission of route request message by the sender which in turn leads to higher communication delay, power consumption and the reduction in the ratio of packets delivered. Sending multiple route reply messages and establishing multiple paths in a single path discovery will reduce the routing overhead involved in maintaining the connection between source and destination nodes. Multipath routing can render high scalability, end-to-end throughput and provide load balancing in MANET. The new proposed novel Multipath QoS aware reliable routing protocol establishes two routes of maximum node disjoint paths and the data transfer is carried out in the two paths simultaneously. To select best paths, the new proposed protocol uses three parameters Link Eminence, MAC overhead and node residual energy. The experimental values prove that the MQARR-AODV protocol achieves high reliability, stability, low latency and outperforms AODV by the less energy consumption, overhead and delay.

  4. Mutagenicity of the potent rat hepatocarcinogen 6BT to the liver of transgenic (lacI) rats: consideration of a reduced mutation assay protocol.

    Science.gov (United States)

    Lefevre, P A; Tinwell, H; Ashby, J

    1997-01-01

    6-(p-dimethylaminophenylazo)benzothiazole (6BT) is an unusually potent rat hepatocarcinogen, producing large malignant liver tumours after only 2-3 months of dietary administration in a riboflavin-deficient diet. This azocarcinogen has been evaluated in a Big Blue F344 transgenic rat (lacI) gene mutation assay. In a reproduction of the early stages of the carcinogenesis bioassay of this agent, rats were maintained on a riboflavin-deficient diet and were given 10 consecutive daily doses of 6BT (10 mg/kg) by oral gavage. The animals were killed and the livers examined 11 days after the final dose. The livers of 6BT-treated rats showed evidence of hepatocellular hypertrophy in centrolobular areas, with some indication of an increased incidence of mitotic figures. An approximately 10-fold increase in the mutation frequency of DNA isolated from an aliquot of the combined liver homogenates of 6BT-treated rats was observed over that obtained from an equivalent aliquot from control animals. Examination of DNA samples isolated from the livers of individual animals confirmed that 6BT was mutagenic in Big Blue rat livers. These data extend the sensitivity of this transgenic assay to include azo hepatocarcinogens. The determination of mutation frequencies using pooled tissue samples represented a major resource-saving adaptation of the assay protocol in the present study; the general advantages and disadvantages of this practice are discussed.

  5. Validation of an accelerated high-sensitivity troponin T assay protocol in an Australian cohort with chest pain.

    Science.gov (United States)

    Parsonage, William A; Greenslade, Jaimi H; Hammett, Christopher J; Lamanna, Arvin; Tate, Jillian R; Ungerer, Jacobus P; Chu, Kevin; Than, Martin; Brown, Anthony F T; Cullen, Louise

    2014-02-17

    To validate an accelerated biomarker strategy using a high-sensitivity cardiac troponin T (hs-cTnT) assay for diagnosing acute myocardial infarction (AMI) in patients presenting to the emergency department with chest pain; and to validate this strategy in combination with the National Heart Foundation of Australia/Cardiac Society of Australia and New Zealand risk stratification model. Single-centre, prospective, observational cohort study of 764 adults presenting to a tertiary hospital with symptoms of possible acute coronary syndrome between November 2008 and February 2011. AMI or cardiac death within 24 hours of presentation (primary), and major adverse cardiac events within 30 days (secondary). An elevated hs-cTnT assay result above the 99th percentile at either the 0 h or 2 h time points had sensitivity of 96.4% (95% CI, 87.9%-99.0%), specificity of 82.6% (95% CI, 79.7%-85.2%), negative predictive value of 99.7% (95% CI, 98.8%-99.9%) and positive predictive value of 30.5% (95% CI, 24.2%-37.6%) for diagnosing AMI. Compared with a traditional 6 h cardiac troponin testing strategy, the accelerated strategy led to reclassification of risk in only two patients with adverse cardiac outcomes, with no net effect on appropriate management. In patients presenting with chest pain, an accelerated biomarker strategy using the hs-cTnT assay performed well in the initial diagnosis of AMI. The accelerated strategy was also effective when incorporated into a comprehensive strategy of risk stratification that included clinical and demographic factors. The time saved by this approach could have a major impact on health service delivery. Australian New Zealand Clinical Trials Registry ACTRN12610000053022.

  6. Harmonization of radiobiological assays: why and how?

    International Nuclear Information System (INIS)

    Prasanna, Pataje G.

    2014-01-01

    The International Atomic Energy Agency has made available a technical manual for cytogenetic biodosimetry assays (dicentric chromosome aberration (DCA) and cytokinesis-block micronucleus (CBMN) assays) used for radiation dose assessment in radiation accidents. The International Standardization Organization, which develops standards and guidelines, also provides an avenue for laboratory accreditation, has developed guidelines and recommendations for performing cytogenetic biodosimetry assays. Harmonization of DCA and CBMN assays, has improved their accuracy. Double-blinded inter-laboratory comparison studies involving several networks have further validated DCA and CBMN assays and improved the confidence in their potential use for radiation dose assessment in mass casualties. This kind of international harmonization is lacking for pre-clinical radiobiology assays. The widely used pre-clinical assays that are relatively important to set stage for clinical trials include clonogenic assays, flow-cytometry assays, apoptotic assays, and tumor regression and growth delay assays. However, significant inter-laboratory variations occur with respect to data among laboratories. This raises concerns on the reliability and reproducibility of preclinical data that drives further development and translation. Lack of reproducibility may stem from a variety of factors such as poor scientist training, less than optimal experimental design, inadequate description of methodology, and impulse to publish only the positive data etc. Availability of technical manuals, standard operating procedures, accreditation avenues for laboratories performing such assays, inter-laboratory comparisons, and use of standardized protocols are necessary to enhance reliability and reproducibility. Thus, it is important that radiobiological assays are harmonized for laboratory protocols to ensure successful translation of pre-clinical research on radiation effect modulators to help design clinic trials with

  7. Adaptations of the Saker-Solomons test: simple, reliable colorimetric field assays for chloroquine and its metabolites in urine.

    Science.gov (United States)

    Mount, D L; Nahlen, B L; Patchen, L C; Churchill, F C

    1989-01-01

    Two field-adapted colorimetric methods for measuring the antimalarial drug chloroquine in urine are described. Both are modifications of the method of Saker and Solomons for screening urine for phencyclidine and other drugs of abuse, using the colour reagent tetrabromophenolphthalein ethyl ester. One method is semiquantitative, detecting the presence of chloroquine (Cq) and its metabolites in urine with a 1 microgram/ml detection limit; it is more sensitive and reliable than the commonly used Dill-Glazko method and is as easy to apply in the field. The second method uses a hand-held, battery-operated filter photometer to quantify Cq and its metabolites with a 2 microgram/ml detection limit and a linear range up to 8 micrograms/ml. The first method was validated in the field using a published quantitative colorimetric method and samples from a malaria study in Nigeria. The second method was validated in the laboratory against high-performance liquid chromatographic results on paired samples from the Nigerian study. Both methods may be used in remote locations where malaria is endemic and no electricity is available.

  8. Design and performance testing of a DNA extraction assay for sensitive and reliable quantification of acetic acid bacteria directly in red wine using real time PCR

    Directory of Open Access Journals (Sweden)

    Cédric eLONGIN

    2016-06-01

    Full Text Available Although strategies exist to prevent AAB contamination, the increased interest for wines with low sulfite addition leads to greater AAB spoilage. Hence there is a real need for a rapid, specific, sensitive and reliable method for detecting these spoilage bacteria. All these requirements are met by real time Polymerase Chain Reaction (or quantitative PCR; qPCR. Here, we compare existing methods of isolating DNA and their adaptation to a red wine matrix. Two different protocols for isolating DNA and three PCR mix compositions were tested to select the best method. The addition of insoluble polyvinylpolypyrrolidone (PVPP at 1% (v/v during DNA extraction using a protocol succeeded in eliminating PCR inhibitors from red wine. We developed a bacterial internal control which was efficient in avoiding false negative results due to decreases in the efficiency of DNA isolation and/or amplification. The specificity, linearity, repeatability and reproducibility of the method were evaluated. A standard curve was established for the enumeration of AAB inoculated into red wines. The limit of quantification in red wine was 3.7 log AAB/mL and about 2.8 log AAB/mL when the volume of the samples was increased from 1 mL to 10 mL. Thus the DNA extraction method developed in this paper allows sensitive and reliable AAB quantification without underestimation thanks to the presence of an internal control. Moreover, monitoring of both the AAB population and the amount of acetic acid in ethanol medium and red wine highlighted that a minimum about 6.0 log cells/mL of AAB is needed to significantly increase the production of acetic acid leading to spoilage.

  9. The sentence verification task: a reliable fMRI protocol for mapping receptive language in individual subjects

    International Nuclear Information System (INIS)

    Sanjuan, Ana; Avila, Cesar; Forn, Cristina; Ventura-Campos, Noelia; Rodriguez-Pujadas, Aina; Garcia-Porcar, Maria; Belloch, Vicente; Villanueva, Vicente

    2010-01-01

    To test the capacity of a sentence verification (SV) task to reliably activate receptive language areas. Presurgical evaluation of language is useful in predicting postsurgical deficits in patients who are candidates for neurosurgery. Productive language tasks have been successfully elaborated, but more conflicting results have been found in receptive language mapping. Twenty-two right-handed healthy controls made true-false semantic judgements of brief sentences presented auditorily. Group maps showed reliable functional activations in the frontal and temporoparietal language areas. At the individual level, the SV task showed activation located in receptive language areas in 100% of the participants with strong left-sided distributions (mean lateralisation index of 69.27). The SV task can be considered a useful tool in evaluating receptive language function in individual subjects. This study is a first step towards designing the fMRI task which may serve to presurgically map receptive language functions. (orig.)

  10. The sentence verification task: a reliable fMRI protocol for mapping receptive language in individual subjects

    Energy Technology Data Exchange (ETDEWEB)

    Sanjuan, Ana; Avila, Cesar [Universitat Jaume I, Departamento de Psicologia Basica, Clinica y Psicobiologia, Castellon de la Plana (Spain); Hospital La Fe, Unidad de Epilepsia, Servicio de Neurologia, Valencia (Spain); Forn, Cristina; Ventura-Campos, Noelia; Rodriguez-Pujadas, Aina; Garcia-Porcar, Maria [Universitat Jaume I, Departamento de Psicologia Basica, Clinica y Psicobiologia, Castellon de la Plana (Spain); Belloch, Vicente [Hospital La Fe, Eresa, Servicio de Radiologia, Valencia (Spain); Villanueva, Vicente [Hospital La Fe, Unidad de Epilepsia, Servicio de Neurologia, Valencia (Spain)

    2010-10-15

    To test the capacity of a sentence verification (SV) task to reliably activate receptive language areas. Presurgical evaluation of language is useful in predicting postsurgical deficits in patients who are candidates for neurosurgery. Productive language tasks have been successfully elaborated, but more conflicting results have been found in receptive language mapping. Twenty-two right-handed healthy controls made true-false semantic judgements of brief sentences presented auditorily. Group maps showed reliable functional activations in the frontal and temporoparietal language areas. At the individual level, the SV task showed activation located in receptive language areas in 100% of the participants with strong left-sided distributions (mean lateralisation index of 69.27). The SV task can be considered a useful tool in evaluating receptive language function in individual subjects. This study is a first step towards designing the fMRI task which may serve to presurgically map receptive language functions. (orig.)

  11. The best of both worlds: Building on the COPUS and RTOP observation protocols to easily and reliably measure various levels of reformed instructional practice.

    Science.gov (United States)

    Lund, Travis J; Pilarz, Matthew; Velasco, Jonathan B; Chakraverty, Devasmita; Rosploch, Kaitlyn; Undersander, Molly; Stains, Marilyne

    2015-01-01

    Researchers, university administrators, and faculty members are increasingly interested in measuring and describing instructional practices provided in science, technology, engineering, and mathematics (STEM) courses at the college level. Specifically, there is keen interest in comparing instructional practices between courses, monitoring changes over time, and mapping observed practices to research-based teaching. While increasingly common observation protocols (Reformed Teaching Observation Protocol [RTOP] and Classroom Observation Protocol in Undergraduate STEM [COPUS]) at the postsecondary level help achieve some of these goals, they also suffer from weaknesses that limit their applicability. In this study, we leverage the strengths of these protocols to provide an easy method that enables the reliable and valid characterization of instructional practices. This method was developed empirically via a cluster analysis using observations of 269 individual class periods, corresponding to 73 different faculty members, 28 different research-intensive institutions, and various STEM disciplines. Ten clusters, called COPUS profiles, emerged from this analysis; they represent the most common types of instructional practices enacted in the classrooms observed for this study. RTOP scores were used to validate the alignment of the 10 COPUS profiles with reformed teaching. Herein, we present a detailed description of the cluster analysis method, the COPUS profiles, and the distribution of the COPUS profiles across various STEM courses at research-intensive universities. © 2015 T. J. Lund et al. CBE—Life Sciences Education © 2015 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  12. Effect of standardized training on the reliability of the Cochrane risk of bias assessment tool: a study protocol.

    Science.gov (United States)

    da Costa, Bruno R; Resta, Nina M; Beckett, Brooke; Israel-Stahre, Nicholas; Diaz, Alison; Johnston, Bradley C; Egger, Matthias; Jüni, Peter; Armijo-Olivo, Susan

    2014-12-13

    The Cochrane risk of bias (RoB) tool has been widely embraced by the systematic review community, but several studies have reported that its reliability is low. We aim to investigate whether training of raters, including objective and standardized instructions on how to assess risk of bias, can improve the reliability of this tool. We describe the methods that will be used in this investigation and present an intensive standardized training package for risk of bias assessment that could be used by contributors to the Cochrane Collaboration and other reviewers. This is a pilot study. We will first perform a systematic literature review to identify randomized clinical trials (RCTs) that will be used for risk of bias assessment. Using the identified RCTs, we will then do a randomized experiment, where raters will be allocated to two different training schemes: minimal training and intensive standardized training. We will calculate the chance-corrected weighted Kappa with 95% confidence intervals to quantify within- and between-group Kappa agreement for each of the domains of the risk of bias tool. To calculate between-group Kappa agreement, we will use risk of bias assessments from pairs of raters after resolution of disagreements. Between-group Kappa agreement will quantify the agreement between the risk of bias assessment of raters in the training groups and the risk of bias assessment of experienced raters. To compare agreement of raters under different training conditions, we will calculate differences between Kappa values with 95% confidence intervals. This study will investigate whether the reliability of the risk of bias tool can be improved by training raters using standardized instructions for risk of bias assessment. One group of inexperienced raters will receive intensive training on risk of bias assessment and the other will receive minimal training. By including a control group with minimal training, we will attempt to mimic what many review authors

  13. Quantitation of next generation sequencing library preparation protocol efficiencies using droplet digital PCR assays - a systematic comparison of DNA library preparation kits for Illumina sequencing.

    Science.gov (United States)

    Aigrain, Louise; Gu, Yong; Quail, Michael A

    2016-06-13

    The emergence of next-generation sequencing (NGS) technologies in the past decade has allowed the democratization of DNA sequencing both in terms of price per sequenced bases and ease to produce DNA libraries. When it comes to preparing DNA sequencing libraries for Illumina, the current market leader, a plethora of kits are available and it can be difficult for the users to determine which kit is the most appropriate and efficient for their applications; the main concerns being not only cost but also minimal bias, yield and time efficiency. We compared 9 commercially available library preparation kits in a systematic manner using the same DNA sample by probing the amount of DNA remaining after each protocol steps using a new droplet digital PCR (ddPCR) assay. This method allows the precise quantification of fragments bearing either adaptors or P5/P7 sequences on both ends just after ligation or PCR enrichment. We also investigated the potential influence of DNA input and DNA fragment size on the final library preparation efficiency. The overall library preparations efficiencies of the libraries show important variations between the different kits with the ones combining several steps into a single one exhibiting some final yields 4 to 7 times higher than the other kits. Detailed ddPCR data also reveal that the adaptor ligation yield itself varies by more than a factor of 10 between kits, certain ligation efficiencies being so low that it could impair the original library complexity and impoverish the sequencing results. When a PCR enrichment step is necessary, lower adaptor-ligated DNA inputs leads to greater amplification yields, hiding the latent disparity between kits. We describe a ddPCR assay that allows us to probe the efficiency of the most critical step in the library preparation, ligation, and to draw conclusion on which kits is more likely to preserve the sample heterogeneity and reduce the need of amplification.

  14. Adaptation of a MR imaging protocol into a real-time clinical biometric ultrasound protocol for persons with spinal cord injury at risk for deep tissue injury: A reliability study.

    Science.gov (United States)

    Swaine, Jillian M; Moe, Andrew; Breidahl, William; Bader, Daniel L; Oomens, Cees W J; Lester, Leanne; O'Loughlin, Edmond; Santamaria, Nick; Stacey, Michael C

    2018-02-01

    High strain in soft tissues that overly bony prominences are considered a risk factor for pressure ulcers (PUs) following spinal cord impairment (SCI) and have been computed using Finite Element methods (FEM). The aim of this study was to translate a MRI protocol into ultrasound (US) and determine between-operator reliability of expert sonographers measuring diameter of the inferior curvature of the ischial tuberosity (IT) and the thickness of the overlying soft tissue layers on able-bodied (AB) and SCI using real-time ultrasound. Part 1: Fourteen AB participants with a mean age of 36.7 ± 12.09 years with 7 males and 7 females had their 3 soft tissue layers in loaded and unloaded sitting measured independently by 2 sonographers: tendon/muscle, skin/fat and total soft tissue and the diameter of the IT in its short and long axis. Part 2: Nineteen participants with SCI were screened, three were excluded due to abnormal skin signs, and eight participants (42%) were excluded for abnormal US signs with normal skin. Eight SCI participants with a mean age of 31.6 ± 13.6 years and all male with 4 paraplegics and 4 tetraplegics were measured by the same sonographers for skin, fat, tendon, muscle and total. Skin/fat and tendon/muscle were computed. AB between-operator reliability was good (ICC = 0.81-0.90) for 3 soft tissues layers in unloaded and loaded sitting and poor for both IT short and long axis (ICC = -0.028 and -0.01). SCI between-operator reliability was good in unloaded and loaded for total, muscle, fat, skin/fat, tendon/muscle (ICC = 0.75-0.97) and poor for tendon (ICC = 0.26 unloaded and ICC = -0.71 loaded) and skin (ICC = 0.37 unloaded and ICC = 0.10). A MRI protocol was successfully adapted for a reliable 3 soft tissue layer model and could be used in a 2-D FEM model designed to estimate soft tissue strain as a novel risk factor for the development of a PU. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. Appraisal of the sensitising potential of orally and dermally administered mercaptobenzothiazole by a biphasic protocol of the local lymph node assay.

    Science.gov (United States)

    Ahuja, Varun; Wanner, Reinhard; Platzek, Thomas; Stahlmann, Ralf

    2009-10-01

    Mercaptobenzothiazole (MBT) is used while manufacturing natural rubber products. Our study deals with assessing its allergenic potential following dermal and oral routes of exposure, using a biphasic local lymph node assay (LLNA). Female Balb/c mice were treated with MBT (dermally 3, 10, 30% concentrations in DMSO; orally 1, 10, 100 mg/kg doses in corn oil) on the back (dermal study) or through oral administration (oral study) on days 1-3 followed by auricular application of 3, 10 and 30% concentrations, respectively, on days 15-17. End points determined on day 19 included ear thickness, ear punch weight, lymph node weight, lymph node cell count, and lymphocyte subpopulations (CD4+, CD8+, CD45+). After dermal application of 3% or 10% solution, a significant increase in cell count and lymph node weight along with significant decrease in CD8+ cells was observed. After initial oral administration of 1 mg/kg, we noticed a significant amplification in cell count. Following oral administration of 10 mg/kg, we observed a similar increase in cell count and lymph node weight. The results of our study show that the modified biphasic LLNA protocol can be used to study the sensitising potential of a compound also following the oral route of exposure.

  16. Validation of a standard forensic anthropology examination protocol by measurement of applicability and reliability on exhumed and archive samples of known biological attribution.

    Science.gov (United States)

    Francisco, Raffaela Arrabaça; Evison, Martin Paul; Costa Junior, Moacyr Lobo da; Silveira, Teresa Cristina Pantozzi; Secchieri, José Marcelo; Guimarães, Marco Aurelio

    2017-10-01

    Forensic anthropology makes an important contribution to human identification and assessment of the causes and mechanisms of death and body disposal in criminal and civil investigations, including those related to atrocity, disaster and trafficking victim identification. The methods used are comparative, relying on assignment of questioned material to categories observed in standard reference material of known attribution. Reference collections typically originate in Europe and North America, and are not necessarily representative of contemporary global populations. Methods based on them must be validated when applied to novel populations. This study describes the validation of a standardized forensic anthropology examination protocol by application to two contemporary Brazilian skeletal samples of known attribution. One sample (n=90) was collected from exhumations following 7-35 years of burial and the second (n=30) was collected following successful investigations following routine case work. The study presents measurement of (1) the applicability of each of the methods: used and (2) the reliability with which the biographic parameters were assigned in each case. The results are discussed with reference to published assessments of methodological reliability regarding sex, age and-in particular-ancestry estimation. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Use of a standardized JaCVAM in vivo rat comet assay protocol to assess the genotoxicity of three coded test compounds; ampicillin trihydrate, 1,2-dimethylhydrazine dihydrochloride, and N-nitrosodimethylamine.

    Science.gov (United States)

    McNamee, J P; Bellier, P V

    2015-07-01

    As part of the Japanese Center for the Validation of Alternative Methods (JaCVAM)-initiative international validation study of the in vivo rat alkaline comet assay (comet assay), our laboratory examined ampicillin trihydrate (AMP), 1,2-dimethylhydrazine dihydrochloride (DMH), and N-nitrosodimethylamine (NDA) using a standard comet assay validation protocol (v14.2) developed by the JaCVAM validation management team (VMT). Coded samples were received by our laboratory along with basic MSDS information. Solubility analysis and range-finding experiments of the coded test compounds were conducted for dose selection. Animal dosing schedules, the comet assay processing and analysis, and statistical analysis were conducted in accordance with the standard protocol. Based upon our blinded evaluation, AMP was not found to exhibit evidence of genotoxicity in either the rat liver or stomach. However, both NDA and DMH were observed to cause a significant increase in % tail DNA in the rat liver at all dose levels tested. While acute hepatoxicity was observed for these compounds in the high dose group, in the investigators opinion there were a sufficient number of consistently damaged/measurable cells at the medium and low dose groups to judge these compounds as genotoxic. There was no evidence of genotoxicity from either NDA or DMH in the rat stomach. In conclusion, our laboratory observed increased DNA damage from two blinded test compounds in rat liver (later identified as genotoxic carcinogens), while no evidence of genotoxicity was observed for the third blinded test compound (later identified as a non-genotoxic, non-carcinogen). This data supports the use of a standardized protocol of the in vivo comet assay as a cost-effective alternative genotoxicity assay for regulatory testing purposes. Crown Copyright © 2015. Published by Elsevier B.V. All rights reserved.

  18. Generation of a reliable full-length cDNA of infectiousTembusu virus using a PCR-based protocol.

    Science.gov (United States)

    Liang, Te; Liu, Xiaoxiao; Cui, Shulin; Qu, Shenghua; Wang, Dan; Liu, Ning; Wang, Fumin; Ning, Kang; Zhang, Bing; Zhang, Dabing

    2016-02-02

    Full-length cDNA of Tembusu virus (TMUV) cloned in a plasmid has been found instable in bacterial hosts. Using a PCR-based protocol, we generated a stable full-length cDNA of TMUV. Different cDNA fragments of TMUV were amplified by reverse transcription (RT)-PCR, and cloned into plasmids. Fragmented cDNAs were amplified and assembled by fusion PCR to produce a full-length cDNA using the recombinant plasmids as templates. Subsequently, a full-length RNA was transcribed from the full-length cDNA in vitro and transfected into BHK-21 cells; infectious viral particles were rescued successfully. Following several passages in BKH-21 cells, the rescued virus was compared with the parental virus by genetic marker checks, growth curve determinations and animal experiments. These assays clearly demonstrated the genetic and biological stabilities of the rescued virus. The present work will be useful for future investigations on the molecular mechanisms involved in replication and pathogenesis of TMUV. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Immunochemical protocols

    National Research Council Canada - National Science Library

    Pound, John D

    1998-01-01

    ... easy and important refinements often are not published. This much anticipated 2nd edition of Immunochemzcal Protocols therefore aims to provide a user-friendly up-to-date handbook of reliable techniques selected to suit the needs of molecular biologists. It covers the full breadth of the relevant established immunochemical methods, from protein blotting and immunoa...

  20. Quality Improvement to Demonstrate the Lack of Reliability of the Human Papillomavirus mRNA Assay to Identify Women With Latent Human Papillomavirus Infections.

    Science.gov (United States)

    Cotton, Sarah; Brown, Robert E; Nugent, Elizabeth K; Robazetti, Sonia C; Berens, Pamela D; Smith, Judith A

    2018-04-01

    To assess the consistency between human papillomavirus (HPV) mRNA testing in women with a history of previous HPV infections diagnosed by HPV DNA assay and the potential effects on follow-up HPV screening. This was a quality improvement study that used data from a pathology laboratory software database reviewed from November 2014 to June 2016 to identify female patients aged 30 years or older with greater than one HPV-positive result, including one or more HPV mRNA assay results and one or more documented HPV DNA assay results for comparison. Previous correlative cytology and colposcopic histopathology were also documented. American College of Obstetricians and Gynecologists' cervical cancer screening guidelines were used to compare potential differences in follow-up recommendations. Four hundred twenty-five charts for female patients 30 years of age or older were identified with one or more prior high-risk HPV infections by DNA assay. There was a 69.3% difference in HPV mRNA results compared with previous HPV DNA-positive results. There was a potential change in follow-up for 71.7% of patients with one prior high-risk-HPV-positive result and 60.0% of patients with two or more prior high-risk HPV-positive results. There were 231 colposcopy reports evaluated in this study. Of these, 62 (26.8%) were abnormal colposcopy reports, including 45 low-grade squamous intraepithelial lesions, 15 high-grade squamous intraepithelial lesions, and two cancers. Twenty-five (40.3%) abnormal colposcopy findings were in patients with a history of at least than two prior HPV DNA-positive results and a report of currently being HPV-negative with the mRNA assay. The HPV mRNA assays are less sensitive for detection of latent HPV infections compared with HPV DNA assays. Based on these data and the potential change in follow-up care, the HPV mRNA assay should not be used for a primary screening tool for cervical cancer. Many pathology laboratories have shifted to using the HPV mRNA assay

  1. Monoclonal antibody-based dipstick assay: a reliable field applicable technique for diagnosis of Schistosoma mansoni infection using human serum and urine samples.

    Science.gov (United States)

    Demerdash, Zeinab; Mohamed, Salwa; Hendawy, Mohamed; Rabia, Ibrahim; Attia, Mohy; Shaker, Zeinab; Diab, Tarek M

    2013-02-01

    A field applicable diagnostic technique, the dipstick assay, was evaluated for its sensitivity and specificity in diagnosing human Schistosoma mansoni infection. A monoclonal antibody (mAb) against S. mansoni adult worm tegumental antigen (AWTA) was employed in dipstick and sandwich ELISA for detection of circulating schistosome antigen (CSA) in both serum and urine samples. Based on clinical and parasitological examinations, 60 S. mansoni-infected patients, 30 patients infected with parasites other than schistosomiasis, and 30 uninfected healthy individuals were selected. The sensitivity and specificity of dipstick assay in urine samples were 86.7% and 90.0%, respectively, compared to 90.0% sensitivity and 91.7% specificity of sandwich ELISA. In serum samples, the sensitivity and specificity were 88.3% and 91.7% for dipstick assay vs. 91.7% and 95.0% for sandwich ELISA, respectively. The diagnostic efficacy of dipstick assay in urine and serum samples was 88.3% and 90.0%, while it was 90.8% and 93.3% for sandwich ELISA, respectively. The diagnostic indices of dipstick assay and ELISA either in serum or in urine were statistically comparable (P>0.05). In conclusion, the dipstick assay offers an alternative simple, rapid, non-invasive technique in detecting CSA or complement to stool examinations especially in field studies.

  2. Nano-immunosafety: issues in assay validation

    International Nuclear Information System (INIS)

    Boraschi, Diana; Italiani, Paola; Oostingh, Gertie J; Duschl, Albert; Casals, Eudald; Puntes, Victor F; Nelissen, Inge

    2011-01-01

    Assessing the safety of engineered nanomaterials for human health must include a thorough evaluation of their effects on the immune system, which is responsible for defending the integrity of our body from damage and disease. An array of robust and representative assays should be set up and validated, which could be predictive of the effects of nanomaterials on immune responses. In a trans-European collaborative work, in vitro assays have been developed to this end. In vitro tests have been preferred for their suitability to standardisation and easier applicability. Adapting classical assays to testing the immunotoxicological effects of nanoparticulate materials has raised a series of issues that needed to be appropriately addressed in order to ensure reliability of results. Besides the exquisitely immunological problem of selecting representative endpoints predictive of the risk of developing disease, assay results turned out to be significantly biased by artefactual interference of the nanomaterials or contaminating agents with the assay protocol. Having addressed such problems, a series of robust and representative assays have been developed that describe the effects of engineered nanoparticles on professional and non-professional human defence cells. Two of such assays are described here, one based on primary human monocytes and the other employing human lung epithelial cells transfected with a reporter gene.

  3. An improved 96-well turbidity assay for T4 lysozyme activity.

    Science.gov (United States)

    Toro, Tasha B; Nguyen, Thao P; Watt, Terry J

    2015-01-01

    T4 lysozyme (T4L) is an important model system for investigating the relationship between protein structure and function. Despite being extensively studied, a reliable, quantitative activity assay for T4L has not been developed. Here, we present an improved T4L turbidity assay as well as an affinity-based T4L expression and purification protocol. This assay is designed for 96-well format and utilizes conditions amenable for both T4L and other lysozymes. This protocol enables easy, efficient, and quantitative characterization of T4L variants and allows comparison between different lysozymes. Our method: •Is applicable for all lysozymes, with enhanced sensitivity for T4 lysozyme compared to other 96-well plate turbidity assays;•Utilizes standardized conditions for comparing T4 lysozyme variants and other lysozymes; and•Incorporates a simplified expression and purification protocol for T4 lysozyme.

  4. A multi-laboratory evaluation of a common in vitro pepsin digestion assay protocol used in assessing the safety of novel proteins

    NARCIS (Netherlands)

    Thomas, K.; Aalbers, M.; Bannon, G. A.; Bartels, M.; Dearman, R. J.; Esdaile, D. J.; Fu, T. J.; Glatt, C. M.; Hadfield, N.; Hatzos, C.; Hefle, S. L.; Heylings, J. R.; Goodman, R. E.; Henry, B.; Herouet, C.; Holsapple, M.; Ladics, G. S.; Landry, T. D.; MacIntosh, S. C.; Rice, E. A.; Privalle, L. S.; Steiner, H. Y.; Teshima, R.; van Ree, R.; Woolhiser, M.; Zawodny, J.

    2004-01-01

    Rationale. Evaluation of the potential allergenicity of proteins derived from genetically modified foods has involved a weight of evidence approach that incorporates an evaluation of protein digestibility in pepsin. Currently, there is no standardized protocol to assess the digestibility of proteins

  5. Efficacy of T2 Magnetic Resonance Assay in Monitoring Candidemia after Initiation of Antifungal Therapy: the Serial Therapeutic and Antifungal Monitoring Protocol (STAMP) Trial.

    Science.gov (United States)

    Mylonakis, Eleftherios; Zacharioudakis, Ioannis M; Clancy, Cornelius J; Nguyen, M Hong; Pappas, Peter G

    2018-04-01

    The performance of blood culture for monitoring candidemia clearance is hampered by its low sensitivity, especially during antifungal therapy. The T2 magnetic resonance (T2MR) assay combines magnetic resonance with nanotechnology to identify whole Candida species cells. A multicenter clinical trial studied the performance of T2MR in monitoring candidemia clearance compared to blood culture. Adults with a blood culture positive for yeast were enrolled and had blood cultures and T2MR testing performed on prespecified days. Thirty-one patients completed the trial. Thirteen of the 31 patients (41.9%) had at least one positive surveillance T2MR and/or blood culture result. All positive blood cultures (7/7 [100%]) had an accompanying positive T2MR result with concordance in the identified Candida sp., while only 7/23 (30.4%) T2MR results had an accompanying positive blood culture. There was one case of discordance in species identification between T2MR and the preenrollment blood culture with evidence to support deep-seated infection by the Candida spp. detected by the T2MR assay. Based on the log rank test, there was a statistically significant improvement in posttreatment surveillance using the T2MR assay compared to blood culture ( P = 0.004). Limitations of the study include the small sample size and lack of outcome data. In conclusion, the T2MR assay significantly outperformed blood cultures for monitoring the clearance of candidemia in patients receiving antifungal therapy and may be useful in determining adequate source control, timing for deescalation, and optimal duration of treatment. However, further studies are needed to determine the viability of Candida species cells detected by the T2MR assay and correlate the results with patient outcomes. (This study is registered at ClinicalTrials.gov under registration number NCT02163889.). Copyright © 2018 Mylonakis et al.

  6. Comet assay with gill cells of Mytilus galloprovincialis end point tools for biomonitoring of water antibiotic contamination: Biological treatment is a reliable process for detoxification.

    Science.gov (United States)

    Mustapha, Nadia; Zouiten, Amina; Dridi, Dorra; Tahrani, Leyla; Zouiten, Dorra; Mosrati, Ridha; Cherif, Ameur; Chekir-Ghedira, Leila; Mansour, Hedi Ben

    2016-04-01

    This article investigates the ability of Pseudomonas peli to treat industrial pharmaceuticals wastewater (PW). Liquid chromatography-mass spectrometry (MS)/MS analysis revealed the presence, in this PW, of a variety of antibiotics such as sulfathiazole, sulfamoxole, norfloxacine, cloxacilline, doxycycline, and cefquinome.P. peli was very effective to be grown in PW and inducts a remarkable increase in chemical oxygen demand and biochemical oxygen demand (140.31 and 148.51%, respectively). On the other hand, genotoxicity of the studied effluent, before and after 24 h of shaking incubation with P. peli, was evaluated in vivo in the Mediterranean wild mussels Mytilus galloprovincialis using comet assay for quantification of DNA fragmentation. Results show that PW exhibited a statistically significant (pbody weight (b.w.); 0.33 ml/kg b.w. of PW, respectively. However, genotoxicity decreased strongly when tested with the PW obtained after incubation with P. peli We can conclude that using comet assay genotoxicity end points are useful tools to biomonitor the physicochemical and biological quality of water. Also, it could be concluded that P. peli can treat and detoxify the studied PW. © The Author(s) 2013.

  7. Reliability engineering

    International Nuclear Information System (INIS)

    Lee, Chi Woo; Kim, Sun Jin; Lee, Seung Woo; Jeong, Sang Yeong

    1993-08-01

    This book start what is reliability? such as origin of reliability problems, definition of reliability and reliability and use of reliability. It also deals with probability and calculation of reliability, reliability function and failure rate, probability distribution of reliability, assumption of MTBF, process of probability distribution, down time, maintainability and availability, break down maintenance and preventive maintenance design of reliability, design of reliability for prediction and statistics, reliability test, reliability data and design and management of reliability.

  8. Performance of a commercial assay for the diagnosis of influenza A (H1N1 infection in comparison to the Centers for Disease Control and Prevention protocol of real time RT-PCR

    Directory of Open Access Journals (Sweden)

    María G Barbás

    2012-03-01

    Full Text Available At the time of influenza A (H1N1 emergency, the WHO responded with remarkable speed by releasing guidelines and a protocol for a real-time RT-PCR assay (rRT-PCR. The aim of the present study was to evalúate the performance of the "Real Time Ready Influenza A/H1N1 Detection Set" (June 2009-Roche kit in comparison to the CDC reference rRT-PCR protocol. The overall sensitivity of the Roche assay for detection of the Inf A gene in the presence or absence of the H1 gene was 74.5 %. The sensitivity for detecting samples that were only positive for the Inf A gene (absence of the H1 gene was 53.3 % whereas the sensitivity for H1N1-positive samples (presence of the Inf A gene and any other swine gene was 76.4 %. The specificity of the assay was 97.1 %. A new version of the kit (November 2009 is now available, and a recent evaluation of its performance showed good sensitivity to detect pandemic H1N1 compared to other molecular assays.Durante la pandemia de influenza A (H1N1, la OMS recomendó algoritmos y protocolos de detección del virus mediante RT-PCR en tiempo real. El objetivo del presente estudio fue evaluar el desempeño del equipo que comercializa la empresa Roche, Real Time Ready Influenza A/H1N1 Detection Set (junio de 2009, en comparación con el protocolo de RT-PCR en tiempo real de los CDC. La sensibilidad global del ensayo de Roche para la detección del gen Inf A en presencia o ausencia del gen H1 fue 74,5 %. La sensibilidad para la detección de muestras positivas solo para el gen Inf A (ausencia del gen H1 fue 53,3 % y la sensibilidad para la detección de muestras positivas para H1N1 (presencia del gen Inf A y cualquier otro gen porcino fue 76,4 %. La especificidad fue 97,1 %. Existe una nueva versión del equipo (noviembre 2009 que, según se ha descrito, presenta buena sensibilidad en comparación con otros ensayos moleculares para detectar H1N1 pandémica.

  9. Development of highly reliable in silico SNP resource and genotyping assay from exome capture and sequencing: an example from black spruce (Picea mariana).

    Science.gov (United States)

    Pavy, Nathalie; Gagnon, France; Deschênes, Astrid; Boyle, Brian; Beaulieu, Jean; Bousquet, Jean

    2016-03-01

    Picea mariana is a widely distributed boreal conifer across Canada and the subject of advanced breeding programmes for which population genomics and genomic selection approaches are being developed. Targeted sequencing was achieved after capturing P. mariana exome with probes designed from the sequenced transcriptome of Picea glauca, a distant relative. A high capture efficiency of 75.9% was reached although spruce has a complex and large genome including gene sequences interspersed by some long introns. The results confirmed the relevance of using probes from congeneric species to perform successfully interspecific exome capture in the genus Picea. A bioinformatics pipeline was developed including stringent criteria that helped detect a set of 97,075 highly reliable in silico SNPs. These SNPs were distributed across 14,909 genes. Part of an Infinium iSelect array was used to estimate the rate of true positives by validating 4267 of the predicted in silico SNPs by genotyping trees from P. mariana populations. The true positive rate was 96.2% for in silico SNPs, compared to a genotyping success rate of 96.7% for a set 1115 P. mariana control SNPs recycled from previous genotyping arrays. These results indicate the high success rate of the genotyping array and the relevance of the selection criteria used to delineate the new P. mariana in silico SNP resource. Furthermore, in silico SNPs were generally of medium to high frequency in natural populations, thus providing high informative value for future population genomics applications. © 2015 John Wiley & Sons Ltd.

  10. A rapid and highly sensitive protocol for the detection of Escherichia coli O157:H7 based on immunochromatography assay combined with the enrichment technique of immunomagnetic nanoparticles

    Directory of Open Access Journals (Sweden)

    Qi H

    2011-11-01

    Full Text Available Hui Qi1, Zhen Zhong1, Han-Xin Zhou1, Chun-Yan Deng1, Hai Zhu2, Jin-Feng Li2, Xi-Li Wang2, Fu-Rong Li1,31Clinical Medical Research Center, The Second Clinical Medical College (Shenzhen People's Hospital, Jinan University, 2Shenzhen Bioeasy Biotechnologies Co, Ltd, 3Shenzhen Institute of Gerontology, Shenzhen, People's Republic of ChinaBackground: Escherichia coli O157:H7 (E. coli O157:H7 is an important pathogenic bacterium that threatens human health. A rapid, simple, highly sensitive, and specific method for the detection of E. coli O157:H7 is necessary.Methods: In the present study, immunomagnetic nanoparticles (IMPs were prepared with nanopure iron as the core, coated with E. coli O157:H7 polyclonal antibodies. These IMPs were used in combination with immunochromatographic assay (ICA and used to establish highly sensitive and rapid kits (IMPs+ICA to detect E. coli O157:H7. The kits were then used to detect E. coli O157:H7 in 150 food samples and were compared with conventional ICA to evaluate their efficacy.Results: The average diameter of IMPs was 56 nm and the amount of adsorbed antibodies was 106.0 µg/mg. The sensitivity of ICA and IMPs+ICA was 105 colony-forming units/mL and 103 CFUs/mL, respectively, for purified E. coli O157:H7 solution. The sensitivity of IMPs+ICA was increased by two orders, and its specificity was similar to ICA.Conclusion: The kits have the potential to offer important social and economic benefits in the screening, monitoring, and control of food safety.Keywords: colloidal gold, immunomagnetic nanoparticles, Escherichia coli O157:H7, immunochromatographic assay

  11. Fast and reliable DNA extraction protocol for identification of species in raw and processed meat products sold on the commercial market

    Directory of Open Access Journals (Sweden)

    Alvarado Pavel Espinoza

    2017-08-01

    Full Text Available In this work a protocol for the extraction of DNA from the meat of different animals (beef, pork, and horse was established. The protocol utilized TE lysis buffer with varying concentrations of phenol and chloroform as a base reagent. Reactions were carried out for verying time periods and under differing temperatures. All samples analyzed were obtained from commercial grade meat sourced from the local region. 12 samples were used for methodological optimization with 30 repetitions per sample. Once optimized, purity results for the three species were 1.7 with a concentration (determined spectrophotometrically at 260 nm of 100 μl/ml of DNA. The protocol was tested using 465 different meat samples from different animal species. All meat used was fresh and processed. Results showed a purity of 1.35 ± 0.076 and a DNA concentration of 70 ± 0.31 μl for a time duration of 1.5 hours. These results were tested by polymerase chain reaction (PCR as reported by several authors. The extracts were tested using different PCR reactions using specific primers for horses. Results suggest that there was 39 positive samples. The proposed methodology provides an efficient way to detect DNA concentration and purity, suitable for amplification with PCR.

  12. A comprehensive protocol to diagnose and treat pain of muscular origin may successfully and reliably decrease or eliminate pain in a chronic pain population.

    Science.gov (United States)

    Marcus, Norman J; Gracely, Edward J; Keefe, Kelly O

    2010-01-01

    A comprehensive protocol is presented to identify muscular causes of regional pain syndromes utilizing an electrical stimulus in lieu of palpation, and combining elements of Prolotherapy with trigger point injections. One hundred seventy-six consecutive patients were evaluated for the presence of muscle pain by utilizing an electrical stimulus produced by the Muscle Pain Detection Device. The diagnosis of "Muscle Pain Amenable to Injection" (MPAI), rather than trigger points, was made if pain was produced for the duration of the stimulation. If MPAI was found, muscle tendon injections (MTI) were offered to patients along with post-MTI physical therapy, providing neuromuscular electrical stimulation followed by a validated exercise program [1]. A control group, evaluated 1 month prior to their actual consultation/evaluation when muscle pain was identified but not yet treated, was used for comparison. Forty-five patients who met criteria completed treatment. Patients' scores on the Brief Pain Inventory decreased an average of 62%; median 70% (P < 0.001) for pain severity and 68%; median 85% (P < 0.001) for pain interference one month following treatment. These changes were significantly greater (P < 0.001) than those observed in the untreated controls. A protocol incorporating an easily reproducible electrical stimulus to diagnose a muscle causing pain in a region of the body followed by an injection technique that involves the entirety of the muscle, and post injection restoration of muscle function, can successfully eliminate or significantly reduce regional pain present for years.

  13. Test-retest and interobserver reliability of quantitative sensory testing according to the protocol of the German Research Network on Neuropathic Pain (DFNS): a multi-centre study.

    Science.gov (United States)

    Geber, Christian; Klein, Thomas; Azad, Shahnaz; Birklein, Frank; Gierthmühlen, Janne; Huge, Volker; Lauchart, Meike; Nitzsche, Dorothee; Stengel, Maike; Valet, Michael; Baron, Ralf; Maier, Christoph; Tölle, Thomas; Treede, Rolf-Detlef

    2011-03-01

    Quantitative sensory testing (QST) is an instrument to assess positive and negative sensory signs, helping to identify mechanisms underlying pathologic pain conditions. In this study, we evaluated the test-retest reliability (TR-R) and the interobserver reliability (IO-R) of QST in patients with sensory disturbances of different etiologies. In 4 centres, 60 patients (37 male and 23 female, 56.4±1.9years) with lesions or diseases of the somatosensory system were included. QST comprised 13 parameters including detection and pain thresholds for thermal and mechanical stimuli. QST was performed in the clinically most affected test area and a less or unaffected control area in a morning and an afternoon session on 2 consecutive days by examiner pairs (4 QSTs/patient). For both, TR-R and IO-R, there were high correlations (r=0.80-0.93) at the affected test area, except for wind-up ratio (TR-R: r=0.67; IO-R: r=0.56) and paradoxical heat sensations (TR-R: r=0.35; IO-R: r=0.44). Mean IO-R (r=0.83, 31% unexplained variance) was slightly lower than TR-R (r=0.86, 26% unexplained variance, Ptest area (TR-R: r=0.86; IO-R: r=0.83) than in the control area (TR-R: r=0.79; IO-R: r=0.71, each Preliability of QST. We conclude that standardized QST performed by trained examiners is a valuable diagnostic instrument with good test-retest and interobserver reliability within 2days. With standardized training, observer bias is much lower than random variance. Quantitative sensory testing performed by trained examiners is a valuable diagnostic instrument with good interobserver and test-retest reliability for use in patients with sensory disturbances of different etiologies to help identify mechanisms of neuropathic and non-neuropathic pain. Copyright © 2010 International Association for the Study of Pain. Published by Elsevier B.V. All rights reserved.

  14. Is the Scale for Measuring Motivational Interviewing Skills a valid and reliable instrument for measuring the primary care professionals motivational skills?: EVEM study protocol.

    Science.gov (United States)

    Pérula, Luis Á; Campiñez, Manuel; Bosch, Josep M; Barragán Brun, Nieves; Arboniés, Juan C; Bóveda Fontán, Julia; Martín Alvarez, Remedios; Prados, Jose A; Martín-Rioboó, Enrique; Massons, Josep; Criado, Margarita; Fernández, José Á; Parras, Juan M; Ruiz-Moral, Roger; Novo, Jesús M

    2012-11-22

    Lifestyle is one of the main determinants of people's health. It is essential to find the most effective prevention strategies to be used to encourage behavioral changes in their patients. Many theories are available that explain change or adherence to specific health behaviors in subjects. In this sense the named Motivational Interviewing has increasingly gained relevance. Few well-validated instruments are available for measuring doctors' communication skills, and more specifically the Motivational Interviewing. The hypothesis of this study is that the Scale for Measuring Motivational Interviewing Skills (EVEM questionnaire) is a valid and reliable instrument for measuring the primary care professionals skills to get behavior change in patients. To test the hypothesis we have designed a prospective, observational, multi-center study to validate a measuring instrument. - Thirty-two primary care centers in Spain. -Sampling and Size: a) face and consensual validity: A group composed of 15 experts in Motivational Interviewing. b) Assessment of the psychometric properties of the scale; 50 physician- patient encounters will be videoed; a total of 162 interviews will be conducted with six standardized patients, and another 200 interviews will be conducted with 50 real patients (n=362). Four physicians will be specially trained to assess 30 interviews randomly selected to test the scale reproducibility. -Measurements for to test the hypothesis: a) Face validity: development of a draft questionnaire based on a theoretical model, by using Delphi-type methodology with experts. b) Scale psychometric properties: intraobservers will evaluate video recorded interviews: content-scalability validity (Exploratory Factor Analysis), internal consistency (Cronbach alpha), intra-/inter-observer reliability (Kappa index, intraclass correlation coefficient, Bland & Altman methodology), generalizability, construct validity and sensitivity to change (Pearson product-moment correlation

  15. Is the Scale for Measuring Motivational Interviewing Skills a valid and reliable instrument for measuring the primary care professionals motivational skills?: EVEM study protocol

    Directory of Open Access Journals (Sweden)

    Pérula Luis Á

    2012-11-01

    Full Text Available Abstract Background Lifestyle is one of the main determinants of people’s health. It is essential to find the most effective prevention strategies to be used to encourage behavioral changes in their patients. Many theories are available that explain change or adherence to specific health behaviors in subjects. In this sense the named Motivational Interviewing has increasingly gained relevance. Few well-validated instruments are available for measuring doctors’ communication skills, and more specifically the Motivational Interviewing. Methods/Design The hypothesis of this study is that the Scale for Measuring Motivational Interviewing Skills (EVEM questionnaire is a valid and reliable instrument for measuring the primary care professionals skills to get behavior change in patients. To test the hypothesis we have designed a prospective, observational, multi-center study to validate a measuring instrument. –Scope: Thirty-two primary care centers in Spain. -Sampling and Size: a face and consensual validity: A group composed of 15 experts in Motivational Interviewing. b Assessment of the psychometric properties of the scale; 50 physician- patient encounters will be videoed; a total of 162 interviews will be conducted with six standardized patients, and another 200 interviews will be conducted with 50 real patients (n=362. Four physicians will be specially trained to assess 30 interviews randomly selected to test the scale reproducibility. -Measurements for to test the hypothesis: a Face validity: development of a draft questionnaire based on a theoretical model, by using Delphi-type methodology with experts. b Scale psychometric properties: intraobservers will evaluate video recorded interviews: content-scalability validity (Exploratory Factor Analysis, internal consistency (Cronbach alpha, intra-/inter-observer reliability (Kappa index, intraclass correlation coefficient, Bland & Altman methodology, generalizability, construct validity and

  16. Critical issues with the in vivo comet assay: A report of the comet assay working group in the 6th International Workshop on Genotoxicity Testing (IWGT).

    Science.gov (United States)

    Speit, Günter; Kojima, Hajime; Burlinson, Brian; Collins, Andrew R; Kasper, Peter; Plappert-Helbig, Ulla; Uno, Yoshifumi; Vasquez, Marie; Beevers, Carol; De Boeck, Marlies; Escobar, Patricia A; Kitamoto, Sachiko; Pant, Kamala; Pfuhler, Stefan; Tanaka, Jin; Levy, Dan D

    2015-05-01

    As a part of the 6th IWGT, an expert working group on the comet assay evaluated critical topics related to the use of the in vivo comet assay in regulatory genotoxicity testing. The areas covered were: identification of the domain of applicability and regulatory acceptance, identification of critical parameters of the protocol and attempts to standardize the assay, experience with combination and integration with other in vivo studies, demonstration of laboratory proficiency, sensitivity and power of the protocol used, use of different tissues, freezing of samples, and choice of appropriate measures of cytotoxicity. The standard protocol detects various types of DNA lesions but it does not detect all types of DNA damage. Modifications of the standard protocol may be used to detect additional types of specific DNA damage (e.g., cross-links, bulky adducts, oxidized bases). In addition, the working group identified critical parameters that should be carefully controlled and described in detail in every published study protocol. In vivo comet assay results are more reliable if they were obtained in laboratories that have demonstrated proficiency. This includes demonstration of adequate response to vehicle controls and an adequate response to a positive control for each tissue being examined. There was a general agreement that freezing of samples is an option but more data are needed in order to establish generally accepted protocols. With regard to tissue toxicity, the working group concluded that cytotoxicity could be a confounder of comet results. It is recommended to look at multiple parameters such as histopathological observations, organ-specific clinical chemistry as well as indicators of tissue inflammation to decide whether compound-specific toxicity might influence the result. The expert working group concluded that the alkaline in vivo comet assay is a mature test for the evaluation of genotoxicity and can be recommended to regulatory agencies for use

  17. Reliability Engineering

    International Nuclear Information System (INIS)

    Lee, Sang Yong

    1992-07-01

    This book is about reliability engineering, which describes definition and importance of reliability, development of reliability engineering, failure rate and failure probability density function about types of it, CFR and index distribution, IFR and normal distribution and Weibull distribution, maintainability and movability, reliability test and reliability assumption in index distribution type, normal distribution type and Weibull distribution type, reliability sampling test, reliability of system, design of reliability and functionality failure analysis by FTA.

  18. Accuracy and reliability of the sensory test performed using the laryngopharyngeal endoscopic esthesiometer and rangefinder in patients with suspected obstructive sleep apnoea hypopnoea: protocol for a prospective double-blinded, randomised, exploratory study.

    Science.gov (United States)

    Giraldo-Cadavid, Luis Fernando; Bastidas, Alirio Rodrigo; Padilla-Ortiz, Diana Marcela; Concha-Galan, Diana Carolina; Bazurto, María Angelica; Vargas, Leslie

    2017-08-21

    Patients with obstructive sleep apnoea hypopnoea syndrome (OSA) might have varying degrees of laryngopharyngeal mechanical hyposensitivity that might impair the brain's capacity to prevent airway collapse during sleep. However, this knowledge about sensory compromises in OSA comes from studies performed using methods with little evidence of their validity. Hence, the purpose of this study is to assess the reliability and accuracy of the measurement of laryngopharyngeal mechanosensitivity in patients with OSA using a recently developed laryngopharyngeal endoscopic esthesiometer and rangefinder (LPEER). The study will be prospective and double blinded, with a randomised crossover assignment of raters performing the sensory tests. Subjects will be recruited from patients with suspected OSA referred for baseline polysomnography to a university hospital sleep laboratory. Intra-rater and inter-rater reliability will be evaluated using the Bland-Altman's limits of agreement plot, the intraclass correlation coefficient, and the Pearson or Spearman correlation coefficient, depending on the distribution of the variables. Diagnostic accuracy will be evaluated plotting ROC curves using standard baseline polysomnography as a reference. The sensory threshold values ​​for patients with mild, moderate and severe OSA will be determined and compared using ANOVA or the Kruskal-Wallis test, depending on the distribution of the variables. The LPEER could be a new tool for evaluating and monitoring laryngopharyngeal sensory impairment in patients with OSA. If it is shown to be valid, it could help to increase our understanding of the pathophysiological mechanisms of this condition and potentially help in finding new therapeutic interventions for OSA. The protocol has been approved by the Institutional Review Board of Fundacion Neumologica Colombiana. The results will be disseminated through conference presentations and peer-reviewed publication. This trial was registered at Clinical

  19. Field experience with a mobile tomographic nondestructive assay system

    International Nuclear Information System (INIS)

    Prettyman, T.H.; Betts, S.E.; Taggart, D.P.; Estep, R.J.; Nicholas, N.J.; Lucas, M.C.; Harlan, R.A.

    1995-01-01

    A mobile tomographic gamma-ray scanner (TGS) developed by Los Alamos National Laboratory was recently demonstrated at the Rocky Flats Environmental Technology Site and is currently in use at Los Alamos waste storage areas. The scanner was developed to assay radionuclides in low-level, transuranic, and mixed waste in containers ranging in size from 2 ft 3 boxes to 83-gallon overpacks. The tomographic imaging capability provides a complete correction for source distribution and matrix attenuation effects, enabling accurate assays of Pu-239 and other gamma-ray emitting isotopes. In addition, the system can reliably detect self-absorbing material such as plutonium metal shot, and can correct for bias caused by self-absorption. The system can be quickly configured to execute far-field scans, segmented gamma-ray scans, and a host of intermediate scanning protocols, enabling higher throughput (up to 20 drums per 8-hour shift). In this paper, we will report on the results of field trials of the mobile system at Rocky Flats and Los Alamos. Assay accuracy is confirmed for cases in which TGS assays can be compared with assays (e.g. with calorimetry) of individual packages within the drums. The mobile tomographic technology is expected to considerably reduce characterization costs at DOE production and environmental technology sites

  20. Towards Reliable Integrated Services for Dependable Systems

    DEFF Research Database (Denmark)

    Schiøler, Henrik; Ravn, Anders Peter; Izadi-Zamanabadi, Roozbeh

    Reliability issues for various technical systems are discussed and focus is directed towards distributed systems, where communication facilities are vital to maintain system functionality. Reliability in communication subsystems is considered as a resource to be shared among a number of logical c...... applications residing on alternative routes. Details are provided for the operation of RRRSVP based on reliability slack calculus. Conclusions summarize the considerations and give directions for future research....... connections and a reliability management framework is suggested. We suggest a network layer level reliability management protocol RRSVP (Reliability Resource Reservation Protocol) as a counterpart of the RSVP for bandwidth and time resource management. Active and passive standby redundancy by background...

  1. Towards Reliable Integrated Services for Dependable Systems

    DEFF Research Database (Denmark)

    Schiøler, Henrik; Ravn, Anders Peter; Izadi-Zamanabadi, Roozbeh

    2003-01-01

    Reliability issues for various technical systems are discussed and focus is directed towards distributed systems, where communication facilities are vital to maintain system functionality. Reliability in communication subsystems is considered as a resource to be shared among a number of logical c...... applications residing on alternative routes. Details are provided for the operation of RRRSVP based on reliability slack calculus. Conclusions summarize the considerations and give directions for future research....... connections and a reliability management framework is suggested. We suggest a network layer level reliability management protocol RRSVP (Reliability Resource Reservation Protocol) as a counterpart of the RSVP for bandwidth and time resource management. Active and passive standby redundancy by background...

  2. Hormone assay

    International Nuclear Information System (INIS)

    Eisentraut, A.M.

    1977-01-01

    An improved radioimmunoassay is described for measuring total triiodothyronine or total thyroxine levels in a sample of serum containing free endogenous thyroid hormone and endogenous thyroid hormone bound to thyroid hormone binding protein. The thyroid hormone is released from the protein by adding hydrochloric acid to the serum. The pH of the separated thyroid hormone and thyroid hormone binding protein is raised in the absence of a blocking agent without interference from the endogenous protein. 125 I-labelled thyroid hormone and thyroid hormone antibodies are added to the mixture, allowing the labelled and unlabelled thyroid hormone and the thyroid hormone antibody to bind competitively. This results in free thyroid hormone being separated from antibody bound thyroid hormone and thus the unknown quantity of thyroid hormone may be determined. A thyroid hormone test assay kit is described for this radioimmunoassay. It provides a 'single tube' assay which does not require blocking agents for endogenous protein interference nor an external solid phase sorption step for the separation of bound and free hormone after the competitive binding step; it also requires a minimum number of manipulative steps. Examples of the assay are given to illustrate the reproducibility, linearity and specificity of the assay. (UK)

  3. Assay system

    International Nuclear Information System (INIS)

    Patzke, J.B.; Rosenberg, B.J.

    1984-01-01

    The accuracy of assays for monitoring concentrations of basic drugs in biological fluids containing a 1 -acid glycoproteins, such as blood (serum or plasma), is improved by the addition of certain organic phosphate compounds to minimize the ''protein effect.'' Kits containing the elements of the invention are also disclosed

  4. A shortened protocol for assessing cognitive bias in rats.

    Science.gov (United States)

    Brydges, Nichola M; Hall, Lynsey

    2017-07-15

    Reliable measurement of affective state in animals is a significant goal of animal welfare. Such measurements would also improve the validity of pre-clinical mental health research which relies on animal models. However, at present, affective states in animals are inaccessible to direct measurement. In humans, changes in cognitive processing can give reliable indications of emotional state. Therefore, similar techniques are increasingly being used to gain proxy measures of affective states in animals. In particular, the 'cognitive bias' assay has gained popularity in recent years. Major disadvantages of this technique include length of time taken for animals to acquire the task (typically several weeks), negative experiences associated with task training, and issues of motivation. Here we present a shortened cognitive bias protocol using only positive reinforcers which must actively be responded to. The protocol took an average of 4days to complete, and produced similar results to previous, longer methods (minimum 30days). Specifically, rats housed in standard laboratory conditions demonstrated negative cognitive biases when presented with ambiguous stimuli, and took longer to make a decision when faced with an ambiguous stimulus. Compared to previous methods, this protocol is significantly shorter (average 4days vs. minimum 30days), utilises only positive reinforcers to avoid inducing negative affective states, and requires active responses to all cues, avoiding potential confounds of motivational state. We have successfully developed a shortened cognitive bias protocol, suitable for use with laboratory rats. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  5. Bioremediation protocols

    National Research Council Canada - National Science Library

    Sheehan, David

    1997-01-01

    ..., .. . . . . .. ,. . . .. . . . . . . . .. . . . . .. . . .. . .. 3 2 Granular Nina Sludge Christiansen, Consortia lndra for Bioremediation, M. Mathrani, and Birgitte K. Ahring . 23 PART II PROTOCOLS...

  6. Reliability of diagnostic imaging techniques in suspected acute appendicitis: proposed diagnostic protocol; Indicacion de las tecnicas de diagnostico por la imagen en la sospecha de apendicitis aguda: propuesta de protocolo diagnostico

    Energy Technology Data Exchange (ETDEWEB)

    Cura del, J. L.; Oleaga, L.; Grande, D.; Vela, A. C.; Ibanez, A. M. [Hospital de Basureto. Bilbao (Spain)

    2001-07-01

    To study the utility of ultrasound and computed tomography (CT) in case of suspected appendicitis. To determine the diagnostic yield in terms of different clinical contexts and patient characteristics. to assess the costs and benefits of introducing these techniques and propose a protocol for their use. Negative appendectomies, complications and length of hospital stay in a group of 152 patients with suspected appendicitis who underwent ultrasound and CT were compared with those of 180 patients who underwent appendectomy during the same time period, but had not been selected for the first group: these patients costs for each group were calculated. In the first group, the diagnostic value of the clinical signs was also evaluated. The reliability of the clinical signs was limited, while the results with ultrasound and CT were excellent. The incidence of negative appendectomy was 9.6% in the study group and 12.2% in the control group. Moreover, there were fewer complications and a shorter hospital stay in the first group. Among men, however, the rate of negative appendectomy was lower in the control group. The cost of using ultrasound and CT in the management of appendicitis was only slightly higher than that of the control group. Although ultrasound and CT are not necessary in cases in which the probability of appendicitis is low or in men presenting clear clinical evidence, the use of these techniques is indicated in the remaining cases in which appendicitis is suspected. In children, ultrasound is the technique of choice. In all other patients, if negative results are obtained with one of the two techniques, the other should be performed. (Author) 49 refs.

  7. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  8. In silico toxicology protocols.

    Science.gov (United States)

    Myatt, Glenn J; Ahlberg, Ernst; Akahori, Yumi; Allen, David; Amberg, Alexander; Anger, Lennart T; Aptula, Aynur; Auerbach, Scott; Beilke, Lisa; Bellion, Phillip; Benigni, Romualdo; Bercu, Joel; Booth, Ewan D; Bower, Dave; Brigo, Alessandro; Burden, Natalie; Cammerer, Zoryana; Cronin, Mark T D; Cross, Kevin P; Custer, Laura; Dettwiler, Magdalena; Dobo, Krista; Ford, Kevin A; Fortin, Marie C; Gad-McDonald, Samantha E; Gellatly, Nichola; Gervais, Véronique; Glover, Kyle P; Glowienke, Susanne; Van Gompel, Jacky; Gutsell, Steve; Hardy, Barry; Harvey, James S; Hillegass, Jedd; Honma, Masamitsu; Hsieh, Jui-Hua; Hsu, Chia-Wen; Hughes, Kathy; Johnson, Candice; Jolly, Robert; Jones, David; Kemper, Ray; Kenyon, Michelle O; Kim, Marlene T; Kruhlak, Naomi L; Kulkarni, Sunil A; Kümmerer, Klaus; Leavitt, Penny; Majer, Bernhard; Masten, Scott; Miller, Scott; Moser, Janet; Mumtaz, Moiz; Muster, Wolfgang; Neilson, Louise; Oprea, Tudor I; Patlewicz, Grace; Paulino, Alexandre; Lo Piparo, Elena; Powley, Mark; Quigley, Donald P; Reddy, M Vijayaraj; Richarz, Andrea-Nicole; Ruiz, Patricia; Schilter, Benoit; Serafimova, Rositsa; Simpson, Wendy; Stavitskaya, Lidiya; Stidl, Reinhard; Suarez-Rodriguez, Diana; Szabo, David T; Teasdale, Andrew; Trejo-Martin, Alejandra; Valentin, Jean-Pierre; Vuorinen, Anna; Wall, Brian A; Watts, Pete; White, Angela T; Wichard, Joerg; Witt, Kristine L; Woolley, Adam; Woolley, David; Zwickl, Craig; Hasselgren, Catrin

    2018-04-17

    The present publication surveys several applications of in silico (i.e., computational) toxicology approaches across different industries and institutions. It highlights the need to develop standardized protocols when conducting toxicity-related predictions. This contribution articulates the information needed for protocols to support in silico predictions for major toxicological endpoints of concern (e.g., genetic toxicity, carcinogenicity, acute toxicity, reproductive toxicity, developmental toxicity) across several industries and regulatory bodies. Such novel in silico toxicology (IST) protocols, when fully developed and implemented, will ensure in silico toxicological assessments are performed and evaluated in a consistent, reproducible, and well-documented manner across industries and regulatory bodies to support wider uptake and acceptance of the approaches. The development of IST protocols is an initiative developed through a collaboration among an international consortium to reflect the state-of-the-art in in silico toxicology for hazard identification and characterization. A general outline for describing the development of such protocols is included and it is based on in silico predictions and/or available experimental data for a defined series of relevant toxicological effects or mechanisms. The publication presents a novel approach for determining the reliability of in silico predictions alongside experimental data. In addition, we discuss how to determine the level of confidence in the assessment based on the relevance and reliability of the information. Copyright © 2018. Published by Elsevier Inc.

  9. The fluorometric microculture cytotoxicity assay.

    Science.gov (United States)

    Lindhagen, Elin; Nygren, Peter; Larsson, Rolf

    2008-01-01

    The fluorometric microculture cytotoxicity assay (FMCA) is a nonclonogenic microplate-based cell viability assay used for measurement of the cytotoxic and/or cytostatic effect of different compounds in vitro. The assay is based on hydrolysis of the probe, fluorescein diacetate (FDA) by esterases in cells with intact plasma membranes. The assay is available as both a semiautomated 96-well plate setup and a 384-well plate version fully adaptable to robotics. Experimental plates are prepared with a small amount of drug solution and can be stored frozen. Cells are seeded on the plates and cell viability is evaluated after 72 h. The protocol described here is applicable both for cell lines and freshly prepared tumor cells from patients and is suitable both for screening in drug development and as a basis for a predictive test for individualization of anticancer drug therapy.

  10. Human reliability

    International Nuclear Information System (INIS)

    Embrey, D.E.

    1987-01-01

    Concepts and techniques of human reliability have been developed and are used mostly in probabilistic risk assessment. For this, the major application of human reliability assessment has been to identify the human errors which have a significant effect on the overall safety of the system and to quantify the probability of their occurrence. Some of the major issues within human reliability studies are reviewed and it is shown how these are applied to the assessment of human failures in systems. This is done under the following headings; models of human performance used in human reliability assessment, the nature of human error, classification of errors in man-machine systems, practical aspects, human reliability modelling in complex situations, quantification and examination of human reliability, judgement based approaches, holistic techniques and decision analytic approaches. (UK)

  11. Reliability Calculations

    DEFF Research Database (Denmark)

    Petersen, Kurt Erling

    1986-01-01

    Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety...... and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic...... approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very...

  12. Reliability Engineering

    CERN Document Server

    Lazzaroni, Massimo

    2012-01-01

    This book gives a practical guide for designers and users in Information and Communication Technology context. In particular, in the first Section, the definition of the fundamental terms according to the international standards are given. Then, some theoretical concepts and reliability models are presented in Chapters 2 and 3: the aim is to evaluate performance for components and systems and reliability growth. Chapter 4, by introducing the laboratory tests, puts in evidence the reliability concept from the experimental point of view. In ICT context, the failure rate for a given system can be

  13. Reliability training

    Science.gov (United States)

    Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Dillard, Richard B.; Wong, Kam L.; Barber, Frank J.; Barina, Frank J.

    1992-01-01

    Discussed here is failure physics, the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low cost reliable products. A review of reliability for the years 1940 to 2000 is given. Next, a review of mathematics is given as well as a description of what elements contribute to product failures. Basic reliability theory and the disciplines that allow us to control and eliminate failures are elucidated.

  14. Assaying Cellular Viability Using the Neutral Red Uptake Assay.

    Science.gov (United States)

    Ates, Gamze; Vanhaecke, Tamara; Rogiers, Vera; Rodrigues, Robim M

    2017-01-01

    The neutral red uptake assay is a cell viability assay that allows in vitro quantification of xenobiotic-induced cytotoxicity. The assay relies on the ability of living cells to incorporate and bind neutral red, a weak cationic dye, in lysosomes. As such, cytotoxicity is expressed as a concentration-dependent reduction of the uptake of neutral red after exposure to the xenobiotic under investigation. The neutral red uptake assay is mainly used for hazard assessment in in vitro toxicology applications. This method has also been introduced in regulatory recommendations as part of 3T3-NRU-phototoxicity-assay, which was regulatory accepted in all EU member states in 2000 and in the OECD member states in 2004 as a test guideline (TG 432). The present protocol describes the neutral red uptake assay using the human hepatoma cell line HepG2, which is often employed as an alternative in vitro model for human hepatocytes. As an example, the cytotoxicity of acetaminophen and acetyl salicylic acid is assessed.

  15. Reliability calculations

    International Nuclear Information System (INIS)

    Petersen, K.E.

    1986-03-01

    Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very complex systems. In order to increase the applicability of the programs variance reduction techniques can be applied to speed up the calculation process. Variance reduction techniques have been studied and procedures for implementation of importance sampling are suggested. (author)

  16. Systems reliability/structural reliability

    International Nuclear Information System (INIS)

    Green, A.E.

    1980-01-01

    The question of reliability technology using quantified techniques is considered for systems and structures. Systems reliability analysis has progressed to a viable and proven methodology whereas this has yet to be fully achieved for large scale structures. Structural loading variants over the half-time of the plant are considered to be more difficult to analyse than for systems, even though a relatively crude model may be a necessary starting point. Various reliability characteristics and environmental conditions are considered which enter this problem. The rare event situation is briefly mentioned together with aspects of proof testing and normal and upset loading conditions. (orig.)

  17. Mac protocols for cyber-physical systems

    CERN Document Server

    Xia, Feng

    2015-01-01

    This book provides a literature review of various wireless MAC protocols and techniques for achieving real-time and reliable communications in the context of cyber-physical systems (CPS). The evaluation analysis of IEEE 802.15.4 for CPS therein will give insights into configuration and optimization of critical design parameters of MAC protocols. In addition, this book also presents the design and evaluation of an adaptive MAC protocol for medical CPS, which exemplifies how to facilitate real-time and reliable communications in CPS by exploiting IEEE 802.15.4 based MAC protocols. This book wil

  18. Reliability of IP Tunnels in Military Networks

    Directory of Open Access Journals (Sweden)

    Pólkowski Marcin

    2016-10-01

    Full Text Available The military networks, contrary to commercial ones, require standards which provide the highest level of security and reliability. The process to assuring redundancy of the main connections through applying various protocols and transmission media causes problem with time needed to re-establish virtual tunnels between different locations in case of damaged link. This article compares reliability of different IP (Internet Protocol tunnels, which were implemented on military network devices.

  19. A Comprehensive Review on Clinical Applications of Comet Assay

    Science.gov (United States)

    Gunasekarana, Vidya; Chand, Parkash

    2015-01-01

    Increased levels of DNA damage and ineffective repair mechanisms are the underlying bio-molecular events in the pathogenesis of most of the life-threatening diseases like cancer and degenerative diseases. The sources of DNA damage can be either exogenous or endogenous in origin. Imbalance between the oxidants and antioxidants resulting in increased reactive oxygen species mostly accounts for the endogenously derived attacks on DNA. Among the various methods employed in the estimation of DNA damage, alkaline comet assay is proven to be a relatively simple and versatile tool in the assessment of DNA damage and also in determining the efficacy of DNA repair mechanism. The aim of this article is to review the application of comet assay in the field of medicine towards human biomonitoring, understanding the pathogenesis of cancer and progression of chronic and degenerative diseases, prediction of tumour radio & chemosensitivity and in male infertility. A standardized protocol and analysis system of various variants of comet assay in different types of cells, across the labs will be of useful and reliable clinical tool in the field of Medicine for the estimation of levels of DNA damage and repair mechanisms. PMID:25954633

  20. Human reliability

    International Nuclear Information System (INIS)

    Bubb, H.

    1992-01-01

    This book resulted from the activity of Task Force 4.2 - 'Human Reliability'. This group was established on February 27th, 1986, at the plenary meeting of the Technical Reliability Committee of VDI, within the framework of the joint committee of VDI on industrial systems technology - GIS. It is composed of representatives of industry, representatives of research institutes, of technical control boards and universities, whose job it is to study how man fits into the technical side of the world of work and to optimize this interaction. In a total of 17 sessions, information from the part of ergonomy dealing with human reliability in using technical systems at work was exchanged, and different methods for its evaluation were examined and analyzed. The outcome of this work was systematized and compiled in this book. (orig.) [de

  1. Microelectronics Reliability

    Science.gov (United States)

    2017-01-17

    inverters  connected in a chain. ................................................. 5  Figure 3  Typical graph showing frequency versus square root of...developing an experimental  reliability estimating methodology that could both illuminate the  lifetime  reliability of advanced devices,  circuits and...or  FIT of the device. In other words an accurate estimate of the device  lifetime  was found and thus the  reliability  that  can  be  conveniently

  2. Cryptographic Protocols:

    DEFF Research Database (Denmark)

    Geisler, Martin Joakim Bittel

    cryptography was thus concerned with message confidentiality and integrity. Modern cryptography cover a much wider range of subjects including the area of secure multiparty computation, which will be the main topic of this dissertation. Our first contribution is a new protocol for secure comparison, presented...... implemented the comparison protocol in Java and benchmarks show that is it highly competitive and practical. The biggest contribution of this dissertation is a general framework for secure multiparty computation. Instead of making new ad hoc implementations for each protocol, we want a single and extensible...... in Chapter 2. Comparisons play a key role in many systems such as online auctions and benchmarks — it is not unreasonable to say that when parties come together for a multiparty computation, it is because they want to make decisions that depend on private information. Decisions depend on comparisons. We have...

  3. Development of a fast and simple gas chromatographic protocol based on the combined use of alkyl chloroformate and solid phase microextraction for the assay of polyamines in human urine.

    Science.gov (United States)

    Naccarato, Attilio; Elliani, Rosangela; Cavaliere, Brunella; Sindona, Giovanni; Tagarelli, Antonio

    2018-05-11

    Polyamines are aliphatic amines with low molecular weight that are widely recognized as one of the most important cancer biomarkers for early diagnosis and treatment. The goal of the work herein presented is the development of a rapid and simple method for the quantification of free polyamines (i.e., putrescine, cadaverine, spermidine, spermine) and N-monoacetylated polyamines (i.e., N 1 -Acetylspermidine, N 8 -Acetylspermidine, and N 1 -Acetylspermine) in human urine. A preliminary derivatization with propyl chloroformate combined with the use of solid phase microextraction (SPME) allowed for an easy and automatable protocol involving minimal sample handling and no consumption of organic solvents. The affinity of the analytes toward five commercial SPME coatings was evaluated in univariate mode, and the best result in terms of analyte extraction was achieved using the divinylbenzene/carboxen/polydimethylsiloxane fiber. The variables affecting the performance of SPME analysis were optimized by the multivariate approach of experimental design and, in particular, using a central composite design (CCD). The optimal working conditions in terms of response values are the following: extraction temperature 40 °C, extraction time of 15 min and no addition of NaCl. Analyses were carried out by gas chromatography-triple quadrupole mass spectrometry (GC-QqQ-MS) in selected reaction monitoring (SRM) acquisition mode. The developed method was validated according to the guidelines issued by the Food and Drug Administration (FDA). The satisfactory performances reached in terms of linearity, sensitivity (LOQs between 0.01 and 0.1 μg/mL), matrix effect (68-121%), accuracy, and precision (inter-day values between -24% and +16% and in the range 3.3-28.4%, respectively) make the proposed protocol suitable to be adopted for quantification of these important biomarkers in urine samples. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Assessment of a recombinant androgen receptor binding assay: initial steps towards validation.

    Science.gov (United States)

    Freyberger, Alexius; Weimer, Marc; Tran, Hoai-Son; Ahr, Hans-Jürgen

    2010-08-01

    Despite more than a decade of research in the field of endocrine active compounds with affinity for the androgen receptor (AR), still no validated recombinant AR binding assay is available, although recombinant AR can be obtained from several sources. With funding from the European Union (EU)-sponsored 6th framework project, ReProTect, we developed a model protocol for such an assay based on a simple AR binding assay recently developed at our institution. Important features of the protocol were the use of a rat recombinant fusion protein to thioredoxin containing both the hinge region and ligand binding domain (LBD) of the rat AR (which is identical to the human AR-LBD) and performance in a 96-well plate format. Besides two reference compounds [dihydrotestosterone (DHT), androstenedione] ten test compounds with different affinities for the AR [levonorgestrel, progesterone, prochloraz, 17alpha-methyltestosterone, flutamide, norethynodrel, o,p'-DDT, dibutylphthalate, vinclozolin, linuron] were used to explore the performance of the assay. At least three independent experiments per compound were performed. The AR binding properties of reference and test compounds were well detected, in terms of the relative ranking of binding affinities, there was good agreement with published data obtained from experiments using recombinant AR preparations. Irrespective of the chemical nature of the compound, individual IC(50)-values for a given compound varied by not more than a factor of 2.6. Our data demonstrate that the assay reliably ranked compounds with strong, weak, and no/marginal affinity for the AR with high accuracy. It avoids the manipulation and use of animals, as a recombinant protein is used and thus contributes to the 3R concept. On the whole, this assay is a promising candidate for further validation. Copyright 2009 Elsevier Inc. All rights reserved.

  5. Redefining reliability

    International Nuclear Information System (INIS)

    Paulson, S.L.

    1995-01-01

    Want to buy some reliability? The question would have been unthinkable in some markets served by the natural gas business even a few years ago, but in the new gas marketplace, industrial, commercial and even some residential customers have the opportunity to choose from among an array of options about the kind of natural gas service they need--and are willing to pay for. The complexities of this brave new world of restructuring and competition have sent the industry scrambling to find ways to educate and inform its customers about the increased responsibility they will have in determining the level of gas reliability they choose. This article discusses the new options and the new responsibilities of customers, the needed for continuous education, and MidAmerican Energy Company's experiment in direct marketing of natural gas

  6. Network Coding Protocols for Smart Grid Communications

    DEFF Research Database (Denmark)

    Prior, Rui; Roetter, Daniel Enrique Lucani; Phulpin, Yannick

    2014-01-01

    We propose a robust network coding protocol for enhancing the reliability and speed of data gathering in smart grids. At the heart of our protocol lies the idea of tunable sparse network coding, which adopts the transmission of sparsely coded packets at the beginning of the transmission process b...

  7. Rosette Assay: Highly Customizable Dot-Blot for SH2 Domain Screening.

    Science.gov (United States)

    Ng, Khong Y; Machida, Kazuya

    2017-01-01

    With a growing number of high-throughput studies, structural analyses, and availability of protein-protein interaction databases, it is now possible to apply web-based prediction tools to SH2 domain-interactions. However, in silico prediction is not always reliable and requires experimental validation. Rosette assay is a dot blot-based reverse-phase assay developed for the assessment of binding between SH2 domains and their ligands. It is conveniently customizable, allowing for low- to high-throughput analysis of interactions between various numbers of SH2 domains and their ligands, e.g., short peptides, purified proteins, and cell lysates. The binding assay is performed in a 96-well plate (MBA or MWA apparatus) in which a sample spotted membrane is incubated with up to 96 labeled SH2 domains. Bound domains are detected and quantified using a chemiluminescence or near-infrared fluorescence (IR) imaging system. In this chapter, we describe a practical protocol for rosette assay to assess interactions between synthesized tyrosine phosphorylated peptides and a library of GST-tagged SH2 domains. Since the methodology is not confined to assessment of SH2-pTyr interactions, rosette assay can be broadly utilized for ligand and drug screening using different protein interaction domains or antibodies.

  8. Overview of the InterGroup protocols

    Energy Technology Data Exchange (ETDEWEB)

    Berket, Karlo [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Agarwal, Deborah A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Melliar-Smith, P. Michael [Univ. of California, Santa Barbara, CA (United States); Moser, Louise E. [Univ. of California, Santa Barbara, CA (United States)

    2001-03-01

    Existing reliable ordered group communication protocols have been developed for local-area networks and do not, in general, scale well to large numbers of nodes and wide-area networks. The InterGroup suite of protocols is a scalable group communication system that introduces a novel approach to handling group membership, and supports a receiver-oriented selection of service. The protocols are intended for a wide-area network, with a large number of nodes, that has highly variable delays and a high message loss rate, such as the Internet. The levels of the message delivery service range from unreliable unordered to reliable group timestamp ordered.

  9. Chromosome aberration assays in Allium

    Energy Technology Data Exchange (ETDEWEB)

    Grant, W.F.

    1982-01-01

    The common onion (Allium cepa) is an excellent plant for the assay of chromosome aberrations after chemical treatment. Other species of Allium (A. cepa var. proliferum, A. carinatum, A. fistulosum and A. sativum) have also been used but to a much lesser extent. Protocols have been given for using root tips from either bulbs or seeds of Allium cepa to study the cytological end-points, such as chromosome breaks and exchanges, which follow the testing of chemicals in somatic cells. It is considered that both mitotic and meiotic end-points should be used to a greater extent in assaying the cytogenetic effects of a chemical. From a literature survey, 148 chemicals are tabulated that have been assayed in 164 Allium tests for their clastogenic effect. Of the 164 assays which have been carried out, 75 are reported as giving a positive reaction, 49 positive and with a dose response, 1 positive and temperature-related, 9 borderline positive, and 30 negative; 76% of the chemicals gave a definite positive response. It is proposed that the Allium test be included among those tests routinely used for assessing chromosomal damage induced by chemicals.

  10. Cost-optimization of the IPv4 zeroconf protocol

    NARCIS (Netherlands)

    Bohnenkamp, H.C.; van der Stok, Peter; Hermanns, H.; Vaandrager, Frits

    2003-01-01

    This paper investigates the tradeoff between reliability and effectiveness for the IPv4 Zeroconf protocol, proposed by Cheshire/Adoba/Guttman in 2002, dedicated to the selfconfiguration of IP network interfaces. We develop a simple stochastic cost model of the protocol, where reliability is measured

  11. An Introduction To Reliability

    International Nuclear Information System (INIS)

    Park, Kyoung Su

    1993-08-01

    This book introduces reliability with definition of reliability, requirement of reliability, system of life cycle and reliability, reliability and failure rate such as summary, reliability characteristic, chance failure, failure rate which changes over time, failure mode, replacement, reliability in engineering design, reliability test over assumption of failure rate, and drawing of reliability data, prediction of system reliability, conservation of system, failure such as summary and failure relay and analysis of system safety.

  12. Automation of a Nile red staining assay enables high throughput quantification of microalgal lipid production.

    Science.gov (United States)

    Morschett, Holger; Wiechert, Wolfgang; Oldiges, Marco

    2016-02-09

    Within the context of microalgal lipid production for biofuels and bulk chemical applications, specialized higher throughput devices for small scale parallelized cultivation are expected to boost the time efficiency of phototrophic bioprocess development. However, the increasing number of possible experiments is directly coupled to the demand for lipid quantification protocols that enable reliably measuring large sets of samples within short time and that can deal with the reduced sample volume typically generated at screening scale. To meet these demands, a dye based assay was established using a liquid handling robot to provide reproducible high throughput quantification of lipids with minimized hands-on-time. Lipid production was monitored using the fluorescent dye Nile red with dimethyl sulfoxide as solvent facilitating dye permeation. The staining kinetics of cells at different concentrations and physiological states were investigated to successfully down-scale the assay to 96 well microtiter plates. Gravimetric calibration against a well-established extractive protocol enabled absolute quantification of intracellular lipids improving precision from ±8 to ±2 % on average. Implementation into an automated liquid handling platform allows for measuring up to 48 samples within 6.5 h, reducing hands-on-time to a third compared to manual operation. Moreover, it was shown that automation enhances accuracy and precision compared to manual preparation. It was revealed that established protocols relying on optical density or cell number for biomass adjustion prior to staining may suffer from errors due to significant changes of the cells' optical and physiological properties during cultivation. Alternatively, the biovolume was used as a measure for biomass concentration so that errors from morphological changes can be excluded. The newly established assay proved to be applicable for absolute quantification of algal lipids avoiding limitations of currently established

  13. Study protocol

    DEFF Research Database (Denmark)

    Smith, Benjamin E; Hendrick, Paul; Bateman, Marcus

    2017-01-01

    avoidance behaviours, catastrophising, self-efficacy, sport and leisure activity participation, and general quality of life. Follow-up will be 3 and 6 months. The analysis will focus on descriptive statistics and confidence intervals. The qualitative components will follow a thematic analysis approach....... DISCUSSION: This study will evaluate the feasibility of running a definitive large-scale trial on patients with patellofemoral pain, within the NHS in the UK. We will identify strengths and weaknesses of the proposed protocol and the utility and characteristics of the outcome measures. The results from...... this study will inform the design of a multicentre trial. TRIAL REGISTRATION: ISRCTN35272486....

  14. Beyond protocols

    DEFF Research Database (Denmark)

    Vanderhoeven, Sonia; Branquart, Etienne; Casaer, Jim

    2017-01-01

    Risk assessment tools for listing invasive alien species need to incorporate all available evidence and expertise. Beyond the wealth of protocols developed to date, we argue that the current way of performing risk analysis has several shortcomings. In particular, lack of data on ecological impact...... information on risk and the exploration of improved methods for decision making on biodiversity management. This is crucial for efficient conservation resource allocation and uptake by stakeholders and the public......., transparency and repeatability of assessments as well as the incorporation of uncertainty should all be explicitly considered. We recommend improved quality control of risk assessments through formalized peer review with clear feedback between assessors and reviewers. Alternatively, a consensus building...

  15. Linearization of the bradford protein assay.

    Science.gov (United States)

    Ernst, Orna; Zor, Tsaffrir

    2010-04-12

    Determination of microgram quantities of protein in the Bradford Coomassie brilliant blue assay is accomplished by measurement of absorbance at 590 nm. This most common assay enables rapid and simple protein quantification in cell lysates, cellular fractions, or recombinant protein samples, for the purpose of normalization of biochemical measurements. However, an intrinsic nonlinearity compromises the sensitivity and accuracy of this method. It is shown that under standard assay conditions, the ratio of the absorbance measurements at 590 nm and 450 nm is strictly linear with protein concentration. This simple procedure increases the accuracy and improves the sensitivity of the assay about 10-fold, permitting quantification down to 50 ng of bovine serum albumin. Furthermore, the interference commonly introduced by detergents that are used to create the cell lysates is greatly reduced by the new protocol. A linear equation developed on the basis of mass action and Beer's law perfectly fits the experimental data.

  16. RELIABLE ASSAYS FOR DETERMINING ENDOGENOUS COMPONENTS OF HUMAN MILK

    Science.gov (United States)

    Healthy women from 18-38 years old (N=25) fasted for several hours and twice donated blood and milk (postpartum 2-7 weeks and 3-4 months) for the EPA's Methods Advancement for Milk Analysis study, a pilot for the National Children's Study (NCS). Endogenous components were chosen...

  17. Integrated cryptosporidium assay to determine oocyst density, infectivity, and genotype for risk assessment of source and reuse water.

    Science.gov (United States)

    King, Brendon; Fanok, Stella; Phillips, Renae; Swaffer, Brooke; Monis, Paul

    2015-05-15

    Cryptosporidium continues to be problematic for the water industry, with risk assessments often indicating that treatment barriers may fail under extreme conditions. However, risk analyses have historically used oocyst densities and not considered either oocyst infectivity or species/genotype, which can result in an overestimation of risk if the oocysts are not human infective. We describe an integrated assay for determining oocyst density, infectivity, and genotype from a single-sample concentrate, an important advance that overcomes the need for processing multiple-grab samples or splitting sample concentrates for separate analyses. The assay incorporates an oocyst recovery control and is compatible with standard primary concentration techniques. Oocysts were purified from primary concentrates using immunomagnetic separation prior to processing by an infectivity assay. Plate-based cell culture was used to detect infectious foci, with a monolayer washing protocol developed to allow recovery and enumeration of oocysts. A simple DNA extraction protocol was developed to allow typing of any wells containing infectious Cryptosporidium. Water samples from a variety of source water and wastewater matrices, including a semirural catchment, wastewater, an aquifer recharge site, and storm water, were analyzed using the assay. Results demonstrate that the assay can reliably determine oocyst densities, infectivity, and genotype from single-grab samples for a variety of water matrices and emphasize the varying nature of Cryptosporidium risk extant throughout source waters and wastewaters. This assay should therefore enable a more comprehensive understanding of Cryptosporidium risk for different water sources, assisting in the selection of appropriate risk mitigation measures. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  18. Frontiers of reliability

    CERN Document Server

    Basu, Asit P; Basu, Sujit K

    1998-01-01

    This volume presents recent results in reliability theory by leading experts in the world. It will prove valuable for researchers, and users of reliability theory. It consists of refereed invited papers on a broad spectrum of topics in reliability. The subjects covered include Bayesian reliability, Bayesian reliability modeling, confounding in a series system, DF tests, Edgeworth approximation to reliability, estimation under random censoring, fault tree reduction for reliability, inference about changes in hazard rates, information theory and reliability, mixture experiment, mixture of Weibul

  19. A real-time, quantitative PCR protocol for assessing the relative parasitemia of Leucocytozoon in waterfowl

    Science.gov (United States)

    Smith, Matthew M.; Schmutz, Joel A.; Apelgren, Chloe; Ramey, Andy M.

    2015-01-01

    Microscopic examination of blood smears can be effective at diagnosing and quantifying hematozoa infections. However, this method requires highly trained observers, is time consuming, and may be inaccurate for detection of infections at low levels of parasitemia. To develop a molecular methodology for identifying and quantifying Leucocytozoon parasite infection in wild waterfowl (Anseriformes), we designed a real-time, quantitative PCR protocol to amplify Leucocytozoon mitochondrial DNA using TaqMan fluorogenic probes and validated our methodology using blood samples collected from waterfowl in interior Alaska during late summer and autumn (n = 105). By comparing our qPCR results to those derived from a widely used nested PCR protocol, we determined that our assay showed high levels of sensitivity (91%) and specificity (100%) in detecting Leucocytozoon DNA from host blood samples. Additionally, results of a linear regression revealed significant correlation between the raw measure of parasitemia produced by our qPCR assay (Ct values) and numbers of parasites observed on blood smears (R2 = 0.694, P = 0.003), indicating that our assay can reliably determine the relative parasitemia levels among samples. This methodology provides a powerful new tool for studies assessing effects of haemosporidian infection in wild avian species.

  20. Identification of Cyclin-dependent Kinase 1 Specific Phosphorylation Sites by an In Vitro Kinase Assay.

    Science.gov (United States)

    Cui, Heying; Loftus, Kyle M; Noell, Crystal R; Solmaz, Sozanne R

    2018-05-03

    Cyclin-dependent kinase 1 (Cdk1) is a master controller for the cell cycle in all eukaryotes and phosphorylates an estimated 8 - 13% of the proteome; however, the number of identified targets for Cdk1, particularly in human cells is still low. The identification of Cdk1-specific phosphorylation sites is important, as they provide mechanistic insights into how Cdk1 controls the cell cycle. Cell cycle regulation is critical for faithful chromosome segregation, and defects in this complicated process lead to chromosomal aberrations and cancer. Here, we describe an in vitro kinase assay that is used to identify Cdk1-specific phosphorylation sites. In this assay, a purified protein is phosphorylated in vitro by commercially available human Cdk1/cyclin B. Successful phosphorylation is confirmed by SDS-PAGE, and phosphorylation sites are subsequently identified by mass spectrometry. We also describe purification protocols that yield highly pure and homogeneous protein preparations suitable for the kinase assay, and a binding assay for the functional verification of the identified phosphorylation sites, which probes the interaction between a classical nuclear localization signal (cNLS) and its nuclear transport receptor karyopherin α. To aid with experimental design, we review approaches for the prediction of Cdk1-specific phosphorylation sites from protein sequences. Together these protocols present a very powerful approach that yields Cdk1-specific phosphorylation sites and enables mechanistic studies into how Cdk1 controls the cell cycle. Since this method relies on purified proteins, it can be applied to any model organism and yields reliable results, especially when combined with cell functional studies.

  1. Radioreceptor assay: theory and applications to pharmacology

    International Nuclear Information System (INIS)

    Perret, G.; Simon, P.

    1984-01-01

    The aim of the first part of this work is to present the theory of the radioreceptor assay and to compare it to the other techniques of radioanalysis (radioimmunoassay, competitive protein binding assays). The technology of the radioreceptor assay is then presented and its components (preparation of the receptors, radioligand, incubation medium) are described. The analytical characteristics of the radioreceptor assay (specificity, sensitivity, reproductibility, accuracy) and the pharmacological significance of the results are discussed. The second part is devoted to the description of the radioreceptor assays of some pharmacological classes (neuroleptics, tricyclic antidepressants, benzodiazepines, β-blockers, anticholinergic drugs) and to their use in therapeutic drug monitoring. In conclusion, by their nature, radioreceptor assays are highly sensitive, reliable, precise, accurate and simple to perform. Their chief disadvantage relates to specificity, since any substance having an appreciable affinity to the receptor site will displace the specifically bound radioligand. Paradoxically in some cases, this lack of specificity may be advantageous in that it allows for the detection of not only the apparent compound but of active metabolites and endogenous receptor agonists as well and in that radioreceptors assays can be devised for a whole pharmacological class and not only for one drug as it is the case for classical physico-chemical techniques. For all these reasons future of radioreceptor assay in pharmacology appears promising [fr

  2. Interpreting sperm DNA damage in a diverse range of mammalian sperm by means of the two-tailed comet assay

    Science.gov (United States)

    Cortés-Gutiérrez, Elva I.; López-Fernández, Carmen; Fernández, José Luis; Dávila-Rodríguez, Martha I.; Johnston, Stephen D.; Gosálvez, Jaime

    2014-01-01

    Key Concepts The two-dimensional Two-Tailed Comet assay (TT-comet) protocol is a valuable technique to differentiate between single-stranded (SSBs) and double-stranded DNA breaks (DSBs) on the same sperm cell.Protein lysis inherent with the TT-comet protocol accounts for differences in sperm protamine composition at a species-specific level to produce reliable visualization of sperm DNA damage.Alkaline treatment may break the sugar–phosphate backbone in abasic sites or at sites with deoxyribose damage, transforming these lesions into DNA breaks that are also converted into ssDNA. These lesions are known as Alkali Labile Sites “ALSs.”DBD–FISH permits the in situ visualization of DNA breaks, abasic sites or alkaline-sensitive DNA regions.The alkaline comet single assay reveals that all mammalian species display constitutive ALS related with the requirement of the sperm to undergo transient changes in DNA structure linked with chromatin packing.Sperm DNA damage is associated with fertilization failure, impaired pre-and post- embryo implantation and poor pregnancy outcome.The TT is a valuable tool for identifying SSBs or DSBs in sperm cells with DNA fragmentation and can be therefore used for the purposes of fertility assessment. Sperm DNA damage is associated with fertilization failure, impaired pre-and post- embryo implantation and poor pregnancy outcome. A series of methodologies to assess DNA damage in spermatozoa have been developed but most are unable to differentiate between single-stranded DNA breaks (SSBs) and double-stranded DNA breaks (DSBs) on the same sperm cell. The two-dimensional Two-Tailed Comet assay (TT-comet) protocol highlighted in this review overcomes this limitation and emphasizes the importance in accounting for the difference in sperm protamine composition at a species-specific level for the appropriate preparation of the assay. The TT-comet is a modification of the original comet assay that uses a two dimensional electrophoresis to

  3. Single-experiment displacement assay for quantifying high-affinity binding by isothermal titration calorimetry.

    Science.gov (United States)

    Krainer, Georg; Keller, Sandro

    2015-04-01

    Isothermal titration calorimetry (ITC) is the gold standard for dissecting the thermodynamics of a biomolecular binding process within a single experiment. However, reliable determination of the dissociation constant (KD) from a single titration is typically limited to the range 100 μM>KD>1 nM. Interactions characterized by a lower KD can be assessed indirectly by so-called competition or displacement assays, provided that a suitable competitive ligand is available whose KD falls within the directly accessible window. However, this protocol is limited by the fact that it necessitates at least two titrations to characterize one high-affinity inhibitor, resulting in considerable consumption of both sample material and time. Here, we introduce a fast and efficient ITC displacement assay that allows for the simultaneous characterization of both a high-affinity ligand and a moderate-affinity ligand competing for the same binding site on a receptor within a single experiment. The protocol is based on a titration of the high-affinity ligand into a solution containing the moderate-affinity ligand bound to the receptor present in excess. The resulting biphasic binding isotherm enables accurate and precise determination of KD values and binding enthalpies (ΔH) of both ligands. We discuss the theoretical background underlying the approach, demonstrate its practical application to metal ion chelation, explore its potential and limitations with the aid of simulations and statistical analyses, and elaborate on potential applications to protein-inhibitor interactions. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. Developing a yeast-based assay protocol to monitor total ...

    African Journals Online (AJOL)

    2005-06-21

    Jun 21, 2005 ... al., 1997; Körner et al., 2000; Miyamoto et al., 2003). Effects ... of high µg/ℓ (Giger et al., 1987; Ahel et al., 1994), up to 2 to 3 orders of magnitude ...... MD, PETER A, SCHONENBERGER R, VOGELI AC, SUTER MJ and EGGEN ...

  5. FtsZ Polymerization Assays : Simple Protocols and Considerations

    NARCIS (Netherlands)

    Król, Ewa; Scheffers, Dirk-Jan

    2013-01-01

    During bacterial cell division, the essential protein FtsZ assembles in the middle of the cell to form the so-called Z-ring. FtsZ polymerizes into long filaments in the presence of GTP in vitro, and polymerization is regulated by several accessory proteins. FtsZ polymerization has been extensively

  6. Microbead agglutination based assays

    KAUST Repository

    Kodzius, Rimantas

    2013-01-21

    We report a simple and rapid room temperature assay for point-of-care (POC) testing that is based on specific agglutination. Agglutination tests are based on aggregation of microbeads in the presence of a specific analyte thus enabling the macroscopic observation. Such tests are most often used to explore antibody-antigen reactions. Agglutination has been used for protein assays using a biotin/streptavidin system as well as a hybridization based assay. The agglutination systems are prone to selftermination of the linking analyte, prone to active site saturation and loss of agglomeration at high analyte concentrations. We investigated the molecular target/ligand interaction, explaining the common agglutination problems related to analyte self-termination, linkage of the analyte to the same bead instead of different microbeads. We classified the agglutination process into three kinds of assays: a two- component assay, a three-component assay and a stepped three- component assay. Although we compared these three kinds of assays for recognizing DNA and protein molecules, the assay can be used for virtually any molecule, including ions and metabolites. In total, the optimized assay permits detecting analytes with high sensitivity in a short time, 5 min, at room temperature. Such a system is appropriate for POC testing.

  7. The electrophoretic mobility shift assay (EMSA)

    OpenAIRE

    sprotocols

    2015-01-01

    The electrophoretic mobility shift assay (EMSA), also known as “gel shift assay”, is used to examine the binding parameters and relative affinities of protein and DNA interactions. We produced recombinant CCA1 protein and tested its binding affinity for the promoter fragments that contain CBS (AAAAATCT) or evening element (EE, AAAATATCT) (1) using a modified procedure adopted from published protocols (2,3).

  8. System Reliability Engineering

    International Nuclear Information System (INIS)

    Lim, Tae Jin

    2005-02-01

    This book tells of reliability engineering, which includes quality and reliability, reliability data, importance of reliability engineering, reliability and measure, the poisson process like goodness of fit test and the poisson arrival model, reliability estimation like exponential distribution, reliability of systems, availability, preventive maintenance such as replacement policies, minimal repair policy, shock models, spares, group maintenance and periodic inspection, analysis of common cause failure, and analysis model of repair effect.

  9. New Application of the Comet Assay

    Science.gov (United States)

    Cortés-Gutiérrez, Elva I.; Dávila-Rodríguez, Martha I.; Fernández, José Luís; López-Fernández, Carmen; Gosálbez, Altea; Gosálvez, Jaime

    2011-01-01

    The comet assay is a well-established, simple, versatile, visual, rapid, and sensitive tool used extensively to assess DNA damage and DNA repair quantitatively and qualitatively in single cells. The comet assay is most frequently used to analyze white blood cells or lymphocytes in human biomonitoring studies, although other cell types have been examined, including buccal, nasal, epithelial, and placental cells and even spermatozoa. This study was conducted to design a protocol that can be used to generate comets in subnuclear units, such as chromosomes. The new technique is based on the chromosome isolation protocols currently used for whole chromosome mounting in electron microscopy, coupled to the alkaline variant of the comet assay, to detect DNA damage. The results show that migrant DNA fragments can be visualized in whole nuclei and isolated chromosomes and that they exhibit patterns of DNA migration that depend on the level of DNA damage produced. This protocol has great potential for the highly reproducible study of DNA damage and repair in specific chromosomal domains. PMID:21540337

  10. Radioreceptor opioid assay

    International Nuclear Information System (INIS)

    Miller, R.J.; Chang, K.-J.

    1981-01-01

    A radioreceptor assay is described for assaying opioid drugs in biological fluids. The method enables the assay of total opioid activity, being specific for opioids as a class but lacking specificity within the class. A radio-iodinated opioid and the liquid test sample are incubated with an opiate receptor material. The percentage inhibition of the binding of the radio-iodinated compound to the opiate receptor is calculated and the opioid activity of the test liquid determined from a standard curve. Examples of preparing radio-iodinated opioids and assaying opioid activity are given. A test kit for the assay is described. Compared to other methods, this assay is cheap, easy and rapid. (U.K.)

  11. Absolute nuclear material assay

    Science.gov (United States)

    Prasad, Manoj K [Pleasanton, CA; Snyderman, Neal J [Berkeley, CA; Rowland, Mark S [Alamo, CA

    2010-07-13

    A method of absolute nuclear material assay of an unknown source comprising counting neutrons from the unknown source and providing an absolute nuclear material assay utilizing a model to optimally compare to the measured count distributions. In one embodiment, the step of providing an absolute nuclear material assay comprises utilizing a random sampling of analytically computed fission chain distributions to generate a continuous time-evolving sequence of event-counts by spreading the fission chain distribution in time.

  12. Biomonitoring of genotoxic risk in radar facility workers: comparison of the comet assay with micronucleus assay and chromatid breakage assay

    International Nuclear Information System (INIS)

    Garaj-Vrhovac, V.; Kopjar, N.

    2003-01-01

    Genotoxic risks of occupational exposure in a radar facility were evaluated by using alkaline comet assay, micronucleus assay and chromatid breakage assay on peripheral blood leukocytes in exposed subjects and corresponding controls. Results show that occupational exposure to microwave radiation correlates with an increase of genome damage in somatic cells. The levels of DNA damage in exposed subjects determined by using alkaline comet assay were increased compared to control and showed interindividual variations. Incidence of micronuclei was also significantly increased compared to baseline control values. After short exposure of cultured lymphocytes to bleomycin, cells of occupationally exposed subjects responded with high numbers of chromatid breaks. Although the level of chromosome damage generated by bleomycin varied greatly between individuals, in exposed subjects a significantly elevated number of chromatid breaks was observed. Our results support data reported in literature indicating that microwave radiation represents a potential DNA-damaging hazard. Alkaline comet assay is confirmed as a sensitive and highly reproducible technique for detection of primary DNA damage inflicted in somatic cells. Micronucleus assay was confirmed as reliable bio-markers of effect and chromatid breakage assay as sensitive bio-marker of individual cancer susceptibility. The results obtained also confirm the necessity to improve measures and to perform accurate health surveillance of individuals occupationally exposed to microwave radiation

  13. Introducing MINA--The Molecularly Imprinted Nanoparticle Assay.

    Science.gov (United States)

    Shutov, Roman V; Guerreiro, Antonio; Moczko, Ewa; de Vargas-Sansalvador, Isabel Perez; Chianella, Iva; Whitcombe, Michael J; Piletsky, Sergey A

    2014-03-26

    A new ELISA- (enzyme-linked immunosorbent assay)-like assay is demonstrated in which no elements of biological origin are used for molecular recognition or signaling. Composite imprinted nanoparticles that contain a catalytic core and which are synthesized by using a solid-phase approach can simultaneously act as recognition/signaling elements, and be used with minimal modifications to standard assay protocols. This assay provides a new route towards replacement of unstable biomolecules in immunoassays. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Endogenous Locus Reporter Assays.

    Science.gov (United States)

    Liu, Yaping; Hermes, Jeffrey; Li, Jing; Tudor, Matthew

    2018-01-01

    Reporter gene assays are widely used in high-throughput screening (HTS) to identify compounds that modulate gene expression. Traditionally a reporter gene assay is built by cloning an endogenous promoter sequence or synthetic response elements in the regulatory region of a reporter gene to monitor transcriptional activity of a specific biological process (exogenous reporter assay). In contrast, an endogenous locus reporter has a reporter gene inserted in the endogenous gene locus that allows the reporter gene to be expressed under the control of the same regulatory elements as the endogenous gene, thus more accurately reflecting the changes seen in the regulation of the actual gene. In this chapter, we introduce some of the considerations behind building a reporter gene assay for high-throughput compound screening and describe the methods we have utilized to establish 1536-well format endogenous locus reporter and exogenous reporter assays for the screening of compounds that modulate Myc pathway activity.

  15. A murine monoclonal antibody based enzyme-linked immunosorbent assay for almond (Prunus dulcis L.) detection.

    Science.gov (United States)

    Su, Mengna; Venkatachalam, Mahesh; Liu, Changqi; Zhang, Ying; Roux, Kenneth H; Sathe, Shridhar K

    2013-11-13

    A sandwich enzyme-linked immunosorbent assay (ELISA) using anti-almond soluble protein rabbit polyclonal antibodies as capture antibodies and murine monoclonal antibody 4C10 as the detection antibodies was developed. The assay is specific and sensitive (3-200 ng almond protein/mL) for almond detection. The standardized assay is accurate (assay variability assay did not register any cross-reactivity with the tested food matrices, suggesting the assay to be almond amandin specific. The assay could detect the presence of declared almond in the tested matched commercial samples. Further, the assay reliably detected the presence of almonds in the laboratory prepared food samples spiked with almond flour.

  16. Optimisation of the microplate resazurin assay for screening and bioassay-guided fractionation of phytochemical extracts against Mycobacterium tuberculosis.

    Science.gov (United States)

    O'Neill, Taryn E; Li, Haoxin; Colquhoun, Caitlyn D; Johnson, John A; Webster, Duncan; Gray, Christopher A

    2014-01-01

    Because of increased resistance to current drugs, there is an urgent need to discover new anti-mycobacterial compounds for the development of novel anti-tuberculosis drugs. The microplate resazurin assay (MRA) is commonly used to evaluate natural products and synthetic compounds for anti-mycobacterial activity. However, the assay can be problematic and unreliable when screening methanolic phytochemical extracts. To optimise the MRA for the screening and bioassay-guided fractionation of phytochemical extracts using Mycobacterium tuberculosis H37Ra. The effects of varying assay duration, resazurin solution composition, solvent (dimethyl sulphoxide - DMSO) concentration and type of microtitre plate used on the results and reliability of the MRA were investigated. The optimal bioassay protocol was applied to methanolic extracts of medicinal plants that have been reported to possess anti-mycobacterial activity. The variables investigated were found to have significant effects on the results obtained with the MRA. A standardised procedure that can reliably quantify anti-mycobacterial activity of phytochemical extracts in as little as 48 h was identified. The optimised MRA uses 2% aqueous DMSO, with an indicator solution of 62.5 µg/mL resazurin in 5% aqueous Tween 80 over 96 h incubation. The study has identified an optimal procedure for the MRA when used with M. tuberculosis H37Ra that gives rapid, reliable and consistent results. The assay procedure has been used successfully for the screening and bioassay-guided fractionation of anti-mycobacterial compounds from methanol extracts of Canadian medicinal plants. Copyright © 2014 John Wiley & Sons, Ltd.

  17. AMSAA Reliability Growth Guide

    National Research Council Canada - National Science Library

    Broemm, William

    2000-01-01

    ... has developed reliability growth methodology for all phases of the process, from planning to tracking to projection. The report presents this methodology and associated reliability growth concepts.

  18. A reliability simulation language for reliability analysis

    International Nuclear Information System (INIS)

    Deans, N.D.; Miller, A.J.; Mann, D.P.

    1986-01-01

    The results of work being undertaken to develop a Reliability Description Language (RDL) which will enable reliability analysts to describe complex reliability problems in a simple, clear and unambiguous way are described. Component and system features can be stated in a formal manner and subsequently used, along with control statements to form a structured program. The program can be compiled and executed on a general-purpose computer or special-purpose simulator. (DG)

  19. Solid phase assays

    International Nuclear Information System (INIS)

    Reese, M.G.; Johnson, L.R.; Ransom, D.K.

    1980-01-01

    In a solid phase assay for quantitative determination of biological and other analytes, a sample such as serum is contacted with a receptor for the analyte being assayed, the receptor being supported on a solid support. No tracer for the analyte is added to the sample before contacting with the receptor; instead the tracer is contacted with the receptor after unbound analyte has been removed from the receptor. The assay can be otherwise performed in a conventional manner but can give greater sensitivity. (author)

  20. Real-time Quaking-induced Conversion Assay for Detection of CWD Prions in Fecal Material.

    Science.gov (United States)

    Cheng, Yo Ching; Hannaoui, Samia; John, Theodore Ralph; Dudas, Sandor; Czub, Stefanie; Gilch, Sabine

    2017-09-29

    The RT-QuIC technique is a sensitive in vitro cell-free prion amplification assay based mainly on the seeded misfolding and aggregation of recombinant prion protein (PrP) substrate using prion seeds as a template for the conversion. RT-QuIC is a novel high-throughput technique which is analogous to real-time polymerase chain reaction (PCR). Detection of amyloid fibril growth is based on the dye Thioflavin T, which fluoresces upon specific interaction with ᵦ-sheet rich proteins. Thus, amyloid formation can be detected in real time. We attempted to develop a reliable non-invasive screening test to detect chronic wasting disease (CWD) prions in fecal extract. Here, we have specifically adapted the RT-QuIC technique to reveal PrP Sc seeding activity in feces of CWD infected cervids. Initially, the seeding activity of the fecal extracts we prepared was relatively low in RT-QuIC, possibly due to potential assay inhibitors in the fecal material. To improve seeding activity of feces extracts and remove potential assay inhibitors, we homogenized the fecal samples in a buffer containing detergents and protease inhibitors. We also submitted the samples to different methodologies to concentrate PrP Sc on the basis of protein precipitation using sodium phosphotungstic acid, and centrifugal force. Finally, the feces extracts were tested by optimized RT-QuIC which included substrate replacement in the protocol to improve the sensitivity of detection. Thus, we established a protocol for sensitive detection of CWD prion seeding activity in feces of pre-clinical and clinical cervids by RT-QuIC, which can be a practical tool for non-invasive CWD diagnosis.

  1. Relationship between the radioisotopic footpad assay and other immunological assays in tumor bearing rats

    International Nuclear Information System (INIS)

    Mizushima, Yutaka; Takeichi, Noritoshi; Minami, Akio; Kasai, Masaharu; Itaya, Toshiyuki

    1981-01-01

    KMT-17, a fibrosarcoma induced by 3-methylcholanthrene in a WKA rat, is a sensitive tumor to various kinds of immunological assays and is a suitable model tumor for the study of the immune status in tumor bearing hosts. The antitumor immune response of KMT-17 bearing rats was studied by a radioisotopic footpad assay (FPA) in comparison with other in vivo and in vitro assays. Delayed hypersensitivity to tumor antigens measured by the FPA was observed from the 8th day after transplantation of KMT-17 cells, reached a peak on the 12 - 15th day, and then declined in the late stage on the 17th day. The kinetics of the FPA correlated well with those of an in vivo Winn assay and of an in vitro lymphocyte cytotoxicity assay ( 51 Cr-release assay). The appearance of an antitumor antibody detected by a complement dependent cytotoxicity test also correlated well with the kinetics of the FPA. A growth inhibition assay (GIA) for non-specific cell-mediated immunity also showed similar kinetics to that of the FPA. The delayed hypersensitivity footpad reaction to tumor cell extracts measured by this FPA was tumor-specific. These results suggest that the FPA is a simple and reliable in vivo assay for evaluating antitumor immunity in tumor bearing hosts. (author)

  2. Advanced flooding-based routing protocols for underwater sensor networks

    OpenAIRE

    Isufi, E.; Dol, H.; Leus, G.J.T.

    2016-01-01

    Flooding-based protocols are a reliable solution to deliver packets in underwater sensor networks. However, these protocols potentially involve all the nodes in the forwarding process. Thus, the performance and energy efficiency are not optimal. In this work, we propose some advances of a flooding-based protocol with the goal to improve the performance and the energy efficiency. The first idea considers the node position information in order to reduce the number of relays that may apply flood...

  3. Factor IX assay

    Science.gov (United States)

    ... this page: //medlineplus.gov/ency/article/003679.htm Factor IX assay To use the sharing features on ... M. is also a founding member of Hi-Ethics and subscribes to the principles of the Health ...

  4. Factor VIII assay

    Science.gov (United States)

    ... this page: //medlineplus.gov/ency/article/003678.htm Factor VIII assay To use the sharing features on ... M. is also a founding member of Hi-Ethics and subscribes to the principles of the Health ...

  5. Factor II assay

    Science.gov (United States)

    ... this page: //medlineplus.gov/ency/article/003674.htm Factor II assay To use the sharing features on ... M. is also a founding member of Hi-Ethics and subscribes to the principles of the Health ...

  6. Factor VII assay

    Science.gov (United States)

    ... this page: //medlineplus.gov/ency/article/003676.htm Factor VII assay To use the sharing features on ... M. is also a founding member of Hi-Ethics and subscribes to the principles of the Health ...

  7. Microbead agglutination based assays

    KAUST Repository

    Kodzius, Rimantas; Castro, David; Foulds, Ian G.; Parameswaran, Ash M.; Sumanpreet, K. Chhina

    2013-01-01

    We report a simple and rapid room temperature assay for point-of-care (POC) testing that is based on specific agglutination. Agglutination tests are based on aggregation of microbeads in the presence of a specific analyte thus enabling

  8. Vertical Protocol Composition

    DEFF Research Database (Denmark)

    Groß, Thomas; Mödersheim, Sebastian Alexander

    2011-01-01

    The security of key exchange and secure channel protocols, such as TLS, has been studied intensively. However, only few works have considered what happens when the established keys are actually used—to run some protocol securely over the established “channel”. We call this a vertical protocol.......e., that the combination cannot introduce attacks that the individual protocols in isolation do not have. In this work, we prove a composability result in the symbolic model that allows for arbitrary vertical composition (including self-composition). It holds for protocols from any suite of channel and application...

  9. Controlling variation in the comet assay

    Directory of Open Access Journals (Sweden)

    Andrew Richard Collins

    2014-10-01

    Full Text Available Variability of the comet assay is a serious issue, whether it occurs from experiment to experiment in the same laboratory, or between different laboratories analysing identical samples. Do we have to live with high variability, just because the comet assay is a biological assay rather than analytical chemistry? Numerous attempts have been made to limit variability by standardising the assay protocol, and the critical steps in the assay have been identified; agarose concentration, duration of alkaline incubation, and electrophoresis conditions (time, temperature and voltage gradient are particularly important. Even when these are controlled, variation seems to be inevitable. It is helpful to include in experiments reference standards, i.e. cells with a known amount of specific damage to the DNA. They can be aliquots frozen from a single large batch of cells, either untreated (negative controls or treated with, for example, H2O2 or X-rays to induce strand breaks (positive control for the basic assay, or photosensitiser plus light to oxidise guanine (positive control for Fpg- or OGG1-sensitive sites. Reference standards are especially valuable when performing a series of experiments over a long period - for example, analysing samples of white blood cells from a large human biomonitoring trial - to check that the assay is performing consistently, and to identify anomalous results necessitating a repeat experiment. The reference values of tail intensity can also be used to iron out small variations occurring from day to day. We present examples of the use of reference standards in human trials, both within one laboratory and between different laboratories, and describe procedures that can be used to control variation.

  10. Reliability data banks

    International Nuclear Information System (INIS)

    Cannon, A.G.; Bendell, A.

    1991-01-01

    Following an introductory chapter on Reliability, what is it, why it is needed, how it is achieved and measured, the principles of reliability data bases and analysis methodologies are the subject of the next two chapters. Achievements due to the development of data banks are mentioned for different industries in the next chapter, FACTS, a comprehensive information system for industrial safety and reliability data collection in process plants are covered next. CREDO, the Central Reliability Data Organization is described in the next chapter and is indexed separately, as is the chapter on DANTE, the fabrication reliability Data analysis system. Reliability data banks at Electricite de France and IAEA's experience in compiling a generic component reliability data base are also separately indexed. The European reliability data system, ERDS, and the development of a large data bank come next. The last three chapters look at 'Reliability data banks, - friend foe or a waste of time'? and future developments. (UK)

  11. Suncor maintenance and reliability

    Energy Technology Data Exchange (ETDEWEB)

    Little, S. [Suncor Energy, Calgary, AB (Canada)

    2006-07-01

    Fleet maintenance and reliability at Suncor Energy was discussed in this presentation, with reference to Suncor Energy's primary and support equipment fleets. This paper also discussed Suncor Energy's maintenance and reliability standard involving people, processes and technology. An organizational maturity chart that graphed organizational learning against organizational performance was illustrated. The presentation also reviewed the maintenance and reliability framework; maintenance reliability model; the process overview of the maintenance and reliability standard; a process flow chart of maintenance strategies and programs; and an asset reliability improvement process flow chart. An example of an improvement initiative was included, with reference to a shovel reliability review; a dipper trip reliability investigation; bucket related failures by type and frequency; root cause analysis of the reliability process; and additional actions taken. Last, the presentation provided a graph of the results of the improvement initiative and presented the key lessons learned. tabs., figs.

  12. The Accelerator Reliability Forum

    CERN Document Server

    Lüdeke, Andreas; Giachino, R

    2014-01-01

    A high reliability is a very important goal for most particle accelerators. The biennial Accelerator Reliability Workshop covers topics related to the design and operation of particle accelerators with a high reliability. In order to optimize the over-all reliability of an accelerator one needs to gather information on the reliability of many different subsystems. While a biennial workshop can serve as a platform for the exchange of such information, the authors aimed to provide a further channel to allow for a more timely communication: the Particle Accelerator Reliability Forum [1]. This contribution will describe the forum and advertise it’s usage in the community.

  13. Assay method and compositions

    International Nuclear Information System (INIS)

    1977-01-01

    Methods are described for measuring catecholamine levels in human and animal body fluids and tissues using the catechol-O-methyl-transferase (COMT) radioassay. The assay involves incubating the biological sample with COMT and the tritiated methyl donor, S-adenosyl-L-methionine( 3 H)-methyl. The O-methylated ( 3 H) epinephrine and/or norepinephrine are extracted and oxidised to vanillin- 3 H which in turn is extracted and its radioactivity counted. When analysing dopamine levels the assay is extended by vanillin- 3 H and raising the pH of the aqueous periodate phase from which O-methylated ( 3 H) dopamine is extracted and counted. The assay may be modified depending on whether measurements of undifferentiated total endogenous catecholamine levels or differential analyses of the catecholamine levels are being performed. The sensitivity of the assay can be as low as 5 picograms for norepinephrine and epinephrine and 12 picograms for dopamine. The assemblance of the essential components of the assay into a kit for use in laboratories is also described. (U.K.)

  14. Serotype determination of Salmonella by xTAG assay.

    Science.gov (United States)

    Zheng, Zhibei; Zheng, Wei; Wang, Haoqiu; Pan, Jincao; Pu, Xiaoying

    2017-10-01

    Currently, no protocols or commercial kits are available to determine the serotypes of Salmonella by using Luminex MAGPIX®. In this study, an xTAG assay for serotype determination of Salmonella suitable for Luminex MAGPIX® is described and 228 Salmonella isolates were serotype determined by this xTAG assay. The xTAG assay consists of two steps: 1) Multiplex PCR to amplify simultaneously O, H and Vi antigen genes of Salmonella, and 2) Magplex-TAG™ microsphere hybridization to identify accurately the specific PCR products of different antigens. Compared with the serotyping results of traditional serum agglutination test, the sensitivity and specificity of the xTAG assay were 95.1% and 100%, respectively. The agreement rate of these two assays was 95.2%. Compared with Luminex xMAP® Salmonella Serotyping Assay (SSA) kit, the advantages of this xTAG assay are: First, the magnetic beads make it applicable to both the Luminex®100/200™ and MAGPIX® systems. Second, only primers rather than both primers and probes are needed in the xTAG assay, and the process of coupling antigen-specific oligonucleotide probes to beads is circumvented, which make the xTAG assay convenient to be utilized by other laboratories. The xTAG assay may serve as a rapid alternative or complementary method for traditional Salmonella serotyping tests, especially for laboratories that utilize the MAGPIX® systems. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Rover waste assay system

    Energy Technology Data Exchange (ETDEWEB)

    Akers, D.W.; Stoots, C.M.; Kraft, N.C.; Marts, D.J. [Idaho National Engineering Lab., Idaho Falls, ID (United States)

    1997-11-01

    The Rover Waste Assay System (RWAS) is a nondestructive assay system designed for the rapid assay of highly-enriched {sup 235}U contaminated piping, tank sections, and debris from the Rover nuclear rocket fuel processing facility at the Idaho Chemical Processing Plant. A scanning system translates a NaI(Tl) detector/collimator system over the structural components where both relative and calibrated measurements for {sup 137}Cs are made. Uranium-235 concentrations are in operation and is sufficiently automated that most functions are performed by the computer system. These functions include system calibration, problem identification, collimator control, data analysis, and reporting. Calibration of the system was done through a combination of measurements on calibration standards and benchmarked modeling. A description of the system is presented along with the methods and uncertainties associated with the calibration and analysis of the system for components from the Rover facility. 4 refs., 2 figs., 4 tabs.

  16. Rover waste assay system

    International Nuclear Information System (INIS)

    Akers, D.W.; Stoots, C.M.; Kraft, N.C.; Marts, D.J.

    1997-01-01

    The Rover Waste Assay System (RWAS) is a nondestructive assay system designed for the rapid assay of highly-enriched 235 U contaminated piping, tank sections, and debris from the Rover nuclear rocket fuel processing facility at the Idaho Chemical Processing Plant. A scanning system translates a NaI(Tl) detector/collimator system over the structural components where both relative and calibrated measurements for 137 Cs are made. Uranium-235 concentrations are in operation and is sufficiently automated that most functions are performed by the computer system. These functions include system calibration, problem identification, collimator control, data analysis, and reporting. Calibration of the system was done through a combination of measurements on calibration standards and benchmarked modeling. A description of the system is presented along with the methods and uncertainties associated with the calibration and analysis of the system for components from the Rover facility. 4 refs., 2 figs., 4 tabs

  17. Radioreceptor assay for insulin

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Kazuo [Tokyo Univ. (Japan). Faculty of Medicine

    1975-04-01

    Radioreceptor assay of insulin was discussed from the aspects of the measuring method, its merits and problems to be solved, and its clinical application. Rat liver 10 x g pellet was used as receptor site, and enzymatic degradation of insulin by the system contained in this fraction was inhibited by adding 1 mM p-CMB. /sup 125/I-labelled porcine insulin was made by lactoperoxidase method under overnight incubation at 4/sup 0/C and later purification by Sephadex G-25 column and Whatman CF-11 cellulose powder. Dog pancreatic vein serum insulin during and after the glucose load was determined by radioreceptor assay and radioimmunoassay resulting that both measurements accorded considerably. Radioreceptor assay would clarify the pathology of disorders of glucose metabolism including diabetes.

  18. Clonogenic assay: adherent cells.

    Science.gov (United States)

    Rafehi, Haloom; Orlowski, Christian; Georgiadis, George T; Ververis, Katherine; El-Osta, Assam; Karagiannis, Tom C

    2011-03-13

    The clonogenic (or colony forming) assay has been established for more than 50 years; the original paper describing the technique was published in 1956. Apart from documenting the method, the initial landmark study generated the first radiation-dose response curve for X-ray irradiated mammalian (HeLa) cells in culture. Basically, the clonogenic assay enables an assessment of the differences in reproductive viability (capacity of cells to produce progeny; i.e. a single cell to form a colony of 50 or more cells) between control untreated cells and cells that have undergone various treatments such as exposure to ionising radiation, various chemical compounds (e.g. cytotoxic agents) or in other cases genetic manipulation. The assay has become the most widely accepted technique in radiation biology and has been widely used for evaluating the radiation sensitivity of different cell lines. Further, the clonogenic assay is commonly used for monitoring the efficacy of radiation modifying compounds and for determining the effects of cytotoxic agents and other anti-cancer therapeutics on colony forming ability, in different cell lines. A typical clonogenic survival experiment using adherent cells lines involves three distinct components, 1) treatment of the cell monolayer in tissue culture flasks, 2) preparation of single cell suspensions and plating an appropriate number of cells in petri dishes and 3) fixing and staining colonies following a relevant incubation period, which could range from 1-3 weeks, depending on the cell line. Here we demonstrate the general procedure for performing the clonogenic assay with adherent cell lines with the use of an immortalized human keratinocyte cell line (FEP-1811). Also, our aims are to describe common features of clonogenic assays including calculation of the plating efficiency and survival fractions after exposure of cells to radiation, and to exemplify modification of radiation-response with the use of a natural antioxidant

  19. Human Reliability Program Overview

    Energy Technology Data Exchange (ETDEWEB)

    Bodin, Michael

    2012-09-25

    This presentation covers the high points of the Human Reliability Program, including certification/decertification, critical positions, due process, organizational structure, program components, personnel security, an overview of the US DOE reliability program, retirees and academia, and security program integration.

  20. Power electronics reliability analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Mark A.; Atcitty, Stanley

    2009-12-01

    This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.

  1. Reliability of software

    International Nuclear Information System (INIS)

    Kopetz, H.

    1980-01-01

    Common factors and differences in the reliability of hardware and software; reliability increase by means of methods of software redundancy. Maintenance of software for long term operating behavior. (HP) [de

  2. Scintillation proximity assay

    International Nuclear Information System (INIS)

    Hart, H.

    1980-01-01

    In a method of immunological assay two different classes of particles which interact at short distances to produce characteristic detectable signals are employed in a modification of the usual latex fixation test. In one embodiment an aqueous suspension of antigen coated tritiated latex particles (LH) and antigen coated polystyrene scintillant particles (L*) is employed to assay antibody in the aqueous medium. The amount of (LH) (L*) dimer formation and higher order aggregation induced and therefore the concentration of antibody (or antigen) present which caused the aggregation can be determined by using standard liquid scintillation counting equipment. (author)

  3. Assays for calcitonin receptors

    International Nuclear Information System (INIS)

    Teitelbaum, A.P.; Nissenson, R.A.; Arnaud, C.D.

    1985-01-01

    The assays for calcitonin receptors described focus on their use in the study of the well-established target organs for calcitonin, bone and kidney. The radioligand used in virtually all calcitonin binding studies is 125 I-labelled salmon calcitonin. The lack of methionine residues in this peptide permits the use of chloramine-T for the iodination reaction. Binding assays are described for intact bone, skeletal plasma membranes, renal plasma membranes, and primary kidney cell cultures of rats. Studies on calcitonin metabolism in laboratory animals and regulation of calcitonin receptors are reviewed

  4. The Americleft Speech Project: A Training and Reliability Study.

    Science.gov (United States)

    Chapman, Kathy L; Baylis, Adriane; Trost-Cardamone, Judith; Cordero, Kelly Nett; Dixon, Angela; Dobbelsteyn, Cindy; Thurmes, Anna; Wilson, Kristina; Harding-Bell, Anne; Sweeney, Triona; Stoddard, Gregory; Sell, Debbie

    2016-01-01

    To describe the results of two reliability studies and to assess the effect of training on interrater reliability scores. The first study (1) examined interrater and intrarater reliability scores (weighted and unweighted kappas) and (2) compared interrater reliability scores before and after training on the use of the Cleft Audit Protocol for Speech-Augmented (CAPS-A) with British English-speaking children. The second study examined interrater and intrarater reliability on a modified version of the CAPS-A (CAPS-A Americleft Modification) with American and Canadian English-speaking children. Finally, comparisons were made between the interrater and intrarater reliability scores obtained for Study 1 and Study 2. The participants were speech-language pathologists from the Americleft Speech Project. In Study 1, interrater reliability scores improved for 6 of the 13 parameters following training on the CAPS-A protocol. Comparison of the reliability results for the two studies indicated lower scores for Study 2 compared with Study 1. However, this appeared to be an artifact of the kappa statistic that occurred due to insufficient variability in the reliability samples for Study 2. When percent agreement scores were also calculated, the ratings appeared similar across Study 1 and Study 2. The findings of this study suggested that improvements in interrater reliability could be obtained following a program of systematic training. However, improvements were not uniform across all parameters. Acceptable levels of reliability were achieved for those parameters most important for evaluation of velopharyngeal function.

  5. The French dosimetry protocol

    International Nuclear Information System (INIS)

    Dutreix, A.

    1985-01-01

    After a general introduction the protocol is divided in five sections dealing with: determination of the quality of X-ray, γ-ray and electron beams; the measuring instrument; calibration of the reference instrument; determination of the reference absorbed dose in the user's beams; determination of the absorbed dose in water at other points, in other conditions. The French protocol is not essentially different from the Nordic protocol and it is based on the experience gained in using both the American and the Nordic protocols. Therefore, only the main difference with the published protocols are discussed. (Auth.)

  6. Reliable Design Versus Trust

    Science.gov (United States)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    This presentation focuses on reliability and trust for the users portion of the FPGA design flow. It is assumed that the manufacturer prior to hand-off to the user tests FPGA internal components. The objective is to present the challenges of creating reliable and trusted designs. The following will be addressed: What makes a design vulnerable to functional flaws (reliability) or attackers (trust)? What are the challenges for verifying a reliable design versus a trusted design?

  7. Pocket Handbook on Reliability

    Science.gov (United States)

    1975-09-01

    exponencial distributions Weibull distribution, -xtimating reliability, confidence intervals, relia- bility growth, 0. P- curves, Bayesian analysis. 20 A S...introduction for those not familiar with reliability and a good refresher for those who are currently working in the area. LEWIS NERI, CHIEF...includes one or both of the following objectives: a) prediction of the current system reliability, b) projection on the system reliability for someI future

  8. Principles of Bridge Reliability

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle; Nowak, Andrzej S.

    The paper gives a brief introduction to the basic principles of structural reliability theory and its application to bridge engineering. Fundamental concepts like failure probability and reliability index are introduced. Ultimate as well as serviceability limit states for bridges are formulated......, and as an example the reliability profile and a sensitivity analyses for a corroded reinforced concrete bridge is shown....

  9. Reliability in engineering '87

    International Nuclear Information System (INIS)

    Tuma, M.

    1987-01-01

    The participants heard 51 papers dealing with the reliability of engineering products. Two of the papers were incorporated in INIS, namely ''Reliability comparison of two designs of low pressure regeneration of the 1000 MW unit at the Temelin nuclear power plant'' and ''Use of probability analysis of reliability in designing nuclear power facilities.''(J.B.)

  10. Targeted resequencing and variant validation using pxlence PCR assays

    Directory of Open Access Journals (Sweden)

    Frauke Coppieters

    2016-01-01

    Full Text Available The advent of next-generation sequencing technologies had a profound impact on molecular diagnostics. PCR is a popular method for target enrichment of disease gene panels. Using our proprietary primer-design pipeline, primerXL, we have created almost one million assays covering over 98% of the human exome. Here we describe the assay specification and both in silico and wet-lab validation of a selected set of 2294 assays using both next-generation sequencing and Sanger sequencing. Using a universal PCR protocol without optimization, these assays result in high coverage uniformity and limited non-specific coverage. In addition, data indicates a positive correlation between the predictive in silico specificity score and the amount of assay non-specific coverage.

  11. Protocol Implementation Generator

    DEFF Research Database (Denmark)

    Carvalho Quaresma, Jose Nuno; Probst, Christian W.

    2010-01-01

    Users expect communication systems to guarantee, amongst others, privacy and integrity of their data. These can be ensured by using well-established protocols; the best protocol, however, is useless if not all parties involved in a communication have a correct implementation of the protocol and a...... Generator framework based on the LySatool and a translator from the LySa language into C or Java....... necessary tools. In this paper, we present the Protocol Implementation Generator (PiG), a framework that can be used to add protocol generation to protocol negotiation, or to easily share and implement new protocols throughout a network. PiG enables the sharing, verification, and translation...

  12. MDP: Reliable File Transfer for Space Missions

    Science.gov (United States)

    Rash, James; Criscuolo, Ed; Hogie, Keith; Parise, Ron; Hennessy, Joseph F. (Technical Monitor)

    2002-01-01

    This paper presents work being done at NASA/GSFC by the Operating Missions as Nodes on the Internet (OMNI) project to demonstrate the application of the Multicast Dissemination Protocol (MDP) to space missions to reliably transfer files. This work builds on previous work by the OMNI project to apply Internet communication technologies to space communication. The goal of this effort is to provide an inexpensive, reliable, standard, and interoperable mechanism for transferring files in the space communication environment. Limited bandwidth, noise, delay, intermittent connectivity, link asymmetry, and one-way links are all possible issues for space missions. Although these are link-layer issues, they can have a profound effect on the performance of transport and application level protocols. MDP, a UDP-based reliable file transfer protocol, was designed for multicast environments which have to address these same issues, and it has done so successfully. Developed by the Naval Research Lab in the mid 1990's, MDP is now in daily use by both the US Post Office and the DoD. This paper describes the use of MDP to provide automated end-to-end data flow for space missions. It examines the results of a parametric study of MDP in a simulated space link environment and discusses the results in terms of their implications for space missions. Lessons learned are addressed, which suggest minor enhancements to the MDP user interface to add specific features for space mission requirements, such as dynamic control of data rate, and a checkpoint/resume capability. These are features that are provided for in the protocol, but are not implemented in the sample MDP application that was provided. A brief look is also taken at the status of standardization. A version of MDP known as NORM (Neck Oriented Reliable Multicast) is in the process of becoming an IETF standard.

  13. Lateral flow assays

    NARCIS (Netherlands)

    Posthuma-Trumpie, G.A.; Amerongen, van A.

    2012-01-01

    A simple version of immunochemical-based methods is the Lateral Flow Assay (LFA). It is a dry chemistry technique (reagents are included); the fluid from the sample runs through a porous membrane (often nitrocellulose) by capillary force. Typically the membrane is cut as a strip of 0.5*5 cm. In most

  14. Microchemiluminescent assay system

    Energy Technology Data Exchange (ETDEWEB)

    Kiel, J.L.

    1986-04-09

    The patent concerns a microchemiluminescent assay system, which can be used to detect ionizing radiation, heat or specific substances. The method involves the use of a complex formed from serum albumin and a luminescer which, in the presence of ionizing radiation (heat, or a specific analyte), will emit light in an amount proportional to the amount of radiation, etc. (U.K.).

  15. (MTT) dye reduction assay.

    African Journals Online (AJOL)

    to inhibit proliferation of HeLa cells was determined using the 3443- dimethylthiazol-2-yl)-2,5-diphenyl-tetrazolium bromide (MTT) dye reduction assay. Extracts from roots of Agathisanthemum bojeri, Synaptolepis kirkii and Zanha africana and the leaf extract of Physalis peruviana at a concentration of 10 pg/ml inhibited cell ...

  16. Hyaluronic Acid Assays

    DEFF Research Database (Denmark)

    Itenov, Theis S; Kirkby, Nikolai S; Bestle, Morten H

    2015-01-01

    BACKGROUD: Hyaluronic acid (HA) is proposed as a marker of functional liver capacity. The aim of the present study was to compare a new turbidimetric assay for measuring HA with the current standard method. METHODS: HA was measured by a particle-enhanced turbidimetric immunoassay (PETIA) and enzyme...

  17. FLUIDICS DEVICE FOR ASSAY

    DEFF Research Database (Denmark)

    2007-01-01

    The present invention relates to a device for use in performing assays on standard laboratory solid supports whereon chemical entities are attached. The invention furthermore relates to the use of such a device and a kit comprising such a device. The device according to the present invention is a...

  18. Assessment and reduction of comet assay variation in relation to DNA damage: studies from the European Comet Assay Validation Group

    DEFF Research Database (Denmark)

    Møller, Peter; Möller, Lennart; Godschalk, Roger W L

    2010-01-01

    The alkaline single cell gel electrophoresis (comet) assay has become a widely used method for the detection of DNA damage and repair in cells and tissues. Still, it has been difficult to compare results from different investigators because of differences in assay conditions and because the data...... are reported in different units. The European Comet Assay Validation Group (ECVAG) was established for the purpose of validation of the comet assay with respect to measures of DNA damage formation and its repair. The results from this inter-laboratory validation trail showed a large variation in measured level...... reliability for the measurement of DNA damage by the comet assay but there is still a need for further validation to reduce both assay and inter-laboratory variation....

  19. Reliable computer systems.

    Science.gov (United States)

    Wear, L L; Pinkert, J R

    1993-11-01

    In this article, we looked at some decisions that apply to the design of reliable computer systems. We began with a discussion of several terms such as testability, then described some systems that call for highly reliable hardware and software. The article concluded with a discussion of methods that can be used to achieve higher reliability in computer systems. Reliability and fault tolerance in computers probably will continue to grow in importance. As more and more systems are computerized, people will want assurances about the reliability of these systems, and their ability to work properly even when sub-systems fail.

  20. Human factor reliability program

    International Nuclear Information System (INIS)

    Knoblochova, L.

    2017-01-01

    The human factor's reliability program was at Slovenske elektrarne, a.s. (SE) nuclear power plants. introduced as one of the components Initiatives of Excellent Performance in 2011. The initiative's goal was to increase the reliability of both people and facilities, in response to 3 major areas of improvement - Need for improvement of the results, Troubleshooting support, Supporting the achievement of the company's goals. The human agent's reliability program is in practice included: - Tools to prevent human error; - Managerial observation and coaching; - Human factor analysis; -Quick information about the event with a human agent; -Human reliability timeline and performance indicators; - Basic, periodic and extraordinary training in human factor reliability(authors)

  1. Principles of validation of diagnostic assays for infectious diseases

    International Nuclear Information System (INIS)

    Jacobson, R.H.

    1998-01-01

    Assay validation requires a series of inter-related processes. Assay validation is an experimental process: reagents and protocols are optimized by experimentation to detect the analyte with accuracy and precision. Assay validation is a relative process: its diagnostic sensitivity and diagnostic specificity are calculated relative to test results obtained from reference animal populations of known infection/exposure status. Assay validation is a conditional process: classification of animals in the target population as infected or uninfected is conditional upon how well the reference animal population used to validate the assay represents the target population; accurate predictions of the infection status of animals from test results (PV+ and PV-) are conditional upon the estimated prevalence of disease/infection in the target population. Assay validation is an incremental process: confidence in the validity of an assay increases over time when use confirms that it is robust as demonstrated by accurate and precise results; the assay may also achieve increasing levels of validity as it is upgraded and extended by adding reference populations of known infection status. Assay validation is a continuous process: the assay remains valid only insofar as it continues to provide accurate and precise results as proven through statistical verification. Therefore, the work required for validation of diagnostic assays for infectious diseases does not end with a time-limited series of experiments based on a few reference samples rather, to assure valid test results from an assay requires constant vigilance and maintenance of the assay, along with reassessment of its performance characteristics for each unique population of animals to which it is applied. (author)

  2. Detection and identification of Toscana and other phleboviruses by RT-nested-PCR assays with degenerated primers.

    Science.gov (United States)

    Sánchez-Seco, María-Paz; Echevarría, José-Manuel; Hernández, Lourdes; Estévez, Domingo; Navarro-Marí, José-María; Tenorio, Antonio

    2003-09-01

    Phleboviruses are a large and widespread group of viruses that are transmitted by arthropods. Toscana virus is one of the principal agents that causes meningitis in humans during the summer in Italy and, possibly, in other Mediterranean countries. Rift Valley Fever virus can cause serious illness in both animals and humans, leading to high morbidity and mortality, and is considered to be a potential agent for epizootics and human epidemics. Since information on this group of viruses is still scant, reliable laboratory tools for diagnosis and epidemiological surveillance must be developed, in order to ascertain their real impact on Public Health. Sequence data obtained from Spanish isolates of Toscana virus and other phleboviruses confirmed that natural genome variability may hamper the diagnosis of these agents by molecular methods, so this must be borne in mind when developing reliable assays. In view of the above, a novel and useful protocol has been developed for the detection and specific identification of every member of the phlebovirus genus present in a sample, including Toscana virus, based on a generic RT-nested-PCR, followed by sequencing of the amplified fragment. A change in this method also allowed specific direct detection and identification of wild isolates of Toscana virus of different geographical origin, using newly designed primers. Testing clinical samples with these assays confirmed the role of Toscana virus as an agent that causes acute aseptic meningitis in the central region of Spain. Copyright 2003 Wiley-Liss, Inc.

  3. A duplex PCR assay for the detection of Ralstonia solanacearum phylotype II strains in Musa spp.

    Directory of Open Access Journals (Sweden)

    Gilles Cellier

    Full Text Available Banana wilt outbreaks that are attributable to Moko disease-causing strains of the pathogen Ralstonia solanacearum (Rs remain a social and economic burden for both multinational corporations and subsistence farmers. All known Moko strains belong to the phylotype II lineage, which has been previously recognized for its broad genetic basis. Moko strains are paraphyletic and are distributed among seven related but distinct phylogenetic clusters (sequevars that are potentially major threats to Musaceae, Solanaceae, and ornamental crops in many countries. Although clustered within the Moko IIB-4 sequevar, strains of the epidemiologically variant IIB-4NPB do not cause wilt on Cavendish or plantain bananas; instead, they establish a latent infection in the vascular tissues of plantains and demonstrate an expanded host range and high aggressiveness toward Solanaceae and Cucurbitaceae. Although most molecular diagnostic methods focus on strains that wilt Solanaceae (particularly potato, no relevant protocol has been described that universally detects strains of the Musaceae-infecting Rs phylotype II. Thus, a duplex PCR assay targeting Moko and IIB-4NPB variant strains was developed, and its performance was assessed using an extensive collection of 111 strains representing the known diversity of Rs Moko-related strains and IIB-4NPB variant strains along with certain related strains and families. The proposed diagnostic protocol demonstrated both high accuracy (inclusivity and exclusivity and high repeatability, detected targets on either pure culture or spiked plant extracts. Although they did not belong to the Moko clusters described at the time of the study, recently discovered banana-infecting strains from Brazil were also detected. According to our comprehensive evaluation, this duplex PCR assay appears suitable for both research and diagnostic laboratories and provides reliable detection of phylotype II Rs strains that infect Musaceae.

  4. Radioreceptor assay for oxyphenonium

    International Nuclear Information System (INIS)

    Ensing, K.; Zeeuw, R.A. de

    1984-01-01

    The development of a radioreceptor assay for the quaternary anticholinergic drug, oxyphenonium, in plasma is reported. It is based on competition between this drug and 3 H-dexetimide for binding to muscarinic receptors. After ion pair extraction and reextraction, the drug can be determined in plasma at concentrations down to a value of 100 pg/ml. This permits pharmacokinetic studies to be made after inhalation of oxyphenonium. (author)

  5. Dual isotope assays

    International Nuclear Information System (INIS)

    Smith, G.F.W.; Stevens, R.A.J.; Jacoby, B.

    1980-01-01

    Dual isotope assays for thyroid function are performed by carrying out a radio-immunoassay for two of thyroxine (T4), tri-iodothyronine (T3), thyroid stimulating hormone (TSH), and thyroxine binding globulin (TBG), by a method wherein a version of one of the thyroid components, preferably T4 or T3 is labelled with Selenium-75 and the version of the other thyroid component is labelled with a different radionuclide, preferably Iodine-125. (author)

  6. Reliability and safety engineering

    CERN Document Server

    Verma, Ajit Kumar; Karanki, Durga Rao

    2016-01-01

    Reliability and safety are core issues that must be addressed throughout the life cycle of engineering systems. Reliability and Safety Engineering presents an overview of the basic concepts, together with simple and practical illustrations. The authors present reliability terminology in various engineering fields, viz.,electronics engineering, software engineering, mechanical engineering, structural engineering and power systems engineering. The book describes the latest applications in the area of probabilistic safety assessment, such as technical specification optimization, risk monitoring and risk informed in-service inspection. Reliability and safety studies must, inevitably, deal with uncertainty, so the book includes uncertainty propagation methods: Monte Carlo simulation, fuzzy arithmetic, Dempster-Shafer theory and probability bounds. Reliability and Safety Engineering also highlights advances in system reliability and safety assessment including dynamic system modeling and uncertainty management. Cas...

  7. Automated amperometric plutonium assay system

    International Nuclear Information System (INIS)

    Burt, M.C.

    1985-01-01

    The amperometric titration for plutonium assay has been used in the nuclear industry for over twenty years and has been in routine use at the Hanford Engineering Development Laboratory since 1976 for the analysis of plutonium oxide and mixed oxide fuel material for the Fast Flux Test Facility. It has proven itself to be an accurate and reliable method. The method may be used as a direct end point titration or an excess of titrant may be added and a back titration performed to aid in determination of the end point. Due to the slowness of the PuVI-FeII reaction it is difficult to recognize when the end point is being approached and is very time consuming if the current is allowed to decay to the residual value after each titrant addition. For this reason the back titration in which the rapid FeII-CrVI reaction occurs is used by most laboratories. The back titration is performed by the addition of excess ferrous solution followed by two measured aliquots of standard dichromate with measurement of cell current after each addition

  8. Design of Bus Protocol Intelligent Initiation System Based On RS485

    Directory of Open Access Journals (Sweden)

    Li Liming

    2017-01-01

    Full Text Available In order to design an effective and reliable RS485 bus protocol based on RS485 bus, this paper introduces the structure and transmission mode of the command frame and the response frame, and also introduce four control measures and the communication in order to process quality of this system. The communication protocol is open, tolerant, reliable and fast, and can realize ignition more reliable and accurate in the intelligent initiation system.

  9. Assessment of Quality of Assay Data on Drill Samples from Golden ...

    African Journals Online (AJOL)

    The success of a mining company is dependent on the integrity of the resource database. The quality of assay data and thus the validity of the database can only be guaranteed when appropriate sampling and assaying protocols have been implemented. It is necessary to convince investors and project financiers that ...

  10. Human reliability analysis

    International Nuclear Information System (INIS)

    Dougherty, E.M.; Fragola, J.R.

    1988-01-01

    The authors present a treatment of human reliability analysis incorporating an introduction to probabilistic risk assessment for nuclear power generating stations. They treat the subject according to the framework established for general systems theory. Draws upon reliability analysis, psychology, human factors engineering, and statistics, integrating elements of these fields within a systems framework. Provides a history of human reliability analysis, and includes examples of the application of the systems approach

  11. Reliability of electronic systems

    International Nuclear Information System (INIS)

    Roca, Jose L.

    2001-01-01

    Reliability techniques have been developed subsequently as a need of the diverse engineering disciplines, nevertheless they are not few those that think they have been work a lot on reliability before the same word was used in the current context. Military, space and nuclear industries were the first ones that have been involved in this topic, however not only in these environments it is that it has been carried out this small great revolution in benefit of the increase of the reliability figures of the products of those industries, but rather it has extended to the whole industry. The fact of the massive production, characteristic of the current industries, drove four decades ago, to the fall of the reliability of its products, on one hand, because the massively itself and, for other, to the recently discovered and even not stabilized industrial techniques. Industry should be changed according to those two new requirements, creating products of medium complexity and assuring an enough reliability appropriated to production costs and controls. Reliability began to be integral part of the manufactured product. Facing this philosophy, the book describes reliability techniques applied to electronics systems and provides a coherent and rigorous framework for these diverse activities providing a unifying scientific basis for the entire subject. It consists of eight chapters plus a lot of statistical tables and an extensive annotated bibliography. Chapters embrace the following topics: 1- Introduction to Reliability; 2- Basic Mathematical Concepts; 3- Catastrophic Failure Models; 4-Parametric Failure Models; 5- Systems Reliability; 6- Reliability in Design and Project; 7- Reliability Tests; 8- Software Reliability. This book is in Spanish language and has a potentially diverse audience as a text book from academic to industrial courses. (author)

  12. Operational safety reliability research

    International Nuclear Information System (INIS)

    Hall, R.E.; Boccio, J.L.

    1986-01-01

    Operating reactor events such as the TMI accident and the Salem automatic-trip failures raised the concern that during a plant's operating lifetime the reliability of systems could degrade from the design level that was considered in the licensing process. To address this concern, NRC is sponsoring the Operational Safety Reliability Research project. The objectives of this project are to identify the essential tasks of a reliability program and to evaluate the effectiveness and attributes of such a reliability program applicable to maintaining an acceptable level of safety during the operating lifetime at the plant

  13. Circuit design for reliability

    CERN Document Server

    Cao, Yu; Wirth, Gilson

    2015-01-01

    This book presents physical understanding, modeling and simulation, on-chip characterization, layout solutions, and design techniques that are effective to enhance the reliability of various circuit units.  The authors provide readers with techniques for state of the art and future technologies, ranging from technology modeling, fault detection and analysis, circuit hardening, and reliability management. Provides comprehensive review on various reliability mechanisms at sub-45nm nodes; Describes practical modeling and characterization techniques for reliability; Includes thorough presentation of robust design techniques for major VLSI design units; Promotes physical understanding with first-principle simulations.

  14. High-throughput screening of carbohydrate-degrading enzymes using novel insoluble chromogenic substrate assay kits

    DEFF Research Database (Denmark)

    Schückel, Julia; Kracun, Stjepan Kresimir; Willats, William George Tycho

    2016-01-01

    for this is that advances in genome and transcriptome sequencing, together with associated bioinformatics tools allow for rapid identification of candidate CAZymes, but technology for determining an enzyme's biochemical characteristics has advanced more slowly. To address this technology gap, a novel high-throughput assay...... CPH and ICB substrates are provided in a 96-well high-throughput assay system. The CPH substrates can be made in four different colors, enabling them to be mixed together and thus increasing assay throughput. The protocol describes a 96-well plate assay and illustrates how this assay can be used...... for screening the activities of enzymes, enzyme cocktails, and broths....

  15. Analysis of Security Protocols by Annotations

    DEFF Research Database (Denmark)

    Gao, Han

    . The development of formal techniques, e.g. control flow analyses, that can check various security properties, is an important tool to meet this challenge. This dissertation contributes to the development of such techniques. In this dissertation, security protocols are modelled in the process calculus LYSA......The trend in Information Technology is that distributed systems and networks are becoming increasingly important, as most of the services and opportunities that characterise the modern society are based on these technologies. Communication among agents over networks has therefore acquired a great...... deal of research interest. In order to provide effective and reliable means of communication, more and more communication protocols are invented, and for most of them, security is a significant goal. It has long been a challenge to determine conclusively whether a given protocol is secure or not...

  16. Radioligand assay for biotin in liver tissues

    International Nuclear Information System (INIS)

    Rettenmaier, R.

    1979-01-01

    A radioligand assay for biotin in liver tissue is described. 3 H-biotin is used as tracer and avidin as binder. The biotin-loaded avidin is separated from free biotin on dextran-coated charcoal, which leaves the avidin-biotin complex in the supernatant liquid. Thus, the avidin-biotin complex can easily be utilized for determination of the radioactivity. Calibration with known additions of biotin in the range 0.25-8.0 ng per assay sample yields a linear logit-log plot. The biotin is extracted from liver tissues by enzymatic proteolysis with papain. This treatment is optimized to liberate the bound forms of the vitamin. Microbiological parallel assays with Lactobacillus plantarum were in good agreement with the radioligand assay giving a regression coefficient of 0.974(n=44). The coefficient of variation was found to be 4.2% in the range 500-1200 ng of biotin per g of liver tissue (n=46). The method is simple and reliable and allows the simultaneous analysis of a considerable number of samples. (Auth.)

  17. Nanoparticle-assay marker interaction: effects on nanotoxicity assessment

    International Nuclear Information System (INIS)

    Zhao, Xinxin; Xiong, Sijing; Huang, Liwen Charlotte; Ng, Kee Woei; Loo, Say Chye Joachim

    2015-01-01

    Protein-based cytotoxicity assays such as lactate dehydrogenase (LDH) and tumor necrosis factor-alpha (TNF-α) are commonly used in cytotoxic evaluation of nanoparticles (NPs) despite numerous reports on possible interactions with protein markers in these assays that can confound the results obtained. In this study, conventional cytotoxicity assays where assay markers may (LDH and TNF- α) or may not (PicoGreen and WST-8) come into contact with NPs were used to evaluate the cytotoxicity of NPs. The findings revealed selective interactions between negatively charged protein assay markers (LDH and TNF- α) and positively charged ZnO NPs under abiotic conditions. The adsorption and interaction with these protein assay markers were strongly influenced by surface charge, concentration, and specific surface area of the NPs, thereby resulting in less than accurate cytotoxic measurements, as observed from actual cell viability measurements. An improved protocol for LDH assay was, therefore, proposed and validated by eliminating any effects associated with protein–particle interactions. In view of this, additional measures and precautions should be taken when evaluating cytotoxicity of NPs with standard protein-based assays, particularly when they are of opposite charges

  18. Nanoparticle-assay marker interaction: effects on nanotoxicity assessment

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, Xinxin; Xiong, Sijing; Huang, Liwen Charlotte; Ng, Kee Woei, E-mail: kwng@ntu.edu.sg; Loo, Say Chye Joachim, E-mail: joachimloo@ntu.edu.sg [Nanyang Technological University, School of Materials Science and Engineering (Singapore)

    2015-01-15

    Protein-based cytotoxicity assays such as lactate dehydrogenase (LDH) and tumor necrosis factor-alpha (TNF-α) are commonly used in cytotoxic evaluation of nanoparticles (NPs) despite numerous reports on possible interactions with protein markers in these assays that can confound the results obtained. In this study, conventional cytotoxicity assays where assay markers may (LDH and TNF- α) or may not (PicoGreen and WST-8) come into contact with NPs were used to evaluate the cytotoxicity of NPs. The findings revealed selective interactions between negatively charged protein assay markers (LDH and TNF- α) and positively charged ZnO NPs under abiotic conditions. The adsorption and interaction with these protein assay markers were strongly influenced by surface charge, concentration, and specific surface area of the NPs, thereby resulting in less than accurate cytotoxic measurements, as observed from actual cell viability measurements. An improved protocol for LDH assay was, therefore, proposed and validated by eliminating any effects associated with protein–particle interactions. In view of this, additional measures and precautions should be taken when evaluating cytotoxicity of NPs with standard protein-based assays, particularly when they are of opposite charges.

  19. Nanoparticle-assay marker interaction: effects on nanotoxicity assessment

    Science.gov (United States)

    Zhao, Xinxin; Xiong, Sijing; Huang, Liwen Charlotte; Ng, Kee Woei; Loo, Say Chye Joachim

    2015-01-01

    Protein-based cytotoxicity assays such as lactate dehydrogenase (LDH) and tumor necrosis factor-alpha (TNF-α) are commonly used in cytotoxic evaluation of nanoparticles (NPs) despite numerous reports on possible interactions with protein markers in these assays that can confound the results obtained. In this study, conventional cytotoxicity assays where assay markers may (LDH and TNF- α) or may not (PicoGreen and WST-8) come into contact with NPs were used to evaluate the cytotoxicity of NPs. The findings revealed selective interactions between negatively charged protein assay markers (LDH and TNF- α) and positively charged ZnO NPs under abiotic conditions. The adsorption and interaction with these protein assay markers were strongly influenced by surface charge, concentration, and specific surface area of the NPs, thereby resulting in less than accurate cytotoxic measurements, as observed from actual cell viability measurements. An improved protocol for LDH assay was, therefore, proposed and validated by eliminating any effects associated with protein-particle interactions. In view of this, additional measures and precautions should be taken when evaluating cytotoxicity of NPs with standard protein-based assays, particularly when they are of opposite charges.

  20. Results of in vitro chemosensitivity assays

    International Nuclear Information System (INIS)

    Tanigawa, Nobuhiko; Morimoto, Hideki; Akita, Toshiaki; Inoue, Hiroshi; Tanaka, Takeo.

    1986-01-01

    The authors reviewed their experiences to date with chemosensitivity testing of 629 tumors by human tumor clonogenic assay (HTCA) and of 199 tumors by scintillation assay (SA). HTCA and SA were both performed using a double-layer-soft-agar system with continuous exposure of cells to one concentration of standard anticancer drugs. Overall, 60 % of specimens in HTCA and 58 % in SA produced significant growth in vitro. HTCA was 52 % (13/25) reliable for predicting in vivo sensitivity, and 95 % (36/38) reliable for in vivo resistance, whereas SA was 40 % (8/20) reliable for in vivo sensitivity and 88 % (21/24) for in vivo resistance. In vitro success rates were variable, depending on the tumor histology. In vitro growth of gastric cancer specimens was characteristically lower than that of colon cancer specimens (48 % and 60 % in HTCA, and 46 % and 68 % in SA, respectively). (p < 0.005). Optimal in vitro-in vivo drug concentrations and culture conditions are still being defined. Correlation studies of in vitro-in vivo responses of gastrointestinal cancers suggested that in vitro concentrations of 5-fluorouracil and mitomycin C used in this study were considerably higher than their optimal doses. Tumor cell heterogeneity poses significant problems in the clinical use of chemosensitivity assays. In this last study, we sought evidence of tumor heterogeneity by comparing chemosensitivity responses between : 1) different portions of a single tumor, 2) a primary and a metastatic biopsy taken from a patient on the same day, and 3) different metastases from a patient taken on the same day. The results demonstrated the presence of considerable heterogeneity of response to chemotherapy among different tumors from the same patient, and even within the same tumor. The reported discrepancies of in vitro and in vivo sensitivity may be due to such therapeutic heterogeneity among tumors. (J.P.N.)

  1. Network protocols and sockets

    OpenAIRE

    BALEJ, Marek

    2010-01-01

    My work will deal with network protocols and sockets and their use in programming language C#. It will therefore deal programming network applications on the platform .NET from Microsoft and instruments, which C# provides to us. There will describe the tools and methods for programming network applications, and shows a description and sample applications that work with sockets and application protocols.

  2. The Assessment of Parameters Affecting the Quality of Cord Blood by the Appliance of the Annexin V Staining Method and Correlation with CFU Assays

    Directory of Open Access Journals (Sweden)

    Teja Falk Radke

    2013-01-01

    Full Text Available The assessment of nonviable haematopoietic cells by Annexin V staining method in flow cytometry has recently been published by Duggleby et al. Resulting in a better correlation with the observed colony formation in methylcellulose assays than the standard ISHAGE protocol, it presents a promising method to predict cord blood potency. Herein, we applied this method for examining the parameters during processing which potentially could affect cord blood viability. We could verify that the current standards regarding time and temperature are sufficient, since no significant difference was observed within 48 hours or in storage at 4°C up to 26°C. However, the addition of DMSO for cryopreservation alone leads to an inevitable increase in nonviable haematopoietic stem cells from initially 14.8% ± 4.3% to at least 30.6% ± 5.5%. Furthermore, CFU-assays with varied seeding density were performed in order to evaluate the applicability as a quantitative method. The results revealed that only in a narrow range reproducible clonogenic efficiency (ClonE could be assessed, giving at least a semiquantitative estimation. We conclude that both Annexin V staining method and CFU-assays with defined seeding density are reliable means leading to a better prediction of the final potency. Especially Annexin V, due to its fast readout, is a practical tool for examining and optimising specific steps in processing, while CFU-assays add a functional confirmation.

  3. Radiorespirometic assay device

    International Nuclear Information System (INIS)

    Levin, G.V.; Straat, P.A.

    1981-01-01

    A radiorespirometic assay device is described in which the presence of microorganisms in a sample is determined by placing the sample in contact with a metabolisable radioactive labelled substrate, collecting any gas evolved, exposing a photosensitive material to the gas and determining if a spot is produced on the material. A spot indicates the presence of radioactivity showing that the substrate has been metabolized by a microorganism. Bacteria may be detected in body fluids, hospital operating rooms, water, food, cosmetics and drugs. (U.K.)

  4. Radon assay for SNO+

    Energy Technology Data Exchange (ETDEWEB)

    Rumleskie, Janet [Laurentian University, Greater Sudbury, Ontario (Canada)

    2015-12-31

    The SNO+ experiment will study neutrinos while located 6,800 feet below the surface of the earth at SNOLAB. Though shielded from surface backgrounds, emanation of radon radioisotopes from the surrounding rock leads to back-grounds. The characteristic decay of radon and its daughters allows for an alpha detection technique to count the amount of Rn-222 atoms collected. Traps can collect Rn-222 from various positions and materials, including an assay skid that will collect Rn-222 from the organic liquid scintillator used to detect interactions within SNO+.

  5. Hawaii Electric System Reliability

    Energy Technology Data Exchange (ETDEWEB)

    Loose, Verne William [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Silva Monroy, Cesar Augusto [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2012-08-01

    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers’ views of reliability “worth” and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers’ views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.

  6. Hawaii electric system reliability.

    Energy Technology Data Exchange (ETDEWEB)

    Silva Monroy, Cesar Augusto; Loose, Verne William

    2012-09-01

    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers' views of reliability %E2%80%9Cworth%E2%80%9D and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers' views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.

  7. Improving machinery reliability

    CERN Document Server

    Bloch, Heinz P

    1998-01-01

    This totally revised, updated and expanded edition provides proven techniques and procedures that extend machinery life, reduce maintenance costs, and achieve optimum machinery reliability. This essential text clearly describes the reliability improvement and failure avoidance steps practiced by best-of-class process plants in the U.S. and Europe.

  8. LED system reliability

    NARCIS (Netherlands)

    Driel, W.D. van; Yuan, C.A.; Koh, S.; Zhang, G.Q.

    2011-01-01

    This paper presents our effort to predict the system reliability of Solid State Lighting (SSL) applications. A SSL system is composed of a LED engine with micro-electronic driver(s) that supplies power to the optic design. Knowledge of system level reliability is not only a challenging scientific

  9. Integrated system reliability analysis

    DEFF Research Database (Denmark)

    Gintautas, Tomas; Sørensen, John Dalsgaard

    Specific targets: 1) The report shall describe the state of the art of reliability and risk-based assessment of wind turbine components. 2) Development of methodology for reliability and risk-based assessment of the wind turbine at system level. 3) Describe quantitative and qualitative measures...

  10. Reliability of neural encoding

    DEFF Research Database (Denmark)

    Alstrøm, Preben; Beierholm, Ulrik; Nielsen, Carsten Dahl

    2002-01-01

    The reliability with which a neuron is able to create the same firing pattern when presented with the same stimulus is of critical importance to the understanding of neuronal information processing. We show that reliability is closely related to the process of phaselocking. Experimental results f...

  11. Telomerase Repeated Amplification Protocol (TRAP).

    Science.gov (United States)

    Mender, Ilgen; Shay, Jerry W

    2015-11-20

    Telomeres are found at the end of eukaryotic linear chromosomes, and proteins that bind to telomeres protect DNA from being recognized as double-strand breaks thus preventing end-to-end fusions (Griffith et al. , 1999). However, due to the end replication problem and other factors such as oxidative damage, the limited life span of cultured cells (Hayflick limit) results in progressive shortening of these protective structures (Hayflick and Moorhead, 1961; Olovnikov, 1973). The ribonucleoprotein enzyme complex telomerase-consisting of a protein catalytic component hTERT and a functional RNA component hTR or hTERC - counteracts telomere shortening by adding telomeric repeats to the end of chromosomes in ~90% of primary human tumors and in some transiently proliferating stem-like cells (Shay and Wright, 1996; Shay and Wright, 2001). This results in continuous proliferation of cells which is a hallmark of cancer. Therefore, telomere biology has a central role in aging, cancer progression/metastasis as well as targeted cancer therapies. There are commonly used methods in telomere biology such as Telomere Restriction Fragment (TRF) (Mender and Shay, 2015b), Telomere Repeat Amplification Protocol (TRAP) and Telomere dysfunction Induced Foci (TIF) analysis (Mender and Shay, 2015a). In this detailed protocol we describe Telomere Repeat Amplification Protocol (TRAP). The TRAP assay is a popular method to determine telomerase activity in mammalian cells and tissue samples (Kim et al. , 1994). The TRAP assay includes three steps: extension, amplification, and detection of telomerase products. In the extension step, telomeric repeats are added to the telomerase substrate (which is actually a non telomeric oligonucleotide, TS) by telomerase. In the amplification step, the extension products are amplified by the polymerase chain reaction (PCR) using specific primers (TS upstream primer and ACX downstream primer) and in the detection step, the presence or absence of telomerase is

  12. Design reliability engineering

    International Nuclear Information System (INIS)

    Buden, D.; Hunt, R.N.M.

    1989-01-01

    Improved design techniques are needed to achieve high reliability at minimum cost. This is especially true of space systems where lifetimes of many years without maintenance are needed and severe mass limitations exist. Reliability must be designed into these systems from the start. Techniques are now being explored to structure a formal design process that will be more complete and less expensive. The intent is to integrate the best features of design, reliability analysis, and expert systems to design highly reliable systems to meet stressing needs. Taken into account are the large uncertainties that exist in materials, design models, and fabrication techniques. Expert systems are a convenient method to integrate into the design process a complete definition of all elements that should be considered and an opportunity to integrate the design process with reliability, safety, test engineering, maintenance and operator training. 1 fig

  13. Bayesian methods in reliability

    Science.gov (United States)

    Sander, P.; Badoux, R.

    1991-11-01

    The present proceedings from a course on Bayesian methods in reliability encompasses Bayesian statistical methods and their computational implementation, models for analyzing censored data from nonrepairable systems, the traits of repairable systems and growth models, the use of expert judgment, and a review of the problem of forecasting software reliability. Specific issues addressed include the use of Bayesian methods to estimate the leak rate of a gas pipeline, approximate analyses under great prior uncertainty, reliability estimation techniques, and a nonhomogeneous Poisson process. Also addressed are the calibration sets and seed variables of expert judgment systems for risk assessment, experimental illustrations of the use of expert judgment for reliability testing, and analyses of the predictive quality of software-reliability growth models such as the Weibull order statistics.

  14. RAS - Screens & Assays - Drug Discovery

    Science.gov (United States)

    The RAS Drug Discovery group aims to develop assays that will reveal aspects of RAS biology upon which cancer cells depend. Successful assay formats are made available for high-throughput screening programs to yield potentially effective drug compounds.

  15. Improving shuffler assay accuracy

    International Nuclear Information System (INIS)

    Rinard, P.M.

    1995-01-01

    Drums of uranium waste should be disposed of in an economical and environmentally sound manner. The most accurate possible assays of the uranium masses in the drums are required for proper disposal. The accuracies of assays from a shuffler are affected by the type of matrix material in the drums. Non-hydrogenous matrices have little effect on neutron transport and accuracies are very good. If self-shielding is known to be a minor problem, good accuracies are also obtained with hydrogenous matrices when a polyethylene sleeve is placed around the drums. But for those cases where self-shielding may be a problem, matrices are hydrogenous, and uranium distributions are non-uniform throughout the drums, the accuracies are degraded. They can be greatly improved by determining the distributions of the uranium and then applying correction factors based on the distributions. This paper describes a technique for determining uranium distributions by using the neutron count rates in detector banks around the waste drum and solving a set of overdetermined linear equations. Other approaches were studied to determine the distributions and are described briefly. Implementation of this correction is anticipated on an existing shuffler next year

  16. Competitive protein binding assay

    International Nuclear Information System (INIS)

    Kaneko, Toshio; Oka, Hiroshi

    1975-01-01

    The measurement of cyclic GMP (cGMP) by competitive protein binding assay was described and discussed. The principle of binding assay was represented briefly. Procedures of our method by binding protein consisted of preparation of cGMP binding protein, selection of 3 H-cyclic GMP on market, and measurement procedures. In our method, binding protein was isolated from the chrysalis of silk worm. This method was discussed from the points of incubation medium, specificity of binding protein, the separation of bound cGMP from free cGMP, and treatment of tissue from which cGMP was extracted. cGMP existing in the tissue was only one tenth or one scores of cGMP, and in addition, cGMP competed with cGMP in binding with binding protein. Therefore, Murad's technique was applied to the isolation of cGMP. This method provided the measurement with sufficient accuracy; the contamination by cAMP was within several per cent. (Kanao, N.)

  17. Using generalizability theory to develop clinical assessment protocols.

    Science.gov (United States)

    Preuss, Richard A

    2013-04-01

    Clinical assessment protocols must produce data that are reliable, with a clinically attainable minimal detectable change (MDC). In a reliability study, generalizability theory has 2 advantages over classical test theory. These advantages provide information that allows assessment protocols to be adjusted to match individual patient profiles. First, generalizability theory allows the user to simultaneously consider multiple sources of measurement error variance (facets). Second, it allows the user to generalize the findings of the main study across the different study facets and to recalculate the reliability and MDC based on different combinations of facet conditions. In doing so, clinical assessment protocols can be chosen based on minimizing the number of measures that must be taken to achieve a realistic MDC, using repeated measures to minimize the MDC, or simply based on the combination that best allows the clinician to monitor an individual patient's progress over a specified period of time.

  18. A Lightweight Protocol for Secure Video Streaming.

    Science.gov (United States)

    Venčkauskas, Algimantas; Morkevicius, Nerijus; Bagdonas, Kazimieras; Damaševičius, Robertas; Maskeliūnas, Rytis

    2018-05-14

    The Internet of Things (IoT) introduces many new challenges which cannot be solved using traditional cloud and host computing models. A new architecture known as fog computing is emerging to address these technological and security gaps. Traditional security paradigms focused on providing perimeter-based protections and client/server point to point protocols (e.g., Transport Layer Security (TLS)) are no longer the best choices for addressing new security challenges in fog computing end devices, where energy and computational resources are limited. In this paper, we present a lightweight secure streaming protocol for the fog computing "Fog Node-End Device" layer. This protocol is lightweight, connectionless, supports broadcast and multicast operations, and is able to provide data source authentication, data integrity, and confidentiality. The protocol is based on simple and energy efficient cryptographic methods, such as Hash Message Authentication Codes (HMAC) and symmetrical ciphers, and uses modified User Datagram Protocol (UDP) packets to embed authentication data into streaming data. Data redundancy could be added to improve reliability in lossy networks. The experimental results summarized in this paper confirm that the proposed method efficiently uses energy and computational resources and at the same time provides security properties on par with the Datagram TLS (DTLS) standard.

  19. Protocol: A simple phenol-based method for 96-well extraction of high quality RNA from Arabidopsis

    Directory of Open Access Journals (Sweden)

    Coustham Vincent

    2011-03-01

    Full Text Available Abstract Background Many experiments in modern plant molecular biology require the processing of large numbers of samples for a variety of applications from mutant screens to the analysis of natural variants. A severe bottleneck to many such analyses is the acquisition of good yields of high quality RNA suitable for use in sensitive downstream applications such as real time quantitative reverse-transcription-polymerase chain reaction (real time qRT-PCR. Although several commercial kits are available for high-throughput RNA extraction in 96-well format, only one non-kit method has been described in the literature using the commercial reagent TRIZOL. Results We describe an unusual phenomenon when using TRIZOL reagent with young Arabidopsis seedlings. This prompted us to develop a high-throughput RNA extraction protocol (HTP96 adapted from a well established phenol:chloroform-LiCl method (P:C-L that is cheap, reliable and requires no specialist equipment. With this protocol 192 high quality RNA samples can be prepared in 96-well format in three hours (less than 1 minute per sample with less than 1% loss of samples. We demonstrate that the RNA derived from this protocol is of high quality and suitable for use in real time qRT-PCR assays. Conclusion The development of the HTP96 protocol has vastly increased our sample throughput, allowing us to fully exploit the large sample capacity of modern real time qRT-PCR thermocyclers, now commonplace in many labs, and develop an effective high-throughput gene expression platform. We propose that the HTP96 protocol will significantly benefit any plant scientist with the task of obtaining hundreds of high quality RNA extractions.

  20. New simple spectrophotometric assay of total carotenes in margarines

    NARCIS (Netherlands)

    Luterotti, S.; Bicanic, D.D.; Pozgaj, R.

    2006-01-01

    Direct and reliable spectrophotometric method for assaying total carotenes (TC) in margarines with the minimum of sample manipulation is proposed. For the first time saponification step used in determination of carotenes in margarines was omitted leading to a substantial cost saving and reduction of

  1. Non destructive assay techniques applied to nuclear materials

    International Nuclear Information System (INIS)

    Gavron, A.

    2001-01-01

    Nondestructive assay is a suite of techniques that has matured and become precise, easily implementable, and remotely usable. These techniques provide elaborate safeguards of nuclear material by providing the necessary information for materials accounting. NDA techniques are ubiquitous, reliable, essentially tamper proof, and simple to use. They make the world a safer place to live in, and they make nuclear energy viable. (author)

  2. MS transport assays for γ-aminobutyric acid transporters--an efficient alternative for radiometric assays.

    Science.gov (United States)

    Schmitt, Sebastian; Höfner, Georg; Wanner, Klaus T

    2014-08-05

    Transport assays for neurotransmitters based on radiolabeled substrates are widely spread and often indispensable in basic research and the drug development process, although the use of radioisotopes is inherently coupled to issues concerning radioactive waste and safety precautions. To overcome these disadvantages, we developed mass spectrometry (MS)-based transport assays for γ-aminobutyric acid (GABA), which is the major inhibitory neurotransmitter in the central nervous system (CNS). These "MS Transport Assays" provide all capabilities of [(3)H]GABA transport assays and therefore represent the first substitute for the latter. The performance of our approach is demonstrated for GAT1, the most important GABA transporter (GAT) subtype. As GABA is endogenously present in COS-7 cells employed as hGAT1 expression system, ((2)H6)GABA was used as a substrate to differentiate transported from endogenous GABA. To record transported ((2)H6)GABA, a highly sensitive, short, robust, and reliable HILIC-ESI-MS/MS quantification method using ((2)H2)GABA as an internal standard was developed and validated according to the Center for Drug Evaluation and Research (CDER) guidelines. Based on this LC-MS quantification, a setup to characterize hGAT1 mediated ((2)H6)GABA transport in a 96-well format was established, that enables automated processing and avoids any sample preparation. The K(m) value for ((2)H6)GABA determined for hGAT1 is in excellent agreement with results obtained from [(3)H]GABA uptake assays. In addition, the established assay format enables efficient determination of the inhibitory potency of GAT1 inhibitors, is capable of identifying those inhibitors transported as substrates, and furthermore allows characterization of efflux. The approach described here combines the strengths of LC-MS/MS with the high efficiency of transport assays based on radiolabeled substrates and is applicable to all GABA transporter subtypes.

  3. A SURVEY ON MULTICAST ROUTING PROTOCOLS FOR PERFORMANCE EVALUATION IN WIRELESS SENSOR NETWORK

    Directory of Open Access Journals (Sweden)

    A. Suruliandi

    2015-03-01

    Full Text Available Multicast is a process used to transfer same message to multiple receivers at the same time. This paper presents the simulation and analysis of the performance of six different multicast routing protocols for Wireless Sensor Network (WSN. They are On Demand Multicast Routing Protocol (ODMRP, Protocol for Unified Multicasting through Announcement (PUMA, Multicast Adhoc On demand Distance Vector Protocol (MAODV, Overlay Boruvka-based Adhoc Multicast Protocol (OBAMP, Application Layer Multicast Algorithm (ALMA and enhanced version of ALMA (ALMA-H for WSN. Among them, ODMRP, MAODV and PUMA are reactive protocols while OBAMP, ALMA and ALMA-H are proactive protocols. This paper compares the performance of these protocols with common parameters such as Throughput, Reliability, End-to-End delay and Packet Delivery Ratio (PDR with increasing the numbers of nodes and increasing the speed of the nodes. The main objective of this work is to select the efficient multicast routing protocol for WSN among six multicast routing protocol based on relative strength and weakness of each protocol. The summary of above six multicast routing protocols is presented with a table of different performance characteristics. Experimental result shows that ODMRP attains higher throughput, reliability and higher packet delivery ratio than other multicast routing protocol, while incurring far less end-to-end delay.

  4. Reliability of construction materials

    International Nuclear Information System (INIS)

    Merz, H.

    1976-01-01

    One can also speak of reliability with respect to materials. While for reliability of components the MTBF (mean time between failures) is regarded as the main criterium, this is replaced with regard to materials by possible failure mechanisms like physical/chemical reaction mechanisms, disturbances of physical or chemical equilibrium, or other interactions or changes of system. The main tasks of the reliability analysis of materials therefore is the prediction of the various failure reasons, the identification of interactions, and the development of nondestructive testing methods. (RW) [de

  5. Structural Reliability Methods

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager; Madsen, H. O.

    The structural reliability methods quantitatively treat the uncertainty of predicting the behaviour and properties of a structure given the uncertain properties of its geometry, materials, and the actions it is supposed to withstand. This book addresses the probabilistic methods for evaluation...... of structural reliability, including the theoretical basis for these methods. Partial safety factor codes under current practice are briefly introduced and discussed. A probabilistic code format for obtaining a formal reliability evaluation system that catches the most essential features of the nature...... of the uncertainties and their interplay is the developed, step-by-step. The concepts presented are illustrated by numerous examples throughout the text....

  6. Reliability and mechanical design

    International Nuclear Information System (INIS)

    Lemaire, Maurice

    1997-01-01

    A lot of results in mechanical design are obtained from a modelisation of physical reality and from a numerical solution which would lead to the evaluation of needs and resources. The goal of the reliability analysis is to evaluate the confidence which it is possible to grant to the chosen design through the calculation of a probability of failure linked to the retained scenario. Two types of analysis are proposed: the sensitivity analysis and the reliability analysis. Approximate methods are applicable to problems related to reliability, availability, maintainability and safety (RAMS)

  7. RTE - 2013 Reliability Report

    International Nuclear Information System (INIS)

    Denis, Anne-Marie

    2014-01-01

    RTE publishes a yearly reliability report based on a standard model to facilitate comparisons and highlight long-term trends. The 2013 report is not only stating the facts of the Significant System Events (ESS), but it moreover underlines the main elements dealing with the reliability of the electrical power system. It highlights the various elements which contribute to present and future reliability and provides an overview of the interaction between the various stakeholders of the Electrical Power System on the scale of the European Interconnected Network. (author)

  8. Protocol Fuel Mix reporting

    International Nuclear Information System (INIS)

    2002-07-01

    The protocol in this document describes a method for an Electricity Distribution Company (EDC) to account for the fuel mix of electricity that it delivers to its customers, based on the best available information. Own production, purchase and sale of electricity, and certificates trading are taken into account. In chapter 2 the actual protocol is outlined. In the appendixes additional (supporting) information is given: (A) Dutch Standard Fuel Mix, 2000; (B) Calculation of the Dutch Standard fuel mix; (C) Procedures to estimate and benchmark the fuel mix; (D) Quality management; (E) External verification; (F) Recommendation for further development of the protocol; (G) Reporting examples

  9. Coded Splitting Tree Protocols

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Stefanovic, Cedomir; Popovski, Petar

    2013-01-01

    This paper presents a novel approach to multiple access control called coded splitting tree protocol. The approach builds on the known tree splitting protocols, code structure and successive interference cancellation (SIC). Several instances of the tree splitting protocol are initiated, each...... instance is terminated prematurely and subsequently iterated. The combined set of leaves from all the tree instances can then be viewed as a graph code, which is decodable using belief propagation. The main design problem is determining the order of splitting, which enables successful decoding as early...

  10. Playing With Population Protocols

    Directory of Open Access Journals (Sweden)

    Xavier Koegler

    2009-06-01

    Full Text Available Population protocols have been introduced as a model of sensor networks consisting of very limited mobile agents with no control over their own movement: A collection of anonymous agents, modeled by finite automata, interact in pairs according to some rules. Predicates on the initial configurations that can be computed by such protocols have been characterized under several hypotheses. We discuss here whether and when the rules of interactions between agents can be seen as a game from game theory. We do so by discussing several basic protocols.

  11. ATM and Internet protocol

    CERN Document Server

    Bentall, M; Turton, B

    1998-01-01

    Asynchronous Transfer Mode (ATM) is a protocol that allows data, sound and video being transferred between independent networks via ISDN links to be supplied to, and interpreted by, the various system protocols.ATM and Internet Protocol explains the working of the ATM and B-ISDN network for readers with a basic understanding of telecommunications. It provides a handy reference to everyone working with ATM who may not require the full standards in detail, but need a comprehensive guide to ATM. A substantial section is devoted to the problems of running IP over ATM and there is some discussion o

  12. An acoustic prion assay

    Directory of Open Access Journals (Sweden)

    Gordon Hayward

    2016-12-01

    Full Text Available An acoustic prion assay has been demonstrated for sheep brain samples. Only five false positives and no false negatives were observed in a test of 45 positive and 45 negative samples. The acoustic prion sensor was constructed using a thickness shear mode quartz resonator coated with a covalently bound recombinant prion protein. The characteristic indicator of a scrapie infected sheep brain sample was an observed shoulder in the frequency decrease in response to a sample.The response of the sensor aligns with a conformational shift in the surface protein and with the propagation mechanism of the disease. This alignment is evident in the response timing and shape, dependence on concentration, cross species behaviour and impact of blood plasma. This alignment is far from sufficient to prove the mechanism of the sensor but it does offer the possibility of a rapid and inexpensive additional tool to explore prion disease. Keywords: Prions, Thickness shear mode quartz sensor

  13. Assay of oestrogen

    International Nuclear Information System (INIS)

    Edwards, J.C.

    1981-01-01

    A particular problem with the direct radioimmunoassay of unconjugated oestriol in pregnancy is caused by the increased amount of steroid-binding proteins present in pregnancy serum and plasma. The steroid-binding proteins react with oestriol and 125 I-labelled oestriol during the assay procedure and the steroid-protein bound 125 I-labelled oestriol is precipitated along with the antibody-bound 125 I-labelled oestriol by the ammonium sulphate solution separation system. A novel method is described whereby progesterone (1-20 μg/ml) is used to block the action of steroid-binding proteins in pregnancy serum and plasma samples, thus minimizing interference in a direct radioimmunoassay for unconjugated oestriol using a specific anti-oestriol serum. (U.K.)

  14. Integrated bioassays in microfluidic devices: botulinum toxin assays.

    Science.gov (United States)

    Mangru, Shakuntala; Bentz, Bryan L; Davis, Timothy J; Desai, Nitin; Stabile, Paul J; Schmidt, James J; Millard, Charles B; Bavari, Sina; Kodukula, Krishna

    2005-12-01

    A microfluidic assay was developed for screening botulinum neurotoxin serotype A (BoNT-A) by using a fluorescent resonance energy transfer (FRET) assay. Molded silicone microdevices with integral valves, pumps, and reagent reservoirs were designed and fabricated. Electrical and pneumatic control hardware were constructed, and software was written to automate the assay protocol and data acquisition. Detection was accomplished by fluorescence microscopy. The system was validated with a peptide inhibitor, running 2 parallel assays, as a feasibility demonstration. The small footprint of each bioreactor cell (0.5 cm2) and scalable fluidic architecture enabled many parallel assays on a single chip. The chip is programmable to run a dilution series in each lane, generating concentration-response data for multiple inhibitors. The assay results showed good agreement with the corresponding experiments done at a macroscale level. Although the system has been developed for BoNT-A screening, a wide variety of assays can be performed on the microfluidic chip with little or no modification.

  15. Approach to reliability assessment

    International Nuclear Information System (INIS)

    Green, A.E.; Bourne, A.J.

    1975-01-01

    Experience has shown that reliability assessments can play an important role in the early design and subsequent operation of technological systems where reliability is at a premium. The approaches to and techniques for such assessments, which have been outlined in the paper, have been successfully applied in variety of applications ranging from individual equipments to large and complex systems. The general approach involves the logical and systematic establishment of the purpose, performance requirements and reliability criteria of systems. This is followed by an appraisal of likely system achievment based on the understanding of different types of variational behavior. A fundamental reliability model emerges from the correlation between the appropriate Q and H functions for performance requirement and achievement. This model may cover the complete spectrum of performance behavior in all the system dimensions

  16. A novel approach for reliable detection of cathepsin S activities in mouse antigen presenting cells.

    Science.gov (United States)

    Steimle, Alex; Kalbacher, Hubert; Maurer, Andreas; Beifuss, Brigitte; Bender, Annika; Schäfer, Andrea; Müller, Ricarda; Autenrieth, Ingo B; Frick, Julia-Stefanie

    2016-05-01

    Cathepsin S (CTSS) is a eukaryotic protease mostly expressed in professional antigen presenting cells (APCs). Since CTSS activity regulation plays a role in the pathogenesis of various autoimmune diseases like multiple sclerosis, atherosclerosis, Sjögren's syndrome and psoriasis as well as in cancer progression, there is an ongoing interest in the reliable detection of cathepsin S activity. Various applications have been invented for specific detection of this enzyme. However, most of them have only been shown to be suitable for human samples, do not deliver quantitative results or the experimental procedure requires technical equipment that is not commonly available in a standard laboratory. We have tested a fluorogen substrate, Mca-GRWPPMGLPWE-Lys(Dnp)-DArg-NH2, that has been described to specifically detect CTSS activities in human APCs for its potential use for mouse samples. We have modified the protocol and thereby offer a cheap, easy, reproducible and quick activity assay to detect CTSS activities in mouse APCs. Since most of basic research on CTSS is performed in mice, this method closes a gap and offers a possibility for reliable and quantitative CTSS activity detection that can be performed in almost every laboratory. Copyright © 2016. Published by Elsevier B.V.

  17. The rating reliability calculator

    Directory of Open Access Journals (Sweden)

    Solomon David J

    2004-04-01

    Full Text Available Abstract Background Rating scales form an important means of gathering evaluation data. Since important decisions are often based on these evaluations, determining the reliability of rating data can be critical. Most commonly used methods of estimating reliability require a complete set of ratings i.e. every subject being rated must be rated by each judge. Over fifty years ago Ebel described an algorithm for estimating the reliability of ratings based on incomplete data. While his article has been widely cited over the years, software based on the algorithm is not readily available. This paper describes an easy-to-use Web-based utility for estimating the reliability of ratings based on incomplete data using Ebel's algorithm. Methods The program is available public use on our server and the source code is freely available under GNU General Public License. The utility is written in PHP, a common open source imbedded scripting language. The rating data can be entered in a convenient format on the user's personal computer that the program will upload to the server for calculating the reliability and other statistics describing the ratings. Results When the program is run it displays the reliability, number of subject rated, harmonic mean number of judges rating each subject, the mean and standard deviation of the averaged ratings per subject. The program also displays the mean, standard deviation and number of ratings for each subject rated. Additionally the program will estimate the reliability of an average of a number of ratings for each subject via the Spearman-Brown prophecy formula. Conclusion This simple web-based program provides a convenient means of estimating the reliability of rating data without the need to conduct special studies in order to provide complete rating data. I would welcome other researchers revising and enhancing the program.

  18. Structural systems reliability analysis

    International Nuclear Information System (INIS)

    Frangopol, D.

    1975-01-01

    For an exact evaluation of the reliability of a structure it appears necessary to determine the distribution densities of the loads and resistances and to calculate the correlation coefficients between loads and between resistances. These statistical characteristics can be obtained only on the basis of a long activity period. In case that such studies are missing the statistical properties formulated here give upper and lower bounds of the reliability. (orig./HP) [de

  19. Reliability and maintainability

    International Nuclear Information System (INIS)

    1994-01-01

    Several communications in this conference are concerned with nuclear plant reliability and maintainability; their titles are: maintenance optimization of stand-by Diesels of 900 MW nuclear power plants; CLAIRE: an event-based simulation tool for software testing; reliability as one important issue within the periodic safety review of nuclear power plants; design of nuclear building ventilation by the means of functional analysis; operation characteristic analysis for a power industry plant park, as a function of influence parameters

  20. Reliability data book

    International Nuclear Information System (INIS)

    Bento, J.P.; Boerje, S.; Ericsson, G.; Hasler, A.; Lyden, C.O.; Wallin, L.; Poern, K.; Aakerlund, O.

    1985-01-01

    The main objective for the report is to improve failure data for reliability calculations as parts of safety analyses for Swedish nuclear power plants. The work is based primarily on evaluations of failure reports as well as information provided by the operation and maintenance staff of each plant. In the report are presented charts of reliability data for: pumps, valves, control rods/rod drives, electrical components, and instruments. (L.E.)

  1. Rapid identification of tomato Sw-5 resistance-breaking isolates of Tomato spotted wilt virus using high resolution melting and TaqMan SNP Genotyping assays as allelic discrimination techniques.

    Directory of Open Access Journals (Sweden)

    Valentina di Rienzo

    Full Text Available In tomato, resistance to Tomato spotted wilt virus (TSWV is conferred by the dominant gene, designated Sw-5. Virulent Sw-5 resistance breaking (SRB mutants of TSWV have been reported on Sw-5 tomato cultivars. Two different PCR-based allelic discrimination techniques, namely Custom TaqMan™ SNP Genotyping and high-resolution melting (HRM assays, were developed and compared for their ability to distinguish between avirulent (Sw-5 non-infecting, SNI and SRB biotypes. TaqMan assays proved to be more sensitive (threshold of detection in a range of 50-70 TSWV RNA copies and more reliable than HRM, assigning 25 TSWV isolates to their correct genotype with an accuracy of 100%. Moreover, the TaqMan SNP assays were further improved developing a rapid and simple protocol that included crude leaf extraction for RNA template preparations. On the other hand, HRM assays showed higher levels of sensitivity than TaqMan when used to co-detect both biotypes in different artificial mixtures. These diagnostic assays contributed to gain preliminary information on the epidemiology of TSWV isolates in open field conditions. In fact, the presented data suggest that SRB isolates are present as stable populations established year round, persisting on both winter (globe artichoke and summer (tomato crops, in the same cultivated areas of Southern Italy.

  2. JaCVAM-organized international validation study of the in vivo rodent alkaline comet assay for the detection of genotoxic carcinogens: I. Summary of pre-validation study results.

    Science.gov (United States)

    Uno, Yoshifumi; Kojima, Hajime; Omori, Takashi; Corvi, Raffaella; Honma, Masamistu; Schechtman, Leonard M; Tice, Raymond R; Burlinson, Brian; Escobar, Patricia A; Kraynak, Andrew R; Nakagawa, Yuzuki; Nakajima, Madoka; Pant, Kamala; Asano, Norihide; Lovell, David; Morita, Takeshi; Ohno, Yasuo; Hayashi, Makoto

    2015-07-01

    The in vivo rodent alkaline comet assay (comet assay) is used internationally to investigate the in vivo genotoxic potential of test chemicals. This assay, however, has not previously been formally validated. The Japanese Center for the Validation of Alternative Methods (JaCVAM), with the cooperation of the U.S. NTP Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM)/the Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM), the European Centre for the Validation of Alternative Methods (ECVAM), and the Japanese Environmental Mutagen Society/Mammalian Mutagenesis Study Group (JEMS/MMS), organized an international validation study to evaluate the reliability and relevance of the assay for identifying genotoxic carcinogens, using liver and stomach as target organs. The ultimate goal of this validation effort was to establish an Organisation for Economic Co-operation and Development (OECD) test guideline. The purpose of the pre-validation studies (i.e., Phase 1 through 3), conducted in four or five laboratories with extensive comet assay experience, was to optimize the protocol to be used during the definitive validation study. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Mobile Internet Protocol Analysis

    National Research Council Canada - National Science Library

    Brachfeld, Lawrence

    1999-01-01

    ...) and User Datagram Protocol (UDP). Mobile IP allows mobile computers to send and receive packets addressed with their home network IP address, regardless of the IP address of their current point of attachment on the Internet...

  4. Reliable Internet Routing

    Science.gov (United States)

    2011-09-01

    global coordination. IEEE/ACM Trans. Netw., 9(6):681–692, 2001. [39] S. Goldberg, D. Xiao, E. Tromer, B. Barak , and J. Rexford. Path-quality monitoring...RFC 4271). [81] Y. Rekhter, T. Li, and S. Hares. A border gateway protocol 4 (BGP-4), 2006. IETF RFC 4271. [82] E. Rosen , A. Viswanathan, and R. Callon

  5. Multidisciplinary System Reliability Analysis

    Science.gov (United States)

    Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  6. Analysis and Application of Reliability

    International Nuclear Information System (INIS)

    Jeong, Hae Seong; Park, Dong Ho; Kim, Jae Ju

    1999-05-01

    This book tells of analysis and application of reliability, which includes definition, importance and historical background of reliability, function of reliability and failure rate, life distribution and assumption of reliability, reliability of unrepaired system, reliability of repairable system, sampling test of reliability, failure analysis like failure analysis by FEMA and FTA, and cases, accelerated life testing such as basic conception, acceleration and acceleration factor, and analysis of accelerated life testing data, maintenance policy about alternation and inspection.

  7. USA-USSR protocol

    CERN Multimedia

    1970-01-01

    On 30 November the USA Atomic Energy Commission and the USSR State Committee for the Utilization of Atomic Energy signed, in Washington, a protocol 'on carrying out of joint projects in the field of high energy physics at the accelerators of the National Accelerator Laboratory (Batavia) and the Institute for High Energy Physics (Serpukhov)'. The protocol will be in force for five years and can be extended by mutual agreement.

  8. The Design of Finite State Machine for Asynchronous Replication Protocol

    Science.gov (United States)

    Wang, Yanlong; Li, Zhanhuai; Lin, Wei; Hei, Minglei; Hao, Jianhua

    Data replication is a key way to design a disaster tolerance system and to achieve reliability and availability. It is difficult for a replication protocol to deal with the diverse and complex environment. This means that data is less well replicated than it ought to be. To reduce data loss and to optimize replication protocols, we (1) present a finite state machine, (2) run it to manage an asynchronous replication protocol and (3) report a simple evaluation of the asynchronous replication protocol based on our state machine. It's proved that our state machine is applicable to guarantee the asynchronous replication protocol running in the proper state to the largest extent in the event of various possible events. It also can helpful to build up replication-based disaster tolerance systems to ensure the business continuity.

  9. Further Development and Validation of the Frog Embryo Teratogenesis Assay - Xenopus (FETAX). Phase III

    National Research Council Canada - National Science Library

    Bantle, John

    1996-01-01

    This interlaboratory study of the Frog Embryo Teratogenesis Assay (FETAX) was undertaken in order to assess the repeatability and reliability of data collected under the guide published by the American Society for Testing and Materials...

  10. Reliable Communication in Wireless Meshed Networks using Network Coding

    DEFF Research Database (Denmark)

    Pahlevani, Peyman; Paramanathan, Achuthan; Hundebøll, Martin

    2012-01-01

    The advantages of network coding have been extensively studied in the field of wireless networks. Integrating network coding with existing IEEE 802.11 MAC layer is a challenging problem. The IEEE 802.11 MAC does not provide any reliability mechanisms for overheard packets. This paper addresses...... this problem and suggests different mechanisms to support reliability as part of the MAC protocol. Analytical expressions to this problem are given to qualify the performance of the modified network coding. These expressions are confirmed by numerical result. While the suggested reliability mechanisms...

  11. Validation of the Thermo Scientific SureTect Escherichia coli O157:H7 Real-Time PCR Assay for Raw Beef and Produce Matrixes.

    Science.gov (United States)

    Cloke, Jonathan; Crowley, Erin; Bird, Patrick; Bastin, Ben; Flannery, Jonathan; Agin, James; Goins, David; Clark, Dorn; Radcliff, Roy; Wickstrand, Nina; Kauppinen, Mikko

    2015-01-01

    . coli O157:NM isolate. Nonmotile isolates of E. coli O157 have been demonstrated to still contain the H7 gene; therefore, this result is not unexpected. Robustness testing was conducted to evaluate the performance of the SureTect assay with specific deviations to the assay protocol, which were outside the recommended parameters and which are open to variation. This study demonstrated that the SureTect assay gave reliable performance. A final study to verify the shelf life of the product, under accelerated conditions was also conducted.

  12. Safety and reliability criteria

    International Nuclear Information System (INIS)

    O'Neil, R.

    1978-01-01

    Nuclear power plants and, in particular, reactor pressure boundary components have unique reliability requirements, in that usually no significant redundancy is possible, and a single failure can give rise to possible widespread core damage and fission product release. Reliability may be required for availability or safety reasons, but in the case of the pressure boundary and certain other systems safety may dominate. Possible Safety and Reliability (S and R) criteria are proposed which would produce acceptable reactor design. Without some S and R requirement the designer has no way of knowing how far he must go in analysing his system or component, or whether his proposed solution is likely to gain acceptance. The paper shows how reliability targets for given components and systems can be individually considered against the derived S and R criteria at the design and construction stage. Since in the case of nuclear pressure boundary components there is often very little direct experience on which to base reliability studies, relevant non-nuclear experience is examined. (author)

  13. Proposed reliability cost model

    Science.gov (United States)

    Delionback, L. M.

    1973-01-01

    The research investigations which were involved in the study include: cost analysis/allocation, reliability and product assurance, forecasting methodology, systems analysis, and model-building. This is a classic example of an interdisciplinary problem, since the model-building requirements include the need for understanding and communication between technical disciplines on one hand, and the financial/accounting skill categories on the other. The systems approach is utilized within this context to establish a clearer and more objective relationship between reliability assurance and the subcategories (or subelements) that provide, or reenforce, the reliability assurance for a system. Subcategories are further subdivided as illustrated by a tree diagram. The reliability assurance elements can be seen to be potential alternative strategies, or approaches, depending on the specific goals/objectives of the trade studies. The scope was limited to the establishment of a proposed reliability cost-model format. The model format/approach is dependent upon the use of a series of subsystem-oriented CER's and sometimes possible CTR's, in devising a suitable cost-effective policy.

  14. Genotoxicity testing: Comparison of the γH2AX focus assay with the alkaline and neutral comet assays.

    Science.gov (United States)

    Nikolova, Teodora; Marini, Federico; Kaina, Bernd

    2017-10-01

    Genotoxicity testing relies on the quantitative measurement of adverse effects, such as chromosome aberrations, micronuclei, and mutations, resulting from primary DNA damage. Ideally, assays will detect DNA damage and cellular responses with high sensitivity, reliability, and throughput. Several novel genotoxicity assays may fulfill these requirements, including the comet assay and the more recently developed γH2AX assay. Although they are thought to be specific for genotoxicants, a systematic comparison of the assays has not yet been undertaken. In the present study, we compare the γH2AX focus assay with the alkaline and neutral versions of the comet assay, as to their sensitivities and limitations for detection of genetic damage. We investigated the dose-response relationships of γH2AX foci and comet tail intensities at various times following treatment with four prototypical genotoxicants, methyl methanesulfonate (MMS), N-methyl-N'-nitro-N-nitrosoguanidine (MNNG), mitomycin C, and hydrogen peroxide (H 2 O 2 ) and we tested whether there is a correlation between the endpoints, i.e., alkali-labile sites and DNA strand breaks on the one hand and the cell's response to DNA double-strand breaks and blocked replication forks on the other. Induction of γH2AX foci gave a linear dose response and all agents tested were positive in the assay. The increase in comet tail intensity was also a function of dose; however, mitomycin C was almost completely ineffective in the comet assay, and the doses needed to achieve a significant effect were somewhat higher for some treatments in the comet assay than in the γH2AX foci assay, which was confirmed by threshold analysis. There was high correlation between tail intensity and γH2AX foci for MMS and H 2 O 2 , less for MNNG, and none for mitomycin C. From this we infer that the γH2AX foci assay is more reliable, sensitive, and robust than the comet assay for detecting genotoxicant-induced DNA damage. Copyright © 2017 Elsevier

  15. Reliability Centered Maintenance - Methodologies

    Science.gov (United States)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  16. Issues in cognitive reliability

    International Nuclear Information System (INIS)

    Woods, D.D.; Hitchler, M.J.; Rumancik, J.A.

    1984-01-01

    This chapter examines some problems in current methods to assess reactor operator reliability at cognitive tasks and discusses new approaches to solve these problems. The two types of human failures are errors in the execution of an intention and errors in the formation/selection of an intention. Topics considered include the types of description, error correction, cognitive performance and response time, the speed-accuracy tradeoff function, function based task analysis, and cognitive task analysis. One problem of human reliability analysis (HRA) techniques in general is the question of what are the units of behavior whose reliability are to be determined. A second problem for HRA is that people often detect and correct their errors. The use of function based analysis, which maps the problem space for plant control, is recommended

  17. Reliability issues in PACS

    Science.gov (United States)

    Taira, Ricky K.; Chan, Kelby K.; Stewart, Brent K.; Weinberg, Wolfram S.

    1991-07-01

    Reliability is an increasing concern when moving PACS from the experimental laboratory to the clinical environment. Any system downtime may seriously affect patient care. The authors report on the several classes of errors encountered during the pre-clinical release of the PACS during the past several months and present the solutions implemented to handle them. The reliability issues discussed include: (1) environmental precautions, (2) database backups, (3) monitor routines of critical resources and processes, (4) hardware redundancy (networks, archives), and (5) development of a PACS quality control program.

  18. Reliability Parts Derating Guidelines

    Science.gov (United States)

    1982-06-01

    226-30, October 1974. 66 I, 26. "Reliability of GAAS Injection Lasers", De Loach , B. C., Jr., 1973 IEEE/OSA Conference on Laser Engineering and...Vol. R-23, No. 4, 226-30, October 1974. 28. "Reliability of GAAS Injection Lasers", De Loach , B. C., Jr., 1973 IEEE/OSA Conference on Laser...opnatien ot 󈨊 deg C, mounted on a 4-inach square 0.250~ inch thick al~loy alum~nusi panel.. This mounting technique should be L~ ken into cunoidur~tiou

  19. International Network for Comparison of HIV Neutralization Assays: The NeutNet Report

    NARCIS (Netherlands)

    Fenyö, Eva Maria; Heath, Alan; Dispinseri, Stefania; Holmes, Harvey; Lusso, Paolo; Zolla-Pazner, Susan; Donners, Helen; Heyndrickx, Leo; Alcami, Jose; Bongertz, Vera; Jassoy, Christian; Malnati, Mauro; Montefiori, David; Moog, Christiane; Morris, Lynn; Osmanov, Saladin; Polonis, Victoria; Sattentau, Quentin; Schuitemaker, Hanneke; Sutthent, Ruengpung; Wrin, Terri; Scarlatti, Gabriella

    2009-01-01

    BACKGROUND: Neutralizing antibody assessments play a central role in human immunodeficiency virus type-1 (HIV-1) vaccine development but it is unclear which assay, or combination of assays, will provide reliable measures of correlates of protection. To address this, an international collaboration

  20. Rapid multiple immunoenzyme assay of mycotoxins.

    Science.gov (United States)

    Urusov, Alexandr E; Zherdev, Anatoly V; Petrakova, Alina V; Sadykhov, Elchin G; Koroleva, Olga V; Dzantiev, Boris B

    2015-01-27

    Mycotoxins are low molecular weight fungal metabolites that pose a threat as toxic contaminants of food products, thereby necessitating their effective monitoring and control. Microplate ELISA can be used for this purpose, but this method is characteristically time consuming, with a duration extending to several hours. This report proposes a variant of the ELISA method for the detection and quantification of three mycotoxins, ochratoxin A, aflatoxin B1 and zearalenone, in the kinetic regime. The main requirement for the proposed kinetic protocol was to provide a rapid method that combined sensitivity and accuracy. The use of biotin with an extended spacer together with a streptavidin-polyperoxidase conjugate provided high signal levels, despite these interactions occurring under non-equilibrium conditions. Duration of the individual mycotoxin assays was 20 min, whereas the analysis of all three mycotoxins in parallel reached a maximum duration of 25 min. Recovery of at least 95% mycotoxins in water-organic extracts was shown. The developed assays were successfully validated using poultry processing products and corn samples spiked with known quantities of mycotoxins. The detection limits for aflatoxin B1, ochratoxin A and zearalenone in these substances were 0.24, 1.2 and 3 ng/g, respectively.

  1. Rapid Multiple Immunoenzyme Assay of Mycotoxins

    Directory of Open Access Journals (Sweden)

    Alexandr E. Urusov

    2015-01-01

    Full Text Available Mycotoxins are low molecular weight fungal metabolites that pose a threat as toxic contaminants of food products, thereby necessitating their effective monitoring and control. Microplate ELISA can be used for this purpose, but this method is characteristically time consuming, with a duration extending to several hours. This report proposes a variant of the ELISA method for the detection and quantification of three mycotoxins, ochratoxin A, aflatoxin B1 and zearalenone, in the kinetic regime. The main requirement for the proposed kinetic protocol was to provide a rapid method that combined sensitivity and accuracy. The use of biotin with an extended spacer together with a streptavidin–polyperoxidase conjugate provided high signal levels, despite these interactions occurring under non-equilibrium conditions. Duration of the individual mycotoxin assays was 20 min, whereas the analysis of all three mycotoxins in parallel reached a maximum duration of 25 min. Recovery of at least 95% mycotoxins in water-organic extracts was shown. The developed assays were successfully validated using poultry processing products and corn samples spiked with known quantities of mycotoxins. The detection limits for aflatoxin B1, ochratoxin A and zearalenone in these substances were 0.24, 1.2 and 3 ng/g, respectively.

  2. The reliability of knee joint position testing using electrogoniometry

    Directory of Open Access Journals (Sweden)

    Winter Adele

    2008-01-01

    Full Text Available Abstract Background The current investigation examined the inter- and intra-tester reliability of knee joint angle measurements using a flexible Penny and Giles Biometric® electrogoniometer. The clinical utility of electrogoniometry was also addressed. Methods The first study examined the inter- and intra-tester reliability of measurements of knee joint angles in supine, sitting and standing in 35 healthy adults. The second study evaluated inter-tester and intra-tester reliability of knee joint angle measurements in standing and after walking 10 metres in 20 healthy adults, using an enhanced measurement protocol with a more detailed electrogoniometer attachment procedure. Both inter-tester reliability studies involved two testers. Results In the first study, inter-tester reliability (ICC[2,10] ranged from 0.58–0.71 in supine, 0.68–0.79 in sitting and 0.57–0.80 in standing. The standard error of measurement between testers was less than 3.55° and the limits of agreement ranged from -12.51° to 12.21°. Reliability coefficients for intra-tester reliability (ICC[3,10] ranged from 0.75–0.76 in supine, 0.86–0.87 in sitting and 0.87–0.88 in standing. The standard error of measurement for repeated measures by the same tester was less than 1.7° and the limits of agreement ranged from -8.13° to 7.90°. The second study showed that using a more detailed electrogoniometer attachment protocol reduced the error of measurement between testers to 0.5°. Conclusion Using a standardised protocol, reliable measures of knee joint angles can be gained in standing, supine and sitting by using a flexible goniometer.

  3. Business protocol in integrated Europe

    OpenAIRE

    Pavelová, Nina

    2009-01-01

    The first chapter devotes to definitions of basic terms such as protocol or business protocol, to differences between protocol and etiquette, and between social etiquette and business etiquette. The second chapter focuses on the factors influencing the European business protocol. The third chapter is devoted to the etiquette of business protocol in the European countries. It touches the topics such as punctuality and planning of business appointment, greeting, business cards, dress and appear...

  4. Security Protocols in a Nutshell

    OpenAIRE

    Toorani, Mohsen

    2016-01-01

    Security protocols are building blocks in secure communications. They deploy some security mechanisms to provide certain security services. Security protocols are considered abstract when analyzed, but they can have extra vulnerabilities when implemented. This manuscript provides a holistic study on security protocols. It reviews foundations of security protocols, taxonomy of attacks on security protocols and their implementations, and different methods and models for security analysis of pro...

  5. Rapid 2,2'-bicinchoninic-based xylanase assay compatible with high throughput screening

    Science.gov (United States)

    William R. Kenealy; Thomas W. Jeffries

    2003-01-01

    High-throughput screening requires simple assays that give reliable quantitative results. A microplate assay was developed for reducing sugar analysis that uses a 2,2'-bicinchoninic-based protein reagent. Endo-1,4-â-D-xylanase activity against oat spelt xylan was detected at activities of 0.002 to 0.011 IU ml−1. The assay is linear for sugar...

  6. Genotyping assay for differentiation of wild-type and vaccine viruses in subjects immunized with live attenuated influenza vaccine.

    Directory of Open Access Journals (Sweden)

    Victoria Matyushenko

    Full Text Available Live attenuated influenza vaccines (LAIVs are considered as safe and effective tool to control influenza in different age groups, especially in young children. An important part of the LAIV safety evaluation is the detection of vaccine virus replication in the nasopharynx of the vaccinees, with special attention to a potential virus transmission to the unvaccinated close contacts. Conducting LAIV clinical trials in some geographical regions with year-round circulation of influenza viruses warrants the development of robust and reliable tools for differentiating vaccine viruses from wild-type influenza viruses in nasal pharyngeal wash (NPW specimens of vaccinated subjects. Here we report the development of genotyping assay for the detection of wild-type and vaccine-type influenza virus genes in NPW specimens of young children immunized with Russian-backbone seasonal trivalent LAIV using Sanger sequencing from newly designed universal primers. The new primer set allowed amplification and sequencing of short fragments of viral genes in NPW specimens and appeared to be more sensitive than conventional real-time RT-PCR protocols routinely used for the detection and typing/subtyping of influenza virus in humans. Furthermore, the new assay is capable of defining the origin of wild-type influenza virus through BLAST search with the generated sequences of viral genes fragments.

  7. Performance standard-based validation study for local lymph node assay: 5-bromo-2-deoxyuridine-flow cytometry method.

    Science.gov (United States)

    Ahn, Ilyoung; Kim, Tae-Sung; Jung, Eun-Sun; Yi, Jung-Sun; Jang, Won-Hee; Jung, Kyoung-Mi; Park, Miyoung; Jung, Mi-Sook; Jeon, Eun-Young; Yeo, Kyeong-Uk; Jo, Ji-Hoon; Park, Jung-Eun; Kim, Chang-Yul; Park, Yeong-Chul; Seong, Won-Keun; Lee, Ai-Young; Chun, Young Jin; Jeong, Tae Cheon; Jeung, Eui Bae; Lim, Kyung-Min; Bae, SeungJin; Sohn, Soojung; Heo, Yong

    2016-10-01

    Local lymph node assay: 5-bromo-2-deoxyuridine-flow cytometry method (LLNA: BrdU-FCM) is a modified non-radioisotopic technique with the additional advantages of accommodating multiple endpoints with the introduction of FCM, and refinement and reduction of animal use by using a sophisticated prescreening scheme. Reliability and accuracy of the LLNA: BrdU-FCM was determined according to OECD Test Guideline (TG) No. 429 (Skin Sensitization: Local Lymph Node Assay) performance standards (PS), with the participation of four laboratories. Transferability was demonstrated through successfully producing stimulation index (SI) values for 25% hexyl cinnamic aldehyde (HCA) consistently greater than 3, a predetermined threshold, by all participating laboratories. Within- and between-laboratory reproducibility was shown using HCA and 2,4-dinitrochlorobenzene, in which EC2.7 values (the estimated concentrations eliciting an SI of 2.7, the threshold for LLNA: BrdU-FCM) fell consistently within the acceptance ranges, 0.025-0.1% and 5-20%, respectively. Predictive capacity was tested using the final protocol version 1.3 for the 18 reference chemicals listed in OECD TG 429, of which results showed 84.6% sensitivity, 100% specificity, and 88.9% accuracy compared with the original LLNA. The data presented are considered to meet the performance criteria for the PS, and its predictive capacity was also sufficiently validated. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. DNA repair protocols

    DEFF Research Database (Denmark)

    Bjergbæk, Lotte

    In its 3rd edition, this Methods in Molecular Biology(TM) book covers the eukaryotic response to genomic insult including advanced protocols and standard techniques in the field of DNA repair. Offers expert guidance for DNA repair, recombination, and replication. Current knowledge of the mechanisms...... that regulate DNA repair has grown significantly over the past years with technology advances such as RNA interference, advanced proteomics and microscopy as well as high throughput screens. The third edition of DNA Repair Protocols covers various aspects of the eukaryotic response to genomic insult including...... recent advanced protocols as well as standard techniques used in the field of DNA repair. Both mammalian and non-mammalian model organisms are covered in the book, and many of the techniques can be applied with only minor modifications to other systems than the one described. Written in the highly...

  9. Development of kits for radioimmunometric assays for tumour markers. Final report of a co-ordinated research project 1997-2001

    International Nuclear Information System (INIS)

    2002-08-01

    Many tumour marker assays have been reported over the years and their role is well recognized and acknowledged in the follow-up of known cancer cases. However, their true potential for use in primary diagnosis or screening of high risk groups is still to be fully realized due to the need to achieve better specificity. Among the various tumour markers, the one for prostate cancer - prostate specific antigen (PSA) - appears to have better specificity, coming close to a tumour specific antigen. Prostate cancer is a commonly encountered cancer in men, and can be effectively treated if detected early. PSA levels in serum appear to provide good correlation with tumour burden. Estimation of free PSA in serum is reported to further improve the diagnosis. In several developed countries routine screening of men above 50 years of age for prostate cancer using serum PSA as marker is recommended. Radioimmunometric assay techniques offer themselves as attractive candidates for measurement of tumour markers. They are robust, economical and didactic, thus eminently suitable for technology transfer, training and teaching. Preparation of primary reagents is relatively easy. The methodology is flexible. As a result of co-operation projects of the IAEA, many developing Member States have built up indigenous capabilities to perform radioimmunometric assays, which can be extended to development of kits for tumour marker assays. Considering the need for indigenous development of capabilities to produce reliable kits for radioimmunometric assays for PSA, in 1997 the IAEA initiated a Co-ordinated Research Project (CRP) on Development of Kits for Radioimmunometric Assays for Tumour Markers. Even though the focus of the project was PSA, it was expected that the expertise to be gained by the participants would also help them undertake development of kits for other tumour markers, essentially using the same methodology. Ten laboratories from Europe, Asia, Africa and the Americas participated

  10. Columbus safety and reliability

    Science.gov (United States)

    Longhurst, F.; Wessels, H.

    1988-10-01

    Analyses carried out to ensure Columbus reliability, availability, and maintainability, and operational and design safety are summarized. Failure modes/effects/criticality is the main qualitative tool used. The main aspects studied are fault tolerance, hazard consequence control, risk minimization, human error effects, restorability, and safe-life design.

  11. Reliability versus reproducibility

    International Nuclear Information System (INIS)

    Lautzenheiser, C.E.

    1976-01-01

    Defect detection and reproducibility of results are two separate but closely related subjects. It is axiomatic that a defect must be detected from examination to examination or reproducibility of results is very poor. On the other hand, a defect can be detected on each of subsequent examinations for higher reliability and still have poor reproducibility of results

  12. Power transformer reliability modelling

    NARCIS (Netherlands)

    Schijndel, van A.

    2010-01-01

    Problem description Electrical power grids serve to transport and distribute electrical power with high reliability and availability at acceptable costs and risks. These grids play a crucial though preferably invisible role in supplying sufficient power in a convenient form. Today’s society has

  13. Designing reliability into accelerators

    International Nuclear Information System (INIS)

    Hutton, A.

    1992-08-01

    For the next generation of high performance, high average luminosity colliders, the ''factories,'' reliability engineering must be introduced right at the inception of the project and maintained as a central theme throughout the project. There are several aspects which will be addressed separately: Concept; design; motivation; management techniques; and fault diagnosis

  14. Proof tests on reliability

    International Nuclear Information System (INIS)

    Mishima, Yoshitsugu

    1983-01-01

    In order to obtain public understanding on nuclear power plants, tests should be carried out to prove the reliability and safety of present LWR plants. For example, the aseismicity of nuclear power plants must be verified by using a large scale earthquake simulator. Reliability test began in fiscal 1975, and the proof tests on steam generators and on PWR support and flexure pins against stress corrosion cracking have already been completed, and the results have been internationally highly appreciated. The capacity factor of the nuclear power plant operation in Japan rose to 80% in the summer of 1983, and considering the period of regular inspection, it means the operation of almost full capacity. Japanese LWR technology has now risen to the top place in the world after having overcome the defects. The significance of the reliability test is to secure the functioning till the age limit is reached, to confirm the correct forecast of deteriorating process, to confirm the effectiveness of the remedy to defects and to confirm the accuracy of predicting the behavior of facilities. The reliability of nuclear valves, fuel assemblies, the heat affected zones in welding, reactor cooling pumps and electric instruments has been tested or is being tested. (Kako, I.)

  15. Reliability and code level

    NARCIS (Netherlands)

    Kasperski, M.; Geurts, C.P.W.

    2005-01-01

    The paper describes the work of the IAWE Working Group WBG - Reliability and Code Level, one of the International Codification Working Groups set up at ICWE10 in Copenhagen. The following topics are covered: sources of uncertainties in the design wind load, appropriate design target values for the

  16. Reliability of Plastic Slabs

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle

    1989-01-01

    In the paper it is shown how upper and lower bounds for the reliability of plastic slabs can be determined. For the fundamental case it is shown that optimal bounds of a deterministic and a stochastic analysis are obtained on the basis of the same failure mechanisms and the same stress fields....

  17. Reliability based structural design

    NARCIS (Netherlands)

    Vrouwenvelder, A.C.W.M.

    2014-01-01

    According to ISO 2394, structures shall be designed, constructed and maintained in such a way that they are suited for their use during the design working life in an economic way. To fulfil this requirement one needs insight into the risk and reliability under expected and non-expected actions. A

  18. Travel time reliability modeling.

    Science.gov (United States)

    2011-07-01

    This report includes three papers as follows: : 1. Guo F., Rakha H., and Park S. (2010), "A Multi-state Travel Time Reliability Model," : Transportation Research Record: Journal of the Transportation Research Board, n 2188, : pp. 46-54. : 2. Park S.,...

  19. Reliability and Model Fit

    Science.gov (United States)

    Stanley, Leanne M.; Edwards, Michael C.

    2016-01-01

    The purpose of this article is to highlight the distinction between the reliability of test scores and the fit of psychometric measurement models, reminding readers why it is important to consider both when evaluating whether test scores are valid for a proposed interpretation and/or use. It is often the case that an investigator judges both the…

  20. Parametric Mass Reliability Study

    Science.gov (United States)

    Holt, James P.

    2014-01-01

    The International Space Station (ISS) systems are designed based upon having redundant systems with replaceable orbital replacement units (ORUs). These ORUs are designed to be swapped out fairly quickly, but some are very large, and some are made up of many components. When an ORU fails, it is replaced on orbit with a spare; the failed unit is sometimes returned to Earth to be serviced and re-launched. Such a system is not feasible for a 500+ day long-duration mission beyond low Earth orbit. The components that make up these ORUs have mixed reliabilities. Components that make up the most mass-such as computer housings, pump casings, and the silicon board of PCBs-typically are the most reliable. Meanwhile components that tend to fail the earliest-such as seals or gaskets-typically have a small mass. To better understand the problem, my project is to create a parametric model that relates both the mass of ORUs to reliability, as well as the mass of ORU subcomponents to reliability.

  1. A Field-Based Testing Protocol for Assessing Gross Motor Skills in Preschool Children: The Children's Activity and Movement in Preschool Study Motor Skills Protocol

    Science.gov (United States)

    Williams, Harriet G.; Pfeiffer, Karin A.; Dowda, Marsha; Jeter, Chevy; Jones, Shaverra; Pate, Russell R.

    2009-01-01

    The purpose of this study was to develop a valid and reliable tool for use in assessing motor skills in preschool children in field-based settings. The development of the Children's Activity and Movement in Preschool Study Motor Skills Protocol included evidence of its reliability and validity for use in field-based environments as part of large…

  2. A Protocol for Advanced Psychometric Assessment of Surveys

    Science.gov (United States)

    Squires, Janet E.; Hayduk, Leslie; Hutchinson, Alison M.; Cranley, Lisa A.; Gierl, Mark; Cummings, Greta G.; Norton, Peter G.; Estabrooks, Carole A.

    2013-01-01

    Background and Purpose. In this paper, we present a protocol for advanced psychometric assessments of surveys based on the Standards for Educational and Psychological Testing. We use the Alberta Context Tool (ACT) as an exemplar survey to which this protocol can be applied. Methods. Data mapping, acceptability, reliability, and validity are addressed. Acceptability is assessed with missing data frequencies and the time required to complete the survey. Reliability is assessed with internal consistency coefficients and information functions. A unitary approach to validity consisting of accumulating evidence based on instrument content, response processes, internal structure, and relations to other variables is taken. We also address assessing performance of survey data when aggregated to higher levels (e.g., nursing unit). Discussion. In this paper we present a protocol for advanced psychometric assessment of survey data using the Alberta Context Tool (ACT) as an exemplar survey; application of the protocol to the ACT survey is underway. Psychometric assessment of any survey is essential to obtaining reliable and valid research findings. This protocol can be adapted for use with any nursing survey. PMID:23401759

  3. Reliability Approach of a Compressor System using Reliability Block ...

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... This paper presents a reliability analysis of such a system using reliability ... Keywords-compressor system, reliability, reliability block diagram, RBD .... the same structure has been kept with the three subsystems: air flow, oil flow and .... and Safety in Engineering Design", Springer, 2009. [3] P. O'Connor ...

  4. Nested PCR Assay for Detection of Leishmania donovani in Slit Aspirates from Post-Kala-Azar Dermal Leishmaniasis Lesions

    Science.gov (United States)

    Sreenivas, Gannavaram; Ansari, N. A.; Kataria, Joginder; Salotra, Poonam

    2004-01-01

    A nested PCR assay to detect parasite DNA in slit aspirates from skin lesions of patients with post-kala-azar dermal lesihmaniasis (PKDL) is described. PCR results were positive in 27 of 29 (93%) samples by nested PCR assay, while only 20 of 29 (69%) were positive in a primary PCR assay. The nested PCR assay allowed reliable diagnosis of PKDL in a noninvasive manner. PMID:15071047

  5. Nested PCR Assay for Detection of Leishmania donovani in Slit Aspirates from Post-Kala-Azar Dermal Leishmaniasis Lesions

    OpenAIRE

    Sreenivas, Gannavaram; Ansari, N. A.; Kataria, Joginder; Salotra, Poonam

    2004-01-01

    A nested PCR assay to detect parasite DNA in slit aspirates from skin lesions of patients with post-kala-azar dermal lesihmaniasis (PKDL) is described. PCR results were positive in 27 of 29 (93%) samples by nested PCR assay, while only 20 of 29 (69%) were positive in a primary PCR assay. The nested PCR assay allowed reliable diagnosis of PKDL in a noninvasive manner.

  6. Teleportation protocol with non-ideal conditional local operations

    Energy Technology Data Exchange (ETDEWEB)

    Di Franco, C., E-mail: cdifranco@caesar.ucc.i [Department of Physics, University College Cork, Cork (Ireland); Ballester, D. [School of Mathematics and Physics, Queen' s University, Belfast BT7 1NN (United Kingdom)

    2010-07-12

    We analyze teleportation protocol when some of receiver's conditional operations are more reliable than others and a non-maximally entangled channel is shared by the two parts. We show that the average fidelity of teleportation can be maximized by choosing properly the basis in which the sender performs her two-qubit measurement.

  7. Protocol Monitoring Energy Conservation; Protocol Monitoring Energiebesparing

    Energy Technology Data Exchange (ETDEWEB)

    Boonekamp, P.G.M. [ECN Beleidsstudies, Petten (Netherlands); Mannaerts, H. [Centraal Planburea CPB, Den Haag (Netherlands); Tinbergen, W. [Centraal Bureau voor de Statistiek CBS, Den Haag (Netherlands); Vreuls, H.H.J. [Nederlandse onderneming voor energie en milieu Novem, Utrecht (Netherlands); Wesselink, B. [Rijksinstituut voor Volksgezondheid en Milieuhygiene RIVM, Bilthoven (Netherlands)

    2001-12-01

    On request of the Dutch ministry of Economic Affairs five institutes have collaborated to create a 'Protocol Monitoring Energy Conservation', a common method and database to calculate the amount of energy savings realised in past years. The institutes concerned are the Central Bureau of Statistics (CBS), the Netherlands Bureau for Economic Policy Analysis (CPB), the Energy research Centre of the Netherlands (ECN), the National Agency for Energy and Environment (Novem) and the Netherlands Institute of Public Health and the Environment (RIVM). The institutes have agreed upon a clear definition of energy use and energy savings. The demarcation with renewable energy, the saving effects of substitution between energy carriers and the role of import and export of energy have been elaborated. A decomposition method is used to split up the observed change in energy use in a number of effects, on a national and sectoral level. This method includes an analysis of growth effects, effects of structural changes in production and consumption activities and savings on end use or with more efficient conversion processes. To calculate these effects the total energy use is desegregated as much as possible. For each segment a reference energy use is calculated according to the trend in a variable which is supposed to be representative for the use without savings. The difference with the actual energy use is taken as the savings realised. Results are given for the sectors households, industry, agriculture, services and government, transportation and the energy sector; as well as a national figure. A special feature of the protocol method is the application of primary energy use figures in the determination of savings for end users. This means that the use of each energy carrier is increased with a certain amount, according to the conversion losses caused elsewhere in the energy system. The losses concern the base year energy sector and losses abroad for imports of secondary

  8. Code verification and Y2K testing, calibration, testing, and installation of the radionuclide assay system-photon (RAS-P) at multiple sites for the Savannah River Site

    International Nuclear Information System (INIS)

    Hodge, C.A.

    2000-01-01

    The Radionuclide Assay System - Photon (RAS-P) is a near-field, transmission-corrected assay system developed for measurement of the actinide content of relatively homogeneous waste generated by facility operations. It is intended for use by facility operations personnel, and has features to enhance its usefulness and efficiency. These include multinuclide assay capability, automatic (off-shift) collection of background and straight-through transmission source data, enforcement of measurement control requirements, Go-NoGo or Assay modes, password protection, and reporting of total fissile gram equivalent values. System hardware consists of a shielded high-resolution germanium detector, a turntable, a shielded transmission source and shutter assembly, and a desktop computer and laser printer mounted on a compact frame. RAS-P was designed to assay the contents of cylindrical containers up to 30 inches diameter by 32 inches high, boxes up to 30 inches diagonal by 32 inches high, and HE PA filters up to 2 x 2 x 1 feet. Prior to installation at the Savannah River Site (SRS), code validation, system performance, and assurance against Y2K effects all were confirmed. Code validation was accomplished using spreadsheet calculations that were independent of the original code to calculate intermediate and final result produced by RAS-P. System testing was performed by repeated operation of the instrument under all required circumstances. Y2K testing was performed simultaneously with code validation following a protocol prescribed by the SRS Y2K subcommittee that required assays with dates varying throughout the expected useful life of the RAS-P, particularly those bracketing Y2K boundaries. Performance history has been compiled demonstrating reliability (system availability), diversability (the ability to alter assay parameters and obtain results quickly), and measurement control characteristics

  9. A quantitative assay for lysosomal acidification rates in human osteoclasts

    DEFF Research Database (Denmark)

    Jensen, Vicki Kaiser; Nosjean, Olivier; Dziegiel, Morten Hanefeld

    2011-01-01

    The osteoclast initiates resorption by creating a resorption lacuna. The ruffled border surrounding the lacunae arises from exocytosis of lysosomes. To dissolve the inorganic phase of the bone, the vacuolar adenosine triphosphatase, located in the ruffled border, pumps protons into the resorption...... assay with respect to lysosomal acidification and assess whether it is a reliable test of a compound's ability to inhibit acidification. Investigated were the expression levels of the lysosomal acidification machinery, the activation of the assay by adenosine triphosphate, H(+) and Cl(-) dependency...

  10. Reliability in the utility computing era: Towards reliable Fog computing

    DEFF Research Database (Denmark)

    Madsen, Henrik; Burtschy, Bernard; Albeanu, G.

    2013-01-01

    This paper considers current paradigms in computing and outlines the most important aspects concerning their reliability. The Fog computing paradigm as a non-trivial extension of the Cloud is considered and the reliability of the networks of smart devices are discussed. Combining the reliability...... requirements of grid and cloud paradigms with the reliability requirements of networks of sensor and actuators it follows that designing a reliable Fog computing platform is feasible....

  11. Micronucleus assay for human peripheral blood lymphocytes as a biomarker of individual sensitivity to assessing radiation health risk in different population

    International Nuclear Information System (INIS)

    Kang, C.-M.; Jeon, H.-J.; Lee, Y.-S.; Lee, S.-J.; Jin, Y.-H.; Kim, Y.-H.; Kim, T.-W.; Cho, C.-K.

    2003-01-01

    Full text: Our studies were to evaluate micronucleus (MN) assay for human peripheral blood lymphocytes (HPBL) as a biomarker of individual sensitivity to assessing radiation health risk in different population in Korea. Further studied are carried out to provide evidence for the existence of individual variations in age-dependent responses. For the MN assay, HPBLs were irradiated with doses of 0, 1, 2, 4, 8Gy 60 Co γ-rays. Spontaneous frequencies not only vary greatly between individuals, but also working or living areas because of the groups with different lifestyle living in different ecological situation and the reaction to radiation exposure. It was shown that the increased level of spontaneous cell with MN was observed with increased age. The relationship between radiosensitivity and the increased spontaneous level of MN may be in inverse proportion. Age and gender are the most important demographic variables impact on MN index with MN frequencies in female being greater than those in male by a factor of depending on the age group. For both sexes, MN frequency was significantly and positively correlated with age. The main lifestyle factors influencing the MN index in subjects are significantly and positively correlated with smoking in measuring the spontaneous frequencies of micronuclei. The described results show that the genetic damaged rate like MN index in human populations is correlated significantly with age, sex and lifestyle factors. So far, it is evident that with regard to the application of MN assay all future studies to evaluate radiation health risks in different population have to take into account the influence of age, gender, and lifestyle. The results suggested that the MN assay have a high potential to ensure appropriate quality control and standard documentation protocol which can be used to monitor a large population exposed to radiation epidemiologically. We conclude that the determination of individual radiosensitivity with MN assay is

  12. Immunocytochemical methods and protocols

    National Research Council Canada - National Science Library

    Javois, Lorette C

    1999-01-01

    ... monoclonal antibodies to study cell differentiation during embryonic development. For a select few disciplines volumes have been published focusing on the specific application of immunocytochemical techniques to that discipline. What distinguished Immunocytochemical Methods and Protocols from earlier books when it was first published four years ago was i...

  13. Critical Response Protocol

    Science.gov (United States)

    Ellingson, Charlene; Roehrig, Gillian; Bakkum, Kris; Dubinsky, Janet M.

    2016-01-01

    This article introduces the Critical Response Protocol (CRP), an arts-based technique that engages students in equitable critical discourse and aligns with the "Next Generation Science Standards" vision for providing students opportunities for language learning while advancing science learning (NGSS Lead States 2013). CRP helps teachers…

  14. Linear Logical Voting Protocols

    DEFF Research Database (Denmark)

    DeYoung, Henry; Schürmann, Carsten

    2012-01-01

    Current approaches to electronic implementations of voting protocols involve translating legal text to source code of an imperative programming language. Because the gap between legal text and source code is very large, it is difficult to trust that the program meets its legal specification. In r...

  15. Principles of Protocol Design

    DEFF Research Database (Denmark)

    Sharp, Robin

    This is a new and updated edition of a book first published in 1994. The book introduces the reader to the principles used in the construction of a large range of modern data communication protocols, as used in distributed computer systems of all kinds. The approach taken is rather a formal one...

  16. RTE - Reliability report 2016

    International Nuclear Information System (INIS)

    2017-06-01

    Every year, RTE produces a reliability report for the past year. This document lays out the main factors that affected the electrical power system's operational reliability in 2016 and the initiatives currently under way intended to ensure its reliability in the future. Within a context of the energy transition, changes to the European interconnected network mean that RTE has to adapt on an on-going basis. These changes include the increase in the share of renewables injecting an intermittent power supply into networks, resulting in a need for flexibility, and a diversification in the numbers of stakeholders operating in the energy sector and changes in the ways in which they behave. These changes are dramatically changing the structure of the power system of tomorrow and the way in which it will operate - particularly the way in which voltage and frequency are controlled, as well as the distribution of flows, the power system's stability, the level of reserves needed to ensure supply-demand balance, network studies, assets' operating and control rules, the tools used and the expertise of operators. The results obtained in 2016 are evidence of a globally satisfactory level of reliability for RTE's operations in somewhat demanding circumstances: more complex supply-demand balance management, cross-border schedules at interconnections indicating operation that is closer to its limits and - most noteworthy - having to manage a cold spell just as several nuclear power plants had been shut down. In a drive to keep pace with the changes expected to occur in these circumstances, RTE implemented numerous initiatives to ensure high levels of reliability: - maintaining investment levels of euro 1.5 billion per year; - increasing cross-zonal capacity at borders with our neighbouring countries, thus bolstering the security of our electricity supply; - implementing new mechanisms (demand response, capacity mechanism, interruptibility, etc.); - involvement in tests or projects

  17. 2014 Building America House Simulation Protocols

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, E. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Engebrecht-Metzger, C. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Horowitz, S. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Hendron, R. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2014-03-01

    As BA has grown to include a large and diverse cross-section of the home building and retrofit industries, it has become more important to develop accurate, consistent analysis techniques to measure progress towards the program's goals. The House Simulation Protocol (HSP) document provides guidance to program partners and managers so they can compare energy savings for new construction and retrofit projects. The HSP provides the program with analysis methods that are proven to be effective and reliable in investigating the energy use of advanced energy systems and of entire houses.

  18. 2014 Building America House Simulation Protocols

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, E. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Engebrecht, C. Metzger [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Horowitz, S. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hendron, R. [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2014-03-01

    As Building America has grown to include a large and diverse cross-section of the home building and retrofit industries, it has become more important to develop accurate, consistent analysis techniques to measure progress towards the program's goals. The House Simulation Protocol (HSP) document provides guidance to program partners and managers so they can compare energy savings for new construction and retrofit projects. The HSP provides the program with analysis methods that are proven to be effective and reliable in investigating the energy use of advanced energy systems and of entire houses.

  19. Model Additional Protocol

    International Nuclear Information System (INIS)

    Rockwood, Laura

    2001-01-01

    Since the end of the cold war a series of events has changed the circumstances and requirements of the safeguards system. The discovery of a clandestine nuclear weapons program in Iraq, the continuing difficulty in verifying the initial report of Democratic People's Republic of Korea upon entry into force of their safeguards agreement, and the decision of the South African Government to give up its nuclear weapons program and join the Treaty on the Non-Proliferation of Nuclear Weapons have all played a role in an ambitious effort by IAEA Member States and the Secretariat to strengthen the safeguards system. A major milestone in this effort was reached in May 1997 when the IAEA Board of Governors approved a Model Protocol Additional to Safeguards Agreements. The Model Additional Protocol was negotiated over a period of less than a year by an open-ended committee of the Board involving some 70 Member States and two regional inspectorates. The IAEA is now in the process of negotiating additional protocols, State by State, and implementing them. These additional protocols will provide the IAEA with rights of access to information about all activities related to the use of nuclear material in States with comprehensive safeguards agreements and greatly expanded physical access for IAEA inspectors to confirm or verify this information. In conjunction with this, the IAEA is working on the integration of these measures with those provided for in comprehensive safeguards agreements, with a view to maximizing the effectiveness and efficiency, within available resources, the implementation of safeguards. Details concerning the Model Additional Protocol are given. (author)

  20. Waste package reliability analysis

    International Nuclear Information System (INIS)

    Pescatore, C.; Sastre, C.

    1983-01-01

    Proof of future performance of a complex system such as a high-level nuclear waste package over a period of hundreds to thousands of years cannot be had in the ordinary sense of the word. The general method of probabilistic reliability analysis could provide an acceptable framework to identify, organize, and convey the information necessary to satisfy the criterion of reasonable assurance of waste package performance according to the regulatory requirements set forth in 10 CFR 60. General principles which may be used to evaluate the qualitative and quantitative reliability of a waste package design are indicated and illustrated with a sample calculation of a repository concept in basalt. 8 references, 1 table

  1. Accelerator reliability workshop

    Energy Technology Data Exchange (ETDEWEB)

    Hardy, L; Duru, Ph; Koch, J M; Revol, J L; Van Vaerenbergh, P; Volpe, A M; Clugnet, K; Dely, A; Goodhew, D

    2002-07-01

    About 80 experts attended this workshop, which brought together all accelerator communities: accelerator driven systems, X-ray sources, medical and industrial accelerators, spallation sources projects (American and European), nuclear physics, etc. With newly proposed accelerator applications such as nuclear waste transmutation, replacement of nuclear power plants and others. Reliability has now become a number one priority for accelerator designers. Every part of an accelerator facility from cryogenic systems to data storage via RF systems are concerned by reliability. This aspect is now taken into account in the design/budget phase, especially for projects whose goal is to reach no more than 10 interruptions per year. This document gathers the slides but not the proceedings of the workshop.

  2. Human Reliability Program Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Landers, John; Rogers, Erin; Gerke, Gretchen

    2014-05-18

    A Human Reliability Program (HRP) is designed to protect national security as well as worker and public safety by continuously evaluating the reliability of those who have access to sensitive materials, facilities, and programs. Some elements of a site HRP include systematic (1) supervisory reviews, (2) medical and psychological assessments, (3) management evaluations, (4) personnel security reviews, and (4) training of HRP staff and critical positions. Over the years of implementing an HRP, the Department of Energy (DOE) has faced various challenges and overcome obstacles. During this 4-day activity, participants will examine programs that mitigate threats to nuclear security and the insider threat to include HRP, Nuclear Security Culture (NSC) Enhancement, and Employee Assistance Programs. The focus will be to develop an understanding of the need for a systematic HRP and to discuss challenges and best practices associated with mitigating the insider threat.

  3. Reliability and construction control

    Directory of Open Access Journals (Sweden)

    Sherif S. AbdelSalam

    2016-06-01

    Full Text Available The goal of this study was to determine the most reliable and efficient combination of design and construction methods required for vibro piles. For a wide range of static and dynamic formulas, the reliability-based resistance factors were calculated using EGYPT database, which houses load test results for 318 piles. The analysis was extended to introduce a construction control factor that determines the variation between the pile nominal capacities calculated using static versus dynamic formulae. From the major outcomes, the lowest coefficient of variation is associated with Davisson’s criterion, and the resistance factors calculated for the AASHTO method are relatively high compared with other methods. Additionally, the CPT-Nottingham and Schmertmann method provided the most economic design. Recommendations related to a pile construction control factor were also presented, and it was found that utilizing the factor can significantly reduce variations between calculated and actual capacities.

  4. Scyllac equipment reliability analysis

    International Nuclear Information System (INIS)

    Gutscher, W.D.; Johnson, K.J.

    1975-01-01

    Most of the failures in Scyllac can be related to crowbar trigger cable faults. A new cable has been designed, procured, and is currently undergoing evaluation. When the new cable has been proven, it will be worked into the system as quickly as possible without causing too much additional down time. The cable-tip problem may not be easy or even desirable to solve. A tightly fastened permanent connection that maximizes contact area would be more reliable than the plug-in type of connection in use now, but it would make system changes and repairs much more difficult. The balance of the failures have such a low occurrence rate that they do not cause much down time and no major effort is underway to eliminate them. Even though Scyllac was built as an experimental system and has many thousands of components, its reliability is very good. Because of this the experiment has been able to progress at a reasonable pace

  5. Improving Power Converter Reliability

    DEFF Research Database (Denmark)

    Ghimire, Pramod; de Vega, Angel Ruiz; Beczkowski, Szymon

    2014-01-01

    of a high-power IGBT module during converter operation, which may play a vital role in improving the reliability of the power converters. The measured voltage is used to estimate the module average junction temperature of the high and low-voltage side of a half-bridge IGBT separately in every fundamental......The real-time junction temperature monitoring of a high-power insulated-gate bipolar transistor (IGBT) module is important to increase the overall reliability of power converters for industrial applications. This article proposes a new method to measure the on-state collector?emitter voltage...... is measured in a wind power converter at a low fundamental frequency. To illustrate more, the test method as well as the performance of the measurement circuit are also presented. This measurement is also useful to indicate failure mechanisms such as bond wire lift-off and solder layer degradation...

  6. Accelerator reliability workshop

    International Nuclear Information System (INIS)

    Hardy, L.; Duru, Ph.; Koch, J.M.; Revol, J.L.; Van Vaerenbergh, P.; Volpe, A.M.; Clugnet, K.; Dely, A.; Goodhew, D.

    2002-01-01

    About 80 experts attended this workshop, which brought together all accelerator communities: accelerator driven systems, X-ray sources, medical and industrial accelerators, spallation sources projects (American and European), nuclear physics, etc. With newly proposed accelerator applications such as nuclear waste transmutation, replacement of nuclear power plants and others. Reliability has now become a number one priority for accelerator designers. Every part of an accelerator facility from cryogenic systems to data storage via RF systems are concerned by reliability. This aspect is now taken into account in the design/budget phase, especially for projects whose goal is to reach no more than 10 interruptions per year. This document gathers the slides but not the proceedings of the workshop

  7. Safety and reliability assessment

    International Nuclear Information System (INIS)

    1979-01-01

    This report contains the papers delivered at the course on safety and reliability assessment held at the CSIR Conference Centre, Scientia, Pretoria. The following topics were discussed: safety standards; licensing; biological effects of radiation; what is a PWR; safety principles in the design of a nuclear reactor; radio-release analysis; quality assurance; the staffing, organisation and training for a nuclear power plant project; event trees, fault trees and probability; Automatic Protective Systems; sources of failure-rate data; interpretation of failure data; synthesis and reliability; quantification of human error in man-machine systems; dispersion of noxious substances through the atmosphere; criticality aspects of enrichment and recovery plants; and risk and hazard analysis. Extensive examples are given as well as case studies

  8. Reliability of Circumplex Axes

    Directory of Open Access Journals (Sweden)

    Micha Strack

    2013-06-01

    Full Text Available We present a confirmatory factor analysis (CFA procedure for computing the reliability of circumplex axes. The tau-equivalent CFA variance decomposition model estimates five variance components: general factor, axes, scale-specificity, block-specificity, and item-specificity. Only the axes variance component is used for reliability estimation. We apply the model to six circumplex types and 13 instruments assessing interpersonal and motivational constructs—Interpersonal Adjective List (IAL, Interpersonal Adjective Scales (revised; IAS-R, Inventory of Interpersonal Problems (IIP, Impact Messages Inventory (IMI, Circumplex Scales of Interpersonal Values (CSIV, Support Action Scale Circumplex (SAS-C, Interaction Problems With Animals (IPI-A, Team Role Circle (TRC, Competing Values Leadership Instrument (CV-LI, Love Styles, Organizational Culture Assessment Instrument (OCAI, Customer Orientation Circle (COC, and System for Multi-Level Observation of Groups (behavioral adjectives; SYMLOG—in 17 German-speaking samples (29 subsamples, grouped by self-report, other report, and metaperception assessments. The general factor accounted for a proportion ranging from 1% to 48% of the item variance, the axes component for 2% to 30%; and scale specificity for 1% to 28%, respectively. Reliability estimates varied considerably from .13 to .92. An application of the Nunnally and Bernstein formula proposed by Markey, Markey, and Tinsley overestimated axes reliabilities in cases of large-scale specificities but otherwise works effectively. Contemporary circumplex evaluations such as Tracey’s RANDALL are sensitive to the ratio of the axes and scale-specificity components. In contrast, the proposed model isolates both components.

  9. The cost of reliability

    International Nuclear Information System (INIS)

    Ilic, M.

    1998-01-01

    In this article the restructuring process under way in the US power industry is being revisited from the point of view of transmission system provision and reliability was rolled into the average cost of electricity to all, it is not so obvious how is this cost managed in the new industry. A new MIT approach to transmission pricing is here suggested as a possible solution [it

  10. Software reliability studies

    Science.gov (United States)

    Hoppa, Mary Ann; Wilson, Larry W.

    1994-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Our research has shown that by improving the quality of the data one can greatly improve the predictions. We are working on methodologies which control some of the randomness inherent in the standard data generation processes in order to improve the accuracy of predictions. Our contribution is twofold in that we describe an experimental methodology using a data structure called the debugging graph and apply this methodology to assess the robustness of existing models. The debugging graph is used to analyze the effects of various fault recovery orders on the predictive accuracy of several well-known software reliability algorithms. We found that, along a particular debugging path in the graph, the predictive performance of different models can vary greatly. Similarly, just because a model 'fits' a given path's data well does not guarantee that the model would perform well on a different path. Further we observed bug interactions and noted their potential effects on the predictive process. We saw that not only do different faults fail at different rates, but that those rates can be affected by the particular debugging stage at which the rates are evaluated. Based on our experiment, we conjecture that the accuracy of a reliability prediction is affected by the fault recovery order as well as by fault interaction.

  11. Investment in new product reliability

    International Nuclear Information System (INIS)

    Murthy, D.N.P.; Rausand, M.; Virtanen, S.

    2009-01-01

    Product reliability is of great importance to both manufacturers and customers. Building reliability into a new product is costly, but the consequences of inadequate product reliability can be costlier. This implies that manufacturers need to decide on the optimal investment in new product reliability by achieving a suitable trade-off between the two costs. This paper develops a framework and proposes an approach to help manufacturers decide on the investment in new product reliability.

  12. Data transmission protocol for Pi-of-the-Sky cameras

    Science.gov (United States)

    Uzycki, J.; Kasprowicz, G.; Mankiewicz, M.; Nawrocki, K.; Sitek, P.; Sokolowski, M.; Sulej, R.; Tlaczala, W.

    2006-10-01

    The large amount of data collected by the automatic astronomical cameras has to be transferred to the fast computers in a reliable way. The method chosen should ensure data streaming in both directions but in nonsymmetrical way. The Ethernet interface is very good choice because of its popularity and proven performance. However it requires TCP/IP stack implementation in devices like cameras for full compliance with existing network and operating systems. This paper describes NUDP protocol, which was made as supplement to standard UDP protocol and can be used as a simple-network protocol. The NUDP does not need TCP protocol implementation and makes it possible to run the Ethernet network with simple devices based on microcontroller and/or FPGA chips. The data transmission idea was created especially for the "Pi of the Sky" project.

  13. Individual response to ionising radiation: What predictive assay(s) to choose?

    International Nuclear Information System (INIS)

    Granzotto, A.; Viau, M.; Devic, C.; Maalouf, M.; Thomas, Ch.; Vogin, G.; Foray, N.; Granzotto, A.; Vogin, G.; Balosso, J.; Joubert, A.; Maalouf, M.; Vogin, G.; Colin, C.; Malek, K.; Balosso, J.; Colin, C.

    2011-01-01

    Individual response to ionizing radiation is an important information required to apply an efficient radiotherapy treatment against tumour and to avoid any adverse effects in normal tissues. In 1981, Fertil and Malaise have demonstrated that the post-irradiation local tumor control determined in vivo is correlated with clonogenic cell survival assessed in vitro. Furthermore, these authors have reminded the relevance of the concept of intrinsic radiosensitivity that is specific to each individual organ (Fertil and Malaise, 1981) [1]. To date, since clonogenicity assays are too time-consuming and do not provide any other molecular information, a plethora of research groups have attempted to determine the molecular bases of intrinsic radiosensitivity in order to propose reliable and faster predictive assays. To this aim, several approaches have been developed. Notably, the recent revolution in genomic and proteomics technologies is providing a considerable number of data but their link with radiosensitivity still remains to be elucidated. On another hand, the systematic screening of some candidate genes potentially involved in the radiation response is highlighting the complexity of the molecular and cellular mechanisms of DNA damage sensing and signalling and shows that an abnormal radiation response is not necessarily due to the impairment of one single protein. Finally, more modest approaches consisting in focusing some specific functions of DNA repair seem to provide more reliable clues to predict over-acute reactions caused by radiotherapy. In this review, we endeavored to analyse the contributions of these major approaches to predict human radiosensitivity. (authors)

  14. A simple and novel modification of comet assay for determination of bacteriophage mediated bacterial cell lysis.

    Science.gov (United States)

    Khairnar, Krishna; Sanmukh, Swapnil; Chandekar, Rajshree; Paunikar, Waman

    2014-07-01

    The comet assay is the widely used method for in vitro toxicity testing which is also an alternative to the use of animal models for in vivo testing. Since, its inception in 1984 by Ostling and Johansson, it is being modified frequently for a wide range of application. In spite of its wide applicability, unfortunately there is no report of its application in bacteriophages research. In this study, a novel application of comet assay for the detection of bacteriophage mediated bacterial cell lysis was described. The conventional methods in bacteriophage research for studying bacterial lysis by bacteriophages are plaque assay method. It is time consuming, laborious and costly. The lytic activity of bacteriophage devours the bacterial cell which results in the release of bacterial genomic material that gets detected by ethidium bromide staining method by the comet assay protocol. The objective of this study was to compare efficacy of comet assay with different assay used to study phage mediated bacterial lysis. The assay was performed on culture isolates (N=80 studies), modified comet assay appear to have relatively higher sensitivity and specificity than other assay. The results of the study showed that the application of comet assay can be an economical, time saving and less laborious alternative to conventional plaque assay for the detection of bacteriophage mediated bacterial cell lysis. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. Binding Assays Using Recombinant SH2 Domains: Far-Western, Pull-Down, and Fluorescence Polarization.

    Science.gov (United States)

    Machida, Kazuya; Liu, Bernard

    2017-01-01

    Recognition of phosphotyrosine-containing sequences by SH2 domains confers specificity in tyrosine kinase pathways. By assessing interactions between isolated SH2 domains and their binding proteins, it is possible to gain insight into otherwise inaccessible complex cellular systems. Far-Western, pull-down, and fluorescence polarization (FP) have been frequently used for characterization of phosphotyrosine signaling. Here, we outline standard protocols for these established assays using recombinant SH2 domain, emphasizing the importance of appropriate sample preparation and assay controls.

  16. Fluorometric assay for phenotypic differentiation of drug-resistant HIV mutants

    OpenAIRE

    Zhu, Qinchang; Yu, Zhiqiang; Kabashima, Tsutomu; Yin, Sheng; Dragusha, Shpend; El-Mahdy, Ahmed F. M.; Ejupi, Valon; Shibata, Takayuki; Kai, Masaaki

    2015-01-01

    Convenient drug-resistance testing of viral mutants is indispensable to effective treatment of viral infection. We developed a novel fluorometric assay for phenotypic differentiation of drug-resistant mutants of human immunodeficiency virus-I protease (HIV-PR) which uses enzymatic and peptide-specific fluorescence (FL) reactions and high-performance liquid chromatography (HPLC) of three HIV-PR substrates. This assay protocol enables use of non-purified enzyme sources and multiple substrates f...

  17. Solution assay instrument operations manual

    International Nuclear Information System (INIS)

    Li, T.K.; Marks, T.; Parker, J.L.

    1983-09-01

    An at-line solution assay instrument (SAI) has been developed and installed in a plutonium purification and americium recovery process area in the Los Alamos Plutonium Processing Facility. The instrument was designed for accurate, timely, and simultaneous nondestructive analysis of plutonium and americium in process solutions that have a wide range of concentrations and americium/plutonium ratios and for routine operation by process technicians who lack instrumentation background. The SAI, based on transmission-corrected, high-resolution gamma-ray spectroscopy, has two measurement stations attached to a single multichannel analyzer/computer system. To ensure the quality of assay results, the SAI has an internal measurement control program, which requires daily and weekly check runs and monitors key aspects of all assay runs. For a 25-ml sample, the assay precision is 5 g/l within a 2000-s count time

  18. Satellite Communications Using Commercial Protocols

    Science.gov (United States)

    Ivancic, William D.; Griner, James H.; Dimond, Robert; Frantz, Brian D.; Kachmar, Brian; Shell, Dan

    2000-01-01

    NASA Glenn Research Center has been working with industry, academia, and other government agencies in assessing commercial communications protocols for satellite and space-based applications. In addition, NASA Glenn has been developing and advocating new satellite-friendly modifications to existing communications protocol standards. This paper summarizes recent research into the applicability of various commercial standard protocols for use over satellite and space- based communications networks as well as expectations for future protocol development. It serves as a reference point from which the detailed work can be readily accessed. Areas that will be addressed include asynchronous-transfer-mode quality of service; completed and ongoing work of the Internet Engineering Task Force; data-link-layer protocol development for unidirectional link routing; and protocols for aeronautical applications, including mobile Internet protocol routing for wireless/mobile hosts and the aeronautical telecommunications network protocol.

  19. Radioligand assay in reproductive biology

    International Nuclear Information System (INIS)

    Korenman, S.G.; Sherman, B.M.

    1975-01-01

    Radioligand assays have been developed for the principal reproductive steroids and peptide hormones. Specific binding reagents have included antibodies, plasma binders, and intracellular receptors. In each assay, problems of specificity, sensitivity, and nonspecific inhibitors were encountered. Many features of the endocrine physiology in childhood, during puberty, and in adulthood have been characterized. Hormonal evaluations of endocrine disorders of reproduction are characterized on the basis of their characteristic pathophysiologic alterations. (U.S.)

  20. Epistemic Protocols for Distributed Gossiping

    Directory of Open Access Journals (Sweden)

    Krzysztof R. Apt

    2016-06-01

    Full Text Available Gossip protocols aim at arriving, by means of point-to-point or group communications, at a situation in which all the agents know each other's secrets. We consider distributed gossip protocols which are expressed by means of epistemic logic. We provide an operational semantics of such protocols and set up an appropriate framework to argue about their correctness. Then we analyze specific protocols for complete graphs and for directed rings.

  1. Symmetric cryptographic protocols

    CERN Document Server

    Ramkumar, Mahalingam

    2014-01-01

    This book focuses on protocols and constructions that make good use of symmetric pseudo random functions (PRF) like block ciphers and hash functions - the building blocks for symmetric cryptography. Readers will benefit from detailed discussion of several strategies for utilizing symmetric PRFs. Coverage includes various key distribution strategies for unicast, broadcast and multicast security, and strategies for constructing efficient digests of dynamic databases using binary hash trees.   •        Provides detailed coverage of symmetric key protocols •        Describes various applications of symmetric building blocks •        Includes strategies for constructing compact and efficient digests of dynamic databases

  2. Analytical validation of a flow cytometric protocol for quantification of platelet microparticles in dogs.

    Science.gov (United States)

    Cremer, Signe E; Krogh, Anne K H; Hedström, Matilda E K; Christiansen, Liselotte B; Tarnow, Inge; Kristensen, Annemarie T

    2018-06-01

    Platelet microparticles (PMPs) are subcellular procoagulant vesicles released upon platelet activation. In people with clinical diseases, alterations in PMP concentrations have been extensively investigated, but few canine studies exist. This study aims to validate a canine flow cytometric protocol for PMP quantification and to assess the influence of calcium on PMP concentrations. Microparticles (MP) were quantified in citrated whole blood (WB) and platelet-poor plasma (PPP) using flow cytometry. Anti-CD61 antibody and Annexin V (AnV) were used to detect platelets and phosphatidylserine, respectively. In 13 healthy dogs, CD61 + /AnV - concentrations were analyzed with/without a calcium buffer. CD61 + /AnV - , CD61 + /AnV + , and CD61 - /AnV + MP quantification were validated in 10 healthy dogs. The coefficient of variation (CV) for duplicate (intra-assay) and parallel (inter-assay) analyses and detection limits (DLs) were calculated. CD61 + /AnV - concentrations were higher in calcium buffer; 841,800 MP/μL (526,000-1,666,200) vs without; 474,200 MP/μL (278,800-997,500), P < .05. In WB, PMP were above DLs and demonstrated acceptable (<20%) intra-assay and inter-assay CVs in 9/10 dogs: 1.7% (0.5-8.9) and 9.0% (0.9-11.9), respectively, for CD61 + /AnV - and 2.4% (0.2-8.7) and 7.8% (0.0-12.8), respectively, for CD61 + /AnV + . Acceptable CVs were not seen for the CD61 - /AnV + MP. In PPP, quantifications were challenged by high inter-assay CV, overlapping DLs and hemolysis and lipemia interfered with quantification in 5/10 dogs. Calcium induced higher in vitro PMP concentrations, likely due to platelet activation. PMP concentrations were reliably quantified in WB, indicating the potential for clinical applications. PPP analyses were unreliable due to high inter-CV and DL overlap, and not obtainable due to hemolysis and lipemia interference. © 2018 American Society for Veterinary Clinical Pathology.

  3. Diplomacy and Diplomatic Protocol

    Directory of Open Access Journals (Sweden)

    Lect. Ph.D Oana Iucu

    2008-12-01

    Full Text Available The present study aims to observe relationships and determining factors between diplomacyand diplomatic protocol as outlined by historical and contextual analyses. The approach is very dynamic,provided that concepts are able to show their richness, antiquity and polyvalence at the level of connotations,semantics, grammatical and social syntax. The fact that this information is up to date determines anattitude of appreciation and a state of positive contamination.

  4. Automatic Validation of Protocol Narration

    DEFF Research Database (Denmark)

    Bodei, Chiara; Buchholtz, Mikael; Degano, Pierpablo

    2003-01-01

    We perform a systematic expansion of protocol narrations into terms of a process algebra in order to make precise some of the detailed checks that need to be made in a protocol. We then apply static analysis technology to develop an automatic validation procedure for protocols. Finally, we...

  5. Dysphonia risk screening protocol

    Directory of Open Access Journals (Sweden)

    Katia Nemr

    2016-03-01

    Full Text Available OBJECTIVE: To propose and test the applicability of a dysphonia risk screening protocol with score calculation in individuals with and without dysphonia. METHOD: This descriptive cross-sectional study included 365 individuals (41 children, 142 adult women, 91 adult men and 91 seniors divided into a dysphonic group and a non-dysphonic group. The protocol consisted of 18 questions and a score was calculated using a 10-cm visual analog scale. The measured value on the visual analog scale was added to the overall score, along with other partial scores. Speech samples allowed for analysis/assessment of the overall degree of vocal deviation and initial definition of the respective groups and after six months, the separation of the groups was confirmed using an acoustic analysis. RESULTS: The mean total scores were different between the groups in all samples. Values ranged between 37.0 and 57.85 in the dysphonic group and between 12.95 and 19.28 in the non-dysphonic group, with overall means of 46.09 and 15.55, respectively. High sensitivity and specificity were demonstrated when discriminating between the groups with the following cut-off points: 22.50 (children, 29.25 (adult women, 22.75 (adult men, and 27.10 (seniors. CONCLUSION: The protocol demonstrated high sensitivity and specificity in differentiating groups of individuals with and without dysphonia in different sample groups and is thus an effective instrument for use in voice clinics.

  6. Dysphonia risk screening protocol

    Science.gov (United States)

    Nemr, Katia; Simões-Zenari, Marcia; da Trindade Duarte, João Marcos; Lobrigate, Karen Elena; Bagatini, Flavia Alves

    2016-01-01

    OBJECTIVE: To propose and test the applicability of a dysphonia risk screening protocol with score calculation in individuals with and without dysphonia. METHOD: This descriptive cross-sectional study included 365 individuals (41 children, 142 adult women, 91 adult men and 91 seniors) divided into a dysphonic group and a non-dysphonic group. The protocol consisted of 18 questions and a score was calculated using a 10-cm visual analog scale. The measured value on the visual analog scale was added to the overall score, along with other partial scores. Speech samples allowed for analysis/assessment of the overall degree of vocal deviation and initial definition of the respective groups and after six months, the separation of the groups was confirmed using an acoustic analysis. RESULTS: The mean total scores were different between the groups in all samples. Values ranged between 37.0 and 57.85 in the dysphonic group and between 12.95 and 19.28 in the non-dysphonic group, with overall means of 46.09 and 15.55, respectively. High sensitivity and specificity were demonstrated when discriminating between the groups with the following cut-off points: 22.50 (children), 29.25 (adult women), 22.75 (adult men), and 27.10 (seniors). CONCLUSION: The protocol demonstrated high sensitivity and specificity in differentiating groups of individuals with and without dysphonia in different sample groups and is thus an effective instrument for use in voice clinics. PMID:27074171

  7. Analytical performances of the Diazyme ADA assay on the Cobas® 6000 system.

    Science.gov (United States)

    Delacour, Hervé; Sauvanet, Christophe; Ceppa, Franck; Burnat, Pascal

    2010-12-01

    To evaluate the analytical performance of the Diazyme ADA assay on the Cobas® 6000 system for pleural fluid samples analysis. Imprecision, linearity, calibration curve stability, interference, and correlation studies were completed. The Diazyme ADA assay demonstrated excellent precision (CVADA assay correlated well with the Giusti method (r(2)=0.93) but exhibited a negative bias (~ -30%). The Diazyme ADA assay on the Cobas® 6000 system represents a rapid, accurate, precise and reliable method for determination of ADA activity in pleural fluid samples. Copyright © 2010 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  8. Nuclear performance and reliability

    International Nuclear Information System (INIS)

    Rothwell, G.

    1993-01-01

    If fewer forced outages are a sign of improved safety, nuclear power plants have become safer and more productive. There has been a significant improvement in nuclear power plant performance, due largely to a decline in the forced outage rate and a dramatic drop in the average number of forced outages per fuel cycle. If fewer forced outages are a sign of improved safety, nuclear power plants have become safer and more productive over time. To encourage further increases in performance, regulatory incentive schemes should reward reactor operators for improved reliability and safety, as well as for improved performance

  9. [How Reliable is Neuronavigation?].

    Science.gov (United States)

    Stieglitz, Lennart Henning

    2016-02-17

    Neuronavigation plays a central role in modern neurosurgery. It allows visualizing instruments and three-dimensional image data intraoperatively and supports spatial orientation. Thus it allows to reduce surgical risks and speed up complex surgical procedures. The growing availability and importance of neuronavigation makes clear how relevant it is to know about its reliability and accuracy. Different factors may influence the accuracy during the surgery unnoticed, misleading the surgeon. Besides the best possible optimization of the systems themselves, a good knowledge about its weaknesses is mandatory for every neurosurgeon.

  10. The value of reliability

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; Karlström, Anders

    2010-01-01

    We derive the value of reliability in the scheduling of an activity of random duration, such as travel under congested conditions. Using a simple formulation of scheduling utility, we show that the maximal expected utility is linear in the mean and standard deviation of trip duration, regardless...... of the form of the standardised distribution of trip durations. This insight provides a unification of the scheduling model and models that include the standard deviation of trip duration directly as an argument in the cost or utility function. The results generalise approximately to the case where the mean...

  11. Multivariate performance reliability prediction in real-time

    International Nuclear Information System (INIS)

    Lu, S.; Lu, H.; Kolarik, W.J.

    2001-01-01

    This paper presents a technique for predicting system performance reliability in real-time considering multiple failure modes. The technique includes on-line multivariate monitoring and forecasting of selected performance measures and conditional performance reliability estimates. The performance measures across time are treated as a multivariate time series. A state-space approach is used to model the multivariate time series. Recursive forecasting is performed by adopting Kalman filtering. The predicted mean vectors and covariance matrix of performance measures are used for the assessment of system survival/reliability with respect to the conditional performance reliability. The technique and modeling protocol discussed in this paper provide a means to forecast and evaluate the performance of an individual system in a dynamic environment in real-time. The paper also presents an example to demonstrate the technique

  12. Materials and Reliability Handbook for Semiconductor Optical and Electron Devices

    CERN Document Server

    Pearton, Stephen

    2013-01-01

    Materials and Reliability Handbook for Semiconductor Optical and Electron Devices provides comprehensive coverage of reliability procedures and approaches for electron and photonic devices. These include lasers and high speed electronics used in cell phones, satellites, data transmission systems and displays. Lifetime predictions for compound semiconductor devices are notoriously inaccurate due to the absence of standard protocols. Manufacturers have relied on extrapolation back to room temperature of accelerated testing at elevated temperature. This technique fails for scaled, high current density devices. Device failure is driven by electric field or current mechanisms or low activation energy processes that are masked by other mechanisms at high temperature. The Handbook addresses reliability engineering for III-V devices, including materials and electrical characterization, reliability testing, and electronic characterization. These are used to develop new simulation technologies for device operation and ...

  13. Application of statistical process control to qualitative molecular diagnostic assays

    LENUS (Irish Health Repository)

    O'Brien, Cathal P.

    2014-11-01

    Modern pathology laboratories and in particular high throughput laboratories such as clinical chemistry have developed a reliable system for statistical process control (SPC). Such a system is absent from the majority of molecular laboratories and where present is confined to quantitative assays. As the inability to apply SPC to an assay is an obvious disadvantage this study aimed to solve this problem by using a frequency estimate coupled with a confidence interval calculation to detect deviations from an expected mutation frequency. The results of this study demonstrate the strengths and weaknesses of this approach and highlight minimum sample number requirements. Notably, assays with low mutation frequencies and detection of small deviations from an expected value require greater sample numbers to mitigate a protracted time to detection. Modeled laboratory data was also used to highlight how this approach might be applied in a routine molecular laboratory. This article is the first to describe the application of SPC to qualitative laboratory data.

  14. Application of statistical process control to qualitative molecular diagnostic assays.

    Science.gov (United States)

    O'Brien, Cathal P; Finn, Stephen P

    2014-01-01

    Modern pathology laboratories and in particular high throughput laboratories such as clinical chemistry have developed a reliable system for statistical process control (SPC). Such a system is absent from the majority of molecular laboratories and where present is confined to quantitative assays. As the inability to apply SPC to an assay is an obvious disadvantage this study aimed to solve this problem by using a frequency estimate coupled with a confidence interval calculation to detect deviations from an expected mutation frequency. The results of this study demonstrate the strengths and weaknesses of this approach and highlight minimum sample number requirements. Notably, assays with low mutation frequencies and detection of small deviations from an expected value require greater sample numbers to mitigate a protracted time to detection. Modeled laboratory data was also used to highlight how this approach might be applied in a routine molecular laboratory. This article is the first to describe the application of SPC to qualitative laboratory data.

  15. Field Programmable Gate Array Reliability Analysis Guidelines for Launch Vehicle Reliability Block Diagrams

    Science.gov (United States)

    Al Hassan, Mohammad; Britton, Paul; Hatfield, Glen Spencer; Novack, Steven D.

    2017-01-01

    Field Programmable Gate Arrays (FPGAs) integrated circuits (IC) are one of the key electronic components in today's sophisticated launch and space vehicle complex avionic systems, largely due to their superb reprogrammable and reconfigurable capabilities combined with relatively low non-recurring engineering costs (NRE) and short design cycle. Consequently, FPGAs are prevalent ICs in communication protocols and control signal commands. This paper will identify reliability concerns and high level guidelines to estimate FPGA total failure rates in a launch vehicle application. The paper will discuss hardware, hardware description language, and radiation induced failures. The hardware contribution of the approach accounts for physical failures of the IC. The hardware description language portion will discuss the high level FPGA programming languages and software/code reliability growth. The radiation portion will discuss FPGA susceptibility to space environment radiation.

  16. Recommendations for safety testing with the in vivo comet assay.

    Science.gov (United States)

    Vasquez, Marie Z

    2012-08-30

    While the in vivo comet assay increases its role in regulatory safety testing, deliberations about the interpretation of comet data continue. Concerns can arise regarding comet assay publications with limited data from non-blind testing of positive control compounds and using protocols (e.g. dose concentrations, sample times, and tissues) known to give an expected effect. There may be a tendency towards bias when the validation or interpretation of comet assay data is based on results generated by widely accepted but non-validated assays. The greatest advantages of the comet assay are its sensitivity and its ability to detect genotoxicity in tissues and at sample times that could not previously be evaluated. Guidelines for its use and interpretation in safety testing should take these factors into account. Guidelines should be derived from objective review of data generated by blind testing of unknown compounds dosed at non-toxic concentrations and evaluated in a true safety-testing environment, where the experimental design and conclusions must be defensible. However, positive in vivo comet findings with such compounds are rarely submitted to regulatory agencies and this data is typically unavailable for publication due to its proprietary nature. To enhance the development of guidelines for safety testing with the comet assay, and with the permission of several sponsors, this paper presents and discusses relevant data from multiple GLP comet studies conducted blind, with unknown pharmaceuticals and consumer products. Based on these data and the lessons we have learned through the course of conducting these studies, I suggest significant adjustments to the current conventions, and I provide recommendations for interpreting in vivo comet assay results in situations where risk must be evaluated in the absence of carcinogenicity or clinical data. Copyright © 2012 Elsevier B.V. All rights reserved.

  17. A high performance totally ordered multicast protocol

    Science.gov (United States)

    Montgomery, Todd; Whetten, Brian; Kaplan, Simon

    1995-01-01

    This paper presents the Reliable Multicast Protocol (RMP). RMP provides a totally ordered, reliable, atomic multicast service on top of an unreliable multicast datagram service such as IP Multicasting. RMP is fully and symmetrically distributed so that no site bears un undue portion of the communication load. RMP provides a wide range of guarantees, from unreliable delivery to totally ordered delivery, to K-resilient, majority resilient, and totally resilient atomic delivery. These QoS guarantees are selectable on a per packet basis. RMP provides many communication options, including virtual synchrony, a publisher/subscriber model of message delivery, an implicit naming service, mutually exclusive handlers for messages, and mutually exclusive locks. It has commonly been held that a large performance penalty must be paid in order to implement total ordering -- RMP discounts this. On SparcStation 10's on a 1250 KB/sec Ethernet, RMP provides totally ordered packet delivery to one destination at 842 KB/sec throughput and with 3.1 ms packet latency. The performance stays roughly constant independent of the number of destinations. For two or more destinations on a LAN, RMP provides higher throughput than any protocol that does not use multicast or broadcast.

  18. Validation and modification of dried blood spot-based glycosylated hemoglobin assay for the longitudinal aging study in India.

    Science.gov (United States)

    Hu, Peifeng; Edenfield, Michael; Potter, Alan; Kale, Varsha; Risbud, Arun; Williams, Sharon; Lee, Jinkook; Bloom, David E; Crimmins, Eileen; Seeman, Teresa

    2015-01-01

    This study aims to validate a modified dried blood spot (DBS)-based glycosylated hemoglobin (HbA1c) assay protocol, after a pretest in India showed poor correlation between the original DBS-based protocol and venous results. The original protocol was tested on different chemistry analyzers and then simplified at the University of Washington (UW). A second pretest was conducted in India to validate the modified assay protocol, using 44 quality control specimens. Data from UW indicated that, using the original protocol, the correlation coefficients between DBS and venous results were above 0.98 on both Bio-Rad and Olympus chemistry analyzers. The protocol worked equally well on filter paper, with or without pre-treatment, and when the recommended amount of blood spot material, or less, was used. A second pretest of the modified protocol confirmed that DBS-based levels from both Olympus and Roche chemistry analyzers were well correlated with DBS results from UW (correlation coefficients were above 0.96), as well as with venous values (correlation coefficients were above 0.94). The DBS-based HbA1c values are highly correlated with venous results. The pre-treatment of filter paper does not appear to be necessary. The poor results from the first pretest are probably due to factors unrelated to the protocol, such as problems with the chemistry analyzer or assay reagents. © 2015 Wiley Periodicals, Inc.

  19. Correlation between the genotoxicity endpoints measured by two different genotoxicity assays: comet assay and CBMN assay

    Directory of Open Access Journals (Sweden)

    Carina Ladeira

    2015-06-01

    The results concerning of positive findings by micronuclei and non significant ones by comet assay, are corroborated by Deng et al. (2005 study performed in workers occupationally exposed to methotrexate, also a cytostatic drug. According to Cavallo et al. (2009, the comet assay seems to be more suitable for the prompt evaluation of the genotoxic effects, for instance, of polycyclic aromatic hydrocarbons mixtures containing volatile substances, whereas the micronucleus test seems more appropriate to evaluate the effects of exposure to antineoplastic agents. However, there are studies that observed an increase in both the comet assay and the micronucleus test in nurses handling antineoplastic drugs, although statistical significance was only seen in the comet assay, quite the opposite of our results (Maluf & Erdtmann, 2000; Laffon et al. 2005.

  20. Study of QoS control and reliable routing method for utility communication network. Application of differentiated service to the network and alternative route establishment by the IP routing protocol; Denryokuyo IP network no QoS seigyo to shinraisei kakuho no hoho. DiffServ ni yoru QoS seigyo no koka to IP ni yoru fuku root ka no kento

    Energy Technology Data Exchange (ETDEWEB)

    Oba, E.

    2000-05-01

    QoS control method which satisfies utilities communication network requirement and alternative route establishment method which is for sustaining communication during a failure are studied. Applicability of DiffServ (Differentiated Service), one of the most promising QoS control method on IP network and studying energetically in IETF WG, is studied and it is found most application used in the utility communication network except for relaying system information could he accommodated to the DiffServ network. An example of the napping of the utility communication applications to the DiffServ PHB (Per Hop Behavior) is shown in this paper. Regarding to the alternative route, usual IP routing protocol cannot establish alternative route which doesn't have common links and nodes in their paths for a destination. IP address duplication with some modification of routing protocol enables such alternative route establishment. MPLS, distance vector algorithm and link state algorithm are evaluated qualitatively, and as a result, we found MPLS is promising way to establish the route. Quantitative evaluation will be future work. (author)

  1. Interactive reliability assessment using an integrated reliability data bank

    International Nuclear Information System (INIS)

    Allan, R.N.; Whitehead, A.M.

    1986-01-01

    The logical structure, techniques and practical application of a computer-aided technique based on a microcomputer using floppy disc Random Access Files is described. This interactive computational technique is efficient if the reliability prediction program is coupled directly to a relevant source of data to create an integrated reliability assessment/reliability data bank system. (DG)

  2. Load Control System Reliability

    Energy Technology Data Exchange (ETDEWEB)

    Trudnowski, Daniel [Montana Tech of the Univ. of Montana, Butte, MT (United States)

    2015-04-03

    This report summarizes the results of the Load Control System Reliability project (DOE Award DE-FC26-06NT42750). The original grant was awarded to Montana Tech April 2006. Follow-on DOE awards and expansions to the project scope occurred August 2007, January 2009, April 2011, and April 2013. In addition to the DOE monies, the project also consisted of matching funds from the states of Montana and Wyoming. Project participants included Montana Tech; the University of Wyoming; Montana State University; NorthWestern Energy, Inc., and MSE. Research focused on two areas: real-time power-system load control methodologies; and, power-system measurement-based stability-assessment operation and control tools. The majority of effort was focused on area 2. Results from the research includes: development of fundamental power-system dynamic concepts, control schemes, and signal-processing algorithms; many papers (including two prize papers) in leading journals and conferences and leadership of IEEE activities; one patent; participation in major actual-system testing in the western North American power system; prototype power-system operation and control software installed and tested at three major North American control centers; and, the incubation of a new commercial-grade operation and control software tool. Work under this grant certainly supported the DOE-OE goals in the area of “Real Time Grid Reliability Management.”

  3. Microprocessor hardware reliability

    Energy Technology Data Exchange (ETDEWEB)

    Wright, R I

    1982-01-01

    Microprocessor-based technology has had an impact in nearly every area of industrial electronics and many applications have important safety implications. Microprocessors are being used for the monitoring and control of hazardous processes in the chemical, oil and power generation industries, for the control and instrumentation of aircraft and other transport systems and for the control of industrial machinery. Even in the field of nuclear reactor protection, where designers are particularly conservative, microprocessors are used to implement certain safety functions and may play increasingly important roles in protection systems in the future. Where microprocessors are simply replacing conventional hard-wired control and instrumentation systems no new hazards are created by their use. In the field of robotics, however, the microprocessor has opened up a totally new technology and with it has created possible new and as yet unknown hazards. The paper discusses some of the design and manufacturing techniques which may be used to enhance the reliability of microprocessor based systems and examines the available reliability data on lsi/vlsi microcircuits. 12 references.

  4. Supply chain reliability modelling

    Directory of Open Access Journals (Sweden)

    Eugen Zaitsev

    2012-03-01

    Full Text Available Background: Today it is virtually impossible to operate alone on the international level in the logistics business. This promotes the establishment and development of new integrated business entities - logistic operators. However, such cooperation within a supply chain creates also many problems related to the supply chain reliability as well as the optimization of the supplies planning. The aim of this paper was to develop and formulate the mathematical model and algorithms to find the optimum plan of supplies by using economic criterion and the model for the probability evaluating of non-failure operation of supply chain. Methods: The mathematical model and algorithms to find the optimum plan of supplies were developed and formulated by using economic criterion and the model for the probability evaluating of non-failure operation of supply chain. Results and conclusions: The problem of ensuring failure-free performance of goods supply channel analyzed in the paper is characteristic of distributed network systems that make active use of business process outsourcing technologies. The complex planning problem occurring in such systems that requires taking into account the consumer's requirements for failure-free performance in terms of supply volumes and correctness can be reduced to a relatively simple linear programming problem through logical analysis of the structures. The sequence of the operations, which should be taken into account during the process of the supply planning with the supplier's functional reliability, was presented.

  5. The intergroup protocols: Scalable group communication for the internet

    Energy Technology Data Exchange (ETDEWEB)

    Berket, Karlo [Univ. of California, Santa Barbara, CA (United States)

    2000-12-04

    Reliable group ordered delivery of multicast messages in a distributed system is a useful service that simplifies the programming of distributed applications. Such a service helps to maintain the consistency of replicated information and to coordinate the activities of the various processes. With the increasing popularity of the Internet, there is an increasing interest in scaling the protocols that provide this service to the environment of the Internet. The InterGroup protocol suite, described in this dissertation, provides such a service, and is intended for the environment of the Internet with scalability to large numbers of nodes and high latency links. The InterGroup protocols approach the scalability problem from various directions. They redefine the meaning of group membership, allow voluntary membership changes, add a receiver-oriented selection of delivery guarantees that permits heterogeneity of the receiver set, and provide a scalable reliability service. The InterGroup system comprises several components, executing at various sites within the system. Each component provides part of the services necessary to implement a group communication system for the wide-area. The components can be categorized as: (1) control hierarchy, (2) reliable multicast, (3) message distribution and delivery, and (4) process group membership. We have implemented a prototype of the InterGroup protocols in Java, and have tested the system performance in both local-area and wide-area networks.

  6. OSS reliability measurement and assessment

    CERN Document Server

    Yamada, Shigeru

    2016-01-01

    This book analyses quantitative open source software (OSS) reliability assessment and its applications, focusing on three major topic areas: the Fundamentals of OSS Quality/Reliability Measurement and Assessment; the Practical Applications of OSS Reliability Modelling; and Recent Developments in OSS Reliability Modelling. Offering an ideal reference guide for graduate students and researchers in reliability for open source software (OSS) and modelling, the book introduces several methods of reliability assessment for OSS including component-oriented reliability analysis based on analytic hierarchy process (AHP), analytic network process (ANP), and non-homogeneous Poisson process (NHPP) models, the stochastic differential equation models and hazard rate models. These measurement and management technologies are essential to producing and maintaining quality/reliable systems using OSS.

  7. Transit ridership, reliability, and retention.

    Science.gov (United States)

    2008-10-01

    This project explores two major components that affect transit ridership: travel time reliability and rider : retention. It has been recognized that transit travel time reliability may have a significant impact on : attractiveness of transit to many ...

  8. Travel reliability inventory for Chicago.

    Science.gov (United States)

    2013-04-01

    The overarching goal of this research project is to enable state DOTs to document and monitor the reliability performance : of their highway networks. To this end, a computer tool, TRIC, was developed to produce travel reliability inventories from : ...

  9. DOE assay methods used for characterization of contact-handled transuranic waste

    Energy Technology Data Exchange (ETDEWEB)

    Schultz, F.J. (Oak Ridge National Lab., TN (United States)); Caldwell, J.T. (Pajarito Scientific Corp., Los Alamos, NM (United States))

    1991-08-01

    US Department of Energy methods used for characterization of contact-handled transuranic (CH-TRU) waste prior to shipment to the Waste Isolation Pilot Plant (WIPP) are described and listed by contractor site. The methods described are part of the certification process. All CH-TRU waste must be assayed for determination of fissile material content and decay heat values prior to shipment and prior to storage on-site. Both nondestructive assay (NDA) and destructive assay methods are discussed, and new NDA developments such as passive-action neutron (PAN) crate counter improvements and neutron imaging are detailed. Specifically addressed are assay method physics; applicability to CH-TRU wastes; calibration standards and implementation; operator training requirements and practices; assay procedures; assay precision, bias, and limit of detection; and assay limitation. While PAN is a new technique and does not yet have established American Society for Testing and Materials. American National Standards Institute, or Nuclear Regulatory Commission guidelines or methods describing proper calibration procedures, equipment setup, etc., comparisons of PAN data with the more established assay methods (e.g., segmented gamma scanning) have demonstrated its reliability and accuracy. Assay methods employed by DOE have been shown to reliable and accurate in determining fissile, radionuclide, alpha-curie content, and decay heat values of CH-TRU wastes. These parameters are therefore used to characterize packaged waste for use in certification programs such as that used in shipment of CH-TRU waste to the WIPP. 36 refs., 10 figs., 7 tabs.

  10. DOE assay methods used for characterization of contact-handled transuranic waste

    International Nuclear Information System (INIS)

    Schultz, F.J.; Caldwell, J.T.

    1991-08-01

    US Department of Energy methods used for characterization of contact-handled transuranic (CH-TRU) waste prior to shipment to the Waste Isolation Pilot Plant (WIPP) are described and listed by contractor site. The methods described are part of the certification process. All CH-TRU waste must be assayed for determination of fissile material content and decay heat values prior to shipment and prior to storage on-site. Both nondestructive assay (NDA) and destructive assay methods are discussed, and new NDA developments such as passive-action neutron (PAN) crate counter improvements and neutron imaging are detailed. Specifically addressed are assay method physics; applicability to CH-TRU wastes; calibration standards and implementation; operator training requirements and practices; assay procedures; assay precision, bias, and limit of detection; and assay limitation. While PAN is a new technique and does not yet have established American Society for Testing and Materials. American National Standards Institute, or Nuclear Regulatory Commission guidelines or methods describing proper calibration procedures, equipment setup, etc., comparisons of PAN data with the more established assay methods (e.g., segmented gamma scanning) have demonstrated its reliability and accuracy. Assay methods employed by DOE have been shown to reliable and accurate in determining fissile, radionuclide, alpha-curie content, and decay heat values of CH-TRU wastes. These parameters are therefore used to characterize packaged waste for use in certification programs such as that used in shipment of CH-TRU waste to the WIPP. 36 refs., 10 figs., 7 tabs

  11. Evaluation of PCR and DNA hybridization protocols for detection of viable enterotoxigenic Clostridium perfringens in irradiated beef

    International Nuclear Information System (INIS)

    Baez, L.A.; Juneja, V.K.; Thayer, D.W.; Sackitey, S.

    1997-01-01

    The sensitivity of DNA hybridization and polymerase chain reaction (PCR), was evaluated in irradiated cooked and raw beef samples. A membrane-based colony hybridization assay and a PCR protocol, both with specificity for the enterotoxin A gene of Clostridium perfringens, were compared with viable plate counts. The results of the colony hybridization procedure were in agreement with viable plate counts for detection and enumeration of enterotoxigenic C. perfringens. The PCR procedure combined a 4 h enrichment followed by a nucleic acid extraction step and assessed the amplification of 183 and 750 base pair enterotoxin gene targets. Detection of C. perfringens by PCR did not show a reliable correlation with viable plate counts or the colony hybridization assay. C. perfringens killed by irradiation were not detected by the plate count or colony hybridization methods; however, killed cells were detected with the PCR technique. By relying on the growth of viable cells for detection and/or enumeration, the colony hybridization and plate count methods provided a direct correlation with the presence of viable bacteria

  12. 2017 NREL Photovoltaic Reliability Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, Sarah [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-08-15

    NREL's Photovoltaic (PV) Reliability Workshop (PVRW) brings together PV reliability experts to share information, leading to the improvement of PV module reliability. Such improvement reduces the cost of solar electricity and promotes investor confidence in the technology -- both critical goals for moving PV technologies deeper into the electricity marketplace.

  13. AECL's reliability and maintainability program

    International Nuclear Information System (INIS)

    Wolfe, W.A.; Nieuwhof, G.W.E.

    1976-05-01

    AECL's reliability and maintainability program for nuclear generating stations is described. How the various resources of the company are organized to design and construct stations that operate reliably and safely is shown. Reliability and maintainability includes not only special mathematically oriented techniques, but also the technical skills and organizational abilities of the company. (author)

  14. Reliable discrimination of 10 ungulate species using high resolution melting analysis of faecal DNA.

    Directory of Open Access Journals (Sweden)

    Ana Ramón-Laca

    Full Text Available Identifying species occupying an area is essential for many ecological and conservation studies. Faecal DNA is a potentially powerful method for identifying cryptic mammalian species. In New Zealand, 10 species of ungulate (Order: Artiodactyla have established wild populations and are managed as pests because of their impacts on native ecosystems. However, identifying the ungulate species present within a management area based on pellet morphology is unreliable. We present a method that enables reliable identification of 10 ungulate species (red deer, sika deer, rusa deer, fallow deer, sambar deer, white-tailed deer, Himalayan tahr, Alpine chamois, feral sheep, and feral goat from swabs of faecal pellets. A high resolution melting (HRM assay, targeting a fragment of the 12S rRNA gene, was developed. Species-specific primers were designed and combined in a multiplex PCR resulting in fragments of different length and therefore different melting behaviour for each species. The method was developed using tissue from each of the 10 species, and was validated in blind trials. Our protocol enabled species to be determined for 94% of faecal pellet swabs collected during routine monitoring by the New Zealand Department of Conservation. Our HRM method enables high-throughput and cost-effective species identification from low DNA template samples, and could readily be adapted to discriminate other mammalian species from faecal DNA.

  15. A Study on IP Network Recovery through Routing Protocols

    Directory of Open Access Journals (Sweden)

    K. Karthik

    2016-09-01

    Full Text Available Internet has taken major role in our communication infrastructure. Such that requirement of internet availability and reliability has increasing accordingly. The major network failure reasons are failure of node and failure of link among the nodes. This can reduce the performance of major applications in an IP networks. The network recovery should be fast enough so that service interruption of link or node failure. The new path taken by the diverted traffic can be computed either at the time of failures or before failures. These mechanisms are known as Reactive and Proactive protocols respectively. In this paper, we surveyed reactive and proactive protocols mechanisms for IP network recovery.

  16. Study and development of a remote biometric authentication protocol

    OpenAIRE

    Bistarelli, Stefano; Claudio, Viti

    2003-01-01

    This paper reports the phases of study and implementation of a remote biometric authentication protocol developed during my internship at the I.i.t. of the C.n.r. in Pisa. Starting from the study of authentication history we had a look from the first system used since the 60ies to the latest technology; this helped us understand how we could realize a demonstration working protocol that could achieve a web remote authentication granting good reliability: to do this we choosed to modify the SS...

  17. Barcoded microchips for biomolecular assays.

    Science.gov (United States)

    Zhang, Yi; Sun, Jiashu; Zou, Yu; Chen, Wenwen; Zhang, Wei; Xi, Jianzhong Jeff; Jiang, Xingyu

    2015-01-20

    Multiplexed assay of analytes is of great importance for clinical diagnostics and other analytical applications. Barcode-based bioassays with the ability to encode and decode may realize this goal in a straightforward and consistent manner. We present here a microfluidic barcoded chip containing several sets of microchannels with different widths, imitating the commonly used barcode. A single barcoded microchip can carry out tens of individual protein/nucleic acid assays (encode) and immediately yield all assay results by a portable barcode reader or a smartphone (decode). The applicability of a barcoded microchip is demonstrated by human immunodeficiency virus (HIV) immunoassays for simultaneous detection of three targets (anti-gp41 antibody, anti-gp120 antibody, and anti-gp36 antibody) from six human serum samples. We can also determine seven pathogen-specific oligonucleotides by a single chip containing both positive and negative controls.

  18. Business of reliability

    Science.gov (United States)

    Engel, Pierre

    1999-12-01

    The presentation is organized around three themes: (1) The decrease of reception equipment costs allows non-Remote Sensing organization to access a technology until recently reserved to scientific elite. What this means is the rise of 'operational' executive agencies considering space-based technology and operations as a viable input to their daily tasks. This is possible thanks to totally dedicated ground receiving entities focusing on one application for themselves, rather than serving a vast community of users. (2) The multiplication of earth observation platforms will form the base for reliable technical and financial solutions. One obstacle to the growth of the earth observation industry is the variety of policies (commercial versus non-commercial) ruling the distribution of the data and value-added products. In particular, the high volume of data sales required for the return on investment does conflict with traditional low-volume data use for most applications. Constant access to data sources supposes monitoring needs as well as technical proficiency. (3) Large volume use of data coupled with low- cost equipment costs is only possible when the technology has proven reliable, in terms of application results, financial risks and data supply. Each of these factors is reviewed. The expectation is that international cooperation between agencies and private ventures will pave the way for future business models. As an illustration, the presentation proposes to use some recent non-traditional monitoring applications, that may lead to significant use of earth observation data, value added products and services: flood monitoring, ship detection, marine oil pollution deterrent systems and rice acreage monitoring.

  19. FRENCH PROTOCOL CARDS

    CERN Multimedia

    Division du Personnel

    1999-01-01

    Senior officials, holders of FRENCH PROTOCOL cards (blue cards) due to expire on 31.12.1999, are requested to return these cards and those of family members, for extension to:Bureau des cartes, bâtiment 33.1-025Should the 3 spaces for authentication on the back of the card be full, please enclose 2 passport photographs for a new card.In the case of children aged 14 and over, an attestation of dependency and a school certificate should be returned with the card.Personnel DivisionTel. 79494/74683

  20. FRENCH PROTOCOL CARDS

    CERN Multimedia

    Human Resources Division

    2000-01-01

    Senior officials, holders of FRENCH PROTOCOL cards (blue cards) due to expire on 31.12.2000, are requested to return these cards and those of family members, for extension to: Bureau des cartes, Bât 33.1-009/1-015 Should the three spaces for authentication on the back of the card be full, please enclose two passport photographs for a new card. In the case of children aged 14 and over, an attestation of dependency and a school certificate should be returned with the card.

  1. Sincalide - the final protocol

    International Nuclear Information System (INIS)

    Clarke, E.A.; Notghi, A.; Hesslewood, S.R.; Harding, L.K.

    2002-01-01

    Full text: HIDA biliary studies examine the gallbladder (GB) to give a percentage ejection fraction (EF). Porcine CCK was an accepted agent for stimulating the GB prior to being withdrawn in the UK from 1998. Sincalide (a synthetic CCK) was the suggested replacement. We have tried many administration regimes in an attempt to get results comparable with our established CCK protocols. Dose concentration and length of infusion times have been studied. Initially a dose of 10 ngm/kg/min given over 2 minutes (manufacturer's recommended dose) was used. This gave falsely low ejection fractions. The dose was reduced to 3 ngm/kg/min over 3 minutes as it was felt the higher dose may be causing constriction of the sphincter of Oddi. This gave a slight improvement with 22 % of patients having normal EF (>35 %). The length of infusion was extended to 15 minutes and the dose concentration reduced again to 0.6 ngm/kg/min. 62 % of patients had a normal EF. However, on many of the curves the gallbladder was still contracting on completion of the 15 minute infusion and began to refill immediately after stopping Sincalide. A further change of protocol was indicated. The infusion time was extended to 30 minutes and the dose concentration per minute kept the same. Imaging began at 30 minutes post HIDA injection and continued for a total of 50 minutes. Sincalide infusion began at 35 minutes if a GB was visualized. This protocol has been performed on 17 patients. 53 % of these had a normal result (comparable with a normal rate of 40 % previously established with CCK) with a mean EF of 60 %. The mean EF of patients with abnormal studies was 15 %. Curves showed a plateau by 30 minutes in 94 % of patients indicating that gallbladder contraction was complete. No normal range is available so results were compared with ultrasound (US). All patients who had an abnormal US scan also had abnormal HIDA results. Three patients had a normal US scan and abnormal HIDA study. These are currently

  2. Distributed Network Protocols

    Science.gov (United States)

    1980-07-01

    MONITORING AGENCY NAME & ADDRESS(II different from Controlting Office) IS. SECURITY CLASS. (of this report) S Office of Naval Research Unclassified...All protocols are extended to networks with changing. topology. S80 8 4 246 DD0I iA 1473 EDITION OF INOV 65 IS OBSOLETE 8 0 24 SECURITY CLASSIFICATION...to the netowrk . f) Each node knows its adjacent links, but not necessarily the identity of its neighbors, i.e. the nodes at the other end of the links

  3. Cyber Security Vulnerability Impact on I and C Reliability

    International Nuclear Information System (INIS)

    Hadley, Mark D.; McBride, Justin B.

    2006-01-01

    We present a discussion of the cyber security vulnerability impact on instrument and control reliability. In the discussion we demonstrate the likely vector of attack and vulnerabilities associated with commodity hardware, protocols and communication media. The current fleet of nuclear power plants in the United States utilizes aging analog instrument and control systems which are more frequently suffering from obsolescence and failure. The commodity equipment available now and in the near future incorporates features from information technology systems which compound cyber vulnerabilities

  4. An eDNA Assay to Monitor a Globally Invasive Fish Species from Flowing Freshwater.

    Science.gov (United States)

    Adrian-Kalchhauser, Irene; Burkhardt-Holm, Patricia

    2016-01-01

    Ponto-Caspian gobies are a flock of five invasive fish species that have colonized freshwaters and brackish waters in Europe and North America. One of them, the round goby Neogobius melanostomus, figures among the 100 worst invaders in Europe. Current methods to detect the presence of Ponto-Caspian gobies involve catching or sighting the fish. These approaches are labor intense and not very sensitive. Consequently, populations are usually detected only when they have reached high densities and when management or containment efforts are futile. To improve monitoring, we developed an assay based on the detection of DNA traces (environmental DNA, or eDNA) of Ponto-Caspian gobies in river water. The assay specifically detects invasive goby DNA and does not react to any native fish species. We apply the assay to environmental samples and demonstrate that parameters such as sampling depth, sampling location, extraction protocol, PCR protocol and PCR inhibition greatly impact detection. We further successfully outline the invasion front of Ponto-Caspian gobies in a large river, the High Rhine in Switzerland, and thus demonstrate the applicability of the assay to lotic environments. The eDNA assay requires less time, equipment, manpower, skills, and financial resources than the conventional monitoring methods such as electrofishing, angling or diving. Samples can be taken by untrained individuals, and the assay can be performed by any molecular biologist on a conventional PCR machine. Therefore, this assay enables environment managers to map invaded areas independently of fishermen's' reports and fish community monitorings.

  5. A Direct, Competitive Enzyme-Linked Immunosorbent Assay (ELISA) as a Quantitative Technique for Small Molecules

    Science.gov (United States)

    Powers, Jennifer L.; Rippe, Karen Duda; Imarhia, Kelly; Swift, Aileen; Scholten, Melanie; Islam, Naina

    2012-01-01

    ELISA (enzyme-linked immunosorbent assay) is a widely used technique with applications in disease diagnosis, detection of contaminated foods, and screening for drugs of abuse or environmental contaminants. However, published protocols with a focus on quantitative detection of small molecules designed for teaching laboratories are limited. A…

  6. How to best freeze liver samples to perform the in vivo mammalian alkaline comet assay

    Directory of Open Access Journals (Sweden)

    José Manuel Enciso Gadea

    2015-06-01

    None of the different methods used was capable of giving good results, except immersing the liver samples in liquid nitrogen, followed by Jackson’s et al. (2013 thawing protocol, suggesting that the thawing process may be as critical as the freezing process. To sum up, these results highlight the importance of deepening the possibility to perform the comet assay with frozen tissue.

  7. The local lymph node assay (LLNA).

    Science.gov (United States)

    Rovida, Costanza; Ryan, Cindy; Cinelli, Serena; Basketter, David; Dearman, Rebecca; Kimber, Ian

    2012-02-01

    The murine local lymph node assay (LLNA) is a widely accepted method for assessing the skin sensitization potential of chemicals. Compared with other in vivo methods in guinea pig, the LLNA offers important advantages with respect to animal welfare, including a requirement for reduced animal numbers as well as reduced pain and trauma. In addition to hazard identification, the LLNA is used for determining the relative skin sensitizing potency of contact allergens as a pivotal contribution to the risk assessment process. The LLNA is the only in vivo method that has been subjected to a formal validation process. The original LLNA protocol is based on measurement of the proliferative activity of draining lymph node cells (LNC), as determined by incorporation of radiolabeled thymidine. Several variants to the original LLNA have been developed to eliminate the use of radioactive materials. One such alternative is considered here: the LLNA:BrdU-ELISA method, which uses 5-bromo-2-deoxyuridine (BrdU) in place of radiolabeled thymidine to measure LNC proliferation in draining nodes. © 2012 by John Wiley & Sons, Inc.

  8. Evaluation of a Noncontact, Alternative Mosquito Repellent Assay System.

    Science.gov (United States)

    Tisgratog, Rungarun; Kongmee, Monthathip; Sanguanpong, Unchalee; Prabaripai, Atchariya; Bangs, Michael J; Chareonviriyaphap, Theeraphap

    2016-09-01

    A novel noncontact repellency assay system (NCRAS) was designed and evaluated as a possible alternative method for testing compounds that repel or inhibit mosquitoes from blood feeding. Deet and Aedes aegypti were used in a controlled laboratory setting. Using 2 study designs, a highly significant difference were seen between deet-treated and untreated skin placed behind the protective screens, indicating that deet was detected and was acting as a deterrence to mosquito landing and probing behavior. However, a 2nd study showed significant differences between protected (behind a metal screen barrier) and unprotected (exposed) deet-treated forearms, indicating the screen mesh might restrict the detection of deet and thus influences landing/biting response. These findings indicate the prototype NCRAS shows good promise but requires further evaluation and possible modification in design and testing protocol to achieve more desirable operational attributes in comparison with direct skin-contact repellency mosquito assays.

  9. A sensitive radioimmunosorbent assay for the detection of plant viruses

    International Nuclear Information System (INIS)

    Ghabrial, S.A.; Shepherd, R.J.

    1980-01-01

    A simple and highly sensitive radioimmunosorbent assay (RISA) for the detection of plant viruses is described. The RISA procedure is a microplate method based on the principle of 'double-antibody sandwich' and follows essentially the protocol of the enzyme-linked immunosorbent assay (ELISA) (Clark and Adams, 1977), with the exception that 125 I-labelled γ-globulin is substituted for the γ-globulin enzyme conjugate; the bound 125 I-γ-globulin is dissociated by acidification from the double-antibody sandwich. The radioactivity is proportional to virus concentration, and cauliflower mosaic virus (CaMV) and lettuce mosaic virus (LMV) could be detected at concentrations as low as 5 and 2 ng/ml, respectively. Direct evidence of the adverse effects of conjugation with enzyme on the binding abilities of antibodies is presented. The RISA procedure should prove valuable with viruses for which the ELISA values are too low to be dependable. (author)

  10. Real‑time, fast neutron detection for stimulated safeguards assay

    International Nuclear Information System (INIS)

    Joyce, Malcolm J.; Adamczyk, Justyna; Plenteda, Romano; Aspinall, Michael D.; Cave, Francis D.

    2015-01-01

    The advent of low‑hazard organic liquid scintillation detectors and real‑time pulse‑shape discrimination (PSD) processing has suggested a variety of modalities by which fast neutrons, as opposed to neutrons moderated prior to detection, can be used directly to benefit safeguards needs. In this paper we describe a development of a fast‑neutron based safeguards assay system designed for the assessment of 235 U content in fresh fuel. The system benefits from real‑time pulse‑shape discrimination processing and auto‑calibration of the detector system parameters to ensure a rapid and effective set‑up protocol. These requirements are essential in optimising the speed and limit of detection of the fast neutron technique, whilst minimising the intervention needed to perform the assay.

  11. Electronics reliability calculation and design

    CERN Document Server

    Dummer, Geoffrey W A; Hiller, N

    1966-01-01

    Electronics Reliability-Calculation and Design provides an introduction to the fundamental concepts of reliability. The increasing complexity of electronic equipment has made problems in designing and manufacturing a reliable product more and more difficult. Specific techniques have been developed that enable designers to integrate reliability into their products, and reliability has become a science in its own right. The book begins with a discussion of basic mathematical and statistical concepts, including arithmetic mean, frequency distribution, median and mode, scatter or dispersion of mea

  12. The use of comet assay in plant toxicology: recent advances

    Directory of Open Access Journals (Sweden)

    Conceição LV Santos

    2015-06-01

    Full Text Available The systematic study of genotoxicity in plants induced by contaminants and other stress agents has been hindered to date by the lack of reliable and robust biomarkers. The comet assay is a versatile and sensitive method for the evaluation of DNA damages and DNA repair capacity at single-cell level. Due to its simplicity and sensitivity, and the small number of cells required to obtain robust results, the use of plant comet assay has drastically increased in the last decade. For years its use was restricted to a few model species, e.g. Allium cepa, Nicotiana tabacum, Vicia faba, or Arabidopsis thaliana but this number largely increased in the last years. Plant comet assay has been used to study the genotoxic impact of radiation, chemicals including pesticides, phytocompounds, heavy metals, nanoparticles or contaminated complex matrices. Here we will review the most recent data on the use of this technique as a standard approach for studying the genotoxic effects of different stress conditions on plants. Also, we will discuss the integration of information provided by the comet assay with other DNA-damage indicators, and with cellular responses including oxidative stress, cell division or cell death. Finally, we will focus on putative relations between transcripts related with DNA damage pathways, DNA replication and repair, oxidative stress and cell cycle progression that have been identified in plant cells with comet assays demonstrating DNA damage.

  13. The use of comet assay in plant toxicology: recent advances

    Science.gov (United States)

    Santos, Conceição L. V.; Pourrut, Bertrand; Ferreira de Oliveira, José M. P.

    2015-01-01

    The systematic study of genotoxicity in plants induced by contaminants and other stress agents has been hindered to date by the lack of reliable and robust biomarkers. The comet assay is a versatile and sensitive method for the evaluation of DNA damages and DNA repair capacity at single-cell level. Due to its simplicity and sensitivity, and the small number of cells required to obtain robust results, the use of plant comet assay has drastically increased in the last decade. For years its use was restricted to a few model species, e.g., Allium cepa, Nicotiana tabacum, Vicia faba, or Arabidopsis thaliana but this number largely increased in the last years. Plant comet assay has been used to study the genotoxic impact of radiation, chemicals including pesticides, phytocompounds, heavy metals, nanoparticles or contaminated complex matrices. Here we will review the most recent data on the use of this technique as a standard approach for studying the genotoxic effects of different stress conditions on plants. Also, we will discuss the integration of information provided by the comet assay with other DNA-damage indicators, and with cellular responses including oxidative stress, cell division or cell death. Finally, we will focus on putative relations between transcripts related with DNA damage pathways, DNA replication and repair, oxidative stress and cell cycle progression that have been identified in plant cells with comet assays demonstrating DNA damage. PMID:26175750

  14. Quantification of methanogenic biomass by enzyme-linked immunosorbent assay and by analysis of specific methanogenic cofactors

    Energy Technology Data Exchange (ETDEWEB)

    Gorris, L G.M.; Kemp, H A; Archer, D B

    1987-01-01

    The reliability and accuracy with which enzyme-linked immunosorbent assay (ELISA) and an assay of methanogenic cofactors detect and quantify methanogenic species were investigated. Both assays required standardization with laboratory cultures of methanogenic bacteria and were applied to mixtures of pure cultures and samples from anaerobic digesters. ELISA was shown to be a simple method for detecting and quantifying individual methanogenic species. The range of species which can be assayed is limited by the range of antisera available but, potentially, ELISA can be applied to all methanogens. Although the cofactor assay is not species-specific it can distinguish hydrogenotrophic and acetotrophic methanogens and is quantitative.

  15. Security and SCADA protocols

    International Nuclear Information System (INIS)

    Igure, V. M.; Williams, R. D.

    2006-01-01

    Supervisory control and data acquisition (SCADA) networks have replaced discrete wiring for many industrial processes, and the efficiency of the network alternative suggests a trend toward more SCADA networks in the future. This paper broadly considers SCADA to include distributed control systems (DCS) and digital control systems. These networks offer many advantages, but they also introduce potential vulnerabilities that can be exploited by adversaries. Inter-connectivity exposes SCADA networks to many of the same threats that face the public internet and many of the established defenses therefore show promise if adapted to the SCADA differences. This paper provides an overview of security issues in SCADA networks and ongoing efforts to improve the security of these networks. Initially, a few samples from the range of threats to SCADA network security are offered. Next, attention is focused on security assessment of SCADA communication protocols. Three challenges must be addressed to strengthen SCADA networks. Access control mechanisms need to be introduced or strengthened, improvements are needed inside of the network to enhance security and network monitoring, and SCADA security management improvements and policies are needed. This paper discusses each of these challenges. This paper uses the Profibus protocol as an example to illustrate some of the vulnerabilities that arise within SCADA networks. The example Profibus security assessment establishes a network model and an attacker model before proceeding to a list of example attacks. (authors)

  16. Mathematical reliability an expository perspective

    CERN Document Server

    Mazzuchi, Thomas; Singpurwalla, Nozer

    2004-01-01

    In this volume consideration was given to more advanced theoretical approaches and novel applications of reliability to ensure that topics having a futuristic impact were specifically included. Topics like finance, forensics, information, and orthopedics, as well as the more traditional reliability topics were purposefully undertaken to make this collection different from the existing books in reliability. The entries have been categorized into seven parts, each emphasizing a theme that seems poised for the future development of reliability as an academic discipline with relevance. The seven parts are networks and systems; recurrent events; information and design; failure rate function and burn-in; software reliability and random environments; reliability in composites and orthopedics, and reliability in finance and forensics. Embedded within the above are some of the other currently active topics such as causality, cascading, exchangeability, expert testimony, hierarchical modeling, optimization and survival...

  17. Comet Assay on Daphnia magna in eco-genotoxicity testing.

    Science.gov (United States)

    Pellegri, Valerio; Gorbi, Gessica; Buschini, Annamaria

    2014-10-01

    Detection of potentially hazardous compounds in water bodies is a priority in environmental risk assessment. For the evaluation and monitoring of water quality, a series of methodologies may be applied. Among them, the worldwide used toxicity tests with organisms of the genus Daphnia is one of the most powerful. In recent years, some attempts were made to utilize Daphnia magna in genotoxicity testing as many of the new environmental contaminants are described as DNA-damaging agents in aquatic organisms. The aim of this research was to develop a highly standardized protocol of the Comet Assay adapted for D. magna, especially regarding the isolation of cells derived from the same tissue (haemolymph) from newborn organisms exposed in vivo. Several methods for haemolymph extraction and different Comet Assay parameters were compared. Electrophoretic conditions were adapted in order to obtain minimum DNA migration in cells derived from untreated organisms and, at the same time, maximum sensitivity in specimens treated with known genotoxicants (CdCl2 and H2O2). Additional tests were performed to investigate if life-history traits of the cladoceran (such as the age of adult organisms that provide newborns, the clutch size of origin, the number of generations reared in standard conditions) and the water composition as well, might influence the response of the assay. This study confirms the potential application of the Comet Assay in D. magna for assessing genotoxic loads in aqueous solution. The newly developed protocol could integrate the acute toxicity bioassay, thus expanding the possibility of using this model species in freshwater monitoring (waters, sediment and soil elutriates) and is in line with the spirit of the EU Water Framework Directive in reducing the number of bioassays that involve medium-sized species. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. BioAssay templates for the semantic web

    Directory of Open Access Journals (Sweden)

    Alex M. Clark

    2016-05-01

    Full Text Available Annotation of bioassay protocols using semantic web vocabulary is a way to make experiment descriptions machine-readable. Protocols are communicated using concise scientific English, which precludes most kinds of analysis by software algorithms. Given the availability of a sufficiently expressive ontology, some or all of the pertinent information can be captured by asserting a series of facts, expressed as semantic web triples (subject, predicate, object. With appropriate annotation, assays can be searched, clustered, tagged and evaluated in a multitude of ways, analogous to other segments of drug discovery informatics. The BioAssay Ontology (BAO has been previously designed for this express purpose, and provides a layered hierarchy of meaningful terms which can be linked to. Currently the biggest challenge is the issue of content creation: scientists cannot be expected to use the BAO effectively without having access to software tools that make it straightforward to use the vocabulary in a canonical way. We have sought to remove this barrier by: (1 defining a BioAssay Template (BAT data model; (2 creating a software tool for experts to create or modify templates to suit their needs; and (3 designing a common assay template (CAT to leverage the most value from the BAO terms. The CAT was carefully assembled by biologists in order to find a balance between the maximum amount of information captured vs. low degrees of freedom in order to keep the user experience as simple as possible. The data format that we use for describing templates and corresponding annotations is the native format of the semantic web (RDF triples, and we demonstrate some of the ways that generated content can be meaningfully queried using the SPARQL language. We have made all of these materials available as open source (http://github.com/cdd/bioassay-template, in order to encourage community input and use within diverse projects, including but not limited to our own

  19. A Reliable Method for the Evaluation of the Anaphylactoid Reaction Caused by Injectable Drugs

    Directory of Open Access Journals (Sweden)

    Fang Wang

    2016-10-01

    Full Text Available Adverse reactions of injectable drugs usually occur at first administration and are closely associated with the dosage and speed of injection. This phenomenon is correlated with the anaphylactoid reaction. However, up to now, study methods based on antigen detection have still not gained wide acceptance and single physiological indicators cannot be utilized to differentiate anaphylactoid reactions from allergic reactions and inflammatory reactions. In this study, a reliable method for the evaluation of anaphylactoid reactions caused by injectable drugs was established by using multiple physiological indicators. We used compound 48/80, ovalbumin and endotoxin as the sensitization agents to induce anaphylactoid, allergic and inflammatory reactions. Different experimental animals (guinea pig and nude rat and different modes of administration (intramuscular, intravenous and intraperitoneal injection and different times (15 min, 30 min and 60 min were evaluated to optimize the study protocol. The results showed that the optimal way to achieve sensitization involved treating guinea pigs with the different agents by intravenous injection for 30 min. Further, seven related humoral factors including 5-HT, SC5b-9, Bb, C4d, IL-6, C3a and histamine were detected by HPLC analysis and ELISA assay to determine their expression level. The results showed that five of them, including 5-HT, SC5b-9, Bb, C4d and IL-6, displayed significant differences between anaphylactoid, allergic and inflammatory reactions, which indicated that their combination could be used to distinguish these three reactions. Then different injectable drugs were used to verify this method and the results showed that the chosen indicators exhibited good correlation with the anaphylactoid reaction which indicated that the established method was both practical and reliable. Our research provides a feasible method for the diagnosis of the serious adverse reactions caused by injectable drugs which

  20. Bacteriophage amplification assay for detection of Listeria spp. using virucidal laser treatment

    Directory of Open Access Journals (Sweden)

    I.C. Oliveira

    2012-09-01

    Full Text Available A protocol for the bacteriophage amplification technique was developed for quantitative detection of viable Listeria monocytogenes cells using the A511 listeriophage with plaque formation as the end-point assay. Laser and toluidine blue O (TBO were employed as selective virucidal treatment for destruction of exogenous bacteriophage. Laser and TBO can bring a total reduction in titer phage (ca. 10(8 pfu/mL without affecting the viability of L. monocytogenes cells. Artificially inoculated skimmed milk revealed mean populations of the bacteria as low as between 13 cfu/mL (1.11 log cfu/mL, after a 10-h assay duration. Virucidal laser treatment demonstrated better protection of Listeria cells than the other agents previously tested. The protocol was faster and easier to perform than standard procedures. This protocol constitutes an alternative for rapid, sensitive and quantitative detection of L. monocytogenes.

  1. Assays of D-Amino Acid Oxidase Activity

    Directory of Open Access Journals (Sweden)

    Elena Rosini

    2018-01-01

    Full Text Available D-amino acid oxidase (DAAO is a well-known flavoenzyme that catalyzes the oxidative FAD-dependent deamination of D-amino acids. As a result of the absolute stereoselectivity and broad substrate specificity, microbial DAAOs have been employed as industrial biocatalysts in the production of semi-synthetic cephalosporins and enantiomerically pure amino acids. Moreover, in mammals, DAAO is present in specific brain areas and degrades D-serine, an endogenous coagonist of the N-methyl-D-aspartate receptors (NMDARs. Dysregulation of D-serine metabolism due to an altered DAAO functionality is related to pathological NMDARs dysfunctions such as in amyotrophic lateral sclerosis and schizophrenia. In this protocol paper, we describe a variety of direct assays based on the determination of molecular oxygen consumption, reduction of alternative electron acceptors, or α-keto acid production, of coupled assays to detect the hydrogen peroxide or the ammonium production, and an indirect assay of the α-keto acid production based on a chemical derivatization. These analytical assays allow the determination of DAAO activity both on recombinant enzyme preparations, in cells, and in tissue samples.

  2. Of plants and reliability

    International Nuclear Information System (INIS)

    Schneider Horst

    2009-01-01

    Behind the political statements made about the transformer event at the Kruemmel nuclear power station (KKK) in the summer of 2009 there are fundamental issues of atomic law. Pursuant to Articles 20 and 28 of its Basic Law, Germany is a state in which the rule of law applies. Consequently, the aspects of atomic law associated with the incident merit a closer look, all the more so as the items concerned have been known for many years. Important aspects in the debate about the Kruemmel nuclear power plant are the fact that the transformer is considered part of the nuclear power station under atomic law and thus a ''plant'' subject to surveillance by the nuclear regulatory agencies, on the one hand, and the reliability under atomic law of the operator and the executive personnel responsible, on the other hand. Both ''plant'' and ''reliability'' are terms focusing on nuclear safety. Hence the question to what extent safety was affected in the Kruemmel incident. The classification of the event as 0 = no or only a very slight safety impact on the INES scale (INES = International Nuclear Event Scale) should not be used to put aside the safety issue once and for all. Points of fact and their technical significance must be considered prior to any legal assessment. Legal assessments and regulations are associated with facts and circumstances. Any legal examination is based on the facts as determined and elucidated. Any other procedure would be tantamount to an inadmissible legal advance conviction. Now, what is the position of political statements, i.e. political assessments and political responsibility? If everything is done the correct way, they come at the end, after exploration of the facts and evaluation under applicable law. Sometimes things are handled differently, with consequences which are not very helpful. In the light of the provisions about the rule of law as laid down in the Basic Law, the new federal government should be made to observe the proper sequence of

  3. A Passive Testing Approach for Protocols in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Xiaoping Che

    2015-11-01

    Full Text Available Smart systems are today increasingly developed with the number of wireless sensor devices drastically increasing. They are implemented within several contexts throughout our environment. Thus, sensed data transported in ubiquitous systems are important, and the way to carry them must be efficient and reliable. For that purpose, several routing protocols have been proposed for wireless sensor networks (WSN. However, one stage that is often neglected before their deployment is the conformance testing process, a crucial and challenging step. Compared to active testing techniques commonly used in wired networks, passive approaches are more suitable to the WSN environment. While some works propose to specify the protocol with state models or to analyze them with simulators and emulators, we here propose a logic-based approach for formally specifying some functional requirements of a novel WSN routing protocol. We provide an algorithm to evaluate these properties on collected protocol execution traces. Further, we demonstrate the efficiency and suitability of our approach by its application into common WSN functional properties, as well as specific ones designed from our own routing protocol. We provide relevant testing verdicts through a real indoor testbed and the implementation of our protocol. Furthermore, the flexibility, genericity and practicability of our approach have been proven by the experimental results.

  4. Multinomial-exponential reliability function: a software reliability model

    International Nuclear Information System (INIS)

    Saiz de Bustamante, Amalio; Saiz de Bustamante, Barbara

    2003-01-01

    The multinomial-exponential reliability function (MERF) was developed during a detailed study of the software failure/correction processes. Later on MERF was approximated by a much simpler exponential reliability function (EARF), which keeps most of MERF mathematical properties, so the two functions together makes up a single reliability model. The reliability model MERF/EARF considers the software failure process as a non-homogeneous Poisson process (NHPP), and the repair (correction) process, a multinomial distribution. The model supposes that both processes are statistically independent. The paper discusses the model's theoretical basis, its mathematical properties and its application to software reliability. Nevertheless it is foreseen model applications to inspection and maintenance of physical systems. The paper includes a complete numerical example of the model application to a software reliability analysis

  5. Development of simple immunoradiometric assays using avidin coupled to polystyrene beads as a common solid phase

    International Nuclear Information System (INIS)

    Jyotsna, N.; Singh, Y.; Chouthkanthiwar, V.; Paradkar, S.; Sivaprasad, N.

    1998-01-01

    In this paper, we describe the preparation and application of avidin coupled polystyrene beads as a common solid phase for use in immunoradiometric assays (IRMAs). The assay system is based on two matched commercial monoclonal antibodies, of which, the capture antibody is biotinylated using biotinamidocaproate N-hydroxysuccinimide ester and the detection antibody is radiolabeled with 125 I by conventional Chloramine-T method. Avidin was immobilized on the polystyrene beads through a primary coat of bovine serum albumin using glutaraldhyde activation method. Various factors, such as concentration of reagents, incubation time, etc. were optimised to obtain a simple assay protocol consisting of only two pipetting steps, namely, that of a mixture of the two labelled antibodies (radiolabelled and biotinylated) and of the standard or sample. The advantage of the Avidin-Biotin system is the improved sensitivity, economy of antibody and the possibility to use a common solid phase in assays for different analytes. Using the polystyrene beads along with the novel decanting device, it has been possible to achieve the convenience of the 'coated-tube' technology without the expensive automation necessary for large scale preparation of antibody coated tubes. This protocol has been successfully applied to Prolactin, LH and FSH assays. The sensitivity of the Prolactin assay is 8μIU/mL (0.3 ng/mL), that of the FSH assay is 1mIU/mL and that of the LH assay is 0.9 mIU/mL. The intra-assay and inter-assay variations were <10%. Shelf life of the avidin coupled beads was found to be about 8 months and that of the biotin labelled antibodies up to 18 months. (author)

  6. Quality assurance and reliability

    International Nuclear Information System (INIS)

    Normand, J.; Charon, M.

    1975-01-01

    Concern for obtaining high-quality products which will function properly when required to do so is nothing new - it is one manifestation of a conscientious attitude to work. However, the complexity and cost of equipment and the consequences of even temporary immobilization are such that it has become necessary to make special arrangements for obtaining high-quality products and examining what one has obtained. Each unit within an enterprise must examine its own work or arrange for it to be examined; a unit whose specific task is quality assurance is responsible for overall checking, but does not relieve other units of their responsibility. Quality assurance is a form of mutual assistance within an enterprise, designed to remove the causes of faults as far as possible. It begins very early in a project and continues through the ordering stage, construction, start-up trials and operation. Quality and hence reliability are the direct result of what is done at all stages of a project. They depend on constant attention to detail, for even a minor piece of poor workmanship can, in the case of an essential item of equipment, give rise to serious operational difficulties

  7. Framework and indicator testing protocol for developing and piloting quality indicators for the UK quality and outcomes framework

    NARCIS (Netherlands)

    Campbell, S.M.; Kontopantelis, E.; Hannon, K.; Burke, M.; Barber, A.; Lester, H.E.

    2011-01-01

    BACKGROUND: Quality measures should be subjected to a testing protocol before being used in practice using key attributes such as acceptability, feasibility and reliability, as well as identifying issues derived from actual implementation and unintended consequences. We describe the methodologies

  8. [Computerized clinical protocol for occlusion].

    Science.gov (United States)

    Salsench, J; Ferrer, J; Nogueras, J

    1988-11-01

    In making a protocol it is necessary that all members of the team who are going to collect information have the same unity of criterion about the different variables that compose it. The drawing up of this document is as much or more necessary than the protocol itself. In this work we all data collected in the protocol and we give the explanations of each concept.

  9. Reliability of Oronasal Fistula Classification.

    Science.gov (United States)

    Sitzman, Thomas J; Allori, Alexander C; Matic, Damir B; Beals, Stephen P; Fisher, David M; Samson, Thomas D; Marcus, Jeffrey R; Tse, Raymond W

    2018-01-01

    Objective Oronasal fistula is an important complication of cleft palate repair that is frequently used to evaluate surgical quality, yet reliability of fistula classification has never been examined. The objective of this study was to determine the reliability of oronasal fistula classification both within individual surgeons and between multiple surgeons. Design Using intraoral photographs of children with repaired cleft palate, surgeons rated the location of palatal fistulae using the Pittsburgh Fistula Classification System. Intrarater and interrater reliability scores were calculated for each region of the palate. Participants Eight cleft surgeons rated photographs obtained from 29 children. Results Within individual surgeons reliability for each region of the Pittsburgh classification ranged from moderate to almost perfect (κ = .60-.96). By contrast, reliability between surgeons was lower, ranging from fair to substantial (κ = .23-.70). Between-surgeon reliability was lowest for the junction of the soft and hard palates (κ = .23). Within-surgeon and between-surgeon reliability were almost perfect for the more general classification of fistula in the secondary palate (κ = .95 and κ = .83, respectively). Conclusions This is the first reliability study of fistula classification. We show that the Pittsburgh Fistula Classification System is reliable when used by an individual surgeon, but less reliable when used among multiple surgeons. Comparisons of fistula occurrence among surgeons may be subject to less bias if they use the more general classification of "presence or absence of fistula of the secondary palate" rather than the Pittsburgh Fistula Classification System.

  10. Automation of the dicentric chromosome assay and related assays

    International Nuclear Information System (INIS)

    Balajee, Adayabalam S.; Dainiak, Nicholas

    2016-01-01

    Dicentric Chromosome Assay (DCA) is considered to be the 'gold standard' for personalized dose assessment in humans after accidental or incidental radiation exposure. Although this technique is superior to other cytogenetic assays in terms of specificity and sensitivity, its potential application to radiation mass casualty scenarios is highly restricted because DCA is time consuming and labor intensive when performed manually. Therefore, it is imperative to develop high throughput automation techniques to make DCA suitable for radiological triage scenarios. At the Cytogenetic Biodosimetry Laboratory in Oak Ridge, efforts are underway to develop high throughput automation of DCA. Current status on development of various automated cytogenetic techniques in meeting the biodosimetry needs of radiological/nuclear incident(s) will be discussed

  11. Dependent systems reliability estimation by structural reliability approach

    DEFF Research Database (Denmark)

    Kostandyan, Erik; Sørensen, John Dalsgaard

    2014-01-01

    Estimation of system reliability by classical system reliability methods generally assumes that the components are statistically independent, thus limiting its applicability in many practical situations. A method is proposed for estimation of the system reliability with dependent components, where...... the leading failure mechanism(s) is described by physics of failure model(s). The proposed method is based on structural reliability techniques and accounts for both statistical and failure effect correlations. It is assumed that failure of any component is due to increasing damage (fatigue phenomena...... identification. Application of the proposed method can be found in many real world systems....

  12. Assay strategies and methods for phospholipases

    International Nuclear Information System (INIS)

    Reynolds, L.J.; Washburn, W.N.; Deems, R.A.; Dennis, E.A.

    1991-01-01

    Of the general considerations discussed, the two issues which are most important in choosing an assay are (1) what sensitivity is required to assay a particular enzyme and (2) whether the assay must be continuous. One can narrow the options further by considering substrate availability, enzyme specificity, assay convenience, or the presence of incompatible side reactions. In addition, the specific preference of a particular phospholipase for polar head group, micellar versus vesicular substrates, and anionic versus nonionic detergents may further restrict the options. Of the many assays described in this chapter, several have limited applicability or serious drawbacks and are not commonly employed. The most commonly used phospholipase assays are the radioactive TLC assay and the pH-stat assay. The TLC assay is probably the most accurate, sensitive assay available. These aspects often outweigh the disadvantages of being discontinuous, tedious, and expensive. The radioactive E. coli assay has become popular recently as an alternative to the TLC assay for the purification of the mammalian nonpancreatic phospholipases. The assay is less time consuming and less expensive than the TLC assay, but it is not appropriate when careful kinetics are required. Where less sensitivity is needed, or when a continuous assay is necessary, the pH-stat assay is often employed. With purified enzymes, when free thiol groups are not present, a spectrophotometric thiol assay can be used. This assay is ∼ as sensitive as the pH-stat assay but is more convenient and more reproducible, although the substrate is not available commercially. Despite the many assay choices available, the search continues for a convenient, generally applicable assay that is both sensitive and continuous

  13. Static Validation of Security Protocols

    DEFF Research Database (Denmark)

    Bodei, Chiara; Buchholtz, Mikael; Degano, P.

    2005-01-01

    We methodically expand protocol narrations into terms of a process algebra in order to specify some of the checks that need to be made in a protocol. We then apply static analysis technology to develop an automatic validation procedure for protocols. Finally, we demonstrate that these techniques ...... suffice to identify several authentication flaws in symmetric and asymmetric key protocols such as Needham-Schroeder symmetric key, Otway-Rees, Yahalom, Andrew secure RPC, Needham-Schroeder asymmetric key, and Beller-Chang-Yacobi MSR...

  14. [Evaluation of serum PIVKA-II by Lumipulse PrestoII assay].

    Science.gov (United States)

    Hiramatsu, Kumiko; Tanaka, Yasuhito; Takagi, Kazumi; Kani, Satomi; Goto, Takaaki; Takasaka, Yoshimitsu; Matsuura, Kentaro; Sugauchi, Fuminaka; Moriyama, Kazushige; Murakami, Hiroshi; Kitajima, Sachiko; Mizokami, Masashi

    2009-03-01

    Measurements of serum concentrations of Des-gamma-carboxy Prothrombin (PIVKA-II) are widely used for diagnosing hepatocellular carcinoma (HCC). Recently, in Lumipulsef assay, it was reported that antibodies against alkaline phosphatase (ALP) derived from anti bleeding sheets led false high values of PIVKA-II in the patients with HCC resection. To improve the previous issue, newly developed Lumipulse PrestoII assay was examined. (1) The assay was reliable and positively correlated with the previous assays (Lumipulse f and Picolumi, R = 0.997 and 0.994 (n=115), respectively). (2) Eleven cases, which had false high values of PIVKA-II by the Lumipulsef assay, were examined by the PrestoII assay with excess of inactive ALP. The false high values of 10 cases were improved, but only one was still high. False reactivity of this case was stronger than other cases, more effective adsorption was required. (3) Comparing the absorbent activity of inactive ALP among 6 different kinds, we found inactive ALP with much higher adsorbent activity. When this inactive ALP was applied to assay, false high values of PIVKA-II were improved in all 11 cases. In conclusion, the PrestoII assay, which applies the inactive ALP with high activity, is reliable and useful for clinical screening.

  15. How reliable are Functional Movement Screening scores? A systematic review of rater reliability.

    Science.gov (United States)

    Moran, Robert W; Schneiders, Anthony G; Major, Katherine M; Sullivan, S John

    2016-05-01

    Several physical assessment protocols to identify intrinsic risk factors for injury aetiology related to movement quality have been described. The Functional Movement Screen (FMS) is a standardised, field-expedient test battery intended to assess movement quality and has been used clinically in preparticipation screening and in sports injury research. To critically appraise and summarise research investigating the reliability of scores obtained using the FMS battery. Systematic literature review. Systematic search of Google Scholar, Scopus (including ScienceDirect and PubMed), EBSCO (including Academic Search Complete, AMED, CINAHL, Health Source: Nursing/Academic Edition), MEDLINE and SPORTDiscus. Studies meeting eligibility criteria were assessed by 2 reviewers for risk of bias using the Quality Appraisal of Reliability Studies checklist. Overall quality of evidence was determined using van Tulder's levels of evidence approach. 12 studies were appraised. Overall, there was a 'moderate' level of evidence in favour of 'acceptable' (intraclass correlation coefficient ≥0.6) inter-rater and intra-rater reliability for composite scores derived from live scoring. For inter-rater reliability of composite scores derived from video recordings there was 'conflicting' evidence, and 'limited' evidence for intra-rater reliability. For inter-rater reliability based on live scoring of individual subtests there was 'moderate' evidence of 'acceptable' reliability (κ≥0.4) for 4 subtests (Deep Squat, Shoulder Mobility, Active Straight-leg Raise, Trunk Stability Push-up) and 'conflicting' evidence for the remaining 3 (Hurdle Step, In-line Lunge, Rotary Stability). This review found 'moderate' evidence that raters can achieve acceptable levels of inter-rater and intra-rater reliability of composite FMS scores when using live ratings. Overall, there were few high-quality studies, and the quality of several studies was impacted by poor study reporting particularly in relation to

  16. Energy efficient routing protocols for wireless sensor networks: comparison and future directions

    Directory of Open Access Journals (Sweden)

    Loganathan Murukesan

    2017-01-01

    Full Text Available Wireless sensor network consists of nodes with limited resources. Hence, it is important to design protocols or algorithms which increases energy efficiency in order to improve the network lifetime. In this paper, techniques used in the network layer (routing of the internet protocol stack to achieve energy efficiency are reviewed. Usually, the routing protocols are classified into four main schemes: (1 Network Structure, (2 Communication Model, (3 Topology Based, and (4 Reliable Routing. In this work, only network structure based routing protocols are reviewed due to the page constraint. Besides, this type of protocols are much popular among the researchers since they are fairly simple to implement and produce good results as presented in this paper. Also, the pros and cons of each protocols are presented. Finally, the paper concludes with possible further research directions.

  17. QoS-aware self-adaptation of communication protocols in a pervasive service middleware

    DEFF Research Database (Denmark)

    Zhang, Weishan; Hansen, Klaus Marius; Fernandes, João

    2010-01-01

    Pervasive computing is characterized by heterogeneous devices that usually have scarce resources requiring optimized usage. These devices may use different communication protocols which can be switched at runtime. As different communication protocols have different quality of service (Qo......S) properties, this motivates optimized self-adaption of protocols for devices, e.g., considering power consumption and other QoS requirements, e.g. round trip time (RTT) for service invocations, throughput, and reliability. In this paper, we present an extensible approach for self-adaptation of communication...... protocols for pervasive web services, where protocols are designed as reusable connectors and our middleware infrastructure can hide the complexity of using different communication protocols to upper layers. We also propose to use Genetic Algorithms (GAs) to find optimized configurations at runtime...

  18. Unconditionally Secure Protocols

    DEFF Research Database (Denmark)

    Meldgaard, Sigurd Torkel

    This thesis contains research on the theory of secure multi-party computation (MPC). Especially information theoretically (as opposed to computationally) secure protocols. It contains results from two main lines of work. One line on Information Theoretically Secure Oblivious RAMS, and how....... We construct an oblivious RAM that hides the client's access pattern with information theoretic security with an amortized $\\log^3 N$ query overhead. And how to employ a second server that is guaranteed not to conspire with the first to improve the overhead to $\\log^2 N$, while also avoiding...... they are used to speed up secure computation. An Oblivious RAM is a construction for a client with a small $O(1)$ internal memory to store $N$ pieces of data on a server while revealing nothing more than the size of the memory $N$, and the number of accesses. This specifically includes hiding the access pattern...

  19. Protocols for Scholarly Communication

    CERN Document Server

    Pepe, Alberto; Pepe, Alberto; Yeomans, Joanne

    2007-01-01

    CERN, the European Organization for Nuclear Research, has operated an institutional preprint repository for more than 10 years. The repository contains over 850,000 records of which more than 450,000 are full-text OA preprints, mostly in the field of particle physics, and it is integrated with the library's holdings of books, conference proceedings, journals and other grey literature. In order to encourage effective propagation and open access to scholarly material, CERN is implementing a range of innovative library services into its document repository: automatic keywording, reference extraction, collaborative management tools and bibliometric tools. Some of these services, such as user reviewing and automatic metadata extraction, could make up an interesting testbed for future publishing solutions and certainly provide an exciting environment for e-science possibilities. The future protocol for scientific communication should naturally guide authors towards OA publication and CERN wants to help reach a full...

  20. Hippocampal MRI volumetry at 3 Tesla: reliability and practical guidance.

    Science.gov (United States)

    Jeukens, Cécile R L P N; Vlooswijk, Mariëlle C G; Majoie, H J Marian; de Krom, Marc C T F M; Aldenkamp, Albert P; Hofman, Paul A M; Jansen, Jacobus F A; Backes, Walter H

    2009-09-01

    Although volumetry of the hippocampus is considered to be an established technique, protocols reported in literature are not described in great detail. This article provides a complete and detailed protocol for hippocampal volumetry applicable to T1-weighted magnetic resonance (MR) images acquired at 3 Tesla, which has become the standard for structural brain research. The protocol encompasses T1-weighted image acquisition at 3 Tesla, anatomic guidelines for manual hippocampus delineation, requirements of delineation software, reliability measures, and criteria to assess and ensure sufficient reliability. Moreover, the validity of the correction for total intracranial volume size was critically assessed. The protocol was applied by 2 readers to the MR images of 36 patients with cryptogenic localization-related epilepsy, 4 patients with unilateral hippocampal sclerosis, and 20 healthy control subjects. The uncorrected hippocampal volumes were 2923 +/- 500 mm3 (mean +/- SD) (left) and 3120 +/- 416 mm3 (right) for the patient group and 3185 +/- 411 mm3 (left) and 3302 +/- 411 mm3 (right) for the healthy control group. The volume of the 4 pathologic hippocampi of the patients with unilateral hippocampal sclerosis was 2980 +/- 422 mm3. The inter-reader reliability values were determined: intraclass-correlation-coefficient (ICC) = 0.87 (left) and 0.86 (right), percentage volume difference (VD) = 7.0 +/- 4.7% (left) and 6.0 +/- 3.8% (right), and overlap ratio (OR) = 0.82 +/- 0.04 (left) and 0.82 +/- 0.03 (right). The positive Pearson correlation between hippocampal volume and total intracranial volume was found to be low: r = 0.48 (P = 0.03, left) and r = 0.62 (P = 0.004, right) and did not significantly reduce the volumetric variances, showing the limited benefit of the brain size correction. A protocol was described to determine hippocampal volumes based on 3 Tesla MR images with high inter-reader reliability. Although the reliability of hippocampal volumetry at 3 Tesla

  1. Development of a Taqman real-time PCR assay for rapid detection and quantification of Vibrio tapetis in extrapallial fluids of clams

    Directory of Open Access Journals (Sweden)

    Adeline Bidault

    2015-12-01

    Full Text Available The Gram-negative bacterium Vibrio tapetis is known as the causative agent of Brown Ring Disease (BRD in the Manila clam Venerupis (=Ruditapes philippinarum. This bivalve is the second most important species produced in aquaculture and has a high commercial value. In spite of the development of several molecular methods, no survey has been yet achieved to rapidly quantify the bacterium in the clam. In this study, we developed a Taqman real-time PCR assay targeting virB4 gene for accurate and quantitative identification of V. tapetis strains pathogenic to clams. Sensitivity and reproducibility of the method were assessed using either filtered sea water or extrapallial fluids of clam injected with the CECT4600T V. tapetis strain. Quantification curves of V. tapetis strain seeded in filtered seawater (FSW or extrapallial fluids (EF samples were equivalent showing reliable qPCR efficacies. With this protocol, we were able to specifically detect V. tapetis strains down to 1.125 101 bacteria per mL of EF or FSW, taking into account the dilution factor used for appropriate template DNA preparation. This qPCR assay allowed us to monitor V. tapetis load both experimentally or naturally infected Manila clams. This technique will be particularly useful for monitoring the kinetics of massive infections by V. tapetis and for designing appropriate control measures for aquaculture purposes.

  2. Reliability of reactor materials

    International Nuclear Information System (INIS)

    Toerroenen, K.; Aho-Mantila, I.

    1986-05-01

    This report is the final technical report of the fracture mechanics part of the Reliability of Reactor Materials Programme, which was carried out at the Technical Research Centre of Finland (VTT) through the years 1981 to 1983. Research and development work was carried out in five major areas, viz. statistical treatment and modelling of cleavage fracture, crack arrest, ductile fracture, instrumented impact testing as well as comparison of numerical and experimental elastic-plastic fracture mechanics. In the area of cleavage fracture the critical variables affecting the fracture of steels are considered in the frames of a statistical model, so called WST-model. Comparison of fracture toughness values predicted by the model and corresponding experimental values shows excellent agreement for a variety of microstructures. different posibilities for using the model are discussed. The development work in the area of crack arrest testing was concentrated in the crack starter properties, test arrangement and computer control. A computerized elastic-plastic fracture testing method with a variety of test specimen geometries in a large temperature range was developed for a routine stage. Ductile fracture characteristics of reactor pressure vessel steel A533B and comparable weld material are given. The features of a new, patented instrumented impact tester are described. Experimental and theoretical comparisons between the new and conventional testers indicated clearly the improvements achieved with the new tester. A comparison of numerical and experimental elastic-plastic fracture mechanics capabilities at VTT was carried out. The comparison consisted of two-dimensional linear elastic as well as elastic-plastic finite element analysis of four specimen geometries and equivalent experimental tests. (author)

  3. Field reliability of electronic systems

    International Nuclear Information System (INIS)

    Elm, T.

    1984-02-01

    This report investigates, through several examples from the field, the reliability of electronic units in a broader sense. That is, it treats not just random parts failure, but also inadequate reliability design and (externally and internally) induced failures. The report is not meant to be merely an indication of the state of the art for the reliability prediction methods we know, but also as a contribution to the investigation of man-machine interplay in the operation and repair of electronic equipment. The report firmly links electronics reliability to safety and risk analyses approaches with a broader, system oriented view of reliability prediction and with postfailure stress analysis. It is intended to reveal, in a qualitative manner, the existence of symptom and cause patterns. It provides a background for further investigations to identify the detailed mechanisms of the faults and the remedical actions and precautions for achieving cost effective reliability. (author)

  4. Reliability Assessment Of Wind Turbines

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    2014-01-01

    Reduction of cost of energy for wind turbines are very important in order to make wind energy competitive compared to other energy sources. Therefore the turbine components should be designed to have sufficient reliability but also not be too costly (and safe). This paper presents models...... for uncertainty modeling and reliability assessment of especially the structural components such as tower, blades, substructure and foundation. But since the function of a wind turbine is highly dependent on many electrical and mechanical components as well as a control system also reliability aspects...... of these components are discussed and it is described how there reliability influences the reliability of the structural components. Two illustrative examples are presented considering uncertainty modeling, reliability assessment and calibration of partial safety factors for structural wind turbine components exposed...

  5. Reliability engineering theory and practice

    CERN Document Server

    Birolini, Alessandro

    2014-01-01

    This book shows how to build in, evaluate, and demonstrate reliability and availability of components, equipment, systems. It presents the state-of-theart of reliability engineering, both in theory and practice, and is based on the author's more than 30 years experience in this field, half in industry and half as Professor of Reliability Engineering at the ETH, Zurich. The structure of the book allows rapid access to practical results. This final edition extend and replace all previous editions. New are, in particular, a strategy to mitigate incomplete coverage, a comprehensive introduction to human reliability with design guidelines and new models, and a refinement of reliability allocation, design guidelines for maintainability, and concepts related to regenerative stochastic processes. The set of problems for homework has been extended. Methods & tools are given in a way that they can be tailored to cover different reliability requirement levels and be used for safety analysis. Because of the Appendice...

  6. Reliability of Wireless Sensor Networks

    Science.gov (United States)

    Dâmaso, Antônio; Rosa, Nelson; Maciel, Paulo

    2014-01-01

    Wireless Sensor Networks (WSNs) consist of hundreds or thousands of sensor nodes with limited processing, storage, and battery capabilities. There are several strategies to reduce the power consumption of WSN nodes (by increasing the network lifetime) and increase the reliability of the network (by improving the WSN Quality of Service). However, there is an inherent conflict between power consumption and reliability: an increase in reliability usually leads to an increase in power consumption. For example, routing algorithms can send the same packet though different paths (multipath strategy), which it is important for reliability, but they significantly increase the WSN power consumption. In this context, this paper proposes a model for evaluating the reliability of WSNs considering the battery level as a key factor. Moreover, this model is based on routing algorithms used by WSNs. In order to evaluate the proposed models, three scenarios were considered to show the impact of the power consumption on the reliability of WSNs. PMID:25157553

  7. ADAPTIVE GOSSIP BASED PROTOCOL FOR ENERGY EFFICIENT MOBILE ADHOC NETWORK

    Directory of Open Access Journals (Sweden)

    S. Rajeswari

    2012-03-01

    Full Text Available In Gossip Sleep Protocol, network performance is enhanced based on energy resource. But energy conservation is achieved with the reduced throughput. In this paper, it has been proposed a new Protocol for Mobile Ad hoc Network to achieve reliability with energy conservation. Based on the probability (p values, the value of sleep nodes is fixed initially. The probability value can be adaptively adjusted by Remote Activated Switch during the transmission process. The adaptiveness of gossiping probability is determined by the Packet Delivery Ratio. For performance comparison, we have taken Routing overhead, Packet Delivery Ratio, Number of dropped packets and Energy consumption with the increasing number of forwarding nodes. We used UDP based traffic models to analyze the performance of this protocol. We analyzed TCP based traffic models for average end to end delay. We have used the NS-2 simulator.

  8. The value of service reliability

    OpenAIRE

    Benezech , Vincent; Coulombel , Nicolas

    2013-01-01

    International audience; This paper studies the impact of service frequency and reliability on the choice of departure time and the travel cost of transit users. When the user has (α, β, γ) scheduling preferences, we show that the optimal head start decreases with service reliability, as expected. It does not necessarily decrease with service frequency, however. We derive the value of service headway (VoSH) and the value of service reliability (VoSR), which measure the marginal effect on the e...

  9. Distribution-Independent Reliable Learning

    OpenAIRE

    Kanade, Varun; Thaler, Justin

    2014-01-01

    We study several questions in the reliable agnostic learning framework of Kalai et al. (2009), which captures learning tasks in which one type of error is costlier than others. A positive reliable classifier is one that makes no false positive errors. The goal in the positive reliable agnostic framework is to output a hypothesis with the following properties: (i) its false positive error rate is at most $\\epsilon$, (ii) its false negative error rate is at most $\\epsilon$ more than that of the...

  10. Industrial wireless sensor networks applications, protocols, and standards

    CERN Document Server

    Güngör, V Çagri

    2013-01-01

    The collaborative nature of industrial wireless sensor networks (IWSNs) brings several advantages over traditional wired industrial monitoring and control systems, including self-organization, rapid deployment, flexibility, and inherent intelligent processing. In this regard, IWSNs play a vital role in creating more reliable, efficient, and productive industrial systems, thus improving companies' competitiveness in the marketplace. Industrial Wireless Sensor Networks: Applications, Protocols, and Standards examines the current state of the art in industrial wireless sensor networks and outline

  11. A global protocol for monitoring of coral bleaching

    OpenAIRE

    Oliver, J.; Setiasih, N.; Marshall, P.; Hansen, L.

    2004-01-01

    Coral bleaching and subsequent mortality represent a major threat to the future health and productivity of coral reefs. However a lack of reliable data on occurrence, severity and other characteristics of bleaching events hampers research on the causes and consequences of this important phenomenon. This article describes a global protocol for monitoring coral bleaching events, which addresses this problem and can be used by people with different levels of expertise and resources.

  12. Evaluation of the HISCL Anti-Treponema pallidum Assay as a Screening Test for Syphilis

    OpenAIRE

    An, Jingna; Chen, Qixia; Liu, Qianqian; Rao, Chenli; Li, Dongdong; Wang, Tingting; Tao, Chuanmin; Wang, Lanlan

    2015-01-01

    The resurgence of syphilis in recent years has become a serious threat to public health worldwide, and the serological detection of specific antibodies against Treponema pallidum remains the most reliable method for laboratory diagnosis of syphilis. This study examined the performance of the recently launched HISCL anti-Treponema pallidum (anti-TP) assay as a screening test for syphilis in a high-volume laboratory. The HISCL anti-TP assay was tested in 300 preselected syphilis-positive sample...

  13. RTE - 2015 Reliability Report. Summary

    International Nuclear Information System (INIS)

    2016-01-01

    Every year, RTE produces a reliability report for the past year. This report includes a number of results from previous years so that year-to-year comparisons can be drawn and long-term trends analysed. The 2015 report underlines the major factors that have impacted on the reliability of the electrical power system, without focusing exclusively on Significant System Events (ESS). It describes various factors which contribute to present and future reliability and the numerous actions implemented by RTE to ensure reliability today and in the future, as well as the ways in which the various parties involved in the electrical power system interact across the whole European interconnected network

  14. Nondestructive assay of sale materials

    International Nuclear Information System (INIS)

    Rodenburg, W.W.; Fleissner, J.G.

    1981-01-01

    This paper covers three primary areas: (1) reasons for performing nondestructive assay on SALE materials; (2) techniques used; and (3) discussion of investigators' revised results. The study shows that nondestructive calorimetric assay of plutonium offers a viable alternative to traditional wet chemical techniques. For these samples, the precision ranged from 0.4 to 0.6% with biases less than 0.2%. Thus, for those materials where sampling errors are the predominant source of uncertainty, this technique can provide improved accuracy and precision while saving time and money as well as reducing the amount of liquid wastes to be handled. In addition, high resolution gamma-ray spectroscopy measurements of solids can provide isotopic analysis data in a cost effective and timely manner. The timeliness of the method can be especially useful to the plant operator for production control and quality control measurements

  15. Comet Assay in Cancer Chemoprevention.

    Science.gov (United States)

    Santoro, Raffaela; Ferraiuolo, Maria; Morgano, Gian Paolo; Muti, Paola; Strano, Sabrina

    2016-01-01

    The comet assay can be useful in monitoring DNA damage in single cells caused by exposure to genotoxic agents, such as those causing air, water, and soil pollution (e.g., pesticides, dioxins, electromagnetic fields) and chemo- and radiotherapy in cancer patients, or in the assessment of genoprotective effects of chemopreventive molecules. Therefore, it has particular importance in the fields of pharmacology and toxicology, and in both environmental and human biomonitoring. It allows the detection of single strand breaks as well as double-strand breaks and can be used in both normal and cancer cells. Here we describe the alkali method for comet assay, which allows to detect both single- and double-strand DNA breaks.

  16. Radioreceptor assay for somatomedin A

    Energy Technology Data Exchange (ETDEWEB)

    Takano, K [Tokyo Women' s Medical Coll. (Japan)

    1975-04-01

    Measurement method of somatomedian A by radioreceptor assay using the human placenta membrane was described and discussed. Binding rate of /sup 125/I-somatomedin A to its receptors was studied under various conditions of time and temperature of the incubation, and pH of the system. The influence of somatomedin A, porcine insulin, and porcine calcitonin, on /sup 125/I-somatomedin A bound receptors was studied, and these hormones showed the competitive binding to somatomedin A receptors in some level. The specificity, recovery rate, and clinical applications of somatomedin A were also discussed. Radioreceptor assay for somatomedine A provided easier, faster, and more accurate measurements than conventional bioassay. This technique would be very useful to study somatomedin A receptor and functions of insulin.

  17. Impact of ubiquitous inhibitors on the GUS gene reporter system: evidence from the model plants Arabidopsis, tobacco and rice and correction methods for quantitative assays of transgenic and endogenous GUS

    Directory of Open Access Journals (Sweden)

    Gerola Paolo D

    2009-12-01

    Full Text Available Abstract Background The β-glucuronidase (GUS gene reporter system is one of the most effective and employed techniques in the study of gene regulation in plant molecular biology. Improving protocols for GUS assays have rendered the original method described by Jefferson amenable to various requirements and conditions, but the serious limitation caused by inhibitors of the enzyme activity in plant tissues has thus far been underestimated. Results We report that inhibitors of GUS activity are ubiquitous in organ tissues of Arabidopsis, tobacco and rice, and significantly bias quantitative assessment of GUS activity in plant transformation experiments. Combined with previous literature reports on non-model species, our findings suggest that inhibitors may be common components of plant cells, with variable affinity towards the E. coli enzyme. The reduced inhibitory capacity towards the plant endogenous GUS discredits the hypothesis of a regulatory role of these compounds in plant cells, and their effect on the bacterial enzyme is better interpreted as a side effect due to their interaction with GUS during the assay. This is likely to have a bearing also on histochemical analyses, leading to inaccurate evaluations of GUS expression. Conclusions In order to achieve reliable results, inhibitor activity should be routinely tested during quantitative GUS assays. Two separate methods to correct the measured activity of the transgenic and endogenous GUS are presented.

  18. Assay of vitamin B12

    International Nuclear Information System (INIS)

    Tovey, K.C.; Carrick, D.T.

    1982-01-01

    A radioassay is described for vitamin B12 which involves denaturing serum protein binding proteins with alkali. In the denaturation step a dithiopolyol and cyanide are used and in the intrinsic factor assay step a vitamin B12 analogue such as cobinamide is used to bind with any remaining serum proteins. The invention also includes a kit in which the dithiopolyol is provided in admixture with the alkali. The dithiopolyol may be dithiothreitol or dithioerythritol. (author)

  19. Detection of proteins using a colorimetric bio-barcode assay.

    Science.gov (United States)

    Nam, Jwa-Min; Jang, Kyung-Jin; Groves, Jay T

    2007-01-01

    The colorimetric bio-barcode assay is a red-to-blue color change-based protein detection method with ultrahigh sensitivity. This assay is based on both the bio-barcode amplification method that allows for detecting miniscule amount of targets with attomolar sensitivity and gold nanoparticle-based colorimetric DNA detection method that allows for a simple and straightforward detection of biomolecules of interest (here we detect interleukin-2, an important biomarker (cytokine) for many immunodeficiency-related diseases and cancers). The protocol is composed of the following steps: (i) conjugation of target capture molecules and barcode DNA strands onto silica microparticles, (ii) target capture with probes, (iii) separation and release of barcode DNA strands from the separated probes, (iv) detection of released barcode DNA using DNA-modified gold nanoparticle probes and (v) red-to-blue color change analysis with a graphic software. Actual target detection and quantification steps with premade probes take approximately 3 h (whole protocol including probe preparations takes approximately 3 days).

  20. Assay of ribulose bisphosphate carboxylase

    International Nuclear Information System (INIS)

    Pike, C.; Berry, J.

    1987-01-01

    Assays of ribulose bisphosphate carboxylase (rubisco) can be used to illustrate many properties of photosynthetic systems. Many different leaves have been assayed with this standard procedure. The tissue is ground with a mortar and pestle in extraction buffer. The supernatant after centrifugation is used as the source of enzyme. Buffer, RuBP, [ 14 C]-NaHCO 3 , and enzyme are combined in a scintillation vial; the reaction is run for 1 min at 30 0 . The acid-stable products are counted. Reproducibility in student experiments has been excellent. The assay data can be combined with analyses of leaf properties such as fresh and dry weight, chlorophyll and protein content, etc. Students have done projects such as the response of enzyme to temperature and to various inhibitors. They also report on the use of a transition state analog, carboxyarabinitol bisphosphate, to titrate the molar concentration of rubisco molecules (active sites) in an enzyme sample. Thus, using crude extracts the catalytic activity of a sample can be compared to the absolute quantity of enzyme or to the turnover number

  1. Gene probes: principles and protocols

    National Research Council Canada - National Science Library

    Aquino de Muro, Marilena; Rapley, Ralph

    2002-01-01

    ... of labeled DNA has allowed genes to be mapped to single chromosomes and in many cases to a single chromosome band, promoting significant advance in human genome mapping. Gene Probes: Principles and Protocols presents the principles for gene probe design, labeling, detection, target format, and hybridization conditions together with detailed protocols, accom...

  2. Developing frameworks for protocol implementation

    NARCIS (Netherlands)

    de Barros Barbosa, C.; de barros Barbosa, C.; Ferreira Pires, Luis

    1999-01-01

    This paper presents a method to develop frameworks for protocol implementation. Frameworks are software structures developed for a specific application domain, which can be reused in the implementation of various different concrete systems in this domain. The use of frameworks support a protocol

  3. Infrared microspectroscopy of live cells in microfluidic devices (MD-IRMS): toward a powerful label-free cell-based assay.

    Science.gov (United States)

    Vaccari, L; Birarda, G; Businaro, L; Pacor, S; Grenci, G

    2012-06-05

    Until nowadays most infrared microspectroscopy (IRMS) experiments on biological specimens (i.e., tissues or cells) have been routinely carried out on fixed or dried samples in order to circumvent water absorption problems. In this paper, we demonstrate the possibility to widen the range of in-vitro IRMS experiments to vibrational analysis of live cellular samples, thanks to the development of novel biocompatible IR-visible transparent microfluidic devices (MD). In order to highlight the biological relevance of IRMS in MD (MD-IRMS), we performed a systematic exploration of the biochemical alterations induced by different fixation protocols, ethanol 70% and formaldehyde solution 4%, as well as air-drying on U937 leukemic monocytes by comparing their IR vibrational features with the live U937 counterpart. Both fixation and air-drying procedures affected lipid composition and order as well as protein structure at a different extent while they both induced structural alterations in nucleic acids. Therefore, only IRMS of live cells can provide reliable information on both DNA and RNA structure and on their cellular dynamic. In summary, we show that MD-IRMS of live cells is feasible, reliable, and biologically relevant to be recognized as a label-free cell-based assay.

  4. Matrix metalloproteinase activity assays: Importance of zymography.

    Science.gov (United States)

    Kupai, K; Szucs, G; Cseh, S; Hajdu, I; Csonka, C; Csont, T; Ferdinandy, P

    2010-01-01

    Matrix metalloproteinases (MMPs) are zinc-dependent endopeptidases capable of degrading extracellular matrix, including the basement membrane. MMPs are associated with various physiological processes such as morphogenesis, angiogenesis, and tissue repair. Moreover, due to the novel non-matrix related intra- and extracellular targets of MMPs, dysregulation of MMP activity has been implicated in a number of acute and chronic pathological processes, such as arthritis, acute myocardial infarction, chronic heart failure, chronic obstructive pulmonary disease, inflammation, and cancer metastasis. MMPs are considered as viable drug targets in the therapy of the above diseases. For the development of selective MMP inhibitor molecules, reliable methods are necessary for target validation and lead development. Here, we discuss the major methods used for MMP assays, focusing on substrate zymography. We highlight some problems frequently encountered during sample preparations, electrophoresis, and data analysis of zymograms. Zymography is a widely used technique to study extracellular matrix-degrading enzymes, such as MMPs, from tissue extracts, cell cultures, serum or urine. This simple and sensitive technique identifies MMPs by the degradation of their substrate and by their molecular weight and therefore helps to understand the widespread role of MMPs in different pathologies and cellular pathways. Copyright 2010 Elsevier Inc. All rights reserved.

  5. Immunochromatographic assay of cadmium levels in oysters.

    Science.gov (United States)

    Nishi, Kosuke; Kim, In-Hae; Itai, Takaaki; Sugahara, Takuya; Takeyama, Haruko; Ohkawa, Hideo

    2012-08-15

    Oysters are one of foodstuffs containing a relatively high amount of cadmium. Here we report on establishment of an immunochromatographic assay (ICA) method of cadmium levels in oysters. Cadmium was extracted with 0.l mol L(-1) HCl from oysters and cleaned up from other metals by the use of an anion-exchange column. The behavior of five metals Mn, Fe, Cu, Zn, and Cd was monitored at each step of extraction and clean-up procedure for the ICA method in an inductively coupled plasma-mass spectrometry (ICP-MS) analysis. The results revealed that a simple extraction method with the HCl solution was efficient enough to extract almost all of cadmium from oysters. Clean-up with an anion-exchange column presented almost no loss of cadmium adsorbed on the column and an efficient removal of metals other than cadmium. When a spiked recovery test was performed in the ICA method, the recovery ranged from 98% to 112% with relative standard deviations between 5.9% and 9.2%. The measured values of cadmium in various oyster samples in the ICA method were favorably correlated with those in ICP-MS analysis (r(2)=0.97). Overall results indicate that the ICA method established in the present study is an adequate and reliable detection method for cadmium levels in oysters. Copyright © 2012 Elsevier B.V. All rights reserved.

  6. Validation of IT-based Data Communication Protocol for Nuclear Power Plant

    International Nuclear Information System (INIS)

    Jeong, K. I.; Kim, D. H.; Lee, J. C.

    2009-12-01

    The communication network designed to transmit control and processing signals in digital Instrument and Control (I and C) systems in Nuclear Power Plant (NPP), should provide a high level of safety and reliability. There are different features between the communication networks of NPPs and other commercial communication networks. Safety and reliability are the most important factors in the communication networks of an NPP rather than efficiency which are important factors of a commercial communication network design. To develop Data Communication Protocol for Nuclear Power Plant, We analyze the design criteria and performance requirements of existing commercial communication protocols based on Information Technology(IT). And also, we examine the adaptability to the communication protocol of an NPP. Based on these results, we developed our own protocol(Nuclear power plant Safety Communication Protocol : NSCP) for NPP I and C, which meet the required specifications through design overall protocol architecture and data frame format, definition of functional requirements and specifications. NSCP is the communication protocol designed for a safety-grade control network in the nuclear power plant. In this report, we had specified NSCP protocol by FDT(Formal Description Technique) and established validation procedures based on the validation methodology. It was confirmed specification error, major function's validity and reachability of NSCP by performing simulation and the validation process using Telelogic Tau tool

  7. Reliability in automotive ethernet networks

    DEFF Research Database (Denmark)

    Soares, Fabio L.; Campelo, Divanilson R.; Yan, Ying

    2015-01-01

    This paper provides an overview of in-vehicle communication networks and addresses the challenges of providing reliability in automotive Ethernet in particular.......This paper provides an overview of in-vehicle communication networks and addresses the challenges of providing reliability in automotive Ethernet in particular....

  8. Estimation of Bridge Reliability Distributions

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle

    In this paper it is shown how the so-called reliability distributions can be estimated using crude Monte Carlo simulation. The main purpose is to demonstrate the methodology. Therefor very exact data concerning reliability and deterioration are not needed. However, it is intended in the paper to ...

  9. Reliability of wind turbine subassemblies

    NARCIS (Netherlands)

    Spinato, F.; Tavner, P.J.; Bussel, van G.J.W.; Koutoulakos, E.

    2009-01-01

    We have investigated the reliability of more than 6000 modern onshore wind turbines and their subassemblies in Denmark and Germany over 11 years and particularly changes in reliability of generators, gearboxes and converters in a subset of 650 turbines in Schleswig Holstein, Germany. We first start

  10. Reliability engineering theory and practice

    CERN Document Server

    Birolini, Alessandro

    2010-01-01

    Presenting a solid overview of reliability engineering, this volume enables readers to build and evaluate the reliability of various components, equipment and systems. Current applications are presented, and the text itself is based on the author's 30 years of experience in the field.

  11. Hardening Stratum, the Bitcoin Pool Mining Protocol

    Directory of Open Access Journals (Sweden)

    Recabarren Ruben

    2017-07-01

    Full Text Available Stratum, the de-facto mining communication protocol used by blockchain based cryptocurrency systems, enables miners to reliably and efficiently fetch jobs from mining pool servers. In this paper we exploit Stratum’s lack of encryption to develop passive and active attacks on Bitcoin’s mining protocol, with important implications on the privacy, security and even safety of mining equipment owners. We introduce StraTap and ISP Log attacks, that infer miner earnings if given access to miner communications, or even their logs. We develop BiteCoin, an active attack that hijacks shares submitted by miners, and their associated payouts. We build BiteCoin on WireGhost, a tool we developed to hijack and surreptitiously maintain Stratum connections. Our attacks reveal that securing Stratum through pervasive encryption is not only undesirable (due to large overheads, but also ineffective: an adversary can predict miner earnings even when given access to only packet timestamps. Instead, we devise Bedrock, a minimalistic Stratum extension that protects the privacy and security of mining participants. We introduce and leverage the mining cookie concept, a secret that each miner shares with the pool and includes in its puzzle computations, and that prevents attackers from reconstructing or hijacking the puzzles.

  12. Protocols for Scholarly Communication

    Science.gov (United States)

    Pepe, A.; Yeomans, J.

    2007-10-01

    CERN, the European Organization for Nuclear Research, has operated an institutional preprint repository for more than 10 years. The repository contains over 850,000 records of which more than 450,000 are full-text OA preprints, mostly in the field of particle physics, and it is integrated with the library's holdings of books, conference proceedings, journals and other grey literature. In order to encourage effective propagation and open access to scholarly material, CERN is implementing a range of innovative library services into its document repository: automatic keywording, reference extraction, collaborative management tools and bibliometric tools. Some of these services, such as user reviewing and automatic metadata extraction, could make up an interesting testbed for future publishing solutions and certainly provide an exciting environment for e-science possibilities. The future protocol for scientific communication should guide authors naturally towards OA publication, and CERN wants to help reach a full open access publishing environment for the particle physics community and related sciences in the next few years.

  13. Reliability-Based Optimization in Structural Engineering

    DEFF Research Database (Denmark)

    Enevoldsen, I.; Sørensen, John Dalsgaard

    1994-01-01

    In this paper reliability-based optimization problems in structural engineering are formulated on the basis of the classical decision theory. Several formulations are presented: Reliability-based optimal design of structural systems with component or systems reliability constraints, reliability...

  14. A continuous spectrophotometric assay for monitoring adenosine 5'-monophosphate production.

    Science.gov (United States)

    First, Eric A

    2015-08-15

    A number of biologically important enzymes release adenosine 5'-monophosphate (AMP) as a product, including aminoacyl-tRNA synthetases, cyclic AMP (cAMP) phosphodiesterases, ubiquitin and ubiquitin-like ligases, DNA ligases, coenzyme A (CoA) ligases, polyA deadenylases, and ribonucleases. In contrast to the abundance of assays available for monitoring the conversion of adenosine 5'-triphosphate (ATP) to ADP, there are relatively few assays for monitoring the conversion of ATP (or cAMP) to AMP. In this article, we describe a homogeneous assay that continuously monitors the production of AMP. Specifically, we have coupled the conversion of AMP to inosine 5'-monophosphate (IMP) (by AMP deaminase) to the oxidation of IMP (by IMP dehydrogenase). This results in the reduction of oxidized nicotine adenine dinucleotide (NAD(+)) to reduced nicotine adenine dinucleotide (NADH), allowing AMP formation to be monitored by the change in the absorbance at 340 nm. Changes in AMP concentrations of 5 μM or more can be reliably detected. The ease of use and relatively low expense make the AMP assay suitable for both high-throughput screening and kinetic analyses. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Evaluation of environmental genotoxicity by comet assay in Columba livia.

    Science.gov (United States)

    González-Acevedo, Anahi; García-Salas, Juan A; Gosálvez, Jaime; Fernández, José Luis; Dávila-Rodríguez, Martha I; Cerda-Flores, Ricardo M; Méndez-López, Luis F; Cortés-Gutiérrez, Elva I

    2016-01-01

    The concentrations of recognized or suspected genotoxic and carcinogenic agents found in the air of large cities and, in particular, developing countries, have raised concerns about the potential for chronic health effects in the populations exposed to them. The biomonitoring of environmental genotoxicity requires the selection of representative organisms as "sentinels," as well as the development of suitable and sensitive assays, such as those aimed at assessing DNA damage. The aim of this study was to evaluate DNA damage levels in erythrocytes from Columba livia living in the metropolitan area of Monterrey, Mexico, compared with control animals via comet assay, and to confirm the results via Micronuclei test (MN) and DNA breakage detection-fluorescence in situ hybridization (DBD-FISH). Our results showed a significant increase in DNA migration in animals from the area assayed compared with that observed in control animals sampled in non-contaminated areas. These results were confirmed by MN test and DBD-FISH. In conclusion, these observations confirm that the examination of erythrocytes from Columba livia via alkaline comet assay provides a sensitive and reliable end point for the detection of environmental genotoxicants.

  16. The comet assay: Reflections on its development, evolution and applications.

    Science.gov (United States)

    Singh, Narendra P

    2016-01-01

    The study of DNA damage and its repair is critical to our understanding of human aging and cancer. This review reflects on the development of a simple technique, now known as the comet assay, to study the accumulation of DNA damage and its repair. It describes my journey into aging research and the need for a method that sensitively quantifies DNA damage on a cell-by-cell basis and on a day-by-day basis. My inspirations, obstacles and successes on the path to developing this assay and improving its reliability and sensitivity are discussed. Recent modifications, applications, and the process of standardizing the technique are also described. What was once untried and unknown has become a technique used around the world for understanding and monitoring DNA damage. The comet assay's use has grown exponentially in the new millennium, as emphasis on studying biological phenomena at the single-cell level has increased. I and others have applied the technique across cell types (including germ cells) and species (including bacteria). As it enters new realms and gains clinical relevance, the comet assay may very well illuminate human aging and its prevention. Copyright © 2016. Published by Elsevier B.V.

  17. An indicator cell assay for blood-based diagnostics.

    Directory of Open Access Journals (Sweden)

    Samuel A Danziger

    Full Text Available We have established proof of principle for the Indicator Cell Assay Platform™ (iCAP™, a broadly applicable tool for blood-based diagnostics that uses specifically-selected, standardized cells as biosensors, relying on their innate ability to integrate and respond to diverse signals present in patients' blood. To develop an assay, indicator cells are exposed in vitro to serum from case or control subjects and their global differential response patterns are used to train reliable, disease classifiers based on a small number of features. In a feasibility study, the iCAP detected pre-symptomatic disease in a murine model of amyotrophic lateral sclerosis (ALS with 94% accuracy (p-Value = 3.81E-6 and correctly identified samples from a murine Huntington's disease model as non-carriers of ALS. Beyond the mouse model, in a preliminary human disease study, the iCAP detected early stage Alzheimer's disease with 72% cross-validated accuracy (p-Value = 3.10E-3. For both assays, iCAP features were enriched for disease-related genes, supporting the assay's relevance for disease research.

  18. A precise, efficient radiometric assay for bacterial growth

    International Nuclear Information System (INIS)

    Boonkitticharoen, V.; Ehrhardt, C.; Kirchner, P.T.

    1984-01-01

    The two-compartment radiometric assay for bacterial growth promised major advantages over systems in clinical use, but poor reproducibility and counting efficiency limited its application. In this method, 14-CO/sub 2/ produced by bacterial metabolism of C-14-glucose is trapped and counted on filter paper impregnated with NaOH and fluors. The authors sought to improve assay efficiency and precision through a systematic study of relevant physical and chemical factors. Improvements in efficiency (88% vs. 10%) and in precision (relative S.D. 5% vs. 40%) were produced by a) reversing growth medium and scintillator chambers to permit vigorous agitation, b) increasing NaOH quantity and using a supersaturated PPO solution and c) adding detergent to improve uniformity of NaOH-PPO mixture. Inoculum size, substrate concentration and O/sub 2/ transfer rate affected assay sensitivity but not bacterial growth rate. The authors' assay reliably detects bacterial growth for inocula of 10,000 organisms in 1 hour and for 25 organisms within 4 1/2 hours, thus surpassing other existing clinical and research methods

  19. International network for comparison of HIV neutralization assays: the NeutNet report.

    Science.gov (United States)

    Fenyö, Eva Maria; Heath, Alan; Dispinseri, Stefania; Holmes, Harvey; Lusso, Paolo; Zolla-Pazner, Susan; Donners, Helen; Heyndrickx, Leo; Alcami, Jose; Bongertz, Vera; Jassoy, Christian; Malnati, Mauro; Montefiori, David; Moog, Christiane; Morris, Lynn; Osmanov, Saladin; Polonis, Victoria; Sattentau, Quentin; Schuitemaker, Hanneke; Sutthent, Ruengpung; Wrin, Terri; Scarlatti, Gabriella

    2009-01-01

    Neutralizing antibody assessments play a central role in human immunodeficiency virus type-1 (HIV-1) vaccine development but it is unclear which assay, or combination of assays, will provide reliable measures of correlates of protection. To address this, an international collaboration (NeutNet) involving 18 independent participants was organized to compare different assays. Each laboratory evaluated four neutralizing reagents (TriMab, 447-52D, 4E10, sCD4) at a given range of concentrations against a panel of 11 viruses representing a wide range of genetic subtypes and phenotypes. A total of 16 different assays were compared. The assays utilized either uncloned virus produced in peripheral blood mononuclear cells (PBMCs) (virus infectivity assays, VI assays), or their Env-pseudotyped (gp160) derivatives produced in 293T cells (PSV assays) from molecular clones or uncloned virus. Target cells included PBMC and genetically-engineered cell lines in either a single- or multiple-cycle infection format. Infection was quantified by using a range of assay read-outs that included extracellular or intracellular p24 antigen detection, RNA quantification and luciferase and beta-galactosidase reporter gene expression. PSV assays were generally more sensitive than VI assays, but there were important differences according to the virus and inhibitor used. For example, for TriMab, the mean IC50 was always lower in PSV than in VI assays. However, with 4E10 or sCD4 some viruses were neutralized with a lower IC50 in VI assays than in the PSV assays. Inter-laboratory concordance was slightly better for PSV than for VI assays with some viruses, but for other viruses agreement between laboratories was limited and depended on both the virus and the neutralizing reagent. The NeutNet project demonstrated clear differences in assay sensitivity that were dependent on both the neutralizing reagent and the virus. No single assay was capable of detecting the entire spectrum of neutralizing

  20. International network for comparison of HIV neutralization assays: the NeutNet report.

    Directory of Open Access Journals (Sweden)

    Eva Maria Fenyö

    Full Text Available Neutralizing antibody assessments play a central role in human immunodeficiency virus type-1 (HIV-1 vaccine development but it is unclear which assay, or combination of assays, will provide reliable measures of correlates of protection. To address this, an international collaboration (NeutNet involving 18 independent participants was organized to compare different assays.Each laboratory evaluated four neutralizing reagents (TriMab, 447-52D, 4E10, sCD4 at a given range of concentrations against a panel of 11 viruses representing a wide range of genetic subtypes and phenotypes. A total of 16 different assays were compared. The assays utilized either uncloned virus produced in peripheral blood mononuclear cells (PBMCs (virus infectivity assays, VI assays, or their Env-pseudotyped (gp160 derivatives produced in 293T cells (PSV assays from molecular clones or uncloned virus. Target cells included PBMC and genetically-engineered cell lines in either a single- or multiple-cycle infection format. Infection was quantified by using a range of assay read-outs that included extracellular or intracellular p24 antigen detection, RNA quantification and luciferase and beta-galactosidase reporter gene expression.PSV assays were generally more sensitive than VI assays, but there were important differences according to the virus and inhibitor used. For example, for TriMab, the mean IC50 was always lower in PSV than in VI assays. However, with 4E10 or sCD4 some viruses were neutralized with a lower IC50 in VI assays than in the PSV assays. Inter-laboratory concordance was slightly better for PSV than for VI assays with some viruses, but for other viruses agreement between laboratories was limited and depended on both the virus and the neutralizing reagent.The NeutNet project demonstrated clear differences in assay sensitivity that were dependent on both the neutralizing reagent and the virus. No single assay was capable of detecting the entire spectrum of