Sample records for reliably detected future

  1. Sampling for assurance of future reliability

    Klauenberg, Katy; Elster, Clemens


    Ensuring measurement trueness, compliance with regulations and conformity with standards are key tasks in metrology which are often considered at the time of an inspection. Current practice does not always verify quality after or between inspections, calibrations, laboratory comparisons, conformity assessments, etc. Statistical models describing behavior over time may ensure reliability, i.e. they may give the probability of functioning, compliance or survival until some future point in time. It may not always be possible or economic to inspect a whole population of measuring devices or other units. Selecting a subset of the population according to statistical sampling plans and inspecting only these, allows conclusions about the quality of the whole population with a certain confidence. Combining these issues of sampling and aging, raises questions such as: How many devices need to be inspected, and at least how many of them must conform, so that one can be sure, that more than 100p % of the population will comply until the next inspection? This research is to raise awareness and offer a simple answer to such time- and sample-based quality statements in metrology and beyond. Reliability demonstration methods, such as the prevailing Weibull binomial model, quantify the confidence in future reliability on the basis of a sample. We adapt the binomial model to be applicable to sampling without replacement and simplify the Weibull model so that sampling plans may be determined on the basis of existing ISO standards. Provided the model is suitable, no additional information and no software are needed; and yet, the consumer is protected against future failure. We establish new sampling plans for utility meter surveillance, which are required by a recent modification of German law. These sampling plans are given in similar tables to the previous ones, which demonstrates their suitability for everyday use.

  2. Fundamental statistical limitations of future dark matter direct detection experiments

    Strege, C.; Trotta, F.; Bertone, G.; Peter, A.H.G.; Scott, P.


    We discuss irreducible statistical limitations of future ton-scale dark matter direct detection experiments. We focus in particular on the coverage of confidence intervals, which quantifies the reliability of the statistical method used to reconstruct the dark matter parameters and the bias of the r

  3. New Multiplexing Tools for Reliable GMO Detection

    Pla, M.; Nadal, A.; Baeten, V.; Bahrdt, C.; Berben, G.; Bertheau, Y.; Coll, A.; Dijk, van J.P.; Dobnik, D.; Fernandez-Pierna, J.A.; Gruden, K.; Hamels, S.; Holck, A.; Holst-Jensen, A.; Janssen, E.; Kok, E.J.; Paz, La J.L.; Laval, V.; Leimanis, S.; Malcevschi, A.; Marmiroli, N.; Morisset, D.; Prins, T.W.; Remacle, J.; Ujhelyi, G.; Wulff, D.


    Among the available methods for GMO detection, enforcement and routine laboratories use in practice PCR, based on the detection of transgenic DNA. The cost required for GMO analysis is constantly increasing due to the progress of GMO commercialization, with inclusion of higher diversity of species,

  4. Reliably Detectable Flaw Size for NDE Methods that Use Calibration

    Koshti, Ajay M.


    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-1823 and associated mh1823 POD software gives most common methods of POD analysis. In this paper, POD analysis is applied to an NDE method, such as eddy current testing, where calibration is used. NDE calibration standards have known size artificial flaws such as electro-discharge machined (EDM) notches and flat bottom hole (FBH) reflectors which are used to set instrument sensitivity for detection of real flaws. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. Therefore, it is important to correlate signal responses from real flaws with signal responses form artificial flaws used in calibration process to determine reliably detectable flaw size.

  5. 3D face recognition algorithm based on detecting reliable components

    Huang Wenjun; Zhou Xuebing; Niu Xiamu


    Fisherfaces algorithm is a popular method for face recognition. However, there exist some unstable components that degrade recognition performance. In this paper, we propose a method based on detecting reliable components to overcome the problem and introduce it to 3D face recognition. The reliable components are detected within the binary feature vector, which is generated from the Fisherfaces feature vector based on statistical properties, and is used for 3D face recognition as the final feature vector. Experimental results show that the reliable components feature vector is much more effective than the Fisherfaces feature vector for face recognition.

  6. Reliability of magnetic susceptibility weighted imaging in detection of cerebral microbleeds in stroke patients

    Lamiaa G. El-Serougy


    Conclusion: SWI is an important reliable technique allows accurate detection of CMBs occurring in association with hemorrhage in acute and chronic stroke and should be included in the protocols for assessment of stroke to help in choice of proper treatment and prediction of future attacks.

  7. Fast wafer-level detection and control of interconnect reliability

    Foley, Sean; Molyneaux, James; Mathewson, Alan


    Many of the technological advances in the semiconductor industry have led to dramatic increases in device density and performance in conjunction with enhanced circuit reliability. As reliability is improved, the time taken to characterize particular failure modes with traditional test methods is getting substantially longer. Furthermore, semiconductor customers expect low product cost and fast time-to-market. The limits of traditional reliability testing philosophies are being reached and new approaches need to be investigated to enable the next generation of highly reliable products to be tested. This is especially true in the area of IC interconnect, where significant challenges are predicted for the next decade. A number of fast, wafer level test methods exist for interconnect reliability evaluation. The relative abilities of four such methods to detect the quality and reliability of IC interconnect over very short test times are evaluated in this work. Four different test structure designs are also evaluated and the results are bench-marked against conventional package level Median Time to Failure results. The Isothermal test method combine with SWEAT-type test structures is shown to be the most suitable combination for defect detection and interconnect reliability control over very short test times.

  8. Revenue Sufficiency and Reliability in a Zero Marginal Cost Future

    Frew, Bethany A.


    Features of existing wholesale electricity markets, such as administrative pricing rules and policy-based reliability standards, can distort market incentives from allowing generators sufficient opportunities to recover both fixed and variable costs. Moreover, these challenges can be amplified by other factors, including (1) inelastic demand resulting from a lack of price signal clarity, (2) low- or near-zero marginal cost generation, particularly arising from low natural gas fuel prices and variable generation (VG), such as wind and solar, and (3) the variability and uncertainty of this VG. As power systems begin to incorporate higher shares of VG, many questions arise about the suitability of the existing marginal-cost-based price formation, primarily within an energy-only market structure, to ensure the economic viability of resources that might be needed to provide system reliability. This article discusses these questions and provides a summary of completed and ongoing modelling-based work at the National Renewable Energy Laboratory to better understand the impacts of evolving power systems on reliability and revenue sufficiency.

  9. Future Trends in Reliability-Based Bridge Management

    Thoft-Christensen, Palle

    Future bridge management systems will be based on simple stochastic models predicting the residual strength of structural elements. The current deterministic management systems are not effective in optimizing e.g. the life cycle cost of a bridge or a system of bridges. A number of important factors...

  10. Probability of detection (POD) is not NDT/E reliability

    Rummel, Ward D.


    The reliability of nondestructive testing procedures has been a primary consideration in the development, application and advancement of nondestructive testing (NDT/E) technology. Indeed significant advancements have been made in process control of procedures and in training, qualification personnel who apply NDT procedures. Recognition and the demand for NDT increased with successes. NDT was integrated into engineering practices and technologies. The design, operation, maintenance, life cycle and risk analyses changed dramatically with the development, application and incorporation of fracture mechanics in engineering requirements, engineering practices and engineering systems management. Those engineering changes increased demands and a revolution in NDT requirements, practices and technology advancement. The Probability of Detection (POD) metric was developed to provide a quantitative assessment of NDT detection capabilities and was focused on the smallest reliably detectable flaw. Critical needs for application resulted in wide acceptance of POD as a metric to quantify the detection capability of NDT procedures. Unfortunately, POD is often misinterpreted as a primary measure of the reliability of NDT procedures. This paper addresses the nature of POD and requirements for assessment and application. Emphasis is on evolution of a new branch in NDT engineering, technology application and validation.

  11. Mastitis detection: current trends and future perspectives.

    Viguier, Caroline; Arora, Sushrut; Gilmartin, Niamh; Welbeck, Katherine; O'Kennedy, Richard


    Bovine mastitis, the most significant disease of dairy herds, has huge effects on farm economics due to reduction in milk production and treatment costs. Traditionally, methods of detection have included estimation of somatic cell counts, an indication of inflammation, measurement of biomarkers associated with the onset of the disease (e.g. the enzymes N-acetyl-beta-D-glucosaminidase and lactate dehydrogenase) and identification of the causative microorganisms, which often involves culturing methods. These methods have their limitations and there is a need for new rapid, sensitive and reliable assays. Recently, significant advances in the identification of nucleic acid markers and other novel biomarkers and the development of sensor-based platforms have taken place. These novel strategies have shown promise, and their advantages over the conventional tests are discussed.

  12. Reliability analysis for the quench detection in the LHC machine

    Denz, R; Vergara-Fernández, A


    The Large Hadron Collider (LHC) will incorporate a large amount of superconducting elements that require protection in case of a quench. Key elements in the quench protection system are the electronic quench detectors. Their reliability will have an important impact on the down time as well as on the operational cost of the collider. The expected rates of both false and missed quenches have been computed for several redundant detection schemes. The developed model takes account of the maintainability of the system to optimise the frequency of foreseen checks, and evaluate their influence on the performance of different detection topologies. Seen the uncertainty of the failure rate of the components combined with the LHC tunnel environment, the study has been completed with a sensitivity analysis of the results. The chosen detection scheme and the maintainability strategy for each detector family are given.

  13. Parameter estimation and reliable fault detection of electric motors

    Dusan PROGOVAC; Le Yi WANG; George YIN


    Accurate model identification and fault detection are necessary for reliable motor control. Motor-characterizing parameters experience substantial changes due to aging, motor operating conditions, and faults. Consequently, motor parameters must be estimated accurately and reliably during operation. Based on enhanced model structures of electric motors that accommodate both normal and faulty modes, this paper introduces bias-corrected least-squares (LS) estimation algorithms that incorporate functions for correcting estimation bias, forgetting factors for capturing sudden faults, and recursive structures for efficient real-time implementation. Permanent magnet motors are used as a benchmark type for concrete algorithm development and evaluation. Algorithms are presented, their properties are established, and their accuracy and robustness are evaluated by simulation case studies under both normal operations and inter-turn winding faults. Implementation issues from different motor control schemes are also discussed.

  14. A reliable method for detecting complexed DNA in vitro

    Holladay, C.; Keeney, M.; Newland, B.; Mathew, A.; Wang, W.; Pandit, A.


    Quantification of eluted nucleic acids is a critical parameter in characterizing biomaterial based gene-delivery systems. The most commonly used method is to assay samples with an intercalating fluorescent dye such as PicoGreen®. However, this technique was developed for unbound DNA and the current trend in gene delivery is to condense DNA with transfection reagents, which interfere with intercalation. Here, for the first time, the DNA was permanently labeled with the fluorescent dye Cy5 prior to complexation, an alternative technique hypothesized to allow quantification of both bound and unbound DNA. A comparison of the two methods was performed by quantifying the elution of six different varieties of DNA complexes from a model biomaterial (collagen) scaffold. After seven days of elution, the PicoGreen® assay only allowed detection of three types of complexes (those formed using Lipofectin™ and two synthesised copolymers). However, the Cy5 fluorescent labeling technique enabled detection of all six varieties including those formed via common transfection agents poly(ethylene imine), poly-l-lysine and SuperFect™. This allowed reliable quantification of the elution of all these complexes from the collagen scaffold. Thus, while intercalating dyes may be effective and reliable for detecting double-stranded, unbound DNA, the technique described in this work allowed reliable quantification of DNA independent of complexation state.Quantification of eluted nucleic acids is a critical parameter in characterizing biomaterial based gene-delivery systems. The most commonly used method is to assay samples with an intercalating fluorescent dye such as PicoGreen®. However, this technique was developed for unbound DNA and the current trend in gene delivery is to condense DNA with transfection reagents, which interfere with intercalation. Here, for the first time, the DNA was permanently labeled with the fluorescent dye Cy5 prior to complexation, an alternative technique

  15. Reliable detection of directional couplings using rank statistics.

    Chicharro, Daniel; Andrzejak, Ralph G


    To detect directional couplings from time series various measures based on distances in reconstructed state spaces were introduced. These measures can, however, be biased by asymmetries in the dynamics' structure, noise color, or noise level, which are ubiquitous in experimental signals. Using theoretical reasoning and results from model systems we identify the various sources of bias and show that most of them can be eliminated by an appropriate normalization. We furthermore diminish the remaining biases by introducing a measure based on ranks of distances. This rank-based measure outperforms existing distance-based measures concerning both sensitivity and specificity for directional couplings. Therefore, our findings are relevant for a reliable detection of directional couplings from experimental signals.

  16. Reliable epileptic seizure detection using an improved wavelet neural network

    Zarita Zainuddin


    Full Text Available BackgroundElectroencephalogram (EEG signal analysis is indispensable in epilepsy diagnosis as it offers valuable insights for locating the abnormal distortions in the brain wave. However, visual interpretation of the massive amounts of EEG signals is time-consuming, and there is often inconsistent judgment between experts. AimsThis study proposes a novel and reliable seizure detection system, where the statistical features extracted from the discrete wavelet transform are used in conjunction with an improved wavelet neural network (WNN to identify the occurrence of seizures. Method Experimental simulations were carried out on a well-known publicly available dataset, which was kindly provided by the Epilepsy Center, University of Bonn, Germany. The normal and epileptic EEG signals were first pre-processed using the discrete wavelet transform. Subsequently, a set of statistical features was extracted to train a WNNs-based classifier. ResultsThe study has two key findings. First, simulation results showed that the proposed improved WNNs-based classifier gave excellent predictive ability, where an overall classification accuracy of 98.87% was obtained. Second, by using the 10th and 90th percentiles of the absolute values of the wavelet coefficients, a better set of EEG features can be identified from the data, as the outliers are removed before any further downstream analysis.ConclusionThe obtained high prediction accuracy demonstrated the feasibility of the proposed seizure detection scheme. It suggested the prospective implementation of the proposed method in developing a real time automated epileptic diagnostic system with fast and accurate response that could assist neurologists in the decision making process.

  17. Oil Spill Detection: Past and Future Trends

    Topouzelis, Konstantinos; Singha, Suman


    In the last 15 years, the detection of oil spills by satellite means has been moved from experimental to operational. Actually, what is really changed is the satellite image availability. From the late 1990's, in the age of "no data" we have moved forward 15 years to the age of "Sentinels" with an abundance of data. Either large accident related to offshore oil exploration and production activity or illegal discharges from tankers, oil on the sea surface is or can be now regularly monitored, over European Waters. National and transnational organizations (i.e. European Maritime Safety Agency's 'CleanSeaNet' Service) are routinely using SAR imagery to detect oil due to it's all weather, day and night imaging capability. However, all these years the scientific methodology on the detection remains relatively constant. From manual analysis to fully automatic detection methodologies, no significant contribution has been published in the last years and certainly none has dramatically changed the rules of the detection. On the contrary, although the overall accuracy of the methodology is questioned, the four main classification steps (dark area detection, features extraction, statistic database creation, and classification) are continuously improving. In recent years, researchers came up with the use of polarimetric SAR data for oil spill detection and characterizations, although utilization of Pol-SAR data for this purpose still remains questionable due to lack of verified dataset and low spatial coverage of Pol-SAR data. The present paper is trying to point out the drawbacks of the oil spill detection in the last years and focus on the bottlenecks of the oil spill detection methodologies. Also, solutions on the basis of data availability, management and analysis are proposed. Moreover, an ideal detection system is discussed regarding satellite image and in situ observations using different scales and sensors.

  18. Automatic Student Plagiarism Detection: Future Perspectives

    Mozgovoy, Maxim; Kakkonen, Tuomo; Cosma, Georgina


    The availability and use of computers in teaching has seen an increase in the rate of plagiarism among students because of the wide availability of electronic texts online. While computer tools that have appeared in recent years are capable of detecting simple forms of plagiarism, such as copy-paste, a number of recent research studies devoted to…

  19. Automatic Student Plagiarism Detection: Future Perspectives

    Mozgovoy, Maxim; Kakkonen, Tuomo; Cosma, Georgina


    The availability and use of computers in teaching has seen an increase in the rate of plagiarism among students because of the wide availability of electronic texts online. While computer tools that have appeared in recent years are capable of detecting simple forms of plagiarism, such as copy-paste, a number of recent research studies devoted to…

  20. A Reliability-Based Multi-Algorithm Fusion Technique in Detecting Changes in Land Cover

    Jiangping Chen


    Full Text Available Detecting land use or land cover changes is a challenging problem in analyzing images. Change-detection plays a fundamental role in most of land use or cover monitoring systems using remote-sensing techniques. The reliability of individual automatic change-detection algorithms is currently below operating requirements when considering the intrinsic uncertainty of a change-detection algorithm and the complexity of detecting changes in remote-sensing images. In particular, most of these algorithms are only suited for a specific image data source, study area and research purpose. Only a number of comprehensive change-detection methods that consider the reliability of the algorithm in different implementation situations have been reported. This study attempts to explore the advantages of combining several typical change-detection algorithms. This combination is specifically designed for a highly reliable change-detection task. Specifically, a fusion approach based on reliability is proposed for an exclusive land use or land cover change-detection. First, the reliability of each candidate algorithm is evaluated. Then, a fuzzy comprehensive evaluation is used to generate a reliable change-detection approach. This evaluation is a transformation between a one-way evaluation matrix and a weight vector computed using the reliability of each candidate algorithm. Experimental results reveal that the advantages of combining these distinct change-detection techniques are evident.

  1. Bubble Radiation Detection: Current and Future Capability

    AJ Peurrung; RA Craig


    Despite a number of noteworthy achievements in other fields, superheated droplet detectors (SDDs) and bubble chambers (BCs) have not been used for nuclear nonproliferation and arms control. This report examines these two radiation-detection technologies in detail and answers the question of how they can be or should be ''adapted'' for use in national security applications. These technologies involve closely related approaches to radiation detection in which an energetic charged particle deposits sufficient energy to initiate the process of bubble nucleation in a superheated fluid. These detectors offer complete gamma-ray insensitivity when used to detect neutrons. They also provide controllable neutron-energy thresholds and excellent position resolution. SDDs are extraordinarily simple and inexpensive. BCs offer the promise of very high efficiency ({approximately}75%). A notable drawback for both technologies is temperature sensitivity. As a result of this problem, the temperature must be controlled whenever high accuracy is required, or harsh environmental conditions are encountered. The primary findings of this work are listed and briefly summarized below: (1) SDDs are ready to function as electronics-free neutron detectors on demand for arms-control applications. The elimination of electronics at the weapon's location greatly eases the negotiability of radiation-detection technologies in general. (2) As a result of their high efficiency and sharp energy threshold, current BCs are almost ready for use in the development of a next-generation active assay system. Development of an instrument based on appropriately safe materials is warranted. (3) Both kinds of bubble detectors are ready for use whenever very high gamma-ray fields must be confronted. Spent fuel MPC and A is a good example where this need presents itself. (4) Both kinds of bubble detectors have the potential to function as low-cost replacements for conventional neutron

  2. Prediction of Global and Localized Damage and Future Reliability for RC Structures subject to Earthquakes

    Köyluoglu, H.U.; Nielsen, Søren R.K.; Cakmak, A.S.;


    The paper deals with the prediction of global and localized damage and the future reliability estimation of partly damaged reinforced concrete (RC) structures under seismic excitation. Initially, a global maximum softening damage indicator is considered based on the variation of the eigenfrequency...... of the first mode due to the stiffness and strength deterioration of the structure. The hysteresis of the first mode is modelled by a Clough and Johnston hysteretic oscillator with a degrading elastic fraction of the restoring force. The linear parameters of the model are assumed to be known, measured before....... The proposed model is next generalized for the MDOF system. Using the adapted models for the structure and the global damage state, the global damage in a future earthquake can then be estimated when a suitable earthquake model is applied. The performance of the model is illustrated on RC frames which were...

  3. Prediction of Global and Localized Damage and Future Reliability for RC Structures subject to Earthquakes

    Köyluoglu, H.U.; Nielsen, Søren R.K.; Cakmak, A.S.;


    The paper deals with the prediction of global and localized damage and the future reliability estimation of partly damaged reinforced concrete (RC) structures under seismic excitation. Initially, a global maximum softening damage indicator is considered based on the variation of the eigenfrequency...... of the first mode due to the stiffness and strength deterioration of the structure. The hysteresis of the first mode is modelled by a Clough and Johnston hysteretic oscillator with a degrading elastic fraction of the restoring force. The linear parameters of the model are assumed to be known, measured before....... The proposed model is next generalized for the MDOF system. Using the adapted models for the structure and the global damage state, the global damage in a future earthquake can then be estimated when a suitable earthquake model is applied. The performance of the model is illustrated on RC frames which were...

  4. PV Systems Reliability Final Technical Report: Ground Fault Detection

    Lavrova, Olga [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Flicker, Jack David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Johnson, Jay [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)


    We have examined ground faults in PhotoVoltaic (PV) arrays and the efficacy of fuse, current detection (RCD), current sense monitoring/relays (CSM), isolation/insulation (Riso) monitoring, and Ground Fault Detection and Isolation (GFID) using simulations based on a Simulation Program with Integrated Circuit Emphasis SPICE ground fault circuit model, experimental ground faults installed on real arrays, and theoretical equations.


    Serjeant, S. [Department of Physical Sciences, The Open University, Milton Keynes MK7 6AA (United Kingdom)


    The Euclid space telescope will observe ∼10{sup 5} strong galaxy-galaxy gravitational lens events in its wide field imaging survey over around half the sky, but identifying the gravitational lenses from their observed morphologies requires solving the difficult problem of reliably separating the lensed sources from contaminant populations, such as tidal tails, as well as presenting challenges for spectroscopic follow-up redshift campaigns. Here I present alternative selection techniques for strong gravitational lenses in both Euclid and the Square Kilometre Array, exploiting the strong magnification bias present in the steep end of the Hα luminosity function and the H I mass function. Around 10{sup 3} strong lensing events are detectable with this method in the Euclid wide survey. While only ∼1% of the total haul of Euclid lenses, this sample has ∼100% reliability, known source redshifts, high signal-to-noise, and a magnification-based selection independent of assumptions of lens morphology. With the proposed Square Kilometre Array dark energy survey, the numbers of reliable strong gravitational lenses with source redshifts can reach 10{sup 5}.

  6. A computational method for reliable gait event detection and abnormality detection for feedback in rehabilitation.

    Senanayake, Chathuri; Senanayake, S M N Arosha


    In this paper, a gait event detection algorithm is presented that uses computer intelligence (fuzzy logic) to identify seven gait phases in walking gait. Two inertial measurement units and four force-sensitive resistors were used to obtain knee angle and foot pressure patterns, respectively. Fuzzy logic is used to address the complexity in distinguishing gait phases based on discrete events. A novel application of the seven-dimensional vector analysis method to estimate the amount of abnormalities detected was also investigated based on the two gait parameters. Experiments were carried out to validate the application of the two proposed algorithms to provide accurate feedback in rehabilitation. The algorithm responses were tested for two cases, normal and abnormal gait. The large amount of data required for reliable gait-phase detection necessitate the utilisation of computer methods to store and manage the data. Therefore, a database management system and an interactive graphical user interface were developed for the utilisation of the overall system in a clinical environment.

  7. Improving geo-information reliability by centralized change detection management

    Gorte, B.; Nardinocchi, C.; Thonon, I.; Addink, E.; Beck, R.; Persie, van M.; Kramer, H.


    A consortium called Mutatis Mutandis (MutMut), consisting of three Universities and eight producers and users of geo-information, was established in the Netherlands to streamline change detection on a national level. After preliminary investigations concerning market feasibility, three actions are

  8. Adapting to a Changing Colorado River: Making Future Water Deliveries More Reliable Through Robust Management Strategies

    Groves, D.; Bloom, E.; Fischbach, J. R.; Knopman, D.


    The U.S. Bureau of Reclamation and water management agencies representing the seven Colorado River Basin States initiated the Colorado River Basin Study in January 2010 to evaluate the resiliency of the Colorado River system over the next 50 years and compare different options for ensuring successful management of the river's resources. RAND was asked to join this Basin Study Team in January 2012 to help develop an analytic approach to identify key vulnerabilities in managing the Colorado River basin over the coming decades and to evaluate different options that could reduce this vulnerability. Using a quantitative approach for planning under uncertainty called Robust Decision Making (RDM), the RAND team assisted the Basin Study by: identifying future vulnerable conditions that could lead to imbalances that could cause the basin to be unable to meet its water delivery objectives; developing a computer-based tool to define 'portfolios' of management options reflecting different strategies for reducing basin imbalances; evaluating these portfolios across thousands of future scenarios to determine how much they could improve basin outcomes; and analyzing the results from the system simulations to identify key tradeoffs among the portfolios. This talk will describe RAND's contribution to the Basin Study, focusing on the methodologies used to to identify vulnerabilities for Upper Basin and Lower Basin water supply reliability and to compare portfolios of options. Several key findings emerged from the study. Future Streamflow and Climate Conditions Are Key: - Vulnerable conditions arise in a majority of scenarios where streamflows are lower than historical averages and where drought conditions persist for eight years or more. - Depending where the shortages occur, problems will arise for delivery obligations for the upper river basin and the lower river basin. The lower river basin is vulnerable to a broader range of plausible future conditions. Additional Investments in

  9. Human Reliability Assessments: Using the Past (Shuttle) to Predict the Future (Orion)

    DeMott, Diana L.; Bigler, Mark A.


    NASA (National Aeronautics and Space Administration) Johnson Space Center (JSC) Safety and Mission Assurance (S&MA) uses two human reliability analysis (HRA) methodologies. The first is a simplified method which is based on how much time is available to complete the action, with consideration included for environmental and personal factors that could influence the human's reliability. This method is expected to provide a conservative value or placeholder as a preliminary estimate. This preliminary estimate or screening value is used to determine which placeholder needs a more detailed assessment. The second methodology is used to develop a more detailed human reliability assessment on the performance of critical human actions. This assessment needs to consider more than the time available, this would include factors such as: the importance of the action, the context, environmental factors, potential human stresses, previous experience, training, physical design interfaces, available procedures/checklists and internal human stresses. The more detailed assessment is expected to be more realistic than that based primarily on time available. When performing an HRA on a system or process that has an operational history, we have information specific to the task based on this history and experience. In the case of a Probabilistic Risk Assessment (PRA) that is based on a new design and has no operational history, providing a "reasonable" assessment of potential crew actions becomes more challenging. To determine what is expected of future operational parameters, the experience from individuals who had relevant experience and were familiar with the system and process previously implemented by NASA was used to provide the "best" available data. Personnel from Flight Operations, Flight Directors, Launch Test Directors, Control Room Console Operators, and Astronauts were all interviewed to provide a comprehensive picture of previous NASA operations. Verification of the

  10. Objective Methods for Reliable Detection of Concealed Depression

    Cynthia eSolomon


    Full Text Available Recent research has shown that it is possible to automatically detect clinical depression from audio-visual recordings. Before considering integration in a clinical pathway, a key question that must be asked is whether such systems can be easily fooled. This work explores the potential of acoustic features to detect clinical depression in adults both when acting normally and when asked to conceal their depression. Nine adults diagnosed with mild to moderate depression as per the Beck Depression Inventory (BDI-II and Patient Health Questionnaire (PHQ-9 were asked a series of questions and to read a excerpt from a novel aloud under two different experimental conditions. In one, participants were asked to act naturally and in the other, to suppress anything that they felt would be indicative of their depression. Acoustic features were then extracted from this data and analysed using paired t-tests to determine any statistically significant differences between healthy and depressed participants. Most features that were found to be significantly different during normal behaviour remained so during concealed behaviour. In leave-one-subject-out automatic classification studies of the 9 depressed subjects and 8 matched healthy controls, an 88% classification accuracy and 89% sensitivity was achieved. Results remained relatively robust during concealed behaviour, with classifiers trained on only non-concealed data achieving 81% detection accuracy and 75% sensitivity when tested on concealed data. These results indicate there is good potential to build deception-proof automatic depression monitoring systems.

  11. Towards Reliable Evaluation of Anomaly-Based Intrusion Detection Performance

    Viswanathan, Arun


    This report describes the results of research into the effects of environment-induced noise on the evaluation process for anomaly detectors in the cyber security domain. This research was conducted during a 10-week summer internship program from the 19th of August, 2012 to the 23rd of August, 2012 at the Jet Propulsion Laboratory in Pasadena, California. The research performed lies within the larger context of the Los Angeles Department of Water and Power (LADWP) Smart Grid cyber security project, a Department of Energy (DoE) funded effort involving the Jet Propulsion Laboratory, California Institute of Technology and the University of Southern California/ Information Sciences Institute. The results of the present effort constitute an important contribution towards building more rigorous evaluation paradigms for anomaly-based intrusion detectors in complex cyber physical systems such as the Smart Grid. Anomaly detection is a key strategy for cyber intrusion detection and operates by identifying deviations from profiles of nominal behavior and are thus conceptually appealing for detecting "novel" attacks. Evaluating the performance of such a detector requires assessing: (a) how well it captures the model of nominal behavior, and (b) how well it detects attacks (deviations from normality). Current evaluation methods produce results that give insufficient insight into the operation of a detector, inevitably resulting in a significantly poor characterization of a detectors performance. In this work, we first describe a preliminary taxonomy of key evaluation constructs that are necessary for establishing rigor in the evaluation regime of an anomaly detector. We then focus on clarifying the impact of the operational environment on the manifestation of attacks in monitored data. We show how dynamic and evolving environments can introduce high variability into the data stream perturbing detector performance. Prior research has focused on understanding the impact of this

  12. Bedside ultrasound reliability in locating catheter and detecting complications

    Payman Moharamzadeh


    Full Text Available Introduction: Central venous catheterization is one of the most common medical procedures and is associated with such complications as misplacement and pneumothorax. Chest X-ray is among good ways for evaluation of these complications. However, due to patient’s excessive exposure to radiation, time consumption and low diagnostic value in detecting pneumothorax in the supine patient, the present study intends to examine bedside ultrasound diagnostic value in locating tip of the catheter and pneumothorax. Materials and methods: In the present cross-sectional study, all referred patients requiring central venous catheterization were examined. Central venous catheterization was performed by a trained emergency medicine specialist, and the location of catheter and the presence of pneumothorax were examined and compared using two modalities of ultrasound and x-ray (as the reference standard. Sensitivity, specificity, and positive and negative predicting values were reported. Results: A total of 200 non-trauma patients were included in the study (58% men. Cohen’s Kappa consistency coefficients for catheterization and diagnosis of pneumothorax were found as 0.49 (95% CI: 0.43-0.55, 0.89 (P<0.001, (95% CI: 97.8-100, respectively. Also, ultrasound sensitivity and specificity in diagnosing pneumothorax were 75% (95% CI: 35.6-95.5, and 100% (95% CI: 97.6-100, respectively. Conclusion: The present study results showed low diagnostic value of ultrasound in determining catheter location and in detecting pneumothorax. With knowledge of previous studies, the search still on this field.   Keywords: Central venous catheterization; complications; bedside ultrasound; radiography;

  13. Reliability and predictors of resistive load detection in children with persistent asthma: a multivariate approach.

    Harver, Andrew; Dyer, Allison; Ersek, Jennifer L; Kotses, Harry; Humphries, C Thomas


    Resistive load detection tasks enable analysis of individual differences in psychophysical outcomes. The purpose of this study was to determine both the reliability and predictors of resistive load detection in children with persistent asthma who completed multiple testing sessions. Both University of North Carolina (UNC) Charlotte and Ohio University institutional review boards approved the research protocol. The detection of inspiratory resistive loads was evaluated in 75 children with asthma between 8 and 15 years of age. Each child participated in four experimental sessions that occurred approximately once every 2 weeks. Multivariate analyses were used to delineate predictors of task performance. Reliability of resistive load detection was determined for each child, and predictors of load detection outcomes were investigated in two groups of children: those who performed reliably in all four sessions (n = 31) and those who performed reliably in three or fewer sessions (n = 44). Three factors (development, symptoms, and compliance) accounted for 66.3% of the variance among variables that predicted 38.7% of the variance in load detection outcomes (Multiple R = 0.62, p = 0.004) and correctly classified performance as reliable or less reliable in 80.6% of the children, χ(2)(12) = 28.88, p = 0.004. Cognitive and physical development, appraisal of symptom experiences, and adherence-related behaviors (1) account for a significant proportion of the interrelationships among variables that affect perception of airflow obstruction in children with asthma and (2) differentiate between children who perform more or less reliably in a resistive load detection task.

  14. The Past and Future of Light Dark Matter Direct Detection

    Davis, Jonathan H


    We review the status and future of direct searches for light dark matter. We start by answering the question: `Whatever happened to the light dark matter anomalies?' i.e. the fate of the potential dark matter signals observed by the CoGeNT, CRESST-II, CDMS-Si and DAMA/LIBRA experiments. We discuss how the excess events in the first two of these experiments have been explained by previously underestimated backgrounds. For DAMA we summarise the progress and future of mundane explanations for the annual modulation reported in its event rate. Concerning the future of direct detection we focus on the irreducible background from solar neutrinos. We explain broadly how it will affect future searches and summarise efforts to mitigate its effects.




    Full Text Available The paper has three parts, in the first one, is presented the theoretical concepts that refer on the grounding lines fault, the treating mode, and implemented solutions for their detection in electric stations. In the second part is presented the result of the operational reliability analyse in period of 2011 – 2015, as in the final part of the paper are given the conclusions and identified solutions regarding the improving of operational reliability of the protection equipment of detection of grounding lines.

  16. Detection of Yarkovsky acceleration in the context of precovery observations and the future Gaia catalogue

    Desmars, J.


    International audience; Context. The Yarkovsky effect is a weak non-gravitational force leading to a small variation of the semi-major axis of an asteroid. Using radar measurements and astrometric observations, it is possible to measure a drift in semi-major axis through orbit determination.Aims: This paper aims to detect a reliable drift in semi-major axis of near-Earth asteroids (NEAs) from ground-based observations and to investigate the impact of precovery observations and the future Gaia...

  17. Reliability of high mobility SiGe channel MOSFETs for future CMOS applications

    Franco, Jacopo; Groeseneken, Guido


    Due to the ever increasing electric fields in scaled CMOS devices, reliability is becoming a showstopper for further scaled technology nodes. Although several groups have already demonstrated functional Si channel devices with aggressively scaled Equivalent Oxide Thickness (EOT) down to 5Å, a 10 year reliable device operation cannot be guaranteed anymore due to severe Negative Bias Temperature Instability. This book focuses on the reliability of the novel (Si)Ge channel quantum well pMOSFET technology. This technology is being considered for possible implementation in next CMOS technology nodes, thanks to its benefit in terms of carrier mobility and device threshold voltage tuning. We observe that it also opens a degree of freedom for device reliability optimization. By properly tuning the device gate stack, sufficiently reliable ultra-thin EOT devices with a 10 years lifetime at operating conditions are demonstrated. The extensive experimental datasets collected on a variety of processed 300mm wafers and pr...

  18. Reliability and minimal detectable difference in multisegment foot kinematics during shod walking and running.

    Milner, Clare E; Brindle, Richard A


    There has been increased interest recently in measuring kinematics within the foot during gait. While several multisegment foot models have appeared in the literature, the Oxford foot model has been used frequently for both walking and running. Several studies have reported the reliability for the Oxford foot model, but most studies to date have reported reliability for barefoot walking. The purpose of this study was to determine between-day (intra-rater) and within-session (inter-trial) reliability of the modified Oxford foot model during shod walking and running and calculate minimum detectable difference for common variables of interest. Healthy adult male runners participated. Participants ran and walked in the gait laboratory for five trials of each. Three-dimensional gait analysis was conducted and foot and ankle joint angle time series data were calculated. Participants returned for a second gait analysis at least 5 days later. Intraclass correlation coefficients and minimum detectable difference were determined for walking and for running, to indicate both within-session and between-day reliability. Overall, relative variables were more reliable than absolute variables, and within-session reliability was greater than between-day reliability. Between-day intraclass correlation coefficients were comparable to those reported previously for adults walking barefoot. It is an extension in the use of the Oxford foot model to incorporate wearing a shoe while maintaining marker placement directly on the skin for each segment. These reliability data for walking and running will aid in the determination of meaningful differences in studies which use this model during shod gait.

  19. The future of spectroscopic life detection on exoplanets.

    Seager, Sara


    The discovery and characterization of exoplanets have the potential to offer the world one of the most impactful findings ever in the history of astronomy--the identification of life beyond Earth. Life can be inferred by the presence of atmospheric biosignature gases--gases produced by life that can accumulate to detectable levels in an exoplanet atmosphere. Detection will be made by remote sensing by sophisticated space telescopes. The conviction that biosignature gases will actually be detected in the future is moderated by lessons learned from the dozens of exoplanet atmospheres studied in last decade, namely the difficulty in robustly identifying molecules, the possible interference of clouds, and the permanent limitations from a spectrum of spatially unresolved and globally mixed gases without direct surface observations. The vision for the path to assess the presence of life beyond Earth is being established.

  20. Efficient Structural System Reliability Updating with Subspace-Based Damage Detection Information

    Döhler, Michael; Thöns, Sebastian

    modelling is introduced building upon the non-destructive testing reliability which applies to structural systems and DDS containing a strategy to overcome the high computational efforts for the pre-determination of the DDS reliability. This approach takes basis in the subspace-based damage detection method......Damage detection systems and algorithms (DDS and DDA) provide information of the structural system integrity in contrast to e.g. local information by inspections or non-destructive testing techniques. However, the potential of utilizing DDS information for the structural integrity assessment...... and prognosis is hardly exploited nor treated in scientific literature up to now. In order to utilize the information provided by DDS for the structural performance, usually high computational efforts for the pre-determination of DDS reliability are required. In this paper, an approach for the DDS performance...

  1. Botnet detection techniques:review, future trends, and issues

    Ahmad KARIM; Rosli Bin SALLEH; Muhammad SHIRAZ; Syed Adeel Ali SHAH; Irfan AWAN; Nor Badrul ANUAR


    In recent years, the Internet has enabled access to widespread remote services in the distributed computing envi-ronment; however, integrity of data transmission in the distributed computing platform is hindered by a number of security issues. For instance, the botnet phenomenon is a prominent threat to Internet security, including the threat of malicious codes. The botnet phenomenon supports a wide range of criminal activities, including distributed denial of service (DDoS) attacks, click fraud, phishing, malware distribution, spam emails, and building machines for illegitimate exchange of information/materials. Therefore, it is imperative to design and develop a robust mechanism for improving the botnet detection, analysis, and removal process. Currently, botnet detection techniques have been reviewed in different ways; however, such studies are limited in scope and lack discussions on the latest botnet detection techniques. This paper presents a comprehensive review of the latest state-of-the-art techniques for botnet detection and figures out the trends of previous and current research. It provides a thematic taxonomy for the classification of botnet detection techniques and highlights the implications and critical aspects by qualitatively analyzing such techniques. Related to our comprehensive review, we highlight future directions for improving the schemes that broadly span the entire botnet detection research field and identify the persistent and prominent research challenges that remain open.

  2. Scenario based approach to structural damage detection and its value in a risk and reliability perspective

    Hovgaard, Mads Knude; Hansen, Jannick Balleby; Brincker, Rune


    A scenario- and vibration based structural damage detection method is demonstrated though simulation. The method is Finite Element (FE) based. The value of the monitoring is calculated using structural reliability theory. A high cycle fatigue crack propagation model is assumed as the damage mecha...

  3. Reliability and minimal detectable change of the weight-bearing lunge test: A systematic review.

    Powden, Cameron J; Hoch, Johanna M; Hoch, Matthew C


    Ankle dorsiflexion range of motion (DROM) is often a point of emphasis during the rehabilitation of lower extremity pathologies. With the growing popularity of weight-bearing DROM assessments, several versions of the weight-bearing lunge (WBLT) test have been developed and numerous reliability studies have been conducted. The purpose of this systematic review was to critically appraise and synthesize the studies which examined the reliability and responsiveness of the WBLT to assess DROM. A systematic search of PubMed and EBSCO Host databases from inception to September 2014 was conducted to identify studies whose primary aim was assessing the reliability of the WBLT. The Quality Appraisal of Reliability Studies assessment tool was utilized to determine the quality of included studies. Relative reliability was examined through intraclass correlation coefficients (ICC) and responsiveness was evaluated through minimal detectable change (MDC). A total of 12 studies met the eligibility criteria and were included. Nine included studies assessed inter-clinician reliability and 12 included studies assessed intra-clinician reliability. There was strong evidence that inter-clinician reliability (ICC = 0.80-0.99) as well as intra-clinician reliability (ICC = 0.65-0.99) of the WBLT is good. Additionally, average MDC scores of 4.6° or 1.6 cm for inter-clinician and 4.7° or 1.9 cm for intra-clinician were found, indicating the minimal change in DROM needed to be outside the error of the WBLT. This systematic review determined that the WBLT, regardless of method, can be used clinically to assess DROM as it provides consistent results between one or more clinicians and demonstrates reasonable responsiveness. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Design for reliability in power electronics in renewable energy systems – status and future

    Wang, Huai; Blaabjerg, Frede; Ma, Ke


    Advances in power electronics enable efficient and flexible interconnection of renewable sources, loads and electric grids. While targets concerning efficiency of power converters are within reach, recent research endeavors to predict and improve their reliability to ensure high availability, low......, the lifetime prediction of reliability-critical components IGBT modules is discussed in a 2.3 MW wind power converter. Finally, the challenges and opportunities to achieve more reliable power electronic converters are discussed....... maintenance costs, and herefore, low Levelized-Cost-of-Energy (LCOE) of renewable energy systems. This paper presents the prior-art Design for Reliability (DFR) process for power converters and addresses the paradigm shift to Physics-of-Failure (PoF) approach and mission profile based analysis. Moreover...

  5. Reliability of recordings of subgingival calculus detected using an ultrasonic device.

    Corraini, Priscila; López, Rodrigo


    To assess the intra-examiner reliability of recordings of subgingival calculus detected using an ultrasonic device, and to investigate the influence of subject-, tooth- and site-level factors on the reliability of these subgingival calculus recordings. On two occasions, within a 1-week interval, 147 adult periodontitis patients received a full-mouth clinical periodontal examination by a single trained examiner. Duplicate subgingival calculus recordings, in six sites per tooth, were obtained using an ultrasonic device for calculus detection and removal. Agreement was observed in 65 % of the 22,584 duplicate subgingival calculus recordings, ranging 45 % to 83 % according to subject. Using hierarchical modeling, disagreements in the subgingival calculus duplicate recordings were more likely in all other sites than the mid-buccal, and in sites harboring supragingival calculus. Disagreements were less likely in sites with PD ≥  4 mm and with furcation involvement  ≥  degree 2. Bleeding on probing or suppuration did not influence the reliability of subgingival calculus. At the subject-level, disagreements were less likely in patients presenting with the highest and lowest extent categories of the covariate subgingival calculus. The reliability of subgingival calculus recordings using the ultrasound technology is reasonable. The results of the present study suggest that the reliability of subgingival calculus recordings is not influenced by the presence of inflammation. Moreover, subgingival calculus can be more reliably detected using the ultrasound device at sites with higher need for periodontal therapy, i.e., sites presenting with deep pockets and premolars and molars with furcation involvement.

  6. Detection of Yarkovsky acceleration in the context of precovery observations and the future Gaia catalogue

    Desmars, Josselin


    The Yarkovsky effect is a weak non-gravitational force leading to a small variation of the semi-major axis of an asteroid. Using radar measurements and astrometric observations, it is possible to measure a drift in semi-major axis through orbit determination. This paper aims to detect a reliable drift in semi-major axis of near-Earth asteroids (NEAs) from ground-based observations and to investigate the impact of precovery observations and the future Gaia catalogue in the detection of a secular drift in semi-major axis. We have developed a precise dynamical model of an asteroid's motion taking the Yarkovsky acceleration into account and allowing the fitting of the drift in semi-major axis. Using statistical methods, we investigate the quality and the robustness of the detection. By filtering spurious detections with an estimated maximum drift depending on the asteroid's size, we found 46 NEAs with a reliable drift in semi-major axis in good agreement with the previous studies. The measure of the drift leads t...

  7. Detecting Chemical Weapons: Threats, Requirements, Solutions, and Future Challenges

    Boso, Brian


    mobility spectrometry, and amplifying fluorescence polymers. In the future the requirements for detection equipment will continue to become even more stringent. The continuing increase in the sheer number of threats that will need to be detected, the development of binary agents requiring that even the precursor chemicals be detected, the development of new types of agents unlike any of the current chemistries, and the expansion of the list of toxic industrial chemical will require new techniques with higher specificity and more sensitivity.

  8. Detectability and reliability analysis of the local seismic network in Pakistan


    The detectability and reliability analysis for the local seismic network is performed employing by Bungum and Husebye technique. The events were relocated using standard computer codes for hypocentral locations. The detectability levels are estimated from the twenty-five years of recorded data in terms of 50(, 90( and 100( cumulative detectability thresholds, which were derived from frequency-magnitude distribution. From this analysis the 100( level of detectability of the network is ML=1.7 for events which occur within the network. The accuracy in hypocentral solutions of the network is investigated by considering the fixed real hypocenter within the network. The epicentral errors are found to be less than 4 km when the events occur within the network. Finally, the problems faced during continuous operation of the local network, which effects its detectability, are discussed.

  9. Reliability and minimum detectable change of the gait profile score for post-stroke patients.

    Devetak, Gisele Francini; Martello, Suzane Ketlyn; de Almeida, Juliana Carla; Correa, Katren Pedroso; Iucksch, Dielise Debona; Manffra, Elisangela Ferretti


    The objectives of this work were (i) to determine Gait Profile Score (GPS) for hemiparetic stroke patients, (ii) to evaluate its reliability within and between sessions, and (iii) to establish its minimal detectable change (MDC). Seventeen hemiparetic patients (mean age 54.9±10.5years; 9 men and 8 women; 6 hemiparetic on the left side and 11 on the right side; mean time after stroke 6.1±3.5months) participated in 2 gait assessment sessions within an interval of 2-7 days. Intra-session reliability was obtained from the intraclass correlation coefficient (ICC) between the three strides of each session. Inter-session reliability was estimated by the ICC from the averages of that three strides. GPS value of non paretic lower limb (NPLL) (13.9±2.4°) was greater than that of paretic lower limb (PLL) (12.0±2.8°) and overall GPS (GPS_O) was 13.7±2.5°. The Gait Variable Scores (GVS), GPS and GPS_O exhibited intra-session ICC values between 0.70 and 0.99, suggesting high intra-day stability. Most of GVS exhibited excellent inter-session reliability (ICC between 0.81 and 0.93). Only hip rotation, hip abduction of PLL exhibited moderate reliability with ICC/MDC values of 0.57/10.0° and 0.71/3.1°, respectively. ICC/MDC values of GPS were 0.92/2.3° and 0.93/1.9° for PLL and NPLL, respectively. GPS_O exhibited excellent test-retest reliability (ICC=0.95) and MDC of 1.7°. Given its reliability, the GPS has proven to be a suitable tool for therapeutic assessment of hemiparetic patients after stroke.

  10. Automated Energy Distribution and Reliability System: Validation Integration - Results of Future Architecture Implementation

    Buche, D. L.


    This report describes Northern Indiana Public Service Co. project efforts to develop an automated energy distribution and reliability system. The purpose of this project was to implement a database-driven GIS solution that would manage all of the company's gas, electric, and landbase objects. This report is second in a series of reports detailing this effort.

  11. Reliability of a computer-aided detection system in detecting lung metastases compared to manual palpation during surgery.

    Schramm, Alexandra; Wormanns, Dag; Leschber, Gunda; Merk, Johannes


    For resection of lung metastases computed tomography (CT) is needed to determine the operative strategy. A computer-aided detection (CAD) system, a software tool for automated detection of lung nodules, analyses the CT scans in addition to the radiologists and clearly marks lesions. The aim of this feasibility study was to evaluate the reliability of CAD in detecting lung metastases. Preoperative CT scans of 18 patients, who underwent surgery for suspected lung metastases, were analysed with CAD (September-December 2009). During surgery all suspected lesions were traced and resected. Histological examination was performed and results compared to radiologically suspicious nodes. Radiological analysis assisted by CAD detected 64 nodules (mean 3.6, range 1-7). During surgery 91 nodules (mean 5.0, range 1-11) were resected, resulting in 27 additionally palpated nodules. Histologically all these additional nodules were benign. In contrast, all 30 nodules shown to be metastases by histological studies were correctly described by CAD. The CAD system is a sensible and useful tool for finding pulmonary lesions. It detects more and smaller lesions than conventional radiological analysis. In this feasibility study we were able to show a greater reliability of the CAD analysis. A further and prospective study to confirm these data is ongoing.

  12. Reliability of Physical Systems: Detection of Malicious Subcircuits (Trojan Circuits) in Sequential Circuits

    Matrosova, A. Yu.; Kirienko, I. E.; Tomkov, V. V.; Miryutov, A. A.


    Reliability of physical systems is provided by reliability of their parts including logical ones. Insertion of malicious subcircuits that can destroy logical circuit or cause leakage of confidential information from a system necessitates the detection of such subcircuits followed by their masking if possible. We suggest a method of finding a set of sequential circuit nodes in which Trojan Circuits can be inserted. The method is based on random estimations of controllability and observability of combinational nodes calculated using a description of sequential circuit working area and an evidence of existence of a transfer sequence for the proper set of internal states without finding the sequence itself. The method allows cutting calculations using operations on Reduced Ordered Binary Decision Diagrams (ROBDDs) that can depend only on the state variables of the circuit. The approach, unlike traditional ones, does not require preliminary sequential circuit simulation but can use its results. It can be used when malicious circuits cannot be detected during sequential circuit verification.

  13. Photometry's bright future: Detecting Solar System analogues with future space telescopes

    Hippke, Michael


    Time-series transit photometry from the Kepler space telescope has allowed for the discovery of thousands of exoplanets. We explore the potential of yet improved future missions such as PLATO 2.0 in detecting solar system analogues. We use real-world solar data and end-to-end simulations to explore the stellar and instrumental noise properties. By injecting and retrieving planets, rings and moons of our own solar system, we show that the discovery of Venus- and Earth-analogues transiting G-dwarfs like our Sun is feasible at high S/N after collecting 6yrs of data, but Mars and Mercury will be difficult to detect due to stellar noise. In the best cases, Saturn's rings and Jupiter's moons will be detectable even in single transit observations. Through the high number (>1bn) of observed stars by PLATO 2.0, it will become possible to detect thousands of single-transit events by cold gas giants, analogue to our Jupiter, Saturn, Uranus and Neptune. Our own solar system aside, we also show, through signal injection a...

  14. Future prospects for the detection and characterization of extrasolar planets

    Lunine J.I.


    Full Text Available Several distinctly different techniques have detected almost 500 planets orbiting around main-sequence stars, 45 multiple planet systems, and a number of extrasolar planets have been the subject of direct study. Hundreds of other “candidate” planets detected by the Kepler spacecraft await confirmation of their existence. Planets are thus common phenomena around stars, and the prospects seem good in the next few years for establishing statistics on the occurrence of Earth-sized planets. Extension of the most successful technique of Doppler spectroscopy in sensitivity to detect Earth-mass planets around Sun-like stars will be limited by the noise generated by the stellar photospheres themselves. The James Webb Space Telescope will have the capability to measure atmospheric abundances of certain gases and of liquid water on extrasolar planets, including “superEarths” within a factor of two of the radius of the Earth. The ultimate goal of measuring the atmospheric composition of an Earth-sized planet orbiting at 1 AU around a star like the Sun remains a daunting challenge that is perhaps twenty years in the future.

  15. Application of Petri nets to reliability prediction of occupant safety systems with partial detection and repair

    Kleyner, Andre, E-mail: [Delphi Corporation, Electronics and Safety Division, P.O. Box 9005, M.S. CTC 2E, Kokomo, IN 46904 (United States); Volovoi, Vitali, E-mail: vitali.volovoi@ae.gatech.ed [School of Aerospace Engineering, Georgia Institute of Technology, Atlanta, GA 30332 (United States)


    This paper presents an application of stochastic Petri nets (SPN) to calculate the availability of safety critical on-demand systems. Traditional methods of estimating system reliability include standards-based or field return-based reliability prediction methods. These methods do not take into account the effect of fault-detection capability and penalize the addition of detection circuitry due to the higher parts count. Therefore, calculating system availability, which can be linked to the system's probability of failure on demand (P{sub fd}), can be a better alternative to reliability prediction. The process of estimating the P{sub fd} of a safety system can be further complicated by the presence of system imperfections such as partial-fault detection by users and untimely or uncompleted repairs. Additionally, most system failures cannot be represented by Poisson process Markov chain methods, which are commonly utilized for the purposes of estimating P{sub fd}, as these methods are not well-suited for the analysis of non-Poisson failures. This paper suggests a methodology and presents a case study of SPN modeling adequately handling most of the above problems. The model will be illustrated with a case study of an automotive electronics airbag controller as an example of a safety critical on-demand system.


    I.M. Braun


    Full Text Available  This paper presents and analyzes two algorithms for the detection of hail zones in clouds and precipitation: parametric algorithm and adaptive non-parametric algorithm. Reliability of detection of radar signals from hailstones is investigated by statistical simulation with application of experimental researches as initial data. The results demonstrate the limits of both algorithms as well as higher viability of non-parametric algorithm. Polarimetric algorithms are useful for the implementation in ground-based and airborne weather radars.

  17. Revenue Sufficiency and Reliability in a Zero Marginal Cost Future: Preprint

    Frew, Bethany A.; Milligan, Michael; Brinkman, Greg; Bloom, Aaron; Clark, Kara; Denholm, Paul


    Features of existing wholesale electricity markets, such as administrative pricing rules and policy-based reliability standards, can distort market incentives from allowing generators sufficient opportunities to recover both fixed and variable costs. Moreover, these challenges can be amplified by other factors, including (1) inelastic demand resulting from a lack of price signal clarity, (2) low- or near-zero marginal cost generation, particularly arising from low natural gas fuel prices and variable generation (VG), such as wind and solar, and (3) the variability and uncertainty of this VG. As power systems begin to incorporate higher shares of VG, many questions arise about the suitability of the existing marginal-cost-based price formation, primarily within an energy-only market structure, to ensure the economic viability of resources that might be needed to provide system reliability. This article discusses these questions and provides a summary of completed and ongoing modelling-based work at the National Renewable Energy Laboratory to better understand the impacts of evolving power systems on reliability and revenue sufficiency.

  18. Is sequential cranial ultrasound reliable for detection of white matter injury in very preterm infants?

    Leijser, Lara M.; Steggerda, Sylke J.; Walther, Frans J.; Wezel-Meijler, Gerda van [Leiden University Medical Center, Department of Pediatrics, Division of Neonatology, Leiden (Netherlands); Bruine, Francisca T. de; Grond, Jeroen van der [Leiden University Medical Center, Department of Radiology, Division of Neuroradiology, Leiden (Netherlands)


    Cranial ultrasound (cUS) may not be reliable for detection of diffuse white matter (WM) injury. Our aim was to assess in very preterm infants the reliability of a classification system for WM injury on sequential cUS throughout the neonatal period, using magnetic resonance imaging (MRI) as reference standard. In 110 very preterm infants (gestational age <32 weeks), serial cUS during admission (median 8, range 4-22) and again around term equivalent age (TEA) and a single MRI around TEA were performed. cUS during admission were assessed for presence of WM changes, and contemporaneous cUS and MRI around TEA additionally for abnormality of lateral ventricles. Sequential cUS (from birth up to TEA) and MRI were classified as normal/mildly abnormal, moderately abnormal, or severely abnormal, based on a combination of findings of the WM and lateral ventricles. Predictive values of the cUS classification were calculated. Sequential cUS were classified as normal/mildly abnormal, moderately abnormal, and severely abnormal in, respectively, 22%, 65%, and 13% of infants and MRI in, respectively, 30%, 52%, and 18%. The positive predictive value of the cUS classification for the MRI classification was high for severely abnormal WM (0.79) but lower for normal/mildly abnormal (0.67) and moderately abnormal (0.64) WM. Sequential cUS during the neonatal period detects severely abnormal WM in very preterm infants but is less reliable for mildly and moderately abnormal WM. MRI around TEA seems needed to reliably detect WM injury in very preterm infants. (orig.)

  19. Role of genetic detection in peritoneal washes with gastric carcinoma: The past, present and future

    Hyun-Dong Chae


    The most frequent cause of treatment failure following surgery for gastric cancer is peritoneal dissemination, mainly caused by the seeding of free cancer cells from the primary gastric cancer, which is the most common type of spread. Unfortunately, there is no standard modality of intraperitoneal free cancer cells detection to predict peritoneal metastasis until now. We reviewed English literature in Pub Med was done using the Me SH terms for gastric cancer, peritoneal wash, and reverse transcriptase polymerase chain reaction. All the articles were reviewed and core information was tabulated for reference. After a comprehensive review of all articles, the data was evaluated by clinical implication and predictive value of each marker for peritoneal recurrence. There are still many limitations to overcome before the genetic diagnosis for free cancer cells detection can be considered as routine assay. To make it a reliable diagnostic tool for detecting free cancer cells, the process and method of genetic detection with peritoneal washes should be standardized, and the development of simple diagnostic devices and easily available kits are necessary. Herein, we reviewed the past, present and future perspectives of the peritoneal lavage for the detection of intraperitoneal free cancer cells in patients with gastric cancer.

  20. Reliability of a Tissue Microarray in Detecting Thyroid Transcription Factor-1 Protein in Lung Carcinomas

    Xiaoyan Bai; Hong Shen


    OBJECTIVE To compare the expression of the thyroid transcription factor-1 (TTF-1) in human normal adult type Ⅱ alveolar epithelial cells,embryonic pneumocytes and cancer cells of lung carcinoma and metastatic lymph nodes using a tissue microarray (TMA) along with paired conventional full sections.and to jnvestigate the reliability of tissue microarrays in detecting protein expression in lung carcinoma.METHODS A lung carcinoma TMA including 765 cores was constructed.TTF-1 protein expression in both TMA and paired conventional full sections were detected by yhe immunohistochemical SP method using a monoclonal antibody to TTF-1.A PU (Positive Unit) of TTF-1 protein was assessed quantitatively by the Leica Q500MC image analysis system with results from the paired conventional full sections as controls.RESULTS There was no signifcance between TMA and paired conven tional full sections in TTF-1 expression in difierent nuclei of the lung tissue.CONCLUSION TTF-1 protein expression in lung carcinoma detected by TMA was highly concordanl with that of paired full sections.TMA is a reliable method in detecting protein expression.

  1. Detection of Yarkovsky acceleration in the context of precovery observations and the future Gaia catalogue

    Desmars, J.


    Context. The Yarkovsky effect is a weak non-gravitational force leading to a small variation of the semi-major axis of an asteroid. Using radar measurements and astrometric observations, it is possible to measure a drift in semi-major axis through orbit determination. Aims: This paper aims to detect a reliable drift in semi-major axis of near-Earth asteroids (NEAs) from ground-based observations and to investigate the impact of precovery observations and the future Gaia catalogue in the detection of a secular drift in semi-major axis. Methods: We have developed a precise dynamical model of an asteroid's motion taking the Yarkovsky acceleration into account and allowing the fitting of the drift in semi-major axis. Using statistical methods, we investigate the quality and the robustness of the detection. Results: By filtering spurious detections with an estimated maximum drift depending on the asteroid's size, we found 46 NEAs with a reliable drift in semi-major axis in good agreement with the previous studies. The measure of the drift leads to a better orbit determination and constrains some physical parameters of these objects. Our results are in good agreement with the 1 /D dependence of the drift and with the expected ratio of prograde and retrograde NEAs. We show that the uncertainty of the drift mainly depends on the length of orbital arc and in this way we highlight the importance of the precovery observations and data mining in the detection of consistent drift. Finally, we discuss the impact of Gaia catalogue in the determination of drift in semi-major axis.

  2. Effective confidence interval estimation of fault-detection process of software reliability growth models

    Fang, Chih-Chiang; Yeh, Chun-Wu


    The quantitative evaluation of software reliability growth model is frequently accompanied by its confidence interval of fault detection. It provides helpful information to software developers and testers when undertaking software development and software quality control. However, the explanation of the variance estimation of software fault detection is not transparent in previous studies, and it influences the deduction of confidence interval about the mean value function that the current study addresses. Software engineers in such a case cannot evaluate the potential hazard based on the stochasticity of mean value function, and this might reduce the practicability of the estimation. Hence, stochastic differential equations are utilised for confidence interval estimation of the software fault-detection process. The proposed model is estimated and validated using real data-sets to show its flexibility.

  3. Are the Projections of Future Climate Change Reliable in the IPCC Reports?

    Zongci Zhao


    @@ As we know,the projections of future climate change including impacts and strategies in the IPCC Assessment Reports were based on global climate models with scenarios on various human activities.Global climate model simulations provide key inputs for climate change assessments. In this study,the main objective is to analyze if the projections of fu-ture climate change by global climate models are reli-able.Several workshops have been held on this issue,such as the IPCC expert meeting on assessing and combining multi-model climate projections in January of 2010 (presided by the co-chairs of the IPCC WGI and WGII AR5),and the workshop of the combined global climate model group held by NCAR in June of 2010.

  4. Circuit design for reliability

    Cao, Yu; Wirth, Gilson


    This book presents physical understanding, modeling and simulation, on-chip characterization, layout solutions, and design techniques that are effective to enhance the reliability of various circuit units.  The authors provide readers with techniques for state of the art and future technologies, ranging from technology modeling, fault detection and analysis, circuit hardening, and reliability management. Provides comprehensive review on various reliability mechanisms at sub-45nm nodes; Describes practical modeling and characterization techniques for reliability; Includes thorough presentation of robust design techniques for major VLSI design units; Promotes physical understanding with first-principle simulations.

  5. Reliability and Minimum Detectable Change of Temporal-Spatial, Kinematic, and Dynamic Stability Measures during Perturbed Gait.

    Christopher A Rábago

    Full Text Available Temporal-spatial, kinematic variability, and dynamic stability measures collected during perturbation-based assessment paradigms are often used to identify dysfunction associated with gait instability. However, it remains unclear which measures are most reliable for detecting and tracking responses to perturbations. This study systematically determined the between-session reliability and minimum detectable change values of temporal-spatial, kinematic variability, and dynamic stability measures during three types of perturbed gait. Twenty young healthy adults completed two identical testing sessions two weeks apart, comprised of an unperturbed and three perturbed (cognitive, physical, and visual walking conditions in a virtual reality environment. Within each session, perturbation responses were compared to unperturbed walking using paired t-tests. Between-session reliability and minimum detectable change values were also calculated for each measure and condition. All temporal-spatial, kinematic variability and dynamic stability measures demonstrated fair to excellent between-session reliability. Minimal detectable change values, normalized to mean values ranged from 1-50%. Step width mean and variability measures demonstrated the greatest response to perturbations with excellent between-session reliability and low minimum detectable change values. Orbital stability measures demonstrated specificity to perturbation direction and sensitivity with excellent between-session reliability and low minimum detectable change values. We observed substantially greater between-session reliability and lower minimum detectable change values for local stability measures than previously described which may be the result of averaging across trials within a session and using velocity versus acceleration data for reconstruction of state spaces. Across all perturbation types, temporal-spatial, orbital and local measures were the most reliable measures with the

  6. Reliability of quantitative real-time PCR for bacterial detection in cystic fibrosis airway specimens.

    Edith T Zemanick

    Full Text Available The cystic fibrosis (CF airway microbiome is complex; polymicrobial infections are common, and the presence of fastidious bacteria including anaerobes make culture-based diagnosis challenging. Quantitative real-time PCR (qPCR offers a culture-independent method for bacterial quantification that may improve diagnosis of CF airway infections; however, the reliability of qPCR applied to CF airway specimens is unknown. We sought to determine the reliability of nine specific bacterial qPCR assays (total bacteria, three typical CF pathogens, and five anaerobes applied to CF airway specimens. Airway and salivary specimens from clinically stable pediatric CF subjects were collected. Quantitative PCR assay repeatability was determined using triplicate reactions. Split-sample measurements were performed to measure variability introduced by DNA extraction. Results from qPCR were compared to standard microbial culture for Pseudomonas aeruginosa, Staphylococcus aureus, and Haemophilus influenzae, common pathogens in CF. We obtained 84 sputa, 47 oropharyngeal and 27 salivary specimens from 16 pediatric subjects with CF. Quantitative PCR detected bacterial DNA in over 97% of specimens. All qPCR assays were highly reproducible at quantities≥10(2 rRNA gene copies/reaction with coefficient of variation less than 20% for over 99% of samples. There was also excellent agreement between samples processed in duplicate. Anaerobic bacteria were highly prevalent and were detected in mean quantities similar to that of typical CF pathogens. Compared to a composite gold standard, qPCR and culture had variable sensitivities for detection of P. aeruginosa, S. aureus and H. influenzae from CF airway samples. By reliably quantifying fastidious airway bacteria, qPCR may improve our understanding of polymicrobial CF lung infections, progression of lung disease and ultimately improve antimicrobial treatments.

  7. Detecting future performance of the reservoirs under the changing climate

    Biglarbeigi, Pardis; Strong, W. Alan; Griffiths, Philip


    Climate change is expected to affect the hydrological cycle resulting in changes in rainfall patterns, seasonal variations as well as flooding and drought. Also, changes in the hydrologic regime of the rivers are another anticipated effects of climate change. This climatic variability put pressure on renewable water resources with its increase in some regions, decrease in the others and high uncertainties in every region. As a result of the pressure of climate change on water resources, the operation of reservoir and dams is expected to experience uncertainties in different aspects such as supplying water and controlling the flood. In this study, we model two hypothetical dams on different streamflows, based on the water needs of 20'000 and 100'000 people. UK, as a country that suffered from several flooding events during the past years, and Iran, as a country with severe water scarcity, are chosen as the nations under study. For this study, the hypothetical modeled dam is located on three streamflows in each nation. Then, the mass-balance model of the system is optimised over 25 years of historical data, considering two objectives: 1) Minimisation of the water deficit in different sectors (agricultural, domestic and industrial) and 2) Minimisation of the flooding around the reservoir catchment. The optimised policies are simulated into the model again under different climate change and demographic scenarios to obtain the Resilience, Reliability and Vulnerability (RRV indices) of the system. In order to gain this goal, two different set of scenarios are introduced; the first set is the scenarios introduced in IPCC assessment in its Special Report on Emission Scenarios (SRES). The second set is introduced as a Monte Carlo simulation of demographic and temperature scenarios. Demographic scenarios are defined as the UN's estimation of population based on age, sex, fertility, mortality and migration rates with a 2-year frequency. Temperature scenarios, on the other

  8. Validity and reliability of an IMU-based method to detect APAs prior to gait initiation.

    Mancini, Martina; Chiari, Lorenzo; Holmstrom, Lars; Salarian, Arash; Horak, Fay B


    Anticipatory postural adjustments (APAs) prior to gait initiation have been largely studied in traditional, laboratory settings using force plates under the feet to characterize the displacement of the center of pressure. However clinical trials and clinical practice would benefit from a portable, inexpensive method for characterizing APAs. Therefore, the main objectives of this study were (1) to develop a novel, automatic IMU-based method to detect and characterize APAs during gait initiation and (2) to measure its test-retest reliability. Experiment I was carried out in the laboratory to determine the validity of the IMU-based method in 10 subjects with PD (OFF medication) and 12 control subjects. Experiment II was carried out in the clinic, to determine test-retest reliability of the IMU-based method in a different set of 17 early-to-moderate, treated subjects with PD (tested ON medication) and 17 age-matched control subjects. Results showed that gait initiation characteristics (both APAs and 1st step) detected with our novel method were significantly correlated to the characteristics calculated with a force plate and motion analysis system. The size of APAs measured with either inertial sensors or force plate was significantly smaller in subjects with PD than in control subjects (p<0.05). Test-retest reliability for the gait initiation characteristics measured with inertial sensors was moderate-to-excellent (0.56

  9. The simulation of cutoff lows in a regional climate model: reliability and future trends

    Grose, Michael R. [University of Tasmania, Antarctic Climate and Ecosystems Cooperative Research Centre (ACE CRC), Private Bag 80, Hobart, TAS (Australia); Pook, Michael J.; McIntosh, Peter C.; Risbey, James S. [CSIRO Marine and Atmospheric Research, Centre for Australian Weather and Climate Research (CAWCR), Hobart, TAS (Australia); Bindoff, Nathaniel L. [University of Tasmania, Antarctic Climate and Ecosystems Cooperative Research Centre (ACE CRC), Private Bag 80, Hobart, TAS (Australia); CSIRO Marine and Atmospheric Research, Centre for Australian Weather and Climate Research (CAWCR), Hobart, TAS (Australia); University of Tasmania, Institute of Marine and Antarctic Studies (IMAS), Private Bag 129, Hobart, TAS (Australia)


    Cutoff lows are an important source of rainfall in the mid-latitudes that climate models need to simulate accurately to give confidence in climate projections for rainfall. Coarse-scale general circulation models used for climate studies show some notable biases and deficiencies in the simulation of cutoff lows in the Australian region and important aspects of the broader circulation such as atmospheric blocking and the split jet structure observed over Australia. The regional climate model conformal cubic atmospheric model or CCAM gives an improvement in some aspects of the simulation of cutoffs in the Australian region, including a reduction in the underestimate of the frequency of cutoff days by more than 15 % compared to a typical GCM. This improvement is due at least in part to substantially higher resolution. However, biases in the simulation of the broader circulation, blocking and the split jet structure are still present. In particular, a northward bias in the central latitude of cutoff lows creates a substantial underestimate of the associated rainfall over Tasmania in April to October. Also, the regional climate model produces a significant north-south distortion of the vertical profile of cutoff lows, with the largest distortion occurring in the cooler months that was not apparent in GCM simulations. The remaining biases and presence of new biases demonstrates that increased horizontal resolution is not the only requirement in the reliable simulation of cutoff lows in climate models. Notwithstanding the biases in their simulation, the regional climate model projections show some responses to climate warming that are noteworthy. The projections indicate a marked closing of the split jet in winter. This change is associated with changes to atmospheric blocking in the Tasman Sea, which decreases in June to November (by up to 7.9 m s{sup -1}), and increases in December to May. The projections also show a reduction in the number of annual cutoff days by 67

  10. Highly Sensitive and Reliable Detection of EGFR Exon 19 Deletions by Droplet Digital Polymerase Chain Reaction.

    Oskina, Natalya; Oscorbin, Igor; Khrapov, Evgeniy; Boyarskikh, Ulyana; Subbotin, Dmitriy; Demidova, Irina; Imyanitov, Evgeny; Filipenko, Maxim


    Analysis of EGFR mutations is becoming a routine clinical practice but the optimal EGFR mutation testing method is still to be determined. We determined the nucleotide sequence of deletions located in exon 19 of the EGFR gene in lung tumor samples of patients residing in different regions of Russia (153 tumor DNA specimens), using Sanger sequencing. We developed a droplet digital polymerase chain reaction assay capable of detecting all common EGFR deletions in exon 19. We also compared the therascreen amplification refractory mutation system assay with a droplet digital polymerase chain reaction assay for the detection of all the deletions in our study. The droplet digital polymerase chain reaction assay demonstrated 100% sensitivity against polymerase chain reaction fragment length analysis and detected all possible types of deletions revealed in our study (22 types). At the same time, the therascreen EGFR RGQ PCR Kit was not able to detect deletions c.2252-2276>A and c.2253-2276 and showed low performance for another long deletion. Thus, we can conclude that the extraordinary length of deletions and their atypical locations (shift at the 3'-region compared to known deletions) could be problematic for the therascreen EGFR RGQ PCR Kit and should be taken into account during targeted mutation test development. However, droplet digital polymerase chain reaction is a promising and reliable assay that can be used as a diagnostic tool to genotype formalin-fixed paraffin-embedded cancer samples for EGFR or another clinically relevant somatic mutation.

  11. Analysis of muscle fiber conduction velocity enables reliable detection of surface EMG crosstalk during detection of nociceptive withdrawal reflexes.

    Jensen, Michael Brun; Manresa, José Alberto Biurrun; Frahm, Ken Steffen; Andersen, Ole Kæseler


    .96) for the tibialis anterior and soleus muscles. This study investigated the negative effect of electrical crosstalk during reflex detection and revealed that the use of a previously validated scoring criterion may result in poor specificity due to crosstalk. The excellent performance of the developed methodology in the presence of crosstalk shows that assessment of muscle fiber conduction velocity allows reliable detection of EMG crosstalk during reflex detection.

  12. Visual acuity measures do not reliably detect childhood refractive error--an epidemiological study.

    Lisa O'Donoghue

    Full Text Available PURPOSE: To investigate the utility of uncorrected visual acuity measures in screening for refractive error in white school children aged 6-7-years and 12-13-years. METHODS: The Northern Ireland Childhood Errors of Refraction (NICER study used a stratified random cluster design to recruit children from schools in Northern Ireland. Detailed eye examinations included assessment of logMAR visual acuity and cycloplegic autorefraction. Spherical equivalent refractive data from the right eye were used to classify significant refractive error as myopia of at least 1DS, hyperopia as greater than +3.50DS and astigmatism as greater than 1.50DC, whether it occurred in isolation or in association with myopia or hyperopia. RESULTS: Results are presented from 661 white 12-13-year-old and 392 white 6-7-year-old school-children. Using a cut-off of uncorrected visual acuity poorer than 0.20 logMAR to detect significant refractive error gave a sensitivity of 50% and specificity of 92% in 6-7-year-olds and 73% and 93% respectively in 12-13-year-olds. In 12-13-year-old children a cut-off of poorer than 0.20 logMAR had a sensitivity of 92% and a specificity of 91% in detecting myopia and a sensitivity of 41% and a specificity of 84% in detecting hyperopia. CONCLUSIONS: Vision screening using logMAR acuity can reliably detect myopia, but not hyperopia or astigmatism in school-age children. Providers of vision screening programs should be cognisant that where detection of uncorrected hyperopic and/or astigmatic refractive error is an aspiration, current UK protocols will not effectively deliver.


    P.V. Srihari


    Full Text Available Fault diagnosis of gearboxes plays an important role in increasing the availability of machinery in condition monitoring. An effort has been made in this work to develop an artificial neural networks (ANN based fault detection system to increase reliability. Two prominent fault conditions in gears, worn-out and broken teeth, are simulated and five feature parameters are extracted based on vibration signals which are used as input features to the ANN based fault detection system developed in MATLAB, a three layered feed forward network using a back propagation algorithm. This ANN system has been trained with 30 sets of data and tested with 10 sets of data. The learning rate and number of hidden layer neurons are varied individually and the optimal training parameters are found based on the number of epochs. Among the five different learning rates used the 0.15 is deduced to be optimal one and at that learning rate the number of hidden layer neurons of 9 was the optimal one out of the three values considered. Then keeping the training parameters fixed, the number of hidden layers is varied by comparing the performance of the networks and results show the two and three hidden layers have the best detection accuracy.

  14. A simple and reliable methodology to detect egg white in art samples

    Michela Gambino; Francesca Cappitelli; Cristina Cattò; Aristodemo Carpen; Pamela Principi; Lisa Ghezzi; Ilaria Bonaduce; Eugenio Galano; Pietro Pucci; Leila Birolo; Federica Villa; Fabio Forlani


    A protocol for a simple and reliable dot-blot immunoassay was developed and optimized to test work of art samples for the presence of specific proteinaceus material (i.e. ovalbumin-based). The analytical protocol has been extensively set up with respect, among the other, to protein extraction conditions, to densitometric analysis and to the colorimetric reaction conditions. Feasibility evaluation demonstrated that a commercial scanner and a free image analysis software can be used for the data acquisition and elaboration, thus facilitating the application of the proposed protocol to commonly equipped laboratories and to laboratories of museums and conservation centres. The introduction of method of standard additions in the analysis of fresh and artificially aged laboratory-prepared samples, containing egg white and various pigments, allowed us to evaluate the matrix effect and the effect of sample aging and to generate threshold density values useful for the detection of ovalbumin in samples from ancient works of art. The efficacy of the developed dot-blot immunoassay was proved testing microsamples from 13th–16th century mural paintings of Saint Francesco Church in Lodi (Italy). Despite the aging, the altered conditions of conservation, the complex matrix, and the micro-size of samples, the presence of ovalbumin was detected in all those mural painting samples where mass-spectrometry-based proteomic analysis unambiguously detected ovalbumin peptides.

  15. Reliable dual-redundant sensor failure detection and identification for the NASA F-8 DFBW aircraft

    Deckert, J. C.; Desai, M. N.; Deyst, J. J., Jr.; Willsky, A. S.


    A technique was developed which provides reliable failure detection and identification (FDI) for a dual redundant subset of the flight control sensors onboard the NASA F-8 digital fly by wire (DFBW) aircraft. The technique was successfully applied to simulated sensor failures on the real time F-8 digital simulator and to sensor failures injected on telemetry data from a test flight of the F-8 DFBW aircraft. For failure identification the technique utilized the analytic redundancy which exists as functional and kinematic relationships among the various quantities being measured by the different control sensor types. The technique can be used not only in a dual redundant sensor system, but also in a more highly redundant system after FDI by conventional voting techniques reduced to two the number of unfailed sensors of a particular type. In addition the technique can be easily extended to the case in which only one sensor of a particular type is available.

  16. Link Reliability based Detection and Predecessor base Route Establishment for Prevention of Wormhole Attack

    Nansi Jain


    Full Text Available Mobile Ad hoc Network (MANET is consists of mobile hosts or sensor nodes proficient of functioning in absence of infrastructure. Such networks should be capable of self forming, self organizing, self managing, self recovering, and able to operate under dynamic conditions. The multi-hop communication phenomenon is used to sending information to receiver. To attain this, each mobile node depends on its neighbor or range node to forward the data packet to the destination. In fact, most of previous studies on MANET have implicitly assumed that nodes are cooperative such as node cooperation becomes a very important issue in MANET. The attacker in dynamic network are easily affected the routing performance and data receiving ratio is affected as compared to normal performance of network as well as dropping of data is enhanced. The packets percentage is degrades is the confirmation of attacker misbehavior. The characteristics of wormhole attack is to making the tunnel and reply the positive acknowledgement of destination at time of route request and drop all the data deliver through tunnel. The attacker is identified by the past and current data receiving and forwarding in MANET. The proposed IPS (Intrusion Detection and Prevention System provides the security on the basis of link reliability. In this work, we proposed new link reliability based security through Predecessor based Route Establishment of detecting routing misbehavior of wormhole attack for prevention in MANET. The attacker is blocked through the broadcasting scheme used by proposed prevention scheme from their actual identification to neighbors. The security provider nodes are blocking the communication of attacker and provide the secure communication among the mobile nodes. The performance of proposed scheme is evaluated through performance metrics like PDR and throughput

  17. Reliability assessment of null allele detection: inconsistencies between and within different methods.

    Dąbrowski, M J; Pilot, M; Kruczyk, M; Żmihorski, M; Umer, H M; Gliwicz, J


    Microsatellite loci are widely used in population genetic studies, but the presence of null alleles may lead to biased results. Here, we assessed five methods that indirectly detect null alleles and found large inconsistencies among them. Our analysis was based on 20 microsatellite loci genotyped in a natural population of Microtus oeconomus sampled during 8 years, together with 1200 simulated populations without null alleles, but experiencing bottlenecks of varying duration and intensity, and 120 simulated populations with known null alleles. In the natural population, 29% of positive results were consistent between the methods in pairwise comparisons, and in the simulated data set, this proportion was 14%. The positive results were also inconsistent between different years in the natural population. In the null-allele-free simulated data set, the number of false positives increased with increased bottleneck intensity and duration. We also found a low concordance in null allele detection between the original simulated populations and their 20% random subsets. In the populations simulated to include null alleles, between 22% and 42% of true null alleles remained undetected, which highlighted that detection errors are not restricted to false positives. None of the evaluated methods clearly outperformed the others when both false-positive and false-negative rates were considered. Accepting only the positive results consistent between at least two methods should considerably reduce the false-positive rate, but this approach may increase the false-negative rate. Our study demonstrates the need for novel null allele detection methods that could be reliably applied to natural populations.

  18. Sensitive and reliable detection of grapevine fanleaf virus in a single Xiphinema index nematode vector.

    Demangeat, Gérard; Komar, Véronique; Cornuet, Pascal; Esmenjaud, Daniel; Fuchs, Marc


    Grapevine fanleaf virus (GFLV) is specifically transmitted from plant to plant by the ectoparasitic nematode Xiphinema index. A sensitive and reliable procedure was developed to readily detect GFLV in a single viruliferous X. index, regardless of the nematode origin, i.e. greenhouse rearings or vineyard soils. The assay is based on bead milling to disrupt nematodes extracted from soil samples, solid-phase extraction of total nematode RNAs, and amplification of a 555bp fragment of the coat protein (CP) gene by reverse transcription-polymerase chain reaction with two primers designed from conserved sequences. This procedure is sensitive since the CP gene fragment is amplified from an artificial sample consisting of one viruliferous nematode mixed with 3000 aviruliferous individuals. In addition, StyI RFLP analysis of the CP amplicon enables the GFLV isolate carried by a single viruliferous X. index to be characterized. This GFLV detection assay opens new avenues for epidemiological studies and for molecular investigations on the mechanism of X. index-mediated GFLV transmission.

  19. Turbine Reliability and Operability Optimization through the use of Direct Detection Lidar Final Technical Report

    Johnson, David K; Lewis, Matthew J; Pavlich, Jane C; Wright, Alan D; Johnson, Kathryn E; Pace, Andrew M


    The goal of this Department of Energy (DOE) project is to increase wind turbine efficiency and reliability with the use of a Light Detection and Ranging (LIDAR) system. The LIDAR provides wind speed and direction data that can be used to help mitigate the fatigue stress on the turbine blades and internal components caused by wind gusts, sub-optimal pointing and reactionary speed or RPM changes. This effort will have a significant impact on the operation and maintenance costs of turbines across the industry. During the course of the project, Michigan Aerospace Corporation (MAC) modified and tested a prototype direct detection wind LIDAR instrument; the resulting LIDAR design considered all aspects of wind turbine LIDAR operation from mounting, assembly, and environmental operating conditions to laser safety. Additionally, in co-operation with our partners, the National Renewable Energy Lab and the Colorado School of Mines, progress was made in LIDAR performance modeling as well as LIDAR feed forward control system modeling and simulation. The results of this investigation showed that using LIDAR measurements to change between baseline and extreme event controllers in a switching architecture can reduce damage equivalent loads on blades and tower, and produce higher mean power output due to fewer overspeed events. This DOE project has led to continued venture capital investment and engagement with leading turbine OEMs, wind farm developers, and wind farm owner/operators.

  20. Digital array gas radiometer (DAGR): a sensitive and reliable trace gas detection concept

    Gordley, Larry L.; McHugh, Martin J.; Marshall, B. T.; Thompson, Earl


    The Digital Array Gas Radiometer (DAGR) concept is based on traditional and reliable Gas Filter Correlation Radiometry (GFCR) for remote trace gas detection and monitoring. GFCR sensors have been successful in many infrared remote sensing applications. Historically however, solar backscatter measurements have not been as successful because instrument designs have been susceptible to natural variations in surface albedo, which induce clutter and degrade the sensitivity. DAGR overcomes this limitation with several key innovations. First, a pupil imaging system scrambles the received light, removing nearly all spatial clutter and permitting a small calibration source to be easily inserted. Then, by using focal plane arrays rather than single detectors to collect the light, dramatic advances in dynamic range can be achieved. Finally, when used with the calibration source, data processing approaches can further mitigate detector non-uniformity effects. DAGR sensors can be made as small as digital cameras and are well suited for downlooking detection of gases in the boundary layer, where solar backscatter measurements are needed to overcome the lack of thermal contrast in the IR. Easily integrated into a satellite platform, a space-based DAGR would provide near-global sensing of climatically important species such as such as CO, CH4, and N2O. Aircraft and UAV measurements with a DAGR could be used to monitor agricultural and industrial emissions. Ground-based or portable DAGRs could augment early warning systems for chemical weapons or toxic materials. Finally, planetary science applications include detection and mapping of biomarkers such as CH4 in the Martian atmosphere.

  1. Diffusion-weighted MR imaging in postoperative follow-up: Reliability for detection of recurrent cholesteatoma

    Cimsit, Nuri Cagatay [Marmara University Hospital, Department of Radiology, Istanbul (Turkey); Engin Sitesi Peker Sokak No:1 D:13, 34330 Levent, Istanbul (Turkey)], E-mail:; Cimsit, Canan [Goztepe Education and Research Hospital, Department of Radiology, Istanbul (Turkey); Istanbul Goztepe Egitim ve Arastirma Hastanesi, Radyoloji Klinigi, Goztepe, Istanbul (Turkey)], E-mail:; Baysal, Begumhan [Goztepe Education and Research Hospital, Department of Radiology, Istanbul (Turkey); Istanbul Goztepe Egitim ve Arastirma Hastanesi, Radyoloji Klinigi, Goztepe, Istanbul (Turkey)], E-mail:; Ruhi, Ilteris Cagatay [Goztepe Education and Research Hospital, Department of ENT, Istanbul (Turkey); Istanbul Goztepe Egitim ve Arastirma Hastanesi, KBB Klinigi, Goztepe, Istanbul (Turkey)], E-mail:; Ozbilgen, Suha [Goztepe Education and Research Hospital, Department of ENT, Istanbul (Turkey); Istanbul Goztepe Egitim ve Arastirma Hastanesi, KBB Klinigi, Goztepe, Istanbul (Turkey)], E-mail:; Aksoy, Elif Ayanoglu [Acibadem Bakirkoy Hospital, Department of ENT, Istanbul (Turkey); Acibadem Hastanesi, KBB Boeluemue, Bakirkoey, Istanbul (Turkey)], E-mail:


    Introduction: Cholesteatoma is a progressively growing process that destroy the neighboring bony structures and treatment is surgical removal. Follow-up is important in the postoperative period, since further surgery is necessary if recurrence is present, but not if granulation tissue is detected. This study evaluates if diffusion-weighted MR imaging alone can be a reliable alternative to CT, without use of contrast agent for follow-up of postoperative patients in detecting recurrent cholesteatoma. Materials and methods: 26 consecutive patients with mastoidectomy reporting for routine follow-up CT after mastoidectomy were included in the study, if there was loss of middle ear aeration on CT examination. MR images were evaluated for loss of aeration and signal intensity changes on diffusion-weighted sequences. Surgical results were compared with imaging findings. Results: Interpretation of MR images were parallel with the loss of aeration detected on CT for all 26 patients. Of the 26 patients examined, 14 were not evaluated as recurrent cholesteatoma and verified with surgery (NPV: 100%). Twelve patients were diagnosed as recurrent cholesteatoma and 11 were surgically diagnosed as recurrent cholesteatoma (PPV: 91.7%). Four of these 11 patients had loss of aeration size greater than the high signal intensity area on DWI, which were surgically confirmed as granulation tissue or fibrosis accompanying recurrent cholesteatoma. Conclusion: Diffusion-weighted MR for suspected recurrent cholesteatoma is a valuable tool to cut costs and prevent unnecessary second-look surgeries. It has the potential to become the MR sequence of choice to differentiate recurrent cholesteatoma from other causes of loss of aeration in patients with mastoidectomy.

  2. Prospects of Gravitational Wave Detection Using Pulsar Timing Array for Chinese Future Telescopes

    Lee, K. J.


    In this paper, we estimate the sensitivity of gravitational wave (GW) detection for future Chinese pulsar timing array (PTA) projects. The calculation of sensitivity is based on the well-known Crámer-Rao bound idea. The red noise and dispersion measure (DM) variation noise has be included in the modeling. We demonstrate that the future Chinese telescope can be very valuable for future PTA experiments and GW detection efforts.

  3. Consortium for Electric Reliability Technology Solutions Grid of the Future White Paper on Review of Recent Reliability Issues and Systems Events

    Hauer, John F.; Dagle, Jeffery E.


    This report is one of six reports developed under the U.S. Department of Energy (DOE) program in Power System Integration and Reliability (PSIR). The objective of this report is to review, analyze, and evaluate critical reliability issues demonstrated by recent disturbance events in the North America power system. Eleven major disturbances are examined, most occurring in this decade. The strategic challenge is that the pattern of technical need has persisted for a long period of time. For more than a decade, anticipation of market deregulation has been a major disincentive to new investments in system capacity. It has also inspired reduced maintenance of existing assets. A massive infusion of better technology is emerging as the final option to continue reliable electrical services. If an investment in better technology will not be made in a timely manner, then North America should plan its adjustments to a very different level of electrical service. It is apparent that technical operations staff among the utilities can be very effective at marshaling their forces in the immediate aftermath of a system emergency, and that serious disturbances often lead to improved mechanisms for coordinated operation. It is not at all apparent that such efforts can be sustained through voluntary reliability organizations in which utility personnel external to those organizations do most of the technical work. The eastern interconnection shows several situations in which much of the technical support has migrated from the utilities to the Independent System Operator (ISO), and the ISO staffs or shares staff with the regional reliability council. This process may be a natural and very positive consequence of utility restructuring. If so, the process should be expedited in regions where it is less advanced.

  4. Reliability of Cobas Amplicor PCR test in detection of Mycobacterium tuberculosis in respiratory and nonorespiratory specimens

    Lepšanović Zorica


    Full Text Available Background/Aim. Traditional methods for detection of mycobacteria, such as microscopic examination for the presence of acid-fast bacilli and isolation of the organism by culture, have either a low sensitivity and/or specificity, or take weeks before a definite result is available. Molecular methods, especially those based on nucleic acid amplification, are rapid diagnostic methods which combine high sensitivity and high specificity. The aim of this study was to determine the usefulness of the Cobas Amplicor Mycobacterium tuberculosis polymerase chain reaction (CAPCR assay in detecting the tuberculosis cause in respiratory and nonrespiratory specimens (compared to culture. Methods. Specimens were decontaminated by the N-acetyl-L-cystein- NaOH method. A 500 μL aliquot of the processed specimen were used for inoculation of Löwenstein-Jensen (L-J slants, a drop for acid-fast staining, and 100 μL for PCR. The Cobas Amplicor PCR was performed according to the manufacturer's instructions. Results. A total of 110 respiratory and 355 nonrespiratory specimens were investigated. After resolving discrepancies by reviewing medical history, overall sensitivity, specificity, and positive and negative predictive values for CA-PCR assay compared to culture, were 83%, 100%, 100%, and 96.8%, respectively. In comparison, they were 50%, 99.7%, 87.5%, and 98%, respectively, for the nonrespiratory specimens. The inhibition rate was 2.8% for respiratory, and 7.6% for nonrespiratory specimens. Conclusion. CA-PCR is a reliable assay that enables specialists to start treatment promptly on a positive test result. Lower value for specificity in a group of nonrespiratory specimens is a consequence of an extremely small number of mycobacteria in some of them.

  5. OCT4 and SOX2 are reliable markers in detecting stem cells in odontogenic lesions

    Abhishek Banerjee


    Full Text Available Context (Background: Stem cells are a unique subpopulation of cells in the human body with a capacity to initiate differentiation into various cell lines. Tumor stem cells (TSCs are a unique subpopulation of cells that possess the ability to initiate a neoplasm and sustain self-renewal. Epithelial stem cell (ESC markers such as octamer-binding transcription factor 4 (OCT4 and sex-determining region Y (SRY-box 2 (SOX2 are capable of identifying these stem cells expressed during the early stages of tooth development. Aims: To detect the expression of the stem cell markers OCT4 and SOX2 in the normal odontogenic tissues and the odontogenic cysts and tumors. Materials and Methods: Paraffin sections of follicular tissue, radicular cyst, dentigerous cyst, odontogenic keratocyst, ameloblastoma, adenomatoid odontogenic tumor, and ameloblastic carcinoma were obtained from the archives. The sections were subjected to immunohistochemical assay by the use of mouse monoclonal antibodies to OCT4 and SOX2. Statistical Analysis: The results were evaluated by descriptive analysis. Results: The results show the presence of stem cells in the normal and lesional tissues with these stem cell identifying markers. SOX2 was found to be more consistent and reliable in the detection of stem cells. Conclusion: The stem cell expressions are maintained in the tumor transformation of tissue and probably suggest that there is no phenotypic change of stem cells in progression from normal embryonic state to its tumor component. The quantification and localization reveals interesting trends that indicate the probable role of the cells in the pathogenesis of the lesions.

  6. The Threat of Uncertainty: Why Using Traditional Approaches for Evaluating Spacecraft Reliability are Insufficient for Future Human Mars Missions

    Stromgren, Chel; Goodliff, Kandyce; Cirillo, William; Owens, Andrew


    Through the Evolvable Mars Campaign (EMC) study, the National Aeronautics and Space Administration (NASA) continues to evaluate potential approaches for sending humans beyond low Earth orbit (LEO). A key aspect of these missions is the strategy that is employed to maintain and repair the spacecraft systems, ensuring that they continue to function and support the crew. Long duration missions beyond LEO present unique and severe maintainability challenges due to a variety of factors, including: limited to no opportunities for resupply, the distance from Earth, mass and volume constraints of spacecraft, high sensitivity of transportation element designs to variation in mass, the lack of abort opportunities to Earth, limited hardware heritage information, and the operation of human-rated systems in a radiation environment with little to no experience. The current approach to maintainability, as implemented on ISS, which includes a large number of spares pre-positioned on ISS, a larger supply sitting on Earth waiting to be flown to ISS, and an on demand delivery of logistics from Earth, is not feasible for future deep space human missions. For missions beyond LEO, significant modifications to the maintainability approach will be required.Through the EMC evaluations, several key findings related to the reliability and safety of the Mars spacecraft have been made. The nature of random and induced failures presents significant issues for deep space missions. Because spare parts cannot be flown as needed for Mars missions, all required spares must be flown with the mission or pre-positioned. These spares must cover all anticipated failure modes and provide a level of overall reliability and safety that is satisfactory for human missions. This will require a large amount of mass and volume be dedicated to storage and transport of spares for the mission. Further, there is, and will continue to be, a significant amount of uncertainty regarding failure rates for spacecraft

  7. 基于环结构的可靠检测机制%Reliable detection mechanism based on ring structure

    李俊锋; 杨英杰; 张国强


    为了解决层次式失效检测方法中检测点单点失效问题,为系统提供一种可靠检测点,提出基于环结构的可靠检测机制(BR_RD).该机制采用环检测算法迅速发现可疑失效节点,利用随机半元确认算法定位出环内的失效节点,通过选举算法产生新的可靠节点来替代失效节点以保持环结构完整性.其有效性和可靠性在失效注入实验中得到验证.%To solve the single failure point in hierarchical failure detection model and supply a reliable detecting point for the detection system, reliable detection mechanism based on ring structure (BR_ RD) is presented, which consists of multiple points. The loop detection algorithm is to find suspect failure points, then random semi-confirmation algorithm is to locate the failure points in the ring, and election algorithm is to maintenance the integrity of the ring structure by updating the failed points. Its validity and reliability is verified by failure injection experiments.

  8. Reliable Detection and Smart Deletion of Malassez Counting Chamber Grid in Microscopic White Light Images for Microbiological Applications.

    Denimal, Emmanuel; Marin, Ambroise; Guyot, Stéphane; Journaux, Ludovic; Molin, Paul


    In biology, hemocytometers such as Malassez slides are widely used and are effective tools for counting cells manually. In a previous work, a robust algorithm was developed for grid extraction in Malassez slide images. This algorithm was evaluated on a set of 135 images and grids were accurately detected in most cases, but there remained failures for the most difficult images. In this work, we present an optimization of this algorithm that allows for 100% grid detection and a 25% improvement in grid positioning accuracy. These improvements make the algorithm fully reliable for grid detection. This optimization also allows complete erasing of the grid without altering the cells, which eases their segmentation.

  9. How often should we monitor for reliable detection of atrial fibrillation recurrence? Efficiency considerations and implications for study design.

    Efstratios I Charitos

    Full Text Available OBJECTIVE: Although atrial fibrillation (AF recurrence is unpredictable in terms of onset and duration, current intermittent rhythm monitoring (IRM diagnostic modalities are short-termed and discontinuous. The aim of the present study was to investigate the necessary IRM frequency required to reliably detect recurrence of various AF recurrence patterns. METHODS: The rhythm histories of 647 patients (mean AF burden: 12 ± 22% of monitored time; 687 patient-years with implantable continuous monitoring devices were reconstructed and analyzed. With the use of computationally intensive simulation, we evaluated the necessary IRM frequency to reliably detect AF recurrence of various AF phenotypes using IRM of various durations. RESULTS: The IRM frequency required for reliable AF detection depends on the amount and temporal aggregation of the AF recurrence (p95% sensitivity of AF recurrence required higher IRM frequencies (>12 24-hour; >6 7-day; >4 14-day; >3 30-day IRM per year; p<0.0001 than currently recommended. Lower IRM frequencies will under-detect AF recurrence and introduce significant bias in the evaluation of therapeutic interventions. More frequent but of shorter duration, IRMs (24-hour are significantly more time effective (sensitivity per monitored time than a fewer number of longer IRM durations (p<0.0001. CONCLUSIONS: Reliable AF recurrence detection requires higher IRM frequencies than currently recommended. Current IRM frequency recommendations will fail to diagnose a significant proportion of patients. Shorter duration but more frequent IRM strategies are significantly more efficient than longer IRM durations. CLINICAL TRIAL REGISTRATION URL: Unique identifier: NCT00806689.

  10. LAMP using a disposable pocket warmer for anthrax detection, a highly mobile and reliable method for anti-bioterrorism.

    Hatano, Ben; Maki, Takayuki; Obara, Takeyuki; Fukumoto, Hitomi; Hagisawa, Kohsuke; Matsushita, Yoshitaro; Okutani, Akiko; Bazartseren, Boldbaastar; Inoue, Satoshi; Sata, Tetsutaro; Katano, Harutaka


    A quick, reliable detection system is necessary to deal with bioterrorism. Loop-mediated isothermal amplification (LAMP) is a DNA amplification method that can amplify specific DNA fragments in isothermal conditions. We developed a new highly mobile and practical LAMP anthrax detection system that uses a disposable pocket warmer without the need for electricity (pocket-warmer LAMP). In our tests, the detection limit of the pocket-warmer LAMP was 1,000 copies of Bacillus anthracis pag and capB gene fragments per tube. The pocket-warmer LAMP also detected B. anthracis genes from DNA extracted from 0.1 volume of a B. anthracis colony. The lower detection limit of the pocket-warmer LAMP was not significantly different from that of a conventional LAMP using a heat block, and was not changed under cold (4 degrees C) or warm (37 degrees C) conditions in a Styrofoam box. The pocket-warmer LAMP could be useful against bioterrorism, and as a sensitive, reliable detection tool in areas with undependable electricity infrastructures.

  11. Reliability of the grip strength coefficient of variation for detecting sincerity in normal and blocked median nerve in healthy adults.

    Wachter, N J; Mentzel, M; Hütz, R; Gülke, J


    In the assessment of hand and upper limb function, detecting sincerity of effort (SOE) for grip strength is of major importance to identifying feigned loss of strength. Measuring maximal grip strength with a dynamometer is very common, often combined with calculating the coefficient of variation (CV), a measure of the variation over the three grip strength trials. Little data is available about the relevance of these measurements in patients with median nerve impairment due to the heterogeneity of patient groups. This study examined the reliability of grip strength tests as well as the CV to detect SOE in healthy subjects. The power distribution of the individual fingers and the thenar was taken into account. To assess reliability, the measurements were performed in subjects with a median nerve block to simulate a nerve injury. The ability of 21 healthy volunteers to exert maximal grip force and to deliberately exert half-maximal force to simulate reduced SOE in a power grip was examined using the Jamar(®) dynamometer. The experiment was performed in a combined setting with and without median nerve block of the same subject. The force at the fingertips of digits 2-5 and at the thenar eminence was measured with a sensor glove with integrated pressure receptors. For each measurement, three trials were recorded subsequently and the mean and CV were calculated. When exerting submaximal force, the subjects reached 50-62% of maximal force, regardless of the median nerve block. The sensor glove revealed a significant reduction of force when exerting submaximal force (P1 sensor) with (P<0.032) and without median nerve block (P<0.017). An increase in CV at submaximal force was found, although it was not significant. SOE can be detected with the CV at the little finger at using a 10% cut-off (sensitivity 0.84 and 0.92 without and with median nerve block, respectively). These findings suggest low reliability of the power grip measurement with the Jamar(®) dynamometer, as

  12. Futurism.

    Foy, Jane Loring

    The objectives of this research report are to gain insight into the main problems of the future and to ascertain the attitudes that the general population has toward the treatment of these problems. In the first section of this report the future is explored socially, psychologically, and environmentally. The second section describes the techniques…

  13. Using Linkage Analysis to Detect Gene-Gene Interactions. 2. Improved Reliability and Extension to More-Complex Models.

    Susan E Hodge

    Full Text Available Detecting gene-gene interaction in complex diseases has become an important priority for common disease genetics, but most current approaches to detecting interaction start with disease-marker associations. These approaches are based on population allele frequency correlations, not genetic inheritance, and therefore cannot exploit the rich information about inheritance contained within families. They are also hampered by issues of rigorous phenotype definition, multiple test correction, and allelic and locus heterogeneity. We recently developed, tested, and published a powerful gene-gene interaction detection strategy based on conditioning family data on a known disease-causing allele or a disease-associated marker allele4. We successfully applied the method to disease data and used computer simulation to exhaustively test the method for some epistatic models. We knew that the statistic we developed to indicate interaction was less reliable when applied to more-complex interaction models. Here, we improve the statistic and expand the testing procedure. We computer-simulated multipoint linkage data for a disease caused by two interacting loci. We examined epistatic as well as additive models and compared them with heterogeneity models. In all our models, the at-risk genotypes are "major" in the sense that among affected individuals, a substantial proportion has a disease-related genotype. One of the loci (A has a known disease-related allele (as would have been determined from a previous analysis. We removed (pruned family members who did not carry this allele; the resultant dataset is referred to as "stratified." This elimination step has the effect of raising the "penetrance" and detectability at the second locus (B. We used the lod scores for the stratified and unstratified data sets to calculate a statistic that either indicated the presence of interaction or indicated that no interaction was detectable. We show that the new method is robust

  14. The future role of genetic screening to detect newborns at risk of childhood-onset hearing loss


    Objective: To explore the future potential of genetic screening to detect newborns at risk of childhood-onset hearing loss. Design: An expert led discussion of current and future developments in genetic technology and the knowledge base of genetic hearing loss to determine the viability of genetic screening and the implications for screening policy. Results and Discussion: Despite increasing pressure to adopt genetic technologies, a major barrier for genetic screening in hearing loss is the uncertain clinical significance of the identified mutations and their interactions. Only when a reliable estimate of the future risk of hearing loss can be made at a reasonable cost, will genetic screening become viable. Given the speed of technological advancement this may be within the next 10 years. Decision-makers should start to consider how genetic screening could augment current screening programmes as well as the associated data processing and storage requirements. Conclusion: In the interim, we suggest that decision makers consider the benefits of (1) genetically testing all newborns and children with hearing loss, to determine aetiology and to increase knowledge of the genetic causes of hearing loss, and (2) consider screening pregnant women for the m.1555A> G mutation to reduce the risk of aminoglycoside antibiotic-associated hearing loss. PMID:23131088

  15. Capillary electrophoresis with laser-induced fluorescence detection for fast and reliable apolipoprotein E genotyping

    Somsen, GW; Welten, HTME; Mulder, FP; Swart, CW; Kema, IP; de Jong, GJ


    The use of capillary electrophoresis (CE) with laser-induced fluorescence (LIF) detection for the rapid determination of apolipoprotein E (apoE) genotypes was studied. High resolution and sensitive detection of the concerned DNA restriction fragments was achieved using CE buffers with hydroxypropylm

  16. Sentinel node detection after preoperative short-course radiotherapy in rectal carcinoma is not reliable

    Braat, AE; Moll, FCP; de Vries, JE; Wiggers, T


    Background: Seninel node (SN) detection may be used in patients with colonic carcinoma. However, its use in patients with rectal carcinoma may be unreliable. To address this, SN detection was evaluated in patients with rectal carcinoma after short-course preoperative radiotherapy. Methods: Patent Bl

  17. Experimental Research of Reliability of Plant Stress State Detection by Laser-Induced Fluorescence Method

    Yury Fedotov


    Full Text Available Experimental laboratory investigations of the laser-induced fluorescence spectra of watercress and lawn grass were conducted. The fluorescence spectra were excited by YAG:Nd laser emitting at 532 nm. It was established that the influence of stress caused by mechanical damage, overwatering, and soil pollution is manifested in changes of the spectra shapes. The mean values and confidence intervals for the ratio of two fluorescence maxima near 685 and 740 nm were estimated. It is presented that the fluorescence ratio could be considered a reliable characteristic of plant stress state.

  18. [Autism Spectrum Disorder in DSM-5 - concept, validity, and reliability, impact on clinical care and future research].

    Freitag, Christine M


    Autism Spectrum Disorder (ASD) in DSM-5 comprises the former DSM-IV-TR diagnoses of Autistic Disorder, Asperger's Disorder and PDD-nos. The criteria for ASD in DSM-5 were considerably revised from those of ICD-10 and DSM-IV-TR. The present article compares the diagnostic criteria, presents studies on the validity and reliability of ASD, and discusses open questions. It ends with a clinical and research perspective.

  19. Autism detection in early childhood (ADEC): reliability and validity data for a Level 2 screening tool for autistic disorder.

    Nah, Yong-Hwee; Young, Robyn L; Brewer, Neil; Berlingeri, Genna


    The Autism Detection in Early Childhood (ADEC; Young, 2007) was developed as a Level 2 clinician-administered autistic disorder (AD) screening tool that was time-efficient, suitable for children under 3 years, easy to administer, and suitable for persons with minimal training and experience with AD. A best estimate clinical Diagnostic and Statistical Manual of Mental Disorders (4th ed., text rev.; DSM-IV-TR; American Psychiatric Association, 2000) diagnosis of AD was made for 70 children using all available information and assessment results, except for the ADEC data. A screening study compared these children on the ADEC with 57 children with other developmental disorders and 64 typically developing children. Results indicated high internal consistency (α = .91). Interrater reliability and test-retest reliability of the ADEC were also adequate. ADEC scores reliably discriminated different diagnostic groups after controlling for nonverbal IQ and Vineland Adaptive Behavior Composite scores. Construct validity (using exploratory factor analysis) and concurrent validity using performance on the Autism Diagnostic Observation Schedule (Lord et al., 2000), the Autism Diagnostic Interview-Revised (Le Couteur, Lord, & Rutter, 2003), and DSM-IV-TR criteria were also demonstrated. Signal detection analysis identified the optimal ADEC cutoff score, with the ADEC identifying all children who had an AD (N = 70, sensitivity = 1.0) but overincluding children with other disabilities (N = 13, specificity ranging from .74 to .90). Together, the reliability and validity data indicate that the ADEC has potential to be established as a suitable and efficient screening tool for infants with AD.

  20. Reliability of ultrasonography in detecting shoulder disease in patients with rheumatoid arthritis.

    Bruyn, G A W


    To assess the intra and interobserver reproducibility of musculoskeletal ultrasonography (US) among rheumatologists in detecting destructive and inflammatory shoulder abnormalities in patients with rheumatoid arthritis (RA) and to determine the overall agreement between US and MRI.

  1. Echolocation detections and digital video surveys provide reliable estimates of the relative density of harbour porpoises

    Williamson, Laura D; Brookes, Kate L; Scott, Beth E; Graham, Isla M; Bradbury, Gareth; Hammond, Philip S; Thompson, Paul M; McPherson, Jana


    ...‐based visual surveys. Surveys of cetaceans using acoustic loggers or digital cameras provide alternative methods to estimate relative density that have the potential to reduce cost and provide a verifiable record of all detections...

  2. Reliability of ultrasonography in detecting shoulder disease in patients with rheumatoid arthritis

    Bruyn, G. A. W.; Naredo, E.; Moeller, I.; Moragues, C.; Garrido, J.; de Bock, G. H.; d'Agostino, M-A; Filippucci, E.; Iagnocco, A.; Backhaus, M.; Swen, W. A. A.; Balint, P.; Pineda, C.; Milutinovic, S.; Kane, D.; Kaeley, G.; Narvaez, F. J.; Wakefield, R. J.; Narvaez, J. A.; de Augustin, J.; Schmidt, W. A.; Moller, I.; Swen, N.; de Agustin, J.

    Objective: To assess the intra and interobserver reproducibility of musculoskeletal ultrasonography ( US) among rheumatologists in detecting destructive and inflammatory shoulder abnormalities in patients with rheumatoid arthritis ( RA) and to determine the overall agreement between US and MRI.

  3. Reliability of ultrasonography in detecting shoulder disease in patients with rheumatoid arthritis

    Bruyn, G. A. W.; Naredo, E.; Moeller, I.; Moragues, C.; Garrido, J.; de Bock, G. H.; d'Agostino, M-A; Filippucci, E.; Iagnocco, A.; Backhaus, M.; Swen, W. A. A.; Balint, P.; Pineda, C.; Milutinovic, S.; Kane, D.; Kaeley, G.; Narvaez, F. J.; Wakefield, R. J.; Narvaez, J. A.; de Augustin, J.; Schmidt, W. A.; Moller, I.; Swen, N.; de Agustin, J.


    Objective: To assess the intra and interobserver reproducibility of musculoskeletal ultrasonography ( US) among rheumatologists in detecting destructive and inflammatory shoulder abnormalities in patients with rheumatoid arthritis ( RA) and to determine the overall agreement between US and MRI. Meth

  4. Implanted cardiac devices are reliably detected by commercially available metal detectors

    Holm, Katja Fiedler; Hjortshøj, Søren; Pehrson, Steen;


    Explosions of Cardiovascular Implantable Electronic Devices (CIEDs) (pacemakers, defibrillators, and loop recorders) are a well-recognized problem during cremation, due to lithium-iodine batteries. In addition, burial of the deceased with a CIED can present a potential risk for environmental cont...... contamination. Therefore, detection of CIEDs in the deceased would be of value. This study evaluated a commercially available metal detector for detecting CIEDs....

  5. Multiplex qPCR for reliable detection and differentiation of Burkholderia mallei and Burkholderia pseudomallei

    Janse Ingmar


    Full Text Available Abstract Background Burkholderia mallei and B. pseudomallei are two closely related species of highly virulent bacteria that can be difficult to detect. Pathogenic Burkholderia are endemic in many regions worldwide and cases of infection, sometimes brought by travelers from unsuspected regions, also occur elsewhere. Rapid, sensitive methods for identification of B. mallei and B. pseudomallei are urgently needed in the interests of patient treatment and epidemiological surveillance. Methods Signature sequences for sensitive, specific detection of pathogenic Burkholderia based on published genomes were identified and a qPCR assay was designed and validated. Results A single-reaction quadruplex qPCR assay for the detection of pathogenic Burkholderia, which includes a marker for internal control of DNA extraction and amplification, was developed. The assay permits differentiation of B. mallei and B. pseudomallei strains, and probit analysis showed a very low detection limit. Use of a multicopy signature sequence permits detection of less than 1 genome equivalent per reaction. Conclusions The new assay permits rapid detection of pathogenic Burkholderia and combines enhanced sensitivity, species differentiation, and inclusion of an internal control for both DNA extraction and PCR amplification.

  6. Technical Note: The single particle soot photometer fails to reliably detect PALAS soot nanoparticles

    M. Gysel


    Full Text Available The single particle soot photometer (SP2 uses laser-induced incandescence (LII for the measurement of atmospheric black carbon (BC particles. The BC mass concentration is obtained by combining quantitative detection of BC mass in single particles with a counting efficiency of 100% above its lower detection limit. It is commonly accepted that a particle must contain at least several tenths of a femtogram BC in order to be detected by the SP2.

    Here we show the result that most BC particles from a PALAS spark discharge soot generator remain undetected by the SP2, even if their BC mass, as independently determined with an aerosol particle mass analyser (APM, is clearly above the typical lower detection limit of the SP2. Comparison of counting efficiency and effective density data of PALAS soot with flame generated soot (combustion aerosol standard burner, CAST, fullerene soot and carbon black particles (Cabot Regal 400R reveals that particle morphology can affect the SP2's lower detection limit. PALAS soot particles are fractal-like agglomerates of very small primary particles with a low fractal dimension, resulting in a very low effective density. Such loosely packed particles behave like "the sum of individual primary particles" in the SP2's laser. Accordingly, most PALAS soot particles remain undetected as the SP2's laser intensity is insufficient to heat the primary particles to their vaporisation temperature because of their small size (Dpp ≈ 5–10 nm. Previous knowledge from pulsed laser-induced incandescence indicated that particle morphology might have an effect on the SP2's lower detection limit, however, an increase of the lower detection limit by a factor of ∼5–10, as reported here for PALAS soot, was not expected.

    In conclusion, the SP2's lower detection limit at a certain laser power depends primarily on the total BC mass per particle for compact particles with sufficiently high effective

  7. Histological-cytological reports correlation and reliability of papanicolau test for the detection of malignant changes in the cervix

    Vitković L.


    Full Text Available The incidence rate of cervical cancer in Serbia is among the highest in Europe and is 23.8 in 100.000. Papanicolaou test, colposcopy and pathohistology report are the basic method of secondary prevention of cervical cancer. The aim of the study was to examine the correlation between histological-cytological findings and reliability of the Papanicolaou test in detection of cervical lesions. We analyzed cervical smears (Papanicolaou test in 3868 women. Among them 190 women had suspect finding and because of that they were underwent to cervical biopsy. We detected premalignant or malignant changes of the cervix in 77 women. LSIL was found at 43 (22.6%, HSIL at 25 (13.2% and carcinoma planocellulare at 9 (4.7% women. There is a statistically significant positive correlation (Spearman=0.829; p<0,001 between histological and cytological findings of the respondents. Most estimates of diagnostic performance of Papanicolaou test in discrimination of LSIL, HSIL and carcinoma planocellulare in accordance with cervicitis are for cytological findings of ASCH (PA IIIa (Sp=90.6% and Sn=100% for carcinoma planocellulare; Sn = 96% for HSIL and Sn=86% for LSIL. In discrimination HSIL from LSIL the best discrimination is achieved by finding LSIL (PAIIIb Papanicolaou test (Sn=72.0%, Sp=67.4%, and in discrimination carcinoma planocellulare from LSIL best discrimination is achieved by finding HSIL (PA IIIb/IV Papanicolaou test (Sn=77.8%, Sp=97.7%. Based on our results we can conclude that there is a positive correlation between histological-cytological findings and that the Papanicolaou test more reliable in detecting severe premalignant lesions. Cytological diagnosis of ASCH (PAIIIa and LSIL (PAIIIb can reliably indicate the presence of premalignant cervical lesions in women, and patients with these findings must be more controlled and treated.

  8. A home-brew real-time PCR assay for reliable detection and quantification of mature miR-122.

    Naderi, Mahmood; Abdul Tehrani, Hossein; Soleimani, Masoud; Shabani, Iman; Hashemi, Seyed Mahmoud


    miR-122 is a liver-specific miRNA that has significant gene expression alterations in response to specific pathophysiological circumstances of liver such as drug-induced liver injury, hepatocellular carcinoma, and hepatitis B and C virus infections. Therefore, accurate and precise quantification of miR-122 is very important for clinical diagnostics. However, because of the lack of in vitro diagnostics assays for miR-122 detection and quantification of the existence of an open-source assay could inevitably provide external evaluation by other researchers and the chance of promoting the assay when required. The aim of this study was to develop a Taqman real-time polymerase chain reaction assay, which is capable of robust and reliable quantification of miR-122 in different sample types. We used stem loop methodology to design a specific Taqman real-time polymerase chain reaction assay for miR-122. This technique enabled us to reliably and reproducibly quantify short-length oligonucleotides such as miR-122. The specificity, sensitivity, interassay and intra-assay, and the dynamic range of the assay were experimentally determined by their respective methodology. The assay had a linear dynamic range of 3E to 4.8E miR-122 copies/reaction and the limit of detection was determined to be between 960 and 192 copies/reaction with 95% confidence interval. The assay gave a coefficient of variation for the Ct values of 50,000 copies per hepatocyte, this assay is able to suffice the need for reliable detection and quantification of this miRNA. Therefore, this study can be considered as a start point for standardizing miR-122 quantification.

  9. Reliable fault detection and diagnosis of photovoltaic systems based on statistical monitoring approaches

    Harrou, Fouzi


    This study reports the development of an innovative fault detection and diagnosis scheme to monitor the direct current (DC) side of photovoltaic (PV) systems. Towards this end, we propose a statistical approach that exploits the advantages of one-diode model and those of the univariate and multivariate exponentially weighted moving average (EWMA) charts to better detect faults. Specifically, we generate array\\'s residuals of current, voltage and power using measured temperature and irradiance. These residuals capture the difference between the measurements and the predictions MPP for the current, voltage and power from the one-diode model, and use them as fault indicators. Then, we apply the multivariate EWMA (MEWMA) monitoring chart to the residuals to detect faults. However, a MEWMA scheme cannot identify the type of fault. Once a fault is detected in MEWMA chart, the univariate EWMA chart based on current and voltage indicators is used to identify the type of fault (e.g., short-circuit, open-circuit and shading faults). We applied this strategy to real data from the grid-connected PV system installed at the Renewable Energy Development Center, Algeria. Results show the capacity of the proposed strategy to monitors the DC side of PV systems and detects partial shading.

  10. The future point-of-care detection of disease and its data capture and handling.

    Lopez-Barbosa, Natalia; Gamarra, Jorge D; Osma, Johann F


    Point-of-care detection is a widely studied area that attracts effort and interest from a large number of fields and companies. However, there is also increased interest from the general public in this type of device, which has driven enormous changes in the design and conception of these developments and the way data is handled. Therefore, future point-of-care detection has to include communication with front-end technology, such as smartphones and networks, automation of manufacture, and the incorporation of concepts like the Internet of Things (IoT) and cloud computing. Three key examples, based on different sensing technology, are analyzed in detail on the basis of these items to highlight a route for the future design and development of point-of-care detection devices and their data capture and handling.

  11. Fast and reliable obstacle detection and segmentation for cross-country navigation

    Talukder, A.; Manduchi, R.; Rankin, A.; Matthies, L.


    Obstacle detection is one of the main components of the control system of autonomous vehicles. In the case of indoor/urban navigation, obstacles are typically defined as surface points that are higher than the ground plane. This characterization, however, cannot be used in cross-country and unstructured environments, where the notion of ground plane is often not meaningful.

  12. A reliable cluster detection technique using photometric redshifts: introducing the 2TecX algorithm

    van Breukelen, Caroline


    We present a new cluster detection algorithm designed for finding high-redshift clusters using optical/infrared imaging data. The algorithm has two main characteristics. First, it utilises each galaxy's full redshift probability function, instead of an estimate of the photometric redshift based on the peak of the probability function and an associated Gaussian error. Second, it identifies cluster candidates through cross-checking the results of two substantially different selection techniques (the name 2TecX representing the cross-check of the two techniques). These are adaptations of the Voronoi Tesselations and Friends-Of-Friends methods. Monte-Carlo simulations of mock catalogues show that cross-checking the cluster candidates found by the two techniques significantly reduces the detection of spurious sources. Furthermore, we examine the selection effects and relative strengths and weaknesses of either method. The simulations also allow us to fine-tune the algorithm's parameters, and define completeness an...

  13. Autopiquer - a Robust and Reliable Peak Detection Algorithm for Mass Spectrometry

    Kilgour, David P. A.; Hughes, Sam; Kilgour, Samantha L.; Mackay, C. Logan; Palmblad, Magnus; Tran, Bao Quoc; Goo, Young Ah; Ernst, Robert K.; Clarke, David J.; Goodlett, David R.


    We present a simple algorithm for robust and unsupervised peak detection by determining a noise threshold in isotopically resolved mass spectrometry data. Solving this problem will greatly reduce the subjective and time-consuming manual picking of mass spectral peaks and so will prove beneficial in many research applications. The Autopiquer approach uses autocorrelation to test for the presence of (isotopic) structure in overlapping windows across the spectrum. Within each window, a noise threshold is optimized to remove the most unstructured data, whilst keeping as much of the (isotopic) structure as possible. This algorithm has been successfully demonstrated for both peak detection and spectral compression on data from many different classes of mass spectrometer and for different sample types, and this approach should also be extendible to other types of data that contain regularly spaced discrete peaks.

  14. Linear SVM-Based Android Malware Detection for Reliable IoT Services

    Hyo-Sik Ham


    Full Text Available Current many Internet of Things (IoT services are monitored and controlled through smartphone applications. By combining IoT with smartphones, many convenient IoT services have been provided to users. However, there are adverse underlying effects in such services including invasion of privacy and information leakage. In most cases, mobile devices have become cluttered with important personal user information as various services and contents are provided through them. Accordingly, attackers are expanding the scope of their attacks beyond the existing PC and Internet environment into mobile devices. In this paper, we apply a linear support vector machine (SVM to detect Android malware and compare the malware detection performance of SVM with that of other machine learning classifiers. Through experimental validation, we show that the SVM outperforms other machine learning classifiers.

  15. Reliable detection of human papillomavirus in recurrent laryngeal papillomatosis and associated carcinoma of archival tissue.

    Weiss, Daniel; Heinkele, Thomas; Rudack, Claudia


    Recurrent laryngeal papillomatosis (RLP) is, although benign, a challenging disease for both, the patient and the treating physician. Maximum disease control with minimum intervention is considered to be the gold standard. However, patients have to undergo repeating surgical interventions. Human papillomavirus (HPV), mainly so called low risk types, are thought to be responsible for the development of RLP. But, there is still some controversy over the true prevalence of HPV and the virus-specific molecular diagnostic of choice. Therefore archival tissue samples from 44 patients with RLP at laryngeal site, out of which eight developed laryngeal cancer, was screened for presence of HPV through various molecular approaches. Results from these different methodologies were compared between each other and with patient's characteristics. The overall detection rates of HPV with the various methods used in this study were: HPV16 E6/E7 PCR: 0%; GP5+/6+ PCR: 4.5%; CDKN2A/p16 immunohistochemistry: 6.8%; in-situ hybridization for low and high risk HPV types: 52.3%; HPV6/11 L1 PCR: 72.7% and HPV6/11 E6 PCR: 79.5%. Disease progression showed no apparent dependence of the detected HPV type or clinical variables like age at diagnosis, sex, or additional drug application (Cidofovir and Bevacizumab). In conclusion, the broad-spectrum PCRs alone or in combination with immunohistochemistry of CDKN2A/p16 and in-situ hybridization are unsuitable for HPV detection in RLP. Based on the findings presented in this study the type specific PCRs targeting the E6 open reading frame are clearly superior in detection of HPV in this tumor entity.

  16. Validating a standardised test battery for synesthesia: Does the Synesthesia Battery reliably detect synesthesia?

    Carmichael, D. A.; Down, M.P.; Shillcock, R. C.; Eagleman, D.M.; Simner, J.


    Synesthesia is a neurological condition that gives rise to unusual secondary sensations (e.g., reading letters might trigger the experience of colour). Testing the consistency of these sensations over long time intervals is the behavioural gold standard assessment for detecting synesthesia (e.g., Simner, Mulvenna et al., 2006). In 2007 however, Eagleman and colleagues presented an online 'Synesthesia Battery' of tests aimed at identifying synesthesia by assessing consistency but within a sing...

  17. Is Side-Channel Analysis really reliable for detecting Hardware Trojans?

    Di Natale, Giorgio; Dupuis, Sophie; Rouzeyre, Bruno


    International audience; Hardware Trojans are malicious alterations to a cir- cuit, inserted either during the design phase or during fabrication process. Due to the diversity of Trojans, detecting and/or locating them is a challenging task. Numerous approaches have been proposed to address this problem, whether logic testing based or side-channel analysis based techniques. In this paper, we focus on side-channel analysis, and try to underline the fact that no published technique until now has...

  18. A novel method for rapid and reliable detection of complex vertebral malformation and bovine leukocyte adhesion deficiency in Holstein cattle

    Zhang Yi


    Full Text Available Abstract Background Complex vertebral malformation (CVM and bovine leukocyte adhesion deficiency (BLAD are two autosomal recessive lethal genetic defects frequently occurring in Holstein cattle, identifiable by single nucleotide polymorphisms. The objective of this study is to develop a rapid and reliable genotyping assay to screen the active Holstein sires and determine the carrier frequency of CVM and BLAD in Chinese dairy cattle population. Results We developed real-time PCR-based assays for discrimination of wild-type and defective alleles, so that carriers can be detected. Only one step was required after the DNA extraction from the sample and time consumption was about 2 hours. A total of 587 Chinese Holstein bulls were assayed, and fifty-six CVM-carriers and eight BLAD-carriers were identified, corresponding to heterozygote carrier frequencies of 9.54% and 1.36%, respectively. The pedigree analysis showed that most of the carriers could be traced back to the common ancestry, Osborndale Ivanhoe for BLAD and Pennstate Ivanhoe Star for CVM. Conclusions These results demonstrate that real-time PCR is a simple, rapid and reliable assay for BLAD and CVM defective allele detection. The high frequency of the CVM allele suggests that implementing a routine testing system is necessary to gradually eradicate the deleterious gene from the Chinese Holstein population.

  19. Reliable Grid Condition Detection and Control of Single-Phase Distributed Power Generation Systems

    Ciobotaru, Mihai

    The constant growth of Distributed Power Generation Systems (DPGS) presents an efficient and economic way of generating electricity closer to the load(s). The DPGS can contribute to an efficient and renewable electricity future by potentially: increasing the use of renewable sources of energy......; improving the efficiency of the electricity system by reducing transmission and distribution losses; improving the security of the electricity supply through increased diversity of supply and reduced vulnerability to simultaneous system failures. However, the new trend of using DPGS comes also with a suite...... of new challenges. One of the challenges is the interaction between the DPGS and the utility grid. As a consequence, grid interconnection requirements applying to distributed generation are continuously updated in order to maintain the quality and the stability of the utility grid. The new upcoming...

  20. Can magnetic resonance imaging at 3.0-Tesla reliably detect patients with endometriosis? Initial results.

    Thomeer, Maarten G; Steensma, Anneke B; van Santbrink, Evert J; Willemssen, Francois E; Wielopolski, Piotr A; Hunink, Myriam G; Spronk, Sandra; Laven, Joop S; Krestin, Gabriel P


    The aim of this study was to determine whether an optimized 3.0-Tesla magnetic resonance imaging (MRI) protocol is sensitive and specific enough to detect patients with endometriosis. This was a prospective cohort study with consecutive patients. Forty consecutive patients with clinical suspicion of endometriosis underwent 3.0-Tesla MRI, including a T2-weighted high-resolution fast spin echo sequence (spatial resolution=0.75 ×1.2 ×1.5 mm³) and a 3D T1-weighted high-resolution gradient echo sequence (spatial resolution=0.75 ×1.2 × 2.0 mm³). Two radiologists reviewed the dataset with consensus reading. During laparoscopy, which was used as reference standard, all lesions were characterized according to the revised criteria of the American Fertility Society. Patient-level and region-level sensitivities and specificities and lesion-level sensitivities were calculated. Patient-level sensitivity was 42% for stage I (5/12) and 100% for stages II, III and IV (25/25). Patient-level specificity for all stages was 100% (3/3). The region-level sensitivity and specificity was 63% and 97%, respectively. The sensitivity per lesion was 61% (90% for deep lesions, 48% for superficial lesions and 100% for endometriomata). The detection rate of obliteration of the cul-the-sac was 100% (10/10) with no false positive findings. The interreader agreement was substantial to perfect (kappa=1 per patient, 0.65 per lesion and 0.71 for obliteration of the cul-the-sac). An optimized 3.0-Tesla MRI protocol is accurate in detecting stage II to stage IV endometriosis. © 2014 The Authors. Journal of Obstetrics and Gynaecology Research © 2014 Japan Society of Obstetrics and Gynecology.

  1. Plasma abnormal prothrombin (PIVKA-II): a new and reliable marker for the detection of hepatocellular carcinoma.

    Takikawa, Y; Suzuki, K; Yamazaki, K; Goto, T; Madarame, T; Miura, Y; Yoshida, T; Kashiwabara, T; Sato, S


    We evaluated the clinical usefulness of a protein induced by vitamin K absence, antagonist-prothrombin (PIVKA-II), in detecting hepatocellular carcinoma (HCC) specifically in patients with liver cirrhosis, and the possible correlation between levels of PIVKA-II and pathological features of HCC. Plasma levels of PIVKA-II and alpha-fetoprotein (AFP) were measured in 628 patients with various diseases, including 253 with liver cirrhosis and 116 with HCC. PIVKA-II was detected (greater than or equal to 0.1 arbitrary unit/mL) in 54.3% of HCC and the concentration showed a positive correlation with the tumour size. As a screening test for the detection of HCC, PIVKA-II produced values comparable with those of AFP with a sensitivity, specificity and validity of 52.8, 98.8 and 51.6% respectively. Sixteen of 45 patients (37%) with HCC who had low AFP (less than 100 ng/mL) levels were positive for PIVKA-II. No apparent relationship, however, could be found between the levels of PIVKA-II and the aetiology or pathological findings of HCC. These results suggest that PIVKA-II can be a reliable marker for detecting HCC in patients with liver cirrhosis.

  2. Simulations of GRB detections with the ECLAIRs telescope onboard the future SVOM mission

    Antier, S; Cordier, B; Gros, A; Götz, D; Lachaud, C


    The soft gamma-ray telescope ECLAIRs with its Scientific Trigger Unit is in charge of detecting Gamma-Ray Bursts (GRBs) on-board the future SVOM satellite. Using the "scientific software model" (SSM), we study the efficiency of both implemented trigger algorithms, the Count-Rate Trigger for time-scales below 20s and the Image Trigger for larger ones. The SMM provides a simulation of ECLAIRs with photon projection through the coded-mask onto the detection plane. We developed an input GRB database for the SSM based on GRBs light curves detected by the Fermi GBM instrument. We extrapolated the GRB spectra into the ECLAIRs band (4-120 keV) and projected them onto the detection plane, superimposed with cosmic extragalactic background photons (CXB). Several simulations were performed by varying the GRB properties (fluxes and positions in the field of view). We present first results of this study in this paper.

  3. Lipase-nanoporous gold biocomposite modified electrode for reliable detection of triglycerides.

    Wu, Chao; Liu, Xueying; Li, Yufei; Du, Xiaoyu; Wang, Xia; Xu, Ping


    For triglycerides biosensor design, protein immobilization is necessary to create the interface between the enzyme and the electrode. In this study, a glassy carbon electrode (GCE) was modified with lipase-nanoporous gold (NPG) biocomposite (denoted as lipase/NPG/GCE). Due to highly conductive, porous, and biocompatible three-dimensional structure, NPG is suitable for enzyme immobilization. In cyclic voltammetry experiments, the lipase/NPG/GCE bioelectrode displayed surface-confined reaction in a phosphate buffer solution. Linear responses were obtained for tributyrin concentrations ranging from 50 to 250 mg dl(-1) and olive oil concentrations ranging from 10 to 200 mg dl(-1). The value of apparent Michaelis-Menten constant for tributyrin was 10.67 mg dl(-1) and the detection limit was 2.68 mg dl(-1). Further, the lipase/NPG/GCE bioelectrode had strong anti-interference ability against urea, glucose, cholesterol, and uric acid as well as a long shelf-life. For the detection of triglycerides in human serum, the values given by the lipase/NPG/GCE bioelectrode were in good agreement with those of an automatic biochemical analyzer. These properties along with a long self-life make the lipase/NPG/GCE bioelectrode an excellent choice for the construction of triglycerides biosensor.

  4. Definition for Rheumatoid Arthritis Erosions Imaged with High Resolution Peripheral Quantitative Computed Tomography and Interreader Reliability for Detection and Measurement.

    Barnabe, Cheryl; Toepfer, Dominique; Marotte, Hubert; Hauge, Ellen-Margrethe; Scharmga, Andrea; Kocijan, Roland; Kraus, Sebastian; Boutroy, Stephanie; Schett, Georg; Keller, Kresten Krarup; de Jong, Joost; Stok, Kathryn S; Finzel, Stephanie


    High-resolution peripheral quantitative computed tomography (HR-pQCT) sensitively detects erosions in rheumatoid arthritis (RA); however, nonpathological cortical bone disruptions are potentially misclassified as erosive. Our objectives were to set and test a definition for pathologic cortical bone disruptions in RA and to standardize reference landmarks for measuring erosion size. HR-pQCT images of metacarpophalangeal joints of RA and control subjects were used in an iterative process to achieve consensus on the definition and reference landmarks. Independent readers (n = 11) applied the definition to score 58 joints and measure pathologic erosions in 2 perpendicular multiplanar reformations for their maximum width and depth. Interreader reliability for erosion detection and variability in measurements between readers [root mean square coefficient of variation (RMSCV), intraclass correlation (ICC)] were calculated. Pathologic erosions were defined as cortical breaks extending over a minimum of 2 consecutive slices in perpendicular planes, with underlying trabecular bone loss and a nonlinear shape. Interreader agreement for classifying pathologic erosions was 90.2%, whereas variability for width and depth erosion assessment was observed (RMSCV perpendicular width 12.3%, axial width 20.6%, perpendicular depth 24.0%, axial depth 22.2%; ICC perpendicular width 0.206, axial width 0.665, axial depth 0.871, perpendicular depth 0.783). Mean erosion width was 1.84 mm (range 0.16-8.90) and mean depth was 1.86 mm (range 0.30-8.00). We propose a new definition for erosions visualized with HR-pQCT imaging. Interreader reliability for erosion detection is good, but further refinement of selection of landmarks for erosion size measurement, or automated volumetric methods, will be pursued.

  5. A rapid and reliable determination of doxycycline hyclate by HPLC with UV detection in pharmaceutical samples



    Full Text Available An accurate, sensitive and reproducible high performance liquid chromatographic (HPLC method for the quantification of doxycycline hyclate in pharmaceutical samples has been developed and validated. The drug and the standard were eluted from a Lichrosorb RP-8 (250 mm´4.6 mm, 10 mm particle size at 20 °C with a mobile phase consisting of methanol, acetonitrile and 0.010 M aqueous solution of oxalic acid (2:3:5, v/v/v. The flow rate was 1.25 ml min-1. A UV detector set at 350 nm was used to monitor the effluent. Each analysis required no longer than 4 min. The limits of detection and quantification were 1.15 and 3.84 μg ml-1, respectively. Recoveries for different concentrations ranged from 99.58 to 101.93 %.

  6. Reliability of Using Retinal Vascular Fractal Dimension as a Biomarker in the Diabetic Retinopathy Detection

    Zhang, Jiong; Bekkers, Erik; Abbasi-Sureshjani, Samaneh


    The retinal fractal dimension (FD) is a measure of vasculature branching pattern complexity. FD has been considered as a potential biomarker for the detection of several diseases like diabetes and hypertension. However, conflicting findings were found in the reported literature regarding the association between this biomarker and diseases. In this paper, we examine the stability of the FD measurement with respect to (1) different vessel annotations obtained from human observers, (2) automatic segmentation methods, (3) various regions of interest, (4) accuracy of vessel segmentation methods, and (5) different imaging modalities. Our results demonstrate that the relative errors for the measurement of FD are significant and FD varies considerably according to the image quality, modality, and the technique used for measuring it. Automated and semiautomated methods for the measurement of FD are not stable enough, which makes FD a deceptive biomarker in quantitative clinical applications.

  7. Ultrasensitive detection in optically dense physiological media: applications to fast reliable biological assays

    Matveeva, Evgenia G.; Gryczynski, Ignacy; Berndt, Klaus W.; Lakowicz, Joseph R.; Goldys, Ewa; Gryczynski, Zygmunt


    We present a novel approach for performing fluorescence immunoassay in serum and whole blood using fluorescently labeled anti-rabbit IgG. This approach, which is based on Surface Plasmon-Coupled Emission (SPCE), provides increased sensitivity and substantial background reduction due to exclusive selection of the signal from the fluorophores located near a bio-affinity surface. Effective coupling range for SPCE is only couple of hundred nanometers from the metallic surface. Excited fluorophores outside the coupling layer do not contribute to SPCE, and their free-space emission is not transmitted through the opaque metallic film into the glass substrate. An antigen (rabbit IgG) was adsorbed to a slide covered with a thin silver metal layer, and the SPCE signal from the fluorophore-labeled anti-rabbit antibody, binding to the immobilized antigen, was detected. The effect of the sample matrix (buffer, human serum, or human whole blood) on the end-point immunoassay SPCE signal is discussed. The kinetics of binding could be monitored directly in whole blood or serum. The results showed that human serum and human whole blood attenuate the SPCE end-point signal and the immunoassay kinetic signal only approximately 2- and 3-fold, respectively (compared to buffer), resulting in signals that are easily detectable even in whole blood. The high optical absorption of the hemoglobin can be tolerated because only fluorophores within a couple of hundred nanometers from the metallic film contribute to SPCE. Both glass and plastic slides can be used for SPCE-based assays. We believe that SPCE has the potential of becoming a powerful approach for performing immunoassays based on surface-bound analytes or antibodies for many biomarkers directly in dense samples such as whole blood, without any need for washing steps.

  8. Can a future mission detect a habitable ecosystem on Europa, or Ganymede?

    Chela Flores, Julian


    orbital probes in the future exploration of Jupiter's System (Gowen et al., 2009). There are alternative views on the effect of space weather on the radiation-induced S-cycles produced on the surficial molecules; but S is common to both interpretations (Carlson et al., 1999; McCord et al., 1999). The largest known S-fractionations are due to microbial reduction, and not to thermochemical processes. Besides, sulphate abiotic reductions are generally not as large as the biogenic ones (Kiyosu and Krouse, 1990). From experience with a natural population, this type of biota is able to fractionate efficiently the S-isotopes up to delta 34S of -70 per mil (Wortmann et al., 2001). Dissimilatory sulphate reducers are ubiquitous on Earth, producing the largest fractionations in the sulphur stable isotopes. These microbes are widely distributed in terrestrial anoxic environments.Consequently, sulphate reducers are the most evident candidates for the microorganisms populating a habitable Europan ecosystem. Microbial fractionation of stable S-isotopes argue in favour of penetrators for surveying the surface of not only Europa, but also of Ganymede, where surficial sulphur has been detected (McCord et al., 1997). The Europa-Jupiter System Mission (EJSM) intends to explore in the 2020s both of these satellites (Grasset et al., 2009). According to our hypothesis we predict that penetrators (supplied with mass spectrometry) should yield different results for fractionated sulphur. The icy patches on Europa should give substantial depletions of delta 34S, while measurements on Ganymede should give significantly lower values for the depletion of delta 34S. (Since the largest of the Galilean satellites lacks an ocean-core interface, according to our hypothesis it would not support life.) These diverging results—a large minus delta 34S for the Europan sulphur patches, and a small minus delta 34S for the Ganymede surficial sulphur—would provide a clear test for the hypothesis that a

  9. Do tests devised to detect recent HIV-1 infection provide reliable estimates of incidence in Africa?

    Sakarovitch, Charlotte; Rouet, Francois; Murphy, Gary; Minga, Albert K; Alioum, Ahmadou; Dabis, Francois; Costagliola, Dominique; Salamon, Roger; Parry, John V; Barin, Francis


    The objective of this study was to assess the performance of 4 biologic tests designed to detect recent HIV-1 infections in estimating incidence in West Africa (BED, Vironostika, Avidity, and IDE-V3). These tests were assessed on a panel of 135 samples from 79 HIV-1-positive regular blood donors from Abidjan, Côte d'Ivoire, whose date of seroconversion was known (Agence Nationale de Recherches sur le SIDA et les Hépatites Virales 1220 cohort). The 135 samples included 26 from recently infected patients (180 days), and 15 from patients with clinical AIDS. The performance of each assay in estimating HIV incidence was assessed through simulations. The modified commercial assays gave the best results for sensitivity (100% for both), and the IDE-V3 technique gave the best result for specificity (96.3%). In a context like Abidjan, with a 10% HIV-1 prevalence associated with a 1% annual incidence, the estimated test-specific annual incidence rates would be 1.2% (IDE-V3), 5.5% (Vironostika), 6.2% (BED), and 11.2% (Avidity). Most of the specimens falsely classified as incident cases were from patients infected for >180 days but <1 year. The authors conclude that none of the 4 methods could currently be used to estimate HIV-1 incidence routinely in Côte d'Ivoire but that further adaptations might enhance their accuracy.

  10. Can the comet assay be used reliably to detect nanoparticle-induced genotoxicity?

    Karlsson, Hanna L; Di Bucchianico, Sebastiano; Collins, Andrew R; Dusinska, Maria


    The comet assay is a sensitive method to detect DNA strand breaks as well as oxidatively damaged DNA at the level of single cells. Today the assay is commonly used in nano-genotoxicology. In this review we critically discuss possible interactions between nanoparticles (NPs) and the comet assay. Concerns for such interactions have arisen from the occasional observation of NPs in the "comet head", which implies that NPs may be present while the assay is being performed. This could give rise to false positive or false negative results, depending on the type of comet assay endpoint and NP. For most NPs, an interaction that substantially impacts the comet assay results is unlikely. For photocatalytically active NPs such as TiO2 , on the other hand, exposure to light containing UV can lead to increased DNA damage. Samples should therefore not be exposed to such light. By comparing studies in which both the comet assay and the micronucleus assay have been used, a good consistency between the assays was found in general (69%); consistency was even higher when excluding studies on TiO2 NPs (81%). The strong consistency between the comet and micronucleus assays for a range of different NPs-even though the two tests measure different endpoints-implies that both can be trusted in assessing the genotoxicity of NPs, and that both could be useful in a standard battery of test methods.

  11. Validating a standardised test battery for synesthesia: Does the Synesthesia Battery reliably detect synesthesia?

    Carmichael, D A; Down, M P; Shillcock, R C; Eagleman, D M; Simner, J


    Synesthesia is a neurological condition that gives rise to unusual secondary sensations (e.g., reading letters might trigger the experience of colour). Testing the consistency of these sensations over long time intervals is the behavioural gold standard assessment for detecting synesthesia (e.g., Simner, Mulvenna et al., 2006). In 2007 however, Eagleman and colleagues presented an online 'Synesthesia Battery' of tests aimed at identifying synesthesia by assessing consistency but within a single test session. This battery has been widely used but has never been previously validated against conventional long-term retesting, and with a randomly recruited sample from the general population. We recruited 2847 participants to complete The Synesthesia Battery and found the prevalence of grapheme-colour synesthesia in the general population to be 1.2%. This prevalence was in line with previous conventional prevalence estimates based on conventional long-term testing (e.g., Simner, Mulvenna et al., 2006). This reproduction of similar prevalence rates suggests that the Synesthesia Battery is indeed a valid methodology for assessing synesthesia. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  12. Detecting inflammation in the unprepared pediatric colon - how reliable is magnetic resonance enterography?

    Barber, Joy L.; Watson, Tom A. [Great Ormond Street Hospital for Children NHS Foundation Trust, Department of Radiology, London (United Kingdom); Lozinsky, Adriana Chebar; Kiparissi, Fevronia; Shah, Neil [Great Ormond Street Hospital for Children NHS Foundation Trust, Department of Gastroenterology, London (United Kingdom)


    Pediatric inflammatory bowel disease frequently affects the colon. MR enterography is used to assess the small bowel but it also depicts the colon. To compare the accuracy of MR enterography and direct visualization at endoscopy in assessing the colon in pediatric inflammatory bowel disease. We included children with inflammatory bowel disease who had undergone both MR enterography and endoscopy, and we restrospectively assessed the imaging and endoscopic findings. We scored the colonic appearance at MR using a total colon score. We then compared scores for the whole colon and for its individual segments with endoscopy and histology. We included 15 children. An elevated MR colonic segmental score predicted the presence of active inflammation on biopsy with a specificity of 90% (95% confidence interval [CI] 79.5-96.2%) and sensitivity of 60% (CI 40.6-77.3%); this compares reasonably with the predictive values for findings at colonoscopy - specificity 85% (CI 73.4 - 92.9%) and sensitivity 53.3% (CI 34.3%-71.6%). Accuracy did not change significantly with increasing bowel distension. MR-derived scores had comparable accuracy to those derived during visualization at colonoscopy for detecting biopsy-proven inflammation in our patient group. MR enterography might prove useful in guiding biopsy or monitoring treatment response. Collapse of a colonic segment did not impair assessment of inflammation. (orig.)

  13. Evaluation of loop-mediated isothermal amplification for the rapid, reliable, and robust detection of Salmonella in produce.

    Yang, Qianru; Wang, Fei; Jones, Kelly L; Meng, Jianghong; Prinyawiwatkul, Witoon; Ge, Beilei


    Rapid, reliable, and robust detection of Salmonella in produce remains a challenge. In this study, loop-mediated isothermal amplification (LAMP) was comprehensively evaluated against real-time quantitative PCR (qPCR) for detecting diverse Salmonella serovars in various produce items (cantaloupe, pepper, and several varieties of lettuce, sprouts, and tomato). To mimic real-world contamination events, produce samples were surface-inoculated with low concentrations (1.1-2.9 CFU/25 g) of individual Salmonella strains representing ten serovars and tested after aging at 4 °C for 48 h. Four DNA extraction methods were also compared using produce enrichment broths. False-positive or false-negative results were not observed among 178 strains (151 Salmonella and 27 non-Salmonella) used to evaluate assay specificity. The detection limits for LAMP were 1.8-4 CFU per reaction in pure culture and 10(4)-10(6) CFU per 25 g (i.e., 10(2)-10(4) CFU per g) in produce without enrichment, comparable to those obtained by qPCR. After 6-8 h of enrichment, both LAMP and qPCR consistently detected these low concentrations of Salmonella of diverse serovars in all produce items except sprouts. The PrepMan Ultra sample preparation reagent yielded the best results among the four DNA extraction methods. Upon further validation, LAMP may be a valuable tool for routine Salmonella testing in produce. The difficulty of detecting Salmonella in sprouts, whether using LAMP or qPCR, warrants further study.

  14. Simultaneous amplification of two bacterial genes: more reliable method of Helicobacter pylori detection in microbial rich dental plaque samples.

    Chaudhry, Saima; Idrees, Muhammad; Izhar, Mateen; Butt, Arshad Kamal; Khan, Ayyaz Ali


    Polymerase Chain reaction (PCR) assay is considered superior to other methods for detection of Helicobacter pylori (H. pylori) in oral cavity; however, it also has limitations when sample under study is microbial rich dental plaque. The type of gene targeted and number of primers used for bacterial detection in dental plaque samples can have a significant effect on the results obtained as there are a number of closely related bacterial species residing in plaque biofilm. Also due to high recombination rate of H. pylori some of the genes might be down regulated or absent. The present study was conducted to determine the frequency of H. pylori colonization of dental plaque by simultaneously amplifying two genes of the bacterium. One hundred dental plaque specimens were collected from dyspeptic patients before their upper gastrointestinal endoscopy and presence of H. pylori was determined through PCR assay using primers targeting two different genes of the bacterium. Eighty-nine of the 100 samples were included in final analysis. With simultaneous amplification of two bacterial genes 51.6% of the dental plaque samples were positive for H. pylori while this prevalence increased to 73% when only one gene amplification was used for bacterial identification. Detection of H. pylori in dental plaque samples is more reliable when two genes of the bacterium are simultaneously amplified as compared to one gene amplification only.

  15. Spectro-Fluor™ Technology for Reliable Detection of Proteins and Biomarkers of Disease: A Pioneered Research Study

    Farid Menaa


    Full Text Available Quantitative and qualitative characterization of fluorinated molecules represents an important task. Fluorine-based medicinal chemistry is a fast-growing research area due to the positive impact of fluorine in drug discovery, and clinical and molecular imaging (e.g., magnetic resonance imaging, positron emission tomography. Common detection methods include fluorinated-based labelling using radioactive isotopes or fluorescent dyes. Nevertheless, these molecular imaging methods can be harmful for health due to the potential instability of fluorochromes and cytoxicity of radioisotopes. Therefore, these methods often require expensive precautionary measures. In this context, we have developed, validated and patented carbon-fluorine spectroscopy (CFS™, recently renamed Spectro-Fluor™ technology, which among a non-competitive family of in-house made devices called PLIRFA™ (Pulsed Laser Isochronic Raman and Fluorescence Apparatus™, allows reliable detection of Carbon-Fluorine (C-F bonds. C-F bonds are known to be stable and safe labels once incorporated to any type of molecules, cells, compounds or (nano- materials. In this pioneered research study, we used Spectro-Fluor™ to assess biomarkers. As a proof-of-principle experiment, we have established a three-step protocol intended to rapid protein detection, which simply consisted of: (i incorporating a sufficient concentration of an aromatic amino-acid (fluorinated versus non-fluorinated into cultured cells; (ii simultaneously isolating the fluorinated protein of interest and the non-fluorinated form of the protein (control by immune-precipitation; (iii comparatively analyzing the respective spectrum obtained for the two protein forms by Spectro-Fluor™. Thereby, we were able to differentiate, from colon cancer cells HCT-116, the fluorinated and non-fluorinated forms of p21, a key transcriptional factor and downstream target of p53, the so-called “guardian of the genome”. Taken together

  16. The ALMA high speed optical communication link is here: an essential component for reliable present and future operations

    Filippi, G.; Ibsen, J.; Jaque, S.; Liello, F.; Ovando, N.; Astudillo, A.; Parra, J.; Saldias, Christian


    Announced in 2012, started in 2013 and completed in 2015, the ALMA high bandwidth communication system has become a key factor to achieve the operational and scientific goals of ALMA. This paper summarizes the technical, organizational, and operational goals of the ALMA Optical Link Project, focused in the creation and operation of an effective and sustainable communication infrastructure to connect the ALMA Operations Support Facility and Array Operations Site, both located in the Atacama Desert in the Northern region of Chile, with the point of presence of REUNA in Antofagasta, about 400km away, and from there to the Santiago Central Office in the Chilean capital through the optical infrastructure created by the EC-funded EVALSO project and now an integral part of the REUNA backbone. This new infrastructure completed in 2014 and now operated on behalf of ALMA by REUNA, the Chilean National Research and Education Network, uses state of the art technologies, like dark fiber from newly built cables and DWDM transmission, allowing extending the reach of high capacity communication to the remote region where the Observatory is located. The paper also reports on the results obtained during the first year and a half testing and operation period, where different operational set ups have been experienced for data transfer, remote collaboration, etc. Finally, the authors will present a forward look of the impact of it to both the future scientific development of the Chajnantor Plateau, where many installations area are (and will be) located, as well as the potential Chilean scientific backbone long term development.

  17. Detection and forecasting of oyster norovirus outbreaks: recent advances and future perspectives.

    Wang, Jiao; Deng, Zhiqiang


    Norovirus is a highly infectious pathogen that is commonly found in oysters growing in fecally contaminated waters. Norovirus outbreaks can cause the closure of oyster harvesting waters and acute gastroenteritis in humans associated with consumption of contaminated raw oysters. Extensive efforts and progresses have been made in detection and forecasting of oyster norovirus outbreaks over the past decades. The main objective of this paper is to provide a literature review of methods and techniques for detecting and forecasting oyster norovirus outbreaks and thereby to identify the future directions for improving the detection and forecasting of norovirus outbreaks. It is found that (1) norovirus outbreaks display strong seasonality with the outbreak peak occurring commonly in December-March in the U.S. and April-May in the Europe; (2) norovirus outbreaks are affected by multiple environmental factors, including but not limited to precipitation, temperature, solar radiation, wind, and salinity; (3) various modeling approaches may be employed to forecast norovirus outbreaks, including Bayesian models, regression models, Artificial Neural Networks, and process-based models; and (4) diverse techniques are available for near real-time detection of norovirus outbreaks, including multiplex PCR, seminested PCR, real-time PCR, quantitative PCR, and satellite remote sensing. The findings are important to the management of oyster growing waters and to future investigations into norovirus outbreaks. It is recommended that a combined approach of sensor-assisted real time monitoring and modeling-based forecasting should be utilized for an efficient and effective detection and forecasting of norovirus outbreaks caused by consumption of contaminated oysters.

  18. A reliable and inexpensive method of nucleic acid extraction for the PCR-based detection of diverse plant pathogens.

    Li, R; Mock, R; Huang, Q; Abad, J; Hartung, J; Kinard, G


    A reliable extraction method is described for the preparation of total nucleic acids from at least ten plant genera for subsequent detection of plant pathogens by PCR-based techniques. The method combined a modified CTAB (cetyltrimethylammonium bromide) extraction protocol with a semi-automatic homogenizer (FastPrep) instrument) for rapid sample processing and low potential of cross contamination. The method was applied to sample preparation for PCR-based detection of 28 different RNA and DNA viruses, six viroids, two phytoplasmas and two bacterial pathogens from a range of infected host plants including sweet potato, small fruits and fruit trees. The procedure is cost-effective and the qualities of the nucleic acid preparations are comparable to those prepared by commonly used commercial kits. The efficiency of the procedure permits processing of numerous samples and the use of a single nucleic acid preparation for testing both RNA and DNA genomes by PCR, making this an appealing method for testing multiple pathogens in certification and quarantine programs.

  19. Optimization of a reliable, fast, cheap and sensitive silver staining method to detect SSR markers in polyacrylamide gels

    Mergeai G.


    Full Text Available A reliable, fast, cheap and sensitive silver staining method to detect nucleic acids in polyacrylamide gels was developed from two standard stain procedures. The main differences between the three methods concerned (i the preexposure with formaldehyde during silver nitrate impregnation, (ii the addition of sodium thiosulfate and sodium carbonate instead of sodium hydroxide during development; (iii the removal of the stop reaction or the inclusion of absolute ethanol with acetic acid in the stop solution and (iv the duration of the different reaction steps. All methods allowed the detection of similar polymorphisms for single sequence repeats with different cotton genotypes but important differences regarding the contrast, background and conservation duration of the gels were observed. Two methods gave superior sensitivity. The improved method was sensitive, fast (20 min, gave lower backgrounds, produced gels with long-term conservation ability, and allowed a reutilization of all the solutions used in the staining procedure from fi ve to seven times, making it quite cheap.

  20. Christodoulou Memory of GW150914 - Prospects of Detection in LIGO and Future Detectors

    Johnson, Aaron; Kapadia, Shasvath; Kennefick, Daniel


    The event GW150914 produced strains of the order 10-21 in the two instruments comprising the Laser Interferometric Gravitational Wave Observatory (LIGO). The event has been interpreted as originating in a coalescing black hole binary, with individual components of about 30 solar masses each. A striking aspect of the coalescence deduced from the signal is the emission of 3 solar masses of energy in the oscillating gravitational wave. Theory predicts a DC component of the gravitational signal associated with the emission of such large amounts of gravitational wave energy known as the Christodoulou memory. The memory, as a non-linear component of the signal, is expected to be an order of magnitude smaller than the amplitude of the primary AC component of the gravitational waves. We discuss the prospects of detecting the Christodoulou memory in similar future signals, both with LIGO and with other detectors, including future space-based instruments.

  1. Field Effect Sensors for Nucleic Acid Detection: Recent Advances and Future Perspectives

    Bruno Veigas


    Full Text Available In the last decade the use of field-effect-based devices has become a basic structural element in a new generation of biosensors that allow label-free DNA analysis. In particular, ion sensitive field effect transistors (FET are the basis for the development of radical new approaches for the specific detection and characterization of DNA due to FETs’ greater signal-to-noise ratio, fast measurement capabilities, and possibility to be included in portable instrumentation. Reliable molecular characterization of DNA and/or RNA is vital for disease diagnostics and to follow up alterations in gene expression profiles. FET biosensors may become a relevant tool for molecular diagnostics and at point-of-care. The development of these devices and strategies should be carefully designed, as biomolecular recognition and detection events must occur within the Debye length. This limitation is sometimes considered to be fundamental for FET devices and considerable efforts have been made to develop better architectures. Herein we review the use of field effect sensors for nucleic acid detection strategies—from production and functionalization to integration in molecular diagnostics platforms, with special focus on those that have made their way into the diagnostics lab.

  2. Testing keV sterile neutrino dark matter in future direct detection experiments

    Campos, Miguel D


    We determine constraints on sterile neutrino warm dark matter through direct detection experiments, taking XENON100 and its future stages as example. If keV-scale sterile neutrinos scatter inelastically with bound electrons of the target material, an electron recoil signal is generated. This can be used to set limits on the sterile neutrino mass and its mixing with the active sector. While not competitive with astrophysical constraints from X-ray data, the constraints are the first direct laboratory bounds on sterile neutrino warm dark matter, and will be in some parts of parameter space the strongest limits on keV-scale neutrinos.

  3. Accuracy of unenhanced MR imaging in the detection of axillary lymph node metastasis: study of reproducibility and reliability.

    Scaranelo, Anabel M; Eiada, Riham; Jacks, Lindsay M; Kulkarni, Supriya R; Crystal, Pavel


    To investigate the accuracy, reproducibility, and reliability of unenhanced magnetic resonance (MR) imaging techniques for detecting metastatic axillary lymph nodes in patients with newly diagnosed breast carcinoma. Institutional review board approval and informed consent were obtained. Seventy-four consecutive women with invasive breast carcinoma were recruited to undergo preoperative breast MR imaging. Thirteen patients were excluded, two because they were undergoing preoperative chemotherapy and 11 because of the presence of movement or susceptibility artifacts on images. Thus, 61 patients (mean age, 53 years; range, 33-78 years) were included in this study. Axial T1-weighted MR images without fat saturation and diffusion-weighted (DW) MR images were analyzed by two experienced radiologists, who were blinded to the histopathologic findings. Visual and quantitative analyses of unenhanced MR images were performed. Sensitivity, specificity, and accuracy were calculated. To assess the intraobserver agreement, a second reading was performed. Statistical analysis was conducted on a patient-by-affected side basis. The sensitivity, specificity, and accuracy were 88%, 82%, and 85%, respectively, for axial T1-weighted MR imaging and 84%, 77%, and 80% for DW imaging. Apparent diffusion coefficients (ADCs) were significantly lower in the malignant group (P<.05 for all four readings), with the average of the four readings ranging from 0.333×10(-3) mm2/sec to 2.843×10(-3) mm2/sec. The mean Lin coefficient comparing the mean ADC reading for each observer was 0.959 (95% confidence interval: 0.935, 0.975), suggesting very high interobserver agreement between the two observers in terms of reproducibility of ADCs. The Bland-Altman plot showed good inter- and intraobserver agreement. Unenhanced MR imaging techniques showed high accuracy in the preoperative evaluation of axillary status in patients with invasive breast cancer. Results indicate reliable and reproducible assessment

  4. Curriculum-based measurement of oral reading: A preliminary investigation of confidence interval overlap to detect reliable growth.

    Van Norman, Ethan R


    Curriculum-based measurement of oral reading (CBM-R) progress monitoring data is used to measure student response to instruction. Federal legislation permits educators to use CBM-R progress monitoring data as a basis for determining the presence of specific learning disabilities. However, decision making frameworks originally developed for CBM-R progress monitoring data were not intended for such high stakes assessments. Numerous documented issues with trend line estimation undermine the validity of using slope estimates to infer progress. One proposed recommendation is to use confidence interval overlap as a means of judging reliable growth. This project explored the degree to which confidence interval overlap was related to true growth magnitude using simulation methodology. True and observed CBM-R scores were generated across 7 durations of data collection (range 6-18 weeks), 3 levels of dataset quality or residual variance (5, 10, and 15 words read correct per minute) and 2 types of data collection schedules. Descriptive and inferential analyses were conducted to explore interactions between overlap status, progress monitoring scenarios, and true growth magnitude. A small but statistically significant interaction was observed between overlap status, duration, and dataset quality, b = -0.004, t(20992) =-7.96, p < .001. In general, confidence interval overlap does not appear to meaningfully account for variance in true growth across many progress monitoring conditions. Implications for research and practice are discussed. Limitations and directions for future research are addressed. (PsycINFO Database Record

  5. Reliability and minimal detectable change of three functional tests: forward-lunge, step-up-over and sit-to-stand.

    Luque-Siles, Carmen; Gallego-Izquierdo, Tomas; Jímenez-Rejano, Jose Jesus; de-la-Orden, Susana Granados; Plaza-Manzano, Gustavo; López-Illescas-Ruiz, Africa; Ferragut-Garcías, Alejandro; Romero-Franco, Natalia; Martín-Casas, Patricia; Pecos-Martín, Daniel


    [Purpose] To examine the intrasession and intersession reliability and the absolute reliability of three functional dynamic tests-forward-lunge, step-up-over and sit-to-stand tests-using computerized dynamic posturography. [Subjects and Methods] An intra-test and test-retest, repeated measure study was designed. Forty-five healthy subjects twice carried out the forward-lunge test, step-up-over test, and sit-to-stand test on two days, one week apart. The intrasession and intersession reliabilities as judged by the intraclass correlation coefficient (ICC) and the minimal detectable change of the three functional tests were calculated. [Results] Excellent to very good intrasession reliability of the forward-lunge test (ICC range of 0.9-0.8) was found. Very good to good intrasession reliability of the step-up-over test (ICC range of 0.9-0.5) was found and very good intrasession reliability of the sit-to-stand test (ICC range of 0.8-0.7) was found. The minimal detectable change at the 95% confidence level of most of the measures was lower than 30%. [Conclusion] The forward-lunge, step-up-over and sit-to-stand tests are reliable measurement tools.

  6. Future non-invasive imaging to detect vascular plaque instability and subclinical non-obstructive atherosclerosis

    Arnon Blum; Menachem Nahir


    Atherosclerosis underlies the major causes of death in the Western World. Our main goal is to detect early changes of atherosclerosis and to identify subjects at highest cardiovascular risk that may aid in the development of prevention approaches and better management that will decrease cardiovascular morbidity and mortality. The new methods that are of interest include the advanced vascular ultrasound methods, the infra red and near infra red imaging techniques, the EndoPat device that reflects peripheral arterial tone, the electron beam computed tomography, the magnetic resonance imaging, and the molecular imaging techniques. In this review we will focus on the future of advanced imaging techniques that are being developed to detect early (pre-clinical) development of atherosclerosis.

  7. Reliable allele detection using SNP-based PCR primers containing Locked Nucleic Acid: application in genetic mapping

    Trognitz Friederike


    Full Text Available Abstract Background The diploid, Solanum caripense, a wild relative of potato and tomato, possesses valuable resistance to potato late blight and we are interested in the genetic base of this resistance. Due to extremely low levels of genetic variation within the S. caripense genome it proved impossible to generate a dense genetic map and to assign individual Solanum chromosomes through the use of conventional chromosome-specific SSR, RFLP, AFLP, as well as gene- or locus-specific markers. The ease of detection of DNA polymorphisms depends on both frequency and form of sequence variation. The narrow genetic background of close relatives and inbreds complicates the detection of persisting, reduced polymorphism and is a challenge to the development of reliable molecular markers. Nonetheless, monomorphic DNA fragments representing not directly usable conventional markers can contain considerable variation at the level of single nucleotide polymorphisms (SNPs. This can be used for the design of allele-specific molecular markers. The reproducible detection of allele-specific markers based on SNPs has been a technical challenge. Results We present a fast and cost-effective protocol for the detection of allele-specific SNPs by applying Sequence Polymorphism-Derived (SPD markers. These markers proved highly efficient for fingerprinting of individuals possessing a homogeneous genetic background. SPD markers are obtained from within non-informative, conventional molecular marker fragments that are screened for SNPs to design allele-specific PCR primers. The method makes use of primers containing a single, 3'-terminal Locked Nucleic Acid (LNA base. We demonstrate the applicability of the technique by successful genetic mapping of allele-specific SNP markers derived from monomorphic Conserved Ortholog Set II (COSII markers mapped to Solanum chromosomes, in S. caripense. By using SPD markers it was possible for the first time to map the S. caripense alleles

  8. Ash pollen allergy: reliable detection of sensitization on the basis of IgE to Ole e 1.

    Imhof, Konrad; Probst, Elisabeth; Seifert, Burkhardt; Regenass, Stephan; Schmid-Grendelmeier, Peter

    Background: Alongside hazel, alder and birch pollen allergies, ash pollen allergy is a relevant cause of hay fever during spring in the European region. For some considerable time, ash pollen allergy was not routinely investigated and its clinical relevance may well have been underestimated, particularly since ash and birch tree pollination times are largely the same. Ash pollen extracts are not yet well standardized and diagnosis is therefore sometimes unreliable. Olive pollen, on the other hand, is strongly cross-reactive with ash pollen and is apparently better standardized. Therefore, the main allergen of olive pollen, Ole e 1, has been postulated as a reliable alternative for the detection of ash pollen sensitization. Methods: To determine to what extent specific IgE against Ole e 1 in patients with ash pollen allergy is relevant, we included 183 subjects with ash pollen allergy displaying typical symptoms in March/April and positive skin prick test specific IgE against Ole e 1 (t224) and ash pollen (t25) and various birch allergens (Bet v 1, Bet v 2/v 4) in a retrospective study. Results: A significant correlation was seen between specific IgE against Ole e 1 and ash pollen, but also to a slightly lesser extent between IgE against Ole e 1 and skin prick test with ash pollen, the latter being even higher than IgE and skin prick test both with ash pollen. No relevant correlation was found with birch pollen allergens, demonstrating the very limited cross-reactivity between ash and birch pollen. Conclusion: It appears appropriate to determine specific IgE against Ole e 1 instead of IgE against ash pollen to detect persons with ash pollen allergy. Our findings may also support the idea of using possibly better standardized or more widely available olive pollen extracts instead of ash pollen extract for allergen-specific immunotherapy.

  9. Practical and reliable enzyme test for the detection of mucopolysaccharidosis IVA (Morquio Syndrome type A) in dried blood samples.

    Camelier, Marli V; Burin, Maira G; De Mari, Jurema; Vieira, Taiane A; Marasca, Giórgia; Giugliani, Roberto


    Mucopolysaccharidosis IVA (MPS IVA), or Morquio Syndrome type A, is an autosomal recessive disease caused by deficiency of the lysosomal enzyme N-acetylgalactosamine-6-sulfatase (GALNS), resulting in excessive lysosomal storage of keratan sulfate in many tissues and organs. This accumulation causes a severe skeletal dysplasia with short stature, and affects the eye, heart and other organs, with many signs and symptoms. Morquio A syndrome is estimated to occur in 1 in 200,000 to 300,000 live births. Clinical trials with enzyme replacement therapy for this disease are in progress, and it is probable that the treatment, when available, would be more effective if started early. We describe an innovative fluorometric method for the assay of GALNS in dried blood spots (DBS). We used dried blood spots (DBS) as the enzyme source and compared it with leukocytes samples, having studied 25 MPS IVA patients and 54 healthy controls. We optimized the assay conditions, including incubation time and stability of DBS samples. To eppendorf type tubes containing a 3-mm diameter blood spot we added elution liquid and substrate solution. After 2 different incubations at 37°C, the amount of hydrolyzed product was compared with a calibrator to allow the quantification of the enzyme activity. Results in DBS were compared to the ones obtained in leukocytes using the standard technique. The fluorescent methodology was validated in our laboratory and the assay was found sensitive and specific, allowing reliable detection of MPS IVA patients. The use of DBS simplifies the collection and transport steps, and is especially useful for testing patients from more remote areas of large countries, and when samples need to cross country borders. This assay could be easily incorporated into the protocol of reference laboratories and play a role in the screening for MPS IVA, contributing to earlier detection of affected patients. Copyright © 2011 Elsevier B.V. All rights reserved.

  10. Carina as a useful and reliable radiological landmark for detection of accidental arterial placement of central venous catheters.

    Umesh, Goneppanavar; Ranjan, Shetty; Jasvinder, Kaur; Nanda, Shetty


    Central venous catheters are commonly used in the management of critically ill patients. Their insertion can be challenging in hemodynamically unstable patients and in those with altered thoracic anatomy. Although ultrasound guided insertion can reduce this problem, this facility may not be available in all locations and in all institutions. Accidental arterial puncture is one of the very serious complications that can occur during central venous catheter insertion. This is usually detected clinically by bright color and projectile/pulsatile flow of the returning blood. However, such means are known to be misleading especially in hypoxic and hemodynamically unstable patients. Other recognized measures used to identify arterial puncture would be blood gas analysis of the returning blood, use of pressure transducer to identify waveform pattern and the pressures. In this article, we propose that trachea and carina can be used as a reliable radiological landmark to identify accidental arterial placement of central venous catheters. We further conclude that this information could be useful especially when dealing with post-resuscitation victims and hemodynamically unstable critically ill patients.

  11. Detecting reliable non interacting proteins (NIPs) significantly enhancing the computational prediction of protein-protein interactions using machine learning methods.

    Srivastava, A; Mazzocco, G; Kel, A; Wyrwicz, L S; Plewczynski, D


    Protein-protein interactions (PPIs) play a vital role in most biological processes. Hence their comprehension can promote a better understanding of the mechanisms underlying living systems. However, besides the cost and the time limitation involved in the detection of experimentally validated PPIs, the noise in the data is still an important issue to overcome. In the last decade several in silico PPI prediction methods using both structural and genomic information were developed for this purpose. Here we introduce a unique validation approach aimed to collect reliable non interacting proteins (NIPs). Thereafter the most relevant protein/protein-pair related features were selected. Finally, the prepared dataset was used for PPI classification, leveraging the prediction capabilities of well-established machine learning methods. Our best classification procedure displayed specificity and sensitivity values of 96.33% and 98.02%, respectively, surpassing the prediction capabilities of other methods, including those trained on gold standard datasets. We showed that the PPI/NIP predictive performances can be considerably improved by focusing on data preparation.

  12. The validity and reliability of the System for Early Detection of Developmental Disorders: 3-36 months

    Francisco Alcantud Marín


    Full Text Available This article introduces the System for Early Detection of Developmental Disorders (referred to as SDPTD for its abbreviation in Spanish, a system developed in previous papers. The SDPTD is a developmental screening test that includes seven questionnaires, one for each cutoff of age (3, 6, 9, 12, 18, 24 and 36 months. These questionnaires have been designed to be answered by parents. To study its validity, SDPTD was administered to a sample of 728 children (approximately 100 children in each of the seven cutoff age groups. A development scale known as Merrill-Palmer-Revised (MP-R was used as a criteria test. The development state of the children was tested again one year later. The results show a high level of agreement between parents and professionals. The concurrent validity is high although it varies by cutoff age. Regarding the diagnostic validity a year after the original evaluation, levels of sensitivity and specificity are high enough to consider the system reliable, valid and suitable for screening purposes.

  13. Detection of eating disorders in patients: validity and reliability of the French version of the SCOFF questionnaire.

    Garcia, Frederico Duarte; Grigioni, Sébastien; Allais, Elodie; Houy-Durand, Emmanuelle; Thibaut, Florence; Déchelotte, Pierre


    Although eating disorders prevalence is increasing, they are often under diagnosed in cases of unspecific signs of malnutrition. Screening scales may allow earlier diagnosis and nutritional intervention. This study aimed to evaluate the validity of the French version (SCOFF-F) of the SCOFF questionnaire for the detection of eating disorders among a female patient population referred to a clinical nutrition unit. After answering the 5 dichotomous questions of the paper version of SCOFF-F, patients were evaluated by one eating disorders specialist blinded to questionnaire results, using the MINI and Diagnosis and Statistical Manual for Mental diseases (DSM-IV) criteria as a gold standard. Patients with anorexia nervosa (n = 67) and with bulimia nervosa (n = 45) were assessed. Age-matched healthy female students (n = 114) served as control group. At a cut-off of two positive responses, the sensibility, specificity and the area under the curve of SCOFF-F were 94.6%, 94.7% and 97.9% respectively. Cohen's kappa coefficient between SCOFF-F and MINI results was 89%. The results of this study confirm the reliability of SCOFF-F as a screening and diagnostic-facilitating test for eating disorders in a French-speaking female patient population. SCOFF-F should help professionals in clinical nutrition to achieve earlier diagnosis and care of eating disorder patients. Copyright © 2010 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.

  14. Detection of Historical and Future Precipitation Variations and Extremes Over the Continental United States

    Anderson, Bruce T. [Boston Univ., MA (United States)


    through a change in the underlying climate. As such, this method is capable of detecting “hot spot” regions—as well as “flare ups” within the hot spot regions—that have experienced interannual to multi-decadal scale variations and trends in seasonal-mean precipitation and extreme events. Further by applying the same methods to numerical climate models we can discern the fidelity of the current-generation climate models in representing detectability within the observed climate system. In this way, we can objectively determine the utility of these model systems for performing detection studies of historical and future climate change.

  15. Is air-displacement plethysmography a reliable method of detecting ongoing changes in percent body fat within obese children involved in a weight management program?

    Ewane, Cecile; McConkey, Stacy A; Kreiter, Clarence D; Fuller, Mathew; Tabor, Ann; Bosch, Joni; Mews, Jayme; Baldwin, Kris; Van Dyke, Don C


    The prevalence of childhood obesity in the US has increased considerably over the last few decades and continues to increase. To monitor the progress of patients enrolled in weight management programs, clinicians need accurate methods of detecting changes in body composition (percent body fat) over time. The gold standard method, hydrodensitometry, has severe limitations for the pediatric population. This study examines the reliability of air-displacement plethysmography (ADP) in detecting percent body fat changes within obese children over time. Percent body fat by ADP, weight, and body mass index (BMI) were measured for eight obese children aged 5-12 years enrolled in a weight management program over a 12-month period. These measurements were taken at initial evaluation, 1.5 months, 3 months, 6 months, and 12 months to monitor the progress of the subjects and detect any changes in these measures over time. Statistical analysis was used to determine the reliability of the data collected. The reliability estimate for percent body fat by ADP was 0.78. This was much lower than the reliability of BMI, 0.98, and weight measurements, 0.99. The low reliability estimate of ADP indicates a large standard error of measurement by this method. The measurement error of ADP is large, and in our study, ADP measured changes in percent body fat that far exceeded levels of true change that would have been clinically useful and important to detect. Hence, this method yielded change measures that did not allow meaningful clinical interpretations and often did not reflect true differences in status across time. ADP is not a reliable method for detecting changes in percent body fat over the time intervals employed within this study of obese children. © 2010 Asian Oceanian Association for the Study of Obesity . Published by Elsevier Ltd. All rights reserved.

  16. The reliability and minimal detectable change of Timed Up and Go test in individuals with grade 1-3 knee osteoarthritis.

    Alghadir, Ahmad; Anwer, Shahnawaz; Brismée, Jean-Michel


    The Timed Up and Go (TUG) test is quick and easy tests to assess patients' functional mobility. However, its reliability in individuals with knee osteoarthritis (OA) has not been well established. The aims of this study were to determine the reliability and minimal detectable change of the TUG test in individuals with doubtful to moderate (Grade 1-3) knee OA. Sixty-five subjects (25 male, 40 female), aged 45-70 years, with knee OA participated. Inter-rater reliability was assessed using two observers at different times of the same day in an alternating order. Intra-rater reliability was assessed on two consecutive visits with a 2-day interval. The standard error of measurement (SEM) and the minimum detectable change (MDC) were calculated to determine statistically meaningful changes. Intra-rater and inter-rater reliability were 0.97 (95% confidence interval [CI], 0.95 - 0.98) and 0.96 (95% confidence interval [CI], 0.94 - 0.97), respectively. The MDC, based on measurements by a single rater and between raters, was 1.10 and 1.14 seconds, respectively. The TUG is a reliable test with adequate MDC for clinical use in individuals with doubtful to moderate knee OA.

  17. A simple, rapid and reliable enzyme-linked immunosorbent assay for the detection of bovine virus diarrhoea virus (BVDV) specific antibodies in cattle serum, plasma and bulk milk

    Kramps, J.A.; Maanen, van C.; Wetering, van de G.; Stienstra, G.; Quak, S.; Brinkhof, J.; Ronsholt, L.; Nylin, B.


    To detect Bovine Virus Diarrhoea Virus (BVDV)-specific antibodies in cattle serum, plasma and bulk milk, a simple, reliable and rapid blocking ELISA ("Ceditest") has been developed using two monoclonal antibodies ("WB112" and "WB103") directed to different highly conserved epitopes on the non-struct

  18. Improvement of Matrix Converter Drive Reliability by Online Fault Detection and a Fault-Tolerant Switching Strategy

    Nguyen-Duy, Khiem; Liu, Tian-Hua; Chen, Der-Fa


    The matrix converter system is becoming a very promising candidate to replace the conventional two-stage ac/dc/ac converter, but system reliability remains an open issue. The most common reliability problem is that a bidirectional switch has an open-switch fault during operation. In this paper, a...

  19. Audio gunshot detection and localization systems: History, basic design, and future possibilities

    Graves, Jordan R.

    For decades, law enforcement organizations have increasingly utilized audio detection and localization systems to identify potential gunshot incidents and to respond accordingly. These systems have grown from simple microphone configurations used to estimate location into complex arrays that seem to pinpoint gunfire to within mere feet of its actual occurrence. Such technology comes from a long and dynamic history of developing equipment dating back to the First World War. Additionally, though basic designs require little in terms of programming or engineering experience, the mere presence of this tool invokes a firestorm of debate amongst economists, law enforcement groups, and the general public, which leads to questions about future possibilities for its use. The following pages will retell the history of these systems from theoretical conception to current capabilities. This work will also dissect these systems to reveal fundamental elements of their inner workings, in order to build a basic demonstrative system. Finally, this work will discuss some legal and moral points of dissension, and will explore these systems’ roles in society now and in the future, in additional applications as well.

  20. Reliability of measuring regional callosal atrophy in neurodegenerative diseases

    Jeroen Van Schependom, MSc Eng, PhD


    In summary, we have constructed an algorithm that reliably detects the CC in 3D T1 images in a fully automated way in healthy controls and different neurodegenerative diseases. Although the CC area and the circularity are the most reliable features (ICC > 0.97; the reliability of the thickness profile (ICC > 0.90; excluding the tip is sufficient to warrant its inclusion in future clinical studies.

  1. Relevant Literary Space about Service Quality Evaluation: Countries where the Studies are Performed, Analytical Methods, Reliability Assessments, Hypothesis and Future Research

    Pérez-Rave Jorge Iván


    Full Text Available This paper aims to explore the current thinking about service quality study, including five variables. We used the methodology of Systematic Literature Review in Engineering (RSLI (steps: identifying, describing, deepening and publishing. After the first selection of documents based on the delimitation map, the final quality control consisted in checking 257 documents (90.3% exceeded the inclusion criteria. According to the representativeness analysis, on the Relevant Literary Space, (ELR the top 50 were chosen, which represented 4.2% of the population (1,019 documents and consolidated 44.7% of the citations issued to the subject. China is the country with the largest presence in the ELR as well as Structural Equation Models; the Alpha Cronbach is the most used reliability index with a mean value of 0.87, being higher compared to the traditional acceptance value (0.7. In most accepted hypotheses there are proved relationships among variables/constructs: quality, satisfaction, perceived value, and behavioural intentions. It identifies six categories of future research, two of them are to expand the scope of research to other contexts and deepen relations between variables/constructs.

  2. Strongly lensed neutral hydrogen emission: detection predictions with current and future radio interferometers

    Deane, R P; Heywood, I


    Strong gravitational lensing provides some of the deepest views of the Universe, enabling studies of high-redshift galaxies only possible with next-generation facilities without the lensing phenomenon. To date, 21 cm radio emission from neutral hydrogen has only been detected directly out to z~0.2, limited by the sensitivity and instantaneous bandwidth of current radio telescopes. We discuss how current and future radio interferometers such as the Square Kilometre Array (SKA) will detect lensed HI emission in individual galaxies at high redshift. Our calculations rely on a semi-analytic galaxy simulation with realistic HI disks (by size, density profile and rotation), in a cosmological context, combined with general relativistic ray tracing. Wide-field, blind HI surveys with the SKA are predicted to be efficient at discovering lensed HI systems, increasingly so at z > 2. This will be enabled by the combination of the magnification boosts, the steepness of the HI luminosity function at the high-mass end, and t...

  3. Application of Geologic Mapping Techniques and Autonomous Feature Detection to Future Exploration of Europa

    Bunte, M. K.; Tanaka, K. L.; Doggett, T.; Figueredo, P. H.; Lin, Y.; Greeley, R.; Saripalli, S.; Bell, J. F.


    disrupted surface morphologies. Areas of high interest include lineaments and chaos margins. The limitations on detecting activity at these locations are approximated by studying similar observed conditions on other bodies. By adapting machine learning and data mining techniques to signatures of plumes and morphology, I have demonstrated autonomous rule-based detection of known features using edge-detection and supervised classification methods. These methods successfully detect ≤94% of known volcanic plumes or jets at Io, Enceladus, and comets. They also allow recognition of multiple feature types. Applying these results to conditions expected for Europa enables a prediction of the potential for detection of similar features and enables recommendations for mission concepts to increase the science return and efficiency of future missions to observe Europa. This post-Galileo view of Europa provides a synthesis of the overall history of this unique icy satellite and will be a useful frame of reference for future exploration of the jovian system and other potentially active outer solar system bodies.

  4. Detectability of rotation-powered pulsars in future hard X-ray surveys

    Wei Wang


    Recent INTEGRAL/IBIS hard X-ray surveys have detected about 10 young pulsars.We show hard X-ray properties of these 10 young pulsars,which have a luminosity of 10~(33)-10~(37) erg s~(-1) and a photon index of 1.6-2.1 in the energy range of 20-100 keV.The correlation between X-ray luminosity and spin-down power of L_X∝ L_(sd)~(1.31) suggests that the hard X-ray emission in rotation-powered pulsars is dominated by the pulsar wind nebula (PWN) component.Assuming spectral properties are similar in 20-100keV and 2-10 keV for both the pulsar and PWN components,the hard X-ray luminosity and flux of 39 known young X-ray pulsars and 8 millisecond pulsars are obtained,and a correlation of L_X ∝ L_(sd)~(1.5) is derived.About 20 known young X-ray pulsars and 1 millisecond pulsars could be detected with future INTEGRAL and HXMT surveys.We also carry out Monte Carlo simulations of hard X-ray pulsars in the Galaxy and the Gould Belt,assuming values for the pulsar birth rate,initial position,proper motion velocity,period,and magnetic field distribution and evolution based on observational statistics and the L_X - L_(sd) relations: L_X∝ L_(sd)~(1.31) and L_X∝ L_(sd)~(1.5).More than 40 young pulsars (mostly in the Galactic plane) could be detected after ten years of INTEGRAL surveys and the launch of HXMT.So,the young pulsars would be a significant part of the hard X-ray source population in the sky,and will contribute to unidentified hard X-ray sources in present and future hard X-ray surveys by INTEGRAL and HXMT.

  5. Choix d'investissement en avenir certain : Présentation du programme return Investment Choosing for a Reliable Future : Description of the Return Program

    Hiegel M.


    Full Text Available Le programme général de calcul économique, mis au point par le département Économie de l'Institut Français du Pétrole, est destiné à comparer différents projets d'investissement en avenir certain. D'un emploi très souple, ce programme qui ne vise pas à apporter une contribution au niveau théorique, permettra d'éviter certains écueils de la méthode d'actualisation dont la simplicité apparente est parfois source de confusion. La plupart des projets d'investissement, quel que soit leur secteur d'activité, peuvent être étudiés à l'aide du programme. Les critères de choix classiques sont déterminés en tenant compte de la structure de financement, de la fiscalit�� et de l'érosion monétaire. Des études de sensibilité permettent de tester la valeur de ces critères économiques dans de nombreuses hypothèses, et constituent une première approche des problèmes liés à un avenir incertain. The general capital budgeting program developed by the Department of Economics of the Institut Français du Pétrole is designed to compare different investment projects for a reliable future. This highly flexible program, which does not attempt to make any contribution on the theoretical level, will enable vorious obstacles to be avoided in the present value method, whose apparent simplicity is sometimes a source of confusion. Most investment projects, whatever their sector of activity, can be analyzed with this program. Conventional decision criteria are determined in the light of the financing structure, the fiscal system and currency depreciation. Sensitivity analysis can be used to test the value of these economic criteria in a great many hypotheses and make up an initial approach ta problems linked ta an uncertain future.

  6. Ornithine decarboxylase antizyme finder (OAF: Fast and reliable detection of antizymes with frameshifts in mRNAs

    Atkins John F


    Full Text Available Abstract Background Ornithine decarboxylase antizymes are proteins which negatively regulate cellular polyamine levels via their affects on polyamine synthesis and cellular uptake. In virtually all organisms from yeast to mammals, antizymes are encoded by two partially overlapping open reading frames (ORFs. A +1 frameshift between frames is required for the synthesis of antizyme. Ribosomes change translation phase at the end of the first ORF in response to stimulatory signals embedded in mRNA. Since standard sequence analysis pipelines are currently unable to recognise sites of programmed ribosomal frameshifting, proper detection of full length antizyme coding sequences (CDS requires conscientious manual evaluation by a human expert. The rapid growth of sequence information demands less laborious and more cost efficient solutions for this problem. This manuscript describes a rapid and accurate computer tool for antizyme CDS detection that requires minimal human involvement. Results We have developed a computer tool, OAF (ODC antizyme finder for identifying antizyme encoding sequences in spliced or intronless nucleic acid sequenes. OAF utilizes a combination of profile hidden Markov models (HMM built separately for the products of each open reading frame constituting the entire antizyme coding sequence. Profile HMMs are based on a set of 218 manually assembled antizyme sequences. To distinguish between antizyme paralogs and orthologs from major phyla, antizyme sequences were clustered into twelve groups and specific combinations of profile HMMs were designed for each group. OAF has been tested on the current version of dbEST, where it identified over six thousand Expressed Sequence Tags (EST sequences encoding antizyme proteins (over two thousand antizyme CDS in these ESTs are non redundant. Conclusion OAF performs well on raw EST sequences and mRNA sequences derived from genomic annotations. OAF will be used for the future updates of the RECODE

  7. Power electronics reliability analysis.

    Smith, Mark A.; Atcitty, Stanley


    This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.

  8. The Epidemic of Zika Virus-Related Microcephaly in Brazil: Detection, Control, Etiology, and Future Scenarios.

    Teixeira, Maria G; Costa, Maria da Conceição N; de Oliveira, Wanderson K; Nunes, Marilia Lavocat; Rodrigues, Laura C


    We describe the epidemic of microcephaly in Brazil, its detection and attempts to control it, the suspected causal link with Zika virus infection during pregnancy, and possible scenarios for the future. In October 2015, in Pernambuco, Brazil, an increase in the number of newborns with microcephaly was reported. Mothers of the affected newborns reported rashes during pregnancy and no exposure to other potentially teratogenic agents. Women delivering in October would have been in the first trimester of pregnancy during the peak of a Zika epidemic in March. By the end of 2015, 4180 cases of suspected microcephaly had been reported. Zika spread to other American countries and, in February 2016, the World Health Organization declared the Zika epidemic a public health emergency of international concern. This unprecedented situation underscores the urgent need to establish the evidence of congenital infection risk by gestational week and accrue knowledge. There is an urgent call for a Zika vaccine, better diagnostic tests, effective treatment, and improved mosquito-control methods.

  9. Reliable Maintanace of Wireless Sensor Networks for Event-detection Applications%事件检测型传感器网络的可靠性维护

    胡四泉; 杨金阳; 王俊峰


    The reliability maintannace of the wireless sensor network is a key point to keep the alarm messages delivered reliably to the monitor center on time in a event-detection application. Based on the unreliable links in the wireless sensor network and the network charateristics of an event detection application,MPRRM,a multiple path redundant reliability maintanace algoritm was proposed in this paper. Both analytical and simulation results show that the MPRRM algorithm is superior to the previous published solutions in the metrics of reliability, false positive rate, latency and message overhead.%传感器网络(Wireless Sensor Networks,WSN)的事件检测型应用中,如何通过可靠性维护来保证在检测到事件时报警信息能及时、可靠地传输到监控主机至关重要.通过对不可靠的无线链路和网络传输的分析,提出多路冗余可靠性维护算法MPRRM.通过解析方法和仿真分析证明,该算法在可靠性、误报率、延迟和消息开销量上比同类算法具有优势.

  10. Calibrating vascular plant abundance for detecting future climate changes in Oregon and Washington, USA

    Timothy J. Brady; Vicente J. Monleon; Andrew N. Gray


    We propose using future vascular plant abundances as indicators of future climate in a way analogous to the reconstruction of past environments by many palaeoecologists. To begin monitoring future short-term climate changes in the forests of Oregon and Washington, USA, we developed a set of transfer functions for a present-day calibration set consisting of climate...

  11. Reliability solutions for a smart digital factory using: (1) RFID based CEP; (2) Image processing based error detection; (3) RFID based HCI

    Badr, Eid


    New technologies have a great influence on the production process in modern factories. Introducing new techniques and methods is crucial to optimize and enhance the working of factories. However, ensuring a reliable and correct integration requires complete evaluation and assessment. In this thesis I utilize RFID systems and image processing to develop and implement real time solutions to enhance and optimize the production and assembly processes. Solutions include: RFID based CEP to detect p...

  12. Targeted Next Generation Sequencing as a Reliable Diagnostic Assay for the Detection of Somatic Mutations in Tumours Using Minimal DNA Amounts from Formalin Fixed Paraffin Embedded Material.

    Wendy W J de Leng

    Full Text Available Targeted Next Generation Sequencing (NGS offers a way to implement testing of multiple genetic aberrations in diagnostic pathology practice, which is necessary for personalized cancer treatment. However, no standards regarding input material have been defined. This study therefore aimed to determine the effect of the type of input material (e.g. formalin fixed paraffin embedded (FFPE versus fresh frozen (FF tissue on NGS derived results. Moreover, this study aimed to explore a standardized analysis pipeline to support consistent clinical decision-making.We used the Ion Torrent PGM sequencing platform in combination with the Ion AmpliSeq Cancer Hotspot Panel v2 to sequence frequently mutated regions in 50 cancer related genes, and validated the NGS detected variants in 250 FFPE samples using standard diagnostic assays. Next, 386 tumour samples were sequenced to explore the effect of input material on variant detection variables. For variant calling, Ion Torrent analysis software was supplemented with additional variant annotation and filtering.Both FFPE and FF tissue could be sequenced reliably with a sensitivity of 99.1%. Validation showed a 98.5% concordance between NGS and conventional sequencing techniques, where NGS provided both the advantage of low input DNA concentration and the detection of low-frequency variants. The reliability of mutation analysis could be further improved with manual inspection of sequence data.Targeted NGS can be reliably implemented in cancer diagnostics using both FFPE and FF tissue when using appropriate analysis settings, even with low input DNA.

  13. Stakeholders' opinions on a future in-vehicle alcohol detection system for prevention of drunk driving.

    Anund, Anna; Antonson, Hans; Ihlström, Jonas


    There is a common understanding that driving under the influence of alcohol is associated with higher risk of being involved in crashes with injuries and possible fatalities as the outcome. Various countermeasures have therefore from time to time been taken by the authorities to prevent drunk driving. One of them has been the alcohol interlock. Up to now, interlocks have mainly been used by previously convicted drunk drivers and in the commercial road transport sector, but not in private cars. New technology has today reached a level where broader implementation might be possible. To our knowledge, however, little is known about different stakeholders' opinions of a broader implementation of such systems. In order to increase that knowledge, we conducted a focus group study to collect in-depth thoughts from different stakeholders on this topic. Eight focus groups representing a broad societal span were recruited and conducted for the purpose. The results show that most stakeholders thought that an integrated system for alcohol detection in vehicles might be beneficial in lowering the number of drunk driving crashes. They said that the system would probably mainly prevent driving by people who unintentionally and unknowingly drive under the influence of alcohol. The groups did, however, not regard the system as a final solution to the drunk driving problem, and believed that certain groups, such as criminals and alcoholics, would most likely find a way around the system. Concerns were raised about the risk of increased sleepy driving and driving just under the legal blood alcohol concentration (BAC) limit. The results also indicate that stakeholders preferred a system that provides information on the BAC up to the legal limit, but not for levels above the limit; for those, the system should simply prevent the car from starting. Acceptance of the system depended on the reliability of the system, on its ability to perform fast sampling, and on the analytical process

  14. Web-based tools can be used reliably to detect patients with major depressive disorder and subsyndromal depressive symptoms

    Tsai Shih-Jen


    Full Text Available Abstract Background Although depression has been regarded as a major public health problem, many individuals with depression still remain undetected or untreated. Despite the potential for Internet-based tools to greatly improve the success rate of screening for depression, their reliability and validity has not been well studied. Therefore the aim of this study was to evaluate the test-retest reliability and criterion validity of a Web-based system, the Internet-based Self-assessment Program for Depression (ISP-D. Methods The ISP-D to screen for major depressive disorder (MDD, minor depressive disorder (MinD, and subsyndromal depressive symptoms (SSD was developed in traditional Chinese. Volunteers, 18 years and older, were recruited via the Internet and then assessed twice on the online ISP-D system to investigate the test-retest reliability of the test. They were subsequently prompted to schedule face-to-face interviews. The interviews were performed by the research psychiatrists using the Mini-International Neuropsychiatric Interview and the diagnoses made according to DSM-IV diagnostic criteria were used for the statistics of criterion validity. Kappa (κ values were calculated to assess test-retest reliability. Results A total of 579 volunteer subjects were administered the test. Most of the subjects were young (mean age: 26.2 ± 6.6 years, female (77.7%, single (81.6%, and well educated (61.9% college or higher. The distributions of MDD, MinD, SSD and no depression specified were 30.9%, 7.4%, 15.2%, and 46.5%, respectively. The mean time to complete the ISP-D was 8.89 ± 6.77 min. One hundred and eighty-four of the respondents completed the retest (response rate: 31.8%. Our analysis revealed that the 2-week test-retest reliability for ISP-D was excellent (weighted κ = 0.801. Fifty-five participants completed the face-to-face interview for the validity study. The sensitivity, specificity, positive, and negative predictive values for major

  15. The Modified painDETECT Questionnaire for Patients with Hip or Knee Osteoarthritis: Translation into Dutch, Cross-Cultural Adaptation and Reliability Assessment.

    Rienstra, Wietske; Blikman, Tim; Mensink, Frans B; van Raay, Jos J A M; Dijkstra, Baukje; Bulstra, Sjoerd K; Stevens, Martin; van den Akker-Scheek, Inge


    There is a growing amount of evidence that alteration in pain processing by the peripheral and central nervous system play a role in osteoarthritis pain, leading to neuropathic-like symptoms. It is essential to identify knee and hip osteoarthritis patients with a neuropathic pain profile in order to offer such patients education and additional treatment options besides conventional pain treatment. The painDETECT Questionnaire is a self-report questionnaire developed to discriminate between nociceptive and neuropathic pain. This questionnaire was modified to fit patients suffering from knee osteoarthritis. The aim of this study was to translate and cross-culturally adapt the modified painDETECT Questionnaire to the Dutch language and to provide a modified version to fit patients with hip osteoarthritis. Reliability for internal consistency, repeatability and floor and ceiling effects were subsequently assessed. A total of 278 patients were included in the reliability study and 123 patients in the repeatability analysis. The Dutch modified painDETECT Questionnaire shows good internal consistency and small relative measurement errors, represented by a good intraclass correlation coefficient. Absolute measurement error, represented by the Standard Error of Measurement, was acceptable. However, a measurement bias might be present when it comes to repeatability. To our knowledge, this study is the first to provide a Dutch modified painDETECT Questionnaire to fit hip and knee osteoarthritis patients and to assess internal consistency, reliability and agreement. International guidelines were followed in the translation process and this study has ample sample size with an adequate time interval for repeatability. Based on this study, the Dutch modified painDETECT Questionnaire seems to be fit as a discriminative tool to identify knee and hip osteoarthritis patients with a neuropathic pain profile. Whether it is also suitable as an evaluative tool to record changes over time

  16. The Modified painDETECT Questionnaire for Patients with Hip or Knee Osteoarthritis: Translation into Dutch, Cross-Cultural Adaptation and Reliability Assessment.

    Wietske Rienstra

    Full Text Available There is a growing amount of evidence that alteration in pain processing by the peripheral and central nervous system play a role in osteoarthritis pain, leading to neuropathic-like symptoms. It is essential to identify knee and hip osteoarthritis patients with a neuropathic pain profile in order to offer such patients education and additional treatment options besides conventional pain treatment. The painDETECT Questionnaire is a self-report questionnaire developed to discriminate between nociceptive and neuropathic pain. This questionnaire was modified to fit patients suffering from knee osteoarthritis. The aim of this study was to translate and cross-culturally adapt the modified painDETECT Questionnaire to the Dutch language and to provide a modified version to fit patients with hip osteoarthritis. Reliability for internal consistency, repeatability and floor and ceiling effects were subsequently assessed. A total of 278 patients were included in the reliability study and 123 patients in the repeatability analysis. The Dutch modified painDETECT Questionnaire shows good internal consistency and small relative measurement errors, represented by a good intraclass correlation coefficient. Absolute measurement error, represented by the Standard Error of Measurement, was acceptable. However, a measurement bias might be present when it comes to repeatability. To our knowledge, this study is the first to provide a Dutch modified painDETECT Questionnaire to fit hip and knee osteoarthritis patients and to assess internal consistency, reliability and agreement. International guidelines were followed in the translation process and this study has ample sample size with an adequate time interval for repeatability. Based on this study, the Dutch modified painDETECT Questionnaire seems to be fit as a discriminative tool to identify knee and hip osteoarthritis patients with a neuropathic pain profile. Whether it is also suitable as an evaluative tool to

  17. Adaptation of the ToxRTool to Assess the Reliability of Toxicology Studies Conducted with Genetically Modified Crops and Implications for Future Safety Testing.

    Koch, Michael S; DeSesso, John M; Williams, Amy Lavin; Michalek, Suzanne; Hammond, Bruce


    To determine the reliability of food safety studies carried out in rodents with genetically modified (GM) crops, a Food Safety Study Reliability Tool (FSSRTool) was adapted from the European Centre for the Validation of Alternative Methods' (ECVAM) ToxRTool. Reliability was defined as the inherent quality of the study with regard to use of standardized testing methodology, full documentation of experimental procedures and results, and the plausibility of the findings. Codex guidelines for GM crop safety evaluations indicate toxicology studies are not needed when comparability of the GM crop to its conventional counterpart has been demonstrated. This guidance notwithstanding, animal feeding studies have routinely been conducted with GM crops, but their conclusions on safety are not always consistent. To accurately evaluate potential risks from GM crops, risk assessors need clearly interpretable results from reliable studies. The development of the FSSRTool, which provides the user with a means of assessing the reliability of a toxicology study to inform risk assessment, is discussed. Its application to the body of literature on GM crop food safety studies demonstrates that reliable studies report no toxicologically relevant differences between rodents fed GM crops or their non-GM comparators.

  18. Test-retest reliability and agreement of the SPI-Questionnaire to detect symptoms of digital ischemia in elite volleyball players.

    van de Pol, Daan; Zacharian, Tigran; Maas, Mario; Kuijer, P Paul F M


    The Shoulder posterior circumflex humeral artery Pathology and digital Ischemia - questionnaire (SPI-Q) has been developed to enable periodic surveillance of elite volleyball players, who are at risk for digital ischemia. Prior to implementation, assessing reliability is mandatory. Therefore, the test-retest reliability and agreement of the SPI-Q were evaluated among the population at risk. A questionnaire survey was performed with a 2-week interval among 65 elite male volleyball players assessing symptoms of cold, pale and blue digits in the dominant hand during or after practice or competition using a 4-point Likert scale (never, sometimes, often and always). Kappa (κ) and percentage of agreement (POA) were calculated for individual symptoms, and to distinguish symptomatic and asymptomatic players. For the individual symptoms, κ ranged from "poor" (0.25) to "good" (0.63), and POA ranged from "moderate" (78%) to "good" (97%). To classify symptomatic players, the SPI-Q showed "good" reliability (κ = 0.83; 95%CI 0.69-0.97) and "good" agreement (POA = 92%). The current study has proven the SPI-Q to be reliable for detecting elite male indoor volleyball players with symptoms of digital ischemia.

  19. Numerical and structural genomic aberrations are reliably detectable in tissue microarrays of formalin-fixed paraffin-embedded tumor samples by fluorescence in-situ hybridization.

    Heike Horn

    Full Text Available Few data are available regarding the reliability of fluorescence in-situ hybridization (FISH, especially for chromosomal deletions, in high-throughput settings using tissue microarrays (TMAs. We performed a comprehensive FISH study for the detection of chromosomal translocations and deletions in formalin-fixed and paraffin-embedded (FFPE tumor specimens arranged in TMA format. We analyzed 46 B-cell lymphoma (B-NHL specimens with known karyotypes for translocations of IGH-, BCL2-, BCL6- and MYC-genes. Locus-specific DNA probes were used for the detection of deletions in chromosome bands 6q21 and 9p21 in 62 follicular lymphomas (FL and six malignant mesothelioma (MM samples, respectively. To test for aberrant signals generated by truncation of nuclei following sectioning of FFPE tissue samples, cell line dilutions with 9p21-deletions were embedded into paraffin blocks. The overall TMA hybridization efficiency was 94%. FISH results regarding translocations matched karyotyping data in 93%. As for chromosomal deletions, sectioning artefacts occurred in 17% to 25% of cells, suggesting that the proportion of cells showing deletions should exceed 25% to be reliably detectable. In conclusion, FISH represents a robust tool for the detection of structural as well as numerical aberrations in FFPE tissue samples in a TMA-based high-throughput setting, when rigorous cut-off values and appropriate controls are maintained, and, of note, was superior to quantitative PCR approaches.

  20. Numerical and structural genomic aberrations are reliably detectable in tissue microarrays of formalin-fixed paraffin-embedded tumor samples by fluorescence in-situ hybridization.

    Horn, Heike; Bausinger, Julia; Staiger, Annette M; Sohn, Maximilian; Schmelter, Christopher; Gruber, Kim; Kalla, Claudia; Ott, M Michaela; Rosenwald, Andreas; Ott, German


    Few data are available regarding the reliability of fluorescence in-situ hybridization (FISH), especially for chromosomal deletions, in high-throughput settings using tissue microarrays (TMAs). We performed a comprehensive FISH study for the detection of chromosomal translocations and deletions in formalin-fixed and paraffin-embedded (FFPE) tumor specimens arranged in TMA format. We analyzed 46 B-cell lymphoma (B-NHL) specimens with known karyotypes for translocations of IGH-, BCL2-, BCL6- and MYC-genes. Locus-specific DNA probes were used for the detection of deletions in chromosome bands 6q21 and 9p21 in 62 follicular lymphomas (FL) and six malignant mesothelioma (MM) samples, respectively. To test for aberrant signals generated by truncation of nuclei following sectioning of FFPE tissue samples, cell line dilutions with 9p21-deletions were embedded into paraffin blocks. The overall TMA hybridization efficiency was 94%. FISH results regarding translocations matched karyotyping data in 93%. As for chromosomal deletions, sectioning artefacts occurred in 17% to 25% of cells, suggesting that the proportion of cells showing deletions should exceed 25% to be reliably detectable. In conclusion, FISH represents a robust tool for the detection of structural as well as numerical aberrations in FFPE tissue samples in a TMA-based high-throughput setting, when rigorous cut-off values and appropriate controls are maintained, and, of note, was superior to quantitative PCR approaches.

  1. On the reliability of fire detection and alarm systems. Exploration and analysis of data from nuclear and non-nuclear installations

    Nyyssoenen, T.; Rajakko, J.; Keski-Rahkonen, O. [VTT Building and Transport, Espoo (Finland)


    A literature review of reliability data of fire detection and alarm systems was made resulting to rough estimates of some failure frequencies. No theoretical or technical articles on the structure of reliability models of these installations were found. Inspection records of fire detection and alarm system installations by SPEK were studied, and transferred in electronic data base classifying observed failures in failure modes (59) and severity categories (3) guided by freely written records in the original data. The results of that work are presented without many comments in tabular form in this paper. A small sample of installations was collected, and number of components in them was counted to derive some distributions for determination of national populations of various components based on know total amount of installations. From NPPs (Loviisa, Olkiluoto and Barsebaeck) failure reports were analysed, and observed failures of fire detection and alarm systems were classified by severity and detection mode. They are presented here in tabular form for the original and new addressable systems. Populations were counted individually, but for all installations needed documents were not available. Therefore, presented failure frequencies are just first estimates, which will be refined later. (orig.)

  2. The reliability of serogroup determination in the detection of Escherichia coli as a causative agent of sporadic and epidemic occurrence of enterocolitis

    Stojanović Valentina


    Full Text Available The purpose of this study was to determine the presence of virulence factors (heat-labile, heat-stable enterotoxin, verotoxin, invasiveness, localized, aggregative and diffuse adherence among E. coli strains isolated from sporadic cases and outbreaks of enterocolitis, which belonged to serogroups characteristic for enteropathogenic E. coli. Serogroup was determined in 57.2% of 622 strains isolated from sporadic cases, and among them virulence factors were detected in 23.6%. Serogroup was also determined in 73.3% of 90 outbreaks isolates tested and virulence factors were detected in 97% of them. The detection rate of virulence factors rarely exceeded 50% among strains belonging to any of serogroup that was determined. The obtained data suggested that the identification of E. coli as a causative agent of enterocolitis by serogroup determination was a reliable method in outbreaks, but not in sporadic cases of this disease.

  3. Reliable LC3 and p62 autophagy marker detection in formalin fixed paraffin embedded human tissue by immunohistochemistry.

    Schläfli, A M; Berezowska, S; Adams, O; Langer, R; Tschan, M P


    Autophagy assures cellular homeostasis, and gains increasing importance in cancer, where it impacts on carcinogenesis, propagation of the malignant phenotype and development of resistance. To date, its tissue-based analysis by immunohistochemistry remains poorly standardized. Here we show the feasibility of specifically and reliably assessing the autophagy markers LC3B and p62 (SQSTM1) in formalin fixed and paraffin embedded human tissue by immunohistochemistry. Preceding functional experiments consisted of depleting LC3B and p62 in H1299 lung cancer cells with subsequent induction of autophagy. Western blot and immunofluorescence validated antibody specificity, knockdown efficiency and autophagy induction prior to fixation in formalin and embedding in paraffin. LC3B and p62 antibodies were validated on formalin fixed and paraffin embedded cell pellets of treated and control cells and finally applied on a tissue microarray with 80 human malignant and non-neoplastic lung and stomach formalin fixed and paraffin embedded tissue samples. Dot-like staining of various degrees was observed in cell pellets and 18/40 (LC3B) and 22/40 (p62) tumors, respectively. Seventeen tumors were double positive for LC3B and p62. P62 displayed additional significant cytoplasmic and nuclear staining of unknown significance. Interobserver-agreement for grading of staining intensities and patterns was substantial to excellent (kappa values 0.60 - 0.83). In summary, we present a specific and reliable IHC staining of LC3B and p62 on formalin fixed and paraffin embedded human tissue. Our presented protocol is designed to aid reliable investigation of dysregulated autophagy in solid tumors and may be used on large tissue collectives.

  4. The reliability, minimal detectable change and concurrent validity of a gravity-based bubble inclinometer and iphone application for measuring standing lumbar lordosis.

    Salamh, Paul A; Kolber, Morey


    To investigate the reliability, minimal detectable change (MDC90) and concurrent validity of a gravity-based bubble inclinometer (inclinometer) and iPhone® application for measuring standing lumbar lordosis. Two investigators used both an inclinometer and an iPhone® with an inclinometer application to measure lumbar lordosis of 30 asymptomatic participants. ICC models 3,k and 2,k were used for the intrarater and interrater analysis, respectively. Good interrater and intrarater reliability was present for the inclinometer with Intraclass Correlation Coefficients (ICC) of 0.90 and 0.85, respectively and the iPhone® application with ICC values of 0.96 and 0.81. The minimal detectable change (MDC90) indicates that a change greater than or equal to 7° and 6° is needed to exceed the threshold of error using the iPhone® and inclinometer, respectively. The concurrent validity between the two instruments was good with a Pearson product-moment coefficient of correlation (r) of 0.86 for both raters. Ninety-five percent limits of agreement identified differences ranging from 9° greater in regards to the iPhone® to 8° less regarding the inclinometer. Both the inclinometer and iPhone® application possess good interrater reliability, intrarater reliability and concurrent validity for measuring standing lumbar lordosis. This investigation provides preliminary evidence to suggest that smart phone applications may offer clinical utility comparable to inclinometry for quantifying standing lumbar lordosis. Clinicians should recognize potential individual differences when using these devices interchangeably.

  5. EMA-real-time PCR as a reliable method for detection of viable Salmonella in chicken and eggs.

    Wang, Luxin; Mustapha, Azlin


    Culture-based Salmonella detection takes at least 4 d to complete. The use of TaqMan probes allows the real-time PCR technique to be a rapid and sensitive way to detect foodborne pathogens. However, unlike RNA-based PCR, DNA-based PCR techniques cannot differentiate between DNA from live and dead cells. Ethidium bromide monoazide (EMA) is a dye that can bind to DNA of dead cells and prevent its amplification by PCR. An EMA staining step prior to PCR allows for the effective inhibition of false positive results from DNA contamination by dead cells. The aim of this study was to design an accurate detection method that can detect only viable Salmonella cells from poultry products. The sensitivity of EMA staining coupled with real-time PCR was compared to that of an RNA-based reverse transcription (RT)-real-time PCR. To prevent false negative results, an internal amplification control was added to the same reaction mixture as the target Salmonella sequences. With an optimized EMA staining step, the detection range of a subsequent real-time PCR was determined to be 10(3) to 10(9) CFU/mL for pure cultures and 10(5) to 10(9) CFU/mL for food samples, which was a wider detection range than for RT-real-time PCR. After a 12-h enrichment step, EMA staining combined with real-time PCR could detect as low as 10 CFU/mL Salmonella from chicken rinses and egg broth. The use of EMA with a DNA-based real-time PCR can successfully prevent false positive results and represents a simple, yet accurate detection tool for enhancing the safety of food.

  6. Fast and simultaneous detection of heavy metals using a simple and reliable microchip-electrochemistry route: An alternative approach to food analysis.

    Chailapakul, Orawon; Korsrisakul, Sarawadee; Siangproh, Weena; Grudpan, Kate


    This paper reports, for the first, the fast and simultaneous detection of prominent heavy metals, including: lead, cadmium and copper using microchip CE with electrochemical detection. The direct amperometric detection mode for microchip CE was successfully applied to these heavy metal ions. The influences of separation voltage, detection potential, as well as the concentration and pH value of the running buffer on the response of the detector were carefully assayed and optimized. The results clearly show that reliable analysis for lead, cadmium, and copper by the degree of electrophoretic separation occurs in less than 3min using a MES buffer (pH 7.0, 25mM) and l-histidine, with 1.2kV separation voltage and -0.8V detection potential. The detection limits for Pb(2+), Cd(2+), and Cu(2+) were 1.74, 0.73 and 0.13microM (S/N=3). The %R.S.D. of each peak current was foods.

  7. MEMS reliability

    Hartzell, Allyson L; Shea, Herbert R


    This book focuses on the reliability and manufacturability of MEMS at a fundamental level. It demonstrates how to design MEMs for reliability and provides detailed information on the different types of failure modes and how to avoid them.

  8. Reliability considerations of NDT by probability of detection (POD). Determination using ultrasound phased array. Results from a project in frame of the German nuclear safety research program

    Kurz, Jochen H. [Fraunhofer-Institut fuer Zerstoerungsfreie Pruefverfahren (IZEP), Saarbruecken (Germany); Dugan, Sandra; Juengert, Anne [Stuttgart Univ. (Germany). Materialpruefungsanstalt (MPA)


    Reliable assessment procedures are an important aspect of maintenance concepts. Non-destructive testing (NDT) methods are an essential part of a variety of maintenance plans. Fracture mechanical assessments require knowledge of flaw dimensions, loads and material parameters. NDT methods are able to acquire information on all of these areas. However, it has to be considered that the level of detail information depends on the case investigated and therefore on the applicable methods. Reliability aspects of NDT methods are of importance if quantitative information is required. Different design concepts e.g. the damage tolerance approach in aerospace already include reliability criteria of NDT methods applied in maintenance plans. NDT is also an essential part during construction and maintenance of nuclear power plants. In Germany, type and extent of inspection are specified in Safety Standards of the Nuclear Safety Standards Commission (KTA). Only certified inspections are allowed in the nuclear industry. The qualification of NDT is carried out in form of performance demonstrations of the inspection teams and the equipment, witnessed by an authorized inspector. The results of these tests are mainly statements regarding the detection capabilities of certain artificial flaws. In other countries, e.g. the U.S., additional blind tests on test blocks with hidden and unknown flaws may be required, in which a certain percentage of these flaws has to be detected. The knowledge of the probability of detection (POD) curves of specific flaws in specific testing conditions is often not present. This paper shows the results of a research project designed for POD determination of ultrasound phased array inspections of real and artificial cracks. The continuative objective of this project was to generate quantitative POD results. The distribution of the crack sizes of the specimens and the inspection planning is discussed, and results of the ultrasound inspections are presented. In

  9. The reliability and accuracy of two methods for proximal caries detection and depth on directly visible proximal surfaces: an in vitro study

    Ekstrand, K R; Alloza, Alvaro Luna; Promisiero, L


    This study aimed to determine the reliability and accuracy of the ICDAS and radiographs in detecting and estimating the depth of proximal lesions on extracted teeth. The lesions were visible to the naked eye. Three trained examiners scored a total of 132 sound/carious proximal surfaces from 106...... primary teeth and 160 sound/carious proximal surfaces from 140 permanent teeth. The selected surfaces were first scored visually, using the 7 classes in the ICDAS. They were then assessed on radiographs using a 5-point classification system. Reexaminations were conducted with both scoring systems. Teeth...

  10. Software reliability

    Bendell, A


    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  11. A Simple and Reliable Assay for Detecting Specific Nucleotide Sequences in Plants Using Optical Thin-film Biosensor Chips

    S. Bai; X. Zhong; L. Ma; W. Zheng; L. Fan; N. Wei; X.W. Deng


    @@ Here we report the adaptation and optimization of an efficient, accurate and inexpensive assay that employs custom-designed silicon-based optical thin-film biosensor chips to detect unique transgenes in genetically modified (GM) crops and SNP markers in model plant genomes.

  12. Reliability of fluid systems

    Kopáček Jaroslav


    Full Text Available This paper focuses on the importance of detection reliability, especially in complex fluid systems for demanding production technology. The initial criterion for assessing the reliability is the failure of object (element, which is seen as a random variable and their data (values can be processed using by the mathematical methods of theory probability and statistics. They are defined the basic indicators of reliability and their applications in calculations of serial, parallel and backed-up systems. For illustration, there are calculation examples of indicators of reliability for various elements of the system and for the selected pneumatic circuit.

  13. Fusion of multi-sensory NDT data for reliable detection of surface cracks: Signal-level vs. decision-level

    Heideklang, René; Shokouhi, Parisa


    We present and compare two different approaches for NDT multi-sensor data fusion at signal (low) and decision (high) levels. Signal-level fusion is achieved by applying simple algebraic rules to strategically post-processed images. This is done in the original domain or in the domain of a suitable signal transform. The importance of signal normalization for low-level fusion applications is emphasized in regard to heterogeneous NDT data sets. For fusion at decision level, we develop a procedure based on assembling joint kernel density estimation (KDE). The procedure involves calculating KDEs for individual sensor detections and aggregating them by applying certain combination rules. The underlying idea is that if the detections from more than one sensor fall spatially close to one another, they are likely to result from the presence of a defect. On the other hand, single-senor detections are more likely to be structural noise or false alarm indications. To this end, we design the KDE combination rules such that it prevents single-sensor domination and allows data-driven scaling to account for the influence of individual sensors. We apply both fusion rules to a three-sensor dataset consisting in ET, MFL/GMR and TT data collected on a specimen with built-in surface discontinuities. The performance of the fusion rules in defect detection is quantitatively evaluated and compared against those of the individual sensors. Both classes of data fusion rules result in a fused image of fewer false alarms and thus improved defect detection. Finally, we discuss the advantages and disadvantages of low-level and high-level NDT data fusion with reference to our experimental results.

  14. Assessment of Interpersonal Motivation in Transcripts (AIMIT): an inter- and intra-rater reliability study of a new method of detection of interpersonal motivational systems in psychotherapy.

    Fassone, G; Valcella, F; Pallini, S; Scarcella, F; Tombolini, L; Ivaldi, A; Prunetti, E; Manaresi, F; Liotti, G


    Assessing Interpersonal Motivations in Transcripts (AIMIT) is a coding system aiming to systematically detect the activity of interpersonal motivational systems (IMS) in the therapeutic dialogue. An inter- and intra-rater reliability study has been conducted. Sixteen video-recorded psychotherapy sessions were selected and transcribed according to the AIMIT criteria. Sessions relate to 16 patients with an Axis II diagnosis, with a mean Global Assessment of Functioning of 51. For the intra-rater reliability evaluation, five sessions have been selected and assigned to five independent coders who where asked to make a first evaluation, and then a second independent one 14 days later. For the inter-rater reliability study, the sessions coded by the therapist-coder were jointly revised with another coder and finally classified as gold standard. The 16 standard sessions were sent to other evaluators for the independent coding. The agreement (κ) was estimated according to the following parameters for each coding unit: evaluation units supported by the 'codable' activation of one or more IMS; motivational interaction with reference to the ongoing relation between patient and therapist; an interaction between the patient and another person reported/narrated by the patient; detection of specific IMS: attachment (At), caregiving (CG), rank (Ra), sexuality (Se), peer cooperation (PC); and transitions from one IMS to another were also scored. The intra-rater agreement was evaluated through the parameters 'cod', 'At', 'CG', 'Ra', 'Se' and 'PC' described above. A total of 2443 coding units were analysed. For the nine parameters on which the agreement was calculated, eight ['coded (Cod)', 'ongoing relation (Rel)', 'narrated relation (Nar)', 'At', 'CG', 'Ra', 'Se' and 'PC'] have κ values comprised between 0.62 (CG) and 0.81 (Cod) and were therefore satisfactory. The scoring of 'transitions' showed agreement values slightly below desired cut-off (0.56). Intra-rater reliability was

  15. Assessing Minimal Detectable Changes and Test-Retest Reliability of the Timed Up and Go Test and the 2-Minute Walk Test in Patients With Total Knee Arthroplasty.

    Yuksel, Ertugrul; Kalkan, Serpil; Cekmece, Senol; Unver, Bayram; Karatosun, Vasfi


    Two-minute walk test (2MWT) and the Timed Up and Go test (TUG) are simple, quick, and can be applied in a short time as part of the routine medical examination. They were shown to be reliable and valid tests in many patient groups. The aims of the present study were: (1) to determine test-retest reliability of data for the TUG and 2MWT and (2) to determine minimal detectable change (MDC) scores for the TUG and 2MWT in patients with TKA. Forty-eight patients with total knee arthroplasty, operated by the same surgeon, were included in this study. Patients performed trials for TUG and 2MWT twice on the same day. Between the first and second trials, patients waited for an hour on sitting position to prevent fatigue. The TUG and 2MWT showed an excellent test-retest reliability in this study. Intraclass correlation coefficient [ICC(2,1)] for TUG and 2MWT were 0.98 and 0.97, respectively. Standard error of measurement and MDC95 for TUG were 0.82 and 2.27, respectively. Standard error of measurement and MDC95 for 2MWT were 5.40 and 14.96, respectively. The TUG and 2MWT have an excellent test-retest reliability in patients with TKA. Clinicians and researchers can be confident that changes in TUG time above 2.27 seconds and changes in 2MWT distances above 14.96 meters, represent a "real" clinical change in an individual patient with TKA. We, therefore, recommend the use of these 2 tests as complementary outcome measures for functional evaluation in patients TKA. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. [Government actions for the early detection of breast cancer in Latin America. Future challenges].

    González-Robledo, Luz María; González-Robledo, María Cecilia; Nigenda, Gustavo; López-Carrillo, Lizbeth


    Documentary research carried out in 2009 aims to document the regulatory framework and existing programs for the early detection of breast cancer in Latin America and the Caribbean in order to establish the most important challenges for the containment of the epidemic in the region. The governments of the region have developed diverse efforts and initiatives to confront the rise in mortality due to said cause, including early detection, treatment and research strategies. Despite advances in the early detection of breast cancer, the challenge remains to link efforts to ensure continuity of care (diagnostic confirmation, treatment and monitoring) in order to achieve higher efficiency, effectiveness and benefits for women with this disease.

  17. Optimization of diagnostic RT-PCR protocols and sampling procedures for the reliable and cost-effective detection of Cassava brown streak virus.

    Abarshi, M M; Mohammed, I U; Wasswa, P; Hillocks, R J; Holt, J; Legg, J P; Seal, S E; Maruthi, M N


    Sampling procedures and diagnostic protocols were optimized for accurate diagnosis of Cassava brown streak virus (CBSV) (genus Ipomovirus, family Potyviridae). A cetyl trimethyl ammonium bromide (CTAB) method was optimized for sample preparation from infected cassava plants and compared with the RNeasy plant mini kit (Qiagen) for sensitivity, reproducibility and costs. CBSV was detectable readily in total RNAs extracted using either method. The major difference between the two methods was in the cost of consumables, with the CTAB 10x cheaper (0.53 pounds sterling=US$0.80 per sample) than the RNeasy method (5.91 pounds sterling=US$8.86 per sample). A two-step RT-PCR (1.34 pounds sterling=US$2.01 per sample), although less sensitive, was at least 3-times cheaper than a one-step RT-PCR (4.48 pounds sterling=US$6.72). The two RT-PCR tests revealed consistently the presence of CBSV both in symptomatic and asymptomatic leaves and indicated that asymptomatic leaves can be used reliably for virus diagnosis. Depending on the accuracy required, sampling 100-400 plants per field is an appropriate recommendation for CBSD diagnosis, giving a 99.9% probability of detecting a disease incidence of 6.7-1.7%, respectively. CBSV was detected at 10(-4)-fold dilutions in composite sampling, indicating that the most efficient way to index many samples for CBSV will be to screen pooled samples. The diagnostic protocols described below are reliable and the most cost-effective methods available currently for detecting CBSV.

  18. AST Critical Propulsion and Noise Reduction Technologies for Future Commercial Subsonic Engines Area of Interest 1.0: Reliable and Affordable Control Systems

    Myers, William; Winter, Steve


    The General Electric Reliable and Affordable Controls effort under the NASA Advanced Subsonic Technology (AST) Program has designed, fabricated, and tested advanced controls hardware and software to reduce emissions and improve engine safety and reliability. The original effort consisted of four elements: 1) a Hydraulic Multiplexer; 2) Active Combustor Control; 3) a Variable Displacement Vane Pump (VDVP); and 4) Intelligent Engine Control. The VDVP and Intelligent Engine Control elements were cancelled due to funding constraints and are reported here only to the state they progressed. The Hydraulic Multiplexing element developed and tested a prototype which improves reliability by combining the functionality of up to 16 solenoids and servo-valves into one component with a single electrically powered force motor. The Active Combustor Control element developed intelligent staging and control strategies for low emission combustors. This included development and tests of a Controlled Pressure Fuel Nozzle for fuel sequencing, a Fuel Multiplexer for individual fuel cup metering, and model-based control logic. Both the Hydraulic Multiplexer and Controlled Pressure Fuel Nozzle system were cleared for engine test. The Fuel Multiplexer was cleared for combustor rig test which must be followed by an engine test to achieve full maturation.

  19. Analogies Among Current and Future Life Detection Missions and the Pharmaceutical/Biomedical Industries

    Wainwright, N. R.; Steele, A.; Monaco, L.; Fries, M.


    Life detection goals and technologies are remarkably similar between several types of NASA missions and the pharmaceutical and biotechnology industries. Needs for sensitivity, specificity, speed have driven techniques and equipment to common ends.

  20. A Stereo-Vision Based Hazard-Detection Algorithm for Future Planetary Landers

    Woicke, S.; Mooij, E.


    A hazard detection algorithm based on the stereo-vision principle is presented. A sensitivity analysis concerning the minimum baseline and the maximum altitude is discussed, based on which the limitations of this algorithm are investigated.

  1. DCP's Early Detection Research Guides Future Science | Division of Cancer Prevention

    Early detection research funded by the NCI's Division of Cancer Prevention has positively steered both public health and clinical outcomes, and set the stage for findings in the next generation of research. |

  2. Validation of non-fluorescent methods to reliably detect acrosomal and plasma membrane integrity of common marmoset (Callithrix jacchus) sperm.

    Valle, R R; Valle, C M R; Nichi, M; Muniz, J A P C; Nayudu, P L; Guimarães, M A B V


    Simple, rapid and stable sperm evaluation methods which have been optimized for common marmoset (Callithrix jacchus) are critical for studies involving collection and evaluation of sperm in the field. This is particularly important for new species groups such as Callitrichidae where the sperm have been little studied. Of this family, C. jacchus is the best known, and has been chosen as a model species for other members of the genus Callithrix. The fundamental evaluation parameters for sperm of any species are viability and acrosomal status. Semen samples were collected by penile vibratory stimulation. To evaluate sperm plasma membrane integrity, Eosin-Nigrosin was tested here for the common marmoset sperm to be used under field conditions. Further, a non-fluorescent stain for acrosome, the "Simple" stain, developed for domestic and wild cats, was tested on common marmoset sperm. This was compared with a fluorescent staining, Fluorescein isothiocyanate-Pisum sativum agglutinin (FITC-PSA), routinely used and validated for common marmoset at the German Primate Centre to evaluate acrosomal integrity. Results obtained with the "Simple" stain showed a marked differentiation between sperm with intact and non-intact acrosome both with and without ionophore treatment and closely correlated with results obtained with FITC-PSA. Temperature had no effect on the results with the "Simple" stain and the complete processing is simple enough to be carried out under field conditions. These findings indicated that the "Simple" stain and Eosin-Nigrosin provide rapid and accurate results for C. jacchus sperm and that those methods can be reliably used as field tools for sperm evaluation for this species.

  3. Fast detection of leakages. Ultrasonic sensors: Fast and reliable detection of leakages in compressed-air networks; Schnelle Fehlerortung. Leckagen in Druckluftnetzen mit Ultraschallsensoren zuverlaessig detektieren

    Muench, H.J.; Streuber, W. [Sonotec Ultraschallsensorik GmbH, Halle/Saale (Germany)


    Leakage points can be detected easily even at high production noise levels. Probes focusing on structure-borne noise instead of airborne noise have a wider range of applications. [German] Druckluftverluste koennen schon am Entstehungsort einfach nachgewiesen werden. Die Lecksuche ist auch bei starkem Produktionslaerm noch moeglich. Eine Koerperschallsonde anstelle der Luftschallsonde bietet weitere Einsatzmoeglichkeiten. (orig.)

  4. Validity and reliability of methods for the detection of secondary caries around amalgam restorations in primary teeth

    Mariana Minatel Braga


    Full Text Available Secondary caries has been reported as the main reason for restoration replacement. The aim of this in vitro study was to evaluate the performance of different methods - visual inspection, laser fluorescence (DIAGNOdent, radiography and tactile examination - for secondary caries detection in primary molars restored with amalgam. Fifty-four primary molars were photographed and 73 suspect sites adjacent to amalgam restorations were selected. Two examiners evaluated independently these sites using all methods. Agreement between examiners was assessed by the Kappa test. To validate the methods, a caries-detector dye was used after restoration removal. The best cut-off points for the sample were found by a Receiver Operator Characteristic (ROC analysis, and the area under the ROC curve (Az, and the sensitivity, specificity and accuracy of the methods were calculated for enamel (D2 and dentine (D3 thresholds. These parameters were found for each method and then compared by the McNemar test. The tactile examination and visual inspection presented the highest inter-examiner agreement for the D2 and D3 thresholds, respectively. The visual inspection also showed better performance than the other methods for both thresholds (Az = 0.861 and Az = 0.841, respectively. In conclusion, the visual inspection presented the best performance for detecting enamel and dentin secondary caries in primary teeth restored with amalgam.

  5. Photovoltaic system reliability

    Maish, A.B.; Atcitty, C. [Sandia National Labs., NM (United States); Greenberg, D. [Ascension Technology, Inc., Lincoln Center, MA (United States)] [and others


    This paper discusses the reliability of several photovoltaic projects including SMUD`s PV Pioneer project, various projects monitored by Ascension Technology, and the Colorado Parks project. System times-to-failure range from 1 to 16 years, and maintenance costs range from 1 to 16 cents per kilowatt-hour. Factors contributing to the reliability of these systems are discussed, and practices are recommended that can be applied to future projects. This paper also discusses the methodology used to collect and analyze PV system reliability data.

  6. The reliability and accuracy of two methods for proximal caries detection and depth on directly visible proximal surfaces: an in vitro study.

    Ekstrand, K R; Luna, L E; Promisiero, L; Cortes, A; Cuevas, S; Reyes, J F; Torres, C E; Martignon, S


    This study aimed to determine the reliability and accuracy of the ICDAS and radiographs in detecting and estimating the depth of proximal lesions on extracted teeth. The lesions were visible to the naked eye. Three trained examiners scored a total of 132 sound/carious proximal surfaces from 106 primary teeth and 160 sound/carious proximal surfaces from 140 permanent teeth. The selected surfaces were first scored visually, using the 7 classes in the ICDAS. They were then assessed on radiographs using a 5-point classification system. Reexaminations were conducted with both scoring systems. Teeth were then sectioned and the selected surfaces histologically classified using a stereomicroscope (×5). Intrareproducibility values (weighted kappa statistics) for the ICDAS for both primary and permanent teeth were >0.9, and for the radiographs between 0.6 and 0.8. Interreproducibility values for the ICDAS were >0.85, for the radiographs >0.6. For both primary and permanent teeth, the accuracy of each examiner (Spearman's correlation coefficient) for the ICDAS was ≥0.85, and for the radiographs ≥0.45. Corresponding data were achieved when using pooled data from the 3 examiners for both the ICDAS and the radiographs. The associations between the 2 detection methods were measured to be moderate. In particular, the ICDAS was accurate in predicting lesion depth (histologically) confined to the enamel/outer third of the dentine versus deeper lesions. This study shows that when proximal lesions are open for inspection, the ICDAS is a more reliable and accurate method than the radiograph for detecting and estimating the depth of the lesion in both primary and permanent teeth.

  7. Development of a reliable assay protocol for identification of diseases (RAPID)-bioactive amplification with probing (BAP) for detection of Newcastle disease virus.

    Wang, Chi-Young; Hsu, Chia-Jen; Chen, Heng-Ju; Chulu, Julius L C; Liu, Hung-Jen


    Due to appearance of new genotypes of Newcastle disease virus (NDV) with no cross-protection and with vaccine strains, some outbreaks have been reported in Taiwan that caused significant damage to the poultry industry. A reliable assay protocol, (RAPID)-bioactive amplification with probing (BAP), for detection of NDV that uses a nested PCR and magnetic bead-based probe to increase sensitivity and specificity, was developed. Primers and probes were designed based on the conserved region of the F protein-encoding gene sequences of all NDV Taiwan isolates. The optimal annealing temperature for nested reverse transcription-polymerase chain reaction (RT-PCR) to amplify the gene was 61 degrees C and optimal hybridization occurred when buffer 1x SSC and 0.5% SDS were used at 50 degrees C. The sensitivity of RAPID-BAP was 1 copy/microl for standard plasmids and 10 copy/mul for transcribed F protein-encoding gene of NDV with comparable linearity (R(2)=0.984 versus R(2)=0.99). This sensitivity was superior to that of other techniques currently used. The assay was also highly specific because the negative controls, including classical swine fever virus, avian influenza virus, avian reovirus, and infectious bursa disease virus could not be detected. Thirty-four field samples were tested using conventional RT-PCR, nested RT-PCR, real-time quantitative RT-PCR, and RAPID-BAP assay and the positive rates were 24%, 30%, 41%, and 53%, respectively. The developed assay allows for rapid, correct, and sensitive detection of NDV and fulfils all of the key requirements for clinical applicability. It could reliably rule out false negative results from antibody-based assays and also facilitate a rapid diagnosis in the early phase of the disease for emergency quarantine that may help prevent large-scale outbreaks.

  8. How to reliably detect molecular clusters and nucleation mode particles with Neutral cluster and Air Ion Spectrometer (NAIS)

    Manninen, Hanna E.; Mirme, Sander; Mirme, Aadu; Petäjä, Tuukka; Kulmala, Markku


    To understand the very first steps of atmospheric particle formation and growth processes, information on the size where the atmospheric nucleation and cluster activation occurs, is crucially needed. The current understanding of the concentrations and dynamics of charged and neutral clusters and particles is based on theoretical predictions and experimental observations. This paper gives a standard operation procedure (SOP) for Neutral cluster and Air Ion Spectrometer (NAIS) measurements and data processing. With the NAIS data, we have improved the scientific understanding by (1) direct detection of freshly formed atmospheric clusters and particles, (2) linking experimental observations and theoretical framework to understand the formation and growth mechanisms of aerosol particles, and (3) parameterizing formation and growth mechanisms for atmospheric models. The SOP provides tools to harmonize the world-wide measurements of small clusters and nucleation mode particles and to verify consistent results measured by the NAIS users. The work is based on discussions and interactions between the NAIS users and the NAIS manufacturer.

  9. In Situ Biological Contamination Studies of the Moon: Implications for Future Planetary Protection and Life Detection Missions

    Glavin, Daniel P.; Dworkin, Jason P.; Lupisella, Mark; Kminek, Gerhard; Rummel, John D.


    NASA and ESA have outlined visions for solar system exploration that will include a series of lunar robotic precursor missions to prepare for, and support a human return to the Moon, and future human exploration of Mars and other destinations. One of the guiding principles for exploration is to pursue compelling scientific questions about the origin and evolution of life. The search for life on objects such as Mars will require that all spacecraft and instrumentation be sufficiently cleaned and sterilized prior to launch to ensure that the scientific integrity of extraterrestrial samples is not jeopardized by terrestrial organic contamination. Under the Committee on Space Research's (COSPAR's) current planetary protection policy for the Moon, no sterilization procedures are required for outbound lunar spacecraft, nor is there yet a planetary protection category for human missions. Future in situ investigations of a variety of locations on the Moon by highly sensitive instruments designed to search for biologically derived organic compounds would help assess the contamination of the Moon by lunar spacecraft. These studies could also provide valuable "ground truth" data for Mars sample return missions and help define planetary protection requirements for future Mars bound spacecraft carrying life detection experiments. In addition, studies of the impact of terrestrial contamination of the lunar surface by the Apollo astronauts could provide valuable data to help refine future Mars surface exploration plans for a human mission to Mars.

  10. Reliability of cortical lesion detection on double inversion recovery MRI applying the MAGNIMS-Criteria in multiple sclerosis patients within a 16-months period

    Thaler, Christian; Ceyrowski, Tim; Broocks, Gabriel; Treffler, Natascha; Sedlacik, Jan; Stürner, Klarissa; Stellmann, Jan-Patrick; Heesen, Christoph; Fiehler, Jens; Siemonsen, Susanne


    Purpose In patients with multiple sclerosis (MS), Double Inversion Recovery (DIR) magnetic resonance imaging (MRI) can be used to identify cortical lesions (CL). We sought to evaluate the reliability of CL detection on DIR longitudinally at multiple subsequent time-points applying the MAGNIMs scoring criteria for CLs. Methods 26 MS patients received a 3T-MRI (Siemens, Skyra) with DIR at 12 time-points (TP) within a 16 months period. Scans were assessed in random order by two different raters. Both raters separately marked all CLs on each scan and total lesion numbers were obtained for each scan-TP and patient. After a retrospective re-evaluation, the number of consensus CLs (conL) was defined as the total number of CLs, which both raters finally agreed on. CLs volumes, relative signal intensities and CLs localizations were determined. Both ratings (conL vs. non-consensus scoring) were compared for further analysis. Results A total number of n = 334 CLs were identified by both raters in 26 MS patients with a first agreement of both raters on 160 out of 334 of the CLs found (κ = 0.48). After the retrospective re-evaluation, consensus agreement increased to 233 out of 334 CL (κ = 0.69). 93.8% of conL were visible in at least 2 consecutive TP. 74.7% of the conL were visible in all 12 consecutive TP. ConL had greater mean lesion volumes and higher mean signal intensities compared to lesions that were only detected by one of the raters (p<0.05). A higher number of CLs in the frontal, parietal, temporal and occipital lobe were identified by both raters than the number of those only identified by one of the raters (p<0.05). Conclusions After a first assessment, slightly less than a half of the CL were considered as reliably detectable on longitudinal DIR images. A retrospective re-evaluation notably increased the consensus agreement. However, this finding is narrowed, considering the fact that retrospective evaluation steps might not be practicable in clinical routine

  11. Reliability of Periapical Radiographs and Orthopantomograms in Detection of Tooth Root Protrusion in the Maxillary Sinus: Correlation Results with Cone Beam Computed Tomography

    Bassam A. Hassan


    Full Text Available Objectives: The purpose of the present study was to investigate the reliability of both periapical radiographs and orthopantomograms for exact detection of tooth root protrusion in the maxillary sinus by correlating the results with cone beam computed tomography.Material and methods: A database of 1400 patients scanned with cone beam computed tomography (CBCT was searched for matching periapical (PA radiographs and orthopantogram (OPG images of maxillary premolars and molars. Matching OPG images datasets of 101 patients with 628 teeth and PA radiographs datasets of 93 patients with 359 teeth were identified. Four observers assessed the relationship between the apex of tooth root and the maxillary sinus per tooth on PA radiographs, OPG and CBCT images using the following classification: root tip is in the sinus (class 1, root tip is against the sinus wall (class 2 and root tip is not in the sinus (class 3.Results: Overall correlation between OPG and CBCT images scores was 50%, 26% and 56.1% for class 1, class 2 and class 3, respectively (Cohen’s kappa [weighted] = 0.1. Overall correlation between PA radiographs and CBCT images was 75.8%, 15.8% and 56.9% for class 1, class 2 and class 3, respectively (Cohen’s kappa [weighted] = 0.24. In both the OPG images and the PA radiographs datasets, class 1 correlation was most frequently observed with the first and second molars.Conclusions: The results demonstrated that both periapical radiographs and orthopantomograms are not reliable in determination of exact relationship between the apex of tooth root and the maxillary sinus floor. Periapical radiography is slightly more reliable than orthopantomography in determining this relationship.

  12. Reliability of periapical radiographs and orthopantomograms in detection of tooth root protrusion in the maxillary sinus: correlation results with cone beam computed tomography.

    Hassan, Bassam A


    The purpose of the present study was to investigate the reliability of both periapical radiographs and orthopantomograms for exact detection of tooth root protrusion in the maxillary sinus by correlating the results with cone beam computed tomography. A database of 1400 patients scanned with cone beam computed tomography (CBCT) was searched for matching periapical (PA) radiographs and orthopantogram (OPG) images of maxillary premolars and molars. Matching OPG images datasets of 101 patients with 628 teeth and PA radiographs datasets of 93 patients with 359 teeth were identified. Four observers assessed the relationship between the apex of tooth root and the maxillary sinus per tooth on PA radiographs, OPG and CBCT images using the following classification: root tip is in the sinus (class 1), root tip is against the sinus wall (class 2) and root tip is not in the sinus (class 3). Overall correlation between OPG and CBCT images scores was 50%, 26% and 56.1% for class 1, class 2 and class 3, respectively (Cohen's kappa [weighted] = 0.1). Overall correlation between PA radiographs and CBCT images was 75.8%, 15.8% and 56.9% for class 1, class 2 and class 3, respectively (Cohen's kappa [weighted]  = 0.24). In both the OPG images and the PA radiographs datasets, class 1 correlation was most frequently observed with the first and second molars. The results demonstrated that both periapical radiographs and orthopantomograms are not reliable in determination of exact relationship between the apex of tooth root and the maxillary sinus floor. Periapical radiography is slightly more reliable than orthopantomography in determining this relationship.

  13. Characterization of exoplanet atmospheres using future space-based infrared telescopes: challenges in detecting biomarkers

    Enya, Keigo


    Characterization of exoplanet atmospheres with space-based infrared telescopes is important to detect biomarkers. A promising method is temporary differential observation. For this method, designs of a wideband infrared spectral disperser are presented. A design using a CdTe prism simultaneously covers λ=1-30 μm. Designing binary pupil masks for segmented pupils to be used in spatially resolved observations are also shown for another observational method.

  14. Lake ice records used to detect historical and future climatic changes

    Robertson, Dale M.; Ragotzkie, R.A.; Magnuson, John J.


    Historical ice records, such as freeze and breakup dates and the total duration of ice cover, can be used as a quantitative indicator of climatic change if long homogeneous records exist and if the records can be calibrated in terms of climatic changes. Lake Mendota, Wisconsin, has the longest uninterrupted ice records available for any lake in North America dating back to 1855. These records extend back prior to any reliable air temperature data in the midwestern region of the U.S. and demonstrate significant warming of approximately 1.5 °C in fall and early winter temperatures and 2.5 °C in winter and spring temperatures during the past 135 years. These changes are not completely monotonie, but rather appear as two shorter periods of climatic change in the longer record. The first change was between 1875 and 1890, when fall, winter, and spring air temperatures increased by approximately 1.5 °C. The second change, earlier ice breakup dates since 1979, was caused by a significant increase in winter and early spring air temperatures of approximately 1.3 °C. This change may be indicative of shifts in regional climatic patterns associated with global warming, possibly associated with the ‘Greenhouse Effect’.

  15. Early detection of diabetic kidney disease: Present limitations and future perspectives

    Chih-Hung; Lin; Yi-Cheng; Chang; Lee-Ming; Chuang


    Diabetic kidney disease(DKD) is one of the most common diabetic complications, as well as the leading cause of chronic kidney disease and end-stage renal disease around the world. To prevent the dreadful consequence, development of new assays for diagnostic of DKD has always been the priority in the research field of diabetic complications. At present, urinary albumin-to-creatinine ratio and estimated glomerular filtration rate(eG FR) are the standard methods for assessing glomerular damage and renal function changes in clinical practice. However, due to diverse tissue involvement in different individuals, the so-called "non-albuminuric renal impairment" is not uncommon, especially in patients with type 2 diabetes. On the other hand, the precision of creatinine-based GFR estimates is limited in hyperfiltration status. These facts make albuminuria and eG FR less reliable indicators for early-stage DKD. In recent years, considerable progress has been made in the understanding of the pathogenesis of DKD, along with the elucidation of its genetic profiles and phenotypic expression of different molecules. With the help of ever-evolving technologies, it has gradually become plausible to apply the thriving information in clinical practice. The strength and weakness of several novel biomarkers, genomic, proteomic and metabolomic signatures in assisting the early diagnosis of DKD will be discussed in this article.

  16. Development of a sensitive and reliable high performance liquid chromatography method with fluorescence detection for high-throughput analysis of multi-class mycotoxins in Coix seed.

    Kong, Wei-Jun; Li, Jun-Yuan; Qiu, Feng; Wei, Jian-He; Xiao, Xiao-He; Zheng, Yuguo; Yang, Mei-Hua


    As an edible and medicinal plant, Coix seed is readily contaminated by more than one group of mycotoxins resulting in potential risk to human health. A reliable and sensitive method has been developed to determine seven mycotoxins (aflatoxins B1, B2, G1, G2, zearalenone, α-zearalenol, and β-zearalenol) simultaneously in 10 batches of Coix seed marketed in China. The method is based on a rapid ultrasound-assisted solid-liquid extraction (USLE) using methanol/water (80/20) followed by immunoaffinity column (IAC) clean-up, on-line photochemical derivatization (PCD), and high performance liquid chromatography coupled with fluorescence detection (HPLC-FLD). Careful optimization of extraction, clean-up, separation and detection conditions was accomplished to increase sample throughput and to attain rapid separation and sensitive detection. Method validation was performed by analyzing samples spiked at three different concentrations for the seven mycotoxins. Recoveries were from 73.5% to 107.3%, with relative standard deviations (RSDs) lower than 7.7%. The intra- and inter-day precisions, expressed as RSDs, were lower than 4% for all studied analytes. Limits of detection and quantification ranged from 0.01 to 50.2 μg kg(-1), and from 0.04 to 125.5 μg kg(-1), respectively, which were below the tolerance levels for mycotoxins set by the European Union. Samples that tested positive were further analyzed by HPLC tandem electrospray ionization mass spectrometry for confirmatory purposes. This is the first application of USLE-IAC-HPLC-PCD-FLD for detecting the occurrence of multi-class mycotoxins in Coix seed.

  17. Validity and reliability of a structured interview for early detection and risk assessment of parenting and developmental problems in young children: a cross-sectional study

    van Stel Henk F


    Full Text Available Abstract Background Preventive child health care is well suited for the early detection of parenting and developmental problems. However, as far as the younger age group is concerned, there are no validated early detection instruments which cover both the child and its environment. Therefore, we have developed a broad-scope structured interview which assesses parents’ concerns and their need for support, using both the parental perspective and the experience of the child health care nurse: the Structured Problem Analysis of Raising Kids (SPARK. This study reports the psychometric characteristics of the SPARK. Method A cross-sectional study of 2012 18-month-old children, living in Zeeland, a province of the Netherlands. Inter-rater reliability was assessed in 67 children. Convergent validity was assessed by comparing SPARK-domains with domains in self-report questionnaires on child development and parenting stress. Discriminative validity was assessed by comparing different outcomes of the SPARK between groups with different levels of socio-economic status and by performing an extreme-groups comparison. The user experience of both parents and nurses was assessed with the aid of an online survey. Results The response rate was 92.1% for the SPARK. Self-report questionnaires were returned in the case of 66.9% of the remaining 1721 children. There was selective non-reporting: 33.1% of the questionnaires were not returned, covering 65.2% of the children with a high-risk label according to the SPARK (p  Conclusion The SPARK discriminates between children with a high, increased and low risk of parenting and developmental problems. It does so in a reliable way, but more research is needed on aspects of validity and in other populations.

  18. Laying the Groundwork for Future Alma Direct Magnetic Field Detection in Protostellar Environments

    Cox, Erin Guilfoil; Harris, Robert J.; Looney, Leslie; Segura-Cox, Dominique M.; Crutcher, Richard; Li, Zhi-Yun; Tobin, John; Stephens, Ian; Novak, Giles; Fernandez-Lopez, Manuel


    Magnetic fields are a crucial element of the star formation process on many scales, from controlling jet and outflow formation on large scales, determining the structure of any protostellar disk, to modulating the accretion rate onto the central protostar. Both the three-dimensional structure and the field strength are important in determining the outcome of star formation. Unfortunately, the method most commonly used to infer magnetic field structure - linearly polarized dust continuum emission - is limited to the plane-of-sky field structure, and gives no reliable information on field strength. Alternatively, observations of the Zeeman effect in transitions of paramagnetic molecules, especially CN, are one of the best prospects for making such measurements due to the molecules' high Zeeman coefficients. In particular, these observations have been used in determining field strengths on cloud-size scales. However, CN and other paramagnetic molecules have, to our knowledge, never been observed in the envelopes/disks of Class 0 protostars at ˜arcsecond resolution, due both to sensitivity and resolution limits of previous generations of millimeter-wave interferometers. Because field strengths near the protostar are so important to understand the star formation process, we have conducted a snapshot ALMA Band 3 (3 mm / 113 GHz) survey of the 10 brightest Class 0 protostars in the Perseus, Taurus, and ρ Ophiuchus molecular clouds in the regions surrounding five transitions of four paramagnetic species, including CN, SO, C_2S, and C_4H. We present this survey - the principle goal of which was to assess the brightness of the lines within ˜ 1000 AU of the protostar - and assess the likelihood of using ALMA observations of the Zeeman effect to determine protostellar magnetic field strength.

  19. Reliability Engineering

    Lazzaroni, Massimo


    This book gives a practical guide for designers and users in Information and Communication Technology context. In particular, in the first Section, the definition of the fundamental terms according to the international standards are given. Then, some theoretical concepts and reliability models are presented in Chapters 2 and 3: the aim is to evaluate performance for components and systems and reliability growth. Chapter 4, by introducing the laboratory tests, puts in evidence the reliability concept from the experimental point of view. In ICT context, the failure rate for a given system can be

  20. Detecting peptidic drugs, drug candidates and analogs in sports doping: current status and future directions.

    Thevis, Mario; Thomas, Andreas; Schänzer, Wilhelm


    With the growing availability of mature systems and strategies in biotechnology and the continuously expanding knowledge of cellular processes and involved biomolecules, human sports drug testing has become a considerably complex field in the arena of analytical chemistry. Proving the exogenous origin of peptidic drugs and respective analogs at lowest concentration levels in biological specimens (commonly blood, serum and urine) of rather limited volume is required to pursue an action against cheating athletes. Therefore, approaches employing chromatographic-mass spectrometric, electrophoretic, immunological and combined test methods have been required and developed. These allow detecting the misuse of peptidic compounds of lower (such as growth hormone-releasing peptides, ARA-290, TB-500, AOD-9604, CJC-1295, desmopressin, luteinizing hormone-releasing hormones, synacthen, etc.), intermediate (e.g., insulins, IGF-1 and analogs, 'full-length' mechano growth factor, growth hormone, chorionic gonadotropin, erythropoietin, etc.) and higher (e.g., stamulumab) molecular mass with desired specificity and sensitivity. A gap between the technically possible detection and the day-to-day analytical practice, however, still needs to be closed.

  1. IGF-I abuse in sport: current knowledge and future prospects for detection.

    Guha, Nishan; Sönksen, Peter H; Holt, Richard I G


    As the tests for detecting growth hormone (GH) abuse develop further, it is likely that athletes will turn to doping with insulin-like growth factor-I (IGF-I). IGF-I mediates many of the anabolic actions of growth hormone. It stimulates muscle protein synthesis, promotes glycogen storage and enhances lipolysis, all of which make IGF-I attractive as a potential performance-enhancing agent. Pharmaceutical companies have developed commercial preparations of recombinant human IGF-I (rhIGF-I) for use in disorders of growth. The increased availability of rhIGF-I increases the opportunity for athletes to acquire supplies of the drug on the black market. The long-term effects of IGF-I administration are currently unknown but it is likely that these will be similar to the adverse effects of chronic GH abuse. The detection of IGF-I abuse is a challenge for anti-doping organisations. Research has commenced into the development of a test for IGF-I abuse based on the measurement of markers of GH action. Simultaneously, the effects of rhIGF-I on physical fitness, body composition and substrate utilisation in healthy volunteers are being investigated.

  2. Evaluation of a novel assay for detection of the fetal marker RASSF1A: facilitating improved diagnostic reliability of noninvasive prenatal diagnosis.

    Helen E White

    Full Text Available BACKGROUND: Analysis of cell free fetal (cff DNA in maternal plasma is used routinely for non invasive prenatal diagnosis (NIPD of fetal sex determination, fetal rhesus D status and some single gene disorders. True positive results rely on detection of the fetal target being analysed. No amplification of the target may be interpreted either as a true negative result or a false negative result due to the absence or very low levels of cffDNA. The hypermethylated RASSF1A promoter has been reported as a universal fetal marker to confirm the presence of cffDNA. Using methylation-sensitive restriction enzymes hypomethylated maternal sequences are digested leaving hypermethylated fetal sequences detectable. Complete digestion of maternal sequences is required to eliminate false positive results. METHODS: cfDNA was extracted from maternal plasma (n = 90 and digested with methylation-sensitive and insensitive restriction enzymes. Analysis of RASSF1A, SRY and DYS14 was performed by real-time PCR. RESULTS: Hypermethylated RASSF1A was amplified for 79 samples (88% indicating the presence of cffDNA. SRY real time PCR results and fetal sex at delivery were 100% accurate. Eleven samples (12% had no detectable hypermethylated RASSF1A and 10 of these (91% had gestational ages less than 7 weeks 2 days. Six of these samples were male at delivery, five had inconclusive results for SRY analysis and one sample had no amplifiable SRY. CONCLUSION: Use of this assay for the detection of hypermethylated RASSF1A as a universal fetal marker has the potential to improve the diagnostic reliability of NIPD for fetal sex determination and single gene disorders.

  3. The cognitive disorders examination (Codex) is a reliable 3-minute test for detection of dementia in the elderly (validation study on 323 subjects).

    Belmin, Joël; Pariel-Madjlessi, Sylvie; Surun, Philomène; Bentot, Caroline; Feteanu, Dorin; Lefebvre des Noettes, Véronique; Onen, Fannie; Drunat, Olivier; Trivalle, Christophe; Chassagne, Philippe; Golmard, Jean-Louis


    Dementia often remains undiagnosed until it has reached moderate or severe stages, thereby preventing patients and their families from obtaining optimal care. Tools that are easy to use in primary care might facilitate earlier detection of dementia. Develop and validate a very brief test for the detection of dementia. In the derivation study, we recorded educational level, Mini Mental State Examination (MMSE) scores and subscores and results of a simplified clock-drawing test (sCDT) for consecutive patients attending a single memory clinic over a two-year period,. Dementia was diagnosed according to DSM-IV criteria. The independent variables related to dementia were determined by a multivariable logistic model (MLM) and used to develop a decision tree to predict this diagnosis. In the validation study, the decision tree was applied to consecutive patients of six memory clinics for whom status about dementia was previously determined with DSM-IV criteria. The decision tree, MLM, and MMSE were applied to detect dementia in these patients. The sensitivity and specificity of each diagnostic tool were estimated and compared. Of 242 patients in the derivation study, the following independent variables were correlated with dementia: sex, sCDT, and two MMSE subscores - the 3-word recall test and spatial orientation. We used Bayesian statistics to develop a brief 2-step decision analysis tree (2-3 min.), which we named Codex (cognitive disorders examination). The validation study applied Codex to 323 patients. Sensitivity was 93% and specificity 85%. The corresponding values were 88% and 87% for the MLM, 94% and 67% or 91% and 70% for the MMSE, depending on the MMSE cutoff score. The sensitivity of Codex was significantly higher than that of MLM, and its specificity was significantly greater than that of MMSE. Codex is a simple, brief, and reliable test for detecting dementia and requires three minutes or less to administer. Its simplicity and brevity make it appropriate

  4. Using Open Data to Detect Organized Crime Threats—Factors Driving Future Crime

    This work provides an innovative look at the use of open data for extracting information to detect and prevent crime, and also explores the link between terrorism and organized crime. In counter-terrorism and other forms of crime prevention, foresight about potential threats is vitally important......, counter-terrorism and crime science. It will also be of interest to those in related fields such as applications of computer science and data mining, public policy, and business intelligence......., such as communication between organized crime networks and radicalization towards terrorism, is driven by a combination of political, economic, social, technological, legal and environmental factors. The contributions to this volume represent a major step by researchers to systematically collect, filter, interpret...

  5. The High Energy cosmic-Radiation Detection (HERD) Facility onboard China's Future Space Station

    Zhang, S N


    The High Energy cosmic-Radiation Detection (HERD) facility is one of several space astronomy payloads of the cosmic lighthouse program onboard China's Space Station, which is planned for operation starting around 2020 for about 10 years. The main scientific objectives of HERD are indirect dark matter search, precise cosmic ray spectrum and composition measurements up to the knee energy, and high energy gamma-ray monitoring and survey. HERD is composed of a 3-D cubic calorimeter (CALO) surrounded by microstrip silicon trackers (STKs) from five sides except the bottom. CALO is made of about 10$^4$ cubes of LYSO crystals, corresponding to about 55 radiation lengths and 3 nuclear interaction lengths, respectively. The top STK microstrips of seven X-Y layers are sandwiched with tungsten converters to make precise directional measurements of incoming electrons and gamma-rays. In the baseline design, each of the four side SKTs is made of only three layers microstrips. All STKs will also be used for measuring the cha...

  6. Reliability, Agreement and Minimal Detectable Change of the Timed Up & Go and the 10-Meter Walk Tests in Older Patients with COPD.

    Marques, Alda; Cruz, Joana; Quina, Sara; Regêncio, Maria; Jácome, Cristina


    This study aimed to determine the interrater and intrarater reliability and agreement and the minimal detectable change (MDC) of the Timed Up & Go (TUG) test and the 10-Meter Walk Test (10MWT) in older patients with Chronic Obstructive Pulmonary Disease (COPD). Patients (≥ 60 years old) living in the community were asked to attend 2 sessions with 48-72-hour interval. In session 1, participants completed the TUG and 10MWT twice (2 trials) and were assessed by 2 raters. In session 2, they repeated the tests twice and were assessed by 1 rater. Interrater and intrarater reliability were calculated for the exact scores (using data from trial 1) and mean scores (mean of 2 trials) using Intraclass Correlation Coefficients (ICC2,1 and ICC2,2, respectively). Interrater and intrarater agreement were explored with the Bland & Altman method. The MDC95 was calculated from the standard error of measurement. Sixty participants (72.43 ± 6.90 years old) completed session 1 and 41 participants session 2. Excellent ICC values were found for the TUG test (interrater: ICC2,1 = 0.997 ICC2,2 = 0.999; intrarater: ICC2,1 = 0.921 ICC2,2 = 0.964) and 10MWT (interrater: ICC2,1 = 0.992 ICC2,2 = 0.997; intrarater: ICC2,1 = 0.903 ICC2,2 = 0.946). Good interrater and intrarater agreement was also found for both tests. The MDC95 was 2.68 s and 1.84 s for the TUG and 0.40 m/s and 0.30 m/s for the 10MWT considering the exact and mean scores, respectively. Findings suggest that the TUG test and the 10MWT are reliable and have acceptable measurement error. Therefore, these measures may be used to assess functional balance (TUG) and gait (10MWT) deficits in older patients with COPD.

  7. Arctic sea ice in the PlioMIP ensemble: is model performance for modern climates a reliable guide to performance for the past or the future?

    F. W. Howell


    Full Text Available Eight general circulation models have simulated the mid-Pliocene Warm Period (mPWP, 3.264 to 3.025 Ma as part of the Pliocene Modelling Intercomparison Project (PlioMIP. Here, we analyse and compare their simulation of Arctic sea ice for both the pre-industrial and the mid-Pliocene. Mid-Pliocene sea ice thickness and extent is reduced and displays greater variability within the ensemble compared to the pre-industrial. This variability is highest in the summer months, when the model spread in the mid-Pliocene is more than three times larger than the rest of the year. Correlations between mid-Pliocene Arctic temperatures and sea ice extents are almost twice as strong as the equivalent correlations for the pre-industrial simulations. It is suggested that the weaker relationship between pre-industrial Arctic sea ice and temperatures is likely due to the tuning of climate models to achieve an optimal pre-industrial sea ice cover, which may also affect future predictions of Arctic sea ice. Model tuning for the pre-industrial does not appear to be best suited for simulating the different climate state of the mid-Pliocene. This highlights the importance of evaluating climate models through simulation of past climates, and the urgent need for more proxy evidence of sea ice during the Pliocene.

  8. Detecting changes in future precipitation extremes over eight river basins in China using RegCM4 downscaling

    Qin, Peihua; Xie, Zhenghui


    To detect the frequency and intensity of precipitation extremes in China for the middle 21st century, simulations were conducted with the regional climate model RegCM4 forced by the global climate model GFDL_ESM2M under the middle emission scenario (RCP4.5). Compared with observed precipitation extremes for the reference period from 1982 to 2001, RegCM4 generally performed better in most river basins of China relative to GFDL. In the future period 2032-2051, more wet extremes will occur relative to the present period in most study areas, especially in southeast China while significantly less dry extremes will occur in arid and semiarid areas in northwest China. In contrast, areas in northwest China showed an increase in the trend of dry extremes (CDD) and a decrease in the trend of wet extremes (R95p and Rx5day), which might result in more drought in the future. Finally, we discuss in detail the possible reason of these processes, such as zonal wind, vertical wind, and water vapor. In the Huaihe river basin (HU), reduced south winds in summer (June-August) and a decrease of the upward vertical p velocity cause less future precipitation and might lead to changes of extreme events. We also completed correlation analysis between the precipitation extreme indices and the climate factors and found that the precipitation extremes were more sensitive to the annual and seasonal mean precipitation, total water vapor, and upward vertical wind relative to the geopotential height and 2 m temperature over most river basins in China. Perhaps the changes of some wet extremes could be verified partly through changes of annual precipitation due to their high consistence.

  9. Mathematical reliability an expository perspective

    Mazzuchi, Thomas; Singpurwalla, Nozer


    In this volume consideration was given to more advanced theoretical approaches and novel applications of reliability to ensure that topics having a futuristic impact were specifically included. Topics like finance, forensics, information, and orthopedics, as well as the more traditional reliability topics were purposefully undertaken to make this collection different from the existing books in reliability. The entries have been categorized into seven parts, each emphasizing a theme that seems poised for the future development of reliability as an academic discipline with relevance. The seven parts are networks and systems; recurrent events; information and design; failure rate function and burn-in; software reliability and random environments; reliability in composites and orthopedics, and reliability in finance and forensics. Embedded within the above are some of the other currently active topics such as causality, cascading, exchangeability, expert testimony, hierarchical modeling, optimization and survival...

  10. Good interrater reliability of a new grading system in detecting traumatic bone marrow lesions in the knee by dual energy CT virtual non-calcium images

    Cao, Jian-xin; Wang, Yi-min [Department of Radiology, Wuhan 161th Hospital, 68 Huangpu Road, Wuhan 430010 (China); Kong, Xiang-quan, E-mail: [Department of Radiology, Union Hospital Affiliated to Tongji Medical College, Huazhong University of Science and Technology, 1277 Jiefang Avenue, Wuhan 430022 (China); Yang, Cheng; Wang, Peng [Department of Radiology, Wuhan 161th Hospital, 68 Huangpu Road, Wuhan 430010 (China)


    upper end of the tibia were 91.0%, 100.0%, 100.0%, and 95.4%, respectively. The CT values of bone marrow were (−52.5 ± 31.3) HU in positive area and (−91.2 ± 16.9) HU in negative area for the lower end of the femur, and those were (−51.3 ± 30.2) HU in positive area and (−104.7 ± 17.5) HU in negative area for the upper end of the tibia (all p values < 0.0001). The areas under the ROC curve of VNCa images for detection of BMLs were 0.875 for the lower end of the femur and 0.939 for the upper end of the tibia. Conclusion: Good interrater reliability of this new grading system in detecting traumatic BMLs in the knee by VNCa images of DECT can be obtained with good diagnostic predictive values.

  11. Microelectronics Reliability


    convey any rights or permission to manufacture, use, or sell any patented invention that may relate to them. This report was cleared for public release...testing for reliability prediction of devices exhibiting multiple failure mechanisms. Also presented was an integrated accelerating and measuring ...13  Table 2  T, V, F and matrix versus  measured  FIT

  12. Point-Counterpoint: Can Newly Developed, Rapid Immunochromatographic Antigen Detection Tests Be Reliably Used for the Laboratory Diagnosis of Influenza Virus Infections?


    Five years ago, the Point-Counterpoint series was launched. The initial article asked about the role of rapid immunochromatographic antigen testing in the diagnosis of influenza A virus 2009 H1N1 infection (D. F. Welch and C. C. Ginocchio, J Clin Microbiol 48:22–25, 2010, Since that article, not only have major changes been made in immunochromatographic antigen detection (IAD) testing for the influenza viruses, but there has also been rapid development of commercially available nucleic acid amplification tests (NAATs) for influenza virus detection. Further, a novel variant of influenza A, H7N9, has emerged in Asia, and H5N1 is also reemergent. In that initial article, the editor of this series, Peter Gilligan, identified two issues that required further consideration. One was how well IAD tests worked in clinical settings, especially in times of antigen drift and shift. The other was the role of future iterations of influenza NAATs and whether this testing would be available in a community hospital setting. James Dunn, who is Director of Medical Microbiology and Virology at Texas Children's Hospital, has extensive experience using IAD tests for diagnosing influenza. He will discuss the application and value of these tests in influenza diagnosis. Christine Ginocchio, who recently retired as the Senior Medical Director, Division of Infectious Disease Diagnostics, North Shore-LIJ Health System, and now is Vice President for Global Microbiology Affairs at bioMérieux, Durham, NC, wrote the initial counterpoint in this series, where she advocated the use of NAATs for influenza diagnosis. She will update us on the commercially available NAAT systems and explain what their role should be in the diagnosis of influenza infection. PMID:25274999

  13. Detection of heavy charged Higgs bosons in $e^+e^-to tbar b H^-$ production at future Linear Colliders

    Moretti, S


    Heavy charged Higgs bosons ($H^pm$) of a Type II 2-Higgs Doublet Model (2HDM) can be detected at future electron-positron Linear Colliders (LCs) even when their mass is larger than half the collider energy. The single Higgs mode $e^+e^-to tbar b H^- + ~{rm{c.c.}} to 4b +{rm{j}}{rm{j}} + ell + p_T^{rm{miss}}$ (where j represents a jet and with $ell=e,mu$) contributes to extend the discovery reach of $H^pm$ states into the mass region $M_{H^pm}gsim sqrt s/2$, where the well studied pair production channel $e^+e^-to H^-H^+$ is no longer available. With a technique that allows one to reconstruct the neutrino four-momentum in the decay $tto b W^+to b ell^+nu$, one can suppress the main irreducible background due to $e^+e^-to tbar t bbar b$ (via a gluon splitting into $bbar b$ pairs) to a negligible level. We prove that one can establish a statistically significant $H^pm$ signal over a region of several tens of GeV beyond $M_{H^pm}approx sqrt s/2$, as long as $tanbetagsim30$.

  14. Making statistical inferences about software reliability

    Miller, Douglas R.


    Failure times of software undergoing random debugging can be modelled as order statistics of independent but nonidentically distributed exponential random variables. Using this model inferences can be made about current reliability and, if debugging continues, future reliability. This model also shows the difficulty inherent in statistical verification of very highly reliable software such as that used by digital avionics in commercial aircraft.

  15. The Future of Futures

    Frankel, Christian; Ossandón, José


    Review of Elena Esposito: The Future of Futures. The Time of Money in Financing and Society Cheltenham. Edward Elgar, 2011.......Review of Elena Esposito: The Future of Futures. The Time of Money in Financing and Society Cheltenham. Edward Elgar, 2011....

  16. Grid reliability

    Saiz, P; Rocha, R; Andreeva, J


    We are offering a system to track the efficiency of different components of the GRID. We can study the performance of both the WMS and the data transfers At the moment, we have set different parts of the system for ALICE, ATLAS, CMS and LHCb. None of the components that we have developed are VO specific, therefore it would be very easy to deploy them for any other VO. Our main goal is basically to improve the reliability of the GRID. The main idea is to discover as soon as possible the different problems that have happened, and inform the responsible. Since we study the jobs and transfers issued by real users, we see the same problems that users see. As a matter of fact, we see even more problems than the end user does, since we are also interested in following up the errors that GRID components can overcome by themselves (like for instance, in case of a job failure, resubmitting the job to a different site). This kind of information is very useful to site and VO administrators. They can find out the efficien...

  17. Accelerator Availability and Reliability Issues

    Steve Suhring


    Maintaining reliable machine operations for existing machines as well as planning for future machines' operability present significant challenges to those responsible for system performance and improvement. Changes to machine requirements and beam specifications often reduce overall machine availability in an effort to meet user needs. Accelerator reliability issues from around the world will be presented, followed by a discussion of the major factors influencing machine availability.

  18. Software Reliability Experimentation and Control

    Kai-Yuan Cai


    This paper classifies software researches as theoretical researches, experimental researches, and engineering researches, and is mainly concerned with the experimental researches with focus on software reliability experimentation and control. The state-of-the-art of experimental or empirical studies is reviewed. A new experimentation methodology is proposed, which is largely theory discovering oriented. Several unexpected results of experimental studies are presented to justify the importance of software reliability experimentation and control. Finally, a few topics that deserve future investigation are identified.

  19. Grid reliability

    Saiz, P.; Andreeva, J.; Cirstoiu, C.; Gaidioz, B.; Herrala, J.; Maguire, E. J.; Maier, G.; Rocha, R.


    Thanks to the Grid, users have access to computing resources distributed all over the world. The Grid hides the complexity and the differences of its heterogeneous components. In such a distributed system, it is clearly very important that errors are detected as soon as possible, and that the procedure to solve them is well established. We focused on two of its main elements: the workload and the data management systems. We developed an application to investigate the efficiency of the different centres. Furthermore, our system can be used to categorize the most common error messages, and control their time evolution.


    B.Anni Princy


    Full Text Available A software reliability exemplary projects snags the random process as disillusionments which were the culmination yield of two progressions: emerging faults and initial state values. The predominant classification uses the logistic analysis effort function mounting efficient software on the real time dataset. The detriments of the logistic testing were efficaciously overcome by Pareto distribution. The estimated outline ventures the resolved technique for analyzing the suitable communities and the preeminent of fit for a software reliability progress model. Its constraints are predictable to evaluate the reliability of a software system. The future process will permit for software reliability estimations that can be used both as prominence Indicator, but also for planning and controlling resources, the development times based on the onslaught assignments of the efficient computing and reliable measurement of a software system was competent.

  1. Validity and reliability of 3D US for the detection of erosions in patients with rheumatoid arthritis using MRI as the gold standard

    Ellegaard, K; Bliddal, H; Møller Døhn, U


    PURPOSE: To test the reliability and validity of a 3D US erosion score in RA using MRI as the gold standard. MATERIALS AND METHODS: RA patients were examined with 3D US and 3 T MRI over the 2nd and 3rd metacarpophalangeal joints. 3D blocks were evaluated by two investigators. The erosions were...

  2. Power electronics reliability.

    Kaplar, Robert James; Brock, Reinhard C.; Marinella, Matthew; King, Michael Patrick; Stanley, James K.; Smith, Mark A.; Atcitty, Stanley


    The project's goals are: (1) use experiments and modeling to investigate and characterize stress-related failure modes of post-silicon power electronic (PE) devices such as silicon carbide (SiC) and gallium nitride (GaN) switches; and (2) seek opportunities for condition monitoring (CM) and prognostics and health management (PHM) to further enhance the reliability of power electronics devices and equipment. CM - detect anomalies and diagnose problems that require maintenance. PHM - track damage growth, predict time to failure, and manage subsequent maintenance and operations in such a way to optimize overall system utility against cost. The benefits of CM/PHM are: (1) operate power conversion systems in ways that will preclude predicted failures; (2) reduce unscheduled downtime and thereby reduce costs; and (3) pioneering reliability in SiC and GaN.

  3. Frontiers of reliability

    Basu, Asit P; Basu, Sujit K


    This volume presents recent results in reliability theory by leading experts in the world. It will prove valuable for researchers, and users of reliability theory. It consists of refereed invited papers on a broad spectrum of topics in reliability. The subjects covered include Bayesian reliability, Bayesian reliability modeling, confounding in a series system, DF tests, Edgeworth approximation to reliability, estimation under random censoring, fault tree reduction for reliability, inference about changes in hazard rates, information theory and reliability, mixture experiment, mixture of Weibul

  4. Computed tomography for the detection of distal radioulnar joint instability: normal variation and reliability of four CT scoring systems in 46 patients

    Wijffels, Mathieu; Krijnen, Pieta; Schipper, Inger [Leiden University Medical Center, Department of Surgery-Trauma Surgery, P.O. Box 9600, Leiden (Netherlands); Stomp, Wouter; Reijnierse, Monique [Leiden University Medical Center, Department of Radiology, P.O. Box 9600, Leiden (Netherlands)


    The diagnosis of distal radioulnar joint (DRUJ) instability is clinically challenging. Computed tomography (CT) may aid in the diagnosis, but the reliability and normal variation for DRUJ translation on CT have not been established in detail. The aim of this study was to evaluate inter- and intraobserver agreement and normal ranges of CT scoring methods for determination of DRUJ translation in both posttraumatic and uninjured wrists. Patients with a conservatively treated, unilateral distal radius fracture were included. CT scans of both wrists were evaluated independently, by two readers using the radioulnar line method, subluxation ratio method, epicenter method and radioulnar ratio method. The inter- and intraobserver agreement was assessed and normal values were determined based on the uninjured wrists. Ninety-two wrist CTs (mean age: 56.5 years, SD: 17.0, mean follow-up 4.2 years, SD: 0.5) were evaluated. Interobserver agreement was best for the epicenter method [ICC = 0.73, 95 % confidence interval (CI) 0.65-0.79]. Intraobserver agreement was almost perfect for the radioulnar line method (ICC = 0.82, 95 % CI 0.77-0.87). Each method showed a wide normal range for normal DRUJ translation. Normal range for the epicenter method is -0.35 to -0.06 in pronation and -0.11 to 0.19 in supination. DRUJ translation on CT in pro- and supination can be reliably evaluated in both normal and posttraumatic wrists, however with large normal variation. The epicenter method seems the most reliable. Scanning of both wrists might be helpful to prevent the radiological overdiagnosis of instability. (orig.)

  5. Delta-Reliability

    Eugster, P.; Guerraoui, R.; Kouznetsov, P.


    This paper presents a new, non-binary measure of the reliability of broadcast algorithms, called Delta-Reliability. This measure quantifies the reliability of practical broadcast algorithms that, on the one hand, were devised with some form of reliability in mind, but, on the other hand, are not considered reliable according to the ``traditional'' notion of broadcast reliability [HT94]. Our specification of Delta-Reliability suggests a further step towards bridging the gap between theory and...

  6. Software Reliability through Theorem Proving

    S.G.K. Murthy


    Full Text Available Improving software reliability of mission-critical systems is widely recognised as one of the major challenges. Early detection of errors in software requirements, designs and implementation, need rigorous verification and validation techniques. Several techniques comprising static and dynamic testing approaches are used to improve reliability of mission critical software; however it is hard to balance development time and budget with software reliability. Particularly using dynamic testing techniques, it is hard to ensure software reliability, as exhaustive testing is not possible. On the other hand, formal verification techniques utilise mathematical logic to prove correctness of the software based on given specifications, which in turn improves the reliability of the software. Theorem proving is a powerful formal verification technique that enhances the software reliability for missioncritical aerospace applications. This paper discusses the issues related to software reliability and theorem proving used to enhance software reliability through formal verification technique, based on the experiences with STeP tool, using the conventional and internationally accepted methodologies, models, theorem proving techniques available in the tool without proposing a new model.Defence Science Journal, 2009, 59(3, pp.314-317, DOI:

  7. Reliability computation from reliability block diagrams

    Chelson, P. O.; Eckstein, E. Y.


    Computer program computes system reliability for very general class of reliability block diagrams. Four factors are considered in calculating probability of system success: active block redundancy, standby block redundancy, partial redundancy, and presence of equivalent blocks in the diagram.

  8. Reliability of immunohistochemical demonstration of oestrogen receptors in routine practice: interlaboratory variance in the sensitivity of detection and evaluation of scoring systems

    RHODES A.; Jasani, B; Barnes, D; Bobrow, L; Miller, K


    Aims—To investigate interlaboratory variance in the immunohistochemical (IHC) detection of oestrogen receptors so as to determine the rate of false negatives, which could adversely influence the decision to give adjuvant tamoxifen treatment.

  9. Numerical and Structural Genomic Aberrations Are Reliably Detectable in Tissue Microarrays of Formalin-Fixed Paraffin-Embedded Tumor Samples by Fluorescence In-Situ Hybridization: e95047

    Heike Horn; Julia Bausinger; Annette M Staiger; Maximilian Sohn; Christopher Schmelter; Kim Gruber; Claudia Kalla; M Michaela Ott; Andreas Rosenwald; German Ott


    ...), especially for chromosomal deletions, in high-throughput settings using tissue microarrays (TMAs). We performed a comprehensive FISH study for the detection of chromosomal translocations and deletions in formalin-fixed and paraffin-embedded...

  10. Numerical and structural genomic aberrations are reliably detectable in tissue microarrays of formalin-fixed paraffin-embedded tumor samples by fluorescence in-situ hybridization

    Horn, Heike; Bausinger, Julia; Staiger, Annette M; Sohn, Maximilian; Schmelter, Christopher; Gruber, Kim; Kalla, Claudia; Ott, M Michaela; Rosenwald, Andreas; Ott, German


    ...), especially for chromosomal deletions, in high-throughput settings using tissue microarrays (TMAs). We performed a comprehensive FISH study for the detection of chromosomal translocations and deletions in formalin-fixed and paraffin-embedded...

  11. Non-invasive aneuploidy detection using free fetal DNA and RNA in maternal plasma: recent progress and future possibilities.

    Go, A.T.; Vugt, J.M.G. van; Oudejans, C.B.


    BACKGROUND: Cell-free fetal DNA (cff DNA) and RNA can be detected in maternal plasma and used for non-invasive prenatal diagnostics. Recent technical advances have led to a drastic change in the clinical applicability and potential uses of free fetal DNA and RNA. This review summarizes the latest cl

  12. Solid State Lighting Reliability Components to Systems

    Fan, XJ


    Solid State Lighting Reliability: Components to Systems begins with an explanation of the major benefits of solid state lighting (SSL) when compared to conventional lighting systems including but not limited to long useful lifetimes of 50,000 (or more) hours and high efficacy. When designing effective devices that take advantage of SSL capabilities the reliability of internal components (optics, drive electronics, controls, thermal design) take on critical importance. As such a detailed discussion of reliability from performance at the device level to sub components is included as well as the integrated systems of SSL modules, lamps and luminaires including various failure modes, reliability testing and reliability performance. This book also: Covers the essential reliability theories and practices for current and future development of Solid State Lighting components and systems Provides a systematic overview for not only the state-of-the-art, but also future roadmap and perspectives of Solid State Lighting r...

  13. VLSI Reliability in Europe

    Verweij, Jan F.


    Several issue's regarding VLSI reliability research in Europe are discussed. Organizations involved in stimulating the activities on reliability by exchanging information or supporting research programs are described. Within one such program, ESPRIT, a technical interest group on IC reliability was

  14. Complete validation of a unique digestion assay to detect Trichinella larvae in horsemeat demonstrates its reliability for meeting food safety and trade requirements.

    A tissue digestion assay using a double separatory funnel (DSF) procedure for the detection of Trichinella larvae in horsemeat was validated for application in food safety programs and trade. It consisted of a pepsin-HCl digestion step to release larvae from muscle tissue followed by two sequential ...

  15. Reliability of nucleic acid amplification methods for detection of Chlamydia trachomatis in urine: results of the first international collaborative quality control study among 96 laboratories

    R.P.A.J. Verkooyen (Roel); G.T. Noordhoek; P.E. Klapper; J. Reid; J. Schirm; G.M. Cleator; M. Ieven; G. Hoddevik


    textabstractThe first European Quality Control Concerted Action study was organized to assess the ability of laboratories to detect Chlamydia trachomatis in a panel of urine samples by nucleic acid amplification tests (NATs). The panel consisted of lyophilized urine samples, includ

  16. Section on prospects for dark matter detection of the white paper on the status and future of ground-based TeV gamma-ray astronomy.

    Byrum, K.; Horan, D.; Tait, T.; Wanger, R.; Zaharijas, G.; Buckley , J.; Baltz, E. A.; Bertone, G.; Dingus, B.; Fegan, S.; Ferrer, F.; Gondolo, P.; Hall, J.; Hooper, D.; Horan, D.; Koushiappas, S.; Krawczynksi, H.; LeBohec, S.; Pohl, M.; Profumo, S.; Silk , J; Vassilev, V.; Wood , M.; Wakely, S.; High Energy Physics; FNAL; Univ. of St. Louis; Stanford Univ.; Insti. d' Astrophysique; LANL; Univ. of California; Washington Univ.; Univ. of Utah; Brown Univ.; Oxford Univ.; Iowa State Univ.; Univ. of Chicago


    This is a report on the findings of the dark matter science working group for the white paper on the status and future of TeV gamma-ray astronomy. The white paper was commissioned by the American Physical Society, and the full white paper can be found on astro-ph (arXiv:0810.0444). This detailed section discusses the prospects for dark matter detection with future gamma-ray experiments, and the complementarity of gamma-ray measurements with other indirect, direct or accelerator-based searches. We conclude that any comprehensive search for dark matter should include gamma-ray observations, both to identify the dark matter particle (through the characteristics of the gamma-ray spectrum) and to measure the distribution of dark matter in galactic halos.

  17. Reliability engineering in solar energy: workshop proceedings

    Gross, G.


    A workshop to reveal the scope of reliability-related activities in solar energy conversion projects and in nonsolar segments of industry is described. Two reliability programs, one in heating and cooling and one in photovoltaics, are explicated. This document also presents general suggestions for the establishment of a unified program for reliability, durability, maintainability, and safety (RDM and S) in present and future solar projects.

  18. Reliability engineering in solar energy: workshop proceedings

    Gross, G.


    A workshop to reveal the scope of reliability-related activities in solar energy conversion projects and in nonsolar segments of industry is described. Two reliability programs, one in heating and cooling and one in photovoltaics, are explicated. This document also presents general suggestions for the establishment of a unified program for reliability, durability, maintainability, and safety (RDM and S) in present and future solar projects.

  19. Reliability assessment of wave Energy devices

    Ambühl, Simon; Kramer, Morten; Kofoed, Jens Peter


    Energy from waves may play a key role in sustainable electricity production in the future. Optimal reliability levels for components used for Wave Energy Devices (WEDs) need to be defined to be able to decrease their cost of electricity. Optimal reliability levels can be found using probabilistic...

  20. Leprosy New Case Detection Trends and the Future Effect of Preventive Interventions in Pará State, Brazil: A Modelling Study.

    de Matos, Haroldo José; Blok, David J; de Vlas, Sake J; Richardus, Jan Hendrik


    Leprosy remains a public health problem in Brazil. Although the overall number of new cases is declining, there are still areas with a high disease burden, such as Pará State in the north of the country. We aim to predict future trends in new case detection rate (NCDR) and explore the potential impact of contact tracing and chemoprophylaxis on NCDR in Pará State. We used SIMCOLEP, an existing individual-based model for the transmission and control of M. leprae, in a population structured by households. The model was quantified to simulate the population and observed NCDR of leprosy in Pará State for the period 1990 to 2014. The baseline scenario was the current control program, consisting of multidrug therapy, passive case detection, and active case detection from 2003 onwards. Future projections of the NCDR were made until 2050 given the continuation of the current control program (i.e. baseline). We further investigated the potential impact of two scenarios for future control of leprosy: 1) discontinuation of contact tracing; and 2) continuation of current control in combination with chemoprophylaxis. Both scenarios started in 2015 and were projected until 2050. The modelled NCDR in Pará State after 2014 shows a continuous downward trend, reaching the official elimination target of 10 cases per 100,000 population by 2030. The cessation of systematic contact tracing would not result in a higher NCDR in the long run. Systematic contact tracing in combination with chemoprophylaxis for contacts would reduce the NCDR by 40% and bring attainment of the elimination target two years forward to 2028. The NCDR of leprosy continues to decrease in Pará State. Elimination of leprosy as a public health problem could possibly be achieved around 2030, if the current control program is maintained. Providing chemoprophylaxis would decrease the NCDR further and would bring elimination forward by two years.

  1. C. Diff Quik Chek complete enzyme immunoassay provides a reliable first-line method for detection of Clostridium difficile in stool specimens.

    Quinn, Criziel D; Sefers, Susan E; Babiker, Wisal; He, Ying; Alcabasa, Romina; Stratton, Charles W; Carroll, Karen C; Tang, Yi-Wei


    We evaluated a single membrane device assay for simultaneously detecting both Clostridium difficile glutamate dehydrogenase (GDH) and toxin A/B antigens against a standard that combines two PCR assays and cytotoxigenic culture. Results showing dual GDH and toxin A/B antigen positives and negatives can be reported immediately as true positives and negatives, respectively. Specimens with discrepant results for GDH and toxins A/B, which comprised 13.2% of the specimens, need to be retested.

  2. Panel-based next generation sequencing as a reliable and efficient technique to detect mutations in unselected patients with retinal dystrophies

    Glöckle, Nicola; Kohl, Susanne; Mohr, Julia; Scheurenbrand, Tim; Sprecher, Andrea; Weisschuh, Nicole; Bernd, Antje; Rudolph, Günther; Schubach, Max; Poloschek, Charlotte; Zrenner, Eberhart; Biskup, Saskia; Berger, Wolfgang; Wissinger, Bernd; Neidhardt, John


    Hereditary retinal dystrophies (RD) constitute a group of blinding diseases that are characterized by clinical variability and pronounced genetic heterogeneity. The different forms of RD can be caused by mutations in >100 genes, including >1600 exons. Consequently, next generation sequencing (NGS) technologies are among the most promising approaches to identify mutations in RD. So far, NGS is not routinely used in gene diagnostics. We developed a diagnostic NGS pipeline to identify mutations in 170 genetically and clinically unselected RD patients. NGS was applied to 105 RD-associated genes. Underrepresented regions were examined by Sanger sequencing. The NGS approach was successfully established using cases with known sequence alterations. Depending on the initial clinical diagnosis, we identified likely causative mutations in 55% of retinitis pigmentosa and 80% of Bardet–Biedl or Usher syndrome cases. Seventy-one novel mutations in 40 genes were newly associated with RD. The genes USH2A, EYS, ABCA4, and RHO were more frequently affected than others. Occasionally, cases carried mutations in more than one RD-associated gene. In addition, we found possible dominant de-novo mutations in cases with sporadic RD, which implies consequences for counseling of patients and families. NGS-based mutation analyses are reliable and cost-efficient approaches in gene diagnostics of genetically heterogeneous diseases like RD. PMID:23591405

  3. Split-bolus single-phase cardiac multidetector computed tomography for reliable detection of left atrial thrombus. Comparison to transesophageal echocardiography

    Staab, W.; Zwaka, P.A.; Sohns, J.M.; Schwarz, A.; Lotz, J. [University Medical Center Goettingen Univ. (Germany). Inst. for Diagnostic and Interventional Radiology; Sohns, C.; Vollmann, D.; Zabel, M.; Hasenfuss, G. [Goettingen Univ. (Germany). Dept. of Cardiology and Pneumology; Schneider, S. [Goettingen Univ. (Germany). Dept. of Medical Statistics


    Evaluation of a new cardiac MDCT protocol using a split-bolus contrast injection protocol and single MDCT scan for reliable diagnosis of LA/LAA thrombi in comparison to TEE, optimizing radiation exposure and use of contrast agent. A total of 182 consecutive patients with drug refractory AF scheduled for PVI (62.6% male, mean age: 64.1 ± 10.2 years) underwent routine diagnostic work including TEE and cardiac MDCT for the evaluation of LA/LAA anatomy and thrombus formation between November 2010 and March 2012. Contrast media injection was split into a pre-bolus of 30 ml and main bolus of 70 ml iodinated contrast agent separated by a short time delay. In this study, split-bolus cardiac MDCT identified 14 of 182 patients with filling defects of the LA/LAA. In all of these 14 patients, abnormalities were found in TEE. All 5 of the 14 patients with thrombus formation in cardiac MDCT were confirmed by TEE. MDCT was 100% accurate for thrombus, with strong but not perfect overall results for SEC equivalent on MDCT.

  4. Using a thermoluminescent dosimeter to evaluate the location reliability of the highest-skin dose area detected by treatment planning in radiotherapy for breast cancer.

    Sun, Li-Min; Huang, Chih-Jen; Chen, Hsiao-Yun; Meng, Fan-Yun; Lu, Tsung-Hsien; Tsao, Min-Jen


    Acute skin reaction during adjuvant radiotherapy for breast cancer is an inevitable process, and its severity is related to the skin dose. A high-skin dose area can be speculated based on the isodose distribution shown on a treatment planning. To determine whether treatment planning can reflect high-skin dose location, 80 patients were collected and their skin doses in different areas were measured using a thermoluminescent dosimeter to locate the highest-skin dose area in each patient. We determined whether the skin dose is consistent with the highest-dose area estimated by the treatment planning of the same patient. The χ(2) and Fisher exact tests revealed that these 2 methods yielded more consistent results when the highest-dose spots were located in the axillary and breast areas but not in the inframammary area. We suggest that skin doses shown on the treatment planning might be a reliable and simple alternative method for estimating the highest skin doses in some areas. Copyright © 2014 American Association of Medical Dosimetrists. Published by Elsevier Inc. All rights reserved.

  5. Reliable SERS detection of nitrite based on pH and laser irradiance-dependent diazotization through a convenient sampling micro-chamber.

    Gao, Mengyue; Fang, Wei; Ren, Jiaqiang; Shen, Aiguo; Hu, Jiming


    Nitrites (NO2(-) ions) in food and drink play an important role in human health but require complicated operations before detection. Herein, we present a rationally designed SERS-enabled micro-chamber that comprised a drawn glass capillary with a tiny orifice (∼50 μm) at the distal tip, wherein the gold nanoparticles (Au NPs) are compactly coated on the inner wall surface. In this chamber, nitrites specifically trigger a pH and laser irradiance-dependent diazotization starting from p-aminothiophenol (PATP) absorbed onto the surface of Au NPs to form p,p'-dimercaptoazobenzene (DMAB) molecules, in which the presence of NO2(-) ions above 30.7 μM (1.38 ppm) in the siphoned liquid sample can be identified relying on SERS peak (1141 cm(-1)) intensity of the emerging azo moiety. Except for pH conditions, laser irradiance is more important but easily neglected in previous studies, which is capable of preventing generation of errors when the detection sensitivity was pursued through increasing the laser power. In this case, several real samples (rather than simple water samples), including honey, pickled vegetable and fermented bean curd, had been successfully detected accurately through such a convenient sampling micro-chamber. The SERS-enabled device could potentially be facilely incorporated with portable Raman instruments for a special application of food inspection in rapid and field analysis of NO2(-) ions.


    Hamano, Keiko; Kawahara, Hajime; Abe, Yutaka [Department of Earth and Planetary Science, The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-0033 (Japan); Onishi, Masanori [Department of Earth and Planetary Sciences, Kobe University, 1-1 Rokkodai-cho, Nada, Kobe 657-8501 (Japan); Hashimoto, George L., E-mail: [Department of Earth Sciences, Okayama University, 3-1-1 Tsushima-Naka, Kita, Okayama, 700-8530 (Japan)


    We present the thermal evolution and emergent spectra of solidifying terrestrial planets along with the formation of steam atmospheres. The lifetime of a magma ocean and its spectra through a steam atmosphere depends on the orbital distance of the planet from the host star. For a Type I planet, which is formed beyond a certain critical distance from the host star, the thermal emission declines on a timescale shorter than approximately 10{sup 6} years. Therefore, young stars should be targets when searching for molten planets in this orbital region. In contrast, a Type II planet, which is formed inside the critical distance, will emit significant thermal radiation from near-infrared atmospheric windows during the entire lifetime of the magma ocean. The K{sub s} and L bands will be favorable for future direct imaging because the planet-to-star contrasts of these bands are higher than approximately 10{sup −7}–10{sup −8}. Our model predicts that, in the Type II orbital region, molten planets would be present over the main sequence of the G-type host star if the initial bulk content of water exceeds approximately 1 wt%. In visible atmospheric windows, the contrasts of the thermal emission drop below 10{sup −10} in less than 10{sup 5} years, whereas those of the reflected light remain 10{sup −10} for both types of planets. Since the contrast level is comparable to those of reflected light from Earth-sized planets in the habitable zone, the visible reflected light from molten planets also provides a promising target for direct imaging with future ground- and space-based telescopes.

  7. Software reliability models for critical applications

    Pham, H.; Pham, M.


    This report presents the results of the first phase of the ongoing EG&G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.

  8. Software reliability models for critical applications

    Pham, H.; Pham, M.


    This report presents the results of the first phase of the ongoing EG G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.

  9. Bead-based immunoassay allows sub-picogram detection of histidine-rich protein 2 from Plasmodium falciparum and estimates reliability of malaria rapid diagnostic tests

    Rogier, Eric; Plucinski, Mateusz; Lucchi, Naomi; Mace, Kimberly; Chang, Michelle; Lemoine, Jean Frantz; Candrinho, Baltazar; Colborn, James; Dimbu, Rafael; Fortes, Filomeno; Udhayakumar, Venkatachalam; Barnwell, John


    Detection of histidine-rich protein 2 (HRP2) from the malaria parasite Plasmodium falciparum provides evidence for active or recent infection, and is utilized for both diagnostic and surveillance purposes, but current laboratory immunoassays for HRP2 are hindered by low sensitivities and high costs. Here we present a new HRP2 immunoassay based on antigen capture through a bead-based system capable of detecting HRP2 at sub-picogram levels. The assay is highly specific and cost-effective, allowing fast processing and screening of large numbers of samples. We utilized the assay to assess results of HRP2-based rapid diagnostic tests (RDTs) in different P. falciparum transmission settings, generating estimates for true performance in the field. Through this method of external validation, HRP2 RDTs were found to perform well in the high-endemic areas of Mozambique and Angola with 86.4% and 73.9% of persons with HRP2 in their blood testing positive by RDTs, respectively, and false-positive rates of 4.3% and 0.5%. However, in the low-endemic setting of Haiti, only 14.5% of persons found to be HRP2 positive by the bead assay were RDT positive. Additionally, 62.5% of Haitians showing a positive RDT test had no detectable HRP2 by the bead assay, likely indicating that these were false positive tests. In addition to RDT validation, HRP2 biomass was assessed for the populations in these different settings, and may provide an additional metric by which to estimate P. falciparum transmission intensity and measure the impact of interventions. PMID:28192523

  10. Reliability Generalization: "Lapsus Linguae"

    Smith, Julie M.


    This study examines the proposed Reliability Generalization (RG) method for studying reliability. RG employs the application of meta-analytic techniques similar to those used in validity generalization studies to examine reliability coefficients. This study explains why RG does not provide a proper research method for the study of reliability,…

  11. Reliability Characteristics of Power Plants

    Zbynek Martinek


    Full Text Available This paper describes the phenomenon of reliability of power plants. It gives an explanation of the terms connected with this topic as their proper understanding is important for understanding the relations and equations which model the possible real situations. The reliability phenomenon is analysed using both the exponential distribution and the Weibull distribution. The results of our analysis are specific equations giving information about the characteristics of the power plants, the mean time of operations and the probability of failure-free operation. Equations solved for the Weibull distribution respect the failures as well as the actual operating hours. Thanks to our results, we are able to create a model of dynamic reliability for prediction of future states. It can be useful for improving the current situation of the unit as well as for creating the optimal plan of maintenance and thus have an impact on the overall economics of the operation of these power plants.

  12. Making the most of RNA-seq: Pre-processing sequencing data with Opossum for reliable SNP variant detection [version 2; referees: 2 approved, 1 approved with reservations

    Laura Oikkonen


    Full Text Available Identifying variants from RNA-seq (transcriptome sequencing data is a cost-effective and versatile complement to whole-exome (WES and whole-genome sequencing (WGS analysis. RNA-seq (transcriptome sequencing is primarily considered a method of gene expression analysis but it can also be used to detect DNA variants in expressed regions of the genome. However, current variant callers do not generally behave well with RNA-seq data due to reads encompassing intronic regions. We have developed a software programme called Opossum to address this problem. Opossum pre-processes RNA-seq reads prior to variant calling, and although it has been designed to work specifically with Platypus, it can be used equally well with other variant callers such as GATK HaplotypeCaller. In this work, we show that using Opossum in conjunction with either Platypus or GATK HaplotypeCaller maintains precision and improves the sensitivity for SNP detection compared to the GATK Best Practices pipeline. In addition, using it in combination with Platypus offers a substantial reduction in run times compared to the GATK pipeline so it is ideal when there are only limited time or computational resources available.

  13. Lifetime and Spectral Evolution of a Magma Ocean with a Steam Atmosphere: Its Detectability by Future Direct Imaging

    Hamano, Keiko; Abe, Yutaka; Onishi, Masanori; Hashimoto, George L


    We present the thermal evolution and emergent spectra of solidifying terrestrial planets along with the formation of steam atmospheres. The lifetime of a magma ocean and its spectra through a steam atmosphere depends on the orbital distance of the planet from the host star. For a type-I planet, which is formed beyond a certain critical distance from the host star, the thermal emission declines on a timescale shorter than approximately $10^6$ years. Therefore, young stars should be targets when searching for molten planets in this orbital region. In contrast, a type-II planet, which is formed inside the critical distance, will emit significant thermal radiation from near-infrared atmospheric windows during the entire lifetime of the magma ocean. The Ks and L bands will be favorable for future direct imaging because the planet-to-star contrasts of these bands are higher than approximately 10$^{-7}$-10$^{-8}$. Our model predicts that, in the type-II orbital region, molten planets would be present over the main s...

  14. Development of an acoustic sensor for the future IceCube-Gen2 detector for neutrino detection and position calibration

    Wickmann, Stefan; Eliseev, Dmitry; Heinen, Dirk; Linder, Peter; Rongen, Martin; Scholz, Franziska; Weinstock, Lars Steffen; Wiebusch, Christopher; Zierke, Simon


    For the planned high-energy extension of the IceCube Neutrino Observatory in the glacial ice at the South Pole the spacing of detector modules will be increased with respect to IceCube. Because of these larger distances the quality of the geometry calibration based on pulsed light sources is expected to deteriorate. To counter this an independent acoustic geometry calibration system based on trilateration is introduced. Such an acoustic positioning system (APS) has already been developed for the Enceladus Explorer Project (EnEx), initiated by the DLR Space Administration. In order to integrate such APS-sensors into the IceCube detector the power consumption needs to be minimized. In addition, the frequency response of the front end electronics is optimized for positioning as well as the acoustic detection of neutrinos. The new design of the acoustic sensor and results of test measurements with an IceCube detector module will be presented.

  15. 基于多源信息可信度的高炉料面温度检测方法%Temperature Detection Method of Blast Furnace Burden Surface Based on the Reliability of Multi-source Information

    安剑奇; 吴敏; 何勇; 曹卫华


    针对高炉料面温度难以准确检测的问题,提出一种基于多源信息可信度的高炉料面温度在线检测方法.根据高炉3种异类检测信息的各自特点分别估计料面温度,采用可信度理论通过融合单一信息的估计值计算高炉料面温度.在某钢铁企业2 200m3高炉应用结果表明,所提出的方法能够准确地检测高炉料面温度,为复杂冶金过程状态检测提供了新的解决思路.%Focusing on the difficulty of precisely detecting blast furnace(BF) burden surface temperature,a novel temperature detection method of BF burden surface based on the reliability of multi-source information was proposed.Firstly,the burden surface temperature is estimated respectively according to the individual features of three kinds of singular heterogeneous information;then the BF burden surface temperature is calculated by fusing the results estimated by the three kinds of singular information based on reliability theory.The application on a 2 200 m3 B F in some steel enterprise shows the method proposed can realize the real-time and precise detection of burden surface temperature,which eventually provides an effective solution for the status monitoring of complicated metallurgy process.

  16. Does functional MRI detect activation in white matter?A review of emerging evidence, issues, and future directions

    Jodie Reanna Gawryluk


    Full Text Available Functional magnetic resonance imaging (fMRI is a non-invasive technique that allows for visualization of activated brain regions. Until recently, fMRI studies have focused on gray matter. There are two main reasons white matter fMRI remains controversial: 1 the blood oxygen level dependent (BOLD fMRI signal depends on cerebral blood flow and volume, which are lower in white matter than gray matter and 2 fMRI signal has been associated with post-synaptic potentials (mainly localized in gray matter as opposed to action potentials (the primary type of neural activity in white matter. Despite these observations, there is no direct evidence against measuring fMRI activation in white matter and reports of fMRI activation in white matter continue to increase. The questions underlying white matter fMRI activation are important. White matter fMRI activation has the potential to greatly expand the breadth of brain connectivity research, as well as improve the assessment and diagnosis of white matter and connectivity disorders. The current review provides an overview of the motivation to investigate white matter fMRI activation, as well as the published evidence of this phenomenon. We speculate on possible neurophysiologic bases of white matter fMRI signals, and discuss potential explanations for why reports of white matter fMRI activation are relatively scarce. We end with a discussion of future basic and clinical research directions in the study of white matter fMRI.

  17. Single Molecule Fluorescence Detection and Tracking in Mammalian Cells: The State-of-the-Art and Future Perspectives

    David T. Clarke


    Full Text Available Insights from single-molecule tracking in mammalian cells have the potential to greatly contribute to our understanding of the dynamic behavior of many protein families and networks which are key therapeutic targets of the pharmaceutical industry. This is particularly so at the plasma membrane, where the method has begun to elucidate the mechanisms governing the molecular interactions that underpin many fundamental processes within the cell, including signal transduction, receptor recognition, cell-cell adhesion, etc. However, despite much progress, single-molecule tracking faces challenges in mammalian samples that hinder its general application in the biomedical sciences. Much work has recently focused on improving the methods for fluorescent tagging of target molecules, detection and localization of tagged molecules, which appear as diffraction-limited spots in charge-coupled device (CCD images, and objectively establishing the correspondence between moving particles in a sequence of image frames to follow their diffusive behavior. In this review we outline the state-of-the-art in the field and discuss the advantages and limitations of the methods available in the context of specific applications, aiming at helping researchers unfamiliar with single molecules methods to plan out their experiments.

  18. Assessing the reliability of self-reported weight for the management of heart failure: application of fraud detection methods to a randomised trial of telemonitoring.

    Steventon, Adam; Chaudhry, Sarwat I; Lin, Zhenqiu; Mattera, Jennifer A; Krumholz, Harlan M


    Since clinical management of heart failure relies on weights that are self-reported by the patient, errors in reporting will negatively impact the ability of health care professionals to offer timely and effective preventive care. Errors might often result from rounding, or more generally from individual preferences for numbers ending in certain digits, such as 0 or 5. We apply fraud detection methods to assess preferences for numbers ending in these digits in order to inform medical decision making. The Telemonitoring to Improve Heart Failure Outcomes trial tested an approach to telemonitoring that used existing technology; intervention patients (n = 826) were asked to measure their weight daily using a digital scale and to relay measurements using their telephone keypads. First, we estimated the number of weights subject to end-digit preference by dividing the weights by five and comparing the resultant distribution with the uniform distribution. Then, we assessed the characteristics of patients reporting an excess number of weights ending in 0 or 5, adjusting for chance reporting of these values. Of the 114,867 weight readings reported during the trial, 18.6% were affected by end-digit preference, and the likelihood of these errors occurring increased with the number of days that had elapsed since trial enrolment (odds ratio per day: 1.002, p telemonitoring system over the six-month trial period (95% CI, 2.3 to 3.5), compared with 2.3 for other patients (95% CI, 2.2 to 2.5). As well as overshadowing clinically meaningful changes in weight, end-digit preference can lead to false alerts to telemonitoring systems, which may be associated with unnecessary treatment and alert fatigue. In this trial, end-digit preference was common and became increasingly so over time. By applying fraud detection methods to electronic medical data, it is possible to produce clinically significant information that can inform the design of initiatives to improve the accuracy of

  19. Making the most of RNA-seq: Pre-processing sequencing data with Opossum for reliable SNP variant detection [version 1; referees: 2 approved, 1 approved with reservations

    Laura Oikkonen


    Full Text Available Identifying variants from RNA-seq (transcriptome sequencing data is a cost-effective and versatile alternative to whole-genome sequencing. However, current variant callers do not generally behave well with RNA-seq data due to reads encompassing intronic regions. We have developed a software programme called Opossum to address this problem. Opossum pre-processes RNA-seq reads prior to variant calling, and although it has been designed to work specifically with Platypus, it can be used equally well with other variant callers such as GATK HaplotypeCaller. In this work, we show that using Opossum in conjunction with either Platypus or GATK HaplotypeCaller maintains precision and improves the sensitivity for SNP detection compared to the GATK Best Practices pipeline. In addition, using it in combination with Platypus offers a substantial reduction in run times compared to the GATK pipeline so it is ideal when there are only limited time or computational resources available.

  20. Assuring reliability program effectiveness.

    Ball, L. W.


    An attempt is made to provide simple identification and description of techniques that have proved to be most useful either in developing a new product or in improving reliability of an established product. The first reliability task is obtaining and organizing parts failure rate data. Other tasks are parts screening, tabulation of general failure rates, preventive maintenance, prediction of new product reliability, and statistical demonstration of achieved reliability. Five principal tasks for improving reliability involve the physics of failure research, derating of internal stresses, control of external stresses, functional redundancy, and failure effects control. A final task is the training and motivation of reliability specialist engineers.

  1. The Accelerator Reliability Forum

    Lüdeke, Andreas; Giachino, R


    A high reliability is a very important goal for most particle accelerators. The biennial Accelerator Reliability Workshop covers topics related to the design and operation of particle accelerators with a high reliability. In order to optimize the over-all reliability of an accelerator one needs to gather information on the reliability of many different subsystems. While a biennial workshop can serve as a platform for the exchange of such information, the authors aimed to provide a further channel to allow for a more timely communication: the Particle Accelerator Reliability Forum [1]. This contribution will describe the forum and advertise it’s usage in the community.

  2. Human- and computer-accessible 2D correlation data for a more reliable structure determination of organic compounds. Future roles of researchers, software developers, spectrometer managers, journal editors, reviewers, publisher and database managers toward artificial-intelligence analysis of NMR spectra.

    Jeannerat, Damien


    The introduction of a universal data format to report the correlation data of 2D NMR spectra such as COSY, HSQC and HMBC spectra will have a large impact on the reliability of structure determination of small organic molecules. These lists of assigned cross peaks will bridge signals found in NMR 1D and 2D spectra and the assigned chemical structure. The record could be very compact, human and computer readable so that it can be included in the supplementary material of publications and easily transferred into databases of scientific literature and chemical compounds. The records will allow authors, reviewers and future users to test the consistency and, in favorable situations, the uniqueness of the assignment of the correlation data to the associated chemical structures. Ideally, the data format of the correlation data should include direct links to the NMR spectra to make it possible to validate their reliability and allow direct comparison of spectra. In order to take the full benefits of their potential, the correlation data and the NMR spectra should therefore follow any manuscript in the review process and be stored in open-access database after publication. Keeping all NMR spectra, correlation data and assigned structures together at all time will allow the future development of validation tools increasing the reliability of past and future NMR data. This will facilitate the development of artificial intelligence analysis of NMR spectra by providing a source of data than can be used efficiently because they have been validated or can be validated by future users. Copyright © 2016 John Wiley & Sons, Ltd.

  3. Revisiting the STEC testing approach: using espK and espV to make enterohemorrhagic Escherichia coli (EHEC detection more reliable in beef

    Sabine eDelannoy


    Full Text Available Current methods for screening Enterohemorrhagic Escherichia coli (EHEC O157 and non-O157 in beef enrichments typically rely on the molecular detection of stx, eae, and serogroup-specific wzx or wzy gene fragments. As these genetic markers can also be found in some non-EHEC strains, a number of ‘false positive’ results are obtained. Here, we explore the suitability of five novel molecular markers, espK, espV, ureD, Z2098, and CRISPRO26:H11 as candidates for a more accurate screening of EHEC strains of greater clinical significance in industrialized countries. Of the 1,739 beef enrichments tested, 180 were positive for both stx and eae genes. Ninety (50% of these tested negative for espK, espV, ureD, and Z2098, but twelve out of these negative samples were positive for the CRISPRO26:H11 gene marker specific for a newly emerging virulent EHEC O26:H11 French clone. We show that screening for stx, eae, espK, and espV, in association with the CRISPRO26:H11 marker is a better approach to narrow down the EHEC screening step in beef enrichments. The number of potentially positive samples was reduced by 48.88% by means of this alternative strategy compared to the European and American reference methods, thus substantially improving the discriminatory power of EHEC screening systems. This approach is in line with the EFSA (European Food Safety Authority opinion on pathogenic STEC published in 2013.

  4. Validation of sensitivity and reliability of GPR and microgravity detection of underground cavities in complex urban settings: Test case of a cellar

    Chromčák, Jakub; Grinč, Michal; Pánisová, Jaroslava; Vajda, Peter; Kubová, Anna


    We test here the feasibility of ground-penetrating radar (GPR) and microgravity methods in identifying underground voids, such as cellars, tunnels, abandoned mine-workings, etc., in complex urban conditions. For this purpose, we selected a cellar located under a private lot in a residential quarter of the town of Senec in Western Slovakia, which was discovered by chance when a small sinkhole developed on the yard just two meters away from the house. The size of our survey area was limited 1) by the presence of a technical room built at the back of the yard with a staircase leading to the garden, and 2) by the small width of the lot. Therefore the geophysical survey was carried out only in the backyard of the lot as we were not permitted to measure on neighbouring estates. The results from the GPR measurements obtained by the GSSI SIR-3000 system with 400 MHz antenna were visualized in the form of 2D radargrams with the corresponding transformed velocity model of studied cross-sections. Only the profiles running over the pavement next to the house yielded interpretable data because the local geological situation and the regular watering of the lawn covering prevailingly the backyard caused significant attenuation of the emitted GPR signal. The Bouguer gravity map is dominated by a distinctive negative anomaly indicating the presence of a shallow underground void. The quantitative interpretation by means of Euler deconvolution was utilized to validate the depth of the center and location of the cellar. Comparison with the gravitational effect of the cellar model calculated in the in-house program Polygrav shows a quite good correlation between the modelled and observed fields. Only a part of the aerial extent of the anomaly could be traced by the used geophysical methods due to accessibility issues. Nevertheless, the test cellar was successfully detected and interpreted by both methods, thus confirming their applicability in similar environmental and geotechnical

  5. Raman spectroscopy for the detection of explosives and their precursors on clothing in fingerprint concentration: a reliable technique for security and counterterrorism issues

    Almaviva, S.; Botti, S.; Cantarini, L.; Palucci, A.; Puiu, A.; Schnuerer, F.; Schweikert, W.; Romolo, F. S.


    In this work we report the results of RS measurements on some common military explosives and some of the most common explosives precursors deposited on clothing fabrics, both synthetic and natural, such as polyester, leather and denim cotton at concentration comparable to those obtained from a single fingerprint. RS Spectra were obtained using an integrated portable Raman system equipped with an optical microscope, focusing the light of a solid state GaAlAs laser emitting at 785 nm. A maximum exposure time of 10 s was used, focusing the beam in a 45 μm diameter spot on the sample. The substances were deposited starting from commercial solutions with a Micropipetting Nano-Plotter, ideal for generating high-quality spots by non-contact dispensing of sub-nanoliter volumes of liquids, in order to simulate a homogeneous stain on the fabric surface. Images acquired with a Confocal Laser Scanning Microscope provided further details of the deposition process showing single particles of micrometric volume trapped or deposited on the underlying tissues. The spectral features of each substance was clearly identified and discriminated from those belonging to the substrate fabric or from the surrounding fluorescence. Our results show that the application of RS using a microscope-based apparatus can provide interpretable Raman spectra in a fast, in-situ analysis, directly from explosive particles of some μm3 as the ones that it could be found in a single fingerprint, despite the contribution of the substrate, leaving the sample completely unaltered for further, more specific and propaedeutic laboratory analysis. The same approach can be envisaged for the detection of other illicit substances like drugs.

  6. Status and Future of a Real-time Global Flood Detection and Forecasting System Using Satellite Rainfall Information

    Adler, R. F.; Wu, H.; Hong, Y.; Policelli, F.; Pierce, H.


    Over the last several years a Global Flood Monitoring System (GFMS) has been running in real-time to detect the occurrence of floods (see and click on "Floods and Landslides"). The system uses 3-hr resolution composite rainfall analyses (TRMM Multi-satellite Precipitation Analysis [TMPA]) as input into a hydrological model that calculates water depth at each grid (at 0.25 degree latitude-longitude) over the tropics and mid-latitudes. These calculations can provide information useful to national and international agencies in understanding the location, intensity, timeline and impact on populations of these significant hazard events. The status of these flood calculations will be shown by case study examples and a statistical comparison against a global flood event database. The validation study indicates that results improve with longer duration (> 3 days) floods and that the statistics are impacted by the presence of dams, which are not accounted for in the model calculations. Limitations in the flood calculations that are related to the satellite rainfall estimates include space and time resolution limitations and underestimation of shallow orographic and monsoon system rainfall. The current quality of these flood estimations is at the level of being useful, but there is a potential for significant improvement, mainly through improved and more timely satellite precipitation information and improvement in the hydrological models being used. NASA's Global Precipitation Measurement (GPM) program should lead to better precipitation analyses utilizing space-time interpolations that maintain accurate intensity distributions along with methods to disaggregate the rain information research should lead to improved rain estimation for shallow, orographic rainfall systems and some types of monsoon rainfall, a current problem area for satellite rainfall. Higher resolution flood models with accurate routing and regional calibration, and the use of satellite

  7. Enlightenment on Computer Network Reliability From Transportation Network Reliability

    Hu Wenjun; Zhou Xizhao


    Referring to transportation network reliability problem, five new computer network reliability definitions are proposed and discussed. They are computer network connectivity reliability, computer network time reliability, computer network capacity reliability, computer network behavior reliability and computer network potential reliability. Finally strategies are suggested to enhance network reliability.

  8. Human Reliability Program Overview

    Bodin, Michael


    This presentation covers the high points of the Human Reliability Program, including certification/decertification, critical positions, due process, organizational structure, program components, personnel security, an overview of the US DOE reliability program, retirees and academia, and security program integration.

  9. Meeting the future metro network challenges and requirements by adopting programmable S-BVT with direct-detection and PDM functionality

    Nadal, Laia; Svaluto Moreolo, Michela; Fàbrega, Josep M.; Vílchez, F. Javier


    In this paper, we propose an advanced programmable sliceable-bandwidth variable transceiver (S-BVT) with polarization division multiplexing (PDM) capability as a key enabler to fulfill the requirements for future 5G networks. Thanks to its cost-effective optoelectronic front-end based on orthogonal frequency division multiplexing (OFDM) technology and direct-detection (DD), the proposed S-BVT becomes suitable for next generation highly flexible and scalable metro networks. Polarization beam splitters (PBSs) and controllers (PCs), available on-demand, are included at the transceivers and at the network nodes, further enhancing the system flexibility and promoting an efficient use of the spectrum. 40G-100G PDM transmission has been experimentally demonstrated, within a 4-node photonic mesh network (ADRENALINE testbed), implementing a simplified equalization process.

  10. Reliable Design Versus Trust

    Berg, Melanie; LaBel, Kenneth A.


    This presentation focuses on reliability and trust for the users portion of the FPGA design flow. It is assumed that the manufacturer prior to hand-off to the user tests FPGA internal components. The objective is to present the challenges of creating reliable and trusted designs. The following will be addressed: What makes a design vulnerable to functional flaws (reliability) or attackers (trust)? What are the challenges for verifying a reliable design versus a trusted design?

  11. New generation of monolithic active pixel sensors for charged particle detection; Developpement d'un capteur de nouvelle generation et son electronique integree pour les collisionneurs futurs

    Deptuch, G


    Vertex detectors are of great importance in particle physics experiments, as the knowledge of the event flavour is becoming an issue for the physics programme at Future Linear Colliders. Monolithic Active Pixel Sensors (MAPS) based on a novel detector structure have been proposed. Their fabrication is compatible with a standard CMOS process. The sensor is inseparable from the readout electronics, since both of them are integrated on the same, low-resistivity silicon wafer. The basic pixel configuration comprises only three MOS transistors and a diode collecting the charge through thermal diffusion. The charge is generated in the thin non-depleted epitaxial layer underneath the readout electronics. This approach provides, at low cost, a high resolution and thin device with the whole area sensitive to radiation. Device simulations using the ISE-TCAD package have been carried out to study the charge collection mechanism. In order to demonstrate the viability of the technique, four prototype chips have been fabricated using different submicrometer CMOS processes. The pixel gain has been calibrated using a {sup 55}Fe source and the Poisson sequence method. The prototypes have been exposed to high-energy particle beams at CERN. The tests proved excellent detection performances expressed in a single-track spatial resolution of 1.5 {mu}m and detection efficiency close to 100%, resulting from a SNR ratio of more than 30. Irradiation tests showed immunity of MAPS to a level of a few times 10{sup 12} n/cm{sup 2} and a few hundred kRad of ionising radiation. The ideas for future work, including on-pixel signal amplification, double sampling operation and current mode pixel design are present as well. (author)

  12. Viking Lander reliability program

    Pilny, M. J.


    The Viking Lander reliability program is reviewed with attention given to the development of the reliability program requirements, reliability program management, documents evaluation, failure modes evaluation, production variation control, failure reporting and correction, and the parts program. Lander hardware failures which have occurred during the mission are listed.

  13. Chapter 15: Reliability of Wind Turbines

    Sheng, Shuangwen; O' Connor, Ryan


    The global wind industry has witnessed exciting developments in recent years. The future will be even brighter with further reductions in capital and operation and maintenance costs, which can be accomplished with improved turbine reliability, especially when turbines are installed offshore. One opportunity for the industry to improve wind turbine reliability is through the exploration of reliability engineering life data analysis based on readily available data or maintenance records collected at typical wind plants. If adopted and conducted appropriately, these analyses can quickly save operation and maintenance costs in a potentially impactful manner. This chapter discusses wind turbine reliability by highlighting the methodology of reliability engineering life data analysis. It first briefly discusses fundamentals for wind turbine reliability and the current industry status. Then, the reliability engineering method for life analysis, including data collection, model development, and forecasting, is presented in detail and illustrated through two case studies. The chapter concludes with some remarks on potential opportunities to improve wind turbine reliability. An owner and operator's perspective is taken and mechanical components are used to exemplify the potential benefits of reliability engineering analysis to improve wind turbine reliability and availability.

  14. Present and future research directions of detection of asbestos fibers%石棉纤维检测的现状与未来研究方向

    徐小茗; 高源; 李艳秋; 戚佳琳; 高玫; 蔡发


    石棉纤维耐温性能好,化学性质稳定,是一种优异的工业材料,因此被广泛应用于化工等诸多领域.但是,石棉纤维能够污染大气、食物和水体,如果被呼吸到人体内,还会危害人的健康.因此,调查和研究石棉纤维的检测方法是非常必要的.本文将对石棉纤维检测的现状和未来研究方向加以概述,并为建立快速有效的检测方法提供依据.%Asbestos fibers were widely used in chemical and many other fields as an excellent industrial material for its good temperature resistance and stable chemical properties. However, atmosphere, food and water body could be polluted by asbestos fibers; human health would be endangered by breathing into the human body. Therefore, Asbestos fiber detection methods that had been investigated and researched were necessary. Asbestos fibers present and future research directions were summarized and a basis of a rapid and effective detection method was provided in the article.

  15. Reliability studies of diagnostic methods in Indian traditional Ayurveda medicine

    Kurande, Vrinda Hitendra; Waagepetersen, Rasmus; Toft, Egon


    as prakriti classification), method development (pulse diagnosis), quality assurance for diagnosis and treatment and in the conduct of clinical studies. Several reliability studies are conducted in western medicine. The investigation of the reliability of traditional Chinese, Japanese and Sasang medicine...... to reliability estimates and different study designs and statistical analysis is given for future studies in Ayurveda....

  16. Reliability-based condition assessment of steel containment and liners

    Ellingwood, B.; Bhattacharya, B.; Zheng, R. [Johns Hopkins Univ., Baltimore, MD (United States). Dept. of Civil Engineering


    Steel containments and liners in nuclear power plants may be exposed to aggressive environments that may cause their strength and stiffness to decrease during the plant service life. Among the factors recognized as having the potential to cause structural deterioration are uniform, pitting or crevice corrosion; fatigue, including crack initiation and propagation to fracture; elevated temperature; and irradiation. The evaluation of steel containments and liners for continued service must provide assurance that they are able to withstand future extreme loads during the service period with a level of reliability that is sufficient for public safety. Rational methodologies to provide such assurances can be developed using modern structural reliability analysis principles that take uncertainties in loading, strength, and degradation resulting from environmental factors into account. The research described in this report is in support of the Steel Containments and Liners Program being conducted for the US Nuclear Regulatory Commission by the Oak Ridge National Laboratory. The research demonstrates the feasibility of using reliability analysis as a tool for performing condition assessments and service life predictions of steel containments and liners. Mathematical models that describe time-dependent changes in steel due to aggressive environmental factors are identified, and statistical data supporting the use of these models in time-dependent reliability analysis are summarized. The analysis of steel containment fragility is described, and simple illustrations of the impact on reliability of structural degradation are provided. The role of nondestructive evaluation in time-dependent reliability analysis, both in terms of defect detection and sizing, is examined. A Markov model provides a tool for accounting for time-dependent changes in damage condition of a structural component or system. 151 refs.

  17. Ultrasound measures of supraspinatus tendon thickness and acromiohumeral distance in rotator cuff tendinopathy are reliable.

    McCreesh, Karen M; Anjum, Shakeel; Crotty, James M; Lewis, Jeremy S


    Rotator cuff (RC) tendinopathy has been widely ascribed to impingement of the supraspinatus tendon (SsT) in the subacromial space, measured as the acromiohumeral distance (AHD). Ultrasound (US) is suitable for measuring AHD and SsT thickness, but few reliability studies have been carried out in symptomatic populations, and interrater reliability is unconfirmed. This study aimed to examine the intrarater and interrater reliability of US measurements of AHD and SsT thickness in asymptomatic control subjects and patients with RC tendinopathy. Seventy participants were recruited and grouped as healthy controls (n = 25) and RC tendinopathy (n = 45). Repeated US measurements of AHD and SsT thickness were obtained by one rater in both groups and by two raters in the RC tendinopathy group. Intrarater and interrater reliability coefficients were excellent for both measurements (intraclass correlation > 0.92), but the intrarater reliability was superior. The minimal detectable change values in the symptomatic group were 0.7 mm for AHD and 0.6 mm for SsT thickness for a single experienced examiner; the values rose to 1.2 mm and 1.3 mm, respectively, for the pair of examiners. The results support the reliability of US for the measurement of AHD and SsT thickness in patients with symptomatic RC tendinopathy and provide minimal detectable change values for use in future research studies. © 2015 Wiley Periodicals, Inc.

  18. Recent advances in mycotoxins detection.

    Chauhan, Ruchika; Singh, Jay; Sachdev, Tushar; Basu, T; Malhotra, B D


    Mycotoxins contamination in both food and feed is inevitable. Mycotoxin toxicity in foodstuff can occur at very low concentrations necessitating early availability of sensitive and reliable methods for their detection. The present research thrust is towards the development of a user friendly biosensor for mycotoxin detection at both academic and industrial levels to replace conventional expensive chromatographic and ELISA techniques. This review critically analyzes the recent research trend towards the construction of immunosensor, aptasensor, enzymatic sensors and others for mycotoxin detection with a reference to label and label free methods, synthesis of new materials including nano dimension, and transuding techniques. Technological aspects in the development of biosensors for mycotoxin detection, current challenges and future prospects are also included to provide a overview and suggestions for future research directions.

  19. Early and reliable detection of herpes simplex virus type 1 and varicella zoster virus DNAs in oral fluid of patients with idiopathic peripheral facial nerve palsy: Decision support regarding antiviral treatment?

    Lackner, Andreas; Kessler, Harald H; Walch, Christian; Quasthoff, Stefan; Raggam, Reinhard B


    Idiopathic peripheral facial nerve palsy has been associated with the reactivation of herpes simplex virus type 1 (HSV-1) or varicella zoster virus (VZV). In recent studies, detection rates were found to vary strongly which may be caused by the use of different oral fluid collection devices in combination with molecular assays lacking standardization. In this single-center pilot study, liquid phase-based and absorption-based oral fluid collection was compared. Samples were collected with both systems from 10 patients with acute idiopathic peripheral facial nerve palsy, 10 with herpes labialis or with Ramsay Hunt syndrome, and 10 healthy controls. Commercially available IVD/CE-labeled molecular assays based on fully automated DNA extraction and real-time PCR were employed. With the liquid phase-based oral fluid collection system, three patients with idiopathic peripheral facial nerve palsy tested positive for HSV-1 DNA and another two tested positive for VZV DNA. All patients with herpes labialis tested positive for HSV-1 DNA and all patients with Ramsay Hunt syndrome tested positive for VZV DNA. With the absorption-based oral fluid collection system, detections rates and viral loads were found to be significantly lower when compared to those obtained with the liquid phase-based collection system. Collection of oral fluid with a liquid phase-based system and the use of automated and standardized molecular methods allow early and reliable detection of HSV-1 and VZV DNAs in patients with acute idiopathic peripheral facial nerve palsy and may provide a valuable decision support regarding start of antiviral treatment at the first clinical visit.

  20. Is quantitative electromyography reliable?

    Cecere, F; Ruf, S; Pancherz, H


    The reliability of quantitative electromyography (EMG) of the masticatory muscles was investigated in 14 subjects without any signs or symptoms of temporomandibular disorders. Integrated EMG activity from the anterior temporalis and masseter muscles was recorded bilaterally by means of bipolar surface electrodes during chewing and biting activities. In the first experiment, the influence of electrode relocation was investigated. No influence of electrode relocation on the recorded EMG signal could be detected. In a second experiment, three sessions of EMG recordings during five different chewing and biting activities were performed in the morning (I); 1 hour later without intermediate removal of the electrodes (II); and in the afternoon, using new electrodes (III). The method errors for different time intervals (I-II and I-III errors) for each muscle and each function were calculated. Depending on the time interval between the EMG recordings, the muscles considered, and the function performed, the individual errors ranged from 5% to 63%. The method error increased significantly (P masseter (mean 27.2%) was higher than for the temporalis (mean 20.0%). The largest function error was found during maximal biting in intercuspal position (mean 23.1%). Based on the findings, quantitative electromyography of the masticatory muscles seems to have a limited value in diagnostics and in the evaluation of individual treatment results.

  1. Reliability and safety engineering

    Verma, Ajit Kumar; Karanki, Durga Rao


    Reliability and safety are core issues that must be addressed throughout the life cycle of engineering systems. Reliability and Safety Engineering presents an overview of the basic concepts, together with simple and practical illustrations. The authors present reliability terminology in various engineering fields, viz.,electronics engineering, software engineering, mechanical engineering, structural engineering and power systems engineering. The book describes the latest applications in the area of probabilistic safety assessment, such as technical specification optimization, risk monitoring and risk informed in-service inspection. Reliability and safety studies must, inevitably, deal with uncertainty, so the book includes uncertainty propagation methods: Monte Carlo simulation, fuzzy arithmetic, Dempster-Shafer theory and probability bounds. Reliability and Safety Engineering also highlights advances in system reliability and safety assessment including dynamic system modeling and uncertainty management. Cas...

  2. Measurement System Reliability Assessment

    Kłos Ryszard


    Full Text Available Decision-making in problem situations is based on up-to-date and reliable information. A great deal of information is subject to rapid changes, hence it may be outdated or manipulated and enforce erroneous decisions. It is crucial to have the possibility to assess the obtained information. In order to ensure its reliability it is best to obtain it with an own measurement process. In such a case, conducting assessment of measurement system reliability seems to be crucial. The article describes general approach to assessing reliability of measurement systems.

  3. Reliable knowledge discovery

    Dai, Honghua; Smirnov, Evgueni


    Reliable Knowledge Discovery focuses on theory, methods, and techniques for RKDD, a new sub-field of KDD. It studies the theory and methods to assure the reliability and trustworthiness of discovered knowledge and to maintain the stability and consistency of knowledge discovery processes. RKDD has a broad spectrum of applications, especially in critical domains like medicine, finance, and military. Reliable Knowledge Discovery also presents methods and techniques for designing robust knowledge-discovery processes. Approaches to assessing the reliability of the discovered knowledge are introduc

  4. Reliability Oriented Circuit Design For Power Electronics Applications

    Sintamarean, Nicolae Cristian


    Highly reliable components are required in order to minimize the downtime during the lifetime of the converter and implicitly the maintenance costs. Therefore, the design of high reliable converters under constrained reliability and cost is a great challenge to be overcome in the future. The temperature variation of the semiconductor devices plays a key role in the robustness design and reliability of power electronics converters. This factor has a major impact on the power converters used in...

  5. LED system reliability

    Driel, W.D. van; Yuan, C.A.; Koh, S.; Zhang, G.Q.


    This paper presents our effort to predict the system reliability of Solid State Lighting (SSL) applications. A SSL system is composed of a LED engine with micro-electronic driver(s) that supplies power to the optic design. Knowledge of system level reliability is not only a challenging scientific ex

  6. Principles of Bridge Reliability

    Thoft-Christensen, Palle; Nowak, Andrzej S.

    The paper gives a brief introduction to the basic principles of structural reliability theory and its application to bridge engineering. Fundamental concepts like failure probability and reliability index are introduced. Ultimate as well as serviceability limit states for bridges are formulated...

  7. Improving machinery reliability

    Bloch, Heinz P


    This totally revised, updated and expanded edition provides proven techniques and procedures that extend machinery life, reduce maintenance costs, and achieve optimum machinery reliability. This essential text clearly describes the reliability improvement and failure avoidance steps practiced by best-of-class process plants in the U.S. and Europe.

  8. Hawaii Electric System Reliability

    Loose, Verne William [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Silva Monroy, Cesar Augusto [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)


    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers’ views of reliability “worth” and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers’ views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.

  9. Hawaii electric system reliability.

    Silva Monroy, Cesar Augusto; Loose, Verne William


    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers' views of reliability %E2%80%9Cworth%E2%80%9D and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers' views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.

  10. Chapter 9: Reliability

    Algora, Carlos; Espinet-Gonzalez, Pilar; Vazquez, Manuel; Bosco, Nick; Miller, David; Kurtz, Sarah; Rubio, Francisca; McConnell,Robert


    This chapter describes the accumulated knowledge on CPV reliability with its fundamentals and qualification. It explains the reliability of solar cells, modules (including optics) and plants. The chapter discusses the statistical distributions, namely exponential, normal and Weibull. The reliability of solar cells includes: namely the issues in accelerated aging tests in CPV solar cells, types of failure and failures in real time operation. The chapter explores the accelerated life tests, namely qualitative life tests (mainly HALT) and quantitative accelerated life tests (QALT). It examines other well proven and experienced PV cells and/or semiconductor devices, which share similar semiconductor materials, manufacturing techniques or operating conditions, namely, III-V space solar cells and light emitting diodes (LEDs). It addresses each of the identified reliability issues and presents the current state of the art knowledge for their testing and evaluation. Finally, the chapter summarizes the CPV qualification and reliability standards.

  11. Designing for Reliability and Robustness

    Svetlik, Randall G.; Moore, Cherice; Williams, Antony


    Long duration spaceflight has a negative effect on the human body, and exercise countermeasures are used on-board the International Space Station (ISS) to minimize bone and muscle loss, combatting these effects. Given the importance of these hardware systems to the health of the crew, this equipment must continue to be readily available. Designing spaceflight exercise hardware to meet high reliability and availability standards has proven to be challenging throughout the time the crewmembers have been living on ISS beginning in 2000. Furthermore, restoring operational capability after a failure is clearly time-critical, but can be problematic given the challenges of troubleshooting the problem from 220 miles away. Several best-practices have been leveraged in seeking to maximize availability of these exercise systems, including designing for robustness, implementing diagnostic instrumentation, relying on user feedback, and providing ample maintenance and sparing. These factors have enhanced the reliability of hardware systems, and therefore have contributed to keeping the crewmembers healthy upon return to Earth. This paper will review the failure history for three spaceflight exercise countermeasure systems identifying lessons learned that can help improve future systems. Specifically, the Treadmill with Vibration Isolation and Stabilization System (TVIS), Cycle Ergometer with Vibration Isolation and Stabilization System (CEVIS), and the Advanced Resistive Exercise Device (ARED) will be reviewed, analyzed, and conclusions identified so as to provide guidance for improving future exercise hardware designs. These lessons learned, paired with thorough testing, offer a path towards reduced system down-time.

  12. Structural Reliability Methods

    Ditlevsen, Ove Dalager; Madsen, H. O.

    of structural reliability, including the theoretical basis for these methods. Partial safety factor codes under current practice are briefly introduced and discussed. A probabilistic code format for obtaining a formal reliability evaluation system that catches the most essential features of the nature......The structural reliability methods quantitatively treat the uncertainty of predicting the behaviour and properties of a structure given the uncertain properties of its geometry, materials, and the actions it is supposed to withstand. This book addresses the probabilistic methods for evaluation...

  13. Space Vehicle Reliability Modeling in DIORAMA

    Tornga, Shawn Robert [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)


    When modeling system performance of space based detection systems it is important to consider spacecraft reliability. As space vehicles age the components become prone to failure for a variety of reasons such as radiation damage. Additionally, some vehicles may lose the ability to maneuver once they exhaust fuel supplies. Typically failure is divided into two categories: engineering mistakes and technology surprise. This document will report on a method of simulating space vehicle reliability in the DIORAMA framework.

  14. Reliable Electronic Equipment

    N. A. Nayak


    Full Text Available The reliability aspect of electronic equipment's is discussed. To obtain optimum results, close cooperation between the components engineer, the design engineer and the production engineer is suggested.

  15. Reliability prediction techniques

    Whittaker, B.; Worthington, B.; Lord, J.F.; Pinkard, D.


    The paper demonstrates the feasibility of applying reliability assessment techniques to mining equipment. A number of techniques are identified and described and examples of their use in assessing mining equipment are given. These techniques include: reliability prediction; failure analysis; design audit; maintainability; availability and the life cycle costing. Specific conclusions regarding the usefulness of each technique are outlined. The choice of techniques depends upon both the type of equipment being assessed and its stage of development, with numerical prediction best suited for electronic equipment and fault analysis and design audit suited to mechanical equipment. Reliability assessments involve much detailed and time consuming work but it has been demonstrated that the resulting reliability improvements lead to savings in service costs which more than offset the cost of the evaluation.

  16. The rating reliability calculator

    Solomon David J


    Full Text Available Abstract Background Rating scales form an important means of gathering evaluation data. Since important decisions are often based on these evaluations, determining the reliability of rating data can be critical. Most commonly used methods of estimating reliability require a complete set of ratings i.e. every subject being rated must be rated by each judge. Over fifty years ago Ebel described an algorithm for estimating the reliability of ratings based on incomplete data. While his article has been widely cited over the years, software based on the algorithm is not readily available. This paper describes an easy-to-use Web-based utility for estimating the reliability of ratings based on incomplete data using Ebel's algorithm. Methods The program is available public use on our server and the source code is freely available under GNU General Public License. The utility is written in PHP, a common open source imbedded scripting language. The rating data can be entered in a convenient format on the user's personal computer that the program will upload to the server for calculating the reliability and other statistics describing the ratings. Results When the program is run it displays the reliability, number of subject rated, harmonic mean number of judges rating each subject, the mean and standard deviation of the averaged ratings per subject. The program also displays the mean, standard deviation and number of ratings for each subject rated. Additionally the program will estimate the reliability of an average of a number of ratings for each subject via the Spearman-Brown prophecy formula. Conclusion This simple web-based program provides a convenient means of estimating the reliability of rating data without the need to conduct special studies in order to provide complete rating data. I would welcome other researchers revising and enhancing the program.

  17. Reliability of power connections

    BRAUNOVIC Milenko


    Despite the use of various preventive maintenance measures, there are still a number of problem areas that can adversely affect system reliability. Also, economical constraints have pushed the designs of power connections closer to the limits allowed by the existing standards. The major parameters influencing the reliability and life of Al-Al and Al-Cu connections are identified. The effectiveness of various palliative measures is determined and the misconceptions about their effectiveness are dealt in detail.

  18. Multidisciplinary System Reliability Analysis

    Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)


    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  19. Sensitivity Analysis of Component Reliability



    In a system, Every component has its unique position within system and its unique failure characteristics. When a component's reliability is changed, its effect on system reliability is not equal. Component reliability sensitivity is a measure of effect on system reliability while a component's reliability is changed. In this paper, the definition and relative matrix of component reliability sensitivity is proposed, and some of their characteristics are analyzed. All these will help us to analyse or improve the system reliability.

  20. 量子共振检测抑郁症症状可靠性的再研究%Evaluation of reliability of quantum resonance spectrometer application in depression symptoms detection

    师建国; 刘飞虎; 张燕; 孙丽莎; 张海涛; 岳晓斌; 杜向农; 袁晶; 徐堂辉


    目的 评价量子共振检测抑郁症相关症状的可靠性及其在精神科应用的价值.方法 将97例检测对象分别经精神科医师通过精神检查出的精神症状与量子共振检测到的精神症状结果进行比较分析,检查和检测顺序根据入院先后及接受检查顺序随机进行.结果 量子共振检测饮食障碍的敏感性和阴性预测值为100%;情感低落、意志减弱及睡眠障碍的特异性和阳性预测值为100%;思维迟缓、情感低落、意志减弱等症状的κ>0.8.思维迟缓、情感低落、意志减弱等11个症状ROC曲线下面积大于0.9.结论 量子共振检测仪可以作为抑郁症辅助诊断工具,为临床诊断提供新的依据.%Objective To evaluate reliability and psychiatric clinical value of quantum resonance spectrometer (QRS) application in depression symptoms detection.Methods The psychiatric symptoms of 97 cases respectively obtained from QRS test and psychiatrist check were performed comparative analysis.The detecting order and checking order was under an order of go to see doctor random.Results Sensitivity and negative predictive value of QRS were 100% same as psychiatrists'examination in eating disorders; also specificity and positive predictive were 100% in depressed mood,will weaken and sleep disorders.Kappa values were all greater than 0.8 of QRS in slow thinking,depressed mood,will weaken and other symptoms.The AUC of ROC line were all greater than 0.9 of QRS in slow thinking,depressed mood,will weaken,et al.11 symptoms.Conclusion QRS could be used as secondary depression diagnostic tools,provide a new basis for clinical diagnosis.

  1. Optimal Reliability-Based Planning of Experiments for POD Curves

    Sørensen, John Dalsgaard; Faber, M. H.; Kroon, I. B.

    Optimal planning of the crack detection test is considered. The test are used to update the information on the reliability of the inspection techniques modelled by probability of detection (P.O.D.) curves. It is shown how cost-optimal and reliability based test plans can be obtained using First...

  2. Reliability of steam generator tubing

    Kadokami, E. [Mitsubishi Heavy Industries Ltd., Hyogo-ku (Japan)


    The author presents results on studies made of the reliability of steam generator (SG) tubing. The basis for this work is that in Japan the issue of defects in SG tubing is addressed by the approach that any detected defect should be repaired, either by plugging the tube or sleeving it. However, this leaves open the issue that there is a detection limit in practice, and what is the effect of nondetectable cracks on the performance of tubing. These studies were commissioned to look at the safety issues involved in degraded SG tubing. The program has looked at a number of different issues. First was an assessment of the penetration and opening behavior of tube flaws due to internal pressure in the tubing. They have studied: penetration behavior of the tube flaws; primary water leakage from through-wall flaws; opening behavior of through-wall flaws. In addition they have looked at the question of the reliability of tubing with flaws during normal plant operation. Also there have been studies done on the consequences of tube rupture accidents on the integrity of neighboring tubes.


    Тамаргазін, О. А.; Національний авіаційний університет; Власенко, П. О.; Національний авіаційний університет


    Airline's operational structure for Reliability program implementation — engineering division, reliability  division, reliability control division, aircraft maintenance division, quality assurance division — was considered. Airline's Reliability program structure is shown. Using of Reliability program for reducing costs on aircraft maintenance is proposed. Рассмотрена организационная структура авиакомпании по выполнению Программы надежности - инженерный отдел, отделы по надежности авиацио...

  4. Ultra reliability at NASA

    Shapiro, Andrew A.


    Ultra reliable systems are critical to NASA particularly as consideration is being given to extended lunar missions and manned missions to Mars. NASA has formulated a program designed to improve the reliability of NASA systems. The long term goal for the NASA ultra reliability is to ultimately improve NASA systems by an order of magnitude. The approach outlined in this presentation involves the steps used in developing a strategic plan to achieve the long term objective of ultra reliability. Consideration is given to: complex systems, hardware (including aircraft, aerospace craft and launch vehicles), software, human interactions, long life missions, infrastructure development, and cross cutting technologies. Several NASA-wide workshops have been held, identifying issues for reliability improvement and providing mitigation strategies for these issues. In addition to representation from all of the NASA centers, experts from government (NASA and non-NASA), universities and industry participated. Highlights of a strategic plan, which is being developed using the results from these workshops, will be presented.

  5. Photovoltaic module reliability workshop

    Mrig, L. (ed.)


    The paper and presentations compiled in this volume form the Proceedings of the fourth in a series of Workshops sponsored by Solar Energy Research Institute (SERI/DOE) under the general theme of photovoltaic module reliability during the period 1986--1990. The reliability Photo Voltaic (PV) modules/systems is exceedingly important along with the initial cost and efficiency of modules if the PV technology has to make a major impact in the power generation market, and for it to compete with the conventional electricity producing technologies. The reliability of photovoltaic modules has progressed significantly in the last few years as evidenced by warranties available on commercial modules of as long as 12 years. However, there is still need for substantial research and testing required to improve module field reliability to levels of 30 years or more. Several small groups of researchers are involved in this research, development, and monitoring activity around the world. In the US, PV manufacturers, DOE laboratories, electric utilities and others are engaged in the photovoltaic reliability research and testing. This group of researchers and others interested in this field were brought together under SERI/DOE sponsorship to exchange the technical knowledge and field experience as related to current information in this important field. The papers presented here reflect this effort.

  6. Reliability Centered Maintenance - Methodologies

    Kammerer, Catherine C.


    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  7. Gearbox Reliability Collaborative Update (Presentation)

    Sheng, S.; Keller, J.; Glinsky, C.


    This presentation was given at the Sandia Reliability Workshop in August 2013 and provides information on current statistics, a status update, next steps, and other reliability research and development activities related to the Gearbox Reliability Collaborative.

  8. Future Contingents

    Øhrstrøm, Peter; Hasle., Per F. V.


    will be a sea-battle tomorrow” could serve as standard examples. What could be called the problem of future contingents concerns how to ascribe truth-values to such statements. If there are several possible decisions out of which one is going to be made freely tomorrow, can there be a truth now about which one......, ‘future contingents’ could also refer to future contingent objects. A statement like “The first astronaut to go to Mars will have a unique experience” could be analyzed as referring to an object not yet existing, supposing that one day in the distant future some person will indeed travel to Mars......, but that person has not yet been born. The notion of ‘future contingent objects’ involves important philosophical questions, for instance the issue of ethical obligations towards future generations, quantification over ‘future contingent objects’ etc. However, this entry is confined to the study of future...

  9. Future Contingents

    Øhrstrøm, Peter; Hasle., Per F. V.


    will be a sea-battle tomorrow” could serve as standard examples. What could be called the problem of future contingents concerns how to ascribe truth-values to such statements. If there are several possible decisions out of which one is going to be made freely tomorrow, can there be a truth now about which one......, ‘future contingents’ could also refer to future contingent objects. A statement like “The first astronaut to go to Mars will have a unique experience” could be analyzed as referring to an object not yet existing, supposing that one day in the distant future some person will indeed travel to Mars......, but that person has not yet been born. The notion of ‘future contingent objects’ involves important philosophical questions, for instance the issue of ethical obligations towards future generations, quantification over ‘future contingent objects’ etc. However, this entry is confined to the study of future...

  10. Future accelerators (?)

    John Womersley


    I describe the future accelerator facilities that are currently foreseen for electroweak scale physics, neutrino physics, and nuclear structure. I will explore the physics justification for these machines, and suggest how the case for future accelerators can be made.

  11. System Reliability Analysis: Foundations.


    performance formulas for systems subject to pre- ventive maintenance are given. V * ~, , 9 D -2 SYSTEM RELIABILITY ANALYSIS: FOUNDATIONS Richard E...reliability in this case is V P{s can communicate with the terminal t = h(p) Sp2(((((p p)p) p)p)gp) + p(l -p)(((pL p)p)(p 2 JLp)) + p(l -p)((p(p p...For undirected networks, the basic reference is A. Satyanarayana and Kevin Wood (1982). For directed networks, the basic reference is Avinash

  12. Havens: Explicit Reliable Memory Regions for HPC Applications

    Hukerikar, Saurabh [ORNL; Engelmann, Christian [ORNL


    Supporting error resilience in future exascale-class supercomputing systems is a critical challenge. Due to transistor scaling trends and increasing memory density, scientific simulations are expected to experience more interruptions caused by transient errors in the system memory. Existing hardware-based detection and recovery techniques will be inadequate to manage the presence of high memory fault rates. In this paper we propose a partial memory protection scheme based on region-based memory management. We define the concept of regions called havens that provide fault protection for program objects. We provide reliability for the regions through a software-based parity protection mechanism. Our approach enables critical program objects to be placed in these havens. The fault coverage provided by our approach is application agnostic, unlike algorithm-based fault tolerance techniques.

  13. Reliability assessment of wave Energy devices

    Ambühl, Simon; Kramer, Morten; Kofoed, Jens Peter


    Energy from waves may play a key role in sustainable electricity production in the future. Optimal reliability levels for components used for Wave Energy Devices (WEDs) need to be defined to be able to decrease their cost of electricity. Optimal reliability levels can be found using probabilistic...... methods. Extreme loads during normal operation, but also extreme loads simultaneous with failure of mechanical and electrical components as well as the control system, are of importance for WEDs. Furthermore, fatigue loading needs to be assessed. This paper focus on the Wavestar prototype which is located...

  14. The 747 primary flight control systems reliability and maintenance study


    The major operational characteristics of the 747 Primary Flight Control Systems (PFCS) are described. Results of reliability analysis for separate control functions are presented. The analysis makes use of a NASA computer program which calculates reliability of redundant systems. Costs for maintaining the 747 PFCS in airline service are assessed. The reliabilities and cost will provide a baseline for use in trade studies of future flight control system design.

  15. a Reliability Evaluation System of Association Rules

    Chen, Jiangping; Feng, Wanshu; Luo, Minghai


    In mining association rules, the evaluation of the rules is a highly important work because it directly affects the usability and applicability of the output results of mining. In this paper, the concept of reliability was imported into the association rule evaluation. The reliability of association rules was defined as the accordance degree that reflects the rules of the mining data set. Such degree contains three levels of measurement, namely, accuracy, completeness, and consistency of rules. To show its effectiveness, the "accuracy-completeness-consistency" reliability evaluation system was applied to two extremely different data sets, namely, a basket simulation data set and a multi-source lightning data fusion. Results show that the reliability evaluation system works well in both simulation data set and the actual problem. The three-dimensional reliability evaluation can effectively detect the useless rules to be screened out and add the missing rules thereby improving the reliability of mining results. Furthermore, the proposed reliability evaluation system is applicable to many research fields; using the system in the analysis can facilitate obtainment of more accurate, complete, and consistent association rules.

  16. Expert system aids reliability

    Johnson, A.T. [Tennessee Gas Pipeline, Houston, TX (United States)


    Quality and Reliability are key requirements in the energy transmission industry. Tennessee Gas Co. a division of El Paso Energy, has applied Gensym`s G2, object-oriented Expert System programming language as a standard tool for maintaining and improving quality and reliability in pipeline operation. Tennessee created a small team of gas controllers and engineers to develop a Proactive Controller`s Assistant (ProCA) that provides recommendations for operating the pipeline more efficiently, reliably and safely. The controller`s pipeline operating knowledge is recreated in G2 in the form of Rules and Procedures in ProCA. Two G2 programmers supporting the Gas Control Room add information to the ProCA knowledge base daily. The result is a dynamic, constantly improving system that not only supports the pipeline controllers in their operations, but also the measurement and communications departments` requests for special studies. The Proactive Controller`s Assistant development focus is in the following areas: Alarm Management; Pipeline Efficiency; Reliability; Fuel Efficiency; and Controller Development.

  17. Reliability based structural design

    Vrouwenvelder, A.C.W.M.


    According to ISO 2394, structures shall be designed, constructed and maintained in such a way that they are suited for their use during the design working life in an economic way. To fulfil this requirement one needs insight into the risk and reliability under expected and non-expected actions. A ke

  18. Reliability based structural design

    Vrouwenvelder, A.C.W.M.


    According to ISO 2394, structures shall be designed, constructed and maintained in such a way that they are suited for their use during the design working life in an economic way. To fulfil this requirement one needs insight into the risk and reliability under expected and non-expected actions. A ke

  19. The value of reliability

    Fosgerau, Mogens; Karlström, Anders


    We derive the value of reliability in the scheduling of an activity of random duration, such as travel under congested conditions. Using a simple formulation of scheduling utility, we show that the maximal expected utility is linear in the mean and standard deviation of trip duration, regardless...

  20. Parametric Mass Reliability Study

    Holt, James P.


    The International Space Station (ISS) systems are designed based upon having redundant systems with replaceable orbital replacement units (ORUs). These ORUs are designed to be swapped out fairly quickly, but some are very large, and some are made up of many components. When an ORU fails, it is replaced on orbit with a spare; the failed unit is sometimes returned to Earth to be serviced and re-launched. Such a system is not feasible for a 500+ day long-duration mission beyond low Earth orbit. The components that make up these ORUs have mixed reliabilities. Components that make up the most mass-such as computer housings, pump casings, and the silicon board of PCBs-typically are the most reliable. Meanwhile components that tend to fail the earliest-such as seals or gaskets-typically have a small mass. To better understand the problem, my project is to create a parametric model that relates both the mass of ORUs to reliability, as well as the mass of ORU subcomponents to reliability.

  1. Avionics Design for Reliability


    Consultant P.O. Box 181, Hazelwood. Missouri 63042, U.S.A. soup ""•.• • CONTENTS Page LIST OF SPEAKERS iii INTRODUCTION AND OVERVIEW-RELIABILITY UNDER... primordial , d’autant plus quo dans co cam ia procg- dure do st~lection en fiabilitg eat assez peu efficaco. La ripartition des pannes suit

  2. Wind Energy - How Reliable.


    The reliability of a wind energy system depends on the size of the propeller and the size of the back-up energy storage. Design of the optimum system...speed incidents which generate a significant part of the wind energy . A nomogram is presented, based on some continuous wind speed measurements

  3. The reliability horizon

    Visser, M


    The ``reliability horizon'' for semi-classical quantum gravity quantifies the extent to which we should trust semi-classical quantum gravity, and gives a handle on just where the ``Planck regime'' resides. The key obstruction to pushing semi-classical quantum gravity into the Planck regime is often the existence of large metric fluctuations, rather than a large back-reaction.

  4. Reliability of semiology description.

    Heo, Jae-Hyeok; Kim, Dong Wook; Lee, Seo-Young; Cho, Jinwhan; Lee, Sang-Kun; Nam, Hyunwoo


    Seizure semiology is important for classifying patients' epilepsy. Physicians usually get most of the seizure information from observers though there have been few reports on the reliability of the observers' description. This study aims at determining the reliability of observers' description of the semiology. We included 92 patients who had their habitual seizures recorded during video-EEG monitoring. We compared the semiology described by the observers with that recorded on the videotape, and reviewed which characteristics of the observers affected the reliability of their reported data. The classification of seizures and the individual components of the semiology based only on the observer-description was somewhat discordant compared with the findings from the videotape (correct classification, 85%). The descriptions of some ictal behaviors such as oroalimentary automatism, tonic/dystonic limb posturing, and head versions were relatively accurate, but those of motionless staring and hand automatism were less accurate. The specified directions by the observers were relatively correct. The accuracy of the description was related to the educational level of the observers. Much of the information described by well-educated observers is reliable. However, every physician should keep in mind the limitations of this information and use this information cautiously.

  5. High reliability organizations

    Gallis, R.; Zwetsloot, G.I.J.M.


    High Reliability Organizations (HRO’s) are organizations that constantly face serious and complex (safety) risks yet succeed in realising an excellent safety performance. In such situations acceptable levels of safety cannot be achieved by traditional safety management only. HRO’s manage safety

  6. Snow: a reliable indicator for global warming in the future?

    Jacobi, H.-W.


    The cryosphere consists of water in the solid form at the Earth's surface and includes, among others, snow, sea ice, glaciers and ice sheets. Since the 1990s the cryosphere and its components have often been considered as indicators of global warming because rising temperatures can enhance the melting of solid water (e.g. Barry et al 1993, Goodison and Walker 1993, Armstrong and Brun 2008). Changes in the cryosphere are often easier to recognize than a global temperature rise of a couple of degrees: many locals and tourists have hands-on experience in changes in the extent of glaciers or the duration of winter snow cover on the Eurasian and North American continents. On a more scientific basis, the last IPCC report left no doubt: the amount of snow and ice on Earth is decreasing (Lemke et al 2007). Available data showed clearly decreasing trends in the sea ice and frozen ground extent of the Northern Hemisphere (NH) and the global glacier mass balance. However, the trend in the snow cover extent (SCE) of the NH was much more ambiguous; a result that has since been confirmed by the online available up-to-date analysis of the SCE performed by the Rutgers University Global Snow Lab ( The behavior of snow is not the result of a simple cause-and-effect relationship between air temperature and snow. It is instead related to a rather complex interplay between external meteorological parameters and internal processes in the snowpack. While air temperature is of course a crucial parameter for snow and its melting, precipitation and radiation are also important. Further physical properties like snow grain size and the amount of absorbing impurities in the snow determine the fraction of absorbed radiation. While all these parameters affect the energy budget of the snowpack, each of these variables can dominate depending on the season or, more generally, on environmental conditions. As a result, the reduction in SCE in spring and summer in the NH was attributed to faster melting because of higher air temperatures, while the winter months (December to February) saw an increase in the SCE due to increased precipitation (Lemke et al >2007). Cohen et al (2012) confirmed these opposing effects in the SCE and showed that on the Eurasian continent the average SCE in October has increased by approximately 3 × 106 km2 in the last two decades; a growth of almost 40%, corresponding to roughly 1.5 times the area of Greenland. For the same period, Cohen et al (2012) found a negligible trend in the average temperatures above the continents of the NH for the winter months despite a significant increase in the annual mean temperature for the same regions. Cohen et al (2012) propose the following link between temperatures and snow: the reduced sea ice cover of the Arctic Ocean and the enhanced air temperatures in fall cause higher evaporation from the Arctic Ocean, leading to increased tropospheric moisture in the Arctic. More moisture results in more snowfall over the Eurasian continent, increasing the SCE. The increased snow cover strengthens the Siberian High, a strong anticyclonic system generally persistent between October and April. This system is strong enough to affect weather patterns in large parts of the NH, resulting in changes in the large-scale circulation of the NH (Panagiotopoulos et al 2005). As a result, outbreaks of cold Arctic air masses into the mid-latitudes are more frequent, leading to low temperatures over the eastern part of North America and Northern Eurasia. According to Cohen et al (2012), these are exactly the same regions that have experienced a cooling trend in the winter temperature over the past twenty years. While this chain of events is plausible (and some are confirmed by observations), existing climate models are not yet capable of reproducing these processes. On the contrary, Cohen et al (2012) showed that they predict a slightly decreasing SCE in October for Eurasia and an increase in winter temperatures over the continents in the NH. This is not surprising because the simu

  7. Reliability in the utility computing era: Towards reliable Fog computing

    Madsen, Henrik; Burtschy, Bernard; Albeanu, G.


    This paper considers current paradigms in computing and outlines the most important aspects concerning their reliability. The Fog computing paradigm as a non-trivial extension of the Cloud is considered and the reliability of the networks of smart devices are discussed. Combining the reliability...... requirements of grid and cloud paradigms with the reliability requirements of networks of sensor and actuators it follows that designing a reliable Fog computing platform is feasible....

  8. Optimal Reliability-Based Planning of Experiments for POD Curves

    Sørensen, John Dalsgaard; Faber, M. H.; Kroon, I. B.

    Optimal planning of the crack detection test is considered. The test are used to update the information on the reliability of the inspection techniques modelled by probability of detection (P.O.D.) curves. It is shown how cost-optimal and reliability based test plans can be obtained using First O...... Order Reliability Methods in combination with life-cycle cost-optimal inspection and maintenance planning. The methodology is based on preposterior analyses from Bayesian decision theory. An illustrative example is shown.......Optimal planning of the crack detection test is considered. The test are used to update the information on the reliability of the inspection techniques modelled by probability of detection (P.O.D.) curves. It is shown how cost-optimal and reliability based test plans can be obtained using First...

  9. Reliability of a rating procedure to monitor industry self-regulation codes governing alcohol advertising content.

    Babor, Thomas F; Xuan, Ziming; Proctor, Dwayne


    The purposes of this study were to develop reliable procedures to monitor the content of alcohol advertisements broadcast on television and in other media, and to detect violations of the content guidelines of the alcohol industry's self-regulation codes. A set of rating-scale items was developed to measure the content guidelines of the 1997 version of the U.S. Beer Institute Code. Six focus groups were conducted with 60 college students to evaluate the face validity of the items and the feasibility of the procedure. A test-retest reliability study was then conducted with 74 participants, who rated five alcohol advertisements on two occasions separated by 1 week. Average correlations across all advertisements using three reliability statistics (r, rho, and kappa) were almost all statistically significant and the kappas were good for most items, which indicated high test-retest agreement. We also found high interrater reliabilities (intraclass correlations) among raters for item-level and guideline-level violations, indicating that regardless of the specific item, raters were consistent in their general evaluations of the advertisements. Naïve (untrained) raters can provide consistent (reliable) ratings of the main content guidelines proposed in the U.S. Beer Institute Code. The rating procedure may have future applications for monitoring compliance with industry self-regulation codes and for conducting research on the ways in which alcohol advertisements are perceived by young adults and other vulnerable populations.

  10. Human Reliability Program Workshop

    Landers, John; Rogers, Erin; Gerke, Gretchen


    A Human Reliability Program (HRP) is designed to protect national security as well as worker and public safety by continuously evaluating the reliability of those who have access to sensitive materials, facilities, and programs. Some elements of a site HRP include systematic (1) supervisory reviews, (2) medical and psychological assessments, (3) management evaluations, (4) personnel security reviews, and (4) training of HRP staff and critical positions. Over the years of implementing an HRP, the Department of Energy (DOE) has faced various challenges and overcome obstacles. During this 4-day activity, participants will examine programs that mitigate threats to nuclear security and the insider threat to include HRP, Nuclear Security Culture (NSC) Enhancement, and Employee Assistance Programs. The focus will be to develop an understanding of the need for a systematic HRP and to discuss challenges and best practices associated with mitigating the insider threat.

  11. Accelerator reliability workshop

    Hardy, L.; Duru, Ph.; Koch, J.M.; Revol, J.L.; Van Vaerenbergh, P.; Volpe, A.M.; Clugnet, K.; Dely, A.; Goodhew, D


    About 80 experts attended this workshop, which brought together all accelerator communities: accelerator driven systems, X-ray sources, medical and industrial accelerators, spallation sources projects (American and European), nuclear physics, etc. With newly proposed accelerator applications such as nuclear waste transmutation, replacement of nuclear power plants and others. Reliability has now become a number one priority for accelerator designers. Every part of an accelerator facility from cryogenic systems to data storage via RF systems are concerned by reliability. This aspect is now taken into account in the design/budget phase, especially for projects whose goal is to reach no more than 10 interruptions per year. This document gathers the slides but not the proceedings of the workshop.

  12. Reliability and construction control

    Sherif S. AbdelSalam


    Full Text Available The goal of this study was to determine the most reliable and efficient combination of design and construction methods required for vibro piles. For a wide range of static and dynamic formulas, the reliability-based resistance factors were calculated using EGYPT database, which houses load test results for 318 piles. The analysis was extended to introduce a construction control factor that determines the variation between the pile nominal capacities calculated using static versus dynamic formulae. From the major outcomes, the lowest coefficient of variation is associated with Davisson’s criterion, and the resistance factors calculated for the AASHTO method are relatively high compared with other methods. Additionally, the CPT-Nottingham and Schmertmann method provided the most economic design. Recommendations related to a pile construction control factor were also presented, and it was found that utilizing the factor can significantly reduce variations between calculated and actual capacities.

  13. Improving Power Converter Reliability

    Ghimire, Pramod; de Vega, Angel Ruiz; Beczkowski, Szymon


    The real-time junction temperature monitoring of a high-power insulated-gate bipolar transistor (IGBT) module is important to increase the overall reliability of power converters for industrial applications. This article proposes a new method to measure the on-state collector?emitter voltage...... of a high-power IGBT module during converter operation, which may play a vital role in improving the reliability of the power converters. The measured voltage is used to estimate the module average junction temperature of the high and low-voltage side of a half-bridge IGBT separately in every fundamental...... is measured in a wind power converter at a low fundamental frequency. To illustrate more, the test method as well as the performance of the measurement circuit are also presented. This measurement is also useful to indicate failure mechanisms such as bond wire lift-off and solder layer degradation...

  14. ATLAS reliability analysis

    Bartsch, R.R.


    Key elements of the 36 MJ ATLAS capacitor bank have been evaluated for individual probabilities of failure. These have been combined to estimate system reliability which is to be greater than 95% on each experimental shot. This analysis utilizes Weibull or Weibull-like distributions with increasing probability of failure with the number of shots. For transmission line insulation, a minimum thickness is obtained and for the railgaps, a method for obtaining a maintenance interval from forthcoming life tests is suggested.

  15. Reliability of Circumplex Axes

    Micha Strack


    Full Text Available We present a confirmatory factor analysis (CFA procedure for computing the reliability of circumplex axes. The tau-equivalent CFA variance decomposition model estimates five variance components: general factor, axes, scale-specificity, block-specificity, and item-specificity. Only the axes variance component is used for reliability estimation. We apply the model to six circumplex types and 13 instruments assessing interpersonal and motivational constructs—Interpersonal Adjective List (IAL, Interpersonal Adjective Scales (revised; IAS-R, Inventory of Interpersonal Problems (IIP, Impact Messages Inventory (IMI, Circumplex Scales of Interpersonal Values (CSIV, Support Action Scale Circumplex (SAS-C, Interaction Problems With Animals (IPI-A, Team Role Circle (TRC, Competing Values Leadership Instrument (CV-LI, Love Styles, Organizational Culture Assessment Instrument (OCAI, Customer Orientation Circle (COC, and System for Multi-Level Observation of Groups (behavioral adjectives; SYMLOG—in 17 German-speaking samples (29 subsamples, grouped by self-report, other report, and metaperception assessments. The general factor accounted for a proportion ranging from 1% to 48% of the item variance, the axes component for 2% to 30%; and scale specificity for 1% to 28%, respectively. Reliability estimates varied considerably from .13 to .92. An application of the Nunnally and Bernstein formula proposed by Markey, Markey, and Tinsley overestimated axes reliabilities in cases of large-scale specificities but otherwise works effectively. Contemporary circumplex evaluations such as Tracey’s RANDALL are sensitive to the ratio of the axes and scale-specificity components. In contrast, the proposed model isolates both components.

  16. Cooperative Communications. Link Reliability and Power Efficiency

    Ahsin, Tafzeel ur Rehman


    Demand for high data rates is increasing rapidly for the future wireless generations, due to the requirement of ubiquitous coverage for wireless broadband services. More base stations are needed to deliver these services, in order to cope with the increased capacity demand and inherent unreliable nature of wireless medium. However, this would directly correspond to high infrastructure cost and energy consumption in cellular networks. Nowadays, high power consumption in the network is becoming a matter of concern for the operators,both from environmental and economic point of view. Cooperative communications, which is regarded as a virtual multi-input-multi-output (MIMO) channel, can be very efficient in combating fading multi-path channels and improve coverage with low complexity and cost. With its distributed structure, cooperative communications can also contribute to the energy efficiency of wireless systems and green radio communications of the future. Using network coding at the top of cooperative communication, utilizes the network resources more efficiently. Here we look at the case of large scale use of low cost relays as a way of making the links reliable, that directly corresponds to reduction in transmission power at the nodes. A lot of research work has focused on highlighting the gains achieved by using network coding in cooperative transmissions. However, there are certain areas that are not fully explored yet. For instance, the kind of detection scheme used at the receiver and its impact on the link performance has not been addressed.The thesis looks at the performance comparison of different detection schemes and also proposes how to group users at the relay to ensure mutual benefit for the cooperating users.Using constellation selection at the nodes, the augmented space formed at the receiver is exploited for making the links more reliable. The network and the channel coding schemes are represented as a single product code, that allows us to

  17. Process control using reliability based control charts

    J.K. Jacob


    Full Text Available Purpose: The paper presents the method to monitor the mean time between failures (MTBF and detect anychange in intensity parameter. Here, a control chart procedure is presented for process reliability monitoring.Control chart based on different distributions are also considered and were used in decision making. Results anddiscussions are presented based on the case study at different industries.Design/methodology/approach: The failure occurrence process can be modeled by different distributions likehomogeneous Poisson process, Weibull model etc. In each case the aim is to monitor the mean time betweenfailure (MTBF and detect any change in intensity parameter. When the process can be described by a Poissonprocess the time between failures will be exponential and can be used for reliability monitoring.Findings: In this paper, a new procedure based on the monitoring of time to observe r failures is also proposedand it can be more appropriate for reliability monitoring.Practical implications: This procedure is useful and more sensitive when compared with the λ-chart although itwill wait until r failures for a decision. These charts can be regarded as powerful tools for reliability monitoring.λr gives more accurate results than λ-chart.Originality/value: Adopting these measures to system of equipments can increase the reliability and availabilityof the system results in economic gain. A homogeneous Poisson process is usually used to model the failureoccurrence process with certain intensity.

  18. Estimating a municipal water supply reliability

    O.G. Okeola


    Full Text Available The availability and adequacy of water in a river basin determine the design of water resources projects such as water supply. There is a further need to regularly appraise availability of such resource for municipality at a distant future to help in articulating contingent plan to handle its vulnerability. This paper attempts to empirically determine the reliability of water resource for a municipal water supply. An approach was first developed to estimate municipality water demand that lack socioeconometric data using a purpose-specific model. Hydrological assessment of river Oyun basin was carried out using Markov model and sequent peak analysis to determine the reliability extent for the future demand need. The two models were then applied to Offa municipality in Kwara state, Nigeria. The finding revealed the reliability and adequacy of the resource up till year 2020. The need to start exploring a well-coordinated conjunctive use of resources is recommended. The study can serve as an organized baseline for future work that will consider physiographic characteristics of the basin and climatic dynamics. The findings can be a vital input into the demand management process for long-term sustainable water supply of the town and by extension to urban township with similar characteristic.

  19. Future Textiles

    Hansen, Anne-Louise Degn; Jensen, Hanne Troels Fusvad; Hansen, Martin


    Magasinet Future Textiles samler resultaterne fra projektet Future Textiles, der markedsfører området intelligente tekstiler. I magasinet kan man læse om trends, drivkræfter, udfordringer samt få ideer til nye produkter inden for intelligente tekstiler. Områder som bæredygtighed og kundetilpasning...

  20. Technique for Measuring Hybrid Electronic Component Reliability

    Green, C.C.; Hernandez, C.L.; Hosking, F.M.; Robinson, D.; Rutherford, B.; Uribe, F.


    Materials compatibility studies of aged, engineered materials and hardware are critical to understanding and predicting component reliability, particularly for systems with extended stockpile life requirements. Nondestructive testing capabilities for component reliability would significantly enhance lifetime predictions. For example, if the detection of crack propagation through a solder joint can be demonstrated, this technique could be used to develop baseline information to statistically determine solder joint lifelengths. This report will investigate high frequency signal response techniques for nondestructively evaluating the electrical behavior of thick film hybrid transmission lines.

  1. The future of postgraduate training

    Walsh, Kieran


    Improvements to postgraduate training have included newly designed postgraduate curricula, new forms of delivery of learning, more valid and reliable assessments, and more rigorous evaluation of training programmes. All these changes have been necessary and have now started to settle in. Now therefore is an appropriate time to look to the future of postgraduate training. Predicting the future is difficult in any course of life-however an examination of recent trends is often a good place to s...

  2. Design and Analysis of Salmonid Tagging Studies in the Columbia Basin, Volume XVI; Alternative Designs for Future Adult PIT-Tag Detection Studies, 2000 Technical Report.

    Perez-Comas, Jose A.; Skalski, John R. (University of Washington, School of Fisheries, Seattle, WA)


    In the advent of the installation of a PIT-tag interrogation system in the Cascades Island fish ladder at Bonneville Dam (BON), and other CRB dams, this overview describes in general terms what can and cannot be estimated under seven different scenarios of adult PIT-tag detection capabilities in the CRB. Moreover, this overview attempted to identify minimal adult PIT-tag detection configurations required by the ten threatened Columbia River Basin (CRB) chinook and steelhead ESUs. A minimal adult PIT-tag detection configuration will require the installation of adult PIT-tag detection facilities at Bonneville Dam and another dam above BON. Thus, the Snake River spring/summer and fall chinook salmon, and the Snake River steelhead will require a minimum of three dams with adult PIT-tag detection capabilities to guarantee estimates of ''ocean survival'' and at least of one independent, in-river returning adult survival (e.g., adult PIT-tag detection facilities at BON and LGR dams and at any other intermediary dam such as IHR). The Upper Columbia River spring chinook salmon and steelhead will also require a minimum of three dams with adult PIT-tag detection capabilities: BON and two other dams on the BON-WEL reach. The current CRB dam system configuration and BPA's and COE's commitment to install adult PIT-tag detectors only in major CRB projects will not allow the estimation of an ''ocean survival'' and of any in-river adult survival for the Lower Columbia River chinook salmon and steelhead. The Middle Columbia River steelhead ESU will require a minimum of two dams with adult PIT-tag detection capabilities: BON and another upstream dam on the BON-McN reach. Finally, in spite of their importance in terms of releases, PIT-tag survival studies for the Upper Willamette chinook and Upper Willamette steelhead ESUs cannot be perform with the current CRB dam system configuration and PIT-tag detection capabilities.

  3. CR reliability testing

    Honeyman-Buck, Janice C.; Rill, Lynn; Frost, Meryll M.; Staab, Edward V.


    The purpose of this work was to develop a method for systematically testing the reliability of a CR system under realistic daily loads in a non-clinical environment prior to its clinical adoption. Once digital imaging replaces film, it will be very difficult to revert back should the digital system become unreliable. Prior to the beginning of the test, a formal evaluation was performed to set the benchmarks for performance and functionality. A formal protocol was established that included all the 62 imaging plates in the inventory for each 24-hour period in the study. Imaging plates were exposed using different combinations of collimation, orientation, and SID. Anthropomorphic phantoms were used to acquire images of different sizes. Each combination was chosen randomly to simulate the differences that could occur in clinical practice. The tests were performed over a wide range of times with batches of plates processed to simulate the temporal constraints required by the nature of portable radiographs taken in the Intensive Care Unit (ICU). Current patient demographics were used for the test studies so automatic routing algorithms could be tested. During the test, only three minor reliability problems occurred, two of which were not directly related to the CR unit. One plate was discovered to cause a segmentation error that essentially reduced the image to only black and white with no gray levels. This plate was removed from the inventory to be replaced. Another problem was a PACS routing problem that occurred when the DICOM server with which the CR was communicating had a problem with disk space. The final problem was a network printing failure to the laser cameras. Although the units passed the reliability test, problems with interfacing to workstations were discovered. The two issues that were identified were the interpretation of what constitutes a study for CR and the construction of the look-up table for a proper gray scale display.

  4. Ultimately Reliable Pyrotechnic Systems

    Scott, John H.; Hinkel, Todd


    This paper presents the methods by which NASA has designed, built, tested, and certified pyrotechnic devices for high reliability operation in extreme environments and illustrates the potential applications in the oil and gas industry. NASA's extremely successful application of pyrotechnics is built upon documented procedures and test methods that have been maintained and developed since the Apollo Program. Standards are managed and rigorously enforced for performance margins, redundancy, lot sampling, and personnel safety. The pyrotechnics utilized in spacecraft include such devices as small initiators and detonators with the power of a shotgun shell, detonating cord systems for explosive energy transfer across many feet, precision linear shaped charges for breaking structural membranes, and booster charges to actuate valves and pistons. NASA's pyrotechnics program is one of the more successful in the history of Human Spaceflight. No pyrotechnic device developed in accordance with NASA's Human Spaceflight standards has ever failed in flight use. NASA's pyrotechnic initiators work reliably in temperatures as low as -420 F. Each of the 135 Space Shuttle flights fired 102 of these initiators, some setting off multiple pyrotechnic devices, with never a failure. The recent landing on Mars of the Opportunity rover fired 174 of NASA's pyrotechnic initiators to complete the famous '7 minutes of terror.' Even after traveling through extreme radiation and thermal environments on the way to Mars, every one of them worked. These initiators have fired on the surface of Titan. NASA's design controls, procedures, and processes produce the most reliable pyrotechnics in the world. Application of pyrotechnics designed and procured in this manner could enable the energy industry's emergency equipment, such as shutoff valves and deep-sea blowout preventers, to be left in place for years in extreme environments and still be relied upon to function when needed, thus greatly enhancing

  5. Ferrite logic reliability study

    Baer, J. A.; Clark, C. B.


    Development and use of digital circuits called all-magnetic logic are reported. In these circuits the magnetic elements and their windings comprise the active circuit devices in the logic portion of a system. The ferrite logic device belongs to the all-magnetic class of logic circuits. The FLO device is novel in that it makes use of a dual or bimaterial ferrite composition in one physical ceramic body. This bimaterial feature, coupled with its potential for relatively high speed operation, makes it attractive for high reliability applications. (Maximum speed of operation approximately 50 kHz.)

  6. Blade reliability collaborative :

    Ashwill, Thomas D.; Ogilvie, Alistair B.; Paquette, Joshua A.


    The Blade Reliability Collaborative (BRC) was started by the Wind Energy Technologies Department of Sandia National Laboratories and DOE in 2010 with the goal of gaining insight into planned and unplanned O&M issues associated with wind turbine blades. A significant part of BRC is the Blade Defect, Damage and Repair Survey task, which will gather data from blade manufacturers, service companies, operators and prior studies to determine details about the largest sources of blade unreliability. This report summarizes the initial findings from this work.

  7. Test-retest reliability of lower limb isokinetic endurance in COPD: a comparison of angular velocities

    Ribeiro F


    Full Text Available Fernanda Ribeiro,* Pierre-Alexis Lépine,* Corine Garceau-Bolduc, Valérie Coats, Étienne Allard, François Maltais, Didier Saey Centre de recherche de l’Institut Universitaire de cardiologie et de pneumologie de Québec, Université Laval, Québec, Canada *These authors contributed equally to this workBackground: The purpose of this study was to determine and compare the test-retest reliability of quadriceps isokinetic endurance testing at two knee angular velocities in patients with chronic obstructive pulmonary disease (COPD. Methods: After one familiarization session, 14 patients with moderate to severe COPD (mean age 65±4 years; forced expiratory volume in 1 second (FEV1 55%±18% predicted performed two quadriceps isokinetic endurance tests on two separate occasions within a 5–7-day interval. Quadriceps isokinetic endurance tests consisted of 30 maximal knee extensions at angular velocities of 90° and 180° per second, performed in random order. Test-retest reliability was assessed for peak torque, muscle endurance, work slope, work fatigue index, and changes in FEV1 for dyspnea and leg fatigue from rest to the end of the test. The intraclass correlation coefficient, minimal detectable change, and limits of agreement were calculated. Results: High test-retest reliability was identified for peak torque and muscle total work at both velocities. Work fatigue index was considered reliable at 90° per second but not at 180° per second. A lower reliability was identified for dyspnea and leg fatigue scores at both angular velocities. Conclusion: Despite a limited sample size, our findings su pport the use of a 30-maximal repetition isokinetic muscle testing procedure at angular velocities of 90° and 180° per second in patients with moderate to severe COPD. Endurance measurement (total isokinetic work at 90° per second was highly reliable, with a minimal detectable change at the 95% confidence level of 10%. Peak torque and fatigue index

  8. Load Control System Reliability

    Trudnowski, Daniel [Montana Tech of the Univ. of Montana, Butte, MT (United States)


    This report summarizes the results of the Load Control System Reliability project (DOE Award DE-FC26-06NT42750). The original grant was awarded to Montana Tech April 2006. Follow-on DOE awards and expansions to the project scope occurred August 2007, January 2009, April 2011, and April 2013. In addition to the DOE monies, the project also consisted of matching funds from the states of Montana and Wyoming. Project participants included Montana Tech; the University of Wyoming; Montana State University; NorthWestern Energy, Inc., and MSE. Research focused on two areas: real-time power-system load control methodologies; and, power-system measurement-based stability-assessment operation and control tools. The majority of effort was focused on area 2. Results from the research includes: development of fundamental power-system dynamic concepts, control schemes, and signal-processing algorithms; many papers (including two prize papers) in leading journals and conferences and leadership of IEEE activities; one patent; participation in major actual-system testing in the western North American power system; prototype power-system operation and control software installed and tested at three major North American control centers; and, the incubation of a new commercial-grade operation and control software tool. Work under this grant certainly supported the DOE-OE goals in the area of “Real Time Grid Reliability Management.”

  9. Supply chain reliability modelling

    Eugen Zaitsev


    Full Text Available Background: Today it is virtually impossible to operate alone on the international level in the logistics business. This promotes the establishment and development of new integrated business entities - logistic operators. However, such cooperation within a supply chain creates also many problems related to the supply chain reliability as well as the optimization of the supplies planning. The aim of this paper was to develop and formulate the mathematical model and algorithms to find the optimum plan of supplies by using economic criterion and the model for the probability evaluating of non-failure operation of supply chain. Methods: The mathematical model and algorithms to find the optimum plan of supplies were developed and formulated by using economic criterion and the model for the probability evaluating of non-failure operation of supply chain. Results and conclusions: The problem of ensuring failure-free performance of goods supply channel analyzed in the paper is characteristic of distributed network systems that make active use of business process outsourcing technologies. The complex planning problem occurring in such systems that requires taking into account the consumer's requirements for failure-free performance in terms of supply volumes and correctness can be reduced to a relatively simple linear programming problem through logical analysis of the structures. The sequence of the operations, which should be taken into account during the process of the supply planning with the supplier's functional reliability, was presented.

  10. Future directions.

    Raffa, Robert B; Tallarida, Ronald J


    The chapters of this book summarize much of what has been done and reported regarding cancer chemotherapy-related cognitive impairment. In this chapter, we point out some future directions for investigation.

  11. Sustainable Futures

    Sustainable Futures is a voluntary program that encourages industry to use predictive models to screen new chemicals early in the development process and offers incentives to companies subject to TSCA section 5.

  12. Robot Futures

    Christoffersen, Anja; Grindsted Nielsen, Sally; Jochum, Elizabeth Ann;

    Robots are increasingly used in health care settings, e.g., as homecare assistants and personal companions. One challenge for personal robots in the home is acceptance. We describe an innovative approach to influencing the acceptance of care robots using theatrical performance. Live performance i...... perceive social robots interacting with humans in a future care scenario through a scripted performance. We discuss our methods and initial findings, and outline future work....

  13. OSS reliability measurement and assessment

    Yamada, Shigeru


    This book analyses quantitative open source software (OSS) reliability assessment and its applications, focusing on three major topic areas: the Fundamentals of OSS Quality/Reliability Measurement and Assessment; the Practical Applications of OSS Reliability Modelling; and Recent Developments in OSS Reliability Modelling. Offering an ideal reference guide for graduate students and researchers in reliability for open source software (OSS) and modelling, the book introduces several methods of reliability assessment for OSS including component-oriented reliability analysis based on analytic hierarchy process (AHP), analytic network process (ANP), and non-homogeneous Poisson process (NHPP) models, the stochastic differential equation models and hazard rate models. These measurement and management technologies are essential to producing and maintaining quality/reliable systems using OSS.

  14. Reliability and validity in research.

    Roberts, Paula; Priest, Helena

    This article examines reliability and validity as ways to demonstrate the rigour and trustworthiness of quantitative and qualitative research. The authors discuss the basic principles of reliability and validity for readers who are new to research.

  15. Reliability and Its Quantitative Measures

    Alexandru ISAIC-MANIU


    Full Text Available In this article is made an opening for the software reliability issues, through wide-ranging statistical indicators, which are designed based on information collected from operating or testing (samples. It is developed the reliability issues also for the case of the main reliability laws (exponential, normal, Weibull, which validated for a particular system, allows the calculation of some reliability indicators with a higher degree of accuracy and trustworthiness

  16. Detection and quantification of soil-transmitted helminths in environmental samples: A review of current state-of-the-art and future perspectives.

    Amoah, Isaac Dennis; Singh, Gulshan; Stenström, Thor Axel; Reddy, Poovendhree


    It is estimated that over a billion people are infected with soil-transmitted helminths (STHs) globally with majority occurring in tropical and subtropical regions of the world. The roundworm (Ascaris lumbricoides), whipworm (Trichuris trichiura), and hookworms (Ancylostoma duodenale and Necator americanus) are the main species infecting people. These infections are mostly gained through exposure to faecally contaminated water, soil or contaminated food and with an increase in the risk of infections due to wastewater and sludge reuse in agriculture. Different methods have been developed for the detection and quantification of STHs eggs in environmental samples. However, there is a lack of a universally accepted technique which creates a challenge for comparative assessments of helminths egg concentrations both in different samples matrices as well as between locations. This review presents a comparison of reported methodologies for the detection of STHs eggs, an assessment of the relative performance of available detection methods and a discussion of new emerging techniques that could be applied for detection and quantification. It is based on a literature search using PubMed and Science Direct considering all geographical locations. Original research articles were selected based on their methodology and results sections. Methods reported in these articles were grouped into conventional, molecular and emerging techniques, the main steps in each method were then compared and discussed. The inclusion of a dissociation step aimed at detaching helminth eggs from particulate matter was found to improve the recovery of eggs. Additionally the selection and application of flotation solutions that take into account the relative densities of the eggs of different species of STHs also results in higher egg recovery. Generally the use of conventional methods was shown to be laborious and time consuming and prone to human error. The alternate use of nucleic acid

  17. 2017 NREL Photovoltaic Reliability Workshop

    Kurtz, Sarah [National Renewable Energy Laboratory (NREL), Golden, CO (United States)


    NREL's Photovoltaic (PV) Reliability Workshop (PVRW) brings together PV reliability experts to share information, leading to the improvement of PV module reliability. Such improvement reduces the cost of solar electricity and promotes investor confidence in the technology -- both critical goals for moving PV technologies deeper into the electricity marketplace.

  18. Testing for PV Reliability (Presentation)

    Kurtz, S.; Bansal, S.


    The DOE SUNSHOT workshop is seeking input from the community about PV reliability and how the DOE might address gaps in understanding. This presentation describes the types of testing that are needed for PV reliability and introduces a discussion to identify gaps in our understanding of PV reliability testing.

  19. Reliable Quantum Computers

    Preskill, J


    The new field of quantum error correction has developed spectacularly since its origin less than two years ago. Encoded quantum information can be protected from errors that arise due to uncontrolled interactions with the environment. Recovery from errors can work effectively even if occasional mistakes occur during the recovery procedure. Furthermore, encoded quantum information can be processed without serious propagation of errors. Hence, an arbitrarily long quantum computation can be performed reliably, provided that the average probability of error per quantum gate is less than a certain critical value, the accuracy threshold. A quantum computer storing about 10^6 qubits, with a probability of error per quantum gate of order 10^{-6}, would be a formidable factoring engine. Even a smaller, less accurate quantum computer would be able to perform many useful tasks. (This paper is based on a talk presented at the ITP Conference on Quantum Coherence and Decoherence, 15-18 December 1996.)

  20. MRI quantification of rheumatoid arthritis: Current knowledge and future perspectives

    Boesen, Mikael [Parker Institute, Frederiksberg University Hospital, Copenhagen (Denmark)], E-mail:; Ostergaard, Mikkel [Department of Rheumatology, Hvidovre and Herlev University Hospitals, Copenhagen (Denmark); Cimmino, Marco A. [Department of Rheumatology, University of Genoa, Genoa (Italy); Kubassova, Olga [Image Analysis LTD, Leeds (United Kingdom); Jensen, Karl Erik [Department of Radiology, MR section, Rigshospitalet, Copenhagen (Denmark); Bliddal, Henning [Parker Institute, Frederiksberg University Hospital, Copenhagen (Denmark)


    The international consensus on treatment of rheumatoid arthritis (RA) involves early initiation of disease modifying anti-rheumatic drugs (DMARDs) for which a reliable identification of early disease is mandatory. Conventional radiography of the joints is considered the standard method for detecting and quantifying joint damage in RA. However, radiographs only show late disease manifestations as joint space narrowing and bone erosions, whereas it cannot detect synovitis and bone marrow oedema, i.e., inflammation in the synovium or the bone, which may be visualized by magnetic resonance imaging (MRI) months to years before erosions develop. Furthermore, MRI allows earlier visualization of bone erosions than radiography. In order to allow early treatment initiation and optimal guidance of the therapeutic strategy, there is a need for methods which are capable of early detection of inflammatory joint changes. In this review, we will discuss available data, advantages, limitations and potential future of MRI in RA.

  1. Software Reliability, Measurement, and Testing. Volume 2. Guidebook for Software Reliability Measurement and Testing


    test experiments. Of the three static techniques, 200-4 SOFTWARE TEST TECHNIQUES ST Code Review A Error/ Anamoly Detection T I Structure...anomaly is an unforeseen event , which may not be detected by error-protection mechanisms in time to prevent system failure. The existence of extensive... event , the more difficult it is to make a meaningful prediction. As an example, it can be seen that the reliability of an electronic equipment is known


    Waterman, Brian; Sutter, Robert; Burroughs, Thomas; Dunagan, W Claiborne


    When evaluating physician performance measures, physician leaders are faced with the quandary of determining whether departures from expected physician performance measurements represent a true signal or random error. This uncertainty impedes the physician leader's ability and confidence to take appropriate performance improvement actions based on physician performance measurements. Incorporating reliability adjustment into physician performance measurement is a valuable way of reducing the impact of random error in the measurements, such as those caused by small sample sizes. Consequently, the physician executive has more confidence that the results represent true performance and is positioned to make better physician performance improvement decisions. Applying reliability adjustment to physician-level performance data is relatively new. As others have noted previously, it's important to keep in mind that reliability adjustment adds significant complexity to the production, interpretation and utilization of results. Furthermore, the methods explored in this case study only scratch the surface of the range of available Bayesian methods that can be used for reliability adjustment; further study is needed to test and compare these methods in practice and to examine important extensions for handling specialty-specific concerns (e.g., average case volumes, which have been shown to be important in cardiac surgery outcomes). Moreover, it's important to note that the provider group average as a basis for shrinkage is one of several possible choices that could be employed in practice and deserves further exploration in future research. With these caveats, our results demonstrate that incorporating reliability adjustment into physician performance measurements is feasible and can notably reduce the incidence of "real" signals relative to what one would expect to see using more traditional approaches. A physician leader who is interested in catalyzing performance improvement

  3. Reliability of Transcallosal Inhibition in Healthy Adults

    Fleming, Melanie K.; Newham, Di J.


    Transcallosal inhibition (TCI), assessed using transcranial magnetic stimulation, can provide insight into the neurophysiology of aging and of neurological disorders such as stroke. However, the reliability of TCI using the ipsilateral silent period (iSP) has not been formally assessed, despite its use in longitudinal studies. This study aimed to determine the reliability of iSP onset latency, duration and depth in healthy young and older adults. A sample of 18 younger (mean age 27.7 years, range: 19–42) and 13 older healthy adults (mean age 68.1 years, range: 58–79) attended four sessions whereby the iSP was measured from the first dorsal interosseous (FDI) muscle of each hand. 20 single pulse stimuli were delivered to each primary motor cortex at 80% maximum stimulator output while the participant maintained an isometric contraction of the ipsilateral FDI. The average onset latency, duration of the iSP, and depth of inhibition relative to baseline electromyography activity was calculated for each hand in each session. Intraclass correlation coefficients (ICCs) were calculated for all four sessions, or the first two sessions only. For iSP onset latency the reliability ranged from poor to good. For iSP duration there was moderate to good reliability (ICC > 0.6). Depth of inhibition demonstrated variation in reproducibility depending on which hand was assessed and whether two or four sessions were compared. Bland and Altman analyses showed wide limits of agreement between the first two sessions, particularly for iSP depth. However, there was no systematic pattern to the variability. These results indicate that although iSP duration is reliable in healthy adults, changes in longitudinal studies should be interpreted with caution, particularly for iSP depth. Future studies are needed to determine reliability in clinical populations. PMID:28119588

  4. Reliability of Maximal Strength Testing in Novice Weightlifters

    Loehr, James A.; Lee, Stuart M. C.; Feiveson, Alan H.; Ploutz-Snyder, Lori L.


    The one repetition maximum (1RM) is a criterion measure of muscle strength. However, the reliability of 1RM testing in novice subjects has received little attention. Understanding this information is crucial to accurately interpret changes in muscle strength. To evaluate the test-retest reliability of a squat (SQ), heel raise (HR), and deadlift (DL) 1RM in novice subjects. Twenty healthy males (31 plus or minus 5 y, 179.1 plus or minus 6.1 cm, 81.4 plus or minus 10.6 kg) with no weight training experience in the previous six months participated in four 1RM testing sessions, with each session separated by 5-7 days. SQ and HR 1RM were conducted using a smith machine; DL 1RM was assessed using free weights. Session 1 was considered a familiarization and was not included in the statistical analyses. Repeated measures analysis of variance with Tukey fs post-hoc tests were used to detect between-session differences in 1RM (p.0.05). Test-retest reliability was evaluated by intraclass correlation coefficients (ICC). During Session 2, the SQ and DL 1RM (SQ: 90.2 }4.3, DL: 75.9 }3.3 kg) were less than Session 3 (SQ: 95.3 }4.1, DL: 81.5 plus or minus 3.5 kg) and Session 4 (SQ: 96.6 }4.0, DL: 82.4 }3.9 kg), but there were no differences between Session 3 and Session 4. HR 1RM measured during Session 2 (150.1 }3.7 kg) and Session 3 (152.5 }3.9 kg) were not different from one another, but both were less than Session 4 (157.5 }3.8 kg). The reliability (ICC) of 1RM measures for Sessions 2-4 were 0.88, 0.83, and 0.87, for SQ, HR, and DL, respectively. When considering only Sessions 3 and 4, the reliability was 0.93, 0.91, and 0.86 for SQ, HR, and DL, respectively. One familiarization session and 2 test sessions (for SQ and DL) were required to obtain excellent reliability (ICC greater than or equal to 0.90) in 1RM values with novice subjects. We were unable to attain this level of reliability following 3 HR testing sessions therefore additional sessions may be required to obtain an

  5. Electronics reliability calculation and design

    Dummer, Geoffrey W A; Hiller, N


    Electronics Reliability-Calculation and Design provides an introduction to the fundamental concepts of reliability. The increasing complexity of electronic equipment has made problems in designing and manufacturing a reliable product more and more difficult. Specific techniques have been developed that enable designers to integrate reliability into their products, and reliability has become a science in its own right. The book begins with a discussion of basic mathematical and statistical concepts, including arithmetic mean, frequency distribution, median and mode, scatter or dispersion of mea

  6. Validity of Ultrasonography and Measures of Adult Shoulder Function and Reliability of Ultrasonography in Detecting Shoulder Synovitis in Patients With Rheumatoid Arthritis Using Magnetic Resonance Imaging as a Gold Standard

    Bruyn, G. A. W.; Pineda, C.; Hernandez-Diaz, C.; Ventura-Rios, L.; Moya, C.; Garrido, J.; Groen, H.; Pena, A.; Espinosa, R.; Moeller, I.; Filippucci, E.; Iagnocco, A.; Balint, P. V.; Kane, D.; D'Agostino, M-A; Angulo, M.; Ponte, R.; Fernandez-Gallardo, J. M.; Naredo, E.; Moller, I.

    Objective. To assess the intra- and interobserver reproducibility of musculoskeletal ultrasonography (US) in detecting inflammatory shoulder changes in patients with rheumatoid arthritis, and to determine the agreement between US and the Shoulder Pain and Disability Index (SPADI) and the

  7. Validity of Ultrasonography and Measures of Adult Shoulder Function and Reliability of Ultrasonography in Detecting Shoulder Synovitis in Patients With Rheumatoid Arthritis Using Magnetic Resonance Imaging as a Gold Standard

    Bruyn, G. A. W.; Pineda, C.; Hernandez-Diaz, C.; Ventura-Rios, L.; Moya, C.; Garrido, J.; Groen, H.; Pena, A.; Espinosa, R.; Moeller, I.; Filippucci, E.; Iagnocco, A.; Balint, P. V.; Kane, D.; D'Agostino, M-A; Angulo, M.; Ponte, R.; Fernandez-Gallardo, J. M.; Naredo, E.; Moller, I.


    Objective. To assess the intra- and interobserver reproducibility of musculoskeletal ultrasonography (US) in detecting inflammatory shoulder changes in patients with rheumatoid arthritis, and to determine the agreement between US and the Shoulder Pain and Disability Index (SPADI) and the Disabilitie

  8. Hybrid reliability model for fatigue reliability analysis of steel bridges

    曹珊珊; 雷俊卿


    A kind of hybrid reliability model is presented to solve the fatigue reliability problems of steel bridges. The cumulative damage model is one kind of the models used in fatigue reliability analysis. The parameter characteristics of the model can be described as probabilistic and interval. The two-stage hybrid reliability model is given with a theoretical foundation and a solving algorithm to solve the hybrid reliability problems. The theoretical foundation is established by the consistency relationships of interval reliability model and probability reliability model with normally distributed variables in theory. The solving process is combined with the definition of interval reliability index and the probabilistic algorithm. With the consideration of the parameter characteristics of theS−N curve, the cumulative damage model with hybrid variables is given based on the standards from different countries. Lastly, a case of steel structure in the Neville Island Bridge is analyzed to verify the applicability of the hybrid reliability model in fatigue reliability analysis based on the AASHTO.

  9. Synthesis of Reliable Telecommunication Networks

    Dusan Trstensky


    Full Text Available In many application, the network designer may to know to senthesise a reliable telecommunication network. Assume that a network, denoted Gm,e has the number of nodes n and the number of edges e, and the operational probability of each edge is known. The system reliability of the network is defined to be the reliability that every pair of nodes can communicate with each other. A network synthesis problem considered in this paper is to find a network G*n,e, that maximises system reliability over the class of all networks for the classes of networks Gn,n-1, Gn,m and Gn,n+1 respectively. In addition an upper bound of maximum reliability for the networks with n-node and e-edge (e>n+2 is derived in terms of node. Computational experiments for the reliability upper are also presented. the results show, that the proposed reliability upper bound is effective.

  10. Energy Futures

    Davies, Sarah Rachael; Selin, Cynthia


    foresight and public and stakeholder engagement are used to reflect on?and direct?the impacts of new technology. In this essay we draw on our experience of anticipatory governance, in the shape of the ?NanoFutures? project on energy futures, to present a reflexive analysis of engagement and deliberation. We...... draw out five tensions of the practice of deliberation on energy technologies. Through tracing the lineages of these dilemmas, we discuss some of the implications of these tensions for the practice of civic engagement and deliberation in a set of questions for this community of practitioner-scholars....

  11. Energy Futures

    Davies, Sarah Rachael; Selin, Cynthia


    foresight and public and stakeholder engagement are used to reflect on?and direct?the impacts of new technology. In this essay we draw on our experience of anticipatory governance, in the shape of the ?NanoFutures? project on energy futures, to present a reflexive analysis of engagement and deliberation. We...... draw out five tensions of the practice of deliberation on energy technologies. Through tracing the lineages of these dilemmas, we discuss some of the implications of these tensions for the practice of civic engagement and deliberation in a set of questions for this community of practitioner-scholars....

  12. Future Contingents

    Øhrstrøm, Peter; Hasle., Per F. V.


    will be a sea-battle tomorrow” could serve as standard examples. What could be called the problem of future contingents concerns how to ascribe truth-values to such statements. If there are several possible decisions out of which one is going to be made freely tomorrow, can there be a truth now about which one...... about the future. Finally, it should be mentioned that temporal logic has found a remarkable application in computer science and applied mathematics. In the late 1970s the first computer scientists realised the relevance of temporal logic for the purposes of computer science (see Hasle and Øhrstrøm 2004)....

  13. Fault tolerant highly reliable inertial navigation system

    Jeerage, Mahesh; Boettcher, Kevin

    This paper describes a development of failure detection and isolation (FDI) strategies for highly reliable inertial navigation systems. FDI strategies are developed based on the generalized likelihood ratio test (GLRT). A relationship between detection threshold and false alarm rate is developed in terms of the sensor parameters. A new method for correct isolation of failed sensors is presented. Evaluation of FDI performance parameters, such as false alarm rate, wrong isolation probability, and correct isolation probability, are presented. Finally a fault recovery scheme capable of correcting false isolation of good sensors is presented.

  14. Futur "simple" et futur "proche" ("Simple" Future and "Immediate" Future).

    Franckel, Jean-Jacques


    An analysis of the use of simple and immediate future tenses in French shows that the expression of time is controlled more by context and modals than by specifically temporal cues. The role of negation in this situation is discussed. (MSE)

  15. Creative Futures

    Feasey, Rosemary


    In 1999 the National Committee for Creativity and Culture in Education (NACCCE) produced a report called "All our futures." In many respects it was and still is a seminal report; it raises issues about creativity in education and offers serious messages for Government, schools and the inspection process. The author's research into teacher…

  16. Assessment of Lower Limb Muscle Strength and Power Using Hand-Held and Fixed Dynamometry: A Reliability and Validity Study.

    Mentiplay, Benjamin F; Perraton, Luke G; Bower, Kelly J; Adair, Brooke; Pua, Yong-Hao; Williams, Gavin P; McGaw, Rebekah; Clark, Ross A


    Hand-held dynamometry (HHD) has never previously been used to examine isometric muscle power. Rate of force development (RFD) is often used for muscle power assessment, however no consensus currently exists on the most appropriate method of calculation. The aim of this study was to examine the reliability of different algorithms for RFD calculation and to examine the intra-rater, inter-rater, and inter-device reliability of HHD as well as the concurrent validity of HHD for the assessment of isometric lower limb muscle strength and power. 30 healthy young adults (age: 23±5 yrs, male: 15) were assessed on two sessions. Isometric muscle strength and power were measured using peak force and RFD respectively using two HHDs (Lafayette Model-01165 and Hoggan microFET2) and a criterion-reference KinCom dynamometer. Statistical analysis of reliability and validity comprised intraclass correlation coefficients (ICC), Pearson correlations, concordance correlations, standard error of measurement, and minimal detectable change. Comparison of RFD methods revealed that a peak 200 ms moving window algorithm provided optimal reliability results. Intra-rater, inter-rater, and inter-device reliability analysis of peak force and RFD revealed mostly good to excellent reliability (coefficients ≥ 0.70) for all muscle groups. Concurrent validity analysis showed moderate to excellent relationships between HHD and fixed dynamometry for the hip and knee (ICCs ≥ 0.70) for both peak force and RFD, with mostly poor to good results shown for the ankle muscles (ICCs = 0.31-0.79). Hand-held dynamometry has good to excellent reliability and validity for most measures of isometric lower limb strength and power in a healthy population, particularly for proximal muscle groups. To aid implementation we have created freely available software to extract these variables from data stored on the Lafayette device. Future research should examine the reliability and validity of these variables in clinical

  17. Assessment of Lower Limb Muscle Strength and Power Using Hand-Held and Fixed Dynamometry: A Reliability and Validity Study.

    Benjamin F Mentiplay

    Full Text Available Hand-held dynamometry (HHD has never previously been used to examine isometric muscle power. Rate of force development (RFD is often used for muscle power assessment, however no consensus currently exists on the most appropriate method of calculation. The aim of this study was to examine the reliability of different algorithms for RFD calculation and to examine the intra-rater, inter-rater, and inter-device reliability of HHD as well as the concurrent validity of HHD for the assessment of isometric lower limb muscle strength and power.30 healthy young adults (age: 23±5 yrs, male: 15 were assessed on two sessions. Isometric muscle strength and power were measured using peak force and RFD respectively using two HHDs (Lafayette Model-01165 and Hoggan microFET2 and a criterion-reference KinCom dynamometer. Statistical analysis of reliability and validity comprised intraclass correlation coefficients (ICC, Pearson correlations, concordance correlations, standard error of measurement, and minimal detectable change.Comparison of RFD methods revealed that a peak 200 ms moving window algorithm provided optimal reliability results. Intra-rater, inter-rater, and inter-device reliability analysis of peak force and RFD revealed mostly good to excellent reliability (coefficients ≥ 0.70 for all muscle groups. Concurrent validity analysis showed moderate to excellent relationships between HHD and fixed dynamometry for the hip and knee (ICCs ≥ 0.70 for both peak force and RFD, with mostly poor to good results shown for the ankle muscles (ICCs = 0.31-0.79.Hand-held dynamometry has good to excellent reliability and validity for most measures of isometric lower limb strength and power in a healthy population, particularly for proximal muscle groups. To aid implementation we have created freely available software to extract these variables from data stored on the Lafayette device. Future research should examine the reliability and validity of these variables in

  18. ECLSS Reliability for Long Duration Missions Beyond Lower Earth Orbit

    Sargusingh, Miriam J.; Nelson, Jason


    Reliability has been highlighted by NASA as critical to future human space exploration particularly in the area of environmental controls and life support systems. The Advanced Exploration Systems (AES) projects have been encouraged to pursue higher reliability components and systems as part of technology development plans. However, there is no consensus on what is meant by improving on reliability; nor on how to assess reliability within the AES projects. This became apparent when trying to assess reliability as one of several figures of merit for a regenerable water architecture trade study. In the Spring of 2013, the AES Water Recovery Project (WRP) hosted a series of events at the NASA Johnson Space Center (JSC) with the intended goal of establishing a common language and understanding of our reliability goals and equipping the projects with acceptable means of assessing our respective systems. This campaign included an educational series in which experts from across the agency and academia provided information on terminology, tools and techniques associated with evaluating and designing for system reliability. The campaign culminated in a workshop at JSC with members of the ECLSS and AES communities with the goal of developing a consensus on what reliability means to AES and identifying methods for assessing our low to mid-technology readiness level (TRL) technologies for reliability. This paper details the results of the workshop.

  19. Reliability Assessment Of Wind Turbines

    Sørensen, John Dalsgaard


    Reduction of cost of energy for wind turbines are very important in order to make wind energy competitive compared to other energy sources. Therefore the turbine components should be designed to have sufficient reliability but also not be too costly (and safe). This paper presents models...... for uncertainty modeling and reliability assessment of especially the structural components such as tower, blades, substructure and foundation. But since the function of a wind turbine is highly dependent on many electrical and mechanical components as well as a control system also reliability aspects...... of these components are discussed and it is described how there reliability influences the reliability of the structural components. Two illustrative examples are presented considering uncertainty modeling, reliability assessment and calibration of partial safety factors for structural wind turbine components exposed...

  20. Nuclear weapon reliability evaluation methodology

    Wright, D.L. [Sandia National Labs., Albuquerque, NM (United States)


    This document provides an overview of those activities that are normally performed by Sandia National Laboratories to provide nuclear weapon reliability evaluations for the Department of Energy. These reliability evaluations are first provided as a prediction of the attainable stockpile reliability of a proposed weapon design. Stockpile reliability assessments are provided for each weapon type as the weapon is fielded and are continuously updated throughout the weapon stockpile life. The reliability predictions and assessments depend heavily on data from both laboratory simulation and actual flight tests. An important part of the methodology are the opportunities for review that occur throughout the entire process that assure a consistent approach and appropriate use of the data for reliability evaluation purposes.

  1. Reliability engineering theory and practice

    Birolini, Alessandro


    This book shows how to build in, evaluate, and demonstrate reliability and availability of components, equipment, systems. It presents the state-of-theart of reliability engineering, both in theory and practice, and is based on the author's more than 30 years experience in this field, half in industry and half as Professor of Reliability Engineering at the ETH, Zurich. The structure of the book allows rapid access to practical results. This final edition extend and replace all previous editions. New are, in particular, a strategy to mitigate incomplete coverage, a comprehensive introduction to human reliability with design guidelines and new models, and a refinement of reliability allocation, design guidelines for maintainability, and concepts related to regenerative stochastic processes. The set of problems for homework has been extended. Methods & tools are given in a way that they can be tailored to cover different reliability requirement levels and be used for safety analysis. Because of the Appendice...

  2. Validity of ultrasonography and measures of adult shoulder function and reliability of ultrasonography in detecting shoulder synovitis in patients with rheumatoid arthritis using magnetic resonance imaging as a gold standard.

    Bruyn, G A W


    To assess the intra- and interobserver reproducibility of musculoskeletal ultrasonography (US) in detecting inflammatory shoulder changes in patients with rheumatoid arthritis, and to determine the agreement between US and the Shoulder Pain and Disability Index (SPADI) and the Disabilities of the Arm, Shoulder, and Hand (DASH) questionnaire, using magnetic resonance imaging (MRI) as a gold standard.

  3. Lithium battery safety and reliability

    Levy, Samuel C.

    Lithium batteries have been used in a variety of applications for a number of years. As their use continues to grow, particularly in the consumer market, a greater emphasis needs to be placed on safety and reliability. There is a useful technique which can help to design cells and batteries having a greater degree of safety and higher reliability. This technique, known as fault tree analysis, can also be useful in determining the cause of unsafe behavior and poor reliability in existing designs.

  4. Automated Detection and Evaluation of Swallowing Using a Combined EMG/Bioimpedance Measurement System

    Corinna Schultheiss


    Full Text Available Introduction. Developing an automated diagnostic and therapeutic instrument for treating swallowing disorders requires procedures able to reliably detect and evaluate a swallow. We tested a two-stage detection procedure based on a combined electromyography/bioimpedance (EMBI measurement system. EMBI is able to detect swallows and distinguish them from similar movements in healthy test subjects. Study Design. The study was planned and conducted as a case-control study (EA 1/019/10, and EA1/160/09, EA1/161/09. Method. The study looked at differences in swallowing parameters in general and in the event of penetration during swallows in healthy subjects and in patients with an oropharyngeal swallowing disorder. A two-stage automated swallow detection procedure which used electromyography (EMG and bioimpedance (BI to reliably detect swallows was developed. Results. Statistically significant differences between healthy subjects and patients with a swallowing disorder were found in swallowing parameters previously used to distinguish between swallowing and head movements. Our two-stage algorithm was able to reliably detect swallows (sensitivity = 96.1%, specificity = 97.1% on the basis of these differences. Discussion. Using a two-stage detection procedure, the EMBI measurement procedure is able to detect and evaluate swallows automatically and reliably. The two procedures (EMBI + swallow detection could in future form the basis for automated diagnosis and treatment (stimulation of swallowing disorders.

  5. Robot Futures

    Christoffersen, Anja; Grindsted Nielsen, Sally; Jochum, Elizabeth Ann

    Robots are increasingly used in health care settings, e.g., as homecare assistants and personal companions. One challenge for personal robots in the home is acceptance. We describe an innovative approach to influencing the acceptance of care robots using theatrical performance. Live performance...... is a useful testbed for developing and evaluating what makes robots expressive; it is also a useful platform for designing robot behaviors and dialogue that result in believable characters. Therefore theatre is a valuable testbed for studying human-robot interaction (HRI). We investigate how audiences...... perceive social robots interacting with humans in a future care scenario through a scripted performance. We discuss our methods and initial findings, and outline future work....

  6. My Future


    正My fellow classmates, Everyone of us is thinking about the future. What is mine? I have decided become a middle school teacher. Does it sound surprising? I had his dream when I was only a child. I love children. I don't think to deal with them all years round is just wasting time. On the contrary, to me it would mean happiness. As we all can see, teachers are badly needed in our country,

  7. Reliability and statistical power analysis of cortical and subcortical FreeSurfer metrics in a large sample of healthy elderly.

    Liem, Franziskus; Mérillat, Susan; Bezzola, Ladina; Hirsiger, Sarah; Philipp, Michel; Madhyastha, Tara; Jäncke, Lutz


    FreeSurfer is a tool to quantify cortical and subcortical brain anatomy automatically and noninvasively. Previous studies have reported reliability and statistical power analyses in relatively small samples or only selected one aspect of brain anatomy. Here, we investigated reliability and statistical power of cortical thickness, surface area, volume, and the volume of subcortical structures in a large sample (N=189) of healthy elderly subjects (64+ years). Reliability (intraclass correlation coefficient) of cortical and subcortical parameters is generally high (cortical: ICCs>0.87, subcortical: ICCs>0.95). Surface-based smoothing increases reliability of cortical thickness maps, while it decreases reliability of cortical surface area and volume. Nevertheless, statistical power of all measures benefits from smoothing. When aiming to detect a 10% difference between groups, the number of subjects required to test effects with sufficient power over the entire cortex varies between cortical measures (cortical thickness: N=39, surface area: N=21, volume: N=81; 10mm smoothing, power=0.8, α=0.05). For subcortical regions this number is between 16 and 76 subjects, depending on the region. We also demonstrate the advantage of within-subject designs over between-subject designs. Furthermore, we publicly provide a tool that allows researchers to perform a priori power analysis and sensitivity analysis to help evaluate previously published studies and to design future studies with sufficient statistical power.

  8. Future detection and monitoring of diabetes may entail analysis of both β-cell function and volume: How markers of β-cell loss may assist

    Neutzsky-Wulff Anita V


    Full Text Available Abstract Disease heterogeneity is as major issue in Type II Diabetes Mellitus (T2DM, and this patient inter-variability might not be sufficiently reflected by measurements of glycated haemoglobin (HbA1c. Β-cell dysfunction and β-cell death are initiating factors in development of T2DM. In fact, β-cells are known vanish prior to the development of T2DM, and autopsy of overt T2DM patients have shown a 60% reduction in β-cell mass. As the decline in β-cell function and mass have been proven to be pathological traits in T2DM, methods for evaluating β-cell loss is becoming of more interest. However, evaluation of β-cell death or loss is currently invasive and unattainable for the vast majority of diabetes patients. Serological markers, reflecting β-cell loss would be advantageous to detect and monitor progression of T2DM. Biomarkers with such capacities could be neo-epitopes of proteins with high β-cell specificity containing post translational modifications. Such tools may segregate T2DM patients into more appropriate treatment groups, based on their β-cell status, which is currently not possible. Presently individuals presenting with adequately elevated levels of both insulin and glucose are classified as T2DM patients, while an important subdivision of those is pending, namely those patients with sufficient β-cell capacity and those without. This may warrant two very different treatment options and patient care paths. Serological biomarkers reflecting β-cell health status may also assist development of new drugs for T2DM and aid physicians in better characterization of individual patients and tailor individual treatments and patient care protocols.

  9. Reliability testing of tendon disease using two different scanning methods in patients with rheumatoid arthritis

    Bruyn, George A W; Möller, Ingrid; Garrido, Jesus


    To assess the intra- and interobserver reliability of musculoskeletal ultrasonography (US) in detecting inflammatory and destructive tendon abnormalities in patients with RA using two different scanning methods....

  10. Reliability engineering theory and practice

    Birolini, Alessandro


    Presenting a solid overview of reliability engineering, this volume enables readers to build and evaluate the reliability of various components, equipment and systems. Current applications are presented, and the text itself is based on the author's 30 years of experience in the field.

  11. The Validity of Reliability Measures.

    Seddon, G. M.


    Demonstrates that some commonly used indices can be misleading in their quantification of reliability. The effects are most pronounced on gain or difference scores. Proposals are made to avoid sources of invalidity by using a procedure to assess reliability in terms of upper and lower limits for the true scores of each examinee. (Author/JDH)

  12. Reliability engineering in RF CMOS


    In this thesis new developments are presented for reliability engineering in RF CMOS. Given the increase in use of CMOS technology in applications for mobile communication, also the reliability of CMOS for such applications becomes increasingly important. When applied in these applications, CMOS is typically referred to as RF CMOS, where RF stands for radio frequencies.

  13. Reliability in automotive ethernet networks

    Soares, Fabio L.; Campelo, Divanilson R.; Yan, Ying;


    This paper provides an overview of in-vehicle communication networks and addresses the challenges of providing reliability in automotive Ethernet in particular.......This paper provides an overview of in-vehicle communication networks and addresses the challenges of providing reliability in automotive Ethernet in particular....

  14. Estimation of Bridge Reliability Distributions

    Thoft-Christensen, Palle

    In this paper it is shown how the so-called reliability distributions can be estimated using crude Monte Carlo simulation. The main purpose is to demonstrate the methodology. Therefor very exact data concerning reliability and deterioration are not needed. However, it is intended in the paper to ...

  15. Reliability estimation using kriging metamodel

    Cho, Tae Min; Ju, Byeong Hyeon; Lee, Byung Chai [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Jung, Do Hyun [Korea Automotive Technology Institute, Chonan (Korea, Republic of)


    In this study, the new method for reliability estimation is proposed using kriging metamodel. Kriging metamodel can be determined by appropriate sampling range and sampling numbers because there are no random errors in the Design and Analysis of Computer Experiments(DACE) model. The first kriging metamodel is made based on widely ranged sampling points. The Advanced First Order Reliability Method(AFORM) is applied to the first kriging metamodel to estimate the reliability approximately. Then, the second kriging metamodel is constructed using additional sampling points with updated sampling range. The Monte-Carlo Simulation(MCS) is applied to the second kriging metamodel to evaluate the reliability. The proposed method is applied to numerical examples and the results are almost equal to the reference reliability.

  16. Reliability-Based Code Calibration

    Faber, M.H.; Sørensen, John Dalsgaard


    The present paper addresses fundamental concepts of reliability based code calibration. First basic principles of structural reliability theory are introduced and it is shown how the results of FORM based reliability analysis may be related to partial safety factors and characteristic values....... Thereafter the code calibration problem is presented in its principal decision theoretical form and it is discussed how acceptable levels of failure probability (or target reliabilities) may be established. Furthermore suggested values for acceptable annual failure probabilities are given for ultimate...... and serviceability limit states. Finally the paper describes the Joint Committee on Structural Safety (JCSS) recommended procedure - CodeCal - for the practical implementation of reliability based code calibration of LRFD based design codes....

  17. Photovoltaic performance and reliability workshop

    Mrig, L. [ed.


    This workshop was the sixth in a series of workshops sponsored by NREL/DOE under the general subject of photovoltaic testing and reliability during the period 1986--1993. PV performance and PV reliability are at least as important as PV cost, if not more. In the US, PV manufacturers, DOE laboratories, electric utilities, and others are engaged in the photovoltaic reliability research and testing. This group of researchers and others interested in the field were brought together to exchange the technical knowledge and field experience as related to current information in this evolving field of PV reliability. The papers presented here reflect this effort since the last workshop held in September, 1992. The topics covered include: cell and module characterization, module and system testing, durability and reliability, system field experience, and standards and codes.

  18. System Reliability for Offshore Wind Turbines

    Marquez-Dominguez, Sergio; Sørensen, John Dalsgaard


    are considered for reliability verification according to international design standards of OWTs. System effects become important for each substructure with many potential fatigue hot spots. Therefore, in this paper a framework for system effects is presented. This information can be e.g. no detection of cracks......E). In consequence, a rational treatment of uncertainties is done in order to assess the reliability of critical details in OWTs. Limit state equations are formulated for fatigue critical details which are not influenced by wake effects generated in offshore wind farms. Furthermore, typical bi-linear S-N curves...... in inspections or measurements from condition monitoring systems. Finally, an example is established to illustrate the practical application of this framework for jacket type wind turbine substructure considering system effects....

  19. 仪器法快速检测鲜乳微生物的可靠性研究%Reliability of Rapid Detection Fresh Milk Microorganism by Insrument Method

    尚新彬; 李国恩


    本文通过乳制品微生物活性快速检测仪对鲜乳中微生物酶活性的测定,快速检测鲜乳中微生物数量的变化,判定鲜乳品质的优劣。试验结果表明:仪器法可以在30min内完成一个鲜乳样品的微生物含量的检测。利用鲜乳微生物快速检测仪和国标平板菌落计数法分别检测大肠杆菌、蜡样芽胞杆菌、变形杆菌、灰绿曲霉、黑曲霉、球拟酵母菌、金黄色葡萄球菌、伤寒沙门氏菌和志贺氏菌等主要危害鲜乳品质的菌落数量,两种方法的测定结果相关系数均在0.98以上,这表明两种方法间有很好的线性相关性,同样用两种方法监测鲜乳贮藏过程中的微生物活动情况,其检测结果R2=0.991,属于显著相关。%Method of fast detecting fresh milk microorganism was investigated by determining the enzyme activity with quick microorganism detector, which could monitor quality of fresh milk. Experimental results showed that detection of microbial content instrument method could be accomplished with a fresh milk samples in 30 minutes. Detect Escherichia coli, Bacillus cereus, Bacillus Proteus, Aspergillus glaucus, Kuroma, Torulopsis, Staphylococcus aureus, Salmonel-la typhi, Shigella and other major hazards of fresh milk quality by the number of colonies using fresh milk microbial rapid detection instrument and standard plate count method. Determination results of two methods of correlation coefficient was above 0.98, indicating that two methods had good linear correlation. Microbial activity of fresh milk two methods during storage, the R2=0.991, which belonged to significant correlation.

  20. Reliability, return periods, and risk under nonstationarity

    Read, Laura K.; Vogel, Richard M.


    Water resources design has widely used the average return period as a concept to inform management and communication of the risk of experiencing an exceedance event within a planning horizon. Even though nonstationarity is often apparent, in practice hydrologic design often mistakenly assumes that the probability of exceedance, p, is constant from year to year which leads to an average return period To equal to 1/p; this expression is far more complex under nonstationarity. Even for stationary processes, the common application of an average return period is problematic: it does not account for planning horizon, is an average value that may not be representative of the time to the next flood, and is generally not applied in other areas of water planning. We combine existing theoretical and empirical results from the literature to provide the first general, comprehensive description of the probabilistic behavior of the return period and reliability under nonstationarity. We show that under nonstationarity, the underlying distribution of the return period exhibits a more complex shape than the exponential distribution under stationary conditions. Using a nonstationary lognormal model, we document the increased complexity and challenges associated with planning for future flood events over a planning horizon. We compare application of the average return period with the more common concept of reliability and recommend replacing the average return period with reliability as a more practical way to communicate event likelihood in both stationary and nonstationary contexts.

  1. Stochastic models in reliability and maintenance


    Our daily lives can be maintained by the high-technology systems. Computer systems are typical examples of such systems. We can enjoy our modern lives by using many computer systems. Much more importantly, we have to maintain such systems without failure, but cannot predict when such systems will fail and how to fix such systems without delay. A stochastic process is a set of outcomes of a random experiment indexed by time, and is one of the key tools needed to analyze the future behavior quantitatively. Reliability and maintainability technologies are of great interest and importance to the maintenance of such systems. Many mathematical models have been and will be proposed to describe reliability and maintainability systems by using the stochastic processes. The theme of this book is "Stochastic Models in Reliability and Main­ tainability. " This book consists of 12 chapters on the theme above from the different viewpoints of stochastic modeling. Chapter 1 is devoted to "Renewal Processes," under which cla...

  2. Detection of pandemic strain of influenza virus (A/H1N1/pdm09 in pigs, West Africa: implications and considerations for prevention of future influenza pandemics at the source

    Oluwagbenga A. Adeola


    Full Text Available Background: Human and animal influenza are inextricably linked. In particular, the pig is uniquely important as a mixing vessel for genetic reassortment of influenza viruses, leading to emergence of novel strains which may cause human pandemics. Significant reduction in transmission of influenza viruses from humans, and other animals, to swine may therefore be crucial for preventing future influenza pandemics. This study investigated the presence of the 2009 pandemic influenza A/H1N1 virus, A(H1N1pdm09, in Nigerian and Ghanaian pigs, and also determined levels of acceptance of preventive measures which could significantly reduce the transmission of this virus from humans to pigs. Methods: Nasal swab specimens from 125 pigs in Ibadan, Nigeria, and Kumasi, Ghana, were tested for the presence of influenza A/California/04/2009 (H1N1 by quantitative antigen-detection ELISA. A semi-structured questionnaire was also administered to pig handlers in the two study areas and responses were analyzed to evaluate their compliance with seven measures for preventing human-to-swine transmission of influenza viruses. Results: The virus was detected among pigs in the two cities, with prevalence of 8% in Ibadan and 10% in Kumasi. Levels of compliance of pig handlers with relevant preventive measures were also found to be mostly below 25 and 40% in Ibadan and Kumasi, respectively. Conclusion: Detection of influenza A(H1N1pdm09 among pigs tested suggests the possibility of human-to-swine transmission, which may proceed even more rapidly, considering the very poor acceptance of basic preventive measures observed in this study. This is also the first report on detection of influenza A(H1N1pdm09 in Ghanaian pigs. We recommend improvement on personal hygiene among pig handlers, enforcement of sick leave particularly during the first few days of influenza-like illnesses, and training of pig handlers on recognition of influenza-like signs in humans and pigs. These could be

  3. Future Summary

    Wilczek, Frank


    We are emerging from a period of consolidation in particle physics. Its great, historic achievement was to establish the Theory of Matter. This Theory will serve as our description of ordinary matter under ordinary conditions -- allowing for an extremely liberal definition of "ordinary -- for the foreseeable future. Yet there are many indications, ranging from the numerical to the semi-mystical, that a new fertile period lies before us. We will discover compelling evidence for the unification of fundamental forces and for new quantum dimensions (low-energy supersymmetry). We will identify new forms of matter, which dominate the mass density of the Universe. We will achieve much better fundamental understanding of the behavior of matter in extreme astrophysical and cosmological environments. Lying beyond these expectations, we can identify deep questions that seem to call for ideas outside our present grasp. And there's still plenty of room for surprises.

  4. Future Summary

    Wilczek, Frank

    We are emerging from a period of consolidation in particle physics. Its great, historic achievement was to establish the Theory of Matter. This Theory will serve as our description of ordinary matter under ordinary conditions - allowing for an extremely liberal definition of ``ordinary'' - for the foreseeable future. Yet there are many indications, ranging from the numerical to the semimystical, that a new fertile period lies before us. We will discover compelling evidence for the unification of fundamental forces and for new quantum dimensions (low-energy supersymmetry). We will identify new forms of matter, which dominate the mass density of the Universe. We will achieve much better fundamental understanding of the behavior of matter in extreme astrophysical and cosmological environments. Lying beyond these expectations, we can identify deep questions that seem to call for ideas outside our present grasp. And there is still plenty of room for surprises.

  5. Future Talks,

    Catherine Defeyt


    Full Text Available La conservation des matériaux modernes et les difficultés qui la caractérisent étaient l’objet du colloque international Future Talks, organisé par Die Neue Sammlung, The International Design Museum, les 22 et 23 octobre 2009 à Munich. Conservateurs-restaurateurs spécialisés, représentants des  institutions muséales les plus prestigieuses d’Europe et d’outre-Atlantique ainsi que chercheurs en sciences appliquées y ont présenté leurs travaux et recherches. En matière de design, d’art moderne e...

  6. Does low-field dedicated extremity MRI (E-MRI) reliably detect bone erosions in rheumatoid arthritis? A comparison of two different E-MRI units and conventional radiography with high-resolution CT scanning

    Duer-Jensen, A; Ejbjerg, B; Albrecht-Beste, E


    OBJECTIVES: To compare the ability of two different E-MRI units and conventional radiography (CR) to identify bone erosions in rheumatoid arthritis (RA) metacarpophalangeal (MCP) and wrist joints with CT scanning as the standard reference method. METHODS: 20 patients with RA and 5 controls...... underwent CR, CT and two E-MRI examinations (Esaote Biomedica Artoscan and MagneVu MV1000) of one hand during a 2-week period. In all modalities, each bone of the wrist and MCP joints was blindly evaluated for erosions. MagneVu images were also assessed for the proportion of each bone being visualised...... were visualised entirely and 37.9% of bones were 67-99% visualised. In MCP joints, 84.2% of bones were visualised entirely and 15.8% of bones were 67-99% visualised. CONCLUSION: With CT as the reference method for detecting erosions in RA hands, the Artoscan showed higher sensitivity than the Magne...

  7. Electronic parts reliability data 1997

    Denson, William; Jaworski, Paul; Mahar, David


    This document contains reliability data on both commercial and military electronic components for use in reliability analyses. It contains failure rate data on integrated circuits, discrete semiconductors (diodes, transistors, optoelectronic devices), resistors, capacitors, and inductors/transformers, all of which were obtained from the field usage of electronic components. At 2,000 pages, the format of this document is the same as RIAC's popular NPRD document which contains reliability data on nonelectronic component and electronic assembly types. Data includes part descriptions, quality level, application environments, point estimates of failure rate, data sources, number of failures, total operating hours, miles, or cycles, and detailed part characteristics.

  8. Mass and Reliability System (MaRS)

    Barnes, Sarah


    : operation hours, random/nonrandom failures, software/hardware failures, quantity, orbital replaceable units (ORU), date of placement, unit weight, frequency of part, etc. The motivation for creating such a database will be the development of a mass/reliability parametric model to estimate mass required for replacement parts. Once complete, engineers working on future space flight missions will have access a mean time to failures and on parts along with their mass, this will be used to make proper decisions for long duration space flight missions

  9. Are Specialist Certification Examinations a Reliable Measure of Physician Competence?

    Burch, V. C.; Norman, G. R.; Schmidt, H. G.; van der Vleuten, C. P. M.


    High stakes postgraduate specialist certification examinations have considerable implications for the future careers of examinees. Medical colleges and professional boards have a social and professional responsibility to ensure their fitness for purpose. To date there is a paucity of published data about the reliability of specialist certification…

  10. Quantitative metal magnetic memory reliability modeling for welded joints

    Xing, Haiyan; Dang, Yongbin; Wang, Ben; Leng, Jiancheng


    Metal magnetic memory(MMM) testing has been widely used to detect welded joints. However, load levels, environmental magnetic field, and measurement noises make the MMM data dispersive and bring difficulty to quantitative evaluation. In order to promote the development of quantitative MMM reliability assessment, a new MMM model is presented for welded joints. Steel Q235 welded specimens are tested along the longitudinal and horizontal lines by TSC-2M-8 instrument in the tensile fatigue experiments. The X-ray testing is carried out synchronously to verify the MMM results. It is found that MMM testing can detect the hidden crack earlier than X-ray testing. Moreover, the MMM gradient vector sum K vs is sensitive to the damage degree, especially at early and hidden damage stages. Considering the dispersion of MMM data, the K vs statistical law is investigated, which shows that K vs obeys Gaussian distribution. So K vs is the suitable MMM parameter to establish reliability model of welded joints. At last, the original quantitative MMM reliability model is first presented based on the improved stress strength interference theory. It is shown that the reliability degree R gradually decreases with the decreasing of the residual life ratio T, and the maximal error between prediction reliability degree R 1 and verification reliability degree R 2 is 9.15%. This presented method provides a novel tool of reliability testing and evaluating in practical engineering for welded joints.

  11. Reliability Modeling of Wind Turbines

    Kostandyan, Erik

    and uncertainties are quantified. Further, estimation of annual failure probability for structural components taking into account possible faults in electrical or mechanical systems is considered. For a representative structural failure mode, a probabilistic model is developed that incorporates grid loss failures...... components. Thus, models of reliability should be developed and applied in order to quantify the residual life of the components. Damage models based on physics of failure combined with stochastic models describing the uncertain parameters are imperative for development of cost-optimal decision tools...... for Operation & Maintenance planning. Concentrating efforts on development of such models, this research is focused on reliability modeling of Wind Turbine critical subsystems (especially the power converter system). For reliability assessment of these components, structural reliability methods are applied...

  12. Reliability Analysis of Wind Turbines

    Toft, Henrik Stensgaard; Sørensen, John Dalsgaard


    In order to minimise the total expected life-cycle costs of a wind turbine it is important to estimate the reliability level for all components in the wind turbine. This paper deals with reliability analysis for the tower and blades of onshore wind turbines placed in a wind farm. The limit states...... consideres are in the ultimate limit state (ULS) extreme conditions in the standstill position and extreme conditions during operating. For wind turbines, where the magnitude of the loads is influenced by the control system, the ultimate limit state can occur in both cases. In the fatigue limit state (FLS......) the reliability level for a wind turbine placed in a wind farm is considered, and wake effects from neighbouring wind turbines is taken into account. An illustrative example with calculation of the reliability for mudline bending of the tower is considered. In the example the design is determined according...

  13. Reliability analysis in intelligent machines

    Mcinroy, John E.; Saridis, George N.


    Given an explicit task to be executed, an intelligent machine must be able to find the probability of success, or reliability, of alternative control and sensing strategies. By using concepts for information theory and reliability theory, new techniques for finding the reliability corresponding to alternative subsets of control and sensing strategies are proposed such that a desired set of specifications can be satisfied. The analysis is straightforward, provided that a set of Gaussian random state variables is available. An example problem illustrates the technique, and general reliability results are presented for visual servoing with a computed torque-control algorithm. Moreover, the example illustrates the principle of increasing precision with decreasing intelligence at the execution level of an intelligent machine.

  14. Reliability Assessment of Wind Turbines

    Sørensen, John Dalsgaard


    (and safe). In probabilistic design the single components are designed to a level of reliability, which accounts for an optimal balance between failure consequences, cost of operation & maintenance, material costs and the probability of failure. Furthermore, using a probabilistic design basis...... but manufactured in series production based on many component tests, some prototype tests and zeroseries wind turbines. These characteristics influence the reliability assessment where focus in this paper is on the structural components. Levelized Cost Of Energy is very important for wind energy, especially when...... comparing to other energy sources. Therefore much focus is on cost reductions and improved reliability both for offshore and onshore wind turbines. The wind turbine components should be designed to have sufficient reliability level with respect to both extreme and fatigue loads but also not be too costly...

  15. On Bayesian System Reliability Analysis

    Soerensen Ringi, M.


    The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person`s state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs.

  16. VCSEL reliability: a user's perspective

    McElfresh, David K.; Lopez, Leoncio D.; Melanson, Robert; Vacar, Dan


    VCSEL arrays are being considered for use in interconnect applications that require high speed, high bandwidth, high density, and high reliability. In order to better understand the reliability of VCSEL arrays, we initiated an internal project at SUN Microsystems, Inc. In this paper, we present preliminary results of an ongoing accelerated temperature-humidity-bias stress test on VCSEL arrays from several manufacturers. This test revealed no significant differences between the reliability of AlGaAs, oxide confined VCSEL arrays constructed with a trench oxide and mesa for isolation. This test did find that the reliability of arrays needs to be measured on arrays and not be estimated with the data from singulated VCSELs as is a common practice.

  17. Innovations in power systems reliability

    Santora, Albert H; Vaccaro, Alfredo


    Electrical grids are among the world's most reliable systems, yet they still face a host of issues, from aging infrastructure to questions of resource distribution. Here is a comprehensive and systematic approach to tackling these contemporary challenges.

  18. MEMS reliability: coming of age

    Douglass, Michael R.


    In today's high-volume semiconductor world, one could easily take reliability for granted. As the MOEMS/MEMS industry continues to establish itself as a viable alternative to conventional manufacturing in the macro world, reliability can be of high concern. Currently, there are several emerging market opportunities in which MOEMS/MEMS is gaining a foothold. Markets such as mobile media, consumer electronics, biomedical devices, and homeland security are all showing great interest in microfabricated products. At the same time, these markets are among the most demanding when it comes to reliability assurance. To be successful, each company developing a MOEMS/MEMS device must consider reliability on an equal footing with cost, performance and manufacturability. What can this maturing industry learn from the successful development of DLP technology, air bag accelerometers and inkjet printheads? This paper discusses some basic reliability principles which any MOEMS/MEMS device development must use. Examples from the commercially successful and highly reliable Digital Micromirror Device complement the discussion.

  19. Reliability of the hip examination in osteoarthritis: effect of standardization.

    Cibere, Jolanda; Thorne, Anona; Bellamy, Nicholas; Greidanus, Nelson; Chalmers, Andrew; Mahomed, Nizar; Shojania, Kam; Kopec, Jacek; Esdaile, John M


    To assess the reliability of the physical examination of the hip in osteoarthritis (OA) among rheumatologists and orthopedic surgeons, and to evaluate the benefits of standardization. Thirty-five physical signs and techniques were evaluated using a 6 x 6 Latin square design. Subjects with mild to severe hip OA, based on physical and radiographic signs, were examined in random order prior to and following standardization of physical examination techniques. For dichotomous signs, agreement was calculated as the prevalence-adjusted bias-adjusted kappa (PABAK), whereas for continuous and ordinal signs a reliability coefficient was calculated using analysis of variance. A PABAK >0.60 and a reliability coefficient >0.80 were considered to indicate adequate reliability. Adequate post-standardization reliability was achieved for 25 (71%) of 35 signs. The most highly reliable signs included true and apparent leg length discrepancy > or =1.5 cm; hip flexion, abduction, adduction, and extension strength; log roll test for hip pain; internal rotation and flexion range of motion; and Thomas test for flexion contracture. The standardization process was associated with substantial improvements in reliability for a number of physical signs, although minimal or no change was noted for some. Only 1 sign, Trendelenburg's sign, was highly unreliable post-standardization. With the exception of gait, a comprehensive hip examination can be performed with adequate reliability. Post-standardization reliability is improved compared with pre-standardization reliability for some physical signs. The application of these findings to future OA studies will contribute to improved outcome assessments in OA.

  20. Supernova detection

    Nakahata, Masayuki [Kamioka Observatory, Institute for Cosmic Ray research, University of Tokyo, Higashi-Mozumi, Kamioka-cho, Hida-shi, Gifu, Japan, 506-1205 (Japan)], E-mail:


    The detection of supernova neutrinos is reviewed, focusing on the current status of experiments to detect supernova burst neutrinos and supernova relic neutrinos. The capabilities of each detector currently operating and in development are assessed and the likely neutrino yield for a future supernova is estimated. It is expected that much more information will be obtained if a supernova burst were to occur in our Galaxy than was obtained for supernova SN1987A. The detection of supernova relic neutrinos is considered and it is concluded that a large volume detector with a neutron tagging technique is necessary.

  1. Robust Design of Reliability Test Plans Using Degradation Measures.

    Lane, Jonathan Wesley; Lane, Jonathan Wesley; Crowder, Stephen V.; Crowder, Stephen V.


    With short production development times, there is an increased need to demonstrate product reliability relatively quickly with minimal testing. In such cases there may be few if any observed failures. Thus, it may be difficult to assess reliability using the traditional reliability test plans that measure only time (or cycles) to failure. For many components, degradation measures will contain important information about performance and reliability. These measures can be used to design a minimal test plan, in terms of number of units placed on test and duration of the test, necessary to demonstrate a reliability goal. Generally, the assumption is made that the error associated with a degradation measure follows a known distribution, usually normal, although in practice cases may arise where that assumption is not valid. In this paper, we examine such degradation measures, both simulated and real, and present non-parametric methods to demonstrate reliability and to develop reliability test plans for the future production of components with this form of degradation.

  2. Recent advances in computational structural reliability analysis methods

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.


    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  3. Future food.

    Wahlqvist, Mark L


    Food systems have changed markedly with human settlement and agriculture, industrialisation, trade, migration and now the digital age. Throughout these transitions, there has been a progressive population explosion and net ecosystem loss and degradation. Climate change now gathers pace, exacerbated by ecological dysfunction. Our health status has been challenged by a developing people-environment mismatch. We have regarded ecological conquest and innovative technology as solutions, but have not understood how ecologically dependent and integrated we are. We are ecological creatures interfaced by our sensoriness, microbiomes, shared regulatory (endocrine) mechanisms, immune system, biorhythms and nutritional pathways. Many of us are 'nature-deprived'. We now suffer what might be termed ecological health disorders (EHD). If there were less of us, nature's resilience might cope, but more than 9 billion people by 2050 is probably an intolerable demand on the planet. Future food must increasingly take into account the pressures on ecosystem-dependent food systems, with foods probably less biodiverse, although eating in this way allows optimal health; energy dysequilibrium with less physical activity and foods inappropriately energy dense; and less socially-conducive food habits. 'Personalised Nutrition', with extensive and resource-demanding nutrigenomic, metabolomic and microbiomic data may provide partial health solutions in clinical settings, but not be justified for ethical, risk management or sustainability reasons in public health. The globally prevalent multidimensional malnutritional problems of food insecurity, quality and equity require local, regional and global action to prevent further ecosystem degradation as well as to educate, provide sustainable livelihoods and encourage respectful social discourse and practice about the role of food.

  4. Reliability of plantar pressure platforms.

    Hafer, Jocelyn F; Lenhoff, Mark W; Song, Jinsup; Jordan, Joanne M; Hannan, Marian T; Hillstrom, Howard J


    Plantar pressure measurement is common practice in many research and clinical protocols. While the accuracy of some plantar pressure measuring devices and methods for ensuring consistency in data collection on plantar pressure measuring devices have been reported, the reliability of different devices when testing the same individuals is not known. This study calculated intra-mat, intra-manufacturer, and inter-manufacturer reliability of plantar pressure parameters as well as the number of plantar pressure trials needed to reach a stable estimate of the mean for an individual. Twenty-two healthy adults completed ten walking trials across each of two Novel emed-x(®) and two Tekscan MatScan(®) plantar pressure measuring devices in a single visit. Intraclass correlation (ICC) was used to describe the agreement between values measured by different devices. All intra-platform reliability correlations were greater than 0.70. All inter-emed-x(®) reliability correlations were greater than 0.70. Inter-MatScan(®) reliability correlations were greater than 0.70 in 31 and 52 of 56 parameters when looking at a 10-trial average and a 5-trial average, respectively. Inter-manufacturer reliability including all four devices was greater than 0.70 for 52 and 56 of 56 parameters when looking at a 10-trial average and a 5-trial average, respectively. All parameters reached a value within 90% of an unbiased estimate of the mean within five trials. Overall, reliability results are encouraging for investigators and clinicians who may have plantar pressure data sets that include data collected on different devices.

  5. Reliability and Validity of Dual-Task Mobility Assessments in People with Chronic Stroke.

    Lei Yang

    Full Text Available The ability to perform a cognitive task while walking simultaneously (dual-tasking is important in real life. However, the psychometric properties of dual-task walking tests have not been well established in stroke.To assess the test-retest reliability, concurrent and known-groups validity of various dual-task walking tests in people with chronic stroke.Observational measurement study with a test-retest design.Eighty-eight individuals with chronic stroke participated. The testing protocol involved four walking tasks (walking forward at self-selected and maximal speed, walking backward at self-selected speed, and crossing over obstacles performed simultaneously with each of the three attention-demanding tasks (verbal fluency, serial 3 subtractions or carrying a cup of water. For each dual-task condition, the time taken to complete the walking task, the correct response rate (CRR of the cognitive task, and the dual-task effect (DTE for the walking time and CRR were calculated. Forty-six of the participants were tested twice within 3-4 days to establish test-retest reliability.The walking time in various dual-task assessments demonstrated good to excellent reliability [Intraclass correlation coefficient (ICC2,1 = 0.70-0.93; relative minimal detectable change at 95% confidence level (MDC95% = 29%-45%]. The reliability of the CRR (ICC2,1 = 0.58-0.81 and the DTE in walking time (ICC2,1 = 0.11-0.80 was more varied. The reliability of the DTE in CRR (ICC2,1 = -0.31-0.40 was poor to fair. The walking time and CRR obtained in various dual-task walking tests were moderately to strongly correlated with those of the dual-task Timed-up-and-Go test, thus demonstrating good concurrent validity. None of the tests could discriminate fallers (those who had sustained at least one fall in the past year from non-fallers.The results are generalizable to community-dwelling individuals with chronic stroke only.The walking time derived from the various dual

  6. Reliability with imperfect diagnostics. [flight-maintenance sequence

    White, A. L.


    A reliability estimation method for systems that continually accumulate faults because of imperfect diagnostics is developed and an application for redundant digital avionics is presented. The present method assumes that if a fault does not appear in a short period of time, it will remain hidden until a majority of components are faulty and the system fails. A certain proportion of a component's faults are detected in a short period of time, and a description of their detection is included in the reliability model. A Markov model of failure during flight for a nonreconfigurable five-plex is presented for a sequence of one-hour flights followed by maintenance.

  7. 78 FR 38311 - Reliability Technical Conference Agenda


    ..., Fix, Track, and Report program enhanced reliability? b. What is the status of the NERC Reliability... Energy Regulatory Commission Reliability Technical Conference Agenda Reliability Technical Docket No. AD13-6-000 Conference. North American Electric Docket No. RC11-6-004 Reliability Corporation....

  8. Reliability in sealing of canister for spent nuclear fuel

    Ronneteg, Ulf [Bodycote Materials Testing AB, Nykoeping (Sweden); Cederqvist, Lars; Ryden, Haakan [Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden); Oeberg, Tomas [Tomas Oeberg Konsult AB, Karlskrona (Sweden); Mueller, Christina [Federal Inst. for Materials Research and Testing, Berlin (Germany)


    The reliability of the system for sealing the canister and inspecting the weld that has been developed for the Encapsulation plant was investigated. In the investigation the occurrence of discontinuities that can be formed in the welds was determined both qualitatively and quantitatively. The probability that these discontinuities can be detected by nondestructive testing (NDT) was also studied. The friction stir welding (FSW) process was verified in several steps. The variables in the welding process that determine weld quality were identified during the development work. In order to establish the limits within which they can be allowed to vary, a screening experiment was performed where the different process settings were tested according to a given design. In the next step the optimal process setting was determined by means of a response surface experiment, whereby the sensitivity of the process to different variable changes was studied. Based on the optimal process setting, the process window was defined, i.e. the limits within which the welding variables must lie in order for the process to produce the desired result. Finally, the process was evaluated during a demonstration series of 20 sealing welds which were carried out under production-like conditions. Conditions for the formation of discontinuities in welding were investigated. The investigations show that the occurrence of discontinuities is dependent on the welding variables. Discontinuities that can arise were classified and described with respect to characteristics, occurrence, cause and preventive measures. To ensure that testing of the welds has been done with sufficient reliability, the probability of detection (POD) of discontinuities by NDT and the accuracy of size determination by NDT were determined. In the evaluation of the demonstration series, which comprised 20 welds, a statistical method based on the generalized extreme value distribution was fitted to the size estimate of the indications

  9. Future Events

    Adli Tıp Uzmanları Derneği ATUD


    Full Text Available 1\t9 Joint Meeting with the Forensic Science Service: - Crimes of the Millennium. 2-\t5 November 2000. Stakis Hotel, Bromsgrove, Birmingham. Convener; Mike Loveland. Details from Forensic Science Society Office. Tel: 01423 506068 2\tCrime Scene Technology Workshop 2 A Crime Scene Practicum November 6-10, 2000 Evanston, IL Contact: Registrar Northwestern University, Traffic Institute 405 Church St. Evanston, IL 60201 (800 323-4011 3\tAPIS Curriculum November 13-15, 2000 Youngsville, NC Instructors: Marty Ludas, Johnny Leonard Contact: Clara Carroll Sirchie Fingerprint Labs, Inc. 100 Hunter Place Youngsville, NC 27596 (800 356-7311 4\tCrime Scene Technology Workshop 3 Advanced Techniques November 13-17, 2000 Evanston, IL Contact: Registrar Northwestern University, Traffic Institute 405 Church St. Evanston, IL 60201 (800 323-4011 5\tInvestigative Photography Workshop I Comprehensive Photographic Techniques December 4-8, 2000 Evanston, IL Contact: Registrar Northwestern University, Traffic Institute 405 Church St. Evanston, IL 60201 (800 323-4011 6\tInvestigative Photography Workshop 2 Advanced Techniques December 11-15, 2000 Evanston, IL Contact: Registrar Northwestern University, Traffic Institute 405 Church St. Evanston, IL 60201 (800 323-4011 7\tBloodstain Evidence Workshop I The Significance of Bloodstain Evidence in Death Scene Investigations December 11-15, 2000 Evanston, IL Contact: Registrar Northwestern University, Traffic Institute 405 Church St. Evanston, IL 60201 (800 323-4011 8\tOne Hundred Years of Fingerprint Detection and Identification June 16-30, 2001 London, Great Britain Contact: Maurice Garvie or David Smith New Scotland Yard The Broadway London, SW1H0BG Great Britain 9\tInternational Assoc, for Identification Annual Conference July 22-28, 2001 Dorai Resort, Miami, FL Contact: Candy Murray 20601 Netherland St. Orlando, FL 32833 (407 568-7436

  10. Reliability Based Ship Structural Design

    Dogliani, M.; Østergaard, C.; Parmentier, G.;


    with developments of models of load effects and of structural collapse adopted in reliability formulations which aim at calibrating partial safety factors for ship structural design. New probabilistic models of still-water load effects are developed both for tankers and for containerships. New results are presented......This paper deals with the development of different methods that allow the reliability-based design of ship structures to be transferred from the area of research to the systematic application in current design. It summarises the achievements of a three-year collaborative research project dealing...... structure of several tankers and containerships. The results of the reliability analysis were the basis for the definition of a target safety level which was used to asses the partial safety factors suitable for in a new design rules format to be adopted in modern ship structural design. Finally...

  11. Reliability Modeling of Wind Turbines

    Kostandyan, Erik

    Cost reductions for offshore wind turbines are a substantial requirement in order to make offshore wind energy more competitive compared to other energy supply methods. During the 20 – 25 years of wind turbines useful life, Operation & Maintenance costs are typically estimated to be a quarter...... the actions should be made and the type of actions requires knowledge on the accumulated damage or degradation state of the wind turbine components. For offshore wind turbines, the action times could be extended due to weather restrictions and result in damage or degradation increase of the remaining...... for Operation & Maintenance planning. Concentrating efforts on development of such models, this research is focused on reliability modeling of Wind Turbine critical subsystems (especially the power converter system). For reliability assessment of these components, structural reliability methods are applied...

  12. Reliability Assessment of Wind Turbines

    Sørensen, John Dalsgaard


    Wind turbines can be considered as structures that are in between civil engineering structures and machines since they consist of structural components and many electrical and machine components together with a control system. Further, a wind turbine is not a one-of-a-kind structure...... but manufactured in series production based on many component tests, some prototype tests and zeroseries wind turbines. These characteristics influence the reliability assessment where focus in this paper is on the structural components. Levelized Cost Of Energy is very important for wind energy, especially when...... comparing to other energy sources. Therefore much focus is on cost reductions and improved reliability both for offshore and onshore wind turbines. The wind turbine components should be designed to have sufficient reliability level with respect to both extreme and fatigue loads but also not be too costly...

  13. Reliability assessment of Wind turbines

    Sørensen, John Dalsgaard


    Wind turbines can be considered as structures that are in between civil engineering structures and machines since they consist of structural components and many electrical and machine components together with a control system. Further, a wind turbine is not a one-of-a-kind structure...... but manufactured in series production based on many component tests, some prototype tests and zeroseries wind turbines. These characteristics influence the reliability assessment where focus in this paper is on the structural components. Levelized Cost Of Energy is very important for wind energy, especially when...... comparing to other energy sources. Therefore much focus is on cost reductions and improved reliability both for offshore and onshore wind turbines. The wind turbine components should be designed to have sufficient reliability level with respect to both extreme and fatigue loads but also not be too costly...

  14. Reliability of Wave Energy Converters

    Ambühl, Simon

    . Structural reliability considerations and optimizations impact operation and maintenance (O&M) costs as well as the initial investment costs. Furthermore, there is a control system for WEC applications which defines the harvested energy but also the loads onto the structure. Therefore, extreme loads but also...... WEPTOS. Calibration of safety factors are performed for welded structures at theWavestar device including different control systems for harvesting energy from waves. In addition, a case study of different O&M strategies for WECs is discussed, and an example of reliability-based structural optimization......There are many different working principles for wave energy converters (WECs) which are used to produce electricity from waves. In order for WECs to become successful and more competitive to other renewable electricity sources, the consideration of the structural reliability of WECs is essential...

  15. Component reliability for electronic systems

    Bajenescu, Titu-Marius I


    The main reason for the premature breakdown of today's electronic products (computers, cars, tools, appliances, etc.) is the failure of the components used to build these products. Today professionals are looking for effective ways to minimize the degradation of electronic components to help ensure longer-lasting, more technically sound products and systems. This practical book offers engineers specific guidance on how to design more reliable components and build more reliable electronic systems. Professionals learn how to optimize a virtual component prototype, accurately monitor product reliability during the entire production process, and add the burn-in and selection procedures that are the most appropriate for the intended applications. Moreover, the book helps system designers ensure that all components are correctly applied, margins are adequate, wear-out failure modes are prevented during the expected duration of life, and system interfaces cannot lead to failure.

  16. New Approaches to Reliability Assessment

    Ma, Ke; Wang, Huai; Blaabjerg, Frede


    of energy. New approaches for reliability assessment are being taken in the design phase of power electronics systems based on the physics-of-failure in components. In this approach, many new methods, such as multidisciplinary simulation tools, strength testing of components, translation of mission profiles......Power electronics are facing continuous pressure to be cheaper and smaller, have a higher power density, and, in some cases, also operate at higher temperatures. At the same time, power electronics products are expected to have reduced failures because it is essential for reducing the cost......, and statistical analysis, are involved to enable better prediction and design of reliability for products. This article gives an overview of the new design flow in the reliability engineering of power electronics from the system-level point of view and discusses some of the emerging needs for the technology...

  17. Structural Optimization with Reliability Constraints

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle


    During the last 25 years considerable progress has been made in the fields of structural optimization and structural reliability theory. In classical deterministic structural optimization all variables are assumed to be deterministic. Due to the unpredictability of loads and strengths of actual...... structures it is now widely accepted that structural problems are non-deterministic. Therefore, some of the variables have to be modelled as random variables/processes and a reliability-based design philosophy should be used, Comell [1], Moses [2], Ditlevsen [3] and Thoft-Christensen & Baker [ 4......]. In this paper we consider only structures which can be modelled as systems of elasto-plastic elements, e.g. frame and truss structures. In section 2 a method to evaluate the reliability of such structural systems is presented. Based on a probabilistic point of view a modern structural optimization problem...

  18. Nanoscale deformation measurements for reliability assessment of material interfaces

    Keller, Jürgen; Gollhardt, Astrid; Vogel, Dietmar; Michel, Bernd


    With the development and application of micro/nano electronic mechanical systems (MEMS, NEMS) for a variety of market segments new reliability issues will arise. The understanding of material interfaces is the key for a successful design for reliability of MEMS/NEMS and sensor systems. Furthermore in the field of BIOMEMS newly developed advanced materials and well known engineering materials are combined despite of fully developed reliability concepts for such devices and components. In addition the increasing interface-to volume ratio in highly integrated systems and nanoparticle filled materials are challenges for experimental reliability evaluation. New strategies for reliability assessment on the submicron scale are essential to fulfil the needs of future devices. In this paper a nanoscale resolution experimental method for the measurement of thermo-mechanical deformation at material interfaces is introduced. The determination of displacement fields is based on scanning probe microscopy (SPM) data. In-situ SPM scans of the analyzed object (i.e. material interface) are carried out at different thermo-mechanical load states. The obtained images are compared by grayscale cross correlation algorithms. This allows the tracking of local image patterns of the analyzed surface structure. The measurement results are full-field displacement fields with nanometer resolution. With the obtained data the mixed mode type of loading at material interfaces can be analyzed with highest resolution for future needs in micro system and nanotechnology.

  19. Report on Wind Turbine Subsystem Reliability - A Survey of Various Databases (Presentation)

    Sheng, S.


    Wind industry has been challenged by premature subsystem/component failures. Various reliability data collection efforts have demonstrated their values in supporting wind turbine reliability and availability research & development and industrial activities. However, most information on these data collection efforts are scattered and not in a centralized place. With the objective of getting updated reliability statistics of wind turbines and/or subsystems so as to benefit future wind reliability and availability activities, this report is put together based on a survey of various reliability databases that are accessible directly or indirectly by NREL. For each database, whenever feasible, a brief description summarizing database population, life span, and data collected is given along with its features & status. Then selective results deemed beneficial to the industry and generated based on the database are highlighted. This report concludes with several observations obtained throughout the survey and several reliability data collection opportunities in the future.

  20. Reliability-based optimization of engineering structures

    Sørensen, John Dalsgaard


    The theoretical basis for reliability-based structural optimization within the framework of Bayesian statistical decision theory is briefly described. Reliability-based cost benefit problems are formulated and exemplitied with structural optimization. The basic reliability-based optimization prob...

  1. Design of Reliable Adaptive Filter with Fault Tolerance Using DSP

    Ryoo, D. W.; Lee, J. W. [Electronics and Telecommunications Research Institute, Taejon (Korea); Seo, B. H. [Kyungbok National University, Taegu (Korea)


    LSM algorithm has been used for plant identifier and noise cancellation. This algorithm has been researched for performance enhancement of filtering. The design and development of a reliable system has been becoming a key issue in industry field because the reliability of a system is considered as an important factor to perform the system's function successfully. And the computing with reliability and fault tolerance is a important factor in the case of aviation, system communication, and nuclear plant. This paper presents design of reliable adaptive filter with fault tolerance. Generally, redundancy is used for reliability. In this case it needs computing or circuit for voting mechanism or computing for fault detection or switching part. But this presented Filter is not in need of computing for voting mechanism, or fault detection. Therefore it has simple computing , and practicality for application. And in this paper, reliability of adaptive filter is analyzed. The effectiveness of the proposed adaptive filter is demonstrated to the case studies of plant identifier and noise cancellation by using DSP. (author). 9 refs., 18 figs.

  2. Metrological Reliability of Medical Devices

    Costa Monteiro, E.; Leon, L. F.


    The prominent development of health technologies of the 20th century triggered demands for metrological reliability of physiological measurements comprising physical, chemical and biological quantities, essential to ensure accurate and comparable results of clinical measurements. In the present work, aspects concerning metrological reliability in premarket and postmarket assessments of medical devices are discussed, pointing out challenges to be overcome. In addition, considering the social relevance of the biomeasurements results, Biometrological Principles to be pursued by research and innovation aimed at biomedical applications are proposed, along with the analysis of their contributions to guarantee the innovative health technologies compliance with the main ethical pillars of Bioethics.

  3. Reliability Assessment of Concrete Bridges

    Thoft-Christensen, Palle; Middleton, C. R.

    This paper is partly based on research performed for the Highways Agency, London, UK under the project DPU/9/44 "Revision of Bridge Assessment Rules Based on Whole Life Performance: concrete bridges". It contains the details of a methodology which can be used to generate Whole Life (WL) reliability...... profiles. These WL reliability profiles may be used to establish revised rules for concrete bridges. This paper is to some extend based on Thoft-Christensen et. al. [1996], Thoft-Christensen [1996] et. al. and Thoft-Christensen [1996]....

  4. Reliability Management for Information System

    李睿; 俞涛; 刘明伦


    An integrated intelligent management is presented to help organizations manage many heterogeneous resources in their information system. A general architecture of management for information system reliability is proposed, and the architecture from two aspects, process model and hierarchical model, described. Data mining techniques are used in data analysis. A data analysis system applicable to real-time data analysis is developed by improved data mining on the critical processes. The framework of the integrated management for information system reliability based on real-time data mining is illustrated, and the development of integrated and intelligent management of information system discussed.

  5. Analog IC reliability in nanometer CMOS

    Maricau, Elie


    This book focuses on modeling, simulation and analysis of analog circuit aging. First, all important nanometer CMOS physical effects resulting in circuit unreliability are reviewed. Then, transistor aging compact models for circuit simulation are discussed and several methods for efficient circuit reliability simulation are explained and compared. Ultimately, the impact of transistor aging on analog circuits is studied. Aging-resilient and aging-immune circuits are identified and the impact of technology scaling is discussed.   The models and simulation techniques described in the book are intended as an aid for device engineers, circuit designers and the EDA community to understand and to mitigate the impact of aging effects on nanometer CMOS ICs.   ·         Enables readers to understand long-term reliability of an integrated circuit; ·         Reviews CMOS unreliability effects, with focus on those that will emerge in future CMOS nodes; ·         Provides overview of models for...

  6. Reliability Modeling Development and Its Applications for Ceramic Capacitors with Base-Metal Electrodes (BMEs)

    Liu, Donhang


    This presentation includes a summary of NEPP-funded deliverables for the Base-Metal Electrodes (BMEs) capacitor task, development of a general reliability model for BME capacitors, and a summary and future work.

  7. 长沙版蒙特利尔认知评估量表的信度、效度检测与血管性认知障碍理想划界分值%The study on reliability, validity of Montreal Cognitive Assessment (Changsha Version) and preliminary exploration of its optimal cutoff score for detecting vascular cognitive impairment

    涂秋云; 靳慧; 丁斌蓉; 杨霞; 雷曾辉; 白松; 唐湘祁


    目的 在湖南地区缺血性脑血管病人群中进行长沙版蒙特利尔认知评估量表(Montreal cognitive assessment,MoCA)的信度、效度检测并初步探索其筛查血管性认知障碍(vascular cognitive impairment,VCI)的理想划界分值.方法 在长沙地区年龄≥40岁的159例缺血性脑血管病患者中进行长沙版MoCA、简易精神状态检查量表(mini-mental state examination,MMSE)、认知检测组合(包括中国修订韦氏成人智力量表中的计算、相似性、数字广度、木块图测验;中国修定韦氏记忆量表中的常识和定向、逻辑记忆、视觉再认;Stroop测试)及其他相关量表的评估.并于首次评估后的3~5周在随机选取的30例子样本中进行长沙版MoCA的复评.分别计算长沙版MoCA的内部一致性信度、重测信度、调查员信度、平行效度,并根据ROC曲线(receiver operator characteristic curve)分析探索其筛查VCI的理想划界分值.结果 长沙版MoCA的克朗巴赫系数(Cronbach's α)为0.846、重测信度为0.974、调查员信度为0.969;长沙版MoCA与MMSE及简式智商的Pearson相关系数分别为0.879及0.799;按照受教育年限≤6年者总分加1分,以26/27分作为诊断VCI的划界分(≤26分存在VCI),可得到最佳的灵敏度(90.0%)及特异度(70.9%).以该划界分进行诊断的结果与临床专家组的认知诊断结果间的Kappa一致性系数为0.610.结论 长沙版MoCA的信度、效度良好,适合在湖南地区缺血性脑血管病人群中进行VCI筛查;长沙版MoCA为一个适合中国大陆人群使用的中文版MoCA,有进一步向全国推广使用的潜力.%Objective To examine the reliability, validity of Montreal Cognitive Assessment (Changsha Version) (MoCA-CS) in patients with ischemic cerebrovascular disease in Hunan province and explore its optimal cutoff score for detecting vascular cognitive impairment (VCI). Methods MoCA-CS, Mini-Mental State Examination (MMSE), a detailed


    Bowerman, P. N.


    reliability calculations and infinite repair resources for availability calculations. No more than 967 items or groups can be modeled by RELAV. If larger problems can be broken into subsystems of 967 items or less, the subsystem results can be used as item inputs to a system problem. The calculated availabilities are steady-state values. Group results are presented in the order in which they were calculated (from the most embedded level out to the system level). This provides a good mechanism to perform trade studies. Starting from the system result and working backwards, the granularity gets finer; therefore, system elements that contribute most to system degradation are detected quickly. RELAV is a C-language program originally developed under the UNIX operating system on a MASSCOMP MC500 computer. It has been modified, as necessary, and ported to an IBM PC compatible with a math coprocessor. The current version of the program runs in the DOS environment and requires a Turbo C vers. 2.0 compiler. RELAV has a memory requirement of 103 KB and was developed in 1989. RELAV is a copyrighted work with all copyright vested in NASA.



  10. Test-Retest Reliability of Dual-Task Outcome Measures in People With Parkinson Disease

    Strouwen, C.; Molenaar, E.A.; Keus, S.H.; Munks, L.; Bloem, B.R.; Nieuwboer, A.


    BACKGROUND: Dual-task (DT) training is gaining ground as a physical therapy intervention in people with Parkinson disease (PD). Future studies evaluating the effect of such interventions need reliable outcome measures. To date, the test-retest reliability of DT measures in patients with PD remains l

  11. Reliability of pre- and intraoperative tests for biliary lithiasis

    Escallon, A. Jr.; Rosales, W.; Aldrete, J.S.


    The records of 242 patients, operated consecutively for biliary lithiasis, were analyzed to determine the reliability of oral cholecystography (OCG), ultrasonography (US), and HIDA in detecting biliary calculi. Preoperative interpretations were correlated to operative findings. OCG obtained in 138 patients was accurate in 92%. US obtained in 150 was correct in 89%. The accuracy of HIDA was 92% in acute and 78% in chronic cholecystitis. Intraoperative cholangiography (IOC) done in 173 patients indicated the need for exploratory choledochotomy in 24; 21 had choledocholithiasis. These observations suggest that OCG and US are very accurate, but not infallible, in detecting cholelithiasis. US should be done first; when doubt persists, the addition of OCG allows the preoperative diagnosis of gallstones in 97% of the cases. HIDA is highly accurate but not infallible in detecting acute calculous cholecystitis. IOC is very reliable in detecting choledocholithiasis; thus, its routine is justifiable.

  12. Reliability of pre- and intraoperative tests for biliary lithiasis.

    Escallon, A; Rosales, W; Aldrete, J S


    The records of 242 patients, operated consecutively for biliary lithiasis, were analyzed to determine the reliability of oral cholecystography (OCG), ultrasonography (US), and HIDA in detecting biliary calculi. Preoperative interpretations were correlated to operative findings. OCG obtained in 138 patients was accurate in 92%. US obtained in 150 was correct in 89%. The accuracy of HIDA was 92% in acute and 78% in chronic cholecystitis. Intraoperative cholangiography (IOC) done in 173 patients indicated the need for exploratory choledochotomy in 24; 21 had choledocholithiasis. These observations suggest that OCG and US are very accurate, but not infallible, in detecting cholelithiasis. US should be done first; when doubt persists, the addition of OCG allows the preoperative diagnosis of gallstones in 97% of the cases. HIDA is highly accurate but not infallible in detecting acute calculous cholecystitis. IOC is very reliable in detecting choledocholithiasis; thus, its routine is justifiable. PMID:3888131

  13. Reliability of quantitative content analyses

    Enschot-van Dijk, R. van


    Reliable coding of stimuli is a daunting task that often yields unsatisfactory results. This paper discusses a case study in which tropes (e.g., metaphors, puns) in TV commercials were analyzed as well the extent and location of verbal and visual anchoring (i.e., explanation) of these tropes. After

  14. Wind turbine reliability database update.

    Peters, Valerie A.; Hill, Roger Ray; Stinebaugh, Jennifer A.; Veers, Paul S.


    This report documents the status of the Sandia National Laboratories' Wind Plant Reliability Database. Included in this report are updates on the form and contents of the Database, which stems from a fivestep process of data partnerships, data definition and transfer, data formatting and normalization, analysis, and reporting. Selected observations are also reported.

  15. Photovoltaic performance and reliability workshop

    Kroposki, B


    This proceedings is the compilation of papers presented at the ninth PV Performance and Reliability Workshop held at the Sheraton Denver West Hotel on September 4--6, 1996. This years workshop included presentations from 25 speakers and had over 100 attendees. All of the presentations that were given are included in this proceedings. Topics of the papers included: defining service lifetime and developing models for PV module lifetime; examining and determining failure and degradation mechanisms in PV modules; combining IEEE/IEC/UL testing procedures; AC module performance and reliability testing; inverter reliability/qualification testing; standardization of utility interconnect requirements for PV systems; need activities to separate variables by testing individual components of PV systems (e.g. cells, modules, batteries, inverters,charge controllers) for individual reliability and then test them in actual system configurations; more results reported from field experience on modules, inverters, batteries, and charge controllers from field deployed PV systems; and system certification and standardized testing for stand-alone and grid-tied systems.


    Dr Obe

    Evaluation of the reliability of a primary cell took place in three stages: 192 cells went through a ... CCV - Closed Circuit Voltage, the voltage at the terminals of a battery when it is under an electrical ... Cylindrical spirally wound cells have the.

  17. Finding Reliable Health Information Online

    ... at NHGRI About About the Institute Budget and Financial Information Divisions Director's Page How to Contact Us Institute ... una búsqueda saludable en Internet Finding Reliable Health Information Online As Internet users quickly discover, an enormous amount of health information ...

  18. Reliability of subjective wound assessment

    M.C.T. Bloemen; P.P.M. van Zuijlen; E. Middelkoop


    Introduction: Assessment of the take of split-skin graft and the rate of epithelialisation are important parameters in burn surgery. Such parameters are normally estimated by the clinician in a bedside procedure. This study investigates whether this subjective assessment is reliable for graft take a

  19. Fatigue Reliability under Random Loads

    Talreja, R.


    , with the application of random loads, the initial homogeneous distribution of strength changes to a two-component distribution, reflecting the two-stage fatigue damage. In the crack initiation stage, the strength increases initially and then decreases, while an abrupt decrease of strength is seen in the crack...... propagation stage. The consequences of this behaviour on the fatigue reliability are discussed....

  20. Reliability Analysis of Money Habitudes

    Delgadillo, Lucy M.; Bushman, Brittani S.


    Use of the Money Habitudes exercise has gained popularity among various financial professionals. This article reports on the reliability of this resource. A survey administered to young adults at a western state university was conducted, and each Habitude or "domain" was analyzed using Cronbach's alpha procedures. Results showed all six…

  1. The Reliability of College Grades

    Beatty, Adam S.; Walmsley, Philip T.; Sackett, Paul R.; Kuncel, Nathan R.; Koch, Amanda J.


    Little is known about the reliability of college grades relative to how prominently they are used in educational research, and the results to date tend to be based on small sample studies or are decades old. This study uses two large databases (N > 800,000) from over 200 educational institutions spanning 13 years and finds that both first-year…

  2. Reliability Analysis of Money Habitudes

    Delgadillo, Lucy M.; Bushman, Brittani S.


    Use of the Money Habitudes exercise has gained popularity among various financial professionals. This article reports on the reliability of this resource. A survey administered to young adults at a western state university was conducted, and each Habitude or "domain" was analyzed using Cronbach's alpha procedures. Results showed all six…

  3. Inflection points for network reliability

    Brown, J.I.; Koç, Y.; Kooij, R.E.


    Given a finite, undirected graph G (possibly with multiple edges), we assume that the vertices are operational, but the edges are each independently operational with probability p. The (all-terminal) reliability, Rel(G,p), of G is the probability that the spanning subgraph of operational edges is co

  4. Becoming a high reliability organization.

    Christianson, Marlys K; Sutcliffe, Kathleen M; Miller, Melissa A; Iwashyna, Theodore J


    Aircraft carriers, electrical power grids, and wildland firefighting, though seemingly different, are exemplars of high reliability organizations (HROs)--organizations that have the potential for catastrophic failure yet engage in nearly error-free performance. HROs commit to safety at the highest level and adopt a special approach to its pursuit. High reliability organizing has been studied and discussed for some time in other industries and is receiving increasing attention in health care, particularly in high-risk settings like the intensive care unit (ICU). The essence of high reliability organizing is a set of principles that enable organizations to focus attention on emergent problems and to deploy the right set of resources to address those problems. HROs behave in ways that sometimes seem counterintuitive--they do not try to hide failures but rather celebrate them as windows into the health of the system, they seek out problems, they avoid focusing on just one aspect of work and are able to see how all the parts of work fit together, they expect unexpected events and develop the capability to manage them, and they defer decision making to local frontline experts who are empowered to solve problems. Given the complexity of patient care in the ICU, the potential for medical error, and the particular sensitivity of critically ill patients to harm, high reliability organizing principles hold promise for improving ICU patient care.

  5. Space solar array reliability: A study and recommendations

    Brandhorst, Henry W., Jr.; Rodiek, Julie A.


    Providing reliable power over the anticipated mission life is critical to all satellites; therefore solar arrays are one of the most vital links to satellite mission success. Furthermore, solar arrays are exposed to the harshest environment of virtually any satellite component. In the past 10 years 117 satellite solar array anomalies have been recorded with 12 resulting in total satellite failure. Through an in-depth analysis of satellite anomalies listed in the Airclaim's Ascend SpaceTrak database, it is clear that solar array reliability is a serious, industry-wide issue. Solar array reliability directly affects the cost of future satellites through increased insurance premiums and a lack of confidence by investors. Recommendations for improving reliability through careful ground testing, standardization of testing procedures such as the emerging AIAA standards, and data sharing across the industry will be discussed. The benefits of creating a certified module and array testing facility that would certify in-space reliability will also be briefly examined. Solar array reliability is an issue that must be addressed to both reduce costs and ensure continued viability of the commercial and government assets on orbit.

  6. Multi-mode reliability-based design of horizontal curves.

    Essa, Mohamed; Sayed, Tarek; Hussein, Mohamed


    Recently, reliability analysis has been advocated as an effective approach to account for uncertainty in the geometric design process and to evaluate the risk associated with a particular design. In this approach, a risk measure (e.g. probability of noncompliance) is calculated to represent the probability that a specific design would not meet standard requirements. The majority of previous applications of reliability analysis in geometric design focused on evaluating the probability of noncompliance for only one mode of noncompliance such as insufficient sight distance. However, in many design situations, more than one mode of noncompliance may be present (e.g. insufficient sight distance and vehicle skidding at horizontal curves). In these situations, utilizing a multi-mode reliability approach that considers more than one failure (noncompliance) mode is required. The main objective of this paper is to demonstrate the application of multi-mode (system) reliability analysis to the design of horizontal curves. The process is demonstrated by a case study of Sea-to-Sky Highway located between Vancouver and Whistler, in southern British Columbia, Canada. Two noncompliance modes were considered: insufficient sight distance and vehicle skidding. The results show the importance of accounting for several noncompliance modes in the reliability model. The system reliability concept could be used in future studies to calibrate the design of various design elements in order to achieve consistent safety levels based on all possible modes of noncompliance.

  7. Power Quality and Reliability Project

    Attia, John O.


    One area where universities and industry can link is in the area of power systems reliability and quality - key concepts in the commercial, industrial and public sector engineering environments. Prairie View A&M University (PVAMU) has established a collaborative relationship with the University of'Texas at Arlington (UTA), NASA/Johnson Space Center (JSC), and EP&C Engineering and Technology Group (EP&C) a small disadvantage business that specializes in power quality and engineering services. The primary goal of this collaboration is to facilitate the development and implementation of a Strategic Integrated power/Systems Reliability and Curriculum Enhancement Program. The objectives of first phase of this work are: (a) to develop a course in power quality and reliability, (b) to use the campus of Prairie View A&M University as a laboratory for the study of systems reliability and quality issues, (c) to provide students with NASA/EPC shadowing and Internship experience. In this work, a course, titled "Reliability Analysis of Electrical Facilities" was developed and taught for two semesters. About thirty seven has benefited directly from this course. A laboratory accompanying the course was also developed. Four facilities at Prairie View A&M University were surveyed. Some tests that were performed are (i) earth-ground testing, (ii) voltage, amperage and harmonics of various panels in the buildings, (iii) checking the wire sizes to see if they were the right size for the load that they were carrying, (iv) vibration tests to test the status of the engines or chillers and water pumps, (v) infrared testing to the test arcing or misfiring of electrical or mechanical systems.

  8. Reliability of Arctic offshore installations

    Bercha, F.G. [Bercha Group, Calgary, AB (Canada); Gudmestad, O.T. [Stavanger Univ., Stavanger (Norway)]|[Statoil, Stavanger (Norway)]|[Norwegian Univ. of Technology, Stavanger (Norway); Foschi, R. [British Columbia Univ., Vancouver, BC (Canada). Dept. of Civil Engineering; Sliggers, F. [Shell International Exploration and Production, Rijswijk (Netherlands); Nikitina, N. [VNIIG, St. Petersburg (Russian Federation); Nevel, D.


    Life threatening and fatal failures of offshore structures can be attributed to a broad range of causes such as fires and explosions, buoyancy losses, and structural overloads. This paper addressed the different severities of failure types, categorized as catastrophic failure, local failure or serviceability failure. Offshore tragedies were also highlighted, namely the failures of P-36, the Ocean Ranger, the Piper Alpha, and the Alexander Kieland which all resulted in losses of human life. P-36 and the Ocean Ranger both failed ultimately due to a loss of buoyancy. The Piper Alpha was destroyed by a natural gas fire, while the Alexander Kieland failed due to fatigue induced structural failure. The mode of failure was described as being the specific way in which a failure occurs from a given cause. Current reliability measures in the context of offshore installations only consider the limited number of causes such as environmental loads. However, it was emphasized that a realistic value of the catastrophic failure probability should consider all credible causes of failure. This paper presented a general method for evaluating all credible causes of failure of an installation. The approach to calculating integrated reliability involves the use of network methods such as fault trees to combine the probabilities of all factors that can cause a catastrophic failure, as well as those which can cause a local failure with the potential to escalate to a catastrophic failure. This paper also proposed a protocol for setting credible reliability targets such as the consideration of life safety targets and escape, evacuation, and rescue (EER) success probabilities. A set of realistic reliability targets for both catastrophic and local failures for representative safety and consequence categories associated with offshore installations was also presented. The reliability targets were expressed as maximum average annual failure probabilities. The method for converting these annual

  9. Test-Retest Reliability of 10 Hz Conditioning Electrical Stimulation Inducing Long-Term Potentiation (LTP)-Like Pain Amplification in Humans

    Xia, Weiwei; Mørch, Carsten Dahl; Andersen, Ole Kæseler


    Background 10 Hz conditioning electrical stimulation (CES) has been shown to induce long-term potentiation (LTP)-like pain amplification similar to traditional 100 Hz CES in healthy humans. The aim of this study was to assess the test-retest reliability and to estimate sample sizes required for future crossover and parallel study designs. Methods The 10 Hz paradigm (500 rectangular pulses lasting 50 s) was repeated on two separate days with one week interval in twenty volunteers. Perceptual intensities to single electrical stimulation (SES) at the conditioned skin site and to mechanical stimuli (pinprick and light stroking) in immediate vicinity to the conditioned skin site were recorded. Superficial blood flow (SBF) was assessed as indicator of neurogenic inflammation. All outcome measures were assessed with 10 min interval three times before and six times after the CES. The coefficient of variation and intra-class correlation coefficient were calculated within session and between sessions. Sample sizes were estimated for future crossover (Ncr) and parallel (Np) drug testing studies expected to detect a 30% decrease for the individual outcome measure following 10 Hz CES. Results Perceptual intensity ratings to light stroking (Ncr = 2, Np = 33) and pinprick stimulation (491 mN) (Ncr = 6, Np = 54) increased after CES and showed better reliability in crossover than parallel design. The SBF increased after CES, and then declined until reaching a plateau 20 minutes postCES. SBF showed acceptable reliability both in crossover and parallel designs (Ncr = 3, Np = 13). Pain ratings to SES were reliable, but with large estimated sample sizes (Ncr = 634, Np = 11310) due to the minor pain amplification. Conclusions The reliability of 10 Hz CES was acceptable in inducing LTP-like effects in the assessments of superficial blood flow, heterotopic mechanical hyperalgesia, and dysesthesia in terms of sample sizes for future crossover study designs. PMID:27529175

  10. The reliability of the New Chrom ID ESBL colored medium for detect of extended spec-trum β-lactamase in Enterobactericaeae%新的产色培养基 Chrom ID ESBL在检测产超广谱β-内酰胺酶肠杆菌科细菌中的可靠性



    AIM:To Evaluate the reliability of Chrom ID ESBL medium for detection of extended spectrum β-lactamase in Enter-obactericaeae.METHODS:2 sets of Enterobacteriaceae Bacteria to test the medium,1 th set of collected bacteria samples was from March 201 0 to April 201 3 of our clinical hospital wards and outpa-tient,a total of 982,and 2nd set of 20 collected bacteria samples of our region was isolated,cultured and identified in previous years.RESULTS:Chrom ID ESBL medium can fast and effec-tively screening ESBLs.CONCLUSION:Chrom ID ESBL medi-um is a fast and reliable filtering of ESBLs.%目的:评价Chrom ID ESBL产色培养基检测产超广谱β-内酰胺酶肠杆菌科菌株的可靠性.方法:共使用2组肠杆菌科菌株对平板进行评估,第1组细菌是我院2010-04/2013-03各病区和门诊临床分离送检的样本,共982株,第2组细菌是本地区历往年分离培养,且确定产酶特性的肠杆菌科细菌20株.结果:Chrom ID ESBL显色平板可快速有效的筛查肠杆菌科细菌是否产ESBLs.结论:Chrom ID ESBL产色培养基是一种快速可靠的对产ESBLs的肠杆菌科菌株进行筛选的产色培养基.

  11. Exact reliability quantification of highly reliable systems with maintenance

    Bris, Radim, E-mail: radim.bris@vsb.c [VSB-Technical University Ostrava, Faculty of Electrical Engineering and Computer Science, Department of Applied Mathematics, 17. listopadu 15, 70833 Ostrava-Poruba (Czech Republic)


    When a system is composed of highly reliable elements, exact reliability quantification may be problematic, because computer accuracy is limited. Inaccuracy can be due to different aspects. For example, an error may be made when subtracting two numbers that are very close to each other, or at the process of summation of many very different numbers, etc. The basic objective of this paper is to find a procedure, which eliminates errors made by PC when calculations close to an error limit are executed. Highly reliable system is represented by the use of directed acyclic graph which is composed from terminal nodes, i.e. highly reliable input elements, internal nodes representing subsystems and edges that bind all of these nodes. Three admissible unavailability models of terminal nodes are introduced, including both corrective and preventive maintenance. The algorithm for exact unavailability calculation of terminal nodes is based on merits of a high-performance language for technical computing MATLAB. System unavailability quantification procedure applied to a graph structure, which considers both independent and dependent (i.e. repeatedly occurring) terminal nodes is based on combinatorial principle. This principle requires summation of a lot of very different non-negative numbers, which may be a source of an inaccuracy. That is why another algorithm for exact summation of such numbers is designed in the paper. The summation procedure uses benefits from a special number system with the base represented by the value 2{sup 32}. Computational efficiency of the new computing methodology is compared with advanced simulation software. Various calculations on systems from references are performed to emphasize merits of the methodology.

  12. Reliability of Children's Testimony in the Era of Developmental Reversals

    Brainerd, C. J.; Reyna, V. F.


    A hoary assumption of the law is that children are more prone to false-memory reports than adults, and hence, their testimony is less reliable than adults'. Since the 1980s, that assumption has been buttressed by numerous studies that detected declines in false memory between early childhood and young adulthood under controlled conditions.…

  13. On the reliability of Quake-Catcher Network earthquake detections

    Yildirim, Battalgazi; Cochran, Elizabeth S.; Chung, Angela I.; Christensen, Carl M.; Lawrence, Jesse F.


    Over the past two decades, there have been several initiatives to create volunteer‐based seismic networks. The Personal Seismic Network, proposed around 1990, used a short‐period seismograph to record earthquake waveforms using existing phone lines (Cranswick and Banfill, 1990; Cranswicket al., 1993). NetQuakes (Luetgert et al., 2010) deploys triaxial Micro‐Electromechanical Systems (MEMS) sensors in private homes, businesses, and public buildings where there is an Internet connection. Other seismic networks using a dense array of low‐cost MEMS sensors are the Community Seismic Network (Clayton et al., 2012; Kohler et al., 2013) and the Home Seismometer Network (Horiuchi et al., 2009). One main advantage of combining low‐cost MEMS sensors and existing Internet connection in public and private buildings over the traditional networks is the reduction in installation and maintenance costs (Koide et al., 2006). In doing so, it is possible to create a dense seismic network for a fraction of the cost of traditional seismic networks (D’Alessandro and D’Anna, 2013; D’Alessandro, 2014; D’Alessandro et al., 2014).

  14. Reliability of genetic bottleneck tests for detecting recent population declines

    Peery, M. Zachariah; Kirby, Rebecca; Reid, Brendan N.; Stoelting, Ricka; Doucet-Beer, Elena; Robinson, Stacie; Vasquez-Carrillo, Catalina; Pauli, Jonathan N.; Palsboll, Per J.


    The identification of population bottlenecks is critical in conservation because populations that have experienced significant reductions in abundance are subject to a variety of genetic and demographic processes that can hasten extinction. Genetic bottleneck tests constitute an appealing and popula

  15. Designing the Future

    Friso de Zeeuw


    Full Text Available The Netherlands has a tradition in public spatial planning and design. In the past 20 years, we have seen an increasing role for the market in this field, and more recently, growing attention for sustainability. Sustainability has become an economic factor. Not only at the building level, but also on the level of large-scale area development projects. More and more local governments have high ambitions for sustainable development. Increasingly, during project development, buildings are developed on a sustainable basis. Most of the time, the focus in this approach is on energy. However, sustainability also comprises social aspects. Energy measures have a direct relation to an economic factor such as investment costs, and payback time can be calculated. The economic aspects of social sustainability are more complex. Therefore, for all sustainability development projects, especially in large-scale projects planned over a longer period, it is necessary to make presumptions, which are less reliable as the planning period is extended. For future larger-scale developments, experience in the Netherlands points to two design approaches: ‘backcasting’, or using a growth model (or a combination of these two. The power of design is the ability to imagine possible scenarios for the future. The layer approach helps to integrate sustainability into public spatial planning. And more specifically, Urban Design Management (UDM supports an integrative and collaborative approach also on the operational level of a project in which public and market partners work together. This article outlines how design, based on these approaches, can contribute to sustainable development based on the ‘new playing field’, where spatial problems should be solved in networks. Dutch projects in Almere (Benoordenhout and Rijswijk are used to illustrate this approach.

  16. Reliability studies of diagnostic methods in Indian traditional Ayurveda medicine: An overview.

    Kurande, Vrinda Hitendra; Waagepetersen, Rasmus; Toft, Egon; Prasad, Ramjee


    Recently, a need to develop supportive new scientific evidence for contemporary Ayurveda has emerged. One of the research objectives is an assessment of the reliability of diagnoses and treatment. Reliability is a quantitative measure of consistency. It is a crucial issue in classification (such as prakriti classification), method development (pulse diagnosis), quality assurance for diagnosis and treatment and in the conduct of clinical studies. Several reliability studies are conducted in western medicine. The investigation of the reliability of traditional Chinese, Japanese and Sasang medicine diagnoses is in the formative stage. However, reliability studies in Ayurveda are in the preliminary stage. In this paper, examples are provided to illustrate relevant concepts of reliability studies of diagnostic methods and their implication in practice, education, and training. An introduction to reliability estimates and different study designs and statistical analysis is given for future studies in Ayurveda.

  17. Assessment and Improving Methods of Reliability Indices in Bakhtar Regional Electricity Company

    Saeed Shahrezaei


    Full Text Available Reliability of a system is the ability of a system to do prospected duties in future and the probability of desirable operation for doing predetermined duties. Power system elements failures data are the main data of reliability assessment in the network. Determining antiseptic parameters is the goal of reliability assessment by using system history data. These parameters help to recognize week points of the system. In other words, the goal of reliability assessment is operation improving and decreasing of the failures and power outages. This paper is developed to assess reliability indices of Bakhtar Regional Electricity Company up to 1393 and the improving methods and their effects on the reliability indices in this network. DIgSILENT Power Factory software is employed for simulation. Simulation results show the positive effect of improving methods in reliability indices of Bakhtar Regional Electricity Company.

  18. Assessing volume of accelerometry data for reliability in preschool children.

    Hinkley, Trina; O'Connell, Eoin; Okely, Anthony D; Crawford, David; Hesketh, Kylie; Salmon, Jo


    This study examines what volume of accelerometry data (h·d) is required to reliably estimate preschool children's physical activity and whether it is necessary to include weekday and weekend data. Accelerometry data from 493 to 799 (depending on wear time) preschool children from the Melbourne-based Healthy Active Preschool Years study were used. The percentage of wear time each child spent in total (light-vigorous) physical activity was the main outcome. Hourly increments of daily data were analyzed. t-tests, controlling for age and clustering by center of recruitment, assessed the differences between weekday and weekend physical activity. Intraclass correlation coefficients estimated reliability for an individual day. Spearman-Brown prophecy formula estimated the number of days required to reach reliability estimates of 0.7, 0.8, and 0.9. The children spent a significantly greater percentage of time being physically active on weekend compared with weekdays regardless of the minimum number of hours included (t = 12.49-16.76, P 8 d of data were required to reach a reliability estimate of 0.7 with 10 or more hours of data per day; 3.3-3.4 d were required to meet the same reliability estimate for days with 7 h of data. Future studies should ensure they include the minimum amount of data (hours per day and number of days) as identified in this study to meet at least a 0.7 reliability level and should report the level of reliability for their study. In addition to weekdays, at least one weekend day should be included in analyses to reliably estimate physical activity levels for preschool children.

  19. Reliability of four experimental mechanical pain tests in children

    Soee AL


    Full Text Available Ann-Britt L Soee,1 Lise L Thomsen,2 Birte Tornoe,1,3 Liselotte Skov11Department of Pediatrics, Children’s Headache Clinic, Copenhagen University Hospital Herlev, Copenhagen, Denmark; 2Department of Neuropediatrics, Juliane Marie Centre, Copenhagen University Hospital Rigshospitalet, København Ø, Denmark; 3Department of Physiotherapy, Medical Department O, Copenhagen University Hospital Herlev, Herlev, DenmarkPurpose: In order to study pain in children, it is necessary to determine whether pain measurement tools used in adults are reliable measurements in children. The aim of this study was to explore the intrasession reliability of pressure pain thresholds (PPT in healthy children. Furthermore, the aim was also to study the intersession reliability of the following four tests: (1 Total Tenderness Score; (2 PPT; (3 Visual Analog Scale score at suprapressure pain threshold; and (4 area under the curve (stimulus–response functions for pressure versus pain.Participants and methods: Twenty-five healthy school children, 8–14 years of age, participated. Test 2, PPT, was repeated three times at 2 minute intervals on the same day to estimate PPT intrasession reliability using Cronbach’s alpha. Tests 1–4 were repeated after median 21 (interquartile range 10.5–22 days, and Pearson’s correlation coefficient was used to describe the intersession reliability.Results: The PPT test was precise and reliable (Cronbach’s alpha ≥ 0.92. All tests showed a good to excellent correlation between days (intersessions r = 0.66–0.81. There were no indications of significant systematic differences found in any of the four tests between days.Conclusion: All tests seemed to be reliable measurements in pain evaluation in healthy children aged 8–14 years. Given the small sample size, this conclusion needs to be confirmed in future studies.Keywords: repeatability, intraindividual reliability, pressure pain threshold, pain measurement, algometer

  20. High-reliability computing for the smarter planet

    Quinn, Heather M [Los Alamos National Laboratory; Graham, Paul [Los Alamos National Laboratory; Manuzzato, Andrea [UNIV OF PADOVA; Dehon, Andre [UNIV OF PENN; Carter, Nicholas [INTEL CORPORATION


    The geometric rate of improvement of transistor size and integrated circuit performance, known as Moore's Law, has been an engine of growth for our economy, enabling new products and services, creating new value and wealth, increasing safety, and removing menial tasks from our daily lives. Affordable, highly integrated components have enabled both life-saving technologies and rich entertainment applications. Anti-lock brakes, insulin monitors, and GPS-enabled emergency response systems save lives. Cell phones, internet appliances, virtual worlds, realistic video games, and mp3 players enrich our lives and connect us together. Over the past 40 years of silicon scaling, the increasing capabilities of inexpensive computation have transformed our society through automation and ubiquitous communications. In this paper, we will present the concept of the smarter planet, how reliability failures affect current systems, and methods that can be used to increase the reliable adoption of new automation in the future. We will illustrate these issues using a number of different electronic devices in a couple of different scenarios. Recently IBM has been presenting the idea of a 'smarter planet.' In smarter planet documents, IBM discusses increased computer automation of roadways, banking, healthcare, and infrastructure, as automation could create more efficient systems. A necessary component of the smarter planet concept is to ensure that these new systems have very high reliability. Even extremely rare reliability problems can easily escalate to problematic scenarios when implemented at very large scales. For life-critical systems, such as automobiles, infrastructure, medical implantables, and avionic systems, unmitigated failures could be dangerous. As more automation moves into these types of critical systems, reliability failures will need to be managed. As computer automation continues to increase in our society, the need for greater radiation reliability is