WorldWideScience

Sample records for reliable computational identification

  1. Reliability in the utility computing era: Towards reliable Fog computing

    DEFF Research Database (Denmark)

    Madsen, Henrik; Burtschy, Bernard; Albeanu, G.

    2013-01-01

    This paper considers current paradigms in computing and outlines the most important aspects concerning their reliability. The Fog computing paradigm as a non-trivial extension of the Cloud is considered and the reliability of the networks of smart devices are discussed. Combining the reliability...... requirements of grid and cloud paradigms with the reliability requirements of networks of sensor and actuators it follows that designing a reliable Fog computing platform is feasible....

  2. Reliable computer systems.

    Science.gov (United States)

    Wear, L L; Pinkert, J R

    1993-11-01

    In this article, we looked at some decisions that apply to the design of reliable computer systems. We began with a discussion of several terms such as testability, then described some systems that call for highly reliable hardware and software. The article concluded with a discussion of methods that can be used to achieve higher reliability in computer systems. Reliability and fault tolerance in computers probably will continue to grow in importance. As more and more systems are computerized, people will want assurances about the reliability of these systems, and their ability to work properly even when sub-systems fail.

  3. Evaluation of the reliability concerning the identification of human factors as contributing factors by a computer supported event analysis (CEA)

    International Nuclear Information System (INIS)

    Wilpert, B.; Maimer, H.; Loroff, C.

    2000-01-01

    The project's objectives are the evaluation of the reliability concerning the identification of Human Factors as contributing factors by a computer supported event analysis (CEA). CEA is a computer version of SOL (Safety through Organizational Learning). Parts of the first step were interviews with experts from the nuclear power industry and the evaluation of existing computer supported event analysis methods. This information was combined to a requirement profile for the CEA software. The next step contained the implementation of the software in an iterative process of evaluation. The completion of this project was the testing of the CEA software. As a result the testing demonstrated that it is possible to identify contributing factors with CEA validly. In addition, CEA received a very positive feedback from the experts. (orig.) [de

  4. Computer scientist looks at reliability computations

    International Nuclear Information System (INIS)

    Rosenthal, A.

    1975-01-01

    Results from the theory of computational complexity are applied to reliability computations on fault trees and networks. A well known class of problems which almost certainly have no fast solution algorithms is presented. It is shown that even approximately computing the reliability of many systems is difficult enough to be in this class. In the face of this result, which indicates that for general systems the computation time will be exponential in the size of the system, decomposition techniques which can greatly reduce the effective size of a wide variety of realistic systems are explored

  5. The reliability of tablet computers in depicting maxillofacial radiographic landmarks

    Energy Technology Data Exchange (ETDEWEB)

    Tadinada, Aditya; Mahdian, Mina; Sheth, Sonam; Chandhoke, Taranpreet K.; Gopalakrishna, Aadarsh; Potluri, Anitha; Yadav, Sumit [University of Connecticut School of Dental Medicine, Farmington (United States)

    2015-09-15

    This study was performed to evaluate the reliability of the identification of anatomical landmarks in panoramic and lateral cephalometric radiographs on a standard medical grade picture archiving communication system (PACS) monitor and a tablet computer (iPad 5). A total of 1000 radiographs, including 500 panoramic and 500 lateral cephalometric radiographs, were retrieved from the de-identified dataset of the archive of the Section of Oral and Maxillofacial Radiology of the University Of Connecticut School Of Dental Medicine. Major radiographic anatomical landmarks were independently reviewed by two examiners on both displays. The examiners initially reviewed ten panoramic and ten lateral cephalometric radiographs using each imaging system, in order to verify interoperator agreement in landmark identification. The images were scored on a four-point scale reflecting the diagnostic image quality and exposure level of the images. Statistical analysis showed no significant difference between the two displays regarding the visibility and clarity of the landmarks in either the panoramic or cephalometric radiographs. Tablet computers can reliably show anatomical landmarks in panoramic and lateral cephalometric radiographs.

  6. Reliability of System Identification Techniques to Assess Standing Balance in Healthy Elderly.

    Directory of Open Access Journals (Sweden)

    Jantsje H Pasma

    Full Text Available System identification techniques have the potential to assess the contribution of the underlying systems involved in standing balance by applying well-known disturbances. We investigated the reliability of standing balance parameters obtained with multivariate closed loop system identification techniques.In twelve healthy elderly balance tests were performed twice a day during three days. Body sway was measured during two minutes of standing with eyes closed and the Balance test Room (BalRoom was used to apply four disturbances simultaneously: two sensory disturbances, to the proprioceptive and the visual system, and two mechanical disturbances applied at the leg and trunk segment. Using system identification techniques, sensitivity functions of the sensory disturbances and the neuromuscular controller were estimated. Based on the generalizability theory (G theory, systematic errors and sources of variability were assessed using linear mixed models and reliability was assessed by computing indexes of dependability (ID, standard error of measurement (SEM and minimal detectable change (MDC.A systematic error was found between the first and second trial in the sensitivity functions. No systematic error was found in the neuromuscular controller and body sway. The reliability of 15 of 25 parameters and body sway were moderate to excellent when the results of two trials on three days were averaged. To reach an excellent reliability on one day in 7 out of 25 parameters, it was predicted that at least seven trials must be averaged.This study shows that system identification techniques are a promising method to assess the underlying systems involved in standing balance in elderly. However, most of the parameters do not appear to be reliable unless a large number of trials are collected across multiple days. To reach an excellent reliability in one third of the parameters, a training session for participants is needed and at least seven trials of two

  7. Reliable computation from contextual correlations

    Science.gov (United States)

    Oestereich, André L.; Galvão, Ernesto F.

    2017-12-01

    An operational approach to the study of computation based on correlations considers black boxes with one-bit inputs and outputs, controlled by a limited classical computer capable only of performing sums modulo-two. In this setting, it was shown that noncontextual correlations do not provide any extra computational power, while contextual correlations were found to be necessary for the deterministic evaluation of nonlinear Boolean functions. Here we investigate the requirements for reliable computation in this setting; that is, the evaluation of any Boolean function with success probability bounded away from 1 /2 . We show that bipartite CHSH quantum correlations suffice for reliable computation. We also prove that an arbitrarily small violation of a multipartite Greenberger-Horne-Zeilinger noncontextuality inequality also suffices for reliable computation.

  8. BUILDING MODEL ANALYSIS APPLICATIONS WITH THE JOINT UNIVERSAL PARAMETER IDENTIFICATION AND EVALUATION OF RELIABILITY (JUPITER) API

    Science.gov (United States)

    The open-source, public domain JUPITER (Joint Universal Parameter IdenTification and Evaluation of Reliability) API (Application Programming Interface) provides conventions and Fortran-90 modules to develop applications (computer programs) for analyzing process models. The input ...

  9. Reliability and Availability of Cloud Computing

    CERN Document Server

    Bauer, Eric

    2012-01-01

    A holistic approach to service reliability and availability of cloud computing Reliability and Availability of Cloud Computing provides IS/IT system and solution architects, developers, and engineers with the knowledge needed to assess the impact of virtualization and cloud computing on service reliability and availability. It reveals how to select the most appropriate design for reliability diligence to assure that user expectations are met. Organized in three parts (basics, risk analysis, and recommendations), this resource is accessible to readers of diverse backgrounds and experience le

  10. Reliability Generalization of the Alcohol Use Disorder Identification Test.

    Science.gov (United States)

    Shields, Alan L.; Caruso, John C.

    2002-01-01

    Evaluated the reliability of scores from the Alcohol Use Disorders Identification Test (AUDIT; J. Sounders and others, 1993) in a reliability generalization study based on 17 empirical journal articles. Results show AUDIT scores to be generally reliable for basic assessment. (SLD)

  11. Identification of computer graphics objects

    Directory of Open Access Journals (Sweden)

    Rossinskyi Yu.M.

    2016-04-01

    Full Text Available The article is devoted to the use of computer graphics methods in problems of creating drawings, charts, drafting, etc. The widespread use of these methods requires the development of efficient algorithms for the identification of objects of drawings. The article analyzes the model-making algorithms for this problem and considered the possibility of reducing the time using graphics editing operations. Editing results in such operations as copying, moving and deleting objects specified images. These operations allow the use of a reliable identification of images of objects methods. For information on the composition of the image of the object along with information about the identity and the color should include information about the spatial location and other characteristics of the object (the thickness and style of contour lines, fill style, and so on. In order to enable the pixel image analysis to structure the information it is necessary to enable the initial code image objects color. The article shows the results of the implementation of the algorithm of encoding object identifiers. To simplify the process of building drawings of any kind, and reduce time-consuming, method of drawing objects identification is proposed based on the use as the ID information of the object color.

  12. Reliability and protection against failure in computer systems

    International Nuclear Information System (INIS)

    Daniels, B.K.

    1979-01-01

    Computers are being increasingly integrated into the control and safety systems of large and potentially hazardous industrial processes. This development introduces problems which are particular to computer systems and opens the way to new techniques of solving conventional reliability and availability problems. References to the developing fields of software reliability, human factors and software design are given, and these subjects are related, where possible, to the quantified assessment of reliability. Original material is presented in the areas of reliability growth and computer hardware failure data. The report draws on the experience of the National Centre of Systems Reliability in assessing the capability and reliability of computer systems both within the nuclear industry, and from the work carried out in other industries by the Systems Reliability Service. (author)

  13. Effects of image enhancement on reliability of landmark identification in digital cephalometry

    Directory of Open Access Journals (Sweden)

    M Oshagh

    2013-01-01

    Full Text Available Introduction: Although digital cephalometric radiography is gaining popularity in orthodontic practice, the most important source of error in its tracing is uncertainty in landmark identification. Therefore, efforts to improve accuracy in landmark identification were directed primarily toward the improvement in image quality. One of the more useful techniques of this process involves digital image enhancement which can increase overall visual quality of image, but this does not necessarily mean a better identification of landmarks. The purpose of this study was to evaluate the effectiveness of digital image enhancements on reliability of landmark identification. Materials and Methods: Fifteen common landmarks including 10 skeletal and 5 soft tissues were selected on the cephalograms of 20 randomly selected patients, prepared in Natural Head Position (NHP. Two observers (orthodontists identified landmarks on the 20 original photostimulable phosphor (PSP digital cephalogram images and 20 enhanced digital images twice with an intervening time interval of at least 4 weeks. The x and y coordinates were further analyzed to evaluate the pattern of recording differences in horizontal and vertical directions. Reliability of landmarks identification was analyzed by paired t test. Results: There was a significant difference between original and enhanced digital images in terms of reliability of points Ar and N in vertical and horizontal dimensions, and enhanced images were significantly more reliable than original images. Identification of A point, Pogonion and Pronasal points, in vertical dimension of enhanced images was significantly more reliable than original ones. Reliability of Menton point identification in horizontal dimension was significantly more in enhanced images than original ones. Conclusion: Direct digital image enhancement by altering brightness and contrast can increase reliability of some landmark identification and this may lead to more

  14. Sigma: computer vision in the service of safety and reliability in the inspection services

    International Nuclear Information System (INIS)

    Pineiro, P. J.; Mendez, M.; Garcia, A.; Cabrera, E.; Regidor, J. J.

    2012-01-01

    Vision Computing is growing very fast in the last decade with very efficient tools and algorithms. This allows new development of applications in the nuclear field providing more efficient equipment and tasks: redundant systems, vision-guided mobile robots, automated visual defects recognition, measurement, etc., In this paper Tecnatom describes a detailed example of visual computing application developed to provide secure redundant identification of the thousands of tubes existing in a power plant steam generator. some other on-going or planned visual computing projects by Tecnatom are also introduced. New possibilities of application in the inspection systems for nuclear components appear where the main objective is to maximize their reliability. (Author) 6 refs.

  15. High-reliability computing for the smarter planet

    International Nuclear Information System (INIS)

    Quinn, Heather M.; Graham, Paul; Manuzzato, Andrea; Dehon, Andre

    2010-01-01

    The geometric rate of improvement of transistor size and integrated circuit performance, known as Moore's Law, has been an engine of growth for our economy, enabling new products and services, creating new value and wealth, increasing safety, and removing menial tasks from our daily lives. Affordable, highly integrated components have enabled both life-saving technologies and rich entertainment applications. Anti-lock brakes, insulin monitors, and GPS-enabled emergency response systems save lives. Cell phones, internet appliances, virtual worlds, realistic video games, and mp3 players enrich our lives and connect us together. Over the past 40 years of silicon scaling, the increasing capabilities of inexpensive computation have transformed our society through automation and ubiquitous communications. In this paper, we will present the concept of the smarter planet, how reliability failures affect current systems, and methods that can be used to increase the reliable adoption of new automation in the future. We will illustrate these issues using a number of different electronic devices in a couple of different scenarios. Recently IBM has been presenting the idea of a 'smarter planet.' In smarter planet documents, IBM discusses increased computer automation of roadways, banking, healthcare, and infrastructure, as automation could create more efficient systems. A necessary component of the smarter planet concept is to ensure that these new systems have very high reliability. Even extremely rare reliability problems can easily escalate to problematic scenarios when implemented at very large scales. For life-critical systems, such as automobiles, infrastructure, medical implantables, and avionic systems, unmitigated failures could be dangerous. As more automation moves into these types of critical systems, reliability failures will need to be managed. As computer automation continues to increase in our society, the need for greater radiation reliability is necessary

  16. High-reliability computing for the smarter planet

    Energy Technology Data Exchange (ETDEWEB)

    Quinn, Heather M [Los Alamos National Laboratory; Graham, Paul [Los Alamos National Laboratory; Manuzzato, Andrea [UNIV OF PADOVA; Dehon, Andre [UNIV OF PENN; Carter, Nicholas [INTEL CORPORATION

    2010-01-01

    The geometric rate of improvement of transistor size and integrated circuit performance, known as Moore's Law, has been an engine of growth for our economy, enabling new products and services, creating new value and wealth, increasing safety, and removing menial tasks from our daily lives. Affordable, highly integrated components have enabled both life-saving technologies and rich entertainment applications. Anti-lock brakes, insulin monitors, and GPS-enabled emergency response systems save lives. Cell phones, internet appliances, virtual worlds, realistic video games, and mp3 players enrich our lives and connect us together. Over the past 40 years of silicon scaling, the increasing capabilities of inexpensive computation have transformed our society through automation and ubiquitous communications. In this paper, we will present the concept of the smarter planet, how reliability failures affect current systems, and methods that can be used to increase the reliable adoption of new automation in the future. We will illustrate these issues using a number of different electronic devices in a couple of different scenarios. Recently IBM has been presenting the idea of a 'smarter planet.' In smarter planet documents, IBM discusses increased computer automation of roadways, banking, healthcare, and infrastructure, as automation could create more efficient systems. A necessary component of the smarter planet concept is to ensure that these new systems have very high reliability. Even extremely rare reliability problems can easily escalate to problematic scenarios when implemented at very large scales. For life-critical systems, such as automobiles, infrastructure, medical implantables, and avionic systems, unmitigated failures could be dangerous. As more automation moves into these types of critical systems, reliability failures will need to be managed. As computer automation continues to increase in our society, the need for greater radiation reliability is

  17. Reliability and validity of a talent identification test battery for seated and standing Paralympic throws.

    Science.gov (United States)

    Spathis, Jemima Grace; Connick, Mark James; Beckman, Emma Maree; Newcombe, Peter Anthony; Tweedy, Sean Michael

    2015-01-01

    Paralympic throwing events for athletes with physical impairments comprise seated and standing javelin, shot put, discus and seated club throwing. Identification of talented throwers would enable prediction of future success and promote participation; however, a valid and reliable talent identification battery for Paralympic throwing has not been reported. This study evaluates the reliability and validity of a talent identification battery for Paralympic throws. Participants were non-disabled so that impairment would not confound analyses, and results would provide an indication of normative performance. Twenty-eight non-disabled participants (13 M; 15 F) aged 23.6 years (±5.44) performed five kinematically distinct criterion throws (three seated, two standing) and nine talent identification tests (three anthropometric, six motor); 23 were tested a second time to evaluate test-retest reliability. Talent identification test-retest reliability was evaluated using Intra-class Correlation Coefficient (ICC) and Bland-Altman plots (Limits of Agreement). Spearman's correlation assessed strength of association between criterion throws and talent identification tests. Reliability was generally acceptable (mean ICC = 0.89), but two seated talent identification tests require more extensive familiarisation. Correlation strength (mean rs = 0.76) indicated that the talent identification tests can be used to validly identify individuals with competitively advantageous attributes for each of the five kinematically distinct throwing activities. Results facilitate further research in this understudied area.

  18. STARS software tool for analysis of reliability and safety

    International Nuclear Information System (INIS)

    Poucet, A.; Guagnini, E.

    1989-01-01

    This paper reports on the STARS (Software Tool for the Analysis of Reliability and Safety) project aims at developing an integrated set of Computer Aided Reliability Analysis tools for the various tasks involved in systems safety and reliability analysis including hazard identification, qualitative analysis, logic model construction and evaluation. The expert system technology offers the most promising perspective for developing a Computer Aided Reliability Analysis tool. Combined with graphics and analysis capabilities, it can provide a natural engineering oriented environment for computer assisted reliability and safety modelling and analysis. For hazard identification and fault tree construction, a frame/rule based expert system is used, in which the deductive (goal driven) reasoning and the heuristic, applied during manual fault tree construction, is modelled. Expert system can explain their reasoning so that the analyst can become aware of the why and the how results are being obtained. Hence, the learning aspect involved in manual reliability and safety analysis can be maintained and improved

  19. Planning is not sufficient - Reliable computers need good requirements specifications

    International Nuclear Information System (INIS)

    Matras, J.R.

    1992-01-01

    Computer system reliability is the assurance that a computer system will perform its functions when required to do so. To ensure such reliability, it is important to plan the activities needed for computer system development. These development activities, in turn, require a Computer Quality Assurance Plan (CQAP) that provides the following: a Configuration Management Plan, a Verification and Validation (V and V) Plan, documentation requirements, a defined life cycle, review requirements, and organizational responsibilities. These items are necessary for system reliability; ultimately, however, they are not enough. Development of a reliable system is dependent on the requirements specification. This paper discusses how to use existing industry standards to develop a CQAP. In particular, the paper emphasizes the importance of the requirements specification and of methods for establishing reliability goals. The paper also describes how the revision of ANSI/IEE-ANS-7-4.3.2, Application Criteria for Digital Computer Systems of Nuclear Power Generating Stations, has addressed these issues

  20. Prediction of Software Reliability using Bio Inspired Soft Computing Techniques.

    Science.gov (United States)

    Diwaker, Chander; Tomar, Pradeep; Poonia, Ramesh C; Singh, Vijander

    2018-04-10

    A lot of models have been made for predicting software reliability. The reliability models are restricted to using particular types of methodologies and restricted number of parameters. There are a number of techniques and methodologies that may be used for reliability prediction. There is need to focus on parameters consideration while estimating reliability. The reliability of a system may increase or decreases depending on the selection of different parameters used. Thus there is need to identify factors that heavily affecting the reliability of the system. In present days, reusability is mostly used in the various area of research. Reusability is the basis of Component-Based System (CBS). The cost, time and human skill can be saved using Component-Based Software Engineering (CBSE) concepts. CBSE metrics may be used to assess those techniques which are more suitable for estimating system reliability. Soft computing is used for small as well as large-scale problems where it is difficult to find accurate results due to uncertainty or randomness. Several possibilities are available to apply soft computing techniques in medicine related problems. Clinical science of medicine using fuzzy-logic, neural network methodology significantly while basic science of medicine using neural-networks-genetic algorithm most frequently and preferably. There is unavoidable interest shown by medical scientists to use the various soft computing methodologies in genetics, physiology, radiology, cardiology and neurology discipline. CBSE boost users to reuse the past and existing software for making new products to provide quality with a saving of time, memory space, and money. This paper focused on assessment of commonly used soft computing technique like Genetic Algorithm (GA), Neural-Network (NN), Fuzzy Logic, Support Vector Machine (SVM), Ant Colony Optimization (ACO), Particle Swarm Optimization (PSO), and Artificial Bee Colony (ABC). This paper presents working of soft computing

  1. Reliable computer systems design and evaluatuion

    CERN Document Server

    Siewiorek, Daniel

    2014-01-01

    Enhance your hardware/software reliabilityEnhancement of system reliability has been a major concern of computer users and designers ¦ and this major revision of the 1982 classic meets users' continuing need for practical information on this pressing topic. Included are case studies of reliablesystems from manufacturers such as Tandem, Stratus, IBM, and Digital, as well as coverage of special systems such as the Galileo Orbiter fault protection system and AT&T telephone switching processors.

  2. Automation of reliability evaluation procedures through CARE - The computer-aided reliability estimation program.

    Science.gov (United States)

    Mathur, F. P.

    1972-01-01

    Description of an on-line interactive computer program called CARE (Computer-Aided Reliability Estimation) which can model self-repair and fault-tolerant organizations and perform certain other functions. Essentially CARE consists of a repository of mathematical equations defining the various basic redundancy schemes. These equations, under program control, are then interrelated to generate the desired mathematical model to fit the architecture of the system under evaluation. The mathematical model is then supplied with ground instances of its variables and is then evaluated to generate values for the reliability-theoretic functions applied to the model.

  3. Comparative reliability of cheiloscopy and palatoscopy in human identification

    Directory of Open Access Journals (Sweden)

    Sharma Preeti

    2009-01-01

    Full Text Available Background: Establishing a person′s identity in postmortem scenarios can be a very difficult process. Dental records, fingerprint and DNA comparisons are probably the most common techniques used in this context, allowing fast and reliable identification processes. However, under certain circumstances they cannot always be used; sometimes it is necessary to apply different and less known techniques. In forensic identification, lip prints and palatal rugae patterns can lead us to important information and help in a person′s identification. This study aims to ascertain the use of lip prints and palatal rugae pattern in identification and sex differentiation. Materials and Methods: A total of 100 subjects, 50 males and 50 females were selected from among the students of Subharti Dental College, Meerut. The materials used to record lip prints were lipstick, bond paper, cellophane tape, a brush for applying the lipstick, and a magnifying lens. To study palatal rugae, alginate impressions were taken and the dental casts analyzed for their various patterns. Results: Statistical analysis (applying Z-test for proportion showed significant difference for type I, I′, IV and V lip patterns (P < 0.05 in males and females, while no significant difference was observed for the same in the palatal rugae patterns (P > 0.05. Conclusion: This study not only showed that palatal rugae and lip prints are unique to an individual, but also that lip prints is more reliable for recognition of the sex of an individual.

  4. Reliability of Computer Analysis of Electrocardiograms (ECG) of ...

    African Journals Online (AJOL)

    Background: Computer programmes have been introduced to electrocardiography (ECG) with most physicians in Africa depending on computer interpretation of ECG. This study was undertaken to evaluate the reliability of computer interpretation of the 12-Lead ECG in the Black race. Methodology: Using the SCHILLER ...

  5. CADRIGS--computer aided design reliability interactive graphics system

    International Nuclear Information System (INIS)

    Kwik, R.J.; Polizzi, L.M.; Sticco, S.; Gerrard, P.B.; Yeater, M.L.; Hockenbury, R.W.; Phillips, M.A.

    1982-01-01

    An integrated reliability analysis program combining graphic representation of fault trees, automated data base loadings and reference, and automated construction of reliability code input files was developed. The functional specifications for CADRIGS, the computer aided design reliability interactive graphics system, are presented. Previously developed fault tree segments used in auxiliary feedwater system safety analysis were constructed on CADRIGS and, when combined, yielded results identical to those resulting from manual input to the same reliability codes

  6. JUPITER PROJECT - JOINT UNIVERSAL PARAMETER IDENTIFICATION AND EVALUATION OF RELIABILITY

    Science.gov (United States)

    The JUPITER (Joint Universal Parameter IdenTification and Evaluation of Reliability) project builds on the technology of two widely used codes for sensitivity analysis, data assessment, calibration, and uncertainty analysis of environmental models: PEST and UCODE.

  7. Towards higher reliability of CMS computing facilities

    International Nuclear Information System (INIS)

    Bagliesi, G; Bloom, K; Brew, C; Flix, J; Kreuzer, P; Sciabà, A

    2012-01-01

    The CMS experiment has adopted a computing system where resources are distributed worldwide in more than 50 sites. The operation of the system requires a stable and reliable behaviour of the underlying infrastructure. CMS has established procedures to extensively test all relevant aspects of a site and their capability to sustain the various CMS computing workflows at the required scale. The Site Readiness monitoring infrastructure has been instrumental in understanding how the system as a whole was improving towards LHC operations, measuring the reliability of sites when running CMS activities, and providing sites with the information they need to troubleshoot any problem. This contribution reviews the complete automation of the Site Readiness program, with the description of monitoring tools and their inclusion into the Site Status Board (SSB), the performance checks, the use of tools like HammerCloud, and the impact in improving the overall reliability of the Grid from the point of view of the CMS computing system. These results are used by CMS to select good sites to conduct workflows, in order to maximize workflows efficiencies. The performance against these tests seen at the sites during the first years of LHC running is as well reviewed.

  8. Automatic Identification of the Repolarization Endpoint by Computing the Dominant T-wave on a Reduced Number of Leads.

    Science.gov (United States)

    Giuliani, C; Agostinelli, A; Di Nardo, F; Fioretti, S; Burattini, L

    2016-01-01

    Electrocardiographic (ECG) T-wave endpoint (Tend) identification suffers lack of reliability due to the presence of noise and variability among leads. Tend identification can be improved by using global repolarization waveforms obtained by combining several leads. The dominant T-wave (DTW) is a global repolarization waveform that proved to improve Tend identification when computed using the 15 (I to III, aVr, aVl, aVf, V1 to V6, X, Y, Z) leads usually available in clinics, of which only 8 (I, II, V1 to V6) are independent. The aim of the present study was to evaluate if the 8 independent leads are sufficient to obtain a DTW which allows a reliable Tend identification. To this aim Tend measures automatically identified from 15-dependent-lead DTWs of 46 control healthy subjects (CHS) and 103 acute myocardial infarction patients (AMIP) were compared with those obtained from 8-independent-lead DTWs. Results indicate that Tend distributions have not statistically different median values (CHS: 340 ms vs. 340 ms, respectively; AMIP: 325 ms vs. 320 ms, respectively), besides being strongly correlated (CHS: ρ=0.97, AMIP: 0.88; Pautomatic Tend identification from DTW, the 8 independent leads can be used without a statistically significant loss of accuracy but with a significant decrement of computational effort. The lead dependence of 7 out of 15 leads does not introduce a significant bias in the Tend determination from 15 dependent lead DTWs.

  9. Identification of Brucella by MALDI-TOF mass spectrometry. Fast and reliable identification from agar plates and blood cultures.

    Directory of Open Access Journals (Sweden)

    Laura Ferreira

    Full Text Available BACKGROUND: MALDI-TOF mass spectrometry (MS is a reliable method for bacteria identification. Some databases used for this purpose lack reference profiles for Brucella species, which is still an important pathogen in wide areas around the world. We report the creation of profiles for MALDI-TOF Biotyper 2.0 database (Bruker Daltonics, Germany and their usefulness for identifying brucellae from culture plates and blood cultures. METHODOLOGY/PRINCIPAL FINDINGS: We created MALDI Biotyper 2.0 profiles for type strains belonging to B. melitensis biotypes 1, 2 and 3; B. abortus biotypes 1, 2, 5 and 9; B. suis, B. canis, B ceti and B. pinnipedialis. Then, 131 clinical isolates grown on plate cultures were used in triplicate to check identification. Identification at genus level was always correct, although in most cases the three replicates reported different identification at species level. Simulated blood cultures were performed with type strains belonging to the main human pathogenic species (B. melitensis, B. abortus, B. suis and B. canis, and studied by MALDI-TOF MS in triplicate. Identification at genus level was always correct. CONCLUSIONS/SIGNIFICANCE: MALDI-TOF MS is reliable for Brucella identification to the genus level from culture plates and directly from blood culture bottles.

  10. The reliable solution and computation time of variable parameters Logistic model

    OpenAIRE

    Pengfei, Wang; Xinnong, Pan

    2016-01-01

    The reliable computation time (RCT, marked as Tc) when applying a double precision computation of a variable parameters logistic map (VPLM) is studied. First, using the method proposed, the reliable solutions for the logistic map are obtained. Second, for a time-dependent non-stationary parameters VPLM, 10000 samples of reliable experiments are constructed, and the mean Tc is then computed. The results indicate that for each different initial value, the Tcs of the VPLM are generally different...

  11. Highly reliable computer network for real time system

    International Nuclear Information System (INIS)

    Mohammed, F.A.; Omar, A.A.; Ayad, N.M.A.; Madkour, M.A.I.; Ibrahim, M.K.

    1988-01-01

    Many of computer networks have been studied different trends regarding the network architecture and the various protocols that govern data transfers and guarantee a reliable communication among all a hierarchical network structure has been proposed to provide a simple and inexpensive way for the realization of a reliable real-time computer network. In such architecture all computers in the same level are connected to a common serial channel through intelligent nodes that collectively control data transfers over the serial channel. This level of computer network can be considered as a local area computer network (LACN) that can be used in nuclear power plant control system since it has geographically dispersed subsystems. network expansion would be straight the common channel for each added computer (HOST). All the nodes are designed around a microprocessor chip to provide the required intelligence. The node can be divided into two sections namely a common section that interfaces with serial data channel and a private section to interface with the host computer. This part would naturally tend to have some variations in the hardware details to match the requirements of individual host computers. fig 7

  12. Reliability-Centric Analysis of Offloaded Computation in Cooperative Wearable Applications

    Directory of Open Access Journals (Sweden)

    Aleksandr Ometov

    2017-01-01

    Full Text Available Motivated by the unprecedented penetration of mobile communications technology, this work carefully brings into perspective the challenges related to heterogeneous communications and offloaded computation operating in cases of fault-tolerant computation, computing, and caching. We specifically focus on the emerging augmented reality applications that require reliable delegation of the computing and caching functionality to proximate resource-rich devices. The corresponding mathematical model proposed in this work becomes of value to assess system-level reliability in cases where one or more nearby collaborating nodes become temporarily unavailable. Our produced analytical and simulation results corroborate the asymptotic insensitivity of the stationary reliability of the system in question (under the “fast” recovery of its elements to the type of the “repair” time distribution, thus supporting the fault-tolerant system operation.

  13. Building fast, reliable, and adaptive software for computational science

    International Nuclear Information System (INIS)

    Rendell, A P; Antony, J; Armstrong, W; Janes, P; Yang, R

    2008-01-01

    Building fast, reliable, and adaptive software is a constant challenge for computational science, especially given recent developments in computer architecture. This paper outlines some of our efforts to address these three issues in the context of computational chemistry. First, a simple linear performance that can be used to model and predict the performance of Hartree-Fock calculations is discussed. Second, the use of interval arithmetic to assess the numerical reliability of the sort of integrals used in electronic structure methods is presented. Third, use of dynamic code modification as part of a framework to support adaptive software is outlined

  14. Selection of suitable hand gestures for reliable myoelectric human computer interface.

    Science.gov (United States)

    Castro, Maria Claudia F; Arjunan, Sridhar P; Kumar, Dinesh K

    2015-04-09

    Myoelectric controlled prosthetic hand requires machine based identification of hand gestures using surface electromyogram (sEMG) recorded from the forearm muscles. This study has observed that a sub-set of the hand gestures have to be selected for an accurate automated hand gesture recognition, and reports a method to select these gestures to maximize the sensitivity and specificity. Experiments were conducted where sEMG was recorded from the muscles of the forearm while subjects performed hand gestures and then was classified off-line. The performances of ten gestures were ranked using the proposed Positive-Negative Performance Measurement Index (PNM), generated by a series of confusion matrices. When using all the ten gestures, the sensitivity and specificity was 80.0% and 97.8%. After ranking the gestures using the PNM, six gestures were selected and these gave sensitivity and specificity greater than 95% (96.5% and 99.3%); Hand open, Hand close, Little finger flexion, Ring finger flexion, Middle finger flexion and Thumb flexion. This work has shown that reliable myoelectric based human computer interface systems require careful selection of the gestures that have to be recognized and without such selection, the reliability is poor.

  15. Towards early software reliability prediction for computer forensic tools (case study).

    Science.gov (United States)

    Abu Talib, Manar

    2016-01-01

    Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study.

  16. Systems reliability analysis: applications of the SPARCS System-Reliability Assessment Computer Program

    International Nuclear Information System (INIS)

    Locks, M.O.

    1978-01-01

    SPARCS-2 (Simulation Program for Assessing the Reliabilities of Complex Systems, Version 2) is a PL/1 computer program for assessing (establishing interval estimates for) the reliability and the MTBF of a large and complex s-coherent system of any modular configuration. The system can consist of a complex logical assembly of independently failing attribute (binomial-Bernoulli) and time-to-failure (Poisson-exponential) components, without regard to their placement. Alternatively, it can be a configuration of independently failing modules, where each module has either or both attribute and time-to-failure components. SPARCS-2 also has an improved super modularity feature. Modules with minimal-cut unreliabiliy calculations can be mixed with those having minimal-path reliability calculations. All output has been standardized to system reliability or probability of success, regardless of the form in which the input data is presented, and whatever the configuration of modules or elements within modules

  17. Computational methods for protein identification from mass spectrometry data.

    Directory of Open Access Journals (Sweden)

    Leo McHugh

    2008-02-01

    Full Text Available Protein identification using mass spectrometry is an indispensable computational tool in the life sciences. A dramatic increase in the use of proteomic strategies to understand the biology of living systems generates an ongoing need for more effective, efficient, and accurate computational methods for protein identification. A wide range of computational methods, each with various implementations, are available to complement different proteomic approaches. A solid knowledge of the range of algorithms available and, more critically, the accuracy and effectiveness of these techniques is essential to ensure as many of the proteins as possible, within any particular experiment, are correctly identified. Here, we undertake a systematic review of the currently available methods and algorithms for interpreting, managing, and analyzing biological data associated with protein identification. We summarize the advances in computational solutions as they have responded to corresponding advances in mass spectrometry hardware. The evolution of scoring algorithms and metrics for automated protein identification are also discussed with a focus on the relative performance of different techniques. We also consider the relative advantages and limitations of different techniques in particular biological contexts. Finally, we present our perspective on future developments in the area of computational protein identification by considering the most recent literature on new and promising approaches to the problem as well as identifying areas yet to be explored and the potential application of methods from other areas of computational biology.

  18. Estimating the reliability of eyewitness identifications from police lineups.

    Science.gov (United States)

    Wixted, John T; Mickes, Laura; Dunn, John C; Clark, Steven E; Wells, William

    2016-01-12

    Laboratory-based mock crime studies have often been interpreted to mean that (i) eyewitness confidence in an identification made from a lineup is a weak indicator of accuracy and (ii) sequential lineups are diagnostically superior to traditional simultaneous lineups. Largely as a result, juries are increasingly encouraged to disregard eyewitness confidence, and up to 30% of law enforcement agencies in the United States have adopted the sequential procedure. We conducted a field study of actual eyewitnesses who were assigned to simultaneous or sequential photo lineups in the Houston Police Department over a 1-y period. Identifications were made using a three-point confidence scale, and a signal detection model was used to analyze and interpret the results. Our findings suggest that (i) confidence in an eyewitness identification from a fair lineup is a highly reliable indicator of accuracy and (ii) if there is any difference in diagnostic accuracy between the two lineup formats, it likely favors the simultaneous procedure.

  19. Reliability and Identification of Aortic Valve Prolapse in the Horse

    Directory of Open Access Journals (Sweden)

    Hallowell Gayle D

    2013-01-01

    Full Text Available Abstract Background The objectives were to determine and assess the reliability of criteria for identification of aortic valve prolapse (AVP using echocardiography in the horse. Results Opinion of equine cardiologists indicated that a long-axis view of the aortic valve (AoV was most commonly used for identification of AVP (46%; n=13. There was consensus that AVP could be mimicked by ultrasound probe malignment. This was confirmed in 7 healthy horses, where the appearance of AVP could be induced by malalignment. In a study of a further 8 healthy horses (5 with AVP examined daily for 5 days, by two echocardiographers standardized imaging guidelines gave good to excellent agreement for the assessment of AVP (kappa>0.80 and good agreement between days and observers (kappa >0.6. The technique allowed for assessment of the degree of prolapse and measurement of the prolapse distance that provided excellent agreement between echocardiographers, days and observers (kappa/ICC>0.8. Assessments made using real-time zoomed images provided similar measurements to the standard views (ICC=0.9, with agreement for the identification of AVP (kappa>0.8. Short axis views of the AoV were used for identification of AVP by fewer respondents (23%, however provided less agreement for the identification of AVP (kappa>0.6 and only adequate agreement with observations made in long axis (kappa>0.5, with AVP being identified more often in short axis (92% compared to long axis (76%. Orthogonal views were used by 31% of respondents to identify the presence of AVP, and 85% to identify cusp. Its identification on both views on 4 days was used to categorise horses as having AVP, providing a positive predictive value of 79% and negative predictive value of 18%. Only the non-coronary cusp (NCC of the AoV was observed to prolapse in these studies. Prolapse of the NCC was confirmed during the optimisation study using four-dimensional echocardiography, which concurred with the findings

  20. Growth characteristics of liquid cultures increase the reliability of presumptive identification of Mycobacterium tuberculosis complex.

    Science.gov (United States)

    Pinhata, Juliana Maira Watanabe; Felippe, Isis Moreira; Gallo, Juliana Failde; Chimara, Erica; Ferrazoli, Lucilaine; de Oliveira, Rosangela Siqueira

    2018-04-23

    We evaluated the microscopic and macroscopic characteristics of mycobacteria growth indicator tube (MGIT) cultures for the presumptive identification of the Mycobacterium tuberculosis complex (MTBC) and assessed the reliability of this strategy for correctly directing isolates to drug susceptibility testing (DST) or species identification. A total of 1526 isolates of mycobacteria received at the Instituto Adolfo Lutz were prospectively subjected to presumptive identification by the observation of growth characteristics along with cord formation detection via microscopy. The presumptive identification showed a sensitivity, specificity and accuracy of 98.8, 92.5 and 97.9 %, respectively. Macroscopic analysis of MTBC isolates that would have been erroneously classified as non-tuberculous mycobacteria based solely on microscopic morphology enabled us to direct them rapidly to DST, representing a substantial gain to patients. In conclusion, the growth characteristics of mycobacteria in MGIT, when considered along with cord formation, increased the reliability of the presumptive identification, which has a great impact on the laboratory budget and turnaround times.

  1. Integrated Markov-neural reliability computation method: A case for multiple automated guided vehicle system

    International Nuclear Information System (INIS)

    Fazlollahtabar, Hamed; Saidi-Mehrabad, Mohammad; Balakrishnan, Jaydeep

    2015-01-01

    This paper proposes an integrated Markovian and back propagation neural network approaches to compute reliability of a system. While states of failure occurrences are significant elements for accurate reliability computation, Markovian based reliability assessment method is designed. Due to drawbacks shown by Markovian model for steady state reliability computations and neural network for initial training pattern, integration being called Markov-neural is developed and evaluated. To show efficiency of the proposed approach comparative analyses are performed. Also, for managerial implication purpose an application case for multiple automated guided vehicles (AGVs) in manufacturing networks is conducted. - Highlights: • Integrated Markovian and back propagation neural network approach to compute reliability. • Markovian based reliability assessment method. • Managerial implication is shown in an application case for multiple automated guided vehicles (AGVs) in manufacturing networks

  2. A Research Roadmap for Computation-Based Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  3. A Research Roadmap for Computation-Based Human Reliability Analysis

    International Nuclear Information System (INIS)

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey; Smith, Curtis; Groth, Katrina

    2015-01-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  4. HuRECA: Human Reliability Evaluator for Computer-based Control Room Actions

    International Nuclear Information System (INIS)

    Kim, Jae Whan; Lee, Seung Jun; Jang, Seung Cheol

    2011-01-01

    As computer-based design features such as computer-based procedures (CBP), soft controls (SCs), and integrated information systems are being adopted in main control rooms (MCR) of nuclear power plants, a human reliability analysis (HRA) method capable of dealing with the effects of these design features on human reliability is needed. From the observations of human factors engineering verification and validation experiments, we have drawn some major important characteristics on operator behaviors and design-related influencing factors (DIFs) from the perspective of human reliability. Firstly, there are new DIFs that should be considered in developing an HRA method for computer-based control rooms including especially CBP and SCs. In the case of the computer-based procedure rather than the paper-based procedure, the structural and managerial elements should be considered as important PSFs in addition to the procedural contents. In the case of the soft controllers, the so-called interface management tasks (or secondary tasks) should be reflected in the assessment of human error probability. Secondly, computer-based control rooms can provide more effective error recovery features than conventional control rooms. Major error recovery features for computer-based control rooms include the automatic logic checking function of the computer-based procedure and the information sharing feature of the general computer-based designs

  5. Reliability of system identification techniques to assess standing balance in healthy elderly

    NARCIS (Netherlands)

    Pasma, Jantsje H.; Engelhart, Denise; Maier, Andrea B.; Aarts, Ronald G.K.M.; Van Gerven, Joop M.A.; Arendzen, J. Hans; Schouten, Alfred C.; Meskers, Carel G.M.; Van Kooij, Herman Der

    2016-01-01

    Objectives System identification techniques have the potential to assess the contribution of the underlying systems involved in standing balance by applying well-known disturbances. We investigated the reliability of standing balance parameters obtained with multivariate closed loop system

  6. Reliability of System Identification Techniques to Assess Standing Balance in Healthy Elderly

    NARCIS (Netherlands)

    Pasma, J.H.; Engelhart, D.; Maier, A.B.; Aarts, R.G.K.M.; Van Gerven, J.M.A.; Arendzen, J.H.; Schouten, A.C.; Meskers, C.G.M.; Van der Kooij, H.

    2016-01-01

    Objectives System identification techniques have the potential to assess the contribution of the underlying systems involved in standing balance by applying well-known disturbances. We investigated the reliability of standing balance parameters obtained with multivariate closed loop system

  7. Precision of lumbar intervertebral measurements: does a computer-assisted technique improve reliability?

    Science.gov (United States)

    Pearson, Adam M; Spratt, Kevin F; Genuario, James; McGough, William; Kosman, Katherine; Lurie, Jon; Sengupta, Dilip K

    2011-04-01

    Comparison of intra- and interobserver reliability of digitized manual and computer-assisted intervertebral motion measurements and classification of "instability." To determine if computer-assisted measurement of lumbar intervertebral motion on flexion-extension radiographs improves reliability compared with digitized manual measurements. Many studies have questioned the reliability of manual intervertebral measurements, although few have compared the reliability of computer-assisted and manual measurements on lumbar flexion-extension radiographs. Intervertebral rotation, anterior-posterior (AP) translation, and change in anterior and posterior disc height were measured with a digitized manual technique by three physicians and by three other observers using computer-assisted quantitative motion analysis (QMA) software. Each observer measured 30 sets of digital flexion-extension radiographs (L1-S1) twice. Shrout-Fleiss intraclass correlation coefficients for intra- and interobserver reliabilities were computed. The stability of each level was also classified (instability defined as >4 mm AP translation or 10° rotation), and the intra- and interobserver reliabilities of the two methods were compared using adjusted percent agreement (APA). Intraobserver reliability intraclass correlation coefficients were substantially higher for the QMA technique THAN the digitized manual technique across all measurements: rotation 0.997 versus 0.870, AP translation 0.959 versus 0.557, change in anterior disc height 0.962 versus 0.770, and change in posterior disc height 0.951 versus 0.283. The same pattern was observed for interobserver reliability (rotation 0.962 vs. 0.693, AP translation 0.862 vs. 0.151, change in anterior disc height 0.862 vs. 0.373, and change in posterior disc height 0.730 vs. 0.300). The QMA technique was also more reliable for the classification of "instability." Intraobserver APAs ranged from 87 to 97% for QMA versus 60% to 73% for digitized manual

  8. To the problem of reliability standardization in computer-aided manufacturing at NPP units

    International Nuclear Information System (INIS)

    Yastrebenetskij, M.A.; Shvyryaev, Yu.V.; Spektor, L.I.; Nikonenko, I.V.

    1989-01-01

    The problems of reliability standardization in computer-aided manufacturing of NPP units considering the following approaches: computer-aided manufacturing of NPP units as a part of automated technological complex; computer-aided manufacturing of NPP units as multi-functional system, are analyzed. Selection of the composition of reliability indeces for computer-aided manufacturing of NPP units for each of the approaches considered is substantiated

  9. Assessment of physical server reliability in multi cloud computing system

    Science.gov (United States)

    Kalyani, B. J. D.; Rao, Kolasani Ramchand H.

    2018-04-01

    Business organizations nowadays functioning with more than one cloud provider. By spreading cloud deployment across multiple service providers, it creates space for competitive prices that minimize the burden on enterprises spending budget. To assess the software reliability of multi cloud application layered software reliability assessment paradigm is considered with three levels of abstractions application layer, virtualization layer, and server layer. The reliability of each layer is assessed separately and is combined to get the reliability of multi-cloud computing application. In this paper, we focused on how to assess the reliability of server layer with required algorithms and explore the steps in the assessment of server reliability.

  10. Evaluation of Network Reliability for Computer Networks with Multiple Sources

    Directory of Open Access Journals (Sweden)

    Yi-Kuei Lin

    2012-01-01

    Full Text Available Evaluating the reliability of a network with multiple sources to multiple sinks is a critical issue from the perspective of quality management. Due to the unrealistic definition of paths of network models in previous literature, existing models are not appropriate for real-world computer networks such as the Taiwan Advanced Research and Education Network (TWAREN. This paper proposes a modified stochastic-flow network model to evaluate the network reliability of a practical computer network with multiple sources where data is transmitted through several light paths (LPs. Network reliability is defined as being the probability of delivering a specified amount of data from the sources to the sink. It is taken as a performance index to measure the service level of TWAREN. This paper studies the network reliability of the international portion of TWAREN from two sources (Taipei and Hsinchu to one sink (New York that goes through a submarine and land surface cable between Taiwan and the United States.

  11. Computational botany methods for automated species identification

    CERN Document Server

    Remagnino, Paolo; Wilkin, Paul; Cope, James; Kirkup, Don

    2017-01-01

    This book discusses innovative methods for mining information from images of plants, especially leaves, and highlights the diagnostic features that can be implemented in fully automatic systems for identifying plant species. Adopting a multidisciplinary approach, it explores the problem of plant species identification, covering both the concepts of taxonomy and morphology. It then provides an overview of morphometrics, including the historical background and the main steps in the morphometric analysis of leaves together with a number of applications. The core of the book focuses on novel diagnostic methods for plant species identification developed from a computer scientist’s perspective. It then concludes with a chapter on the characterization of botanists' visions, which highlights important cognitive aspects that can be implemented in a computer system to more accurately replicate the human expert’s fixation process. The book not only represents an authoritative guide to advanced computational tools fo...

  12. Polynomial-time computability of the edge-reliability of graphs using Gilbert's formula

    Directory of Open Access Journals (Sweden)

    Marlowe Thomas J.

    1998-01-01

    Full Text Available Reliability is an important consideration in analyzing computer and other communication networks, but current techniques are extremely limited in the classes of graphs which can be analyzed efficiently. While Gilbert's formula establishes a theoretically elegant recursive relationship between the edge reliability of a graph and the reliability of its subgraphs, naive evaluation requires consideration of all sequences of deletions of individual vertices, and for many graphs has time complexity essentially Θ (N!. We discuss a general approach which significantly reduces complexity, encoding subgraph isomorphism in a finer partition by invariants, and recursing through the set of invariants. We illustrate this approach using threshhold graphs, and show that any computation of reliability using Gilbert's formula will be polynomial-time if and only if the number of invariants considered is polynomial; we then show families of graphs with polynomial-time, and non-polynomial reliability computation, and show that these encompass most previously known results. We then codify our approach to indicate how it can be used for other classes of graphs, and suggest several classes to which the technique can be applied.

  13. A universal and reliable assay for molecular sex identification of three-spined sticklebacks (Gasterosteus aculeatus).

    Science.gov (United States)

    Toli, E-A; Calboli, F C F; Shikano, T; Merilä, J

    2016-11-01

    In heterogametic species, biological differences between the two sexes are ubiquitous, and hence, errors in sex identification can be a significant source of noise and bias in studies where sex-related sources of variation are of interest or need to be controlled for. We developed and validated a universal multimarker assay for reliable sex identification of three-spined sticklebacks (Gasterosteus aculeatus). The assay makes use of genotype scores from three sex-linked loci and utilizes Bayesian probabilistic inference to identify sex of the genotyped individuals. The results, validated with 286 phenotypically sexed individuals from six populations of sticklebacks representing all major genetic lineages (cf. Pacific, Atlantic and Japan Sea), indicate that in contrast to commonly used single-marker-based sex identification assays, the developed multimarker assay should be 100% accurate. As the markers in the assay can be scored from agarose gels, it provides a quick and cost-efficient tool for universal sex identification of three-spined sticklebacks. The general principle of combining information from multiple markers to improve the reliability of sex identification is transferable and can be utilized to develop and validate similar assays for other species. © 2016 John Wiley & Sons Ltd.

  14. Distributed Information and Control system reliability enhancement by fog-computing concept application

    Science.gov (United States)

    Melnik, E. V.; Klimenko, A. B.; Ivanov, D. Ya

    2018-03-01

    The paper focuses on the information and control system reliability issue. Authors of the current paper propose a new complex approach of information and control system reliability enhancement by application of the computing concept elements. The approach proposed consists of a complex of optimization problems to be solved. These problems are: estimation of computational complexity, which can be shifted to the edge of the network and fog-layer, distribution of computations among the data processing elements and distribution of computations among the sensors. The problems as well as some simulated results and discussion are formulated and presented within this paper.

  15. Fog-computing concept usage as means to enhance information and control system reliability

    Science.gov (United States)

    Melnik, E. V.; Klimenko, A. B.; Ivanov, D. Ya

    2018-05-01

    This paper focuses on the reliability issue of information and control systems (ICS). The authors propose using the elements of the fog-computing concept to enhance the reliability function. The key idea of fog-computing is to shift computations to the fog-layer of the network, and thus to decrease the workload of the communication environment and data processing components. As for ICS, workload also can be distributed among sensors, actuators and network infrastructure facilities near the sources of data. The authors simulated typical workload distribution situations for the “traditional” ICS architecture and for the one with fogcomputing concept elements usage. The paper contains some models, selected simulation results and conclusion about the prospects of the fog-computing as a means to enhance ICS reliability.

  16. Comparison of reliability of lateral cephalogram and computed ...

    African Journals Online (AJOL)

    of malocclusion and airway space using lateral cephalogram and computed tomography (CT) and to compare its reliability. To obtain important information on the morphology of the soft palate on lateral cephalogram and to determine its etiopathogenesis in obstructive sleep apnea (OSA). Materials and Methods: Lateral ...

  17. Reliability of a computer and Internet survey (Computer User Profile) used by adults with and without traumatic brain injury (TBI).

    Science.gov (United States)

    Kilov, Andrea M; Togher, Leanne; Power, Emma

    2015-01-01

    To determine test-re-test reliability of the 'Computer User Profile' (CUP) in people with and without TBI. The CUP was administered on two occasions to people with and without TBI. The CUP investigated the nature and frequency of participants' computer and Internet use. Intra-class correlation coefficients and kappa coefficients were conducted to measure reliability of individual CUP items. Descriptive statistics were used to summarize content of responses. Sixteen adults with TBI and 40 adults without TBI were included in the study. All participants were reliable in reporting demographic information, frequency of social communication and leisure activities and computer/Internet habits and usage. Adults with TBI were reliable in 77% of their responses to survey items. Adults without TBI were reliable in 88% of their responses to survey items. The CUP was practical and valuable in capturing information about social, leisure, communication and computer/Internet habits of people with and without TBI. Adults without TBI scored more items with satisfactory reliability overall in their surveys. Future studies may include larger samples and could also include an exploration of how people with/without TBI use other digital communication technologies. This may provide further information on determining technology readiness for people with TBI in therapy programmes.

  18. Rater reliability and concurrent validity of the Keyboard Personal Computer Style instrument (K-PeCS).

    Science.gov (United States)

    Baker, Nancy A; Cook, James R; Redfern, Mark S

    2009-01-01

    This paper describes the inter-rater and intra-rater reliability, and the concurrent validity of an observational instrument, the Keyboard Personal Computer Style instrument (K-PeCS), which assesses stereotypical postures and movements associated with computer keyboard use. Three trained raters independently rated the video clips of 45 computer keyboard users to ascertain inter-rater reliability, and then re-rated a sub-sample of 15 video clips to ascertain intra-rater reliability. Concurrent validity was assessed by comparing the ratings obtained using the K-PeCS to scores developed from a 3D motion analysis system. The overall K-PeCS had excellent reliability [inter-rater: intra-class correlation coefficients (ICC)=.90; intra-rater: ICC=.92]. Most individual items on the K-PeCS had from good to excellent reliability, although six items fell below ICC=.75. Those K-PeCS items that were assessed for concurrent validity compared favorably to the motion analysis data for all but two items. These results suggest that most items on the K-PeCS can be used to reliably document computer keyboarding style.

  19. Recent advances in computational structural reliability analysis methods

    Science.gov (United States)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-10-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  20. Improving the reliability of nuclear reprocessing by application of computers and mathematical modelling

    International Nuclear Information System (INIS)

    Gabowitsch, E.; Trauboth, H.

    1982-01-01

    After a brief survey of the present and expected future state of nuclear energy utilization, which should demonstrate the significance of nuclear reprocessing, safety and reliability aspects of nuclear reprocessing plants (NRP) are considered. Then, the principal possibilities of modern computer technology including computer systems architecture and application-oriented software for improving the reliability and availability are outlined. In this context, two information systems being developed at the Nuclear Research Center Karlsruhe (KfK) are briefly described. For design evaluation of certain areas of a large NRP mathematical methods and computer-aided tools developed, used or being designed by KfK are discussed. In conclusion, future research to be pursued in information processing and applied mathematics in support of reliable operation of NRP's is proposed. (Auth.)

  1. Conceptual transitions in methods of skull-photo superimposition that impact the reliability of identification: a review.

    Science.gov (United States)

    Jayaprakash, Paul T

    2015-01-01

    Establishing identification during skull-photo superimposition relies on correlating the salient morphological features of an unidentified skull with those of a face-image of a suspected dead individual using image overlay processes. Technical progression in the process of overlay has included the incorporation of video cameras, image-mixing devices and software that enables real-time vision-mixing. Conceptual transitions occur in the superimposition methods that involve 'life-size' images, that achieve orientation of the skull to the posture of the face in the photograph and that assess the extent of match. A recent report on the reliability of identification using the superimposition method adopted the currently prevalent methods and suggested an increased rate of failures when skulls were compared with related and unrelated face images. The reported reduction in the reliability of the superimposition method prompted a review of the transition in the concepts that are involved in skull-photo superimposition. The prevalent popular methods for visualizing the superimposed images at less than 'life-size', overlaying skull-face images by relying on the cranial and facial landmarks in the frontal plane when orienting the skull for matching and evaluating the match on a morphological basis by relying on mix-mode alone are the major departures in the methodology that may have reduced the identification reliability. The need to reassess the reliability of the method that incorporates the concepts which have been considered appropriate by the practitioners is stressed. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  2. [Evaluation of mass spectrometry: MALDI-TOF MS for fast and reliable yeast identification].

    Science.gov (United States)

    Relloso, María S; Nievas, Jimena; Fares Taie, Santiago; Farquharson, Victoria; Mujica, María T; Romano, Vanesa; Zarate, Mariela S; Smayevsky, Jorgelina

    2015-01-01

    The matrix-assisted laser desorption/ionization time-of-flight mass spectrometry technique known as MALDI-TOF MS is a tool used for the identification of clinical pathogens by generating a protein spectrum that is unique for a given species. In this study we assessed the identification of clinical yeast isolates by MALDI-TOF MS in a university hospital from Argentina and compared two procedures for protein extraction: a rapid method and a procedure based on the manufacturer's recommendations. A short protein extraction procedure was applied in 100 isolates and the rate of correct identification at genus and species level was 98.0%. In addition, we analyzed 201 isolates, previously identified by conventional methods, using the methodology recommended by the manufacturer and there was 95.38% coincidence in the identification at species level. MALDI TOF MS showed to be a fast, simple and reliable tool for yeast identification. Copyright © 2014 Asociación Argentina de Microbiología. Publicado por Elsevier España, S.L.U. All rights reserved.

  3. High reliability - low noise radionuclide signature identification algorithms for border security applications

    Science.gov (United States)

    Lee, Sangkyu

    Illicit trafficking and smuggling of radioactive materials and special nuclear materials (SNM) are considered as one of the most important recent global nuclear threats. Monitoring the transport and safety of radioisotopes and SNM are challenging due to their weak signals and easy shielding. Great efforts worldwide are focused at developing and improving the detection technologies and algorithms, for accurate and reliable detection of radioisotopes of interest in thus better securing the borders against nuclear threats. In general, radiation portal monitors enable detection of gamma and neutron emitting radioisotopes. Passive or active interrogation techniques, present and/or under the development, are all aimed at increasing accuracy, reliability, and in shortening the time of interrogation as well as the cost of the equipment. Equally important efforts are aimed at advancing algorithms to process the imaging data in an efficient manner providing reliable "readings" of the interiors of the examined volumes of various sizes, ranging from cargos to suitcases. The main objective of this thesis is to develop two synergistic algorithms with the goal to provide highly reliable - low noise identification of radioisotope signatures. These algorithms combine analysis of passive radioactive detection technique with active interrogation imaging techniques such as gamma radiography or muon tomography. One algorithm consists of gamma spectroscopy and cosmic muon tomography, and the other algorithm is based on gamma spectroscopy and gamma radiography. The purpose of fusing two detection methodologies per algorithm is to find both heavy-Z radioisotopes and shielding materials, since radionuclides can be identified with gamma spectroscopy, and shielding materials can be detected using muon tomography or gamma radiography. These combined algorithms are created and analyzed based on numerically generated images of various cargo sizes and materials. In summary, the three detection

  4. 48 CFR 227.7203-10 - Contractor identification and marking of computer software or computer software documentation to...

    Science.gov (United States)

    2010-10-01

    ... operation of the software to display a restrictive rights legend or other license notice; and (2) Requires a... and marking of computer software or computer software documentation to be furnished with restrictive... Rights in Computer Software and Computer Software Documentation 227.7203-10 Contractor identification and...

  5. Identification of Classified Information in Unclassified DoD Systems During the Audit of Internal Controls and Data Reliability in the Deployable Disbursing System

    Science.gov (United States)

    2009-02-17

    Identification of Classified Information in Unclassified DoD Systems During the Audit of Internal Controls and Data Reliability in the Deployable...TITLE AND SUBTITLE Identification of Classified Information in Unclassified DoD Systems During the Audit of Internal Controls and Data Reliability...Systems During the Audit ofInternal Controls and Data Reliability in the Deployable Disbursing System (Report No. D-2009-054) Weare providing this

  6. The reliable solution and computation time of variable parameters logistic model

    Science.gov (United States)

    Wang, Pengfei; Pan, Xinnong

    2018-05-01

    The study investigates the reliable computation time (RCT, termed as T c) by applying a double-precision computation of a variable parameters logistic map (VPLM). Firstly, by using the proposed method, we obtain the reliable solutions for the logistic map. Secondly, we construct 10,000 samples of reliable experiments from a time-dependent non-stationary parameters VPLM and then calculate the mean T c. The results indicate that, for each different initial value, the T cs of the VPLM are generally different. However, the mean T c trends to a constant value when the sample number is large enough. The maximum, minimum, and probable distribution functions of T c are also obtained, which can help us to identify the robustness of applying a nonlinear time series theory to forecasting by using the VPLM output. In addition, the T c of the fixed parameter experiments of the logistic map is obtained, and the results suggest that this T c matches the theoretical formula-predicted value.

  7. Algorithmic mechanisms for reliable crowdsourcing computation under collusion.

    Science.gov (United States)

    Fernández Anta, Antonio; Georgiou, Chryssis; Mosteiro, Miguel A; Pareja, Daniel

    2015-01-01

    We consider a computing system where a master processor assigns a task for execution to worker processors that may collude. We model the workers' decision of whether to comply (compute the task) or not (return a bogus result to save the computation cost) as a game among workers. That is, we assume that workers are rational in a game-theoretic sense. We identify analytically the parameter conditions for a unique Nash Equilibrium where the master obtains the correct result. We also evaluate experimentally mixed equilibria aiming to attain better reliability-profit trade-offs. For a wide range of parameter values that may be used in practice, our simulations show that, in fact, both master and workers are better off using a pure equilibrium where no worker cheats, even under collusion, and even for colluding behaviors that involve deviating from the game.

  8. Computer-assisted photo identification outperforms visible implant elastomers in an endangered salamander, Eurycea tonkawae.

    Directory of Open Access Journals (Sweden)

    Nathan F Bendik

    Full Text Available Despite recognition that nearly one-third of the 6300 amphibian species are threatened with extinction, our understanding of the general ecology and population status of many amphibians is relatively poor. A widely-used method for monitoring amphibians involves injecting captured individuals with unique combinations of colored visible implant elastomer (VIE. We compared VIE identification to a less-invasive method - computer-assisted photographic identification (photoID - in endangered Jollyville Plateau salamanders (Eurycea tonkawae, a species with a known range limited to eight stream drainages in central Texas. We based photoID on the unique pigmentation patterns on the dorsal head region of 1215 individual salamanders using identification software Wild-ID. We compared the performance of photoID methods to VIEs using both 'high-quality' and 'low-quality' images, which were taken using two different camera types and technologies. For high-quality images, the photoID method had a false rejection rate of 0.76% compared to 1.90% for VIEs. Using a comparable dataset of lower-quality images, the false rejection rate was much higher (15.9%. Photo matching scores were negatively correlated with time between captures, suggesting that evolving natural marks could increase misidentification rates in longer term capture-recapture studies. Our study demonstrates the utility of large-scale capture-recapture using photo identification methods for Eurycea and other species with stable natural marks that can be reliably photographed.

  9. A new efficient algorithm for computing the imprecise reliability of monotone systems

    International Nuclear Information System (INIS)

    Utkin, Lev V.

    2004-01-01

    Reliability analysis of complex systems by partial information about reliability of components and by different conditions of independence of components may be carried out by means of the imprecise probability theory which provides a unified framework (natural extension, lower and upper previsions) for computing the system reliability. However, the application of imprecise probabilities to reliability analysis meets with a complexity of optimization problems which have to be solved for obtaining the system reliability measures. Therefore, an efficient simplified algorithm to solve and decompose the optimization problems is proposed in the paper. This algorithm allows us to practically implement reliability analysis of monotone systems under partial and heterogeneous information about reliability of components and under conditions of the component independence or the lack of information about independence. A numerical example illustrates the algorithm

  10. Ensemble of different approaches for a reliable person re-identification system

    Directory of Open Access Journals (Sweden)

    Loris Nanni

    2016-07-01

    Full Text Available An ensemble of approaches for reliable person re-identification is proposed in this paper. The proposed ensemble is built combining widely used person re-identification systems using different color spaces and some variants of state-of-the-art approaches that are proposed in this paper. Different descriptors are tested, and both texture and color features are extracted from the images; then the different descriptors are compared using different distance measures (e.g., the Euclidean distance, angle, and the Jeffrey distance. To improve performance, a method based on skeleton detection, extracted from the depth map, is also applied when the depth map is available. The proposed ensemble is validated on three widely used datasets (CAVIAR4REID, IAS, and VIPeR, keeping the same parameter set of each approach constant across all tests to avoid overfitting and to demonstrate that the proposed system can be considered a general-purpose person re-identification system. Our experimental results show that the proposed system offers significant improvements over baseline approaches. The source code used for the approaches tested in this paper will be available at https://www.dei.unipd.it/node/2357 and http://robotics.dei.unipd.it/reid/.

  11. Reliable multicast for the Grid: a case study in experimental computer science.

    Science.gov (United States)

    Nekovee, Maziar; Barcellos, Marinho P; Daw, Michael

    2005-08-15

    In its simplest form, multicast communication is the process of sending data packets from a source to multiple destinations in the same logical multicast group. IP multicast allows the efficient transport of data through wide-area networks, and its potentially great value for the Grid has been highlighted recently by a number of research groups. In this paper, we focus on the use of IP multicast in Grid applications, which require high-throughput reliable multicast. These include Grid-enabled computational steering and collaborative visualization applications, and wide-area distributed computing. We describe the results of our extensive evaluation studies of state-of-the-art reliable-multicast protocols, which were performed on the UK's high-speed academic networks. Based on these studies, we examine the ability of current reliable multicast technology to meet the Grid's requirements and discuss future directions.

  12. A New Method of Reliability Evaluation Based on Wavelet Information Entropy for Equipment Condition Identification

    International Nuclear Information System (INIS)

    He, Z J; Zhang, X L; Chen, X F

    2012-01-01

    Aiming at reliability evaluation of condition identification of mechanical equipment, it is necessary to analyze condition monitoring information. A new method of reliability evaluation based on wavelet information entropy extracted from vibration signals of mechanical equipment is proposed. The method is quite different from traditional reliability evaluation models that are dependent on probability statistics analysis of large number sample data. The vibration signals of mechanical equipment were analyzed by means of second generation wavelet package (SGWP). We take relative energy in each frequency band of decomposed signal that equals a percentage of the whole signal energy as probability. Normalized information entropy (IE) is obtained based on the relative energy to describe uncertainty of a system instead of probability. The reliability degree is transformed by the normalized wavelet information entropy. A successful application has been achieved to evaluate the assembled quality reliability for a kind of dismountable disk-drum aero-engine. The reliability degree indicates the assembled quality satisfactorily.

  13. Reliability of the fuel identification procedure used by COGEMA during cask loading for shipment to LA HAGUE

    International Nuclear Information System (INIS)

    Pretesacque, P.; Eid, M.; Zachar, M.

    1993-01-01

    This study has been carried out to demonstrate the reliability of the system of the spent fuel identification used by COGEMA and NTL prior to shipment to the reprocessing plant of La Hague. This was a prerequisite for the French competent authority to accept the 'burnup credit' assumption in the criticality assessment of spent fuel packages. The probability to load a non-irradiated and non-specified fuel assembly was considered as acceptable if our identification and irradiation status measurement procedures were used. Furthermore, the task analysis enabled us to improve the working conditions at reactor sites, the quality of the working documentation, and consequently to improve the reliability of the system. The NTL experience of transporting to La Hague, as consignor, more than 10,000 fuel assemblies since the date of implementation of our system in 1984 without any non-conformance on fuel identification, validated the formalism of this study as well as our assumptions on basic events probabilities. (J.P.N.)

  14. Soft computing approach for reliability optimization: State-of-the-art survey

    International Nuclear Information System (INIS)

    Gen, Mitsuo; Yun, Young Su

    2006-01-01

    In the broadest sense, reliability is a measure of performance of systems. As systems have grown more complex, the consequences of their unreliable behavior have become severe in terms of cost, effort, lives, etc., and the interest in assessing system reliability and the need for improving the reliability of products and systems have become very important. Most solution methods for reliability optimization assume that systems have redundancy components in series and/or parallel systems and alternative designs are available. Reliability optimization problems concentrate on optimal allocation of redundancy components and optimal selection of alternative designs to meet system requirement. In the past two decades, numerous reliability optimization techniques have been proposed. Generally, these techniques can be classified as linear programming, dynamic programming, integer programming, geometric programming, heuristic method, Lagrangean multiplier method and so on. A Genetic Algorithm (GA), as a soft computing approach, is a powerful tool for solving various reliability optimization problems. In this paper, we briefly survey GA-based approach for various reliability optimization problems, such as reliability optimization of redundant system, reliability optimization with alternative design, reliability optimization with time-dependent reliability, reliability optimization with interval coefficients, bicriteria reliability optimization, and reliability optimization with fuzzy goals. We also introduce the hybrid approaches for combining GA with fuzzy logic, neural network and other conventional search techniques. Finally, we have some experiments with an example of various reliability optimization problems using hybrid GA approach

  15. Identification of natural images and computer-generated graphics based on statistical and textural features.

    Science.gov (United States)

    Peng, Fei; Li, Jiao-ting; Long, Min

    2015-03-01

    To discriminate the acquisition pipelines of digital images, a novel scheme for the identification of natural images and computer-generated graphics is proposed based on statistical and textural features. First, the differences between them are investigated from the view of statistics and texture, and 31 dimensions of feature are acquired for identification. Then, LIBSVM is used for the classification. Finally, the experimental results are presented. The results show that it can achieve an identification accuracy of 97.89% for computer-generated graphics, and an identification accuracy of 97.75% for natural images. The analyses also demonstrate the proposed method has excellent performance, compared with some existing methods based only on statistical features or other features. The method has a great potential to be implemented for the identification of natural images and computer-generated graphics. © 2014 American Academy of Forensic Sciences.

  16. Radiologic identification of disaster victims: A simple and reliable method using CT of the paranasal sinuses

    International Nuclear Information System (INIS)

    Ruder, Thomas D.; Kraehenbuehl, Markus; Gotsmy, Walther F.; Mathier, Sandra; Ebert, Lars C.; Thali, Michael J.; Hatch, Gary M.

    2012-01-01

    Objective: To assess the reliability of radiologic identification using visual comparison of ante and post mortem paranasal sinus computed tomography (CT). Subjects and methods: The study was approved by the responsible justice department and university ethics committee. Four blinded readers with varying radiological experience separately compared 100 post mortem to 25 ante mortem head CTs with the goal to identify as many matching pairs as possible (out of 23 possible matches). Sensitivity, specificity, positive and negative predictive values were calculated for all readers. The chi-square test was applied to establish if there was significant difference in sensitivity between radiologists and non-radiologists. Results: For all readers, sensitivity was 83.7%, specificity was 100.0%, negative predictive value (NPV) was 95.4%, positive predictive value (PPV) was 100.0%, and accuracy was 96.3%. For radiologists, sensitivity was 97.8%, NPV was 99.4%, and accuracy was 99.5%. For non-radiologists, average sensitivity was 69.6%, negative predictive value (NPV) was 91.7%, and accuracy was 93.0%. Radiologists achieved a significantly higher sensitivity (p < 0.01) than non-radiologists. Conclusions: Visual comparison of ante mortem and post mortem CT of the head is a robust and reliable method for identifying unknown decedents, particularly in regard to positive matches. The sensitivity and NPV of the method depend on the reader's experience.

  17. Identification of Nasal Bone Fractures on Conventional Radiography and Facial CT: Comparison of the Diagnostic Accuracy in Different Imaging Modalities and Analysis of Interobserver Reliability

    International Nuclear Information System (INIS)

    Baek, Hye Jin; Kim, Dong Wook; Ryu, Ji Hwa; Lee, Yoo Jin

    2013-01-01

    There has been no study to compare the diagnostic accuracy of an experienced radiologist with a trainee in nasal bone fracture. To compare the diagnostic accuracy between conventional radiography and computed tomography (CT) for the identification of nasal bone fractures and to evaluate the interobserver reliability between a staff radiologist and a trainee. A total of 108 patients who underwent conventional radiography and CT after acute nasal trauma were included in this retrospective study. Two readers, a staff radiologist and a second-year resident, independently assessed the results of the imaging studies. Of the 108 patients, the presence of a nasal bone fracture was confirmed in 88 (81.5%) patients. The number of non-depressed fractures was higher than the number of depressed fractures. In nine (10.2%) patients, nasal bone fractures were only identified on conventional radiography, including three depressed and six non-depressed fractures. CT was more accurate as compared to conventional radiography for the identification of nasal bone fractures as determined by both readers (P <0.05), all diagnostic indices of an experienced radiologist were similar to or higher than those of a trainee, and κ statistics showed moderate agreement between the two diagnostic tools for both readers. There was no statistical difference in the assessment of interobserver reliability for both imaging modalities in the identification of nasal bone fractures. For the identification of nasal bone fractures, CT was significantly superior to conventional radiography. Although a staff radiologist showed better values in the identification of nasal bone fracture and differentiation between depressed and non-depressed fractures than a trainee, there was no statistically significant difference in the interpretation of conventional radiography and CT between a radiologist and a trainee

  18. Reliability of lower limb alignment measures using an established landmark-based method with a customized computer software program

    Science.gov (United States)

    Sled, Elizabeth A.; Sheehy, Lisa M.; Felson, David T.; Costigan, Patrick A.; Lam, Miu; Cooke, T. Derek V.

    2010-01-01

    The objective of the study was to evaluate the reliability of frontal plane lower limb alignment measures using a landmark-based method by (1) comparing inter- and intra-reader reliability between measurements of alignment obtained manually with those using a computer program, and (2) determining inter- and intra-reader reliability of computer-assisted alignment measures from full-limb radiographs. An established method for measuring alignment was used, involving selection of 10 femoral and tibial bone landmarks. 1) To compare manual and computer methods, we used digital images and matching paper copies of five alignment patterns simulating healthy and malaligned limbs drawn using AutoCAD. Seven readers were trained in each system. Paper copies were measured manually and repeat measurements were performed daily for 3 days, followed by a similar routine with the digital images using the computer. 2) To examine the reliability of computer-assisted measures from full-limb radiographs, 100 images (200 limbs) were selected as a random sample from 1,500 full-limb digital radiographs which were part of the Multicenter Osteoarthritis (MOST) Study. Three trained readers used the software program to measure alignment twice from the batch of 100 images, with two or more weeks between batch handling. Manual and computer measures of alignment showed excellent agreement (intraclass correlations [ICCs] 0.977 – 0.999 for computer analysis; 0.820 – 0.995 for manual measures). The computer program applied to full-limb radiographs produced alignment measurements with high inter- and intra-reader reliability (ICCs 0.839 – 0.998). In conclusion, alignment measures using a bone landmark-based approach and a computer program were highly reliable between multiple readers. PMID:19882339

  19. Test-retest reliability of computer-based video analysis of general movements in healthy term-born infants.

    Science.gov (United States)

    Valle, Susanne Collier; Støen, Ragnhild; Sæther, Rannei; Jensenius, Alexander Refsum; Adde, Lars

    2015-10-01

    A computer-based video analysis has recently been presented for quantitative assessment of general movements (GMs). This method's test-retest reliability, however, has not yet been evaluated. The aim of the current study was to evaluate the test-retest reliability of computer-based video analysis of GMs, and to explore the association between computer-based video analysis and the temporal organization of fidgety movements (FMs). Test-retest reliability study. 75 healthy, term-born infants were recorded twice the same day during the FMs period using a standardized video set-up. The computer-based movement variables "quantity of motion mean" (Qmean), "quantity of motion standard deviation" (QSD) and "centroid of motion standard deviation" (CSD) were analyzed, reflecting the amount of motion and the variability of the spatial center of motion of the infant, respectively. In addition, the association between the variable CSD and the temporal organization of FMs was explored. Intraclass correlation coefficients (ICC 1.1 and ICC 3.1) were calculated to assess test-retest reliability. The ICC values for the variables CSD, Qmean and QSD were 0.80, 0.80 and 0.86 for ICC (1.1), respectively; and 0.80, 0.86 and 0.90 for ICC (3.1), respectively. There were significantly lower CSD values in the recordings with continual FMs compared to the recordings with intermittent FMs (ptest-retest reliability of computer-based video analysis of GMs, and a significant association between our computer-based video analysis and the temporal organization of FMs. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  20. Reliability analysis framework for computer-assisted medical decision systems

    International Nuclear Information System (INIS)

    Habas, Piotr A.; Zurada, Jacek M.; Elmaghraby, Adel S.; Tourassi, Georgia D.

    2007-01-01

    We present a technique that enhances computer-assisted decision (CAD) systems with the ability to assess the reliability of each individual decision they make. Reliability assessment is achieved by measuring the accuracy of a CAD system with known cases similar to the one in question. The proposed technique analyzes the feature space neighborhood of the query case to dynamically select an input-dependent set of known cases relevant to the query. This set is used to assess the local (query-specific) accuracy of the CAD system. The estimated local accuracy is utilized as a reliability measure of the CAD response to the query case. The underlying hypothesis of the study is that CAD decisions with higher reliability are more accurate. The above hypothesis was tested using a mammographic database of 1337 regions of interest (ROIs) with biopsy-proven ground truth (681 with masses, 656 with normal parenchyma). Three types of decision models, (i) a back-propagation neural network (BPNN), (ii) a generalized regression neural network (GRNN), and (iii) a support vector machine (SVM), were developed to detect masses based on eight morphological features automatically extracted from each ROI. The performance of all decision models was evaluated using the Receiver Operating Characteristic (ROC) analysis. The study showed that the proposed reliability measure is a strong predictor of the CAD system's case-specific accuracy. Specifically, the ROC area index for CAD predictions with high reliability was significantly better than for those with low reliability values. This result was consistent across all decision models investigated in the study. The proposed case-specific reliability analysis technique could be used to alert the CAD user when an opinion that is unlikely to be reliable is offered. The technique can be easily deployed in the clinical environment because it is applicable with a wide range of classifiers regardless of their structure and it requires neither additional

  1. Reliability issues related to the usage of Cloud Computing in Critical Infrastructures

    OpenAIRE

    Diez Gonzalez, Oscar Manuel; Silva Vazquez, Andrés

    2011-01-01

    The use of cloud computing is extending to all kind of systems, including the ones that are part of Critical Infrastructures, and measuring the reliability is becoming more difficult. Computing is becoming the 5th utility, in part thanks to the use of cloud services. Cloud computing is used now by all types of systems and organizations, including critical infrastructure, creating hidden inter-dependencies on both public and private cloud models. This paper investigates the use of cloud co...

  2. Polynomial-time computability of the edge-reliability of graphs using Gilbert's formula

    Directory of Open Access Journals (Sweden)

    Thomas J. Marlowe

    1998-01-01

    Full Text Available Reliability is an important consideration in analyzing computer and other communication networks, but current techniques are extremely limited in the classes of graphs which can be analyzed efficiently. While Gilbert's formula establishes a theoretically elegant recursive relationship between the edge reliability of a graph and the reliability of its subgraphs, naive evaluation requires consideration of all sequences of deletions of individual vertices, and for many graphs has time complexity essentially Θ (N!. We discuss a general approach which significantly reduces complexity, encoding subgraph isomorphism in a finer partition by invariants, and recursing through the set of invariants.

  3. Diagnostic reliability of MMPI-2 computer-based test interpretations.

    Science.gov (United States)

    Pant, Hina; McCabe, Brian J; Deskovitz, Mark A; Weed, Nathan C; Williams, John E

    2014-09-01

    Reflecting the common use of the MMPI-2 to provide diagnostic considerations, computer-based test interpretations (CBTIs) also typically offer diagnostic suggestions. However, these diagnostic suggestions can sometimes be shown to vary widely across different CBTI programs even for identical MMPI-2 profiles. The present study evaluated the diagnostic reliability of 6 commercially available CBTIs using a 20-item Q-sort task developed for this study. Four raters each sorted diagnostic classifications based on these 6 CBTI reports for 20 MMPI-2 profiles. Two questions were addressed. First, do users of CBTIs understand the diagnostic information contained within the reports similarly? Overall, diagnostic sorts of the CBTIs showed moderate inter-interpreter diagnostic reliability (mean r = .56), with sorts for the 1/2/3 profile showing the highest inter-interpreter diagnostic reliability (mean r = .67). Second, do different CBTIs programs vary with respect to diagnostic suggestions? It was found that diagnostic sorts of the CBTIs had a mean inter-CBTI diagnostic reliability of r = .56, indicating moderate but not strong agreement across CBTIs in terms of diagnostic suggestions. The strongest inter-CBTI diagnostic agreement was found for sorts of the 1/2/3 profile CBTIs (mean r = .71). Limitations and future directions are discussed. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  4. Adult Sex Identification Using Three-Dimensional Computed Tomography (3D-CT of the Pelvis: A Study Among a Sample of the Egyptian Population

    Directory of Open Access Journals (Sweden)

    Enas M. A. Mostafa

    2016-06-01

    Full Text Available Sex identification of unknown human skeletal remains is of great importance in establishing identity and individuality. In adults, the hip bone is the most reliable sex indicator because of its sexual dimorphism. Each population should have its own specific standards of identification. The objective of this study is to develop a logistic regression formula for adult sex identification using threedimensional computed tomography (3D-CT of the pelvis and to perform an assessment of its validity in sex determination among a sample of the Egyptian population in the Suez Canal region. 141 pelvic-abdominal CT images (free of any pelvic orthopaedic disorder were included; they were reconstructed to produce 3D-CT pelvic images which were divided into a calibration group (47 male and 47 female and a test group (47 CT images the sex of which was unknown to the observers. Twenty radiometric variables were measured for the calibration group. A logit response formula for sex prediction was developed and applied on the test group for sex prediction. The logit response formula for the test sample showed sensitivity, specificity, and an overall accuracy of 100%. The proposed method represents a quick and reliable metric method in establishing sex from a CT image of the pelvis bone.

  5. Optimal reliability allocation for large software projects through soft computing techniques

    DEFF Research Database (Denmark)

    Madsen, Henrik; Albeanu, Grigore; Popentiu-Vladicescu, Florin

    2012-01-01

    or maximizing the system reliability subject to budget constraints. These kinds of optimization problems were considered both in deterministic and stochastic frameworks in literature. Recently, the intuitionistic-fuzzy optimization approach was considered as a soft computing successful modelling approach....... Firstly, a review on existing soft computing approaches to optimization is given. The main section extends the results considering self-organizing migrating algorithms for solving intuitionistic-fuzzy optimization problems attached to complex fault-tolerant software architectures which proved...

  6. Reliable identification at the species level of Brucella isolates with MALDI-TOF-MS

    Directory of Open Access Journals (Sweden)

    Lista Florigio

    2011-12-01

    Full Text Available Abstract Background The genus Brucella contains highly infectious species that are classified as biological threat agents. The timely detection and identification of the microorganism involved is essential for an effective response not only to biological warfare attacks but also to natural outbreaks. Matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS is a rapid method for the analysis of biological samples. The advantages of this method, compared to conventional techniques, are rapidity, cost-effectiveness, accuracy and suitability for the high-throughput identification of bacteria. Discrepancies between taxonomy and genetic relatedness on the species and biovar level complicate the development of detection and identification assays. Results In this study, the accurate identification of Brucella species using MALDI-TOF-MS was achieved by constructing a Brucella reference library based on multilocus variable-number tandem repeat analysis (MLVA data. By comparing MS-spectra from Brucella species against a custom-made MALDI-TOF-MS reference library, MALDI-TOF-MS could be used as a rapid identification method for Brucella species. In this way, 99.3% of the 152 isolates tested were identified at the species level, and B. suis biovar 1 and 2 were identified at the level of their biovar. This result demonstrates that for Brucella, even minimal genomic differences between these serovars translate to specific proteomic differences. Conclusions MALDI-TOF-MS can be developed into a fast and reliable identification method for genetically highly related species when potential taxonomic and genetic inconsistencies are taken into consideration during the generation of the reference library.

  7. The reliability of flexible nasolaryngoscopy in the identification of vocal fold movement impairment in young infants.

    Science.gov (United States)

    Liu, Yi-Chun Carol; McElwee, Tyler; Musso, Mary; Rosenberg, Tara L; Ongkasuwan, Julina

    2017-09-01

    Flexible nasolaryngoscopy (FNL) is considered the gold standard for evaluation of vocal fold mobility but there has been no data on the reliability of interpretation in the infant population. Visualization may be limited by excessive movement, secretions, or floppy supraglottic structures that prevent accurate diagnosis of vocal fold movement impairment (VFMI). We sought to evaluate the inter- and intra-rater reliability of FNL for the evaluation of VFMI in young infants. Case-control. Twenty infants were identified: 10 with VFMI and 10 normal as seen on FNL. Three pediatric otolaryngologists reviewed the video without sound and rated the presence and/or degree of vocal fold mobility. Twelve videos were repeated to assess intra-rater reliability. There was substantial agreement between the reviewers regarding the identification normal vs. any type of VFMI (kappa = 0.67) but only moderate agreement regarding the degree of vocal fold movement (kappa = 0.49). Intra-rater reliability ranges from moderate to perfect agreement (kappa = 0.48-1). FNL in infants is an extremely challenging procedure. Clinically, physicians frequently use the quality of the cry and the past medical and surgical history to help make a judgment of vocal fold movement when the view is suboptimal. These other factors, however, may bias the interpretation of the FNL. Without sound, there is only moderate inter-rater and variable intra-rater reliability for the identification of degree of movement on FNL. Otolaryngologists must be cognizant of the limitations of FNL when using it as a clinical tool or as a "gold standard" against which other modalities are measured. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Description of the TREBIL, CRESSEX and STREUSL computer programs, that belongs to RALLY computer code pack for the analysis of reliability systems

    International Nuclear Information System (INIS)

    Fernandes Filho, T.L.

    1982-11-01

    The RALLY computer code pack (RALLY pack) is a set of computer codes destinate to the reliability of complex systems, aiming to a risk analysis. Three of the six codes, are commented, presenting their purpose, input description, calculation methods and results obtained with each one of those computer codes. The computer codes are: TREBIL, to obtain the fault tree logical equivalent; CRESSEX, to obtain the minimal cut and the punctual values of the non-reliability and non-availability of the system; and STREUSL, for the dispersion calculation of those values around the media. In spite of the CRESSEX, in its version available at CNEN, uses a little long method to obtain the minimal cut in an HB-CNEN system, the three computer programs show good results, mainly the STREUSL, which permits the simulation of various components. (E.G.) [pt

  9. Reliability analysis techniques for the design engineer

    International Nuclear Information System (INIS)

    Corran, E.R.; Witt, H.H.

    1980-01-01

    A fault tree analysis package is described that eliminates most of the housekeeping tasks involved in proceeding from the initial construction of a fault tree to the final stage of presenting a reliability analysis in a safety report. It is suitable for designers with relatively little training in reliability analysis and computer operation. Users can rapidly investigate the reliability implications of various options at the design stage, and evolve a system which meets specified reliability objectives. Later independent review is thus unlikely to reveal major shortcomings necessitating modification and projects delays. The package operates interactively allowing the user to concentrate on the creative task of developing the system fault tree, which may be modified and displayed graphically. For preliminary analysis system data can be derived automatically from a generic data bank. As the analysis procedes improved estimates of critical failure rates and test and maintenance schedules can be inserted. The computations are standard, - identification of minimal cut-sets, estimation of reliability parameters, and ranking of the effect of the individual component failure modes and system failure modes on these parameters. The user can vary the fault trees and data on-line, and print selected data for preferred systems in a form suitable for inclusion in safety reports. A case history is given - that of HIFAR containment isolation system. (author)

  10. Sigma: computer vision in the service of safety and reliability in the inspection services; Sigma: la vision computacional al servicio de la seguridad y fiabilidad en los servicios de inspeccion

    Energy Technology Data Exchange (ETDEWEB)

    Pineiro, P. J.; Mendez, M.; Garcia, A.; Cabrera, E.; Regidor, J. J.

    2012-11-01

    Vision Computing is growing very fast in the last decade with very efficient tools and algorithms. This allows new development of applications in the nuclear field providing more efficient equipment and tasks: redundant systems, vision-guided mobile robots, automated visual defects recognition, measurement, etc., In this paper Tecnatom describes a detailed example of visual computing application developed to provide secure redundant identification of the thousands of tubes existing in a power plant steam generator. some other on-going or planned visual computing projects by Tecnatom are also introduced. New possibilities of application in the inspection systems for nuclear components appear where the main objective is to maximize their reliability. (Author) 6 refs.

  11. Reliability Assessment of Cloud Computing Platform Based on Semiquantitative Information and Evidential Reasoning

    Directory of Open Access Journals (Sweden)

    Hang Wei

    2016-01-01

    Full Text Available A reliability assessment method based on evidential reasoning (ER rule and semiquantitative information is proposed in this paper, where a new reliability assessment architecture including four aspects with both quantitative data and qualitative knowledge is established. The assessment architecture is more objective in describing complex dynamic cloud computing environment than that in traditional method. In addition, the ER rule which has good performance for multiple attribute decision making problem is employed to integrate different types of the attributes in assessment architecture, which can obtain more accurate assessment results. The assessment results of the case study in an actual cloud computing platform verify the effectiveness and the advantage of the proposed method.

  12. Computational area measurement of orbital floor fractures: Reliability, accuracy and rapidity

    International Nuclear Information System (INIS)

    Schouman, Thomas; Courvoisier, Delphine S.; Imholz, Benoit; Van Issum, Christopher; Scolozzi, Paolo

    2012-01-01

    Objective: To evaluate the reliability, accuracy and rapidity of a specific computational method for assessing the orbital floor fracture area on a CT scan. Method: A computer assessment of the area of the fracture, as well as that of the total orbital floor, was determined on CT scans taken from ten patients. The ratio of the fracture's area to the orbital floor area was also calculated. The test–retest precision of measurement calculations was estimated using the Intraclass Correlation Coefficient (ICC) and Dahlberg's formula to assess the agreement across observers and across measures. The time needed for the complete assessment was also evaluated. Results: The Intraclass Correlation Coefficient across observers was 0.92 [0.85;0.96], and the precision of the measures across observers was 4.9%, according to Dahlberg's formula .The mean time needed to make one measurement was 2 min and 39 s (range, 1 min and 32 s to 4 min and 37 s). Conclusion: This study demonstrated that (1) the area of the orbital floor fracture can be rapidly and reliably assessed by using a specific computer system directly on CT scan images; (2) this method has the potential of being routinely used to standardize the post-traumatic evaluation of orbital fractures

  13. Time-variant reliability assessment through equivalent stochastic process transformation

    International Nuclear Information System (INIS)

    Wang, Zequn; Chen, Wei

    2016-01-01

    Time-variant reliability measures the probability that an engineering system successfully performs intended functions over a certain period of time under various sources of uncertainty. In practice, it is computationally prohibitive to propagate uncertainty in time-variant reliability assessment based on expensive or complex numerical models. This paper presents an equivalent stochastic process transformation approach for cost-effective prediction of reliability deterioration over the life cycle of an engineering system. To reduce the high dimensionality, a time-independent reliability model is developed by translating random processes and time parameters into random parameters in order to equivalently cover all potential failures that may occur during the time interval of interest. With the time-independent reliability model, an instantaneous failure surface is attained by using a Kriging-based surrogate model to identify all potential failure events. To enhance the efficacy of failure surface identification, a maximum confidence enhancement method is utilized to update the Kriging model sequentially. Then, the time-variant reliability is approximated using Monte Carlo simulations of the Kriging model where system failures over a time interval are predicted by the instantaneous failure surface. The results of two case studies demonstrate that the proposed approach is able to accurately predict the time evolution of system reliability while requiring much less computational efforts compared with the existing analytical approach. - Highlights: • Developed a new approach for time-variant reliability analysis. • Proposed a novel stochastic process transformation procedure to reduce the dimensionality. • Employed Kriging models with confidence-based adaptive sampling scheme to enhance computational efficiency. • The approach is effective for handling random process in time-variant reliability analysis. • Two case studies are used to demonstrate the efficacy

  14. Automatic Identification and Reconstruction of the Right Phrenic Nerve on Computed Tomography

    OpenAIRE

    Bamps, Kobe; Cuypers, Céline; Polmans, Pieter; Claesen, Luc; Koopman, Pieter

    2016-01-01

    An automatic computer algorithm was successfully constructed, enabling identification and reconstruction of the right phrenic nerve on high resolution coronary computed tomography scans. This could lead to a substantial reduction in the incidence of phrenic nerve paralysis during pulmonary vein isolation using ballon techniques.

  15. Computing Inter-Rater Reliability for Observational Data: An Overview and Tutorial

    Directory of Open Access Journals (Sweden)

    Kevin A. Hallgren

    2012-02-01

    Full Text Available Many research designs require the assessment of inter-rater reliability (IRR to demonstrate consistency among observational ratings provided by multiple coders. However, many studies use incorrect statistical procedures, fail to fully report the information necessary to interpret their results, or do not address how IRR affects the power of their subsequent analyses for hypothesis testing. This paper provides an overview of methodological issues related to the assessment of IRR with a focus on study design, selection of appropriate statistics, and the computation, interpretation, and reporting of some commonly-used IRR statistics. Computational examples include SPSS and R syntax for computing Cohen’s kappa and intra-class correlations to assess IRR.

  16. Interactive reliability assessment using an integrated reliability data bank

    International Nuclear Information System (INIS)

    Allan, R.N.; Whitehead, A.M.

    1986-01-01

    The logical structure, techniques and practical application of a computer-aided technique based on a microcomputer using floppy disc Random Access Files is described. This interactive computational technique is efficient if the reliability prediction program is coupled directly to a relevant source of data to create an integrated reliability assessment/reliability data bank system. (DG)

  17. [Reliability and validity of the Chinese version on Alcohol Use Disorders Identification Test].

    Science.gov (United States)

    Zhang, C; Yang, G P; Li, Z; Li, X N; Li, Y; Hu, J; Zhang, F Y; Zhang, X J

    2017-08-10

    Objective: To assess the reliability and validity of the Chinese version on Alcohol Use Disorders Identification Test (AUDIT) among medical students in China and to provide correct way of application on the recommended scales. Methods: An E-questionnaire was developed and sent to medical students in five different colleges. Students were all active volunteers to accept the testings. Cronbach's α and split-half reliability were calculated to evaluate the reliability of AUDIT while content, contract, discriminant and convergent validity were performed to measure the validity of the scales. Results: The overall Cronbach's α of AUDIT was 0.782 and the split-half reliability was 0.711. Data showed that the domain Cronbach's α and split-half reliability were 0.796 and 0.794 for hazardous alcohol use, 0.561 and 0.623 for dependence symptoms, and 0.647 and 0.640 for harmful alcohol use. Results also showed that the content validity index on the levels of items I-CVI) were from 0.83 to 1.00, the content validity index of scale level (S-CVI/UA) was 0.90, content validity index of average scale level (S-CVI/Ave) was 0.99 and the content validity ratios (CVR) were from 0.80 to 1.00. The simplified version of AUDIT supported a presupposed three-factor structure which could explain 61.175% of the total variance revealed through exploratory factor analysis. AUDIT semed to have good convergent and discriminant validity, with the success rate of calibration experiment as 100%. Conclusion: AUDIT showed good reliability and validity among medical students in China thus worth for promotion on its use.

  18. A Quantitative Risk Analysis Framework for Evaluating and Monitoring Operational Reliability of Cloud Computing

    Science.gov (United States)

    Islam, Muhammad Faysal

    2013-01-01

    Cloud computing offers the advantage of on-demand, reliable and cost efficient computing solutions without the capital investment and management resources to build and maintain in-house data centers and network infrastructures. Scalability of cloud solutions enable consumers to upgrade or downsize their services as needed. In a cloud environment,…

  19. Identification of critical parameters for PEMFC stack performance characterization and control strategies for reliable and comparable stack benchmarking

    DEFF Research Database (Denmark)

    Mitzel, Jens; Gülzow, Erich; Kabza, Alexander

    2016-01-01

    This paper is focused on the identification of critical parameters and on the development of reliable methodologies to achieve comparable benchmark results. Possibilities for control sensor positioning and for parameter variation in sensitivity tests are discussed and recommended options for the ...

  20. Reliability of voxel gray values in cone beam computed tomography for preoperative implant planning assessment

    NARCIS (Netherlands)

    Parsa, A.; Ibrahim, N.; Hassan, B.; Motroni, A.; van der Stelt, P.; Wismeijer, D.

    2012-01-01

    Purpose: To assess the reliability of cone beam computed tomography (CBCT) voxel gray value measurements using Hounsfield units (HU) derived from multislice computed tomography (MSCT) as a clinical reference (gold standard). Materials and Methods: Ten partially edentulous human mandibular cadavers

  1. Computer Model to Estimate Reliability Engineering for Air Conditioning Systems

    International Nuclear Information System (INIS)

    Afrah Al-Bossly, A.; El-Berry, A.; El-Berry, A.

    2012-01-01

    Reliability engineering is used to predict the performance and optimize design and maintenance of air conditioning systems. Air conditioning systems are expose to a number of failures. The failures of an air conditioner such as turn on, loss of air conditioner cooling capacity, reduced air conditioning output temperatures, loss of cool air supply and loss of air flow entirely can be due to a variety of problems with one or more components of an air conditioner or air conditioning system. Forecasting for system failure rates are very important for maintenance. This paper focused on the reliability of the air conditioning systems. Statistical distributions that were commonly applied in reliability settings: the standard (2 parameter) Weibull and Gamma distributions. After distributions parameters had been estimated, reliability estimations and predictions were used for evaluations. To evaluate good operating condition in a building, the reliability of the air conditioning system that supplies conditioned air to the several The company's departments. This air conditioning system is divided into two, namely the main chilled water system and the ten air handling systems that serves the ten departments. In a chilled-water system the air conditioner cools water down to 40-45 degree F (4-7 degree C). The chilled water is distributed throughout the building in a piping system and connected to air condition cooling units wherever needed. Data analysis has been done with support a computer aided reliability software, this is due to the Weibull and Gamma distributions indicated that the reliability for the systems equal to 86.012% and 77.7% respectively. A comparison between the two important families of distribution functions, namely, the Weibull and Gamma families was studied. It was found that Weibull method performed for decision making.

  2. Reliability of Lyapunov characteristic exponents computed by the two-particle method

    Science.gov (United States)

    Mei, Lijie; Huang, Li

    2018-03-01

    For highly complex problems, such as the post-Newtonian formulation of compact binaries, the two-particle method may be a better, or even the only, choice to compute the Lyapunov characteristic exponent (LCE). This method avoids the complex calculations of variational equations compared with the variational method. However, the two-particle method sometimes provides spurious estimates to LCEs. In this paper, we first analyze the equivalence in the definition of LCE between the variational and two-particle methods for Hamiltonian systems. Then, we develop a criterion to determine the reliability of LCEs computed by the two-particle method by considering the magnitude of the initial tangent (or separation) vector ξ0 (or δ0), renormalization time interval τ, machine precision ε, and global truncation error ɛT. The reliable Lyapunov characteristic indicators estimated by the two-particle method form a V-shaped region, which is restricted by d0, ε, and ɛT. Finally, the numerical experiments with the Hénon-Heiles system, the spinning compact binaries, and the post-Newtonian circular restricted three-body problem strongly support the theoretical results.

  3. Transparent reliability model for fault-tolerant safety systems

    International Nuclear Information System (INIS)

    Bodsberg, Lars; Hokstad, Per

    1997-01-01

    A reliability model is presented which may serve as a tool for identification of cost-effective configurations and operating philosophies of computer-based process safety systems. The main merit of the model is the explicit relationship in the mathematical formulas between failure cause and the means used to improve system reliability such as self-test, redundancy, preventive maintenance and corrective maintenance. A component failure taxonomy has been developed which allows the analyst to treat hardware failures, human failures, and software failures of automatic systems in an integrated manner. Furthermore, the taxonomy distinguishes between failures due to excessive environmental stresses and failures initiated by humans during engineering and operation. Attention has been given to develop a transparent model which provides predictions which are in good agreement with observed system performance, and which is applicable for non-experts in the field of reliability

  4. Reliable methods for computer simulation error control and a posteriori estimates

    CERN Document Server

    Neittaanmäki, P

    2004-01-01

    Recent decades have seen a very rapid success in developing numerical methods based on explicit control over approximation errors. It may be said that nowadays a new direction is forming in numerical analysis, the main goal of which is to develop methods ofreliable computations. In general, a reliable numerical method must solve two basic problems: (a) generate a sequence of approximations that converges to a solution and (b) verify the accuracy of these approximations. A computer code for such a method must consist of two respective blocks: solver and checker.In this book, we are chie

  5. Dependent systems reliability estimation by structural reliability approach

    DEFF Research Database (Denmark)

    Kostandyan, Erik; Sørensen, John Dalsgaard

    2014-01-01

    Estimation of system reliability by classical system reliability methods generally assumes that the components are statistically independent, thus limiting its applicability in many practical situations. A method is proposed for estimation of the system reliability with dependent components, where...... the leading failure mechanism(s) is described by physics of failure model(s). The proposed method is based on structural reliability techniques and accounts for both statistical and failure effect correlations. It is assumed that failure of any component is due to increasing damage (fatigue phenomena...... identification. Application of the proposed method can be found in many real world systems....

  6. Non-binary decomposition trees - a method of reliability computation for systems with known minimal paths/cuts

    International Nuclear Information System (INIS)

    Malinowski, Jacek

    2004-01-01

    A coherent system with independent components and known minimal paths (cuts) is considered. In order to compute its reliability, a tree structure T is constructed whose nodes contain the modified minimal paths (cuts) and numerical values. The value of a non-leaf node is a function of its child nodes' values. The values of leaf nodes are calculated from a simple formula. The value of the root node is the system's failure probability (reliability). Subsequently, an algorithm computing the system's failure probability (reliability) is constructed. The algorithm scans all nodes of T using a stack structure for this purpose. The nodes of T are alternately put on and removed from the stack, their data being modified in the process. Once the algorithm has terminated, the stack contains only the final modification of the root node of T, and its value is equal to the system's failure probability (reliability)

  7. RSAM: An enhanced architecture for achieving web services reliability in mobile cloud computing

    Directory of Open Access Journals (Sweden)

    Amr S. Abdelfattah

    2018-04-01

    Full Text Available The evolution of the mobile landscape is coupled with the ubiquitous nature of the internet with its intermittent wireless connectivity and the web services. Achieving the web service reliability results in low communication overhead and retrieving the appropriate response. The middleware approach (MA is highly tended to achieve the web service reliability. This paper proposes a Reliable Service Architecture using Middleware (RSAM that achieves the reliable web services consumption. The enhanced architecture focuses on ensuring and tracking the request execution under the communication limitations and service temporal unavailability. It considers the most measurement factors including: request size, response size, and consuming time. We conducted experiments to compare the enhanced architecture with the traditional one. In these experiments, we covered several cases to prove the achievement of reliability. Results also show that the request size was found to be constant, the response size is identical to the traditional architecture, and the increase in the consuming time was less than 5% of the transaction time with the different response sizes. Keywords: Reliable web service, Middleware architecture, Mobile cloud computing

  8. A computable phenotype for asthma case identification in adult and pediatric patients: External validation in the Chicago Area Patient-Outcomes Research Network (CAPriCORN).

    Science.gov (United States)

    Afshar, Majid; Press, Valerie G; Robison, Rachel G; Kho, Abel N; Bandi, Sindhura; Biswas, Ashvini; Avila, Pedro C; Kumar, Harsha Vardhan Madan; Yu, Byung; Naureckas, Edward T; Nyenhuis, Sharmilee M; Codispoti, Christopher D

    2017-10-13

    Comprehensive, rapid, and accurate identification of patients with asthma for clinical care and engagement in research efforts is needed. The original development and validation of a computable phenotype for asthma case identification occurred at a single institution in Chicago and demonstrated excellent test characteristics. However, its application in a diverse payer mix, across different health systems and multiple electronic health record vendors, and in both children and adults was not examined. The objective of this study is to externally validate the computable phenotype across diverse Chicago institutions to accurately identify pediatric and adult patients with asthma. A cohort of 900 asthma and control patients was identified from the electronic health record between January 1, 2012 and November 30, 2014. Two physicians at each site independently reviewed the patient chart to annotate cases. The inter-observer reliability between the physician reviewers had a κ-coefficient of 0.95 (95% CI 0.93-0.97). The accuracy, sensitivity, specificity, negative predictive value, and positive predictive value of the computable phenotype were all above 94% in the full cohort. The excellent positive and negative predictive values in this multi-center external validation study establish a useful tool to identify asthma cases in in the electronic health record for research and care. This computable phenotype could be used in large-scale comparative-effectiveness trials.

  9. An analytical model for computation of reliability of waste management facilities with intermediate storages

    International Nuclear Information System (INIS)

    Kallweit, A.; Schumacher, F.

    1977-01-01

    A high reliability is called for waste management facilities within the fuel cycle of nuclear power stations which can be fulfilled by providing intermediate storage facilities and reserve capacities. In this report a model based on the theory of Markov processes is described which allows computation of reliability characteristics of waste management facilities containing intermediate storage facilities. The application of the model is demonstrated by an example. (orig.) [de

  10. Probability of extreme interference levels computed from reliability approaches: application to transmission lines with uncertain parameters

    International Nuclear Information System (INIS)

    Larbi, M.; Besnier, P.; Pecqueux, B.

    2014-01-01

    This paper deals with the risk analysis of an EMC default using a statistical approach. It is based on reliability methods from probabilistic engineering mechanics. A computation of probability of failure (i.e. probability of exceeding a threshold) of an induced current by crosstalk is established by taking into account uncertainties on input parameters influencing levels of interference in the context of transmission lines. The study has allowed us to evaluate the probability of failure of the induced current by using reliability methods having a relative low computational cost compared to Monte Carlo simulation. (authors)

  11. Reliability analysis of Airbus A-330 computer flight management system

    OpenAIRE

    Fajmut, Metod

    2010-01-01

    Diploma thesis deals with digitized, computerized flight control system »Fly-by-wire« and security aspects of the computer system of an aircraft Airbus A330. As for space and military aircraft structures is also in commercial airplanes, much of the financial contribution devoted to reliability. Conventional aircraft control systems have, and some are still, to rely on mechanical and hydraulic connections between the controls on aircraft operated by the pilot and control surfaces. But newer a...

  12. The Reliability of EMU FIscal Indicators: Risks and Safeguards

    OpenAIRE

    Fabrizio Balassone; Daniele Franco; Stefania Zotteri

    2007-01-01

    The reliability of EMU�s fiscal indicators has been questioned by recent episodes of large upward deficit revisions. This paper discusses the causes of such revisions in order to identify ways to improve monitoring. The computation of EMU�s deficit indicator involves the assessment of accrued revenue and expenditure and the identification of transactions in financial assets. Both can open margins for opportunistic accounting. However, crosschecks between deficit and changes in gross nomin...

  13. Non-binary decomposition trees - a method of reliability computation for systems with known minimal paths/cuts

    Energy Technology Data Exchange (ETDEWEB)

    Malinowski, Jacek

    2004-05-01

    A coherent system with independent components and known minimal paths (cuts) is considered. In order to compute its reliability, a tree structure T is constructed whose nodes contain the modified minimal paths (cuts) and numerical values. The value of a non-leaf node is a function of its child nodes' values. The values of leaf nodes are calculated from a simple formula. The value of the root node is the system's failure probability (reliability). Subsequently, an algorithm computing the system's failure probability (reliability) is constructed. The algorithm scans all nodes of T using a stack structure for this purpose. The nodes of T are alternately put on and removed from the stack, their data being modified in the process. Once the algorithm has terminated, the stack contains only the final modification of the root node of T, and its value is equal to the system's failure probability (reliability)

  14. Cross-cultural adaptation, reliability, and validation of the Korean version of the identification functional ankle instability (IdFAI).

    Science.gov (United States)

    Ko, Jupil; Rosen, Adam B; Brown, Cathleen N

    2017-09-12

    To cross-culturally adapt the Identification Functional Ankle Instability for use with Korean-speaking participants. The English version of the IdFAI was cross-culturally adapted into Korean based on the guidelines. The psychometric properties in the Korean version of the IdFAI were measured for test-retest reliability, internal consistency, criterion-related validity, discriminative validity, and measurement error 181 native Korean-speakers. Intra-class correlation coefficients (ICC 2,1 ) between the English and Korean versions of the IdFAI for test-retest reliability was 0.98 (standard error of measurement = 1.41). The Cronbach's alpha coefficient was 0.89 for the Korean versions of IdFAI. The Korean versions of the IdFAI had a strong correlation with the SF-36 (r s  = -0.69, p 10 was the optimal cutoff score to distinguish between the group memberships. The minimally detectable change of the Korean versions of the IdFAI score was 3.91. The Korean versions of the IdFAI have shown to be an excellent, reliable, and valid instrument. The Korean versions of the IdFAI can be utilized to assess the presence of Chronic Ankle Instability by researchers and clinicians working among Korean-speaking populations. Implications for rehabilitation The high recurrence rate of sprains may result into Chronic Ankle Instability (CAI). The Identification of Functional Ankle Instability Tool (IdFAI) has been validated and recommended to identify patients with Chronic Ankle Instability (CAI). The Korean version of the Identification of Functional Ankle Instability Tool (IdFAI) may be also recommend to researchers and clinicians for assessing the presence of Chronic Ankle Instability (CAI) in Korean-speaking population.

  15. RGCA: A Reliable GPU Cluster Architecture for Large-Scale Internet of Things Computing Based on Effective Performance-Energy Optimization.

    Science.gov (United States)

    Fang, Yuling; Chen, Qingkui; Xiong, Neal N; Zhao, Deyu; Wang, Jingjuan

    2017-08-04

    This paper aims to develop a low-cost, high-performance and high-reliability computing system to process large-scale data using common data mining algorithms in the Internet of Things (IoT) computing environment. Considering the characteristics of IoT data processing, similar to mainstream high performance computing, we use a GPU (Graphics Processing Unit) cluster to achieve better IoT services. Firstly, we present an energy consumption calculation method (ECCM) based on WSNs. Then, using the CUDA (Compute Unified Device Architecture) Programming model, we propose a Two-level Parallel Optimization Model (TLPOM) which exploits reasonable resource planning and common compiler optimization techniques to obtain the best blocks and threads configuration considering the resource constraints of each node. The key to this part is dynamic coupling Thread-Level Parallelism (TLP) and Instruction-Level Parallelism (ILP) to improve the performance of the algorithms without additional energy consumption. Finally, combining the ECCM and the TLPOM, we use the Reliable GPU Cluster Architecture (RGCA) to obtain a high-reliability computing system considering the nodes' diversity, algorithm characteristics, etc. The results show that the performance of the algorithms significantly increased by 34.1%, 33.96% and 24.07% for Fermi, Kepler and Maxwell on average with TLPOM and the RGCA ensures that our IoT computing system provides low-cost and high-reliability services.

  16. Reliability of an interactive computer program for advance care planning.

    Science.gov (United States)

    Schubart, Jane R; Levi, Benjamin H; Camacho, Fabian; Whitehead, Megan; Farace, Elana; Green, Michael J

    2012-06-01

    Despite widespread efforts to promote advance directives (ADs), completion rates remain low. Making Your Wishes Known: Planning Your Medical Future (MYWK) is an interactive computer program that guides individuals through the process of advance care planning, explaining health conditions and interventions that commonly involve life or death decisions, helps them articulate their values/goals, and translates users' preferences into a detailed AD document. The purpose of this study was to demonstrate that (in the absence of major life changes) the AD generated by MYWK reliably reflects an individual's values/preferences. English speakers ≥30 years old completed MYWK twice, 4 to 6 weeks apart. Reliability indices were assessed for three AD components: General Wishes; Specific Wishes for treatment; and Quality-of-Life values (QoL). Twenty-four participants completed the study. Both the Specific Wishes and QoL scales had high internal consistency in both time periods (Knuder Richardson formula 20 [KR-20]=0.83-0.95, and 0.86-0.89). Test-retest reliability was perfect for General Wishes (κ=1), high for QoL (Pearson's correlation coefficient=0.83), but lower for Specific Wishes (Pearson's correlation coefficient=0.57). MYWK generates an AD where General Wishes and QoL (but not Specific Wishes) statements remain consistent over time.

  17. Reliability of an Interactive Computer Program for Advance Care Planning

    Science.gov (United States)

    Levi, Benjamin H.; Camacho, Fabian; Whitehead, Megan; Farace, Elana; Green, Michael J

    2012-01-01

    Abstract Despite widespread efforts to promote advance directives (ADs), completion rates remain low. Making Your Wishes Known: Planning Your Medical Future (MYWK) is an interactive computer program that guides individuals through the process of advance care planning, explaining health conditions and interventions that commonly involve life or death decisions, helps them articulate their values/goals, and translates users' preferences into a detailed AD document. The purpose of this study was to demonstrate that (in the absence of major life changes) the AD generated by MYWK reliably reflects an individual's values/preferences. English speakers ≥30 years old completed MYWK twice, 4 to 6 weeks apart. Reliability indices were assessed for three AD components: General Wishes; Specific Wishes for treatment; and Quality-of-Life values (QoL). Twenty-four participants completed the study. Both the Specific Wishes and QoL scales had high internal consistency in both time periods (Knuder Richardson formula 20 [KR-20]=0.83–0.95, and 0.86–0.89). Test-retest reliability was perfect for General Wishes (κ=1), high for QoL (Pearson's correlation coefficient=0.83), but lower for Specific Wishes (Pearson's correlation coefficient=0.57). MYWK generates an AD where General Wishes and QoL (but not Specific Wishes) statements remain consistent over time. PMID:22512830

  18. Accident identification system with automatic detection of abnormal condition using quantum computation

    International Nuclear Information System (INIS)

    Nicolau, Andressa dos Santos; Schirru, Roberto; Lima, Alan Miranda Monteiro de

    2011-01-01

    Transient identification systems have been proposed in order to maintain the plant operating in safe conditions and help operators in make decisions in emergency short time interval with maximum certainty associated. This article presents a system, time independent and without the use of an event that can be used as a starting point for t = 0 (reactor scram, for instance), for transient/accident identification of a pressurized water nuclear reactor (PWR). The model was developed in order to be able to recognize the normal condition and three accidents of the design basis list of the Nuclear Power Plant Angra 2, postulated in the Final Safety Analysis Report (FSAR). Were used several sets of process variables in order to establish a minimum set of variables considered necessary and sufficient. The optimization step of the identification algorithm is based upon the paradigm of Quantum Computing. In this case, the optimization metaheuristic Quantum Inspired Evolutionary Algorithm (QEA) was implemented and works as a data mining tool. The results obtained with the QEA without the time variable are compatible to the techniques in the reference literature, for the transient identification problem, with less computational effort (number of evaluations). This system allows a solution that approximates the ideal solution, the Voronoi Vectors with only one partition for the classes of accidents with robustness. (author)

  19. Exploration of the (Interrater) Reliability and Latent Factor Structure of the Alcohol Use Disorders Identification Test (AUDIT) and the Drug Use Disorders Identification Test (DUDIT) in a Sample of Dutch Probationers.

    Science.gov (United States)

    Hildebrand, Martin; Noteborn, Mirthe G C

    2015-01-01

    The use of brief, reliable, valid, and practical measures of substance use is critical for conducting individual (risk and need) assessments in probation practice. In this exploratory study, the basic psychometric properties of the Alcohol Use Disorders Identification Test (AUDIT) and the Drug Use Disorders Identification Test (DUDIT) are evaluated. The instruments were administered as an oral interview instead of a self-report questionnaire. The sample comprised 383 offenders (339 men, 44 women). A subset of 56 offenders (49 men, 7 women) participated in the interrater reliability study. Data collection took place between September 2011 and November 2012. Overall, both instruments have acceptable levels of interrater reliability for total scores and acceptable to good interrater reliabilities for most of the individual items. Confirmatory factor analyses (CFA) indicated that the a priori one-, two- and three-factor solutions for the AUDIT did not fit the observed data very well. Principal axis factoring (PAF) supported a two-factor solution for the AUDIT that included a level of alcohol consumption/consequences factor (Factor 1) and a dependence factor (Factor 2), with both factors explaining substantial variance in AUDIT scores. For the DUDIT, CFA and PAF suggest that a one-factor solution is the preferred model (accounting for 62.61% of total variance). The Dutch language versions of the AUDIT and the DUDIT are reliable screening instruments for use with probationers and both instruments can be reliably administered by probation officers in probation practice. However, future research on concurrent and predictive validity is warranted.

  20. Contours identification of elements in a cone beam computed tomography for investigating maxillary cysts

    Science.gov (United States)

    Chioran, Doina; Nicoarǎ, Adrian; Roşu, Şerban; Cǎrligeriu, Virgil; Ianeş, Emilia

    2013-10-01

    Digital processing of two-dimensional cone beam computer tomography slicesstarts by identification of the contour of elements within. This paper deals with the collective work of specialists in medicine and applied mathematics in computer science on elaborating and implementation of algorithms in dental 2D imagery.

  1. UPTF test instrumentation. Measurement system identification, engineering units and computed parameters

    International Nuclear Information System (INIS)

    Sarkar, J.; Liebert, J.; Laeufer, R.

    1992-11-01

    This updated version of the previous report /1/ contains, besides additional instrumentation needed for 2D/3D Programme, the supplementary instrumentation in the inlet plenum of SG simulator and hot and cold leg of broken loop, the cold leg of intact loops and the upper plenum to meet the requirements (Test Phase A) of the UPTF Programme, TRAM, sponsored by the Federal Minister of Research and Technology (BMFT) of the Federal Republic of Germany. For understanding, the derivation and the description of the identification codes for the entire conventional and advanced measurement systems classifying the function, and the equipment unit, key, as adopted in the conventional power plants, have been included. Amendments have also been made to the appendices. In particular, the list of measurement systems covering the measurement identification code, instrument, measured quantity, measuring range, band width, uncertainty and sensor location has been updated and extended to include the supplementary instrumentation. Beyond these amendments, the uncertainties of measurements have been precisely specified. The measurement identification codes which also stand for the identification of the corresponding measured quantities in engineering units and the identification codes derived therefrom for the computed parameters have been adequately detailed. (orig.)

  2. Interaction Entropy: A New Paradigm for Highly Efficient and Reliable Computation of Protein-Ligand Binding Free Energy.

    Science.gov (United States)

    Duan, Lili; Liu, Xiao; Zhang, John Z H

    2016-05-04

    Efficient and reliable calculation of protein-ligand binding free energy is a grand challenge in computational biology and is of critical importance in drug design and many other molecular recognition problems. The main challenge lies in the calculation of entropic contribution to protein-ligand binding or interaction systems. In this report, we present a new interaction entropy method which is theoretically rigorous, computationally efficient, and numerically reliable for calculating entropic contribution to free energy in protein-ligand binding and other interaction processes. Drastically different from the widely employed but extremely expensive normal mode method for calculating entropy change in protein-ligand binding, the new method calculates the entropic component (interaction entropy or -TΔS) of the binding free energy directly from molecular dynamics simulation without any extra computational cost. Extensive study of over a dozen randomly selected protein-ligand binding systems demonstrated that this interaction entropy method is both computationally efficient and numerically reliable and is vastly superior to the standard normal mode approach. This interaction entropy paradigm introduces a novel and intuitive conceptual understanding of the entropic effect in protein-ligand binding and other general interaction systems as well as a practical method for highly efficient calculation of this effect.

  3. Review of the reliability of Bruce 'B' RRS dual computer system

    International Nuclear Information System (INIS)

    Arsenault, J.E.; Manship, R.A.; Levan, D.G.

    1995-07-01

    The review presents an analysis of the Bruce 'B' Reactor Regulating System (RRS) Digital Control Computer (DCC) system, based on system documentation, significant event reports (SERs), question sets, and a site visit. The intent is to evaluate the reliability of the RRS DCC and to identify the possible scenarios that could lead to a serious process failure. The evaluation is based on three relatively independent analyses, which are integrated and presented in the form of Conclusions and Recommendations

  4. Reliability and validity of the Korean standard pattern identification for stroke (K-SPI-Stroke questionnaire

    Directory of Open Access Journals (Sweden)

    Kang Byoung-Kab

    2012-04-01

    Full Text Available Abstract Background The present study was conducted to examine the reliability and validity of the ‘Korean Standard Pattern Identification for Stroke (K-SPI-Stroke’, which was developed and evaluated within the context of traditional Korean medicine (TKM. Methods Between September 2006 and December 2010, 2,905 patients from 11 Korean medical hospitals were asked to complete the K-SPI-Stroke questionnaire as a part of project ' Fundamental study for the standardization and objectification of pattern identification in traditional Korean medicine for stroke (SOPI-Stroke. Each patient was independently diagnosed by two TKM physicians from the same site according to one of four patterns, as suggested by the Korea Institute of Oriental Medicine: 1 a Qi deficiency pattern, 2 a Dampness-phlegm pattern, 3 a Yin deficiency pattern, or 4 a Fire-heat pattern. We estimated the internal consistency using Cronbach’s α coefficient, the discriminant validity using the means score of patterns, and the predictive validity using the classification accuracy of the K-SPI-Stroke questionnaire. Results The K-SPI-Stroke questionnaire had satisfactory internal consistency (α = 0.700 and validity, with significant differences in the mean of scores among the four patterns. The overall classification accuracy of this questionnaire was 65.2 %. Conclusion These results suggest that the K-SPI-Stroke questionnaire is a reliable and valid instrument for estimating the severity of the four patterns.

  5. Computer-aided reliability and risk assessment

    International Nuclear Information System (INIS)

    Leicht, R.; Wingender, H.J.

    1989-01-01

    Activities in the fields of reliability and risk analyses have led to the development of particular software tools which now are combined in the PC-based integrated CARARA system. The options available in this system cover a wide range of reliability-oriented tasks, like organizing raw failure data in the component/event data bank FDB, performing statistical analysis of those data with the program FDA, managing the resulting parameters in the reliability data bank RDB, and performing fault tree analysis with the fault tree code FTL or evaluating the risk of toxic or radioactive material release with the STAR code. (orig.)

  6. Reliability analysis of microcomputer boards and computer based systems important to safety of nuclear plants

    International Nuclear Information System (INIS)

    Shrikhande, S.V.; Patil, V.K.; Ganesh, G.; Biswas, B.; Patil, R.K.

    2010-01-01

    Computer Based Systems (CBS) are employed in Indian nuclear plants for protection, control and monitoring purpose. For forthcoming CBS, Reactor Control Division has designed and developed a new standardized family of microcomputer boards qualified to stringent requirements of nuclear industry. These boards form the basic building blocks of CBS. Reliability analysis of these boards is being carried out using analysis package based on MIL-STD-217Plus methodology. The estimated failure rate values of these standardized microcomputer boards will be useful for reliability assessment of these systems. The paper presents reliability analysis of microcomputer boards and case study of a CBS system built using these boards. (author)

  7. Fault tolerance in computational grids: perspectives, challenges, and issues.

    Science.gov (United States)

    Haider, Sajjad; Nazir, Babar

    2016-01-01

    Computational grids are established with the intention of providing shared access to hardware and software based resources with special reference to increased computational capabilities. Fault tolerance is one of the most important issues faced by the computational grids. The main contribution of this survey is the creation of an extended classification of problems that incur in the computational grid environments. The proposed classification will help researchers, developers, and maintainers of grids to understand the types of issues to be anticipated. Moreover, different types of problems, such as omission, interaction, and timing related have been identified that need to be handled on various layers of the computational grid. In this survey, an analysis and examination is also performed pertaining to the fault tolerance and fault detection mechanisms. Our conclusion is that a dependable and reliable grid can only be established when more emphasis is on fault identification. Moreover, our survey reveals that adaptive and intelligent fault identification, and tolerance techniques can improve the dependability of grid working environments.

  8. Test-retest reliability and comparability of paper and computer questionnaires for the Finnish version of the Tampa Scale of Kinesiophobia.

    Science.gov (United States)

    Koho, P; Aho, S; Kautiainen, H; Pohjolainen, T; Hurri, H

    2014-12-01

    To estimate the internal consistency, test-retest reliability and comparability of paper and computer versions of the Finnish version of the Tampa Scale of Kinesiophobia (TSK-FIN) among patients with chronic pain. In addition, patients' personal experiences of completing both versions of the TSK-FIN and preferences between these two methods of data collection were studied. Test-retest reliability study. Paper and computer versions of the TSK-FIN were completed twice on two consecutive days. The sample comprised 94 consecutive patients with chronic musculoskeletal pain participating in a pain management or individual rehabilitation programme. The group rehabilitation design consisted of physical and functional exercises, evaluation of the social situation, psychological assessment of pain-related stress factors, and personal pain management training in order to regain overall function and mitigate the inconvenience of pain and fear-avoidance behaviour. The mean TSK-FIN score was 37.1 [standard deviation (SD) 8.1] for the computer version and 35.3 (SD 7.9) for the paper version. The mean difference between the two versions was 1.9 (95% confidence interval 0.8 to 2.9). Test-retest reliability was 0.89 for the paper version and 0.88 for the computer version. Internal consistency was considered to be good for both versions. The intraclass correlation coefficient for comparability was 0.77 (95% confidence interval 0.66 to 0.85), indicating substantial reliability between the two methods. Both versions of the TSK-FIN demonstrated substantial intertest reliability, good test-retest reliability, good internal consistency and acceptable limits of agreement, suggesting their suitability for clinical use. However, subjects tended to score higher when using the computer version. As such, in an ideal situation, data should be collected in a similar manner throughout the course of rehabilitation or clinical research. Copyright © 2014 Chartered Society of Physiotherapy. Published

  9. The computer vision in the service of safety and reliability in steam generators inspection services

    International Nuclear Information System (INIS)

    Pineiro Fernandez, P.; Garcia Bueno, A.; Cabrera Jordan, E.

    2012-01-01

    The actual computational vision has matured very quickly in the last ten years by facilitating new developments in various areas of nuclear application allowing to automate and simplify processes and tasks, instead or in collaboration with the people and equipment efficiently. The current computer vision (more appropriate than the artificial vision concept) provides great possibilities of also improving in terms of the reliability and safety of NPPS inspection systems.

  10. Exploration of the (interrater) reliability and latent factor structure of the Alcohol Use Disorders Identification Test (AUDIT) and the Drug Use Disorders Identification Test (DUDIT) in a sample of Dutch probationers

    NARCIS (Netherlands)

    Noteborn, M.G.C.; Hildebrand, M.

    2015-01-01

    Background: The use of brief, reliable, valid, and practical measures of substance use is critical for conducting individual (risk and need) assessments in probation practice. In this exploratory study, the basic psychometric properties of the Alcohol Use Disorders Identification Test (AUDIT) and

  11. Reliability analysis and computation of computer-based safety instrumentation and control used in German nuclear power plant. Final report

    International Nuclear Information System (INIS)

    Ding, Yongjian; Krause, Ulrich; Gu, Chunlei

    2014-01-01

    The trend of technological advancement in the field of safety instrumentation and control (I and C) leads to increasingly frequent use of computer-based (digital) control systems which consisting of distributed, connected bus communications computers and their functionalities are freely programmable by qualified software. The advantages of the new I and C system over the old I and C system with hard-wired technology are e.g. in the higher flexibility, cost-effective procurement of spare parts, higher hardware reliability (through higher integration density, intelligent self-monitoring mechanisms, etc.). On the other hand, skeptics see the new technology with the computer-based I and C a higher potential by influences of common cause failures (CCF), and the easier manipulation by sabotage (IT Security). In this joint research project funded by the Federal Ministry for Economical Affaires and Energy (BMWi) (2011-2014, FJZ 1501405) the Otto-von-Guericke-University Magdeburg and Magdeburg-Stendal University of Applied Sciences are therefore trying to develop suitable methods for the demonstration of the reliability of the new instrumentation and control systems with the focus on the investigation of CCF. This expertise of both houses shall be extended to this area and a scientific contribution to the sound reliability judgments of the digital safety I and C in domestic and foreign nuclear power plants. First, the state of science and technology will be worked out through the study of national and international standards in the field of functional safety of electrical and I and C systems and accompanying literature. On the basis of the existing nuclear Standards the deterministic requirements on the structure of the new digital I and C system will be determined. The possible methods of reliability modeling will be analyzed and compared. A suitable method called multi class binomial failure rate (MCFBR) which was successfully used in safety valve applications will be

  12. How to effectively compute the reliability of a thermal-hydraulic nuclear passive system

    International Nuclear Information System (INIS)

    Zio, E.; Pedroni, N.

    2011-01-01

    Research highlights: → Optimized LS is the preferred choice for failure probability estimation. → Two alternative options are suggested for uncertainty and sensitivity analyses. → SS for simulation codes requiring seconds or minutes to run. → Regression models (e.g., ANNs) for simulation codes requiring hours or days to run. - Abstract: The computation of the reliability of a thermal-hydraulic (T-H) passive system of a nuclear power plant can be obtained by (i) Monte Carlo (MC) sampling the uncertainties of the system model and parameters, (ii) computing, for each sample, the system response by a mechanistic T-H code and (iii) comparing the system response with pre-established safety thresholds, which define the success or failure of the safety function. The computational effort involved can be prohibitive because of the large number of (typically long) T-H code simulations that must be performed (one for each sample) for the statistical estimation of the probability of success or failure. The objective of this work is to provide operative guidelines to effectively handle the computation of the reliability of a nuclear passive system. Two directions of computation efficiency are considered: from one side, efficient Monte Carlo Simulation (MCS) techniques are indicated as a means to performing robust estimations with a limited number of samples: in particular, the Subset Simulation (SS) and Line Sampling (LS) methods are identified as most valuable; from the other side, fast-running, surrogate regression models (also called response surfaces or meta-models) are indicated as a valid replacement of the long-running T-H model codes: in particular, the use of bootstrapped Artificial Neural Networks (ANNs) is shown to have interesting potentials, including for uncertainty propagation. The recommendations drawn are supported by the results obtained in an illustrative application of literature.

  13. Reference Computational Meshing Strategy for Computational Fluid Dynamics Simulation of Departure from Nucleate BoilingReference Computational Meshing Strategy for Computational Fluid Dynamics Simulation of Departure from Nucleate Boiling

    Energy Technology Data Exchange (ETDEWEB)

    Pointer, William David [ORNL

    2017-08-01

    The objective of this effort is to establish a strategy and process for generation of suitable computational mesh for computational fluid dynamics simulations of departure from nucleate boiling in a 5 by 5 fuel rod assembly held in place by PWR mixing vane spacer grids. This mesh generation process will support ongoing efforts to develop, demonstrate and validate advanced multi-phase computational fluid dynamics methods that enable more robust identification of dryout conditions and DNB occurrence.Building upon prior efforts and experience, multiple computational meshes were developed using the native mesh generation capabilities of the commercial CFD code STAR-CCM+. These meshes were used to simulate two test cases from the Westinghouse 5 by 5 rod bundle facility. The sensitivity of predicted quantities of interest to the mesh resolution was then established using two evaluation methods, the Grid Convergence Index method and the Least Squares method. This evaluation suggests that the Least Squares method can reliably establish the uncertainty associated with local parameters such as vector velocity components at a point in the domain or surface averaged quantities such as outlet velocity magnitude. However, neither method is suitable for characterization of uncertainty in global extrema such as peak fuel surface temperature, primarily because such parameters are not necessarily associated with a fixed point in space. This shortcoming is significant because the current generation algorithm for identification of DNB event conditions relies on identification of such global extrema. Ongoing efforts to identify DNB based on local surface conditions will address this challenge

  14. New algorithm to reduce the number of computing steps in reliability formula of Weighted-k-out-of-n system

    Directory of Open Access Journals (Sweden)

    Tatsunari Ohkura

    2007-02-01

    Full Text Available In the disjoint products version of reliability analysis of weighted–k–out–of–n systems, it is necessary to determine the order in which the weight of components is to be considered. The k–out–of–n:G(F system consists of n components; each com-ponent has its own probability and positive integer weight such that the system is operational (failed if and only if the total weight of some operational (failure components is at least k. This paper designs a method to compute the reliability in O(nk computing time and in O(nk memory space. The proposed method expresses the system reliability in fewer product terms than those already published.

  15. A reliability simulation language for reliability analysis

    International Nuclear Information System (INIS)

    Deans, N.D.; Miller, A.J.; Mann, D.P.

    1986-01-01

    The results of work being undertaken to develop a Reliability Description Language (RDL) which will enable reliability analysts to describe complex reliability problems in a simple, clear and unambiguous way are described. Component and system features can be stated in a formal manner and subsequently used, along with control statements to form a structured program. The program can be compiled and executed on a general-purpose computer or special-purpose simulator. (DG)

  16. Apps for Angiosperms: The Usability of Mobile Computers and Printed Field Guides for UK Wild Flower and Winter Tree Identification

    Science.gov (United States)

    Stagg, Bethan C.; Donkin, Maria E.

    2017-01-01

    We investigated usability of mobile computers and field guide books with adult botanical novices, for the identification of wildflowers and deciduous trees in winter. Identification accuracy was significantly higher for wildflowers using a mobile computer app than field guide books but significantly lower for deciduous trees. User preference…

  17. Automated bony region identification using artificial neural networks: reliability and validation measurements

    International Nuclear Information System (INIS)

    Gassman, Esther E.; Kallemeyn, Nicole A.; DeVries, Nicole A.; Shivanna, Kiran H.; Powell, Stephanie M.; Magnotta, Vincent A.; Ramme, Austin J.; Adams, Brian D.; Grosland, Nicole M.

    2008-01-01

    The objective was to develop tools for automating the identification of bony structures, to assess the reliability of this technique against manual raters, and to validate the resulting regions of interest against physical surface scans obtained from the same specimen. Artificial intelligence-based algorithms have been used for image segmentation, specifically artificial neural networks (ANNs). For this study, an ANN was created and trained to identify the phalanges of the human hand. The relative overlap between the ANN and a manual tracer was 0.87, 0.82, and 0.76, for the proximal, middle, and distal index phalanx bones respectively. Compared with the physical surface scans, the ANN-generated surface representations differed on average by 0.35 mm, 0.29 mm, and 0.40 mm for the proximal, middle, and distal phalanges respectively. Furthermore, the ANN proved to segment the structures in less than one-tenth of the time required by a manual rater. The ANN has proven to be a reliable and valid means of segmenting the phalanx bones from CT images. Employing automated methods such as the ANN for segmentation, eliminates the likelihood of rater drift and inter-rater variability. Automated methods also decrease the amount of time and manual effort required to extract the data of interest, thereby making the feasibility of patient-specific modeling a reality. (orig.)

  18. Automated bony region identification using artificial neural networks: reliability and validation measurements

    Energy Technology Data Exchange (ETDEWEB)

    Gassman, Esther E.; Kallemeyn, Nicole A.; DeVries, Nicole A.; Shivanna, Kiran H. [The University of Iowa, Department of Biomedical Engineering, Seamans Center for the Engineering Arts and Sciences, Iowa City, IA (United States); The University of Iowa, Center for Computer-Aided Design, Iowa City, IA (United States); Powell, Stephanie M. [The University of Iowa, Department of Biomedical Engineering, Seamans Center for the Engineering Arts and Sciences, Iowa City, IA (United States); University of Iowa Hospitals and Clinics, The University of Iowa, Department of Radiology, Iowa City, IA (United States); Magnotta, Vincent A. [The University of Iowa, Department of Biomedical Engineering, Seamans Center for the Engineering Arts and Sciences, Iowa City, IA (United States); The University of Iowa, Center for Computer-Aided Design, Iowa City, IA (United States); University of Iowa Hospitals and Clinics, The University of Iowa, Department of Radiology, Iowa City, IA (United States); Ramme, Austin J. [University of Iowa Hospitals and Clinics, The University of Iowa, Department of Radiology, Iowa City, IA (United States); Adams, Brian D. [The University of Iowa, Department of Biomedical Engineering, Seamans Center for the Engineering Arts and Sciences, Iowa City, IA (United States); University of Iowa Hospitals and Clinics, The University of Iowa, Department of Orthopaedics and Rehabilitation, Iowa City, IA (United States); Grosland, Nicole M. [The University of Iowa, Department of Biomedical Engineering, Seamans Center for the Engineering Arts and Sciences, Iowa City, IA (United States); University of Iowa Hospitals and Clinics, The University of Iowa, Department of Orthopaedics and Rehabilitation, Iowa City, IA (United States); The University of Iowa, Center for Computer-Aided Design, Iowa City, IA (United States)

    2008-04-15

    The objective was to develop tools for automating the identification of bony structures, to assess the reliability of this technique against manual raters, and to validate the resulting regions of interest against physical surface scans obtained from the same specimen. Artificial intelligence-based algorithms have been used for image segmentation, specifically artificial neural networks (ANNs). For this study, an ANN was created and trained to identify the phalanges of the human hand. The relative overlap between the ANN and a manual tracer was 0.87, 0.82, and 0.76, for the proximal, middle, and distal index phalanx bones respectively. Compared with the physical surface scans, the ANN-generated surface representations differed on average by 0.35 mm, 0.29 mm, and 0.40 mm for the proximal, middle, and distal phalanges respectively. Furthermore, the ANN proved to segment the structures in less than one-tenth of the time required by a manual rater. The ANN has proven to be a reliable and valid means of segmenting the phalanx bones from CT images. Employing automated methods such as the ANN for segmentation, eliminates the likelihood of rater drift and inter-rater variability. Automated methods also decrease the amount of time and manual effort required to extract the data of interest, thereby making the feasibility of patient-specific modeling a reality. (orig.)

  19. Intraobserver and intermethod reliability for using two different computer programs in preoperative lower limb alignment analysis

    Directory of Open Access Journals (Sweden)

    Mohamed Kenawey

    2016-12-01

    Conclusion: Computer assisted lower limb alignment analysis is reliable whether using graphics editing program or specialized planning software. However slight higher variability for angles away from the knee joint can be expected.

  20. A Program for the Identification of the Enterobacteriaceae for Use in Teaching the Principles of Computer Identification of Bacteria.

    Science.gov (United States)

    Hammonds, S. J.

    1990-01-01

    A technique for the numerical identification of bacteria using normalized likelihoods calculated from a probabilistic database is described, and the principles of the technique are explained. The listing of the computer program is included. Specimen results from the program, and examples of how they should be interpreted, are given. (KR)

  1. Computational Acoustic Beamforming for Noise Source Identification for Small Wind Turbines.

    Science.gov (United States)

    Ma, Ping; Lien, Fue-Sang; Yee, Eugene

    2017-01-01

    This paper develops a computational acoustic beamforming (CAB) methodology for identification of sources of small wind turbine noise. This methodology is validated using the case of the NACA 0012 airfoil trailing edge noise. For this validation case, the predicted acoustic maps were in excellent conformance with the results of the measurements obtained from the acoustic beamforming experiment. Following this validation study, the CAB methodology was applied to the identification of noise sources generated by a commercial small wind turbine. The simulated acoustic maps revealed that the blade tower interaction and the wind turbine nacelle were the two primary mechanisms for sound generation for this small wind turbine at frequencies between 100 and 630 Hz.

  2. Reliability-based design of wind turbine blades

    DEFF Research Database (Denmark)

    Toft, Henrik Stensgaard; Sørensen, John Dalsgaard

    2011-01-01

    Reliability-based design of wind turbine blades requires identification of the important failure modes/limit states along with stochastic models for the uncertainties and methods for estimating the reliability. In the present paper it is described how reliability-based design can be applied to wi...

  3. Reliability of the Identification of Functional Ankle Instability (IdFAI) Scale Across Different Age Groups in Adults.

    Science.gov (United States)

    Gurav, Reshma S; Ganu, Sneha S; Panhale, Vrushali P

    2014-10-01

    Functional ankle instability (FAI) is the tendency of the foot to 'give way'. Identification of Functional Ankle Instability questionnaire (IdFAI) is a newly developed questionnaire to detect whether individuals meet the minimum criteria necessary for inclusion in an FAI population. However, the reliability of the questionnaire was studied only in a restricted age group. The purpose of this investigation was to examine the reliability of IdFAI across different age groups in adults. One hundred and twenty participants in the age group of 20-60 years consisting of 30 individuals in each age group were asked to complete the IdFAI on two occasions. Test-retest reliability was evaluated by intraclass correlation coefficient (ICC2,1). The study revealed that IdFAI has excellent test-retest reliability when studied across different age groups. The ICC2,1 in the age groups 20-30 years, 30-40 years, 40-50 years and 50-60 years was 0.978, 0.975, 0.961 and 0.922, respectively with Cronbach's alpha >0.9 in all the age groups. The IdFAI can accurately predict if an individual meets the minimum criterion for FAI across different age groups in adults. Thus, the questionnaire can be applied over different age groups in clinical and research set-ups.

  4. An efficient Neuro-Fuzzy approach to nuclear power plant transient identification

    Energy Technology Data Exchange (ETDEWEB)

    Gomes da Costa, Rafael [Instituto de Engenharia Nuclear - CNEN, Programa de Pos-Graduacao em Ciencia e Tecnologia Nucleares, Via Cinco, s/no, Cidade Universitaria, Rua Helio de Almeida, 75, Postal Box 68550, Zip Code 21941-906 Rio de Janeiro (Brazil); Abreu Mol, Antonio Carlos de, E-mail: mol@ien.gov.br [Instituto de Engenharia Nuclear - CNEN, Programa de Pos-Graduacao em Ciencia e Tecnologia Nucleares, Via Cinco, s/no, Cidade Universitaria, Rua Helio de Almeida, 75, Postal Box 68550, Zip Code 21941-906 Rio de Janeiro (Brazil); Instituto Nacional de C and T de Reatores Nucleares Inovadores (Brazil); Carvalho, Paulo Victor R. de, E-mail: paulov@ien.gov.br [Instituto de Engenharia Nuclear - CNEN, Programa de Pos-Graduacao em Ciencia e Tecnologia Nucleares, Via Cinco, s/no, Cidade Universitaria, Rua Helio de Almeida, 75, Postal Box 68550, Zip Code 21941-906 Rio de Janeiro (Brazil); Lapa, Celso Marcelo Franklin, E-mail: lapa@ien.gov.br [Instituto de Engenharia Nuclear - CNEN, Programa de Pos-Graduacao em Ciencia e Tecnologia Nucleares, Via Cinco, s/no, Cidade Universitaria, Rua Helio de Almeida, 75, Postal Box 68550, Zip Code 21941-906 Rio de Janeiro (Brazil); Instituto Nacional de C and T de Reatores Nucleares Inovadores (Brazil)

    2011-06-15

    Highlights: > We investigate a Neuro-Fuzzy modeling tool use for able transient identification. > The prelusive transient type identification is done by an artificial neural network. > After, the fuzzy-logic system analyzes the results emitting reliability degree of it. > The research support was made in a PWR simulator at the Brazilian Nuclear Engineering Institute. > The results show the potential to help operators' decisions in a nuclear power plant. - Abstract: Transient identification in nuclear power plants (NPP) is often a computational very hard task and may involve a great amount of human cognition. The early identification of unexpected departures from steady state behavior is an essential step for the operation, control and accident management in NPPs. The bases for the transient identification relay on the evidence that different system faults and anomalies lead to different pattern evolution in the involved process variables. During an abnormal event, the operator must monitor a great amount of information from the instruments that represents a specific type of event. Recently, several works have been developed for transient identification. These works frequently present a non reliable response, using the 'don't know' as the system output. In this work, we investigate the possibility of using a Neuro-Fuzzy modeling tool for efficient transient identification, aiming to helping the operator crew to take decisions relative to the procedure to be followed in situations of accidents/transients at NPPs. The proposed system uses artificial neural networks (ANN) as first level transient diagnostic. After the ANN has done the preliminary transient type identification, a fuzzy-logic system analyzes the results emitting reliability degree of it. A validation of this identification system was made at the three loops Pressurized Water Reactor (PWR) simulator of the Human-System Interface Laboratory (LABIHS) of the Nuclear Engineering Institute

  5. An efficient Neuro-Fuzzy approach to nuclear power plant transient identification

    International Nuclear Information System (INIS)

    Gomes da Costa, Rafael; Abreu Mol, Antonio Carlos de; Carvalho, Paulo Victor R. de; Lapa, Celso Marcelo Franklin

    2011-01-01

    Highlights: → We investigate a Neuro-Fuzzy modeling tool use for able transient identification. → The prelusive transient type identification is done by an artificial neural network. → After, the fuzzy-logic system analyzes the results emitting reliability degree of it. → The research support was made in a PWR simulator at the Brazilian Nuclear Engineering Institute. → The results show the potential to help operators' decisions in a nuclear power plant. - Abstract: Transient identification in nuclear power plants (NPP) is often a computational very hard task and may involve a great amount of human cognition. The early identification of unexpected departures from steady state behavior is an essential step for the operation, control and accident management in NPPs. The bases for the transient identification relay on the evidence that different system faults and anomalies lead to different pattern evolution in the involved process variables. During an abnormal event, the operator must monitor a great amount of information from the instruments that represents a specific type of event. Recently, several works have been developed for transient identification. These works frequently present a non reliable response, using the 'don't know' as the system output. In this work, we investigate the possibility of using a Neuro-Fuzzy modeling tool for efficient transient identification, aiming to helping the operator crew to take decisions relative to the procedure to be followed in situations of accidents/transients at NPPs. The proposed system uses artificial neural networks (ANN) as first level transient diagnostic. After the ANN has done the preliminary transient type identification, a fuzzy-logic system analyzes the results emitting reliability degree of it. A validation of this identification system was made at the three loops Pressurized Water Reactor (PWR) simulator of the Human-System Interface Laboratory (LABIHS) of the Nuclear Engineering Institute (IEN

  6. Identification of Enhancers In Human: Advances In Computational Studies

    KAUST Repository

    Kleftogiannis, Dimitrios A.

    2016-03-24

    Roughly ~50% of the human genome, contains noncoding sequences serving as regulatory elements responsible for the diverse gene expression of the cells in the body. One very well studied category of regulatory elements is the category of enhancers. Enhancers increase the transcriptional output in cells through chromatin remodeling or recruitment of complexes of binding proteins. Identification of enhancer using computational techniques is an interesting area of research and up to now several approaches have been proposed. However, the current state-of-the-art methods face limitations since the function of enhancers is clarified, but their mechanism of function is not well understood. This PhD thesis presents a bioinformatics/computer science study that focuses on the problem of identifying enhancers in different human cells using computational techniques. The dissertation is decomposed into four main tasks that we present in different chapters. First, since many of the enhancer’s functions are not well understood, we study the basic biological models by which enhancers trigger transcriptional functions and we survey comprehensively over 30 bioinformatics approaches for identifying enhancers. Next, we elaborate more on the availability of enhancer data as produced by different enhancer identification methods and experimental procedures. In particular, we analyze advantages and disadvantages of existing solutions and we report obstacles that require further consideration. To mitigate these problems we developed the Database of Integrated Human Enhancers (DENdb), a centralized online repository that archives enhancer data from 16 ENCODE cell-lines. The integrated enhancer data are also combined with many other experimental data that can be used to interpret the enhancers content and generate a novel enhancer annotation that complements the existing integrative annotation proposed by the ENCODE consortium. Next, we propose the first deep-learning computational

  7. Developing a personal computer based expert system for radionuclide identification

    International Nuclear Information System (INIS)

    Aarnio, P.A.; Hakulinen, T.T.

    1990-01-01

    Several expert system development tools are available for personal computers today. We have used one of the LISP-based high end tools for nearly two years in developing an expert system for identification of gamma sources. The system contains a radionuclide database of 2055 nuclides and 48000 gamma transitions with a knowledge base of about sixty rules. This application combines a LISP-based inference engine with database management and relatively heavy numerical calculations performed using C-language. The most important feature needed has been the possibility to use LISP and C together with the more advanced object oriented features of the development tool. Main difficulties have been long response times and the big amount (10-16 MB) of computer memory required

  8. On-line validation of safety parameters and fault identification

    International Nuclear Information System (INIS)

    Tzanos, C.P.

    1985-01-01

    In many safety-significant off-normal events, the reliability of failure identification and corrective operator actions is limited greatly by the large amount of data that has to be processed and analyzed mentally in a very short time and in a high-stress environment. A data-validation and fault-identification system, that uses computers for continuous plant-information processing and analysis, can enhance plant safety and also improve plant availability. A methodology has been developed that provides validation of safety-significant plant parameter measurements, plant state verification, and fault identification in the presence of many instrumentation failures (including multiple common-cause failures). This paper presents this methodology and some results of its application to a reference LMFBR plant. The basic features of this methodology and the results of its application are summarized

  9. Virtualization of Legacy Instrumentation Control Computers for Improved Reliability, Operational Life, and Management.

    Science.gov (United States)

    Katz, Jonathan E

    2017-01-01

    Laboratories tend to be amenable environments for long-term reliable operation of scientific measurement equipment. Indeed, it is not uncommon to find equipment 5, 10, or even 20+ years old still being routinely used in labs. Unfortunately, the Achilles heel for many of these devices is the control/data acquisition computer. Often these computers run older operating systems (e.g., Windows XP) and, while they might only use standard network, USB or serial ports, they require proprietary software to be installed. Even if the original installation disks can be found, it is a burdensome process to reinstall and is fraught with "gotchas" that can derail the process-lost license keys, incompatible hardware, forgotten configuration settings, etc. If you have running legacy instrumentation, the computer is the ticking time bomb waiting to put a halt to your operation.In this chapter, I describe how to virtualize your currently running control computer. This virtualized computer "image" is easy to maintain, easy to back up and easy to redeploy. I have used this multiple times in my own lab to greatly improve the robustness of my legacy devices.After completing the steps in this chapter, you will have your original control computer as well as a virtual instance of that computer with all the software installed ready to control your hardware should your original computer ever be decommissioned.

  10. Strength and Reliability of Wood for the Components of Low-cost Wind Turbines: Computational and Experimental Analysis and Applications

    DEFF Research Database (Denmark)

    Mishnaevsky, Leon; Freere, Peter; Sharma, Ranjan

    2009-01-01

    of experiments and computational investigations. Low cost testing machines have been designed, and employed for the systematic analysis of different sorts of Nepali wood, to be used for the wind turbine construction. At the same time, computational micromechanical models of deformation and strength of wood......This paper reports the latest results of the comprehensive program of experimental and computational analysis of strength and reliability of wooden parts of low cost wind turbines. The possibilities of prediction of strength and reliability of different types of wood are studied in the series...... are developed, which should provide the basis for microstructure-based correlating of observable and service properties of wood. Some correlations between microstructure, strength and service properties of wood have been established....

  11. Reliability of construction materials

    International Nuclear Information System (INIS)

    Merz, H.

    1976-01-01

    One can also speak of reliability with respect to materials. While for reliability of components the MTBF (mean time between failures) is regarded as the main criterium, this is replaced with regard to materials by possible failure mechanisms like physical/chemical reaction mechanisms, disturbances of physical or chemical equilibrium, or other interactions or changes of system. The main tasks of the reliability analysis of materials therefore is the prediction of the various failure reasons, the identification of interactions, and the development of nondestructive testing methods. (RW) [de

  12. Results of a Demonstration Assessment of Passive System Reliability Utilizing the Reliability Method for Passive Systems (RMPS)

    Energy Technology Data Exchange (ETDEWEB)

    Bucknor, Matthew; Grabaskas, David; Brunett, Acacia; Grelle, Austin

    2015-04-26

    Advanced small modular reactor designs include many advantageous design features such as passively driven safety systems that are arguably more reliable and cost effective relative to conventional active systems. Despite their attractiveness, a reliability assessment of passive systems can be difficult using conventional reliability methods due to the nature of passive systems. Simple deviations in boundary conditions can induce functional failures in a passive system, and intermediate or unexpected operating modes can also occur. As part of an ongoing project, Argonne National Laboratory is investigating various methodologies to address passive system reliability. The Reliability Method for Passive Systems (RMPS), a systematic approach for examining reliability, is one technique chosen for this analysis. This methodology is combined with the Risk-Informed Safety Margin Characterization (RISMC) approach to assess the reliability of a passive system and the impact of its associated uncertainties. For this demonstration problem, an integrated plant model of an advanced small modular pool-type sodium fast reactor with a passive reactor cavity cooling system is subjected to a station blackout using RELAP5-3D. This paper discusses important aspects of the reliability assessment, including deployment of the methodology, the uncertainty identification and quantification process, and identification of key risk metrics.

  13. Quantitative software-reliability analysis of computer codes relevant to nuclear safety

    International Nuclear Information System (INIS)

    Mueller, C.J.

    1981-12-01

    This report presents the results of the first year of an ongoing research program to determine the probability of failure characteristics of computer codes relevant to nuclear safety. An introduction to both qualitative and quantitative aspects of nuclear software is given. A mathematical framework is presented which will enable the a priori prediction of the probability of failure characteristics of a code given the proper specification of its properties. The framework consists of four parts: (1) a classification system for software errors and code failures; (2) probabilistic modeling for selected reliability characteristics; (3) multivariate regression analyses to establish predictive relationships among reliability characteristics and generic code property and development parameters; and (4) the associated information base. Preliminary data of the type needed to support the modeling and the predictions of this program are described. Illustrations of the use of the modeling are given but the results so obtained, as well as all results of code failure probabilities presented herein, are based on data which at this point are preliminary, incomplete, and possibly non-representative of codes relevant to nuclear safety

  14. Reliability of a structured interview for admission to an emergency medicine residency program.

    Science.gov (United States)

    Blouin, Danielle

    2010-10-01

    Interviews are most important in resident selection. Structured interviews are more reliable than unstructured ones. We sought to measure the interrater reliability of a newly designed structured interview during the selection process to an Emergency Medicine residency program. The critical incident technique was used to extract the desired dimensions of performance. The interview tool consisted of 7 clinical scenarios and 1 global rating. Three trained interviewers marked each candidate on all scenarios without discussing candidates' responses. Interitem consistency and estimates of variance were computed. Twenty-eight candidates were interviewed. The generalizability coefficient was 0.67. Removing the central tendency ratings increased the coefficient to 0.74. Coefficients of interitem consistency ranged from 0.64 to 0.74. The structured interview tool provided good although suboptimal interrater reliability. Increasing the number of scenarios improves reliability as does applying differential weights to the rating scale anchors. The latter would also facilitate the identification of those candidates with extreme ratings.

  15. Diagnostic reliability of the cervical vertebral maturation method and standing height in the identification of the mandibular growth spurt.

    Science.gov (United States)

    Perinetti, Giuseppe; Contardo, Luca; Castaldo, Attilio; McNamara, James A; Franchi, Lorenzo

    2016-07-01

    To evaluate the capability of both cervical vertebral maturation (CVM) stages 3 and 4 (CS3-4 interval) and the peak in standing height to identify the mandibular growth spurt throughout diagnostic reliability analysis. A previous longitudinal data set derived from 24 untreated growing subjects (15 females and nine males,) detailed elsewhere were reanalyzed. Mandibular growth was defined as annual increments in Condylion (Co)-Gnathion (Gn) (total mandibular length) and Co-Gonion Intersection (Goi) (ramus height) and their arithmetic mean (mean mandibular growth [mMG]). Subsequently, individual annual increments in standing height, Co-Gn, Co-Goi, and mMG were arranged according to annual age intervals, with the first and last intervals defined as 7-8 years and 15-16 years, respectively. An analysis was performed to establish the diagnostic reliability of the CS3-4 interval or the peak in standing height in the identification of the maximum individual increments of each Co-Gn, Co-Goi, and mMG measurement at each annual age interval. CS3-4 and standing height peak show similar but variable accuracy across annual age intervals, registering values between 0.61 (standing height peak, Co-Gn) and 0.95 (standing height peak and CS3-4, mMG). Generally, satisfactory diagnostic reliability was seen when the mandibular growth spurt was identified on the basis of the Co-Goi and mMG increments. Both CVM interval CS3-4 and peak in standing height may be used in routine clinical practice to enhance efficiency of treatments requiring identification of the mandibular growth spurt.

  16. Online Identification with Reliability Criterion and State of Charge Estimation Based on a Fuzzy Adaptive Extended Kalman Filter for Lithium-Ion Batteries

    Directory of Open Access Journals (Sweden)

    Zhongwei Deng

    2016-06-01

    Full Text Available In the field of state of charge (SOC estimation, the Kalman filter has been widely used for many years, although its performance strongly depends on the accuracy of the battery model as well as the noise covariance. The Kalman gain determines the confidence coefficient of the battery model by adjusting the weight of open circuit voltage (OCV correction, and has a strong correlation with the measurement noise covariance (R. In this paper, the online identification method is applied to acquire the real model parameters under different operation conditions. A criterion based on the OCV error is proposed to evaluate the reliability of online parameters. Besides, the equivalent circuit model produces an intrinsic model error which is dependent on the load current, and the property that a high battery current or a large current change induces a large model error can be observed. Based on the above prior knowledge, a fuzzy model is established to compensate the model error through updating R. Combining the positive strategy (i.e., online identification and negative strategy (i.e., fuzzy model, a more reliable and robust SOC estimation algorithm is proposed. The experiment results verify the proposed reliability criterion and SOC estimation method under various conditions for LiFePO4 batteries.

  17. Reliability of the spent fuel identification for flask loading procedure used by COGEMA for fuel transport to La Hague

    International Nuclear Information System (INIS)

    Eid, M.; Zachar, M.; Pretesacque, P.

    1991-01-01

    The Spent Fuel Identification for Flask Loading (SFIFL) procedure designed by COGEMA is analysed and its reliability calculated. The reliability of the procedure is defined as the probability of transporting only approved fuel elements for a given number of shipments. The procedure describes a non-coherent system. A non-coherent system is the one in which two successive failures could result in a success, from the system mission point of view. A technique that describes the system with the help of its maximal cuts (states) is used for calculations. A maximal cut contains more than one failure which can split into two cuts (sub-states). Cuts splitting will enable us to analyse, in a systematic way, non-coherent systems with independent basic components. (author)

  18. Reliability of the spent fuel identification for flask loading procedure used by COGEMA for fuel transport to La Hague

    International Nuclear Information System (INIS)

    Eid, M.; Zachar, M.; Pretesacque, P.

    1990-01-01

    The Spent Fuel Identification for Flask Loading, SFIFL, procedure designed by COGEMA is analysed and its reliability is calculated. The reliability of the procedure is defined as the probability of transporting only approved fuel elements for a given number of shipments. The procedure describes a non-coherent system. A non-coherent system is the one in which two successive failures could result in a success, from the system mission point of view. A technique that describes the system with the help of its maximal cuts (states), is used for calculations. A maximal cut contains more than one failure can split into two cuts, (sub-states). Cuts splitting will enable us to analyse, in a systematic way, non-coherent systems with independent basic components. (author)

  19. Risk-based Optimization and Reliability Levels of Coastal Structures

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Burcharth, H. F.

    Identification of optimum reliability levels for coastal structures is considered. A class of breakwaters is considered where no human injuries can be expected in cases of failure. The optimum reliability level is identified by minimizing the total costs over the service life of the structure, in...... on the minimumcost reliability levels is investigated for different values of the real rate of interest, the service lifetime, the downtime costs due to malfunction and the decommission costs.......Identification of optimum reliability levels for coastal structures is considered. A class of breakwaters is considered where no human injuries can be expected in cases of failure. The optimum reliability level is identified by minimizing the total costs over the service life of the structure...

  20. Risk-based Optimization and Reliability Levels of Coastal Structures

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Burcharth, Hans F.

    2005-01-01

     Identification of optimum reliability levels for coastal structures is considered. A class of breakwaters is considered where no human injuries can be expected in cases of failure. The optimum reliability level is identified by minimizing the total costs over the service life of the structure, i...... on the minimumcost reliability levels is investigated for different values of the real rate of interest, the service lifetime, the downtime costs due to malfunction and the decommission costs....... Identification of optimum reliability levels for coastal structures is considered. A class of breakwaters is considered where no human injuries can be expected in cases of failure. The optimum reliability level is identified by minimizing the total costs over the service life of the structure...

  1. Improved Targeting Through Collaborative Decision-Making and Brain Computer Interfaces

    Science.gov (United States)

    Stoica, Adrian; Barrero, David F.; McDonald-Maier, Klaus

    2013-01-01

    This paper reports a first step toward a brain-computer interface (BCI) for collaborative targeting. Specifically, we explore, from a broad perspective, how the collaboration of a group of people can increase the performance on a simple target identification task. To this end, we requested a group of people to identify the location and color of a sequence of targets appearing on the screen and measured the time and accuracy of the response. The individual results are compared to a collective identification result determined by simple majority voting, with random choice in case of drawn. The results are promising, as the identification becomes significantly more reliable even with this simple voting and a small number of people (either odd or even number) involved in the decision. In addition, the paper briefly analyzes the role of brain-computer interfaces in collaborative targeting, extending the targeting task by using a BCI instead of a mechanical response.

  2. Pore sub-features reproducibility in direct microscopic and Livescan images--their reliability in personal identification.

    Science.gov (United States)

    Gupta, Abhishek; Sutton, Raul

    2010-07-01

    Third level features have been reported to have equal discriminatory power as second level details in establishing personal identification. Pore area, as an extended set third level sub-feature, has been studied by minimizing possible factors that could affect pore size. The reproducibility of pore surface area has been studied using direct microscopic and 500 ppi Livescan images. Direct microscopic pore area measurements indicated that the day on which the pore area was measured had a significant impact on the measured pore area. Pore area measurement was shown to be difficult to estimate in 500 ppi Livescan measurements owing to lack of resolution. It is not possible to reliably use pore area as an identifying feature in fingerprint examination.

  3. Review of advances in human reliability analysis of errors of commission, Part 1: EOC identification

    International Nuclear Information System (INIS)

    Reer, Bernhard

    2008-01-01

    In close connection with examples relevant to contemporary probabilistic safety assessment (PSA), a review of advances in human reliability analysis (HRA) of post-initiator errors of commission (EOCs), i.e. inappropriate actions under abnormal operating conditions, has been carried out. The review comprises both EOC identification (part 1) and quantification (part 2); part 1 is presented in this article. Emerging HRA methods addressing the problem of EOC identification are: A Technique for Human Event Analysis (ATHEANA), the EOC HRA method developed by Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS), the Misdiagnosis Tree Analysis (MDTA) method, and the Commission Errors Search and Assessment (CESA) method. Most of the EOCs referred to in predictive studies comprise the stop of running or the inhibition of anticipated functions; a few comprise the start of a function. The CESA search scheme-which proceeds from possible operator actions to the affected systems to scenarios and uses procedures and importance measures as key sources of input information-provides a formalized way for identifying relatively important scenarios with EOC opportunities. In the implementation however, attention should be paid regarding EOCs associated with familiar but non-procedural actions and EOCs leading to failures of manually initiated safety functions

  4. Advanced and intelligent computations in diagnosis and control

    CERN Document Server

    2016-01-01

    This book is devoted to the demands of research and industrial centers for diagnostics, monitoring and decision making systems that result from the increasing complexity of automation and systems, the need to ensure the highest level of reliability and safety, and continuing research and the development of innovative approaches to fault diagnosis. The contributions combine domains of engineering knowledge for diagnosis, including detection, isolation, localization, identification, reconfiguration and fault-tolerant control. The book is divided into six parts:  (I) Fault Detection and Isolation; (II) Estimation and Identification; (III) Robust and Fault Tolerant Control; (IV) Industrial and Medical Diagnostics; (V) Artificial Intelligence; (VI) Expert and Computer Systems.

  5. Reliability of trajectory identification for cosmic heavy ions and cytogenetic effects of their passage through plant seeds

    International Nuclear Information System (INIS)

    Facius, R.; Reitz, G.; Buecker, H.; Nevzgodina, L.V.; Maximova, E.N.; Kaminskaya, E.V.; Virkov, A.I.; Marenny, A.M.; Akatov, Yu.A.

    1990-01-01

    The potentially specific importance of the study of heavy ions from galactic cosmic rays for the understanding of radiation protection in manned spaceflight continues to stimulate spaceflight experiments in order to investigate the radiobiological properties of these ions. Chromosome aberrations as an expression of a direct assault on the genome are of particular interest in view of carcinogenesis as the primary radiation risk for man in space. An essential technical ingredient of such spaceflight experiments is the visual nuclear track detector which permits identification of those biological test organisms which have been affected by cosmic heavy ions. We describe such a technique and report on an analysis of the qualitative and quantitative reliability of this identification of particle trajectories in layers of biological test organisms. The incidence of chromosome aberrations in cells of lettuce seeds, Lactuca sativa, exposed during the Kosmos 1887 mission, was determined for seeds hit by cosmic heavy ions. In those seeds the incidence of both single and multiple chromosome aberrations was enhanced. (author)

  6. Reliability of trajectory identification for cosmic heavy ions and cytogenetic effects of their passage through plant seeds

    Energy Technology Data Exchange (ETDEWEB)

    Facius, R.; Reitz, G.; Buecker, H. (Deutsche Forschungsanstalt fuer Luft- und Raumfahrt e.V. (DLR), Koeln (Germany, F.R.)); Nevzgodina, L.V.; Maximova, E.N.; Kaminskaya, E.V.; Virkov, A.I.; Marenny, A.M.; Akatov, Yu.A. (Ministry of Public Health, Moscow (USSR). Inst. of Biomedical Problems)

    1990-01-01

    The potentially specific importance of the study of heavy ions from galactic cosmic rays for the understanding of radiation protection in manned spaceflight continues to stimulate spaceflight experiments in order to investigate the radiobiological properties of these ions. Chromosome aberrations as an expression of a direct assault on the genome are of particular interest in view of carcinogenesis as the primary radiation risk for man in space. An essential technical ingredient of such spaceflight experiments is the visual nuclear track detector which permits identification of those biological test organisms which have been affected by cosmic heavy ions. We describe such a technique and report on an analysis of the qualitative and quantitative reliability of this identification of particle trajectories in layers of biological test organisms. The incidence of chromosome aberrations in cells of lettuce seeds, Lactuca sativa, exposed during the Kosmos 1887 mission, was determined for seeds hit by cosmic heavy ions. In those seeds the incidence of both single and multiple chromosome aberrations was enhanced. (author).

  7. System reliability analysis using dominant failure modes identified by selective searching technique

    International Nuclear Information System (INIS)

    Kim, Dong-Seok; Ok, Seung-Yong; Song, Junho; Koh, Hyun-Moo

    2013-01-01

    The failure of a redundant structural system is often described by innumerable system failure modes such as combinations or sequences of local failures. An efficient approach is proposed to identify dominant failure modes in the space of random variables, and then perform system reliability analysis to compute the system failure probability. To identify dominant failure modes in the decreasing order of their contributions to the system failure probability, a new simulation-based selective searching technique is developed using a genetic algorithm. The system failure probability is computed by a multi-scale matrix-based system reliability (MSR) method. Lower-scale MSR analyses evaluate the probabilities of the identified failure modes and their statistical dependence. A higher-scale MSR analysis evaluates the system failure probability based on the results of the lower-scale analyses. Three illustrative examples demonstrate the efficiency and accuracy of the approach through comparison with existing methods and Monte Carlo simulations. The results show that the proposed method skillfully identifies the dominant failure modes, including those neglected by existing approaches. The multi-scale MSR method accurately evaluates the system failure probability with statistical dependence fully considered. The decoupling between the failure mode identification and the system reliability evaluation allows for effective applications to larger structural systems

  8. Cloud Computing and Risk: A look at the EU and the application of the Data Protection Directive to cloud computing

    OpenAIRE

    Victoria Ostrzenski

    2013-01-01

    The use of cloud services for the management of records presents many challenges, both in terms of the particulars of data security as well the need to sustain and ensure the greater reliability, authenticity, and accuracy of records. To properly grapple with these concerns requires the development of more specifically applicable and effective binding legislation; an important first step is the examination and identification of the risks specific to cloud computing coupled with an evaluation...

  9. Integrating reliability analysis and design

    International Nuclear Information System (INIS)

    Rasmuson, D.M.

    1980-10-01

    This report describes the Interactive Reliability Analysis Project and demonstrates the advantages of using computer-aided design systems (CADS) in reliability analysis. Common cause failure problems require presentations of systems, analysis of fault trees, and evaluation of solutions to these. Results have to be communicated between the reliability analyst and the system designer. Using a computer-aided design system saves time and money in the analysis of design. Computer-aided design systems lend themselves to cable routing, valve and switch lists, pipe routing, and other component studies. At EG and G Idaho, Inc., the Applicon CADS is being applied to the study of water reactor safety systems

  10. Chest computed tomography-based scoring of thoracic sarcoidosis: Inter-rater reliability of CT abnormalities

    Energy Technology Data Exchange (ETDEWEB)

    Heuvel, D.A.V. den; Es, H.W. van; Heesewijk, J.P. van; Spee, M. [St. Antonius Hospital Nieuwegein, Department of Radiology, Nieuwegein (Netherlands); Jong, P.A. de [University Medical Center Utrecht, Department of Radiology, Utrecht (Netherlands); Zanen, P.; Grutters, J.C. [University Medical Center Utrecht, Division Heart and Lungs, Utrecht (Netherlands); St. Antonius Hospital Nieuwegein, Center of Interstitial Lung Diseases, Department of Pulmonology, Nieuwegein (Netherlands)

    2015-09-15

    To determine inter-rater reliability of sarcoidosis-related computed tomography (CT) findings that can be used for scoring of thoracic sarcoidosis. CT images of 51 patients with sarcoidosis were scored by five chest radiologists for various abnormal CT findings (22 in total) encountered in thoracic sarcoidosis. Using intra-class correlation coefficient (ICC) analysis, inter-rater reliability was analysed and reported according to the Guidelines for Reporting Reliability and Agreement Studies (GRRAS) criteria. A pre-specified sub-analysis was performed to investigate the effect of training. Scoring was trained in a distinct set of 15 scans in which all abnormal CT findings were represented. Median age of the 51 patients (36 men, 70 %) was 43 years (range 26 - 64 years). All radiographic stages were present in this group. ICC ranged from 0.91 for honeycombing to 0.11 for nodular margin (sharp versus ill-defined). The ICC was above 0.60 in 13 of the 22 abnormal findings. Sub-analysis for the best-trained observers demonstrated an ICC improvement for all abnormal findings and values above 0.60 for 16 of the 22 abnormalities. In our cohort, reliability between raters was acceptable for 16 thoracic sarcoidosis-related abnormal CT findings. (orig.)

  11. Chest computed tomography-based scoring of thoracic sarcoidosis: Inter-rater reliability of CT abnormalities

    International Nuclear Information System (INIS)

    Heuvel, D.A.V. den; Es, H.W. van; Heesewijk, J.P. van; Spee, M.; Jong, P.A. de; Zanen, P.; Grutters, J.C.

    2015-01-01

    To determine inter-rater reliability of sarcoidosis-related computed tomography (CT) findings that can be used for scoring of thoracic sarcoidosis. CT images of 51 patients with sarcoidosis were scored by five chest radiologists for various abnormal CT findings (22 in total) encountered in thoracic sarcoidosis. Using intra-class correlation coefficient (ICC) analysis, inter-rater reliability was analysed and reported according to the Guidelines for Reporting Reliability and Agreement Studies (GRRAS) criteria. A pre-specified sub-analysis was performed to investigate the effect of training. Scoring was trained in a distinct set of 15 scans in which all abnormal CT findings were represented. Median age of the 51 patients (36 men, 70 %) was 43 years (range 26 - 64 years). All radiographic stages were present in this group. ICC ranged from 0.91 for honeycombing to 0.11 for nodular margin (sharp versus ill-defined). The ICC was above 0.60 in 13 of the 22 abnormal findings. Sub-analysis for the best-trained observers demonstrated an ICC improvement for all abnormal findings and values above 0.60 for 16 of the 22 abnormalities. In our cohort, reliability between raters was acceptable for 16 thoracic sarcoidosis-related abnormal CT findings. (orig.)

  12. Rapid and reliable detection and identification of GM events using multiplex PCR coupled with oligonucleotide microarray.

    Science.gov (United States)

    Xu, Xiaodan; Li, Yingcong; Zhao, Heng; Wen, Si-yuan; Wang, Sheng-qi; Huang, Jian; Huang, Kun-lun; Luo, Yun-bo

    2005-05-18

    To devise a rapid and reliable method for the detection and identification of genetically modified (GM) events, we developed a multiplex polymerase chain reaction (PCR) coupled with a DNA microarray system simultaneously aiming at many targets in a single reaction. The system included probes for screening gene, species reference gene, specific gene, construct-specific gene, event-specific gene, and internal and negative control genes. 18S rRNA was combined with species reference genes as internal controls to assess the efficiency of all reactions and to eliminate false negatives. Two sets of the multiplex PCR system were used to amplify four and five targets, respectively. Eight different structure genes could be detected and identified simultaneously for Roundup Ready soybean in a single microarray. The microarray specificity was validated by its ability to discriminate two GM maizes Bt176 and Bt11. The advantages of this method are its high specificity and greatly reduced false-positives and -negatives. The multiplex PCR coupled with microarray technology presented here is a rapid and reliable tool for the simultaneous detection of GM organism ingredients.

  13. Development of the method of aggregation to determine the current storage area using computer vision and radiofrequency identification

    Science.gov (United States)

    Astafiev, A.; Orlov, A.; Privezencev, D.

    2018-01-01

    The article is devoted to the development of technology and software for the construction of positioning and control systems in industrial plants based on aggregation to determine the current storage area using computer vision and radiofrequency identification. It describes the developed of the project of hardware for industrial products positioning system in the territory of a plant on the basis of radio-frequency grid. It describes the development of the project of hardware for industrial products positioning system in the plant on the basis of computer vision methods. It describes the development of the method of aggregation to determine the current storage area using computer vision and radiofrequency identification. Experimental studies in laboratory and production conditions have been conducted and described in the article.

  14. A structural approach to constructing perspective efficient and reliable human-computer interfaces

    International Nuclear Information System (INIS)

    Balint, L.

    1989-01-01

    The principles of human-computer interface (HCI) realizations are investigated with the aim of getting closer to a general framework and thus, to a more or less solid background of constructing perspective efficient, reliable and cost-effective human-computer interfaces. On the basis of characterizing and classifying the different HCI solutions, the fundamental problems of interface construction are pointed out especially with respect to human error occurrence possibilities. The evolution of HCI realizations is illustrated by summarizing the main properties of past, present and foreseeable future interface generations. HCI modeling is pointed out to be a crucial problem in theoretical and practical investigations. Suggestions concerning HCI structure (hierarchy and modularity), HCI functional dynamics (mapping from input to output information), minimization of human error caused system failures (error-tolerance, error-recovery and error-correcting) as well as cost-effective HCI design and realization methodology (universal and application-oriented vs. application-specific solutions) are presented. The concept of RISC-based and SCAMP-type HCI components is introduced with the aim of having a reduced interaction scheme in communication and a well defined architecture in HCI components' internal structure. HCI efficiency and reliability are dealt with, by taking into account complexity and flexibility. The application of fast computerized prototyping is also briefly investigated as an experimental device of achieving simple, parametrized, invariant HCI models. Finally, a concise outline of an approach of how to construct ideal HCI's is also suggested by emphasizing the open questions and the need of future work related to the proposals, as well. (author). 14 refs, 6 figs

  15. Exact reliability quantification of highly reliable systems with maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Bris, Radim, E-mail: radim.bris@vsb.c [VSB-Technical University Ostrava, Faculty of Electrical Engineering and Computer Science, Department of Applied Mathematics, 17. listopadu 15, 70833 Ostrava-Poruba (Czech Republic)

    2010-12-15

    When a system is composed of highly reliable elements, exact reliability quantification may be problematic, because computer accuracy is limited. Inaccuracy can be due to different aspects. For example, an error may be made when subtracting two numbers that are very close to each other, or at the process of summation of many very different numbers, etc. The basic objective of this paper is to find a procedure, which eliminates errors made by PC when calculations close to an error limit are executed. Highly reliable system is represented by the use of directed acyclic graph which is composed from terminal nodes, i.e. highly reliable input elements, internal nodes representing subsystems and edges that bind all of these nodes. Three admissible unavailability models of terminal nodes are introduced, including both corrective and preventive maintenance. The algorithm for exact unavailability calculation of terminal nodes is based on merits of a high-performance language for technical computing MATLAB. System unavailability quantification procedure applied to a graph structure, which considers both independent and dependent (i.e. repeatedly occurring) terminal nodes is based on combinatorial principle. This principle requires summation of a lot of very different non-negative numbers, which may be a source of an inaccuracy. That is why another algorithm for exact summation of such numbers is designed in the paper. The summation procedure uses benefits from a special number system with the base represented by the value 2{sup 32}. Computational efficiency of the new computing methodology is compared with advanced simulation software. Various calculations on systems from references are performed to emphasize merits of the methodology.

  16. Appraisal of the PREP, KITT, and SAMPLE computer codes for the evaluation of the reliability characteristics of engineered systems

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, P; White, R F

    1976-01-01

    For the probabilistic approach to reactor safety assessment by the use of event tree and fault tree techniques it is essential to be able to estimate the probabilities of failure of the various engineered safety features provided to mitigate the effects of postulated accident sequences. The PREP, KITT and SAMPLE computer codes, which incorporate Kinetic Tree Theory, perform these calculations and have been used extensively to evaluate the reliability characteristics of engineered safety features of American nuclear reactors. Working versions of these computer codes are now available in SRD, and this report explains the merits, capabilities and ease of application of the PREP, KITT, and SAMPLE programs for the solution of system reliability problems.

  17. Time domain system identification of longitudinal dynamics of single rotor model helicopter using sidpac

    International Nuclear Information System (INIS)

    Khaizer, A.N.; Hussain, I.

    2015-01-01

    This paper presents a time-domain approach for identification of longitudinal dynamics of single rotor model helicopter. A frequency sweep excitation input signal is applied for hover flying mode widely used for space state linearized model. A fully automated programmed flight test method provides high quality flight data for system identification using the computer controlled flight simulator X-plane. The flight test data were recorded, analyzed and reduced using the SIDPAC (System Identification Programs for Air Craft) toolbox for MATLAB, resulting in an aerodynamic model of single rotor helicopter. Finally, the identified model of single rotor helicopter is validated on Raptor 30-class model helicopter at hover showing the reliability of proposed approach. (author)

  18. Secure Cooperative Spectrum Sensing for the Cognitive Radio Network Using Nonuniform Reliability

    Directory of Open Access Journals (Sweden)

    Muhammad Usman

    2014-01-01

    Full Text Available Both reliable detection of the primary signal in a noisy and fading environment and nullifying the effect of unauthorized users are important tasks in cognitive radio networks. To address these issues, we consider a cooperative spectrum sensing approach where each user is assigned nonuniform reliability based on the sensing performance. Users with poor channel or faulty sensor are assigned low reliability. The nonuniform reliabilities serve as identification tags and are used to isolate users with malicious behavior. We consider a link layer attack similar to the Byzantine attack, which falsifies the spectrum sensing data. Three different strategies are presented in this paper to ignore unreliable and malicious users in the network. Considering only reliable users for global decision improves sensing time and decreases collisions in the control channel. The fusion center uses the degree of reliability as a weighting factor to determine the global decision in scheme I. Schemes II and III consider the unreliability of users, which makes the computations even simpler. The proposed schemes reduce the number of sensing reports and increase the inference accuracy. The advantages of our proposed schemes over conventional cooperative spectrum sensing and the Chair-Varshney optimum rule are demonstrated through simulations.

  19. Reliability of real-time computing with radiation data feedback at accidental release

    International Nuclear Information System (INIS)

    Deme, S.; Feher, I.; Lang, E.

    1989-07-01

    At present, the computing method normalized for the telemetric data represents the primary information for deciding on any necessary countermeasures in case of a nuclear reactor accident. The reliability of the results, however, are influenced by the choice of certain parameters that can not be determined by direct methods. Improperly chosen diffusion parameters would distort the determination of environmental radiation parameters normalized on the basis of the measurements ( 131 I activity concentration, gamma dose rate) at points lying at a given distance from the measuring stations. Numerical examples for the uncertainties due to the above factors are analyzed. (author) 4 refs.; 14 figs

  20. Structural reliability assessment capability in NESSUS

    Science.gov (United States)

    Millwater, H.; Wu, Y.-T.

    1992-07-01

    The principal capabilities of NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), an advanced computer code developed for probabilistic structural response analysis, are reviewed, and its structural reliability assessed. The code combines flexible structural modeling tools with advanced probabilistic algorithms in order to compute probabilistic structural response and resistance, component reliability and risk, and system reliability and risk. An illustrative numerical example is presented.

  1. Reliability parameters of distribution networks components

    Energy Technology Data Exchange (ETDEWEB)

    Gono, R.; Kratky, M.; Rusek, S.; Kral, V. [Technical Univ. of Ostrava (Czech Republic)

    2009-03-11

    This paper presented a framework for the retrieval of parameters from various heterogenous power system databases. The framework was designed to transform the heterogenous outage data in a common relational scheme. The framework was used to retrieve outage data parameters from the Czech and Slovak republics in order to demonstrate the scalability of the framework. A reliability computation of the system was computed in 2 phases representing the retrieval of component reliability parameters and the reliability computation. Reliability rates were determined using component reliability and global reliability indices. Input data for the reliability was retrieved from data on equipment operating under similar conditions, while the probability of failure-free operations was evaluated by determining component status. Anomalies in distribution outage data were described as scheme, attribute, and term differences. Input types consisted of input relations; transformation programs; codebooks; and translation tables. The system was used to successfully retrieve data from 7 distributors in the Czech Republic and Slovak Republic between 2000-2007. The database included 301,555 records. Data were queried using SQL language. 29 refs., 2 tabs., 2 figs.

  2. Reliability of a computer software angle tool for measuring spine and pelvic flexibility during the sit-and-reach test.

    Science.gov (United States)

    Mier, Constance M; Shapiro, Belinda S

    2013-02-01

    The purpose of this study was to determine the reliability of a computer software angle tool that measures thoracic (T), lumbar (L), and pelvic (P) angles as a means of evaluating spine and pelvic flexibility during the sit-and-reach (SR) test. Thirty adults performed the SR twice on separate days. The SR test was captured on video and later analyzed for T, L, and P angles using the computer software angle tool. During the test, 3 markers were placed over T1, T12, and L5 vertebrae to identify T, L, and P angles. Intraclass correlation coefficient (ICC) indicated a very high internal consistency (between trials) for T, L, and P angles (0.95-0.99); thus, the average of trials was used for test-retest (between days) reliability. Mean (±SD) values did not differ between days for T (51.0 ± 14.3 vs. 52.3 ± 16.2°), L (23.9 ± 7.1 vs. 23.0 ± 6.9°), or P (98.4 ± 15.6 vs. 98.3 ± 14.7°) angles. Test-retest reliability (ICC) was high for T (0.96) and P (0.97) angles and moderate for L angle (0.84). Both intrarater and interrater reliabilities were high for T (0.95, 0.94) and P (0.97, 0.97) angles and moderate for L angle (0.87, 0.82). Thus, the computer software angle tool is a highly objective method for assessing spine and pelvic flexibility during a video-captured SR test.

  3. Reliability Calculations

    DEFF Research Database (Denmark)

    Petersen, Kurt Erling

    1986-01-01

    Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety...... and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic...... approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very...

  4. Reliability of Semiautomated Computational Methods for Estimating Tibiofemoral Contact Stress in the Multicenter Osteoarthritis Study

    Directory of Open Access Journals (Sweden)

    Donald D. Anderson

    2012-01-01

    Full Text Available Recent findings suggest that contact stress is a potent predictor of subsequent symptomatic osteoarthritis development in the knee. However, much larger numbers of knees (likely on the order of hundreds, if not thousands need to be reliably analyzed to achieve the statistical power necessary to clarify this relationship. This study assessed the reliability of new semiautomated computational methods for estimating contact stress in knees from large population-based cohorts. Ten knees of subjects from the Multicenter Osteoarthritis Study were included. Bone surfaces were manually segmented from sequential 1.0 Tesla magnetic resonance imaging slices by three individuals on two nonconsecutive days. Four individuals then registered the resulting bone surfaces to corresponding bone edges on weight-bearing radiographs, using a semi-automated algorithm. Discrete element analysis methods were used to estimate contact stress distributions for each knee. Segmentation and registration reliabilities (day-to-day and interrater for peak and mean medial and lateral tibiofemoral contact stress were assessed with Shrout-Fleiss intraclass correlation coefficients (ICCs. The segmentation and registration steps of the modeling approach were found to have excellent day-to-day (ICC 0.93–0.99 and good inter-rater reliability (0.84–0.97. This approach for estimating compartment-specific tibiofemoral contact stress appears to be sufficiently reliable for use in large population-based cohorts.

  5. RELIABILITY OF POSITRON EMISSION TOMOGRAPHY-COMPUTED TOMOGRAPHY IN EVALUATION OF TESTICULAR CARCINOMA PATIENTS.

    Science.gov (United States)

    Nikoletić, Katarina; Mihailović, Jasna; Matovina, Emil; Žeravica, Radmila; Srbovan, Dolores

    2015-01-01

    The study was aimed at assessing the reliability of 18F-fluorodeoxyglucose positron emission tomography-computed tomography scan in evaluation of testicular carcinoma patients. The study sample consisted of 26 scans performed in 23 patients with testicular carcinoma. According to the pathohistological finding, 14 patients had seminomas, 7 had nonseminomas and 2 patients had a mixed histological type. In 17 patients, the initial treatment was orchiectomy+chemotherapy, 2 patients had orchiectomy+chemotherapy+retroperitoneal lymph node dissection, 3 patients had orchiectomy only and one patient was treated with chemotherapy only. Abnormal computed tomography was the main cause for the oncologist to refer the patient to positron emission tomography-computed tomography scan (in 19 scans), magnetic resonance imaging abnormalities in 1 scan, high level oftumor markers in 3 and 3 scans were perforned for follow-up. Positron emission tomography-computed tomography imaging results were compared with histological results, other imaging modalities or the clinical follow-up of the patients. Positron emission tomography-computed tomography scans were positive in 6 and negative in 20 patients. In two patients, positron emission tomography-computed tomography was false positive. There were 20 negative positron emission omography-computed tomography scans perforned in 18 patients, one patient was lost for data analysis. Clinically stable disease was confirmed in 18 follow-up scans performed in 16 patients. The values of sensitivty, specificity, accuracy, and positive- and negative predictive value were 60%, 95%, 75%, 88% and 90.5%, respectively. A hgh negative predictive value obtained in our study (90.5%) suggests that there is a small possibility for a patient to have future relapse after normal positron emission tomography-computed tomography study. However, since the sensitivity and positive predictive value of the study ire rather low, there are limitations of positive

  6. Genome-wide identification of the regulatory targets of a transcription factor using biochemical characterization and computational genomic analysis

    Directory of Open Access Journals (Sweden)

    Jolly Emmitt R

    2005-11-01

    Full Text Available Abstract Background A major challenge in computational genomics is the development of methodologies that allow accurate genome-wide prediction of the regulatory targets of a transcription factor. We present a method for target identification that combines experimental characterization of binding requirements with computational genomic analysis. Results Our method identified potential target genes of the transcription factor Ndt80, a key transcriptional regulator involved in yeast sporulation, using the combined information of binding affinity, positional distribution, and conservation of the binding sites across multiple species. We have also developed a mathematical approach to compute the false positive rate and the total number of targets in the genome based on the multiple selection criteria. Conclusion We have shown that combining biochemical characterization and computational genomic analysis leads to accurate identification of the genome-wide targets of a transcription factor. The method can be extended to other transcription factors and can complement other genomic approaches to transcriptional regulation.

  7. Metamodel-based inverse method for parameter identification: elastic-plastic damage model

    Science.gov (United States)

    Huang, Changwu; El Hami, Abdelkhalak; Radi, Bouchaïb

    2017-04-01

    This article proposed a metamodel-based inverse method for material parameter identification and applies it to elastic-plastic damage model parameter identification. An elastic-plastic damage model is presented and implemented in numerical simulation. The metamodel-based inverse method is proposed in order to overcome the disadvantage in computational cost of the inverse method. In the metamodel-based inverse method, a Kriging metamodel is constructed based on the experimental design in order to model the relationship between material parameters and the objective function values in the inverse problem, and then the optimization procedure is executed by the use of a metamodel. The applications of the presented material model and proposed parameter identification method in the standard A 2017-T4 tensile test prove that the presented elastic-plastic damage model is adequate to describe the material's mechanical behaviour and that the proposed metamodel-based inverse method not only enhances the efficiency of parameter identification but also gives reliable results.

  8. Use of Soft Computing Technologies for a Qualitative and Reliable Engine Control System for Propulsion Systems

    Science.gov (United States)

    Trevino, Luis; Brown, Terry; Crumbley, R. T. (Technical Monitor)

    2001-01-01

    The problem to be addressed in this paper is to explore how the use of Soft Computing Technologies (SCT) could be employed to improve overall vehicle system safety, reliability, and rocket engine performance by development of a qualitative and reliable engine control system (QRECS). Specifically, this will be addressed by enhancing rocket engine control using SCT, innovative data mining tools, and sound software engineering practices used in Marshall's Flight Software Group (FSG). The principle goals for addressing the issue of quality are to improve software management, software development time, software maintenance, processor execution, fault tolerance and mitigation, and nonlinear control in power level transitions. The intent is not to discuss any shortcomings of existing engine control methodologies, but to provide alternative design choices for control, implementation, performance, and sustaining engineering, all relative to addressing the issue of reliability. The approaches outlined in this paper will require knowledge in the fields of rocket engine propulsion (system level), software engineering for embedded flight software systems, and soft computing technologies (i.e., neural networks, fuzzy logic, data mining, and Bayesian belief networks); some of which are briefed in this paper. For this effort, the targeted demonstration rocket engine testbed is the MC-1 engine (formerly FASTRAC) which is simulated with hardware and software in the Marshall Avionics & Software Testbed (MAST) laboratory that currently resides at NASA's Marshall Space Flight Center, building 4476, and is managed by the Avionics Department. A brief plan of action for design, development, implementation, and testing a Phase One effort for QRECS is given, along with expected results. Phase One will focus on development of a Smart Start Engine Module and a Mainstage Engine Module for proper engine start and mainstage engine operations. The overall intent is to demonstrate that by

  9. The analytic hierarchy process as a systematic approach to the identification of important parameters for the reliability assessment of passive systems

    International Nuclear Information System (INIS)

    Zio, E.; Cantarella, M.; Cammi, A.

    2003-01-01

    Passive systems play a crucial role in the development of future solutions for nuclear plant technology. A fundamental issue still to be resolved is the quantification of the reliability of such systems. In this paper, we firstly illustrate a systematic methodology to guide the definition of the failure criteria of a passive system and the evaluation of its probability of occurrence, through the identification of the relevant system parameters and the propagation of their associated uncertainties. Within this methodology, we propose the use of the analytic hierarchy process as a structured and reproducible tool for the decomposition of the problem and the identification of the dominant system parameters. An example of its application to a real passive system is illustrated in details

  10. Design and implementation of a reliable and cost-effective cloud computing infrastructure: the INFN Napoli experience

    International Nuclear Information System (INIS)

    Capone, V; Esposito, R; Pardi, S; Taurino, F; Tortone, G

    2012-01-01

    Over the last few years we have seen an increasing number of services and applications needed to manage and maintain cloud computing facilities. This is particularly true for computing in high energy physics, which often requires complex configurations and distributed infrastructures. In this scenario a cost effective rationalization and consolidation strategy is the key to success in terms of scalability and reliability. In this work we describe an IaaS (Infrastructure as a Service) cloud computing system, with high availability and redundancy features, which is currently in production at INFN-Naples and ATLAS Tier-2 data centre. The main goal we intended to achieve was a simplified method to manage our computing resources and deliver reliable user services, reusing existing hardware without incurring heavy costs. A combined usage of virtualization and clustering technologies allowed us to consolidate our services on a small number of physical machines, reducing electric power costs. As a result of our efforts we developed a complete solution for data and computing centres that can be easily replicated using commodity hardware. Our architecture consists of 2 main subsystems: a clustered storage solution, built on top of disk servers running GlusterFS file system, and a virtual machines execution environment. GlusterFS is a network file system able to perform parallel writes on multiple disk servers, providing this way live replication of data. High availability is also achieved via a network configuration using redundant switches and multiple paths between hypervisor hosts and disk servers. We also developed a set of management scripts to easily perform basic system administration tasks such as automatic deployment of new virtual machines, adaptive scheduling of virtual machines on hypervisor hosts, live migration and automated restart in case of hypervisor failures.

  11. Design and implementation of a reliable and cost-effective cloud computing infrastructure: the INFN Napoli experience

    Science.gov (United States)

    Capone, V.; Esposito, R.; Pardi, S.; Taurino, F.; Tortone, G.

    2012-12-01

    Over the last few years we have seen an increasing number of services and applications needed to manage and maintain cloud computing facilities. This is particularly true for computing in high energy physics, which often requires complex configurations and distributed infrastructures. In this scenario a cost effective rationalization and consolidation strategy is the key to success in terms of scalability and reliability. In this work we describe an IaaS (Infrastructure as a Service) cloud computing system, with high availability and redundancy features, which is currently in production at INFN-Naples and ATLAS Tier-2 data centre. The main goal we intended to achieve was a simplified method to manage our computing resources and deliver reliable user services, reusing existing hardware without incurring heavy costs. A combined usage of virtualization and clustering technologies allowed us to consolidate our services on a small number of physical machines, reducing electric power costs. As a result of our efforts we developed a complete solution for data and computing centres that can be easily replicated using commodity hardware. Our architecture consists of 2 main subsystems: a clustered storage solution, built on top of disk servers running GlusterFS file system, and a virtual machines execution environment. GlusterFS is a network file system able to perform parallel writes on multiple disk servers, providing this way live replication of data. High availability is also achieved via a network configuration using redundant switches and multiple paths between hypervisor hosts and disk servers. We also developed a set of management scripts to easily perform basic system administration tasks such as automatic deployment of new virtual machines, adaptive scheduling of virtual machines on hypervisor hosts, live migration and automated restart in case of hypervisor failures.

  12. Establishment of a protein frequency library and its application in the reliable identification of specific protein interaction partners.

    Science.gov (United States)

    Boulon, Séverine; Ahmad, Yasmeen; Trinkle-Mulcahy, Laura; Verheggen, Céline; Cobley, Andy; Gregor, Peter; Bertrand, Edouard; Whitehorn, Mark; Lamond, Angus I

    2010-05-01

    The reliable identification of protein interaction partners and how such interactions change in response to physiological or pathological perturbations is a key goal in most areas of cell biology. Stable isotope labeling with amino acids in cell culture (SILAC)-based mass spectrometry has been shown to provide a powerful strategy for characterizing protein complexes and identifying specific interactions. Here, we show how SILAC can be combined with computational methods drawn from the business intelligence field for multidimensional data analysis to improve the discrimination between specific and nonspecific protein associations and to analyze dynamic protein complexes. A strategy is shown for developing a protein frequency library (PFL) that improves on previous use of static "bead proteomes." The PFL annotates the frequency of detection in co-immunoprecipitation and pulldown experiments for all proteins in the human proteome. It can provide a flexible and objective filter for discriminating between contaminants and specifically bound proteins and can be used to normalize data values and facilitate comparisons between data obtained in separate experiments. The PFL is a dynamic tool that can be filtered for specific experimental parameters to generate a customized library. It will be continuously updated as data from each new experiment are added to the library, thereby progressively enhancing its utility. The application of the PFL to pulldown experiments is especially helpful in identifying either lower abundance or less tightly bound specific components of protein complexes that are otherwise lost among the large, nonspecific background.

  13. Reference gene identification for reliable normalisation of quantitative RT-PCR data in Setaria viridis.

    Science.gov (United States)

    Nguyen, Duc Quan; Eamens, Andrew L; Grof, Christopher P L

    2018-01-01

    Quantitative real-time polymerase chain reaction (RT-qPCR) is the key platform for the quantitative analysis of gene expression in a wide range of experimental systems and conditions. However, the accuracy and reproducibility of gene expression quantification via RT-qPCR is entirely dependent on the identification of reliable reference genes for data normalisation. Green foxtail ( Setaria viridis ) has recently been proposed as a potential experimental model for the study of C 4 photosynthesis and is closely related to many economically important crop species of the Panicoideae subfamily of grasses, including Zea mays (maize), Sorghum bicolor (sorghum) and Sacchurum officinarum (sugarcane). Setaria viridis (Accession 10) possesses a number of key traits as an experimental model, namely; (i) a small sized, sequenced and well annotated genome; (ii) short stature and generation time; (iii) prolific seed production, and; (iv) is amendable to Agrobacterium tumefaciens -mediated transformation. There is currently however, a lack of reference gene expression information for Setaria viridis ( S. viridis ). We therefore aimed to identify a cohort of suitable S. viridis reference genes for accurate and reliable normalisation of S. viridis RT-qPCR expression data. Eleven putative candidate reference genes were identified and examined across thirteen different S. viridis tissues. Of these, the geNorm and NormFinder analysis software identified SERINE / THERONINE - PROTEIN PHOSPHATASE 2A ( PP2A ), 5 '- ADENYLYLSULFATE REDUCTASE 6 ( ASPR6 ) and DUAL SPECIFICITY PHOSPHATASE ( DUSP ) as the most suitable combination of reference genes for the accurate and reliable normalisation of S. viridis RT-qPCR expression data. To demonstrate the suitability of the three selected reference genes, PP2A , ASPR6 and DUSP , were used to normalise the expression of CINNAMYL ALCOHOL DEHYDROGENASE ( CAD ) genes across the same tissues. This approach readily demonstrated the suitably of the three

  14. Learning Support Assessment Study of a Computer Simulation for the Development of Microbial Identification Strategies

    Directory of Open Access Journals (Sweden)

    Tristan E. Johnson

    2009-12-01

    Full Text Available This paper describes a study that examined how microbiology students construct knowledge of bacterial identification while using a computer simulation. The purpose of this study was to understand how the simulation affects the cognitive processing of students during thinking, problem solving, and learning about bacterial identification and to determine how the simulation facilitates the learning of a domain-specific problem-solving strategy. As part of an upper-division microbiology course, five students participated in several simulation assignments. The data were collected using think-aloud protocol and video action logs as the students used the simulation. The analysis revealed two major themes that determined the performance of the students: Simulation Usage—how the students used the software features and Problem-Solving Strategy Development—the strategy level students started with and the skill level they achieved when they completed their use of the simulation. Several conclusions emerged from the analysis of the data: (i The simulation affects various aspects of cognitive processing by creating an environment that makes it possible to practice the application of a problem-solving strategy. The simulation was used as an environment that allowed students to practice the cognitive skills required to solve an unknown. (ii Identibacter (the computer simulation may be considered to be a cognitive tool to facilitate the learning of a bacterial identification problem-solving strategy. (iii The simulation characteristics did support student learning of a problem-solving strategy. (iv Students demonstrated problem-solving strategy development specific to bacterial identification. (v Participants demonstrated an improved performance from their repeated use of the simulation.

  15. Optimal design methods for a digital human-computer interface based on human reliability in a nuclear power plant

    International Nuclear Information System (INIS)

    Jiang, Jianjun; Zhang, Li; Xie, Tian; Wu, Daqing; Li, Min; Wang, Yiqun; Peng, Yuyuan; Peng, Jie; Zhang, Mengjia; Li, Peiyao; Ma, Congmin; Wu, Xing

    2017-01-01

    Highlights: • A complete optimization process is established for digital human-computer interfaces of Npps. • A quick convergence search method is proposed. • The authors propose an affinity error probability mapping function to test human reliability. - Abstract: This is the second in a series of papers describing the optimal design method for a digital human-computer interface of nuclear power plant (Npp) from three different points based on human reliability. The purpose of this series is to explore different optimization methods from varying perspectives. This present paper mainly discusses the optimal design method for quantity of components of the same factor. In monitoring process, quantity of components has brought heavy burden to operators, thus, human errors are easily triggered. To solve the problem, the authors propose an optimization process, a quick convergence search method and an affinity error probability mapping function. Two balanceable parameter values of the affinity error probability function are obtained by experiments. The experimental results show that the affinity error probability mapping function about human-computer interface has very good sensitivity and stability, and that quick convergence search method for fuzzy segments divided by component quantity has better performance than general algorithm.

  16. Safety and reliability of automatization software

    Energy Technology Data Exchange (ETDEWEB)

    Kapp, K; Daum, R [Karlsruhe Univ. (TH) (Germany, F.R.). Lehrstuhl fuer Angewandte Informatik, Transport- und Verkehrssysteme

    1979-02-01

    Automated technical systems have to meet very high requirements concerning safety, security and reliability. Today, modern computers, especially microcomputers, are used as integral parts of those systems. In consequence computer programs must work in a safe and reliable mannter. Methods are discussed which allow to construct safe and reliable software for automatic systems such as reactor protection systems and to prove that the safety requirements are met. As a result it is shown that only the method of total software diversification can satisfy all safety requirements at tolerable cost. In order to achieve a high degree of reliability, structured and modular programming in context with high level programming languages are recommended.

  17. Reliability of real-time computing with radiation data feedback at accidental release

    International Nuclear Information System (INIS)

    Deme, S.; Feher, I.; Lang, E.

    1990-01-01

    At the first workshop in 1985 we reported on the real-time dose computing method used at the Paks Nuclear Power Plant and on the telemetric system developed for the normalization of the computed data. At present, the computing method normalized for the telemetric data represents the primary information for deciding on any necessary counter measures in case of a nuclear reactor accident. In this connection we analyzed the reliability of the results obtained in this manner. The points of the analysis were: how the results are influenced by the choice of certain parameters that cannot be determined by direct methods and how the improperly chosen diffusion parameters would distort the determination of environmental radiation parameters normalized on the basis of the measurements ( 131 I activity concentration, gamma dose rate) at points lying at a given distance from the measuring stations. A further source of errors may be that, when determining the level of gamma radiation, the radionuclide doses in the cloud and on the ground surface are measured together by the environmental monitoring stations, whereas these doses appear separately in the computations. At the Paks NPP it is the time integral of the aiborne activity concentration of vapour form 131 I which is determined. This quantity includes neither the other physical and chemical forms of 131 I nor the other isotopes of radioiodine. We gave numerical examples for the uncertainties due to the above factors. As a result, we arrived at the conclusions that there is a need to decide on accident-related measures based on the computing method that the dose uncertainties may reach one order of magnitude for points lying far from the monitoring stations. Different measures are discussed to make the uncertainties significantly lower

  18. Reliability of the MicroScan WalkAway PC21 panel in identifying and detecting oxacillin resistance in clinical coagulase-negative staphylococci strains.

    Science.gov (United States)

    Olendzki, A N; Barros, E M; Laport, M S; Dos Santos, K R N; Giambiagi-Demarval, M

    2014-01-01

    The purpose of this study was to determine the reliability of the MicroScan WalkAway PosCombo21 (PC21) system for the identification of coagulase-negative staphylococci (CNS) strains and the detection of oxacillin resistance. Using molecular and phenotypic methods, 196 clinical strains were evaluated. The automated system demonstrated 100 % reliability for the identification of the clinical strains Staphylococcus haemolyticus, Staphylococcus hominis and Staphylococcus cohnii; 98.03 % reliability for the identification of Staphylococcus epidermidis; 70 % reliability for the identification of Staphylococcus lugdunensis; 40 % reliability for the identification of Staphylococcus warneri; and 28.57 % reliability for the identification of Staphylococcus capitis, but no reliability for the identification of Staphylococcus auricularis, Staphylococcus simulans and Staphylococcus xylosus. We concluded that the automated system provides accurate results for the more common CNS species but often fails to accurately identify less prevalent species. For the detection of oxacillin resistance, the automated system showed 100 % specificity and 90.22 % sensitivity. Thus, the PC21 panel detects oxacillin-resistant strains, but is limited by the heteroresistance that is observed when using most phenotypic methods.

  19. Identification of double-yolked duck egg using computer vision.

    Directory of Open Access Journals (Sweden)

    Long Ma

    Full Text Available The double-yolked (DY egg is quite popular in some Asian countries because it is considered as a sign of good luck, however, the double yolk is one of the reasons why these eggs fail to hatch. The usage of automatic methods for identifying DY eggs can increase the efficiency in the poultry industry by decreasing egg loss during incubation or improving sale proceeds. In this study, two methods for DY duck egg identification were developed by using computer vision technology. Transmittance images of DY and single-yolked (SY duck eggs were acquired by a CCD camera to identify them according to their shape features. The Fisher's linear discriminant (FLD model equipped with a set of normalized Fourier descriptors (NFDs extracted from the acquired images and the convolutional neural network (CNN model using primary preprocessed images were built to recognize duck egg yolk types. The classification accuracies of the FLD model for SY and DY eggs were 100% and 93.2% respectively, while the classification accuracies of the CNN model for SY and DY eggs were 98% and 98.8% respectively. The CNN-based algorithm took about 0.12 s to recognize one sample image, which was slightly faster than the FLD-based (about 0.20 s. Finally, this work compared two classification methods and provided the better method for DY egg identification.

  20. Simple and reliable identification of the human round spermatid by inverted phase-contrast microscopy.

    Science.gov (United States)

    Verheyen, G; Crabbé, E; Joris, H; Van Steirteghem, A

    1998-06-01

    Based on the results of animal studies, round spermatid injection (ROSI) has been introduced into the clinical practice of several in-vitro fertilization (IVF) centres. The efficiency of this procedure in terms of fertilization rates and pregnancy rates, however, remains very poor. An essential aspect which does not receive enough attention is the correct identification of this type of round cell within a heterogeneous population of testicular cells. A Nikon inverted microscope equipped with phase-contrast optics (DLL) provided a clear image which allowed reliable recognition of round spermatids in cell suspensions smeared at the glass bottom of the dish. Fluorescent in-situ hybridization confirmed the haploid status of the selected cells. However, exploration of several biopsies from patients with non-obstructive azoospermia showing no spermatozoa after extensive search did not reveal any round spermatids. This observation questions whether enough effort is spent on searching for mature spermatozoa or late spermatids. Experimental investigations should precede the introduction of ROSI into the clinical practice of any IVF centre.

  1. Reliability data banks at Electricite de France (EDF)

    International Nuclear Information System (INIS)

    Procaccia, H.

    1991-01-01

    When Electricite de France opted for a policy of rapid development of PWR nuclear power plants only foreign data was available on plant operation to start a computerized data bank. Since 1978 however, a specific French data bank has been built up. The collected data are recorded on a central computer near Paris. This article describes the component reliability bank. First a history of the banks' development, which has taken place in four steps, is described. Then the data bank itself, known as SRDF, is explained. It allows computation of the component failure rates in operation, on standby or on demand, their development with time, the corresponding unavailability and repair time, the modes and causes of failure, the affected subcomponents and the consequences of the failure on the plant. To achieve this SRDF has three subfiles, the identification file, the operating file and the failure file. The components monitored are listed. Data processing, and data retrieval are explained and some examples given of studies performed using SRDF. (UK)

  2. A hyperspectral X-ray computed tomography system for enhanced material identification

    Science.gov (United States)

    Wu, Xiaomei; Wang, Qian; Ma, Jinlei; Zhang, Wei; Li, Po; Fang, Zheng

    2017-08-01

    X-ray computed tomography (CT) can distinguish different materials according to their absorption characteristics. The hyperspectral X-ray CT (HXCT) system proposed in the present work reconstructs each voxel according to its X-ray absorption spectral characteristics. In contrast to a dual-energy or multi-energy CT system, HXCT employs cadmium telluride (CdTe) as the x-ray detector, which provides higher spectral resolution and separate spectral lines according to the material's photon-counter working principle. In this paper, a specimen containing ten different polymer materials randomly arranged was adopted for material identification by HXCT. The filtered back-projection algorithm was applied for image and spectral reconstruction. The first step was to sort the individual material components of the specimen according to their cross-sectional image intensity. The second step was to classify materials with similar intensities according to their reconstructed spectral characteristics. The results demonstrated the feasibility of the proposed material identification process and indicated that the proposed HXCT system has good prospects for a wide range of biomedical and industrial nondestructive testing applications.

  3. Improved object optimal synthetic description, modeling, learning, and discrimination by GEOGINE computational kernel

    Science.gov (United States)

    Fiorini, Rodolfo A.; Dacquino, Gianfranco

    2005-03-01

    GEOGINE (GEOmetrical enGINE), a state-of-the-art OMG (Ontological Model Generator) based on n-D Tensor Invariants for n-Dimensional shape/texture optimal synthetic representation, description and learning, was presented in previous conferences elsewhere recently. Improved computational algorithms based on the computational invariant theory of finite groups in Euclidean space and a demo application is presented. Progressive model automatic generation is discussed. GEOGINE can be used as an efficient computational kernel for fast reliable application development and delivery in advanced biomedical engineering, biometric, intelligent computing, target recognition, content image retrieval, data mining technological areas mainly. Ontology can be regarded as a logical theory accounting for the intended meaning of a formal dictionary, i.e., its ontological commitment to a particular conceptualization of the world object. According to this approach, "n-D Tensor Calculus" can be considered a "Formal Language" to reliably compute optimized "n-Dimensional Tensor Invariants" as specific object "invariant parameter and attribute words" for automated n-Dimensional shape/texture optimal synthetic object description by incremental model generation. The class of those "invariant parameter and attribute words" can be thought as a specific "Formal Vocabulary" learned from a "Generalized Formal Dictionary" of the "Computational Tensor Invariants" language. Even object chromatic attributes can be effectively and reliably computed from object geometric parameters into robust colour shape invariant characteristics. As a matter of fact, any highly sophisticated application needing effective, robust object geometric/colour invariant attribute capture and parameterization features, for reliable automated object learning and discrimination can deeply benefit from GEOGINE progressive automated model generation computational kernel performance. Main operational advantages over previous

  4. [Feasibility and acceptance of computer-based assessment for the identification of psychosocially distressed patients in routine clinical care].

    Science.gov (United States)

    Sehlen, Susanne; Ott, Martin; Marten-Mittag, Birgitt; Haimerl, Wolfgang; Dinkel, Andreas; Duehmke, Eckhart; Klein, Christian; Schaefer, Christof; Herschbach, Peter

    2012-07-01

    This study investigated feasibility and acceptance of computer-based assessment for the identification of psychosocial distress in routine radiotherapy care. 155 cancer patients were assessed using QSC-R10, PO-Bado-SF and Mach-9. The congruence between computerized tablet PC and conventional paper assessment was analysed in 50 patients. The agreement between the 2 modes was high (ICC 0.869-0.980). Acceptance of computer-based assessment was very high (>95%). Sex, age, education, distress and Karnofsky performance status (KPS) did not influence acceptance. Computerized assessment was rated more difficult by older patients (p = 0.039) and patients with low KPS (p = 0.020). 75.5% of the respondents supported referral for psycho-social intervention for distressed patients. The prevalence of distress was 27.1% (QSC-R10). Computer-based assessment allows easy identification of distressed patients. Level of staff involvement is low, and the results are quickly available for care providers. © Georg Thieme Verlag KG Stuttgart · New York.

  5. Reliability assessment of restructured power systems using reliability network equivalent and pseudo-sequential simulation techniques

    International Nuclear Information System (INIS)

    Ding, Yi; Wang, Peng; Goel, Lalit; Billinton, Roy; Karki, Rajesh

    2007-01-01

    This paper presents a technique to evaluate reliability of a restructured power system with a bilateral market. The proposed technique is based on the combination of the reliability network equivalent and pseudo-sequential simulation approaches. The reliability network equivalent techniques have been implemented in the Monte Carlo simulation procedure to reduce the computational burden of the analysis. Pseudo-sequential simulation has been used to increase the computational efficiency of the non-sequential simulation method and to model the chronological aspects of market trading and system operation. Multi-state Markov models for generation and transmission systems are proposed and implemented in the simulation. A new load shedding scheme is proposed during generation inadequacy and network congestion to minimize the load curtailment. The IEEE reliability test system (RTS) is used to illustrate the technique. (author)

  6. Multidisciplinary System Reliability Analysis

    Science.gov (United States)

    Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  7. Software reliability and safety in nuclear reactor protection systems

    Energy Technology Data Exchange (ETDEWEB)

    Lawrence, J.D. [Lawrence Livermore National Lab., CA (United States)

    1993-11-01

    Planning the development, use and regulation of computer systems in nuclear reactor protection systems in such a way as to enhance reliability and safety is a complex issue. This report is one of a series of reports from the Computer Safety and Reliability Group, Lawrence Livermore that investigates different aspects of computer software in reactor National Laboratory, that investigates different aspects of computer software in reactor protection systems. There are two central themes in the report, First, software considerations cannot be fully understood in isolation from computer hardware and application considerations. Second, the process of engineering reliability and safety into a computer system requires activities to be carried out throughout the software life cycle. The report discusses the many activities that can be carried out during the software life cycle to improve the safety and reliability of the resulting product. The viewpoint is primarily that of the assessor, or auditor.

  8. Software reliability and safety in nuclear reactor protection systems

    International Nuclear Information System (INIS)

    Lawrence, J.D.

    1993-11-01

    Planning the development, use and regulation of computer systems in nuclear reactor protection systems in such a way as to enhance reliability and safety is a complex issue. This report is one of a series of reports from the Computer Safety and Reliability Group, Lawrence Livermore that investigates different aspects of computer software in reactor National Laboratory, that investigates different aspects of computer software in reactor protection systems. There are two central themes in the report, First, software considerations cannot be fully understood in isolation from computer hardware and application considerations. Second, the process of engineering reliability and safety into a computer system requires activities to be carried out throughout the software life cycle. The report discusses the many activities that can be carried out during the software life cycle to improve the safety and reliability of the resulting product. The viewpoint is primarily that of the assessor, or auditor

  9. Reliability calculations

    International Nuclear Information System (INIS)

    Petersen, K.E.

    1986-03-01

    Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very complex systems. In order to increase the applicability of the programs variance reduction techniques can be applied to speed up the calculation process. Variance reduction techniques have been studied and procedures for implementation of importance sampling are suggested. (author)

  10. The Alcohol Use Disorders Identification Test (AUDIT: reliability and validity of the Greek version

    Directory of Open Access Journals (Sweden)

    Bratis Dimitris

    2009-05-01

    Full Text Available Abstract Background Problems associated with alcohol abuse are recognised by the World Health Organization as a major health issue, which according to most recent estimations is responsible for 1.4% of the total world burden of morbidity and has been proven to increase mortality risk by 50%. Because of the size and severity of the problem, early detection is very important. This requires easy to use and specific tools. One of these is the Alcohol Use Disorders Identification Test (AUDIT. Aim This study aims to standardise the questionnaire in a Greek population. Methods AUDIT was translated and back-translated from its original language by two English-speaking psychiatrists. The tool contains 10 questions. A score ≥ 11 is an indication of serious abuse/dependence. In the study, 218 subjects took part: 128 were males and 90 females. The average age was 40.71 years (± 11.34. From the 218 individuals, 109 (75 male, 34 female fulfilled the criteria for alcohol dependence according to the Diagnostic and Statistical Manual of Mental Disorders, 4th edition (DSM-IV, and presented requesting admission; 109 subjects (53 male, 56 female were healthy controls. Results Internal reliability (Cronbach α was 0.80 for the controls and 0.80 for the alcohol-dependent individuals. Controls had significantly lower average scores (t test P 8 was 0.98 and its specificity was 0.94 for the same score. For the alcohol-dependent sample 3% scored as false negatives and from the control group 1.8% scored false positives. In the alcohol-dependent sample there was no difference between males and females in their average scores (t test P > 0.05. Conclusion The Greek version of AUDIT has increased internal reliability and validity. It detects 97% of the alcohol-dependent individuals and has a high sensitivity and specificity. AUDIT is easy to use, quick and reliable and can be very useful in detection alcohol problems in sensitive populations.

  11. The Alcohol Use Disorders Identification Test (AUDIT): reliability and validity of the Greek version.

    Science.gov (United States)

    Moussas, George; Dadouti, Georgia; Douzenis, Athanassios; Poulis, Evangelos; Tzelembis, Athanassios; Bratis, Dimitris; Christodoulou, Christos; Lykouras, Lefteris

    2009-05-14

    Problems associated with alcohol abuse are recognised by the World Health Organization as a major health issue, which according to most recent estimations is responsible for 1.4% of the total world burden of morbidity and has been proven to increase mortality risk by 50%. Because of the size and severity of the problem, early detection is very important. This requires easy to use and specific tools. One of these is the Alcohol Use Disorders Identification Test (AUDIT). This study aims to standardise the questionnaire in a Greek population. AUDIT was translated and back-translated from its original language by two English-speaking psychiatrists. The tool contains 10 questions. A score >or= 11 is an indication of serious abuse/dependence. In the study, 218 subjects took part: 128 were males and 90 females. The average age was 40.71 years (+/- 11.34). From the 218 individuals, 109 (75 male, 34 female) fulfilled the criteria for alcohol dependence according to the Diagnostic and Statistical Manual of Mental Disorders, 4th edition (DSM-IV), and presented requesting admission; 109 subjects (53 male, 56 female) were healthy controls. Internal reliability (Cronbach alpha) was 0.80 for the controls and 0.80 for the alcohol-dependent individuals. Controls had significantly lower average scores (t test P 8 was 0.98 and its specificity was 0.94 for the same score. For the alcohol-dependent sample 3% scored as false negatives and from the control group 1.8% scored false positives. In the alcohol-dependent sample there was no difference between males and females in their average scores (t test P > 0.05). The Greek version of AUDIT has increased internal reliability and validity. It detects 97% of the alcohol-dependent individuals and has a high sensitivity and specificity. AUDIT is easy to use, quick and reliable and can be very useful in detection alcohol problems in sensitive populations.

  12. Reliability of computed tomography measurements in assessment of thigh muscle cross-sectional area and attenuation

    International Nuclear Information System (INIS)

    Strandberg, Sören; Wretling, Marie-Louise; Wredmark, Torsten; Shalabi, Adel

    2010-01-01

    Advancement in technology of computer tomography (CT) and introduction of new medical imaging softwares enables easy and rapid assessment of muscle cross-sectional area (CSA) and attenuation. Before using these techniques in clinical studies there is a need for evaluation of the reliability of the measurements. The purpose of the study was to evaluate the inter- and intra-observer reliability of ImageJ in measuring thigh muscles CSA and attenuation in patients with anterior cruciate ligament (ACL) injury by computer tomography. 31 patients from an ongoing study of rehabilitation and muscle atrophy after ACL reconstruction were included in the study. Axial CT images with slice thickness of 10 mm at the level of 150 mm above the knee joint were analyzed by two investigators independently at two times with a minimum of 3 weeks between the two readings using NIH ImageJ. CSA and the mean attenuation of individual thigh muscles were analyzed for both legs. Mean CSA and mean attenuation values were in good agreement both when comparing the two observers and the two replicates. The inter- and intraclass correlation (ICC) was generally very high with values from 0.98 to 1.00 for all comparisons except for the area of semimembranosus. All the ICC values were significant (p < 0,001). Pearson correlation coefficients were also generally very high with values from 0.98 to 1.00 for all comparisons except for the area of semimembranosus (0.95 for intraobserver and 0.92 for interobserver). This study has presented ImageJ as a method to monitor and evaluate CSA and attenuation of different muscles in the thigh using CT-imaging. The method shows an overall excellent reliability with respect to both observer and replicate

  13. A computational Bayesian approach to dependency assessment in system reliability

    International Nuclear Information System (INIS)

    Yontay, Petek; Pan, Rong

    2016-01-01

    Due to the increasing complexity of engineered products, it is of great importance to develop a tool to assess reliability dependencies among components and systems under the uncertainty of system reliability structure. In this paper, a Bayesian network approach is proposed for evaluating the conditional probability of failure within a complex system, using a multilevel system configuration. Coupling with Bayesian inference, the posterior distributions of these conditional probabilities can be estimated by combining failure information and expert opinions at both system and component levels. Three data scenarios are considered in this study, and they demonstrate that, with the quantification of the stochastic relationship of reliability within a system, the dependency structure in system reliability can be gradually revealed by the data collected at different system levels. - Highlights: • A Bayesian network representation of system reliability is presented. • Bayesian inference methods for assessing dependencies in system reliability are developed. • Complete and incomplete data scenarios are discussed. • The proposed approach is able to integrate reliability information from multiple sources at multiple levels of the system.

  14. An Innovative Fuzzy-Logic-Based Methodology for Trend Identification

    International Nuclear Information System (INIS)

    Wang Xin; Tsoukalas, Lefteri H.; Wei, Thomas Y.C.; Reifman, Jaques

    2001-01-01

    A new fuzzy-logic-based methodology for on-line signal trend identification is introduced. The methodology may be used for detecting the onset of nuclear power plant (NPP) transients at the earliest possible time and could be of great benefit to diagnostic, maintenance, and performance-monitoring programs. Although signal trend identification is complicated by the presence of noise, fuzzy methods can help capture important features of on-line signals, integrate the information included in these features, and classify incoming NPP signals into increasing, decreasing, and steady-state trend categories. A computer program named PROTREN is developed and tested for the purpose of verifying this methodology using NPP and simulation data. The results indicate that the new fuzzy-logic-based methodology is capable of detecting transients accurately, it identifies trends reliably and does not misinterpret a steady-state signal as a transient one

  15. Reliability and maintainability assessment factors for reliable fault-tolerant systems

    Science.gov (United States)

    Bavuso, S. J.

    1984-01-01

    A long term goal of the NASA Langley Research Center is the development of a reliability assessment methodology of sufficient power to enable the credible comparison of the stochastic attributes of one ultrareliable system design against others. This methodology, developed over a 10 year period, is a combined analytic and simulative technique. An analytic component is the Computer Aided Reliability Estimation capability, third generation, or simply CARE III. A simulative component is the Gate Logic Software Simulator capability, or GLOSS. The numerous factors that potentially have a degrading effect on system reliability and the ways in which these factors that are peculiar to highly reliable fault tolerant systems are accounted for in credible reliability assessments. Also presented are the modeling difficulties that result from their inclusion and the ways in which CARE III and GLOSS mitigate the intractability of the heretofore unworkable mathematics.

  16. Travel reliability inventory for Chicago.

    Science.gov (United States)

    2013-04-01

    The overarching goal of this research project is to enable state DOTs to document and monitor the reliability performance : of their highway networks. To this end, a computer tool, TRIC, was developed to produce travel reliability inventories from : ...

  17. Development in structural systems reliability theory

    Energy Technology Data Exchange (ETDEWEB)

    Murotsu, Y

    1986-07-01

    This paper is concerned with two topics on structural systems reliability theory. One covers automatic generation of failure mode equations, identifications of stochastically dominant failure modes, and reliability assessment of redundant structures. Reduced stiffness matrixes and equivalent nodal forces representing the failed elements are introduced for expressing the safety of the elements, using a matrix method. Dominant failure modes are systematically selected by a branch-and-bound technique and heuristic operations. The other discusses the various optimum design problems based on reliability concept. Those problems are interpreted through a solution to a multi-objective optimization problem.

  18. Development in structural systems reliability theory

    International Nuclear Information System (INIS)

    Murotsu, Y.

    1986-01-01

    This paper is concerned with two topics on structural systems reliability theory. One covers automatic generation of failure mode equations, identifications of stochastically dominant failure modes, and reliability assessment of redundant structures. Reduced stiffness matrixes and equivalent nodal forces representing the failed elements are introduced for expressing the safety of the elements, using a matrix method. Dominant failure modes are systematically selected by a branch-and-bound technique and heuristic operations. The other discusses the various optimum design problems based on reliability concept. Those problems are interpreted through a solution to a multi-objective optimization problem. (orig.)

  19. Design and reliability, availability, maintainability, and safety analysis of a high availability quadruple vital computer system

    Institute of Scientific and Technical Information of China (English)

    Ping TAN; Wei-ting HE; Jia LIN; Hong-ming ZHAO; Jian CHU

    2011-01-01

    With the development of high-speed railways in China,more than 2000 high-speed trains will be put into use.Safety and efficiency of railway transportation is increasingly important.We have designed a high availability quadruple vital computer (HAQVC) system based on the analysis of the architecture of the traditional double 2-out-of-2 system and 2-out-of-3 system.The HAQVC system is a system with high availability and safety,with prominent characteristics such as fire-new internal architecture,high efficiency,reliable data interaction mechanism,and operation state change mechanism.The hardware of the vital CPU is based on ARM7 with the real-time embedded safe operation system (ES-OS).The Markov modeling method is designed to evaluate the reliability,availability,maintainability,and safety (RAMS) of the system.In this paper,we demonstrate that the HAQVC system is more reliable than the all voting triple modular redundancy (AVTMR) system and double 2-out-of-2 system.Thus,the design can be used for a specific application system,such as an airplane or high-speed railway system.

  20. Reliability of structural systems subject to fatigue

    International Nuclear Information System (INIS)

    Rackwitz, R.

    1984-01-01

    Concepts and computational procedures for the reliability calculation of structural systems subject to fatigue are outlined. Systems are dealt with by approximately computing componential times to first failure. So-called first-order reliability methods are then used to formulate dependencies between componential failures and to evaluate the system failure probability. (Author) [pt

  1. Fingerprint: A Unique and Reliable Method for Identification

    Directory of Open Access Journals (Sweden)

    Palash Kumar Bose

    2017-01-01

    Full Text Available Fingerprints have been the gold standard for personal identification within the forensic community for more than one hundred years. It is still universal in spite of discovery of DNA fingerprint. The science of fingerprint identification has evolved over time from the early use of finger prints to mark business transactions in ancient Babylonia to their use today as core technology in biometric security devices and as scientific evidence in courts of law throughout the world. The science of fingerprints, dactylography or dermatoglyphics, had long been widely accepted, and well acclaimed and reputed as panacea for individualization, particularly in forensic investigations. Human fingerprints are detailed, unique, difficult to alter, and durable over the life of an individual, making them suitable as lifelong markers of human identity. Fingerprints can be readily used by police or other authorities to identify individuals who wish to conceal their identity, or to identify people who are incapacitated or deceased, as in the aftermath of a natural disaster

  2. Identification of High-Risk Plaques Destined to Cause Acute Coronary Syndrome Using Coronary Computed Tomographic Angiography and Computational Fluid Dynamics.

    Science.gov (United States)

    Lee, Joo Myung; Choi, Gilwoo; Koo, Bon-Kwon; Hwang, Doyeon; Park, Jonghanne; Zhang, Jinlong; Kim, Kyung-Jin; Tong, Yaliang; Kim, Hyun Jin; Grady, Leo; Doh, Joon-Hyung; Nam, Chang-Wook; Shin, Eun-Seok; Cho, Young-Seok; Choi, Su-Yeon; Chun, Eun Ju; Choi, Jin-Ho; Nørgaard, Bjarne L; Christiansen, Evald H; Niemen, Koen; Otake, Hiromasa; Penicka, Martin; de Bruyne, Bernard; Kubo, Takashi; Akasaka, Takashi; Narula, Jagat; Douglas, Pamela S; Taylor, Charles A; Kim, Hyo-Soo

    2018-03-14

    We investigated the utility of noninvasive hemodynamic assessment in the identification of high-risk plaques that caused subsequent acute coronary syndrome (ACS). ACS is a critical event that impacts the prognosis of patients with coronary artery disease. However, the role of hemodynamic factors in the development of ACS is not well-known. Seventy-two patients with clearly documented ACS and available coronary computed tomographic angiography (CTA) acquired between 1 month and 2 years before the development of ACS were included. In 66 culprit and 150 nonculprit lesions as a case-control design, the presence of adverse plaque characteristics (APC) was assessed and hemodynamic parameters (fractional flow reserve derived by coronary computed tomographic angiography [FFR CT ], change in FFR CT across the lesion [△FFR CT ], wall shear stress [WSS], and axial plaque stress) were analyzed using computational fluid dynamics. The best cut-off values for FFR CT , △FFR CT , WSS, and axial plaque stress were used to define the presence of adverse hemodynamic characteristics (AHC). The incremental discriminant and reclassification abilities for ACS prediction were compared among 3 models (model 1: percent diameter stenosis [%DS] and lesion length, model 2: model 1 + APC, and model 3: model 2 + AHC). The culprit lesions showed higher %DS (55.5 ± 15.4% vs. 43.1 ± 15.0%; p stress than nonculprit lesions (all p values statistic [c-index] 0.789 vs. 0.747; p = 0.014) and reclassification abilities (category-free net reclassification index 0.287; p = 0.047; relative integrated discrimination improvement 0.368; p < 0.001) than model 2. Lesions with both APC and AHC showed significantly higher risk of the culprit for subsequent ACS than those with no APC/AHC (hazard ratio: 11.75; 95% confidence interval: 2.85 to 48.51; p = 0.001) and with either APC or AHC (hazard ratio: 3.22; 95% confidence interval: 1.86 to 5.55; p < 0.001). Noninvasive hemodynamic assessment enhanced

  3. A Newly Developed Method for Computing Reliability Measures in a Water Supply Network

    Directory of Open Access Journals (Sweden)

    Jacek Malinowski

    2016-01-01

    Full Text Available A reliability model of a water supply network has beens examined. Its main features are: a topology that can be decomposed by the so-called state factorization into a (relativelysmall number of derivative networks, each having a series-parallel structure (1, binary-state components (either operative or failed with given flow capacities (2, a multi-state character of the whole network and its sub-networks - a network state is defined as the maximal flow between a source (sources and a sink (sinks (3, all capacities (component, network, and sub-network have integer values (4. As the network operates, its state changes due to component failures, repairs, and replacements. A newly developed method of computing the inter-state transition intensities has been presented. It is based on the so-called state factorization and series-parallel aggregation. The analysis of these intensities shows that the failure-repair process of the considered system is an asymptotically homogenous Markov process. It is also demonstrated how certain reliability parameters useful for the network maintenance planning can be determined on the basis of the asymptotic intensities. For better understanding of the presented method, an illustrative example is given. (original abstract

  4. Analysis of information security reliability: A tutorial

    International Nuclear Information System (INIS)

    Kondakci, Suleyman

    2015-01-01

    This article presents a concise reliability analysis of network security abstracted from stochastic modeling, reliability, and queuing theories. Network security analysis is composed of threats, their impacts, and recovery of the failed systems. A unique framework with a collection of the key reliability models is presented here to guide the determination of the system reliability based on the strength of malicious acts and performance of the recovery processes. A unique model, called Attack-obstacle model, is also proposed here for analyzing systems with immunity growth features. Most computer science curricula do not contain courses in reliability modeling applicable to different areas of computer engineering. Hence, the topic of reliability analysis is often too diffuse to most computer engineers and researchers dealing with network security. This work is thus aimed at shedding some light on this issue, which can be useful in identifying models, their assumptions and practical parameters for estimating the reliability of threatened systems and for assessing the performance of recovery facilities. It can also be useful for the classification of processes and states regarding the reliability of information systems. Systems with stochastic behaviors undergoing queue operations and random state transitions can also benefit from the approaches presented here. - Highlights: • A concise survey and tutorial in model-based reliability analysis applicable to information security. • A framework of key modeling approaches for assessing reliability of networked systems. • The framework facilitates quantitative risk assessment tasks guided by stochastic modeling and queuing theory. • Evaluation of approaches and models for modeling threats, failures, impacts, and recovery analysis of information systems

  5. Modeling Message Queueing Services with Reliability Guarantee in Cloud Computing Environment Using Colored Petri Nets

    Directory of Open Access Journals (Sweden)

    Jing Li

    2015-01-01

    Full Text Available Motivated by the need for loosely coupled and asynchronous dissemination of information, message queues are widely used in large-scale application areas. With the advent of virtualization technology, cloud-based message queueing services (CMQSs with distributed computing and storage are widely adopted to improve availability, scalability, and reliability; however, a critical issue is its performance and the quality of service (QoS. While numerous approaches evaluating system performance are available, there is no modeling approach for estimating and analyzing the performance of CMQSs. In this paper, we employ both the analytical and simulation modeling to address the performance of CMQSs with reliability guarantee. We present a visibility-based modeling approach (VMA for simulation model using colored Petri nets (CPN. Our model incorporates the important features of message queueing services in the cloud such as replication, message consistency, resource virtualization, and especially the mechanism named visibility timeout which is adopted in the services to guarantee system reliability. Finally, we evaluate our model through different experiments under varied scenarios to obtain important performance metrics such as total message delivery time, waiting number, and components utilization. Our results reveal considerable insights into resource scheduling and system configuration for service providers to estimate and gain performance optimization.

  6. Simple, Reliable, and Cost-Effective Yeast Identification Scheme for the Clinical Laboratory

    OpenAIRE

    Koehler, Ann P.; Chu, Kai-Cheong; Houang, Elizabeth T. S.; Cheng, Augustine F. B.

    1999-01-01

    The appearance of colonies on the chromogenic medium CHROMagar Candida combined with observation of morphology on corn meal–Tween 80 agar was used for the identification of 353 clinical yeast isolates. The results were compared with those obtained with API yeast identification kits. The accuracy of identification and the turnaround time were equivalent for each method, and our cultural method was less expensive.

  7. The utility of including pathology reports in improving the computational identification of patients

    Directory of Open Access Journals (Sweden)

    Wei Chen

    2016-01-01

    Full Text Available Background: Celiac disease (CD is a common autoimmune disorder. Efficient identification of patients may improve chronic management of the disease. Prior studies have shown searching International Classification of Diseases-9 (ICD-9 codes alone is inaccurate for identifying patients with CD. In this study, we developed automated classification algorithms leveraging pathology reports and other clinical data in Electronic Health Records (EHRs to refine the subset population preselected using ICD-9 code (579.0. Materials and Methods: EHRs were searched for established ICD-9 code (579.0 suggesting CD, based on which an initial identification of cases was obtained. In addition, laboratory results for tissue transglutaminse were extracted. Using natural language processing we analyzed pathology reports from upper endoscopy. Twelve machine learning classifiers using different combinations of variables related to ICD-9 CD status, laboratory result status, and pathology reports were experimented to find the best possible CD classifier. Ten-fold cross-validation was used to assess the results. Results: A total of 1498 patient records were used including 363 confirmed cases and 1135 false positive cases that served as controls. Logistic model based on both clinical and pathology report features produced the best results: Kappa of 0.78, F1 of 0.92, and area under the curve (AUC of 0.94, whereas in contrast using ICD-9 only generated poor results: Kappa of 0.28, F1 of 0.75, and AUC of 0.63. Conclusion: Our automated classification system presented an efficient and reliable way to improve the performance of CD patient identification.

  8. The Development of DNA Based Methods for the Reliable and Efficient Identification of Nicotiana tabacum in Tobacco and Its Derived Products

    Directory of Open Access Journals (Sweden)

    Sukumar Biswas

    2016-01-01

    Full Text Available Reliable methods are needed to detect the presence of tobacco components in tobacco products to effectively control smuggling and classify tariff and excise in tobacco industry to control illegal tobacco trade. In this study, two sensitive and specific DNA based methods, one quantitative real-time PCR (qPCR assay and the other loop-mediated isothermal amplification (LAMP assay, were developed for the reliable and efficient detection of the presence of tobacco (Nicotiana tabacum in various tobacco samples and commodities. Both assays targeted the same sequence of the uridine 5′-monophosphate synthase (UMPS, and their specificities and sensitivities were determined with various plant materials. Both qPCR and LAMP methods were reliable and accurate in the rapid detection of tobacco components in various practical samples, including customs samples, reconstituted tobacco samples, and locally purchased cigarettes, showing high potential for their application in tobacco identification, particularly in the special cases where the morphology or chemical compositions of tobacco have been disrupted. Therefore, combining both methods would facilitate not only the detection of tobacco smuggling control, but also the detection of tariff classification and of excise.

  9. Software reliability prediction using SPN | Abbasabadee | Journal of ...

    African Journals Online (AJOL)

    Software reliability prediction using SPN. ... In this research for computation of software reliability, component reliability model based on SPN would be proposed. An isomorphic markov ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  10. Bayesian methods in reliability

    Science.gov (United States)

    Sander, P.; Badoux, R.

    1991-11-01

    The present proceedings from a course on Bayesian methods in reliability encompasses Bayesian statistical methods and their computational implementation, models for analyzing censored data from nonrepairable systems, the traits of repairable systems and growth models, the use of expert judgment, and a review of the problem of forecasting software reliability. Specific issues addressed include the use of Bayesian methods to estimate the leak rate of a gas pipeline, approximate analyses under great prior uncertainty, reliability estimation techniques, and a nonhomogeneous Poisson process. Also addressed are the calibration sets and seed variables of expert judgment systems for risk assessment, experimental illustrations of the use of expert judgment for reliability testing, and analyses of the predictive quality of software-reliability growth models such as the Weibull order statistics.

  11. Design for reliability information and computer-based systems

    CERN Document Server

    Bauer, Eric

    2010-01-01

    "System reliability, availability and robustness are often not well understood by system architects, engineers and developers. They often don't understand what drives customer's availability expectations, how to frame verifiable availability/robustness requirements, how to manage and budget availability/robustness, how to methodically architect and design systems that meet robustness requirements, and so on. The book takes a very pragmatic approach of framing reliability and robustness as a functional aspect of a system so that architects, designers, developers and testers can address it as a concrete, functional attribute of a system, rather than an abstract, non-functional notion"--Provided by publisher.

  12. Identification of Black Spots Based on Reliability Approach

    Directory of Open Access Journals (Sweden)

    Ahmadreza Ghaffari

    2013-12-01

    Full Text Available Identifying crash “black-spots”, “hot-spots” or “high-risk” locations is one of the most important and prevalent concerns in traffic safety and various methods have been devised and presented for solving this issue until now. In this paper, a new method based on the reliability analysis is presented to identify black-spots. Reliability analysis has an ordered framework to consider the probabilistic nature of engineering problems, so crashes with their probabilistic na -ture can be applied. In this study, the application of this new method was compared with the commonly implemented Frequency and Empirical Bayesian methods using simulated data. The results indicated that the traditional methods can lead to an inconsistent prediction due to their inconsider -ation of the variance of the number of crashes in each site and their dependence on the mean of the data.

  13. Neural-net based unstable machine identification using individual energy functions. [Transient disturbances in power systems

    Energy Technology Data Exchange (ETDEWEB)

    Djukanovic, M [Institut Nikola Tesla, Belgrade (Yugoslavia); Sobajic, D J; Pao, Yohhan [Case Western Reserve Univ., Cleveland, OH (United States)

    1991-10-01

    The identification of the mode of instability plays an essential role in generating principal energy boundary hypersurfaces. We present a new method for unstable machine identification based on the use of supervised learning neural-net technology, and the adaptive pattern recognition concept. It is shown that using individual energy functions as pattern features, appropriately trained neural-nets can retrieve the reliable characterization of the transient process including critical clearing time parameter, mode of instability and energy margins. Generalization capabilities of the neural-net processing allow for these assessments to be made independently of load levels. The results obtained from computer simulations are presented using the New England power system, as an example. (author).

  14. Multi-Disciplinary System Reliability Analysis

    Science.gov (United States)

    Mahadevan, Sankaran; Han, Song

    1997-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code developed under the leadership of NASA Lewis Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multi-disciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  15. CERPI and CEREL, two computer codes for the automatic identification and determination of gamma emitters in thermal-neutron-activated samples

    International Nuclear Information System (INIS)

    Giannini, M.; Oliva, P.R.; Ramorino, M.C.

    1979-01-01

    A computer code that automatically analyzes gamma-ray spectra obtained with Ge(Li) detectors is described. The program contains such features as automatic peak location and fitting, determination of peak energies and intensities, nuclide identification, and calculation of masses and errors. Finally, the results obtained with this computer code for a lunar sample are reported and briefly discussed

  16. Reliable and reproducible method for rapid identification of Nocardia species by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry.

    Science.gov (United States)

    Toyokawa, Masahiro; Kimura, Keigo; Nishi, Isao; Sunada, Atsuko; Ueda, Akiko; Sakata, Tomomi; Asari, Seishi

    2013-01-01

    Recently, matrix-assisted laser desorption ionization time-of-flight mass spectrometry (MALDI-TOF MS) has been challenged for the identification of Nocardia species. However, the standard ethanol-formic acid extraction alone is insufficient in allowing the membrane proteins of Nocardia species to be ionized by the matrix. We therefore aimed to establish our new extraction method for the MALDI-TOF MS-based identification of Nocardia species isolates. Our modified extraction procedure is through dissociation in 0.5% Tween-20 followed by bacterial heat-inactivation, mechanical breaking of the cell wall by acid-washed glass beads and protein extraction with formic acid and acetonitrile. As reference methods for species identification, full-length 16S rRNA gene sequencing and some phenotypical tests were used. In a first step, we made our own Nocardia database by analyzing 13 strains (13 different species including N. elegans, N. otitidiscaviarum, N. asiatica, N. abscessus, N. brasiliensis, N. thailandica, N. farcinica, N. nova, N. mikamii, N. cyriacigeorgica, N. asteroids, Nocardiopsis alba, and Micromonospora sp.) and registered to the MALDI BioTyper database. Then we established our database. The analysis of 12 challenge strains using the our database gave a 100% correct identification, including 8 strains identified to the species level and 4 strains to the genus level (N. elegans, N. nova, N. farcinica, Micromonospora sp.) according to the manufacture's log score specifications. In the estimation of reproducibility of our method intended for 4 strains, both within-run and between-run reproducibility were excellent. These data indicates that our method for rapid identification of Nocardia species is with reliability, reproducibility and cost effective.

  17. Identification of control targets in Boolean molecular network models via computational algebra.

    Science.gov (United States)

    Murrugarra, David; Veliz-Cuba, Alan; Aguilar, Boris; Laubenbacher, Reinhard

    2016-09-23

    Many problems in biomedicine and other areas of the life sciences can be characterized as control problems, with the goal of finding strategies to change a disease or otherwise undesirable state of a biological system into another, more desirable, state through an intervention, such as a drug or other therapeutic treatment. The identification of such strategies is typically based on a mathematical model of the process to be altered through targeted control inputs. This paper focuses on processes at the molecular level that determine the state of an individual cell, involving signaling or gene regulation. The mathematical model type considered is that of Boolean networks. The potential control targets can be represented by a set of nodes and edges that can be manipulated to produce a desired effect on the system. This paper presents a method for the identification of potential intervention targets in Boolean molecular network models using algebraic techniques. The approach exploits an algebraic representation of Boolean networks to encode the control candidates in the network wiring diagram as the solutions of a system of polynomials equations, and then uses computational algebra techniques to find such controllers. The control methods in this paper are validated through the identification of combinatorial interventions in the signaling pathways of previously reported control targets in two well studied systems, a p53-mdm2 network and a blood T cell lymphocyte granular leukemia survival signaling network. Supplementary data is available online and our code in Macaulay2 and Matlab are available via http://www.ms.uky.edu/~dmu228/ControlAlg . This paper presents a novel method for the identification of intervention targets in Boolean network models. The results in this paper show that the proposed methods are useful and efficient for moderately large networks.

  18. Logistic regression model for identification of right ventricular dysfunction in patients with acute pulmonary embolism by means of computed tomography

    International Nuclear Information System (INIS)

    Staskiewicz, Grzegorz; Czekajska-Chehab, Elżbieta; Uhlig, Sebastian; Przegalinski, Jerzy; Maciejewski, Ryszard; Drop, Andrzej

    2013-01-01

    Purpose: Diagnosis of right ventricular dysfunction in patients with acute pulmonary embolism (PE) is known to be associated with increased risk of mortality. The aim of the study was to calculate a logistic regression model for reliable identification of right ventricular dysfunction (RVD) in patients diagnosed with computed tomography pulmonary angiography. Material and methods: Ninety-seven consecutive patients with acute pulmonary embolism were divided into groups with and without RVD basing upon echocardiographic measurement of pulmonary artery systolic pressure (PASP). PE severity was graded with the pulmonary obstruction score. CT measurements of heart chambers and mediastinal vessels were performed; position of interventricular septum and presence of contrast reflux into the inferior vena cava were also recorded. The logistic regression model was prepared by means of stepwise logistic regression. Results: Among the used parameters, the final model consisted of pulmonary obstruction score, short axis diameter of right ventricle and diameter of inferior vena cava. The calculated model is characterized by 79% sensitivity and 81% specificity, and its performance was significantly better than single CT-based measurements. Conclusion: Logistic regression model identifies RVD significantly better, than single CT-based measurements

  19. Emergency diesel generator reliability program

    International Nuclear Information System (INIS)

    Serkiz, A.W.

    1989-01-01

    The need for an emergency diesel generator (EDG) reliability program has been established by 10 CFR Part 50, Section 50.63, Loss of All Alternating Current Power, which requires that utilities assess their station blackout duration and recovery capability. EDGs are the principal emergency ac power sources for coping with a station blackout. Regulatory Guide 1.155, Station Blackout, identifies a need for (1) an EDG reliability equal to or greater than 0.95, and (2) an EDG reliability program to monitor and maintain the required levels. The resolution of Generic Safety Issue (GSI) B-56 embodies the identification of a suitable EDG reliability program structure, revision of pertinent regulatory guides and Tech Specs, and development of an Inspection Module. Resolution of B-56 is coupled to the resolution of Unresolved Safety Issue (USI) A-44, Station Blackout, which resulted in the station blackout rule, 10 CFR 50.63 and Regulatory Guide 1.155, Station Blackout. This paper discusses the principal elements of an EDG reliability program developed for resolving GSI B-56 and related matters

  20. System reliability with correlated components: Accuracy of the Equivalent Planes method

    NARCIS (Netherlands)

    Roscoe, K.; Diermanse, F.; Vrouwenvelder, A.C.W.M.

    2015-01-01

    Computing system reliability when system components are correlated presents a challenge because it usually requires solving multi-fold integrals numerically, which is generally infeasible due to the computational cost. In Dutch flood defense reliability modeling, an efficient method for computing

  1. System reliability with correlated components : Accuracy of the Equivalent Planes method

    NARCIS (Netherlands)

    Roscoe, K.; Diermanse, F.; Vrouwenvelder, T.

    2015-01-01

    Computing system reliability when system components are correlated presents a challenge because it usually requires solving multi-fold integrals numerically, which is generally infeasible due to the computational cost. In Dutch flood defense reliability modeling, an efficient method for computing

  2. Fuel reliability experience in Finland

    International Nuclear Information System (INIS)

    Kekkonen, L.

    2015-01-01

    Four nuclear reactors have operated in Finland now for 35-38 years. The two VVER-440 units at Loviisa Nuclear Power Plant are operated by Fortum and two BWR’s in Olkiluoto are operated by Teollisuuden Voima Oyj (TVO). The fuel reliability experience of the four reactors operating currently in Finland has been very good and the fuel failure rates have been very low. Systematic inspection of spent fuel assemblies, and especially all failed assemblies, is a good practice that is employed in Finland in order to improve fuel reliability and operational safety. Investigation of the root cause of fuel failures is important in developing ways to prevent similar failures in the future. The operational and fuel reliability experience at the Loviisa Nuclear Power Plant has been reported also earlier in the international seminars on WWER Fuel Performance, Modelling and Experimental Support. In this paper the information on fuel reliability experience at Loviisa NPP is updated and also a short summary of the fuel reliability experience at Olkiluoto NPP is given. Keywords: VVER-440, fuel reliability, operational experience, poolside inspections, fuel failure identification. (author)

  3. Rapid and reliable identification of Gram-negative bacteria and Gram-positive cocci by deposition of bacteria harvested from blood cultures onto the MALDI-TOF plate.

    Science.gov (United States)

    Barnini, Simona; Ghelardi, Emilia; Brucculeri, Veronica; Morici, Paola; Lupetti, Antonella

    2015-06-18

    Rapid identification of the causative agent(s) of bloodstream infections using the matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF) methodology can lead to increased empirical antimicrobial therapy appropriateness. Herein, we aimed at establishing an easier and simpler method, further referred to as the direct method, using bacteria harvested by serum separator tubes from positive blood cultures and placed onto the polished steel target plate for rapid identification by MALDI-TOF. The results by the direct method were compared with those obtained by MALDI-TOF on bacteria isolated on solid media. Identification of Gram-negative bacilli was 100 % concordant using the direct method or MALDI-TOF on isolated bacteria (96 % with score > 2.0). These two methods were 90 % concordant on Gram-positive cocci (32 % with score > 2.0). Identification by the SepsiTyper method of Gram-positive cocci gave concordant results with MALDI-TOF on isolated bacteria in 87 % of cases (37 % with score > 2.0). The direct method herein developed allows rapid identification (within 30 min) of Gram-negative bacteria and Gram-positive cocci from positive blood cultures and can be used to rapidly report reliable and accurate results, without requiring skilled personnel or the use of expensive kits.

  4. An accurate and efficient reliability-based design optimization using the second order reliability method and improved stability transformation method

    Science.gov (United States)

    Meng, Zeng; Yang, Dixiong; Zhou, Huanlin; Yu, Bo

    2018-05-01

    The first order reliability method has been extensively adopted for reliability-based design optimization (RBDO), but it shows inaccuracy in calculating the failure probability with highly nonlinear performance functions. Thus, the second order reliability method is required to evaluate the reliability accurately. However, its application for RBDO is quite challenge owing to the expensive computational cost incurred by the repeated reliability evaluation and Hessian calculation of probabilistic constraints. In this article, a new improved stability transformation method is proposed to search the most probable point efficiently, and the Hessian matrix is calculated by the symmetric rank-one update. The computational capability of the proposed method is illustrated and compared to the existing RBDO approaches through three mathematical and two engineering examples. The comparison results indicate that the proposed method is very efficient and accurate, providing an alternative tool for RBDO of engineering structures.

  5. JUPITER: Joint Universal Parameter IdenTification and Evaluation of Reliability - An Application Programming Interface (API) for Model Analysis

    Science.gov (United States)

    Banta, Edward R.; Poeter, Eileen P.; Doherty, John E.; Hill, Mary C.

    2006-01-01

    he Joint Universal Parameter IdenTification and Evaluation of Reliability Application Programming Interface (JUPITER API) improves the computer programming resources available to those developing applications (computer programs) for model analysis.The JUPITER API consists of eleven Fortran-90 modules that provide for encapsulation of data and operations on that data. Each module contains one or more entities: data, data types, subroutines, functions, and generic interfaces. The modules do not constitute computer programs themselves; instead, they are used to construct computer programs. Such computer programs are called applications of the API. The API provides common modeling operations for use by a variety of computer applications.The models being analyzed are referred to here as process models, and may, for example, represent the physics, chemistry, and(or) biology of a field or laboratory system. Process models commonly are constructed using published models such as MODFLOW (Harbaugh et al., 2000; Harbaugh, 2005), MT3DMS (Zheng and Wang, 1996), HSPF (Bicknell et al., 1997), PRMS (Leavesley and Stannard, 1995), and many others. The process model may be accessed by a JUPITER API application as an external program, or it may be implemented as a subroutine within a JUPITER API application . In either case, execution of the model takes place in a framework designed by the application programmer. This framework can be designed to take advantage of any parallel processing capabilities possessed by the process model, as well as the parallel-processing capabilities of the JUPITER API.Model analyses for which the JUPITER API could be useful include, for example: Compare model results to observed values to determine how well the model reproduces system processes and characteristics.Use sensitivity analysis to determine the information provided by observations to parameters and predictions of interest.Determine the additional data needed to improve selected model

  6. Measurement of transplanted pancreatic volume using computed tomography: reliability by intra- and inter-observer variability

    International Nuclear Information System (INIS)

    Lundqvist, Eva; Segelsjoe, Monica; Magnusson, Anders; Andersson, Anna; Biglarnia, Ali-Reza

    2012-01-01

    Background Unlike other solid organ transplants, pancreas allografts can undergo a substantial decrease in baseline volume after transplantation. This phenomenon has not been well characterized, as there are insufficient data on reliable and reproducible volume assessments. We hypothesized that characterization of pancreatic volume by means of computed tomography (CT) could be a useful method for clinical follow-up in pancreas transplant patients. Purpose To evaluate the feasibility and reliability of pancreatic volume assessment using CT scan in transplanted patients. Material and Methods CT examinations were performed on 21 consecutive patients undergoing pancreas transplantation. Volume measurements were carried out by two observers tracing the pancreatic contours in all slices. The observers performed the measurements twice for each patient. Differences in volume measurement were used to evaluate intra- and inter-observer variability. Results The intra-observer variability for the pancreatic volume measurements of Observers 1 and 2 was found to be in almost perfect agreement, with an intraclass correlation coefficient (ICC) of 0.90 (0.77-0.96) and 0.99 (0.98-1.0), respectively. Regarding inter-observer validity, the ICCs for the first and second measurements were 0.90 (range, 0.77-0.96) and 0.95 (range, 0.85-0.98), respectively. Conclusion CT volumetry is a reliable and reproducible method for measurement of transplanted pancreatic volume

  7. Measurement of transplanted pancreatic volume using computed tomography: reliability by intra- and inter-observer variability

    Energy Technology Data Exchange (ETDEWEB)

    Lundqvist, Eva; Segelsjoe, Monica; Magnusson, Anders [Uppsala Univ., Dept. of Radiology, Oncology and Radiation Science, Section of Radiology, Uppsala (Sweden)], E-mail: eva.lundqvist.8954@student.uu.se; Andersson, Anna; Biglarnia, Ali-Reza [Dept. of Surgical Sciences, Section of Transplantation Surgery, Uppsala Univ. Hospital, Uppsala (Sweden)

    2012-11-15

    Background Unlike other solid organ transplants, pancreas allografts can undergo a substantial decrease in baseline volume after transplantation. This phenomenon has not been well characterized, as there are insufficient data on reliable and reproducible volume assessments. We hypothesized that characterization of pancreatic volume by means of computed tomography (CT) could be a useful method for clinical follow-up in pancreas transplant patients. Purpose To evaluate the feasibility and reliability of pancreatic volume assessment using CT scan in transplanted patients. Material and Methods CT examinations were performed on 21 consecutive patients undergoing pancreas transplantation. Volume measurements were carried out by two observers tracing the pancreatic contours in all slices. The observers performed the measurements twice for each patient. Differences in volume measurement were used to evaluate intra- and inter-observer variability. Results The intra-observer variability for the pancreatic volume measurements of Observers 1 and 2 was found to be in almost perfect agreement, with an intraclass correlation coefficient (ICC) of 0.90 (0.77-0.96) and 0.99 (0.98-1.0), respectively. Regarding inter-observer validity, the ICCs for the first and second measurements were 0.90 (range, 0.77-0.96) and 0.95 (range, 0.85-0.98), respectively. Conclusion CT volumetry is a reliable and reproducible method for measurement of transplanted pancreatic volume.

  8. Development of a multilocus-based approach for sponge (phylum Porifera) identification: refinement and limitations.

    Science.gov (United States)

    Yang, Qi; Franco, Christopher M M; Sorokin, Shirley J; Zhang, Wei

    2017-02-02

    For sponges (phylum Porifera), there is no reliable molecular protocol available for species identification. To address this gap, we developed a multilocus-based Sponge Identification Protocol (SIP) validated by a sample of 37 sponge species belonging to 10 orders from South Australia. The universal barcode COI mtDNA, 28S rRNA gene (D3-D5), and the nuclear ITS1-5.8S-ITS2 region were evaluated for their suitability and capacity for sponge identification. The highest Bit Score was applied to infer the identity. The reliability of SIP was validated by phylogenetic analysis. The 28S rRNA gene and COI mtDNA performed better than the ITS region in classifying sponges at various taxonomic levels. A major limitation is that the databases are not well populated and possess low diversity, making it difficult to conduct the molecular identification protocol. The identification is also impacted by the accuracy of the morphological classification of the sponges whose sequences have been submitted to the database. Re-examination of the morphological identification further demonstrated and improved the reliability of sponge identification by SIP. Integrated with morphological identification, the multilocus-based SIP offers an improved protocol for more reliable and effective sponge identification, by coupling the accuracy of different DNA markers.

  9. Sequential optimization and reliability assessment method for metal forming processes

    International Nuclear Information System (INIS)

    Sahai, Atul; Schramm, Uwe; Buranathiti, Thaweepat; Chen Wei; Cao Jian; Xia, Cedric Z.

    2004-01-01

    Uncertainty is inevitable in any design process. The uncertainty could be due to the variations in geometry of the part, material properties or due to the lack of knowledge about the phenomena being modeled itself. Deterministic design optimization does not take uncertainty into account and worst case scenario assumptions lead to vastly over conservative design. Probabilistic design, such as reliability-based design and robust design, offers tools for making robust and reliable decisions under the presence of uncertainty in the design process. Probabilistic design optimization often involves double-loop procedure for optimization and iterative probabilistic assessment. This results in high computational demand. The high computational demand can be reduced by replacing computationally intensive simulation models with less costly surrogate models and by employing Sequential Optimization and reliability assessment (SORA) method. The SORA method uses a single-loop strategy with a series of cycles of deterministic optimization and reliability assessment. The deterministic optimization and reliability assessment is decoupled in each cycle. This leads to quick improvement of design from one cycle to other and increase in computational efficiency. This paper demonstrates the effectiveness of Sequential Optimization and Reliability Assessment (SORA) method when applied to designing a sheet metal flanging process. Surrogate models are used as less costly approximations to the computationally expensive Finite Element simulations

  10. CERPI and CEREL, two computer codes for the automatic identification and determination of gamma emitters in thermal neutron activated samples

    International Nuclear Information System (INIS)

    Giannini, M.; Oliva, P.R.; Ramorino, C.

    1978-01-01

    A description is given of a computer code which automatically analyses gamma-ray spectra obtained with Ge(Li) detectors. The program contains features as automatic peak location and fitting, determination of peak energies and intensities, nuclide identification and calculation of masses and errors. Finally the results obtained with our computer code for a lunar sample are reported and briefly discussed

  11. Optical identifications of radio sources in the 5C 7 survey

    International Nuclear Information System (INIS)

    Perryman, M.A.C.

    1979-01-01

    An identification procedure developed for the deep radio survey 5C 6 has been refined and applied to the 5C 7 survey. Positions and finding charts are presented for candidate identifications from deep plates taken with the Palomar 48-inch Schmidt telescope. The identification statistics are in good agreement with the 5C 6 results, the accurate radio positions obtained at 1407 MHz defining a reasonably reliable and complete sample of associations with an identification rate of about 40 per cent. At 408 MHz the positional uncertainties are larger and the identifications are thus of lower reliability; the identification rate is about 20 per cent. The results are in good agreement with the assumptions that the optical identifications are coincident with the radio centroids, and that the identifications are not preferentially associated with faint clusters. (author)

  12. Reliability and validity of the revised Gibson Test of Cognitive Skills, a computer-based test battery for assessing cognition across the lifespan.

    Science.gov (United States)

    Moore, Amy Lawson; Miller, Terissa M

    2018-01-01

    The purpose of the current study is to evaluate the validity and reliability of the revised Gibson Test of Cognitive Skills, a computer-based battery of tests measuring short-term memory, long-term memory, processing speed, logic and reasoning, visual processing, as well as auditory processing and word attack skills. This study included 2,737 participants aged 5-85 years. A series of studies was conducted to examine the validity and reliability using the test performance of the entire norming group and several subgroups. The evaluation of the technical properties of the test battery included content validation by subject matter experts, item analysis and coefficient alpha, test-retest reliability, split-half reliability, and analysis of concurrent validity with the Woodcock Johnson III Tests of Cognitive Abilities and Tests of Achievement. Results indicated strong sources of evidence of validity and reliability for the test, including internal consistency reliability coefficients ranging from 0.87 to 0.98, test-retest reliability coefficients ranging from 0.69 to 0.91, split-half reliability coefficients ranging from 0.87 to 0.91, and concurrent validity coefficients ranging from 0.53 to 0.93. The Gibson Test of Cognitive Skills-2 is a reliable and valid tool for assessing cognition in the general population across the lifespan.

  13. Photo-identification as a technique for recognition of individual fish: a test with the freshwater armored catfish Rineloricaria aequalicuspis Reis & Cardoso, 2001 (Siluriformes: Loricariidae

    Directory of Open Access Journals (Sweden)

    Renato B. Dala-Corte

    Full Text Available Abstract Photo-identification allows individual recognition of animal species based on natural marks, being an alternative to other more stressful artificial tagging/marking techniques. An increasing number of studies with different animal groups has shown that photo-identification can successfully be used in several situations, but its feasibility to study freshwater fishes is yet to be explored. We demonstrate the potential use of photo-identification for intraspecific recognition of individuals in the stream-dwelling loricariid Rineloricaria aequalicuspis . We tested photo-identification in laboratory and field conditions based on the interindividual variability in abdominal bony plates. Our test yielded high correct matches in both laboratory (100% and field conditions (> 97%, comparable to other reliable techniques and to studies that successfully used photo-identification in other animals. In field conditions, the number of correct matches did not differ statistically between computer-assisted and naked-eye identification. However, the average time expended to conclude computer-assisted photo evaluations was about half of the time expended to conclude naked-eye evaluations. This result may be exacerbated when using database with large number of images. Our results indicate that photo-identification can be a feasible alternative technique to study freshwater fish species, allowing for a wider use of mark-recapture in ecological and behavioral studies.

  14. Improving reliability of state estimation programming and computing suite based on analyzing a fault tree

    Directory of Open Access Journals (Sweden)

    Kolosok Irina

    2017-01-01

    Full Text Available Reliable information on the current state parameters obtained as a result of processing the measurements from systems of the SCADA and WAMS data acquisition and processing through methods of state estimation (SE is a condition that enables to successfully manage an energy power system (EPS. SCADA and WAMS systems themselves, as any technical systems, are subject to failures and faults that lead to distortion and loss of information. The SE procedure enables to find erroneous measurements, therefore, it is a barrier for the distorted information to penetrate into control problems. At the same time, the programming and computing suite (PCS implementing the SE functions may itself provide a wrong decision due to imperfection of the software algorithms and errors. In this study, we propose to use a fault tree to analyze consequences of failures and faults in SCADA and WAMS and in the very SE procedure. Based on the analysis of the obtained measurement information and on the SE results, we determine the state estimation PCS fault tolerance level featuring its reliability.

  15. Use of computer codes for system reliability analysis

    International Nuclear Information System (INIS)

    Sabek, M.; Gaafar, M.; Poucet, A.

    1988-01-01

    This paper gives a collective summary of the studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRANTIC, FTAP, computer code package RALLY, and BOUNDS codes. Two reference study cases were executed by each code. The results obtained logic/probabilistic analysis as well as computation time are compared

  16. Reliability Lessons Learned From GPU Experience With The Titan Supercomputer at Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Gallarno, George [Christian Brothers University; Rogers, James H [ORNL; Maxwell, Don E [ORNL

    2015-01-01

    The high computational capability of graphics processing units (GPUs) is enabling and driving the scientific discovery process at large-scale. The world s second fastest supercomputer for open science, Titan, has more than 18,000 GPUs that computational scientists use to perform scientific simu- lations and data analysis. Understanding of GPU reliability characteristics, however, is still in its nascent stage since GPUs have only recently been deployed at large-scale. This paper presents a detailed study of GPU errors and their impact on system operations and applications, describing experiences with the 18,688 GPUs on the Titan supercom- puter as well as lessons learned in the process of efficient operation of GPUs at scale. These experiences are helpful to HPC sites which already have large-scale GPU clusters or plan to deploy GPUs in the future.

  17. ESCAF - Failure simulation and reliability calculation device

    International Nuclear Information System (INIS)

    Laviron, A.; Berard, C.; Quenee, R.

    1979-01-01

    Reliability studies of nuclear power plant safety functions have, up to now, required the use of large computers. As they are of universal use, these big machines are not very well adapted to deal with reliability problems at low cost. ESCAF has been developed to be substituted for large computers in order to save time and money. ESCAF is a small electronic device which can be used in connection with a minicomputer. It allows to perform complex system reliability analysis (qualitative and quantitative) and to study critical element influences such as common cause failures. In this paper, the device is described and its features and abilities are outlined: easy to implement, swift running, low working cost. Its application range concerns all cases when a good reliability is needed

  18. Translation, Validation, and Reliability of the Dutch Late-Life Function and Disability Instrument Computer Adaptive Test.

    Science.gov (United States)

    Arensman, Remco M; Pisters, Martijn F; de Man-van Ginkel, Janneke M; Schuurmans, Marieke J; Jette, Alan M; de Bie, Rob A

    2016-09-01

    Adequate and user-friendly instruments for assessing physical function and disability in older adults are vital for estimating and predicting health care needs in clinical practice. The Late-Life Function and Disability Instrument Computer Adaptive Test (LLFDI-CAT) is a promising instrument for assessing physical function and disability in gerontology research and clinical practice. The aims of this study were: (1) to translate the LLFDI-CAT to the Dutch language and (2) to investigate its validity and reliability in a sample of older adults who spoke Dutch and dwelled in the community. For the assessment of validity of the LLFDI-CAT, a cross-sectional design was used. To assess reliability, measurement of the LLFDI-CAT was repeated in the same sample. The item bank of the LLFDI-CAT was translated with a forward-backward procedure. A sample of 54 older adults completed the LLFDI-CAT, World Health Organization Disability Assessment Schedule 2.0, RAND 36-Item Short-Form Health Survey physical functioning scale (10 items), and 10-Meter Walk Test. The LLFDI-CAT was repeated in 2 to 8 days (mean=4.5 days). Pearson's r and the intraclass correlation coefficient (ICC) (2,1) were calculated to assess validity, group-level reliability, and participant-level reliability. A correlation of .74 for the LLFDI-CAT function scale and the RAND 36-Item Short-Form Health Survey physical functioning scale (10 items) was found. The correlations of the LLFDI-CAT disability scale with the World Health Organization Disability Assessment Schedule 2.0 and the 10-Meter Walk Test were -.57 and -.53, respectively. The ICC (2,1) of the LLFDI-CAT function scale was .84, with a group-level reliability score of .85. The ICC (2,1) of the LLFDI-CAT disability scale was .76, with a group-level reliability score of .81. The high percentage of women in the study and the exclusion of older adults with recent joint replacement or hospitalization limit the generalizability of the results. The Dutch LLFDI

  19. Electronic structure of BN-aromatics: Choice of reliable computational tools

    Science.gov (United States)

    Mazière, Audrey; Chrostowska, Anna; Darrigan, Clovis; Dargelos, Alain; Graciaa, Alain; Chermette, Henry

    2017-10-01

    The importance of having reliable calculation tools to interpret and predict the electronic properties of BN-aromatics is directly linked to the growing interest for these very promising new systems in the field of materials science, biomedical research, or energy sustainability. Ionization energy (IE) is one of the most important parameters to approach the electronic structure of molecules. It can be theoretically estimated, but in order to evaluate their persistence and propose the most reliable tools for the evaluation of different electronic properties of existent or only imagined BN-containing compounds, we took as reference experimental values of ionization energies provided by ultra-violet photoelectron spectroscopy (UV-PES) in gas phase—the only technique giving access to the energy levels of filled molecular orbitals. Thus, a set of 21 aromatic molecules containing B-N bonds and B-N-B patterns has been merged for a comparison between experimental IEs obtained by UV-PES and various theoretical approaches for their estimation. Time-Dependent Density Functional Theory (TD-DFT) methods using B3LYP and long-range corrected CAM-B3LYP functionals are used, combined with the Δ SCF approach, and compared with electron propagator theory such as outer valence Green's function (OVGF, P3) and symmetry adapted cluster-configuration interaction ab initio methods. Direct Kohn-Sham estimation and "corrected" Kohn-Sham estimation are also given. The deviation between experimental and theoretical values is computed for each molecule, and a statistical study is performed over the average and the root mean square for the whole set and sub-sets of molecules. It is shown that (i) Δ SCF+TDDFT(CAM-B3LYP), OVGF, and P3 are the most efficient way for a good agreement with UV-PES values, (ii) a CAM-B3LYP range-separated hybrid functional is significantly better than B3LYP for the purpose, especially for extended conjugated systems, and (iii) the "corrected" Kohn-Sham result is a

  20. Dynamical Models for Computer Viruses Propagation

    Directory of Open Access Journals (Sweden)

    José R. C. Piqueira

    2008-01-01

    Full Text Available Nowadays, digital computer systems and networks are the main engineering tools, being used in planning, design, operation, and control of all sizes of building, transportation, machinery, business, and life maintaining devices. Consequently, computer viruses became one of the most important sources of uncertainty, contributing to decrease the reliability of vital activities. A lot of antivirus programs have been developed, but they are limited to detecting and removing infections, based on previous knowledge of the virus code. In spite of having good adaptation capability, these programs work just as vaccines against diseases and are not able to prevent new infections based on the network state. Here, a trial on modeling computer viruses propagation dynamics relates it to other notable events occurring in the network permitting to establish preventive policies in the network management. Data from three different viruses are collected in the Internet and two different identification techniques, autoregressive and Fourier analyses, are applied showing that it is possible to forecast the dynamics of a new virus propagation by using the data collected from other viruses that formerly infected the network.

  1. Reliability-based optimal structural design by the decoupling approach

    International Nuclear Information System (INIS)

    Royset, J.O.; Der Kiureghian, A.; Polak, E.

    2001-01-01

    A decoupling approach for solving optimal structural design problems involving reliability terms in the objective function, the constraint set or both is discussed and extended. The approach employs a reformulation of each problem, in which reliability terms are replaced by deterministic functions. The reformulated problems can be solved by existing semi-infinite optimization algorithms and computational reliability methods. It is shown that the reformulated problems produce solutions that are identical to those of the original problems when the limit-state functions defining the reliability problem are affine. For nonaffine limit-state functions, approximate solutions are obtained by solving series of reformulated problems. An important advantage of the approach is that the required reliability and optimization calculations are completely decoupled, thus allowing flexibility in the choice of the optimization algorithm and the reliability computation method

  2. Common-Reliability Cumulative-Binomial Program

    Science.gov (United States)

    Scheuer, Ernest, M.; Bowerman, Paul N.

    1989-01-01

    Cumulative-binomial computer program, CROSSER, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. CROSSER, CUMBIN (NPO-17555), and NEWTONP (NPO-17556), used independently of one another. Point of equality between reliability of system and common reliability of components found. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Program written in C.

  3. A reliability-risk modelling of nuclear rad-waste facilities

    International Nuclear Information System (INIS)

    Lehmann, P.H.; El-Bassioni, A.A.

    1975-01-01

    Rad-waste disposal systems of nuclear power sites are designed and operated to collect, delay, contain, and concentrate radioactive wastes from reactor plant processes such that on-site and off-site exposures to radiation are well below permissible limits. To assist the designer in achieving minimum release/exposure goals, a computerized reliability-risk model has been developed to simulate the rad-waste system. The objectives of the model are to furnish a practical tool for quantifying the effects of changes in system configuration, operation, and equipment, and for the identification of weak segments in the system design. Primarily, the model comprises a marriage of system analysis, reliability analysis, and release-risk assessment. Provisions have been included in the model to permit the optimization of the system design subject to constraints on cost and rad-releases. The system analysis phase involves the preparation of a physical and functional description of the rad-waste facility accompanied by the formation of a system tree diagram. The reliability analysis phase embodies the formulation of appropriate reliability models and the collection of model parameters. Release-risk assessment constitutes the analytical basis whereupon further system and reliability analyses may be warranted. Release-risk represents the potential for release of radioactivity and is defined as the product of an element's unreliability at time, t, and the radioactivity available for release in time interval, Δt. A computer code (RARISK) has been written to simulate the tree diagram of the rad-waste system. Reliability and release-risk results have been generated for cases which examined the process flow paths of typical rad-waste systems, the effects of repair and standby, the variations of equipment failure and repair rates, and changes in system configurations. The essential feature of this model is that a complex system like the rad-waste facility can be easily decomposed into its

  4. Use of computer codes for system reliability analysis

    International Nuclear Information System (INIS)

    Sabek, M.; Gaafar, M.; Poucet, A.

    1989-01-01

    This paper gives a summary of studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRACTIC, FTAP, computer code package RALLY, and BOUNDS. Two reference case studies were executed by each code. The probabilistic results obtained, as well as the computation times are compared. The two cases studied are the auxiliary feedwater system of a 1300 MW PWR reactor and the emergency electrical power supply system. (author)

  5. Use of computer codes for system reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sabek, M.; Gaafar, M. (Nuclear Regulatory and Safety Centre, Atomic Energy Authority, Cairo (Egypt)); Poucet, A. (Commission of the European Communities, Ispra (Italy). Joint Research Centre)

    1989-01-01

    This paper gives a summary of studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRACTIC, FTAP, computer code package RALLY, and BOUNDS. Two reference case studies were executed by each code. The probabilistic results obtained, as well as the computation times are compared. The two cases studied are the auxiliary feedwater system of a 1300 MW PWR reactor and the emergency electrical power supply system. (author).

  6. Online Reliable Peak Charge/Discharge Power Estimation of Series-Connected Lithium-Ion Battery Packs

    Directory of Open Access Journals (Sweden)

    Bo Jiang

    2017-03-01

    Full Text Available The accurate peak power estimation of a battery pack is essential to the power-train control of electric vehicles (EVs. It helps to evaluate the maximum charge and discharge capability of the battery system, and thus to optimally control the power-train system to meet the requirement of acceleration, gradient climbing and regenerative braking while achieving a high energy efficiency. A novel online peak power estimation method for series-connected lithium-ion battery packs is proposed, which considers the influence of cell difference on the peak power of the battery packs. A new parameter identification algorithm based on adaptive ratio vectors is designed to online identify the parameters of each individual cell in a series-connected battery pack. The ratio vectors reflecting cell difference are deduced strictly based on the analysis of battery characteristics. Based on the online parameter identification, the peak power estimation considering cell difference is further developed. Some validation experiments in different battery aging conditions and with different current profiles have been implemented to verify the proposed method. The results indicate that the ratio vector-based identification algorithm can achieve the same accuracy as the repetitive RLS (recursive least squares based identification while evidently reducing the computation cost, and the proposed peak power estimation method is more effective and reliable for series-connected battery packs due to the consideration of cell difference.

  7. Validity and Reliability of Orthodontic Loops between Mechanical Testing and Computer Simulation: An Finite Element Method Study

    Directory of Open Access Journals (Sweden)

    Gaurav Sepolia

    2014-01-01

    Full Text Available The magnitude and direction of orthodontic force is one of the essential concerns of orthodontic tooth movements. Excessive force may cause root resorption and mobility of the tooth, whereas low force level may results in prolonged treatment. The addition of loops allows the clinician to more accurately achieve the desired results. Aims and objectives: The purpose of the study was to evaluate the validity and reliability of orthodontic loops between mechanical testing and computer simulation. Materials and methods: Different types of loops were taken and divided into four groups: The Teardrop loop, Opus loop, L loop and T loop. These were artificially activated for multiple lengths and studied using the FEM. Results: The Teardrop loop showed the highest force level, and there is no significant difference between mechanical testing and computer simulation.

  8. Computer game as a tool for training the identification of phonemic length.

    Science.gov (United States)

    Pennala, Riitta; Richardson, Ulla; Ylinen, Sari; Lyytinen, Heikki; Martin, Maisa

    2014-12-01

    Computer-assisted training of Finnish phonemic length was conducted with 7-year-old Russian-speaking second-language learners of Finnish. Phonemic length plays a different role in these two languages. The training included game activities with two- and three-syllable word and pseudo-word minimal pairs with prototypical vowel durations. The lowest accuracy scores were recorded for two-syllable words. Accuracy scores were higher for the minimal pairs with larger rather than smaller differences in duration. Accuracy scores were lower for long duration than for short duration. The ability to identify quantity degree was generalized to stimuli used in the identification test in two of the children. Ideas for improving the game are introduced.

  9. Identification of DNA methylation biomarkers from Infinium arrays

    Directory of Open Access Journals (Sweden)

    Richard D Emes

    2012-08-01

    Full Text Available Epigenetic modifications of DNA, such as cytosine methylation are differentially abundant in diseases such as cancer. A goal for clinical research is finding sites that are differentially methylated between groups of samples to act as potential biomarkers for disease outcome. However, clinical samples are often limited in availability, represent a heterogeneous collection of cells or are of uncertain clinical class. Array based methods for identification of methylation provide a cost effective method to survey a proportion of the methylome at single base resolution. The Illumina Infinium array has become a popular and reliable high throughput method in this field and are proving useful in the identification of biomarkers for disease. Here, we compare a commonly used statistical test with a new intuitive and flexible computational approach to quickly detect differentially methylated sites. The method rapidly identifies and ranks candidate lists with greatest inter-group variability whilst controlling for intra-group variability. Intuitive and biologically relevant filters can be imposed to quickly identify sites and genes of interest.

  10. A digital squarer system for positive mass identification on the ARL ion microprobe mass analyser

    International Nuclear Information System (INIS)

    Woods, K.N.; Grant, L.D.V.; Rawsthorne, E.D.; Strydom, H.J.; Gries, W.H.

    1984-01-01

    The original analogue squarer for mass scale linearisation in the Ion Microprobe Mass Analyser (IMMA) has been replaced by a programmable digital squarer system which permits reliable mass number identification throughout the tested range 1 to 240. The digital squarer provides signals to both a digital direct reading mass number display and to an X-Y recorder where it provides a linear mass scale correct to within 0,3 mass units. An additional output to a computer can provide binary or BCD mass number data

  11. Uncertainty calculation for modal parameters used with stochastic subspace identification: an application to a bridge structure

    Science.gov (United States)

    Hsu, Wei-Ting; Loh, Chin-Hsiung; Chao, Shu-Hsien

    2015-03-01

    Stochastic subspace identification method (SSI) has been proven to be an efficient algorithm for the identification of liner-time-invariant system using multivariate measurements. Generally, the estimated modal parameters through SSI may be afflicted with statistical uncertainty, e.g. undefined measurement noises, non-stationary excitation, finite number of data samples etc. Therefore, the identified results are subjected to variance errors. Accordingly, the concept of the stabilization diagram can help users to identify the correct model, i.e. through removing the spurious modes. Modal parameters are estimated at successive model orders where the physical modes of the system are extracted and separated from the spurious modes. Besides, an uncertainty computation scheme was derived for the calculation of uncertainty bounds for modal parameters at some given model order. The uncertainty bounds of damping ratios are particularly interesting, as the estimation of damping ratios are difficult to obtain. In this paper, an automated stochastic subspace identification algorithm is addressed. First, the identification of modal parameters through covariance-driven stochastic subspace identification from the output-only measurements is used for discussion. A systematic way of investigation on the criteria for the stabilization diagram is presented. Secondly, an automated algorithm of post-processing on stabilization diagram is demonstrated. Finally, the computation of uncertainty bounds for each mode with all model order in the stabilization diagram is utilized to determine system natural frequencies and damping ratios. Demonstration of this study on the system identification of a three-span steel bridge under operation condition is presented. It is shown that the proposed new operation procedure for the automated covariance-driven stochastic subspace identification can enhance the robustness and reliability in structural health monitoring.

  12. Modeling and Analysis of Surgery Patient Identification Using RFID

    OpenAIRE

    Byungho Jeong; Chen-Yang Cheng; Vittal Prabhu

    2009-01-01

    This article proposes a workflow and reliability model for surgery patient identification using RFID (Radio Frequency Identification). Certain types of mistakes may be prevented by automatically identifying the patient before surgery. The proposed workflow is designed to ensure that both the correct site and patient are engaged in the surgical process. The reliability model can be used to assess improvements in patients’ safety during this process. A proof-of-concept system is developed to ...

  13. MAPPS (Maintenance Personnel Performance Simulation): a computer simulation model for human reliability analysis

    International Nuclear Information System (INIS)

    Knee, H.E.; Haas, P.M.

    1985-01-01

    A computer model has been developed, sensitivity tested, and evaluated capable of generating reliable estimates of human performance measures in the nuclear power plant (NPP) maintenance context. The model, entitled MAPPS (Maintenance Personnel Performance Simulation), is of the simulation type and is task-oriented. It addresses a number of person-machine, person-environment, and person-person variables and is capable of providing the user with a rich spectrum of important performance measures including mean time for successful task performance by a maintenance team and maintenance team probability of task success. These two measures are particularly important for input to probabilistic risk assessment (PRA) studies which were the primary impetus for the development of MAPPS. The simulation nature of the model along with its generous input parameters and output variables allows its usefulness to extend beyond its input to PRA

  14. Improving Wind Turbine Drivetrain Reliability Using a Combined Experimental, Computational, and Analytical Approach

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Y.; van Dam, J.; Bergua, R.; Jove, J.; Campbell, J.

    2015-03-01

    Nontorque loads induced by the wind turbine rotor overhang weight and aerodynamic forces can greatly affect drivetrain loads and responses. If not addressed properly, these loads can result in a decrease in gearbox component life. This work uses analytical modeling, computational modeling, and experimental data to evaluate a unique drivetrain design that minimizes the effects of nontorque loads on gearbox reliability: the Pure Torque(R) drivetrain developed by Alstom. The drivetrain has a hub-support configuration that transmits nontorque loads directly into the tower rather than through the gearbox as in other design approaches. An analytical model of Alstom's Pure Torque drivetrain provides insight into the relationships among turbine component weights, aerodynamic forces, and the resulting drivetrain loads. Main shaft bending loads are orders of magnitude lower than the rated torque and are hardly affected by wind conditions and turbine operations.

  15. Construction, implementation and testing of an image identification system using computer vision methods for fruit flies with economic importance (Diptera: Tephritidae).

    Science.gov (United States)

    Wang, Jiang-Ning; Chen, Xiao-Lin; Hou, Xin-Wen; Zhou, Li-Bing; Zhu, Chao-Dong; Ji, Li-Qiang

    2017-07-01

    Many species of Tephritidae are damaging to fruit, which might negatively impact international fruit trade. Automatic or semi-automatic identification of fruit flies are greatly needed for diagnosing causes of damage and quarantine protocols for economically relevant insects. A fruit fly image identification system named AFIS1.0 has been developed using 74 species belonging to six genera, which include the majority of pests in the Tephritidae. The system combines automated image identification and manual verification, balancing operability and accuracy. AFIS1.0 integrates image analysis and expert system into a content-based image retrieval framework. In the the automatic identification module, AFIS1.0 gives candidate identification results. Afterwards users can do manual selection based on comparing unidentified images with a subset of images corresponding to the automatic identification result. The system uses Gabor surface features in automated identification and yielded an overall classification success rate of 87% to the species level by Independent Multi-part Image Automatic Identification Test. The system is useful for users with or without specific expertise on Tephritidae in the task of rapid and effective identification of fruit flies. It makes the application of computer vision technology to fruit fly recognition much closer to production level. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  16. Reliability Assessment for Low-cost Unmanned Aerial Vehicles

    Science.gov (United States)

    Freeman, Paul Michael

    Existing low-cost unmanned aerospace systems are unreliable, and engineers must blend reliability analysis with fault-tolerant control in novel ways. This dissertation introduces the University of Minnesota unmanned aerial vehicle flight research platform, a comprehensive simulation and flight test facility for reliability and fault-tolerance research. An industry-standard reliability assessment technique, the failure modes and effects analysis, is performed for an unmanned aircraft. Particular attention is afforded to the control surface and servo-actuation subsystem. Maintaining effector health is essential for safe flight; failures may lead to loss of control incidents. Failure likelihood, severity, and risk are qualitatively assessed for several effector failure modes. Design changes are recommended to improve aircraft reliability based on this analysis. Most notably, the control surfaces are split, providing independent actuation and dual-redundancy. The simulation models for control surface aerodynamic effects are updated to reflect the split surfaces using a first-principles geometric analysis. The failure modes and effects analysis is extended by using a high-fidelity nonlinear aircraft simulation. A trim state discovery is performed to identify the achievable steady, wings-level flight envelope of the healthy and damaged vehicle. Tolerance of elevator actuator failures is studied using familiar tools from linear systems analysis. This analysis reveals significant inherent performance limitations for candidate adaptive/reconfigurable control algorithms used for the vehicle. Moreover, it demonstrates how these tools can be applied in a design feedback loop to make safety-critical unmanned systems more reliable. Control surface impairments that do occur must be quickly and accurately detected. This dissertation also considers fault detection and identification for an unmanned aerial vehicle using model-based and model-free approaches and applies those

  17. Comparison of reliability of lateral cephalogram and computed ...

    African Journals Online (AJOL)

    2014-05-07

    May 7, 2014 ... measurements acquired from both the modalities are reliable and reproducible, but .... paint on all the slice of the image stack in the axial plane of ..... between body mass index, age and upper airway measurements in snorers.

  18. Osteochondritis dissecans of the humeral capitellum: reliability of four classification systems using radiographs and computed tomography.

    Science.gov (United States)

    Claessen, Femke M A P; van den Ende, Kimberly I M; Doornberg, Job N; Guitton, Thierry G; Eygendaal, Denise; van den Bekerom, Michel P J

    2015-10-01

    The radiographic appearance of osteochondritis dissecans (OCD) of the humeral capitellum varies according to the stage of the lesion. It is important to evaluate the stage of OCD lesion carefully to guide treatment. We compared the interobserver reliability of currently used classification systems for OCD of the humeral capitellum to identify the most reliable classification system. Thirty-two musculoskeletal radiologists and orthopaedic surgeons specialized in elbow surgery from several countries evaluated anteroposterior and lateral radiographs and corresponding computed tomography (CT) scans of 22 patients to classify the stage of OCD of the humeral capitellum according to the classification systems developed by (1) Minami, (2) Berndt and Harty, (3) Ferkel and Sgaglione, and (4) Anderson on a Web-based study platform including a Digital Imaging and Communications in Medicine viewer. Magnetic resonance imaging was not evaluated as part of this study. We measured agreement among observers using the Siegel and Castellan multirater κ. All OCD classification systems, except for Berndt and Harty, which had poor agreement among observers (κ = 0.20), had fair interobserver agreement: κ was 0.27 for the Minami, 0.23 for Anderson, and 0.22 for Ferkel and Sgaglione classifications. The Minami Classification was significantly more reliable than the other classifications (P reliable for classifying different stages of OCD of the humeral capitellum. However, it is unclear whether radiographic evidence of OCD of the humeral capitellum, as categorized by the Minami Classification, guides treatment in clinical practice as a result of this fair agreement. Copyright © 2015 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Elsevier Inc. All rights reserved.

  19. Geometric computations with interval and new robust methods applications in computer graphics, GIS and computational geometry

    CERN Document Server

    Ratschek, H

    2003-01-01

    This undergraduate and postgraduate text will familiarise readers with interval arithmetic and related tools to gain reliable and validated results and logically correct decisions for a variety of geometric computations plus the means for alleviating the effects of the errors. It also considers computations on geometric point-sets, which are neither robust nor reliable in processing with standard methods. The authors provide two effective tools for obtaining correct results: (a) interval arithmetic, and (b) ESSA the new powerful algorithm which improves many geometric computations and makes th

  20. Improvement of level-1 PSA computer code package - Modeling and analysis for dynamic reliability of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chang Hoon; Baek, Sang Yeup; Shin, In Sup; Moon, Shin Myung; Moon, Jae Phil; Koo, Hoon Young; Kim, Ju Shin [Seoul National University, Seoul (Korea, Republic of); Hong, Jung Sik [Seoul National Polytechnology University, Seoul (Korea, Republic of); Lim, Tae Jin [Soongsil University, Seoul (Korea, Republic of)

    1996-08-01

    The objective of this project is to develop a methodology of the dynamic reliability analysis for NPP. The first year`s research was focused on developing a procedure for analyzing failure data of running components and a simulator for estimating the reliability of series-parallel structures. The second year`s research was concentrated on estimating the lifetime distribution and PM effect of a component from its failure data in various cases, and the lifetime distribution of a system with a particular structure. Computer codes for performing these jobs were also developed. The objectives of the third year`s research is to develop models for analyzing special failure types (CCFs, Standby redundant structure) that were nor considered in the first two years, and to complete a methodology of the dynamic reliability analysis for nuclear power plants. The analysis of failure data of components and related researches for supporting the simulator must be preceded for providing proper input to the simulator. Thus this research is divided into three major parts. 1. Analysis of the time dependent life distribution and the PM effect. 2. Development of a simulator for system reliability analysis. 3. Related researches for supporting the simulator : accelerated simulation analytic approach using PH-type distribution, analysis for dynamic repair effects. 154 refs., 5 tabs., 87 figs. (author)

  1. Advances in reliability and system engineering

    CERN Document Server

    Davim, J

    2017-01-01

    This book presents original studies describing the latest research and developments in the area of reliability and systems engineering. It helps the reader identifying gaps in the current knowledge and presents fruitful areas for further research in the field. Among others, this book covers reliability measures, reliability assessment of multi-state systems, optimization of multi-state systems, continuous multi-state systems, new computational techniques applied to multi-state systems and probabilistic and non-probabilistic safety assessment.

  2. User's guide to the Reliability Estimation System Testbed (REST)

    Science.gov (United States)

    Nicol, David M.; Palumbo, Daniel L.; Rifkin, Adam

    1992-01-01

    The Reliability Estimation System Testbed is an X-window based reliability modeling tool that was created to explore the use of the Reliability Modeling Language (RML). RML was defined to support several reliability analysis techniques including modularization, graphical representation, Failure Mode Effects Simulation (FMES), and parallel processing. These techniques are most useful in modeling large systems. Using modularization, an analyst can create reliability models for individual system components. The modules can be tested separately and then combined to compute the total system reliability. Because a one-to-one relationship can be established between system components and the reliability modules, a graphical user interface may be used to describe the system model. RML was designed to permit message passing between modules. This feature enables reliability modeling based on a run time simulation of the system wide effects of a component's failure modes. The use of failure modes effects simulation enhances the analyst's ability to correctly express system behavior when using the modularization approach to reliability modeling. To alleviate the computation bottleneck often found in large reliability models, REST was designed to take advantage of parallel processing on hypercube processors.

  3. Reliability analysis in intelligent machines

    Science.gov (United States)

    Mcinroy, John E.; Saridis, George N.

    1990-01-01

    Given an explicit task to be executed, an intelligent machine must be able to find the probability of success, or reliability, of alternative control and sensing strategies. By using concepts for information theory and reliability theory, new techniques for finding the reliability corresponding to alternative subsets of control and sensing strategies are proposed such that a desired set of specifications can be satisfied. The analysis is straightforward, provided that a set of Gaussian random state variables is available. An example problem illustrates the technique, and general reliability results are presented for visual servoing with a computed torque-control algorithm. Moreover, the example illustrates the principle of increasing precision with decreasing intelligence at the execution level of an intelligent machine.

  4. Validity, reliability, and reproducibility of linear measurements on digital models obtained from intraoral and cone-beam computed tomography scans of alginate impressions

    NARCIS (Netherlands)

    Wiranto, Matthew G.; Engelbrecht, W. Petrie; Nolthenius, Heleen E. Tutein; van der Meer, W. Joerd; Ren, Yijin

    INTRODUCTION: Digital 3-dimensional models are widely used for orthodontic diagnosis. The aim of this study was to assess the validity, reliability, and reproducibility of digital models obtained from the Lava Chairside Oral scanner (3M ESPE, Seefeld, Germany) and cone-beam computed tomography scans

  5. Reliability and validity of the revised Gibson Test of Cognitive Skills, a computer-based test battery for assessing cognition across the lifespan

    Directory of Open Access Journals (Sweden)

    Moore AL

    2018-02-01

    Full Text Available Amy Lawson Moore, Terissa M Miller Gibson Institute of Cognitive Research, Colorado Springs, CO, USA Purpose: The purpose of the current study is to evaluate the validity and reliability of the revised Gibson Test of Cognitive Skills, a computer-based battery of tests measuring short-term memory, long-term memory, processing speed, logic and reasoning, visual processing, as well as auditory processing and word attack skills.Methods: This study included 2,737 participants aged 5–85 years. A series of studies was conducted to examine the validity and reliability using the test performance of the entire norming group and several subgroups. The evaluation of the technical properties of the test battery included content validation by subject matter experts, item analysis and coefficient alpha, test–retest reliability, split-half reliability, and analysis of concurrent validity with the Woodcock Johnson III Tests of Cognitive Abilities and Tests of Achievement.Results: Results indicated strong sources of evidence of validity and reliability for the test, including internal consistency reliability coefficients ranging from 0.87 to 0.98, test–retest reliability coefficients ranging from 0.69 to 0.91, split-half reliability coefficients ranging from 0.87 to 0.91, and concurrent validity coefficients ranging from 0.53 to 0.93.Conclusion: The Gibson Test of Cognitive Skills-2 is a reliable and valid tool for assessing cognition in the general population across the lifespan. Keywords: testing, cognitive skills, memory, processing speed, visual processing, auditory processing

  6. Reliability model for common mode failures in redundant safety systems

    International Nuclear Information System (INIS)

    Fleming, K.N.

    1974-12-01

    A method is presented for computing the reliability of redundant safety systems, considering both independent and common mode type failures. The model developed for the computation is a simple extension of classical reliability theory. The feasibility of the method is demonstrated with the use of an example. The probability of failure of a typical diesel-generator emergency power system is computed based on data obtained from U. S. diesel-generator operating experience. The results are compared with reliability predictions based on the assumption that all failures are independent. The comparison shows a significant increase in the probability of redundant system failure, when common failure modes are considered. (U.S.)

  7. Models of Information Security Highly Reliable Computing Systems

    Directory of Open Access Journals (Sweden)

    Vsevolod Ozirisovich Chukanov

    2016-03-01

    Full Text Available Methods of the combined reservation are considered. The models of reliability of systems considering parameters of restoration and prevention of blocks of system are described. Ratios for average quantity prevention and an availability quotient of blocks of system are given.

  8. Design and construction the identification of nitriding plasma process parameters using personal computer based on serial communication

    International Nuclear Information System (INIS)

    Frida Iswinning Diah; Slamet Santosa

    2012-01-01

    Design and construction the identification of process parameters using personal computer based on serial communication PLC M-series has been done. The function of this device is to identify the process parameters of a system (plan), to which then be analyzed and conducted a follow-up given to the plan by the user. The main component of this device is the M-Series T100MD1616 PLC and personal computer (PC). In this device the data plan parameters obtained from the corresponding sensor outputs in the form of voltage or current. While the analog parameter data is adjusted to the ADC analog input of the PLC using a signal conditioning system. Then, as the parameter is processed by the PLC then sent to a PC via RS232 to be displayed in the form of graphs or tables and stored in the database. Software to program the database is created using Visual Basic Programming V-6. The device operation test is performed for the measurement of temperature parameter and vacuum level on the plasma nitriding machine. The results indicate that the device has functioning as an identification device parameters process of plasma nitriding machine. (author)

  9. Quantum Computing for Computer Architects

    CERN Document Server

    Metodi, Tzvetan

    2011-01-01

    Quantum computers can (in theory) solve certain problems far faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their infancy, it is not too early to consider their scalability and reliability in the context of the design of large-scale quantum computers. To architect such systems, one must understand what it takes to design and model a balanced, fault-tolerant quantum computer architecture. The goal of this lecture is to provide architectural abstractions for the design of a quantum computer and to explore

  10. Accuracy and reliability of linear cephalometric measurements from cone-beam computed tomography scans of a dry human skull.

    Science.gov (United States)

    Berco, Mauricio; Rigali, Paul H; Miner, R Matthew; DeLuca, Stephelynn; Anderson, Nina K; Will, Leslie A

    2009-07-01

    The purpose of this study was to determine the accuracy and reliability of 3-dimensional craniofacial measurements obtained from cone-beam computed tomography (CBCT) scans of a dry human skull. Seventeen landmarks were identified on the skull. CBCT scans were then obtained, with 2 skull orientations during scanning. Twenty-nine interlandmark linear measurements were made directly on the skull and compared with the same measurements made on the CBCT scans. All measurements were made by 2 operators on 4 separate occasions. The method errors were 0.19, 0.21, and 0.19 mm in the x-, y- and z-axes, respectively. Repeated measures analysis of variance (ANOVA) showed no significant intraoperator or interoperator differences. The mean measurement error was -0.01 mm (SD, 0.129 mm). Five measurement errors were found to be statistically significantly different; however, all measurement errors were below the known voxel size and clinically insignificant. No differences were found in the measurements from the 2 CBCT scan orientations of the skull. CBCT allows for clinically accurate and reliable 3-dimensional linear measurements of the craniofacial complex. Moreover, skull orientation during CBCT scanning does not affect the accuracy or the reliability of these measurements.

  11. Reliability databases: State-of-the-art and perspectives

    DEFF Research Database (Denmark)

    Akhmedjanov, Farit

    2001-01-01

    The report gives a history of development and an overview of the existing reliability databases. This overview also describes some other (than computer databases) sources of reliability and failures information, e.g. reliability handbooks, but the mainattention is paid to standard models...... and software packages containing the data mentioned. The standards corresponding to collection and exchange of reliability data are observed too. Finally, perspective directions in such data sources development areshown....

  12. Reliability of computer designed surgical guides in six implant rehabilitations with two years follow-up.

    Science.gov (United States)

    Giordano, Mauro; Ausiello, Pietro; Martorelli, Massimo; Sorrentino, Roberto

    2012-09-01

    To evaluate the reliability and accuracy of computer-designed surgical guides in osseointegrated oral implant rehabilitation. Six implant rehabilitations, with a total of 17 implants, were completed with computer-designed surgical guides, performed with the master model developed by muco-compressive and muco-static impressions. In the first case, the surgical guide had exclusively mucosal support, in the second case exclusively dental support. For all six cases computer-aided surgical planning was performed by virtual analyses with 3D models obtained by dental scan DICOM data. The accuracy and stability of implant osseointegration over two years post surgery was then evaluated with clinical and radiographic examinations. Radiographic examination, performed with digital acquisitions (RVG - Radio Video graph) and parallel techniques, allowed two-dimensional feedback with a margin of linear error of 10%. Implant osseointegration was recorded for all the examined rehabilitations. During the clinical and radiographic post-surgical assessments, over the following two years, the peri-implant bone level was found to be stable and without appearance of any complications. The margin of error recorded between pre-operative positions assigned by virtual analysis and the post-surgical digital radiographic observations was as low as 0.2mm. Computer-guided implant surgery can be very effective in oral rehabilitations, providing an opportunity for the surgeon: (a) to avoid the necessity of muco-periosteal detachments and then (b) to perform minimally invasive interventions, whenever appropriate, with a flapless approach. Copyright © 2012 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  13. The photon identification loophole in EPRB experiments: computer models with single-wing selection

    Directory of Open Access Journals (Sweden)

    De Raedt Hans

    2017-11-01

    Full Text Available Recent Einstein-Podolsky-Rosen-Bohm experiments [M. Giustina et al. Phys. Rev. Lett. 115, 250401 (2015; L. K. Shalm et al. Phys. Rev. Lett. 115, 250402 (2015] that claim to be loophole free are scrutinized. The combination of a digital computer and discrete-event simulation is used to construct a minimal but faithful model of the most perfected realization of these laboratory experiments. In contrast to prior simulations, all photon selections are strictly made, as they are in the actual experiments, at the local station and no other “post-selection” is involved. The simulation results demonstrate that a manifestly non-quantum model that identifies photons in the same local manner as in these experiments can produce correlations that are in excellent agreement with those of the quantum theoretical description of the corresponding thought experiment, in conflict with Bell’s theorem which states that this is impossible. The failure of Bell’s theorem is possible because of our recognition of the photon identification loophole. Such identification measurement-procedures are necessarily included in all actual experiments but are not included in the theory of Bell and his followers.

  14. The photon identification loophole in EPRB experiments: computer models with single-wing selection

    Science.gov (United States)

    De Raedt, Hans; Michielsen, Kristel; Hess, Karl

    2017-11-01

    Recent Einstein-Podolsky-Rosen-Bohm experiments [M. Giustina et al. Phys. Rev. Lett. 115, 250401 (2015); L. K. Shalm et al. Phys. Rev. Lett. 115, 250402 (2015)] that claim to be loophole free are scrutinized. The combination of a digital computer and discrete-event simulation is used to construct a minimal but faithful model of the most perfected realization of these laboratory experiments. In contrast to prior simulations, all photon selections are strictly made, as they are in the actual experiments, at the local station and no other "post-selection" is involved. The simulation results demonstrate that a manifestly non-quantum model that identifies photons in the same local manner as in these experiments can produce correlations that are in excellent agreement with those of the quantum theoretical description of the corresponding thought experiment, in conflict with Bell's theorem which states that this is impossible. The failure of Bell's theorem is possible because of our recognition of the photon identification loophole. Such identification measurement-procedures are necessarily included in all actual experiments but are not included in the theory of Bell and his followers.

  15. The hierarchical expert tuning of PID controllers using tools of soft computing.

    Science.gov (United States)

    Karray, F; Gueaieb, W; Al-Sharhan, S

    2002-01-01

    We present soft computing-based results pertaining to the hierarchical tuning process of PID controllers located within the control loop of a class of nonlinear systems. The results are compared with PID controllers implemented either in a stand alone scheme or as a part of conventional gain scheduling structure. This work is motivated by the increasing need in the industry to design highly reliable and efficient controllers for dealing with regulation and tracking capabilities of complex processes characterized by nonlinearities and possibly time varying parameters. The soft computing-based controllers proposed are hybrid in nature in that they integrate within a well-defined hierarchical structure the benefits of hard algorithmic controllers with those having supervisory capabilities. The controllers proposed also have the distinct features of learning and auto-tuning without the need for tedious and computationally extensive online systems identification schemes.

  16. A reliable and valid questionnaire was developed to measure computer vision syndrome at the workplace.

    Science.gov (United States)

    Seguí, María del Mar; Cabrero-García, Julio; Crespo, Ana; Verdú, José; Ronda, Elena

    2015-06-01

    To design and validate a questionnaire to measure visual symptoms related to exposure to computers in the workplace. Our computer vision syndrome questionnaire (CVS-Q) was based on a literature review and validated through discussion with experts and performance of a pretest, pilot test, and retest. Content validity was evaluated by occupational health, optometry, and ophthalmology experts. Rasch analysis was used in the psychometric evaluation of the questionnaire. Criterion validity was determined by calculating the sensitivity and specificity, receiver operator characteristic curve, and cutoff point. Test-retest repeatability was tested using the intraclass correlation coefficient (ICC) and concordance by Cohen's kappa (κ). The CVS-Q was developed with wide consensus among experts and was well accepted by the target group. It assesses the frequency and intensity of 16 symptoms using a single rating scale (symptom severity) that fits the Rasch rating scale model well. The questionnaire has sensitivity and specificity over 70% and achieved good test-retest repeatability both for the scores obtained [ICC = 0.802; 95% confidence interval (CI): 0.673, 0.884] and CVS classification (κ = 0.612; 95% CI: 0.384, 0.839). The CVS-Q has acceptable psychometric properties, making it a valid and reliable tool to control the visual health of computer workers, and can potentially be used in clinical trials and outcome research. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. An adaptive neuro fuzzy model for estimating the reliability of component-based software systems

    Directory of Open Access Journals (Sweden)

    Kirti Tyagi

    2014-01-01

    Full Text Available Although many algorithms and techniques have been developed for estimating the reliability of component-based software systems (CBSSs, much more research is needed. Accurate estimation of the reliability of a CBSS is difficult because it depends on two factors: component reliability and glue code reliability. Moreover, reliability is a real-world phenomenon with many associated real-time problems. Soft computing techniques can help to solve problems whose solutions are uncertain or unpredictable. A number of soft computing approaches for estimating CBSS reliability have been proposed. These techniques learn from the past and capture existing patterns in data. The two basic elements of soft computing are neural networks and fuzzy logic. In this paper, we propose a model for estimating CBSS reliability, known as an adaptive neuro fuzzy inference system (ANFIS, that is based on these two basic elements of soft computing, and we compare its performance with that of a plain FIS (fuzzy inference system for different data sets.

  18. Comparison between MALDI-TOF MS and FilmArray Blood Culture Identification panel for rapid identification of yeast from positive blood culture.

    Science.gov (United States)

    Paolucci, M; Foschi, C; Tamburini, M V; Ambretti, S; Lazzarotto, T; Landini, M P

    2014-09-01

    In this study we evaluated MALDI-TOF MS and FilmArray methods for the rapid identification of yeast from positive blood cultures. FilmArray correctly identified 20/22 of yeast species, while MALDI-TOF MS identified 9/22. FilmArray is a reliable and rapid identification system for the direct identification of yeasts from positive blood cultures. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Proof-of-Concept Demonstrations for Computation-Based Human Reliability Analysis. Modeling Operator Performance During Flooding Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Joe, Jeffrey Clark [Idaho National Lab. (INL), Idaho Falls, ID (United States); Boring, Ronald Laurids [Idaho National Lab. (INL), Idaho Falls, ID (United States); Herberger, Sarah Elizabeth Marie [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis Lee [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    The United States (U.S.) Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) program has the overall objective to help sustain the existing commercial nuclear power plants (NPPs). To accomplish this program objective, there are multiple LWRS “pathways,” or research and development (R&D) focus areas. One LWRS focus area is called the Risk-Informed Safety Margin and Characterization (RISMC) pathway. Initial efforts under this pathway to combine probabilistic and plant multi-physics models to quantify safety margins and support business decisions also included HRA, but in a somewhat simplified manner. HRA experts at Idaho National Laboratory (INL) have been collaborating with other experts to develop a computational HRA approach, called the Human Unimodel for Nuclear Technology to Enhance Reliability (HUNTER), for inclusion into the RISMC framework. The basic premise of this research is to leverage applicable computational techniques, namely simulation and modeling, to develop and then, using RAVEN as a controller, seamlessly integrate virtual operator models (HUNTER) with 1) the dynamic computational MOOSE runtime environment that includes a full-scope plant model, and 2) the RISMC framework PRA models already in use. The HUNTER computational HRA approach is a hybrid approach that leverages past work from cognitive psychology, human performance modeling, and HRA, but it is also a significant departure from existing static and even dynamic HRA methods. This report is divided into five chapters that cover the development of an external flooding event test case and associated statistical modeling considerations.

  20. Proof-of-Concept Demonstrations for Computation-Based Human Reliability Analysis. Modeling Operator Performance During Flooding Scenarios

    International Nuclear Information System (INIS)

    Joe, Jeffrey Clark; Boring, Ronald Laurids; Herberger, Sarah Elizabeth Marie; Mandelli, Diego; Smith, Curtis Lee

    2015-01-01

    The United States (U.S.) Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) program has the overall objective to help sustain the existing commercial nuclear power plants (NPPs). To accomplish this program objective, there are multiple LWRS 'pathways,' or research and development (R&D) focus areas. One LWRS focus area is called the Risk-Informed Safety Margin and Characterization (RISMC) pathway. Initial efforts under this pathway to combine probabilistic and plant multi-physics models to quantify safety margins and support business decisions also included HRA, but in a somewhat simplified manner. HRA experts at Idaho National Laboratory (INL) have been collaborating with other experts to develop a computational HRA approach, called the Human Unimodel for Nuclear Technology to Enhance Reliability (HUNTER), for inclusion into the RISMC framework. The basic premise of this research is to leverage applicable computational techniques, namely simulation and modeling, to develop and then, using RAVEN as a controller, seamlessly integrate virtual operator models (HUNTER) with 1) the dynamic computational MOOSE runtime environment that includes a full-scope plant model, and 2) the RISMC framework PRA models already in use. The HUNTER computational HRA approach is a hybrid approach that leverages past work from cognitive psychology, human performance modeling, and HRA, but it is also a significant departure from existing static and even dynamic HRA methods. This report is divided into five chapters that cover the development of an external flooding event test case and associated statistical modeling considerations.

  1. Reliability analysis and computation of computer-based safety instrumentation and control used in German nuclear power plant. Final report; Zuverlaessigkeitsuntersuchung und -berechnung rechnerbasierter Sicherheitsleittechnik zum Einsatz in deutschen Kernkraftwerken. Abschlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Ding, Yongjian [Hochschule Magdeburg-Stendal, Magdeburg (Germany). Inst. fuer Elektrotechnik; Krause, Ulrich [Magdeburg Univ. (Germany). Inst. fuer Apparate- und Umwelttechnik; Gu, Chunlei

    2014-08-21

    The trend of technological advancement in the field of safety instrumentation and control (I and C) leads to increasingly frequent use of computer-based (digital) control systems which consisting of distributed, connected bus communications computers and their functionalities are freely programmable by qualified software. The advantages of the new I and C system over the old I and C system with hard-wired technology are e.g. in the higher flexibility, cost-effective procurement of spare parts, higher hardware reliability (through higher integration density, intelligent self-monitoring mechanisms, etc.). On the other hand, skeptics see the new technology with the computer-based I and C a higher potential by influences of common cause failures (CCF), and the easier manipulation by sabotage (IT Security). In this joint research project funded by the Federal Ministry for Economical Affaires and Energy (BMWi) (2011-2014, FJZ 1501405) the Otto-von-Guericke-University Magdeburg and Magdeburg-Stendal University of Applied Sciences are therefore trying to develop suitable methods for the demonstration of the reliability of the new instrumentation and control systems with the focus on the investigation of CCF. This expertise of both houses shall be extended to this area and a scientific contribution to the sound reliability judgments of the digital safety I and C in domestic and foreign nuclear power plants. First, the state of science and technology will be worked out through the study of national and international standards in the field of functional safety of electrical and I and C systems and accompanying literature. On the basis of the existing nuclear Standards the deterministic requirements on the structure of the new digital I and C system will be determined. The possible methods of reliability modeling will be analyzed and compared. A suitable method called multi class binomial failure rate (MCFBR) which was successfully used in safety valve applications will be

  2. A hybrid approach to quantify software reliability in nuclear safety systems

    International Nuclear Information System (INIS)

    Arun Babu, P.; Senthil Kumar, C.; Murali, N.

    2012-01-01

    Highlights: ► A novel method to quantify software reliability using software verification and mutation testing in nuclear safety systems. ► Contributing factors that influence software reliability estimate. ► Approach to help regulators verify the reliability of safety critical software system during software licensing process. -- Abstract: Technological advancements have led to the use of computer based systems in safety critical applications. As computer based systems are being introduced in nuclear power plants, effective and efficient methods are needed to ensure dependability and compliance to high reliability requirements of systems important to safety. Even after several years of research, quantification of software reliability remains controversial and unresolved issue. Also, existing approaches have assumptions and limitations, which are not acceptable for safety applications. This paper proposes a theoretical approach combining software verification and mutation testing to quantify the software reliability in nuclear safety systems. The theoretical results obtained suggest that the software reliability depends on three factors: the test adequacy, the amount of software verification carried out and the reusability of verified code in the software. The proposed approach may help regulators in licensing computer based safety systems in nuclear reactors.

  3. Identification and Endodontic Management of Middle Mesial Canal in Mandibular Second Molar Using Cone Beam Computed Tomography

    Directory of Open Access Journals (Sweden)

    Bonny Paul

    2015-01-01

    Full Text Available Endodontic treatments are routinely done with the help of radiographs. However, radiographs represent only a two-dimensional image of an object. Failure to identify aberrant anatomy can lead to endodontic failure. This case report presents the use of three-dimensional imaging with cone beam computed tomography (CBCT as an adjunct to digital radiography in identification and management of mandibular second molar with three mesial canals.

  4. Design of a modular digital computer system, DRL 4. [for meeting future requirements of spaceborne computers

    Science.gov (United States)

    1972-01-01

    The design is reported of an advanced modular computer system designated the Automatically Reconfigurable Modular Multiprocessor System, which anticipates requirements for higher computing capacity and reliability for future spaceborne computers. Subjects discussed include: an overview of the architecture, mission analysis, synchronous and nonsynchronous scheduling control, reliability, and data transmission.

  5. Recommendations for certification or measurement of reliability for reliable digital archival repositories with emphasis on access

    Directory of Open Access Journals (Sweden)

    Paula Regina Ventura Amorim Gonçalez

    2017-04-01

    Full Text Available Introduction: Considering the guidelines of ISO 16363: 2012 (Space data and information transfer systems -- Audit and certification of trustworthy digital repositories and the text of CONARQ Resolution 39 for certification of Reliable Digital Archival Repository (RDC-Arq, verify the technical recommendations should be used as the basis for a digital archival repository to be considered reliable. Objective: Identify requirements for the creation of Reliable Digital Archival Repositories with emphasis on access to information from the ISO 16363: 2012 and CONARQ Resolution 39. Methodology: For the development of the study, the methodology consisted of an exploratory, descriptive and documentary theoretical investigation, since it is based on ISO 16363: 2012 and CONARQ Resolution 39. From the perspective of the problem approach, the study is qualitative and quantitative, since the data were collected, tabulated, and analyzed from the interpretation of their contents. Results: We presented a set of Checklist Recommendations for reliability measurement and/or certification for RDC-Arq with a clipping focused on the identification of requirements with emphasis on access to information is presented. Conclusions: The right to information as well as access to reliable information is a premise for Digital Archival Repositories, so the set of recommendations is directed to archivists who work in Digital Repositories and wish to verify the requirements necessary to evaluate the reliability of the Digital Repository or still guide the information professional in collecting requirements for repository reliability certification.

  6. Reliability analysis of neutron transport simulation using Monte Carlo method

    International Nuclear Information System (INIS)

    Souza, Bismarck A. de; Borges, Jose C.

    1995-01-01

    This work presents a statistical and reliability analysis covering data obtained by computer simulation of neutron transport process, using the Monte Carlo method. A general description of the method and its applications is presented. Several simulations, corresponding to slowing down and shielding problems have been accomplished. The influence of the physical dimensions of the materials and of the sample size on the reliability level of results was investigated. The objective was to optimize the sample size, in order to obtain reliable results, optimizing computation time. (author). 5 refs, 8 figs

  7. Stochastic models in reliability and maintenance

    CERN Document Server

    2002-01-01

    Our daily lives can be maintained by the high-technology systems. Computer systems are typical examples of such systems. We can enjoy our modern lives by using many computer systems. Much more importantly, we have to maintain such systems without failure, but cannot predict when such systems will fail and how to fix such systems without delay. A stochastic process is a set of outcomes of a random experiment indexed by time, and is one of the key tools needed to analyze the future behavior quantitatively. Reliability and maintainability technologies are of great interest and importance to the maintenance of such systems. Many mathematical models have been and will be proposed to describe reliability and maintainability systems by using the stochastic processes. The theme of this book is "Stochastic Models in Reliability and Main­ tainability. " This book consists of 12 chapters on the theme above from the different viewpoints of stochastic modeling. Chapter 1 is devoted to "Renewal Processes," under which cla...

  8. Computational identification of strain-, species- and genus-specific proteins

    Directory of Open Access Journals (Sweden)

    Thiagarajan Rathi

    2005-11-01

    Full Text Available Abstract Background The identification of unique proteins at different taxonomic levels has both scientific and practical value. Strain-, species- and genus-specific proteins can provide insight into the criteria that define an organism and its relationship with close relatives. Such proteins can also serve as taxon-specific diagnostic targets. Description A pipeline using a combination of computational and manual analyses of BLAST results was developed to identify strain-, species-, and genus-specific proteins and to catalog the closest sequenced relative for each protein in a proteome. Proteins encoded by a given strain are preliminarily considered to be unique if BLAST, using a comprehensive protein database, fails to retrieve (with an e-value better than 0.001 any protein not encoded by the query strain, species or genus (for strain-, species- and genus-specific proteins respectively, or if BLAST, using the best hit as the query (reverse BLAST, does not retrieve the initial query protein. Results are manually inspected for homology if the initial query is retrieved in the reverse BLAST but is not the best hit. Sequences unlikely to retrieve homologs using the default BLOSUM62 matrix (usually short sequences are re-tested using the PAM30 matrix, thereby increasing the number of retrieved homologs and increasing the stringency of the search for unique proteins. The above protocol was used to examine several food- and water-borne pathogens. We find that the reverse BLAST step filters out about 22% of proteins with homologs that would otherwise be considered unique at the genus and species levels. Analysis of the annotations of unique proteins reveals that many are remnants of prophage proteins, or may be involved in virulence. The data generated from this study can be accessed and further evaluated from the CUPID (Core and Unique Protein Identification system web site (updated semi-annually at http://pir.georgetown.edu/cupid. Conclusion CUPID

  9. Learning reliable manipulation strategies without initial physical models

    Science.gov (United States)

    Christiansen, Alan D.; Mason, Matthew T.; Mitchell, Tom M.

    1990-01-01

    A description is given of a robot, possessing limited sensory and effectory capabilities but no initial model of the effects of its actions on the world, that acquires such a model through exploration, practice, and observation. By acquiring an increasingly correct model of its actions, it generates increasingly successful plans to achieve its goals. In an apparently nondeterministic world, achieving reliability requires the identification of reliable actions and a preference for using such actions. Furthermore, by selecting its training actions carefully, the robot can significantly improve its learning rate.

  10. Computer architecture fundamentals and principles of computer design

    CERN Document Server

    Dumas II, Joseph D

    2005-01-01

    Introduction to Computer ArchitectureWhat is Computer Architecture?Architecture vs. ImplementationBrief History of Computer SystemsThe First GenerationThe Second GenerationThe Third GenerationThe Fourth GenerationModern Computers - The Fifth GenerationTypes of Computer SystemsSingle Processor SystemsParallel Processing SystemsSpecial ArchitecturesQuality of Computer SystemsGenerality and ApplicabilityEase of UseExpandabilityCompatibilityReliabilitySuccess and Failure of Computer Architectures and ImplementationsQuality and the Perception of QualityCost IssuesArchitectural Openness, Market Timi

  11. Adjoint sensitivity analysis of dynamic reliability models based on Markov chains - II: Application to IFMIF reliability assessment

    Energy Technology Data Exchange (ETDEWEB)

    Cacuci, D. G. [Commiss Energy Atom, Direct Energy Nucl, Saclay, (France); Cacuci, D. G.; Balan, I. [Univ Karlsruhe, Inst Nucl Technol and Reactor Safetly, Karlsruhe, (Germany); Ionescu-Bujor, M. [Forschungszentrum Karlsruhe, Fus Program, D-76021 Karlsruhe, (Germany)

    2008-07-01

    In Part II of this work, the adjoint sensitivity analysis procedure developed in Part I is applied to perform sensitivity analysis of several dynamic reliability models of systems of increasing complexity, culminating with the consideration of the International Fusion Materials Irradiation Facility (IFMIF) accelerator system. Section II presents the main steps of a procedure for the automated generation of Markov chains for reliability analysis, including the abstraction of the physical system, construction of the Markov chain, and the generation and solution of the ensuing set of differential equations; all of these steps have been implemented in a stand-alone computer code system called QUEFT/MARKOMAG-S/MCADJSEN. This code system has been applied to sensitivity analysis of dynamic reliability measures for a paradigm '2-out-of-3' system comprising five components and also to a comprehensive dynamic reliability analysis of the IFMIF accelerator system facilities for the average availability and, respectively, the system's availability at the final mission time. The QUEFT/MARKOMAG-S/MCADJSEN has been used to efficiently compute sensitivities to 186 failure and repair rates characterizing components and subsystems of the first-level fault tree of the IFMIF accelerator system. (authors)

  12. Adjoint sensitivity analysis of dynamic reliability models based on Markov chains - II: Application to IFMIF reliability assessment

    International Nuclear Information System (INIS)

    Cacuci, D. G.; Cacuci, D. G.; Balan, I.; Ionescu-Bujor, M.

    2008-01-01

    In Part II of this work, the adjoint sensitivity analysis procedure developed in Part I is applied to perform sensitivity analysis of several dynamic reliability models of systems of increasing complexity, culminating with the consideration of the International Fusion Materials Irradiation Facility (IFMIF) accelerator system. Section II presents the main steps of a procedure for the automated generation of Markov chains for reliability analysis, including the abstraction of the physical system, construction of the Markov chain, and the generation and solution of the ensuing set of differential equations; all of these steps have been implemented in a stand-alone computer code system called QUEFT/MARKOMAG-S/MCADJSEN. This code system has been applied to sensitivity analysis of dynamic reliability measures for a paradigm '2-out-of-3' system comprising five components and also to a comprehensive dynamic reliability analysis of the IFMIF accelerator system facilities for the average availability and, respectively, the system's availability at the final mission time. The QUEFT/MARKOMAG-S/MCADJSEN has been used to efficiently compute sensitivities to 186 failure and repair rates characterizing components and subsystems of the first-level fault tree of the IFMIF accelerator system. (authors)

  13. Reliability and diagnostic of modular systems

    Directory of Open Access Journals (Sweden)

    J. Kohlas

    2014-01-01

    Full Text Available Reliability and diagnostic are in general two problems discussed separately. Yet the two problems are in fact closely related to each other. Here, this relation is considered in the simple case of modular systems. We show, how the computation of reliability and diagnostic can efficiently be done within the same Bayesian network induced by the modularity of the structure function of the system.

  14. The collection, storage and use of equipment performance data for the safety and reliability assessment of nuclear power plants

    International Nuclear Information System (INIS)

    Fothergill, C.D.H.

    1975-01-01

    It has been characteristic of the Nuclear Industry that it should grow up in an atmosphere where reliability and operational safety considerations have been of vital importance. Consequently all aspects of Nuclear Power Reactor design, construction and operation (in the U.K.A.E.A.) are subjected to rigorous reliability assessments, beginning with the automatic protective devices and the safety shut-down systems. This has resulted in the setting up of large and small private data stores to support this upsurgence of Safety and Reliability assessment work. Unfortunately, much of the information being stored and published falls short of the minimum requirements of Safety Assessors and Reliability Analysts who need to make use of it. That there is still an urgent need for more work to be done in the Reliability Data field is universally acknowledged. The characteristics which make up good quality reliability data must be defined and achievable minimum standards must be set for its identification, collection, storage and retrieval. To this end the United Kingdom Atomic Energy Authority have set up the Systems Reliability Service Data Bank. This includes a computerized storage facility comprised of two principal data stores: (i) Reliability Data Store, (ii) Event Data Store. The figures available in the Reliability Data Store range from those relating to the lifetimes of minute components to those obtained from the assessment of whole plants and complete assemblies. These data have been accumulated from many reliable sources both inside and outside the Nuclear Industry, including the transfer of 'live' data generated from the results of reliability surveillance exercises associated with Event Data collection. Computer techniques developed specifically for the Reliability Data Store enable further 'processing' of these data to be carried out. The Event Data Store consists of three discrete computerized data stores, each one providing the necessary storage, retrieval and

  15. Interrater reliability of a Pilates movement-based classification system.

    Science.gov (United States)

    Yu, Kwan Kenny; Tulloch, Evelyn; Hendrick, Paul

    2015-01-01

    To determine the interrater reliability for identification of a specific movement pattern using a Pilates Classification system. Videos of 5 subjects performing specific movement tasks were sent to raters trained in the DMA-CP classification system. Ninety-six raters completed the survey. Interrater reliability for the detection of a directional bias was excellent (Pi = 0.92, and K(free) = 0.89). Interrater reliability for classifying an individual into a specific subgroup was moderate (Pi = 0.64, K(free) = 0.55) however raters who had completed levels 1-4 of the DMA-CP training and reported using the assessment daily demonstrated excellent reliability (Pi = 0.89 and K(free) = 0.87). The reliability of the classification system demonstrated almost perfect agreement in determining the existence of a specific movement pattern and classifying into a subgroup for experienced raters. There was a trend for greater reliability associated with increased levels of training and experience of the raters. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. Strategy for continuous improvement in IC manufacturability, yield, and reliability

    Science.gov (United States)

    Dreier, Dean J.; Berry, Mark; Schani, Phil; Phillips, Michael; Steinberg, Joe; DePinto, Gary

    1993-01-01

    Continual improvements in yield, reliability and manufacturability measure a fab and ultimately result in Total Customer Satisfaction. A new organizational and technical methodology for continuous defect reduction has been established in a formal feedback loop, which relies on yield and reliability, failed bit map analysis, analytical tools, inline monitoring, cross functional teams and a defect engineering group. The strategy requires the fastest detection, identification and implementation of possible corrective actions. Feedback cycle time is minimized at all points to improve yield and reliability and reduce costs, essential for competitiveness in the memory business. Payoff was a 9.4X reduction in defectivity and a 6.2X improvement in reliability of 256 K fast SRAMs over 20 months.

  17. Plant Identification Based on Leaf Midrib Cross-Section Images Using Fractal Descriptors.

    Directory of Open Access Journals (Sweden)

    Núbia Rosa da Silva

    Full Text Available The correct identification of plants is a common necessity not only to researchers but also to the lay public. Recently, computational methods have been employed to facilitate this task, however, there are few studies front of the wide diversity of plants occurring in the world. This study proposes to analyse images obtained from cross-sections of leaf midrib using fractal descriptors. These descriptors are obtained from the fractal dimension of the object computed at a range of scales. In this way, they provide rich information regarding the spatial distribution of the analysed structure and, as a consequence, they measure the multiscale morphology of the object of interest. In Biology, such morphology is of great importance because it is related to evolutionary aspects and is successfully employed to characterize and discriminate among different biological structures. Here, the fractal descriptors are used to identify the species of plants based on the image of their leaves. A large number of samples are examined, being 606 leaf samples of 50 species from Brazilian flora. The results are compared to other imaging methods in the literature and demonstrate that fractal descriptors are precise and reliable in the taxonomic process of plant species identification.

  18. Reliability and reproducibility analysis of the Cobb angle and assessing sagittal plane by computer-assisted and manual measurement tools.

    Science.gov (United States)

    Wu, Weifei; Liang, Jie; Du, Yuanli; Tan, Xiaoyi; Xiang, Xuanping; Wang, Wanhong; Ru, Neng; Le, Jinbo

    2014-02-06

    Although many studies on reliability and reproducibility of measurement have been performed on coronal Cobb angle, few results about reliability and reproducibility are reported on sagittal alignment measurement including the pelvis. We usually use SurgimapSpine software to measure the Cobb angle in our studies; however, there are no reports till date on its reliability and reproducible measurements. Sixty-eight standard standing posteroanterior whole-spine radiographs were reviewed. Three examiners carried out the measurements independently under the settings of manual measurement on X-ray radiographies and SurgimapSpine software on the computer. Parameters measured included pelvic incidence, sacral slope, pelvic tilt, Lumbar lordosis (LL), thoracic kyphosis, and coronal Cobb angle. SPSS 16.0 software was used for statistical analyses. The means, standard deviations, intraclass and interclass correlation coefficient (ICC), and 95% confidence intervals (CI) were calculated. There was no notable difference between the two tools (P = 0.21) for the coronal Cobb angle. In the sagittal plane parameters, the ICC of intraobserver reliability for the manual measures varied from 0.65 (T2-T5 angle) to 0.95 (LL angle). Further, for SurgimapSpine tool, the ICC ranged from 0.75 to 0.98. No significant difference in intraobserver reliability was found between the two measurements (P > 0.05). As for the interobserver reliability, measurements with SurgimapSpine tool had better ICC (0.71 to 0.98 vs 0.59 to 0.96) and Pearson's coefficient (0.76 to 0.99 vs 0.60 to 0.97). The reliability of SurgimapSpine measures was significantly higher in all parameters except for the coronal Cobb angle where the difference was not significant (P > 0.05). Although the differences between the two methods are very small, the results of this study indicate that the SurgimapSpine measurement is an equivalent measuring tool to the traditional manual in coronal Cobb angle, but is advantageous in spino

  19. Reliability centered Maintenance (RCM) program for Chashma NPP (CHASNUPP)

    International Nuclear Information System (INIS)

    Khalid, S.; Khan, S.A.

    2000-01-01

    This paper describes the proposed Reliability Centered Maintenance (RCM) program for Chashma Nuclear Power Plant (CHASNUPP). Major steps are the identification of risk critical components and the implementation of RCM procedures. Identification of risk critical components is based upon the CHASNUPP level 1 PSA results (performed under IAEA TC Project PAK/9/019) which is near completion. The other requirements for implementation of RCM program is the qualitative analysis to be performed for identifying the dominant potential failure modes of each risk critical component and determination of the necessary maintenance activities, required to ensure reliable operation of the identified risk critical components. Implementation of RCM program for these components will lead to improvement in plant availability and safety together with reduction in the maintenance cost. Development - implementation of RCM program at this stage will help the CHASNUPP Maintenance department who is now developing the maintenance program - procedures for CHASNUPP. (author)

  20. Test rig overview for validation and reliability testing of shutdown system software

    International Nuclear Information System (INIS)

    Zhao, M.; McDonald, A.; Dick, P.

    2007-01-01

    The test rig for Validation and Reliability Testing of shutdown system software has been upgraded from the AECL Windows-based test rig previously used for CANDU6 stations. It includes a Virtual Trip Computer, which is a software simulation of the functional specification of the trip computer, and a real-time trip computer simulator in a separate chassis, which is used during the preparation of trip computer test cases before the actual trip computers are available. This allows preparation work for Validation and Reliability Testing to be performed in advance of delivery of actual trip computers to maintain a project schedule. (author)

  1. Identification of reliable gridded reference data for statistical downscaling methods in Alberta

    Science.gov (United States)

    Eum, H. I.; Gupta, A.

    2017-12-01

    Climate models provide essential information to assess impacts of climate change at regional and global scales. However, statistical downscaling methods have been applied to prepare climate model data for various applications such as hydrologic and ecologic modelling at a watershed scale. As the reliability and (spatial and temporal) resolution of statistically downscaled climate data mainly depend on a reference data, identifying the most reliable reference data is crucial for statistical downscaling. A growing number of gridded climate products are available for key climate variables which are main input data to regional modelling systems. However, inconsistencies in these climate products, for example, different combinations of climate variables, varying data domains and data lengths and data accuracy varying with physiographic characteristics of the landscape, have caused significant challenges in selecting the most suitable reference climate data for various environmental studies and modelling. Employing various observation-based daily gridded climate products available in public domain, i.e. thin plate spline regression products (ANUSPLIN and TPS), inverse distance method (Alberta Townships), and numerical climate model (North American Regional Reanalysis) and an optimum interpolation technique (Canadian Precipitation Analysis), this study evaluates the accuracy of the climate products at each grid point by comparing with the Adjusted and Homogenized Canadian Climate Data (AHCCD) observations for precipitation, minimum and maximum temperature over the province of Alberta. Based on the performance of climate products at AHCCD stations, we ranked the reliability of these publically available climate products corresponding to the elevations of stations discretized into several classes. According to the rank of climate products for each elevation class, we identified the most reliable climate products based on the elevation of target points. A web-based system

  2. Computerized nipple identification for multiple image analysis in computer-aided diagnosis

    International Nuclear Information System (INIS)

    Zhou Chuan; Chan Heangping; Paramagul, Chintana; Roubidoux, Marilyn A.; Sahiner, Berkman; Hadjiiski, Labomir M.; Petrick, Nicholas

    2004-01-01

    Correlation of information from multiple-view mammograms (e.g., MLO and CC views, bilateral views, or current and prior mammograms) can improve the performance of breast cancer diagnosis by radiologists or by computer. The nipple is a reliable and stable landmark on mammograms for the registration of multiple mammograms. However, accurate identification of nipple location on mammograms is challenging because of the variations in image quality and in the nipple projections, resulting in some nipples being nearly invisible on the mammograms. In this study, we developed a computerized method to automatically identify the nipple location on digitized mammograms. First, the breast boundary was obtained using a gradient-based boundary tracking algorithm, and then the gray level profiles along the inside and outside of the boundary were identified. A geometric convergence analysis was used to limit the nipple search to a region of the breast boundary. A two-stage nipple detection method was developed to identify the nipple location using the gray level information around the nipple, the geometric characteristics of nipple shapes, and the texture features of glandular tissue or ducts which converge toward the nipple. At the first stage, a rule-based method was designed to identify the nipple location by detecting significant changes of intensity along the gray level profiles inside and outside the breast boundary and the changes in the boundary direction. At the second stage, a texture orientation-field analysis was developed to estimate the nipple location based on the convergence of the texture pattern of glandular tissue or ducts towards the nipple. The nipple location was finally determined from the detected nipple candidates by a rule-based confidence analysis. In this study, 377 and 367 randomly selected digitized mammograms were used for training and testing the nipple detection algorithm, respectively. Two experienced radiologists identified the nipple locations

  3. Accuracy and reliability of stitched cone-beam computed tomography images

    Energy Technology Data Exchange (ETDEWEB)

    Egbert, Nicholas [Private Practice, Reconstructive Dental Specialists of Utah, Salt Lake (United States); Cagna, David R.; Ahuja, Swati; Wicks, Russell A. [Dept. of rosthodontics, University of Tennessee Health Science Center College of Dentistry, Memphis (United States)

    2015-03-15

    This study was performed to evaluate the linear distance accuracy and reliability of stitched small field of view (FOV) cone-beam computed tomography (CBCT) reconstructed images for the fabrication of implant surgical guides. Three gutta percha points were fixed on the inferior border of a cadaveric mandible to serve as control reference points. Ten additional gutta percha points, representing fiduciary markers, were scattered on the buccal and lingual cortices at the level of the proposed complete denture flange. A digital caliper was used to measure the distance between the reference points and fiduciary markers, which represented the anatomic linear dimension. The mandible was scanned using small FOV CBCT, and the images were then reconstructed and stitched using the manufacturer's imaging software. The same measurements were then taken with the CBCT software. The anatomic linear dimension measurements and stitched small FOV CBCT measurements were statistically evaluated for linear accuracy. The mean difference between the anatomic linear dimension measurements and the stitched small FOV CBCT measurements was found to be 0.34 mm with a 95% confidence interval of +0.24 - +0.44 mm and a mean standard deviation of 0.30 mm. The difference between the control and the stitched small FOV CBCT measurements was insignificant within the parameters defined by this study. The proven accuracy of stitched small FOV CBCT data sets may allow image-guided fabrication of implant surgical stents from such data sets.

  4. Accuracy and reliability of stitched cone-beam computed tomography images

    International Nuclear Information System (INIS)

    Egbert, Nicholas; Cagna, David R.; Ahuja, Swati; Wicks, Russell A.

    2015-01-01

    This study was performed to evaluate the linear distance accuracy and reliability of stitched small field of view (FOV) cone-beam computed tomography (CBCT) reconstructed images for the fabrication of implant surgical guides. Three gutta percha points were fixed on the inferior border of a cadaveric mandible to serve as control reference points. Ten additional gutta percha points, representing fiduciary markers, were scattered on the buccal and lingual cortices at the level of the proposed complete denture flange. A digital caliper was used to measure the distance between the reference points and fiduciary markers, which represented the anatomic linear dimension. The mandible was scanned using small FOV CBCT, and the images were then reconstructed and stitched using the manufacturer's imaging software. The same measurements were then taken with the CBCT software. The anatomic linear dimension measurements and stitched small FOV CBCT measurements were statistically evaluated for linear accuracy. The mean difference between the anatomic linear dimension measurements and the stitched small FOV CBCT measurements was found to be 0.34 mm with a 95% confidence interval of +0.24 - +0.44 mm and a mean standard deviation of 0.30 mm. The difference between the control and the stitched small FOV CBCT measurements was insignificant within the parameters defined by this study. The proven accuracy of stitched small FOV CBCT data sets may allow image-guided fabrication of implant surgical stents from such data sets.

  5. Accuracy and reliability of stitched cone-beam computed tomography images.

    Science.gov (United States)

    Egbert, Nicholas; Cagna, David R; Ahuja, Swati; Wicks, Russell A

    2015-03-01

    This study was performed to evaluate the linear distance accuracy and reliability of stitched small field of view (FOV) cone-beam computed tomography (CBCT) reconstructed images for the fabrication of implant surgical guides. Three gutta percha points were fixed on the inferior border of a cadaveric mandible to serve as control reference points. Ten additional gutta percha points, representing fiduciary markers, were scattered on the buccal and lingual cortices at the level of the proposed complete denture flange. A digital caliper was used to measure the distance between the reference points and fiduciary markers, which represented the anatomic linear dimension. The mandible was scanned using small FOV CBCT, and the images were then reconstructed and stitched using the manufacturer's imaging software. The same measurements were then taken with the CBCT software. The anatomic linear dimension measurements and stitched small FOV CBCT measurements were statistically evaluated for linear accuracy. The mean difference between the anatomic linear dimension measurements and the stitched small FOV CBCT measurements was found to be 0.34 mm with a 95% confidence interval of +0.24 - +0.44 mm and a mean standard deviation of 0.30 mm. The difference between the control and the stitched small FOV CBCT measurements was insignificant within the parameters defined by this study. The proven accuracy of stitched small FOV CBCT data sets may allow image-guided fabrication of implant surgical stents from such data sets.

  6. Accuracy and reliability in sex determination from skulls: a comparison of Fordisc® 3.0 and the discriminant function analysis.

    Science.gov (United States)

    Guyomarc'h, Pierre; Bruzek, Jaroslav

    2011-05-20

    Identification in forensic anthropology and the definition of a biological profile in bioarchaeology are essential to each of those fields and use the same methodologies. Sex, age, stature and ancestry can be conclusive or dispensable, depending on the field. The Fordisc(®) 3.0 computer program was developed to aid in the identification of the sex, stature and ancestry of skeletal remains by exploiting the Forensic Data Bank (FDB) and computing discriminant function analyses (DFAs). Although widely used, this tool has been recently criticised, principally when used to determine ancestry. Two sub-samples of individuals of known sex were drawn from French (n=50) and Thai (n=91) osteological collections and used to assess the reliability of sex determination using Fordisc(®) 3.0 with 12 cranial measurements. Comparisons were made using the whole FDB as well as using select groups, taking into account the posterior and typicality probabilities. The results of Fordisc(®) 3.0 vary between 52.2% and 77.8% depending on the options and groups selected. Tests of published discriminant functions and the computation of specific DFA were performed in order to discuss the applicability of this software and, overall, to question the pertinence of the use of DFA and linear distances in sex determination, in light of the huge cranial morphological variability. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  7. Computational Exposure Science: An Emerging Discipline to ...

    Science.gov (United States)

    Background: Computational exposure science represents a frontier of environmental science that is emerging and quickly evolving.Objectives: In this commentary, we define this burgeoning discipline, describe a framework for implementation, and review some key ongoing research elements that are advancing the science with respect to exposure to chemicals in consumer products.Discussion: The fundamental elements of computational exposure science include the development of reliable, computationally efficient predictive exposure models; the identification, acquisition, and application of data to support and evaluate these models; and generation of improved methods for extrapolating across chemicals. We describe our efforts in each of these areas and provide examples that demonstrate both progress and potential.Conclusions: Computational exposure science, linked with comparable efforts in toxicology, is ushering in a new era of risk assessment that greatly expands our ability to evaluate chemical safety and sustainability and to protect public health. The National Exposure Research Laboratory’s (NERL’s) Human Exposure and Atmospheric Sciences Division (HEASD) conducts research in support of EPA’s mission to protect human health and the environment. HEASD’s research program supports Goal 1 (Clean Air) and Goal 4 (Healthy People) of EPA’s strategic plan. More specifically, our division conducts research to characterize the movement of pollutants from the source

  8. Exact reliability formula and bounds for general k-out-of-n systems

    International Nuclear Information System (INIS)

    Koucky, Miroslav

    2003-01-01

    The paper deals with reliability of general k-out-of-n systems whose component failures need not be independent and identically distributed. The result is an exact closed form reliability formula which is based on Feller's result. The formula is efficient and easy to use for manual and computer computations. The approximations for the system reliability are given and are useful when dealing with large systems. Two examples illustrate the use of the results

  9. Advances towards reliable identification and concentration determination of rare cells in peripheral blood

    Science.gov (United States)

    Alemany Server, R.; Martens, D.; Jans, K.; Bienstman, P.; Hill, D.

    2016-03-01

    Through further development, integration and validation of micro-nano-bio and biophotonics systems FP7 CanDo is developing an instrument that will permit highly reproducible and reliable identification and concentration determination of rare cells in peripheral blood for two key societal challenges, early and low cost anti-cancer drug efficacy determination and cancer diagnosis/monitoring. A cellular link between the primary malignant tumour and the peripheral metastases, responsible for 90% of cancerrelated deaths, has been established in the form of circulating tumour cells (CTCs) in peripheral blood. Furthermore, the relatively short survival time of CTCs in peripheral blood means that their detection is indicative of tumour progression thereby providing in addition to a prognostic value an evaluation of therapeutic efficacy and early recognition of tumour progression in theranostics. In cancer patients however blood concentrations are very low (=1 CTC/1E9 cells) and current detection strategies are too insensitive, limiting use to prognosis of only those with advanced metastatic cancer. Similarly, problems occur in therapeutics with anti-cancer drug development leading to lengthy and costly trials often preventing access to market. The novel cell separation/Raman analysis technologies plus nucleic acid based molecular characterization of the CanDo platform will provide an accurate CTC count with high throughput and high yield meeting both key societal challenges. Being beyond the state of art it will lead to substantial share gains not just in the high end markets of drug discovery and cancer diagnostics but due to modular technologies also in others. Here we present preliminary DNA hybridization sensing results.

  10. The reliability of identifying the Omega sign using axial T2-weighted magnetic resonance imaging.

    Science.gov (United States)

    Zakaria, Hesham Mostafa; Massa, Peter Joseph; Smith, Richard L; Moharram, Tarek Hazem; Corrigan, John; Lee, Ian; Schultz, Lonni; Hu, Jianhui; Patel, Suresh; Griffith, Brent

    2018-01-01

    Preoperative identification of the eloquent brain is important for neurosurgical planning. One common method of finding the motor cortex is by localizing "the Omega sign." No studies have tested the reliability of imaging to identify the Omega sign. We identified 40 recent and consecutive patients who had undergone preoperative functional magnetic resonance imaging for identification of the hand motor area prior to tumor resection. We recruited 11 neurosurgical residents of various levels of training and one board-certified neurosurgeon to identify the hand motor cortex Omega. Testees were given axial images of T2-weighted MRI and placed marks where they expected to find the Omega. Two board-certified radiologists graded and quantified the localization attempts. Inter-rater reliability was assessed using the kappa statistic, and Rao-Scott chi-square tests were used to examine the relationship between clinical factors and testees' experience with correct identification of the Omega sign. The overall correct identification rate was 69.9% (95% CI = 63.4-75.7), ranging from 36.6% to 92.7% among all raters for the tumor side and from 46.2% to 97.4% for the non-tumor side. Anatomic distortion greatly affected correct identification ( p Omega than junior residents ( p Omega sign is poor, with a Fleiss kappa of 0.23. We concluded that correct identification of the Omega sign is affected by tumor distortion and experience but overall is not reliable. This underscores the limitations of anatomic landmarks and the importance of utilizing multiple scanning planes and preoperative fMRI for appropriate localization.

  11. Reliability of lifeline networks under seismic hazard

    International Nuclear Information System (INIS)

    Selcuk, A. Sevtap; Yuecemen, M. Semih

    1999-01-01

    Lifelines, such as pipelines, transportation, communication and power transmission systems, are networks which extend spatially over large geographical regions. The quantification of the reliability (survival probability) of a lifeline under seismic threat requires attention, as the proper functioning of these systems during or after a destructive earthquake is vital. In this study, a lifeline is idealized as an equivalent network with the capacity of its elements being random and spatially correlated and a comprehensive probabilistic model for the assessment of the reliability of lifelines under earthquake loads is developed. The seismic hazard that the network is exposed to is described by a probability distribution derived by using the past earthquake occurrence data. The seismic hazard analysis is based on the 'classical' seismic hazard analysis model with some modifications. An efficient algorithm developed by Yoo and Deo (Yoo YB, Deo N. A comparison of algorithms for terminal pair reliability. IEEE Transactions on Reliability 1988; 37: 210-215) is utilized for the evaluation of the network reliability. This algorithm eliminates the CPU time and memory capacity problems for large networks. A comprehensive computer program, called LIFEPACK is coded in Fortran language in order to carry out the numerical computations. Two detailed case studies are presented to show the implementation of the proposed model

  12. Reliability-guided digital image correlation for image deformation measurement

    International Nuclear Information System (INIS)

    Pan Bing

    2009-01-01

    A universally applicable reliability-guided digital image correlation (DIC) method is proposed for reliable image deformation measurement. The zero-mean normalized cross correlation (ZNCC) coefficient is used to identify the reliability of the point computed. The correlation calculation begins with a seed point and is then guided by the ZNCC coefficient. That means the neighbors of the point with the highest ZNCC coefficient in a queue for computed points will be processed first. Thus the calculation path is always along the most reliable direction, and possible error propagation of the conventional DIC method can be avoided. The proposed novel DIC method is universally applicable to the images with shadows, discontinuous areas, and deformation discontinuity. Two image pairs were used to evaluate the performance of the proposed technique, and the successful results clearly demonstrate its robustness and effectiveness

  13. Contribute to quantitative identification of casting defects based on computer analysis of X-ray images

    Directory of Open Access Journals (Sweden)

    Z. Ignaszak

    2007-12-01

    Full Text Available The forecast of structure and properties of casting is based on results of computer simulation of physical processes which are carried out during the casting processes. For the effective using of simulation system it is necessary to validate mathematica-physical models describing process of casting formation and the creation of local discontinues, witch determinate the casting properties.In the paper the proposition for quantitative validation of VP system using solidification casting defects by information sources of II group (methods of NDT was introduced. It was named the VP/RT validation (virtual prototyping/radiographic testing validation. Nowadays identification of casting defects noticeable on X-ray images bases on comparison of X-ray image of casting with relates to the ASTM. The results of this comparison are often not conclusive because based on operator’s subjective assessment. In the paper the system of quantitative identification of iron casting defects on X-ray images and classification this defects to ASTM class is presented. The methods of pattern recognition and machine learning were applied.

  14. Reliability Growth in Space Life Support Systems

    Science.gov (United States)

    Jones, Harry W.

    2014-01-01

    A hardware system's failure rate often increases over time due to wear and aging, but not always. Some systems instead show reliability growth, a decreasing failure rate with time, due to effective failure analysis and remedial hardware upgrades. Reliability grows when failure causes are removed by improved design. A mathematical reliability growth model allows the reliability growth rate to be computed from the failure data. The space shuttle was extensively maintained, refurbished, and upgraded after each flight and it experienced significant reliability growth during its operational life. In contrast, the International Space Station (ISS) is much more difficult to maintain and upgrade and its failure rate has been constant over time. The ISS Carbon Dioxide Removal Assembly (CDRA) reliability has slightly decreased. Failures on ISS and with the ISS CDRA continue to be a challenge.

  15. Do strict rules and moving images increase the reliability of sequential identification procedures?.

    OpenAIRE

    Valentine, Tim; Darling, Stephen; Memon, Amina

    2007-01-01

    Live identification procedures in England and Wales have been replaced by use of video, which provides a sequential presentation of facial images. Sequential presentation of photographs provides some protection to innocent suspects from mistaken identification when used with strict instructions designed to prevent relative judgements (Lindsay, Lea & Fulford, 1991). However, the current procedure in England and Wales is incompatible with these strict instructions. The reported research investi...

  16. Compound analysis of gallstones using dual energy computed tomography-Results in a phantom model

    Energy Technology Data Exchange (ETDEWEB)

    Bauer, Ralf W., E-mail: ralfwbauer@aol.co [Department of Diagnostic and Interventional Radiology, Clinic of the Goethe University Frankfurt, Theodor-Stern-Kai 7, 60596 Frankfurt (Germany); Schulz, Julian R., E-mail: julian.schulz@t-online.d [Department of Diagnostic and Interventional Radiology, Clinic of the Goethe University Frankfurt, Theodor-Stern-Kai 7, 60596 Frankfurt (Germany); Zedler, Barbara, E-mail: zedler@em.uni-frankfurt.d [Department of Forensic Medicine, Clinic of the Goethe University Frankfurt, Kennedyallee 104, 60596 Frankfurt (Germany); Graf, Thomas G., E-mail: thomas.gt.graf@siemens.co [Siemens AG Healthcare Sector, Computed Tomography, Physics and Applications, Siemensstrasse 1, 91313 Forchheim (Germany); Vogl, Thomas J., E-mail: t.vogl@em.uni-frankfurt.d [Department of Diagnostic and Interventional Radiology, Clinic of the Goethe University Frankfurt, Theodor-Stern-Kai 7, 60596 Frankfurt (Germany)

    2010-07-15

    Purpose: The potential of dual energy computed tomography (DECT) for the analysis of gallstone compounds was investigated. The main goal was to find parameters, that can reliably define high percentage (>70%) cholesterol stones without calcium components. Materials and methods: 35 gallstones were analyzed with DECT using a phantom model. Stone samples were put into specimen containers filled with formalin. Containers were put into a water-filled cylindrical acrylic glass phantom. DECT scans were performed using a tube voltage/current of 140 kV/83 mAs (tube A) and 80 kV/340 mAs (tube B). ROI-measurements to determine CT attenuation of each sector of the stones that had different appearance on the CT images were performed. Finally, semi-quantitative infrared spectroscopy (FTIR) of these sectors was performed for chemical analysis. Results: ROI-measurements were performed in 45 different sectors in 35 gallstones. Sectors containing >70% of cholesterol and no calcium component (n = 20) on FTIR could be identified with 95% sensitivity and 100% specificity on DECT. These sectors showed typical attenuation of -8 {+-} 4 HU at 80 kV and +22 {+-} 3 HU at 140 kV. Even the presence of a small calcium component (<10%) hindered the reliable identification of cholesterol components as such. Conclusion: Dual energy CT allows for reliable identification of gallstones containing a high percentage of cholesterol and no calcium component in this pre-clinical phantom model. Results from in vivo or anthropomorphic phantom trials will have to confirm these results. This may enable the identification of patients eligible for non-surgical treatment options in the future.

  17. Compound analysis of gallstones using dual energy computed tomography-Results in a phantom model

    International Nuclear Information System (INIS)

    Bauer, Ralf W.; Schulz, Julian R.; Zedler, Barbara; Graf, Thomas G.; Vogl, Thomas J.

    2010-01-01

    Purpose: The potential of dual energy computed tomography (DECT) for the analysis of gallstone compounds was investigated. The main goal was to find parameters, that can reliably define high percentage (>70%) cholesterol stones without calcium components. Materials and methods: 35 gallstones were analyzed with DECT using a phantom model. Stone samples were put into specimen containers filled with formalin. Containers were put into a water-filled cylindrical acrylic glass phantom. DECT scans were performed using a tube voltage/current of 140 kV/83 mAs (tube A) and 80 kV/340 mAs (tube B). ROI-measurements to determine CT attenuation of each sector of the stones that had different appearance on the CT images were performed. Finally, semi-quantitative infrared spectroscopy (FTIR) of these sectors was performed for chemical analysis. Results: ROI-measurements were performed in 45 different sectors in 35 gallstones. Sectors containing >70% of cholesterol and no calcium component (n = 20) on FTIR could be identified with 95% sensitivity and 100% specificity on DECT. These sectors showed typical attenuation of -8 ± 4 HU at 80 kV and +22 ± 3 HU at 140 kV. Even the presence of a small calcium component (<10%) hindered the reliable identification of cholesterol components as such. Conclusion: Dual energy CT allows for reliable identification of gallstones containing a high percentage of cholesterol and no calcium component in this pre-clinical phantom model. Results from in vivo or anthropomorphic phantom trials will have to confirm these results. This may enable the identification of patients eligible for non-surgical treatment options in the future.

  18. Reliability methods in nuclear power plant ageing management

    International Nuclear Information System (INIS)

    Simola, K.

    1999-01-01

    The aim of nuclear power plant ageing management is to maintain an adequate safety level throughout the lifetime of the plant. In ageing studies, the reliability of components, systems and structures is evaluated taking into account the possible time-dependent degradation. The phases of ageing analyses are generally the identification of critical components, identification and evaluation of ageing effects, and development of mitigation methods. This thesis focuses on the use of reliability methods and analyses of plant- specific operating experience in nuclear power plant ageing studies. The presented applications and method development have been related to nuclear power plants, but many of the approaches can also be applied outside the nuclear industry. The thesis consists of a summary and seven publications. The summary provides an overview of ageing management and discusses the role of reliability methods in ageing analyses. In the publications, practical applications and method development are described in more detail. The application areas at component and system level are motor-operated valves and protection automation systems, for which experience-based ageing analyses have been demonstrated. Furthermore, Bayesian ageing models for repairable components have been developed, and the management of ageing by improving maintenance practices is discussed. Recommendations for improvement of plant information management in order to facilitate ageing analyses are also given. The evaluation and mitigation of ageing effects on structural components is addressed by promoting the use of probabilistic modelling of crack growth, and developing models for evaluation of the reliability of inspection results. (orig.)

  19. Reliability methods in nuclear power plant ageing management

    Energy Technology Data Exchange (ETDEWEB)

    Simola, K. [VTT Automation, Espoo (Finland). Industrial Automation

    1999-07-01

    The aim of nuclear power plant ageing management is to maintain an adequate safety level throughout the lifetime of the plant. In ageing studies, the reliability of components, systems and structures is evaluated taking into account the possible time-dependent degradation. The phases of ageing analyses are generally the identification of critical components, identification and evaluation of ageing effects, and development of mitigation methods. This thesis focuses on the use of reliability methods and analyses of plant- specific operating experience in nuclear power plant ageing studies. The presented applications and method development have been related to nuclear power plants, but many of the approaches can also be applied outside the nuclear industry. The thesis consists of a summary and seven publications. The summary provides an overview of ageing management and discusses the role of reliability methods in ageing analyses. In the publications, practical applications and method development are described in more detail. The application areas at component and system level are motor-operated valves and protection automation systems, for which experience-based ageing analyses have been demonstrated. Furthermore, Bayesian ageing models for repairable components have been developed, and the management of ageing by improving maintenance practices is discussed. Recommendations for improvement of plant information management in order to facilitate ageing analyses are also given. The evaluation and mitigation of ageing effects on structural components is addressed by promoting the use of probabilistic modelling of crack growth, and developing models for evaluation of the reliability of inspection results. (orig.)

  20. Uncertainty propagation and sensitivity analysis in system reliability assessment via unscented transformation

    International Nuclear Information System (INIS)

    Rocco Sanseverino, Claudio M.; Ramirez-Marquez, José Emmanuel

    2014-01-01

    The reliability of a system, notwithstanding it intended function, can be significantly affected by the uncertainty in the reliability estimate of the components that define the system. This paper implements the Unscented Transformation to quantify the effects of the uncertainty of component reliability through two approaches. The first approach is based on the concept of uncertainty propagation, which is the assessment of the effect that the variability of the component reliabilities produces on the variance of the system reliability. This assessment based on UT has been previously considered in the literature but only for system represented through series/parallel configuration. In this paper the assessment is extended to systems whose reliability cannot be represented through analytical expressions and require, for example, Monte Carlo Simulation. The second approach consists on the evaluation of the importance of components, i.e., the evaluation of the components that most contribute to the variance of the system reliability. An extension of the UT is proposed to evaluate the so called “main effects” of each component, as well to assess high order component interaction. Several examples with excellent results illustrate the proposed approach. - Highlights: • Simulation based approach for computing reliability estimates. • Computation of reliability variance via 2n+1 points. • Immediate computation of component importance. • Application to network systems

  1. The reliability and validity of the Alcohol Use Disorders Identification Test (AUDIT) in a German general practice population sample.

    Science.gov (United States)

    Dybek, Inga; Bischof, Gallus; Grothues, Janina; Reinhardt, Susa; Meyer, Christian; Hapke, Ulfert; John, Ulrich; Broocks, Andreas; Hohagen, Fritz; Rumpf, Hans-Jürgen

    2006-05-01

    Our goal was to analyze the retest reliability and validity of the Alcohol Use Disorders Identification Test (AUDIT) in a primary-care setting and recommend a cut-off value for the different alcohol-related diagnoses. Participants recruited from general practices (GPs) in two northern German cities received the AUDIT, which was embedded in a health-risk questionnaire. In total, 10,803 screenings were conducted. The retest reliability was tested on a subsample of 99 patients, with an intertest interval of 30 days. Sensitivity and specificity at a number of different cut-off values were estimated for the sample of alcohol consumers (n=8237). For this study, 1109 screen-positive patients received a diagnostic interview. Individuals who scored less than five points in the AUDIT and also tested negative in a second alcohol-related screen were defined as "negative" (n=6003). This definition was supported by diagnostic interviews of 99 screen-negative patients from which no false negatives could be detected. As the gold standard for detection of an alcohol-use disorder (AUD), we used the Munich-Composite International Diagnostic Interview (MCIDI), which is based on Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition, criteria. On the item level, the reliability, measured by the intraclass correlation coefficient (ICC), ranged between .39 (Item 9) and .98 (Item 10). For the total score, the ICC was .95. For cut-off values of eight points and five points, 87.5% and 88.9%, respectively, of the AUDIT-positives, and 98.9% and 95.1%, respectively, of the AUDIT-negatives were identically identified at retest, with kappa = .86 and kappa = .81. At the cut-off value of five points, we determined good combinations of sensitivity and specificity for the following diagnoses: alcohol dependence (sensitivity and specificity of .97 and .88, respectively), AUD (.97 and .92), and AUD and/or at-risk consumption (.97 and .91). Embedded in a health-risk questionnaire in

  2. Computer-assisted radiographic calculation of spinal curvature in brachycephalic "screw-tailed" dog breeds with congenital thoracic vertebral malformations: reliability and clinical evaluation.

    Directory of Open Access Journals (Sweden)

    Julien Guevar

    Full Text Available The objectives of this study were: To investigate computer-assisted digital radiographic measurement of Cobb angles in dogs with congenital thoracic vertebral malformations, to determine its intra- and inter-observer reliability and its association with the presence of neurological deficits. Medical records were reviewed (2009-2013 to identify brachycephalic screw-tailed dog breeds with radiographic studies of the thoracic vertebral column and with at least one vertebral malformation present. Twenty-eight dogs were included in the study. The end vertebrae were defined as the cranial end plate of the vertebra cranial to the malformed vertebra and the caudal end plate of the vertebra caudal to the malformed vertebra. Three observers performed the measurements twice. Intraclass correlation coefficients were used to calculate the intra- and inter-observer reliabilities. The intraclass correlation coefficient was excellent for all intra- and inter-observer measurements using this method. There was a significant difference in the kyphotic Cobb angle between dogs with and without associated neurological deficits. The majority of dogs with neurological deficits had a kyphotic Cobb angle higher than 35°. No significant difference in the scoliotic Cobb angle was observed. We concluded that the computer assisted digital radiographic measurement of the Cobb angle for kyphosis and scoliosis is a valid, reproducible and reliable method to quantify the degree of spinal curvature in brachycephalic screw-tailed dog breeds with congenital thoracic vertebral malformations.

  3. Assessment of the Maximal Split-Half Coefficient to Estimate Reliability

    Science.gov (United States)

    Thompson, Barry L.; Green, Samuel B.; Yang, Yanyun

    2010-01-01

    The maximal split-half coefficient is computed by calculating all possible split-half reliability estimates for a scale and then choosing the maximal value as the reliability estimate. Osburn compared the maximal split-half coefficient with 10 other internal consistency estimates of reliability and concluded that it yielded the most consistently…

  4. An artificial intelligence system for reliability studies

    International Nuclear Information System (INIS)

    Llory, M.; Ancelin, C.; Bannelier, M.; Bouhadana, H.; Bouissou, M.; Lucas, J.Y.; Magne, L.; Villate, N.

    1990-01-01

    The EDF (French Electricity Company) software developed for computer aided reliability studies is considered. Such software tools were applied in the study of the safety requirements of the Paluel nuclear power plant. The reliability models, based on IF-THEN type rules, and the generation of models by the expert system are described. The models are then processed applying algorithm structures [fr

  5. Advances in methods and applications of reliability and safety analysis

    International Nuclear Information System (INIS)

    Fieandt, J.; Hossi, H.; Laakso, K.; Lyytikaeinen, A.; Niemelae, I.; Pulkkinen, U.; Pulli, T.

    1986-01-01

    The know-how of the reliability and safety design and analysis techniques of Vtt has been established over several years in analyzing the reliability in the Finnish nuclear power plants Loviisa and Olkiluoto. This experience has been later on applied and developed to be used in the process industry, conventional power industry, automation and electronics. VTT develops and transfers methods and tools for reliability and safety analysis to the private and public sectors. The technology transfer takes place in joint development projects with potential users. Several computer-aided methods, such as RELVEC for reliability modelling and analysis, have been developed. The tool developed are today used by major Finnish companies in the fields of automation, nuclear power, shipbuilding and electronics. Development of computer-aided and other methods needed in analysis of operating experience, reliability or safety is further going on in a number of research and development projects

  6. Structural Time Domain Identification (STDI) Toolbox for Use with MATLAB

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Andersen, P.; Brincker, Rune

    1997-01-01

    The Structural Time Domain Identification (STDI) toolbox for use with MATLABTM is developed at Aalborg University, Denmark, based on the system identification research performed during recent years. By now, a reliable set of functions offers a wide spectrum of services for all the important steps...

  7. Structural Time Domain Identification (STDI) Toolbox for Use with MATLAB

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Andersen, P.; Brincker, Rune

    The Structural Time Domain Identification (STDI) toolbox for use with MATLABTM is developed at Aalborg University, Denmark, based on the system identification research performed during recent years. By now, a reliable set of functions offers a wide spectrum of services for all the important steps...

  8. DIRAC: reliable data management for LHCb

    International Nuclear Information System (INIS)

    Smith, A C; Tsaregorodtsev, A

    2008-01-01

    DIRAC, LHCb's Grid Workload and Data Management System, utilizes WLCG resources and middleware components to perform distributed computing tasks satisfying LHCb's Computing Model. The Data Management System (DMS) handles data transfer and data access within LHCb. Its scope ranges from the output of the LHCb Online system to Grid-enabled storage for all data types. It supports metadata for these files in replica and bookkeeping catalogues, allowing dataset selection and localization. The DMS controls the movement of files in a redundant fashion whilst providing utilities for accessing all metadata. To do these tasks effectively the DMS requires complete self integrity between its components and external physical storage. The DMS provides highly redundant management of all LHCb data to leverage available storage resources and to manage transient errors in underlying services. It provides data driven and reliable distribution of files as well as reliable job output upload, utilizing VO Boxes at LHCb Tier1 sites to prevent data loss. This paper presents several examples of mechanisms implemented in the DMS to increase reliability, availability and integrity, highlighting successful design choices and limitations discovered

  9. TrueAllele casework on Virginia DNA mixture evidence: computer and manual interpretation in 72 reported criminal cases.

    Directory of Open Access Journals (Sweden)

    Mark W Perlin

    Full Text Available Mixtures are a commonly encountered form of biological evidence that contain DNA from two or more contributors. Laboratory analysis of mixtures produces data signals that usually cannot be separated into distinct contributor genotypes. Computer modeling can resolve the genotypes up to probability, reflecting the uncertainty inherent in the data. Human analysts address the problem by simplifying the quantitative data in a threshold process that discards considerable identification information. Elevated stochastic threshold levels potentially discard more information. This study examines three different mixture interpretation methods. In 72 criminal cases, 111 genotype comparisons were made between 92 mixture items and relevant reference samples. TrueAllele computer modeling was done on all the evidence samples, and documented in DNA match reports that were provided as evidence for each case. Threshold-based Combined Probability of Inclusion (CPI and stochastically modified CPI (mCPI analyses were performed as well. TrueAllele's identification information in 101 positive matches was used to assess the reliability of its modeling approach. Comparison was made with 81 CPI and 53 mCPI DNA match statistics that were manually derived from the same data. There were statistically significant differences between the DNA interpretation methods. TrueAllele gave an average match statistic of 113 billion, CPI averaged 6.68 million, and mCPI averaged 140. The computer was highly specific, with a false positive rate under 0.005%. The modeling approach was precise, having a factor of two within-group standard deviation. TrueAllele accuracy was indicated by having uniformly distributed match statistics over the data set. The computer could make genotype comparisons that were impossible or impractical using manual methods. TrueAllele computer interpretation of DNA mixture evidence is sensitive, specific, precise, accurate and more informative than manual

  10. Reliability data collection and use in risk and availability assessment

    International Nuclear Information System (INIS)

    Colombari, V.

    1989-01-01

    For EuReDatA it is a prevailing objective to initiate and support contact between experts, companies and institutions active in reliability engineering and research. Main topics of this 6th EuReDatA Conference are: Reliability data banks; incidents data banks; common cause data; source and propagation of uncertainties; computer aided risk analysis; reliability and incidents data acquisition and processing; human reliability; probabilistic safety and availability assessment; feedback of reliability into system design; data fusion; reliability modeling and techniques; structural and mechanical reliability; consequence modeling; software and electronic reliability; reliability tests. Some conference papers are separately indexed in the database. (HP)

  11. NDE reliability and advanced NDE technology validation

    International Nuclear Information System (INIS)

    Doctor, S.R.; Deffenbaugh, J.D.; Good, M.S.; Green, E.R.; Heasler, P.G.; Hutton, P.H.; Reid, L.D.; Simonen, F.A.; Spanner, J.C.; Vo, T.V.

    1989-01-01

    This paper reports on progress for three programs: (1) evaluation and improvement in nondestructive examination reliability for inservice inspection of light water reactors (LWR) (NDE Reliability Program), (2) field validation acceptance, and training for advanced NDE technology, and (3) evaluation of computer-based NDE techniques and regional support of inspection activities. The NDE Reliability Program objectives are to quantify the reliability of inservice inspection techniques for LWR primary system components through independent research and establish means for obtaining improvements in the reliability of inservice inspections. The areas of significant progress will be described concerning ASME Code activities, re-analysis of the PISC-II data, the equipment interaction matrix study, new inspection criteria, and PISC-III. The objectives of the second program are to develop field procedures for the AE and SAFT-UT techniques, perform field validation testing of these techniques, provide training in the techniques for NRC headquarters and regional staff, and work with the ASME Code for the use of these advanced technologies. The final program's objective is to evaluate the reliability and accuracy of interpretation of results from computer-based ultrasonic inservice inspection systems, and to develop guidelines for NRC staff to monitor and evaluate the effectiveness of inservice inspections conducted on nuclear power reactors. This program started in the last quarter of FY89, and the extent of the program was to prepare a work plan for presentation to and approval from a technical advisory group of NRC staff

  12. Modeling cognition dynamics and its application to human reliability analysis

    International Nuclear Information System (INIS)

    Mosleh, A.; Smidts, C.; Shen, S.H.

    1996-01-01

    For the past two decades, a number of approaches have been proposed for the identification and estimation of the likelihood of human errors, particularly for use in the risk and reliability studies of nuclear power plants. Despite the wide-spread use of the most popular among these methods, their fundamental weaknesses are widely recognized, and the treatment of human reliability has been considered as one of the soft spots of risk studies of large technological systems. To alleviate the situation, new efforts have focused on the development of human reliability models based on a more fundamental understanding of operator response and its cognitive aspects

  13. ARCHITECTURE AND RELIABILITY OF OPERATING SYSTEMS

    Directory of Open Access Journals (Sweden)

    Stanislav V. Nazarov

    2018-03-01

    Full Text Available Progress in the production technology of microprocessors significantly increased reliability and performance of the computer systems hardware. It cannot be told about the corresponding characteristics of the software and its basis – the operating system (OS. Some achievements of program engineering are more modest in this field. Both directions of OS improvement (increasing of productivity and reliability are connected with the development of effective structures of these systems. OS functional complexity leads to the multiplicity of the structure, which is further enhanced by the specialization of the operating system depending on scope of computer system (complex scientific calculations, real time, information retrieval systems, systems of the automated and automatic control, etc. The functional complexity of the OS leads to the complexity of its architecture, which is further enhanced by the specialization of the operating system, depending on the computer system application area (complex scientific calculations, real-time, information retrieval systems, automated and automatic control systems, etc.. That fact led to variety of modern OS. It is possible to estimate reliability of different OS structures only as results of long-term field experiment or simulation modeling. However it is most often unacceptable because of time and funds expenses for carrying out such research. This survey attempts to evaluate the reliability of two main OS architectures: large multi-layered modular core and a multiserver (client-server system. Represented by continuous Markov chains which are explored in the stationary mode on the basis of transition from systems of the differential equations of Kolmogorov to system of the linear algebraic equations, models of these systems are developed.

  14. The commissioning of CMS sites: Improving the site reliability

    International Nuclear Information System (INIS)

    Belforte, S; Fisk, I; Flix, J; Hernandez, J M; Klem, J; Letts, J; Magini, N; Saiz, P; Sciaba, A

    2010-01-01

    The computing system of the CMS experiment works using distributed resources from more than 60 computing centres worldwide. These centres, located in Europe, America and Asia are interconnected by the Worldwide LHC Computing Grid. The operation of the system requires a stable and reliable behaviour of the underlying infrastructure. CMS has established a procedure to extensively test all relevant aspects of a Grid site, such as the ability to efficiently use their network to transfer data, the functionality of all the site services relevant for CMS and the capability to sustain the various CMS computing workflows at the required scale. This contribution describes in detail the procedure to rate CMS sites depending on their performance, including the complete automation of the program, the description of monitoring tools, and its impact in improving the overall reliability of the Grid from the point of view of the CMS computing system.

  15. Structural system reliability calculation using a probabilistic fault tree analysis method

    Science.gov (United States)

    Torng, T. Y.; Wu, Y.-T.; Millwater, H. R.

    1992-01-01

    The development of a new probabilistic fault tree analysis (PFTA) method for calculating structural system reliability is summarized. The proposed PFTA procedure includes: developing a fault tree to represent the complex structural system, constructing an approximation function for each bottom event, determining a dominant sampling sequence for all bottom events, and calculating the system reliability using an adaptive importance sampling method. PFTA is suitable for complicated structural problems that require computer-intensive computer calculations. A computer program has been developed to implement the PFTA.

  16. Intraobserver and interobserver reliability of radial torsion angle measurements by a new and alternative method with computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Freitas, Luiz Fernando Pinheiro de; Barbieri, Claudio Henrique; Mazzer, Nilton; Zatiti, Salomao Chade Assan; Bellucci, Angela Delete [Universidade de Sao Paulo (FMRP/USP), Ribeirao Preto, SP (Brazil). School of Medicine. Dept. of Biomechanics, Medicine and Rehabilitation; Nogueira-Barbosa, Marcello Henrique, E-mail: marcello@fmrp.usp.b [Universidade de Sao Paulo (FMRP/USP), Ribeirao Preto, SP (Brazil). School of Medicine. Radiology Div.

    2010-07-01

    Objective: to evaluate the intraobserver and interobserver reliability of radial torsion angle measurement using computed tomography. Methods: twelve pairs of cadaver radii and 116 forearms from 58 healthy volunteers were evaluated using axial computed tomography sections measured at the level of the bicipital tuberosity and the subchondral region of the radius. During digital imaging, the angle was formed by two lines, one diametrically perpendicular to the radial tubercle and the other tangential to the volar rim of the distal joint surface. Measurements were performed twice each by three observers. Results: in cadaveric bones, the mean radial torsion angle was 1.48 deg (-6 deg - 9 deg) on the right and 1.62 deg (-6 deg - 8 deg) on the left, with a mean difference between the right and left sides of 1.61 deg (0 deg - 8 deg). In volunteers, the mean radial torsion angle was 3.00 deg (-17 deg - 17 deg) on the right and 2.91 deg (-16 deg- 15 deg) on the left, with a mean difference between the sides of 1.58 deg (0 deg - 7 deg). There was no significant difference between each side. The interobserver correlation coefficient for the cadaver radii measurements was 0.88 (0.72 - 0.96) and 0.81 (0.58 - 0.93) for the right and left radius, respectively, while for the volunteers, the difference was 0.84 (0.77 - 0.90) and 0.83 (0.75 - 0.89), respectively. Intraobserver reliability was high. Conclusion: the described method is reproducible and applicable even when the radial tubercle has a rounded contour. (author)

  17. Parametric Mass Reliability Study

    Science.gov (United States)

    Holt, James P.

    2014-01-01

    The International Space Station (ISS) systems are designed based upon having redundant systems with replaceable orbital replacement units (ORUs). These ORUs are designed to be swapped out fairly quickly, but some are very large, and some are made up of many components. When an ORU fails, it is replaced on orbit with a spare; the failed unit is sometimes returned to Earth to be serviced and re-launched. Such a system is not feasible for a 500+ day long-duration mission beyond low Earth orbit. The components that make up these ORUs have mixed reliabilities. Components that make up the most mass-such as computer housings, pump casings, and the silicon board of PCBs-typically are the most reliable. Meanwhile components that tend to fail the earliest-such as seals or gaskets-typically have a small mass. To better understand the problem, my project is to create a parametric model that relates both the mass of ORUs to reliability, as well as the mass of ORU subcomponents to reliability.

  18. Accuracy and Reliability of Cone-Beam Computed Tomography for Linear and Volumetric Mandibular Condyle Measurements. A Human Cadaver Study.

    Science.gov (United States)

    García-Sanz, Verónica; Bellot-Arcís, Carlos; Hernández, Virginia; Serrano-Sánchez, Pedro; Guarinos, Juan; Paredes-Gallardo, Vanessa

    2017-09-20

    The accuracy of Cone-Beam Computed Tomography (CBCT) on linear and volumetric measurements on condyles has only been assessed on dry skulls. The aim of this study was to evaluate the reliability and accuracy of linear and volumetric measurements of mandibular condyles in the presence of soft tissues using CBCT. Six embalmed cadaver heads were used. CBCT scans were taken, followed by the extraction of the condyles. The water displacement technique was used to calculate the volumes of the condyles and three linear measurements were made using a digital caliper, these measurements serving as the gold standard. Surface models of the condyles were obtained using a 3D scanner, and superimposed onto the CBCT images. Condyles were isolated on the CBCT render volume using the surface models as reference and volumes were measured. Linear measurements were made on CBCT slices. The CBCT method was found to be reliable for both volumetric and linear measurements (CV  0.90). Highly accurate values were obtained for the three linear measurements and volume. CBCT is a reliable and accurate method for taking volumetric and linear measurements on mandibular condyles in the presence of soft tissue, and so a valid tool for clinical diagnosis.

  19. The problem of software reliability

    International Nuclear Information System (INIS)

    Ballard, G.M.

    1989-01-01

    The state of the art in safety and reliability assessment of the software of industrial computer systems is reviewed and likely progress over the next few years is identified and compared with the perceived needs of the user. Some of the current projects contributing to the development of new techniques for assessing software reliability are described. One is the software test and evaluation method which looked at the faults within and between two manufacturers specifications, faults in the codes and inconsistencies between the codes and specifications. The results are given. (author)

  20. Achieving High Reliability with People, Processes, and Technology.

    Science.gov (United States)

    Saunders, Candice L; Brennan, John A

    2017-01-01

    High reliability as a corporate value in healthcare can be achieved by meeting the "Quadruple Aim" of improving population health, reducing per capita costs, enhancing the patient experience, and improving provider wellness. This drive starts with the board of trustees, CEO, and other senior leaders who ingrain high reliability throughout the organization. At WellStar Health System, the board developed an ambitious goal to become a top-decile health system in safety and quality metrics. To achieve this goal, WellStar has embarked on a journey toward high reliability and has committed to Lean management practices consistent with the Institute for Healthcare Improvement's definition of a high-reliability organization (HRO): one that is committed to the prevention of failure, early identification and mitigation of failure, and redesign of processes based on identifiable failures. In the end, a successful HRO can provide safe, effective, patient- and family-centered, timely, efficient, and equitable care through a convergence of people, processes, and technology.

  1. Progress and challenges in bioinformatics approaches for enhancer identification

    KAUST Repository

    Kleftogiannis, Dimitrios A.

    2017-02-03

    Enhancers are cis-acting DNA elements that play critical roles in distal regulation of gene expression. Identifying enhancers is an important step for understanding distinct gene expression programs that may reflect normal and pathogenic cellular conditions. Experimental identification of enhancers is constrained by the set of conditions used in the experiment. This requires multiple experiments to identify enhancers, as they can be active under specific cellular conditions but not in different cell types/tissues or cellular states. This has opened prospects for computational prediction methods that can be used for high-throughput identification of putative enhancers to complement experimental approaches. Potential functions and properties of predicted enhancers have been catalogued and summarized in several enhancer-oriented databases. Because the current methods for the computational prediction of enhancers produce significantly different enhancer predictions, it will be beneficial for the research community to have an overview of the strategies and solutions developed in this field. In this review, we focus on the identification and analysis of enhancers by bioinformatics approaches. First, we describe a general framework for computational identification of enhancers, present relevant data types and discuss possible computational solutions. Next, we cover over 30 existing computational enhancer identification methods that were developed since 2000. Our review highlights advantages, limitations and potentials, while suggesting pragmatic guidelines for development of more efficient computational enhancer prediction methods. Finally, we discuss challenges and open problems of this topic, which require further consideration.

  2. Progress and challenges in bioinformatics approaches for enhancer identification

    KAUST Repository

    Kleftogiannis, Dimitrios A.; Kalnis, Panos; Bajic, Vladimir B.

    2017-01-01

    Enhancers are cis-acting DNA elements that play critical roles in distal regulation of gene expression. Identifying enhancers is an important step for understanding distinct gene expression programs that may reflect normal and pathogenic cellular conditions. Experimental identification of enhancers is constrained by the set of conditions used in the experiment. This requires multiple experiments to identify enhancers, as they can be active under specific cellular conditions but not in different cell types/tissues or cellular states. This has opened prospects for computational prediction methods that can be used for high-throughput identification of putative enhancers to complement experimental approaches. Potential functions and properties of predicted enhancers have been catalogued and summarized in several enhancer-oriented databases. Because the current methods for the computational prediction of enhancers produce significantly different enhancer predictions, it will be beneficial for the research community to have an overview of the strategies and solutions developed in this field. In this review, we focus on the identification and analysis of enhancers by bioinformatics approaches. First, we describe a general framework for computational identification of enhancers, present relevant data types and discuss possible computational solutions. Next, we cover over 30 existing computational enhancer identification methods that were developed since 2000. Our review highlights advantages, limitations and potentials, while suggesting pragmatic guidelines for development of more efficient computational enhancer prediction methods. Finally, we discuss challenges and open problems of this topic, which require further consideration.

  3. On-line signal trend identification

    International Nuclear Information System (INIS)

    Tambouratzis, T.; Antonopoulos-Domis, M.

    2004-01-01

    An artificial neural network, based on the self-organizing map, is proposed for on-line signal trend identification. Trends are categorized at each incoming signal as steady-state, increasing and decreasing, while they are further classified according to characteristics such signal shape and rate of change. Tests with model-generated signals illustrate the ability of the self-organizing map to accurately and reliably perform on-line trend identification in terms of both detection and classification. The proposed methodology has been found robust to the presence of white noise

  4. Computer proficiency questionnaire: assessing low and high computer proficient seniors.

    Science.gov (United States)

    Boot, Walter R; Charness, Neil; Czaja, Sara J; Sharit, Joseph; Rogers, Wendy A; Fisk, Arthur D; Mitzner, Tracy; Lee, Chin Chin; Nair, Sankaran

    2015-06-01

    Computers and the Internet have the potential to enrich the lives of seniors and aid in the performance of important tasks required for independent living. A prerequisite for reaping these benefits is having the skills needed to use these systems, which is highly dependent on proper training. One prerequisite for efficient and effective training is being able to gauge current levels of proficiency. We developed a new measure (the Computer Proficiency Questionnaire, or CPQ) to measure computer proficiency in the domains of computer basics, printing, communication, Internet, calendaring software, and multimedia use. Our aim was to develop a measure appropriate for individuals with a wide range of proficiencies from noncomputer users to extremely skilled users. To assess the reliability and validity of the CPQ, a diverse sample of older adults, including 276 older adults with no or minimal computer experience, was recruited and asked to complete the CPQ. The CPQ demonstrated excellent reliability (Cronbach's α = .98), with subscale reliabilities ranging from .86 to .97. Age, computer use, and general technology use all predicted CPQ scores. Factor analysis revealed three main factors of proficiency related to Internet and e-mail use; communication and calendaring; and computer basics. Based on our findings, we also developed a short-form CPQ (CPQ-12) with similar properties but 21 fewer questions. The CPQ and CPQ-12 are useful tools to gauge computer proficiency for training and research purposes, even among low computer proficient older adults. © The Author 2013. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  5. Use of computers at nuclear power plants

    International Nuclear Information System (INIS)

    Sen'kin, V.I.; Ozhigano, Yu.V.

    1974-01-01

    Applications of information and control computors in reacter central systems in Great Britain, Federal Republic of Germany, France, Canada, and the USA is surveyed. For the purpose of increasing the reliability of the computers effective means were designed for emergency operation and automatic computerized controls, and highly reliable micromodel modifications were developed. Numerical data units were handled along with development of methods and diagrams for converting analog values to numerical values, in accordance with modern requirements. Some data are presented on computer reliability in operating nuclear power plants both proposed and under construction. It is concluded that in foreign nuclear power stations the informational and calculational computers are finding increasingly wide distribution. Rapid action, the possibility of controlling large parameters, and operation of the computer in conjunction with increasing reliability are speeding up the process of introducing computers in atomic energy and broadenig their functions. (V.P.)

  6. Drought Risk Identification: Early Warning System of Seasonal Agrometeorological Drought

    Science.gov (United States)

    Dalecios, Nicolas; Spyropoulos, Nicos V.; Tarquis, Ana M.

    2014-05-01

    By considering drought as a hazard, drought types are classified into three categories, namely meteorological or climatological, agrometeorological or agricultural and hydrological drought and as a fourth class the socioeconomic impacts can be considered. This paper addresses agrometeorological drought affecting agriculture within the risk management framework. Risk management consists of risk assessment, as well as a feedback on the adopted risk reduction measures. And risk assessment comprises three distinct steps, namely risk identification, risk estimation and risk evaluation. This paper deals with the quantification and monitoring of agrometeorological drought, which constitute part of risk identification. For the quantitative assessment of agrometeorological or agricultural drought, as well as the computation of spatiotemporal features, one of the most reliable and widely used indices is applied, namely the Vegetation Health Index (VHI). The computation of VHI is based on satellite data of temperature and the Normalized Difference Vegetation Index (NDVI). The spatiotemporal features of drought, which are extracted from VHI are: areal extent, onset and end time, duration and severity. In this paper, a 20-year (1981-2001) time series of NOAA/AVHRR satellite data is used, where monthly images of VHI are extracted. Application is implemented in Thessaly, which is the major agricultural region of Greece characterized by vulnerable and drought-prone agriculture. The results show that every year there is a seasonal agrometeorological drought with a gradual increase in the areal extent and severity with peaks appearing usually during the summer. Drought monitoring is conducted by monthly remotely sensed VHI images. Drought early warning is developed using empirical relationships of severity and areal extent. In particular, two second-order polynomials are fitted, one for low and the other for high severity drought, respectively. The two fitted curves offer a seasonal

  7. Computing with memory for energy-efficient robust systems

    CERN Document Server

    Paul, Somnath

    2013-01-01

    This book analyzes energy and reliability as major challenges faced by designers of computing frameworks in the nanometer technology regime.  The authors describe the existing solutions to address these challenges and then reveal a new reconfigurable computing platform, which leverages high-density nanoscale memory for both data storage and computation to maximize the energy-efficiency and reliability. The energy and reliability benefits of this new paradigm are illustrated and the design challenges are discussed. Various hardware and software aspects of this exciting computing paradigm are de

  8. Quantitative reliability assessment for safety critical system software

    International Nuclear Information System (INIS)

    Chung, Dae Won; Kwon, Soon Man

    2005-01-01

    An essential issue in the replacement of the old analogue I and C to computer-based digital systems in nuclear power plants is the quantitative software reliability assessment. Software reliability models have been successfully applied to many industrial applications, but have the unfortunate drawback of requiring data from which one can formulate a model. Software which is developed for safety critical applications is frequently unable to produce such data for at least two reasons. First, the software is frequently one-of-a-kind, and second, it rarely fails. Safety critical software is normally expected to pass every unit test producing precious little failure data. The basic premise of the rare events approach is that well-tested software does not fail under normal routine and input signals, which means that failures must be triggered by unusual input data and computer states. The failure data found under the reasonable testing cases and testing time for these conditions should be considered for the quantitative reliability assessment. We will present the quantitative reliability assessment methodology of safety critical software for rare failure cases in this paper

  9. Poverty identification for a pro-poor health insurance scheme in Tanzania: reliability and multi-level stakeholder perceptions.

    Science.gov (United States)

    Kuwawenaruwa, August; Baraka, Jitihada; Ramsey, Kate; Manzi, Fatuma; Bellows, Ben; Borghi, Josephine

    2015-12-01

    Many low income countries have policies to exempt the poor from user charges in public facilities. Reliably identifying the poor is a challenge when implementing such policies. In Tanzania, a scorecard system was established in 2011, within a programme providing free national health insurance fund (NHIF) cards, to identify poor pregnant women and their families, based on eight components. Using a series of reliability tests on a 2012 dataset of 2,621 households in two districts, this study compares household poverty levels using the scorecard, a wealth index, and monthly consumption expenditures. We compared the distributions of the three wealth measures, and the consistency of household poverty classification using cross-tabulations and the Kappa statistic. We measured errors of inclusion and exclusion of the scorecard relative to the other methods. We also gathered perceptions of the scorecard criteria through qualitative interviews with stakeholders at multiple levels of the health system. The distribution of the scorecard was less skewed than other wealth measures and not truncated, but demonstrated clumping. There was a higher level of agreement between the scorecard and the wealth index than consumption expenditure. The scorecard identified a similar number of poor households as the "basic needs" poverty line based on monthly consumption expenditure, with only 45 % errors of inclusion. However, it failed to pick up half of those living below the "basic needs" poverty line as being poor. Stakeholders supported the inclusion of water sources, income, food security and disability measures but had reservations about other items on the scorecard. In choosing poverty identification strategies for programmes seeking to enhance health equity it's necessary to balance between community acceptability, local relevance and the need for such a strategy. It is important to ensure the strategy is efficient and less costly than alternatives in order to effectively reduce

  10. A Reliable Measure of Information Security Awareness and the Identification of Bias in Responses

    Directory of Open Access Journals (Sweden)

    Agata McCormac

    2017-11-01

    Full Text Available The Human Aspects of Information Security Questionnaire (HAIS-Q is designed to measure Information Security Awareness. More specifically, the tool measures an individual’s knowledge, attitude, and self-reported behaviour relating to information security in the workplace. This paper reports on the reliability of the HAIS-Q, including test-retest reliability and internal consistency. The paper also assesses the reliability of three preliminary over-claiming items, designed specifically to complement the HAIS-Q, and identify those individuals who provide socially desirable responses. A total of 197 working Australians completed two iterations of the HAIS-Q and the over-claiming items, approximately 4 weeks apart. Results of the analysis showed that the HAIS-Q was externally reliable and internally consistent. Therefore, the HAIS-Q can be used to reliably measure information security awareness. Reliability testing on the preliminary over-claiming items was not as robust and further development is required and recommended. The implications of these findings mean that organisations can confidently use the HAIS-Q to not only measure the current state of employee information security awareness within their organisation, but they can also measure the effectiveness and impacts of training interventions, information security awareness programs and campaigns. The influence of cultural changes and the effect of security incidents can also be assessed.

  11. Impact of PECS tablet computer app on receptive identification of pictures given a verbal stimulus.

    Science.gov (United States)

    Ganz, Jennifer B; Hong, Ee Rea; Goodwyn, Fara; Kite, Elizabeth; Gilliland, Whitney

    2015-04-01

    The purpose of this brief report was to determine the effect on receptive identification of photos of a tablet computer-based augmentative and alternative communication (AAC) system with voice output. A multiple baseline single-case experimental design across vocabulary words was implemented. One participant, a preschool-aged boy with autism and little intelligible verbal language, was included in the study. Although a functional relation between the intervention and the dependent variable was not established, the intervention did appear to result in mild improvement for two of the three vocabulary words selected. The authors recommend further investigations of the collateral impacts of AAC on skills other than expressive language.

  12. Modal and Wave Load Identification by ARMA Calibration

    DEFF Research Database (Denmark)

    Jensen, Jens Kristian Jehrbo; Kirkegaard, Poul Henning; Brincker, Rune

    1992-01-01

    In this note, modal parameter and wave load identification by calibration of ARMA models are considered for a simple offshore structure. The theory of identification by ARMA calibration is introduced as an identification technique in the time domain, which can be applied for white noise–excited s......In this note, modal parameter and wave load identification by calibration of ARMA models are considered for a simple offshore structure. The theory of identification by ARMA calibration is introduced as an identification technique in the time domain, which can be applied for white noise...... by an experimental example of a monopile model excited by random waves. The identification results show that the approach is able to give very reliable estimates of the modal parameters. Furthermore, a comparison of the identified wave load process and the calculated load process based on the Morison equation shows...

  13. Direct unavailability computation of a maintained highly reliable system

    Czech Academy of Sciences Publication Activity Database

    Briš, R.; Byczanski, Petr

    2010-01-01

    Roč. 224, č. 3 (2010), s. 159-170 ISSN 1748-0078 Grant - others:GA Mšk(CZ) MSM6198910007 Institutional research plan: CEZ:AV0Z30860518 Keywords : high reliability * availability * directed acyclic graph Subject RIV: BA - General Mathematics http:// journals .pepublishing.com/content/rtp3178l17923m46/

  14. A reliable and real-time aggregation aware data dissemination in a chain-based wireless sensor network

    NARCIS (Netherlands)

    Taghikhaki, Zahra; Meratnia, Nirvana; Havinga, Paul J.M.

    2012-01-01

    Time-critical applications of Wireless Sensor Networks (WSNs) demand timely data delivery for fast identification of out-of-ordinary situations and fast and reliable delivery of notification and warning messages. Due to the low reliable links in WSNs, achieving real-time guarantees and providing

  15. Bayesian methodology for reliability model acceptance

    International Nuclear Information System (INIS)

    Zhang Ruoxue; Mahadevan, Sankaran

    2003-01-01

    This paper develops a methodology to assess the reliability computation model validity using the concept of Bayesian hypothesis testing, by comparing the model prediction and experimental observation, when there is only one computational model available to evaluate system behavior. Time-independent and time-dependent problems are investigated, with consideration of both cases: with and without statistical uncertainty in the model. The case of time-independent failure probability prediction with no statistical uncertainty is a straightforward application of Bayesian hypothesis testing. However, for the life prediction (time-dependent reliability) problem, a new methodology is developed in this paper to make the same Bayesian hypothesis testing concept applicable. With the existence of statistical uncertainty in the model, in addition to the application of a predictor estimator of the Bayes factor, the uncertainty in the Bayes factor is explicitly quantified through treating it as a random variable and calculating the probability that it exceeds a specified value. The developed method provides a rational criterion to decision-makers for the acceptance or rejection of the computational model

  16. Fault-tolerant search algorithms reliable computation with unreliable information

    CERN Document Server

    Cicalese, Ferdinando

    2013-01-01

    Why a book on fault-tolerant search algorithms? Searching is one of the fundamental problems in computer science. Time and again algorithmic and combinatorial issues originally studied in the context of search find application in the most diverse areas of computer science and discrete mathematics. On the other hand, fault-tolerance is a necessary ingredient of computing. Due to their inherent complexity, information systems are naturally prone to errors, which may appear at any level - as imprecisions in the data, bugs in the software, or transient or permanent hardware failures. This book pr

  17. Reliability in Warehouse-Scale Computing: Why Low Latency Matters

    DEFF Research Database (Denmark)

    Nannarelli, Alberto

    2015-01-01

    , the limiting factor of these warehouse-scale data centers is the power dissipation. Power is dissipated not only in the computation itself, but also in heat removal (fans, air conditioning, etc.) to keep the temperature of the devices within the operating ranges. The need to keep the temperature low within......Warehouse sized buildings are nowadays hosting several types of large computing systems: from supercomputers to large clusters of servers to provide the infrastructure to the cloud. Although the main target, especially for high-performance computing, is still to achieve high throughput...

  18. Computational Identification of Novel Genes: Current and Future Perspectives.

    Science.gov (United States)

    Klasberg, Steffen; Bitard-Feildel, Tristan; Mallet, Ludovic

    2016-01-01

    While it has long been thought that all genomic novelties are derived from the existing material, many genes lacking homology to known genes were found in recent genome projects. Some of these novel genes were proposed to have evolved de novo, ie, out of noncoding sequences, whereas some have been shown to follow a duplication and divergence process. Their discovery called for an extension of the historical hypotheses about gene origination. Besides the theoretical breakthrough, increasing evidence accumulated that novel genes play important roles in evolutionary processes, including adaptation and speciation events. Different techniques are available to identify genes and classify them as novel. Their classification as novel is usually based on their similarity to known genes, or lack thereof, detected by comparative genomics or against databases. Computational approaches are further prime methods that can be based on existing models or leveraging biological evidences from experiments. Identification of novel genes remains however a challenging task. With the constant software and technologies updates, no gold standard, and no available benchmark, evaluation and characterization of genomic novelty is a vibrant field. In this review, the classical and state-of-the-art tools for gene prediction are introduced. The current methods for novel gene detection are presented; the methodological strategies and their limits are discussed along with perspective approaches for further studies.

  19. Reliability in individual monitoring service.

    Science.gov (United States)

    Mod Ali, N

    2011-03-01

    As a laboratory certified to ISO 9001:2008 and accredited to ISO/IEC 17025, the Secondary Standard Dosimetry Laboratory (SSDL)-Nuclear Malaysia has incorporated an overall comprehensive system for technical and quality management in promoting a reliable individual monitoring service (IMS). Faster identification and resolution of issues regarding dosemeter preparation and issuing of reports, personnel enhancement, improved customer satisfaction and overall efficiency of laboratory activities are all results of the implementation of an effective quality system. Review of these measures and responses to observed trends provide continuous improvement of the system. By having these mechanisms, reliability of the IMS can be assured in the promotion of safe behaviour at all levels of the workforce utilising ionising radiation facilities. Upgradation of in the reporting program through a web-based e-SSDL marks a major improvement in Nuclear Malaysia's IMS reliability on the whole. The system is a vital step in providing a user friendly and effective occupational exposure evaluation program in the country. It provides a higher level of confidence in the results generated for occupational dose monitoring of the IMS, thus, enhances the status of the radiation protection framework of the country.

  20. FRAC (failure rate analysis code): a computer program for analysis of variance of failure rates. An application user's guide

    International Nuclear Information System (INIS)

    Martz, H.F.; Beckman, R.J.; McInteer, C.R.

    1982-03-01

    Probabilistic risk assessments (PRAs) require estimates of the failure rates of various components whose failure modes appear in the event and fault trees used to quantify accident sequences. Several reliability data bases have been designed for use in providing the necessary reliability data to be used in constructing these estimates. In the nuclear industry, the Nuclear Plant Reliability Data System (NPRDS) and the In-Plant Reliability Data System (IRPDS), among others, were designed for this purpose. An important characteristic of such data bases is the selection and identification of numerous factors used to classify each component that is reported and the subsequent failures of each component. However, the presence of such factors often complicates the analysis of reliability data in the sense that it is inappropriate to group (that is, pool) data for those combinations of factors that yield significantly different failure rate values. These types of data can be analyzed by analysis of variance. FRAC (Failure Rate Analysis Code) is a computer code that performs an analysis of variance of failure rates. In addition, FRAC provides failure rate estimates

  1. Rapid and accurate identification of Streptococcus equi subspecies by MALDI-TOF MS

    DEFF Research Database (Denmark)

    Kudirkiene, Egle; Welker, Martin; Knudsen, Nanna Reumert

    2015-01-01

    phenotypic and sequence similarity between three subspecies their discrimination remains difficult. In this study, we aimed to design and validate a novel, Superspectra based, MALDI-TOF MS approach for reliable, rapid and cost-effective identification of SEE and SEZ, the most frequent S. equi subspecies.......3±7.5%). This result may be attributed to the highly clonal population structure of SEE, as opposed to the diversity of SEZ seen in horses. Importantly strains with atypical colony appearance both within SEE and SEZ did not affect correct identification of the strains by MALDI-TOF MS. Atypical colony variants...... are often associated with a higher persistence or virulence of S. equi, thus their correct identification using the current method strengthens its potential use in routine clinical diagnostics. In conclusion, reliable identification of S. equi subspecies was achieved by combining a MALDI-TOF MS method...

  2. Liquid identification by Hilbert spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Lyatti, M; Divin, Y; Poppe, U; Urban, K, E-mail: M.Lyatti@fz-juelich.d, E-mail: Y.Divin@fz-juelich.d [Forschungszentrum Juelich, 52425 Juelich (Germany)

    2009-11-15

    Fast and reliable identification of liquids is of great importance in, for example, security, biology and the beverage industry. An unambiguous identification of liquids can be made by electromagnetic measurements of their dielectric functions in the frequency range of their main dispersions, but this frequency range, from a few GHz to a few THz, is not covered by any conventional spectroscopy. We have developed a concept of liquid identification based on our new Hilbert spectroscopy and high- T{sub c} Josephson junctions, which can operate at the intermediate range from microwaves to THz frequencies. A demonstration setup has been developed consisting of a polychromatic radiation source and a compact Hilbert spectrometer integrated in a Stirling cryocooler. Reflection polychromatic spectra of various bottled liquids have been measured at the spectral range of 15-300 GHz with total scanning time down to 0.2 s and identification of liquids has been demonstrated.

  3. Liquid identification by Hilbert spectroscopy

    Science.gov (United States)

    Lyatti, M.; Divin, Y.; Poppe, U.; Urban, K.

    2009-11-01

    Fast and reliable identification of liquids is of great importance in, for example, security, biology and the beverage industry. An unambiguous identification of liquids can be made by electromagnetic measurements of their dielectric functions in the frequency range of their main dispersions, but this frequency range, from a few GHz to a few THz, is not covered by any conventional spectroscopy. We have developed a concept of liquid identification based on our new Hilbert spectroscopy and high- Tc Josephson junctions, which can operate at the intermediate range from microwaves to THz frequencies. A demonstration setup has been developed consisting of a polychromatic radiation source and a compact Hilbert spectrometer integrated in a Stirling cryocooler. Reflection polychromatic spectra of various bottled liquids have been measured at the spectral range of 15-300 GHz with total scanning time down to 0.2 s and identification of liquids has been demonstrated.

  4. Liquid identification by Hilbert spectroscopy

    International Nuclear Information System (INIS)

    Lyatti, M; Divin, Y; Poppe, U; Urban, K

    2009-01-01

    Fast and reliable identification of liquids is of great importance in, for example, security, biology and the beverage industry. An unambiguous identification of liquids can be made by electromagnetic measurements of their dielectric functions in the frequency range of their main dispersions, but this frequency range, from a few GHz to a few THz, is not covered by any conventional spectroscopy. We have developed a concept of liquid identification based on our new Hilbert spectroscopy and high- T c Josephson junctions, which can operate at the intermediate range from microwaves to THz frequencies. A demonstration setup has been developed consisting of a polychromatic radiation source and a compact Hilbert spectrometer integrated in a Stirling cryocooler. Reflection polychromatic spectra of various bottled liquids have been measured at the spectral range of 15-300 GHz with total scanning time down to 0.2 s and identification of liquids has been demonstrated.

  5. Hardware-efficient robust biometric identification from 0.58 second template and 12 features of limb (Lead I) ECG signal using logistic regression classifier.

    Science.gov (United States)

    Sahadat, Md Nazmus; Jacobs, Eddie L; Morshed, Bashir I

    2014-01-01

    The electrocardiogram (ECG), widely known as a cardiac diagnostic signal, has recently been proposed for biometric identification of individuals; however reliability and reproducibility are of research interest. In this paper, we propose a template matching technique with 12 features using logistic regression classifier that achieved high reliability and identification accuracy. Non-invasive ECG signals were captured using our custom-built ambulatory EEG/ECG embedded device (NeuroMonitor). ECG data were collected from healthy subjects (10), between 25-35 years, for 10 seconds per trial. The number of trials from each subject was 10. From each trial, only 0.58 seconds of Lead I ECG data were used as template. Hardware-efficient fiducial point detection technique was implemented for feature extraction. To obtain repeated random sub-sampling validation, data were randomly separated into training and testing sets at a ratio of 80:20. Test data were used to find the classification accuracy. ECG template data with 12 extracted features provided the best performance in terms of accuracy (up to 100%) and processing complexity (computation time of 1.2ms). This work shows that a single limb (Lead I) ECG can robustly identify an individual quickly and reliably with minimal contact and data processing using the proposed algorithm.

  6. Cervical vertebral maturation method and mandibular growth peak: a longitudinal study of diagnostic reliability.

    Science.gov (United States)

    Perinetti, Giuseppe; Primozic, Jasmina; Sharma, Bhavna; Cioffi, Iacopo; Contardo, Luca

    2018-03-28

    The capability of the cervical vertebral maturation (CVM) method in the identification of the mandibular growth peak on an individual basis remains undetermined. The diagnostic reliability of the six-stage CVM method in the identification of the mandibular growth peak was thus investigated. From the files of the Oregon and Burlington Growth Studies (data obtained between early 1950s and middle 1970s), 50 subjects (26 females, 24 males) with at least seven annual lateral cephalograms taken from 9 to 16 years were identified. Cervical vertebral maturation was assessed according to the CVM code staging system, and mandibular growth was defined as annual increments in Co-Gn distance. A diagnostic reliability analysis was carried out to establish the capability of the circumpubertal CVM stages 2, 3, and 4 in the identification of the imminent mandibular growth peak. Variable durations of each of the CVM stages 2, 3, and 4 were seen. The overall diagnostic accuracy values for the CVM stages 2, 3, and 4 were 0.70, 0.76, and 0.77, respectively. These low values appeared to be due to false positive cases. Secular trends in conjunction with the use of a discrete staging system. In most of the Burlington Growth Study sample, the lateral head film at age 15 was missing. None of the CVM stages 2, 3, and 4 reached a satisfactorily diagnostic reliability in the identification of imminent mandibular growth peak.

  7. IRT-Estimated Reliability for Tests Containing Mixed Item Formats

    Science.gov (United States)

    Shu, Lianghua; Schwarz, Richard D.

    2014-01-01

    As a global measure of precision, item response theory (IRT) estimated reliability is derived for four coefficients (Cronbach's a, Feldt-Raju, stratified a, and marginal reliability). Models with different underlying assumptions concerning test-part similarity are discussed. A detailed computational example is presented for the targeted…

  8. Simplified Multimodal Biometric Identification

    Directory of Open Access Journals (Sweden)

    Abhijit Shete

    2014-03-01

    Full Text Available Multibiometric systems are expected to be more reliable than unimodal biometric systems for personal identification due to the presence of multiple, fairly independent pieces of evidence e.g. Unique Identification Project "Aadhaar" of Government of India. In this paper, we present a novel wavelet based technique to perform fusion at the feature level and score level by considering two biometric modalities, face and fingerprint. The results indicate that the proposed technique can lead to substantial improvement in multimodal matching performance. The proposed technique is simple because of no preprocessing of raw biometric traits as well as no feature and score normalization.

  9. Reliability analysis of grid connected small wind turbine power electronics

    International Nuclear Information System (INIS)

    Arifujjaman, Md.; Iqbal, M.T.; Quaicoe, J.E.

    2009-01-01

    Grid connection of small permanent magnet generator (PMG) based wind turbines requires a power conditioning system comprising a bridge rectifier, a dc-dc converter and a grid-tie inverter. This work presents a reliability analysis and an identification of the least reliable component of the power conditioning system of such grid connection arrangements. Reliability of the configuration is analyzed for the worst case scenario of maximum conversion losses at a particular wind speed. The analysis reveals that the reliability of the power conditioning system of such PMG based wind turbines is fairly low and it reduces to 84% of initial value within one year. The investigation is further enhanced by identifying the least reliable component within the power conditioning system and found that the inverter has the dominant effect on the system reliability, while the dc-dc converter has the least significant effect. The reliability analysis demonstrates that a permanent magnet generator based wind energy conversion system is not the best option from the point of view of power conditioning system reliability. The analysis also reveals that new research is required to determine a robust power electronics configuration for small wind turbine conversion systems.

  10. Parallelized Genetic Identification of the Thermal-Electrochemical Model for Lithium-Ion Battery

    Directory of Open Access Journals (Sweden)

    Liqiang Zhang

    2013-01-01

    Full Text Available The parameters of a well predicted model can be used as health characteristics for Lithium-ion battery. This article reports a parallelized parameter identification of the thermal-electrochemical model, which significantly reduces the time consumption of parameter identification. Since the P2D model has the most predictability, it is chosen for further research and expanded to the thermal-electrochemical model by coupling thermal effect and temperature-dependent parameters. Then Genetic Algorithm is used for parameter identification, but it takes too much time because of the long time simulation of model. For this reason, a computer cluster is built by surplus computing resource in our laboratory based on Parallel Computing Toolbox and Distributed Computing Server in MATLAB. The performance of two parallelized methods, namely Single Program Multiple Data (SPMD and parallel FOR loop (PARFOR, is investigated and then the parallelized GA identification is proposed. With this method, model simulations running parallelly and the parameter identification could be speeded up more than a dozen times, and the identification result is batter than that from serial GA. This conclusion is validated by model parameter identification of a real LiFePO4 battery.

  11. Fundamentals of reliability engineering applications in multistage interconnection networks

    CERN Document Server

    Gunawan, Indra

    2014-01-01

    This book presents fundamentals of reliability engineering with its applications in evaluating reliability of multistage interconnection networks. In the first part of the book, it introduces the concept of reliability engineering, elements of probability theory, probability distributions, availability and data analysis.  The second part of the book provides an overview of parallel/distributed computing, network design considerations, and more.  The book covers a comprehensive reliability engineering methods and its practical aspects in the interconnection network systems. Students, engineers, researchers, managers will find this book as a valuable reference source.

  12. HiRel: Hybrid Automated Reliability Predictor (HARP) integrated reliability tool system, (version 7.0). Volume 1: HARP introduction and user's guide

    Science.gov (United States)

    Bavuso, Salvatore J.; Rothmann, Elizabeth; Dugan, Joanne Bechta; Trivedi, Kishor S.; Mittal, Nitin; Boyd, Mark A.; Geist, Robert M.; Smotherman, Mark D.

    1994-01-01

    The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. HiRel consists of interactive graphical input/output programs and four reliability/availability modeling engines that provide analytical and simulative solutions to a wide host of reliable fault-tolerant system architectures and is also applicable to electronic systems in general. The tool system was designed to be compatible with most computing platforms and operating systems, and some programs have been beta tested, within the aerospace community for over 8 years. Volume 1 provides an introduction to the HARP program. Comprehensive information on HARP mathematical models can be found in the references.

  13. Identification of sex using lateral cephalogram: Role of cephalofacial parameters

    Directory of Open Access Journals (Sweden)

    Almas Binnal

    2012-01-01

    Full Text Available Introduction: Recognition of sex is an important aspect of identification of an individual. Apart from pelvis, skull exhibits highest sexual dimorphism in the human body- Lateral cephalograms are an invaluable tool in identification of sex as they reveal architectural and morphological details of the skull on a single radiograph- The equipment required for lateral cephalometry is readily available and the technique is cost-effective, easy to perform, offers quick results, reproducible and can be implemented in any special training for the forensic examiner. The present study was undertaken to evaluate the role of lateral cephalograms and the nine cephalometric variables in the identification of sex and also to derive a discriminant function equation for identification of sex. Materials and methods: A total of 100 lateral cephalograms were taken of 50 male and 50 female subjects aged between 25 and 54 years belonging to South Indian population. The nine derived cephabmetnc parameters were used to arrive at a discriminant function equation which was further assessed for its reliability among the study subjects. Results: Among nine cephalometric parameters used, seven were reliable in the identification of sex. The derived discriminant function equation accurately identified 88% of the male study subjects as males and 84% of the female subjects as females. Conclusion: The lateral cephalograms and the nine cephalometric variables employed in the study are simple and reliable tools of sexual discrimination. The derived discriminant functional equation can be used to accurately identify sex of an individual belonging to South Indian population

  14. Optimal identification of semi-rigid domains in macromolecules from molecular dynamics simulation.

    Directory of Open Access Journals (Sweden)

    Stefan Bernhard

    Full Text Available Biological function relies on the fact that biomolecules can switch between different conformations and aggregation states. Such transitions involve a rearrangement of parts of the biomolecules involved that act as dynamic domains. The reliable identification of such domains is thus a key problem in biophysics. In this work we present a method to identify semi-rigid domains based on dynamical data that can be obtained from molecular dynamics simulations or experiments. To this end the average inter-atomic distance-deviations are computed. The resulting matrix is then clustered by a constrained quadratic optimization problem. The reliability and performance of the method are demonstrated for two artificial peptides. Furthermore we correlate the mechanical properties with biological malfunction in three variants of amyloidogenic transthyretin protein, where the method reveals that a pathological mutation destabilizes the natural dimer structure of the protein. Finally the method is used to identify functional domains of the GroEL-GroES chaperone, thus illustrating the efficiency of the method for large biomolecular machines.

  15. Reliability of operating WWER monitoring systems

    International Nuclear Information System (INIS)

    Yastrebenetsky, M.A.; Goldrin, V.M.; Garagulya, A.V.

    1996-01-01

    The elaboration of WWER monitoring systems reliability measures is described in this paper. The evaluation is based on the statistical data about failures what have collected at the Ukrainian operating nuclear power plants (NPP). The main attention is devoted to radiation safety monitoring system and unit information computer system, what collects information from different sensors and system of the unit. Reliability measures were used for decision the problems, connected with life extension of the instruments, and for other purposes. (author). 6 refs, 6 figs

  16. Reliability of operating WWER monitoring systems

    Energy Technology Data Exchange (ETDEWEB)

    Yastrebenetsky, M A; Goldrin, V M; Garagulya, A V [Ukrainian State Scientific Technical Center of Nuclear and Radiation Safety, Kharkov (Ukraine). Instrumentation and Control Systems Dept.

    1997-12-31

    The elaboration of WWER monitoring systems reliability measures is described in this paper. The evaluation is based on the statistical data about failures what have collected at the Ukrainian operating nuclear power plants (NPP). The main attention is devoted to radiation safety monitoring system and unit information computer system, what collects information from different sensors and system of the unit. Reliability measures were used for decision the problems, connected with life extension of the instruments, and for other purposes. (author). 6 refs, 6 figs.

  17. Reliability-Based Topology Optimization Using Stochastic Response Surface Method with Sparse Grid Design

    Directory of Open Access Journals (Sweden)

    Qinghai Zhao

    2015-01-01

    Full Text Available A mathematical framework is developed which integrates the reliability concept into topology optimization to solve reliability-based topology optimization (RBTO problems under uncertainty. Two typical methodologies have been presented and implemented, including the performance measure approach (PMA and the sequential optimization and reliability assessment (SORA. To enhance the computational efficiency of reliability analysis, stochastic response surface method (SRSM is applied to approximate the true limit state function with respect to the normalized random variables, combined with the reasonable design of experiments generated by sparse grid design, which was proven to be an effective and special discretization technique. The uncertainties such as material property and external loads are considered on three numerical examples: a cantilever beam, a loaded knee structure, and a heat conduction problem. Monte-Carlo simulations are also performed to verify the accuracy of the failure probabilities computed by the proposed approach. Based on the results, it is demonstrated that application of SRSM with SGD can produce an efficient reliability analysis in RBTO which enables a more reliable design than that obtained by DTO. It is also found that, under identical accuracy, SORA is superior to PMA in view of computational efficiency.

  18. Reliability Analysis Of Fire System On The Industry Facility By Use Fameca Method

    International Nuclear Information System (INIS)

    Sony T, D.T.; Situmorang, Johnny; Ismu W, Puradwi; Demon H; Mulyanto, Dwijo; Kusmono, Slamet; Santa, Sigit Asmara

    2000-01-01

    FAMECA is one of the analysis method to determine system reliability on the industry facility. Analysis is done by some procedure that is identification of component function, determination of failure mode, severity level and effect of their failure. Reliability value is determined by three combinations that is severity level, component failure value and critical component. Reliability of analysis has been done for fire system on the industry by FAMECA method. Critical component which identified is pump, air release valve, check valve, manual test valve, isolation valve, control system etc

  19. The rating reliability calculator

    Directory of Open Access Journals (Sweden)

    Solomon David J

    2004-04-01

    Full Text Available Abstract Background Rating scales form an important means of gathering evaluation data. Since important decisions are often based on these evaluations, determining the reliability of rating data can be critical. Most commonly used methods of estimating reliability require a complete set of ratings i.e. every subject being rated must be rated by each judge. Over fifty years ago Ebel described an algorithm for estimating the reliability of ratings based on incomplete data. While his article has been widely cited over the years, software based on the algorithm is not readily available. This paper describes an easy-to-use Web-based utility for estimating the reliability of ratings based on incomplete data using Ebel's algorithm. Methods The program is available public use on our server and the source code is freely available under GNU General Public License. The utility is written in PHP, a common open source imbedded scripting language. The rating data can be entered in a convenient format on the user's personal computer that the program will upload to the server for calculating the reliability and other statistics describing the ratings. Results When the program is run it displays the reliability, number of subject rated, harmonic mean number of judges rating each subject, the mean and standard deviation of the averaged ratings per subject. The program also displays the mean, standard deviation and number of ratings for each subject rated. Additionally the program will estimate the reliability of an average of a number of ratings for each subject via the Spearman-Brown prophecy formula. Conclusion This simple web-based program provides a convenient means of estimating the reliability of rating data without the need to conduct special studies in order to provide complete rating data. I would welcome other researchers revising and enhancing the program.

  20. An automated method for estimating reliability of grid systems using Bayesian networks

    International Nuclear Information System (INIS)

    Doguc, Ozge; Emmanuel Ramirez-Marquez, Jose

    2012-01-01

    Grid computing has become relevant due to its applications to large-scale resource sharing, wide-area information transfer, and multi-institutional collaborating. In general, in grid computing a service requests the use of a set of resources, available in a grid, to complete certain tasks. Although analysis tools and techniques for these types of systems have been studied, grid reliability analysis is generally computation-intensive to obtain due to the complexity of the system. Moreover, conventional reliability models have some common assumptions that cannot be applied to the grid systems. Therefore, new analytical methods are needed for effective and accurate assessment of grid reliability. This study presents a new method for estimating grid service reliability, which does not require prior knowledge about the grid system structure unlike the previous studies. Moreover, the proposed method does not rely on any assumptions about the link and node failure rates. This approach is based on a data-mining algorithm, the K2, to discover the grid system structure from raw historical system data, that allows to find minimum resource spanning trees (MRST) within the grid then, uses Bayesian networks (BN) to model the MRST and estimate grid service reliability.

  1. Component reliability for electronic systems

    CERN Document Server

    Bajenescu, Titu-Marius I

    2010-01-01

    The main reason for the premature breakdown of today's electronic products (computers, cars, tools, appliances, etc.) is the failure of the components used to build these products. Today professionals are looking for effective ways to minimize the degradation of electronic components to help ensure longer-lasting, more technically sound products and systems. This practical book offers engineers specific guidance on how to design more reliable components and build more reliable electronic systems. Professionals learn how to optimize a virtual component prototype, accurately monitor product reliability during the entire production process, and add the burn-in and selection procedures that are the most appropriate for the intended applications. Moreover, the book helps system designers ensure that all components are correctly applied, margins are adequate, wear-out failure modes are prevented during the expected duration of life, and system interfaces cannot lead to failure.

  2. Stochastic reliability and maintenance modeling essays in honor of Professor Shunji Osaki on his 70th birthday

    CERN Document Server

    Nakagawa, Toshio

    2013-01-01

    In honor of the work of Professor Shunji Osaki, Stochastic Reliability and Maintenance Modeling provides a comprehensive study of the legacy of and ongoing research in stochastic reliability and maintenance modeling. Including associated application areas such as dependable computing, performance evaluation, software engineering, communication engineering, distinguished researchers review and build on the contributions over the last four decades by Professor Shunji Osaki. Fundamental yet significant research results are presented and discussed clearly alongside new ideas and topics on stochastic reliability and maintenance modeling to inspire future research. Across 15 chapters readers gain the knowledge and understanding to apply reliability and maintenance theory to computer and communication systems. Stochastic Reliability and Maintenance Modeling is ideal for graduate students and researchers in reliability engineering, and workers, managers and engineers engaged in computer, maintenance and management wo...

  3. Accurate Identification of Cancerlectins through Hybrid Machine Learning Technology.

    Science.gov (United States)

    Zhang, Jieru; Ju, Ying; Lu, Huijuan; Xuan, Ping; Zou, Quan

    2016-01-01

    Cancerlectins are cancer-related proteins that function as lectins. They have been identified through computational identification techniques, but these techniques have sometimes failed to identify proteins because of sequence diversity among the cancerlectins. Advanced machine learning identification methods, such as support vector machine and basic sequence features (n-gram), have also been used to identify cancerlectins. In this study, various protein fingerprint features and advanced classifiers, including ensemble learning techniques, were utilized to identify this group of proteins. We improved the prediction accuracy of the original feature extraction methods and classification algorithms by more than 10% on average. Our work provides a basis for the computational identification of cancerlectins and reveals the power of hybrid machine learning techniques in computational proteomics.

  4. Signal trend identification with fuzzy methods

    International Nuclear Information System (INIS)

    Reifman, J.; Tsoukalas, L. H.; Wang, X.; Wei, T. Y. C.

    1999-01-01

    A fuzzy-logic-based methodology for on-line signal trend identification is introduced. Although signal trend identification is complicated by the presence of noise, fuzzy logic can help capture important features of on-line signals and classify incoming power plant signals into increasing, decreasing and steady-state trend categories. In order to verify the methodology, a code named PROTREN is developed and tested using plant data. The results indicate that the code is capable of detecting transients accurately, identifying trends reliably, and not misinterpreting a steady-state signal as a transient one

  5. Survey of industry methods for producing highly reliable software

    International Nuclear Information System (INIS)

    Lawrence, J.D.; Persons, W.L.

    1994-11-01

    The Nuclear Reactor Regulation Office of the US Nuclear Regulatory Commission is charged with assessing the safety of new instrument and control designs for nuclear power plants which may use computer-based reactor protection systems. Lawrence Livermore National Laboratory has evaluated the latest techniques in software reliability for measurement, estimation, error detection, and prediction that can be used during the software life cycle as a means of risk assessment for reactor protection systems. One aspect of this task has been a survey of the software industry to collect information to help identify the design factors used to improve the reliability and safety of software. The intent was to discover what practices really work in industry and what design factors are used by industry to achieve highly reliable software. The results of the survey are documented in this report. Three companies participated in the survey: Computer Sciences Corporation, International Business Machines (Federal Systems Company), and TRW. Discussions were also held with NASA Software Engineering Lab/University of Maryland/CSC, and the AIAA Software Reliability Project

  6. Evaluation of the reliability and accuracy of using cone-beam computed tomography for diagnosing periapical cysts from granulomas.

    Science.gov (United States)

    Guo, Jing; Simon, James H; Sedghizadeh, Parish; Soliman, Osman N; Chapman, Travis; Enciso, Reyes

    2013-12-01

    The purpose of this study was to evaluate the reliability and accuracy of cone-beam computed tomographic (CBCT) imaging against the histopathologic diagnosis for the differential diagnosis of periapical cysts (cavitated lesions) from (solid) granulomas. Thirty-six periapical lesions were imaged using CBCT scans. Apicoectomy surgeries were conducted for histopathological examination. Evaluator 1 examined each CBCT scan for the presence of 6 radiologic characteristics of a cyst (ie, location, periphery, shape, internal structure, effects on surrounding structure, and perforation of the cortical plate). Not every cyst showed all radiologic features (eg, not all cysts perforate the cortical plate). For the purpose of finding the minimum number of diagnostic criteria present in a scan to diagnose a lesion as a cyst, we conducted 6 receiver operating characteristic curve analyses comparing CBCT diagnoses with the histopathologic diagnosis. Two other independent evaluators examined the CBCT lesions. Statistical tests were conducted to examine the accuracy, inter-rater reliability, and intrarater reliability of CBCT images. Findings showed that a score of ≥4 positive findings was the optimal scoring system. The accuracies of differential diagnoses of 3 evaluators were moderate (area under the curve = 0.76, 0.70, and 0.69 for evaluators 1, 2, and 3, respectively). The inter-rater agreement of the 3 evaluators was excellent (α = 0.87). The intrarater agreement was good to excellent (κ = 0.71, 0.76, and 0.77). CBCT images can provide a moderately accurate diagnosis between cysts and granulomas. Copyright © 2013 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  7. Nuclear power plant transient identification using a neuro-fuzzy inference system

    International Nuclear Information System (INIS)

    Mol, Antonio Carlos de Abreu; Oliveira, Mauro Vitor de; Santos, Isaac Jose Antonio Luchetti dos; Carvalho, Paulo Victor Rodrigues de; Grecco, Claudio Henrique dos Santos; Auguto, Silas Cordeiro

    2005-01-01

    Transient identification in Nuclear Power Plant (NPP) is often a very hard task and may involve a great amount of human cognition. The early identification of unexpected departures from steady state behavior is an essential step for the operation, control and accident management in nuclear power plants. The basis for the identification of a change in the system is that different system faults and anomalies lead to different patterns of evolution of the involved process variables. During an abnormal event, the operator must monitor a great amount of information from the instruments, that represents a specific type of event. In this work, an approach for the identification of transients is presented, aiming at helping the operator to make a decision relative to the procedure to be followed in situations of accidents/transients at nuclear power plants. In this way, a diagnostic strategy based on hierarchical use artificial neural networks (ANN) for a first level transient diagnose. After the ANN has done a preliminary transient type identification, a fuzzy-logic system analyzes the results emitting reliability degree of it. In order to validate the method, a Nuclear Power Plant transient identification problem, comprising postulated accidents, is proposed. Noisy data was used to evaluate the method robustness. The results obtained reveal the ability of the method in dealing with dynamic identification of transients and its reliability degree. (author)

  8. [Animal experimentation, computer simulation and surgical research].

    Science.gov (United States)

    Carpentier, Alain

    2009-11-01

    We live in a digital world In medicine, computers are providing new tools for data collection, imaging, and treatment. During research and development of complex technologies and devices such as artificial hearts, computer simulation can provide more reliable information than experimentation on large animals. In these specific settings, animal experimentation should serve more to validate computer models of complex devices than to demonstrate their reliability.

  9. Efficient Identification Using a Prime-Feature-Based Technique

    DEFF Research Database (Denmark)

    Hussain, Dil Muhammad Akbar; Haq, Shaiq A.; Valente, Andrea

    2011-01-01

    . Fingerprint identification system, implemented on PC/104 based real-time systems, can accurately identify the operator. Traditionally, the uniqueness of a fingerprint is determined by the overall pattern of ridges and valleys as well as the local ridge anomalies e.g., a ridge bifurcation or a ridge ending......, which are called minutiae points. Designing a reliable automatic fingerprint matching algorithm for minimal platform is quite challenging. In real-time systems, efficiency of the matching algorithm is of utmost importance. To achieve this goal, a prime-feature-based indexing algorithm is proposed......Identification of authorized train drivers through biometrics is a growing area of interest in locomotive radio remote control systems. The existing technique of password authentication is not very reliable and potentially unauthorized personnel may also operate the system on behalf of the operator...

  10. 48 CFR 252.227-7014 - Rights in noncommercial computer software and noncommercial computer software documentation.

    Science.gov (United States)

    2010-10-01

    ...) Restricted rights in computer software, limited rights in technical data, or government purpose license... necessary to perfect a license or licenses in the deliverable software or documentation of the appropriate... the license rights obtained. (e) Identification and delivery of computer software and computer...

  11. Spatiotemporal Dynamics and Reliable Computations in Recurrent Spiking Neural Networks

    Science.gov (United States)

    Pyle, Ryan; Rosenbaum, Robert

    2017-01-01

    Randomly connected networks of excitatory and inhibitory spiking neurons provide a parsimonious model of neural variability, but are notoriously unreliable for performing computations. We show that this difficulty is overcome by incorporating the well-documented dependence of connection probability on distance. Spatially extended spiking networks exhibit symmetry-breaking bifurcations and generate spatiotemporal patterns that can be trained to perform dynamical computations under a reservoir computing framework.

  12. Spatiotemporal Dynamics and Reliable Computations in Recurrent Spiking Neural Networks.

    Science.gov (United States)

    Pyle, Ryan; Rosenbaum, Robert

    2017-01-06

    Randomly connected networks of excitatory and inhibitory spiking neurons provide a parsimonious model of neural variability, but are notoriously unreliable for performing computations. We show that this difficulty is overcome by incorporating the well-documented dependence of connection probability on distance. Spatially extended spiking networks exhibit symmetry-breaking bifurcations and generate spatiotemporal patterns that can be trained to perform dynamical computations under a reservoir computing framework.

  13. Reliability in the design phase

    International Nuclear Information System (INIS)

    Siahpush, A.S.; Hills, S.W.; Pham, H.; Majumdar, D.

    1991-12-01

    A study was performed to determine the common methods and tools that are available to calculated or predict a system's reliability. A literature review and software survey are included. The desired product of this developmental work is a tool for the system designer to use in the early design phase so that the final design will achieve the desired system reliability without lengthy testing and rework. Three computer programs were written which provide the first attempt at fulfilling this need. The programs are described and a case study is presented for each one. This is a continuing effort which will be furthered in FY-1992. 10 refs

  14. Methods to compute reliabilities for genomic predictions of feed intake

    Science.gov (United States)

    For new traits without historical reference data, cross-validation is often the preferred method to validate reliability (REL). Time truncation is less useful because few animals gain substantial REL after the truncation point. Accurate cross-validation requires separating genomic gain from pedigree...

  15. Three dimensional subsurface elemental identification of minerals using confocal micro-X-ray fluorescence and micro-X-ray computed tomography

    International Nuclear Information System (INIS)

    Cordes, Nikolaus L.; Seshadri, Srivatsan; Havrilla, George J.; Yuan, Xiaoli; Feser, Michael; Patterson, Brian M.

    2015-01-01

    Current non-destructive elemental characterization methods, such as scanning electron microscopy-based energy dispersive spectroscopy (SEM–EDS) and micro-X-ray fluorescence spectroscopy (MXRF), are limited to either elemental identification at the surface (SEM–EDS) or suffer from an inability to discriminate between surface or depth information (MXRF). Thus, a non-destructive elemental characterization of individual embedded particles beneath the surface is impossible with either of these techniques. This limitation can be overcome by using laboratory-based 3D confocal micro-X-ray fluorescence spectroscopy (confocal MXRF). This technique utilizes focusing optics on the X-ray source and detector which allows for spatial discrimination in all three dimensions. However, the voxel-by-voxel serial acquisition of a 3D elemental scan can be very time-intensive (~ 1 to 4 weeks) if it is necessary to locate individual embedded particles of interest. As an example, if each point takes a 5 s measurement time, a small volume of 50 × 50 × 50 pixels leads to an acquisition time of approximately 174 h, not including sample stage movement time. Initially screening the samples for particles of interest using micro-X-ray computed tomography (micro-CT) can significantly reduce the time required to spatially locate these particles. Once located, these individual particles can be elementally characterized with confocal MXRF. Herein, we report the elemental identification of high atomic number surface and subsurface particles embedded in a mineralogical matrix by coupling micro-CT and confocal MXRF. Synergistically, these two X-ray based techniques first rapidly locate and then elementally identify individual subsurface particles. - Highlights: • Coupling of confocal X-ray fluorescence spectroscopy and X-ray computed tomography • Qualitative elemental identification of surface and subsurface mineral particles • Non-destructive particle size measurements • Utilization of

  16. Three dimensional subsurface elemental identification of minerals using confocal micro-X-ray fluorescence and micro-X-ray computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Cordes, Nikolaus L., E-mail: ncordes@lanl.gov [Polymers and Coatings Group, Material Science and Technology Division, Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Seshadri, Srivatsan, E-mail: srivatsan.seshadri@zeiss.com [Carl Zeiss X-ray Microscopy, Inc., Pleasanton, CA 94588 (United States); Havrilla, George J. [Chemical Diagnostics and Engineering, Chemistry Division, Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Yuan, Xiaoli [Julius Kruttschnitt Mineral Research Centre, University of Queensland, Indooroopilly, Brisbane, QLD 4068 (Australia); Feser, Michael [Carl Zeiss X-ray Microscopy, Inc., Pleasanton, CA 94588 (United States); Patterson, Brian M. [Polymers and Coatings Group, Material Science and Technology Division, Los Alamos National Laboratory, Los Alamos, NM 87545 (United States)

    2015-01-01

    Current non-destructive elemental characterization methods, such as scanning electron microscopy-based energy dispersive spectroscopy (SEM–EDS) and micro-X-ray fluorescence spectroscopy (MXRF), are limited to either elemental identification at the surface (SEM–EDS) or suffer from an inability to discriminate between surface or depth information (MXRF). Thus, a non-destructive elemental characterization of individual embedded particles beneath the surface is impossible with either of these techniques. This limitation can be overcome by using laboratory-based 3D confocal micro-X-ray fluorescence spectroscopy (confocal MXRF). This technique utilizes focusing optics on the X-ray source and detector which allows for spatial discrimination in all three dimensions. However, the voxel-by-voxel serial acquisition of a 3D elemental scan can be very time-intensive (~ 1 to 4 weeks) if it is necessary to locate individual embedded particles of interest. As an example, if each point takes a 5 s measurement time, a small volume of 50 × 50 × 50 pixels leads to an acquisition time of approximately 174 h, not including sample stage movement time. Initially screening the samples for particles of interest using micro-X-ray computed tomography (micro-CT) can significantly reduce the time required to spatially locate these particles. Once located, these individual particles can be elementally characterized with confocal MXRF. Herein, we report the elemental identification of high atomic number surface and subsurface particles embedded in a mineralogical matrix by coupling micro-CT and confocal MXRF. Synergistically, these two X-ray based techniques first rapidly locate and then elementally identify individual subsurface particles. - Highlights: • Coupling of confocal X-ray fluorescence spectroscopy and X-ray computed tomography • Qualitative elemental identification of surface and subsurface mineral particles • Non-destructive particle size measurements • Utilization of

  17. A field study of the accuracy and reliability of a biometric iris recognition system.

    Science.gov (United States)

    Latman, Neal S; Herb, Emily

    2013-06-01

    The iris of the eye appears to satisfy the criteria for a good anatomical characteristic for use in a biometric system. The purpose of this study was to evaluate a biometric iris recognition system: Mobile-Eyes™. The enrollment, verification, and identification applications were evaluated in a field study for accuracy and reliability using both irises of 277 subjects. Independent variables included a wide range of subject demographics, ambient light, and ambient temperature. A sub-set of 35 subjects had alcohol-induced nystagmus. There were 2710 identification and verification attempts, which resulted in 1,501,340 and 5540 iris comparisons respectively. In this study, the system successfully enrolled all subjects on the first attempt. All 277 subjects were successfully verified and identified on the first day of enrollment. None of the current or prior eye conditions prevented enrollment, verification, or identification. All 35 subjects with alcohol-induced nystagmus were successfully verified and identified. There were no false verifications or false identifications. Two conditions were identified that potentially could circumvent the use of iris recognitions systems in general. The Mobile-Eyes™ iris recognition system exhibited accurate and reliable enrollment, verification, and identification applications in this study. It may have special applications in subjects with nystagmus. Copyright © 2012 Forensic Science Society. Published by Elsevier Ireland Ltd. All rights reserved.

  18. Software For Computer-Security Audits

    Science.gov (United States)

    Arndt, Kate; Lonsford, Emily

    1994-01-01

    Information relevant to potential breaches of security gathered efficiently. Automated Auditing Tools for VAX/VMS program includes following automated software tools performing noted tasks: Privileged ID Identification, program identifies users and their privileges to circumvent existing computer security measures; Critical File Protection, critical files not properly protected identified; Inactive ID Identification, identifications of users no longer in use found; Password Lifetime Review, maximum lifetimes of passwords of all identifications determined; and Password Length Review, minimum allowed length of passwords of all identifications determined. Written in DEC VAX DCL language.

  19. Cost Optimal System Identification Experiment Design

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning

    A structural system identification experiment design method is formulated in the light of decision theory, structural reliability theory and optimization theory. The experiment design is based on a preposterior analysis, well-known from the classical decision theory. I.e. the decisions concerning...... reflecting the cost of the experiment and the value of obtained additional information. An example concerning design of an experiment for parametric identification of a single degree of freedom structural system shows the applicability of the experiment design method....... the experiment design are not based on obtained experimental data. Instead the decisions are based on the expected experimental data assumed to be obtained from the measurements, estimated based on prior information and engineering judgement. The design method provides a system identification experiment design...

  20. Attendance fingerprint identification system using arduino and single board computer

    Science.gov (United States)

    Muchtar, M. A.; Seniman; Arisandi, D.; Hasanah, S.

    2018-03-01

    Fingerprint is one of the most unique parts of the human body that distinguishes one person from others and is easily accessed. This uniqueness is supported by technology that can automatically identify or recognize a person called fingerprint sensor. Yet, the existing Fingerprint Sensor can only do fingerprint identification on one machine. For the mentioned reason, we need a method to be able to recognize each user in a different fingerprint sensor. The purpose of this research is to build fingerprint sensor system for fingerprint data management to be centralized so identification can be done in each Fingerprint Sensor. The result of this research shows that by using Arduino and Raspberry Pi, data processing can be centralized so that fingerprint identification can be done in each fingerprint sensor with 98.5 % success rate of centralized server recording.

  1. A Fast Optimization Method for Reliability and Performance of Cloud Services Composition Application

    Directory of Open Access Journals (Sweden)

    Zhao Wu

    2013-01-01

    Full Text Available At present the cloud computing is one of the newest trends of distributed computation, which is propelling another important revolution of software industry. The cloud services composition is one of the key techniques in software development. The optimization for reliability and performance of cloud services composition application, which is a typical stochastic optimization problem, is confronted with severe challenges due to its randomness and long transaction, as well as the characteristics of the cloud computing resources such as openness and dynamic. The traditional reliability and performance optimization techniques, for example, Markov model and state space analysis and so forth, have some defects such as being too time consuming and easy to cause state space explosion and unsatisfied the assumptions of component execution independence. To overcome these defects, we propose a fast optimization method for reliability and performance of cloud services composition application based on universal generating function and genetic algorithm in this paper. At first, a reliability and performance model for cloud service composition application based on the multiple state system theory is presented. Then the reliability and performance definition based on universal generating function is proposed. Based on this, a fast reliability and performance optimization algorithm is presented. In the end, the illustrative examples are given.

  2. Inter-rater reliability of an observation-based ergonomics assessment checklist for office workers.

    Science.gov (United States)

    Pereira, Michelle Jessica; Straker, Leon Melville; Comans, Tracy Anne; Johnston, Venerina

    2016-12-01

    To establish the inter-rater reliability of an observation-based ergonomics assessment checklist for computer workers. A 37-item (38-item if a laptop was part of the workstation) comprehensive observational ergonomics assessment checklist comparable to government guidelines and up to date with empirical evidence was developed. Two trained practitioners assessed full-time office workers performing their usual computer-based work and evaluated the suitability of workstations used. Practitioners assessed each participant consecutively. The order of assessors was randomised, and the second assessor was blinded to the findings of the first. Unadjusted kappa coefficients between the raters were obtained for the overall checklist and subsections that were formed from question-items relevant to specific workstation equipment. Twenty-seven office workers were recruited. The inter-rater reliability between two trained practitioners achieved moderate to good reliability for all except one checklist component. This checklist has mostly moderate to good reliability between two trained practitioners. Practitioner Summary: This reliable ergonomics assessment checklist for computer workers was designed using accessible government guidelines and supplemented with up-to-date evidence. Employers in Queensland (Australia) can fulfil legislative requirements by using this reliable checklist to identify and subsequently address potential risk factors for work-related injury to provide a safe working environment.

  3. Reliability-based sensitivity of mechanical components with arbitrary distribution parameters

    International Nuclear Information System (INIS)

    Zhang, Yi Min; Yang, Zhou; Wen, Bang Chun; He, Xiang Dong; Liu, Qiaoling

    2010-01-01

    This paper presents a reliability-based sensitivity method for mechanical components with arbitrary distribution parameters. Techniques from the perturbation method, the Edgeworth series, the reliability-based design theory, and the sensitivity analysis approach were employed directly to calculate the reliability-based sensitivity of mechanical components on the condition that the first four moments of the original random variables are known. The reliability-based sensitivity information of the mechanical components can be accurately and quickly obtained using a practical computer program. The effects of the design parameters on the reliability of mechanical components were studied. The method presented in this paper provides the theoretic basis for the reliability-based design of mechanical components

  4. Development of reliability-based safety enhancement technology

    International Nuclear Information System (INIS)

    Kim, Kil Yoo; Han, Sang Hoon; Jang, Seung Cherl

    2002-04-01

    This project aims to develop critical technologies and the necessary reliability DB for maximizing the economics in the NPP operation with keeping the safety using the information of the risk (or reliability). For the research goal, firstly the four critical technologies(Risk Informed Tech. Spec. Optimization, Risk Informed Inservice Testing, On-line Maintenance, Maintenance Rule) for RIR and A have been developed. Secondly, KIND (Korea Information System for Nuclear Reliability Data) has been developed. Using KIND, YGN 3,4 and UCN 3,4 component reliability DB have been established. A reactor trip history DB for all NPP in Korea also has been developed and analyzed. Finally, a detailed reliability analysis of RPS/ESFAS for KNSP has been performed. With the result of the analysis, the sensitivity analysis also has been performed to optimize the AOT/STI of tech. spec. A statistical analysis procedure and computer code have been developed for the set point drift analysis

  5. Reliability analysis techniques for the design engineer

    International Nuclear Information System (INIS)

    Corran, E.R.; Witt, H.H.

    1982-01-01

    This paper describes a fault tree analysis package that eliminates most of the housekeeping tasks involved in proceeding from the initial construction of a fault tree to the final stage of presenting a reliability analysis in a safety report. It is suitable for designers with relatively little training in reliability analysis and computer operation. Users can rapidly investigate the reliability implications of various options at the design stage and evolve a system which meets specified reliability objectives. Later independent review is thus unlikely to reveal major shortcomings necessitating modification and project delays. The package operates interactively, allowing the user to concentrate on the creative task of developing the system fault tree, which may be modified and displayed graphically. For preliminary analysis, system data can be derived automatically from a generic data bank. As the analysis proceeds, improved estimates of critical failure rates and test and maintenance schedules can be inserted. The technique is applied to the reliability analysis of the recently upgraded HIFAR Containment Isolation System. (author)

  6. A taxonomy for human reliability analysis

    International Nuclear Information System (INIS)

    Beattie, J.D.; Iwasa-Madge, K.M.

    1984-01-01

    A human interaction taxonomy (classification scheme) was developed to facilitate human reliability analysis in a probabilistic safety evaluation of a nuclear power plant, being performed at Ontario Hydro. A human interaction occurs, by definition, when operators or maintainers manipulate, or respond to indication from, a plant component or system. The taxonomy aids the fault tree analyst by acting as a heuristic device. It helps define the range and type of human errors to be identified in the construction of fault trees, while keeping the identification by different analysts consistent. It decreases the workload associated with preliminary quantification of the large number of identified interactions by including a category called 'simple interactions'. Fault tree analysts quantify these according to a procedure developed by a team of human reliability specialists. The interactions which do not fit into this category are called 'complex' and are quantified by the human reliability team. The taxonomy is currently being used in fault tree construction in a probabilistic safety evaluation. As far as can be determined at this early stage, the potential benefits of consistency and completeness in identifying human interactions and streamlining the initial quantification are being realized

  7. Bypassing BDD Construction for Reliability Analysis

    DEFF Research Database (Denmark)

    Williams, Poul Frederick; Nikolskaia, Macha; Rauzy, Antoine

    2000-01-01

    In this note, we propose a Boolean Expression Diagram (BED)-based algorithm to compute the minimal p-cuts of boolean reliability models such as fault trees. BEDs make it possible to bypass the Binary Decision Diagram (BDD) construction, which is the main cost of fault tree assessment....

  8. System-Reliability Cumulative-Binomial Program

    Science.gov (United States)

    Scheuer, Ernest M.; Bowerman, Paul N.

    1989-01-01

    Cumulative-binomial computer program, NEWTONP, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. NEWTONP, CUMBIN (NPO-17555), and CROSSER (NPO-17557), used independently of one another. Program finds probability required to yield given system reliability. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Program written in C.

  9. An Intuitionistic Fuzzy Methodology for Component-Based Software Reliability Optimization

    DEFF Research Database (Denmark)

    Madsen, Henrik; Grigore, Albeanu; Popenţiuvlǎdicescu, Florin

    2012-01-01

    Component-based software development is the current methodology facilitating agility in project management, software reuse in design and implementation, promoting quality and productivity, and increasing the reliability and performability. This paper illustrates the usage of intuitionistic fuzzy...... degree approach in modelling the quality of entities in imprecise software reliability computing in order to optimize management results. Intuitionistic fuzzy optimization algorithms are proposed to be used for complex software systems reliability optimization under various constraints....

  10. Telecommunications system reliability engineering theory and practice

    CERN Document Server

    Ayers, Mark L

    2012-01-01

    "Increasing system complexity require new, more sophisticated tools for system modeling and metric calculation. Bringing the field up to date, this book provides telecommunications engineers with practical tools for analyzing, calculating, and reporting availability, reliability, and maintainability metrics. It gives the background in system reliability theory and covers in-depth applications in fiber optic networks, microwave networks, satellite networks, power systems, and facilities management. Computer programming tools for simulating the approaches presented, using the Matlab software suite, are also provided"

  11. Plant and control system reliability and risk model

    International Nuclear Information System (INIS)

    Niemelae, I.M.

    1986-01-01

    A new reliability modelling technique for control systems and plants is demonstrated. It is based on modified boolean algebra and it has been automated into an efficient computer code called RELVEC. The code is useful for getting an overall view of the reliability parameters or for an in-depth reliability analysis, which is essential in risk analysis, where the model must be capable of answering to specific questions like: 'What is the probability of this temperature limiter to provide a false alarm', or 'what is the probability of air pressure in this subsystem to drop below lower limit'. (orig./DG)

  12. DIRAC reliable data management for LHCb

    CERN Document Server

    Smith, A C

    2008-01-01

    DIRAC, LHCb's Grid Workload and Data Management System, utilizes WLCG resources and middleware components to perform distributed computing tasks satisfying LHCb's Computing Model. The Data Management System (DMS) handles data transfer and data access within LHCb. Its scope ranges from the output of the LHCb Online system to Grid-enabled storage for all data types. It supports metadata for these files in replica and bookkeeping catalogues, allowing dataset selection and localization. The DMS controls the movement of files in a redundant fashion whilst providing utilities for accessing all metadata. To do these tasks effectively the DMS requires complete self integrity between its components and external physical storage. The DMS provides highly redundant management of all LHCb data to leverage available storage resources and to manage transient errors in underlying services. It provides data driven and reliable distribution of files as well as reliable job output upload, utilizing VO Boxes at LHCb Tier1 sites ...

  13. The Role of Human Error in Design, Construction, and Reliability of Marine Structures.

    Science.gov (United States)

    1994-10-01

    the fundamental reason for the disparities between computed or notional reliabilities and actuarial relia- bilities. Another important finding from...Marine Structures Lack of recognition of HOE is the fundamental reason for the disparities between computed or notional reliabilities and actuarial ...Conference on Offshore Mechanics and Arctic Engineering, ASME Paper No. OMAE-92-1372, Calgary, Alberta, Canada. Bea, R. G., et al. (1994). "Quality Assurance

  14. Data reliability in complex directed networks

    Science.gov (United States)

    Sanz, Joaquín; Cozzo, Emanuele; Moreno, Yamir

    2013-12-01

    The availability of data from many different sources and fields of science has made it possible to map out an increasing number of networks of contacts and interactions. However, quantifying how reliable these data are remains an open problem. From Biology to Sociology and Economics, the identification of false and missing positives has become a problem that calls for a solution. In this work we extend one of the newest, best performing models—due to Guimerá and Sales-Pardo in 2009—to directed networks. The new methodology is able to identify missing and spurious directed interactions with more precision than previous approaches, which renders it particularly useful for analyzing data reliability in systems like trophic webs, gene regulatory networks, communication patterns and several social systems. We also show, using real-world networks, how the method can be employed to help search for new interactions in an efficient way.

  15. Data reliability in complex directed networks

    International Nuclear Information System (INIS)

    Sanz, Joaquín; Cozzo, Emanuele; Moreno, Yamir

    2013-01-01

    The availability of data from many different sources and fields of science has made it possible to map out an increasing number of networks of contacts and interactions. However, quantifying how reliable these data are remains an open problem. From Biology to Sociology and Economics, the identification of false and missing positives has become a problem that calls for a solution. In this work we extend one of the newest, best performing models—due to Guimerá and Sales-Pardo in 2009—to directed networks. The new methodology is able to identify missing and spurious directed interactions with more precision than previous approaches, which renders it particularly useful for analyzing data reliability in systems like trophic webs, gene regulatory networks, communication patterns and several social systems. We also show, using real-world networks, how the method can be employed to help search for new interactions in an efficient way. (paper)

  16. Fast Simulation of Large-Scale Floods Based on GPU Parallel Computing

    Directory of Open Access Journals (Sweden)

    Qiang Liu

    2018-05-01

    Full Text Available Computing speed is a significant issue of large-scale flood simulations for real-time response to disaster prevention and mitigation. Even today, most of the large-scale flood simulations are generally run on supercomputers due to the massive amounts of data and computations necessary. In this work, a two-dimensional shallow water model based on an unstructured Godunov-type finite volume scheme was proposed for flood simulation. To realize a fast simulation of large-scale floods on a personal computer, a Graphics Processing Unit (GPU-based, high-performance computing method using the OpenACC application was adopted to parallelize the shallow water model. An unstructured data management method was presented to control the data transportation between the GPU and CPU (Central Processing Unit with minimum overhead, and then both computation and data were offloaded from the CPU to the GPU, which exploited the computational capability of the GPU as much as possible. The parallel model was validated using various benchmarks and real-world case studies. The results demonstrate that speed-ups of up to one order of magnitude can be achieved in comparison with the serial model. The proposed parallel model provides a fast and reliable tool with which to quickly assess flood hazards in large-scale areas and, thus, has a bright application prospect for dynamic inundation risk identification and disaster assessment.

  17. Reliability and factor structure of the audit among male and female ...

    African Journals Online (AJOL)

    We assessed the reliability and dimensional structure of the Alcohol Use Disorders Identification Test (AUDIT) among bar patrons in a rural area of South Africa. In total, 406 bar patrons completed a questionnaire containing the AUDIT, and demographic and psychosocial measures. The participants consisted of 314 (77.3%) ...

  18. A Reliability Based Model for Wind Turbine Selection

    Directory of Open Access Journals (Sweden)

    A.K. Rajeevan

    2013-06-01

    Full Text Available A wind turbine generator output at a specific site depends on many factors, particularly cut- in, rated and cut-out wind speed parameters. Hence power output varies from turbine to turbine. The objective of this paper is to develop a mathematical relationship between reliability and wind power generation. The analytical computation of monthly wind power is obtained from weibull statistical model using cubic mean cube root of wind speed. Reliability calculation is based on failure probability analysis. There are many different types of wind turbinescommercially available in the market. From reliability point of view, to get optimum reliability in power generation, it is desirable to select a wind turbine generator which is best suited for a site. The mathematical relationship developed in this paper can be used for site-matching turbine selection in reliability point of view.

  19. Identification of tasks of maintenance centered in the reliability

    International Nuclear Information System (INIS)

    Torres V, A.; Rivero O, J.J.

    2004-01-01

    The methodology of Reliability Centered Maintenance (RCM) it has become, after the discovery of their advantages, an objective of many industrial facilities to optimize their maintenance. However, diverse subjective factors affect the determination of the parameters (technical of predictive to apply and times among interventions) that characterize the tasks of RCM. A method to determine the monitoring tasks at condition and the times more recommended for to apply the monitoring by time and the search of faults, with focus in system. This methodology has been computerized inside the code MOSEG Win Ver 1.0. The same has been applied with success to the determination of tasks of RCM in industrial objectives. (Author)

  20. Cardiac valve calcifications on low-dose unenhanced ungated chest computed tomography: inter-observer and inter-examination reliability, agreement and variability

    International Nuclear Information System (INIS)

    Hamersvelt, Robbert W. van; Willemink, Martin J.; Takx, Richard A.P.; Eikendal, Anouk L.M.; Budde, Ricardo P.J.; Leiner, Tim; Jong, Pim A. de; Mol, Christian P.; Isgum, Ivana

    2014-01-01

    To determine inter-observer and inter-examination variability for aortic valve calcification (AVC) and mitral valve and annulus calcification (MC) in low-dose unenhanced ungated lung cancer screening chest computed tomography (CT). We included 578 lung cancer screening trial participants who were examined by CT twice within 3 months to follow indeterminate pulmonary nodules. On these CTs, AVC and MC were measured in cubic millimetres. One hundred CTs were examined by five observers to determine the inter-observer variability. Reliability was assessed by kappa statistics (κ) and intra-class correlation coefficients (ICCs). Variability was expressed as the mean difference ± standard deviation (SD). Inter-examination reliability was excellent for AVC (κ = 0.94, ICC = 0.96) and MC (κ = 0.95, ICC = 0.90). Inter-examination variability was 12.7 ± 118.2 mm 3 for AVC and 31.5 ± 219.2 mm 3 for MC. Inter-observer reliability ranged from κ = 0.68 to κ = 0.92 for AVC and from κ = 0.20 to κ = 0.66 for MC. Inter-observer ICC was 0.94 for AVC and ranged from 0.56 to 0.97 for MC. Inter-observer variability ranged from -30.5 ± 252.0 mm 3 to 84.0 ± 240.5 mm 3 for AVC and from -95.2 ± 210.0 mm 3 to 303.7 ± 501.6 mm 3 for MC. AVC can be quantified with excellent reliability on ungated unenhanced low-dose chest CT, but manual detection of MC can be subject to substantial inter-observer variability. Lung cancer screening CT may be used for detection and quantification of cardiac valve calcifications. (orig.)

  1. The Reliability and Stability of an Inferred Phylogenetic Tree from Empirical Data.

    Science.gov (United States)

    Katsura, Yukako; Stanley, Craig E; Kumar, Sudhir; Nei, Masatoshi

    2017-03-01

    The reliability of a phylogenetic tree obtained from empirical data is usually measured by the bootstrap probability (Pb) of interior branches of the tree. If the bootstrap probability is high for most branches, the tree is considered to be reliable. If some interior branches show relatively low bootstrap probabilities, we are not sure that the inferred tree is really reliable. Here, we propose another quantity measuring the reliability of the tree called the stability of a subtree. This quantity refers to the probability of obtaining a subtree (Ps) of an inferred tree obtained. We then show that if the tree is to be reliable, both Pb and Ps must be high. We also show that Ps is given by a bootstrap probability of the subtree with the closest outgroup sequence, and computer program RESTA for computing the Pb and Ps values will be presented. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  2. Efficient surrogate models for reliability analysis of systems with multiple failure modes

    International Nuclear Information System (INIS)

    Bichon, Barron J.; McFarland, John M.; Mahadevan, Sankaran

    2011-01-01

    Despite many advances in the field of computational reliability analysis, the efficient estimation of the reliability of a system with multiple failure modes remains a persistent challenge. Various sampling and analytical methods are available, but they typically require accepting a tradeoff between accuracy and computational efficiency. In this work, a surrogate-based approach is presented that simultaneously addresses the issues of accuracy, efficiency, and unimportant failure modes. The method is based on the creation of Gaussian process surrogate models that are required to be locally accurate only in the regions of the component limit states that contribute to system failure. This approach to constructing surrogate models is demonstrated to be both an efficient and accurate method for system-level reliability analysis. - Highlights: → Extends efficient global reliability analysis to systems with multiple failure modes. → Constructs locally accurate Gaussian process models of each response. → Highly efficient and accurate method for assessing system reliability. → Effectiveness is demonstrated on several test problems from the literature.

  3. Evaluation of tecnological reliability of wind turbine facility Gibara 2

    International Nuclear Information System (INIS)

    Torres Valle, Antonio; Martínez Martín, Erich

    2016-01-01

    Renewable energy, particularly wind, will occupy an important place in the coming decades, marked by the depletion of fossil fuel sources. In Cuba significant growth in the use of these energy sourcesis forecasted. For this reason is importantthe creation of reliable technology to ensure that future mission. The paper proposes as its central objective, the analysis of reliability of Wind Farm Gibara 2 starting from its representation based on the methodology of fault tree and to recommend some possible applications of the results. An essential step in the research is the determination of participating components in the fault tree and processing of the available reliability database at the Wind Farm Gibara 2. The document essentially helpsin the identification of the main contributors to the unavailability of facilities and optimizing maintenance policy. (author)

  4. A comparative study of computed radiographic cephalometry and conventional cephalometry in reliability of head film measurements

    International Nuclear Information System (INIS)

    Kim, Hyung Done; Kim, Kee Deog; Park, Chang Seo

    1997-01-01

    The purpose of this study was to compare and to find out the variability of head film measurements (landmarks identification) between Fuji computed radiographic cephalometry and conventional cephalometry. 28 Korean adults were selected. Lateral cephalometric FCR film and conventional cephalometric film of each subject was taken. Four investigators identified 24 cephalometric landmarks on lateral cephalometric FCR film and conventional cephalometric film were statistically analysed. The results were as follows : 1. In FCR film and conventional film, coefficient of variation (C.V.) of 24 landmarks was taken horizontally and vertically. 2. In comparison of significant differences of landmarks variability between FCR film and conventional film, horizontal l value of coefficient of variation showed significant differences in four landmarks among twenty-four landmarks, but vertical a value of coefficient of variation showed significant differences in sixteen landmarks among twenty-four landmarks. FCR film showed significantly less variability than conventional film in 17 subjects among 20 (4+16) subjects that sho wed significant difference.

  5. Inclusion of task dependence in human reliability analysis

    International Nuclear Information System (INIS)

    Su, Xiaoyan; Mahadevan, Sankaran; Xu, Peida; Deng, Yong

    2014-01-01

    Dependence assessment among human errors in human reliability analysis (HRA) is an important issue, which includes the evaluation of the dependence among human tasks and the effect of the dependence on the final human error probability (HEP). This paper represents a computational model to handle dependence in human reliability analysis. The aim of the study is to automatically provide conclusions on the overall degree of dependence and calculate the conditional human error probability (CHEP) once the judgments of the input factors are given. The dependence influencing factors are first identified by the experts and the priorities of these factors are also taken into consideration. Anchors and qualitative labels are provided as guidance for the HRA analyst's judgment of the input factors. The overall degree of dependence between human failure events is calculated based on the input values and the weights of the input factors. Finally, the CHEP is obtained according to a computing formula derived from the technique for human error rate prediction (THERP) method. The proposed method is able to quantify the subjective judgment from the experts and improve the transparency in the HEP evaluation process. Two examples are illustrated to show the effectiveness and the flexibility of the proposed method. - Highlights: • We propose a computational model to handle dependence in human reliability analysis. • The priorities of the dependence influencing factors are taken into consideration. • The overall dependence degree is determined by input judgments and the weights of factors. • The CHEP is obtained according to a computing formula derived from THERP

  6. A reliability analysis tool for SpaceWire network

    Science.gov (United States)

    Zhou, Qiang; Zhu, Longjiang; Fei, Haidong; Wang, Xingyou

    2017-04-01

    A SpaceWire is a standard for on-board satellite networks as the basis for future data-handling architectures. It is becoming more and more popular in space applications due to its technical advantages, including reliability, low power and fault protection, etc. High reliability is the vital issue for spacecraft. Therefore, it is very important to analyze and improve the reliability performance of the SpaceWire network. This paper deals with the problem of reliability modeling and analysis with SpaceWire network. According to the function division of distributed network, a reliability analysis method based on a task is proposed, the reliability analysis of every task can lead to the system reliability matrix, the reliability result of the network system can be deduced by integrating these entire reliability indexes in the matrix. With the method, we develop a reliability analysis tool for SpaceWire Network based on VC, where the computation schemes for reliability matrix and the multi-path-task reliability are also implemented. By using this tool, we analyze several cases on typical architectures. And the analytic results indicate that redundancy architecture has better reliability performance than basic one. In practical, the dual redundancy scheme has been adopted for some key unit, to improve the reliability index of the system or task. Finally, this reliability analysis tool will has a directive influence on both task division and topology selection in the phase of SpaceWire network system design.

  7. Computer Identification of Symptomatic Deep Venous Thrombosis Associated with Peripherally Inserted Central Catheters

    Science.gov (United States)

    Evans, R. Scott; Linford, Lorraine H.; Sharp, Jamie H.; White, Gayle; Lloyd, James F.; Weaver, Lindell K.

    2007-01-01

    Peripherally inserted central catheters (PICCs) are considered a safe method to provide long-term antibiotic therapy, chemotherapy and nutrition support. Deep venous thrombosis (DVT) is a complication that requires early PICC removal, may extend hospitalization and can result in pulmonary embolism. PICC insertion teams strive to understand risk factors and develop methods to prevent DVTs. However, they can only manage what they can measure. At LDS Hospital, identification of PICC associated DVTs was dependent on verbal notification or manual surveillance of more than a thousand free-text vascular reports. Accurate DVT rates were not known which hindered prevention. We describe the development of a computer application (PICC-DVT monitor) to identify PICC associated DVTs each day. A one-year evaluation of the monitor by the PICC team and a review of 445 random vascular reports found a positive predictive value of 98%, sensitivity of 94%, specificity of 100% and a PICC team associated DVT rate of 2.8%. PMID:18693831

  8. Computer identification of symptomatic deep venous thrombosis associated with peripherally inserted venous catheters.

    Science.gov (United States)

    Evans, R Scott; Linford, Lorraine H; Sharp, Jamie H; White, Gayle; Lloyd, James F; Weaver, Lindell K

    2007-10-11

    Peripherally inserted central catheters (PICCs) are considered a safe method to provide long-term antibiotic therapy, chemotherapy and nutrition support. Deep venous thrombosis (DVT) is a complication that requires early PICC removal, may extend hospitalization and can result in pulmonary embolism. PICC insertion teams strive to understand risk factors and develop methods to prevent DVTs. However, they can only manage what they can measure. At LDS Hospital, identification of PICC associated DVTs was dependent on verbal notification or manual surveillance of more than a thousand free-text vascular reports. Accurate DVT rates were not known which hindered prevention. We describe the development of a computer application (PICC-DVT monitor) to identify PICC associated DVTs each day. A one-year evaluation of the monitor by the PICC team and a review of 445 random vascular reports found a positive predictive value of 98%, sensitivity of 94%, specificity of 100% and a PICC team associated DVT rate of 2.8%.

  9. A Closed-Form Technique for the Reliability and Risk Assessment of Wind Turbine Systems

    Directory of Open Access Journals (Sweden)

    Leonardo Dueñas-Osorio

    2012-06-01

    Full Text Available This paper proposes a closed-form method to evaluate wind turbine system reliability and associated failure consequences. Monte Carlo simulation, a widely used approach for system reliability assessment, usually requires large numbers of computational experiments, while existing analytical methods are limited to simple system event configurations with a focus on average values of reliability metrics. By analyzing a wind turbine system and its components in a combinatorial yet computationally efficient form, the proposed approach provides an entire probability distribution of system failure that contains all possible configurations of component failure and survival events. The approach is also capable of handling unique component attributes such as downtime and repair cost needed for risk estimations, and enables sensitivity analysis for quantifying the criticality of individual components to wind turbine system reliability. Applications of the technique are illustrated by assessing the reliability of a 12-subassembly turbine system. In addition, component downtimes and repair costs of components are embedded in the formulation to compute expected annual wind turbine unavailability and repair cost probabilities, and component importance metrics useful for maintenance planning and research prioritization. Furthermore, this paper introduces a recursive solution to closed-form method and applies this to a 45-component turbine system. The proposed approach proves to be computationally efficient and yields vital reliability information that could be readily used by wind farm stakeholders for decision making and risk management.

  10. RIO: a program to determine reliability importance and allocate optimal reliability goals

    International Nuclear Information System (INIS)

    Poloski, J.P.

    1978-09-01

    The designer of a nuclear plant must know the plant's associated risk limitations so that he can design the plant accordingly. To design a safety system, he must understand its importance and how it relates to the overall plant risk. The computer program RIO can aid the designer to understand a system's contribution to the plant's overall risk. The methodology developed and presented was sponsored by the Nuclear Research Applications Division of the Department of Energy for use in the Gas Cooled Fast Breeder Reactor (GCFR) Program. The principal motivation behind its development was the need to translate nuclear plants safety goals into reliability goals for systems which make up that plant. The method described herein will make use of the GCFR Accident Initiation and Progression Analyses (AIPA) event trees and other models in order to determine these reliability goals

  11. Validation of Land Cover Products Using Reliability Evaluation Methods

    Directory of Open Access Journals (Sweden)

    Wenzhong Shi

    2015-06-01

    Full Text Available Validation of land cover products is a fundamental task prior to data applications. Current validation schemes and methods are, however, suited only for assessing classification accuracy and disregard the reliability of land cover products. The reliability evaluation of land cover products should be undertaken to provide reliable land cover information. In addition, the lack of high-quality reference data often constrains validation and affects the reliability results of land cover products. This study proposes a validation schema to evaluate the reliability of land cover products, including two methods, namely, result reliability evaluation and process reliability evaluation. Result reliability evaluation computes the reliability of land cover products using seven reliability indicators. Process reliability evaluation analyzes the reliability propagation in the data production process to obtain the reliability of land cover products. Fuzzy fault tree analysis is introduced and improved in the reliability analysis of a data production process. Research results show that the proposed reliability evaluation scheme is reasonable and can be applied to validate land cover products. Through the analysis of the seven indicators of result reliability evaluation, more information on land cover can be obtained for strategic decision-making and planning, compared with traditional accuracy assessment methods. Process reliability evaluation without the need for reference data can facilitate the validation and reflect the change trends of reliabilities to some extent.

  12. ERP application of real-time vdc-enabled last planner system for planning reliability improvement

    DEFF Research Database (Denmark)

    Cho, S.; Sørensen, Kristian Birch; Fischer, M.

    2009-01-01

    The Last Planner System (LPS) has since its introduction in 1994 become a widely used method of AEC practitioners for improvement of planning reliability and tracking and monitoring of project progress. However, the observations presented in this paper indicate that the last planners...... and coordinators are in need of a new system that integrates the existing LPS with Virtual Design and Construction (VDC), Enterprise Resource Planning (ERP) systems, and automatic object identification by means of Radio Frequency Identification (RFID) technology. This is because current practice of the LPS...... implementations is guesswork-driven, textual report-generated, hand-updated, and even interpersonal trust-oriented, resulting in less accurate and reliable plans. This research introduces a prototype development of the VREL (VDC + RFID + ERP + LPS) integration to generate a real-time updated cost + physical...

  13. Evaluating North American Electric Grid Reliability Using the Barabasi-Albert Network Model

    OpenAIRE

    Chassin, David P.; Posse, Christian

    2004-01-01

    The reliability of electric transmission systems is examined using a scale-free model of network structure and failure propagation. The topologies of the North American eastern and western electric networks are analyzed to estimate their reliability based on the Barabasi-Albert network model. A commonly used power system reliability index is computed using a simple failure propagation model. The results are compared to the values of power system reliability indices previously obtained using s...

  14. Evaluation of the reliability of Levine method of wound swab for ...

    African Journals Online (AJOL)

    The aim of this paper is to evaluate the reliability of Levine swab in accurate identification of microorganisms present in a wound and identify the necessity for further studies in this regard. Methods: A semi structured questionnaire was administered and physical examination was performed on patients with chronic wounds ...

  15. Fast Metabolite Identification in Nuclear Magnetic Resonance Metabolomic Studies: Statistical Peak Sorting and Peak Overlap Detection for More Reliable Database Queries.

    Science.gov (United States)

    Hoijemberg, Pablo A; Pelczer, István

    2018-01-05

    A lot of time is spent by researchers in the identification of metabolites in NMR-based metabolomic studies. The usual metabolite identification starts employing public or commercial databases to match chemical shifts thought to belong to a given compound. Statistical total correlation spectroscopy (STOCSY), in use for more than a decade, speeds the process by finding statistical correlations among peaks, being able to create a better peak list as input for the database query. However, the (normally not automated) analysis becomes challenging due to the intrinsic issue of peak overlap, where correlations of more than one compound appear in the STOCSY trace. Here we present a fully automated methodology that analyzes all STOCSY traces at once (every peak is chosen as driver peak) and overcomes the peak overlap obstacle. Peak overlap detection by clustering analysis and sorting of traces (POD-CAST) first creates an overlap matrix from the STOCSY traces, then clusters the overlap traces based on their similarity and finally calculates a cumulative overlap index (COI) to account for both strong and intermediate correlations. This information is gathered in one plot to help the user identify the groups of peaks that would belong to a single molecule and perform a more reliable database query. The simultaneous examination of all traces reduces the time of analysis, compared to viewing STOCSY traces by pairs or small groups, and condenses the redundant information in the 2D STOCSY matrix into bands containing similar traces. The COI helps in the detection of overlapping peaks, which can be added to the peak list from another cross-correlated band. POD-CAST overcomes the generally overlooked and underestimated presence of overlapping peaks and it detects them to include them in the search of all compounds contributing to the peak overlap, enabling the user to accelerate the metabolite identification process with more successful database queries and searching all tentative

  16. OPTICAL correlation identification technology applied in underwater laser imaging target identification

    Science.gov (United States)

    Yao, Guang-tao; Zhang, Xiao-hui; Ge, Wei-long

    2012-01-01

    The underwater laser imaging detection is an effective method of detecting short distance target underwater as an important complement of sonar detection. With the development of underwater laser imaging technology and underwater vehicle technology, the underwater automatic target identification has gotten more and more attention, and is a research difficulty in the area of underwater optical imaging information processing. Today, underwater automatic target identification based on optical imaging is usually realized with the method of digital circuit software programming. The algorithm realization and control of this method is very flexible. However, the optical imaging information is 2D image even 3D image, the amount of imaging processing information is abundant, so the electronic hardware with pure digital algorithm will need long identification time and is hard to meet the demands of real-time identification. If adopt computer parallel processing, the identification speed can be improved, but it will increase complexity, size and power consumption. This paper attempts to apply optical correlation identification technology to realize underwater automatic target identification. The optics correlation identification technology utilizes the Fourier transform characteristic of Fourier lens which can accomplish Fourier transform of image information in the level of nanosecond, and optical space interconnection calculation has the features of parallel, high speed, large capacity and high resolution, combines the flexibility of calculation and control of digital circuit method to realize optoelectronic hybrid identification mode. We reduce theoretical formulation of correlation identification and analyze the principle of optical correlation identification, and write MATLAB simulation program. We adopt single frame image obtained in underwater range gating laser imaging to identify, and through identifying and locating the different positions of target, we can improve

  17. Using Pareto points for model identification in predictive toxicology

    Science.gov (United States)

    2013-01-01

    Predictive toxicology is concerned with the development of models that are able to predict the toxicity of chemicals. A reliable prediction of toxic effects of chemicals in living systems is highly desirable in cosmetics, drug design or food protection to speed up the process of chemical compound discovery while reducing the need for lab tests. There is an extensive literature associated with the best practice of model generation and data integration but management and automated identification of relevant models from available collections of models is still an open problem. Currently, the decision on which model should be used for a new chemical compound is left to users. This paper intends to initiate the discussion on automated model identification. We present an algorithm, based on Pareto optimality, which mines model collections and identifies a model that offers a reliable prediction for a new chemical compound. The performance of this new approach is verified for two endpoints: IGC50 and LogP. The results show a great potential for automated model identification methods in predictive toxicology. PMID:23517649

  18. Biometric identification based on feature fusion with PCA and SVM

    Science.gov (United States)

    Lefkovits, László; Lefkovits, Szidónia; Emerich, Simina

    2018-04-01

    Biometric identification is gaining ground compared to traditional identification methods. Many biometric measurements may be used for secure human identification. The most reliable among them is the iris pattern because of its uniqueness, stability, unforgeability and inalterability over time. The approach presented in this paper is a fusion of different feature descriptor methods such as HOG, LIOP, LBP, used for extracting iris texture information. The classifiers obtained through the SVM and PCA methods demonstrate the effectiveness of our system applied to one and both irises. The performances measured are highly accurate and foreshadow a fusion system with a rate of identification approaching 100% on the UPOL database.

  19. TIGER reliability analysis in the DSN

    Science.gov (United States)

    Gunn, J. M.

    1982-01-01

    The TIGER algorithm, the inputs to the program and the output are described. TIGER is a computer program designed to simulate a system over a period of time to evaluate system reliability and availability. Results can be used in the Deep Space Network for initial spares provisioning and system evaluation.

  20. Resilient computer system design

    CERN Document Server

    Castano, Victor

    2015-01-01

    This book presents a paradigm for designing new generation resilient and evolving computer systems, including their key concepts, elements of supportive theory, methods of analysis and synthesis of ICT with new properties of evolving functioning, as well as implementation schemes and their prototyping. The book explains why new ICT applications require a complete redesign of computer systems to address challenges of extreme reliability, high performance, and power efficiency. The authors present a comprehensive treatment for designing the next generation of computers, especially addressing safety-critical, autonomous, real time, military, banking, and wearable health care systems.   §  Describes design solutions for new computer system - evolving reconfigurable architecture (ERA) that is free from drawbacks inherent in current ICT and related engineering models §  Pursues simplicity, reliability, scalability principles of design implemented through redundancy and re-configurability; targeted for energy-,...

  1. Optimal design method for a digital human–computer interface based on human reliability in a nuclear power plant. Part 3: Optimization method for interface task layout

    International Nuclear Information System (INIS)

    Jiang, Jianjun; Wang, Yiqun; Zhang, Li; Xie, Tian; Li, Min; Peng, Yuyuan; Wu, Daqing; Li, Peiyao; Ma, Congmin; Shen, Mengxu; Wu, Xing; Weng, Mengyun; Wang, Shiwei; Xie, Cen

    2016-01-01

    Highlights: • The authors present an optimization algorithm for interface task layout. • The performing process of the proposed algorithm was depicted. • The performance evaluation method adopted neural network method. • The optimization layouts of an event interface tasks were obtained by experiments. - Abstract: This is the last in a series of papers describing the optimal design for a digital human–computer interface of a nuclear power plant (NPP) from three different points based on human reliability. The purpose of this series is to propose different optimization methods from varying perspectives to decrease human factor events that arise from the defects of a human–computer interface. The present paper mainly solves the optimization method as to how to effectively layout interface tasks into different screens. The purpose of this paper is to decrease human errors by reducing the distance that an operator moves among different screens in each operation. In order to resolve the problem, the authors propose an optimization process of interface task layout for digital human–computer interface of a NPP. As to how to automatically layout each interface task into one of screens in each operation, the paper presents a shortest moving path optimization algorithm with dynamic flag based on human reliability. To test the algorithm performance, the evaluation method uses neural network based on human reliability. The less the human error probabilities are, the better the interface task layouts among different screens are. Thus, by analyzing the performance of each interface task layout, the optimization result is obtained. Finally, the optimization layouts of spurious safety injection event interface tasks of the NPP are obtained by an experiment, the proposed methods has a good accuracy and stabilization.

  2. Reliability analysis and assessment of structural systems

    International Nuclear Information System (INIS)

    Yao, J.T.P.; Anderson, C.A.

    1977-01-01

    The study of structural reliability deals with the probability of having satisfactory performance of the structure under consideration within any specific time period. To pursue this study, it is necessary to apply available knowledge and methodology in structural analysis (including dynamics) and design, behavior of materials and structures, experimental mechanics, and the theory of probability and statistics. In addition, various severe loading phenomena such as strong motion earthquakes and wind storms are important considerations. For three decades now, much work has been done on reliability analysis of structures, and during this past decade, certain so-called 'Level I' reliability-based design codes have been proposed and are in various stages of implementation. These contributions will be critically reviewed and summarized in this paper. Because of the undesirable consequences resulting from the failure of nuclear structures, it is important and desirable to consider the structural reliability in the analysis and design of these structures. Moreover, after these nuclear structures are constructed, it is desirable for engineers to be able to assess the structural reliability periodically as well as immediately following the occurrence of severe loading conditions such as a strong-motion earthquake. During this past decade, increasing use has been made of techniques of system identification in structural engineering. On the basis of non-destructive test results, various methods have been developed to obtain an adequate mathematical model (such as the equations of motion with more realistic parameters) to represent the structural system

  3. Optical computing.

    Science.gov (United States)

    Stroke, G. W.

    1972-01-01

    Applications of the optical computer include an approach for increasing the sharpness of images obtained from the most powerful electron microscopes and fingerprint/credit card identification. The information-handling capability of the various optical computing processes is very great. Modern synthetic-aperture radars scan upward of 100,000 resolvable elements per second. Fields which have assumed major importance on the basis of optical computing principles are optical image deblurring, coherent side-looking synthetic-aperture radar, and correlative pattern recognition. Some examples of the most dramatic image deblurring results are shown.

  4. Mapping the Most Significant Computer Hacking Events to a Temporal Computer Attack Model

    OpenAIRE

    Heerden , Renier ,; Pieterse , Heloise; Irwin , Barry

    2012-01-01

    Part 4: Section 3: ICT for Peace and War; International audience; This paper presents eight of the most significant computer hacking events (also known as computer attacks). These events were selected because of their unique impact, methodology, or other properties. A temporal computer attack model is presented that can be used to model computer based attacks. This model consists of the following stages: Target Identification, Reconnaissance, Attack, and Post-Attack Reconnaissance stages. The...

  5. A new kind high-reliability digital reactivity meter

    International Nuclear Information System (INIS)

    Shen Feng; Jiang Zongbing

    2001-01-01

    The paper introduces a new kind of high-reliability Digital Reactivity Meter developed by the DRM developing group in designing department of Nuclear Power Institute of China. The meter has two independent measure channels, which can be set as either master-slave structure or working independently. This structure will ensure that the meter can continually fulfill its online measure task under the condition of single failure with it. It provides a solution for the conflict between nuclear station's extreme demand in DRM's reliability and instability of computer's business software platform. The instrument reaches both advance and reliability by covering a lot of kinds of complex functions in data process and display

  6. Engineering high reliability, low-jitter Marx generators

    International Nuclear Information System (INIS)

    Schneider, L.X.; Lockwood, G.J.

    1985-01-01

    Multimodule pulsed power accelerators typically require high module reliability and nanosecond regime simultaneity between modules. Energy storage using bipolar Marx generators can meet these requirements. Experience gained from computer simulations and the development of the DEMON II Marx generator has led to a fundamental understanding of the operation of these multistage devices. As a result of this research, significant improvements in erection time jitter and reliability have been realized in multistage, bipolar Marx generators. Erection time jitter has been measured as low as 2.5 nanoseconds for the 3.2MV, 16-stage PBFA I Marx and 3.5 nanoseconds for the 6.0MV, 30-stage PBFA II (DEMON II) Marx, while maintaining exceptionally low prefire rates. Performance data are presented from the DEMON II Marx research program, as well as discussions on the use of computer simulations in designing low-jitter Marx generators

  7. Preliminary Analysis of LORAN-C System Reliability for Civil Aviation.

    Science.gov (United States)

    1981-09-01

    overviev of the analysis technique. Section 3 describes the computerized LORAN-C coverage model which is used extensively in the reliability analysis...Xth Plenary Assembly, Geneva, 1963, published by International Telecomunications Union. S. Braff, R., Computer program to calculate a Karkov Chain Reliability Model, unpublished york, MITRE Corporation. A-1 I.° , 44J Ili *Y 0E 00 ...F i8 1110 Prelim inary Analysis of Program Engineering & LORAN’C System ReliabilityMaintenance Service i ~Washington. D.C.

  8. Text-independent writer identification and verification using textural and allographic features.

    Science.gov (United States)

    Bulacu, Marius; Schomaker, Lambert

    2007-04-01

    The identification of a person on the basis of scanned images of handwriting is a useful biometric modality with application in forensic and historic document analysis and constitutes an exemplary study area within the research field of behavioral biometrics. We developed new and very effective techniques for automatic writer identification and verification that use probability distribution functions (PDFs) extracted from the handwriting images to characterize writer individuality. A defining property of our methods is that they are designed to be independent of the textual content of the handwritten samples. Our methods operate at two levels of analysis: the texture level and the character-shape (allograph) level. At the texture level, we use contour-based joint directional PDFs that encode orientation and curvature information to give an intimate characterization of individual handwriting style. In our analysis at the allograph level, the writer is considered to be characterized by a stochastic pattern generator of ink-trace fragments, or graphemes. The PDF of these simple shapes in a given handwriting sample is characteristic for the writer and is computed using a common shape codebook obtained by grapheme clustering. Combining multiple features (directional, grapheme, and run-length PDFs) yields increased writer identification and verification performance. The proposed methods are applicable to free-style handwriting (both cursive and isolated) and have practical feasibility, under the assumption that a few text lines of handwritten material are available in order to obtain reliable probability estimates.

  9. A general software reliability process simulation technique

    Science.gov (United States)

    Tausworthe, Robert C.

    1991-01-01

    The structure and rationale of the generalized software reliability process, together with the design and implementation of a computer program that simulates this process are described. Given assumed parameters of a particular project, the users of this program are able to generate simulated status timelines of work products, numbers of injected anomalies, and the progress of testing, fault isolation, repair, validation, and retest. Such timelines are useful in comparison with actual timeline data, for validating the project input parameters, and for providing data for researchers in reliability prediction modeling.

  10. Computer sciences

    Science.gov (United States)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  11. Reliability of video-based identification of footstrike pattern and video time frame at initial contact in recreational runners

    DEFF Research Database (Denmark)

    Damsted, Camma; Larsen, L H; Nielsen, R.O.

    2015-01-01

    and video time frame at initial contact during treadmill running using two-dimensional (2D) video recordings. METHODS: Thirty-one recreational runners were recorded twice, 1 week apart, with a high-speed video camera. Two blinded raters evaluated each video twice with an interval of at least 14 days....... RESULTS: Kappa values for within-day identification of footstrike pattern revealed intra-rater agreement of 0.83-0.88 and inter-rater agreement of 0.50-0.63. Corresponding figures for between-day identification of footstrike pattern were 0.63-0.69 and 0.41-0.53, respectively. Identification of video time...... in 36% of the identifications (kappa=0.41). The 95% limits of agreement for identification of video time frame at initial contact may, at times, allow for different identification of footstrike pattern. Clinicians should, therefore, be encouraged to continue using clinical 2D video setups for intra...

  12. Estimating spatial travel times using automatic vehicle identification data

    Science.gov (United States)

    2001-01-01

    Prepared ca. 2001. The paper describes an algorithm that was developed for estimating reliable and accurate average roadway link travel times using Automatic Vehicle Identification (AVI) data. The algorithm presented is unique in two aspects. First, ...

  13. Higher-order techniques in computational electromagnetics

    CERN Document Server

    Graglia, Roberto D

    2016-01-01

    Higher-Order Techniques in Computational Electromagnetics explains 'high-order' techniques that can significantly improve the accuracy, computational cost, and reliability of computational techniques for high-frequency electromagnetics, such as antennas, microwave devices and radar scattering applications.

  14. Computer vision based room interior design

    Science.gov (United States)

    Ahmad, Nasir; Hussain, Saddam; Ahmad, Kashif; Conci, Nicola

    2015-12-01

    This paper introduces a new application of computer vision. To the best of the author's knowledge, it is the first attempt to incorporate computer vision techniques into room interior designing. The computer vision based interior designing is achieved in two steps: object identification and color assignment. The image segmentation approach is used for the identification of the objects in the room and different color schemes are used for color assignment to these objects. The proposed approach is applied to simple as well as complex images from online sources. The proposed approach not only accelerated the process of interior designing but also made it very efficient by giving multiple alternatives.

  15. Cardiac valve calcifications on low-dose unenhanced ungated chest computed tomography: inter-observer and inter-examination reliability, agreement and variability

    Energy Technology Data Exchange (ETDEWEB)

    Hamersvelt, Robbert W. van; Willemink, Martin J.; Takx, Richard A.P.; Eikendal, Anouk L.M.; Budde, Ricardo P.J.; Leiner, Tim; Jong, Pim A. de [University Medical Center Utrecht, Department of Radiology, Utrecht (Netherlands); Mol, Christian P.; Isgum, Ivana [University Medical Center Utrecht, Image Sciences Institute, Utrecht (Netherlands)

    2014-07-15

    To determine inter-observer and inter-examination variability for aortic valve calcification (AVC) and mitral valve and annulus calcification (MC) in low-dose unenhanced ungated lung cancer screening chest computed tomography (CT). We included 578 lung cancer screening trial participants who were examined by CT twice within 3 months to follow indeterminate pulmonary nodules. On these CTs, AVC and MC were measured in cubic millimetres. One hundred CTs were examined by five observers to determine the inter-observer variability. Reliability was assessed by kappa statistics (κ) and intra-class correlation coefficients (ICCs). Variability was expressed as the mean difference ± standard deviation (SD). Inter-examination reliability was excellent for AVC (κ = 0.94, ICC = 0.96) and MC (κ = 0.95, ICC = 0.90). Inter-examination variability was 12.7 ± 118.2 mm{sup 3} for AVC and 31.5 ± 219.2 mm{sup 3} for MC. Inter-observer reliability ranged from κ = 0.68 to κ = 0.92 for AVC and from κ = 0.20 to κ = 0.66 for MC. Inter-observer ICC was 0.94 for AVC and ranged from 0.56 to 0.97 for MC. Inter-observer variability ranged from -30.5 ± 252.0 mm{sup 3} to 84.0 ± 240.5 mm{sup 3} for AVC and from -95.2 ± 210.0 mm{sup 3} to 303.7 ± 501.6 mm{sup 3} for MC. AVC can be quantified with excellent reliability on ungated unenhanced low-dose chest CT, but manual detection of MC can be subject to substantial inter-observer variability. Lung cancer screening CT may be used for detection and quantification of cardiac valve calcifications. (orig.)

  16. Reliability analysis of reactor inspection robot(RIROB)

    International Nuclear Information System (INIS)

    Eom, H. S.; Kim, J. H.; Lee, J. C.; Choi, Y. R.; Moon, S. S.

    2002-05-01

    This report describes the method and the result of the reliability analysis of RIROB developed in Korea Atomic Energy Research Institute. There are many classic techniques and models for the reliability analysis. These techniques and models have been used widely and approved in other industries such as aviation and nuclear industry. Though these techniques and models have been approved in real fields they are still insufficient for the complicated systems such RIROB which are composed of computer, networks, electronic parts, mechanical parts, and software. Particularly the application of these analysis techniques to digital and software parts of complicated systems is immature at this time thus expert judgement plays important role in evaluating the reliability of the systems at these days. In this report we proposed a method which combines diverse evidences relevant to the reliability to evaluate the reliability of complicated systems such as RIROB. The proposed method combines diverse evidences and performs inference in formal and in quantitative way by using the benefits of Bayesian Belief Nets (BBN)

  17. Multi-level damage identification with response reconstruction

    Science.gov (United States)

    Zhang, Chao-Dong; Xu, You-Lin

    2017-10-01

    Damage identification through finite element (FE) model updating usually forms an inverse problem. Solving the inverse identification problem for complex civil structures is very challenging since the dimension of potential damage parameters in a complex civil structure is often very large. Aside from enormous computation efforts needed in iterative updating, the ill-condition and non-global identifiability features of the inverse problem probably hinder the realization of model updating based damage identification for large civil structures. Following a divide-and-conquer strategy, a multi-level damage identification method is proposed in this paper. The entire structure is decomposed into several manageable substructures and each substructure is further condensed as a macro element using the component mode synthesis (CMS) technique. The damage identification is performed at two levels: the first is at macro element level to locate the potentially damaged region and the second is over the suspicious substructures to further locate as well as quantify the damage severity. In each level's identification, the damage searching space over which model updating is performed is notably narrowed down, not only reducing the computation amount but also increasing the damage identifiability. Besides, the Kalman filter-based response reconstruction is performed at the second level to reconstruct the response of the suspicious substructure for exact damage quantification. Numerical studies and laboratory tests are both conducted on a simply supported overhanging steel beam for conceptual verification. The results demonstrate that the proposed multi-level damage identification via response reconstruction does improve the identification accuracy of damage localization and quantization considerably.

  18. Application of subset simulation in reliability estimation of underground pipelines

    International Nuclear Information System (INIS)

    Tee, Kong Fah; Khan, Lutfor Rahman; Li, Hongshuang

    2014-01-01

    This paper presents a computational framework for implementing an advanced Monte Carlo simulation method, called Subset Simulation (SS) for time-dependent reliability prediction of underground flexible pipelines. The SS can provide better resolution for low failure probability level of rare failure events which are commonly encountered in pipeline engineering applications. Random samples of statistical variables are generated efficiently and used for computing probabilistic reliability model. It gains its efficiency by expressing a small probability event as a product of a sequence of intermediate events with larger conditional probabilities. The efficiency of SS has been demonstrated by numerical studies and attention in this work is devoted to scrutinise the robustness of the SS application in pipe reliability assessment and compared with direct Monte Carlo simulation (MCS) method. Reliability of a buried flexible steel pipe with time-dependent failure modes, namely, corrosion induced deflection, buckling, wall thrust and bending stress has been assessed in this study. The analysis indicates that corrosion induced excessive deflection is the most critical failure event whereas buckling is the least susceptible during the whole service life of the pipe. The study also shows that SS is robust method to estimate the reliability of buried pipelines and it is more efficient than MCS, especially in small failure probability prediction

  19. Dynamic decision-making for reliability and maintenance analysis of manufacturing systems based on failure effects

    Science.gov (United States)

    Zhang, Ding; Zhang, Yingjie

    2017-09-01

    A framework for reliability and maintenance analysis of job shop manufacturing systems is proposed in this paper. An efficient preventive maintenance (PM) policy in terms of failure effects analysis (FEA) is proposed. Subsequently, reliability evaluation and component importance measure based on FEA are performed under the PM policy. A job shop manufacturing system is applied to validate the reliability evaluation and dynamic maintenance policy. Obtained results are compared with existed methods and the effectiveness is validated. Some vague understandings for issues such as network modelling, vulnerabilities identification, the evaluation criteria of repairable systems, as well as PM policy during manufacturing system reliability analysis are elaborated. This framework can help for reliability optimisation and rational maintenance resources allocation of job shop manufacturing systems.

  20. Identification of Learning Processes by Means of Computer Graphics.

    Science.gov (United States)

    Sorensen, Birgitte Holm

    1993-01-01

    Describes a development project for the use of computer graphics and video in connection with an inservice training course for primary education teachers in Denmark. Topics addressed include research approaches to computers; computer graphics in learning processes; activities relating to computer graphics; the role of the teacher; and student…

  1. An Energy-Based Limit State Function for Estimation of Structural Reliability in Shock Environments

    Directory of Open Access Journals (Sweden)

    Michael A. Guthrie

    2013-01-01

    Full Text Available limit state function is developed for the estimation of structural reliability in shock environments. This limit state function uses peak modal strain energies to characterize environmental severity and modal strain energies at failure to characterize the structural capacity. The Hasofer-Lind reliability index is briefly reviewed and its computation for the energy-based limit state function is discussed. Applications to two degree of freedom mass-spring systems and to a simple finite element model are considered. For these examples, computation of the reliability index requires little effort beyond a modal analysis, but still accounts for relevant uncertainties in both the structure and environment. For both examples, the reliability index is observed to agree well with the results of Monte Carlo analysis. In situations where fast, qualitative comparison of several candidate designs is required, the reliability index based on the proposed limit state function provides an attractive metric which can be used to compare and control reliability.

  2. Design and implementation of component reliability database management system for NPP

    International Nuclear Information System (INIS)

    Kim, S. H.; Jung, J. K.; Choi, S. Y.; Lee, Y. H.; Han, S. H.

    1999-01-01

    KAERI is constructing the component reliability database for Korean nuclear power plant. This paper describes the development of data management tool, which runs for component reliability database. This is running under intranet environment and is used to analyze the failure mode and failure severity to compute the component failure rate. Now we are developing the additional modules to manage operation history, test history and algorithms for calculation of component failure history and reliability

  3. A computational technique to identify the optimal stiffness matrix for a discrete nuclear fuel assembly model

    International Nuclear Information System (INIS)

    Park, Nam-Gyu; Kim, Kyoung-Joo; Kim, Kyoung-Hong; Suh, Jung-Min

    2013-01-01

    Highlights: ► An identification method of the optimal stiffness matrix for a fuel assembly structure is discussed. ► The least squares optimization method is introduced, and a closed form solution of the problem is derived. ► The method can be expanded to the system with the limited number of modes. ► Identification error due to the perturbed mode shape matrix is analyzed. ► Verification examples show that the proposed procedure leads to a reliable solution. -- Abstract: A reactor core structural model which is used to evaluate the structural integrity of the core contains nuclear fuel assembly models. Since the reactor core consists of many nuclear fuel assemblies, the use of a refined fuel assembly model leads to a considerable amount of computing time for performing nonlinear analyses such as the prediction of seismic induced vibration behaviors. The computational time could be reduced by replacing the detailed fuel assembly model with a simplified model that has fewer degrees of freedom, but the dynamic characteristics of the detailed model must be maintained in the simplified model. Such a model based on an optimal design method is proposed in this paper. That is, when a mass matrix and a mode shape matrix are given, the optimal stiffness matrix of a discrete fuel assembly model can be estimated by applying the least squares minimization method. The verification of the method is completed by comparing test results and simulation results. This paper shows that the simplified model's dynamic behaviors are quite similar to experimental results and that the suggested method is suitable for identifying reliable mathematical model for fuel assemblies

  4. Major influence of interobserver reliability on polytrauma identification with the Injury Severity Score (ISS): Time for a centralised coding in trauma registries?

    Science.gov (United States)

    Maduz, Roman; Kugelmeier, Patrick; Meili, Severin; Döring, Robert; Meier, Christoph; Wahl, Peter

    2017-04-01

    The Abbreviated Injury Scale (AIS) and the Injury Severity Score (ISS) find increasingly widespread use to assess trauma burden and to perform interhospital benchmarking through trauma registries. Since 2015, public resource allocation in Switzerland shall even be derived from such data. As every trauma centre is responsible for its own coding and data input, this study aims at evaluating interobserver reliability of AIS and ISS coding. Interobserver reliability of the AIS and ISS is analysed from a cohort of 50 consecutive severely injured patients treated in 2012 at our institution, coded retrospectively by 3 independent and specifically trained observers. Considering a cutoff ISS≥16, only 38/50 patients (76%) were uniformly identified as polytraumatised or not. Increasing the cut off to ≥20, this increased to 41/50 patients (82%). A difference in the AIS of ≥ 1 was present in 261 (16%) of possible codes. Excluding the vast majority of uninjured body regions, uniformly identical AIS severity values were attributed in 67/193 (35%) body regions, or 318/579 (55%) possible observer pairings. Injury severity all too often is neither identified correctly nor consistently when using the AIS. This leads to wrong identification of severely injured patients using the ISS. Improving consistency of coding through centralisation is recommended before scores based on the AIS are to be used for interhospital benchmarking and resource allocation in the treatment of severely injured patients. Copyright © 2017. Published by Elsevier Ltd.

  5. Maintenance management of railway infrastructures based on reliability analysis

    International Nuclear Information System (INIS)

    Macchi, Marco; Garetti, Marco; Centrone, Domenico; Fumagalli, Luca; Piero Pavirani, Gian

    2012-01-01

    Railway infrastructure maintenance plays a crucial role for rail transport. It aims at guaranteeing safety of operations and availability of railway tracks and related equipment for traffic regulation. Moreover, it is one major cost for rail transport operations. Thus, the increased competition in traffic market is asking for maintenance improvement, aiming at the reduction of maintenance expenditures while keeping the safety of operations. This issue is addressed by the methodology presented in the paper. The first step of the methodology consists of a family-based approach for the equipment reliability analysis; its purpose is the identification of families of railway items which can be given the same reliability targets. The second step builds the reliability model of the railway system for identifying the most critical items, given a required service level for the transportation system. The two methods have been implemented and tested in practical case studies, in the context of Rete Ferroviaria Italiana, the Italian public limited company for railway transportation.

  6. Business model elements impacting cloud computing adoption

    DEFF Research Database (Denmark)

    Bogataj, Kristina; Pucihar, Andreja; Sudzina, Frantisek

    The paper presents a proposed research framework for identification of business model elements impacting Cloud Computing Adoption. We provide a definition of main Cloud Computing characteristics, discuss previous findings on factors impacting Cloud Computing Adoption, and investigate technology a...

  7. New methodology for a person identification system

    Indian Academy of Sciences (India)

    Abstract. Reliable person identification is a key factor for any safety measure. Unlike other biometrics such as the palm, retina, gait, face and fingerprints, the characteristic of the iris is stable in a person's lifetime. Iris patterns are chaotically distributed and well suited for recognizing persons throughout their lifetime with.

  8. MSblender: A probabilistic approach for integrating peptide identifications from multiple database search engines.

    Science.gov (United States)

    Kwon, Taejoon; Choi, Hyungwon; Vogel, Christine; Nesvizhskii, Alexey I; Marcotte, Edward M

    2011-07-01

    Shotgun proteomics using mass spectrometry is a powerful method for protein identification but suffers limited sensitivity in complex samples. Integrating peptide identifications from multiple database search engines is a promising strategy to increase the number of peptide identifications and reduce the volume of unassigned tandem mass spectra. Existing methods pool statistical significance scores such as p-values or posterior probabilities of peptide-spectrum matches (PSMs) from multiple search engines after high scoring peptides have been assigned to spectra, but these methods lack reliable control of identification error rates as data are integrated from different search engines. We developed a statistically coherent method for integrative analysis, termed MSblender. MSblender converts raw search scores from search engines into a probability score for every possible PSM and properly accounts for the correlation between search scores. The method reliably estimates false discovery rates and identifies more PSMs than any single search engine at the same false discovery rate. Increased identifications increment spectral counts for most proteins and allow quantification of proteins that would not have been quantified by individual search engines. We also demonstrate that enhanced quantification contributes to improve sensitivity in differential expression analyses.

  9. Superior model for fault tolerance computation in designing nano-sized circuit systems

    Energy Technology Data Exchange (ETDEWEB)

    Singh, N. S. S., E-mail: narinderjit@petronas.com.my; Muthuvalu, M. S., E-mail: msmuthuvalu@gmail.com [Fundamental and Applied Sciences Department, Universiti Teknologi PETRONAS, Bandar Seri Iskandar, Perak (Malaysia); Asirvadam, V. S., E-mail: vijanth-sagayan@petronas.com.my [Electrical and Electronics Engineering Department, Universiti Teknologi PETRONAS, Bandar Seri Iskandar, Perak (Malaysia)

    2014-10-24

    As CMOS technology scales nano-metrically, reliability turns out to be a decisive subject in the design methodology of nano-sized circuit systems. As a result, several computational approaches have been developed to compute and evaluate reliability of desired nano-electronic circuits. The process of computing reliability becomes very troublesome and time consuming as the computational complexity build ups with the desired circuit size. Therefore, being able to measure reliability instantly and superiorly is fast becoming necessary in designing modern logic integrated circuits. For this purpose, the paper firstly looks into the development of an automated reliability evaluation tool based on the generalization of Probabilistic Gate Model (PGM) and Boolean Difference-based Error Calculator (BDEC) models. The Matlab-based tool allows users to significantly speed-up the task of reliability analysis for very large number of nano-electronic circuits. Secondly, by using the developed automated tool, the paper explores into a comparative study involving reliability computation and evaluation by PGM and, BDEC models for different implementations of same functionality circuits. Based on the reliability analysis, BDEC gives exact and transparent reliability measures, but as the complexity of the same functionality circuits with respect to gate error increases, reliability measure by BDEC tends to be lower than the reliability measure by PGM. The lesser reliability measure by BDEC is well explained in this paper using distribution of different signal input patterns overtime for same functionality circuits. Simulation results conclude that the reliability measure by BDEC depends not only on faulty gates but it also depends on circuit topology, probability of input signals being one or zero and also probability of error on signal lines.

  10. Superior model for fault tolerance computation in designing nano-sized circuit systems

    International Nuclear Information System (INIS)

    Singh, N. S. S.; Muthuvalu, M. S.; Asirvadam, V. S.

    2014-01-01

    As CMOS technology scales nano-metrically, reliability turns out to be a decisive subject in the design methodology of nano-sized circuit systems. As a result, several computational approaches have been developed to compute and evaluate reliability of desired nano-electronic circuits. The process of computing reliability becomes very troublesome and time consuming as the computational complexity build ups with the desired circuit size. Therefore, being able to measure reliability instantly and superiorly is fast becoming necessary in designing modern logic integrated circuits. For this purpose, the paper firstly looks into the development of an automated reliability evaluation tool based on the generalization of Probabilistic Gate Model (PGM) and Boolean Difference-based Error Calculator (BDEC) models. The Matlab-based tool allows users to significantly speed-up the task of reliability analysis for very large number of nano-electronic circuits. Secondly, by using the developed automated tool, the paper explores into a comparative study involving reliability computation and evaluation by PGM and, BDEC models for different implementations of same functionality circuits. Based on the reliability analysis, BDEC gives exact and transparent reliability measures, but as the complexity of the same functionality circuits with respect to gate error increases, reliability measure by BDEC tends to be lower than the reliability measure by PGM. The lesser reliability measure by BDEC is well explained in this paper using distribution of different signal input patterns overtime for same functionality circuits. Simulation results conclude that the reliability measure by BDEC depends not only on faulty gates but it also depends on circuit topology, probability of input signals being one or zero and also probability of error on signal lines

  11. Evaluation of structural reliability using simulation methods

    Directory of Open Access Journals (Sweden)

    Baballëku Markel

    2015-01-01

    Full Text Available Eurocode describes the 'index of reliability' as a measure of structural reliability, related to the 'probability of failure'. This paper is focused on the assessment of this index for a reinforced concrete bridge pier. It is rare to explicitly use reliability concepts for design of structures, but the problems of structural engineering are better known through them. Some of the main methods for the estimation of the probability of failure are the exact analytical integration, numerical integration, approximate analytical methods and simulation methods. Monte Carlo Simulation is used in this paper, because it offers a very good tool for the estimation of probability in multivariate functions. Complicated probability and statistics problems are solved through computer aided simulations of a large number of tests. The procedures of structural reliability assessment for the bridge pier and the comparison with the partial factor method of the Eurocodes have been demonstrated in this paper.

  12. Development of rapid phenotypic system for the identification

    Indian Academy of Sciences (India)

    Rapid and accurate identification of bacterial pathogens is a fundamental goal of clinical microbiology. The diagnosis and surveillance of diseases is dependent, to a great extent, on laboratory services, which cannot function without effective reliable reagents and diagnostics. Despite the advancement in microbiology ...

  13. System Identification with Quantized Observations

    CERN Document Server

    Wang, Le Yi; Zhang, Jifeng; Zhao, Yanlong

    2010-01-01

    This book presents recently developed methodologies that utilize quantized information in system identification and explores their potential in extending control capabilities for systems with limited sensor information or networked systems. The results of these methodologies can be applied to signal processing and control design of communication and computer networks, sensor networks, mobile agents, coordinated data fusion, remote sensing, telemedicine, and other fields in which noise-corrupted quantized data need to be processed. Providing a comprehensive coverage of quantized identification,

  14. Impact of staffing parameters on operational reliability

    International Nuclear Information System (INIS)

    Hahn, H.A.; Houghton, F.K.

    1993-01-01

    This paper reports on a project related to human resource management of the Department of Energy's (DOE's) High-Level Waste (HLW) Tank program. Safety and reliability of waste tank operations is impacted by several issues, including not only the design of the tanks themselves, but also how operations and operational personnel are managed. As demonstrated by management assessments performed by the Tiger Teams, DOE believes that the effective use of human resources impacts environment safety, and health concerns. For the of the current paper, human resource management activities are identified as ''Staffing'' and include the of developing the functional responsibilities and qualifications of technical and administrative personnel. This paper discusses the importance of staff plans and management in the overall view of safety and reliability. The work activities and procedures associated with the project, a review of the results of these activities, including a summary of the literature and a preliminary analysis of the data. We conclude that although identification of staffing issues and the development of staffing plans contributes to the overall reliability and safety of the HLW tanks, the relationship is not well understood and is in need of further development

  15. Impact of staffing parameters on operational reliability

    International Nuclear Information System (INIS)

    Hahn, H.A.; Houghton, F.K.

    1993-01-01

    This paper reports on a project related to human resource management of the Department of Energy (DOEs) High-Level Waste (HLW) Tank program. Safety and reliability of waste tank operations is impacted by several issues, including not only the design of the tanks themselves, but also how operations and operational personnel are managed. As demonstrated by management assessments performed by the Tiger Teams, DOE believes that the effective use of human resources impacts environment, safety, and health concerns. For the purposes of the current paper, human resource management activities are identified as 'Staffing' and include the process of developing the functional responsibilities and qualifications of technical and administrative personnel. This paper discusses the importance of staff plans and management in the overall view of safety and reliability, the work activities and procedures associated with the project, a review of the results of these activities, including a summary of the literature and a preliminary analysis of the data. We conclude that, although identification of staffing issues and the development of staffing plans contributes to the overall reliability and safety of the HLW tanks, the relationship is not well understood and is in need of further development

  16. Root cause analysis in support of reliability enhancement of engineering components

    International Nuclear Information System (INIS)

    Kumar, Sachin; Mishra, Vivek; Joshi, N.S.; Varde, P.V.

    2014-01-01

    Reliability based methods have been widely used for the safety assessment of plant system, structures and components. These methods provide a quantitative estimation of system reliability but do not give insight into the failure mechanism. Understanding the failure mechanism is a must to avoid the recurrence of the events and enhancement of the system reliability. Root cause analysis provides a tool for gaining detailed insights into the causes of failure of component with particular attention to the identification of fault in component design, operation, surveillance, maintenance, training, procedures and policies which must be improved to prevent repetition of incidents. Root cause analysis also helps in developing Probabilistic Safety Analysis models. A probabilistic precursor study provides a complement to the root cause analysis approach in event analysis by focusing on how an event might have developed adversely. This paper discusses the root cause analysis methodologies and their application in the specific case studies for enhancement of system reliability. (author)

  17. Hardware and software maintenance strategies for upgrading vintage computers

    International Nuclear Information System (INIS)

    Wang, B.C.; Buijs, W.J.; Banting, R.D.

    1992-01-01

    The paper focuses on the maintenance of the computer hardware and software for digital control computers (DCC). Specific design and problems related to various maintenance strategies are reviewed. A foundation was required for a reliable computer maintenance and upgrading program to provide operation of the DCC with high availability and reliability for 40 years. This involved a carefully planned and executed maintenance and upgrading program, involving complementary hardware and software strategies. The computer system was designed on a modular basis, with large sections easily replaceable, to facilitate maintenance and improve availability of the system. Advances in computer hardware have made it possible to replace DCC peripheral devices with reliable, inexpensive, and widely available components from PC-based systems (PC = personal computer). By providing a high speed link from the DCC to a PC, it is now possible to use many commercial software packages to process data from the plant. 1 fig

  18. A double-loop adaptive sampling approach for sensitivity-free dynamic reliability analysis

    International Nuclear Information System (INIS)

    Wang, Zequn; Wang, Pingfeng

    2015-01-01

    Dynamic reliability measures reliability of an engineered system considering time-variant operation condition and component deterioration. Due to high computational costs, conducting dynamic reliability analysis at an early system design stage remains challenging. This paper presents a confidence-based meta-modeling approach, referred to as double-loop adaptive sampling (DLAS), for efficient sensitivity-free dynamic reliability analysis. The DLAS builds a Gaussian process (GP) model sequentially to approximate extreme system responses over time, so that Monte Carlo simulation (MCS) can be employed directly to estimate dynamic reliability. A generic confidence measure is developed to evaluate the accuracy of dynamic reliability estimation while using the MCS approach based on developed GP models. A double-loop adaptive sampling scheme is developed to efficiently update the GP model in a sequential manner, by considering system input variables and time concurrently in two sampling loops. The model updating process using the developed sampling scheme can be terminated once the user defined confidence target is satisfied. The developed DLAS approach eliminates computationally expensive sensitivity analysis process, thus substantially improves the efficiency of dynamic reliability analysis. Three case studies are used to demonstrate the efficacy of DLAS for dynamic reliability analysis. - Highlights: • Developed a novel adaptive sampling approach for dynamic reliability analysis. • POD Developed a new metric to quantify the accuracy of dynamic reliability estimation. • Developed a new sequential sampling scheme to efficiently update surrogate models. • Three case studies were used to demonstrate the efficacy of the new approach. • Case study results showed substantially enhanced efficiency with high accuracy

  19. Reliable identification at the species level of Brucella isolates with MALDI-TOF-MS

    NARCIS (Netherlands)

    Lista, F.; Reubsaet, F.A.G.; Santis, R. de; Parchen, R.R.; Jong, A.L. de; Kieboom, J.; Laaken, A.L. van der; Voskamp-Visser, I.A.I.; Fillo, S.; Jansen, H.J. de; Plas, J. van der; Paauw, A.

    2011-01-01

    Background: The genus Brucella contains highly infectious species that are classified as biological threat agents. The timely detection and identification of the microorganism involved is essential for an effective response not only to biological warfare attacks but also to natural outbreaks.

  20. Reliability-oriented energy storage sizing in wind power systems

    DEFF Research Database (Denmark)

    Qin, Zian; Liserre, Marco; Blaabjerg, Frede

    2014-01-01

    Energy storage can be used to suppress the power fluctuations in wind power systems, and thereby reduce the thermal excursion and improve the reliability. Since the cost of the energy storage in large power application is high, it is crucial to have a better understanding of the relationship...... between the size of the energy storage and the reliability benefit it can generate. Therefore, a reliability-oriented energy storage sizing approach is proposed for the wind power systems, where the power, energy, cost and the control strategy of the energy storage are all taken into account....... With the proposed approach, the computational effort is reduced and the impact of the energy storage system on the reliability of the wind power converter can be quantified....