WorldWideScience

Sample records for reliable computational identification

  1. Reliable computer systems.

    Science.gov (United States)

    Wear, L L; Pinkert, J R

    1993-11-01

    In this article, we looked at some decisions that apply to the design of reliable computer systems. We began with a discussion of several terms such as testability, then described some systems that call for highly reliable hardware and software. The article concluded with a discussion of methods that can be used to achieve higher reliability in computer systems. Reliability and fault tolerance in computers probably will continue to grow in importance. As more and more systems are computerized, people will want assurances about the reliability of these systems, and their ability to work properly even when sub-systems fail.

  2. Evaluation of the reliability concerning the identification of human factors as contributing factors by a computer supported event analysis (CEA)

    International Nuclear Information System (INIS)

    Wilpert, B.; Maimer, H.; Loroff, C.

    2000-01-01

    The project's objectives are the evaluation of the reliability concerning the identification of Human Factors as contributing factors by a computer supported event analysis (CEA). CEA is a computer version of SOL (Safety through Organizational Learning). Parts of the first step were interviews with experts from the nuclear power industry and the evaluation of existing computer supported event analysis methods. This information was combined to a requirement profile for the CEA software. The next step contained the implementation of the software in an iterative process of evaluation. The completion of this project was the testing of the CEA software. As a result the testing demonstrated that it is possible to identify contributing factors with CEA validly. In addition, CEA received a very positive feedback from the experts. (orig.) [de

  3. Reliability in the utility computing era: Towards reliable Fog computing

    DEFF Research Database (Denmark)

    Madsen, Henrik; Burtschy, Bernard; Albeanu, G.

    2013-01-01

    This paper considers current paradigms in computing and outlines the most important aspects concerning their reliability. The Fog computing paradigm as a non-trivial extension of the Cloud is considered and the reliability of the networks of smart devices are discussed. Combining the reliability...... requirements of grid and cloud paradigms with the reliability requirements of networks of sensor and actuators it follows that designing a reliable Fog computing platform is feasible....

  4. Reliable computation from contextual correlations

    Science.gov (United States)

    Oestereich, André L.; Galvão, Ernesto F.

    2017-12-01

    An operational approach to the study of computation based on correlations considers black boxes with one-bit inputs and outputs, controlled by a limited classical computer capable only of performing sums modulo-two. In this setting, it was shown that noncontextual correlations do not provide any extra computational power, while contextual correlations were found to be necessary for the deterministic evaluation of nonlinear Boolean functions. Here we investigate the requirements for reliable computation in this setting; that is, the evaluation of any Boolean function with success probability bounded away from 1 /2 . We show that bipartite CHSH quantum correlations suffice for reliable computation. We also prove that an arbitrarily small violation of a multipartite Greenberger-Horne-Zeilinger noncontextuality inequality also suffices for reliable computation.

  5. Computer scientist looks at reliability computations

    International Nuclear Information System (INIS)

    Rosenthal, A.

    1975-01-01

    Results from the theory of computational complexity are applied to reliability computations on fault trees and networks. A well known class of problems which almost certainly have no fast solution algorithms is presented. It is shown that even approximately computing the reliability of many systems is difficult enough to be in this class. In the face of this result, which indicates that for general systems the computation time will be exponential in the size of the system, decomposition techniques which can greatly reduce the effective size of a wide variety of realistic systems are explored

  6. Identification of computer graphics objects

    Directory of Open Access Journals (Sweden)

    Rossinskyi Yu.M.

    2016-04-01

    Full Text Available The article is devoted to the use of computer graphics methods in problems of creating drawings, charts, drafting, etc. The widespread use of these methods requires the development of efficient algorithms for the identification of objects of drawings. The article analyzes the model-making algorithms for this problem and considered the possibility of reducing the time using graphics editing operations. Editing results in such operations as copying, moving and deleting objects specified images. These operations allow the use of a reliable identification of images of objects methods. For information on the composition of the image of the object along with information about the identity and the color should include information about the spatial location and other characteristics of the object (the thickness and style of contour lines, fill style, and so on. In order to enable the pixel image analysis to structure the information it is necessary to enable the initial code image objects color. The article shows the results of the implementation of the algorithm of encoding object identifiers. To simplify the process of building drawings of any kind, and reduce time-consuming, method of drawing objects identification is proposed based on the use as the ID information of the object color.

  7. Reliability and Availability of Cloud Computing

    CERN Document Server

    Bauer, Eric

    2012-01-01

    A holistic approach to service reliability and availability of cloud computing Reliability and Availability of Cloud Computing provides IS/IT system and solution architects, developers, and engineers with the knowledge needed to assess the impact of virtualization and cloud computing on service reliability and availability. It reveals how to select the most appropriate design for reliability diligence to assure that user expectations are met. Organized in three parts (basics, risk analysis, and recommendations), this resource is accessible to readers of diverse backgrounds and experience le

  8. BUILDING MODEL ANALYSIS APPLICATIONS WITH THE JOINT UNIVERSAL PARAMETER IDENTIFICATION AND EVALUATION OF RELIABILITY (JUPITER) API

    Science.gov (United States)

    The open-source, public domain JUPITER (Joint Universal Parameter IdenTification and Evaluation of Reliability) API (Application Programming Interface) provides conventions and Fortran-90 modules to develop applications (computer programs) for analyzing process models. The input ...

  9. Reliable computer systems design and evaluatuion

    CERN Document Server

    Siewiorek, Daniel

    2014-01-01

    Enhance your hardware/software reliabilityEnhancement of system reliability has been a major concern of computer users and designers ¦ and this major revision of the 1982 classic meets users' continuing need for practical information on this pressing topic. Included are case studies of reliablesystems from manufacturers such as Tandem, Stratus, IBM, and Digital, as well as coverage of special systems such as the Galileo Orbiter fault protection system and AT&T telephone switching processors.

  10. Reliability Generalization of the Alcohol Use Disorder Identification Test.

    Science.gov (United States)

    Shields, Alan L.; Caruso, John C.

    2002-01-01

    Evaluated the reliability of scores from the Alcohol Use Disorders Identification Test (AUDIT; J. Sounders and others, 1993) in a reliability generalization study based on 17 empirical journal articles. Results show AUDIT scores to be generally reliable for basic assessment. (SLD)

  11. Towards higher reliability of CMS computing facilities

    International Nuclear Information System (INIS)

    Bagliesi, G; Bloom, K; Brew, C; Flix, J; Kreuzer, P; Sciabà, A

    2012-01-01

    The CMS experiment has adopted a computing system where resources are distributed worldwide in more than 50 sites. The operation of the system requires a stable and reliable behaviour of the underlying infrastructure. CMS has established procedures to extensively test all relevant aspects of a site and their capability to sustain the various CMS computing workflows at the required scale. The Site Readiness monitoring infrastructure has been instrumental in understanding how the system as a whole was improving towards LHC operations, measuring the reliability of sites when running CMS activities, and providing sites with the information they need to troubleshoot any problem. This contribution reviews the complete automation of the Site Readiness program, with the description of monitoring tools and their inclusion into the Site Status Board (SSB), the performance checks, the use of tools like HammerCloud, and the impact in improving the overall reliability of the Grid from the point of view of the CMS computing system. These results are used by CMS to select good sites to conduct workflows, in order to maximize workflows efficiencies. The performance against these tests seen at the sites during the first years of LHC running is as well reviewed.

  12. JUPITER PROJECT - JOINT UNIVERSAL PARAMETER IDENTIFICATION AND EVALUATION OF RELIABILITY

    Science.gov (United States)

    The JUPITER (Joint Universal Parameter IdenTification and Evaluation of Reliability) project builds on the technology of two widely used codes for sensitivity analysis, data assessment, calibration, and uncertainty analysis of environmental models: PEST and UCODE.

  13. The reliability of tablet computers in depicting maxillofacial radiographic landmarks

    Energy Technology Data Exchange (ETDEWEB)

    Tadinada, Aditya; Mahdian, Mina; Sheth, Sonam; Chandhoke, Taranpreet K.; Gopalakrishna, Aadarsh; Potluri, Anitha; Yadav, Sumit [University of Connecticut School of Dental Medicine, Farmington (United States)

    2015-09-15

    This study was performed to evaluate the reliability of the identification of anatomical landmarks in panoramic and lateral cephalometric radiographs on a standard medical grade picture archiving communication system (PACS) monitor and a tablet computer (iPad 5). A total of 1000 radiographs, including 500 panoramic and 500 lateral cephalometric radiographs, were retrieved from the de-identified dataset of the archive of the Section of Oral and Maxillofacial Radiology of the University Of Connecticut School Of Dental Medicine. Major radiographic anatomical landmarks were independently reviewed by two examiners on both displays. The examiners initially reviewed ten panoramic and ten lateral cephalometric radiographs using each imaging system, in order to verify interoperator agreement in landmark identification. The images were scored on a four-point scale reflecting the diagnostic image quality and exposure level of the images. Statistical analysis showed no significant difference between the two displays regarding the visibility and clarity of the landmarks in either the panoramic or cephalometric radiographs. Tablet computers can reliably show anatomical landmarks in panoramic and lateral cephalometric radiographs.

  14. Reliability and protection against failure in computer systems

    International Nuclear Information System (INIS)

    Daniels, B.K.

    1979-01-01

    Computers are being increasingly integrated into the control and safety systems of large and potentially hazardous industrial processes. This development introduces problems which are particular to computer systems and opens the way to new techniques of solving conventional reliability and availability problems. References to the developing fields of software reliability, human factors and software design are given, and these subjects are related, where possible, to the quantified assessment of reliability. Original material is presented in the areas of reliability growth and computer hardware failure data. The report draws on the experience of the National Centre of Systems Reliability in assessing the capability and reliability of computer systems both within the nuclear industry, and from the work carried out in other industries by the Systems Reliability Service. (author)

  15. Computer-aided reliability and risk assessment

    International Nuclear Information System (INIS)

    Leicht, R.; Wingender, H.J.

    1989-01-01

    Activities in the fields of reliability and risk analyses have led to the development of particular software tools which now are combined in the PC-based integrated CARARA system. The options available in this system cover a wide range of reliability-oriented tasks, like organizing raw failure data in the component/event data bank FDB, performing statistical analysis of those data with the program FDA, managing the resulting parameters in the reliability data bank RDB, and performing fault tree analysis with the fault tree code FTL or evaluating the risk of toxic or radioactive material release with the STAR code. (orig.)

  16. Planning is not sufficient - Reliable computers need good requirements specifications

    International Nuclear Information System (INIS)

    Matras, J.R.

    1992-01-01

    Computer system reliability is the assurance that a computer system will perform its functions when required to do so. To ensure such reliability, it is important to plan the activities needed for computer system development. These development activities, in turn, require a Computer Quality Assurance Plan (CQAP) that provides the following: a Configuration Management Plan, a Verification and Validation (V and V) Plan, documentation requirements, a defined life cycle, review requirements, and organizational responsibilities. These items are necessary for system reliability; ultimately, however, they are not enough. Development of a reliable system is dependent on the requirements specification. This paper discusses how to use existing industry standards to develop a CQAP. In particular, the paper emphasizes the importance of the requirements specification and of methods for establishing reliability goals. The paper also describes how the revision of ANSI/IEE-ANS-7-4.3.2, Application Criteria for Digital Computer Systems of Nuclear Power Generating Stations, has addressed these issues

  17. CADRIGS--computer aided design reliability interactive graphics system

    International Nuclear Information System (INIS)

    Kwik, R.J.; Polizzi, L.M.; Sticco, S.; Gerrard, P.B.; Yeater, M.L.; Hockenbury, R.W.; Phillips, M.A.

    1982-01-01

    An integrated reliability analysis program combining graphic representation of fault trees, automated data base loadings and reference, and automated construction of reliability code input files was developed. The functional specifications for CADRIGS, the computer aided design reliability interactive graphics system, are presented. Previously developed fault tree segments used in auxiliary feedwater system safety analysis were constructed on CADRIGS and, when combined, yielded results identical to those resulting from manual input to the same reliability codes

  18. Assessment of physical server reliability in multi cloud computing system

    Science.gov (United States)

    Kalyani, B. J. D.; Rao, Kolasani Ramchand H.

    2018-04-01

    Business organizations nowadays functioning with more than one cloud provider. By spreading cloud deployment across multiple service providers, it creates space for competitive prices that minimize the burden on enterprises spending budget. To assess the software reliability of multi cloud application layered software reliability assessment paradigm is considered with three levels of abstractions application layer, virtualization layer, and server layer. The reliability of each layer is assessed separately and is combined to get the reliability of multi-cloud computing application. In this paper, we focused on how to assess the reliability of server layer with required algorithms and explore the steps in the assessment of server reliability.

  19. Estimating the reliability of eyewitness identifications from police lineups.

    Science.gov (United States)

    Wixted, John T; Mickes, Laura; Dunn, John C; Clark, Steven E; Wells, William

    2016-01-12

    Laboratory-based mock crime studies have often been interpreted to mean that (i) eyewitness confidence in an identification made from a lineup is a weak indicator of accuracy and (ii) sequential lineups are diagnostically superior to traditional simultaneous lineups. Largely as a result, juries are increasingly encouraged to disregard eyewitness confidence, and up to 30% of law enforcement agencies in the United States have adopted the sequential procedure. We conducted a field study of actual eyewitnesses who were assigned to simultaneous or sequential photo lineups in the Houston Police Department over a 1-y period. Identifications were made using a three-point confidence scale, and a signal detection model was used to analyze and interpret the results. Our findings suggest that (i) confidence in an eyewitness identification from a fair lineup is a highly reliable indicator of accuracy and (ii) if there is any difference in diagnostic accuracy between the two lineup formats, it likely favors the simultaneous procedure.

  20. Comparative reliability of cheiloscopy and palatoscopy in human identification

    Directory of Open Access Journals (Sweden)

    Sharma Preeti

    2009-01-01

    Full Text Available Background: Establishing a person′s identity in postmortem scenarios can be a very difficult process. Dental records, fingerprint and DNA comparisons are probably the most common techniques used in this context, allowing fast and reliable identification processes. However, under certain circumstances they cannot always be used; sometimes it is necessary to apply different and less known techniques. In forensic identification, lip prints and palatal rugae patterns can lead us to important information and help in a person′s identification. This study aims to ascertain the use of lip prints and palatal rugae pattern in identification and sex differentiation. Materials and Methods: A total of 100 subjects, 50 males and 50 females were selected from among the students of Subharti Dental College, Meerut. The materials used to record lip prints were lipstick, bond paper, cellophane tape, a brush for applying the lipstick, and a magnifying lens. To study palatal rugae, alginate impressions were taken and the dental casts analyzed for their various patterns. Results: Statistical analysis (applying Z-test for proportion showed significant difference for type I, I′, IV and V lip patterns (P < 0.05 in males and females, while no significant difference was observed for the same in the palatal rugae patterns (P > 0.05. Conclusion: This study not only showed that palatal rugae and lip prints are unique to an individual, but also that lip prints is more reliable for recognition of the sex of an individual.

  1. Reliability of Computer Analysis of Electrocardiograms (ECG) of ...

    African Journals Online (AJOL)

    Background: Computer programmes have been introduced to electrocardiography (ECG) with most physicians in Africa depending on computer interpretation of ECG. This study was undertaken to evaluate the reliability of computer interpretation of the 12-Lead ECG in the Black race. Methodology: Using the SCHILLER ...

  2. Reliability and Identification of Aortic Valve Prolapse in the Horse

    Directory of Open Access Journals (Sweden)

    Hallowell Gayle D

    2013-01-01

    Full Text Available Abstract Background The objectives were to determine and assess the reliability of criteria for identification of aortic valve prolapse (AVP using echocardiography in the horse. Results Opinion of equine cardiologists indicated that a long-axis view of the aortic valve (AoV was most commonly used for identification of AVP (46%; n=13. There was consensus that AVP could be mimicked by ultrasound probe malignment. This was confirmed in 7 healthy horses, where the appearance of AVP could be induced by malalignment. In a study of a further 8 healthy horses (5 with AVP examined daily for 5 days, by two echocardiographers standardized imaging guidelines gave good to excellent agreement for the assessment of AVP (kappa>0.80 and good agreement between days and observers (kappa >0.6. The technique allowed for assessment of the degree of prolapse and measurement of the prolapse distance that provided excellent agreement between echocardiographers, days and observers (kappa/ICC>0.8. Assessments made using real-time zoomed images provided similar measurements to the standard views (ICC=0.9, with agreement for the identification of AVP (kappa>0.8. Short axis views of the AoV were used for identification of AVP by fewer respondents (23%, however provided less agreement for the identification of AVP (kappa>0.6 and only adequate agreement with observations made in long axis (kappa>0.5, with AVP being identified more often in short axis (92% compared to long axis (76%. Orthogonal views were used by 31% of respondents to identify the presence of AVP, and 85% to identify cusp. Its identification on both views on 4 days was used to categorise horses as having AVP, providing a positive predictive value of 79% and negative predictive value of 18%. Only the non-coronary cusp (NCC of the AoV was observed to prolapse in these studies. Prolapse of the NCC was confirmed during the optimisation study using four-dimensional echocardiography, which concurred with the findings

  3. Reliability of System Identification Techniques to Assess Standing Balance in Healthy Elderly.

    Directory of Open Access Journals (Sweden)

    Jantsje H Pasma

    Full Text Available System identification techniques have the potential to assess the contribution of the underlying systems involved in standing balance by applying well-known disturbances. We investigated the reliability of standing balance parameters obtained with multivariate closed loop system identification techniques.In twelve healthy elderly balance tests were performed twice a day during three days. Body sway was measured during two minutes of standing with eyes closed and the Balance test Room (BalRoom was used to apply four disturbances simultaneously: two sensory disturbances, to the proprioceptive and the visual system, and two mechanical disturbances applied at the leg and trunk segment. Using system identification techniques, sensitivity functions of the sensory disturbances and the neuromuscular controller were estimated. Based on the generalizability theory (G theory, systematic errors and sources of variability were assessed using linear mixed models and reliability was assessed by computing indexes of dependability (ID, standard error of measurement (SEM and minimal detectable change (MDC.A systematic error was found between the first and second trial in the sensitivity functions. No systematic error was found in the neuromuscular controller and body sway. The reliability of 15 of 25 parameters and body sway were moderate to excellent when the results of two trials on three days were averaged. To reach an excellent reliability on one day in 7 out of 25 parameters, it was predicted that at least seven trials must be averaged.This study shows that system identification techniques are a promising method to assess the underlying systems involved in standing balance in elderly. However, most of the parameters do not appear to be reliable unless a large number of trials are collected across multiple days. To reach an excellent reliability in one third of the parameters, a training session for participants is needed and at least seven trials of two

  4. Building fast, reliable, and adaptive software for computational science

    International Nuclear Information System (INIS)

    Rendell, A P; Antony, J; Armstrong, W; Janes, P; Yang, R

    2008-01-01

    Building fast, reliable, and adaptive software is a constant challenge for computational science, especially given recent developments in computer architecture. This paper outlines some of our efforts to address these three issues in the context of computational chemistry. First, a simple linear performance that can be used to model and predict the performance of Hartree-Fock calculations is discussed. Second, the use of interval arithmetic to assess the numerical reliability of the sort of integrals used in electronic structure methods is presented. Third, use of dynamic code modification as part of a framework to support adaptive software is outlined

  5. Computational botany methods for automated species identification

    CERN Document Server

    Remagnino, Paolo; Wilkin, Paul; Cope, James; Kirkup, Don

    2017-01-01

    This book discusses innovative methods for mining information from images of plants, especially leaves, and highlights the diagnostic features that can be implemented in fully automatic systems for identifying plant species. Adopting a multidisciplinary approach, it explores the problem of plant species identification, covering both the concepts of taxonomy and morphology. It then provides an overview of morphometrics, including the historical background and the main steps in the morphometric analysis of leaves together with a number of applications. The core of the book focuses on novel diagnostic methods for plant species identification developed from a computer scientist’s perspective. It then concludes with a chapter on the characterization of botanists' visions, which highlights important cognitive aspects that can be implemented in a computer system to more accurately replicate the human expert’s fixation process. The book not only represents an authoritative guide to advanced computational tools fo...

  6. Prediction of Software Reliability using Bio Inspired Soft Computing Techniques.

    Science.gov (United States)

    Diwaker, Chander; Tomar, Pradeep; Poonia, Ramesh C; Singh, Vijander

    2018-04-10

    A lot of models have been made for predicting software reliability. The reliability models are restricted to using particular types of methodologies and restricted number of parameters. There are a number of techniques and methodologies that may be used for reliability prediction. There is need to focus on parameters consideration while estimating reliability. The reliability of a system may increase or decreases depending on the selection of different parameters used. Thus there is need to identify factors that heavily affecting the reliability of the system. In present days, reusability is mostly used in the various area of research. Reusability is the basis of Component-Based System (CBS). The cost, time and human skill can be saved using Component-Based Software Engineering (CBSE) concepts. CBSE metrics may be used to assess those techniques which are more suitable for estimating system reliability. Soft computing is used for small as well as large-scale problems where it is difficult to find accurate results due to uncertainty or randomness. Several possibilities are available to apply soft computing techniques in medicine related problems. Clinical science of medicine using fuzzy-logic, neural network methodology significantly while basic science of medicine using neural-networks-genetic algorithm most frequently and preferably. There is unavoidable interest shown by medical scientists to use the various soft computing methodologies in genetics, physiology, radiology, cardiology and neurology discipline. CBSE boost users to reuse the past and existing software for making new products to provide quality with a saving of time, memory space, and money. This paper focused on assessment of commonly used soft computing technique like Genetic Algorithm (GA), Neural-Network (NN), Fuzzy Logic, Support Vector Machine (SVM), Ant Colony Optimization (ACO), Particle Swarm Optimization (PSO), and Artificial Bee Colony (ABC). This paper presents working of soft computing

  7. High-reliability computing for the smarter planet

    International Nuclear Information System (INIS)

    Quinn, Heather M.; Graham, Paul; Manuzzato, Andrea; Dehon, Andre

    2010-01-01

    The geometric rate of improvement of transistor size and integrated circuit performance, known as Moore's Law, has been an engine of growth for our economy, enabling new products and services, creating new value and wealth, increasing safety, and removing menial tasks from our daily lives. Affordable, highly integrated components have enabled both life-saving technologies and rich entertainment applications. Anti-lock brakes, insulin monitors, and GPS-enabled emergency response systems save lives. Cell phones, internet appliances, virtual worlds, realistic video games, and mp3 players enrich our lives and connect us together. Over the past 40 years of silicon scaling, the increasing capabilities of inexpensive computation have transformed our society through automation and ubiquitous communications. In this paper, we will present the concept of the smarter planet, how reliability failures affect current systems, and methods that can be used to increase the reliable adoption of new automation in the future. We will illustrate these issues using a number of different electronic devices in a couple of different scenarios. Recently IBM has been presenting the idea of a 'smarter planet.' In smarter planet documents, IBM discusses increased computer automation of roadways, banking, healthcare, and infrastructure, as automation could create more efficient systems. A necessary component of the smarter planet concept is to ensure that these new systems have very high reliability. Even extremely rare reliability problems can easily escalate to problematic scenarios when implemented at very large scales. For life-critical systems, such as automobiles, infrastructure, medical implantables, and avionic systems, unmitigated failures could be dangerous. As more automation moves into these types of critical systems, reliability failures will need to be managed. As computer automation continues to increase in our society, the need for greater radiation reliability is necessary

  8. High-reliability computing for the smarter planet

    Energy Technology Data Exchange (ETDEWEB)

    Quinn, Heather M [Los Alamos National Laboratory; Graham, Paul [Los Alamos National Laboratory; Manuzzato, Andrea [UNIV OF PADOVA; Dehon, Andre [UNIV OF PENN; Carter, Nicholas [INTEL CORPORATION

    2010-01-01

    The geometric rate of improvement of transistor size and integrated circuit performance, known as Moore's Law, has been an engine of growth for our economy, enabling new products and services, creating new value and wealth, increasing safety, and removing menial tasks from our daily lives. Affordable, highly integrated components have enabled both life-saving technologies and rich entertainment applications. Anti-lock brakes, insulin monitors, and GPS-enabled emergency response systems save lives. Cell phones, internet appliances, virtual worlds, realistic video games, and mp3 players enrich our lives and connect us together. Over the past 40 years of silicon scaling, the increasing capabilities of inexpensive computation have transformed our society through automation and ubiquitous communications. In this paper, we will present the concept of the smarter planet, how reliability failures affect current systems, and methods that can be used to increase the reliable adoption of new automation in the future. We will illustrate these issues using a number of different electronic devices in a couple of different scenarios. Recently IBM has been presenting the idea of a 'smarter planet.' In smarter planet documents, IBM discusses increased computer automation of roadways, banking, healthcare, and infrastructure, as automation could create more efficient systems. A necessary component of the smarter planet concept is to ensure that these new systems have very high reliability. Even extremely rare reliability problems can easily escalate to problematic scenarios when implemented at very large scales. For life-critical systems, such as automobiles, infrastructure, medical implantables, and avionic systems, unmitigated failures could be dangerous. As more automation moves into these types of critical systems, reliability failures will need to be managed. As computer automation continues to increase in our society, the need for greater radiation reliability is

  9. Automation of reliability evaluation procedures through CARE - The computer-aided reliability estimation program.

    Science.gov (United States)

    Mathur, F. P.

    1972-01-01

    Description of an on-line interactive computer program called CARE (Computer-Aided Reliability Estimation) which can model self-repair and fault-tolerant organizations and perform certain other functions. Essentially CARE consists of a repository of mathematical equations defining the various basic redundancy schemes. These equations, under program control, are then interrelated to generate the desired mathematical model to fit the architecture of the system under evaluation. The mathematical model is then supplied with ground instances of its variables and is then evaluated to generate values for the reliability-theoretic functions applied to the model.

  10. Highly reliable computer network for real time system

    International Nuclear Information System (INIS)

    Mohammed, F.A.; Omar, A.A.; Ayad, N.M.A.; Madkour, M.A.I.; Ibrahim, M.K.

    1988-01-01

    Many of computer networks have been studied different trends regarding the network architecture and the various protocols that govern data transfers and guarantee a reliable communication among all a hierarchical network structure has been proposed to provide a simple and inexpensive way for the realization of a reliable real-time computer network. In such architecture all computers in the same level are connected to a common serial channel through intelligent nodes that collectively control data transfers over the serial channel. This level of computer network can be considered as a local area computer network (LACN) that can be used in nuclear power plant control system since it has geographically dispersed subsystems. network expansion would be straight the common channel for each added computer (HOST). All the nodes are designed around a microprocessor chip to provide the required intelligence. The node can be divided into two sections namely a common section that interfaces with serial data channel and a private section to interface with the host computer. This part would naturally tend to have some variations in the hardware details to match the requirements of individual host computers. fig 7

  11. Comparison of reliability of lateral cephalogram and computed ...

    African Journals Online (AJOL)

    of malocclusion and airway space using lateral cephalogram and computed tomography (CT) and to compare its reliability. To obtain important information on the morphology of the soft palate on lateral cephalogram and to determine its etiopathogenesis in obstructive sleep apnea (OSA). Materials and Methods: Lateral ...

  12. Systems reliability analysis: applications of the SPARCS System-Reliability Assessment Computer Program

    International Nuclear Information System (INIS)

    Locks, M.O.

    1978-01-01

    SPARCS-2 (Simulation Program for Assessing the Reliabilities of Complex Systems, Version 2) is a PL/1 computer program for assessing (establishing interval estimates for) the reliability and the MTBF of a large and complex s-coherent system of any modular configuration. The system can consist of a complex logical assembly of independently failing attribute (binomial-Bernoulli) and time-to-failure (Poisson-exponential) components, without regard to their placement. Alternatively, it can be a configuration of independently failing modules, where each module has either or both attribute and time-to-failure components. SPARCS-2 also has an improved super modularity feature. Modules with minimal-cut unreliabiliy calculations can be mixed with those having minimal-path reliability calculations. All output has been standardized to system reliability or probability of success, regardless of the form in which the input data is presented, and whatever the configuration of modules or elements within modules

  13. Recent advances in computational structural reliability analysis methods

    Science.gov (United States)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-10-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  14. A Research Roadmap for Computation-Based Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  15. A Research Roadmap for Computation-Based Human Reliability Analysis

    International Nuclear Information System (INIS)

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey; Smith, Curtis; Groth, Katrina

    2015-01-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  16. Reliability analysis framework for computer-assisted medical decision systems

    International Nuclear Information System (INIS)

    Habas, Piotr A.; Zurada, Jacek M.; Elmaghraby, Adel S.; Tourassi, Georgia D.

    2007-01-01

    We present a technique that enhances computer-assisted decision (CAD) systems with the ability to assess the reliability of each individual decision they make. Reliability assessment is achieved by measuring the accuracy of a CAD system with known cases similar to the one in question. The proposed technique analyzes the feature space neighborhood of the query case to dynamically select an input-dependent set of known cases relevant to the query. This set is used to assess the local (query-specific) accuracy of the CAD system. The estimated local accuracy is utilized as a reliability measure of the CAD response to the query case. The underlying hypothesis of the study is that CAD decisions with higher reliability are more accurate. The above hypothesis was tested using a mammographic database of 1337 regions of interest (ROIs) with biopsy-proven ground truth (681 with masses, 656 with normal parenchyma). Three types of decision models, (i) a back-propagation neural network (BPNN), (ii) a generalized regression neural network (GRNN), and (iii) a support vector machine (SVM), were developed to detect masses based on eight morphological features automatically extracted from each ROI. The performance of all decision models was evaluated using the Receiver Operating Characteristic (ROC) analysis. The study showed that the proposed reliability measure is a strong predictor of the CAD system's case-specific accuracy. Specifically, the ROC area index for CAD predictions with high reliability was significantly better than for those with low reliability values. This result was consistent across all decision models investigated in the study. The proposed case-specific reliability analysis technique could be used to alert the CAD user when an opinion that is unlikely to be reliable is offered. The technique can be easily deployed in the clinical environment because it is applicable with a wide range of classifiers regardless of their structure and it requires neither additional

  17. Evaluation of Network Reliability for Computer Networks with Multiple Sources

    Directory of Open Access Journals (Sweden)

    Yi-Kuei Lin

    2012-01-01

    Full Text Available Evaluating the reliability of a network with multiple sources to multiple sinks is a critical issue from the perspective of quality management. Due to the unrealistic definition of paths of network models in previous literature, existing models are not appropriate for real-world computer networks such as the Taiwan Advanced Research and Education Network (TWAREN. This paper proposes a modified stochastic-flow network model to evaluate the network reliability of a practical computer network with multiple sources where data is transmitted through several light paths (LPs. Network reliability is defined as being the probability of delivering a specified amount of data from the sources to the sink. It is taken as a performance index to measure the service level of TWAREN. This paper studies the network reliability of the international portion of TWAREN from two sources (Taipei and Hsinchu to one sink (New York that goes through a submarine and land surface cable between Taiwan and the United States.

  18. Reliability analysis of Airbus A-330 computer flight management system

    OpenAIRE

    Fajmut, Metod

    2010-01-01

    Diploma thesis deals with digitized, computerized flight control system »Fly-by-wire« and security aspects of the computer system of an aircraft Airbus A330. As for space and military aircraft structures is also in commercial airplanes, much of the financial contribution devoted to reliability. Conventional aircraft control systems have, and some are still, to rely on mechanical and hydraulic connections between the controls on aircraft operated by the pilot and control surfaces. But newer a...

  19. Computer Model to Estimate Reliability Engineering for Air Conditioning Systems

    International Nuclear Information System (INIS)

    Afrah Al-Bossly, A.; El-Berry, A.; El-Berry, A.

    2012-01-01

    Reliability engineering is used to predict the performance and optimize design and maintenance of air conditioning systems. Air conditioning systems are expose to a number of failures. The failures of an air conditioner such as turn on, loss of air conditioner cooling capacity, reduced air conditioning output temperatures, loss of cool air supply and loss of air flow entirely can be due to a variety of problems with one or more components of an air conditioner or air conditioning system. Forecasting for system failure rates are very important for maintenance. This paper focused on the reliability of the air conditioning systems. Statistical distributions that were commonly applied in reliability settings: the standard (2 parameter) Weibull and Gamma distributions. After distributions parameters had been estimated, reliability estimations and predictions were used for evaluations. To evaluate good operating condition in a building, the reliability of the air conditioning system that supplies conditioned air to the several The company's departments. This air conditioning system is divided into two, namely the main chilled water system and the ten air handling systems that serves the ten departments. In a chilled-water system the air conditioner cools water down to 40-45 degree F (4-7 degree C). The chilled water is distributed throughout the building in a piping system and connected to air condition cooling units wherever needed. Data analysis has been done with support a computer aided reliability software, this is due to the Weibull and Gamma distributions indicated that the reliability for the systems equal to 86.012% and 77.7% respectively. A comparison between the two important families of distribution functions, namely, the Weibull and Gamma families was studied. It was found that Weibull method performed for decision making.

  20. Algorithmic mechanisms for reliable crowdsourcing computation under collusion.

    Science.gov (United States)

    Fernández Anta, Antonio; Georgiou, Chryssis; Mosteiro, Miguel A; Pareja, Daniel

    2015-01-01

    We consider a computing system where a master processor assigns a task for execution to worker processors that may collude. We model the workers' decision of whether to comply (compute the task) or not (return a bogus result to save the computation cost) as a game among workers. That is, we assume that workers are rational in a game-theoretic sense. We identify analytically the parameter conditions for a unique Nash Equilibrium where the master obtains the correct result. We also evaluate experimentally mixed equilibria aiming to attain better reliability-profit trade-offs. For a wide range of parameter values that may be used in practice, our simulations show that, in fact, both master and workers are better off using a pure equilibrium where no worker cheats, even under collusion, and even for colluding behaviors that involve deviating from the game.

  1. Diagnostic reliability of MMPI-2 computer-based test interpretations.

    Science.gov (United States)

    Pant, Hina; McCabe, Brian J; Deskovitz, Mark A; Weed, Nathan C; Williams, John E

    2014-09-01

    Reflecting the common use of the MMPI-2 to provide diagnostic considerations, computer-based test interpretations (CBTIs) also typically offer diagnostic suggestions. However, these diagnostic suggestions can sometimes be shown to vary widely across different CBTI programs even for identical MMPI-2 profiles. The present study evaluated the diagnostic reliability of 6 commercially available CBTIs using a 20-item Q-sort task developed for this study. Four raters each sorted diagnostic classifications based on these 6 CBTI reports for 20 MMPI-2 profiles. Two questions were addressed. First, do users of CBTIs understand the diagnostic information contained within the reports similarly? Overall, diagnostic sorts of the CBTIs showed moderate inter-interpreter diagnostic reliability (mean r = .56), with sorts for the 1/2/3 profile showing the highest inter-interpreter diagnostic reliability (mean r = .67). Second, do different CBTIs programs vary with respect to diagnostic suggestions? It was found that diagnostic sorts of the CBTIs had a mean inter-CBTI diagnostic reliability of r = .56, indicating moderate but not strong agreement across CBTIs in terms of diagnostic suggestions. The strongest inter-CBTI diagnostic agreement was found for sorts of the 1/2/3 profile CBTIs (mean r = .71). Limitations and future directions are discussed. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  2. Reliability and validity of a talent identification test battery for seated and standing Paralympic throws.

    Science.gov (United States)

    Spathis, Jemima Grace; Connick, Mark James; Beckman, Emma Maree; Newcombe, Peter Anthony; Tweedy, Sean Michael

    2015-01-01

    Paralympic throwing events for athletes with physical impairments comprise seated and standing javelin, shot put, discus and seated club throwing. Identification of talented throwers would enable prediction of future success and promote participation; however, a valid and reliable talent identification battery for Paralympic throwing has not been reported. This study evaluates the reliability and validity of a talent identification battery for Paralympic throws. Participants were non-disabled so that impairment would not confound analyses, and results would provide an indication of normative performance. Twenty-eight non-disabled participants (13 M; 15 F) aged 23.6 years (±5.44) performed five kinematically distinct criterion throws (three seated, two standing) and nine talent identification tests (three anthropometric, six motor); 23 were tested a second time to evaluate test-retest reliability. Talent identification test-retest reliability was evaluated using Intra-class Correlation Coefficient (ICC) and Bland-Altman plots (Limits of Agreement). Spearman's correlation assessed strength of association between criterion throws and talent identification tests. Reliability was generally acceptable (mean ICC = 0.89), but two seated talent identification tests require more extensive familiarisation. Correlation strength (mean rs = 0.76) indicated that the talent identification tests can be used to validly identify individuals with competitively advantageous attributes for each of the five kinematically distinct throwing activities. Results facilitate further research in this understudied area.

  3. Sigma: computer vision in the service of safety and reliability in the inspection services

    International Nuclear Information System (INIS)

    Pineiro, P. J.; Mendez, M.; Garcia, A.; Cabrera, E.; Regidor, J. J.

    2012-01-01

    Vision Computing is growing very fast in the last decade with very efficient tools and algorithms. This allows new development of applications in the nuclear field providing more efficient equipment and tasks: redundant systems, vision-guided mobile robots, automated visual defects recognition, measurement, etc., In this paper Tecnatom describes a detailed example of visual computing application developed to provide secure redundant identification of the thousands of tubes existing in a power plant steam generator. some other on-going or planned visual computing projects by Tecnatom are also introduced. New possibilities of application in the inspection systems for nuclear components appear where the main objective is to maximize their reliability. (Author) 6 refs.

  4. Reliability of system identification techniques to assess standing balance in healthy elderly

    NARCIS (Netherlands)

    Pasma, Jantsje H.; Engelhart, Denise; Maier, Andrea B.; Aarts, Ronald G.K.M.; Van Gerven, Joop M.A.; Arendzen, J. Hans; Schouten, Alfred C.; Meskers, Carel G.M.; Van Kooij, Herman Der

    2016-01-01

    Objectives System identification techniques have the potential to assess the contribution of the underlying systems involved in standing balance by applying well-known disturbances. We investigated the reliability of standing balance parameters obtained with multivariate closed loop system

  5. Reliability of System Identification Techniques to Assess Standing Balance in Healthy Elderly

    NARCIS (Netherlands)

    Pasma, J.H.; Engelhart, D.; Maier, A.B.; Aarts, R.G.K.M.; Van Gerven, J.M.A.; Arendzen, J.H.; Schouten, A.C.; Meskers, C.G.M.; Van der Kooij, H.

    2016-01-01

    Objectives System identification techniques have the potential to assess the contribution of the underlying systems involved in standing balance by applying well-known disturbances. We investigated the reliability of standing balance parameters obtained with multivariate closed loop system

  6. Reliability of an interactive computer program for advance care planning.

    Science.gov (United States)

    Schubart, Jane R; Levi, Benjamin H; Camacho, Fabian; Whitehead, Megan; Farace, Elana; Green, Michael J

    2012-06-01

    Despite widespread efforts to promote advance directives (ADs), completion rates remain low. Making Your Wishes Known: Planning Your Medical Future (MYWK) is an interactive computer program that guides individuals through the process of advance care planning, explaining health conditions and interventions that commonly involve life or death decisions, helps them articulate their values/goals, and translates users' preferences into a detailed AD document. The purpose of this study was to demonstrate that (in the absence of major life changes) the AD generated by MYWK reliably reflects an individual's values/preferences. English speakers ≥30 years old completed MYWK twice, 4 to 6 weeks apart. Reliability indices were assessed for three AD components: General Wishes; Specific Wishes for treatment; and Quality-of-Life values (QoL). Twenty-four participants completed the study. Both the Specific Wishes and QoL scales had high internal consistency in both time periods (Knuder Richardson formula 20 [KR-20]=0.83-0.95, and 0.86-0.89). Test-retest reliability was perfect for General Wishes (κ=1), high for QoL (Pearson's correlation coefficient=0.83), but lower for Specific Wishes (Pearson's correlation coefficient=0.57). MYWK generates an AD where General Wishes and QoL (but not Specific Wishes) statements remain consistent over time.

  7. Reliability of an Interactive Computer Program for Advance Care Planning

    Science.gov (United States)

    Levi, Benjamin H.; Camacho, Fabian; Whitehead, Megan; Farace, Elana; Green, Michael J

    2012-01-01

    Abstract Despite widespread efforts to promote advance directives (ADs), completion rates remain low. Making Your Wishes Known: Planning Your Medical Future (MYWK) is an interactive computer program that guides individuals through the process of advance care planning, explaining health conditions and interventions that commonly involve life or death decisions, helps them articulate their values/goals, and translates users' preferences into a detailed AD document. The purpose of this study was to demonstrate that (in the absence of major life changes) the AD generated by MYWK reliably reflects an individual's values/preferences. English speakers ≥30 years old completed MYWK twice, 4 to 6 weeks apart. Reliability indices were assessed for three AD components: General Wishes; Specific Wishes for treatment; and Quality-of-Life values (QoL). Twenty-four participants completed the study. Both the Specific Wishes and QoL scales had high internal consistency in both time periods (Knuder Richardson formula 20 [KR-20]=0.83–0.95, and 0.86–0.89). Test-retest reliability was perfect for General Wishes (κ=1), high for QoL (Pearson's correlation coefficient=0.83), but lower for Specific Wishes (Pearson's correlation coefficient=0.57). MYWK generates an AD where General Wishes and QoL (but not Specific Wishes) statements remain consistent over time. PMID:22512830

  8. Effects of image enhancement on reliability of landmark identification in digital cephalometry

    Directory of Open Access Journals (Sweden)

    M Oshagh

    2013-01-01

    Full Text Available Introduction: Although digital cephalometric radiography is gaining popularity in orthodontic practice, the most important source of error in its tracing is uncertainty in landmark identification. Therefore, efforts to improve accuracy in landmark identification were directed primarily toward the improvement in image quality. One of the more useful techniques of this process involves digital image enhancement which can increase overall visual quality of image, but this does not necessarily mean a better identification of landmarks. The purpose of this study was to evaluate the effectiveness of digital image enhancements on reliability of landmark identification. Materials and Methods: Fifteen common landmarks including 10 skeletal and 5 soft tissues were selected on the cephalograms of 20 randomly selected patients, prepared in Natural Head Position (NHP. Two observers (orthodontists identified landmarks on the 20 original photostimulable phosphor (PSP digital cephalogram images and 20 enhanced digital images twice with an intervening time interval of at least 4 weeks. The x and y coordinates were further analyzed to evaluate the pattern of recording differences in horizontal and vertical directions. Reliability of landmarks identification was analyzed by paired t test. Results: There was a significant difference between original and enhanced digital images in terms of reliability of points Ar and N in vertical and horizontal dimensions, and enhanced images were significantly more reliable than original images. Identification of A point, Pogonion and Pronasal points, in vertical dimension of enhanced images was significantly more reliable than original ones. Reliability of Menton point identification in horizontal dimension was significantly more in enhanced images than original ones. Conclusion: Direct digital image enhancement by altering brightness and contrast can increase reliability of some landmark identification and this may lead to more

  9. Radiologic identification of disaster victims: A simple and reliable method using CT of the paranasal sinuses

    International Nuclear Information System (INIS)

    Ruder, Thomas D.; Kraehenbuehl, Markus; Gotsmy, Walther F.; Mathier, Sandra; Ebert, Lars C.; Thali, Michael J.; Hatch, Gary M.

    2012-01-01

    Objective: To assess the reliability of radiologic identification using visual comparison of ante and post mortem paranasal sinus computed tomography (CT). Subjects and methods: The study was approved by the responsible justice department and university ethics committee. Four blinded readers with varying radiological experience separately compared 100 post mortem to 25 ante mortem head CTs with the goal to identify as many matching pairs as possible (out of 23 possible matches). Sensitivity, specificity, positive and negative predictive values were calculated for all readers. The chi-square test was applied to establish if there was significant difference in sensitivity between radiologists and non-radiologists. Results: For all readers, sensitivity was 83.7%, specificity was 100.0%, negative predictive value (NPV) was 95.4%, positive predictive value (PPV) was 100.0%, and accuracy was 96.3%. For radiologists, sensitivity was 97.8%, NPV was 99.4%, and accuracy was 99.5%. For non-radiologists, average sensitivity was 69.6%, negative predictive value (NPV) was 91.7%, and accuracy was 93.0%. Radiologists achieved a significantly higher sensitivity (p < 0.01) than non-radiologists. Conclusions: Visual comparison of ante mortem and post mortem CT of the head is a robust and reliable method for identifying unknown decedents, particularly in regard to positive matches. The sensitivity and NPV of the method depend on the reader's experience.

  10. Fingerprint: A Unique and Reliable Method for Identification

    Directory of Open Access Journals (Sweden)

    Palash Kumar Bose

    2017-01-01

    Full Text Available Fingerprints have been the gold standard for personal identification within the forensic community for more than one hundred years. It is still universal in spite of discovery of DNA fingerprint. The science of fingerprint identification has evolved over time from the early use of finger prints to mark business transactions in ancient Babylonia to their use today as core technology in biometric security devices and as scientific evidence in courts of law throughout the world. The science of fingerprints, dactylography or dermatoglyphics, had long been widely accepted, and well acclaimed and reputed as panacea for individualization, particularly in forensic investigations. Human fingerprints are detailed, unique, difficult to alter, and durable over the life of an individual, making them suitable as lifelong markers of human identity. Fingerprints can be readily used by police or other authorities to identify individuals who wish to conceal their identity, or to identify people who are incapacitated or deceased, as in the aftermath of a natural disaster

  11. Identification of tasks of maintenance centered in the reliability

    International Nuclear Information System (INIS)

    Torres V, A.; Rivero O, J.J.

    2004-01-01

    The methodology of Reliability Centered Maintenance (RCM) it has become, after the discovery of their advantages, an objective of many industrial facilities to optimize their maintenance. However, diverse subjective factors affect the determination of the parameters (technical of predictive to apply and times among interventions) that characterize the tasks of RCM. A method to determine the monitoring tasks at condition and the times more recommended for to apply the monitoring by time and the search of faults, with focus in system. This methodology has been computerized inside the code MOSEG Win Ver 1.0. The same has been applied with success to the determination of tasks of RCM in industrial objectives. (Author)

  12. Identification of Black Spots Based on Reliability Approach

    Directory of Open Access Journals (Sweden)

    Ahmadreza Ghaffari

    2013-12-01

    Full Text Available Identifying crash “black-spots”, “hot-spots” or “high-risk” locations is one of the most important and prevalent concerns in traffic safety and various methods have been devised and presented for solving this issue until now. In this paper, a new method based on the reliability analysis is presented to identify black-spots. Reliability analysis has an ordered framework to consider the probabilistic nature of engineering problems, so crashes with their probabilistic na -ture can be applied. In this study, the application of this new method was compared with the commonly implemented Frequency and Empirical Bayesian methods using simulated data. The results indicated that the traditional methods can lead to an inconsistent prediction due to their inconsider -ation of the variance of the number of crashes in each site and their dependence on the mean of the data.

  13. Identification of Brucella by MALDI-TOF mass spectrometry. Fast and reliable identification from agar plates and blood cultures.

    Directory of Open Access Journals (Sweden)

    Laura Ferreira

    Full Text Available BACKGROUND: MALDI-TOF mass spectrometry (MS is a reliable method for bacteria identification. Some databases used for this purpose lack reference profiles for Brucella species, which is still an important pathogen in wide areas around the world. We report the creation of profiles for MALDI-TOF Biotyper 2.0 database (Bruker Daltonics, Germany and their usefulness for identifying brucellae from culture plates and blood cultures. METHODOLOGY/PRINCIPAL FINDINGS: We created MALDI Biotyper 2.0 profiles for type strains belonging to B. melitensis biotypes 1, 2 and 3; B. abortus biotypes 1, 2, 5 and 9; B. suis, B. canis, B ceti and B. pinnipedialis. Then, 131 clinical isolates grown on plate cultures were used in triplicate to check identification. Identification at genus level was always correct, although in most cases the three replicates reported different identification at species level. Simulated blood cultures were performed with type strains belonging to the main human pathogenic species (B. melitensis, B. abortus, B. suis and B. canis, and studied by MALDI-TOF MS in triplicate. Identification at genus level was always correct. CONCLUSIONS/SIGNIFICANCE: MALDI-TOF MS is reliable for Brucella identification to the genus level from culture plates and directly from blood culture bottles.

  14. A New Method of Reliability Evaluation Based on Wavelet Information Entropy for Equipment Condition Identification

    International Nuclear Information System (INIS)

    He, Z J; Zhang, X L; Chen, X F

    2012-01-01

    Aiming at reliability evaluation of condition identification of mechanical equipment, it is necessary to analyze condition monitoring information. A new method of reliability evaluation based on wavelet information entropy extracted from vibration signals of mechanical equipment is proposed. The method is quite different from traditional reliability evaluation models that are dependent on probability statistics analysis of large number sample data. The vibration signals of mechanical equipment were analyzed by means of second generation wavelet package (SGWP). We take relative energy in each frequency band of decomposed signal that equals a percentage of the whole signal energy as probability. Normalized information entropy (IE) is obtained based on the relative energy to describe uncertainty of a system instead of probability. The reliability degree is transformed by the normalized wavelet information entropy. A successful application has been achieved to evaluate the assembled quality reliability for a kind of dismountable disk-drum aero-engine. The reliability degree indicates the assembled quality satisfactorily.

  15. The reliable solution and computation time of variable parameters Logistic model

    OpenAIRE

    Pengfei, Wang; Xinnong, Pan

    2016-01-01

    The reliable computation time (RCT, marked as Tc) when applying a double precision computation of a variable parameters logistic map (VPLM) is studied. First, using the method proposed, the reliable solutions for the logistic map are obtained. Second, for a time-dependent non-stationary parameters VPLM, 10000 samples of reliable experiments are constructed, and the mean Tc is then computed. The results indicate that for each different initial value, the Tcs of the VPLM are generally different...

  16. Computational methods for protein identification from mass spectrometry data.

    Directory of Open Access Journals (Sweden)

    Leo McHugh

    2008-02-01

    Full Text Available Protein identification using mass spectrometry is an indispensable computational tool in the life sciences. A dramatic increase in the use of proteomic strategies to understand the biology of living systems generates an ongoing need for more effective, efficient, and accurate computational methods for protein identification. A wide range of computational methods, each with various implementations, are available to complement different proteomic approaches. A solid knowledge of the range of algorithms available and, more critically, the accuracy and effectiveness of these techniques is essential to ensure as many of the proteins as possible, within any particular experiment, are correctly identified. Here, we undertake a systematic review of the currently available methods and algorithms for interpreting, managing, and analyzing biological data associated with protein identification. We summarize the advances in computational solutions as they have responded to corresponding advances in mass spectrometry hardware. The evolution of scoring algorithms and metrics for automated protein identification are also discussed with a focus on the relative performance of different techniques. We also consider the relative advantages and limitations of different techniques in particular biological contexts. Finally, we present our perspective on future developments in the area of computational protein identification by considering the most recent literature on new and promising approaches to the problem as well as identifying areas yet to be explored and the potential application of methods from other areas of computational biology.

  17. Reliable and Persistent Identification of Linked Data Elements

    Science.gov (United States)

    Wood, David

    Linked Data techniques rely upon common terminology in a manner similar to a relational database'vs reliance on a schema. Linked Data terminology anchors metadata descriptions and facilitates navigation of information. Common vocabularies ease the human, social tasks of understanding datasets sufficiently to construct queries and help to relate otherwise disparate datasets. Vocabulary terms must, when using the Resource Description Framework, be grounded in URIs. A current bestpractice on the World Wide Web is to serve vocabulary terms as Uniform Resource Locators (URLs) and present both human-readable and machine-readable representations to the public. Linked Data terminology published to theWorldWideWeb may be used by others without reference or notification to the publishing party. That presents a problem: Vocabulary publishers take on an implicit responsibility to maintain and publish their terms via the URLs originally assigned, regardless of the inconvenience such a responsibility may cause. Over the course of years, people change jobs, publishing organizations change Internet domain names, computers change IP addresses,systems administrators publish old material in new ways. Clearly, a mechanism is required to manageWeb-based vocabularies over a long term. This chapter places Linked Data vocabularies in context with the wider concepts of metadata in general and specifically metadata on the Web. Persistent identifier mechanisms are reviewed, with a particular emphasis on Persistent URLs, or PURLs. PURLs and PURL services are discussed in the context of Linked Data. Finally, historic weaknesses of PURLs are resolved by the introduction of a federation of PURL services to address needs specific to Linked Data.

  18. Reliability of voxel gray values in cone beam computed tomography for preoperative implant planning assessment

    NARCIS (Netherlands)

    Parsa, A.; Ibrahim, N.; Hassan, B.; Motroni, A.; van der Stelt, P.; Wismeijer, D.

    2012-01-01

    Purpose: To assess the reliability of cone beam computed tomography (CBCT) voxel gray value measurements using Hounsfield units (HU) derived from multislice computed tomography (MSCT) as a clinical reference (gold standard). Materials and Methods: Ten partially edentulous human mandibular cadavers

  19. Computational identification of mutually homologous Zika virus ...

    African Journals Online (AJOL)

    Ewen McLean

    2017-04-07

    Apr 7, 2017 ... of Computer Science & Mathematics, Claflin University, Orangeburg, SC, USA. ABSTRACT. Background .... It is still unclear how and at what time of gestation ZIKV infects fetal brain cells, ..... Soc Interface. 2012;9:2708-2717.

  20. To the problem of reliability standardization in computer-aided manufacturing at NPP units

    International Nuclear Information System (INIS)

    Yastrebenetskij, M.A.; Shvyryaev, Yu.V.; Spektor, L.I.; Nikonenko, I.V.

    1989-01-01

    The problems of reliability standardization in computer-aided manufacturing of NPP units considering the following approaches: computer-aided manufacturing of NPP units as a part of automated technological complex; computer-aided manufacturing of NPP units as multi-functional system, are analyzed. Selection of the composition of reliability indeces for computer-aided manufacturing of NPP units for each of the approaches considered is substantiated

  1. A computational Bayesian approach to dependency assessment in system reliability

    International Nuclear Information System (INIS)

    Yontay, Petek; Pan, Rong

    2016-01-01

    Due to the increasing complexity of engineered products, it is of great importance to develop a tool to assess reliability dependencies among components and systems under the uncertainty of system reliability structure. In this paper, a Bayesian network approach is proposed for evaluating the conditional probability of failure within a complex system, using a multilevel system configuration. Coupling with Bayesian inference, the posterior distributions of these conditional probabilities can be estimated by combining failure information and expert opinions at both system and component levels. Three data scenarios are considered in this study, and they demonstrate that, with the quantification of the stochastic relationship of reliability within a system, the dependency structure in system reliability can be gradually revealed by the data collected at different system levels. - Highlights: • A Bayesian network representation of system reliability is presented. • Bayesian inference methods for assessing dependencies in system reliability are developed. • Complete and incomplete data scenarios are discussed. • The proposed approach is able to integrate reliability information from multiple sources at multiple levels of the system.

  2. Comparison of reliability of lateral cephalogram and computed ...

    African Journals Online (AJOL)

    2014-05-07

    May 7, 2014 ... measurements acquired from both the modalities are reliable and reproducible, but .... paint on all the slice of the image stack in the axial plane of ..... between body mass index, age and upper airway measurements in snorers.

  3. Use of computer codes for system reliability analysis

    International Nuclear Information System (INIS)

    Sabek, M.; Gaafar, M.; Poucet, A.

    1988-01-01

    This paper gives a collective summary of the studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRANTIC, FTAP, computer code package RALLY, and BOUNDS codes. Two reference study cases were executed by each code. The results obtained logic/probabilistic analysis as well as computation time are compared

  4. Integrated Markov-neural reliability computation method: A case for multiple automated guided vehicle system

    International Nuclear Information System (INIS)

    Fazlollahtabar, Hamed; Saidi-Mehrabad, Mohammad; Balakrishnan, Jaydeep

    2015-01-01

    This paper proposes an integrated Markovian and back propagation neural network approaches to compute reliability of a system. While states of failure occurrences are significant elements for accurate reliability computation, Markovian based reliability assessment method is designed. Due to drawbacks shown by Markovian model for steady state reliability computations and neural network for initial training pattern, integration being called Markov-neural is developed and evaluated. To show efficiency of the proposed approach comparative analyses are performed. Also, for managerial implication purpose an application case for multiple automated guided vehicles (AGVs) in manufacturing networks is conducted. - Highlights: • Integrated Markovian and back propagation neural network approach to compute reliability. • Markovian based reliability assessment method. • Managerial implication is shown in an application case for multiple automated guided vehicles (AGVs) in manufacturing networks

  5. Selection of suitable hand gestures for reliable myoelectric human computer interface.

    Science.gov (United States)

    Castro, Maria Claudia F; Arjunan, Sridhar P; Kumar, Dinesh K

    2015-04-09

    Myoelectric controlled prosthetic hand requires machine based identification of hand gestures using surface electromyogram (sEMG) recorded from the forearm muscles. This study has observed that a sub-set of the hand gestures have to be selected for an accurate automated hand gesture recognition, and reports a method to select these gestures to maximize the sensitivity and specificity. Experiments were conducted where sEMG was recorded from the muscles of the forearm while subjects performed hand gestures and then was classified off-line. The performances of ten gestures were ranked using the proposed Positive-Negative Performance Measurement Index (PNM), generated by a series of confusion matrices. When using all the ten gestures, the sensitivity and specificity was 80.0% and 97.8%. After ranking the gestures using the PNM, six gestures were selected and these gave sensitivity and specificity greater than 95% (96.5% and 99.3%); Hand open, Hand close, Little finger flexion, Ring finger flexion, Middle finger flexion and Thumb flexion. This work has shown that reliable myoelectric based human computer interface systems require careful selection of the gestures that have to be recognized and without such selection, the reliability is poor.

  6. Reliability

    OpenAIRE

    Condon, David; Revelle, William

    2017-01-01

    Separating the signal in a test from the irrelevant noise is a challenge for all measurement. Low test reliability limits test validity, attenuates important relationships, and can lead to regression artifacts. Multiple approaches to the assessment and improvement of reliability are discussed. The advantages and disadvantages of several different approaches to reliability are considered. Practical advice on how to assess reliability using open source software is provided.

  7. A universal and reliable assay for molecular sex identification of three-spined sticklebacks (Gasterosteus aculeatus).

    Science.gov (United States)

    Toli, E-A; Calboli, F C F; Shikano, T; Merilä, J

    2016-11-01

    In heterogametic species, biological differences between the two sexes are ubiquitous, and hence, errors in sex identification can be a significant source of noise and bias in studies where sex-related sources of variation are of interest or need to be controlled for. We developed and validated a universal multimarker assay for reliable sex identification of three-spined sticklebacks (Gasterosteus aculeatus). The assay makes use of genotype scores from three sex-linked loci and utilizes Bayesian probabilistic inference to identify sex of the genotyped individuals. The results, validated with 286 phenotypically sexed individuals from six populations of sticklebacks representing all major genetic lineages (cf. Pacific, Atlantic and Japan Sea), indicate that in contrast to commonly used single-marker-based sex identification assays, the developed multimarker assay should be 100% accurate. As the markers in the assay can be scored from agarose gels, it provides a quick and cost-efficient tool for universal sex identification of three-spined sticklebacks. The general principle of combining information from multiple markers to improve the reliability of sex identification is transferable and can be utilized to develop and validate similar assays for other species. © 2016 John Wiley & Sons Ltd.

  8. Growth characteristics of liquid cultures increase the reliability of presumptive identification of Mycobacterium tuberculosis complex.

    Science.gov (United States)

    Pinhata, Juliana Maira Watanabe; Felippe, Isis Moreira; Gallo, Juliana Failde; Chimara, Erica; Ferrazoli, Lucilaine; de Oliveira, Rosangela Siqueira

    2018-04-23

    We evaluated the microscopic and macroscopic characteristics of mycobacteria growth indicator tube (MGIT) cultures for the presumptive identification of the Mycobacterium tuberculosis complex (MTBC) and assessed the reliability of this strategy for correctly directing isolates to drug susceptibility testing (DST) or species identification. A total of 1526 isolates of mycobacteria received at the Instituto Adolfo Lutz were prospectively subjected to presumptive identification by the observation of growth characteristics along with cord formation detection via microscopy. The presumptive identification showed a sensitivity, specificity and accuracy of 98.8, 92.5 and 97.9 %, respectively. Macroscopic analysis of MTBC isolates that would have been erroneously classified as non-tuberculous mycobacteria based solely on microscopic morphology enabled us to direct them rapidly to DST, representing a substantial gain to patients. In conclusion, the growth characteristics of mycobacteria in MGIT, when considered along with cord formation, increased the reliability of the presumptive identification, which has a great impact on the laboratory budget and turnaround times.

  9. Soft computing approach for reliability optimization: State-of-the-art survey

    International Nuclear Information System (INIS)

    Gen, Mitsuo; Yun, Young Su

    2006-01-01

    In the broadest sense, reliability is a measure of performance of systems. As systems have grown more complex, the consequences of their unreliable behavior have become severe in terms of cost, effort, lives, etc., and the interest in assessing system reliability and the need for improving the reliability of products and systems have become very important. Most solution methods for reliability optimization assume that systems have redundancy components in series and/or parallel systems and alternative designs are available. Reliability optimization problems concentrate on optimal allocation of redundancy components and optimal selection of alternative designs to meet system requirement. In the past two decades, numerous reliability optimization techniques have been proposed. Generally, these techniques can be classified as linear programming, dynamic programming, integer programming, geometric programming, heuristic method, Lagrangean multiplier method and so on. A Genetic Algorithm (GA), as a soft computing approach, is a powerful tool for solving various reliability optimization problems. In this paper, we briefly survey GA-based approach for various reliability optimization problems, such as reliability optimization of redundant system, reliability optimization with alternative design, reliability optimization with time-dependent reliability, reliability optimization with interval coefficients, bicriteria reliability optimization, and reliability optimization with fuzzy goals. We also introduce the hybrid approaches for combining GA with fuzzy logic, neural network and other conventional search techniques. Finally, we have some experiments with an example of various reliability optimization problems using hybrid GA approach

  10. Spatiotemporal Dynamics and Reliable Computations in Recurrent Spiking Neural Networks

    Science.gov (United States)

    Pyle, Ryan; Rosenbaum, Robert

    2017-01-01

    Randomly connected networks of excitatory and inhibitory spiking neurons provide a parsimonious model of neural variability, but are notoriously unreliable for performing computations. We show that this difficulty is overcome by incorporating the well-documented dependence of connection probability on distance. Spatially extended spiking networks exhibit symmetry-breaking bifurcations and generate spatiotemporal patterns that can be trained to perform dynamical computations under a reservoir computing framework.

  11. Spatiotemporal Dynamics and Reliable Computations in Recurrent Spiking Neural Networks.

    Science.gov (United States)

    Pyle, Ryan; Rosenbaum, Robert

    2017-01-06

    Randomly connected networks of excitatory and inhibitory spiking neurons provide a parsimonious model of neural variability, but are notoriously unreliable for performing computations. We show that this difficulty is overcome by incorporating the well-documented dependence of connection probability on distance. Spatially extended spiking networks exhibit symmetry-breaking bifurcations and generate spatiotemporal patterns that can be trained to perform dynamical computations under a reservoir computing framework.

  12. Use of computer codes for system reliability analysis

    International Nuclear Information System (INIS)

    Sabek, M.; Gaafar, M.; Poucet, A.

    1989-01-01

    This paper gives a summary of studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRACTIC, FTAP, computer code package RALLY, and BOUNDS. Two reference case studies were executed by each code. The probabilistic results obtained, as well as the computation times are compared. The two cases studied are the auxiliary feedwater system of a 1300 MW PWR reactor and the emergency electrical power supply system. (author)

  13. Use of computer codes for system reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sabek, M.; Gaafar, M. (Nuclear Regulatory and Safety Centre, Atomic Energy Authority, Cairo (Egypt)); Poucet, A. (Commission of the European Communities, Ispra (Italy). Joint Research Centre)

    1989-01-01

    This paper gives a summary of studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRACTIC, FTAP, computer code package RALLY, and BOUNDS. Two reference case studies were executed by each code. The probabilistic results obtained, as well as the computation times are compared. The two cases studied are the auxiliary feedwater system of a 1300 MW PWR reactor and the emergency electrical power supply system. (author).

  14. Design for reliability information and computer-based systems

    CERN Document Server

    Bauer, Eric

    2010-01-01

    "System reliability, availability and robustness are often not well understood by system architects, engineers and developers. They often don't understand what drives customer's availability expectations, how to frame verifiable availability/robustness requirements, how to manage and budget availability/robustness, how to methodically architect and design systems that meet robustness requirements, and so on. The book takes a very pragmatic approach of framing reliability and robustness as a functional aspect of a system so that architects, designers, developers and testers can address it as a concrete, functional attribute of a system, rather than an abstract, non-functional notion"--Provided by publisher.

  15. Methods to compute reliabilities for genomic predictions of feed intake

    Science.gov (United States)

    For new traits without historical reference data, cross-validation is often the preferred method to validate reliability (REL). Time truncation is less useful because few animals gain substantial REL after the truncation point. Accurate cross-validation requires separating genomic gain from pedigree...

  16. Direct unavailability computation of a maintained highly reliable system

    Czech Academy of Sciences Publication Activity Database

    Briš, R.; Byczanski, Petr

    2010-01-01

    Roč. 224, č. 3 (2010), s. 159-170 ISSN 1748-0078 Grant - others:GA Mšk(CZ) MSM6198910007 Institutional research plan: CEZ:AV0Z30860518 Keywords : high reliability * availability * directed acyclic graph Subject RIV: BA - General Mathematics http:// journals .pepublishing.com/content/rtp3178l17923m46/

  17. Models of Information Security Highly Reliable Computing Systems

    Directory of Open Access Journals (Sweden)

    Vsevolod Ozirisovich Chukanov

    2016-03-01

    Full Text Available Methods of the combined reservation are considered. The models of reliability of systems considering parameters of restoration and prevention of blocks of system are described. Ratios for average quantity prevention and an availability quotient of blocks of system are given.

  18. Fault-tolerant search algorithms reliable computation with unreliable information

    CERN Document Server

    Cicalese, Ferdinando

    2013-01-01

    Why a book on fault-tolerant search algorithms? Searching is one of the fundamental problems in computer science. Time and again algorithmic and combinatorial issues originally studied in the context of search find application in the most diverse areas of computer science and discrete mathematics. On the other hand, fault-tolerance is a necessary ingredient of computing. Due to their inherent complexity, information systems are naturally prone to errors, which may appear at any level - as imprecisions in the data, bugs in the software, or transient or permanent hardware failures. This book pr

  19. Reliability in Warehouse-Scale Computing: Why Low Latency Matters

    DEFF Research Database (Denmark)

    Nannarelli, Alberto

    2015-01-01

    , the limiting factor of these warehouse-scale data centers is the power dissipation. Power is dissipated not only in the computation itself, but also in heat removal (fans, air conditioning, etc.) to keep the temperature of the devices within the operating ranges. The need to keep the temperature low within......Warehouse sized buildings are nowadays hosting several types of large computing systems: from supercomputers to large clusters of servers to provide the infrastructure to the cloud. Although the main target, especially for high-performance computing, is still to achieve high throughput...

  20. Towards early software reliability prediction for computer forensic tools (case study).

    Science.gov (United States)

    Abu Talib, Manar

    2016-01-01

    Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study.

  1. Effective computing algorithm for maintenance optimization of highly reliable systems

    Czech Academy of Sciences Publication Activity Database

    Briš, R.; Byczanski, Petr

    2013-01-01

    Roč. 109, č. 1 (2013), s. 77-85 ISSN 0951-8320 R&D Projects: GA MŠk(CZ) ED1.1.00/02.0070 Institutional support: RVO:68145535 Keywords : exact computing * maintenance * optimization * unavailability Subject RIV: BA - General Mathematics Impact factor: 2.048, year: 2013 http://www.sciencedirect.com/science/article/pii/S0951832012001639

  2. [Evaluation of mass spectrometry: MALDI-TOF MS for fast and reliable yeast identification].

    Science.gov (United States)

    Relloso, María S; Nievas, Jimena; Fares Taie, Santiago; Farquharson, Victoria; Mujica, María T; Romano, Vanesa; Zarate, Mariela S; Smayevsky, Jorgelina

    2015-01-01

    The matrix-assisted laser desorption/ionization time-of-flight mass spectrometry technique known as MALDI-TOF MS is a tool used for the identification of clinical pathogens by generating a protein spectrum that is unique for a given species. In this study we assessed the identification of clinical yeast isolates by MALDI-TOF MS in a university hospital from Argentina and compared two procedures for protein extraction: a rapid method and a procedure based on the manufacturer's recommendations. A short protein extraction procedure was applied in 100 isolates and the rate of correct identification at genus and species level was 98.0%. In addition, we analyzed 201 isolates, previously identified by conventional methods, using the methodology recommended by the manufacturer and there was 95.38% coincidence in the identification at species level. MALDI TOF MS showed to be a fast, simple and reliable tool for yeast identification. Copyright © 2014 Asociación Argentina de Microbiología. Publicado por Elsevier España, S.L.U. All rights reserved.

  3. Intraobserver and intermethod reliability for using two different computer programs in preoperative lower limb alignment analysis

    Directory of Open Access Journals (Sweden)

    Mohamed Kenawey

    2016-12-01

    Conclusion: Computer assisted lower limb alignment analysis is reliable whether using graphics editing program or specialized planning software. However slight higher variability for angles away from the knee joint can be expected.

  4. Distributed Information and Control system reliability enhancement by fog-computing concept application

    Science.gov (United States)

    Melnik, E. V.; Klimenko, A. B.; Ivanov, D. Ya

    2018-03-01

    The paper focuses on the information and control system reliability issue. Authors of the current paper propose a new complex approach of information and control system reliability enhancement by application of the computing concept elements. The approach proposed consists of a complex of optimization problems to be solved. These problems are: estimation of computational complexity, which can be shifted to the edge of the network and fog-layer, distribution of computations among the data processing elements and distribution of computations among the sensors. The problems as well as some simulated results and discussion are formulated and presented within this paper.

  5. The reliability of flexible nasolaryngoscopy in the identification of vocal fold movement impairment in young infants.

    Science.gov (United States)

    Liu, Yi-Chun Carol; McElwee, Tyler; Musso, Mary; Rosenberg, Tara L; Ongkasuwan, Julina

    2017-09-01

    Flexible nasolaryngoscopy (FNL) is considered the gold standard for evaluation of vocal fold mobility but there has been no data on the reliability of interpretation in the infant population. Visualization may be limited by excessive movement, secretions, or floppy supraglottic structures that prevent accurate diagnosis of vocal fold movement impairment (VFMI). We sought to evaluate the inter- and intra-rater reliability of FNL for the evaluation of VFMI in young infants. Case-control. Twenty infants were identified: 10 with VFMI and 10 normal as seen on FNL. Three pediatric otolaryngologists reviewed the video without sound and rated the presence and/or degree of vocal fold mobility. Twelve videos were repeated to assess intra-rater reliability. There was substantial agreement between the reviewers regarding the identification normal vs. any type of VFMI (kappa = 0.67) but only moderate agreement regarding the degree of vocal fold movement (kappa = 0.49). Intra-rater reliability ranges from moderate to perfect agreement (kappa = 0.48-1). FNL in infants is an extremely challenging procedure. Clinically, physicians frequently use the quality of the cry and the past medical and surgical history to help make a judgment of vocal fold movement when the view is suboptimal. These other factors, however, may bias the interpretation of the FNL. Without sound, there is only moderate inter-rater and variable intra-rater reliability for the identification of degree of movement on FNL. Otolaryngologists must be cognizant of the limitations of FNL when using it as a clinical tool or as a "gold standard" against which other modalities are measured. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. A new efficient algorithm for computing the imprecise reliability of monotone systems

    International Nuclear Information System (INIS)

    Utkin, Lev V.

    2004-01-01

    Reliability analysis of complex systems by partial information about reliability of components and by different conditions of independence of components may be carried out by means of the imprecise probability theory which provides a unified framework (natural extension, lower and upper previsions) for computing the system reliability. However, the application of imprecise probabilities to reliability analysis meets with a complexity of optimization problems which have to be solved for obtaining the system reliability measures. Therefore, an efficient simplified algorithm to solve and decompose the optimization problems is proposed in the paper. This algorithm allows us to practically implement reliability analysis of monotone systems under partial and heterogeneous information about reliability of components and under conditions of the component independence or the lack of information about independence. A numerical example illustrates the algorithm

  7. Reliability of a computer and Internet survey (Computer User Profile) used by adults with and without traumatic brain injury (TBI).

    Science.gov (United States)

    Kilov, Andrea M; Togher, Leanne; Power, Emma

    2015-01-01

    To determine test-re-test reliability of the 'Computer User Profile' (CUP) in people with and without TBI. The CUP was administered on two occasions to people with and without TBI. The CUP investigated the nature and frequency of participants' computer and Internet use. Intra-class correlation coefficients and kappa coefficients were conducted to measure reliability of individual CUP items. Descriptive statistics were used to summarize content of responses. Sixteen adults with TBI and 40 adults without TBI were included in the study. All participants were reliable in reporting demographic information, frequency of social communication and leisure activities and computer/Internet habits and usage. Adults with TBI were reliable in 77% of their responses to survey items. Adults without TBI were reliable in 88% of their responses to survey items. The CUP was practical and valuable in capturing information about social, leisure, communication and computer/Internet habits of people with and without TBI. Adults without TBI scored more items with satisfactory reliability overall in their surveys. Future studies may include larger samples and could also include an exploration of how people with/without TBI use other digital communication technologies. This may provide further information on determining technology readiness for people with TBI in therapy programmes.

  8. Reliable identification at the species level of Brucella isolates with MALDI-TOF-MS

    Directory of Open Access Journals (Sweden)

    Lista Florigio

    2011-12-01

    Full Text Available Abstract Background The genus Brucella contains highly infectious species that are classified as biological threat agents. The timely detection and identification of the microorganism involved is essential for an effective response not only to biological warfare attacks but also to natural outbreaks. Matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS is a rapid method for the analysis of biological samples. The advantages of this method, compared to conventional techniques, are rapidity, cost-effectiveness, accuracy and suitability for the high-throughput identification of bacteria. Discrepancies between taxonomy and genetic relatedness on the species and biovar level complicate the development of detection and identification assays. Results In this study, the accurate identification of Brucella species using MALDI-TOF-MS was achieved by constructing a Brucella reference library based on multilocus variable-number tandem repeat analysis (MLVA data. By comparing MS-spectra from Brucella species against a custom-made MALDI-TOF-MS reference library, MALDI-TOF-MS could be used as a rapid identification method for Brucella species. In this way, 99.3% of the 152 isolates tested were identified at the species level, and B. suis biovar 1 and 2 were identified at the level of their biovar. This result demonstrates that for Brucella, even minimal genomic differences between these serovars translate to specific proteomic differences. Conclusions MALDI-TOF-MS can be developed into a fast and reliable identification method for genetically highly related species when potential taxonomic and genetic inconsistencies are taken into consideration during the generation of the reference library.

  9. [Reliability and validity of the Chinese version on Alcohol Use Disorders Identification Test].

    Science.gov (United States)

    Zhang, C; Yang, G P; Li, Z; Li, X N; Li, Y; Hu, J; Zhang, F Y; Zhang, X J

    2017-08-10

    Objective: To assess the reliability and validity of the Chinese version on Alcohol Use Disorders Identification Test (AUDIT) among medical students in China and to provide correct way of application on the recommended scales. Methods: An E-questionnaire was developed and sent to medical students in five different colleges. Students were all active volunteers to accept the testings. Cronbach's α and split-half reliability were calculated to evaluate the reliability of AUDIT while content, contract, discriminant and convergent validity were performed to measure the validity of the scales. Results: The overall Cronbach's α of AUDIT was 0.782 and the split-half reliability was 0.711. Data showed that the domain Cronbach's α and split-half reliability were 0.796 and 0.794 for hazardous alcohol use, 0.561 and 0.623 for dependence symptoms, and 0.647 and 0.640 for harmful alcohol use. Results also showed that the content validity index on the levels of items I-CVI) were from 0.83 to 1.00, the content validity index of scale level (S-CVI/UA) was 0.90, content validity index of average scale level (S-CVI/Ave) was 0.99 and the content validity ratios (CVR) were from 0.80 to 1.00. The simplified version of AUDIT supported a presupposed three-factor structure which could explain 61.175% of the total variance revealed through exploratory factor analysis. AUDIT semed to have good convergent and discriminant validity, with the success rate of calibration experiment as 100%. Conclusion: AUDIT showed good reliability and validity among medical students in China thus worth for promotion on its use.

  10. A Quantitative Risk Analysis Framework for Evaluating and Monitoring Operational Reliability of Cloud Computing

    Science.gov (United States)

    Islam, Muhammad Faysal

    2013-01-01

    Cloud computing offers the advantage of on-demand, reliable and cost efficient computing solutions without the capital investment and management resources to build and maintain in-house data centers and network infrastructures. Scalability of cloud solutions enable consumers to upgrade or downsize their services as needed. In a cloud environment,…

  11. Fog-computing concept usage as means to enhance information and control system reliability

    Science.gov (United States)

    Melnik, E. V.; Klimenko, A. B.; Ivanov, D. Ya

    2018-05-01

    This paper focuses on the reliability issue of information and control systems (ICS). The authors propose using the elements of the fog-computing concept to enhance the reliability function. The key idea of fog-computing is to shift computations to the fog-layer of the network, and thus to decrease the workload of the communication environment and data processing components. As for ICS, workload also can be distributed among sensors, actuators and network infrastructure facilities near the sources of data. The authors simulated typical workload distribution situations for the “traditional” ICS architecture and for the one with fogcomputing concept elements usage. The paper contains some models, selected simulation results and conclusion about the prospects of the fog-computing as a means to enhance ICS reliability.

  12. Reliability-Centric Analysis of Offloaded Computation in Cooperative Wearable Applications

    Directory of Open Access Journals (Sweden)

    Aleksandr Ometov

    2017-01-01

    Full Text Available Motivated by the unprecedented penetration of mobile communications technology, this work carefully brings into perspective the challenges related to heterogeneous communications and offloaded computation operating in cases of fault-tolerant computation, computing, and caching. We specifically focus on the emerging augmented reality applications that require reliable delegation of the computing and caching functionality to proximate resource-rich devices. The corresponding mathematical model proposed in this work becomes of value to assess system-level reliability in cases where one or more nearby collaborating nodes become temporarily unavailable. Our produced analytical and simulation results corroborate the asymptotic insensitivity of the stationary reliability of the system in question (under the “fast” recovery of its elements to the type of the “repair” time distribution, thus supporting the fault-tolerant system operation.

  13. High reliability - low noise radionuclide signature identification algorithms for border security applications

    Science.gov (United States)

    Lee, Sangkyu

    Illicit trafficking and smuggling of radioactive materials and special nuclear materials (SNM) are considered as one of the most important recent global nuclear threats. Monitoring the transport and safety of radioisotopes and SNM are challenging due to their weak signals and easy shielding. Great efforts worldwide are focused at developing and improving the detection technologies and algorithms, for accurate and reliable detection of radioisotopes of interest in thus better securing the borders against nuclear threats. In general, radiation portal monitors enable detection of gamma and neutron emitting radioisotopes. Passive or active interrogation techniques, present and/or under the development, are all aimed at increasing accuracy, reliability, and in shortening the time of interrogation as well as the cost of the equipment. Equally important efforts are aimed at advancing algorithms to process the imaging data in an efficient manner providing reliable "readings" of the interiors of the examined volumes of various sizes, ranging from cargos to suitcases. The main objective of this thesis is to develop two synergistic algorithms with the goal to provide highly reliable - low noise identification of radioisotope signatures. These algorithms combine analysis of passive radioactive detection technique with active interrogation imaging techniques such as gamma radiography or muon tomography. One algorithm consists of gamma spectroscopy and cosmic muon tomography, and the other algorithm is based on gamma spectroscopy and gamma radiography. The purpose of fusing two detection methodologies per algorithm is to find both heavy-Z radioisotopes and shielding materials, since radionuclides can be identified with gamma spectroscopy, and shielding materials can be detected using muon tomography or gamma radiography. These combined algorithms are created and analyzed based on numerically generated images of various cargo sizes and materials. In summary, the three detection

  14. Reliability issues related to the usage of Cloud Computing in Critical Infrastructures

    OpenAIRE

    Diez Gonzalez, Oscar Manuel; Silva Vazquez, Andrés

    2011-01-01

    The use of cloud computing is extending to all kind of systems, including the ones that are part of Critical Infrastructures, and measuring the reliability is becoming more difficult. Computing is becoming the 5th utility, in part thanks to the use of cloud services. Cloud computing is used now by all types of systems and organizations, including critical infrastructure, creating hidden inter-dependencies on both public and private cloud models. This paper investigates the use of cloud co...

  15. Rater reliability and concurrent validity of the Keyboard Personal Computer Style instrument (K-PeCS).

    Science.gov (United States)

    Baker, Nancy A; Cook, James R; Redfern, Mark S

    2009-01-01

    This paper describes the inter-rater and intra-rater reliability, and the concurrent validity of an observational instrument, the Keyboard Personal Computer Style instrument (K-PeCS), which assesses stereotypical postures and movements associated with computer keyboard use. Three trained raters independently rated the video clips of 45 computer keyboard users to ascertain inter-rater reliability, and then re-rated a sub-sample of 15 video clips to ascertain intra-rater reliability. Concurrent validity was assessed by comparing the ratings obtained using the K-PeCS to scores developed from a 3D motion analysis system. The overall K-PeCS had excellent reliability [inter-rater: intra-class correlation coefficients (ICC)=.90; intra-rater: ICC=.92]. Most individual items on the K-PeCS had from good to excellent reliability, although six items fell below ICC=.75. Those K-PeCS items that were assessed for concurrent validity compared favorably to the motion analysis data for all but two items. These results suggest that most items on the K-PeCS can be used to reliably document computer keyboarding style.

  16. Reliability and validity of the Korean standard pattern identification for stroke (K-SPI-Stroke questionnaire

    Directory of Open Access Journals (Sweden)

    Kang Byoung-Kab

    2012-04-01

    Full Text Available Abstract Background The present study was conducted to examine the reliability and validity of the ‘Korean Standard Pattern Identification for Stroke (K-SPI-Stroke’, which was developed and evaluated within the context of traditional Korean medicine (TKM. Methods Between September 2006 and December 2010, 2,905 patients from 11 Korean medical hospitals were asked to complete the K-SPI-Stroke questionnaire as a part of project ' Fundamental study for the standardization and objectification of pattern identification in traditional Korean medicine for stroke (SOPI-Stroke. Each patient was independently diagnosed by two TKM physicians from the same site according to one of four patterns, as suggested by the Korea Institute of Oriental Medicine: 1 a Qi deficiency pattern, 2 a Dampness-phlegm pattern, 3 a Yin deficiency pattern, or 4 a Fire-heat pattern. We estimated the internal consistency using Cronbach’s α coefficient, the discriminant validity using the means score of patterns, and the predictive validity using the classification accuracy of the K-SPI-Stroke questionnaire. Results The K-SPI-Stroke questionnaire had satisfactory internal consistency (α = 0.700 and validity, with significant differences in the mean of scores among the four patterns. The overall classification accuracy of this questionnaire was 65.2 %. Conclusion These results suggest that the K-SPI-Stroke questionnaire is a reliable and valid instrument for estimating the severity of the four patterns.

  17. Ensemble of different approaches for a reliable person re-identification system

    Directory of Open Access Journals (Sweden)

    Loris Nanni

    2016-07-01

    Full Text Available An ensemble of approaches for reliable person re-identification is proposed in this paper. The proposed ensemble is built combining widely used person re-identification systems using different color spaces and some variants of state-of-the-art approaches that are proposed in this paper. Different descriptors are tested, and both texture and color features are extracted from the images; then the different descriptors are compared using different distance measures (e.g., the Euclidean distance, angle, and the Jeffrey distance. To improve performance, a method based on skeleton detection, extracted from the depth map, is also applied when the depth map is available. The proposed ensemble is validated on three widely used datasets (CAVIAR4REID, IAS, and VIPeR, keeping the same parameter set of each approach constant across all tests to avoid overfitting and to demonstrate that the proposed system can be considered a general-purpose person re-identification system. Our experimental results show that the proposed system offers significant improvements over baseline approaches. The source code used for the approaches tested in this paper will be available at https://www.dei.unipd.it/node/2357 and http://robotics.dei.unipd.it/reid/.

  18. HuRECA: Human Reliability Evaluator for Computer-based Control Room Actions

    International Nuclear Information System (INIS)

    Kim, Jae Whan; Lee, Seung Jun; Jang, Seung Cheol

    2011-01-01

    As computer-based design features such as computer-based procedures (CBP), soft controls (SCs), and integrated information systems are being adopted in main control rooms (MCR) of nuclear power plants, a human reliability analysis (HRA) method capable of dealing with the effects of these design features on human reliability is needed. From the observations of human factors engineering verification and validation experiments, we have drawn some major important characteristics on operator behaviors and design-related influencing factors (DIFs) from the perspective of human reliability. Firstly, there are new DIFs that should be considered in developing an HRA method for computer-based control rooms including especially CBP and SCs. In the case of the computer-based procedure rather than the paper-based procedure, the structural and managerial elements should be considered as important PSFs in addition to the procedural contents. In the case of the soft controllers, the so-called interface management tasks (or secondary tasks) should be reflected in the assessment of human error probability. Secondly, computer-based control rooms can provide more effective error recovery features than conventional control rooms. Major error recovery features for computer-based control rooms include the automatic logic checking function of the computer-based procedure and the information sharing feature of the general computer-based designs

  19. An analytical model for computation of reliability of waste management facilities with intermediate storages

    International Nuclear Information System (INIS)

    Kallweit, A.; Schumacher, F.

    1977-01-01

    A high reliability is called for waste management facilities within the fuel cycle of nuclear power stations which can be fulfilled by providing intermediate storage facilities and reserve capacities. In this report a model based on the theory of Markov processes is described which allows computation of reliability characteristics of waste management facilities containing intermediate storage facilities. The application of the model is demonstrated by an example. (orig.) [de

  20. Improving the reliability of nuclear reprocessing by application of computers and mathematical modelling

    International Nuclear Information System (INIS)

    Gabowitsch, E.; Trauboth, H.

    1982-01-01

    After a brief survey of the present and expected future state of nuclear energy utilization, which should demonstrate the significance of nuclear reprocessing, safety and reliability aspects of nuclear reprocessing plants (NRP) are considered. Then, the principal possibilities of modern computer technology including computer systems architecture and application-oriented software for improving the reliability and availability are outlined. In this context, two information systems being developed at the Nuclear Research Center Karlsruhe (KfK) are briefly described. For design evaluation of certain areas of a large NRP mathematical methods and computer-aided tools developed, used or being designed by KfK are discussed. In conclusion, future research to be pursued in information processing and applied mathematics in support of reliable operation of NRP's is proposed. (Auth.)

  1. Computer-assisted photo identification outperforms visible implant elastomers in an endangered salamander, Eurycea tonkawae.

    Directory of Open Access Journals (Sweden)

    Nathan F Bendik

    Full Text Available Despite recognition that nearly one-third of the 6300 amphibian species are threatened with extinction, our understanding of the general ecology and population status of many amphibians is relatively poor. A widely-used method for monitoring amphibians involves injecting captured individuals with unique combinations of colored visible implant elastomer (VIE. We compared VIE identification to a less-invasive method - computer-assisted photographic identification (photoID - in endangered Jollyville Plateau salamanders (Eurycea tonkawae, a species with a known range limited to eight stream drainages in central Texas. We based photoID on the unique pigmentation patterns on the dorsal head region of 1215 individual salamanders using identification software Wild-ID. We compared the performance of photoID methods to VIEs using both 'high-quality' and 'low-quality' images, which were taken using two different camera types and technologies. For high-quality images, the photoID method had a false rejection rate of 0.76% compared to 1.90% for VIEs. Using a comparable dataset of lower-quality images, the false rejection rate was much higher (15.9%. Photo matching scores were negatively correlated with time between captures, suggesting that evolving natural marks could increase misidentification rates in longer term capture-recapture studies. Our study demonstrates the utility of large-scale capture-recapture using photo identification methods for Eurycea and other species with stable natural marks that can be reliably photographed.

  2. Polynomial-time computability of the edge-reliability of graphs using Gilbert's formula

    Directory of Open Access Journals (Sweden)

    Thomas J. Marlowe

    1998-01-01

    Full Text Available Reliability is an important consideration in analyzing computer and other communication networks, but current techniques are extremely limited in the classes of graphs which can be analyzed efficiently. While Gilbert's formula establishes a theoretically elegant recursive relationship between the edge reliability of a graph and the reliability of its subgraphs, naive evaluation requires consideration of all sequences of deletions of individual vertices, and for many graphs has time complexity essentially Θ (N!. We discuss a general approach which significantly reduces complexity, encoding subgraph isomorphism in a finer partition by invariants, and recursing through the set of invariants.

  3. Reliability analysis of microcomputer boards and computer based systems important to safety of nuclear plants

    International Nuclear Information System (INIS)

    Shrikhande, S.V.; Patil, V.K.; Ganesh, G.; Biswas, B.; Patil, R.K.

    2010-01-01

    Computer Based Systems (CBS) are employed in Indian nuclear plants for protection, control and monitoring purpose. For forthcoming CBS, Reactor Control Division has designed and developed a new standardized family of microcomputer boards qualified to stringent requirements of nuclear industry. These boards form the basic building blocks of CBS. Reliability analysis of these boards is being carried out using analysis package based on MIL-STD-217Plus methodology. The estimated failure rate values of these standardized microcomputer boards will be useful for reliability assessment of these systems. The paper presents reliability analysis of microcomputer boards and case study of a CBS system built using these boards. (author)

  4. Probability of extreme interference levels computed from reliability approaches: application to transmission lines with uncertain parameters

    International Nuclear Information System (INIS)

    Larbi, M.; Besnier, P.; Pecqueux, B.

    2014-01-01

    This paper deals with the risk analysis of an EMC default using a statistical approach. It is based on reliability methods from probabilistic engineering mechanics. A computation of probability of failure (i.e. probability of exceeding a threshold) of an induced current by crosstalk is established by taking into account uncertainties on input parameters influencing levels of interference in the context of transmission lines. The study has allowed us to evaluate the probability of failure of the induced current by using reliability methods having a relative low computational cost compared to Monte Carlo simulation. (authors)

  5. Developing a personal computer based expert system for radionuclide identification

    International Nuclear Information System (INIS)

    Aarnio, P.A.; Hakulinen, T.T.

    1990-01-01

    Several expert system development tools are available for personal computers today. We have used one of the LISP-based high end tools for nearly two years in developing an expert system for identification of gamma sources. The system contains a radionuclide database of 2055 nuclides and 48000 gamma transitions with a knowledge base of about sixty rules. This application combines a LISP-based inference engine with database management and relatively heavy numerical calculations performed using C-language. The most important feature needed has been the possibility to use LISP and C together with the more advanced object oriented features of the development tool. Main difficulties have been long response times and the big amount (10-16 MB) of computer memory required

  6. Precision of lumbar intervertebral measurements: does a computer-assisted technique improve reliability?

    Science.gov (United States)

    Pearson, Adam M; Spratt, Kevin F; Genuario, James; McGough, William; Kosman, Katherine; Lurie, Jon; Sengupta, Dilip K

    2011-04-01

    Comparison of intra- and interobserver reliability of digitized manual and computer-assisted intervertebral motion measurements and classification of "instability." To determine if computer-assisted measurement of lumbar intervertebral motion on flexion-extension radiographs improves reliability compared with digitized manual measurements. Many studies have questioned the reliability of manual intervertebral measurements, although few have compared the reliability of computer-assisted and manual measurements on lumbar flexion-extension radiographs. Intervertebral rotation, anterior-posterior (AP) translation, and change in anterior and posterior disc height were measured with a digitized manual technique by three physicians and by three other observers using computer-assisted quantitative motion analysis (QMA) software. Each observer measured 30 sets of digital flexion-extension radiographs (L1-S1) twice. Shrout-Fleiss intraclass correlation coefficients for intra- and interobserver reliabilities were computed. The stability of each level was also classified (instability defined as >4 mm AP translation or 10° rotation), and the intra- and interobserver reliabilities of the two methods were compared using adjusted percent agreement (APA). Intraobserver reliability intraclass correlation coefficients were substantially higher for the QMA technique THAN the digitized manual technique across all measurements: rotation 0.997 versus 0.870, AP translation 0.959 versus 0.557, change in anterior disc height 0.962 versus 0.770, and change in posterior disc height 0.951 versus 0.283. The same pattern was observed for interobserver reliability (rotation 0.962 vs. 0.693, AP translation 0.862 vs. 0.151, change in anterior disc height 0.862 vs. 0.373, and change in posterior disc height 0.730 vs. 0.300). The QMA technique was also more reliable for the classification of "instability." Intraobserver APAs ranged from 87 to 97% for QMA versus 60% to 73% for digitized manual

  7. Automatic Identification of the Repolarization Endpoint by Computing the Dominant T-wave on a Reduced Number of Leads.

    Science.gov (United States)

    Giuliani, C; Agostinelli, A; Di Nardo, F; Fioretti, S; Burattini, L

    2016-01-01

    Electrocardiographic (ECG) T-wave endpoint (Tend) identification suffers lack of reliability due to the presence of noise and variability among leads. Tend identification can be improved by using global repolarization waveforms obtained by combining several leads. The dominant T-wave (DTW) is a global repolarization waveform that proved to improve Tend identification when computed using the 15 (I to III, aVr, aVl, aVf, V1 to V6, X, Y, Z) leads usually available in clinics, of which only 8 (I, II, V1 to V6) are independent. The aim of the present study was to evaluate if the 8 independent leads are sufficient to obtain a DTW which allows a reliable Tend identification. To this aim Tend measures automatically identified from 15-dependent-lead DTWs of 46 control healthy subjects (CHS) and 103 acute myocardial infarction patients (AMIP) were compared with those obtained from 8-independent-lead DTWs. Results indicate that Tend distributions have not statistically different median values (CHS: 340 ms vs. 340 ms, respectively; AMIP: 325 ms vs. 320 ms, respectively), besides being strongly correlated (CHS: ρ=0.97, AMIP: 0.88; Pautomatic Tend identification from DTW, the 8 independent leads can be used without a statistically significant loss of accuracy but with a significant decrement of computational effort. The lead dependence of 7 out of 15 leads does not introduce a significant bias in the Tend determination from 15 dependent lead DTWs.

  8. The computer vision in the service of safety and reliability in steam generators inspection services

    International Nuclear Information System (INIS)

    Pineiro Fernandez, P.; Garcia Bueno, A.; Cabrera Jordan, E.

    2012-01-01

    The actual computational vision has matured very quickly in the last ten years by facilitating new developments in various areas of nuclear application allowing to automate and simplify processes and tasks, instead or in collaboration with the people and equipment efficiently. The current computer vision (more appropriate than the artificial vision concept) provides great possibilities of also improving in terms of the reliability and safety of NPPS inspection systems.

  9. Optimal reliability allocation for large software projects through soft computing techniques

    DEFF Research Database (Denmark)

    Madsen, Henrik; Albeanu, Grigore; Popentiu-Vladicescu, Florin

    2012-01-01

    or maximizing the system reliability subject to budget constraints. These kinds of optimization problems were considered both in deterministic and stochastic frameworks in literature. Recently, the intuitionistic-fuzzy optimization approach was considered as a soft computing successful modelling approach....... Firstly, a review on existing soft computing approaches to optimization is given. The main section extends the results considering self-organizing migrating algorithms for solving intuitionistic-fuzzy optimization problems attached to complex fault-tolerant software architectures which proved...

  10. Polynomial-time computability of the edge-reliability of graphs using Gilbert's formula

    Directory of Open Access Journals (Sweden)

    Marlowe Thomas J.

    1998-01-01

    Full Text Available Reliability is an important consideration in analyzing computer and other communication networks, but current techniques are extremely limited in the classes of graphs which can be analyzed efficiently. While Gilbert's formula establishes a theoretically elegant recursive relationship between the edge reliability of a graph and the reliability of its subgraphs, naive evaluation requires consideration of all sequences of deletions of individual vertices, and for many graphs has time complexity essentially Θ (N!. We discuss a general approach which significantly reduces complexity, encoding subgraph isomorphism in a finer partition by invariants, and recursing through the set of invariants. We illustrate this approach using threshhold graphs, and show that any computation of reliability using Gilbert's formula will be polynomial-time if and only if the number of invariants considered is polynomial; we then show families of graphs with polynomial-time, and non-polynomial reliability computation, and show that these encompass most previously known results. We then codify our approach to indicate how it can be used for other classes of graphs, and suggest several classes to which the technique can be applied.

  11. Review of advances in human reliability analysis of errors of commission, Part 1: EOC identification

    International Nuclear Information System (INIS)

    Reer, Bernhard

    2008-01-01

    In close connection with examples relevant to contemporary probabilistic safety assessment (PSA), a review of advances in human reliability analysis (HRA) of post-initiator errors of commission (EOCs), i.e. inappropriate actions under abnormal operating conditions, has been carried out. The review comprises both EOC identification (part 1) and quantification (part 2); part 1 is presented in this article. Emerging HRA methods addressing the problem of EOC identification are: A Technique for Human Event Analysis (ATHEANA), the EOC HRA method developed by Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS), the Misdiagnosis Tree Analysis (MDTA) method, and the Commission Errors Search and Assessment (CESA) method. Most of the EOCs referred to in predictive studies comprise the stop of running or the inhibition of anticipated functions; a few comprise the start of a function. The CESA search scheme-which proceeds from possible operator actions to the affected systems to scenarios and uses procedures and importance measures as key sources of input information-provides a formalized way for identifying relatively important scenarios with EOC opportunities. In the implementation however, attention should be paid regarding EOCs associated with familiar but non-procedural actions and EOCs leading to failures of manually initiated safety functions

  12. Reliable multicast for the Grid: a case study in experimental computer science.

    Science.gov (United States)

    Nekovee, Maziar; Barcellos, Marinho P; Daw, Michael

    2005-08-15

    In its simplest form, multicast communication is the process of sending data packets from a source to multiple destinations in the same logical multicast group. IP multicast allows the efficient transport of data through wide-area networks, and its potentially great value for the Grid has been highlighted recently by a number of research groups. In this paper, we focus on the use of IP multicast in Grid applications, which require high-throughput reliable multicast. These include Grid-enabled computational steering and collaborative visualization applications, and wide-area distributed computing. We describe the results of our extensive evaluation studies of state-of-the-art reliable-multicast protocols, which were performed on the UK's high-speed academic networks. Based on these studies, we examine the ability of current reliable multicast technology to meet the Grid's requirements and discuss future directions.

  13. Rapid and reliable detection and identification of GM events using multiplex PCR coupled with oligonucleotide microarray.

    Science.gov (United States)

    Xu, Xiaodan; Li, Yingcong; Zhao, Heng; Wen, Si-yuan; Wang, Sheng-qi; Huang, Jian; Huang, Kun-lun; Luo, Yun-bo

    2005-05-18

    To devise a rapid and reliable method for the detection and identification of genetically modified (GM) events, we developed a multiplex polymerase chain reaction (PCR) coupled with a DNA microarray system simultaneously aiming at many targets in a single reaction. The system included probes for screening gene, species reference gene, specific gene, construct-specific gene, event-specific gene, and internal and negative control genes. 18S rRNA was combined with species reference genes as internal controls to assess the efficiency of all reactions and to eliminate false negatives. Two sets of the multiplex PCR system were used to amplify four and five targets, respectively. Eight different structure genes could be detected and identified simultaneously for Roundup Ready soybean in a single microarray. The microarray specificity was validated by its ability to discriminate two GM maizes Bt176 and Bt11. The advantages of this method are its high specificity and greatly reduced false-positives and -negatives. The multiplex PCR coupled with microarray technology presented here is a rapid and reliable tool for the simultaneous detection of GM organism ingredients.

  14. Automated bony region identification using artificial neural networks: reliability and validation measurements

    Energy Technology Data Exchange (ETDEWEB)

    Gassman, Esther E.; Kallemeyn, Nicole A.; DeVries, Nicole A.; Shivanna, Kiran H. [The University of Iowa, Department of Biomedical Engineering, Seamans Center for the Engineering Arts and Sciences, Iowa City, IA (United States); The University of Iowa, Center for Computer-Aided Design, Iowa City, IA (United States); Powell, Stephanie M. [The University of Iowa, Department of Biomedical Engineering, Seamans Center for the Engineering Arts and Sciences, Iowa City, IA (United States); University of Iowa Hospitals and Clinics, The University of Iowa, Department of Radiology, Iowa City, IA (United States); Magnotta, Vincent A. [The University of Iowa, Department of Biomedical Engineering, Seamans Center for the Engineering Arts and Sciences, Iowa City, IA (United States); The University of Iowa, Center for Computer-Aided Design, Iowa City, IA (United States); University of Iowa Hospitals and Clinics, The University of Iowa, Department of Radiology, Iowa City, IA (United States); Ramme, Austin J. [University of Iowa Hospitals and Clinics, The University of Iowa, Department of Radiology, Iowa City, IA (United States); Adams, Brian D. [The University of Iowa, Department of Biomedical Engineering, Seamans Center for the Engineering Arts and Sciences, Iowa City, IA (United States); University of Iowa Hospitals and Clinics, The University of Iowa, Department of Orthopaedics and Rehabilitation, Iowa City, IA (United States); Grosland, Nicole M. [The University of Iowa, Department of Biomedical Engineering, Seamans Center for the Engineering Arts and Sciences, Iowa City, IA (United States); University of Iowa Hospitals and Clinics, The University of Iowa, Department of Orthopaedics and Rehabilitation, Iowa City, IA (United States); The University of Iowa, Center for Computer-Aided Design, Iowa City, IA (United States)

    2008-04-15

    The objective was to develop tools for automating the identification of bony structures, to assess the reliability of this technique against manual raters, and to validate the resulting regions of interest against physical surface scans obtained from the same specimen. Artificial intelligence-based algorithms have been used for image segmentation, specifically artificial neural networks (ANNs). For this study, an ANN was created and trained to identify the phalanges of the human hand. The relative overlap between the ANN and a manual tracer was 0.87, 0.82, and 0.76, for the proximal, middle, and distal index phalanx bones respectively. Compared with the physical surface scans, the ANN-generated surface representations differed on average by 0.35 mm, 0.29 mm, and 0.40 mm for the proximal, middle, and distal phalanges respectively. Furthermore, the ANN proved to segment the structures in less than one-tenth of the time required by a manual rater. The ANN has proven to be a reliable and valid means of segmenting the phalanx bones from CT images. Employing automated methods such as the ANN for segmentation, eliminates the likelihood of rater drift and inter-rater variability. Automated methods also decrease the amount of time and manual effort required to extract the data of interest, thereby making the feasibility of patient-specific modeling a reality. (orig.)

  15. Automated bony region identification using artificial neural networks: reliability and validation measurements

    International Nuclear Information System (INIS)

    Gassman, Esther E.; Kallemeyn, Nicole A.; DeVries, Nicole A.; Shivanna, Kiran H.; Powell, Stephanie M.; Magnotta, Vincent A.; Ramme, Austin J.; Adams, Brian D.; Grosland, Nicole M.

    2008-01-01

    The objective was to develop tools for automating the identification of bony structures, to assess the reliability of this technique against manual raters, and to validate the resulting regions of interest against physical surface scans obtained from the same specimen. Artificial intelligence-based algorithms have been used for image segmentation, specifically artificial neural networks (ANNs). For this study, an ANN was created and trained to identify the phalanges of the human hand. The relative overlap between the ANN and a manual tracer was 0.87, 0.82, and 0.76, for the proximal, middle, and distal index phalanx bones respectively. Compared with the physical surface scans, the ANN-generated surface representations differed on average by 0.35 mm, 0.29 mm, and 0.40 mm for the proximal, middle, and distal phalanges respectively. Furthermore, the ANN proved to segment the structures in less than one-tenth of the time required by a manual rater. The ANN has proven to be a reliable and valid means of segmenting the phalanx bones from CT images. Employing automated methods such as the ANN for segmentation, eliminates the likelihood of rater drift and inter-rater variability. Automated methods also decrease the amount of time and manual effort required to extract the data of interest, thereby making the feasibility of patient-specific modeling a reality. (orig.)

  16. 48 CFR 227.7203-10 - Contractor identification and marking of computer software or computer software documentation to...

    Science.gov (United States)

    2010-10-01

    ... operation of the software to display a restrictive rights legend or other license notice; and (2) Requires a... and marking of computer software or computer software documentation to be furnished with restrictive... Rights in Computer Software and Computer Software Documentation 227.7203-10 Contractor identification and...

  17. The Alcohol Use Disorders Identification Test (AUDIT): reliability and validity of the Greek version.

    Science.gov (United States)

    Moussas, George; Dadouti, Georgia; Douzenis, Athanassios; Poulis, Evangelos; Tzelembis, Athanassios; Bratis, Dimitris; Christodoulou, Christos; Lykouras, Lefteris

    2009-05-14

    Problems associated with alcohol abuse are recognised by the World Health Organization as a major health issue, which according to most recent estimations is responsible for 1.4% of the total world burden of morbidity and has been proven to increase mortality risk by 50%. Because of the size and severity of the problem, early detection is very important. This requires easy to use and specific tools. One of these is the Alcohol Use Disorders Identification Test (AUDIT). This study aims to standardise the questionnaire in a Greek population. AUDIT was translated and back-translated from its original language by two English-speaking psychiatrists. The tool contains 10 questions. A score >or= 11 is an indication of serious abuse/dependence. In the study, 218 subjects took part: 128 were males and 90 females. The average age was 40.71 years (+/- 11.34). From the 218 individuals, 109 (75 male, 34 female) fulfilled the criteria for alcohol dependence according to the Diagnostic and Statistical Manual of Mental Disorders, 4th edition (DSM-IV), and presented requesting admission; 109 subjects (53 male, 56 female) were healthy controls. Internal reliability (Cronbach alpha) was 0.80 for the controls and 0.80 for the alcohol-dependent individuals. Controls had significantly lower average scores (t test P 8 was 0.98 and its specificity was 0.94 for the same score. For the alcohol-dependent sample 3% scored as false negatives and from the control group 1.8% scored false positives. In the alcohol-dependent sample there was no difference between males and females in their average scores (t test P > 0.05). The Greek version of AUDIT has increased internal reliability and validity. It detects 97% of the alcohol-dependent individuals and has a high sensitivity and specificity. AUDIT is easy to use, quick and reliable and can be very useful in detection alcohol problems in sensitive populations.

  18. Identification of Enhancers In Human: Advances In Computational Studies

    KAUST Repository

    Kleftogiannis, Dimitrios A.

    2016-03-24

    Roughly ~50% of the human genome, contains noncoding sequences serving as regulatory elements responsible for the diverse gene expression of the cells in the body. One very well studied category of regulatory elements is the category of enhancers. Enhancers increase the transcriptional output in cells through chromatin remodeling or recruitment of complexes of binding proteins. Identification of enhancer using computational techniques is an interesting area of research and up to now several approaches have been proposed. However, the current state-of-the-art methods face limitations since the function of enhancers is clarified, but their mechanism of function is not well understood. This PhD thesis presents a bioinformatics/computer science study that focuses on the problem of identifying enhancers in different human cells using computational techniques. The dissertation is decomposed into four main tasks that we present in different chapters. First, since many of the enhancer’s functions are not well understood, we study the basic biological models by which enhancers trigger transcriptional functions and we survey comprehensively over 30 bioinformatics approaches for identifying enhancers. Next, we elaborate more on the availability of enhancer data as produced by different enhancer identification methods and experimental procedures. In particular, we analyze advantages and disadvantages of existing solutions and we report obstacles that require further consideration. To mitigate these problems we developed the Database of Integrated Human Enhancers (DENdb), a centralized online repository that archives enhancer data from 16 ENCODE cell-lines. The integrated enhancer data are also combined with many other experimental data that can be used to interpret the enhancers content and generate a novel enhancer annotation that complements the existing integrative annotation proposed by the ENCODE consortium. Next, we propose the first deep-learning computational

  19. Reference gene identification for reliable normalisation of quantitative RT-PCR data in Setaria viridis.

    Science.gov (United States)

    Nguyen, Duc Quan; Eamens, Andrew L; Grof, Christopher P L

    2018-01-01

    Quantitative real-time polymerase chain reaction (RT-qPCR) is the key platform for the quantitative analysis of gene expression in a wide range of experimental systems and conditions. However, the accuracy and reproducibility of gene expression quantification via RT-qPCR is entirely dependent on the identification of reliable reference genes for data normalisation. Green foxtail ( Setaria viridis ) has recently been proposed as a potential experimental model for the study of C 4 photosynthesis and is closely related to many economically important crop species of the Panicoideae subfamily of grasses, including Zea mays (maize), Sorghum bicolor (sorghum) and Sacchurum officinarum (sugarcane). Setaria viridis (Accession 10) possesses a number of key traits as an experimental model, namely; (i) a small sized, sequenced and well annotated genome; (ii) short stature and generation time; (iii) prolific seed production, and; (iv) is amendable to Agrobacterium tumefaciens -mediated transformation. There is currently however, a lack of reference gene expression information for Setaria viridis ( S. viridis ). We therefore aimed to identify a cohort of suitable S. viridis reference genes for accurate and reliable normalisation of S. viridis RT-qPCR expression data. Eleven putative candidate reference genes were identified and examined across thirteen different S. viridis tissues. Of these, the geNorm and NormFinder analysis software identified SERINE / THERONINE - PROTEIN PHOSPHATASE 2A ( PP2A ), 5 '- ADENYLYLSULFATE REDUCTASE 6 ( ASPR6 ) and DUAL SPECIFICITY PHOSPHATASE ( DUSP ) as the most suitable combination of reference genes for the accurate and reliable normalisation of S. viridis RT-qPCR expression data. To demonstrate the suitability of the three selected reference genes, PP2A , ASPR6 and DUSP , were used to normalise the expression of CINNAMYL ALCOHOL DEHYDROGENASE ( CAD ) genes across the same tissues. This approach readily demonstrated the suitably of the three

  20. Identification of double-yolked duck egg using computer vision.

    Directory of Open Access Journals (Sweden)

    Long Ma

    Full Text Available The double-yolked (DY egg is quite popular in some Asian countries because it is considered as a sign of good luck, however, the double yolk is one of the reasons why these eggs fail to hatch. The usage of automatic methods for identifying DY eggs can increase the efficiency in the poultry industry by decreasing egg loss during incubation or improving sale proceeds. In this study, two methods for DY duck egg identification were developed by using computer vision technology. Transmittance images of DY and single-yolked (SY duck eggs were acquired by a CCD camera to identify them according to their shape features. The Fisher's linear discriminant (FLD model equipped with a set of normalized Fourier descriptors (NFDs extracted from the acquired images and the convolutional neural network (CNN model using primary preprocessed images were built to recognize duck egg yolk types. The classification accuracies of the FLD model for SY and DY eggs were 100% and 93.2% respectively, while the classification accuracies of the CNN model for SY and DY eggs were 98% and 98.8% respectively. The CNN-based algorithm took about 0.12 s to recognize one sample image, which was slightly faster than the FLD-based (about 0.20 s. Finally, this work compared two classification methods and provided the better method for DY egg identification.

  1. Establishment of a protein frequency library and its application in the reliable identification of specific protein interaction partners.

    Science.gov (United States)

    Boulon, Séverine; Ahmad, Yasmeen; Trinkle-Mulcahy, Laura; Verheggen, Céline; Cobley, Andy; Gregor, Peter; Bertrand, Edouard; Whitehorn, Mark; Lamond, Angus I

    2010-05-01

    The reliable identification of protein interaction partners and how such interactions change in response to physiological or pathological perturbations is a key goal in most areas of cell biology. Stable isotope labeling with amino acids in cell culture (SILAC)-based mass spectrometry has been shown to provide a powerful strategy for characterizing protein complexes and identifying specific interactions. Here, we show how SILAC can be combined with computational methods drawn from the business intelligence field for multidimensional data analysis to improve the discrimination between specific and nonspecific protein associations and to analyze dynamic protein complexes. A strategy is shown for developing a protein frequency library (PFL) that improves on previous use of static "bead proteomes." The PFL annotates the frequency of detection in co-immunoprecipitation and pulldown experiments for all proteins in the human proteome. It can provide a flexible and objective filter for discriminating between contaminants and specifically bound proteins and can be used to normalize data values and facilitate comparisons between data obtained in separate experiments. The PFL is a dynamic tool that can be filtered for specific experimental parameters to generate a customized library. It will be continuously updated as data from each new experiment are added to the library, thereby progressively enhancing its utility. The application of the PFL to pulldown experiments is especially helpful in identifying either lower abundance or less tightly bound specific components of protein complexes that are otherwise lost among the large, nonspecific background.

  2. RSAM: An enhanced architecture for achieving web services reliability in mobile cloud computing

    Directory of Open Access Journals (Sweden)

    Amr S. Abdelfattah

    2018-04-01

    Full Text Available The evolution of the mobile landscape is coupled with the ubiquitous nature of the internet with its intermittent wireless connectivity and the web services. Achieving the web service reliability results in low communication overhead and retrieving the appropriate response. The middleware approach (MA is highly tended to achieve the web service reliability. This paper proposes a Reliable Service Architecture using Middleware (RSAM that achieves the reliable web services consumption. The enhanced architecture focuses on ensuring and tracking the request execution under the communication limitations and service temporal unavailability. It considers the most measurement factors including: request size, response size, and consuming time. We conducted experiments to compare the enhanced architecture with the traditional one. In these experiments, we covered several cases to prove the achievement of reliability. Results also show that the request size was found to be constant, the response size is identical to the traditional architecture, and the increase in the consuming time was less than 5% of the transaction time with the different response sizes. Keywords: Reliable web service, Middleware architecture, Mobile cloud computing

  3. Identification of critical parameters for PEMFC stack performance characterization and control strategies for reliable and comparable stack benchmarking

    DEFF Research Database (Denmark)

    Mitzel, Jens; Gülzow, Erich; Kabza, Alexander

    2016-01-01

    This paper is focused on the identification of critical parameters and on the development of reliable methodologies to achieve comparable benchmark results. Possibilities for control sensor positioning and for parameter variation in sensitivity tests are discussed and recommended options for the ...

  4. The Alcohol Use Disorders Identification Test (AUDIT: reliability and validity of the Greek version

    Directory of Open Access Journals (Sweden)

    Bratis Dimitris

    2009-05-01

    Full Text Available Abstract Background Problems associated with alcohol abuse are recognised by the World Health Organization as a major health issue, which according to most recent estimations is responsible for 1.4% of the total world burden of morbidity and has been proven to increase mortality risk by 50%. Because of the size and severity of the problem, early detection is very important. This requires easy to use and specific tools. One of these is the Alcohol Use Disorders Identification Test (AUDIT. Aim This study aims to standardise the questionnaire in a Greek population. Methods AUDIT was translated and back-translated from its original language by two English-speaking psychiatrists. The tool contains 10 questions. A score ≥ 11 is an indication of serious abuse/dependence. In the study, 218 subjects took part: 128 were males and 90 females. The average age was 40.71 years (± 11.34. From the 218 individuals, 109 (75 male, 34 female fulfilled the criteria for alcohol dependence according to the Diagnostic and Statistical Manual of Mental Disorders, 4th edition (DSM-IV, and presented requesting admission; 109 subjects (53 male, 56 female were healthy controls. Results Internal reliability (Cronbach α was 0.80 for the controls and 0.80 for the alcohol-dependent individuals. Controls had significantly lower average scores (t test P 8 was 0.98 and its specificity was 0.94 for the same score. For the alcohol-dependent sample 3% scored as false negatives and from the control group 1.8% scored false positives. In the alcohol-dependent sample there was no difference between males and females in their average scores (t test P > 0.05. Conclusion The Greek version of AUDIT has increased internal reliability and validity. It detects 97% of the alcohol-dependent individuals and has a high sensitivity and specificity. AUDIT is easy to use, quick and reliable and can be very useful in detection alcohol problems in sensitive populations.

  5. Reliability Assessment of Cloud Computing Platform Based on Semiquantitative Information and Evidential Reasoning

    Directory of Open Access Journals (Sweden)

    Hang Wei

    2016-01-01

    Full Text Available A reliability assessment method based on evidential reasoning (ER rule and semiquantitative information is proposed in this paper, where a new reliability assessment architecture including four aspects with both quantitative data and qualitative knowledge is established. The assessment architecture is more objective in describing complex dynamic cloud computing environment than that in traditional method. In addition, the ER rule which has good performance for multiple attribute decision making problem is employed to integrate different types of the attributes in assessment architecture, which can obtain more accurate assessment results. The assessment results of the case study in an actual cloud computing platform verify the effectiveness and the advantage of the proposed method.

  6. Virtualization of Legacy Instrumentation Control Computers for Improved Reliability, Operational Life, and Management.

    Science.gov (United States)

    Katz, Jonathan E

    2017-01-01

    Laboratories tend to be amenable environments for long-term reliable operation of scientific measurement equipment. Indeed, it is not uncommon to find equipment 5, 10, or even 20+ years old still being routinely used in labs. Unfortunately, the Achilles heel for many of these devices is the control/data acquisition computer. Often these computers run older operating systems (e.g., Windows XP) and, while they might only use standard network, USB or serial ports, they require proprietary software to be installed. Even if the original installation disks can be found, it is a burdensome process to reinstall and is fraught with "gotchas" that can derail the process-lost license keys, incompatible hardware, forgotten configuration settings, etc. If you have running legacy instrumentation, the computer is the ticking time bomb waiting to put a halt to your operation.In this chapter, I describe how to virtualize your currently running control computer. This virtualized computer "image" is easy to maintain, easy to back up and easy to redeploy. I have used this multiple times in my own lab to greatly improve the robustness of my legacy devices.After completing the steps in this chapter, you will have your original control computer as well as a virtual instance of that computer with all the software installed ready to control your hardware should your original computer ever be decommissioned.

  7. Reliable methods for computer simulation error control and a posteriori estimates

    CERN Document Server

    Neittaanmäki, P

    2004-01-01

    Recent decades have seen a very rapid success in developing numerical methods based on explicit control over approximation errors. It may be said that nowadays a new direction is forming in numerical analysis, the main goal of which is to develop methods ofreliable computations. In general, a reliable numerical method must solve two basic problems: (a) generate a sequence of approximations that converges to a solution and (b) verify the accuracy of these approximations. A computer code for such a method must consist of two respective blocks: solver and checker.In this book, we are chie

  8. Simple and reliable identification of the human round spermatid by inverted phase-contrast microscopy.

    Science.gov (United States)

    Verheyen, G; Crabbé, E; Joris, H; Van Steirteghem, A

    1998-06-01

    Based on the results of animal studies, round spermatid injection (ROSI) has been introduced into the clinical practice of several in-vitro fertilization (IVF) centres. The efficiency of this procedure in terms of fertilization rates and pregnancy rates, however, remains very poor. An essential aspect which does not receive enough attention is the correct identification of this type of round cell within a heterogeneous population of testicular cells. A Nikon inverted microscope equipped with phase-contrast optics (DLL) provided a clear image which allowed reliable recognition of round spermatids in cell suspensions smeared at the glass bottom of the dish. Fluorescent in-situ hybridization confirmed the haploid status of the selected cells. However, exploration of several biopsies from patients with non-obstructive azoospermia showing no spermatozoa after extensive search did not reveal any round spermatids. This observation questions whether enough effort is spent on searching for mature spermatozoa or late spermatids. Experimental investigations should precede the introduction of ROSI into the clinical practice of any IVF centre.

  9. Review of the reliability of Bruce 'B' RRS dual computer system

    International Nuclear Information System (INIS)

    Arsenault, J.E.; Manship, R.A.; Levan, D.G.

    1995-07-01

    The review presents an analysis of the Bruce 'B' Reactor Regulating System (RRS) Digital Control Computer (DCC) system, based on system documentation, significant event reports (SERs), question sets, and a site visit. The intent is to evaluate the reliability of the RRS DCC and to identify the possible scenarios that could lead to a serious process failure. The evaluation is based on three relatively independent analyses, which are integrated and presented in the form of Conclusions and Recommendations

  10. Identification of Nasal Bone Fractures on Conventional Radiography and Facial CT: Comparison of the Diagnostic Accuracy in Different Imaging Modalities and Analysis of Interobserver Reliability

    International Nuclear Information System (INIS)

    Baek, Hye Jin; Kim, Dong Wook; Ryu, Ji Hwa; Lee, Yoo Jin

    2013-01-01

    There has been no study to compare the diagnostic accuracy of an experienced radiologist with a trainee in nasal bone fracture. To compare the diagnostic accuracy between conventional radiography and computed tomography (CT) for the identification of nasal bone fractures and to evaluate the interobserver reliability between a staff radiologist and a trainee. A total of 108 patients who underwent conventional radiography and CT after acute nasal trauma were included in this retrospective study. Two readers, a staff radiologist and a second-year resident, independently assessed the results of the imaging studies. Of the 108 patients, the presence of a nasal bone fracture was confirmed in 88 (81.5%) patients. The number of non-depressed fractures was higher than the number of depressed fractures. In nine (10.2%) patients, nasal bone fractures were only identified on conventional radiography, including three depressed and six non-depressed fractures. CT was more accurate as compared to conventional radiography for the identification of nasal bone fractures as determined by both readers (P <0.05), all diagnostic indices of an experienced radiologist were similar to or higher than those of a trainee, and κ statistics showed moderate agreement between the two diagnostic tools for both readers. There was no statistical difference in the assessment of interobserver reliability for both imaging modalities in the identification of nasal bone fractures. For the identification of nasal bone fractures, CT was significantly superior to conventional radiography. Although a staff radiologist showed better values in the identification of nasal bone fracture and differentiation between depressed and non-depressed fractures than a trainee, there was no statistically significant difference in the interpretation of conventional radiography and CT between a radiologist and a trainee

  11. Reliability of Semiautomated Computational Methods for Estimating Tibiofemoral Contact Stress in the Multicenter Osteoarthritis Study

    Directory of Open Access Journals (Sweden)

    Donald D. Anderson

    2012-01-01

    Full Text Available Recent findings suggest that contact stress is a potent predictor of subsequent symptomatic osteoarthritis development in the knee. However, much larger numbers of knees (likely on the order of hundreds, if not thousands need to be reliably analyzed to achieve the statistical power necessary to clarify this relationship. This study assessed the reliability of new semiautomated computational methods for estimating contact stress in knees from large population-based cohorts. Ten knees of subjects from the Multicenter Osteoarthritis Study were included. Bone surfaces were manually segmented from sequential 1.0 Tesla magnetic resonance imaging slices by three individuals on two nonconsecutive days. Four individuals then registered the resulting bone surfaces to corresponding bone edges on weight-bearing radiographs, using a semi-automated algorithm. Discrete element analysis methods were used to estimate contact stress distributions for each knee. Segmentation and registration reliabilities (day-to-day and interrater for peak and mean medial and lateral tibiofemoral contact stress were assessed with Shrout-Fleiss intraclass correlation coefficients (ICCs. The segmentation and registration steps of the modeling approach were found to have excellent day-to-day (ICC 0.93–0.99 and good inter-rater reliability (0.84–0.97. This approach for estimating compartment-specific tibiofemoral contact stress appears to be sufficiently reliable for use in large population-based cohorts.

  12. Reliability analysis and computation of computer-based safety instrumentation and control used in German nuclear power plant. Final report

    International Nuclear Information System (INIS)

    Ding, Yongjian; Krause, Ulrich; Gu, Chunlei

    2014-01-01

    The trend of technological advancement in the field of safety instrumentation and control (I and C) leads to increasingly frequent use of computer-based (digital) control systems which consisting of distributed, connected bus communications computers and their functionalities are freely programmable by qualified software. The advantages of the new I and C system over the old I and C system with hard-wired technology are e.g. in the higher flexibility, cost-effective procurement of spare parts, higher hardware reliability (through higher integration density, intelligent self-monitoring mechanisms, etc.). On the other hand, skeptics see the new technology with the computer-based I and C a higher potential by influences of common cause failures (CCF), and the easier manipulation by sabotage (IT Security). In this joint research project funded by the Federal Ministry for Economical Affaires and Energy (BMWi) (2011-2014, FJZ 1501405) the Otto-von-Guericke-University Magdeburg and Magdeburg-Stendal University of Applied Sciences are therefore trying to develop suitable methods for the demonstration of the reliability of the new instrumentation and control systems with the focus on the investigation of CCF. This expertise of both houses shall be extended to this area and a scientific contribution to the sound reliability judgments of the digital safety I and C in domestic and foreign nuclear power plants. First, the state of science and technology will be worked out through the study of national and international standards in the field of functional safety of electrical and I and C systems and accompanying literature. On the basis of the existing nuclear Standards the deterministic requirements on the structure of the new digital I and C system will be determined. The possible methods of reliability modeling will be analyzed and compared. A suitable method called multi class binomial failure rate (MCFBR) which was successfully used in safety valve applications will be

  13. Computing Inter-Rater Reliability for Observational Data: An Overview and Tutorial

    Directory of Open Access Journals (Sweden)

    Kevin A. Hallgren

    2012-02-01

    Full Text Available Many research designs require the assessment of inter-rater reliability (IRR to demonstrate consistency among observational ratings provided by multiple coders. However, many studies use incorrect statistical procedures, fail to fully report the information necessary to interpret their results, or do not address how IRR affects the power of their subsequent analyses for hypothesis testing. This paper provides an overview of methodological issues related to the assessment of IRR with a focus on study design, selection of appropriate statistics, and the computation, interpretation, and reporting of some commonly-used IRR statistics. Computational examples include SPSS and R syntax for computing Cohen’s kappa and intra-class correlations to assess IRR.

  14. Computational Identification of Novel Genes: Current and Future Perspectives.

    Science.gov (United States)

    Klasberg, Steffen; Bitard-Feildel, Tristan; Mallet, Ludovic

    2016-01-01

    While it has long been thought that all genomic novelties are derived from the existing material, many genes lacking homology to known genes were found in recent genome projects. Some of these novel genes were proposed to have evolved de novo, ie, out of noncoding sequences, whereas some have been shown to follow a duplication and divergence process. Their discovery called for an extension of the historical hypotheses about gene origination. Besides the theoretical breakthrough, increasing evidence accumulated that novel genes play important roles in evolutionary processes, including adaptation and speciation events. Different techniques are available to identify genes and classify them as novel. Their classification as novel is usually based on their similarity to known genes, or lack thereof, detected by comparative genomics or against databases. Computational approaches are further prime methods that can be based on existing models or leveraging biological evidences from experiments. Identification of novel genes remains however a challenging task. With the constant software and technologies updates, no gold standard, and no available benchmark, evaluation and characterization of genomic novelty is a vibrant field. In this review, the classical and state-of-the-art tools for gene prediction are introduced. The current methods for novel gene detection are presented; the methodological strategies and their limits are discussed along with perspective approaches for further studies.

  15. Simple, Reliable, and Cost-Effective Yeast Identification Scheme for the Clinical Laboratory

    OpenAIRE

    Koehler, Ann P.; Chu, Kai-Cheong; Houang, Elizabeth T. S.; Cheng, Augustine F. B.

    1999-01-01

    The appearance of colonies on the chromogenic medium CHROMagar Candida combined with observation of morphology on corn meal–Tween 80 agar was used for the identification of 353 clinical yeast isolates. The results were compared with those obtained with API yeast identification kits. The accuracy of identification and the turnaround time were equivalent for each method, and our cultural method was less expensive.

  16. Advances towards reliable identification and concentration determination of rare cells in peripheral blood

    Science.gov (United States)

    Alemany Server, R.; Martens, D.; Jans, K.; Bienstman, P.; Hill, D.

    2016-03-01

    Through further development, integration and validation of micro-nano-bio and biophotonics systems FP7 CanDo is developing an instrument that will permit highly reproducible and reliable identification and concentration determination of rare cells in peripheral blood for two key societal challenges, early and low cost anti-cancer drug efficacy determination and cancer diagnosis/monitoring. A cellular link between the primary malignant tumour and the peripheral metastases, responsible for 90% of cancerrelated deaths, has been established in the form of circulating tumour cells (CTCs) in peripheral blood. Furthermore, the relatively short survival time of CTCs in peripheral blood means that their detection is indicative of tumour progression thereby providing in addition to a prognostic value an evaluation of therapeutic efficacy and early recognition of tumour progression in theranostics. In cancer patients however blood concentrations are very low (=1 CTC/1E9 cells) and current detection strategies are too insensitive, limiting use to prognosis of only those with advanced metastatic cancer. Similarly, problems occur in therapeutics with anti-cancer drug development leading to lengthy and costly trials often preventing access to market. The novel cell separation/Raman analysis technologies plus nucleic acid based molecular characterization of the CanDo platform will provide an accurate CTC count with high throughput and high yield meeting both key societal challenges. Being beyond the state of art it will lead to substantial share gains not just in the high end markets of drug discovery and cancer diagnostics but due to modular technologies also in others. Here we present preliminary DNA hybridization sensing results.

  17. Identification of Classified Information in Unclassified DoD Systems During the Audit of Internal Controls and Data Reliability in the Deployable Disbursing System

    Science.gov (United States)

    2009-02-17

    Identification of Classified Information in Unclassified DoD Systems During the Audit of Internal Controls and Data Reliability in the Deployable...TITLE AND SUBTITLE Identification of Classified Information in Unclassified DoD Systems During the Audit of Internal Controls and Data Reliability...Systems During the Audit ofInternal Controls and Data Reliability in the Deployable Disbursing System (Report No. D-2009-054) Weare providing this

  18. RELIABILITY OF POSITRON EMISSION TOMOGRAPHY-COMPUTED TOMOGRAPHY IN EVALUATION OF TESTICULAR CARCINOMA PATIENTS.

    Science.gov (United States)

    Nikoletić, Katarina; Mihailović, Jasna; Matovina, Emil; Žeravica, Radmila; Srbovan, Dolores

    2015-01-01

    The study was aimed at assessing the reliability of 18F-fluorodeoxyglucose positron emission tomography-computed tomography scan in evaluation of testicular carcinoma patients. The study sample consisted of 26 scans performed in 23 patients with testicular carcinoma. According to the pathohistological finding, 14 patients had seminomas, 7 had nonseminomas and 2 patients had a mixed histological type. In 17 patients, the initial treatment was orchiectomy+chemotherapy, 2 patients had orchiectomy+chemotherapy+retroperitoneal lymph node dissection, 3 patients had orchiectomy only and one patient was treated with chemotherapy only. Abnormal computed tomography was the main cause for the oncologist to refer the patient to positron emission tomography-computed tomography scan (in 19 scans), magnetic resonance imaging abnormalities in 1 scan, high level oftumor markers in 3 and 3 scans were perforned for follow-up. Positron emission tomography-computed tomography imaging results were compared with histological results, other imaging modalities or the clinical follow-up of the patients. Positron emission tomography-computed tomography scans were positive in 6 and negative in 20 patients. In two patients, positron emission tomography-computed tomography was false positive. There were 20 negative positron emission omography-computed tomography scans perforned in 18 patients, one patient was lost for data analysis. Clinically stable disease was confirmed in 18 follow-up scans performed in 16 patients. The values of sensitivty, specificity, accuracy, and positive- and negative predictive value were 60%, 95%, 75%, 88% and 90.5%, respectively. A hgh negative predictive value obtained in our study (90.5%) suggests that there is a small possibility for a patient to have future relapse after normal positron emission tomography-computed tomography study. However, since the sensitivity and positive predictive value of the study ire rather low, there are limitations of positive

  19. The reliable solution and computation time of variable parameters logistic model

    Science.gov (United States)

    Wang, Pengfei; Pan, Xinnong

    2018-05-01

    The study investigates the reliable computation time (RCT, termed as T c) by applying a double-precision computation of a variable parameters logistic map (VPLM). Firstly, by using the proposed method, we obtain the reliable solutions for the logistic map. Secondly, we construct 10,000 samples of reliable experiments from a time-dependent non-stationary parameters VPLM and then calculate the mean T c. The results indicate that, for each different initial value, the T cs of the VPLM are generally different. However, the mean T c trends to a constant value when the sample number is large enough. The maximum, minimum, and probable distribution functions of T c are also obtained, which can help us to identify the robustness of applying a nonlinear time series theory to forecasting by using the VPLM output. In addition, the T c of the fixed parameter experiments of the logistic map is obtained, and the results suggest that this T c matches the theoretical formula-predicted value.

  20. Chest computed tomography-based scoring of thoracic sarcoidosis: Inter-rater reliability of CT abnormalities

    Energy Technology Data Exchange (ETDEWEB)

    Heuvel, D.A.V. den; Es, H.W. van; Heesewijk, J.P. van; Spee, M. [St. Antonius Hospital Nieuwegein, Department of Radiology, Nieuwegein (Netherlands); Jong, P.A. de [University Medical Center Utrecht, Department of Radiology, Utrecht (Netherlands); Zanen, P.; Grutters, J.C. [University Medical Center Utrecht, Division Heart and Lungs, Utrecht (Netherlands); St. Antonius Hospital Nieuwegein, Center of Interstitial Lung Diseases, Department of Pulmonology, Nieuwegein (Netherlands)

    2015-09-15

    To determine inter-rater reliability of sarcoidosis-related computed tomography (CT) findings that can be used for scoring of thoracic sarcoidosis. CT images of 51 patients with sarcoidosis were scored by five chest radiologists for various abnormal CT findings (22 in total) encountered in thoracic sarcoidosis. Using intra-class correlation coefficient (ICC) analysis, inter-rater reliability was analysed and reported according to the Guidelines for Reporting Reliability and Agreement Studies (GRRAS) criteria. A pre-specified sub-analysis was performed to investigate the effect of training. Scoring was trained in a distinct set of 15 scans in which all abnormal CT findings were represented. Median age of the 51 patients (36 men, 70 %) was 43 years (range 26 - 64 years). All radiographic stages were present in this group. ICC ranged from 0.91 for honeycombing to 0.11 for nodular margin (sharp versus ill-defined). The ICC was above 0.60 in 13 of the 22 abnormal findings. Sub-analysis for the best-trained observers demonstrated an ICC improvement for all abnormal findings and values above 0.60 for 16 of the 22 abnormalities. In our cohort, reliability between raters was acceptable for 16 thoracic sarcoidosis-related abnormal CT findings. (orig.)

  1. Chest computed tomography-based scoring of thoracic sarcoidosis: Inter-rater reliability of CT abnormalities

    International Nuclear Information System (INIS)

    Heuvel, D.A.V. den; Es, H.W. van; Heesewijk, J.P. van; Spee, M.; Jong, P.A. de; Zanen, P.; Grutters, J.C.

    2015-01-01

    To determine inter-rater reliability of sarcoidosis-related computed tomography (CT) findings that can be used for scoring of thoracic sarcoidosis. CT images of 51 patients with sarcoidosis were scored by five chest radiologists for various abnormal CT findings (22 in total) encountered in thoracic sarcoidosis. Using intra-class correlation coefficient (ICC) analysis, inter-rater reliability was analysed and reported according to the Guidelines for Reporting Reliability and Agreement Studies (GRRAS) criteria. A pre-specified sub-analysis was performed to investigate the effect of training. Scoring was trained in a distinct set of 15 scans in which all abnormal CT findings were represented. Median age of the 51 patients (36 men, 70 %) was 43 years (range 26 - 64 years). All radiographic stages were present in this group. ICC ranged from 0.91 for honeycombing to 0.11 for nodular margin (sharp versus ill-defined). The ICC was above 0.60 in 13 of the 22 abnormal findings. Sub-analysis for the best-trained observers demonstrated an ICC improvement for all abnormal findings and values above 0.60 for 16 of the 22 abnormalities. In our cohort, reliability between raters was acceptable for 16 thoracic sarcoidosis-related abnormal CT findings. (orig.)

  2. JUPITER: Joint Universal Parameter IdenTification and Evaluation of Reliability - An Application Programming Interface (API) for Model Analysis

    Science.gov (United States)

    Banta, Edward R.; Poeter, Eileen P.; Doherty, John E.; Hill, Mary C.

    2006-01-01

    he Joint Universal Parameter IdenTification and Evaluation of Reliability Application Programming Interface (JUPITER API) improves the computer programming resources available to those developing applications (computer programs) for model analysis.The JUPITER API consists of eleven Fortran-90 modules that provide for encapsulation of data and operations on that data. Each module contains one or more entities: data, data types, subroutines, functions, and generic interfaces. The modules do not constitute computer programs themselves; instead, they are used to construct computer programs. Such computer programs are called applications of the API. The API provides common modeling operations for use by a variety of computer applications.The models being analyzed are referred to here as process models, and may, for example, represent the physics, chemistry, and(or) biology of a field or laboratory system. Process models commonly are constructed using published models such as MODFLOW (Harbaugh et al., 2000; Harbaugh, 2005), MT3DMS (Zheng and Wang, 1996), HSPF (Bicknell et al., 1997), PRMS (Leavesley and Stannard, 1995), and many others. The process model may be accessed by a JUPITER API application as an external program, or it may be implemented as a subroutine within a JUPITER API application . In either case, execution of the model takes place in a framework designed by the application programmer. This framework can be designed to take advantage of any parallel processing capabilities possessed by the process model, as well as the parallel-processing capabilities of the JUPITER API.Model analyses for which the JUPITER API could be useful include, for example: Compare model results to observed values to determine how well the model reproduces system processes and characteristics.Use sensitivity analysis to determine the information provided by observations to parameters and predictions of interest.Determine the additional data needed to improve selected model

  3. Computational identification of strain-, species- and genus-specific proteins

    Directory of Open Access Journals (Sweden)

    Thiagarajan Rathi

    2005-11-01

    Full Text Available Abstract Background The identification of unique proteins at different taxonomic levels has both scientific and practical value. Strain-, species- and genus-specific proteins can provide insight into the criteria that define an organism and its relationship with close relatives. Such proteins can also serve as taxon-specific diagnostic targets. Description A pipeline using a combination of computational and manual analyses of BLAST results was developed to identify strain-, species-, and genus-specific proteins and to catalog the closest sequenced relative for each protein in a proteome. Proteins encoded by a given strain are preliminarily considered to be unique if BLAST, using a comprehensive protein database, fails to retrieve (with an e-value better than 0.001 any protein not encoded by the query strain, species or genus (for strain-, species- and genus-specific proteins respectively, or if BLAST, using the best hit as the query (reverse BLAST, does not retrieve the initial query protein. Results are manually inspected for homology if the initial query is retrieved in the reverse BLAST but is not the best hit. Sequences unlikely to retrieve homologs using the default BLOSUM62 matrix (usually short sequences are re-tested using the PAM30 matrix, thereby increasing the number of retrieved homologs and increasing the stringency of the search for unique proteins. The above protocol was used to examine several food- and water-borne pathogens. We find that the reverse BLAST step filters out about 22% of proteins with homologs that would otherwise be considered unique at the genus and species levels. Analysis of the annotations of unique proteins reveals that many are remnants of prophage proteins, or may be involved in virulence. The data generated from this study can be accessed and further evaluated from the CUPID (Core and Unique Protein Identification system web site (updated semi-annually at http://pir.georgetown.edu/cupid. Conclusion CUPID

  4. Computational area measurement of orbital floor fractures: Reliability, accuracy and rapidity

    International Nuclear Information System (INIS)

    Schouman, Thomas; Courvoisier, Delphine S.; Imholz, Benoit; Van Issum, Christopher; Scolozzi, Paolo

    2012-01-01

    Objective: To evaluate the reliability, accuracy and rapidity of a specific computational method for assessing the orbital floor fracture area on a CT scan. Method: A computer assessment of the area of the fracture, as well as that of the total orbital floor, was determined on CT scans taken from ten patients. The ratio of the fracture's area to the orbital floor area was also calculated. The test–retest precision of measurement calculations was estimated using the Intraclass Correlation Coefficient (ICC) and Dahlberg's formula to assess the agreement across observers and across measures. The time needed for the complete assessment was also evaluated. Results: The Intraclass Correlation Coefficient across observers was 0.92 [0.85;0.96], and the precision of the measures across observers was 4.9%, according to Dahlberg's formula .The mean time needed to make one measurement was 2 min and 39 s (range, 1 min and 32 s to 4 min and 37 s). Conclusion: This study demonstrated that (1) the area of the orbital floor fracture can be rapidly and reliably assessed by using a specific computer system directly on CT scan images; (2) this method has the potential of being routinely used to standardize the post-traumatic evaluation of orbital fractures

  5. How to effectively compute the reliability of a thermal-hydraulic nuclear passive system

    International Nuclear Information System (INIS)

    Zio, E.; Pedroni, N.

    2011-01-01

    Research highlights: → Optimized LS is the preferred choice for failure probability estimation. → Two alternative options are suggested for uncertainty and sensitivity analyses. → SS for simulation codes requiring seconds or minutes to run. → Regression models (e.g., ANNs) for simulation codes requiring hours or days to run. - Abstract: The computation of the reliability of a thermal-hydraulic (T-H) passive system of a nuclear power plant can be obtained by (i) Monte Carlo (MC) sampling the uncertainties of the system model and parameters, (ii) computing, for each sample, the system response by a mechanistic T-H code and (iii) comparing the system response with pre-established safety thresholds, which define the success or failure of the safety function. The computational effort involved can be prohibitive because of the large number of (typically long) T-H code simulations that must be performed (one for each sample) for the statistical estimation of the probability of success or failure. The objective of this work is to provide operative guidelines to effectively handle the computation of the reliability of a nuclear passive system. Two directions of computation efficiency are considered: from one side, efficient Monte Carlo Simulation (MCS) techniques are indicated as a means to performing robust estimations with a limited number of samples: in particular, the Subset Simulation (SS) and Line Sampling (LS) methods are identified as most valuable; from the other side, fast-running, surrogate regression models (also called response surfaces or meta-models) are indicated as a valid replacement of the long-running T-H model codes: in particular, the use of bootstrapped Artificial Neural Networks (ANNs) is shown to have interesting potentials, including for uncertainty propagation. The recommendations drawn are supported by the results obtained in an illustrative application of literature.

  6. Reliability of real-time computing with radiation data feedback at accidental release

    International Nuclear Information System (INIS)

    Deme, S.; Feher, I.; Lang, E.

    1989-07-01

    At present, the computing method normalized for the telemetric data represents the primary information for deciding on any necessary countermeasures in case of a nuclear reactor accident. The reliability of the results, however, are influenced by the choice of certain parameters that can not be determined by direct methods. Improperly chosen diffusion parameters would distort the determination of environmental radiation parameters normalized on the basis of the measurements ( 131 I activity concentration, gamma dose rate) at points lying at a given distance from the measuring stations. Numerical examples for the uncertainties due to the above factors are analyzed. (author) 4 refs.; 14 figs

  7. Reliability Lessons Learned From GPU Experience With The Titan Supercomputer at Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Gallarno, George [Christian Brothers University; Rogers, James H [ORNL; Maxwell, Don E [ORNL

    2015-01-01

    The high computational capability of graphics processing units (GPUs) is enabling and driving the scientific discovery process at large-scale. The world s second fastest supercomputer for open science, Titan, has more than 18,000 GPUs that computational scientists use to perform scientific simu- lations and data analysis. Understanding of GPU reliability characteristics, however, is still in its nascent stage since GPUs have only recently been deployed at large-scale. This paper presents a detailed study of GPU errors and their impact on system operations and applications, describing experiences with the 18,688 GPUs on the Titan supercom- puter as well as lessons learned in the process of efficient operation of GPUs at scale. These experiences are helpful to HPC sites which already have large-scale GPU clusters or plan to deploy GPUs in the future.

  8. Modeling Message Queueing Services with Reliability Guarantee in Cloud Computing Environment Using Colored Petri Nets

    Directory of Open Access Journals (Sweden)

    Jing Li

    2015-01-01

    Full Text Available Motivated by the need for loosely coupled and asynchronous dissemination of information, message queues are widely used in large-scale application areas. With the advent of virtualization technology, cloud-based message queueing services (CMQSs with distributed computing and storage are widely adopted to improve availability, scalability, and reliability; however, a critical issue is its performance and the quality of service (QoS. While numerous approaches evaluating system performance are available, there is no modeling approach for estimating and analyzing the performance of CMQSs. In this paper, we employ both the analytical and simulation modeling to address the performance of CMQSs with reliability guarantee. We present a visibility-based modeling approach (VMA for simulation model using colored Petri nets (CPN. Our model incorporates the important features of message queueing services in the cloud such as replication, message consistency, resource virtualization, and especially the mechanism named visibility timeout which is adopted in the services to guarantee system reliability. Finally, we evaluate our model through different experiments under varied scenarios to obtain important performance metrics such as total message delivery time, waiting number, and components utilization. Our results reveal considerable insights into resource scheduling and system configuration for service providers to estimate and gain performance optimization.

  9. Design and reliability, availability, maintainability, and safety analysis of a high availability quadruple vital computer system

    Institute of Scientific and Technical Information of China (English)

    Ping TAN; Wei-ting HE; Jia LIN; Hong-ming ZHAO; Jian CHU

    2011-01-01

    With the development of high-speed railways in China,more than 2000 high-speed trains will be put into use.Safety and efficiency of railway transportation is increasingly important.We have designed a high availability quadruple vital computer (HAQVC) system based on the analysis of the architecture of the traditional double 2-out-of-2 system and 2-out-of-3 system.The HAQVC system is a system with high availability and safety,with prominent characteristics such as fire-new internal architecture,high efficiency,reliable data interaction mechanism,and operation state change mechanism.The hardware of the vital CPU is based on ARM7 with the real-time embedded safe operation system (ES-OS).The Markov modeling method is designed to evaluate the reliability,availability,maintainability,and safety (RAMS) of the system.In this paper,we demonstrate that the HAQVC system is more reliable than the all voting triple modular redundancy (AVTMR) system and double 2-out-of-2 system.Thus,the design can be used for a specific application system,such as an airplane or high-speed railway system.

  10. Measurement of transplanted pancreatic volume using computed tomography: reliability by intra- and inter-observer variability

    International Nuclear Information System (INIS)

    Lundqvist, Eva; Segelsjoe, Monica; Magnusson, Anders; Andersson, Anna; Biglarnia, Ali-Reza

    2012-01-01

    Background Unlike other solid organ transplants, pancreas allografts can undergo a substantial decrease in baseline volume after transplantation. This phenomenon has not been well characterized, as there are insufficient data on reliable and reproducible volume assessments. We hypothesized that characterization of pancreatic volume by means of computed tomography (CT) could be a useful method for clinical follow-up in pancreas transplant patients. Purpose To evaluate the feasibility and reliability of pancreatic volume assessment using CT scan in transplanted patients. Material and Methods CT examinations were performed on 21 consecutive patients undergoing pancreas transplantation. Volume measurements were carried out by two observers tracing the pancreatic contours in all slices. The observers performed the measurements twice for each patient. Differences in volume measurement were used to evaluate intra- and inter-observer variability. Results The intra-observer variability for the pancreatic volume measurements of Observers 1 and 2 was found to be in almost perfect agreement, with an intraclass correlation coefficient (ICC) of 0.90 (0.77-0.96) and 0.99 (0.98-1.0), respectively. Regarding inter-observer validity, the ICCs for the first and second measurements were 0.90 (range, 0.77-0.96) and 0.95 (range, 0.85-0.98), respectively. Conclusion CT volumetry is a reliable and reproducible method for measurement of transplanted pancreatic volume

  11. Measurement of transplanted pancreatic volume using computed tomography: reliability by intra- and inter-observer variability

    Energy Technology Data Exchange (ETDEWEB)

    Lundqvist, Eva; Segelsjoe, Monica; Magnusson, Anders [Uppsala Univ., Dept. of Radiology, Oncology and Radiation Science, Section of Radiology, Uppsala (Sweden)], E-mail: eva.lundqvist.8954@student.uu.se; Andersson, Anna; Biglarnia, Ali-Reza [Dept. of Surgical Sciences, Section of Transplantation Surgery, Uppsala Univ. Hospital, Uppsala (Sweden)

    2012-11-15

    Background Unlike other solid organ transplants, pancreas allografts can undergo a substantial decrease in baseline volume after transplantation. This phenomenon has not been well characterized, as there are insufficient data on reliable and reproducible volume assessments. We hypothesized that characterization of pancreatic volume by means of computed tomography (CT) could be a useful method for clinical follow-up in pancreas transplant patients. Purpose To evaluate the feasibility and reliability of pancreatic volume assessment using CT scan in transplanted patients. Material and Methods CT examinations were performed on 21 consecutive patients undergoing pancreas transplantation. Volume measurements were carried out by two observers tracing the pancreatic contours in all slices. The observers performed the measurements twice for each patient. Differences in volume measurement were used to evaluate intra- and inter-observer variability. Results The intra-observer variability for the pancreatic volume measurements of Observers 1 and 2 was found to be in almost perfect agreement, with an intraclass correlation coefficient (ICC) of 0.90 (0.77-0.96) and 0.99 (0.98-1.0), respectively. Regarding inter-observer validity, the ICCs for the first and second measurements were 0.90 (range, 0.77-0.96) and 0.95 (range, 0.85-0.98), respectively. Conclusion CT volumetry is a reliable and reproducible method for measurement of transplanted pancreatic volume.

  12. Identification of Learning Processes by Means of Computer Graphics.

    Science.gov (United States)

    Sorensen, Birgitte Holm

    1993-01-01

    Describes a development project for the use of computer graphics and video in connection with an inservice training course for primary education teachers in Denmark. Topics addressed include research approaches to computers; computer graphics in learning processes; activities relating to computer graphics; the role of the teacher; and student…

  13. Use of Soft Computing Technologies for a Qualitative and Reliable Engine Control System for Propulsion Systems

    Science.gov (United States)

    Trevino, Luis; Brown, Terry; Crumbley, R. T. (Technical Monitor)

    2001-01-01

    The problem to be addressed in this paper is to explore how the use of Soft Computing Technologies (SCT) could be employed to improve overall vehicle system safety, reliability, and rocket engine performance by development of a qualitative and reliable engine control system (QRECS). Specifically, this will be addressed by enhancing rocket engine control using SCT, innovative data mining tools, and sound software engineering practices used in Marshall's Flight Software Group (FSG). The principle goals for addressing the issue of quality are to improve software management, software development time, software maintenance, processor execution, fault tolerance and mitigation, and nonlinear control in power level transitions. The intent is not to discuss any shortcomings of existing engine control methodologies, but to provide alternative design choices for control, implementation, performance, and sustaining engineering, all relative to addressing the issue of reliability. The approaches outlined in this paper will require knowledge in the fields of rocket engine propulsion (system level), software engineering for embedded flight software systems, and soft computing technologies (i.e., neural networks, fuzzy logic, data mining, and Bayesian belief networks); some of which are briefed in this paper. For this effort, the targeted demonstration rocket engine testbed is the MC-1 engine (formerly FASTRAC) which is simulated with hardware and software in the Marshall Avionics & Software Testbed (MAST) laboratory that currently resides at NASA's Marshall Space Flight Center, building 4476, and is managed by the Avionics Department. A brief plan of action for design, development, implementation, and testing a Phase One effort for QRECS is given, along with expected results. Phase One will focus on development of a Smart Start Engine Module and a Mainstage Engine Module for proper engine start and mainstage engine operations. The overall intent is to demonstrate that by

  14. Identification of natural images and computer-generated graphics based on statistical and textural features.

    Science.gov (United States)

    Peng, Fei; Li, Jiao-ting; Long, Min

    2015-03-01

    To discriminate the acquisition pipelines of digital images, a novel scheme for the identification of natural images and computer-generated graphics is proposed based on statistical and textural features. First, the differences between them are investigated from the view of statistics and texture, and 31 dimensions of feature are acquired for identification. Then, LIBSVM is used for the classification. Finally, the experimental results are presented. The results show that it can achieve an identification accuracy of 97.89% for computer-generated graphics, and an identification accuracy of 97.75% for natural images. The analyses also demonstrate the proposed method has excellent performance, compared with some existing methods based only on statistical features or other features. The method has a great potential to be implemented for the identification of natural images and computer-generated graphics. © 2014 American Academy of Forensic Sciences.

  15. Reliability of Lyapunov characteristic exponents computed by the two-particle method

    Science.gov (United States)

    Mei, Lijie; Huang, Li

    2018-03-01

    For highly complex problems, such as the post-Newtonian formulation of compact binaries, the two-particle method may be a better, or even the only, choice to compute the Lyapunov characteristic exponent (LCE). This method avoids the complex calculations of variational equations compared with the variational method. However, the two-particle method sometimes provides spurious estimates to LCEs. In this paper, we first analyze the equivalence in the definition of LCE between the variational and two-particle methods for Hamiltonian systems. Then, we develop a criterion to determine the reliability of LCEs computed by the two-particle method by considering the magnitude of the initial tangent (or separation) vector ξ0 (or δ0), renormalization time interval τ, machine precision ε, and global truncation error ɛT. The reliable Lyapunov characteristic indicators estimated by the two-particle method form a V-shaped region, which is restricted by d0, ε, and ɛT. Finally, the numerical experiments with the Hénon-Heiles system, the spinning compact binaries, and the post-Newtonian circular restricted three-body problem strongly support the theoretical results.

  16. Reliability of computed tomography measurements in assessment of thigh muscle cross-sectional area and attenuation

    International Nuclear Information System (INIS)

    Strandberg, Sören; Wretling, Marie-Louise; Wredmark, Torsten; Shalabi, Adel

    2010-01-01

    Advancement in technology of computer tomography (CT) and introduction of new medical imaging softwares enables easy and rapid assessment of muscle cross-sectional area (CSA) and attenuation. Before using these techniques in clinical studies there is a need for evaluation of the reliability of the measurements. The purpose of the study was to evaluate the inter- and intra-observer reliability of ImageJ in measuring thigh muscles CSA and attenuation in patients with anterior cruciate ligament (ACL) injury by computer tomography. 31 patients from an ongoing study of rehabilitation and muscle atrophy after ACL reconstruction were included in the study. Axial CT images with slice thickness of 10 mm at the level of 150 mm above the knee joint were analyzed by two investigators independently at two times with a minimum of 3 weeks between the two readings using NIH ImageJ. CSA and the mean attenuation of individual thigh muscles were analyzed for both legs. Mean CSA and mean attenuation values were in good agreement both when comparing the two observers and the two replicates. The inter- and intraclass correlation (ICC) was generally very high with values from 0.98 to 1.00 for all comparisons except for the area of semimembranosus. All the ICC values were significant (p < 0,001). Pearson correlation coefficients were also generally very high with values from 0.98 to 1.00 for all comparisons except for the area of semimembranosus (0.95 for intraobserver and 0.92 for interobserver). This study has presented ImageJ as a method to monitor and evaluate CSA and attenuation of different muscles in the thigh using CT-imaging. The method shows an overall excellent reliability with respect to both observer and replicate

  17. Osteochondritis dissecans of the humeral capitellum: reliability of four classification systems using radiographs and computed tomography.

    Science.gov (United States)

    Claessen, Femke M A P; van den Ende, Kimberly I M; Doornberg, Job N; Guitton, Thierry G; Eygendaal, Denise; van den Bekerom, Michel P J

    2015-10-01

    The radiographic appearance of osteochondritis dissecans (OCD) of the humeral capitellum varies according to the stage of the lesion. It is important to evaluate the stage of OCD lesion carefully to guide treatment. We compared the interobserver reliability of currently used classification systems for OCD of the humeral capitellum to identify the most reliable classification system. Thirty-two musculoskeletal radiologists and orthopaedic surgeons specialized in elbow surgery from several countries evaluated anteroposterior and lateral radiographs and corresponding computed tomography (CT) scans of 22 patients to classify the stage of OCD of the humeral capitellum according to the classification systems developed by (1) Minami, (2) Berndt and Harty, (3) Ferkel and Sgaglione, and (4) Anderson on a Web-based study platform including a Digital Imaging and Communications in Medicine viewer. Magnetic resonance imaging was not evaluated as part of this study. We measured agreement among observers using the Siegel and Castellan multirater κ. All OCD classification systems, except for Berndt and Harty, which had poor agreement among observers (κ = 0.20), had fair interobserver agreement: κ was 0.27 for the Minami, 0.23 for Anderson, and 0.22 for Ferkel and Sgaglione classifications. The Minami Classification was significantly more reliable than the other classifications (P reliable for classifying different stages of OCD of the humeral capitellum. However, it is unclear whether radiographic evidence of OCD of the humeral capitellum, as categorized by the Minami Classification, guides treatment in clinical practice as a result of this fair agreement. Copyright © 2015 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Elsevier Inc. All rights reserved.

  18. Conceptual transitions in methods of skull-photo superimposition that impact the reliability of identification: a review.

    Science.gov (United States)

    Jayaprakash, Paul T

    2015-01-01

    Establishing identification during skull-photo superimposition relies on correlating the salient morphological features of an unidentified skull with those of a face-image of a suspected dead individual using image overlay processes. Technical progression in the process of overlay has included the incorporation of video cameras, image-mixing devices and software that enables real-time vision-mixing. Conceptual transitions occur in the superimposition methods that involve 'life-size' images, that achieve orientation of the skull to the posture of the face in the photograph and that assess the extent of match. A recent report on the reliability of identification using the superimposition method adopted the currently prevalent methods and suggested an increased rate of failures when skulls were compared with related and unrelated face images. The reported reduction in the reliability of the superimposition method prompted a review of the transition in the concepts that are involved in skull-photo superimposition. The prevalent popular methods for visualizing the superimposed images at less than 'life-size', overlaying skull-face images by relying on the cranial and facial landmarks in the frontal plane when orienting the skull for matching and evaluating the match on a morphological basis by relying on mix-mode alone are the major departures in the methodology that may have reduced the identification reliability. The need to reassess the reliability of the method that incorporates the concepts which have been considered appropriate by the practitioners is stressed. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  19. A Program for the Identification of the Enterobacteriaceae for Use in Teaching the Principles of Computer Identification of Bacteria.

    Science.gov (United States)

    Hammonds, S. J.

    1990-01-01

    A technique for the numerical identification of bacteria using normalized likelihoods calculated from a probabilistic database is described, and the principles of the technique are explained. The listing of the computer program is included. Specimen results from the program, and examples of how they should be interpreted, are given. (KR)

  20. Reliability of the fuel identification procedure used by COGEMA during cask loading for shipment to LA HAGUE

    International Nuclear Information System (INIS)

    Pretesacque, P.; Eid, M.; Zachar, M.

    1993-01-01

    This study has been carried out to demonstrate the reliability of the system of the spent fuel identification used by COGEMA and NTL prior to shipment to the reprocessing plant of La Hague. This was a prerequisite for the French competent authority to accept the 'burnup credit' assumption in the criticality assessment of spent fuel packages. The probability to load a non-irradiated and non-specified fuel assembly was considered as acceptable if our identification and irradiation status measurement procedures were used. Furthermore, the task analysis enabled us to improve the working conditions at reactor sites, the quality of the working documentation, and consequently to improve the reliability of the system. The NTL experience of transporting to La Hague, as consignor, more than 10,000 fuel assemblies since the date of implementation of our system in 1984 without any non-conformance on fuel identification, validated the formalism of this study as well as our assumptions on basic events probabilities. (J.P.N.)

  1. Reliability of real-time computing with radiation data feedback at accidental release

    International Nuclear Information System (INIS)

    Deme, S.; Feher, I.; Lang, E.

    1990-01-01

    At the first workshop in 1985 we reported on the real-time dose computing method used at the Paks Nuclear Power Plant and on the telemetric system developed for the normalization of the computed data. At present, the computing method normalized for the telemetric data represents the primary information for deciding on any necessary counter measures in case of a nuclear reactor accident. In this connection we analyzed the reliability of the results obtained in this manner. The points of the analysis were: how the results are influenced by the choice of certain parameters that cannot be determined by direct methods and how the improperly chosen diffusion parameters would distort the determination of environmental radiation parameters normalized on the basis of the measurements ( 131 I activity concentration, gamma dose rate) at points lying at a given distance from the measuring stations. A further source of errors may be that, when determining the level of gamma radiation, the radionuclide doses in the cloud and on the ground surface are measured together by the environmental monitoring stations, whereas these doses appear separately in the computations. At the Paks NPP it is the time integral of the aiborne activity concentration of vapour form 131 I which is determined. This quantity includes neither the other physical and chemical forms of 131 I nor the other isotopes of radioiodine. We gave numerical examples for the uncertainties due to the above factors. As a result, we arrived at the conclusions that there is a need to decide on accident-related measures based on the computing method that the dose uncertainties may reach one order of magnitude for points lying far from the monitoring stations. Different measures are discussed to make the uncertainties significantly lower

  2. Do strict rules and moving images increase the reliability of sequential identification procedures?.

    OpenAIRE

    Valentine, Tim; Darling, Stephen; Memon, Amina

    2007-01-01

    Live identification procedures in England and Wales have been replaced by use of video, which provides a sequential presentation of facial images. Sequential presentation of photographs provides some protection to innocent suspects from mistaken identification when used with strict instructions designed to prevent relative judgements (Lindsay, Lea & Fulford, 1991). However, the current procedure in England and Wales is incompatible with these strict instructions. The reported research investi...

  3. reliability reliability

    African Journals Online (AJOL)

    eobe

    Corresponding author, Tel: +234-703. RELIABILITY .... V , , given by the code of practice. However, checks must .... an optimization procedure over the failure domain F corresponding .... of Concrete Members based on Utility Theory,. Technical ...

  4. A reliable and valid questionnaire was developed to measure computer vision syndrome at the workplace.

    Science.gov (United States)

    Seguí, María del Mar; Cabrero-García, Julio; Crespo, Ana; Verdú, José; Ronda, Elena

    2015-06-01

    To design and validate a questionnaire to measure visual symptoms related to exposure to computers in the workplace. Our computer vision syndrome questionnaire (CVS-Q) was based on a literature review and validated through discussion with experts and performance of a pretest, pilot test, and retest. Content validity was evaluated by occupational health, optometry, and ophthalmology experts. Rasch analysis was used in the psychometric evaluation of the questionnaire. Criterion validity was determined by calculating the sensitivity and specificity, receiver operator characteristic curve, and cutoff point. Test-retest repeatability was tested using the intraclass correlation coefficient (ICC) and concordance by Cohen's kappa (κ). The CVS-Q was developed with wide consensus among experts and was well accepted by the target group. It assesses the frequency and intensity of 16 symptoms using a single rating scale (symptom severity) that fits the Rasch rating scale model well. The questionnaire has sensitivity and specificity over 70% and achieved good test-retest repeatability both for the scores obtained [ICC = 0.802; 95% confidence interval (CI): 0.673, 0.884] and CVS classification (κ = 0.612; 95% CI: 0.384, 0.839). The CVS-Q has acceptable psychometric properties, making it a valid and reliable tool to control the visual health of computer workers, and can potentially be used in clinical trials and outcome research. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Reliability of lower limb alignment measures using an established landmark-based method with a customized computer software program

    Science.gov (United States)

    Sled, Elizabeth A.; Sheehy, Lisa M.; Felson, David T.; Costigan, Patrick A.; Lam, Miu; Cooke, T. Derek V.

    2010-01-01

    The objective of the study was to evaluate the reliability of frontal plane lower limb alignment measures using a landmark-based method by (1) comparing inter- and intra-reader reliability between measurements of alignment obtained manually with those using a computer program, and (2) determining inter- and intra-reader reliability of computer-assisted alignment measures from full-limb radiographs. An established method for measuring alignment was used, involving selection of 10 femoral and tibial bone landmarks. 1) To compare manual and computer methods, we used digital images and matching paper copies of five alignment patterns simulating healthy and malaligned limbs drawn using AutoCAD. Seven readers were trained in each system. Paper copies were measured manually and repeat measurements were performed daily for 3 days, followed by a similar routine with the digital images using the computer. 2) To examine the reliability of computer-assisted measures from full-limb radiographs, 100 images (200 limbs) were selected as a random sample from 1,500 full-limb digital radiographs which were part of the Multicenter Osteoarthritis (MOST) Study. Three trained readers used the software program to measure alignment twice from the batch of 100 images, with two or more weeks between batch handling. Manual and computer measures of alignment showed excellent agreement (intraclass correlations [ICCs] 0.977 – 0.999 for computer analysis; 0.820 – 0.995 for manual measures). The computer program applied to full-limb radiographs produced alignment measurements with high inter- and intra-reader reliability (ICCs 0.839 – 0.998). In conclusion, alignment measures using a bone landmark-based approach and a computer program were highly reliable between multiple readers. PMID:19882339

  6. Attendance fingerprint identification system using arduino and single board computer

    Science.gov (United States)

    Muchtar, M. A.; Seniman; Arisandi, D.; Hasanah, S.

    2018-03-01

    Fingerprint is one of the most unique parts of the human body that distinguishes one person from others and is easily accessed. This uniqueness is supported by technology that can automatically identify or recognize a person called fingerprint sensor. Yet, the existing Fingerprint Sensor can only do fingerprint identification on one machine. For the mentioned reason, we need a method to be able to recognize each user in a different fingerprint sensor. The purpose of this research is to build fingerprint sensor system for fingerprint data management to be centralized so identification can be done in each Fingerprint Sensor. The result of this research shows that by using Arduino and Raspberry Pi, data processing can be centralized so that fingerprint identification can be done in each fingerprint sensor with 98.5 % success rate of centralized server recording.

  7. A Reliable Measure of Information Security Awareness and the Identification of Bias in Responses

    Directory of Open Access Journals (Sweden)

    Agata McCormac

    2017-11-01

    Full Text Available The Human Aspects of Information Security Questionnaire (HAIS-Q is designed to measure Information Security Awareness. More specifically, the tool measures an individual’s knowledge, attitude, and self-reported behaviour relating to information security in the workplace. This paper reports on the reliability of the HAIS-Q, including test-retest reliability and internal consistency. The paper also assesses the reliability of three preliminary over-claiming items, designed specifically to complement the HAIS-Q, and identify those individuals who provide socially desirable responses. A total of 197 working Australians completed two iterations of the HAIS-Q and the over-claiming items, approximately 4 weeks apart. Results of the analysis showed that the HAIS-Q was externally reliable and internally consistent. Therefore, the HAIS-Q can be used to reliably measure information security awareness. Reliability testing on the preliminary over-claiming items was not as robust and further development is required and recommended. The implications of these findings mean that organisations can confidently use the HAIS-Q to not only measure the current state of employee information security awareness within their organisation, but they can also measure the effectiveness and impacts of training interventions, information security awareness programs and campaigns. The influence of cultural changes and the effect of security incidents can also be assessed.

  8. MAPPS (Maintenance Personnel Performance Simulation): a computer simulation model for human reliability analysis

    International Nuclear Information System (INIS)

    Knee, H.E.; Haas, P.M.

    1985-01-01

    A computer model has been developed, sensitivity tested, and evaluated capable of generating reliable estimates of human performance measures in the nuclear power plant (NPP) maintenance context. The model, entitled MAPPS (Maintenance Personnel Performance Simulation), is of the simulation type and is task-oriented. It addresses a number of person-machine, person-environment, and person-person variables and is capable of providing the user with a rich spectrum of important performance measures including mean time for successful task performance by a maintenance team and maintenance team probability of task success. These two measures are particularly important for input to probabilistic risk assessment (PRA) studies which were the primary impetus for the development of MAPPS. The simulation nature of the model along with its generous input parameters and output variables allows its usefulness to extend beyond its input to PRA

  9. Improving Wind Turbine Drivetrain Reliability Using a Combined Experimental, Computational, and Analytical Approach

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Y.; van Dam, J.; Bergua, R.; Jove, J.; Campbell, J.

    2015-03-01

    Nontorque loads induced by the wind turbine rotor overhang weight and aerodynamic forces can greatly affect drivetrain loads and responses. If not addressed properly, these loads can result in a decrease in gearbox component life. This work uses analytical modeling, computational modeling, and experimental data to evaluate a unique drivetrain design that minimizes the effects of nontorque loads on gearbox reliability: the Pure Torque(R) drivetrain developed by Alstom. The drivetrain has a hub-support configuration that transmits nontorque loads directly into the tower rather than through the gearbox as in other design approaches. An analytical model of Alstom's Pure Torque drivetrain provides insight into the relationships among turbine component weights, aerodynamic forces, and the resulting drivetrain loads. Main shaft bending loads are orders of magnitude lower than the rated torque and are hardly affected by wind conditions and turbine operations.

  10. Cross-cultural adaptation, reliability, and validation of the Korean version of the identification functional ankle instability (IdFAI).

    Science.gov (United States)

    Ko, Jupil; Rosen, Adam B; Brown, Cathleen N

    2017-09-12

    To cross-culturally adapt the Identification Functional Ankle Instability for use with Korean-speaking participants. The English version of the IdFAI was cross-culturally adapted into Korean based on the guidelines. The psychometric properties in the Korean version of the IdFAI were measured for test-retest reliability, internal consistency, criterion-related validity, discriminative validity, and measurement error 181 native Korean-speakers. Intra-class correlation coefficients (ICC 2,1 ) between the English and Korean versions of the IdFAI for test-retest reliability was 0.98 (standard error of measurement = 1.41). The Cronbach's alpha coefficient was 0.89 for the Korean versions of IdFAI. The Korean versions of the IdFAI had a strong correlation with the SF-36 (r s  = -0.69, p 10 was the optimal cutoff score to distinguish between the group memberships. The minimally detectable change of the Korean versions of the IdFAI score was 3.91. The Korean versions of the IdFAI have shown to be an excellent, reliable, and valid instrument. The Korean versions of the IdFAI can be utilized to assess the presence of Chronic Ankle Instability by researchers and clinicians working among Korean-speaking populations. Implications for rehabilitation The high recurrence rate of sprains may result into Chronic Ankle Instability (CAI). The Identification of Functional Ankle Instability Tool (IdFAI) has been validated and recommended to identify patients with Chronic Ankle Instability (CAI). The Korean version of the Identification of Functional Ankle Instability Tool (IdFAI) may be also recommend to researchers and clinicians for assessing the presence of Chronic Ankle Instability (CAI) in Korean-speaking population.

  11. Quantitative software-reliability analysis of computer codes relevant to nuclear safety

    International Nuclear Information System (INIS)

    Mueller, C.J.

    1981-12-01

    This report presents the results of the first year of an ongoing research program to determine the probability of failure characteristics of computer codes relevant to nuclear safety. An introduction to both qualitative and quantitative aspects of nuclear software is given. A mathematical framework is presented which will enable the a priori prediction of the probability of failure characteristics of a code given the proper specification of its properties. The framework consists of four parts: (1) a classification system for software errors and code failures; (2) probabilistic modeling for selected reliability characteristics; (3) multivariate regression analyses to establish predictive relationships among reliability characteristics and generic code property and development parameters; and (4) the associated information base. Preliminary data of the type needed to support the modeling and the predictions of this program are described. Illustrations of the use of the modeling are given but the results so obtained, as well as all results of code failure probabilities presented herein, are based on data which at this point are preliminary, incomplete, and possibly non-representative of codes relevant to nuclear safety

  12. Improving reliability of state estimation programming and computing suite based on analyzing a fault tree

    Directory of Open Access Journals (Sweden)

    Kolosok Irina

    2017-01-01

    Full Text Available Reliable information on the current state parameters obtained as a result of processing the measurements from systems of the SCADA and WAMS data acquisition and processing through methods of state estimation (SE is a condition that enables to successfully manage an energy power system (EPS. SCADA and WAMS systems themselves, as any technical systems, are subject to failures and faults that lead to distortion and loss of information. The SE procedure enables to find erroneous measurements, therefore, it is a barrier for the distorted information to penetrate into control problems. At the same time, the programming and computing suite (PCS implementing the SE functions may itself provide a wrong decision due to imperfection of the software algorithms and errors. In this study, we propose to use a fault tree to analyze consequences of failures and faults in SCADA and WAMS and in the very SE procedure. Based on the analysis of the obtained measurement information and on the SE results, we determine the state estimation PCS fault tolerance level featuring its reliability.

  13. A Newly Developed Method for Computing Reliability Measures in a Water Supply Network

    Directory of Open Access Journals (Sweden)

    Jacek Malinowski

    2016-01-01

    Full Text Available A reliability model of a water supply network has beens examined. Its main features are: a topology that can be decomposed by the so-called state factorization into a (relativelysmall number of derivative networks, each having a series-parallel structure (1, binary-state components (either operative or failed with given flow capacities (2, a multi-state character of the whole network and its sub-networks - a network state is defined as the maximal flow between a source (sources and a sink (sinks (3, all capacities (component, network, and sub-network have integer values (4. As the network operates, its state changes due to component failures, repairs, and replacements. A newly developed method of computing the inter-state transition intensities has been presented. It is based on the so-called state factorization and series-parallel aggregation. The analysis of these intensities shows that the failure-repair process of the considered system is an asymptotically homogenous Markov process. It is also demonstrated how certain reliability parameters useful for the network maintenance planning can be determined on the basis of the asymptotic intensities. For better understanding of the presented method, an illustrative example is given. (original abstract

  14. Reliability of computer designed surgical guides in six implant rehabilitations with two years follow-up.

    Science.gov (United States)

    Giordano, Mauro; Ausiello, Pietro; Martorelli, Massimo; Sorrentino, Roberto

    2012-09-01

    To evaluate the reliability and accuracy of computer-designed surgical guides in osseointegrated oral implant rehabilitation. Six implant rehabilitations, with a total of 17 implants, were completed with computer-designed surgical guides, performed with the master model developed by muco-compressive and muco-static impressions. In the first case, the surgical guide had exclusively mucosal support, in the second case exclusively dental support. For all six cases computer-aided surgical planning was performed by virtual analyses with 3D models obtained by dental scan DICOM data. The accuracy and stability of implant osseointegration over two years post surgery was then evaluated with clinical and radiographic examinations. Radiographic examination, performed with digital acquisitions (RVG - Radio Video graph) and parallel techniques, allowed two-dimensional feedback with a margin of linear error of 10%. Implant osseointegration was recorded for all the examined rehabilitations. During the clinical and radiographic post-surgical assessments, over the following two years, the peri-implant bone level was found to be stable and without appearance of any complications. The margin of error recorded between pre-operative positions assigned by virtual analysis and the post-surgical digital radiographic observations was as low as 0.2mm. Computer-guided implant surgery can be very effective in oral rehabilitations, providing an opportunity for the surgeon: (a) to avoid the necessity of muco-periosteal detachments and then (b) to perform minimally invasive interventions, whenever appropriate, with a flapless approach. Copyright © 2012 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  15. A structural approach to constructing perspective efficient and reliable human-computer interfaces

    International Nuclear Information System (INIS)

    Balint, L.

    1989-01-01

    The principles of human-computer interface (HCI) realizations are investigated with the aim of getting closer to a general framework and thus, to a more or less solid background of constructing perspective efficient, reliable and cost-effective human-computer interfaces. On the basis of characterizing and classifying the different HCI solutions, the fundamental problems of interface construction are pointed out especially with respect to human error occurrence possibilities. The evolution of HCI realizations is illustrated by summarizing the main properties of past, present and foreseeable future interface generations. HCI modeling is pointed out to be a crucial problem in theoretical and practical investigations. Suggestions concerning HCI structure (hierarchy and modularity), HCI functional dynamics (mapping from input to output information), minimization of human error caused system failures (error-tolerance, error-recovery and error-correcting) as well as cost-effective HCI design and realization methodology (universal and application-oriented vs. application-specific solutions) are presented. The concept of RISC-based and SCAMP-type HCI components is introduced with the aim of having a reduced interaction scheme in communication and a well defined architecture in HCI components' internal structure. HCI efficiency and reliability are dealt with, by taking into account complexity and flexibility. The application of fast computerized prototyping is also briefly investigated as an experimental device of achieving simple, parametrized, invariant HCI models. Finally, a concise outline of an approach of how to construct ideal HCI's is also suggested by emphasizing the open questions and the need of future work related to the proposals, as well. (author). 14 refs, 6 figs

  16. Reliability of the spent fuel identification for flask loading procedure used by COGEMA for fuel transport to La Hague

    International Nuclear Information System (INIS)

    Eid, M.; Zachar, M.; Pretesacque, P.

    1991-01-01

    The Spent Fuel Identification for Flask Loading (SFIFL) procedure designed by COGEMA is analysed and its reliability calculated. The reliability of the procedure is defined as the probability of transporting only approved fuel elements for a given number of shipments. The procedure describes a non-coherent system. A non-coherent system is the one in which two successive failures could result in a success, from the system mission point of view. A technique that describes the system with the help of its maximal cuts (states) is used for calculations. A maximal cut contains more than one failure which can split into two cuts (sub-states). Cuts splitting will enable us to analyse, in a systematic way, non-coherent systems with independent basic components. (author)

  17. Reliability of the spent fuel identification for flask loading procedure used by COGEMA for fuel transport to La Hague

    International Nuclear Information System (INIS)

    Eid, M.; Zachar, M.; Pretesacque, P.

    1990-01-01

    The Spent Fuel Identification for Flask Loading, SFIFL, procedure designed by COGEMA is analysed and its reliability is calculated. The reliability of the procedure is defined as the probability of transporting only approved fuel elements for a given number of shipments. The procedure describes a non-coherent system. A non-coherent system is the one in which two successive failures could result in a success, from the system mission point of view. A technique that describes the system with the help of its maximal cuts (states), is used for calculations. A maximal cut contains more than one failure can split into two cuts, (sub-states). Cuts splitting will enable us to analyse, in a systematic way, non-coherent systems with independent basic components. (author)

  18. Reliability of the Identification of Functional Ankle Instability (IdFAI) Scale Across Different Age Groups in Adults.

    Science.gov (United States)

    Gurav, Reshma S; Ganu, Sneha S; Panhale, Vrushali P

    2014-10-01

    Functional ankle instability (FAI) is the tendency of the foot to 'give way'. Identification of Functional Ankle Instability questionnaire (IdFAI) is a newly developed questionnaire to detect whether individuals meet the minimum criteria necessary for inclusion in an FAI population. However, the reliability of the questionnaire was studied only in a restricted age group. The purpose of this investigation was to examine the reliability of IdFAI across different age groups in adults. One hundred and twenty participants in the age group of 20-60 years consisting of 30 individuals in each age group were asked to complete the IdFAI on two occasions. Test-retest reliability was evaluated by intraclass correlation coefficient (ICC2,1). The study revealed that IdFAI has excellent test-retest reliability when studied across different age groups. The ICC2,1 in the age groups 20-30 years, 30-40 years, 40-50 years and 50-60 years was 0.978, 0.975, 0.961 and 0.922, respectively with Cronbach's alpha >0.9 in all the age groups. The IdFAI can accurately predict if an individual meets the minimum criterion for FAI across different age groups in adults. Thus, the questionnaire can be applied over different age groups in clinical and research set-ups.

  19. Contours identification of elements in a cone beam computed tomography for investigating maxillary cysts

    Science.gov (United States)

    Chioran, Doina; Nicoarǎ, Adrian; Roşu, Şerban; Cǎrligeriu, Virgil; Ianeş, Emilia

    2013-10-01

    Digital processing of two-dimensional cone beam computer tomography slicesstarts by identification of the contour of elements within. This paper deals with the collective work of specialists in medicine and applied mathematics in computer science on elaborating and implementation of algorithms in dental 2D imagery.

  20. Automatic Identification and Reconstruction of the Right Phrenic Nerve on Computed Tomography

    OpenAIRE

    Bamps, Kobe; Cuypers, Céline; Polmans, Pieter; Claesen, Luc; Koopman, Pieter

    2016-01-01

    An automatic computer algorithm was successfully constructed, enabling identification and reconstruction of the right phrenic nerve on high resolution coronary computed tomography scans. This could lead to a substantial reduction in the incidence of phrenic nerve paralysis during pulmonary vein isolation using ballon techniques.

  1. Identification of reliable gridded reference data for statistical downscaling methods in Alberta

    Science.gov (United States)

    Eum, H. I.; Gupta, A.

    2017-12-01

    Climate models provide essential information to assess impacts of climate change at regional and global scales. However, statistical downscaling methods have been applied to prepare climate model data for various applications such as hydrologic and ecologic modelling at a watershed scale. As the reliability and (spatial and temporal) resolution of statistically downscaled climate data mainly depend on a reference data, identifying the most reliable reference data is crucial for statistical downscaling. A growing number of gridded climate products are available for key climate variables which are main input data to regional modelling systems. However, inconsistencies in these climate products, for example, different combinations of climate variables, varying data domains and data lengths and data accuracy varying with physiographic characteristics of the landscape, have caused significant challenges in selecting the most suitable reference climate data for various environmental studies and modelling. Employing various observation-based daily gridded climate products available in public domain, i.e. thin plate spline regression products (ANUSPLIN and TPS), inverse distance method (Alberta Townships), and numerical climate model (North American Regional Reanalysis) and an optimum interpolation technique (Canadian Precipitation Analysis), this study evaluates the accuracy of the climate products at each grid point by comparing with the Adjusted and Homogenized Canadian Climate Data (AHCCD) observations for precipitation, minimum and maximum temperature over the province of Alberta. Based on the performance of climate products at AHCCD stations, we ranked the reliability of these publically available climate products corresponding to the elevations of stations discretized into several classes. According to the rank of climate products for each elevation class, we identified the most reliable climate products based on the elevation of target points. A web-based system

  2. Reliable identification at the species level of Brucella isolates with MALDI-TOF-MS

    NARCIS (Netherlands)

    Lista, F.; Reubsaet, F.A.G.; Santis, R. de; Parchen, R.R.; Jong, A.L. de; Kieboom, J.; Laaken, A.L. van der; Voskamp-Visser, I.A.I.; Fillo, S.; Jansen, H.J. de; Plas, J. van der; Paauw, A.

    2011-01-01

    Background: The genus Brucella contains highly infectious species that are classified as biological threat agents. The timely detection and identification of the microorganism involved is essential for an effective response not only to biological warfare attacks but also to natural outbreaks.

  3. Non-binary decomposition trees - a method of reliability computation for systems with known minimal paths/cuts

    Energy Technology Data Exchange (ETDEWEB)

    Malinowski, Jacek

    2004-05-01

    A coherent system with independent components and known minimal paths (cuts) is considered. In order to compute its reliability, a tree structure T is constructed whose nodes contain the modified minimal paths (cuts) and numerical values. The value of a non-leaf node is a function of its child nodes' values. The values of leaf nodes are calculated from a simple formula. The value of the root node is the system's failure probability (reliability). Subsequently, an algorithm computing the system's failure probability (reliability) is constructed. The algorithm scans all nodes of T using a stack structure for this purpose. The nodes of T are alternately put on and removed from the stack, their data being modified in the process. Once the algorithm has terminated, the stack contains only the final modification of the root node of T, and its value is equal to the system's failure probability (reliability)

  4. Non-binary decomposition trees - a method of reliability computation for systems with known minimal paths/cuts

    International Nuclear Information System (INIS)

    Malinowski, Jacek

    2004-01-01

    A coherent system with independent components and known minimal paths (cuts) is considered. In order to compute its reliability, a tree structure T is constructed whose nodes contain the modified minimal paths (cuts) and numerical values. The value of a non-leaf node is a function of its child nodes' values. The values of leaf nodes are calculated from a simple formula. The value of the root node is the system's failure probability (reliability). Subsequently, an algorithm computing the system's failure probability (reliability) is constructed. The algorithm scans all nodes of T using a stack structure for this purpose. The nodes of T are alternately put on and removed from the stack, their data being modified in the process. Once the algorithm has terminated, the stack contains only the final modification of the root node of T, and its value is equal to the system's failure probability (reliability)

  5. Data identification for improving gene network inference using computational algebra.

    Science.gov (United States)

    Dimitrova, Elena; Stigler, Brandilyn

    2014-11-01

    Identification of models of gene regulatory networks is sensitive to the amount of data used as input. Considering the substantial costs in conducting experiments, it is of value to have an estimate of the amount of data required to infer the network structure. To minimize wasted resources, it is also beneficial to know which data are necessary to identify the network. Knowledge of the data and knowledge of the terms in polynomial models are often required a priori in model identification. In applications, it is unlikely that the structure of a polynomial model will be known, which may force data sets to be unnecessarily large in order to identify a model. Furthermore, none of the known results provides any strategy for constructing data sets to uniquely identify a model. We provide a specialization of an existing criterion for deciding when a set of data points identifies a minimal polynomial model when its monomial terms have been specified. Then, we relax the requirement of the knowledge of the monomials and present results for model identification given only the data. Finally, we present a method for constructing data sets that identify minimal polynomial models.

  6. Description of the TREBIL, CRESSEX and STREUSL computer programs, that belongs to RALLY computer code pack for the analysis of reliability systems

    International Nuclear Information System (INIS)

    Fernandes Filho, T.L.

    1982-11-01

    The RALLY computer code pack (RALLY pack) is a set of computer codes destinate to the reliability of complex systems, aiming to a risk analysis. Three of the six codes, are commented, presenting their purpose, input description, calculation methods and results obtained with each one of those computer codes. The computer codes are: TREBIL, to obtain the fault tree logical equivalent; CRESSEX, to obtain the minimal cut and the punctual values of the non-reliability and non-availability of the system; and STREUSL, for the dispersion calculation of those values around the media. In spite of the CRESSEX, in its version available at CNEN, uses a little long method to obtain the minimal cut in an HB-CNEN system, the three computer programs show good results, mainly the STREUSL, which permits the simulation of various components. (E.G.) [pt

  7. Identification of Enhancers In Human: Advances In Computational Studies

    KAUST Repository

    Kleftogiannis, Dimitrios A.

    2016-01-01

    Finally, we take a step further by developing a novel feature selection method suitable for defining a computational framework capable of analyzing the genomic content of enhancers and reporting cell-line specific predictive signatures.

  8. Identification of risk factors of computer information technologies in education

    Directory of Open Access Journals (Sweden)

    Hrebniak M.P.

    2014-03-01

    Full Text Available The basic direction of development of secondary school and vocational training is computer training of schoolchildren and students, including distance forms of education and widespread usage of world information systems. The purpose of the work is to determine risk factors for schoolchildren and students, when using modern information and computer technologies. Results of researches allowed to establish dynamics of formation of skills using computer information technologies in education and characteristics of mental ability among schoolchildren and students during training in high school. Common risk factors, while operating CIT, are: intensification and formalization of intellectual activity, adverse ergonomic parameters, unfavorable working posture, excess of hygiene standards by chemical and physical characteristics. The priority preventive directions in applying computer information technology in education are: improvement of optimal visual parameters of activity, rationalization of ergonomic parameters, minimizing of adverse effects of chemical and physical conditions, rationalization of work and rest activity.

  9. Diagnostic reliability of the cervical vertebral maturation method and standing height in the identification of the mandibular growth spurt.

    Science.gov (United States)

    Perinetti, Giuseppe; Contardo, Luca; Castaldo, Attilio; McNamara, James A; Franchi, Lorenzo

    2016-07-01

    To evaluate the capability of both cervical vertebral maturation (CVM) stages 3 and 4 (CS3-4 interval) and the peak in standing height to identify the mandibular growth spurt throughout diagnostic reliability analysis. A previous longitudinal data set derived from 24 untreated growing subjects (15 females and nine males,) detailed elsewhere were reanalyzed. Mandibular growth was defined as annual increments in Condylion (Co)-Gnathion (Gn) (total mandibular length) and Co-Gonion Intersection (Goi) (ramus height) and their arithmetic mean (mean mandibular growth [mMG]). Subsequently, individual annual increments in standing height, Co-Gn, Co-Goi, and mMG were arranged according to annual age intervals, with the first and last intervals defined as 7-8 years and 15-16 years, respectively. An analysis was performed to establish the diagnostic reliability of the CS3-4 interval or the peak in standing height in the identification of the maximum individual increments of each Co-Gn, Co-Goi, and mMG measurement at each annual age interval. CS3-4 and standing height peak show similar but variable accuracy across annual age intervals, registering values between 0.61 (standing height peak, Co-Gn) and 0.95 (standing height peak and CS3-4, mMG). Generally, satisfactory diagnostic reliability was seen when the mandibular growth spurt was identified on the basis of the Co-Goi and mMG increments. Both CVM interval CS3-4 and peak in standing height may be used in routine clinical practice to enhance efficiency of treatments requiring identification of the mandibular growth spurt.

  10. Reliable identification of deep sulcal pits: the effects of scan session, scanner, and surface extraction tool.

    Directory of Open Access Journals (Sweden)

    Kiho Im

    Full Text Available Sulcal pit analysis has been providing novel insights into brain function and development. The purpose of this study was to evaluate the reliability of sulcal pit extraction with respect to the effects of scan session, scanner, and surface extraction tool. Five subjects were scanned 4 times at 3 MRI centers and other 5 subjects were scanned 3 times at 2 MRI centers, including 1 test-retest session. Sulcal pits were extracted on the white matter surfaces reconstructed with both Montreal Neurological Institute and Freesurfer pipelines. We estimated similarity of the presence of sulcal pits having a maximum value of 1 and their spatial difference within the same subject. The tests showed high similarity of the sulcal pit presence and low spatial difference. The similarity was more than 0.90 and the spatial difference was less than 1.7 mm in most cases according to different scan sessions or scanners, and more than 0.85 and about 2.0 mm across surface extraction tools. The reliability of sulcal pit extraction was more affected by the image processing-related factors than the scan session or scanner factors. Moreover, the similarity of sulcal pit distribution appeared to be largely influenced by the presence or absence of the sulcal pits on the shallow and small folds. We suggest that our sulcal pit extraction from MRI is highly reliable and could be useful for clinical applications as an imaging biomarker.

  11. Reliable identification of deep sulcal pits: the effects of scan session, scanner, and surface extraction tool.

    Science.gov (United States)

    Im, Kiho; Lee, Jong-Min; Jeon, Seun; Kim, Jong-Heon; Seo, Sang Won; Na, Duk L; Grant, P Ellen

    2013-01-01

    Sulcal pit analysis has been providing novel insights into brain function and development. The purpose of this study was to evaluate the reliability of sulcal pit extraction with respect to the effects of scan session, scanner, and surface extraction tool. Five subjects were scanned 4 times at 3 MRI centers and other 5 subjects were scanned 3 times at 2 MRI centers, including 1 test-retest session. Sulcal pits were extracted on the white matter surfaces reconstructed with both Montreal Neurological Institute and Freesurfer pipelines. We estimated similarity of the presence of sulcal pits having a maximum value of 1 and their spatial difference within the same subject. The tests showed high similarity of the sulcal pit presence and low spatial difference. The similarity was more than 0.90 and the spatial difference was less than 1.7 mm in most cases according to different scan sessions or scanners, and more than 0.85 and about 2.0 mm across surface extraction tools. The reliability of sulcal pit extraction was more affected by the image processing-related factors than the scan session or scanner factors. Moreover, the similarity of sulcal pit distribution appeared to be largely influenced by the presence or absence of the sulcal pits on the shallow and small folds. We suggest that our sulcal pit extraction from MRI is highly reliable and could be useful for clinical applications as an imaging biomarker.

  12. A comparative study of computed radiographic cephalometry and conventional cephalometry in reliability of head film measurements

    International Nuclear Information System (INIS)

    Kim, Hyung Done; Kim, Kee Deog; Park, Chang Seo

    1997-01-01

    The purpose of this study was to compare and to find out the variability of head film measurements (landmarks identification) between Fuji computed radiographic cephalometry and conventional cephalometry. 28 Korean adults were selected. Lateral cephalometric FCR film and conventional cephalometric film of each subject was taken. Four investigators identified 24 cephalometric landmarks on lateral cephalometric FCR film and conventional cephalometric film were statistically analysed. The results were as follows : 1. In FCR film and conventional film, coefficient of variation (C.V.) of 24 landmarks was taken horizontally and vertically. 2. In comparison of significant differences of landmarks variability between FCR film and conventional film, horizontal l value of coefficient of variation showed significant differences in four landmarks among twenty-four landmarks, but vertical a value of coefficient of variation showed significant differences in sixteen landmarks among twenty-four landmarks. FCR film showed significantly less variability than conventional film in 17 subjects among 20 (4+16) subjects that sho wed significant difference.

  13. Computational identification of putative cytochrome P450 genes in ...

    African Journals Online (AJOL)

    In this work, a computational study of expressed sequence tags (ESTs) of soybean was performed by data mining methods and bio-informatics tools and as a result 78 putative P450 genes were identified, including 57 new ones. These genes were classified into five clans and 20 families by sequence similarities and among ...

  14. Accuracy and reliability of stitched cone-beam computed tomography images

    Energy Technology Data Exchange (ETDEWEB)

    Egbert, Nicholas [Private Practice, Reconstructive Dental Specialists of Utah, Salt Lake (United States); Cagna, David R.; Ahuja, Swati; Wicks, Russell A. [Dept. of rosthodontics, University of Tennessee Health Science Center College of Dentistry, Memphis (United States)

    2015-03-15

    This study was performed to evaluate the linear distance accuracy and reliability of stitched small field of view (FOV) cone-beam computed tomography (CBCT) reconstructed images for the fabrication of implant surgical guides. Three gutta percha points were fixed on the inferior border of a cadaveric mandible to serve as control reference points. Ten additional gutta percha points, representing fiduciary markers, were scattered on the buccal and lingual cortices at the level of the proposed complete denture flange. A digital caliper was used to measure the distance between the reference points and fiduciary markers, which represented the anatomic linear dimension. The mandible was scanned using small FOV CBCT, and the images were then reconstructed and stitched using the manufacturer's imaging software. The same measurements were then taken with the CBCT software. The anatomic linear dimension measurements and stitched small FOV CBCT measurements were statistically evaluated for linear accuracy. The mean difference between the anatomic linear dimension measurements and the stitched small FOV CBCT measurements was found to be 0.34 mm with a 95% confidence interval of +0.24 - +0.44 mm and a mean standard deviation of 0.30 mm. The difference between the control and the stitched small FOV CBCT measurements was insignificant within the parameters defined by this study. The proven accuracy of stitched small FOV CBCT data sets may allow image-guided fabrication of implant surgical stents from such data sets.

  15. Task analysis and computer aid development for human reliability analysis in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, W. C.; Kim, H.; Park, H. S.; Choi, H. H.; Moon, J. M.; Heo, J. Y.; Ham, D. H.; Lee, K. K.; Han, B. T. [Korea Advanced Institute of Science and Technology, Taejeon (Korea)

    2001-04-01

    Importance of human reliability analysis (HRA) that predicts the error's occurrence possibility in a quantitative and qualitative manners is gradually increased by human errors' effects on the system's safety. HRA needs a task analysis as a virtue step, but extant task analysis techniques have the problem that a collection of information about the situation, which the human error occurs, depends entirely on HRA analyzers. The problem makes results of the task analysis inconsistent and unreliable. To complement such problem, KAERI developed the structural information analysis (SIA) that helps to analyze task's structure and situations systematically. In this study, the SIA method was evaluated by HRA experts, and a prototype computerized supporting system named CASIA (Computer Aid for SIA) was developed for the purpose of supporting to perform HRA using the SIA method. Additionally, through applying the SIA method to emergency operating procedures, we derived generic task types used in emergency and accumulated the analysis results in the database of the CASIA. The CASIA is expected to help HRA analyzers perform the analysis more easily and consistently. If more analyses will be performed and more data will be accumulated to the CASIA's database, HRA analyzers can share freely and spread smoothly his or her analysis experiences, and there by the quality of the HRA analysis will be improved. 35 refs., 38 figs., 25 tabs. (Author)

  16. Accuracy and reliability of stitched cone-beam computed tomography images

    International Nuclear Information System (INIS)

    Egbert, Nicholas; Cagna, David R.; Ahuja, Swati; Wicks, Russell A.

    2015-01-01

    This study was performed to evaluate the linear distance accuracy and reliability of stitched small field of view (FOV) cone-beam computed tomography (CBCT) reconstructed images for the fabrication of implant surgical guides. Three gutta percha points were fixed on the inferior border of a cadaveric mandible to serve as control reference points. Ten additional gutta percha points, representing fiduciary markers, were scattered on the buccal and lingual cortices at the level of the proposed complete denture flange. A digital caliper was used to measure the distance between the reference points and fiduciary markers, which represented the anatomic linear dimension. The mandible was scanned using small FOV CBCT, and the images were then reconstructed and stitched using the manufacturer's imaging software. The same measurements were then taken with the CBCT software. The anatomic linear dimension measurements and stitched small FOV CBCT measurements were statistically evaluated for linear accuracy. The mean difference between the anatomic linear dimension measurements and the stitched small FOV CBCT measurements was found to be 0.34 mm with a 95% confidence interval of +0.24 - +0.44 mm and a mean standard deviation of 0.30 mm. The difference between the control and the stitched small FOV CBCT measurements was insignificant within the parameters defined by this study. The proven accuracy of stitched small FOV CBCT data sets may allow image-guided fabrication of implant surgical stents from such data sets.

  17. Accuracy and reliability of stitched cone-beam computed tomography images.

    Science.gov (United States)

    Egbert, Nicholas; Cagna, David R; Ahuja, Swati; Wicks, Russell A

    2015-03-01

    This study was performed to evaluate the linear distance accuracy and reliability of stitched small field of view (FOV) cone-beam computed tomography (CBCT) reconstructed images for the fabrication of implant surgical guides. Three gutta percha points were fixed on the inferior border of a cadaveric mandible to serve as control reference points. Ten additional gutta percha points, representing fiduciary markers, were scattered on the buccal and lingual cortices at the level of the proposed complete denture flange. A digital caliper was used to measure the distance between the reference points and fiduciary markers, which represented the anatomic linear dimension. The mandible was scanned using small FOV CBCT, and the images were then reconstructed and stitched using the manufacturer's imaging software. The same measurements were then taken with the CBCT software. The anatomic linear dimension measurements and stitched small FOV CBCT measurements were statistically evaluated for linear accuracy. The mean difference between the anatomic linear dimension measurements and the stitched small FOV CBCT measurements was found to be 0.34 mm with a 95% confidence interval of +0.24 - +0.44 mm and a mean standard deviation of 0.30 mm. The difference between the control and the stitched small FOV CBCT measurements was insignificant within the parameters defined by this study. The proven accuracy of stitched small FOV CBCT data sets may allow image-guided fabrication of implant surgical stents from such data sets.

  18. Computational identification of candidate nucleotide cyclases in higher plants

    KAUST Repository

    Wong, Aloysius Tze

    2013-09-03

    In higher plants guanylyl cyclases (GCs) and adenylyl cyclases (ACs) cannot be identified using BLAST homology searches based on annotated cyclic nucleotide cyclases (CNCs) of prokaryotes, lower eukaryotes, or animals. The reason is that CNCs are often part of complex multifunctional proteins with different domain organizations and biological functions that are not conserved in higher plants. For this reason, we have developed CNC search strategies based on functionally conserved amino acids in the catalytic center of annotated and/or experimentally confirmed CNCs. Here we detail this method which has led to the identification of >25 novel candidate CNCs in Arabidopsis thaliana, several of which have been experimentally confirmed in vitro and in vivo. We foresee that the application of this method can be used to identify many more members of the growing family of CNCs in higher plants. © Springer Science+Business Media New York 2013.

  19. Test-retest reliability of computer-based video analysis of general movements in healthy term-born infants.

    Science.gov (United States)

    Valle, Susanne Collier; Støen, Ragnhild; Sæther, Rannei; Jensenius, Alexander Refsum; Adde, Lars

    2015-10-01

    A computer-based video analysis has recently been presented for quantitative assessment of general movements (GMs). This method's test-retest reliability, however, has not yet been evaluated. The aim of the current study was to evaluate the test-retest reliability of computer-based video analysis of GMs, and to explore the association between computer-based video analysis and the temporal organization of fidgety movements (FMs). Test-retest reliability study. 75 healthy, term-born infants were recorded twice the same day during the FMs period using a standardized video set-up. The computer-based movement variables "quantity of motion mean" (Qmean), "quantity of motion standard deviation" (QSD) and "centroid of motion standard deviation" (CSD) were analyzed, reflecting the amount of motion and the variability of the spatial center of motion of the infant, respectively. In addition, the association between the variable CSD and the temporal organization of FMs was explored. Intraclass correlation coefficients (ICC 1.1 and ICC 3.1) were calculated to assess test-retest reliability. The ICC values for the variables CSD, Qmean and QSD were 0.80, 0.80 and 0.86 for ICC (1.1), respectively; and 0.80, 0.86 and 0.90 for ICC (3.1), respectively. There were significantly lower CSD values in the recordings with continual FMs compared to the recordings with intermittent FMs (ptest-retest reliability of computer-based video analysis of GMs, and a significant association between our computer-based video analysis and the temporal organization of FMs. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  20. Novel Methods to Enhance Precision and Reliability in Muscle Synergy Identification during Walking

    Science.gov (United States)

    Kim, Yushin; Bulea, Thomas C.; Damiano, Diane L.

    2016-01-01

    Muscle synergies are hypothesized to reflect modular control of muscle groups via descending commands sent through multiple neural pathways. Recently, the number of synergies has been reported as a functionally relevant indicator of motor control complexity in individuals with neurological movement disorders. Yet the number of synergies extracted during a given activity, e.g., gait, varies within and across studies, even for unimpaired individuals. With no standardized methods for precise determination, this variability remains unexplained making comparisons across studies and cohorts difficult. Here, we utilize k-means clustering and intra-class and between-level correlation coefficients to precisely discriminate reliable from unreliable synergies. Electromyography (EMG) was recorded bilaterally from eight leg muscles during treadmill walking at self-selected speed. Muscle synergies were extracted from 20 consecutive gait cycles using non-negative matrix factorization. We demonstrate that the number of synergies is highly dependent on the threshold when using the variance accounted for by reconstructed EMG. Beyond use of threshold, our method utilized a quantitative metric to reliably identify four or five synergies underpinning walking in unimpaired adults and revealed synergies having poor reproducibility that should not be considered as true synergies. We show that robust and unreliable synergies emerge similarly, emphasizing the need for careful analysis in those with pathology. PMID:27695403

  1. Identification of a practical and reliable method for the evaluation of litter moisture in turkey production.

    Science.gov (United States)

    Vinco, L J; Giacomelli, S; Campana, L; Chiari, M; Vitale, N; Lombardi, G; Veldkamp, T; Hocking, P M

    2018-02-01

    1. An experiment was conducted to compare 5 different methods for the evaluation of litter moisture. 2. For litter collection and assessment, 55 farms were selected, one shed from each farm was inspected and 9 points were identified within each shed. 3. For each device, used for the evaluation of litter moisture, mean and standard deviation of wetness measures per collection point were assessed. 4. The reliability and overall consistency between the 5 instruments used to measure wetness were high (α = 0.72). 5. Measurement of three out of the 9 collection points were sufficient to provide a reliable assessment of litter moisture throughout the shed. 6. Based on the direct correlation between litter moisture and footpad lesions, litter moisture measurement can be used as a resource based on-farm animal welfare indicator. 7. Among the 5 methods analysed, visual scoring is the most simple and practical, and therefore the best candidate to be used on-farm for animal welfare assessment.

  2. COMPUTER-AIDED IDENTIFICATION OF ENVIRONMENTAL IMPACTS OF BUSINESSES

    Directory of Open Access Journals (Sweden)

    Halina Marczak

    2015-11-01

    Full Text Available Discussed elements of the computer databases used in “input-output” analysis method to identify enterprise impact on the environment. Presented “input-output” database parameters essential for a company performing road transport of finished products. Presented a method for determining parameters: energy consumption by means of road transport and the emissions amount with exhaust fumes from vehicles. Discussed parameters of the computer databases needed to calculate the amount of fees for use of the environment. These data and amount of fees for environment usage may be useful in assessing the scale of the impact on environment by businesses. Described a method for determining the fee for gases and dust emitted into the air from combustion of fuels in internal combustion engines.

  3. New algorithm to reduce the number of computing steps in reliability formula of Weighted-k-out-of-n system

    Directory of Open Access Journals (Sweden)

    Tatsunari Ohkura

    2007-02-01

    Full Text Available In the disjoint products version of reliability analysis of weighted–k–out–of–n systems, it is necessary to determine the order in which the weight of components is to be considered. The k–out–of–n:G(F system consists of n components; each com-ponent has its own probability and positive integer weight such that the system is operational (failed if and only if the total weight of some operational (failure components is at least k. This paper designs a method to compute the reliability in O(nk computing time and in O(nk memory space. The proposed method expresses the system reliability in fewer product terms than those already published.

  4. Electronic structure of BN-aromatics: Choice of reliable computational tools

    Science.gov (United States)

    Mazière, Audrey; Chrostowska, Anna; Darrigan, Clovis; Dargelos, Alain; Graciaa, Alain; Chermette, Henry

    2017-10-01

    The importance of having reliable calculation tools to interpret and predict the electronic properties of BN-aromatics is directly linked to the growing interest for these very promising new systems in the field of materials science, biomedical research, or energy sustainability. Ionization energy (IE) is one of the most important parameters to approach the electronic structure of molecules. It can be theoretically estimated, but in order to evaluate their persistence and propose the most reliable tools for the evaluation of different electronic properties of existent or only imagined BN-containing compounds, we took as reference experimental values of ionization energies provided by ultra-violet photoelectron spectroscopy (UV-PES) in gas phase—the only technique giving access to the energy levels of filled molecular orbitals. Thus, a set of 21 aromatic molecules containing B-N bonds and B-N-B patterns has been merged for a comparison between experimental IEs obtained by UV-PES and various theoretical approaches for their estimation. Time-Dependent Density Functional Theory (TD-DFT) methods using B3LYP and long-range corrected CAM-B3LYP functionals are used, combined with the Δ SCF approach, and compared with electron propagator theory such as outer valence Green's function (OVGF, P3) and symmetry adapted cluster-configuration interaction ab initio methods. Direct Kohn-Sham estimation and "corrected" Kohn-Sham estimation are also given. The deviation between experimental and theoretical values is computed for each molecule, and a statistical study is performed over the average and the root mean square for the whole set and sub-sets of molecules. It is shown that (i) Δ SCF+TDDFT(CAM-B3LYP), OVGF, and P3 are the most efficient way for a good agreement with UV-PES values, (ii) a CAM-B3LYP range-separated hybrid functional is significantly better than B3LYP for the purpose, especially for extended conjugated systems, and (iii) the "corrected" Kohn-Sham result is a

  5. Pore sub-features reproducibility in direct microscopic and Livescan images--their reliability in personal identification.

    Science.gov (United States)

    Gupta, Abhishek; Sutton, Raul

    2010-07-01

    Third level features have been reported to have equal discriminatory power as second level details in establishing personal identification. Pore area, as an extended set third level sub-feature, has been studied by minimizing possible factors that could affect pore size. The reproducibility of pore surface area has been studied using direct microscopic and 500 ppi Livescan images. Direct microscopic pore area measurements indicated that the day on which the pore area was measured had a significant impact on the measured pore area. Pore area measurement was shown to be difficult to estimate in 500 ppi Livescan measurements owing to lack of resolution. It is not possible to reliably use pore area as an identifying feature in fingerprint examination.

  6. Computational identification of MoRFs in protein sequences.

    Science.gov (United States)

    Malhis, Nawar; Gsponer, Jörg

    2015-06-01

    Intrinsically disordered regions of proteins play an essential role in the regulation of various biological processes. Key to their regulatory function is the binding of molecular recognition features (MoRFs) to globular protein domains in a process known as a disorder-to-order transition. Predicting the location of MoRFs in protein sequences with high accuracy remains an important computational challenge. In this study, we introduce MoRFCHiBi, a new computational approach for fast and accurate prediction of MoRFs in protein sequences. MoRFCHiBi combines the outcomes of two support vector machine (SVM) models that take advantage of two different kernels with high noise tolerance. The first, SVMS, is designed to extract maximal information from the general contrast in amino acid compositions between MoRFs, their surrounding regions (Flanks), and the remainders of the sequences. The second, SVMT, is used to identify similarities between regions in a query sequence and MoRFs of the training set. We evaluated the performance of our predictor by comparing its results with those of two currently available MoRF predictors, MoRFpred and ANCHOR. Using three test sets that have previously been collected and used to evaluate MoRFpred and ANCHOR, we demonstrate that MoRFCHiBi outperforms the other predictors with respect to different evaluation metrics. In addition, MoRFCHiBi is downloadable and fast, which makes it useful as a component in other computational prediction tools. http://www.chibi.ubc.ca/morf/. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. Cloud identification using genetic algorithms and massively parallel computation

    Science.gov (United States)

    Buckles, Bill P.; Petry, Frederick E.

    1996-01-01

    As a Guest Computational Investigator under the NASA administered component of the High Performance Computing and Communication Program, we implemented a massively parallel genetic algorithm on the MasPar SIMD computer. Experiments were conducted using Earth Science data in the domains of meteorology and oceanography. Results obtained in these domains are competitive with, and in most cases better than, similar problems solved using other methods. In the meteorological domain, we chose to identify clouds using AVHRR spectral data. Four cloud speciations were used although most researchers settle for three. Results were remarkedly consistent across all tests (91% accuracy). Refinements of this method may lead to more timely and complete information for Global Circulation Models (GCMS) that are prevalent in weather forecasting and global environment studies. In the oceanographic domain, we chose to identify ocean currents from a spectrometer having similar characteristics to AVHRR. Here the results were mixed (60% to 80% accuracy). Given that one is willing to run the experiment several times (say 10), then it is acceptable to claim the higher accuracy rating. This problem has never been successfully automated. Therefore, these results are encouraging even though less impressive than the cloud experiment. Successful conclusion of an automated ocean current detection system would impact coastal fishing, naval tactics, and the study of micro-climates. Finally we contributed to the basic knowledge of GA (genetic algorithm) behavior in parallel environments. We developed better knowledge of the use of subpopulations in the context of shared breeding pools and the migration of individuals. Rigorous experiments were conducted based on quantifiable performance criteria. While much of the work confirmed current wisdom, for the first time we were able to submit conclusive evidence. The software developed under this grant was placed in the public domain. An extensive user

  8. Computational identification of antigen-binding antibody fragments.

    Science.gov (United States)

    Burkovitz, Anat; Leiderman, Olga; Sela-Culang, Inbal; Byk, Gerardo; Ofran, Yanay

    2013-03-01

    Determining which parts of the Ab are essential for Ag recognition and binding is crucial for understanding B cell-mediated immunity. Identification of fragments of Abs that maintain specificity to the Ag will also allow for the development of improved Ab-based therapy and diagnostics. In this article, we show that structural analysis of Ab-Ag complexes reveals which fragments of the Ab may bind the Ag on their own. In particular, it is possible to predict whether a given CDR is likely to bind the Ag as a peptide by analyzing the energetic contribution of each CDR to Ag binding and by assessing to what extent the interaction between that CDR and the Ag depends on other CDRs. To demonstrate this, we analyzed five Ab-Ag complexes and predicted for each of them which of the CDRs may bind the Ag on its own as a peptide. We then show that these predictions are in agreement with our experimental analysis and with previously published experimental results. These findings promote our understanding of the modular nature of Ab-Ag interactions and lay the foundation for the rational design of active CDR-derived peptides.

  9. Reliability of trajectory identification for cosmic heavy ions and cytogenetic effects of their passage through plant seeds

    International Nuclear Information System (INIS)

    Facius, R.; Reitz, G.; Buecker, H.; Nevzgodina, L.V.; Maximova, E.N.; Kaminskaya, E.V.; Virkov, A.I.; Marenny, A.M.; Akatov, Yu.A.

    1990-01-01

    The potentially specific importance of the study of heavy ions from galactic cosmic rays for the understanding of radiation protection in manned spaceflight continues to stimulate spaceflight experiments in order to investigate the radiobiological properties of these ions. Chromosome aberrations as an expression of a direct assault on the genome are of particular interest in view of carcinogenesis as the primary radiation risk for man in space. An essential technical ingredient of such spaceflight experiments is the visual nuclear track detector which permits identification of those biological test organisms which have been affected by cosmic heavy ions. We describe such a technique and report on an analysis of the qualitative and quantitative reliability of this identification of particle trajectories in layers of biological test organisms. The incidence of chromosome aberrations in cells of lettuce seeds, Lactuca sativa, exposed during the Kosmos 1887 mission, was determined for seeds hit by cosmic heavy ions. In those seeds the incidence of both single and multiple chromosome aberrations was enhanced. (author)

  10. Reliability of trajectory identification for cosmic heavy ions and cytogenetic effects of their passage through plant seeds

    Energy Technology Data Exchange (ETDEWEB)

    Facius, R.; Reitz, G.; Buecker, H. (Deutsche Forschungsanstalt fuer Luft- und Raumfahrt e.V. (DLR), Koeln (Germany, F.R.)); Nevzgodina, L.V.; Maximova, E.N.; Kaminskaya, E.V.; Virkov, A.I.; Marenny, A.M.; Akatov, Yu.A. (Ministry of Public Health, Moscow (USSR). Inst. of Biomedical Problems)

    1990-01-01

    The potentially specific importance of the study of heavy ions from galactic cosmic rays for the understanding of radiation protection in manned spaceflight continues to stimulate spaceflight experiments in order to investigate the radiobiological properties of these ions. Chromosome aberrations as an expression of a direct assault on the genome are of particular interest in view of carcinogenesis as the primary radiation risk for man in space. An essential technical ingredient of such spaceflight experiments is the visual nuclear track detector which permits identification of those biological test organisms which have been affected by cosmic heavy ions. We describe such a technique and report on an analysis of the qualitative and quantitative reliability of this identification of particle trajectories in layers of biological test organisms. The incidence of chromosome aberrations in cells of lettuce seeds, Lactuca sativa, exposed during the Kosmos 1887 mission, was determined for seeds hit by cosmic heavy ions. In those seeds the incidence of both single and multiple chromosome aberrations was enhanced. (author).

  11. Strength and Reliability of Wood for the Components of Low-cost Wind Turbines: Computational and Experimental Analysis and Applications

    DEFF Research Database (Denmark)

    Mishnaevsky, Leon; Freere, Peter; Sharma, Ranjan

    2009-01-01

    of experiments and computational investigations. Low cost testing machines have been designed, and employed for the systematic analysis of different sorts of Nepali wood, to be used for the wind turbine construction. At the same time, computational micromechanical models of deformation and strength of wood......This paper reports the latest results of the comprehensive program of experimental and computational analysis of strength and reliability of wooden parts of low cost wind turbines. The possibilities of prediction of strength and reliability of different types of wood are studied in the series...... are developed, which should provide the basis for microstructure-based correlating of observable and service properties of wood. Some correlations between microstructure, strength and service properties of wood have been established....

  12. Computer-Aided Identification and Validation of Privacy Requirements

    Directory of Open Access Journals (Sweden)

    Rene Meis

    2016-05-01

    Full Text Available Privacy is a software quality that is closely related to security. The main difference is that security properties aim at the protection of assets that are crucial for the considered system, and privacy aims at the protection of personal data that are processed by the system. The identification of privacy protection needs in complex systems is a hard and error prone task. Stakeholders whose personal data are processed might be overlooked, or the sensitivity and the need of protection of the personal data might be underestimated. The later personal data and the needs to protect them are identified during the development process, the more expensive it is to fix these issues, because the needed changes of the system-to-be often affect many functionalities. In this paper, we present a systematic method to identify the privacy needs of a software system based on a set of functional requirements by extending the problem-based privacy analysis (ProPAn method. Our method is tool-supported and automated where possible to reduce the effort that has to be spent for the privacy analysis, which is especially important when considering complex systems. The contribution of this paper is a semi-automatic method to identify the relevant privacy requirements for a software-to-be based on its functional requirements. The considered privacy requirements address all dimensions of privacy that are relevant for software development. As our method is solely based on the functional requirements of the system to be, we enable users of our method to identify the privacy protection needs that have to be addressed by the software-to-be at an early stage of the development. As initial evaluation of our method, we show its applicability on a small electronic health system scenario.

  13. Reliability of lip prints in personal identification: An inter-racial pilot study.

    Science.gov (United States)

    Kumar, Laliytha Bijai; Jayaraman, Venkatesh; Mathew, Philips; Ramasamy, S; Austin, Ravi David

    2016-01-01

    Forensic science is a branch of science that deals with the application of science and technology in solving a crime and this requires a multidisciplinary team effort. The word "Forensic" is derived from the Latin word, "Forensis" which means the study of public. Dental professionals should develop interests in contributing to legal issues. To study the lip prints among people of different races. Descriptive study. The present study comprised of ninety subjects of which Group A comprised of Africans, Group B comprised of Dravidian, and Group C of Mongoloid race. Each group was then further divided into 15 males and 15 females for whom the lip prints were recorded and evaluated. ANOVA test. ANOVA statistical analysis was used to compare three races of African, Dravidian, and Mongoloid races. The observed data among male and female were found to be significant with a P = 0.000492. The present study showed a significant difference in lip pattern among the three races. Perhaps future studies with a larger sample size and comparison between many other races may be done for better personal identification.

  14. Apps for Angiosperms: The Usability of Mobile Computers and Printed Field Guides for UK Wild Flower and Winter Tree Identification

    Science.gov (United States)

    Stagg, Bethan C.; Donkin, Maria E.

    2017-01-01

    We investigated usability of mobile computers and field guide books with adult botanical novices, for the identification of wildflowers and deciduous trees in winter. Identification accuracy was significantly higher for wildflowers using a mobile computer app than field guide books but significantly lower for deciduous trees. User preference…

  15. Identification and Evaluation of Reliable Reference Genes in the Medicinal Fungus Shiraia bambusicola.

    Science.gov (United States)

    Song, Liang; Li, Tong; Fan, Li; Shen, Xiao-Ye; Hou, Cheng-Lin

    2016-04-01

    The stability of reference genes plays a vital role in real-time quantitative reverse transcription polymerase chain reaction (qRT-PCR) analysis, which is generally regarded as a convenient and sensitive tool for the analysis of gene expression. A well-known medicinal fungus, Shiraia bambusicola, has great potential in the pharmaceutical, agricultural and food industries, but its suitable reference genes have not yet been determined. In the present study, 11 candidate reference genes in S. bambusicola were first evaluated and validated comprehensively. To identify the suitable reference genes for qRT-PCR analysis, three software-based algorithms, geNorm, NormFinder and Best Keeper, were applied to rank the tested genes. RNA samples were collected from seven fermentation stages using different media (potato dextrose or Czapek medium) and under different light conditions (12-h light/12-h dark and all-dark). The three most appropriate reference genes, ubi, tfc and ags, were able to normalize the qRT-PCR results under the culturing conditions of 12-h light/12-h dark, whereas the other three genes, vac, gke and acyl, performed better in the culturing conditions of all-dark growth. Therefore, under different light conditions, at least two reference genes (ubi and vac) could be employed to assure the reliability of qRT-PCR results. For both the natural culture medium (the most appropriate genes of this group: ubi, tfc and ags) and the chemically defined synthetic medium (the most stable genes of this group: tfc, vac and ef), the tfc gene remained the best gene used for normalizing the gene expression found with qRT-PCR. It is anticipated that these results would improve the selection of suitable reference genes for qRT-PCR assays and lay the foundation for an accurate analysis of gene expression in S. bambusicola.

  16. Interaction Entropy: A New Paradigm for Highly Efficient and Reliable Computation of Protein-Ligand Binding Free Energy.

    Science.gov (United States)

    Duan, Lili; Liu, Xiao; Zhang, John Z H

    2016-05-04

    Efficient and reliable calculation of protein-ligand binding free energy is a grand challenge in computational biology and is of critical importance in drug design and many other molecular recognition problems. The main challenge lies in the calculation of entropic contribution to protein-ligand binding or interaction systems. In this report, we present a new interaction entropy method which is theoretically rigorous, computationally efficient, and numerically reliable for calculating entropic contribution to free energy in protein-ligand binding and other interaction processes. Drastically different from the widely employed but extremely expensive normal mode method for calculating entropy change in protein-ligand binding, the new method calculates the entropic component (interaction entropy or -TΔS) of the binding free energy directly from molecular dynamics simulation without any extra computational cost. Extensive study of over a dozen randomly selected protein-ligand binding systems demonstrated that this interaction entropy method is both computationally efficient and numerically reliable and is vastly superior to the standard normal mode approach. This interaction entropy paradigm introduces a novel and intuitive conceptual understanding of the entropic effect in protein-ligand binding and other general interaction systems as well as a practical method for highly efficient calculation of this effect.

  17. Computational identification and analysis of novel sugarcane microRNAs

    Directory of Open Access Journals (Sweden)

    Thiebaut Flávia

    2012-07-01

    Full Text Available Abstract Background MicroRNA-regulation of gene expression plays a key role in the development and response to biotic and abiotic stresses. Deep sequencing analyses accelerate the process of small RNA discovery in many plants and expand our understanding of miRNA-regulated processes. We therefore undertook small RNA sequencing of sugarcane miRNAs in order to understand their complexity and to explore their role in sugarcane biology. Results A bioinformatics search was carried out to discover novel miRNAs that can be regulated in sugarcane plants submitted to drought and salt stresses, and under pathogen infection. By means of the presence of miRNA precursors in the related sorghum genome, we identified 623 candidates of new mature miRNAs in sugarcane. Of these, 44 were classified as high confidence miRNAs. The biological function of the new miRNAs candidates was assessed by analyzing their putative targets. The set of bona fide sugarcane miRNA includes those likely targeting serine/threonine kinases, Myb and zinc finger proteins. Additionally, a MADS-box transcription factor and an RPP2B protein, which act in development and disease resistant processes, could be regulated by cleavage (21-nt-species and DNA methylation (24-nt-species, respectively. Conclusions A large scale investigation of sRNA in sugarcane using a computational approach has identified a substantial number of new miRNAs and provides detailed genotype-tissue-culture miRNA expression profiles. Comparative analysis between monocots was valuable to clarify aspects about conservation of miRNA and their targets in a plant whose genome has not yet been sequenced. Our findings contribute to knowledge of miRNA roles in regulatory pathways in the complex, polyploidy sugarcane genome.

  18. The reliability and validity of the Alcohol Use Disorders Identification Test (AUDIT) in a German general practice population sample.

    Science.gov (United States)

    Dybek, Inga; Bischof, Gallus; Grothues, Janina; Reinhardt, Susa; Meyer, Christian; Hapke, Ulfert; John, Ulrich; Broocks, Andreas; Hohagen, Fritz; Rumpf, Hans-Jürgen

    2006-05-01

    Our goal was to analyze the retest reliability and validity of the Alcohol Use Disorders Identification Test (AUDIT) in a primary-care setting and recommend a cut-off value for the different alcohol-related diagnoses. Participants recruited from general practices (GPs) in two northern German cities received the AUDIT, which was embedded in a health-risk questionnaire. In total, 10,803 screenings were conducted. The retest reliability was tested on a subsample of 99 patients, with an intertest interval of 30 days. Sensitivity and specificity at a number of different cut-off values were estimated for the sample of alcohol consumers (n=8237). For this study, 1109 screen-positive patients received a diagnostic interview. Individuals who scored less than five points in the AUDIT and also tested negative in a second alcohol-related screen were defined as "negative" (n=6003). This definition was supported by diagnostic interviews of 99 screen-negative patients from which no false negatives could be detected. As the gold standard for detection of an alcohol-use disorder (AUD), we used the Munich-Composite International Diagnostic Interview (MCIDI), which is based on Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition, criteria. On the item level, the reliability, measured by the intraclass correlation coefficient (ICC), ranged between .39 (Item 9) and .98 (Item 10). For the total score, the ICC was .95. For cut-off values of eight points and five points, 87.5% and 88.9%, respectively, of the AUDIT-positives, and 98.9% and 95.1%, respectively, of the AUDIT-negatives were identically identified at retest, with kappa = .86 and kappa = .81. At the cut-off value of five points, we determined good combinations of sensitivity and specificity for the following diagnoses: alcohol dependence (sensitivity and specificity of .97 and .88, respectively), AUD (.97 and .92), and AUD and/or at-risk consumption (.97 and .91). Embedded in a health-risk questionnaire in

  19. Appraisal of the PREP, KITT, and SAMPLE computer codes for the evaluation of the reliability characteristics of engineered systems

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, P; White, R F

    1976-01-01

    For the probabilistic approach to reactor safety assessment by the use of event tree and fault tree techniques it is essential to be able to estimate the probabilities of failure of the various engineered safety features provided to mitigate the effects of postulated accident sequences. The PREP, KITT and SAMPLE computer codes, which incorporate Kinetic Tree Theory, perform these calculations and have been used extensively to evaluate the reliability characteristics of engineered safety features of American nuclear reactors. Working versions of these computer codes are now available in SRD, and this report explains the merits, capabilities and ease of application of the PREP, KITT, and SAMPLE programs for the solution of system reliability problems.

  20. Poverty identification for a pro-poor health insurance scheme in Tanzania: reliability and multi-level stakeholder perceptions.

    Science.gov (United States)

    Kuwawenaruwa, August; Baraka, Jitihada; Ramsey, Kate; Manzi, Fatuma; Bellows, Ben; Borghi, Josephine

    2015-12-01

    Many low income countries have policies to exempt the poor from user charges in public facilities. Reliably identifying the poor is a challenge when implementing such policies. In Tanzania, a scorecard system was established in 2011, within a programme providing free national health insurance fund (NHIF) cards, to identify poor pregnant women and their families, based on eight components. Using a series of reliability tests on a 2012 dataset of 2,621 households in two districts, this study compares household poverty levels using the scorecard, a wealth index, and monthly consumption expenditures. We compared the distributions of the three wealth measures, and the consistency of household poverty classification using cross-tabulations and the Kappa statistic. We measured errors of inclusion and exclusion of the scorecard relative to the other methods. We also gathered perceptions of the scorecard criteria through qualitative interviews with stakeholders at multiple levels of the health system. The distribution of the scorecard was less skewed than other wealth measures and not truncated, but demonstrated clumping. There was a higher level of agreement between the scorecard and the wealth index than consumption expenditure. The scorecard identified a similar number of poor households as the "basic needs" poverty line based on monthly consumption expenditure, with only 45 % errors of inclusion. However, it failed to pick up half of those living below the "basic needs" poverty line as being poor. Stakeholders supported the inclusion of water sources, income, food security and disability measures but had reservations about other items on the scorecard. In choosing poverty identification strategies for programmes seeking to enhance health equity it's necessary to balance between community acceptability, local relevance and the need for such a strategy. It is important to ensure the strategy is efficient and less costly than alternatives in order to effectively reduce

  1. The engine maintenance scheduling by using reliability centered maintenance method and the identification of 5S application in PT. XYZ

    Science.gov (United States)

    Sembiring, N.; Panjaitan, N.; Saragih, A. F.

    2018-02-01

    PT. XYZ is a manufacturing company that produces fresh fruit bunches (FFB) to Crude Palm Oil (CPO) and Palm Kernel Oil (PKO). PT. XYZ consists of six work stations: receipt station, sterilizing station, thressing station, pressing station, clarification station, and kernelery station. So far, the company is still implementing corrective maintenance maintenance system for production machines where the machine repair is done after damage occurs. Problems at PT. XYZ is the absence of scheduling engine maintenance in a planned manner resulting in the engine often damaged which can disrupt the smooth production. Another factor that is the problem in this research is the kernel station environment that becomes less convenient for operators such as there are machines and equipment not used in the production area, slippery, muddy, scattered fibers, incomplete use of PPE, and lack of employee discipline. The most commonly damaged machine is in the seed processing station (kernel station) which is cake breaker conveyor machine. The solution of this problem is to propose a schedule plan for maintenance of the machine by using the method of reliability centered maintenance and also the application of 5S. The result of the application of Reliability Centered maintenance method is obtained four components that must be treated scheduled (time directed), namely: for bearing component is 37 days, gearbox component is 97 days, CBC pen component is 35 days and conveyor pedal component is 32 days While after identification the application of 5S obtained the proposed corporate environmental improvement measures in accordance with the principles of 5S where unused goods will be moved from the production area, grouping goods based on their use, determining the procedure of cleaning the production area, conducting inspection in the use of PPE, and making 5S slogans.

  2. Accident identification system with automatic detection of abnormal condition using quantum computation

    International Nuclear Information System (INIS)

    Nicolau, Andressa dos Santos; Schirru, Roberto; Lima, Alan Miranda Monteiro de

    2011-01-01

    Transient identification systems have been proposed in order to maintain the plant operating in safe conditions and help operators in make decisions in emergency short time interval with maximum certainty associated. This article presents a system, time independent and without the use of an event that can be used as a starting point for t = 0 (reactor scram, for instance), for transient/accident identification of a pressurized water nuclear reactor (PWR). The model was developed in order to be able to recognize the normal condition and three accidents of the design basis list of the Nuclear Power Plant Angra 2, postulated in the Final Safety Analysis Report (FSAR). Were used several sets of process variables in order to establish a minimum set of variables considered necessary and sufficient. The optimization step of the identification algorithm is based upon the paradigm of Quantum Computing. In this case, the optimization metaheuristic Quantum Inspired Evolutionary Algorithm (QEA) was implemented and works as a data mining tool. The results obtained with the QEA without the time variable are compatible to the techniques in the reference literature, for the transient identification problem, with less computational effort (number of evaluations). This system allows a solution that approximates the ideal solution, the Voronoi Vectors with only one partition for the classes of accidents with robustness. (author)

  3. Reliability of a computer software angle tool for measuring spine and pelvic flexibility during the sit-and-reach test.

    Science.gov (United States)

    Mier, Constance M; Shapiro, Belinda S

    2013-02-01

    The purpose of this study was to determine the reliability of a computer software angle tool that measures thoracic (T), lumbar (L), and pelvic (P) angles as a means of evaluating spine and pelvic flexibility during the sit-and-reach (SR) test. Thirty adults performed the SR twice on separate days. The SR test was captured on video and later analyzed for T, L, and P angles using the computer software angle tool. During the test, 3 markers were placed over T1, T12, and L5 vertebrae to identify T, L, and P angles. Intraclass correlation coefficient (ICC) indicated a very high internal consistency (between trials) for T, L, and P angles (0.95-0.99); thus, the average of trials was used for test-retest (between days) reliability. Mean (±SD) values did not differ between days for T (51.0 ± 14.3 vs. 52.3 ± 16.2°), L (23.9 ± 7.1 vs. 23.0 ± 6.9°), or P (98.4 ± 15.6 vs. 98.3 ± 14.7°) angles. Test-retest reliability (ICC) was high for T (0.96) and P (0.97) angles and moderate for L angle (0.84). Both intrarater and interrater reliabilities were high for T (0.95, 0.94) and P (0.97, 0.97) angles and moderate for L angle (0.87, 0.82). Thus, the computer software angle tool is a highly objective method for assessing spine and pelvic flexibility during a video-captured SR test.

  4. Exploration of the (interrater) reliability and latent factor structure of the Alcohol Use Disorders Identification Test (AUDIT) and the Drug Use Disorders Identification Test (DUDIT) in a sample of Dutch probationers

    NARCIS (Netherlands)

    Noteborn, M.G.C.; Hildebrand, M.

    2015-01-01

    Background: The use of brief, reliable, valid, and practical measures of substance use is critical for conducting individual (risk and need) assessments in probation practice. In this exploratory study, the basic psychometric properties of the Alcohol Use Disorders Identification Test (AUDIT) and

  5. Accuracy and reliability of facial soft tissue depth measurements using cone beam computer tomography

    NARCIS (Netherlands)

    Fourie, Zacharias; Damstra, Janalt; Gerrits, Pieter; Ren, Yijin

    2010-01-01

    It is important to have accurate and reliable measurements of soft tissue thickness for specific landmarks of the face and scalp when producing a facial reconstruction. In the past several methods have been created to measure facial soft tissue thickness (FSTT) in cadavers and in the living. The

  6. Computational intelligence methods for the efficient reliability analysis of complex flood defence structures

    NARCIS (Netherlands)

    Kingston, Greer B.; Rajabali Nejad, Mohammadreza; Gouldby, Ben P.; van Gelder, Pieter H.A.J.M.

    2011-01-01

    With the continual rise of sea levels and deterioration of flood defence structures over time, it is no longer appropriate to define a design level of flood protection, but rather, it is necessary to estimate the reliability of flood defences under varying and uncertain conditions. For complex

  7. UPTF test instrumentation. Measurement system identification, engineering units and computed parameters

    International Nuclear Information System (INIS)

    Sarkar, J.; Liebert, J.; Laeufer, R.

    1992-11-01

    This updated version of the previous report /1/ contains, besides additional instrumentation needed for 2D/3D Programme, the supplementary instrumentation in the inlet plenum of SG simulator and hot and cold leg of broken loop, the cold leg of intact loops and the upper plenum to meet the requirements (Test Phase A) of the UPTF Programme, TRAM, sponsored by the Federal Minister of Research and Technology (BMFT) of the Federal Republic of Germany. For understanding, the derivation and the description of the identification codes for the entire conventional and advanced measurement systems classifying the function, and the equipment unit, key, as adopted in the conventional power plants, have been included. Amendments have also been made to the appendices. In particular, the list of measurement systems covering the measurement identification code, instrument, measured quantity, measuring range, band width, uncertainty and sensor location has been updated and extended to include the supplementary instrumentation. Beyond these amendments, the uncertainties of measurements have been precisely specified. The measurement identification codes which also stand for the identification of the corresponding measured quantities in engineering units and the identification codes derived therefrom for the computed parameters have been adequately detailed. (orig.)

  8. Improvement of level-1 PSA computer code package - Modeling and analysis for dynamic reliability of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chang Hoon; Baek, Sang Yeup; Shin, In Sup; Moon, Shin Myung; Moon, Jae Phil; Koo, Hoon Young; Kim, Ju Shin [Seoul National University, Seoul (Korea, Republic of); Hong, Jung Sik [Seoul National Polytechnology University, Seoul (Korea, Republic of); Lim, Tae Jin [Soongsil University, Seoul (Korea, Republic of)

    1996-08-01

    The objective of this project is to develop a methodology of the dynamic reliability analysis for NPP. The first year`s research was focused on developing a procedure for analyzing failure data of running components and a simulator for estimating the reliability of series-parallel structures. The second year`s research was concentrated on estimating the lifetime distribution and PM effect of a component from its failure data in various cases, and the lifetime distribution of a system with a particular structure. Computer codes for performing these jobs were also developed. The objectives of the third year`s research is to develop models for analyzing special failure types (CCFs, Standby redundant structure) that were nor considered in the first two years, and to complete a methodology of the dynamic reliability analysis for nuclear power plants. The analysis of failure data of components and related researches for supporting the simulator must be preceded for providing proper input to the simulator. Thus this research is divided into three major parts. 1. Analysis of the time dependent life distribution and the PM effect. 2. Development of a simulator for system reliability analysis. 3. Related researches for supporting the simulator : accelerated simulation analytic approach using PH-type distribution, analysis for dynamic repair effects. 154 refs., 5 tabs., 87 figs. (author)

  9. Optimal design methods for a digital human-computer interface based on human reliability in a nuclear power plant

    International Nuclear Information System (INIS)

    Jiang, Jianjun; Zhang, Li; Xie, Tian; Wu, Daqing; Li, Min; Wang, Yiqun; Peng, Yuyuan; Peng, Jie; Zhang, Mengjia; Li, Peiyao; Ma, Congmin; Wu, Xing

    2017-01-01

    Highlights: • A complete optimization process is established for digital human-computer interfaces of Npps. • A quick convergence search method is proposed. • The authors propose an affinity error probability mapping function to test human reliability. - Abstract: This is the second in a series of papers describing the optimal design method for a digital human-computer interface of nuclear power plant (Npp) from three different points based on human reliability. The purpose of this series is to explore different optimization methods from varying perspectives. This present paper mainly discusses the optimal design method for quantity of components of the same factor. In monitoring process, quantity of components has brought heavy burden to operators, thus, human errors are easily triggered. To solve the problem, the authors propose an optimization process, a quick convergence search method and an affinity error probability mapping function. Two balanceable parameter values of the affinity error probability function are obtained by experiments. The experimental results show that the affinity error probability mapping function about human-computer interface has very good sensitivity and stability, and that quick convergence search method for fuzzy segments divided by component quantity has better performance than general algorithm.

  10. Exploration of the (Interrater) Reliability and Latent Factor Structure of the Alcohol Use Disorders Identification Test (AUDIT) and the Drug Use Disorders Identification Test (DUDIT) in a Sample of Dutch Probationers.

    Science.gov (United States)

    Hildebrand, Martin; Noteborn, Mirthe G C

    2015-01-01

    The use of brief, reliable, valid, and practical measures of substance use is critical for conducting individual (risk and need) assessments in probation practice. In this exploratory study, the basic psychometric properties of the Alcohol Use Disorders Identification Test (AUDIT) and the Drug Use Disorders Identification Test (DUDIT) are evaluated. The instruments were administered as an oral interview instead of a self-report questionnaire. The sample comprised 383 offenders (339 men, 44 women). A subset of 56 offenders (49 men, 7 women) participated in the interrater reliability study. Data collection took place between September 2011 and November 2012. Overall, both instruments have acceptable levels of interrater reliability for total scores and acceptable to good interrater reliabilities for most of the individual items. Confirmatory factor analyses (CFA) indicated that the a priori one-, two- and three-factor solutions for the AUDIT did not fit the observed data very well. Principal axis factoring (PAF) supported a two-factor solution for the AUDIT that included a level of alcohol consumption/consequences factor (Factor 1) and a dependence factor (Factor 2), with both factors explaining substantial variance in AUDIT scores. For the DUDIT, CFA and PAF suggest that a one-factor solution is the preferred model (accounting for 62.61% of total variance). The Dutch language versions of the AUDIT and the DUDIT are reliable screening instruments for use with probationers and both instruments can be reliably administered by probation officers in probation practice. However, future research on concurrent and predictive validity is warranted.

  11. Design and implementation of a reliable and cost-effective cloud computing infrastructure: the INFN Napoli experience

    International Nuclear Information System (INIS)

    Capone, V; Esposito, R; Pardi, S; Taurino, F; Tortone, G

    2012-01-01

    Over the last few years we have seen an increasing number of services and applications needed to manage and maintain cloud computing facilities. This is particularly true for computing in high energy physics, which often requires complex configurations and distributed infrastructures. In this scenario a cost effective rationalization and consolidation strategy is the key to success in terms of scalability and reliability. In this work we describe an IaaS (Infrastructure as a Service) cloud computing system, with high availability and redundancy features, which is currently in production at INFN-Naples and ATLAS Tier-2 data centre. The main goal we intended to achieve was a simplified method to manage our computing resources and deliver reliable user services, reusing existing hardware without incurring heavy costs. A combined usage of virtualization and clustering technologies allowed us to consolidate our services on a small number of physical machines, reducing electric power costs. As a result of our efforts we developed a complete solution for data and computing centres that can be easily replicated using commodity hardware. Our architecture consists of 2 main subsystems: a clustered storage solution, built on top of disk servers running GlusterFS file system, and a virtual machines execution environment. GlusterFS is a network file system able to perform parallel writes on multiple disk servers, providing this way live replication of data. High availability is also achieved via a network configuration using redundant switches and multiple paths between hypervisor hosts and disk servers. We also developed a set of management scripts to easily perform basic system administration tasks such as automatic deployment of new virtual machines, adaptive scheduling of virtual machines on hypervisor hosts, live migration and automated restart in case of hypervisor failures.

  12. Design and implementation of a reliable and cost-effective cloud computing infrastructure: the INFN Napoli experience

    Science.gov (United States)

    Capone, V.; Esposito, R.; Pardi, S.; Taurino, F.; Tortone, G.

    2012-12-01

    Over the last few years we have seen an increasing number of services and applications needed to manage and maintain cloud computing facilities. This is particularly true for computing in high energy physics, which often requires complex configurations and distributed infrastructures. In this scenario a cost effective rationalization and consolidation strategy is the key to success in terms of scalability and reliability. In this work we describe an IaaS (Infrastructure as a Service) cloud computing system, with high availability and redundancy features, which is currently in production at INFN-Naples and ATLAS Tier-2 data centre. The main goal we intended to achieve was a simplified method to manage our computing resources and deliver reliable user services, reusing existing hardware without incurring heavy costs. A combined usage of virtualization and clustering technologies allowed us to consolidate our services on a small number of physical machines, reducing electric power costs. As a result of our efforts we developed a complete solution for data and computing centres that can be easily replicated using commodity hardware. Our architecture consists of 2 main subsystems: a clustered storage solution, built on top of disk servers running GlusterFS file system, and a virtual machines execution environment. GlusterFS is a network file system able to perform parallel writes on multiple disk servers, providing this way live replication of data. High availability is also achieved via a network configuration using redundant switches and multiple paths between hypervisor hosts and disk servers. We also developed a set of management scripts to easily perform basic system administration tasks such as automatic deployment of new virtual machines, adaptive scheduling of virtual machines on hypervisor hosts, live migration and automated restart in case of hypervisor failures.

  13. Computerized nipple identification for multiple image analysis in computer-aided diagnosis

    International Nuclear Information System (INIS)

    Zhou Chuan; Chan Heangping; Paramagul, Chintana; Roubidoux, Marilyn A.; Sahiner, Berkman; Hadjiiski, Labomir M.; Petrick, Nicholas

    2004-01-01

    Correlation of information from multiple-view mammograms (e.g., MLO and CC views, bilateral views, or current and prior mammograms) can improve the performance of breast cancer diagnosis by radiologists or by computer. The nipple is a reliable and stable landmark on mammograms for the registration of multiple mammograms. However, accurate identification of nipple location on mammograms is challenging because of the variations in image quality and in the nipple projections, resulting in some nipples being nearly invisible on the mammograms. In this study, we developed a computerized method to automatically identify the nipple location on digitized mammograms. First, the breast boundary was obtained using a gradient-based boundary tracking algorithm, and then the gray level profiles along the inside and outside of the boundary were identified. A geometric convergence analysis was used to limit the nipple search to a region of the breast boundary. A two-stage nipple detection method was developed to identify the nipple location using the gray level information around the nipple, the geometric characteristics of nipple shapes, and the texture features of glandular tissue or ducts which converge toward the nipple. At the first stage, a rule-based method was designed to identify the nipple location by detecting significant changes of intensity along the gray level profiles inside and outside the breast boundary and the changes in the boundary direction. At the second stage, a texture orientation-field analysis was developed to estimate the nipple location based on the convergence of the texture pattern of glandular tissue or ducts towards the nipple. The nipple location was finally determined from the detected nipple candidates by a rule-based confidence analysis. In this study, 377 and 367 randomly selected digitized mammograms were used for training and testing the nipple detection algorithm, respectively. Two experienced radiologists identified the nipple locations

  14. Adult Sex Identification Using Three-Dimensional Computed Tomography (3D-CT of the Pelvis: A Study Among a Sample of the Egyptian Population

    Directory of Open Access Journals (Sweden)

    Enas M. A. Mostafa

    2016-06-01

    Full Text Available Sex identification of unknown human skeletal remains is of great importance in establishing identity and individuality. In adults, the hip bone is the most reliable sex indicator because of its sexual dimorphism. Each population should have its own specific standards of identification. The objective of this study is to develop a logistic regression formula for adult sex identification using threedimensional computed tomography (3D-CT of the pelvis and to perform an assessment of its validity in sex determination among a sample of the Egyptian population in the Suez Canal region. 141 pelvic-abdominal CT images (free of any pelvic orthopaedic disorder were included; they were reconstructed to produce 3D-CT pelvic images which were divided into a calibration group (47 male and 47 female and a test group (47 CT images the sex of which was unknown to the observers. Twenty radiometric variables were measured for the calibration group. A logit response formula for sex prediction was developed and applied on the test group for sex prediction. The logit response formula for the test sample showed sensitivity, specificity, and an overall accuracy of 100%. The proposed method represents a quick and reliable metric method in establishing sex from a CT image of the pelvis bone.

  15. Beyond redundancy how geographic redundancy can improve service availability and reliability of computer-based systems

    CERN Document Server

    Bauer, Eric; Eustace, Dan

    2012-01-01

    "While geographic redundancy can obviously be a huge benefit for disaster recovery, it is far less obvious what benefit is feasible and likely for more typical non-catastrophic hardware, software, and human failures. Georedundancy and Service Availability provides both a theoretical and practical treatment of the feasible and likely benefits of geographic redundancy for both service availability and service reliability. The text provides network/system planners, IS/IT operations folks, system architects, system engineers, developers, testers, and other industry practitioners with a general discussion about the capital expense/operating expense tradeoff that frames system redundancy and georedundancy"--

  16. SALP (Sensitivity Analysis by List Processing), a computer assisted technique for binary systems reliability analysis

    International Nuclear Information System (INIS)

    Astolfi, M.; Mancini, G.; Volta, G.; Van Den Muyzenberg, C.L.; Contini, S.; Garribba, S.

    1978-01-01

    A computerized technique which allows the modelling by AND, OR, NOT binary trees, of various complex situations encountered in safety and reliability assessment, is described. By the use of list-processing, numerical and non-numerical types of information are used together. By proper marking of gates and primary events, stand-by systems, common cause failure and multiphase systems can be analyzed. The basic algorithms used in this technique are shown in detail. Application to a stand-by and multiphase system is then illustrated

  17. Sigma: computer vision in the service of safety and reliability in the inspection services; Sigma: la vision computacional al servicio de la seguridad y fiabilidad en los servicios de inspeccion

    Energy Technology Data Exchange (ETDEWEB)

    Pineiro, P. J.; Mendez, M.; Garcia, A.; Cabrera, E.; Regidor, J. J.

    2012-11-01

    Vision Computing is growing very fast in the last decade with very efficient tools and algorithms. This allows new development of applications in the nuclear field providing more efficient equipment and tasks: redundant systems, vision-guided mobile robots, automated visual defects recognition, measurement, etc., In this paper Tecnatom describes a detailed example of visual computing application developed to provide secure redundant identification of the thousands of tubes existing in a power plant steam generator. some other on-going or planned visual computing projects by Tecnatom are also introduced. New possibilities of application in the inspection systems for nuclear components appear where the main objective is to maximize their reliability. (Author) 6 refs.

  18. The reliability of computer analysis of ultrasonographic prostate images: the influence of inconsistent histopathology

    NARCIS (Netherlands)

    Giesen, R. J.; Huynen, A. L.; de la Rosette, J. J.; Schaafsma, H. E.; van Iersel, M. P.; Aarnink, R. G.; Debruyne, F. M.; Wijkstra, H.

    1994-01-01

    This article describes a method to investigate the influence of inconsistent histopathology during the development of tissue discrimination algorithms. Review of the pathology is performed on the biopsies used as training set of a computer system for cancer detection in ultrasonographic prostate

  19. Computational approaches to standard-compliant biofilm data for reliable analysis and integration

    Directory of Open Access Journals (Sweden)

    Sousa Ana Margarida

    2012-12-01

    Full Text Available The study of microorganism consortia, also known as biofilms, is associated to a number of applications in biotechnology, ecotechnology and clinical domains. Nowadays, biofilm studies are heterogeneous and data-intensive, encompassing different levels of analysis. Computational modelling of biofilm studies has become thus a requirement to make sense of these vast and ever-expanding biofilm data volumes.

  20. Interobserver reliability of coronoid fracture classification: two-dimensional versus three-dimensional computed tomography

    NARCIS (Netherlands)

    Lindenhovius, Anneluuk; Karanicolas, Paul Jack; Bhandari, Mohit; van Dijk, Niek; Ring, David; Allan, Christopher; Anglen, Jeffrey; Axelrod, Terry; Baratz, Mark; Beingessner, Daphne; Brink, Peter; Cassidy, Charles; Coles, Chad; Conflitti, Joe; Crist, Brett; Della Rocca, Gregory; Dijkstra, Sander; Elmans, L. H. G. J.; Feibel, Roger; Flores, Luis; Frihagen, Frede; Gosens, Taco; Goslings, J. C.; Greenberg, Jeffrey; Grosso, Elena; Harness, Neil; van der Heide, Huub; Jeray, Kyle; Kalainov, David; van Kampen, Albert; Kawamura, Sumito; Kloen, Peter; McKee, Michael; Nork, Sean; Page, Richard; Pesantez, Rodrigo; Peters, Anil; Poolman, Rudolf; Prayson, Michael; Richardson, Martin; Seiler, John; Swiontkowski, Marc; Thomas, George; Trumble, Tom; van Vugt, Arie; Wright, Thomas; Zalavras, Charalampos; Zura, Robert

    2009-01-01

    This study tests the hypothesis that 3-dimensional computed tomography (CT) reconstructions improve interobserver agreement on classification and treatment of coronoid fractures compared with 2-dimensional CT. A total of 29 orthopedic surgeons evaluated 10 coronoid fractures on 2 occasions (first

  1. Improving the capacity, reliability & life of mobile devices with cloud computing

    CSIR Research Space (South Africa)

    Nkosi, MT

    2011-05-01

    Full Text Available devices. The approach in this paper is to model the mobile cloud computing process in a 3GPP IMS software development and emulator environment. And show that multimedia and security operations can be performed in the cloud, allowing mobile service...

  2. Reliability and availability of redundant systems: Computational program and the use of nomograms

    International Nuclear Information System (INIS)

    Signoret, J.P.

    1975-01-01

    A rigorous mathematical approach to determining the reliability and availability of repairable actively redundant systems - (r/m) systems - is considered for the case where the m units comprising the system are identical and the failure and repair rates, lambda and μ respectively, are constant. The method used involves the Markov processes, operator calculus and matrix calculus. All the results of the study are handled by the FIDIAS program, which is a practical tool for calculating with a high degree of precision the reliability or availability of such (r/m) systems whatever the values of m and r. In the FIDIAS-TC version of FIDIAS it is possible to plot curves with a Benson plotter, so that nomograms are produced for rapid and simple determination of the probabilities of failure or non-availability of the (r/m) systems considered. The practical application of nomograms is of interest because (2/3) and (2/4) actively redundant systems are very often used in the control circuits of power reactors. It is shown how easily one can compare these two systems using nomograms and how one can determine lambda or μ as a function of the anticipated result

  3. User's manual of SECOM2: a computer code for seismic system reliability analysis

    International Nuclear Information System (INIS)

    Uchiyama, Tomoaki; Oikawa, Tetsukuni; Kondo, Masaaki; Tamura, Kazuo

    2002-03-01

    This report is the user's manual of seismic system reliability analysis code SECOM2 (Seismic Core Melt Frequency Evaluation Code Ver.2) developed at the Japan Atomic Energy Research Institute for systems reliability analysis, which is one of the tasks of seismic probabilistic safety assessment (PSA) of nuclear power plants (NPPs). The SECOM2 code has many functions such as: Calculation of component failure probabilities based on the response factor method, Extraction of minimal cut sets (MCSs), Calculation of conditional system failure probabilities for given seismic motion levels at the site of an NPP, Calculation of accident sequence frequencies and the core damage frequency (CDF) with use of the seismic hazard curve, Importance analysis using various indicators, Uncertainty analysis, Calculation of the CDF taking into account the effect of the correlations of responses and capacities of components, and Efficient sensitivity analysis by changing parameters on responses and capacities of components. These analyses require the fault tree (FT) representing the occurrence condition of the system failures and core damage, information about response and capacity of components and seismic hazard curve for the NPP site as inputs. This report presents the models and methods applied in the SECOM2 code and how to use those functions. (author)

  4. Translation, Validation, and Reliability of the Dutch Late-Life Function and Disability Instrument Computer Adaptive Test.

    Science.gov (United States)

    Arensman, Remco M; Pisters, Martijn F; de Man-van Ginkel, Janneke M; Schuurmans, Marieke J; Jette, Alan M; de Bie, Rob A

    2016-09-01

    Adequate and user-friendly instruments for assessing physical function and disability in older adults are vital for estimating and predicting health care needs in clinical practice. The Late-Life Function and Disability Instrument Computer Adaptive Test (LLFDI-CAT) is a promising instrument for assessing physical function and disability in gerontology research and clinical practice. The aims of this study were: (1) to translate the LLFDI-CAT to the Dutch language and (2) to investigate its validity and reliability in a sample of older adults who spoke Dutch and dwelled in the community. For the assessment of validity of the LLFDI-CAT, a cross-sectional design was used. To assess reliability, measurement of the LLFDI-CAT was repeated in the same sample. The item bank of the LLFDI-CAT was translated with a forward-backward procedure. A sample of 54 older adults completed the LLFDI-CAT, World Health Organization Disability Assessment Schedule 2.0, RAND 36-Item Short-Form Health Survey physical functioning scale (10 items), and 10-Meter Walk Test. The LLFDI-CAT was repeated in 2 to 8 days (mean=4.5 days). Pearson's r and the intraclass correlation coefficient (ICC) (2,1) were calculated to assess validity, group-level reliability, and participant-level reliability. A correlation of .74 for the LLFDI-CAT function scale and the RAND 36-Item Short-Form Health Survey physical functioning scale (10 items) was found. The correlations of the LLFDI-CAT disability scale with the World Health Organization Disability Assessment Schedule 2.0 and the 10-Meter Walk Test were -.57 and -.53, respectively. The ICC (2,1) of the LLFDI-CAT function scale was .84, with a group-level reliability score of .85. The ICC (2,1) of the LLFDI-CAT disability scale was .76, with a group-level reliability score of .81. The high percentage of women in the study and the exclusion of older adults with recent joint replacement or hospitalization limit the generalizability of the results. The Dutch LLFDI

  5. Reliability and reproducibility analysis of the Cobb angle and assessing sagittal plane by computer-assisted and manual measurement tools.

    Science.gov (United States)

    Wu, Weifei; Liang, Jie; Du, Yuanli; Tan, Xiaoyi; Xiang, Xuanping; Wang, Wanhong; Ru, Neng; Le, Jinbo

    2014-02-06

    Although many studies on reliability and reproducibility of measurement have been performed on coronal Cobb angle, few results about reliability and reproducibility are reported on sagittal alignment measurement including the pelvis. We usually use SurgimapSpine software to measure the Cobb angle in our studies; however, there are no reports till date on its reliability and reproducible measurements. Sixty-eight standard standing posteroanterior whole-spine radiographs were reviewed. Three examiners carried out the measurements independently under the settings of manual measurement on X-ray radiographies and SurgimapSpine software on the computer. Parameters measured included pelvic incidence, sacral slope, pelvic tilt, Lumbar lordosis (LL), thoracic kyphosis, and coronal Cobb angle. SPSS 16.0 software was used for statistical analyses. The means, standard deviations, intraclass and interclass correlation coefficient (ICC), and 95% confidence intervals (CI) were calculated. There was no notable difference between the two tools (P = 0.21) for the coronal Cobb angle. In the sagittal plane parameters, the ICC of intraobserver reliability for the manual measures varied from 0.65 (T2-T5 angle) to 0.95 (LL angle). Further, for SurgimapSpine tool, the ICC ranged from 0.75 to 0.98. No significant difference in intraobserver reliability was found between the two measurements (P > 0.05). As for the interobserver reliability, measurements with SurgimapSpine tool had better ICC (0.71 to 0.98 vs 0.59 to 0.96) and Pearson's coefficient (0.76 to 0.99 vs 0.60 to 0.97). The reliability of SurgimapSpine measures was significantly higher in all parameters except for the coronal Cobb angle where the difference was not significant (P > 0.05). Although the differences between the two methods are very small, the results of this study indicate that the SurgimapSpine measurement is an equivalent measuring tool to the traditional manual in coronal Cobb angle, but is advantageous in spino

  6. STARS software tool for analysis of reliability and safety

    International Nuclear Information System (INIS)

    Poucet, A.; Guagnini, E.

    1989-01-01

    This paper reports on the STARS (Software Tool for the Analysis of Reliability and Safety) project aims at developing an integrated set of Computer Aided Reliability Analysis tools for the various tasks involved in systems safety and reliability analysis including hazard identification, qualitative analysis, logic model construction and evaluation. The expert system technology offers the most promising perspective for developing a Computer Aided Reliability Analysis tool. Combined with graphics and analysis capabilities, it can provide a natural engineering oriented environment for computer assisted reliability and safety modelling and analysis. For hazard identification and fault tree construction, a frame/rule based expert system is used, in which the deductive (goal driven) reasoning and the heuristic, applied during manual fault tree construction, is modelled. Expert system can explain their reasoning so that the analyst can become aware of the why and the how results are being obtained. Hence, the learning aspect involved in manual reliability and safety analysis can be maintained and improved

  7. A computable phenotype for asthma case identification in adult and pediatric patients: External validation in the Chicago Area Patient-Outcomes Research Network (CAPriCORN).

    Science.gov (United States)

    Afshar, Majid; Press, Valerie G; Robison, Rachel G; Kho, Abel N; Bandi, Sindhura; Biswas, Ashvini; Avila, Pedro C; Kumar, Harsha Vardhan Madan; Yu, Byung; Naureckas, Edward T; Nyenhuis, Sharmilee M; Codispoti, Christopher D

    2017-10-13

    Comprehensive, rapid, and accurate identification of patients with asthma for clinical care and engagement in research efforts is needed. The original development and validation of a computable phenotype for asthma case identification occurred at a single institution in Chicago and demonstrated excellent test characteristics. However, its application in a diverse payer mix, across different health systems and multiple electronic health record vendors, and in both children and adults was not examined. The objective of this study is to externally validate the computable phenotype across diverse Chicago institutions to accurately identify pediatric and adult patients with asthma. A cohort of 900 asthma and control patients was identified from the electronic health record between January 1, 2012 and November 30, 2014. Two physicians at each site independently reviewed the patient chart to annotate cases. The inter-observer reliability between the physician reviewers had a κ-coefficient of 0.95 (95% CI 0.93-0.97). The accuracy, sensitivity, specificity, negative predictive value, and positive predictive value of the computable phenotype were all above 94% in the full cohort. The excellent positive and negative predictive values in this multi-center external validation study establish a useful tool to identify asthma cases in in the electronic health record for research and care. This computable phenotype could be used in large-scale comparative-effectiveness trials.

  8. Validity and Reliability of Orthodontic Loops between Mechanical Testing and Computer Simulation: An Finite Element Method Study

    Directory of Open Access Journals (Sweden)

    Gaurav Sepolia

    2014-01-01

    Full Text Available The magnitude and direction of orthodontic force is one of the essential concerns of orthodontic tooth movements. Excessive force may cause root resorption and mobility of the tooth, whereas low force level may results in prolonged treatment. The addition of loops allows the clinician to more accurately achieve the desired results. Aims and objectives: The purpose of the study was to evaluate the validity and reliability of orthodontic loops between mechanical testing and computer simulation. Materials and methods: Different types of loops were taken and divided into four groups: The Teardrop loop, Opus loop, L loop and T loop. These were artificially activated for multiple lengths and studied using the FEM. Results: The Teardrop loop showed the highest force level, and there is no significant difference between mechanical testing and computer simulation.

  9. Computational Acoustic Beamforming for Noise Source Identification for Small Wind Turbines.

    Science.gov (United States)

    Ma, Ping; Lien, Fue-Sang; Yee, Eugene

    2017-01-01

    This paper develops a computational acoustic beamforming (CAB) methodology for identification of sources of small wind turbine noise. This methodology is validated using the case of the NACA 0012 airfoil trailing edge noise. For this validation case, the predicted acoustic maps were in excellent conformance with the results of the measurements obtained from the acoustic beamforming experiment. Following this validation study, the CAB methodology was applied to the identification of noise sources generated by a commercial small wind turbine. The simulated acoustic maps revealed that the blade tower interaction and the wind turbine nacelle were the two primary mechanisms for sound generation for this small wind turbine at frequencies between 100 and 630 Hz.

  10. Learning Support Assessment Study of a Computer Simulation for the Development of Microbial Identification Strategies

    Directory of Open Access Journals (Sweden)

    Tristan E. Johnson

    2009-12-01

    Full Text Available This paper describes a study that examined how microbiology students construct knowledge of bacterial identification while using a computer simulation. The purpose of this study was to understand how the simulation affects the cognitive processing of students during thinking, problem solving, and learning about bacterial identification and to determine how the simulation facilitates the learning of a domain-specific problem-solving strategy. As part of an upper-division microbiology course, five students participated in several simulation assignments. The data were collected using think-aloud protocol and video action logs as the students used the simulation. The analysis revealed two major themes that determined the performance of the students: Simulation Usage—how the students used the software features and Problem-Solving Strategy Development—the strategy level students started with and the skill level they achieved when they completed their use of the simulation. Several conclusions emerged from the analysis of the data: (i The simulation affects various aspects of cognitive processing by creating an environment that makes it possible to practice the application of a problem-solving strategy. The simulation was used as an environment that allowed students to practice the cognitive skills required to solve an unknown. (ii Identibacter (the computer simulation may be considered to be a cognitive tool to facilitate the learning of a bacterial identification problem-solving strategy. (iii The simulation characteristics did support student learning of a problem-solving strategy. (iv Students demonstrated problem-solving strategy development specific to bacterial identification. (v Participants demonstrated an improved performance from their repeated use of the simulation.

  11. Reliability of video-based identification of footstrike pattern and video time frame at initial contact in recreational runners

    DEFF Research Database (Denmark)

    Damsted, Camma; Larsen, L H; Nielsen, R.O.

    2015-01-01

    and video time frame at initial contact during treadmill running using two-dimensional (2D) video recordings. METHODS: Thirty-one recreational runners were recorded twice, 1 week apart, with a high-speed video camera. Two blinded raters evaluated each video twice with an interval of at least 14 days....... RESULTS: Kappa values for within-day identification of footstrike pattern revealed intra-rater agreement of 0.83-0.88 and inter-rater agreement of 0.50-0.63. Corresponding figures for between-day identification of footstrike pattern were 0.63-0.69 and 0.41-0.53, respectively. Identification of video time...... in 36% of the identifications (kappa=0.41). The 95% limits of agreement for identification of video time frame at initial contact may, at times, allow for different identification of footstrike pattern. Clinicians should, therefore, be encouraged to continue using clinical 2D video setups for intra...

  12. Accuracy and reliability of linear cephalometric measurements from cone-beam computed tomography scans of a dry human skull.

    Science.gov (United States)

    Berco, Mauricio; Rigali, Paul H; Miner, R Matthew; DeLuca, Stephelynn; Anderson, Nina K; Will, Leslie A

    2009-07-01

    The purpose of this study was to determine the accuracy and reliability of 3-dimensional craniofacial measurements obtained from cone-beam computed tomography (CBCT) scans of a dry human skull. Seventeen landmarks were identified on the skull. CBCT scans were then obtained, with 2 skull orientations during scanning. Twenty-nine interlandmark linear measurements were made directly on the skull and compared with the same measurements made on the CBCT scans. All measurements were made by 2 operators on 4 separate occasions. The method errors were 0.19, 0.21, and 0.19 mm in the x-, y- and z-axes, respectively. Repeated measures analysis of variance (ANOVA) showed no significant intraoperator or interoperator differences. The mean measurement error was -0.01 mm (SD, 0.129 mm). Five measurement errors were found to be statistically significantly different; however, all measurement errors were below the known voxel size and clinically insignificant. No differences were found in the measurements from the 2 CBCT scan orientations of the skull. CBCT allows for clinically accurate and reliable 3-dimensional linear measurements of the craniofacial complex. Moreover, skull orientation during CBCT scanning does not affect the accuracy or the reliability of these measurements.

  13. Accuracy and Reliability of Cone-Beam Computed Tomography for Linear and Volumetric Mandibular Condyle Measurements. A Human Cadaver Study.

    Science.gov (United States)

    García-Sanz, Verónica; Bellot-Arcís, Carlos; Hernández, Virginia; Serrano-Sánchez, Pedro; Guarinos, Juan; Paredes-Gallardo, Vanessa

    2017-09-20

    The accuracy of Cone-Beam Computed Tomography (CBCT) on linear and volumetric measurements on condyles has only been assessed on dry skulls. The aim of this study was to evaluate the reliability and accuracy of linear and volumetric measurements of mandibular condyles in the presence of soft tissues using CBCT. Six embalmed cadaver heads were used. CBCT scans were taken, followed by the extraction of the condyles. The water displacement technique was used to calculate the volumes of the condyles and three linear measurements were made using a digital caliper, these measurements serving as the gold standard. Surface models of the condyles were obtained using a 3D scanner, and superimposed onto the CBCT images. Condyles were isolated on the CBCT render volume using the surface models as reference and volumes were measured. Linear measurements were made on CBCT slices. The CBCT method was found to be reliable for both volumetric and linear measurements (CV  0.90). Highly accurate values were obtained for the three linear measurements and volume. CBCT is a reliable and accurate method for taking volumetric and linear measurements on mandibular condyles in the presence of soft tissue, and so a valid tool for clinical diagnosis.

  14. Computing interval-valued reliability measures: application of optimal control methods

    DEFF Research Database (Denmark)

    Kozin, Igor; Krymsky, Victor

    2017-01-01

    The paper describes an approach to deriving interval-valued reliability measures given partial statistical information on the occurrence of failures. We apply methods of optimal control theory, in particular, Pontryagin’s principle of maximum to solve the non-linear optimisation problem and derive...... the probabilistic interval-valued quantities of interest. It is proven that the optimisation problem can be translated into another problem statement that can be solved on the class of piecewise continuous probability density functions (pdfs). This class often consists of piecewise exponential pdfs which appear...... as soon as among the constraints there are bounds on a failure rate of a component under consideration. Finding the number of switching points of the piecewise continuous pdfs and their values becomes the focus of the approach described in the paper. Examples are provided....

  15. Computing interval-valued statistical characteristics: What is the stumbling block for reliability applications?

    DEFF Research Database (Denmark)

    Kozine, Igor; Krymsky, V.G.

    2009-01-01

    The application of interval-valued statistical models is often hindered by the rapid growth in imprecision that occurs when intervals are propagated through models. Is this deficiency inherent in the models? If so, what is the underlying cause of imprecision in mathematical terms? What kind...... of additional information can be incorporated to make the bounds tighter? The present paper gives an account of the source of this imprecision that prevents interval-valued statistical models from being widely applied. Firstly, the mathematical approach to building interval-valued models (discrete...... and continuous) is delineated. Secondly, a degree of imprecision is demonstrated on some simple reliability models. Thirdly, the root mathematical cause of sizeable imprecision is elucidated and, finally, a method of making the intervals tighter is described. A number of examples are given throughout the paper....

  16. Statistical test data selection for reliability evalution of process computer software

    International Nuclear Information System (INIS)

    Volkmann, K.P.; Hoermann, H.; Ehrenberger, W.

    1976-01-01

    The paper presents a concept for converting knowledge about the characteristics of process states into practicable procedures for the statistical selection of test cases in testing process computer software. Process states are defined as vectors whose components consist of values of input variables lying in discrete positions or within given limits. Two approaches for test data selection, based on knowledge about cases of demand, are outlined referring to a purely probabilistic method and to the mathematics of stratified sampling. (orig.) [de

  17. Proof-of-Concept Demonstrations for Computation-Based Human Reliability Analysis. Modeling Operator Performance During Flooding Scenarios

    International Nuclear Information System (INIS)

    Joe, Jeffrey Clark; Boring, Ronald Laurids; Herberger, Sarah Elizabeth Marie; Mandelli, Diego; Smith, Curtis Lee

    2015-01-01

    The United States (U.S.) Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) program has the overall objective to help sustain the existing commercial nuclear power plants (NPPs). To accomplish this program objective, there are multiple LWRS 'pathways,' or research and development (R&D) focus areas. One LWRS focus area is called the Risk-Informed Safety Margin and Characterization (RISMC) pathway. Initial efforts under this pathway to combine probabilistic and plant multi-physics models to quantify safety margins and support business decisions also included HRA, but in a somewhat simplified manner. HRA experts at Idaho National Laboratory (INL) have been collaborating with other experts to develop a computational HRA approach, called the Human Unimodel for Nuclear Technology to Enhance Reliability (HUNTER), for inclusion into the RISMC framework. The basic premise of this research is to leverage applicable computational techniques, namely simulation and modeling, to develop and then, using RAVEN as a controller, seamlessly integrate virtual operator models (HUNTER) with 1) the dynamic computational MOOSE runtime environment that includes a full-scope plant model, and 2) the RISMC framework PRA models already in use. The HUNTER computational HRA approach is a hybrid approach that leverages past work from cognitive psychology, human performance modeling, and HRA, but it is also a significant departure from existing static and even dynamic HRA methods. This report is divided into five chapters that cover the development of an external flooding event test case and associated statistical modeling considerations.

  18. Proof-of-Concept Demonstrations for Computation-Based Human Reliability Analysis. Modeling Operator Performance During Flooding Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Joe, Jeffrey Clark [Idaho National Lab. (INL), Idaho Falls, ID (United States); Boring, Ronald Laurids [Idaho National Lab. (INL), Idaho Falls, ID (United States); Herberger, Sarah Elizabeth Marie [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis Lee [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    The United States (U.S.) Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) program has the overall objective to help sustain the existing commercial nuclear power plants (NPPs). To accomplish this program objective, there are multiple LWRS “pathways,” or research and development (R&D) focus areas. One LWRS focus area is called the Risk-Informed Safety Margin and Characterization (RISMC) pathway. Initial efforts under this pathway to combine probabilistic and plant multi-physics models to quantify safety margins and support business decisions also included HRA, but in a somewhat simplified manner. HRA experts at Idaho National Laboratory (INL) have been collaborating with other experts to develop a computational HRA approach, called the Human Unimodel for Nuclear Technology to Enhance Reliability (HUNTER), for inclusion into the RISMC framework. The basic premise of this research is to leverage applicable computational techniques, namely simulation and modeling, to develop and then, using RAVEN as a controller, seamlessly integrate virtual operator models (HUNTER) with 1) the dynamic computational MOOSE runtime environment that includes a full-scope plant model, and 2) the RISMC framework PRA models already in use. The HUNTER computational HRA approach is a hybrid approach that leverages past work from cognitive psychology, human performance modeling, and HRA, but it is also a significant departure from existing static and even dynamic HRA methods. This report is divided into five chapters that cover the development of an external flooding event test case and associated statistical modeling considerations.

  19. Impact of PECS tablet computer app on receptive identification of pictures given a verbal stimulus.

    Science.gov (United States)

    Ganz, Jennifer B; Hong, Ee Rea; Goodwyn, Fara; Kite, Elizabeth; Gilliland, Whitney

    2015-04-01

    The purpose of this brief report was to determine the effect on receptive identification of photos of a tablet computer-based augmentative and alternative communication (AAC) system with voice output. A multiple baseline single-case experimental design across vocabulary words was implemented. One participant, a preschool-aged boy with autism and little intelligible verbal language, was included in the study. Although a functional relation between the intervention and the dependent variable was not established, the intervention did appear to result in mild improvement for two of the three vocabulary words selected. The authors recommend further investigations of the collateral impacts of AAC on skills other than expressive language.

  20. Environmental risk assessment of biocidal products: identification of relevant components and reliability of a component-based mixture assessment.

    Science.gov (United States)

    Coors, Anja; Vollmar, Pia; Heim, Jennifer; Sacher, Frank; Kehrer, Anja

    2018-01-01

    study developed criteria for the identification of CBA-relevant components in a biocidal product. These criteria are based on existing criteria stated in the regulation for classification, labelling and packaging of substances. The CBA was found sufficiently protective and reliable for the tested products when applying the here recommended criteria. The lack of available aquatic toxicity data for some of the identified relevant components was the main reason for underestimation of product toxicity.

  1. Fast and reliable method for computing free-bound emission coefficients for hydrogenic ions

    Energy Technology Data Exchange (ETDEWEB)

    Sarmiento, A; Canto, J

    1985-12-01

    An approximate formula for the computation of the free-bound emission coefficient for hydrogenic ions is presented. The approximation is obtained through a manipulation of the (free-bound) Gaunt factor which intentionally distinguish the dependence on frequency from the dependence on temperature and ionic composition. Numerical tests indicate that the derived formula is very precise, fast and easy to use, making the calculation of the free-bound contribution from an ionized region of varying temperature and ionic composition a very simple and time-saving task.

  2. A fast and reliable method for computing free-bound emission coefficients for hydrogenic ions

    International Nuclear Information System (INIS)

    Sarmiento, A.; Canto, J.

    1985-01-01

    An approximate formula for the computation of the free-bound emission coefficient for hydrogenic ions is presented. The approximation is obtained through a manipulation of the (free-bound) Gaunt factor which intentionally distinguish the dependence on frequency from the dependence on temperature and ionic composition. Numerical tests indicate that the derived formula is very precise, fast and easy to use, making the calculation of the free-bound contribution from an ionized region of varying temperature and ionic composition a very simple and time-saving task. (author)

  3. Logistic regression model for identification of right ventricular dysfunction in patients with acute pulmonary embolism by means of computed tomography

    International Nuclear Information System (INIS)

    Staskiewicz, Grzegorz; Czekajska-Chehab, Elżbieta; Uhlig, Sebastian; Przegalinski, Jerzy; Maciejewski, Ryszard; Drop, Andrzej

    2013-01-01

    Purpose: Diagnosis of right ventricular dysfunction in patients with acute pulmonary embolism (PE) is known to be associated with increased risk of mortality. The aim of the study was to calculate a logistic regression model for reliable identification of right ventricular dysfunction (RVD) in patients diagnosed with computed tomography pulmonary angiography. Material and methods: Ninety-seven consecutive patients with acute pulmonary embolism were divided into groups with and without RVD basing upon echocardiographic measurement of pulmonary artery systolic pressure (PASP). PE severity was graded with the pulmonary obstruction score. CT measurements of heart chambers and mediastinal vessels were performed; position of interventricular septum and presence of contrast reflux into the inferior vena cava were also recorded. The logistic regression model was prepared by means of stepwise logistic regression. Results: Among the used parameters, the final model consisted of pulmonary obstruction score, short axis diameter of right ventricle and diameter of inferior vena cava. The calculated model is characterized by 79% sensitivity and 81% specificity, and its performance was significantly better than single CT-based measurements. Conclusion: Logistic regression model identifies RVD significantly better, than single CT-based measurements

  4. A computational model for reliability calculation of steam generators from defects in its tubes

    International Nuclear Information System (INIS)

    Rivero, Paulo C.M.; Melo, P.F. Frutuoso e

    2000-01-01

    Nowadays, probability approaches are employed for calculating the reliability of steam generators as a function of defects in their tubes without any deterministic association with warranty assurance. Unfortunately, probability models produce large failure values, as opposed to the recommendation of the U.S. Code of Federal Regulations, that is, failure probabilities must be as small as possible In this paper, we propose the association of the deterministic methodology with the probabilistic one. At first, the failure probability evaluation of steam generators follows a probabilistic methodology: to find the failure probability, critical cracks - obtained from Monte Carlo simulations - are limited to have length's in the interval defined by their lower value and the plugging limit one, so as to obtain a failure probability of at most 1%. The distribution employed for modeling the observed (measured) cracks considers the same interval. Any length outside the mentioned interval is not considered for the probability evaluation: it is approached by the deterministic model. The deterministic approach is to plug the tube when any anomalous crack is detected in it. Such a crack is an observed one placed in the third region on the plot of the logarithmic time derivative of crack lengths versus the mode I stress intensity factor, while for normal cracks the plugging of tubes occurs in the second region of that plot - if they are dangerous, of course, considering their random evolution. A methodology for identifying anomalous cracks is also presented. (author)

  5. Computed tomography and angiography do not reliably discriminate malignant meningiomas from benign ones

    International Nuclear Information System (INIS)

    Servo, A.; Porras, M.; Jaeaeskelaeinen, J.; Paetau, A.; Haltia, M.

    1990-01-01

    Histological anaplasia, found in up to 10% of meningiomas, is an important prognostic sign as it is associated with increased recurrence rate and volume growth rate. We studied in retrospect a series of 230 primary intracranial meningiomas to discover whether histological anaplasia can be reliably foreseen in CT scans and angiograms. 205 meningiomas were histologically benign, and 25 meningiomas were classified as malignant (atypical or anaplastic), with either incipient (20) or overt (5) signs of anaplasia. Of ten CT parameters tested, three were associated significantly more often with malignant meningiomas: Nodular contour (58.3% vs 26.7%); cysts (20.0% vs 4.4%) and absence of calcifications (92% vs 65.3%); none of these parameters was an absolute sign of anaplasia. 'Mushrooming', previously regarded as a definite sign of malignancy, was seen in 9% of benign meningiomas and in 21% of malignant ones. In angiography, no apparent differences between benign and malignant meningiomas were seen. The conclusion is that it is not possible to distinguish malignant meningiomas from benign ones with CT or angiography. (orig.)

  6. CERPI and CEREL, two computer codes for the automatic identification and determination of gamma emitters in thermal neutron activated samples

    International Nuclear Information System (INIS)

    Giannini, M.; Oliva, P.R.; Ramorino, C.

    1978-01-01

    A description is given of a computer code which automatically analyses gamma-ray spectra obtained with Ge(Li) detectors. The program contains features as automatic peak location and fitting, determination of peak energies and intensities, nuclide identification and calculation of masses and errors. Finally the results obtained with our computer code for a lunar sample are reported and briefly discussed

  7. Rapid and reliable identification of Gram-negative bacteria and Gram-positive cocci by deposition of bacteria harvested from blood cultures onto the MALDI-TOF plate.

    Science.gov (United States)

    Barnini, Simona; Ghelardi, Emilia; Brucculeri, Veronica; Morici, Paola; Lupetti, Antonella

    2015-06-18

    Rapid identification of the causative agent(s) of bloodstream infections using the matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF) methodology can lead to increased empirical antimicrobial therapy appropriateness. Herein, we aimed at establishing an easier and simpler method, further referred to as the direct method, using bacteria harvested by serum separator tubes from positive blood cultures and placed onto the polished steel target plate for rapid identification by MALDI-TOF. The results by the direct method were compared with those obtained by MALDI-TOF on bacteria isolated on solid media. Identification of Gram-negative bacilli was 100 % concordant using the direct method or MALDI-TOF on isolated bacteria (96 % with score > 2.0). These two methods were 90 % concordant on Gram-positive cocci (32 % with score > 2.0). Identification by the SepsiTyper method of Gram-positive cocci gave concordant results with MALDI-TOF on isolated bacteria in 87 % of cases (37 % with score > 2.0). The direct method herein developed allows rapid identification (within 30 min) of Gram-negative bacteria and Gram-positive cocci from positive blood cultures and can be used to rapidly report reliable and accurate results, without requiring skilled personnel or the use of expensive kits.

  8. Reliability Calculations

    DEFF Research Database (Denmark)

    Petersen, Kurt Erling

    1986-01-01

    Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety...... and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic...... approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very...

  9. The utility of including pathology reports in improving the computational identification of patients

    Directory of Open Access Journals (Sweden)

    Wei Chen

    2016-01-01

    Full Text Available Background: Celiac disease (CD is a common autoimmune disorder. Efficient identification of patients may improve chronic management of the disease. Prior studies have shown searching International Classification of Diseases-9 (ICD-9 codes alone is inaccurate for identifying patients with CD. In this study, we developed automated classification algorithms leveraging pathology reports and other clinical data in Electronic Health Records (EHRs to refine the subset population preselected using ICD-9 code (579.0. Materials and Methods: EHRs were searched for established ICD-9 code (579.0 suggesting CD, based on which an initial identification of cases was obtained. In addition, laboratory results for tissue transglutaminse were extracted. Using natural language processing we analyzed pathology reports from upper endoscopy. Twelve machine learning classifiers using different combinations of variables related to ICD-9 CD status, laboratory result status, and pathology reports were experimented to find the best possible CD classifier. Ten-fold cross-validation was used to assess the results. Results: A total of 1498 patient records were used including 363 confirmed cases and 1135 false positive cases that served as controls. Logistic model based on both clinical and pathology report features produced the best results: Kappa of 0.78, F1 of 0.92, and area under the curve (AUC of 0.94, whereas in contrast using ICD-9 only generated poor results: Kappa of 0.28, F1 of 0.75, and AUC of 0.63. Conclusion: Our automated classification system presented an efficient and reliable way to improve the performance of CD patient identification.

  10. Identification of control targets in Boolean molecular network models via computational algebra.

    Science.gov (United States)

    Murrugarra, David; Veliz-Cuba, Alan; Aguilar, Boris; Laubenbacher, Reinhard

    2016-09-23

    Many problems in biomedicine and other areas of the life sciences can be characterized as control problems, with the goal of finding strategies to change a disease or otherwise undesirable state of a biological system into another, more desirable, state through an intervention, such as a drug or other therapeutic treatment. The identification of such strategies is typically based on a mathematical model of the process to be altered through targeted control inputs. This paper focuses on processes at the molecular level that determine the state of an individual cell, involving signaling or gene regulation. The mathematical model type considered is that of Boolean networks. The potential control targets can be represented by a set of nodes and edges that can be manipulated to produce a desired effect on the system. This paper presents a method for the identification of potential intervention targets in Boolean molecular network models using algebraic techniques. The approach exploits an algebraic representation of Boolean networks to encode the control candidates in the network wiring diagram as the solutions of a system of polynomials equations, and then uses computational algebra techniques to find such controllers. The control methods in this paper are validated through the identification of combinatorial interventions in the signaling pathways of previously reported control targets in two well studied systems, a p53-mdm2 network and a blood T cell lymphocyte granular leukemia survival signaling network. Supplementary data is available online and our code in Macaulay2 and Matlab are available via http://www.ms.uky.edu/~dmu228/ControlAlg . This paper presents a novel method for the identification of intervention targets in Boolean network models. The results in this paper show that the proposed methods are useful and efficient for moderately large networks.

  11. Computational approaches to standard-compliant biofilm data for reliable analysis and integration.

    Science.gov (United States)

    Sousa, Ana Margarida; Ferreira, Andreia; Azevedo, Nuno F; Pereira, Maria Olivia; Lourenço, Anália

    2012-12-01

    The study of microorganism consortia, also known as biofilms, is associated to a number of applications in biotechnology, ecotechnology and clinical domains. Nowadays, biofilm studies are heterogeneous and data-intensive, encompassing different levels of analysis. Computational modelling of biofilm studies has become thus a requirement to make sense of these vast and ever-expanding biofilm data volumes. The rationale of the present work is a machine-readable format for representing biofilm studies and supporting biofilm data interchange and data integration. This format is supported by the Biofilm Science Ontology (BSO), the first ontology on biofilms information. The ontology is decomposed into a number of areas of interest, namely: the Experimental Procedure Ontology (EPO) which describes biofilm experimental procedures; the Colony Morphology Ontology (CMO) which characterises morphologically microorganism colonies; and other modules concerning biofilm phenotype, antimicrobial susceptibility and virulence traits. The overall objective behind BSO is to develop semantic resources to capture, represent and share data on biofilms and related experiments in a regularized fashion manner. Furthermore, the present work also introduces a framework in assistance of biofilm data interchange and analysis - BiofOmics (http://biofomics.org) - and a public repository on colony morphology signatures - MorphoCol (http://stardust.deb.uminho.pt/morphocol).

  12. Evaluation of the reliability and accuracy of using cone-beam computed tomography for diagnosing periapical cysts from granulomas.

    Science.gov (United States)

    Guo, Jing; Simon, James H; Sedghizadeh, Parish; Soliman, Osman N; Chapman, Travis; Enciso, Reyes

    2013-12-01

    The purpose of this study was to evaluate the reliability and accuracy of cone-beam computed tomographic (CBCT) imaging against the histopathologic diagnosis for the differential diagnosis of periapical cysts (cavitated lesions) from (solid) granulomas. Thirty-six periapical lesions were imaged using CBCT scans. Apicoectomy surgeries were conducted for histopathological examination. Evaluator 1 examined each CBCT scan for the presence of 6 radiologic characteristics of a cyst (ie, location, periphery, shape, internal structure, effects on surrounding structure, and perforation of the cortical plate). Not every cyst showed all radiologic features (eg, not all cysts perforate the cortical plate). For the purpose of finding the minimum number of diagnostic criteria present in a scan to diagnose a lesion as a cyst, we conducted 6 receiver operating characteristic curve analyses comparing CBCT diagnoses with the histopathologic diagnosis. Two other independent evaluators examined the CBCT lesions. Statistical tests were conducted to examine the accuracy, inter-rater reliability, and intrarater reliability of CBCT images. Findings showed that a score of ≥4 positive findings was the optimal scoring system. The accuracies of differential diagnoses of 3 evaluators were moderate (area under the curve = 0.76, 0.70, and 0.69 for evaluators 1, 2, and 3, respectively). The inter-rater agreement of the 3 evaluators was excellent (α = 0.87). The intrarater agreement was good to excellent (κ = 0.71, 0.76, and 0.77). CBCT images can provide a moderately accurate diagnosis between cysts and granulomas. Copyright © 2013 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  13. Online Identification with Reliability Criterion and State of Charge Estimation Based on a Fuzzy Adaptive Extended Kalman Filter for Lithium-Ion Batteries

    Directory of Open Access Journals (Sweden)

    Zhongwei Deng

    2016-06-01

    Full Text Available In the field of state of charge (SOC estimation, the Kalman filter has been widely used for many years, although its performance strongly depends on the accuracy of the battery model as well as the noise covariance. The Kalman gain determines the confidence coefficient of the battery model by adjusting the weight of open circuit voltage (OCV correction, and has a strong correlation with the measurement noise covariance (R. In this paper, the online identification method is applied to acquire the real model parameters under different operation conditions. A criterion based on the OCV error is proposed to evaluate the reliability of online parameters. Besides, the equivalent circuit model produces an intrinsic model error which is dependent on the load current, and the property that a high battery current or a large current change induces a large model error can be observed. Based on the above prior knowledge, a fuzzy model is established to compensate the model error through updating R. Combining the positive strategy (i.e., online identification and negative strategy (i.e., fuzzy model, a more reliable and robust SOC estimation algorithm is proposed. The experiment results verify the proposed reliability criterion and SOC estimation method under various conditions for LiFePO4 batteries.

  14. Intraobserver and interobserver reliability of radial torsion angle measurements by a new and alternative method with computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Freitas, Luiz Fernando Pinheiro de; Barbieri, Claudio Henrique; Mazzer, Nilton; Zatiti, Salomao Chade Assan; Bellucci, Angela Delete [Universidade de Sao Paulo (FMRP/USP), Ribeirao Preto, SP (Brazil). School of Medicine. Dept. of Biomechanics, Medicine and Rehabilitation; Nogueira-Barbosa, Marcello Henrique, E-mail: marcello@fmrp.usp.b [Universidade de Sao Paulo (FMRP/USP), Ribeirao Preto, SP (Brazil). School of Medicine. Radiology Div.

    2010-07-01

    Objective: to evaluate the intraobserver and interobserver reliability of radial torsion angle measurement using computed tomography. Methods: twelve pairs of cadaver radii and 116 forearms from 58 healthy volunteers were evaluated using axial computed tomography sections measured at the level of the bicipital tuberosity and the subchondral region of the radius. During digital imaging, the angle was formed by two lines, one diametrically perpendicular to the radial tubercle and the other tangential to the volar rim of the distal joint surface. Measurements were performed twice each by three observers. Results: in cadaveric bones, the mean radial torsion angle was 1.48 deg (-6 deg - 9 deg) on the right and 1.62 deg (-6 deg - 8 deg) on the left, with a mean difference between the right and left sides of 1.61 deg (0 deg - 8 deg). In volunteers, the mean radial torsion angle was 3.00 deg (-17 deg - 17 deg) on the right and 2.91 deg (-16 deg- 15 deg) on the left, with a mean difference between the sides of 1.58 deg (0 deg - 7 deg). There was no significant difference between each side. The interobserver correlation coefficient for the cadaver radii measurements was 0.88 (0.72 - 0.96) and 0.81 (0.58 - 0.93) for the right and left radius, respectively, while for the volunteers, the difference was 0.84 (0.77 - 0.90) and 0.83 (0.75 - 0.89), respectively. Intraobserver reliability was high. Conclusion: the described method is reproducible and applicable even when the radial tubercle has a rounded contour. (author)

  15. A reliable computational workflow for the selection of optimal screening libraries.

    Science.gov (United States)

    Gilad, Yocheved; Nadassy, Katalin; Senderowitz, Hanoch

    2015-01-01

    components, it can be easily adapted and reproduced by computational groups interested in rational selection of screening libraries. Furthermore, the workflow could be readily modified to include additional components. This workflow has been routinely used in our laboratory for the selection of libraries in multiple projects and consistently selects libraries which are well balanced across multiple parameters.Graphical abstract.

  16. The analytic hierarchy process as a systematic approach to the identification of important parameters for the reliability assessment of passive systems

    International Nuclear Information System (INIS)

    Zio, E.; Cantarella, M.; Cammi, A.

    2003-01-01

    Passive systems play a crucial role in the development of future solutions for nuclear plant technology. A fundamental issue still to be resolved is the quantification of the reliability of such systems. In this paper, we firstly illustrate a systematic methodology to guide the definition of the failure criteria of a passive system and the evaluation of its probability of occurrence, through the identification of the relevant system parameters and the propagation of their associated uncertainties. Within this methodology, we propose the use of the analytic hierarchy process as a structured and reproducible tool for the decomposition of the problem and the identification of the dominant system parameters. An example of its application to a real passive system is illustrated in details

  17. The Phoneme Identification Test for Assessment of Spectral and Temporal Discrimination Skills in Children: Development, Normative Data, and Test-Retest Reliability Studies.

    Science.gov (United States)

    Cameron, Sharon; Chong-White, Nicky; Mealings, Kiri; Beechey, Tim; Dillon, Harvey; Young, Taegan

    2018-02-01

    Previous research suggests that a proportion of children experiencing reading and listening difficulties may have an underlying primary deficit in the way that the central auditory nervous system analyses the perceptually important, rapidly varying, formant frequency components of speech. The Phoneme Identification Test (PIT) was developed to investigate the ability of children to use spectro-temporal cues to perceptually categorize speech sounds based on their rapidly changing formant frequencies. The PIT uses an adaptive two-alternative forced-choice procedure whereby the participant identifies a synthesized consonant-vowel (CV) (/ba/ or /da/) syllable. CV syllables differed only in the second formant (F2) frequency along an 11-step continuum (between 0% and 100%-representing an ideal /ba/ and /da/, respectively). The CV syllables were presented in either quiet (PIT Q) or noise at a 0 dB signal-to-noise ratio (PIT N). Development of the PIT stimuli and test protocols, and collection of normative and test-retest reliability data. Twelve adults (aged 23 yr 10 mo to 50 yr 9 mo, mean 32 yr 5 mo) and 137 typically developing, primary-school children (aged 6 yr 0 mo to 12 yr 4 mo, mean 9 yr 3 mo). There were 73 males and 76 females. Data were collected using a touchscreen computer. Psychometric functions were automatically fit to individual data by the PIT software. Performance was determined by the width of the continuum for which responses were neither clearly /ba/ nor /da/ (referred to as the uncertainty region [UR]). A shallower psychometric function slope reflected greater uncertainty. Age effects were determined based on raw scores. Z scores were calculated to account for the effect of age on performance. Outliers, and individual data for which the confidence interval of the UR exceeded a maximum allowable value, were removed. Nonparametric tests were used as the data were skewed toward negative performance. Across participants, the median value of the F2 range

  18. Computer identification of symptomatic deep venous thrombosis associated with peripherally inserted venous catheters.

    Science.gov (United States)

    Evans, R Scott; Linford, Lorraine H; Sharp, Jamie H; White, Gayle; Lloyd, James F; Weaver, Lindell K

    2007-10-11

    Peripherally inserted central catheters (PICCs) are considered a safe method to provide long-term antibiotic therapy, chemotherapy and nutrition support. Deep venous thrombosis (DVT) is a complication that requires early PICC removal, may extend hospitalization and can result in pulmonary embolism. PICC insertion teams strive to understand risk factors and develop methods to prevent DVTs. However, they can only manage what they can measure. At LDS Hospital, identification of PICC associated DVTs was dependent on verbal notification or manual surveillance of more than a thousand free-text vascular reports. Accurate DVT rates were not known which hindered prevention. We describe the development of a computer application (PICC-DVT monitor) to identify PICC associated DVTs each day. A one-year evaluation of the monitor by the PICC team and a review of 445 random vascular reports found a positive predictive value of 98%, sensitivity of 94%, specificity of 100% and a PICC team associated DVT rate of 2.8%.

  19. Computer game as a tool for training the identification of phonemic length.

    Science.gov (United States)

    Pennala, Riitta; Richardson, Ulla; Ylinen, Sari; Lyytinen, Heikki; Martin, Maisa

    2014-12-01

    Computer-assisted training of Finnish phonemic length was conducted with 7-year-old Russian-speaking second-language learners of Finnish. Phonemic length plays a different role in these two languages. The training included game activities with two- and three-syllable word and pseudo-word minimal pairs with prototypical vowel durations. The lowest accuracy scores were recorded for two-syllable words. Accuracy scores were higher for the minimal pairs with larger rather than smaller differences in duration. Accuracy scores were lower for long duration than for short duration. The ability to identify quantity degree was generalized to stimuli used in the identification test in two of the children. Ideas for improving the game are introduced.

  20. The photon identification loophole in EPRB experiments: computer models with single-wing selection

    Science.gov (United States)

    De Raedt, Hans; Michielsen, Kristel; Hess, Karl

    2017-11-01

    Recent Einstein-Podolsky-Rosen-Bohm experiments [M. Giustina et al. Phys. Rev. Lett. 115, 250401 (2015); L. K. Shalm et al. Phys. Rev. Lett. 115, 250402 (2015)] that claim to be loophole free are scrutinized. The combination of a digital computer and discrete-event simulation is used to construct a minimal but faithful model of the most perfected realization of these laboratory experiments. In contrast to prior simulations, all photon selections are strictly made, as they are in the actual experiments, at the local station and no other "post-selection" is involved. The simulation results demonstrate that a manifestly non-quantum model that identifies photons in the same local manner as in these experiments can produce correlations that are in excellent agreement with those of the quantum theoretical description of the corresponding thought experiment, in conflict with Bell's theorem which states that this is impossible. The failure of Bell's theorem is possible because of our recognition of the photon identification loophole. Such identification measurement-procedures are necessarily included in all actual experiments but are not included in the theory of Bell and his followers.

  1. The photon identification loophole in EPRB experiments: computer models with single-wing selection

    Directory of Open Access Journals (Sweden)

    De Raedt Hans

    2017-11-01

    Full Text Available Recent Einstein-Podolsky-Rosen-Bohm experiments [M. Giustina et al. Phys. Rev. Lett. 115, 250401 (2015; L. K. Shalm et al. Phys. Rev. Lett. 115, 250402 (2015] that claim to be loophole free are scrutinized. The combination of a digital computer and discrete-event simulation is used to construct a minimal but faithful model of the most perfected realization of these laboratory experiments. In contrast to prior simulations, all photon selections are strictly made, as they are in the actual experiments, at the local station and no other “post-selection” is involved. The simulation results demonstrate that a manifestly non-quantum model that identifies photons in the same local manner as in these experiments can produce correlations that are in excellent agreement with those of the quantum theoretical description of the corresponding thought experiment, in conflict with Bell’s theorem which states that this is impossible. The failure of Bell’s theorem is possible because of our recognition of the photon identification loophole. Such identification measurement-procedures are necessarily included in all actual experiments but are not included in the theory of Bell and his followers.

  2. A hyperspectral X-ray computed tomography system for enhanced material identification

    Science.gov (United States)

    Wu, Xiaomei; Wang, Qian; Ma, Jinlei; Zhang, Wei; Li, Po; Fang, Zheng

    2017-08-01

    X-ray computed tomography (CT) can distinguish different materials according to their absorption characteristics. The hyperspectral X-ray CT (HXCT) system proposed in the present work reconstructs each voxel according to its X-ray absorption spectral characteristics. In contrast to a dual-energy or multi-energy CT system, HXCT employs cadmium telluride (CdTe) as the x-ray detector, which provides higher spectral resolution and separate spectral lines according to the material's photon-counter working principle. In this paper, a specimen containing ten different polymer materials randomly arranged was adopted for material identification by HXCT. The filtered back-projection algorithm was applied for image and spectral reconstruction. The first step was to sort the individual material components of the specimen according to their cross-sectional image intensity. The second step was to classify materials with similar intensities according to their reconstructed spectral characteristics. The results demonstrated the feasibility of the proposed material identification process and indicated that the proposed HXCT system has good prospects for a wide range of biomedical and industrial nondestructive testing applications.

  3. Contribute to quantitative identification of casting defects based on computer analysis of X-ray images

    Directory of Open Access Journals (Sweden)

    Z. Ignaszak

    2007-12-01

    Full Text Available The forecast of structure and properties of casting is based on results of computer simulation of physical processes which are carried out during the casting processes. For the effective using of simulation system it is necessary to validate mathematica-physical models describing process of casting formation and the creation of local discontinues, witch determinate the casting properties.In the paper the proposition for quantitative validation of VP system using solidification casting defects by information sources of II group (methods of NDT was introduced. It was named the VP/RT validation (virtual prototyping/radiographic testing validation. Nowadays identification of casting defects noticeable on X-ray images bases on comparison of X-ray image of casting with relates to the ASTM. The results of this comparison are often not conclusive because based on operator’s subjective assessment. In the paper the system of quantitative identification of iron casting defects on X-ray images and classification this defects to ASTM class is presented. The methods of pattern recognition and machine learning were applied.

  4. RGCA: A Reliable GPU Cluster Architecture for Large-Scale Internet of Things Computing Based on Effective Performance-Energy Optimization.

    Science.gov (United States)

    Fang, Yuling; Chen, Qingkui; Xiong, Neal N; Zhao, Deyu; Wang, Jingjuan

    2017-08-04

    This paper aims to develop a low-cost, high-performance and high-reliability computing system to process large-scale data using common data mining algorithms in the Internet of Things (IoT) computing environment. Considering the characteristics of IoT data processing, similar to mainstream high performance computing, we use a GPU (Graphics Processing Unit) cluster to achieve better IoT services. Firstly, we present an energy consumption calculation method (ECCM) based on WSNs. Then, using the CUDA (Compute Unified Device Architecture) Programming model, we propose a Two-level Parallel Optimization Model (TLPOM) which exploits reasonable resource planning and common compiler optimization techniques to obtain the best blocks and threads configuration considering the resource constraints of each node. The key to this part is dynamic coupling Thread-Level Parallelism (TLP) and Instruction-Level Parallelism (ILP) to improve the performance of the algorithms without additional energy consumption. Finally, combining the ECCM and the TLPOM, we use the Reliable GPU Cluster Architecture (RGCA) to obtain a high-reliability computing system considering the nodes' diversity, algorithm characteristics, etc. The results show that the performance of the algorithms significantly increased by 34.1%, 33.96% and 24.07% for Fermi, Kepler and Maxwell on average with TLPOM and the RGCA ensures that our IoT computing system provides low-cost and high-reliability services.

  5. Reliability and validity of the revised Gibson Test of Cognitive Skills, a computer-based test battery for assessing cognition across the lifespan.

    Science.gov (United States)

    Moore, Amy Lawson; Miller, Terissa M

    2018-01-01

    The purpose of the current study is to evaluate the validity and reliability of the revised Gibson Test of Cognitive Skills, a computer-based battery of tests measuring short-term memory, long-term memory, processing speed, logic and reasoning, visual processing, as well as auditory processing and word attack skills. This study included 2,737 participants aged 5-85 years. A series of studies was conducted to examine the validity and reliability using the test performance of the entire norming group and several subgroups. The evaluation of the technical properties of the test battery included content validation by subject matter experts, item analysis and coefficient alpha, test-retest reliability, split-half reliability, and analysis of concurrent validity with the Woodcock Johnson III Tests of Cognitive Abilities and Tests of Achievement. Results indicated strong sources of evidence of validity and reliability for the test, including internal consistency reliability coefficients ranging from 0.87 to 0.98, test-retest reliability coefficients ranging from 0.69 to 0.91, split-half reliability coefficients ranging from 0.87 to 0.91, and concurrent validity coefficients ranging from 0.53 to 0.93. The Gibson Test of Cognitive Skills-2 is a reliable and valid tool for assessing cognition in the general population across the lifespan.

  6. Maximizing the sensitivity and reliability of peptide identification in large-scale proteomic experiments by harnessing multiple search engines.

    Science.gov (United States)

    Yu, Wen; Taylor, J Alex; Davis, Michael T; Bonilla, Leo E; Lee, Kimberly A; Auger, Paul L; Farnsworth, Chris C; Welcher, Andrew A; Patterson, Scott D

    2010-03-01

    Despite recent advances in qualitative proteomics, the automatic identification of peptides with optimal sensitivity and accuracy remains a difficult goal. To address this deficiency, a novel algorithm, Multiple Search Engines, Normalization and Consensus is described. The method employs six search engines and a re-scoring engine to search MS/MS spectra against protein and decoy sequences. After the peptide hits from each engine are normalized to error rates estimated from the decoy hits, peptide assignments are then deduced using a minimum consensus model. These assignments are produced in a series of progressively relaxed false-discovery rates, thus enabling a comprehensive interpretation of the data set. Additionally, the estimated false-discovery rate was found to have good concordance with the observed false-positive rate calculated from known identities. Benchmarking against standard proteins data sets (ISBv1, sPRG2006) and their published analysis, demonstrated that the Multiple Search Engines, Normalization and Consensus algorithm consistently achieved significantly higher sensitivity in peptide identifications, which led to increased or more robust protein identifications in all data sets compared with prior methods. The sensitivity and the false-positive rate of peptide identification exhibit an inverse-proportional and linear relationship with the number of participating search engines.

  7. Improved method for reliable HMW-GS identification by RP-HPLC and SDS-PAGE in common wheat cultivars

    Science.gov (United States)

    The accurate identification of alleles for high-molecular weight glutenins (HMW-GS) is critical for wheat breeding programs targeting end-use quality. RP-HPLC methods were optimized for separation of HMW-GS, resulting in enhanced resolution of 1By and 1Dx subunits. Statistically significant differe...

  8. Reliable and reproducible method for rapid identification of Nocardia species by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry.

    Science.gov (United States)

    Toyokawa, Masahiro; Kimura, Keigo; Nishi, Isao; Sunada, Atsuko; Ueda, Akiko; Sakata, Tomomi; Asari, Seishi

    2013-01-01

    Recently, matrix-assisted laser desorption ionization time-of-flight mass spectrometry (MALDI-TOF MS) has been challenged for the identification of Nocardia species. However, the standard ethanol-formic acid extraction alone is insufficient in allowing the membrane proteins of Nocardia species to be ionized by the matrix. We therefore aimed to establish our new extraction method for the MALDI-TOF MS-based identification of Nocardia species isolates. Our modified extraction procedure is through dissociation in 0.5% Tween-20 followed by bacterial heat-inactivation, mechanical breaking of the cell wall by acid-washed glass beads and protein extraction with formic acid and acetonitrile. As reference methods for species identification, full-length 16S rRNA gene sequencing and some phenotypical tests were used. In a first step, we made our own Nocardia database by analyzing 13 strains (13 different species including N. elegans, N. otitidiscaviarum, N. asiatica, N. abscessus, N. brasiliensis, N. thailandica, N. farcinica, N. nova, N. mikamii, N. cyriacigeorgica, N. asteroids, Nocardiopsis alba, and Micromonospora sp.) and registered to the MALDI BioTyper database. Then we established our database. The analysis of 12 challenge strains using the our database gave a 100% correct identification, including 8 strains identified to the species level and 4 strains to the genus level (N. elegans, N. nova, N. farcinica, Micromonospora sp.) according to the manufacture's log score specifications. In the estimation of reproducibility of our method intended for 4 strains, both within-run and between-run reproducibility were excellent. These data indicates that our method for rapid identification of Nocardia species is with reliability, reproducibility and cost effective.

  9. Genome-wide identification of the regulatory targets of a transcription factor using biochemical characterization and computational genomic analysis

    Directory of Open Access Journals (Sweden)

    Jolly Emmitt R

    2005-11-01

    Full Text Available Abstract Background A major challenge in computational genomics is the development of methodologies that allow accurate genome-wide prediction of the regulatory targets of a transcription factor. We present a method for target identification that combines experimental characterization of binding requirements with computational genomic analysis. Results Our method identified potential target genes of the transcription factor Ndt80, a key transcriptional regulator involved in yeast sporulation, using the combined information of binding affinity, positional distribution, and conservation of the binding sites across multiple species. We have also developed a mathematical approach to compute the false positive rate and the total number of targets in the genome based on the multiple selection criteria. Conclusion We have shown that combining biochemical characterization and computational genomic analysis leads to accurate identification of the genome-wide targets of a transcription factor. The method can be extended to other transcription factors and can complement other genomic approaches to transcriptional regulation.

  10. Identification of rounded atelectasis in workers exposed to asbestos by contrast helical computed tomography

    International Nuclear Information System (INIS)

    Terra-Filho, M.; Kavakama, J.; Bagatin, E.; Capelozzi, V.L.; Nery, L.E.; Tavares, R.

    2003-01-01

    Rounded atelectasis (RA) is a benign and unusual form of sub pleural lung collapse that has been described mostly in asbestos-exposed workers. This form of atelectasis manifests as a lung nodule and can be confused with bronchogenic carcinoma upon conventional radiologic examination. The objective of the present study was to evaluate the variation in contrast uptake in computed tomography for the identification of asbestos-related RA in Brazil. Between January 1998 and December 2000, high-resolution computed tomography (HRCT) was performed in 1658 asbestos-exposed workers. The diagnosis was made in nine patients based on a history of prior asbestos exposure, the presence of characteristic (HRCT) findings and lesions unchanged in size over 2 years or more. In three of them the diagnosis was confirmed during surgery. The dynamic contrast enhancement study was modified to evaluate nodules and pulmonary masses. All nine patients with R A received iodide contrast according to weight. The average enhancement after iodide contrast was infused, reported as Hounsfield units (HU), increased from 62.5±9.7 to 125.4±20.7 (P < 0.05), with a mean enhancement of 62.5±19.7 (range 40 to 89) and with a uniform dense opacification. In conclusion, in this study all patients with R A showed contrast enhancement with uniform dense opacification. The main clinical implication of this finding is that this procedure does not permit differentiation between RA and malignant pulmonary neoplasm. (author)

  11. Computational identification of binding energy hot spots in protein-RNA complexes using an ensemble approach.

    Science.gov (United States)

    Pan, Yuliang; Wang, Zixiang; Zhan, Weihua; Deng, Lei

    2018-05-01

    Identifying RNA-binding residues, especially energetically favored hot spots, can provide valuable clues for understanding the mechanisms and functional importance of protein-RNA interactions. Yet, limited availability of experimentally recognized energy hot spots in protein-RNA crystal structures leads to the difficulties in developing empirical identification approaches. Computational prediction of RNA-binding hot spot residues is still in its infant stage. Here, we describe a computational method, PrabHot (Prediction of protein-RNA binding hot spots), that can effectively detect hot spot residues on protein-RNA binding interfaces using an ensemble of conceptually different machine learning classifiers. Residue interaction network features and new solvent exposure characteristics are combined together and selected for classification with the Boruta algorithm. In particular, two new reference datasets (benchmark and independent) have been generated containing 107 hot spots from 47 known protein-RNA complex structures. In 10-fold cross-validation on the training dataset, PrabHot achieves promising performances with an AUC score of 0.86 and a sensitivity of 0.78, which are significantly better than that of the pioneer RNA-binding hot spot prediction method HotSPRing. We also demonstrate the capability of our proposed method on the independent test dataset and gain a competitive advantage as a result. The PrabHot webserver is freely available at http://denglab.org/PrabHot/. leideng@csu.edu.cn. Supplementary data are available at Bioinformatics online.

  12. Accuracy and reliability of a novel method for fusion of digital dental casts and Cone Beam Computed Tomography scans.

    Directory of Open Access Journals (Sweden)

    Frits A Rangel

    Full Text Available Several methods have been proposed to integrate digital models into Cone Beam Computed Tomography scans. Since all these methods have some drawbacks such as radiation exposure, soft tissue deformation and time-consuming digital handling processes, we propose a new method to integrate digital dental casts into Cone Beam Computed Tomography scans. Plaster casts of 10 patients were randomly selected and 5 titanium markers were glued to the upper and lower plaster cast. The plaster models were scanned, impressions were taken from the plaster models and the impressions were also scanned. Linear measurements were performed on all three models, to assess accuracy and reproducibility. Besides that, matching of the scanned plaster models and scanned impressions was done, to assess the accuracy of the matching procedure. Results show that all measurement errors are smaller than 0.2 mm, and that 81% is smaller than 0.1 mm. Matching of the scanned plaster casts and scanned impressions show a mean error between the two surfaces of the upper arch of 0.14 mm and for the lower arch of 0.18 mm. The time needed for reconstructing the CBCT scans to a digital patient, where the impressions are integrated into the CBCT scan of the patient takes about 15 minutes, with little variance between patients. In conclusion, we can state that this new method is a reliable method to integrate digital dental casts into CBCT scans. As far as radiation exposure, soft tissue deformation and digital handling processes are concerned, it is a significant improvement compared to the previously published methods.

  13. Accuracy and Reliability of a Novel Method for Fusion of Digital Dental Casts and Cone Beam Computed Tomography Scans

    Science.gov (United States)

    Rangel, Frits A.; Maal, Thomas J. J.; Bronkhorst, Ewald M.; Breuning, K. Hero; Schols, Jan G. J. H.; Bergé, Stefaan J.; Kuijpers-Jagtman, Anne Marie

    2013-01-01

    Several methods have been proposed to integrate digital models into Cone Beam Computed Tomography scans. Since all these methods have some drawbacks such as radiation exposure, soft tissue deformation and time-consuming digital handling processes, we propose a new method to integrate digital dental casts into Cone Beam Computed Tomography scans. Plaster casts of 10 patients were randomly selected and 5 titanium markers were glued to the upper and lower plaster cast. The plaster models were scanned, impressions were taken from the plaster models and the impressions were also scanned. Linear measurements were performed on all three models, to assess accuracy and reproducibility. Besides that, matching of the scanned plaster models and scanned impressions was done, to assess the accuracy of the matching procedure. Results show that all measurement errors are smaller than 0.2 mm, and that 81% is smaller than 0.1 mm. Matching of the scanned plaster casts and scanned impressions show a mean error between the two surfaces of the upper arch of 0.14 mm and for the lower arch of 0.18 mm. The time needed for reconstructing the CBCT scans to a digital patient, where the impressions are integrated into the CBCT scan of the patient takes about 15 minutes, with little variance between patients. In conclusion, we can state that this new method is a reliable method to integrate digital dental casts into CBCT scans. As far as radiation exposure, soft tissue deformation and digital handling processes are concerned, it is a significant improvement compared to the previously published methods. PMID:23527111

  14. Reliability of iris recognition as a means of identity verification and future impact on transportation worker identification credential

    OpenAIRE

    McLaren, Simon R.

    2008-01-01

    The Department of Homeland Security is deploying the Transportation Worker Identification Credential (TWIC) to U.S. ports to help ensure only authorized individuals having undergone background checks have access to secure areas. Congress mandated the TWIC have a biometric authenticator; DHS chose fingerprints. This thesis argues iris scanning is a better choice because of the nature of the maritime environment and because iris scanning is a more accurate biometric. This thesis also argues th...

  15. Test-retest reliability and comparability of paper and computer questionnaires for the Finnish version of the Tampa Scale of Kinesiophobia.

    Science.gov (United States)

    Koho, P; Aho, S; Kautiainen, H; Pohjolainen, T; Hurri, H

    2014-12-01

    To estimate the internal consistency, test-retest reliability and comparability of paper and computer versions of the Finnish version of the Tampa Scale of Kinesiophobia (TSK-FIN) among patients with chronic pain. In addition, patients' personal experiences of completing both versions of the TSK-FIN and preferences between these two methods of data collection were studied. Test-retest reliability study. Paper and computer versions of the TSK-FIN were completed twice on two consecutive days. The sample comprised 94 consecutive patients with chronic musculoskeletal pain participating in a pain management or individual rehabilitation programme. The group rehabilitation design consisted of physical and functional exercises, evaluation of the social situation, psychological assessment of pain-related stress factors, and personal pain management training in order to regain overall function and mitigate the inconvenience of pain and fear-avoidance behaviour. The mean TSK-FIN score was 37.1 [standard deviation (SD) 8.1] for the computer version and 35.3 (SD 7.9) for the paper version. The mean difference between the two versions was 1.9 (95% confidence interval 0.8 to 2.9). Test-retest reliability was 0.89 for the paper version and 0.88 for the computer version. Internal consistency was considered to be good for both versions. The intraclass correlation coefficient for comparability was 0.77 (95% confidence interval 0.66 to 0.85), indicating substantial reliability between the two methods. Both versions of the TSK-FIN demonstrated substantial intertest reliability, good test-retest reliability, good internal consistency and acceptable limits of agreement, suggesting their suitability for clinical use. However, subjects tended to score higher when using the computer version. As such, in an ideal situation, data should be collected in a similar manner throughout the course of rehabilitation or clinical research. Copyright © 2014 Chartered Society of Physiotherapy. Published

  16. The sensitivity of computed tomography (CT) scans in detecting trauma: are CT scans reliable enough for courtroom testimony?

    Science.gov (United States)

    Molina, D Kimberley; Nichols, Joanna J; Dimaio, Vincent J M

    2007-09-01

    Rapid and accurate recognition of traumatic injuries is extremely important in emergency room and surgical settings. Emergency departments depend on computed tomography (CT) scans to provide rapid, accurate injury assessment. We conducted an analysis of all traumatic deaths autopsied at the Bexar County Medical Examiner's Office in which perimortem medical imaging (CT scan) was performed to assess the reliability of the CT scan in detecting trauma with sufficient accuracy for courtroom testimony. Cases were included in the study if an autopsy was conducted, a CT scan was performed within 24 hours before death, and there was no surgical intervention. Analysis was performed to assess the correlation between the autopsy and CT scan results. Sensitivity, specificity, positive predictive value, and negative predictive value were defined for the CT scan based on the autopsy results. The sensitivity of the CT scan ranged from 0% for cerebral lacerations, cervical vertebral body fractures, cardiac injury, and hollow viscus injury to 75% for liver injury. This study reveals that CT scans are an inadequate detection tool for forensic pathologists, where a definitive diagnosis is required, because they have a low level of accuracy in detecting traumatic injuries. CT scans may be adequate for clinicians in the emergency room setting, but are inadequate for courtroom testimony. If the evidence of trauma is based solely on CT scan reports, there is a high possibility of erroneous accusations, indictments, and convictions.

  17. Prediction of Global Damage and Reliability Based Upon Sequential Identification and Updating of RC Structures Subject to Earthquakes

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Skjærbæk, P. S.; Köylüoglu, H. U.

    The paper deals with the prediction of global damage and future structural reliability with special emphasis on sensitivity, bias and uncertainty of these predictions dependent on the statistically equivalent realizations of the future earthquake. The predictions are based on a modified Clough......-Johnston single-degree-of-freedom (SDOF) oscillator with three parameters which are calibrated to fit the displacement response and the damage development in the past earthquake....

  18. Identification of fall risk predictors in daily life measurements: gait characteristics' reliability and association with self-reported fall history.

    Science.gov (United States)

    Rispens, Sietse M; van Schooten, Kimberley S; Pijnappels, Mirjam; Daffertshofer, Andreas; Beek, Peter J; van Dieën, Jaap H

    2015-01-01

    Background. Gait characteristics extracted from trunk accelerations during daily life locomotion are complementary to questionnaire- or laboratory-based gait and balance assessments and may help to improve fall risk prediction. Objective. The aim of this study was to identify gait characteristics that are associated with self-reported fall history and that can be reliably assessed based on ambulatory data collected during a single week. Methods. We analyzed 2 weeks of trunk acceleration data (DynaPort MoveMonitor, McRoberts) collected among 113 older adults (age range, 65-97 years). During episodes of locomotion, various gait characteristics were determined, including local dynamic stability, interstride variability, and several spectral features. For each characteristic, we performed a negative binomial regression analysis with the participants' self-reported number of falls in the preceding year as outcome. Reliability of gait characteristics was assessed in terms of intraclass correlations between both measurement weeks. Results. The percentages of spectral power below 0.7 Hz along the vertical and anteroposterior axes and below 10 Hz along the mediolateral axis, as well as local dynamic stability, local dynamic stability per stride, gait smoothness, and the amplitude and slope of the dominant frequency along the vertical axis, were associated with the number of falls in the preceding year and could be reliably assessed (all P 0.75). Conclusions. Daily life gait characteristics are associated with fall history in older adults and can be reliably estimated from a week of ambulatory trunk acceleration measurements. © The Author(s) 2014.

  19. The Development of DNA Based Methods for the Reliable and Efficient Identification of Nicotiana tabacum in Tobacco and Its Derived Products

    Directory of Open Access Journals (Sweden)

    Sukumar Biswas

    2016-01-01

    Full Text Available Reliable methods are needed to detect the presence of tobacco components in tobacco products to effectively control smuggling and classify tariff and excise in tobacco industry to control illegal tobacco trade. In this study, two sensitive and specific DNA based methods, one quantitative real-time PCR (qPCR assay and the other loop-mediated isothermal amplification (LAMP assay, were developed for the reliable and efficient detection of the presence of tobacco (Nicotiana tabacum in various tobacco samples and commodities. Both assays targeted the same sequence of the uridine 5′-monophosphate synthase (UMPS, and their specificities and sensitivities were determined with various plant materials. Both qPCR and LAMP methods were reliable and accurate in the rapid detection of tobacco components in various practical samples, including customs samples, reconstituted tobacco samples, and locally purchased cigarettes, showing high potential for their application in tobacco identification, particularly in the special cases where the morphology or chemical compositions of tobacco have been disrupted. Therefore, combining both methods would facilitate not only the detection of tobacco smuggling control, but also the detection of tariff classification and of excise.

  20. Validity, reliability, and reproducibility of linear measurements on digital models obtained from intraoral and cone-beam computed tomography scans of alginate impressions

    NARCIS (Netherlands)

    Wiranto, Matthew G.; Engelbrecht, W. Petrie; Nolthenius, Heleen E. Tutein; van der Meer, W. Joerd; Ren, Yijin

    INTRODUCTION: Digital 3-dimensional models are widely used for orthodontic diagnosis. The aim of this study was to assess the validity, reliability, and reproducibility of digital models obtained from the Lava Chairside Oral scanner (3M ESPE, Seefeld, Germany) and cone-beam computed tomography scans

  1. Development of the method of aggregation to determine the current storage area using computer vision and radiofrequency identification

    Science.gov (United States)

    Astafiev, A.; Orlov, A.; Privezencev, D.

    2018-01-01

    The article is devoted to the development of technology and software for the construction of positioning and control systems in industrial plants based on aggregation to determine the current storage area using computer vision and radiofrequency identification. It describes the developed of the project of hardware for industrial products positioning system in the territory of a plant on the basis of radio-frequency grid. It describes the development of the project of hardware for industrial products positioning system in the plant on the basis of computer vision methods. It describes the development of the method of aggregation to determine the current storage area using computer vision and radiofrequency identification. Experimental studies in laboratory and production conditions have been conducted and described in the article.

  2. Rapid and reliable MALDI-TOF mass spectrometry identification of Candida non-albicans isolates from bloodstream infections.

    Science.gov (United States)

    Pulcrano, Giovanna; Iula, Dora Vita; Vollaro, Antonio; Tucci, Alessandra; Cerullo, Monica; Esposito, Matilde; Rossano, Fabio; Catania, Maria Rosaria

    2013-09-01

    Matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF) mass spectrometry (MS) fingerprinting has recently become an effective instrument for rapid microbiological diagnostics and in particular for identification of micro-organisms directly in a positive blood culture. The aim of the study was to evaluate a collection of 82 stored yeast isolates from bloodstream infection, by MALDI-TOF MS; 21 isolates were identified also directly from positive blood cultures and in the presence of other co-infecting micro-organisms. Of the 82 isolates grown on plates, 64 (76%) were correctly identified by the Vitek II system and 82 (100%) by MALDI-TOF MS; when the two methods gave different results, the isolate was identified by PCR. MALDI-TOF MS was unreliable in identifying two isolates (Candida glabrata and Candida parapsilosis) directly from blood culture; however, direct analysis from positive blood culture samples was fast and effective for the identification of yeast, which is of great importance for early and adequate treatment. © 2013. Published by Elsevier B.V. All rights reserved.

  3. Zoonotic onchocerciasis in Hiroshima, Japan, and molecular analysis of a paraffin section of the agent for a reliable identification

    Directory of Open Access Journals (Sweden)

    Fukuda M.

    2011-05-01

    Full Text Available Japan is a country of high specific diversity of Onchocerca with eight species, the adults of two not yet known. Onchocerca dewittei japonica, a common filarial parasite of wild boar, had been proved to be the agent of five zoonotic onchocerciasis in Kyushu island with morphological and molecular studies. The sixth case, at Hiroshima in the main island, was identified to the same Onchocerca species, based on adult characters observed on histological sections. To consolidate the identification, mitochondrial cytochrome c oxidase subunit 1 (CO1 gene analysis was attempted with the formalin-fixed, paraffin-embedded parasite specimen. The sequence (196 bp of a CO1 gene fragment of the parasite successfully PCR-amplified agreed well with those of O. dewittei japonica registered in GenBank, confirming the morphological identification. Moreover a comparison with the CO1 gene sequences of six other Onchocerca species in GenBank excluded the possibility that Onchocerca sp. from wild boar and Onchocerca sp. type A from cattle in Japan, were the causative agents in this case. Mitochondrial DNA analysis proved to be a valuable tool to support the morphological method for the discrimination of zoonotic Onchocerca species in a histological specimen.

  4. Identification of conductive hearing loss using air conduction tests alone: reliability and validity of an automatic test battery.

    Science.gov (United States)

    Convery, Elizabeth; Keidser, Gitte; Seeto, Mark; Freeston, Katrina; Zhou, Dan; Dillon, Harvey

    2014-01-01

    The primary objective of this study was to determine whether a combination of automatically administered pure-tone audiometry and a tone-in-noise detection task, both delivered via an air conduction (AC) pathway, could reliably and validly predict the presence of a conductive component to the hearing loss. The authors hypothesized that performance on the battery of tests would vary according to hearing loss type. A secondary objective was to evaluate the reliability and validity of a novel automatic audiometry algorithm to assess its suitability for inclusion in the test battery. Participants underwent a series of hearing assessments that were conducted in a randomized order: manual pure-tone air conduction audiometry and bone conduction audiometry; automatic pure-tone air conduction audiometry; and an automatic tone-in-noise detection task. The automatic tests were each administered twice. The ability of the automatic test battery to: (a) predict the presence of an air-bone gap (ABG); and (b) accurately measure AC hearing thresholds was assessed against the results of manual audiometry. Test-retest conditions were compared to determine the reliability of each component of the automatic test battery. Data were collected on 120 ears from normal-hearing and conductive, sensorineural, and mixed hearing-loss subgroups. Performance differences between different types of hearing loss were observed. Ears with a conductive component (conductive and mixed ears) tended to have normal signal to noise ratios (SNR) despite impaired thresholds in quiet, while ears without a conductive component (normal and sensorineural ears) demonstrated, on average, an increasing relationship between their thresholds in quiet and their achieved SNR. Using the relationship between these two measures among ears with no conductive component as a benchmark, the likelihood that an ear has a conductive component can be estimated based on the deviation from this benchmark. The sensitivity and

  5. Identification of tasks of maintenance centered in the reliability; Identificacion de tareas de mantenimiento centrado en la confiabilidad

    Energy Technology Data Exchange (ETDEWEB)

    Torres V, A.; Rivero O, J.J. [Dpto. Ingenieria Nuclear, Instituto Superior de Tecnologias y Ciencias Aplicadas, Ave. Salvador Allende y Luaces, Quinta de los Molinos, Plaza, Ciudad Habana (Cuba)]. e-mail: atorres@fctn.isctn.edu.cu

    2004-07-01

    The methodology of Reliability Centered Maintenance (RCM) it has become, after the discovery of their advantages, an objective of many industrial facilities to optimize their maintenance. However, diverse subjective factors affect the determination of the parameters (technical of predictive to apply and times among interventions) that characterize the tasks of RCM. A method to determine the monitoring tasks at condition and the times more recommended for to apply the monitoring by time and the search of faults, with focus in system. This methodology has been computerized inside the code MOSEG Win Ver 1.0. The same has been applied with success to the determination of tasks of RCM in industrial objectives. (Author)

  6. Computational analyses of spectral trees from electrospray multi-stage mass spectrometry to aid metabolite identification.

    Science.gov (United States)

    Cao, Mingshu; Fraser, Karl; Rasmussen, Susanne

    2013-10-31

    Mass spectrometry coupled with chromatography has become the major technical platform in metabolomics. Aided by peak detection algorithms, the detected signals are characterized by mass-over-charge ratio (m/z) and retention time. Chemical identities often remain elusive for the majority of the signals. Multi-stage mass spectrometry based on electrospray ionization (ESI) allows collision-induced dissociation (CID) fragmentation of selected precursor ions. These fragment ions can assist in structural inference for metabolites of low molecular weight. Computational investigations of fragmentation spectra have increasingly received attention in metabolomics and various public databases house such data. We have developed an R package "iontree" that can capture, store and analyze MS2 and MS3 mass spectral data from high throughput metabolomics experiments. The package includes functions for ion tree construction, an algorithm (distMS2) for MS2 spectral comparison, and tools for building platform-independent ion tree (MS2/MS3) libraries. We have demonstrated the utilization of the package for the systematic analysis and annotation of fragmentation spectra collected in various metabolomics platforms, including direct infusion mass spectrometry, and liquid chromatography coupled with either low resolution or high resolution mass spectrometry. Assisted by the developed computational tools, we have demonstrated that spectral trees can provide informative evidence complementary to retention time and accurate mass to aid with annotating unknown peaks. These experimental spectral trees once subjected to a quality control process, can be used for querying public MS2 databases or de novo interpretation. The putatively annotated spectral trees can be readily incorporated into reference libraries for routine identification of metabolites.

  7. SIMON. A computer program for reliability and statistical analysis using Monte Carlo simulation. Program description and manual

    International Nuclear Information System (INIS)

    Kongsoe, H.E.; Lauridsen, K.

    1993-09-01

    SIMON is a program for calculation of reliability and statistical analysis. The program is of the Monte Carlo type, and it is designed with high flexibility, and has a large potential for application to complex problems like reliability analyses of very large systems and of systems, where complex modelling or knowledge of special details are required. Examples of application of the program, including input and output, for reliability and statistical analysis are presented. (au) (3 tabs., 3 ills., 5 refs.)

  8. Computed tomographic angiography criteria in the diagnosis of brain death - comparison of sensitivity and interobserver reliability of different evaluation scales

    International Nuclear Information System (INIS)

    Sawicki, Marcin; Walecka, A.; Bohatyrewicz, R.; Solek-Pastuszka, J.; Safranow, K.; Walecki, J.; Rowinski, O.; Czajkowski, Z.; Guzinski, M.; Burzynska, M.; Wojczal, J.

    2014-01-01

    The standardized diagnostic criteria for computed tomographic angiography (CTA) in diagnosis of brain death (BD) are not yet established. The aim of the study was to compare the sensitivity and interobserver agreement of the three previously used scales of CTA for the diagnosis of BD. Eighty-two clinically brain-dead patients underwent CTA with a delay of 40 s after contrast injection. Catheter angiography was used as the reference standard. CTA results were assessed by two radiologists, and the diagnosis of BD was established according to 10-, 7-, and 4-point scales. Catheter angiography confirmed the diagnosis of BD in all cases. Opacification of certain cerebral vessels as indicator of BD was highly sensitive: cortical segments of the middle cerebral artery (96.3 %), the internal cerebral vein (98.8 %), and the great cerebral vein (98.8 %). Other vessels were less sensitive: the pericallosal artery (74.4 %), cortical segments of the posterior cerebral artery (79.3 %), and the basilar artery (82.9 %). The sensitivities of the 10-, 7-, and 4-point scales were 67.1, 74.4, and 96.3 %, respectively (p < 0.001). Percentage interobserver agreement in diagnosis of BD reached 93 % for the 10-point scale, 89 % for the 7-point scale, and 95 % for the 4-point scale (p = 0.37). In the application of CTA to the diagnosis of BD, reducing the assessment of vascular opacification scale from a 10- to a 4-point scale significantly increases the sensitivity and maintains high interobserver reliability. (orig.)

  9. Computed tomographic angiography criteria in the diagnosis of brain death - comparison of sensitivity and interobserver reliability of different evaluation scales

    Energy Technology Data Exchange (ETDEWEB)

    Sawicki, Marcin; Walecka, A. [Pomeranian Medical University, Department of Diagnostic Imaging and Interventional Radiology, Szczecin (Poland); Bohatyrewicz, R.; Solek-Pastuszka, J. [Pomeranian Medical University, Clinic of Anesthesiology and Intensive Care, Szczecin (Poland); Safranow, K. [Pomeranian Medical University, Department of Biochemistry and Medical Chemistry, Szczecin (Poland); Walecki, J. [The Centre of Postgraduate Medical Education, Warsaw (Poland); Rowinski, O. [Medical University of Warsaw, 2nd Department of Clinical Radiology, Warsaw (Poland); Czajkowski, Z. [Regional Joint Hospital, Szczecin (Poland); Guzinski, M. [Wroclaw Medical University, Department of General Radiology, Interventional Radiology and Neuroradiology, Wroclaw (Poland); Burzynska, M. [Wroclaw Medical University, Department of Anesthesiology and Intensive Therapy, Wroclaw (Poland); Wojczal, J. [Medical University of Lublin, Department of Neurology, Lublin (Poland)

    2014-08-15

    The standardized diagnostic criteria for computed tomographic angiography (CTA) in diagnosis of brain death (BD) are not yet established. The aim of the study was to compare the sensitivity and interobserver agreement of the three previously used scales of CTA for the diagnosis of BD. Eighty-two clinically brain-dead patients underwent CTA with a delay of 40 s after contrast injection. Catheter angiography was used as the reference standard. CTA results were assessed by two radiologists, and the diagnosis of BD was established according to 10-, 7-, and 4-point scales. Catheter angiography confirmed the diagnosis of BD in all cases. Opacification of certain cerebral vessels as indicator of BD was highly sensitive: cortical segments of the middle cerebral artery (96.3 %), the internal cerebral vein (98.8 %), and the great cerebral vein (98.8 %). Other vessels were less sensitive: the pericallosal artery (74.4 %), cortical segments of the posterior cerebral artery (79.3 %), and the basilar artery (82.9 %). The sensitivities of the 10-, 7-, and 4-point scales were 67.1, 74.4, and 96.3 %, respectively (p < 0.001). Percentage interobserver agreement in diagnosis of BD reached 93 % for the 10-point scale, 89 % for the 7-point scale, and 95 % for the 4-point scale (p = 0.37). In the application of CTA to the diagnosis of BD, reducing the assessment of vascular opacification scale from a 10- to a 4-point scale significantly increases the sensitivity and maintains high interobserver reliability. (orig.)

  10. Reliability of implant placement with stereolithographic surgical guides generated from computed tomography: clinical data from 94 implants.

    Science.gov (United States)

    Ersoy, Ahmet Ersan; Turkyilmaz, Ilser; Ozan, Oguz; McGlumphy, Edwin A

    2008-08-01

    Dental implant placement requires precise planning with regard to anatomic limitations and restorative goals. The aim of this study was to evaluate the match between the positions and axes of the planned and placed implants using stereolithographic (SLA) surgical guides. Ninety-four implants were placed using SLA surgical guides generated from computed tomography (CT) between 2005 and 2006. Radiographic templates were used for all subjects during CT imaging. After obtaining three-dimensional CT images, each implant was virtually placed on the CT images. SLA surgical guides, fabricated using an SLA machine with a laser beam to polymerize the liquid photo-polymerized resin, were used during implant placement. A new CT scan was taken for each subject following implant placement. Special software was used to fuse the images of the planned and placed implants, and the locations and axes were compared. Compared to the planned implants, the placed implants showed angular deviation of 4.9 degrees+/-2.36 degrees, whereas the mean linear deviation was 1.22+/-0.85 mm at the implant neck and 1.51+/-1 mm at the implant apex. Compared to the implant planning, the angular deviation and linear deviation at the neck and apex of the placed maxillary implants were 5.31 degrees+/-0.36 degrees, 1.04+/-0.56 mm, and 1.57+/-0.97 mm, respectively, whereas corresponding figures for placed mandibular implants were 4.44 degrees+/-0.31 degrees, 1.42+/-1.05 mm, and 1.44+/-1.03 mm, respectively. SLA surgical guides using CT data may be reliable in implant placement and make flapless implant placement possible.

  11. Computer Identification of Symptomatic Deep Venous Thrombosis Associated with Peripherally Inserted Central Catheters

    Science.gov (United States)

    Evans, R. Scott; Linford, Lorraine H.; Sharp, Jamie H.; White, Gayle; Lloyd, James F.; Weaver, Lindell K.

    2007-01-01

    Peripherally inserted central catheters (PICCs) are considered a safe method to provide long-term antibiotic therapy, chemotherapy and nutrition support. Deep venous thrombosis (DVT) is a complication that requires early PICC removal, may extend hospitalization and can result in pulmonary embolism. PICC insertion teams strive to understand risk factors and develop methods to prevent DVTs. However, they can only manage what they can measure. At LDS Hospital, identification of PICC associated DVTs was dependent on verbal notification or manual surveillance of more than a thousand free-text vascular reports. Accurate DVT rates were not known which hindered prevention. We describe the development of a computer application (PICC-DVT monitor) to identify PICC associated DVTs each day. A one-year evaluation of the monitor by the PICC team and a review of 445 random vascular reports found a positive predictive value of 98%, sensitivity of 94%, specificity of 100% and a PICC team associated DVT rate of 2.8%. PMID:18693831

  12. Computer-aided identification of polymorphism sets diagnostic for groups of bacterial and viral genetic variants

    Directory of Open Access Journals (Sweden)

    Huygens Flavia

    2007-08-01

    Full Text Available Abstract Background Single nucleotide polymorphisms (SNPs and genes that exhibit presence/absence variation have provided informative marker sets for bacterial and viral genotyping. Identification of marker sets optimised for these purposes has been based on maximal generalized discriminatory power as measured by Simpson's Index of Diversity, or on the ability to identify specific variants. Here we describe the Not-N algorithm, which is designed to identify small sets of genetic markers diagnostic for user-specified subsets of known genetic variants. The algorithm does not treat the user-specified subset and the remaining genetic variants equally. Rather Not-N analysis is designed to underpin assays that provide 0% false negatives, which is very important for e.g. diagnostic procedures for clinically significant subgroups within microbial species. Results The Not-N algorithm has been incorporated into the "Minimum SNPs" computer program and used to derive genetic markers diagnostic for multilocus sequence typing-defined clonal complexes, hepatitis C virus (HCV subtypes, and phylogenetic clades defined by comparative genome hybridization (CGH data for Campylobacter jejuni, Yersinia enterocolitica and Clostridium difficile. Conclusion Not-N analysis is effective for identifying small sets of genetic markers diagnostic for microbial sub-groups. The best results to date have been obtained with CGH data from several bacterial species, and HCV sequence data.

  13. Validation of DNA-based identification software by computation of pedigree likelihood ratios.

    Science.gov (United States)

    Slooten, K

    2011-08-01

    Disaster victim identification (DVI) can be aided by DNA-evidence, by comparing the DNA-profiles of unidentified individuals with those of surviving relatives. The DNA-evidence is used optimally when such a comparison is done by calculating the appropriate likelihood ratios. Though conceptually simple, the calculations can be quite involved, especially with large pedigrees, precise mutation models etc. In this article we describe a series of test cases designed to check if software designed to calculate such likelihood ratios computes them correctly. The cases include both simple and more complicated pedigrees, among which inbred ones. We show how to calculate the likelihood ratio numerically and algebraically, including a general mutation model and possibility of allelic dropout. In Appendix A we show how to derive such algebraic expressions mathematically. We have set up these cases to validate new software, called Bonaparte, which performs pedigree likelihood ratio calculations in a DVI context. Bonaparte has been developed by SNN Nijmegen (The Netherlands) for the Netherlands Forensic Institute (NFI). It is available free of charge for non-commercial purposes (see www.dnadvi.nl for details). Commercial licenses can also be obtained. The software uses Bayesian networks and the junction tree algorithm to perform its calculations. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  14. Early detection and identification of anomalies in chemical regime based on computational intelligence techniques

    International Nuclear Information System (INIS)

    Figedy, Stefan; Smiesko, Ivan

    2012-01-01

    This article provides brief information about the fundamental features of a newly-developed diagnostic system for early detection and identification of anomalies being generated in water chemistry regime of the primary and secondary circuit of the VVER-440 reactor. This system, which is called SACHER (System of Analysis of CHEmical Regime), was installed within the major modernization project at the NPP-V2 Bohunice in the Slovak Republic. The SACHER system has been fully developed on MATLAB environment. It is based on computational intelligence techniques and inserts various elements of intelligent data processing modules for clustering, diagnosing, future prediction, signal validation, etc, into the overall chemical information system. The application of SACHER would essentially assist chemists to identify the current situation regarding anomalies being generated in the primary and secondary circuit water chemistry. This system is to be used for diagnostics and data handling, however it is not intended to fully replace the presence of experienced chemists to decide upon corrective actions. (author)

  15. Computational Identification of Potential Multi-drug Combinations for Reduction of Microglial Inflammation in Alzheimer Disease

    Directory of Open Access Journals (Sweden)

    Thomas J. Anastasio

    2015-06-01

    Full Text Available Like other neurodegenerative diseases, Alzheimer Disease (AD has a prominent inflammatory component mediated by brain microglia. Reducing microglial inflammation could potentially halt or at least slow the neurodegenerative process. A major challenge in the development of treatments targeting brain inflammation is the sheer complexity of the molecular mechanisms that determine whether microglia become inflammatory or take on a more neuroprotective phenotype. The process is highly multifactorial, raising the possibility that a multi-target/multi-drug strategy could be more effective than conventional monotherapy. This study takes a computational approach in finding combinations of approved drugs that are potentially more effective than single drugs in reducing microglial inflammation in AD. This novel approach exploits the distinct advantages of two different computer programming languages, one imperative and the other declarative. Existing programs written in both languages implement the same model of microglial behavior, and the input/output relationships of both programs agree with each other and with data on microglia over an extensive test battery. Here the imperative program is used efficiently to screen the model for the most efficacious combinations of 10 drugs, while the declarative program is used to analyze in detail the mechanisms of action of the most efficacious combinations. Of the 1024 possible drug combinations, the simulated screen identifies only 7 that are able to move simulated microglia at least 50% of the way from a neurotoxic to a neuroprotective phenotype. Subsequent analysis shows that of the 7 most efficacious combinations, 2 stand out as superior both in strength and reliability. The model offers many experimentally testable and therapeutically relevant predictions concerning effective drug combinations and their mechanisms of action.

  16. Computational identification of potential multi-drug combinations for reduction of microglial inflammation in Alzheimer disease.

    Science.gov (United States)

    Anastasio, Thomas J

    2015-01-01

    Like other neurodegenerative diseases, Alzheimer Disease (AD) has a prominent inflammatory component mediated by brain microglia. Reducing microglial inflammation could potentially halt or at least slow the neurodegenerative process. A major challenge in the development of treatments targeting brain inflammation is the sheer complexity of the molecular mechanisms that determine whether microglia become inflammatory or take on a more neuroprotective phenotype. The process is highly multifactorial, raising the possibility that a multi-target/multi-drug strategy could be more effective than conventional monotherapy. This study takes a computational approach in finding combinations of approved drugs that are potentially more effective than single drugs in reducing microglial inflammation in AD. This novel approach exploits the distinct advantages of two different computer programming languages, one imperative and the other declarative. Existing programs written in both languages implement the same model of microglial behavior, and the input/output relationships of both programs agree with each other and with data on microglia over an extensive test battery. Here the imperative program is used efficiently to screen the model for the most efficacious combinations of 10 drugs, while the declarative program is used to analyze in detail the mechanisms of action of the most efficacious combinations. Of the 1024 possible drug combinations, the simulated screen identifies only 7 that are able to move simulated microglia at least 50% of the way from a neurotoxic to a neuroprotective phenotype. Subsequent analysis shows that of the 7 most efficacious combinations, 2 stand out as superior both in strength and reliability. The model offers many experimentally testable and therapeutically relevant predictions concerning effective drug combinations and their mechanisms of action.

  17. Identification of Reliable Reference Genes for Quantification of MicroRNAs in Serum Samples of Sulfur Mustard-Exposed Veterans.

    Science.gov (United States)

    Gharbi, Sedigheh; Shamsara, Mehdi; Khateri, Shahriar; Soroush, Mohammad Reza; Ghorbanmehr, Nassim; Tavallaei, Mahmood; Nourani, Mohammad Reza; Mowla, Seyed Javad

    2015-01-01

    In spite of accumulating information about pathological aspects of sulfur mustard (SM), the precise mechanism responsible for its effects is not well understood. Circulating microRNAs (miRNAs) are promising biomarkers for disease diagnosis and prognosis. Accurate normalization using appropriate reference genes, is a critical step in miRNA expression studies. In this study, we aimed to identify appropriate reference gene for microRNA quantification in serum samples of SM victims. In this case and control experimental study, using quantitative real-time polymerase chain reaction (qRT-PCR), we evaluated the suitability of a panel of small RNAs including SNORD38B, SNORD49A, U6, 5S rRNA, miR-423-3p, miR-191, miR-16 and miR-103 in sera of 28 SM-exposed veterans of Iran-Iraq war (1980-1988) and 15 matched control volunteers. Different statistical algorithms including geNorm, Normfinder, best-keeper and comparative delta-quantification cycle (Cq) method were employed to find the least variable reference gene. miR-423-3p was identified as the most stably expressed reference gene, and miR- 103 and miR-16 ranked after that. We demonstrate that non-miRNA reference genes have the least stabil- ity in serum samples and that some house-keeping miRNAs may be used as more reliable reference genes for miRNAs in serum. In addition, using the geometric mean of two reference genes could increase the reliability of the normalizers.

  18. Reliability and validity of the revised Gibson Test of Cognitive Skills, a computer-based test battery for assessing cognition across the lifespan

    Directory of Open Access Journals (Sweden)

    Moore AL

    2018-02-01

    Full Text Available Amy Lawson Moore, Terissa M Miller Gibson Institute of Cognitive Research, Colorado Springs, CO, USA Purpose: The purpose of the current study is to evaluate the validity and reliability of the revised Gibson Test of Cognitive Skills, a computer-based battery of tests measuring short-term memory, long-term memory, processing speed, logic and reasoning, visual processing, as well as auditory processing and word attack skills.Methods: This study included 2,737 participants aged 5–85 years. A series of studies was conducted to examine the validity and reliability using the test performance of the entire norming group and several subgroups. The evaluation of the technical properties of the test battery included content validation by subject matter experts, item analysis and coefficient alpha, test–retest reliability, split-half reliability, and analysis of concurrent validity with the Woodcock Johnson III Tests of Cognitive Abilities and Tests of Achievement.Results: Results indicated strong sources of evidence of validity and reliability for the test, including internal consistency reliability coefficients ranging from 0.87 to 0.98, test–retest reliability coefficients ranging from 0.69 to 0.91, split-half reliability coefficients ranging from 0.87 to 0.91, and concurrent validity coefficients ranging from 0.53 to 0.93.Conclusion: The Gibson Test of Cognitive Skills-2 is a reliable and valid tool for assessing cognition in the general population across the lifespan. Keywords: testing, cognitive skills, memory, processing speed, visual processing, auditory processing

  19. Computational Prediction of Human Salivary Proteins from Blood Circulation and Application to Diagnostic Biomarker Identification

    Science.gov (United States)

    Wang, Jiaxin; Liang, Yanchun; Wang, Yan; Cui, Juan; Liu, Ming; Du, Wei; Xu, Ying

    2013-01-01

    Proteins can move from blood circulation into salivary glands through active transportation, passive diffusion or ultrafiltration, some of which are then released into saliva and hence can potentially serve as biomarkers for diseases if accurately identified. We present a novel computational method for predicting salivary proteins that come from circulation. The basis for the prediction is a set of physiochemical and sequence features we found to be discerning between human proteins known to be movable from circulation to saliva and proteins deemed to be not in saliva. A classifier was trained based on these features using a support-vector machine to predict protein secretion into saliva. The classifier achieved 88.56% average recall and 90.76% average precision in 10-fold cross-validation on the training data, indicating that the selected features are informative. Considering the possibility that our negative training data may not be highly reliable (i.e., proteins predicted to be not in saliva), we have also trained a ranking method, aiming to rank the known salivary proteins from circulation as the highest among the proteins in the general background, based on the same features. This prediction capability can be used to predict potential biomarker proteins for specific human diseases when coupled with the information of differentially expressed proteins in diseased versus healthy control tissues and a prediction capability for blood-secretory proteins. Using such integrated information, we predicted 31 candidate biomarker proteins in saliva for breast cancer. PMID:24324552

  20. Reliability calculations

    International Nuclear Information System (INIS)

    Petersen, K.E.

    1986-03-01

    Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very complex systems. In order to increase the applicability of the programs variance reduction techniques can be applied to speed up the calculation process. Variance reduction techniques have been studied and procedures for implementation of importance sampling are suggested. (author)

  1. Reliability engineering

    International Nuclear Information System (INIS)

    Lee, Chi Woo; Kim, Sun Jin; Lee, Seung Woo; Jeong, Sang Yeong

    1993-08-01

    This book start what is reliability? such as origin of reliability problems, definition of reliability and reliability and use of reliability. It also deals with probability and calculation of reliability, reliability function and failure rate, probability distribution of reliability, assumption of MTBF, process of probability distribution, down time, maintainability and availability, break down maintenance and preventive maintenance design of reliability, design of reliability for prediction and statistics, reliability test, reliability data and design and management of reliability.

  2. Fast Metabolite Identification in Nuclear Magnetic Resonance Metabolomic Studies: Statistical Peak Sorting and Peak Overlap Detection for More Reliable Database Queries.

    Science.gov (United States)

    Hoijemberg, Pablo A; Pelczer, István

    2018-01-05

    A lot of time is spent by researchers in the identification of metabolites in NMR-based metabolomic studies. The usual metabolite identification starts employing public or commercial databases to match chemical shifts thought to belong to a given compound. Statistical total correlation spectroscopy (STOCSY), in use for more than a decade, speeds the process by finding statistical correlations among peaks, being able to create a better peak list as input for the database query. However, the (normally not automated) analysis becomes challenging due to the intrinsic issue of peak overlap, where correlations of more than one compound appear in the STOCSY trace. Here we present a fully automated methodology that analyzes all STOCSY traces at once (every peak is chosen as driver peak) and overcomes the peak overlap obstacle. Peak overlap detection by clustering analysis and sorting of traces (POD-CAST) first creates an overlap matrix from the STOCSY traces, then clusters the overlap traces based on their similarity and finally calculates a cumulative overlap index (COI) to account for both strong and intermediate correlations. This information is gathered in one plot to help the user identify the groups of peaks that would belong to a single molecule and perform a more reliable database query. The simultaneous examination of all traces reduces the time of analysis, compared to viewing STOCSY traces by pairs or small groups, and condenses the redundant information in the 2D STOCSY matrix into bands containing similar traces. The COI helps in the detection of overlapping peaks, which can be added to the peak list from another cross-correlated band. POD-CAST overcomes the generally overlooked and underestimated presence of overlapping peaks and it detects them to include them in the search of all compounds contributing to the peak overlap, enabling the user to accelerate the metabolite identification process with more successful database queries and searching all tentative

  3. Major influence of interobserver reliability on polytrauma identification with the Injury Severity Score (ISS): Time for a centralised coding in trauma registries?

    Science.gov (United States)

    Maduz, Roman; Kugelmeier, Patrick; Meili, Severin; Döring, Robert; Meier, Christoph; Wahl, Peter

    2017-04-01

    The Abbreviated Injury Scale (AIS) and the Injury Severity Score (ISS) find increasingly widespread use to assess trauma burden and to perform interhospital benchmarking through trauma registries. Since 2015, public resource allocation in Switzerland shall even be derived from such data. As every trauma centre is responsible for its own coding and data input, this study aims at evaluating interobserver reliability of AIS and ISS coding. Interobserver reliability of the AIS and ISS is analysed from a cohort of 50 consecutive severely injured patients treated in 2012 at our institution, coded retrospectively by 3 independent and specifically trained observers. Considering a cutoff ISS≥16, only 38/50 patients (76%) were uniformly identified as polytraumatised or not. Increasing the cut off to ≥20, this increased to 41/50 patients (82%). A difference in the AIS of ≥ 1 was present in 261 (16%) of possible codes. Excluding the vast majority of uninjured body regions, uniformly identical AIS severity values were attributed in 67/193 (35%) body regions, or 318/579 (55%) possible observer pairings. Injury severity all too often is neither identified correctly nor consistently when using the AIS. This leads to wrong identification of severely injured patients using the ISS. Improving consistency of coding through centralisation is recommended before scores based on the AIS are to be used for interhospital benchmarking and resource allocation in the treatment of severely injured patients. Copyright © 2017. Published by Elsevier Ltd.

  4. RELIABLE IDENTIFICATIONS OF ACTIVE GALACTIC NUCLEI FROM THE WISE, 2MASS, AND ROSAT ALL-SKY SURVEYS

    Energy Technology Data Exchange (ETDEWEB)

    Edelson, R. [Department of Astronomy, University of Maryland, College Park, MD 20742-2421 (United States); Malkan, M., E-mail: rickedelson@gmail.com [Department of Physics and Astronomy, University of California Los Angeles, Los Angeles, CA 90095-1547 (United States)

    2012-05-20

    We have developed the ''S{sub IX}'' statistic to identify bright, highly likely active galactic nucleus (AGN) candidates solely on the basis of Wide-field Infrared Survey Explorer (WISE), Two Micron All-Sky Survey (2MASS), and ROSAT all-sky survey (RASS) data. This statistic was optimized with data from the preliminary WISE survey and the Sloan Digital Sky Survey, and tested with Lick 3 m Kast spectroscopy. We find that sources with S{sub IX} < 0 have a {approx}>95% likelihood of being an AGN (defined in this paper as a Seyfert 1, quasar, or blazar). This statistic was then applied to the full WISE/2MASS/RASS dataset, including the final WISE data release, to yield the ''W2R'' sample of 4316 sources with S{sub IX} < 0. Only 2209 of these sources are currently in the Veron-Cetty and Veron (VCV) catalog of spectroscopically confirmed AGNs, indicating that the W2R sample contains nearly 2000 new, relatively bright (J {approx}< 16) AGNs. We utilize the W2R sample to quantify biases and incompleteness in the VCV catalog. We find that it is highly complete for bright (J < 14), northern AGNs, but the completeness drops below 50% for fainter, southern samples and for sources near the Galactic plane. This approach also led to the spectroscopic identification of 10 new AGNs in the Kepler field, more than doubling the number of AGNs being monitored by Kepler. The W2R sample contains better than 1 bright AGN every 10 deg{sup 2}, permitting construction of AGN samples in any sufficiently large region of sky.

  5. Identification and Endodontic Management of Middle Mesial Canal in Mandibular Second Molar Using Cone Beam Computed Tomography

    Directory of Open Access Journals (Sweden)

    Bonny Paul

    2015-01-01

    Full Text Available Endodontic treatments are routinely done with the help of radiographs. However, radiographs represent only a two-dimensional image of an object. Failure to identify aberrant anatomy can lead to endodontic failure. This case report presents the use of three-dimensional imaging with cone beam computed tomography (CBCT as an adjunct to digital radiography in identification and management of mandibular second molar with three mesial canals.

  6. Low-bandwidth and non-compute intensive remote identification of microbes from raw sequencing reads.

    Directory of Open Access Journals (Sweden)

    Laurent Gautier

    Full Text Available Cheap DNA sequencing may soon become routine not only for human genomes but also for practically anything requiring the identification of living organisms from their DNA: tracking of infectious agents, control of food products, bioreactors, or environmental samples. We propose a novel general approach to the analysis of sequencing data where a reference genome does not have to be specified. Using a distributed architecture we are able to query a remote server for hints about what the reference might be, transferring a relatively small amount of data. Our system consists of a server with known reference DNA indexed, and a client with raw sequencing reads. The client sends a sample of unidentified reads, and in return receives a list of matching references. Sequences for the references can be retrieved and used for exhaustive computation on the reads, such as alignment. To demonstrate this approach we have implemented a web server, indexing tens of thousands of publicly available genomes and genomic regions from various organisms and returning lists of matching hits from query sequencing reads. We have also implemented two clients: one running in a web browser, and one as a python script. Both are able to handle a large number of sequencing reads and from portable devices (the browser-based running on a tablet, perform its task within seconds, and consume an amount of bandwidth compatible with mobile broadband networks. Such client-server approaches could develop in the future, allowing a fully automated processing of sequencing data and routine instant quality check of sequencing runs from desktop sequencers. A web access is available at http://tapir.cbs.dtu.dk. The source code for a python command-line client, a server, and supplementary data are available at http://bit.ly/1aURxkc.

  7. Low-Bandwidth and Non-Compute Intensive Remote Identification of Microbes from Raw Sequencing Reads

    Science.gov (United States)

    Gautier, Laurent; Lund, Ole

    2013-01-01

    Cheap DNA sequencing may soon become routine not only for human genomes but also for practically anything requiring the identification of living organisms from their DNA: tracking of infectious agents, control of food products, bioreactors, or environmental samples. We propose a novel general approach to the analysis of sequencing data where a reference genome does not have to be specified. Using a distributed architecture we are able to query a remote server for hints about what the reference might be, transferring a relatively small amount of data. Our system consists of a server with known reference DNA indexed, and a client with raw sequencing reads. The client sends a sample of unidentified reads, and in return receives a list of matching references. Sequences for the references can be retrieved and used for exhaustive computation on the reads, such as alignment. To demonstrate this approach we have implemented a web server, indexing tens of thousands of publicly available genomes and genomic regions from various organisms and returning lists of matching hits from query sequencing reads. We have also implemented two clients: one running in a web browser, and one as a python script. Both are able to handle a large number of sequencing reads and from portable devices (the browser-based running on a tablet), perform its task within seconds, and consume an amount of bandwidth compatible with mobile broadband networks. Such client-server approaches could develop in the future, allowing a fully automated processing of sequencing data and routine instant quality check of sequencing runs from desktop sequencers. A web access is available at http://tapir.cbs.dtu.dk. The source code for a python command-line client, a server, and supplementary data are available at http://bit.ly/1aURxkc. PMID:24391826

  8. Genome-wide Studies of Mycolic Acid Bacteria: Computational Identification and Analysis of a Minimal Genome

    KAUST Repository

    Kamanu, Frederick Kinyua

    2012-12-01

    The mycolic acid bacteria are a distinct suprageneric group of asporogenous Grampositive, high GC-content bacteria, distinguished by the presence of mycolic acids in their cell envelope. They exhibit great diversity in their cell and morphology; although primarily non-pathogens, this group contains three major pathogens Mycobacterium leprae, Mycobacterium tuberculosis complex, and Corynebacterium diphtheria. Although the mycolic acid bacteria are a clearly defined group of bacteria, the taxonomic relationships between its constituent genera and species are less well defined. Two approaches were tested for their suitability in describing the taxonomy of the group. First, a Multilocus Sequence Typing (MLST) experiment was assessed and found to be superior to monophyletic (16S small ribosomal subunit) in delineating a total of 52 mycolic acid bacterial species. Phylogenetic inference was performed using the neighbor-joining method. To further refine phylogenetic analysis and to take advantage of the widespread availability of bacterial genome data, a computational framework that simulates DNA-DNA hybridisation was developed and validated using multiscale bootstrap resampling. The tool classifies microbial genomes based on whole genome DNA, and was deployed as a web-application using PHP and Javascript. It is accessible online at http://cbrc.kaust.edu.sa/dna_hybridization/ A third study was a computational and statistical methods in the identification and analysis of a putative minimal mycolic acid bacterial genome so as to better understand (1) the genomic requirements to encode a mycolic acid bacterial cell and (2) the role and type of genes and genetic elements that lead to the massive increase in genome size in environmental mycolic acid bacteria. Using a reciprocal comparison approach, a total of 690 orthologous gene clusters forming a putative minimal genome were identified across 24 mycolic acid bacterial species. In order to identify new potential drug

  9. Using standardized video cases for assessment of medical communication skills: reliability of an objective structured video examination by computer

    NARCIS (Netherlands)

    Hulsman, R. L.; Mollema, E. D.; Oort, F. J.; Hoos, A. M.; de Haes, J. C. J. M.

    2006-01-01

    OBJECTIVE: Using standardized video cases in a computerized objective structured video examination (OSVE) aims to measure cognitive scripts underlying overt communication behavior by questions on knowledge, understanding and performance. In this study the reliability of the OSVE assessment is

  10. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  11. Technical Note: "Mitochondrial and nuclear DNA approaches for reliable identification of Lucilia (Diptera, Calliphoridae) species of forensic interest from Southern Europe".

    Science.gov (United States)

    GilArriortua, Maite; Saloña-Bordas, Marta I; Cainé, Laura M; Pinheiro, Fátima; M de Pancorbo, Marian

    2015-12-01

    In forensic entomology, rapid and unambiguous identification of blowfly species is a critical prerequisite for accurately estimating the post-mortem interval (PMI). The conventional diagnosis of cadaveric entomofauna based on external characters is hampered by the morphological similarities between species, especially in immature stages. Genetic analysis has been shown to allow precise and reliable diagnosis and delimitation of insect species. Nevertheless, the taxonomy of some species remains unresolved. This study was focused on improving the effectiveness and accuracy of analysis based on the widely used cytochrome c oxidase subunit I barcode region (COI barcode, 658 bp), complemented by other mitochondrial and nuclear regions, such as cytochrome b (Cyt-b, 307 bp) and the second internal transcribed spacer (ITS2, 310-331 bp), for the identification of Southern European blowflies. We analyzed a total of 209 specimens, collected from 38 human corpses, belonging to three Calliphoridae genera and seven species: Chrysomya (Ch. albiceps), Calliphora (C. vicina and C. vomitoria), and Lucilia (L. sericata, L. ampullacea, L. caesar and L. illustris). These species are the most common PMI indicators in Portugal. The results revealed that unambiguous separation of species of the Lucilia genus requires different loci from the barcode region. Furthermore, we conclude that the ITS2 (310-331 bp) molecular marker is a promising diagnostic tool because its inter-specific discriminatory power enables unequivocal and consistent distinctions to be made, even between closely related species (L. caesar-L. illustris). This work also contributes new genetic data that may be of interest in performing species diagnosis for Southern European blowflies. Notably, to the best of our knowledge, we provide the first records of the Cyt-b (307 bp) locus for L. illustris and the ITS2 (310-331 bp) region for Iberian Peninsula Lucilia species. Copyright © 2015 Elsevier Ireland Ltd. All rights

  12. Genome-wide identification of specific oligonucleotides using artificial neural network and computational genomic analysis

    Directory of Open Access Journals (Sweden)

    Chen Jiun-Ching

    2007-05-01

    Full Text Available Abstract Background Genome-wide identification of specific oligonucleotides (oligos is a computationally-intensive task and is a requirement for designing microarray probes, primers, and siRNAs. An artificial neural network (ANN is a machine learning technique that can effectively process complex and high noise data. Here, ANNs are applied to process the unique subsequence distribution for prediction of specific oligos. Results We present a novel and efficient algorithm, named the integration of ANN and BLAST (IAB algorithm, to identify specific oligos. We establish the unique marker database for human and rat gene index databases using the hash table algorithm. We then create the input vectors, via the unique marker database, to train and test the ANN. The trained ANN predicted the specific oligos with high efficiency, and these oligos were subsequently verified by BLAST. To improve the prediction performance, the ANN over-fitting issue was avoided by early stopping with the best observed error and a k-fold validation was also applied. The performance of the IAB algorithm was about 5.2, 7.1, and 6.7 times faster than the BLAST search without ANN for experimental results of 70-mer, 50-mer, and 25-mer specific oligos, respectively. In addition, the results of polymerase chain reactions showed that the primers predicted by the IAB algorithm could specifically amplify the corresponding genes. The IAB algorithm has been integrated into a previously published comprehensive web server to support microarray analysis and genome-wide iterative enrichment analysis, through which users can identify a group of desired genes and then discover the specific oligos of these genes. Conclusion The IAB algorithm has been developed to construct SpecificDB, a web server that provides a specific and valid oligo database of the probe, siRNA, and primer design for the human genome. We also demonstrate the ability of the IAB algorithm to predict specific oligos through

  13. Usability Studies in Virtual and Traditional Computer Aided Design Environments for Fault Identification

    Science.gov (United States)

    2017-08-08

    communicate their subjective opinions. Keywords: Usability Analysis; CAVETM (Cave Automatic Virtual Environments); Human Computer Interface (HCI...the differences in interaction when compared with traditional human computer interfaces. This paper provides analysis via usability study methods

  14. Man-Computer Symbiosis Through Interactive Graphics: A Survey and Identification of Critical Research Areas.

    Science.gov (United States)

    Knoop, Patricia A.

    The purpose of this report was to determine the research areas that appear most critical to achieving man-computer symbiosis. An operational definition of man-computer symbiosis was developed by: (1) reviewing and summarizing what others have said about it, and (2) attempting to distinguish it from other types of man-computer relationships. From…

  15. Reshaping Computer Literacy Teaching in Higher Education: Identification of Critical Success Factors

    Science.gov (United States)

    Taylor, Estelle; Goede, Roelien; Steyn, Tjaart

    2011-01-01

    Purpose: Acquiring computer skills is more important today than ever before, especially in a developing country. Teaching of computer skills, however, has to adapt to new technology. This paper aims to model factors influencing the success of the learning of computer literacy by means of an e-learning environment. The research question for this…

  16. Computer-assisted radiographic calculation of spinal curvature in brachycephalic "screw-tailed" dog breeds with congenital thoracic vertebral malformations: reliability and clinical evaluation.

    Directory of Open Access Journals (Sweden)

    Julien Guevar

    Full Text Available The objectives of this study were: To investigate computer-assisted digital radiographic measurement of Cobb angles in dogs with congenital thoracic vertebral malformations, to determine its intra- and inter-observer reliability and its association with the presence of neurological deficits. Medical records were reviewed (2009-2013 to identify brachycephalic screw-tailed dog breeds with radiographic studies of the thoracic vertebral column and with at least one vertebral malformation present. Twenty-eight dogs were included in the study. The end vertebrae were defined as the cranial end plate of the vertebra cranial to the malformed vertebra and the caudal end plate of the vertebra caudal to the malformed vertebra. Three observers performed the measurements twice. Intraclass correlation coefficients were used to calculate the intra- and inter-observer reliabilities. The intraclass correlation coefficient was excellent for all intra- and inter-observer measurements using this method. There was a significant difference in the kyphotic Cobb angle between dogs with and without associated neurological deficits. The majority of dogs with neurological deficits had a kyphotic Cobb angle higher than 35°. No significant difference in the scoliotic Cobb angle was observed. We concluded that the computer assisted digital radiographic measurement of the Cobb angle for kyphosis and scoliosis is a valid, reproducible and reliable method to quantify the degree of spinal curvature in brachycephalic screw-tailed dog breeds with congenital thoracic vertebral malformations.

  17. [Feasibility and acceptance of computer-based assessment for the identification of psychosocially distressed patients in routine clinical care].

    Science.gov (United States)

    Sehlen, Susanne; Ott, Martin; Marten-Mittag, Birgitt; Haimerl, Wolfgang; Dinkel, Andreas; Duehmke, Eckhart; Klein, Christian; Schaefer, Christof; Herschbach, Peter

    2012-07-01

    This study investigated feasibility and acceptance of computer-based assessment for the identification of psychosocial distress in routine radiotherapy care. 155 cancer patients were assessed using QSC-R10, PO-Bado-SF and Mach-9. The congruence between computerized tablet PC and conventional paper assessment was analysed in 50 patients. The agreement between the 2 modes was high (ICC 0.869-0.980). Acceptance of computer-based assessment was very high (>95%). Sex, age, education, distress and Karnofsky performance status (KPS) did not influence acceptance. Computerized assessment was rated more difficult by older patients (p = 0.039) and patients with low KPS (p = 0.020). 75.5% of the respondents supported referral for psycho-social intervention for distressed patients. The prevalence of distress was 27.1% (QSC-R10). Computer-based assessment allows easy identification of distressed patients. Level of staff involvement is low, and the results are quickly available for care providers. © Georg Thieme Verlag KG Stuttgart · New York.

  18. A reliability simulation language for reliability analysis

    International Nuclear Information System (INIS)

    Deans, N.D.; Miller, A.J.; Mann, D.P.

    1986-01-01

    The results of work being undertaken to develop a Reliability Description Language (RDL) which will enable reliability analysts to describe complex reliability problems in a simple, clear and unambiguous way are described. Component and system features can be stated in a formal manner and subsequently used, along with control statements to form a structured program. The program can be compiled and executed on a general-purpose computer or special-purpose simulator. (DG)

  19. Identification of High-Risk Plaques Destined to Cause Acute Coronary Syndrome Using Coronary Computed Tomographic Angiography and Computational Fluid Dynamics.

    Science.gov (United States)

    Lee, Joo Myung; Choi, Gilwoo; Koo, Bon-Kwon; Hwang, Doyeon; Park, Jonghanne; Zhang, Jinlong; Kim, Kyung-Jin; Tong, Yaliang; Kim, Hyun Jin; Grady, Leo; Doh, Joon-Hyung; Nam, Chang-Wook; Shin, Eun-Seok; Cho, Young-Seok; Choi, Su-Yeon; Chun, Eun Ju; Choi, Jin-Ho; Nørgaard, Bjarne L; Christiansen, Evald H; Niemen, Koen; Otake, Hiromasa; Penicka, Martin; de Bruyne, Bernard; Kubo, Takashi; Akasaka, Takashi; Narula, Jagat; Douglas, Pamela S; Taylor, Charles A; Kim, Hyo-Soo

    2018-03-14

    We investigated the utility of noninvasive hemodynamic assessment in the identification of high-risk plaques that caused subsequent acute coronary syndrome (ACS). ACS is a critical event that impacts the prognosis of patients with coronary artery disease. However, the role of hemodynamic factors in the development of ACS is not well-known. Seventy-two patients with clearly documented ACS and available coronary computed tomographic angiography (CTA) acquired between 1 month and 2 years before the development of ACS were included. In 66 culprit and 150 nonculprit lesions as a case-control design, the presence of adverse plaque characteristics (APC) was assessed and hemodynamic parameters (fractional flow reserve derived by coronary computed tomographic angiography [FFR CT ], change in FFR CT across the lesion [△FFR CT ], wall shear stress [WSS], and axial plaque stress) were analyzed using computational fluid dynamics. The best cut-off values for FFR CT , △FFR CT , WSS, and axial plaque stress were used to define the presence of adverse hemodynamic characteristics (AHC). The incremental discriminant and reclassification abilities for ACS prediction were compared among 3 models (model 1: percent diameter stenosis [%DS] and lesion length, model 2: model 1 + APC, and model 3: model 2 + AHC). The culprit lesions showed higher %DS (55.5 ± 15.4% vs. 43.1 ± 15.0%; p stress than nonculprit lesions (all p values statistic [c-index] 0.789 vs. 0.747; p = 0.014) and reclassification abilities (category-free net reclassification index 0.287; p = 0.047; relative integrated discrimination improvement 0.368; p < 0.001) than model 2. Lesions with both APC and AHC showed significantly higher risk of the culprit for subsequent ACS than those with no APC/AHC (hazard ratio: 11.75; 95% confidence interval: 2.85 to 48.51; p = 0.001) and with either APC or AHC (hazard ratio: 3.22; 95% confidence interval: 1.86 to 5.55; p < 0.001). Noninvasive hemodynamic assessment enhanced

  20. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  2. The Use of Computer-Assisted Identification of ARIMA Time-Series.

    Science.gov (United States)

    Brown, Roger L.

    This study was conducted to determine the effects of using various levels of tutorial statistical software for the tentative identification of nonseasonal ARIMA models, a statistical technique proposed by Box and Jenkins for the interpretation of time-series data. The Box-Jenkins approach is an iterative process encompassing several stages of…

  3. Validation of DNA-based identification software by computation of pedigree likelihood ratios

    NARCIS (Netherlands)

    Slooten, K.

    Disaster victim identification (DVI) can be aided by DNA-evidence, by comparing the DNA-profiles of unidentified individuals with those of surviving relatives. The DNA-evidence is used optimally when such a comparison is done by calculating the appropriate likelihood ratios. Though conceptually

  4. Optimal design method for a digital human–computer interface based on human reliability in a nuclear power plant. Part 3: Optimization method for interface task layout

    International Nuclear Information System (INIS)

    Jiang, Jianjun; Wang, Yiqun; Zhang, Li; Xie, Tian; Li, Min; Peng, Yuyuan; Wu, Daqing; Li, Peiyao; Ma, Congmin; Shen, Mengxu; Wu, Xing; Weng, Mengyun; Wang, Shiwei; Xie, Cen

    2016-01-01

    Highlights: • The authors present an optimization algorithm for interface task layout. • The performing process of the proposed algorithm was depicted. • The performance evaluation method adopted neural network method. • The optimization layouts of an event interface tasks were obtained by experiments. - Abstract: This is the last in a series of papers describing the optimal design for a digital human–computer interface of a nuclear power plant (NPP) from three different points based on human reliability. The purpose of this series is to propose different optimization methods from varying perspectives to decrease human factor events that arise from the defects of a human–computer interface. The present paper mainly solves the optimization method as to how to effectively layout interface tasks into different screens. The purpose of this paper is to decrease human errors by reducing the distance that an operator moves among different screens in each operation. In order to resolve the problem, the authors propose an optimization process of interface task layout for digital human–computer interface of a NPP. As to how to automatically layout each interface task into one of screens in each operation, the paper presents a shortest moving path optimization algorithm with dynamic flag based on human reliability. To test the algorithm performance, the evaluation method uses neural network based on human reliability. The less the human error probabilities are, the better the interface task layouts among different screens are. Thus, by analyzing the performance of each interface task layout, the optimization result is obtained. Finally, the optimization layouts of spurious safety injection event interface tasks of the NPP are obtained by an experiment, the proposed methods has a good accuracy and stabilization.

  5. Interactions among biotic and abiotic factors affect the reliability of tungsten microneedles puncturing in vitro and in vivo peripheral nerves: A hybrid computational approach

    Energy Technology Data Exchange (ETDEWEB)

    Sergi, Pier Nicola, E-mail: p.sergi@sssup.it [Translational Neural Engineering Laboratory, The Biorobotics Institute, Scuola Superiore Sant' Anna, Viale Rinaldo Piaggio 34, Pontedera, 56025 (Italy); Jensen, Winnie [Department of Health Science and Technology, Fredrik Bajers Vej 7, 9220 Aalborg (Denmark); Yoshida, Ken [Department of Biomedical Engineering, Indiana University - Purdue University Indianapolis, 723 W. Michigan St., SL220, Indianapolis, IN 46202 (United States)

    2016-02-01

    Tungsten is an elective material to produce slender and stiff microneedles able to enter soft tissues and minimize puncture wounds. In particular, tungsten microneedles are used to puncture peripheral nerves and insert neural interfaces, bridging the gap between the nervous system and robotic devices (e.g., hand prostheses). Unfortunately, microneedles fail during the puncture process and this failure is not dependent on stiffness or fracture toughness of the constituent material. In addition, the microneedles' performances decrease during in vivo trials with respect to the in vitro ones. This further effect is independent on internal biotic effects, while it seems to be related to external biotic causes. Since the exact synergy of phenomena decreasing the in vivo reliability is still not known, this work explored the connection between in vitro and in vivo behavior of tungsten microneedles through the study of interactions between biotic and abiotic factors. A hybrid computational approach, simultaneously using theoretical relationships and in silico models of nerves, was implemented to model the change of reliability varying the microneedle diameter, and to predict in vivo performances by using in vitro reliability and local differences between in vivo and in vitro mechanical response of nerves. - Highlights: • We provide phenomenological Finite Element (FE) models of peripheral nerves to study the interactions with W microneedles • We provide a general interaction-based approach to model the reliability of slender microneedles • We evaluate the reliability of W microneedels to puncture in vivo nerves • We provide a novel synergistic hybrid approach (theory + simulations) involving interactions among biotic and abiotic factors • We validate the hybrid approach by using experimental data from literature.

  6. Interactions among biotic and abiotic factors affect the reliability of tungsten microneedles puncturing in vitro and in vivo peripheral nerves: A hybrid computational approach

    International Nuclear Information System (INIS)

    Sergi, Pier Nicola; Jensen, Winnie; Yoshida, Ken

    2016-01-01

    Tungsten is an elective material to produce slender and stiff microneedles able to enter soft tissues and minimize puncture wounds. In particular, tungsten microneedles are used to puncture peripheral nerves and insert neural interfaces, bridging the gap between the nervous system and robotic devices (e.g., hand prostheses). Unfortunately, microneedles fail during the puncture process and this failure is not dependent on stiffness or fracture toughness of the constituent material. In addition, the microneedles' performances decrease during in vivo trials with respect to the in vitro ones. This further effect is independent on internal biotic effects, while it seems to be related to external biotic causes. Since the exact synergy of phenomena decreasing the in vivo reliability is still not known, this work explored the connection between in vitro and in vivo behavior of tungsten microneedles through the study of interactions between biotic and abiotic factors. A hybrid computational approach, simultaneously using theoretical relationships and in silico models of nerves, was implemented to model the change of reliability varying the microneedle diameter, and to predict in vivo performances by using in vitro reliability and local differences between in vivo and in vitro mechanical response of nerves. - Highlights: • We provide phenomenological Finite Element (FE) models of peripheral nerves to study the interactions with W microneedles • We provide a general interaction-based approach to model the reliability of slender microneedles • We evaluate the reliability of W microneedels to puncture in vivo nerves • We provide a novel synergistic hybrid approach (theory + simulations) involving interactions among biotic and abiotic factors • We validate the hybrid approach by using experimental data from literature

  7. CERPI and CEREL, two computer codes for the automatic identification and determination of gamma emitters in thermal-neutron-activated samples

    International Nuclear Information System (INIS)

    Giannini, M.; Oliva, P.R.; Ramorino, M.C.

    1979-01-01

    A computer code that automatically analyzes gamma-ray spectra obtained with Ge(Li) detectors is described. The program contains such features as automatic peak location and fitting, determination of peak energies and intensities, nuclide identification, and calculation of masses and errors. Finally, the results obtained with this computer code for a lunar sample are reported and briefly discussed

  8. Reliability Engineering

    International Nuclear Information System (INIS)

    Lee, Sang Yong

    1992-07-01

    This book is about reliability engineering, which describes definition and importance of reliability, development of reliability engineering, failure rate and failure probability density function about types of it, CFR and index distribution, IFR and normal distribution and Weibull distribution, maintainability and movability, reliability test and reliability assumption in index distribution type, normal distribution type and Weibull distribution type, reliability sampling test, reliability of system, design of reliability and functionality failure analysis by FTA.

  9. Accuracy and reliability of a novel method for fusion of digital dental casts and cone beam computed tomography scans

    NARCIS (Netherlands)

    Rangel, F.A.; Maal, T.J.J.; Bronkhorst, E.M.; Breuning, K.H.; Schols, J.G.J.H.; Berge, S.J.; Kuijpers-Jagtman, A.M.

    2013-01-01

    Several methods have been proposed to integrate digital models into Cone Beam Computed Tomography scans. Since all these methods have some drawbacks such as radiation exposure, soft tissue deformation and time-consuming digital handling processes, we propose a new method to integrate digital dental

  10. Reliability Analysis Based on a Jump Diffusion Model with Two Wiener Processes for Cloud Computing with Big Data

    Directory of Open Access Journals (Sweden)

    Yoshinobu Tamura

    2015-06-01

    Full Text Available At present, many cloud services are managed by using open source software, such as OpenStack and Eucalyptus, because of the unification management of data, cost reduction, quick delivery and work savings. The operation phase of cloud computing has a unique feature, such as the provisioning processes, the network-based operation and the diversity of data, because the operation phase of cloud computing changes depending on many external factors. We propose a jump diffusion model with two-dimensional Wiener processes in order to consider the interesting aspects of the network traffic and big data on cloud computing. In particular, we assess the stability of cloud software by using the sample paths obtained from the jump diffusion model with two-dimensional Wiener processes. Moreover, we discuss the optimal maintenance problem based on the proposed jump diffusion model. Furthermore, we analyze actual data to show numerical examples of dependability optimization based on the software maintenance cost considering big data on cloud computing.

  11. The Reliability of Classifications of Proximal Femoral Fractures with 3-Dimensional Computed Tomography: The New Concept of Comprehensive Classification

    Directory of Open Access Journals (Sweden)

    Hiroaki Kijima

    2014-01-01

    Full Text Available The reliability of proximal femoral fracture classifications using 3DCT was evaluated, and a comprehensive “area classification” was developed. Eleven orthopedists (5–26 years from graduation classified 27 proximal femoral fractures at one hospital from June 2013 to July 2014 based on preoperative images. Various classifications were compared to “area classification.” In “area classification,” the proximal femur is divided into 4 areas with 3 boundary lines: Line-1 is the center of the neck, Line-2 is the border between the neck and the trochanteric zone, and Line-3 links the inferior borders of the greater and lesser trochanters. A fracture only in the first area was classified as a pure first area fracture; one in the first and second area was classified as a 1-2 type fracture. In the same way, fractures were classified as pure 2, 3-4, 1-2-3, and so on. “Area classification” reliability was highest when orthopedists with varying experience classified proximal femoral fractures using 3DCT. Other classifications cannot classify proximal femoral fractures if they exceed each classification’s particular zones. However, fractures that exceed the target zones are “dangerous” fractures. “Area classification” can classify such fractures, and it is therefore useful for selecting osteosynthesis methods.

  12. Design and construction the identification of nitriding plasma process parameters using personal computer based on serial communication

    International Nuclear Information System (INIS)

    Frida Iswinning Diah; Slamet Santosa

    2012-01-01

    Design and construction the identification of process parameters using personal computer based on serial communication PLC M-series has been done. The function of this device is to identify the process parameters of a system (plan), to which then be analyzed and conducted a follow-up given to the plan by the user. The main component of this device is the M-Series T100MD1616 PLC and personal computer (PC). In this device the data plan parameters obtained from the corresponding sensor outputs in the form of voltage or current. While the analog parameter data is adjusted to the ADC analog input of the PLC using a signal conditioning system. Then, as the parameter is processed by the PLC then sent to a PC via RS232 to be displayed in the form of graphs or tables and stored in the database. Software to program the database is created using Visual Basic Programming V-6. The device operation test is performed for the measurement of temperature parameter and vacuum level on the plasma nitriding machine. The results indicate that the device has functioning as an identification device parameters process of plasma nitriding machine. (author)

  13. Identification and ranking of the risk factors of cloud computing in State-Owned organizations

    Directory of Open Access Journals (Sweden)

    Noor Mohammad Yaghoubi

    2015-05-01

    Full Text Available Rapid development of processing and storage technologies and the success of the Internet have made computing resources cheaper, more powerful and more available than before. This technological trend has enabled the realization of a new computing model called cloud computing. Recently, the State-Owned organizations have begun to utilize cloud computing architectures, platforms, and applications to deliver services and meet constituents’ needs. Despite all of the advantages and opportunities of cloud computing technology, there are so many risks that State-Owned organizations need to know about before their migration to cloud environment. The purpose of this study is to identify and rank the risks factors of cloud computing in State-Owned organizations by making use of IT experts’ opinion. Firstly, by reviewing key articles, a comprehensive list of risks factors were extracted and classified into two categories: tangible and intangible. Then, six experts were interviewed about these risks and their classifications, and 10 risks were identified. After that, process of ranking the risks was done by seeking help from 52 experts and by fuzzy analytic hierarchy process. The results show that experts have identified intangible risks as the most important risks in cloud computing usage by State-Owned organizations. As the results indicate, "data confidentiality" risk has the highest place among the other risks.

  14. Identification of unknown contaminants in surface water : combination of analytical and computer-based approaches

    OpenAIRE

    Hu, Meng

    2017-01-01

    Thousands of different chemicals are used in our daily life for household, industry, agriculture and medical purpose, and many of them are discharged into water bodies by direct or indirect ways. Thus, monitoring and identification of organic pollutants in aquatic ecosystem is one of the most essential concerns with respects to human health and aquatic life. Althrough liquid chromatography coupled to high resolution mass spectrometry (LC-HRMS) has made huge advancements in recent years, allow...

  15. Reliability of a coordinate system based on anatomical landmarks of the maxillofacial skeleton. An evaluation method for three-dimensional images obtained by cone-beam computed tomography

    International Nuclear Information System (INIS)

    Kimura, Momoko; Nawa, Hiroyuki; Yoshida, Kazuhito; Muramatsu, Atsushi; Fuyamada, Mariko; Goto, Shigemi; Ariji, Eiichiro; Tokumori, Kenji; Katsumata, Akitoshi

    2009-01-01

    We propose a method for evaluating the reliability of a coordinate system based on maxillofacial skeletal landmarks and use it to assess two coordinate systems. Scatter plots and 95% confidence ellipses of an objective landmark were defined as an index for demonstrating the stability of the coordinate system. A head phantom was positioned horizontally in reference to the Frankfurt horizontal and occlusal planes and subsequently scanned once in each position using cone-beam computed tomography. On the three-dimensional images created with a volume-rendering procedure, six dentists twice set two different coordinate systems: coordinate system 1 was defined by the nasion, sella, and basion, and coordinate system 2 was based on the left orbitale, bilateral porions, and basion. The menton was assigned as an objective landmark. The scatter plot and 95% ellipse of the menton indicated the high-level reliability of coordinate system 2. The patterns with the two coordinate systems were similar between data obtained in different head positions. The method presented here may be effective for evaluating the reliability (reproducibility) of coordinate systems based on skeletal landmarks. (author)

  16. On the analysis of glow curves with the general order kinetics: Reliability of the computed trap parameters

    Energy Technology Data Exchange (ETDEWEB)

    Ortega, F. [Facultad de Ingeniería (UNCPBA) and CIFICEN (UNCPBA – CICPBA – CONICET), Av. del Valle 5737, 7400 Olavarría (Argentina); Santiago, M.; Martinez, N.; Marcazzó, J.; Molina, P.; Caselli, E. [Instituto de Física Arroyo Seco (UNCPBA) and CIFICEN (UNCPBA – CICPBA – CONICET), Pinto 399, 7000 Tandil (Argentina)

    2017-04-15

    Nowadays the most employed kinetics for analyzing glow curves is the general order kinetics (GO) proposed by C. E. May and J. A. Partridge. As shown in many articles this kinetics might yield wrong parameters characterizing trap and recombination centers. In this article this kinetics is compared with the modified general order kinetics put forward by M. S. Rasheedy by analyzing synthetic glow curves. The results show that the modified kinetics gives parameters, which are more accurate than that yield by the original general order kinetics. A criterion is reported to evaluate the accuracy of the trap parameters found by deconvolving glow curves. This criterion was employed to assess the reliability of the trap parameters of the YVO{sub 4}: Eu{sup 3+} compounds.

  17. Comments and Criticism: Comment on "Identification of Student Misconceptions in Genetics Problem Solving via Computer Program."

    Science.gov (United States)

    Smith, Mike U.

    1991-01-01

    Criticizes an article by Browning and Lehman (1988) for (1) using "gene" instead of allele, (2) misusing the word "misconception," and (3) the possible influences of the computer environment on the results of the study. (PR)

  18. Computer-aided modeling framework for efficient model development, analysis and identification

    DEFF Research Database (Denmark)

    Heitzig, Martina; Sin, Gürkan; Sales Cruz, Mauricio

    2011-01-01

    Model-based computer aided product-process engineering has attained increased importance in a number of industries, including pharmaceuticals, petrochemicals, fine chemicals, polymers, biotechnology, food, energy, and water. This trend is set to continue due to the substantial benefits computer-aided...... methods introduce. The key prerequisite of computer-aided product-process engineering is however the availability of models of different types, forms, and application modes. The development of the models required for the systems under investigation tends to be a challenging and time-consuming task....... The methodology has been implemented into a computer-aided modeling framework, which combines expert skills, tools, and database connections that are required for the different steps of the model development work-flow with the goal to increase the efficiency of the modeling process. The framework has two main...

  19. Identification and Evaluation of Reliable Reference Genes for Quantitative Real-Time PCR Analysis in Tea Plant (Camellia sinensis (L.) O. Kuntze)

    Science.gov (United States)

    Hao, Xinyuan; Horvath, David P.; Chao, Wun S.; Yang, Yajun; Wang, Xinchao; Xiao, Bin

    2014-01-01

    Reliable reference selection for the accurate quantification of gene expression under various experimental conditions is a crucial step in qRT-PCR normalization. To date, only a few housekeeping genes have been identified and used as reference genes in tea plant. The validity of those reference genes are not clear since their expression stabilities have not been rigorously examined. To identify more appropriate reference genes for qRT-PCR studies on tea plant, we examined the expression stability of 11 candidate reference genes from three different sources: the orthologs of Arabidopsis traditional reference genes and stably expressed genes identified from whole-genome GeneChip studies, together with three housekeeping gene commonly used in tea plant research. We evaluated the transcript levels of these genes in 94 experimental samples. The expression stabilities of these 11 genes were ranked using four different computation programs including geNorm, Normfinder, BestKeeper, and the comparative ∆CT method. Results showed that the three commonly used housekeeping genes of CsTUBULIN1, CsACINT1 and Cs18S rRNA1 together with CsUBQ1 were the most unstable genes in all sample ranking order. However, CsPTB1, CsEF1, CsSAND1, CsCLATHRIN1 and CsUBC1 were the top five appropriate reference genes for qRT-PCR analysis in complex experimental conditions. PMID:25474086

  20. Cardiac valve calcifications on low-dose unenhanced ungated chest computed tomography: inter-observer and inter-examination reliability, agreement and variability

    Energy Technology Data Exchange (ETDEWEB)

    Hamersvelt, Robbert W. van; Willemink, Martin J.; Takx, Richard A.P.; Eikendal, Anouk L.M.; Budde, Ricardo P.J.; Leiner, Tim; Jong, Pim A. de [University Medical Center Utrecht, Department of Radiology, Utrecht (Netherlands); Mol, Christian P.; Isgum, Ivana [University Medical Center Utrecht, Image Sciences Institute, Utrecht (Netherlands)

    2014-07-15

    To determine inter-observer and inter-examination variability for aortic valve calcification (AVC) and mitral valve and annulus calcification (MC) in low-dose unenhanced ungated lung cancer screening chest computed tomography (CT). We included 578 lung cancer screening trial participants who were examined by CT twice within 3 months to follow indeterminate pulmonary nodules. On these CTs, AVC and MC were measured in cubic millimetres. One hundred CTs were examined by five observers to determine the inter-observer variability. Reliability was assessed by kappa statistics (κ) and intra-class correlation coefficients (ICCs). Variability was expressed as the mean difference ± standard deviation (SD). Inter-examination reliability was excellent for AVC (κ = 0.94, ICC = 0.96) and MC (κ = 0.95, ICC = 0.90). Inter-examination variability was 12.7 ± 118.2 mm{sup 3} for AVC and 31.5 ± 219.2 mm{sup 3} for MC. Inter-observer reliability ranged from κ = 0.68 to κ = 0.92 for AVC and from κ = 0.20 to κ = 0.66 for MC. Inter-observer ICC was 0.94 for AVC and ranged from 0.56 to 0.97 for MC. Inter-observer variability ranged from -30.5 ± 252.0 mm{sup 3} to 84.0 ± 240.5 mm{sup 3} for AVC and from -95.2 ± 210.0 mm{sup 3} to 303.7 ± 501.6 mm{sup 3} for MC. AVC can be quantified with excellent reliability on ungated unenhanced low-dose chest CT, but manual detection of MC can be subject to substantial inter-observer variability. Lung cancer screening CT may be used for detection and quantification of cardiac valve calcifications. (orig.)

  1. Cardiac valve calcifications on low-dose unenhanced ungated chest computed tomography: inter-observer and inter-examination reliability, agreement and variability

    International Nuclear Information System (INIS)

    Hamersvelt, Robbert W. van; Willemink, Martin J.; Takx, Richard A.P.; Eikendal, Anouk L.M.; Budde, Ricardo P.J.; Leiner, Tim; Jong, Pim A. de; Mol, Christian P.; Isgum, Ivana

    2014-01-01

    To determine inter-observer and inter-examination variability for aortic valve calcification (AVC) and mitral valve and annulus calcification (MC) in low-dose unenhanced ungated lung cancer screening chest computed tomography (CT). We included 578 lung cancer screening trial participants who were examined by CT twice within 3 months to follow indeterminate pulmonary nodules. On these CTs, AVC and MC were measured in cubic millimetres. One hundred CTs were examined by five observers to determine the inter-observer variability. Reliability was assessed by kappa statistics (κ) and intra-class correlation coefficients (ICCs). Variability was expressed as the mean difference ± standard deviation (SD). Inter-examination reliability was excellent for AVC (κ = 0.94, ICC = 0.96) and MC (κ = 0.95, ICC = 0.90). Inter-examination variability was 12.7 ± 118.2 mm 3 for AVC and 31.5 ± 219.2 mm 3 for MC. Inter-observer reliability ranged from κ = 0.68 to κ = 0.92 for AVC and from κ = 0.20 to κ = 0.66 for MC. Inter-observer ICC was 0.94 for AVC and ranged from 0.56 to 0.97 for MC. Inter-observer variability ranged from -30.5 ± 252.0 mm 3 to 84.0 ± 240.5 mm 3 for AVC and from -95.2 ± 210.0 mm 3 to 303.7 ± 501.6 mm 3 for MC. AVC can be quantified with excellent reliability on ungated unenhanced low-dose chest CT, but manual detection of MC can be subject to substantial inter-observer variability. Lung cancer screening CT may be used for detection and quantification of cardiac valve calcifications. (orig.)

  2. Reliable computation of roots in analytical waveguide modeling using an interval-Newton approach and algorithmic differentiation.

    Science.gov (United States)

    Bause, Fabian; Walther, Andrea; Rautenberg, Jens; Henning, Bernd

    2013-12-01

    For the modeling and simulation of wave propagation in geometrically simple waveguides such as plates or rods, one may employ the analytical global matrix method. That is, a certain (global) matrix depending on the two parameters wavenumber and frequency is built. Subsequently, one must calculate all parameter pairs within the domain of interest where the global matrix becomes singular. For this purpose, one could compute all roots of the determinant of the global matrix when the two parameters vary in the given intervals. This requirement to calculate all roots is actually the method's most concerning restriction. Previous approaches are based on so-called mode-tracers, which use the physical phenomenon that solutions, i.e., roots of the determinant of the global matrix, appear in a certain pattern, the waveguide modes, to limit the root-finding algorithm's search space with respect to consecutive solutions. In some cases, these reductions of the search space yield only an incomplete set of solutions, because some roots may be missed as a result of uncertain predictions. Therefore, we propose replacement of the mode-tracer approach with a suitable version of an interval- Newton method. To apply this interval-based method, we extended the interval and derivative computation provided by a numerical computing environment such that corresponding information is also available for Bessel functions used in circular models of acoustic waveguides. We present numerical results for two different scenarios. First, a polymeric cylindrical waveguide is simulated, and second, we show simulation results of a one-sided fluid-loaded plate. For both scenarios, we compare results obtained with the proposed interval-Newton algorithm and commercial software.

  3. Interactive reliability assessment using an integrated reliability data bank

    International Nuclear Information System (INIS)

    Allan, R.N.; Whitehead, A.M.

    1986-01-01

    The logical structure, techniques and practical application of a computer-aided technique based on a microcomputer using floppy disc Random Access Files is described. This interactive computational technique is efficient if the reliability prediction program is coupled directly to a relevant source of data to create an integrated reliability assessment/reliability data bank system. (DG)

  4. Construction, implementation and testing of an image identification system using computer vision methods for fruit flies with economic importance (Diptera: Tephritidae).

    Science.gov (United States)

    Wang, Jiang-Ning; Chen, Xiao-Lin; Hou, Xin-Wen; Zhou, Li-Bing; Zhu, Chao-Dong; Ji, Li-Qiang

    2017-07-01

    Many species of Tephritidae are damaging to fruit, which might negatively impact international fruit trade. Automatic or semi-automatic identification of fruit flies are greatly needed for diagnosing causes of damage and quarantine protocols for economically relevant insects. A fruit fly image identification system named AFIS1.0 has been developed using 74 species belonging to six genera, which include the majority of pests in the Tephritidae. The system combines automated image identification and manual verification, balancing operability and accuracy. AFIS1.0 integrates image analysis and expert system into a content-based image retrieval framework. In the the automatic identification module, AFIS1.0 gives candidate identification results. Afterwards users can do manual selection based on comparing unidentified images with a subset of images corresponding to the automatic identification result. The system uses Gabor surface features in automated identification and yielded an overall classification success rate of 87% to the species level by Independent Multi-part Image Automatic Identification Test. The system is useful for users with or without specific expertise on Tephritidae in the task of rapid and effective identification of fruit flies. It makes the application of computer vision technology to fruit fly recognition much closer to production level. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  5. [The Computer Book of the Internal Medicine resident: validity and reliability of a questionnaire for self-assessment of competences in internal medicine residents].

    Science.gov (United States)

    Oristrell, J; Casanovas, A; Jordana, R; Comet, R; Gil, M; Oliva, J C

    2012-12-01

    There are no simple and validated instruments for evaluating the training of specialists. To analyze the reliability and validity of a computerized self-assessment method to quantify the acquisition of medical competences during the Internal Medicine residency program. All residents of our department participated in the study during a period of 28 months. Twenty-two questionnaires specific for each rotation (the Computer-Book of the Internal Medicine Resident) were constructed with items (questions) corresponding to three competence domains: clinical skills competence, communication skills and teamwork. Reliability was analyzed by measuring the internal consistency of items in each competence domain using Cronbach's alpha index. Validation was performed by comparing mean scores in each competence domain between senior and junior residents. Cut-off levels of competence scores were established in order to identify the strengths and weaknesses of our training program. Finally, self-assessment values were correlated with the evaluations of the medical staff. There was a high internal consistency of the items of clinical skills competences, communication skills and teamwork. Higher scores of clinical skills competence and communication skills, but not in those of teamwork were observed in senior residents than in junior residents. The Computer-Book of the Internal Medicine Resident identified the strengths and weaknesses of our training program. We did not observe any correlation between the results of the self- evaluations and the evaluations made by staff physicians. The items of Computer-Book of the Internal Medicine Resident showed high internal consistency and made it possible to measure the acquisition of medical competences in a team of Internal Medicine residents. This self-assessment method should be complemented with other evaluation methods in order to assess the acquisition of medical competences by an individual resident. Copyright © 2012 Elsevier Espa

  6. The pathological basis of dementia in the aged and reliability of computed tomograms in the diagnosis of dementia

    International Nuclear Information System (INIS)

    Tohgi, Hideo

    1981-01-01

    Pathological findings of demented (89 cases) and non-demented, control subjects (74 cases) in the aged were compared. The reliability of CT in the diagnosis was also studied. 1) Brain weight and the degree of ventricular dilatation were related to dementia, but the degree of convolutional atrophy showed no correlation with dementia. 2) Among various types of cerebrovascular lesions, only diffuse white matter lesions can be the cause of dementia. 3) Cases with dementia were classified into 4 groups as regards to which of cerebrovascular lesions and senile plaques was more prominent histologically. 4) CT evaluations coincided with pathological findings in only 17.9% in the degree of ventricular dilatation and 57.1% in the degree of convolutional atrophy. Ninty-three percent of cases without periventricular lucency did not show diffuse white matter lesions at autopsy, while only 50% of cases with periventricular lucency were confirmed to have diffuse white matter lesions. 5) The degree of ventricular dilatation, conventional atrophy, periventricular lucency, and subarachnoid free space in the cerebral convexity were studied in relation to dementia. The sum of the evaluations of these indices had a significant correlation with dementia. (J.P.N.)

  7. Computational identification of 18 micrornas and their targets in three species of rose

    International Nuclear Information System (INIS)

    Baloch, I.A.; Barozai, M.Y.K.; Achakzai, A.K.K.

    2015-01-01

    MicroRNAs (miRNAs) are non-protein coding, small endogenous RNAs. Their length ranges from 18-26 nucleotides (nt). The miRNAs convergence property becomes a rational approach for the hunt of novel miRNAs in other organisms by homology search. As presently very little miRNAs are reported for rose species, so this study deals with the identification of miRNAs in different species of rose. Consequently 18 miRNA belonging to 17 miRNA families were identified in 3 species of rose (Rosa hybrid, Rosa chinensis and Rosa virginiana). All of the identified miRNA families (miR156, 160, 164, 166, 398, 482, 831, 837, 838, 841, 847, 3436, 3627, 6135, 6285, 6287 and 6288) are being reported for the first time in rose. Precursors of the identified miRNAs form stable minimum free energy (MFE) stem-loop structures and the mature miRNAs are found in the stem portions of their corresponding precursors. 11 putative targets of the miRNAs have also been identified. The identified targets are various proteins including transcription factors. Identification of 18 miRNAs will be supportive to explore the gene regulation phenomenon in various species of roses and it will be a good contribution for understanding the post transcriptional gene regulation in various stages of the life cycles of roses. (author)

  8. TPASS: a gamma-ray spectrum analysis and isotope identification computer code

    International Nuclear Information System (INIS)

    Dickens, J.K.

    1981-03-01

    The gamma-ray spectral data-reduction and analysis computer code TPASS is described. This computer code is used to analyze complex Ge(Li) gamma-ray spectra to obtain peak areas corrected for detector efficiencies, from which are determined gamma-ray yields. These yields are compared with an isotope gamma-ray data file to determine the contributions to the observed spectrum from decay of specific radionuclides. A complete FORTRAN listing of the code and a complex test case are given

  9. Reliability analysis and computation of computer-based safety instrumentation and control used in German nuclear power plant. Final report; Zuverlaessigkeitsuntersuchung und -berechnung rechnerbasierter Sicherheitsleittechnik zum Einsatz in deutschen Kernkraftwerken. Abschlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Ding, Yongjian [Hochschule Magdeburg-Stendal, Magdeburg (Germany). Inst. fuer Elektrotechnik; Krause, Ulrich [Magdeburg Univ. (Germany). Inst. fuer Apparate- und Umwelttechnik; Gu, Chunlei

    2014-08-21

    The trend of technological advancement in the field of safety instrumentation and control (I and C) leads to increasingly frequent use of computer-based (digital) control systems which consisting of distributed, connected bus communications computers and their functionalities are freely programmable by qualified software. The advantages of the new I and C system over the old I and C system with hard-wired technology are e.g. in the higher flexibility, cost-effective procurement of spare parts, higher hardware reliability (through higher integration density, intelligent self-monitoring mechanisms, etc.). On the other hand, skeptics see the new technology with the computer-based I and C a higher potential by influences of common cause failures (CCF), and the easier manipulation by sabotage (IT Security). In this joint research project funded by the Federal Ministry for Economical Affaires and Energy (BMWi) (2011-2014, FJZ 1501405) the Otto-von-Guericke-University Magdeburg and Magdeburg-Stendal University of Applied Sciences are therefore trying to develop suitable methods for the demonstration of the reliability of the new instrumentation and control systems with the focus on the investigation of CCF. This expertise of both houses shall be extended to this area and a scientific contribution to the sound reliability judgments of the digital safety I and C in domestic and foreign nuclear power plants. First, the state of science and technology will be worked out through the study of national and international standards in the field of functional safety of electrical and I and C systems and accompanying literature. On the basis of the existing nuclear Standards the deterministic requirements on the structure of the new digital I and C system will be determined. The possible methods of reliability modeling will be analyzed and compared. A suitable method called multi class binomial failure rate (MCFBR) which was successfully used in safety valve applications will be

  10. Virtual non-contrast in second-generation, dual-energy computed tomography: Reliability of attenuation values

    International Nuclear Information System (INIS)

    Toepker, Michael; Moritz, Thomas; Krauss, Bernhard; Weber, Michael; Euller, Gordon; Mang, Thomas; Wolf, Florian; Herold, Christian J.; Ringl, Helmut

    2012-01-01

    Purpose: To evaluate the reliability of attenuation values in virtual non-contrast images (VNC) reconstructed from contrast-enhanced, dual-energy scans performed on a second-generation dual-energy CT scanner, compared to single-energy, non-contrast images (TNC). Materials and methods: Sixteen phantoms containing a mixture of contrast agent and water at different attenuations (0–1400 HU) were investigated on a Definition Flash-CT scanner using a single-energy scan at 120 kV and a DE-CT protocol (100 kV/SN140 kV). For clinical assessment, 86 patients who received a dual-phase CT, containing an unenhanced single-energy scan at 120 kV and a contrast enhanced (110 ml Iomeron 400 mg/ml; 4 ml/s) DE-CT (100 kV/SN140 kV) in an arterial (n = 43) or a venous phase, were retrospectively analyzed. Mean attenuation was measured within regions of interest of the phantoms and in different tissue types of the patients within the corresponding VNC and TNC images. Paired t-tests and Pearson correlation were used for statistical analysis. Results: For all phantoms, mean attenuation in VNC was 5.3 ± 18.4 HU, with respect to water. In 86 patients overall, 2637 regions were measured in TNC and VNC images, with a mean difference between TNC and VNC of −3.6 ± 8.3 HU. In 91.5% (n = 2412) of all cases, absolute differences between TNC and VNC were under 15 HU, and, in 75.3% (n = 1986), differences were under 10 HU. Conclusions: Second-generation dual-energy CT based VNC images provide attenuation values close to those of TNC. To avoid possible outliers multiple measurements are recommended especially for measurements in the spleen, the mesenteric fat, and the aorta.

  11. Virtual non-contrast in second-generation, dual-energy computed tomography: reliability of attenuation values.

    Science.gov (United States)

    Toepker, Michael; Moritz, Thomas; Krauss, Bernhard; Weber, Michael; Euller, Gordon; Mang, Thomas; Wolf, Florian; Herold, Christian J; Ringl, Helmut

    2012-03-01

    To evaluate the reliability of attenuation values in virtual non-contrast images (VNC) reconstructed from contrast-enhanced, dual-energy scans performed on a second-generation dual-energy CT scanner, compared to single-energy, non-contrast images (TNC). Sixteen phantoms containing a mixture of contrast agent and water at different attenuations (0-1400 HU) were investigated on a Definition Flash-CT scanner using a single-energy scan at 120 kV and a DE-CT protocol (100 kV/SN140 kV). For clinical assessment, 86 patients who received a dual-phase CT, containing an unenhanced single-energy scan at 120 kV and a contrast enhanced (110 ml Iomeron 400 mg/ml; 4 ml/s) DE-CT (100 kV/SN140 kV) in an arterial (n=43) or a venous phase, were retrospectively analyzed. Mean attenuation was measured within regions of interest of the phantoms and in different tissue types of the patients within the corresponding VNC and TNC images. Paired t-tests and Pearson correlation were used for statistical analysis. For all phantoms, mean attenuation in VNC was 5.3±18.4 HU, with respect to water. In 86 patients overall, 2637 regions were measured in TNC and VNC images, with a mean difference between TNC and VNC of -3.6±8.3 HU. In 91.5% (n=2412) of all cases, absolute differences between TNC and VNC were under 15HU, and, in 75.3% (n=1986), differences were under 10 HU. Second-generation dual-energy CT based VNC images provide attenuation values close to those of TNC. To avoid possible outliers multiple measurements are recommended especially for measurements in the spleen, the mesenteric fat, and the aorta. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  12. Identification of Requirements for Computer-Supported Matching of Food Consumption Data with Food Composition Data.

    NARCIS (Netherlands)

    Koroušić Seljak, Barbara; Korošec, Peter; Eftimov, Tome; Ocke, Marga; van der Laan, Jan; Roe, Mark; Berry, Rachel; Crispim, Sandra Patricia; Turrini, Aida; Krems, Carolin; Slimani, Nadia; Finglas, Paul

    2018-01-01

    This paper identifies the requirements for computer-supported food matching, in order to address not only national and European but also international current related needs and represents an integrated research contribution of the FP7 EuroDISH project. The available classification and coding systems

  13. A computer-aided framework for development, identification andmanagement of physiologically-based pharmacokinetic models

    DEFF Research Database (Denmark)

    Heitzig, Martina; Linninger, Andreas; Sin, Gürkan

    2014-01-01

    The objective of this work is the development of a generic computer-aided modelling framework to support the development of physiologically-based pharmacokinetic models thereby increasing the efficiency and quality of the modelling process. In particular, the framework systematizes the modelling...

  14. Single source dual-energy computed tomography in the diagnosis of gout: Diagnostic reliability in comparison to digital radiography and conventional computed tomography of the feet

    Energy Technology Data Exchange (ETDEWEB)

    Kiefer, Tobias; Diekhoff, Torsten [Department of Radiology, Charité—Universitätsmedizin Berlin, Campus Mitte, Humboldt-Universität zu Berlin, Freie Universität, Berlin, Charitéplatz 1, 10117 Berlin (Germany); Hermann, Sandra [Department of Rheumatology and Clinical Immunology, Charité—Universitätsmedizin Berlin Campus Mitte, Humboldt-Universität zu Berlin, Freie Universität Berlin, Charitéplatz 1, 10117 Berlin (Germany); Stroux, Andrea [Department of Medical Informatics, Biometry and Epidemiology, Freie Universität Berlin, Berlin (Germany); Mews, Jürgen; Blobel, Jörg [Toshiba Medical Systems Europe, BV, Zilverstraat 1, 2701 RP Zoetermeer (Netherlands); Hamm, Bernd [Department of Radiology, Charité—Universitätsmedizin Berlin, Campus Mitte, Humboldt-Universität zu Berlin, Freie Universität, Berlin, Charitéplatz 1, 10117 Berlin (Germany); Hermann, Kay-Geert A., E-mail: kghermann@gmail.com [Department of Radiology, Charité—Universitätsmedizin Berlin, Campus Mitte, Humboldt-Universität zu Berlin, Freie Universität, Berlin, Charitéplatz 1, 10117 Berlin (Germany)

    2016-10-15

    Objectives: To investigate the diagnostic value of single-source dual-energy computed tomography (SDECT) in gouty arthritis and to compare its capability to detect urate depositions with digital radiography (DR) and conventional computed tomography (CT). Methods: Forty-four patients who underwent SDECT volume scans of the feet for suspected gouty arthritis were retrospectively analyzed. SDECT, CT (both n = 44) and DR (n = 36) were scored by three blinded readers for presence of osteoarthritis, erosions, and tophi. A diagnosis was made for each imaging modality. Results were compared to the clinical diagnosis using the American College of Rheumatology (ACR) classification criteria. Results: The patient population was divided into a gout (n = 21) and control (n = 23) group based on final clinical diagnosis. Osteoarthritis was evident in 15 joints using CT and 30 joints using DR (p = 0.165). There were 134 erosions detected by CT compared to 38 erosions detected by DR (p < 0.001). In total 119 tophi were detected by SDECT, compared to 85 tophi by CT (p = 0.182) and 25 tophi by DR (p < 0.001). SDECT had best diagnostic value for diagnosis of gout compared to DR and conventional CT (sensitivity and specificity for SDECT: 71.4% and 95.7%, CT: 71.4% and 91.3% and DR: 44.4% and 83.3%, respectively). For all three readers, Cohen’s kappa for DR and conventional CT were substantial for all scoring items and ranged from 0.75 to 0.77 and 0.72–0.76, respectively. For SDECT Cohen’s kappa was good to almost perfect with 0.77–0.84. Conclusions: SDECT is capable to detect uric acid depositions with good sensitivity and high specificity in feet, therefore diagnostic confidence is improved. Using SDECT, inter-reader variance can be markedly reduced for the detection of gouty tophi.

  15. Single source dual-energy computed tomography in the diagnosis of gout: Diagnostic reliability in comparison to digital radiography and conventional computed tomography of the feet

    International Nuclear Information System (INIS)

    Kiefer, Tobias; Diekhoff, Torsten; Hermann, Sandra; Stroux, Andrea; Mews, Jürgen; Blobel, Jörg; Hamm, Bernd; Hermann, Kay-Geert A.

    2016-01-01

    Objectives: To investigate the diagnostic value of single-source dual-energy computed tomography (SDECT) in gouty arthritis and to compare its capability to detect urate depositions with digital radiography (DR) and conventional computed tomography (CT). Methods: Forty-four patients who underwent SDECT volume scans of the feet for suspected gouty arthritis were retrospectively analyzed. SDECT, CT (both n = 44) and DR (n = 36) were scored by three blinded readers for presence of osteoarthritis, erosions, and tophi. A diagnosis was made for each imaging modality. Results were compared to the clinical diagnosis using the American College of Rheumatology (ACR) classification criteria. Results: The patient population was divided into a gout (n = 21) and control (n = 23) group based on final clinical diagnosis. Osteoarthritis was evident in 15 joints using CT and 30 joints using DR (p = 0.165). There were 134 erosions detected by CT compared to 38 erosions detected by DR (p < 0.001). In total 119 tophi were detected by SDECT, compared to 85 tophi by CT (p = 0.182) and 25 tophi by DR (p < 0.001). SDECT had best diagnostic value for diagnosis of gout compared to DR and conventional CT (sensitivity and specificity for SDECT: 71.4% and 95.7%, CT: 71.4% and 91.3% and DR: 44.4% and 83.3%, respectively). For all three readers, Cohen’s kappa for DR and conventional CT were substantial for all scoring items and ranged from 0.75 to 0.77 and 0.72–0.76, respectively. For SDECT Cohen’s kappa was good to almost perfect with 0.77–0.84. Conclusions: SDECT is capable to detect uric acid depositions with good sensitivity and high specificity in feet, therefore diagnostic confidence is improved. Using SDECT, inter-reader variance can be markedly reduced for the detection of gouty tophi.

  16. Computational Identification of MicroRNAs and Their Targets from Finger Millet (Eleusine coracana).

    Science.gov (United States)

    Usha, S; Jyothi, M N; Suchithra, B; Dixit, Rekha; Rai, D V; Nagesh Babu, R

    2017-03-01

    MicroRNAs are endogenous small RNAs regulating intrinsic normal growth and development of plant. Discovering miRNAs, their targets and further inferring their functions had become routine process to comprehend the normal biological processes of miRNAs and their roles in plant development. In this study, we used homology-based analysis with available expressed sequence tag of finger millet (Eleusine coracana) to predict conserved miRNAs. Three potent miRNAs targeting 88 genes were identified. The newly identified miRNAs were found to be homologous with miR166 and miR1310. The targets recognized were transcription factors and enzymes, and GO analysis showed these miRNAs played varied roles in gene regulation. The identification of miRNAs and their targets is anticipated to hasten the pace of key epigenetic regulators in plant development.

  17. Modeling and identification of ARMG models for stochastic processes: application to on-line computation of the power spectral density

    International Nuclear Information System (INIS)

    Zwingelstein, Gilles; Thabet, Gabriel.

    1977-01-01

    Control algorithms for components of nuclear power plants are currently based on external diagnostic methods. Modeling and identification techniques for autoregressive moving average models (ARMA) for stochastic processes are described. The identified models provide a means of estimating the power spectral density with improved accuracy and computer time compared with the classical methods. They are particularly will suited for on-line estimation of the power spectral density. The observable stochastic process y (t) is modeled assuming that it is the output of a linear filter driven by Gaussian while noise w (t). Two identification schemes were tested to find the orders m and n of the ARMA (m,n) models and to estimate the parameters of the recursion equation relating the input and output signals. The first scheme consists in transforming the ARMA model to an autoregressive model. The parameters of this AR model are obtained using least squares estimation techniques. The second scheme consists in finding the parameters of the ARMA by nonlinear programming techniques. The power spectral density of y(t) is instantaneously deduced from these ARMA models [fr

  18. Comparison between ultrasound and noncontrast helical computed tomography for identification of acute ureterolithiasis in a teaching hospital setting

    Directory of Open Access Journals (Sweden)

    Luís Ronan Marquez Ferreira de Souza

    2007-03-01

    Full Text Available CONTEXT AND OBJECTIVE: Recent studies have shown noncontrast computed tomography (NCT to be more effective than ultrasound (US for imaging acute ureterolithiasis. However, to our knowledge, there are few studies directly comparing these techniques in an emergency teaching hospital setting. The objectives of this study were to compare the diagnostic accuracy of US and NCT performed by senior radiology residents for diagnosing acute ureterolithiasis; and to assess interobserver agreement on tomography interpretations by residents and experienced abdominal radiologists. DESIGN AND SETTING: Prospective study of 52 consecutive patients, who underwent both US and NCT within an interval of eight hours, at Hospital São Paulo. METHODS: US scans were performed by senior residents and read by experienced radiologists. NCT scan images were read by senior residents, and subsequently by three abdominal radiologists. The interobserver variability was assessed using the kappa statistic. RESULTS: Ureteral calculi were found in 40 out of 52 patients (77%. US presented sensitivity of 22% and specificity of 100%. When collecting system dilatation was associated, US demonstrated 73% sensitivity, 82% specificity. The interobserver agreement in NCT analysis was very high with regard to identification of calculi, collecting system dilatation and stranding of perinephric fat. CONCLUSIONS: US has limited value for identifying ureteral calculi in comparison with NCT, even when collecting system dilatation is present. Residents and abdominal radiologists demonstrated excellent agreement rates for ureteral calculi, identification of collecting system dilatation and stranding of perinephric fat on NCT.

  19. Image Guided Virtual Autopsy: An Adjunct with Radiographic and Computed Tomography Modalities - An Important Tool in Forensic Identification

    Directory of Open Access Journals (Sweden)

    Shalu Rai

    2017-01-01

    Full Text Available The forensic examination of dead bodies is very helpful in order to identify the person, cause of death, gender, and solving the mysterious cases. It includes a number of techniques, out of which autopsy is the primary investigation that is performed in every medicolegal case. Because of mutilation technologies, traditional autopsy technique is most disturbing in terms of emotions and rituals of relatives. The use of radiology in forensic science comprises performance, interpretation, and reporting of radiographs that is helpful in detecting those changes that are not clinically visible. Forensic radiology plays an important role for identification of humans in mass disasters, criminal investigations, and evaluation of cause of death. The introduction of radiological modalities in autopsy techniques is a complementary tool for forensic identification and is known as virtual autopsy. The advance imaging techniques such as computed tomography (CT and magnetic resonance imaging (MRI is used in virtual autopsy in order to visualize and reconstruct the internal organs to know the site, type, and depth of injury. This review elaborates the role of maxillofacial imaging in image-guided virtual autopsy.

  20. An eye movement study for identification of suitable font characters for presentation on a computer screen.

    Science.gov (United States)

    Banerjee, Jayeeta; Majumdar, Dhurjati; Majumdar, Deepti; Pal, Madhu Sudan

    2010-06-01

    We are experiencing a shifting of media: from the printed paper to the computer screen. This transition is modifying the process of how we read and understand a text. It is very difficult to conclude on suitability of font characters based upon subjective evaluation method only. Present study evaluates the effect of font type on human cognitive workload during perception of individual alphabets on a computer screen. Twenty six young subjects volunteered for this study. Here, subjects have been shown individual characters of different font types and their eye movements have been recorded. A binocular eye movement recorder was used for eye movement recording. The results showed that different eye movement parameters such as pupil diameter, number of fixations, fixation duration were less for font type Verdana. The present study recommends the use of font type Verdana for presentation of individual alphabets on various electronic displays in order to reduce cognitive workload.

  1. Identification of a unique cause of ring artifact seen in computed tomography trans-axial images

    International Nuclear Information System (INIS)

    Jha, Ashish Kumar; Purandare, Nilendu C; Shah, Sneha; Agrawal, Archi; Puranik, Ameya D; Rangarajan, Venkatesh

    2013-01-01

    Artifacts present in computed tomography (CT) image often degrade the image quality and ultimately, the diagnostic outcome. Ring artifact in trans-axial image is caused by either miscalibrated or defective detector element of detector row, which is often categorized as scanner based artifact. A ring artifact detected on trans-axial CT image of positron emission tomography/computed tomography (PET/CT), was caused by contamination of CT tube aperture by droplet of injectable contrast medium. This artifact was corrected by removal of contrast droplet from CT tube aperture. The ring artifact is a very common artifact, commonly cited in the literature. Our case puts forward an uncommon cause of this artifact and its method of correction, which also, has no mention in the existing literature

  2. Identification and analysis of unsatisfactory psychosocial work situations: a participatory approach employing video-computer interaction.

    Science.gov (United States)

    Hanse, J J; Forsman, M

    2001-02-01

    A method for psychosocial evaluation of potentially stressful or unsatisfactory situations in manual work was developed. It focuses on subjective responses regarding specific situations and is based on interactive worker assessment when viewing video recordings of oneself. The worker is first video-recorded during work. The video is then displayed on the computer terminal, and the filmed worker clicks on virtual controls on the screen whenever an unsatisfactory psychosocial situation appears; a window of questions regarding psychological demands, mental strain and job control is then opened. A library with pictorial information and comments on the selected situations is formed in the computer. The evaluation system, called PSIDAR, was applied in two case studies, one of manual materials handling in an automotive workshop and one of a group of workers producing and testing instrument panels. The findings indicate that PSIDAR can provide data that are useful in a participatory ergonomic process of change.

  3. Reliability analysis techniques for the design engineer

    International Nuclear Information System (INIS)

    Corran, E.R.; Witt, H.H.

    1980-01-01

    A fault tree analysis package is described that eliminates most of the housekeeping tasks involved in proceeding from the initial construction of a fault tree to the final stage of presenting a reliability analysis in a safety report. It is suitable for designers with relatively little training in reliability analysis and computer operation. Users can rapidly investigate the reliability implications of various options at the design stage, and evolve a system which meets specified reliability objectives. Later independent review is thus unlikely to reveal major shortcomings necessitating modification and projects delays. The package operates interactively allowing the user to concentrate on the creative task of developing the system fault tree, which may be modified and displayed graphically. For preliminary analysis system data can be derived automatically from a generic data bank. As the analysis procedes improved estimates of critical failure rates and test and maintenance schedules can be inserted. The computations are standard, - identification of minimal cut-sets, estimation of reliability parameters, and ranking of the effect of the individual component failure modes and system failure modes on these parameters. The user can vary the fault trees and data on-line, and print selected data for preferred systems in a form suitable for inclusion in safety reports. A case history is given - that of HIFAR containment isolation system. (author)

  4. Identification of an Adaptable Computer Program Design for Analyzing a Modular Organizational Assessment Instrument.

    Science.gov (United States)

    1981-09-01

    ber) Survey-guided development Organizational effectiveness Computer program Organizational diagnosis Management 20. ABSTRACT (Continue an reverse...Army. Doctoral dissertation, Purdue University, December 1977. (DTIC AD-A059-542) Bowers, D. G. Organizational diagnosis : A review and a proposed method...G. E. Compara- tive issues and methods in organizational diagnosis . Ann Arbor MI: Institute for Social Research, University of Michigan, November 1977

  5. Conventional multi-slice computed tomography (CT) and cone-beam CT (CBCT) for computer-aided implant placement. Part II: reliability of mucosa-supported stereolithographic guides.

    Science.gov (United States)

    Arisan, Volkan; Karabuda, Zihni Cüneyt; Pişkin, Bülent; Özdemir, Tayfun

    2013-12-01

    Deviations of implants that were placed by conventional computed tomography (CT)- or cone beam CT (CBCT)-derived mucosa-supported stereolithographic (SLA) surgical guides were analyzed in this study. Eleven patients were randomly scanned by a multi-slice CT (CT group) or a CBCT scanner (CBCT group). A total of 108 implants were planned on the software and placed using SLA guides. A new CT or CBCT scan was obtained and merged with the planning data to identify the deviations between the planned and placed implants. Results were analyzed by Mann-Whitney U test and multiple regressions (p < .05). Mean angular and linear deviations in the CT group were 3.30° (SD 0.36), and 0.75 (SD 0.32) and 0.80 mm (SD 0.35) at the implant shoulder and tip, respectively. In the CBCT group, mean angular and linear deviations were 3.47° (SD 0.37), and 0.81 (SD 0.32) and 0.87 mm (SD 0.32) at the implant shoulder and tip, respectively. No statistically significant differences were detected between the CT and CBCT groups (p = .169 and p = .551, p = .113 for angular and linear deviations, respectively). Implant placement via CT- or CBCT-derived mucosa-supported SLA guides yielded similar deviation values. Results should be confirmed on alternative CBCT scanners. © 2012 Wiley Periodicals, Inc.

  6. Computed tomography for the detection of distal radioulnar joint instability: normal variation and reliability of four CT scoring systems in 46 patients

    Energy Technology Data Exchange (ETDEWEB)

    Wijffels, Mathieu; Krijnen, Pieta; Schipper, Inger [Leiden University Medical Center, Department of Surgery-Trauma Surgery, P.O. Box 9600, Leiden (Netherlands); Stomp, Wouter; Reijnierse, Monique [Leiden University Medical Center, Department of Radiology, P.O. Box 9600, Leiden (Netherlands)

    2016-11-15

    The diagnosis of distal radioulnar joint (DRUJ) instability is clinically challenging. Computed tomography (CT) may aid in the diagnosis, but the reliability and normal variation for DRUJ translation on CT have not been established in detail. The aim of this study was to evaluate inter- and intraobserver agreement and normal ranges of CT scoring methods for determination of DRUJ translation in both posttraumatic and uninjured wrists. Patients with a conservatively treated, unilateral distal radius fracture were included. CT scans of both wrists were evaluated independently, by two readers using the radioulnar line method, subluxation ratio method, epicenter method and radioulnar ratio method. The inter- and intraobserver agreement was assessed and normal values were determined based on the uninjured wrists. Ninety-two wrist CTs (mean age: 56.5 years, SD: 17.0, mean follow-up 4.2 years, SD: 0.5) were evaluated. Interobserver agreement was best for the epicenter method [ICC = 0.73, 95 % confidence interval (CI) 0.65-0.79]. Intraobserver agreement was almost perfect for the radioulnar line method (ICC = 0.82, 95 % CI 0.77-0.87). Each method showed a wide normal range for normal DRUJ translation. Normal range for the epicenter method is -0.35 to -0.06 in pronation and -0.11 to 0.19 in supination. DRUJ translation on CT in pro- and supination can be reliably evaluated in both normal and posttraumatic wrists, however with large normal variation. The epicenter method seems the most reliable. Scanning of both wrists might be helpful to prevent the radiological overdiagnosis of instability. (orig.)

  7. Reliability measures of a computer system with priority to PM over the H/W repair activities subject to MOT and MRT

    Directory of Open Access Journals (Sweden)

    Ashish Kumar

    2015-01-01

    Full Text Available This paper concentrates on the evaluation of reliability measures of a computer system of two-identical units having independent failure of h/w and s/w components. Initially one unit is operative and the other is kept as spare in cold standby. There is a single server visiting the system immediately whenever needed. The server conducts preventive maintenance of the unit after a maximum operation time. If server is unable to repair the h/w components in maximum repair time, then components in the unit are replaced immediately by new one. However, only replacement of the s/w components has been made at their failure. The priority is given to the preventive maintenance over repair activities of the h/w. The time to failure of the components follows negative exponential distribution whereas the distribution of preventive maintenance, repair and replacement time are taken as arbitrary. The expressions for some important reliability measures of system effectiveness have been derived using semi-Markov process and regenerative point technique. The graphical behavior of the results has also been shown for a particular case.

  8. Reliability of cone beam computed tomography as a biopsy-independent tool in differential diagnosis of periapical cysts and granulomas: An In vivo Study.

    Science.gov (United States)

    Chanani, Ankit; Adhikari, Haridas Das

    2017-01-01

    Differential diagnosis of periapical cysts and granulomas is required as their treatment modalities are different. The aim of this study was to evaluate the efficacy of cone beam computed tomography (CBCT) in the differential diagnosis of periapical cysts from granulomas. A single-centered observational study was carried out in the Department of Conservative Dentistry and Endodontics, Dr. R. Ahmed Dental College and Hospital, using CBCT and dental operating microscope. Forty-five lesions were analyzed using CBCT scans. One evaluator analyzed each CBCT scan for the presence of the following six characteristic radiological features: cyst like-location, shape, periphery, internal structure, effect on the surrounding structures, and cortical plate perforation. Another independent evaluator analyzed the CBCT scans. This process was repeated after 6 months, and inter- and intrarater reliability of CBCT diagnoses was evaluated. Periapical surgeries were performed and tissue samples were obtained for histopathological analysis. To evaluate the efficacy, CBCT diagnoses were compared with histopathological diagnoses, and six receiver operating characteristic (ROC) curve analyses were conducted. ROC curve, Cronbach's alpha (α) test, and Cohen Kappa (κ) test were used for statistical analysis. Both inter- and intrarater reliability were excellent (α = 0.94, κ = 0.75 and 0.77, respectively). ROC curve with regard to ≥4 positive findings revealed the highest area under curve (0.66). CBCT is moderately accurate in the differential diagnosis of periapical cysts and granulomas.

  9. Computation of reliable textural indices from multimodal brain MRI: suggestions based on a study of patients with diffuse intrinsic pontine glioma

    Science.gov (United States)

    Goya-Outi, Jessica; Orlhac, Fanny; Calmon, Raphael; Alentorn, Agusti; Nioche, Christophe; Philippe, Cathy; Puget, Stéphanie; Boddaert, Nathalie; Buvat, Irène; Grill, Jacques; Frouin, Vincent; Frouin, Frederique

    2018-05-01

    Few methodological studies regarding widely used textural indices robustness in MRI have been reported. In this context, this study aims to propose some rules to compute reliable textural indices from multimodal 3D brain MRI. Diagnosis and post-biopsy MR scans including T1, post-contrast T1, T2 and FLAIR images from thirty children with diffuse intrinsic pontine glioma (DIPG) were considered. The hybrid white stripe method was adapted to standardize MR intensities. Sixty textural indices were then computed for each modality in different regions of interest (ROI), including tumor and white matter (WM). Three types of intensity binning were compared : constant bin width and relative bounds; constant number of bins and relative bounds; constant number of bins and absolute bounds. The impact of the volume of the region was also tested within the WM. First, the mean Hellinger distance between patient-based intensity distributions decreased by a factor greater than 10 in WM and greater than 2.5 in gray matter after standardization. Regarding the binning strategy, the ranking of patients was highly correlated for 188/240 features when comparing with , but for only 20 when comparing with , and nine when comparing with . Furthermore, when using or texture indices reflected tumor heterogeneity as assessed visually by experts. Last, 41 features presented statistically significant differences between contralateral WM regions when ROI size slightly varies across patients, and none when using ROI of the same size. For regions with similar size, 224 features were significantly different between WM and tumor. Valuable information from texture indices can be biased by methodological choices. Recommendations are to standardize intensities in MR brain volumes, to use intensity binning with constant bin width, and to define regions with the same volumes to get reliable textural indices.

  10. Travel reliability inventory for Chicago.

    Science.gov (United States)

    2013-04-01

    The overarching goal of this research project is to enable state DOTs to document and monitor the reliability performance : of their highway networks. To this end, a computer tool, TRIC, was developed to produce travel reliability inventories from : ...

  11. In silico fragmentation for computer assisted identification of metabolite mass spectra

    Directory of Open Access Journals (Sweden)

    Müller-Hannemann Matthias

    2010-03-01

    Full Text Available Abstract Background Mass spectrometry has become the analytical method of choice in metabolomics research. The identification of unknown compounds is the main bottleneck. In addition to the precursor mass, tandem MS spectra carry informative fragment peaks, but the coverage of spectral libraries of measured reference compounds are far from covering the complete chemical space. Compound libraries such as PubChem or KEGG describe a larger number of compounds, which can be used to compare their in silico fragmentation with spectra of unknown metabolites. Results We created the MetFrag suite to obtain a candidate list from compound libraries based on the precursor mass, subsequently ranked by the agreement between measured and in silico fragments. In the evaluation MetFrag was able to rank most of the correct compounds within the top 3 candidates returned by an exact mass query in KEGG. Compared to a previously published study, MetFrag obtained better results than the commercial MassFrontier software. Especially for large compound libraries, the candidates with a good score show a high structural similarity or just different stereochemistry, a subsequent clustering based on chemical distances reduces this redundancy. The in silico fragmentation requires less than a second to process a molecule, and MetFrag performs a search in KEGG or PubChem on average within 30 to 300 seconds, respectively, on an average desktop PC. Conclusions We presented a method that is able to identify small molecules from tandem MS measurements, even without spectral reference data or a large set of fragmentation rules. With today's massive general purpose compound libraries we obtain dozens of very similar candidates, which still allows a confident estimate of the correct compound class. Our tool MetFrag improves the identification of unknown substances from tandem MS spectra and delivers better results than comparable commercial software. MetFrag is available through a web

  12. Reliability of construction materials

    International Nuclear Information System (INIS)

    Merz, H.

    1976-01-01

    One can also speak of reliability with respect to materials. While for reliability of components the MTBF (mean time between failures) is regarded as the main criterium, this is replaced with regard to materials by possible failure mechanisms like physical/chemical reaction mechanisms, disturbances of physical or chemical equilibrium, or other interactions or changes of system. The main tasks of the reliability analysis of materials therefore is the prediction of the various failure reasons, the identification of interactions, and the development of nondestructive testing methods. (RW) [de

  13. Computational Identification of Genomic Features That Influence 3D Chromatin Domain Formation.

    Science.gov (United States)

    Mourad, Raphaël; Cuvier, Olivier

    2016-05-01

    Recent advances in long-range Hi-C contact mapping have revealed the importance of the 3D structure of chromosomes in gene expression. A current challenge is to identify the key molecular drivers of this 3D structure. Several genomic features, such as architectural proteins and functional elements, were shown to be enriched at topological domain borders using classical enrichment tests. Here we propose multiple logistic regression to identify those genomic features that positively or negatively influence domain border establishment or maintenance. The model is flexible, and can account for statistical interactions among multiple genomic features. Using both simulated and real data, we show that our model outperforms enrichment test and non-parametric models, such as random forests, for the identification of genomic features that influence domain borders. Using Drosophila Hi-C data at a very high resolution of 1 kb, our model suggests that, among architectural proteins, BEAF-32 and CP190 are the main positive drivers of 3D domain borders. In humans, our model identifies well-known architectural proteins CTCF and cohesin, as well as ZNF143 and Polycomb group proteins as positive drivers of domain borders. The model also reveals the existence of several negative drivers that counteract the presence of domain borders including P300, RXRA, BCL11A and ELK1.

  14. TAREAN: a computational tool for identification and characterization of satellite DNA from unassembled short reads

    Czech Academy of Sciences Publication Activity Database

    Novák, Petr; Ávila Robledillo, Laura; Koblížková, Andrea; Vrbová, Iva; Neumann, Pavel; Macas, Jiří

    2017-01-01

    Roč. 45, č. 12 (2017), č. článku e111. ISSN 0305-1048 R&D Projects: GA ČR GBP501/12/G090; GA MŠk(CZ) LM2015047 Institutional support: RVO:60077344 Keywords : in-situ hybridization * repetitive sequences * tandem repeats * vicia-faba Subject RIV: EB - Genetics ; Molecular Biology OBOR OECD: Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8) Impact factor: 10.162, year: 2016

  15. Cutting edge: identification of novel T cell epitopes in Lol p5a by computational prediction.

    Science.gov (United States)

    de Lalla, C; Sturniolo, T; Abbruzzese, L; Hammer, J; Sidoli, A; Sinigaglia, F; Panina-Bordignon, P

    1999-08-15

    Although atopic allergy affects Lol p5a allergen from rye grass. In vitro binding studies confirmed the promiscuous binding characteristics of these peptides. Moreover, most of the predicted ligands were novel T cell epitopes that were able to stimulate T cells from atopic patients. We generated a panel of Lol p5a-specific T cell clones, the majority of which recognized the peptides in a cross-reactive fashion. The computational prediction of DR ligands might thus allow the design of T cell epitopes with potential useful application in novel immunotherapy strategies.

  16. The identification of spinal pathology in chronic low back pain using single photon emission computed tomography

    International Nuclear Information System (INIS)

    Ryan, R.J.; Gibson, T.; Fogelman, I.

    1992-01-01

    Single photon emission computed tomography (SPECT) findings were investigated in 80 consecutive patients (aged 18-70 years, median 44) referred to a rheumatology outpatient clinic with low back pain persisting for more than 3 months. Lesions of the lumbar spine were demonstrated in 60% of patients using SPECT but in only 35% with planar imaging. Fifty-one per cent of all lesions were only detected by SPECT, and lesions visualized on SPECT could be precisely localized to the vertebral body, or different parts of the posterior elements. Fifty per cent of lesions involved the facetal joints of which almost 60% were identified on SPECT alone. X-rays of the lumbar spine, with posterior oblique views, failed to demonstrate abnormalities corresponding to almost all SPECT posterior element lesions although it identified abnormalities corresponding to over 60% of anterior SPECT lesions. Computed tomography (CT) was performed in 30 patients with a SPECT lesion and sites of facetal joint activity corresponded to facetal osteoarthritis in 82%. (author)

  17. TAREAN: a computational tool for identification and characterization of satellite DNA from unassembled short reads.

    Science.gov (United States)

    Novák, Petr; Ávila Robledillo, Laura; Koblížková, Andrea; Vrbová, Iva; Neumann, Pavel; Macas, Jirí

    2017-07-07

    Satellite DNA is one of the major classes of repetitive DNA, characterized by tandemly arranged repeat copies that form contiguous arrays up to megabases in length. This type of genomic organization makes satellite DNA difficult to assemble, which hampers characterization of satellite sequences by computational analysis of genomic contigs. Here, we present tandem repeat analyzer (TAREAN), a novel computational pipeline that circumvents this problem by detecting satellite repeats directly from unassembled short reads. The pipeline first employs graph-based sequence clustering to identify groups of reads that represent repetitive elements. Putative satellite repeats are subsequently detected by the presence of circular structures in their cluster graphs. Consensus sequences of repeat monomers are then reconstructed from the most frequent k-mers obtained by decomposing read sequences from corresponding clusters. The pipeline performance was successfully validated by analyzing low-pass genome sequencing data from five plant species where satellite DNA was previously experimentally characterized. Moreover, novel satellite repeats were predicted for the genome of Vicia faba and three of these repeats were verified by detecting their sequences on metaphase chromosomes using fluorescence in situ hybridization. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  18. Computer system for identification of tool wear model in hot forging

    Directory of Open Access Journals (Sweden)

    Wilkus Marek

    2016-01-01

    Full Text Available The aim of the research was to create a methodology that will enable effective and reliable prediction of the tool wear. The idea of the hybrid model, which accounts for various mechanisms of tool material deterioration, is proposed in the paper. The mechanisms, which were considered, include abrasive wear, adhesive wear, thermal fatigue, mechanical fatigue, oxidation and plastic deformation. Individual models of various complexity were used for separate phenomena and strategy of combination of these models in one hybrid system was developed to account for the synergy of various mechanisms. The complex hybrid model was built on the basis of these individual models for various wear mechanisms. The individual models expanded from phenomenological ones for abrasive wear to multi-scale methods for modelling micro cracks initiation and propagation utilizing virtual representations of granular microstructures. The latter have been intensively developed recently and they form potentially a powerful tool that allows modelling of thermal and mechanical fatigue, accounting explicitly for the tool material microstructure.

  19. Rapid and reliable identification of Gram-negative bacteria and Gram-positive cocci by deposition of bacteria harvested from blood cultures onto the MALDI-TOF plate.

    OpenAIRE

    Barnini, S; Ghelardi, Emilia; Brucculeri, V; Morici, Paola; Lupetti, Antonella

    2015-01-01

    Background Rapid identification of the causative agent(s) of bloodstream infections using the matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF) methodology can lead to increased empirical antimicrobial therapy appropriateness. Herein, we aimed at establishing an easier and simpler method, further referred to as the direct method, using bacteria harvested by serum separator tubes from positive blood cultures and placed onto the polished steel target plate for rapid identif...

  20. Jimena: efficient computing and system state identification for genetic regulatory networks.

    Science.gov (United States)

    Karl, Stefan; Dandekar, Thomas

    2013-10-11

    Boolean networks capture switching behavior of many naturally occurring regulatory networks. For semi-quantitative modeling, interpolation between ON and OFF states is necessary. The high degree polynomial interpolation of Boolean genetic regulatory networks (GRNs) in cellular processes such as apoptosis or proliferation allows for the modeling of a wider range of node interactions than continuous activator-inhibitor models, but suffers from scaling problems for networks which contain nodes with more than ~10 inputs. Many GRNs from literature or new gene expression experiments exceed those limitations and a new approach was developed. (i) As a part of our new GRN simulation framework Jimena we introduce and setup Boolean-tree-based data structures; (ii) corresponding algorithms greatly expedite the calculation of the polynomial interpolation in almost all cases, thereby expanding the range of networks which can be simulated by this model in reasonable time. (iii) Stable states for discrete models are efficiently counted and identified using binary decision diagrams. As application example, we show how system states can now be sampled efficiently in small up to large scale hormone disease networks (Arabidopsis thaliana development and immunity, pathogen Pseudomonas syringae and modulation by cytokinins and plant hormones). Jimena simulates currently available GRNs about 10-100 times faster than the previous implementation of the polynomial interpolation model and even greater gains are achieved for large scale-free networks. This speed-up also facilitates a much more thorough sampling of continuous state spaces which may lead to the identification of new stable states. Mutants of large networks can be constructed and analyzed very quickly enabling new insights into network robustness and behavior.

  1. Computer-aided structure analysis. Structure identification by infrared and /sup 13/C NMR measurements

    Energy Technology Data Exchange (ETDEWEB)

    Szalontai, G; Simon, Z; Csapo, Z; Farkas, M; Pfeifer, Gy [Nehezvegyipari Kutato Intezet, Veszprem (Hungary)

    1980-01-01

    The results obtained from the computer-aided interpretation of /sup 13/C NMR and IR spectra using the artificial intelligence approach are presented. In its present state the output of the system is a list of functional groups which are resonable candidates for the final structural isomers. The input requires empirical formula, /sup 13/C NMR data (off resonance data also) and IR spectral data. The confirmation of the presence of a functional group is based on comparison of the experimental data with the spectral properties of functional groups stored in a property matrix. If the molecular weight of the compounds studied is less or equal 500, the output contains usually 1.5-2.5 times more groups than really present, in most cases without the loss of the real ones.

  2. Identification and red blood cell automated counting from blood smear images using computer-aided system.

    Science.gov (United States)

    Acharya, Vasundhara; Kumar, Preetham

    2018-03-01

    Red blood cell count plays a vital role in identifying the overall health of the patient. Hospitals use the hemocytometer to count the blood cells. Conventional method of placing the smear under microscope and counting the cells manually lead to erroneous results, and medical laboratory technicians are put under stress. A computer-aided system will help to attain precise results in less amount of time. This research work proposes an image-processing technique for counting the number of red blood cells. It aims to examine and process the blood smear image, in order to support the counting of red blood cells and identify the number of normal and abnormal cells in the image automatically. K-medoids algorithm which is robust to external noise is used to extract the WBCs from the image. Granulometric analysis is used to separate the red blood cells from the white blood cells. The red blood cells obtained are counted using the labeling algorithm and circular Hough transform. The radius range for the circle-drawing algorithm is estimated by computing the distance of the pixels from the boundary which automates the entire algorithm. A comparison is done between the counts obtained using the labeling algorithm and circular Hough transform. Results of the work showed that circular Hough transform was more accurate in counting the red blood cells than the labeling algorithm as it was successful in identifying even the overlapping cells. The work also intends to compare the results of cell count done using the proposed methodology and manual approach. The work is designed to address all the drawbacks of the previous research work. The research work can be extended to extract various texture and shape features of abnormal cells identified so that diseases like anemia of inflammation and chronic disease can be detected at the earliest.

  3. Rapid identification of pearl powder from Hyriopsis cumingii by Tri-step infrared spectroscopy combined with computer vision technology

    Science.gov (United States)

    Liu, Siqi; Wei, Wei; Bai, Zhiyi; Wang, Xichang; Li, Xiaohong; Wang, Chuanxian; Liu, Xia; Liu, Yuan; Xu, Changhua

    2018-01-01

    Pearl powder, an important raw material in cosmetics and Chinese patent medicines, is commonly uneven in quality and frequently adulterated with low-cost shell powder in the market. The aim of this study is to establish an adequate approach based on Tri-step infrared spectroscopy with enhancing resolution combined with chemometrics for qualitative identification of pearl powder originated from three different quality grades of pearls and quantitative prediction of the proportions of shell powder adulterated in pearl powder. Additionally, computer vision technology (E-eyes) can investigate the color difference among different pearl powders and make it traceable to the pearl quality trait-visual color categories. Though the different grades of pearl powder or adulterated pearl powder have almost identical IR spectra, SD-IR peak intensity at about 861 cm- 1 (v2 band) exhibited regular enhancement with the increasing quality grade of pearls, while the 1082 cm- 1 (v1 band), 712 cm- 1 and 699 cm- 1 (v4 band) were just the reverse. Contrastly, only the peak intensity at 862 cm- 1 was enhanced regularly with the increasing concentration of shell powder. Thus, the bands in the ranges of (1550-1350 cm- 1, 730-680 cm- 1) and (830-880 cm- 1, 690-725 cm- 1) could be exclusive ranges to discriminate three distinct pearl powders and identify adulteration, respectively. For massive sample analysis, a qualitative classification model and a quantitative prediction model based on IR spectra was established successfully by principal component analysis (PCA) and partial least squares (PLS), respectively. The developed method demonstrated great potential for pearl powder quality control and authenticity identification in a direct, holistic manner.

  4. MULTIDETECTOR COMPUTED TOMOGRAPHY FOR IDENTIFICATION OF INSTABILITY OF AORTIC ANEURYSM WALL

    Directory of Open Access Journals (Sweden)

    M. V. Vishnyakova Jr.

    2015-01-01

    Full Text Available Background: Aortic aneurysm is characterized by high incidence, polymorphic clinical features and sudden onset of severe complications.Aim: To develop a standard multidetector computed tomography (MDCT protocol for aortic aneurysm examination and image analysis for detection the signs of aortic wall instability.Materials and methods: The data of 279 patients with aortic aneurysm who underwent MDCT examination during 2009–2014 was analyzed to identify aortic wall instability signs.Results: Complicated course of aortic aneurysm was observed in 100 cases (36%. The most common sign of aortic wall instability was aortic dissection. According to our results, a new definition of aortic aneurysm complications was elaborated. It included signs of aortic wall instability with incomplete and/or complete disruption of aortic wall layers. A scheme of the most common patterns of aortic wall abnormalities was proposed, allowing a radiologist to reach high accuracy in characterizing this pathology.Conclusion: A dedicated MDCT protocol for aortic aneurysm detection and image analysis can increase quality of radiologic assessment of aneurysm wall allowing to approach to the level of histological accuracy.

  5. Seismocardiography-Based Cardiac Computed Tomography Gating Using Patient-Specific Template Identification and Detection.

    Science.gov (United States)

    Yao, Jingting; Tridandapani, Srini; Wick, Carson A; Bhatti, Pamela T

    2017-01-01

    To more accurately trigger cardiac computed tomography angiography (CTA) than electrocardiography (ECG) alone, a sub-system is proposed as an intermediate step toward fusing ECG with seismocardiography (SCG). Accurate prediction of quiescent phases is crucial to prospectively gating CTA, which is susceptible to cardiac motion and, thus, can affect the diagnostic quality of images. The key innovation of this sub-system is that it identifies the SCG waveform corresponding to heart sounds and determines their phases within the cardiac cycles. Furthermore, this relationship is modeled as a linear function with respect to heart rate. For this paper, B-mode echocardiography is used as the gold standard for identifying the quiescent phases. We analyzed synchronous ECG, SCG, and echocardiography data acquired from seven healthy subjects (mean age: 31; age range: 22-48; males: 4) and 11 cardiac patients (mean age: 56; age range: 31-78; males: 6). On average, the proposed algorithm was able to successfully identify 79% of the SCG waveforms in systole and 68% in diastole. The simulated results show that SCG-based prediction produced less average phase error than that of ECG. It was found that the accuracy of ECG-based gating is more susceptible to increases in heart rate variability, while SCG-based gating is susceptible to high cycle to cycle variability in morphology. This pilot work of prediction using SCG waveforms enriches the framework of a comprehensive system with multiple modalities that could potentially, in real time, improve the image quality of CTA.

  6. Computational identification of potent inhibitors for Streptomycin 3″-adenylyltransferase of Serratia marcescens.

    Science.gov (United States)

    Prabhu, Dhamodharan; Vidhyavathi, Ramasamy; Jeyakanthan, Jeyaraman

    2017-02-01

    Serratia marcescens is an opportunistic pathogen responsible for the respiratory and urinary tract infections in humans. The antibiotic resistance mechanism of S. marcescens is mediated through aminoglycoside modification enzyme that transfer adenyl group from substrate to antibiotic through regiospecific transfers for the inactivation of antibiotics. Streptomycin 3 ″ -adenylyltransferase acts on the 3' position of the antibiotic and considered as a novel drug target to overcome bacterial antibiotic resistance. Till now, there is no experimentally solved crystal structure of Streptomycin 3″-adenylyltransferase in S. marcescens. Hence, the present study was initiated to construct the three dimensional structure of Streptomycin 3″-adenylyltransferase in order to understand the binding mechanism. The modeled structure was subjected to structure-based virtual screening to identify potent compounds from the five chemical structure databases. Furthermore, different computational methods such as molecular docking, molecular dynamics simulations, ADME toxicity assessment, free energy and density functional theory calculations predicted the structural, binding and pharmacokinetic properties of the best five compounds. Overall, the results suggested that stable binding confirmation of the five potent compounds were mediated through hydrophobic, π-π stacking, salt bridges and hydrogen bond interactions. The identified compounds could pave way for the development of anti-pathogenic agents as potential drug entities. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Identification and simulation of the power quality problems using computer models

    International Nuclear Information System (INIS)

    Abro, M.R.; Memon, A.P.; Memon, Z.A.

    2005-01-01

    The Power Quality has become the main factor in our life. If this quality of power is being polluted over the Electrical Power Network, serious problems could arise within the modem social structure and its conveniences. The Nonlinear Characteristics of various office and Industrial equipment connected to the power grid could cause electrical disturbances to poor power quality. In many cases the electric power consumed is first converted to different form and such conversion process introduces harmonic pollution in the grid. These electrical disturbances could destroy certain sensitive equipment connected to the grid or in some cases could cause them to malfunction. In the huge power network identifying the source of such disturbance without causing interruption to the supply is a big problem. This paper attempts to study the power quality problem caused by typical loads using computer models paving the way to identify the source of the problem. PSB (Power System Blockset) Toolbox of MATLAB is used for this paper, which is designed to provide modem tool that rapidly and easily builds models and simulates the power system. The blockset uses the Simulink environment, allowing a model to be built using simple click and drag procedures. (author)

  8. Three-dimensional maximum principal strain using cardiac computed tomography for identification of myocardial infarction

    Energy Technology Data Exchange (ETDEWEB)

    Tanabe, Yuki; Kido, Teruhito; Kurata, Akira; Sawada, Shun; Suekuni, Hiroshi; Kido, Tomoyuki; Yokoi, Takahiro; Miyagawa, Masao; Mochizuki, Teruhito [Ehime University Graduate School of Medicine, Department of Radiology, Toon City, Ehime (Japan); Uetani, Teruyoshi; Inoue, Katsuji [Ehime University Graduate School of Medicine, Department of Cardiology, Pulmonology, Hypertension and Nephrology, Toon City, Ehime (Japan)

    2017-04-15

    To evaluate the feasibility of three-dimensional (3D) maximum principal strain (MP-strain) derived from cardiac computed tomography (CT) for detecting myocardial infarction (MI). Forty-three patients who underwent cardiac CT and magnetic resonance imaging (MRI) were retrospectively selected. Using the voxel tracking of motion coherence algorithm, the peak CT MP-strain was measured using the 16-segment model. With the trans-mural extent of late gadolinium enhancement (LGE) and the distance from MI, all segments were classified into four groups (infarcted, border, adjacent, and remote segments); infarcted and border segments were defined as MI with LGE positive. Diagnostic performance of MP-strain for detecting MI was compared with per cent systolic wall thickening (%SWT) assessed by MRI using receiver-operating characteristic curve analysis at a segment level. Of 672 segments excluding16 segments influenced by artefacts, 193 were diagnosed as MI. Sensitivity and specificity of peak MP-strain to identify MI were 81 % [95 % confidence interval (95 % CI): 74-88 %] and 86 % (81-92 %) compared with %SWT: 76 % (60-95 %) and 68 % (48-84 %), respectively. The area under the curve of peak MP-strain was superior to %SWT [0.90 (0.87-0.93) vs. 0.80 (0.76-0.83), p < 0.05]. CT MP-strain has a potential to provide incremental value to coronary CT angiography for detecting MI. (orig.)

  9. The Reliability of EMU FIscal Indicators: Risks and Safeguards

    OpenAIRE

    Fabrizio Balassone; Daniele Franco; Stefania Zotteri

    2007-01-01

    The reliability of EMU�s fiscal indicators has been questioned by recent episodes of large upward deficit revisions. This paper discusses the causes of such revisions in order to identify ways to improve monitoring. The computation of EMU�s deficit indicator involves the assessment of accrued revenue and expenditure and the identification of transactions in financial assets. Both can open margins for opportunistic accounting. However, crosschecks between deficit and changes in gross nomin...

  10. Accuracy and reliability of different cone beam computed tomography (CBCT) devices for structural analysis of alveolar bone in comparison with multislice CT and micro-CT.

    Science.gov (United States)

    Van Dessel, Jeroen; Nicolielo, Laura Ferreira Pinheiro; Huang, Yan; Coudyzer, Walter; Salmon, Benjamin; Lambrichts, Ivo; Jacobs, Reinhilde

    The aim of this study was to assess whether cone beam computed tomography (CBCT) may be used for clinically reliable alveolar bone quality assessment in comparison to its clinical alternatives, multislice computed tomography and the gold standard (micro-CT). Six dentate mandibular bone samples were scanned with seven CBCT devices (ProMax 3D Max, NewTom GiANO, Cranex 3D, 3D Accuitomo 170, Carestream 9300, Scanora 3D, I-CAT Next generation), one micro-CT scanner (SkyScan 1174) and one MSCT machine (Somatom Definition Flash) using two protocols (standard and high-resolution). MSCT and CBCT images were automatically spatially aligned on the micro-CT scan of the corresponding sample. A volume of interest was manually delineated on the micro-CT image and overlaid on the other scanning devices. Alveolar bone structures were automatically extracted using the adaptive thresholding algorithm. Based on the resulting binary images, an automatic 3D morphometric quantification was performed in a CT-Analyser (Bruker, Kontich, Belgium). The reliability and measurement errors were calculated for each modality compared to the gold standard micro-CT. Both MSCT and CBCT were associated with a clinically and statistically (P max, bone surface density -0.47 mm-1 min to 0.16 mm-1 max and trabecular thickness 0.15 mm min to 0.31 mm max) were significantly (P max and fractal dimension 0.08 min to 0.17 max) in all scanners compared to micro-CT. However, the structural pattern of the alveolar bone remained similar compared to that of the micro-CT for the ProMax 3D Max, NewTom GiANO, Cranex 3D, 3D Accuitomo 170 and Carestream 9300. On the other hand, the Scanora 3D, i-CAT Next Generation, standard and high-resolution MSCT displayed an overrated bone quantity and aberrant structural pattern compared to other scanning devices. The calculation of morphometric indices had an overall high reliability (intraclass correlation coefficient [ICC] 0.62 min to 0.99 max), except

  11. The reliability of cone-beam computed tomography to assess bone density at dental implant recipient sites: a histomorphometric analysis by micro-CT.

    Science.gov (United States)

    González-García, Raúl; Monje, Florencio

    2013-08-01

    The aim of this study was to objectively assess the reliability of the cone-beam computed tomography (CBCT) as a tool to pre-operatively determine radiographic bone density (RBD) by the density values provided by the system, analyzing its relationship with histomorphometric bone density expressed as bone volumetric fraction (BV/TV) assessed by micro-CT of bone biopsies at the site of insertion of dental implants in the maxillary bones. Thirty-nine bone biopsies of the maxillary bones at the sites of 39 dental implants from 31 edentulous healthy patients were analyzed. The NobelGuide™ software was used for implant planning, which also allowed fabrication of individual stereolithographic surgical guides. The analysis of CBCT images allowed pre-operative determination of mean density values of implant recipient sites along the major axis of the planned implants (axial RBD). Stereolithographic surgical guides were used to guide implant insertion and also to extract cylindrical bone biopsies from the core of the exact implant site. Further analysis of several osseous micro-structural variables including BV/TV was performed by micro-CT of the extracted bone biopsies. Mean axial RBD was 478 ± 212 (range: 144-953). A statistically significant difference (P = 0.02) was observed among density values of the cortical bone of the upper maxilla and mandible. A high positive Pearson's correlation coefficient (r = 0.858, P micro-CT at the site of dental implants in the maxillary bones. Pre-operative estimation of density values by CBCT is a reliable tool to objectively determine bone density. © 2012 John Wiley & Sons A/S.

  12. Identification and analysis of potential targets in Streptococcus sanguinis using computer aided protein data analysis.

    Science.gov (United States)

    Chowdhury, Md Rabiul Hossain; Bhuiyan, Md IqbalKaiser; Saha, Ayan; Mosleh, Ivan Mhai; Mondol, Sobuj; Ahmed, C M Sabbir

    2014-01-01

    Streptococcus sanguinis is a Gram-positive, facultative aerobic bacterium that is a member of the viridans streptococcus group. It is found in human mouths in dental plaque, which accounts for both dental cavities and bacterial endocarditis, and which entails a mortality rate of 25%. Although a range of remedial mediators have been found to control this organism, the effectiveness of agents such as penicillin, amoxicillin, trimethoprim-sulfamethoxazole, and erythromycin, was observed. The emphasis of this investigation was on finding substitute and efficient remedial approaches for the total destruction of this bacterium. In this computational study, various databases and online software were used to ascertain some specific targets of S. sanguinis. Particularly, the Kyoto Encyclopedia of Genes and Genomes databases were applied to determine human nonhomologous proteins, as well as the metabolic pathways involved with those proteins. Different software such as Phyre2, CastP, DoGSiteScorer, the Protein Function Predictor server, and STRING were utilized to evaluate the probable active drug binding site with its known function and protein-protein interaction. In this study, among 218 essential proteins of this pathogenic bacterium, 81 nonhomologous proteins were accrued, and 15 proteins that are unique in several metabolic pathways of S. sanguinis were isolated through metabolic pathway analysis. Furthermore, four essentially membrane-bound unique proteins that are involved in distinct metabolic pathways were revealed by this research. Active sites and druggable pockets of these selected proteins were investigated with bioinformatic techniques. In addition, this study also mentions the activity of those proteins, as well as their interactions with the other proteins. Our findings helped to identify the type of protein to be considered as an efficient drug target. This study will pave the way for researchers to develop and discover more effective and specific

  13. Identification and analysis of potential targets in Streptococcus sanguinis using computer aided protein data analysis

    Science.gov (United States)

    Chowdhury, Md Rabiul Hossain; Bhuiyan, Md IqbalKaiser; Saha, Ayan; Mosleh, Ivan MHAI; Mondol, Sobuj; Ahmed, C M Sabbir

    2014-01-01

    Purpose Streptococcus sanguinis is a Gram-positive, facultative aerobic bacterium that is a member of the viridans streptococcus group. It is found in human mouths in dental plaque, which accounts for both dental cavities and bacterial endocarditis, and which entails a mortality rate of 25%. Although a range of remedial mediators have been found to control this organism, the effectiveness of agents such as penicillin, amoxicillin, trimethoprim–sulfamethoxazole, and erythromycin, was observed. The emphasis of this investigation was on finding substitute and efficient remedial approaches for the total destruction of this bacterium. Materials and methods In this computational study, various databases and online software were used to ascertain some specific targets of S. sanguinis. Particularly, the Kyoto Encyclopedia of Genes and Genomes databases were applied to determine human nonhomologous proteins, as well as the metabolic pathways involved with those proteins. Different software such as Phyre2, CastP, DoGSiteScorer, the Protein Function Predictor server, and STRING were utilized to evaluate the probable active drug binding site with its known function and protein–protein interaction. Results In this study, among 218 essential proteins of this pathogenic bacterium, 81 nonhomologous proteins were accrued, and 15 proteins that are unique in several metabolic pathways of S. sanguinis were isolated through metabolic pathway analysis. Furthermore, four essentially membrane-bound unique proteins that are involved in distinct metabolic pathways were revealed by this research. Active sites and druggable pockets of these selected proteins were investigated with bioinformatic techniques. In addition, this study also mentions the activity of those proteins, as well as their interactions with the other proteins. Conclusion Our findings helped to identify the type of protein to be considered as an efficient drug target. This study will pave the way for researchers to

  14. Identification and analysis of potential targets in Streptococcus sanguinis using computer aided protein data analysis

    Directory of Open Access Journals (Sweden)

    Chowdhury MRH

    2014-11-01

    Full Text Available Md Rabiul Hossain Chowdhury,1 Md IqbalKaiser Bhuiyan,2 Ayan Saha,2 Ivan MHAI Mosleh,2 Sobuj Mondol,2 C M Sabbir Ahmed3 1Department of Pharmacy, University of Science and Technology Chittagong, Chittagong, Bangladesh; 2Department of Genetic Engineering and Biotechnology, University of Chittagong, Chittagong, Bangladesh; 3Biotechnology and Genetic Engineering Discipline, Khulna University, Khulna, Bangladesh Purpose: Streptococcus sanguinis is a Gram-positive, facultative aerobic bacterium that is a member of the viridans streptococcus group. It is found in human mouths in dental plaque, which accounts for both dental cavities and bacterial endocarditis, and which entails a mortality rate of 25%. Although a range of remedial mediators have been found to control this organism, the effectiveness of agents such as penicillin, amoxicillin, trimethoprim–sulfamethoxazole, and erythromycin, was observed. The emphasis of this investigation was on finding substitute and efficient remedial approaches for the total destruction of this bacterium. Materials and methods: In this computational study, various databases and online software were used to ascertain some specific targets of S. sanguinis. Particularly, the Kyoto Encyclopedia of Genes and Genomes databases were applied to determine human nonhomologous proteins, as well as the metabolic pathways involved with those proteins. Different software such as Phyre2, CastP, DoGSiteScorer, the Protein Function Predictor server, and STRING were utilized to evaluate the probable active drug binding site with its known function and protein–protein interaction. Results: In this study, among 218 essential proteins of this pathogenic bacterium, 81 nonhomologous proteins were accrued, and 15 proteins that are unique in several metabolic pathways of S. sanguinis were isolated through metabolic pathway analysis. Furthermore, four essentially membrane-bound unique proteins that are involved in distinct metabolic

  15. Identification of Requirements for Computer-Supported Matching of Food Consumption Data with Food Composition Data

    Directory of Open Access Journals (Sweden)

    Barbara Koroušić Seljak

    2018-03-01

    Full Text Available This paper identifies the requirements for computer-supported food matching, in order to address not only national and European but also international current related needs and represents an integrated research contribution of the FP7 EuroDISH project. The available classification and coding systems and the specific problems of food matching are summarized and a new concept for food matching based on optimization methods and machine-based learning is proposed. To illustrate and test this concept, a study has been conducted in four European countries (i.e., Germany, The Netherlands, Italy and the UK using different classification and coding systems. This real case study enabled us to evaluate the new food matching concept and provide further recommendations for future work. In the first stage of the study, we prepared subsets of food consumption data described and classified using different systems, that had already been manually matched with national food composition data. Once the food matching algorithm was trained using this data, testing was performed on another subset of food consumption data. Experts from different countries validated food matching between consumption and composition data by selecting best matches from the options given by the matching algorithm without seeing the result of the previously made manual match. The evaluation of study results stressed the importance of the role and quality of the food composition database as compared to the selected classification and/or coding systems and the need to continue compiling national food composition data as eating habits and national dishes still vary between countries. Although some countries managed to collect extensive sets of food consumption data, these cannot be easily matched with food composition data if either food consumption or food composition data are not properly classified and described using any classification and coding systems. The study also showed that the

  16. Identification of Requirements for Computer-Supported Matching of Food Consumption Data with Food Composition Data

    Science.gov (United States)

    Korošec, Peter; Eftimov, Tome; Ocke, Marga; van der Laan, Jan; Roe, Mark; Berry, Rachel; Turrini, Aida; Krems, Carolin; Slimani, Nadia; Finglas, Paul

    2018-01-01

    This paper identifies the requirements for computer-supported food matching, in order to address not only national and European but also international current related needs and represents an integrated research contribution of the FP7 EuroDISH project. The available classification and coding systems and the specific problems of food matching are summarized and a new concept for food matching based on optimization methods and machine-based learning is proposed. To illustrate and test this concept, a study has been conducted in four European countries (i.e., Germany, The Netherlands, Italy and the UK) using different classification and coding systems. This real case study enabled us to evaluate the new food matching concept and provide further recommendations for future work. In the first stage of the study, we prepared subsets of food consumption data described and classified using different systems, that had already been manually matched with national food composition data. Once the food matching algorithm was trained using this data, testing was performed on another subset of food consumption data. Experts from different countries validated food matching between consumption and composition data by selecting best matches from the options given by the matching algorithm without seeing the result of the previously made manual match. The evaluation of study results stressed the importance of the role and quality of the food composition database as compared to the selected classification and/or coding systems and the need to continue compiling national food composition data as eating habits and national dishes still vary between countries. Although some countries managed to collect extensive sets of food consumption data, these cannot be easily matched with food composition data if either food consumption or food composition data are not properly classified and described using any classification and coding systems. The study also showed that the level of human

  17. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  18. A Soft Computing Approach to Crack Detection and Impact Source Identification with Field-Programmable Gate Array Implementation

    Directory of Open Access Journals (Sweden)

    Arati M. Dixit

    2013-01-01

    Full Text Available The real-time nondestructive testing (NDT for crack detection and impact source identification (CDISI has attracted the researchers from diverse areas. This is apparent from the current work in the literature. CDISI has usually been performed by visual assessment of waveforms generated by a standard data acquisition system. In this paper we suggest an automation of CDISI for metal armor plates using a soft computing approach by developing a fuzzy inference system to effectively deal with this problem. It is also advantageous to develop a chip that can contribute towards real time CDISI. The objective of this paper is to report on efforts to develop an automated CDISI procedure and to formulate a technique such that the proposed method can be easily implemented on a chip. The CDISI fuzzy inference system is developed using MATLAB’s fuzzy logic toolbox. A VLSI circuit for CDISI is developed on basis of fuzzy logic model using Verilog, a hardware description language (HDL. The Xilinx ISE WebPACK9.1i is used for design, synthesis, implementation, and verification. The CDISI field-programmable gate array (FPGA implementation is done using Xilinx’s Spartan 3 FPGA. SynaptiCAD’s Verilog Simulators—VeriLogger PRO and ModelSim—are used as the software simulation and debug environment.

  19. Computed Tomography Image Origin Identification Based on Original Sensor Pattern Noise and 3-D Image Reconstruction Algorithm Footprints.

    Science.gov (United States)

    Duan, Yuping; Bouslimi, Dalel; Yang, Guanyu; Shu, Huazhong; Coatrieux, Gouenou

    2017-07-01

    In this paper, we focus on the "blind" identification of the computed tomography (CT) scanner that has produced a CT image. To do so, we propose a set of noise features derived from the image chain acquisition and which can be used as CT-scanner footprint. Basically, we propose two approaches. The first one aims at identifying a CT scanner based on an original sensor pattern noise (OSPN) that is intrinsic to the X-ray detectors. The second one identifies an acquisition system based on the way this noise is modified by its three-dimensional (3-D) image reconstruction algorithm. As these reconstruction algorithms are manufacturer dependent and kept secret, our features are used as input to train a support vector machine (SVM) based classifier to discriminate acquisition systems. Experiments conducted on images issued from 15 different CT-scanner models of 4 distinct manufacturers demonstrate that our system identifies the origin of one CT image with a detection rate of at least 94% and that it achieves better performance than sensor pattern noise (SPN) based strategy proposed for general public camera devices.

  20. Identification of the Procedural Accidents During Root Canal Preparation Using Digital Intraoral Radiography and Cone Beam Computed Tomography

    Directory of Open Access Journals (Sweden)

    Csinszka K.-Ivácson A.-

    2016-09-01

    Full Text Available Crown or root perforation, ledge formation, fractured instruments and perforation of the roots are the most important accidents which appear during endodontic therapy. Our objective was to evaluate the value of digital intraoral periapical radiographs compared to cone beam computed tomography images (CBCT used to diagnose some procedural accidents. Material and methods: Eleven extracted molars were used in this study. A total of 18 perforations and 13 ledges were created artifically and 10 instruments were fractured in the root canals. Digital intraoral periapical radiographs from two angles and CBCT scans were made with the teeth fixed in position. The images were evaluated and the number of detected accidents were stated in percentages. Statistical analysis was performed using the chi square-test. Results: On digital periapical radiographs the evaluators identified 12 (66.66% perforations, 10 (100 % separated instruments and 10 (76.9% created ledges. The CBCT scans made possible the recognition of 17 (94.66 % perforations, 9 (90 % separated instruments and 13 (100% ledges. The totally recognized accidental procedures showed significant differences between the two groups. (p<0.05 Conclusion: Digital periapical radiographs are the most common imaging modalities used during endodontic treatments. Though, the CBCT allows a better identification of the procedural accidents.

  1. A computer program for automatic gamma-ray spectra analysis with isotope identification for the purpose of activation analysis

    International Nuclear Information System (INIS)

    Weigel, H.; Dauk, J.

    1974-01-01

    A FORTRAN IV program for a PDP-9 computer, with 16K storage capacity, is developed performing automatic analysis of complex gamma-spectra, taken with Ge/Li/ detectors. It searches for full energy peaks and evaluates the peak areas. The program features and automatically performed isotope identifiaction. It is written in such a flexible manner that after reactor irradiation, spectra from samples of any composition can be evaluated for activation analysis. The peak search rutin is based on the following criteria: the counting rate has to increase for two succesive channels; and the amplitude of the corresponding maximum has to be greater than/or equal to F 1 times the statistical error of the counting rate in the valley just before the maximum. In order to detect superimposed peaks, it is assumed that the dependence of FWHM on channel number is roughly approximated by a linear function, and the actual and''theoretical''FWHM values are compared. To determine the net peak area a Gaussian based function is fitted to each peak. The isotope identification is based on the procedure developed by ADAMS and DAMS. (T.G.)

  2. Automatic identification of watercourses in flat and engineered landscapes by computing the skeleton of a LiDAR point cloud

    Science.gov (United States)

    Broersen, Tom; Peters, Ravi; Ledoux, Hugo

    2017-09-01

    Drainage networks play a crucial role in protecting land against floods. It is therefore important to have an accurate map of the watercourses that form the drainage network. Previous work on the automatic identification of watercourses was typically based on grids, focused on natural landscapes, and used mostly the slope and curvature of the terrain. We focus in this paper on areas that are characterised by low-lying, flat, and engineered landscapes; these are characteristic to the Netherlands for instance. We propose a new methodology to identify watercourses automatically from elevation data, it uses solely a raw classified LiDAR point cloud as input. We show that by computing twice a skeleton of the point cloud-once in 2D and once in 3D-and that by using the properties of the skeletons we can identify most of the watercourses. We have implemented our methodology and tested it for three different soil types around Utrecht, the Netherlands. We were able to detect 98% of the watercourses for one soil type, and around 75% for the worst case, when we compared to a reference dataset that was obtained semi-automatically.

  3. The Reliability of Panoramic Radiography Versus Cone Beam Computed Tomography when Evaluating the Distance to the Alveolar Nerve in the Site of Lateral Teeth.

    Science.gov (United States)

    Česaitienė, Gabrielė; Česaitis, Kęstutis; Junevičius, Jonas; Venskutonis, Tadas

    2017-07-04

    BACKGROUND The aim of this study was to compare the reliability of panoramic radiography (PR) and cone beam computed tomography (CBCT) in the evaluation of the distance of the roots of lateral teeth to the inferior alveolar nerve canal (IANC). MATERIAL AND METHODS 100 PR and 100 CBCT images that met the selection criteria were selected from the database. In PR images, the distances were measured using an electronic caliper with 0.01 mm accuracy and white light x-ray film reviewer. Actual values of the measurements were calculated taking into consideration the magnification used in PR images (130%). Measurements on CBCT images were performed using i-CAT Vision software. Statistical data analysis was performed using R software and applying Welch's t-test and the Wilcoxon test. RESULTS There was no statistically significant difference in the mean distance from the root of the second premolar and the mesial and distal roots of the first molar to the IANC between PR and CBCT images. The difference in the mean distance from the mesial and distal roots of the second and the third molars to the IANC measured in PR and CBCT images was statistically significant. CONCLUSIONS PR may be uninformative or misleading when measuring the distance from the mesial and distal roots of the second and the third molars to the IANC.

  4. Identification and evaluation of reliable reference genes for quantitative real-time PCR analysis in tea plant (Camellia sinensis (L.) O. Kuntze)

    Science.gov (United States)

    Quantitative real-time polymerase chain reaction (qRT-PCR) is a commonly used technique for measuring gene expression levels due to its simplicity, specificity, and sensitivity. Reliable reference selection for the accurate quantification of gene expression under various experimental conditions is a...

  5. The computer vision in the service of safety and reliability in steam generators inspection services; La vision computacional al servicio de la seguridad y fiabilidad en los servicios de inspeccion en generadores de vapor

    Energy Technology Data Exchange (ETDEWEB)

    Pineiro Fernandez, P.; Garcia Bueno, A.; Cabrera Jordan, E.

    2012-07-01

    The actual computational vision has matured very quickly in the last ten years by facilitating new developments in various areas of nuclear application allowing to automate and simplify processes and tasks, instead or in collaboration with the people and equipment efficiently. The current computer vision (more appropriate than the artificial vision concept) provides great possibilities of also improving in terms of the reliability and safety of NPPS inspection systems.

  6. Human reliability

    International Nuclear Information System (INIS)

    Embrey, D.E.

    1987-01-01

    Concepts and techniques of human reliability have been developed and are used mostly in probabilistic risk assessment. For this, the major application of human reliability assessment has been to identify the human errors which have a significant effect on the overall safety of the system and to quantify the probability of their occurrence. Some of the major issues within human reliability studies are reviewed and it is shown how these are applied to the assessment of human failures in systems. This is done under the following headings; models of human performance used in human reliability assessment, the nature of human error, classification of errors in man-machine systems, practical aspects, human reliability modelling in complex situations, quantification and examination of human reliability, judgement based approaches, holistic techniques and decision analytic approaches. (UK)

  7. Bayesian methods in reliability

    Science.gov (United States)

    Sander, P.; Badoux, R.

    1991-11-01

    The present proceedings from a course on Bayesian methods in reliability encompasses Bayesian statistical methods and their computational implementation, models for analyzing censored data from nonrepairable systems, the traits of repairable systems and growth models, the use of expert judgment, and a review of the problem of forecasting software reliability. Specific issues addressed include the use of Bayesian methods to estimate the leak rate of a gas pipeline, approximate analyses under great prior uncertainty, reliability estimation techniques, and a nonhomogeneous Poisson process. Also addressed are the calibration sets and seed variables of expert judgment systems for risk assessment, experimental illustrations of the use of expert judgment for reliability testing, and analyses of the predictive quality of software-reliability growth models such as the Weibull order statistics.

  8. Time-variant reliability assessment through equivalent stochastic process transformation

    International Nuclear Information System (INIS)

    Wang, Zequn; Chen, Wei

    2016-01-01

    Time-variant reliability measures the probability that an engineering system successfully performs intended functions over a certain period of time under various sources of uncertainty. In practice, it is computationally prohibitive to propagate uncertainty in time-variant reliability assessment based on expensive or complex numerical models. This paper presents an equivalent stochastic process transformation approach for cost-effective prediction of reliability deterioration over the life cycle of an engineering system. To reduce the high dimensionality, a time-independent reliability model is developed by translating random processes and time parameters into random parameters in order to equivalently cover all potential failures that may occur during the time interval of interest. With the time-independent reliability model, an instantaneous failure surface is attained by using a Kriging-based surrogate model to identify all potential failure events. To enhance the efficacy of failure surface identification, a maximum confidence enhancement method is utilized to update the Kriging model sequentially. Then, the time-variant reliability is approximated using Monte Carlo simulations of the Kriging model where system failures over a time interval are predicted by the instantaneous failure surface. The results of two case studies demonstrate that the proposed approach is able to accurately predict the time evolution of system reliability while requiring much less computational efforts compared with the existing analytical approach. - Highlights: • Developed a new approach for time-variant reliability analysis. • Proposed a novel stochastic process transformation procedure to reduce the dimensionality. • Employed Kriging models with confidence-based adaptive sampling scheme to enhance computational efficiency. • The approach is effective for handling random process in time-variant reliability analysis. • Two case studies are used to demonstrate the efficacy

  9. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  10. Computational identification of conserved microRNAs and their targets from expression sequence tags of blueberry (Vaccinium corybosum).

    Science.gov (United States)

    Li, Xuyan; Hou, Yanming; Zhang, Li; Zhang, Wenhao; Quan, Chen; Cui, Yuhai; Bian, Shaomin

    2014-01-01

    MicroRNAs (miRNAs) are a class of endogenous, approximately 21nt in length, non-coding RNA, which mediate the expression of target genes primarily at post-transcriptional levels. miRNAs play critical roles in almost all plant cellular and metabolic processes. Although numerous miRNAs have been identified in the plant kingdom, the miRNAs in blueberry, which is an economically important small fruit crop, still remain totally unknown. In this study, we reported a computational identification of miRNAs and their targets in blueberry. By conducting an EST-based comparative genomics approach, 9 potential vco-miRNAs were discovered from 22,402 blueberry ESTs according to a series of filtering criteria, designated as vco-miR156-5p, vco-miR156-3p, vco-miR1436, vco-miR1522, vco-miR4495, vco-miR5120, vco-miR5658, vco-miR5783, and vco-miR5986. Based on sequence complementarity between miRNA and its target transcript, 34 target ESTs from blueberry and 70 targets from other species were identified for the vco-miRNAs. The targets were found to be involved in transcription, RNA splicing and binding, DNA duplication, signal transduction, transport and trafficking, stress response, as well as synthesis and metabolic process. These findings will greatly contribute to future research in regard to functions and regulatory mechanisms of blueberry miRNAs.

  11. Intra-tumour 18F-FDG uptake heterogeneity decreases the reliability on target volume definition with positron emission tomography/computed tomography imaging.

    Science.gov (United States)

    Dong, Xinzhe; Wu, Peipei; Sun, Xiaorong; Li, Wenwu; Wan, Honglin; Yu, Jinming; Xing, Ligang

    2015-06-01

    This study aims to explore whether the intra-tumour (18) F-fluorodeoxyglucose (FDG) uptake heterogeneity affects the reliability of target volume definition with FDG positron emission tomography/computed tomography (PET/CT) imaging for nonsmall cell lung cancer (NSCLC) and squamous cell oesophageal cancer (SCEC). Patients with NSCLC (n = 50) or SCEC (n = 50) who received (18)F-FDG PET/CT scanning before treatments were included in this retrospective study. Intra-tumour FDG uptake heterogeneity was assessed by visual scoring, the coefficient of variation (COV) of the standardised uptake value (SUV) and the image texture feature (entropy). Tumour volumes (gross tumour volume (GTV)) were delineated on the CT images (GTV(CT)), the fused PET/CT images (GTV(PET-CT)) and the PET images, using a threshold at 40% SUV(max) (GTV(PET40%)) or the SUV cut-off value of 2.5 (GTV(PET2.5)). The correlation between the FDG uptake heterogeneity parameters and the differences in tumour volumes among GTV(CT), GTV(PET-CT), GTV(PET40%) and GTV(PET2.5) was analysed. For both NSCLC and SCEC, obvious correlations were found between uptake heterogeneity, SUV or tumour volumes. Three types of heterogeneity parameters were consistent and closely related to each other. Substantial differences between the four methods of GTV definition were found. The differences between the GTV correlated significantly with PET heterogeneity defined with the visual score, the COV or the textural feature-entropy for NSCLC and SCEC. In tumours with a high FDG uptake heterogeneity, a larger GTV delineation difference was found. Advance image segmentation algorithms dealing with tracer uptake heterogeneity should be incorporated into the treatment planning system. © 2015 The Royal Australian and New Zealand College of Radiologists.

  12. Intra-tumour 18F-FDG uptake heterogeneity decreases the reliability on target volume definition with positron emission tomography/computed tomography imaging

    International Nuclear Information System (INIS)

    Dong, Xinzhe; Wu, Peipei; Yu, Jinming; Xing, Ligang; Sun, Xiaorong; Li, Wenwu; Wan, Honglin

    2015-01-01

    This study aims to explore whether the intra-tumour 18 F-fluorodeoxyglucose (FDG) uptake heterogeneity affects the reliability of target volume definition with FDG positron emission tomography/computed tomography (PET/CT) imaging for nonsmall cell lung cancer (NSCLC) and squamous cell oesophageal cancer (SCEC). Patients with NSCLC (n = 50) or SCEC (n = 50) who received 18 F-FDG PET/CT scanning before treatments were included in this retrospective study. Intra-tumour FDG uptake heterogeneity was assessed by visual scoring, the coefficient of variation (COV) of the standardised uptake value (SUV) and the image texture feature (entropy). Tumour volumes (gross tumour volume (GTV) ) were delineated on the CT images (GTV CT ), the fused PET/CT images (GTV PET-CT ) and the PET images, using a threshold at 40% SUV max (GTV PET40% ) or the SUV cut-off value of 2.5 (GTV PET2.5 ). The correlation between the FDG uptake heterogeneity parameters and the differences in tumour volumes among GTV CT , GTV PET-CT , GTV PET40% and GTV PET2.5 was analysed. For both NSCLC and SCEC, obvious correlations were found between uptake heterogeneity, SUV or tumour volumes. Three types of heterogeneity parameters were consistent and closely related to each other. Substantial differences between the four methods of GTV definition were found. The differences between the GTV correlated significantly with PET heterogeneity defined with the visual score, the COV or the textural feature-entropy for NSCLC and SCEC. In tumours with a high FDG uptake heterogeneity, a larger GTV delineation difference was found. Advance image segmentation algorithms dealing with tracer uptake heterogeneity should be incorporated into the treatment planning system.

  13. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  14. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  15. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  16. Dependent systems reliability estimation by structural reliability approach

    DEFF Research Database (Denmark)

    Kostandyan, Erik; Sørensen, John Dalsgaard

    2014-01-01

    Estimation of system reliability by classical system reliability methods generally assumes that the components are statistically independent, thus limiting its applicability in many practical situations. A method is proposed for estimation of the system reliability with dependent components, where...... the leading failure mechanism(s) is described by physics of failure model(s). The proposed method is based on structural reliability techniques and accounts for both statistical and failure effect correlations. It is assumed that failure of any component is due to increasing damage (fatigue phenomena...... identification. Application of the proposed method can be found in many real world systems....

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  18. Reliability Engineering

    CERN Document Server

    Lazzaroni, Massimo

    2012-01-01

    This book gives a practical guide for designers and users in Information and Communication Technology context. In particular, in the first Section, the definition of the fundamental terms according to the international standards are given. Then, some theoretical concepts and reliability models are presented in Chapters 2 and 3: the aim is to evaluate performance for components and systems and reliability growth. Chapter 4, by introducing the laboratory tests, puts in evidence the reliability concept from the experimental point of view. In ICT context, the failure rate for a given system can be

  19. Reliability training

    Science.gov (United States)

    Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Dillard, Richard B.; Wong, Kam L.; Barber, Frank J.; Barina, Frank J.

    1992-01-01

    Discussed here is failure physics, the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low cost reliable products. A review of reliability for the years 1940 to 2000 is given. Next, a review of mathematics is given as well as a description of what elements contribute to product failures. Basic reliability theory and the disciplines that allow us to control and eliminate failures are elucidated.

  20. Multidisciplinary System Reliability Analysis

    Science.gov (United States)

    Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  2. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  3. Individual selection of X-ray tube settings in computed tomography coronary angiography: Reliability of an automated software algorithm to maintain constant image quality.

    Science.gov (United States)

    Durmus, Tahir; Luhur, Reny; Daqqaq, Tareef; Schwenke, Carsten; Knobloch, Gesine; Huppertz, Alexander; Hamm, Bernd; Lembcke, Alexander

    2016-05-01

    To evaluate a software tool that claims to maintain a constant contrast-to-noise ratio (CNR) in high-pitch dual-source computed tomography coronary angiography (CTCA) by automatically selecting both X-ray tube voltage and current. A total of 302 patients (171 males; age 61±12years; body weight 82±17kg, body mass index 27.3±4.6kg/cm(2)) underwent CTCA with a topogram-based, automatic selection of both tube voltage and current using dedicated software with quality reference values of 100kV and 250mAs/rotation (i.e., standard values for an average adult weighing 75kg) and an injected iodine load of 222mg/kg. The average radiation dose was estimated to be 1.02±0.64mSv. All data sets had adequate contrast enhancement. Average CNR in the aortic root, left ventricle, and left and right coronary artery was 15.7±4.5, 8.3±2.9, 16.1±4.3 and 15.3±3.9 respectively. Individual CNR values were independent of patients' body size and radiation dose. However, individual CNR values may vary considerably between subjects as reflected by interquartile ranges of 12.6-18.6, 6.2-9.9, 12.8-18.9 and 12.5-17.9 respectively. Moreover, average CNR values were significantly lower in males than females (15.1±4.1 vs. 16.6±11.7 and 7.9±2.7 vs. 8.9±3.0, 15.5±3.9 vs. 16.9±4.6 and 14.7±3.6 vs. 16.0±4.1 respectively). A topogram-based automatic selection of X-ray tube settings in CTCA provides diagnostic image quality independent of patients' body size. Nevertheless, considerable variation of individual CNR values between patients and significant differences of CNR values between males and females occur which questions the reliability of this approach. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  4. Systems reliability/structural reliability

    International Nuclear Information System (INIS)

    Green, A.E.

    1980-01-01

    The question of reliability technology using quantified techniques is considered for systems and structures. Systems reliability analysis has progressed to a viable and proven methodology whereas this has yet to be fully achieved for large scale structures. Structural loading variants over the half-time of the plant are considered to be more difficult to analyse than for systems, even though a relatively crude model may be a necessary starting point. Various reliability characteristics and environmental conditions are considered which enter this problem. The rare event situation is briefly mentioned together with aspects of proof testing and normal and upset loading conditions. (orig.)

  5. Error free physically unclonable function with programmed resistive random access memory using reliable resistance states by specific identification-generation method

    Science.gov (United States)

    Tseng, Po-Hao; Hsu, Kai-Chieh; Lin, Yu-Yu; Lee, Feng-Min; Lee, Ming-Hsiu; Lung, Hsiang-Lan; Hsieh, Kuang-Yeu; Chung Wang, Keh; Lu, Chih-Yuan

    2018-04-01

    A high performance physically unclonable function (PUF) implemented with WO3 resistive random access memory (ReRAM) is presented in this paper. This robust ReRAM-PUF can eliminated bit flipping problem at very high temperature (up to 250 °C) due to plentiful read margin by using initial resistance state and set resistance state. It is also promised 10 years retention at the temperature range of 210 °C. These two stable resistance states enable stable operation at automotive environments from -40 to 125 °C without need of temperature compensation circuit. The high uniqueness of PUF can be achieved by implementing a proposed identification (ID)-generation method. Optimized forming condition can move 50% of the cells to low resistance state and the remaining 50% remain at initial high resistance state. The inter- and intra-PUF evaluations with unlimited separation of hamming distance (HD) are successfully demonstrated even under the corner condition. The number of reproduction was measured to exceed 107 times with 0% bit error rate (BER) at read voltage from 0.4 to 0.7 V.

  6. Integrating reliability analysis and design

    International Nuclear Information System (INIS)

    Rasmuson, D.M.

    1980-10-01

    This report describes the Interactive Reliability Analysis Project and demonstrates the advantages of using computer-aided design systems (CADS) in reliability analysis. Common cause failure problems require presentations of systems, analysis of fault trees, and evaluation of solutions to these. Results have to be communicated between the reliability analyst and the system designer. Using a computer-aided design system saves time and money in the analysis of design. Computer-aided design systems lend themselves to cable routing, valve and switch lists, pipe routing, and other component studies. At EG and G Idaho, Inc., the Applicon CADS is being applied to the study of water reactor safety systems

  7. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  8. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  9. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  10. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  11. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  13. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  14. Transparent reliability model for fault-tolerant safety systems

    International Nuclear Information System (INIS)

    Bodsberg, Lars; Hokstad, Per

    1997-01-01

    A reliability model is presented which may serve as a tool for identification of cost-effective configurations and operating philosophies of computer-based process safety systems. The main merit of the model is the explicit relationship in the mathematical formulas between failure cause and the means used to improve system reliability such as self-test, redundancy, preventive maintenance and corrective maintenance. A component failure taxonomy has been developed which allows the analyst to treat hardware failures, human failures, and software failures of automatic systems in an integrated manner. Furthermore, the taxonomy distinguishes between failures due to excessive environmental stresses and failures initiated by humans during engineering and operation. Attention has been given to develop a transparent model which provides predictions which are in good agreement with observed system performance, and which is applicable for non-experts in the field of reliability

  15. Structural reliability assessment capability in NESSUS

    Science.gov (United States)

    Millwater, H.; Wu, Y.-T.

    1992-07-01

    The principal capabilities of NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), an advanced computer code developed for probabilistic structural response analysis, are reviewed, and its structural reliability assessed. The code combines flexible structural modeling tools with advanced probabilistic algorithms in order to compute probabilistic structural response and resistance, component reliability and risk, and system reliability and risk. An illustrative numerical example is presented.

  16. Human reliability

    International Nuclear Information System (INIS)

    Bubb, H.

    1992-01-01

    This book resulted from the activity of Task Force 4.2 - 'Human Reliability'. This group was established on February 27th, 1986, at the plenary meeting of the Technical Reliability Committee of VDI, within the framework of the joint committee of VDI on industrial systems technology - GIS. It is composed of representatives of industry, representatives of research institutes, of technical control boards and universities, whose job it is to study how man fits into the technical side of the world of work and to optimize this interaction. In a total of 17 sessions, information from the part of ergonomy dealing with human reliability in using technical systems at work was exchanged, and different methods for its evaluation were examined and analyzed. The outcome of this work was systematized and compiled in this book. (orig.) [de

  17. Microelectronics Reliability

    Science.gov (United States)

    2017-01-17

    inverters  connected in a chain. ................................................. 5  Figure 3  Typical graph showing frequency versus square root of...developing an experimental  reliability estimating methodology that could both illuminate the  lifetime  reliability of advanced devices,  circuits and...or  FIT of the device. In other words an accurate estimate of the device  lifetime  was found and thus the  reliability  that  can  be  conveniently

  18. Quantum computation and swarm intelligence applied in the optimization of identification of accidents in a PWR nuclear power plant

    International Nuclear Information System (INIS)

    Nicolau, Andressa; Schirru, Roberto; Medeiros, Jose A.C.C.

    2009-01-01

    This work presents the results of a performance evaluation study of the quantum based algorithms, QEA (Quantum Inspired Evolutionary Algorithm) and QSE (Quantum Swarm Evolutionary), when applied to the transient identification optimization problem of a nuclear power station operating at 100% of full power. For the sake of evaluation of the algorithms 3 benchmark functions were used. When compared to other similar optimization methods QEA showed that it can be an efficient optimization tool, not only for combinatorial problems but also for numerical problems, particularly for complex problems as the identification of transients in a nuclear power station. (author)

  19. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  20. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  1. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  2. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  3. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  4. Objectively Determining the Educational Potential of Computer and Video-Based Courseware; or, Producing Reliable Evaluations Despite the Dog and Pony Show.

    Science.gov (United States)

    Barrett, Andrew J.; And Others

    The Center for Interactive Technology, Applications, and Research at the College of Engineering of the University of South Florida (Tampa) has developed objective and descriptive evaluation models to assist in determining the educational potential of computer and video courseware. The computer-based courseware evaluation model and the video-based…

  5. A distributed computational search strategy for the identification of diagnostics targets: application to finding aptamer targets for methicillin-resistant staphylococci.

    Science.gov (United States)

    Flanagan, Keith; Cockell, Simon; Harwood, Colin; Hallinan, Jennifer; Nakjang, Sirintra; Lawry, Beth; Wipat, Anil

    2014-06-30

    The rapid and cost-effective identification of bacterial species is crucial, especially for clinical diagnosis and treatment. Peptide aptamers have been shown to be valuable for use as a component of novel, direct detection methods. These small peptides have a number of advantages over antibodies, including greater specificity and longer shelf life. These properties facilitate their use as the detector components of biosensor devices. However, the identification of suitable aptamer targets for particular groups of organisms is challenging. We present a semi-automated processing pipeline for the identification of candidate aptamer targets from whole bacterial genome sequences. The pipeline can be configured to search for protein sequence fragments that uniquely identify a set of strains of interest. The system is also capable of identifying additional organisms that may be of interest due to their possession of protein fragments in common with the initial set. Through the use of Cloud computing technology and distributed databases, our system is capable of scaling with the rapidly growing genome repositories, and consequently of keeping the resulting data sets up-to-date. The system described is also more generically applicable to the discovery of specific targets for other diagnostic approaches such as DNA probes, PCR primers and antibodies.

  6. A distributed computational search strategy for the identification of diagnostics targets: Application to finding aptamer targets for methicillin-resistant staphylococci

    Directory of Open Access Journals (Sweden)

    Flanagan Keith

    2014-06-01

    Full Text Available The rapid and cost-effective identification of bacterial species is crucial, especially for clinical diagnosis and treatment. Peptide aptamers have been shown to be valuable for use as a component of novel, direct detection methods. These small peptides have a number of advantages over antibodies, including greater specificity and longer shelf life. These properties facilitate their use as the detector components of biosensor devices. However, the identification of suitable aptamer targets for particular groups of organisms is challenging. We present a semi-automated processing pipeline for the identification of candidate aptamer targets from whole bacterial genome sequences. The pipeline can be configured to search for protein sequence fragments that uniquely identify a set of strains of interest. The system is also capable of identifying additional organisms that may be of interest due to their possession of protein fragments in common with the initial set. Through the use of Cloud computing technology and distributed databases, our system is capable of scaling with the rapidly growing genome repositories, and consequently of keeping the resulting data sets up-to-date. The system described is also more generically applicable to the discovery of specific targets for other diagnostic approaches such as DNA probes, PCR primers and antibodies.

  7. The rating reliability calculator

    Directory of Open Access Journals (Sweden)

    Solomon David J

    2004-04-01

    Full Text Available Abstract Background Rating scales form an important means of gathering evaluation data. Since important decisions are often based on these evaluations, determining the reliability of rating data can be critical. Most commonly used methods of estimating reliability require a complete set of ratings i.e. every subject being rated must be rated by each judge. Over fifty years ago Ebel described an algorithm for estimating the reliability of ratings based on incomplete data. While his article has been widely cited over the years, software based on the algorithm is not readily available. This paper describes an easy-to-use Web-based utility for estimating the reliability of ratings based on incomplete data using Ebel's algorithm. Methods The program is available public use on our server and the source code is freely available under GNU General Public License. The utility is written in PHP, a common open source imbedded scripting language. The rating data can be entered in a convenient format on the user's personal computer that the program will upload to the server for calculating the reliability and other statistics describing the ratings. Results When the program is run it displays the reliability, number of subject rated, harmonic mean number of judges rating each subject, the mean and standard deviation of the averaged ratings per subject. The program also displays the mean, standard deviation and number of ratings for each subject rated. Additionally the program will estimate the reliability of an average of a number of ratings for each subject via the Spearman-Brown prophecy formula. Conclusion This simple web-based program provides a convenient means of estimating the reliability of rating data without the need to conduct special studies in order to provide complete rating data. I would welcome other researchers revising and enhancing the program.

  8. Redefining reliability

    International Nuclear Information System (INIS)

    Paulson, S.L.

    1995-01-01

    Want to buy some reliability? The question would have been unthinkable in some markets served by the natural gas business even a few years ago, but in the new gas marketplace, industrial, commercial and even some residential customers have the opportunity to choose from among an array of options about the kind of natural gas service they need--and are willing to pay for. The complexities of this brave new world of restructuring and competition have sent the industry scrambling to find ways to educate and inform its customers about the increased responsibility they will have in determining the level of gas reliability they choose. This article discusses the new options and the new responsibilities of customers, the needed for continuous education, and MidAmerican Energy Company's experiment in direct marketing of natural gas

  9. Reliability and validity of a dual-probe personal computer-based muscle viewer for measuring the pennation angle of the medial gastrocnemius muscle in patients who have had a stroke.

    Science.gov (United States)

    Cho, Ji-Eun; Cho, Ki Hun; Yoo, Jun Sang; Lee, Su Jin; Lee, Wan-Hee

    2018-01-01

    Background A dual-probe personal computer-based muscle viewer (DPC-BMW) is advantageous in that it is relatively lightweight and easy to apply. Objective To investigate the reliability and validity of the DPC-BMW in comparison with those of a portable ultrasonography (P-US) device for measuring the pennation angle of the medial gastrocnemius (MG) muscle at rest and during contraction. Methods Twenty-four patients who had a stroke (18 men and 6 women) participated in this study. Using the DPC-BMW and P-US device, the pennation angle of the MG muscle on the affected side was randomly measured. Two examiners randomly obtained the images of all the participants in two separate test sessions, 7 days apart. Intraclass correlation coefficient (ICC), confidence interval, standard error of measurement, Bland-Altman plot, and Pearson correlation coefficient were used to estimate their reliability and validity. Results The ICC for the intrarater reliability of the MG muscle pennation angle measured using the DPC-BMW was > 0.916, indicating excellent reliability, and that for the interrater reliability ranged from 0.964 to 0.994. The P-US device also exhibited good reliability. A high correlation was found between the measurements of MG muscle pennation angle obtained using the DPC-BMW and that obtained using the P-US device (p < 0.01). Conclusion The DPC-BMW can provide clear images for accurate measurements, including measurements using dual probes. It has the advantage of rehabilitative US imaging for individuals who have had a stroke. More research studies are needed to evaluate the usefulness of the DPC-BMW in rehabilitation.

  10. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  11. Three dimensional subsurface elemental identification of minerals using confocal micro-X-ray fluorescence and micro-X-ray computed tomography

    International Nuclear Information System (INIS)

    Cordes, Nikolaus L.; Seshadri, Srivatsan; Havrilla, George J.; Yuan, Xiaoli; Feser, Michael; Patterson, Brian M.

    2015-01-01

    Current non-destructive elemental characterization methods, such as scanning electron microscopy-based energy dispersive spectroscopy (SEM–EDS) and micro-X-ray fluorescence spectroscopy (MXRF), are limited to either elemental identification at the surface (SEM–EDS) or suffer from an inability to discriminate between surface or depth information (MXRF). Thus, a non-destructive elemental characterization of individual embedded particles beneath the surface is impossible with either of these techniques. This limitation can be overcome by using laboratory-based 3D confocal micro-X-ray fluorescence spectroscopy (confocal MXRF). This technique utilizes focusing optics on the X-ray source and detector which allows for spatial discrimination in all three dimensions. However, the voxel-by-voxel serial acquisition of a 3D elemental scan can be very time-intensive (~ 1 to 4 weeks) if it is necessary to locate individual embedded particles of interest. As an example, if each point takes a 5 s measurement time, a small volume of 50 × 50 × 50 pixels leads to an acquisition time of approximately 174 h, not including sample stage movement time. Initially screening the samples for particles of interest using micro-X-ray computed tomography (micro-CT) can significantly reduce the time required to spatially locate these particles. Once located, these individual particles can be elementally characterized with confocal MXRF. Herein, we report the elemental identification of high atomic number surface and subsurface particles embedded in a mineralogical matrix by coupling micro-CT and confocal MXRF. Synergistically, these two X-ray based techniques first rapidly locate and then elementally identify individual subsurface particles. - Highlights: • Coupling of confocal X-ray fluorescence spectroscopy and X-ray computed tomography • Qualitative elemental identification of surface and subsurface mineral particles • Non-destructive particle size measurements • Utilization of

  12. Three dimensional subsurface elemental identification of minerals using confocal micro-X-ray fluorescence and micro-X-ray computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Cordes, Nikolaus L., E-mail: ncordes@lanl.gov [Polymers and Coatings Group, Material Science and Technology Division, Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Seshadri, Srivatsan, E-mail: srivatsan.seshadri@zeiss.com [Carl Zeiss X-ray Microscopy, Inc., Pleasanton, CA 94588 (United States); Havrilla, George J. [Chemical Diagnostics and Engineering, Chemistry Division, Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Yuan, Xiaoli [Julius Kruttschnitt Mineral Research Centre, University of Queensland, Indooroopilly, Brisbane, QLD 4068 (Australia); Feser, Michael [Carl Zeiss X-ray Microscopy, Inc., Pleasanton, CA 94588 (United States); Patterson, Brian M. [Polymers and Coatings Group, Material Science and Technology Division, Los Alamos National Laboratory, Los Alamos, NM 87545 (United States)

    2015-01-01

    Current non-destructive elemental characterization methods, such as scanning electron microscopy-based energy dispersive spectroscopy (SEM–EDS) and micro-X-ray fluorescence spectroscopy (MXRF), are limited to either elemental identification at the surface (SEM–EDS) or suffer from an inability to discriminate between surface or depth information (MXRF). Thus, a non-destructive elemental characterization of individual embedded particles beneath the surface is impossible with either of these techniques. This limitation can be overcome by using laboratory-based 3D confocal micro-X-ray fluorescence spectroscopy (confocal MXRF). This technique utilizes focusing optics on the X-ray source and detector which allows for spatial discrimination in all three dimensions. However, the voxel-by-voxel serial acquisition of a 3D elemental scan can be very time-intensive (~ 1 to 4 weeks) if it is necessary to locate individual embedded particles of interest. As an example, if each point takes a 5 s measurement time, a small volume of 50 × 50 × 50 pixels leads to an acquisition time of approximately 174 h, not including sample stage movement time. Initially screening the samples for particles of interest using micro-X-ray computed tomography (micro-CT) can significantly reduce the time required to spatially locate these particles. Once located, these individual particles can be elementally characterized with confocal MXRF. Herein, we report the elemental identification of high atomic number surface and subsurface particles embedded in a mineralogical matrix by coupling micro-CT and confocal MXRF. Synergistically, these two X-ray based techniques first rapidly locate and then elementally identify individual subsurface particles. - Highlights: • Coupling of confocal X-ray fluorescence spectroscopy and X-ray computed tomography • Qualitative elemental identification of surface and subsurface mineral particles • Non-destructive particle size measurements • Utilization of

  13. MULTIPRED2: A computational system for large-scale identification of peptides predicted to bind to HLA supertypes and alleles

    DEFF Research Database (Denmark)

    Zhang, Guang Lan; DeLuca, David S.; Keskin, Derin B.

    2011-01-01

    binding peptides and immunological hotspots in an intuitive manner and also to provide a global view of results as heat maps. Another function of MULTIPRED2, which has direct relevance to vaccine design, is the calculation of population coverage. Currently it calculates population coverage in five major...... groups in North America. MULTIPRED2 is an important tool to complement wet-lab experimental methods for identification of T-cell epitopes. It is available at http://cvc.dfci.harvard.edu/multipred2/....

  14. Reliability studies by fast electronic simulation. Realization of an apparatus of configuration made by computer in function of the model to study

    International Nuclear Information System (INIS)

    Jurvillier, I.

    1991-01-01

    Here is a reliability study concerning nuclear facilities; from the operating system scheme studies lead to the elaboration of an apparatus called ESCAF, actually commercialized, allowing the analysis of the most complex systems. These studies were made at the demand of safety evaluation services of the CEA who have to give a technical advice to the authorities able to deliver the authorization of building and operating nuclear power plants

  15. Response to Dr. Smith's Comments and Criticisms Concerning "Identification of Student Misconceptions in Genetics Problem Solving via Computer Program."

    Science.gov (United States)

    Browning, Mark; Lehman, James D.

    1991-01-01

    Authors respond to criticisms by Smith in the same issue and defend their use of the term "gene" and "misconception." Authors indicate that they did not believe that the use of computers significantly skewed their data concerning student errors. (PR)

  16. Reliability and accuracy of three imaging software packages used for 3D analysis of the upper airway on cone beam computed tomography images.

    Science.gov (United States)

    Chen, Hui; van Eijnatten, Maureen; Wolff, Jan; de Lange, Jan; van der Stelt, Paul F; Lobbezoo, Frank; Aarab, Ghizlane

    2017-08-01

    The aim of this study was to assess the reliability and accuracy of three different imaging software packages for three-dimensional analysis of the upper airway using CBCT images. To assess the reliability of the software packages, 15 NewTom 5G ® (QR Systems, Verona, Italy) CBCT data sets were randomly and retrospectively selected. Two observers measured the volume, minimum cross-sectional area and the length of the upper airway using Amira ® (Visage Imaging Inc., Carlsbad, CA), 3Diagnosys ® (3diemme, Cantu, Italy) and OnDemand3D ® (CyberMed, Seoul, Republic of Korea) software packages. The intra- and inter-observer reliability of the upper airway measurements were determined using intraclass correlation coefficients and Bland & Altman agreement tests. To assess the accuracy of the software packages, one NewTom 5G ® CBCT data set was used to print a three-dimensional anthropomorphic phantom with known dimensions to be used as the "gold standard". This phantom was subsequently scanned using a NewTom 5G ® scanner. Based on the CBCT data set of the phantom, one observer measured the volume, minimum cross-sectional area, and length of the upper airway using Amira ® , 3Diagnosys ® , and OnDemand3D ® , and compared these measurements with the gold standard. The intra- and inter-observer reliability of the measurements of the upper airway using the different software packages were excellent (intraclass correlation coefficient ≥0.75). There was excellent agreement between all three software packages in volume, minimum cross-sectional area and length measurements. All software packages underestimated the upper airway volume by -8.8% to -12.3%, the minimum cross-sectional area by -6.2% to -14.6%, and the length by -1.6% to -2.9%. All three software packages offered reliable volume, minimum cross-sectional area and length measurements of the upper airway. The length measurements of the upper airway were the most accurate results in all software packages. All

  17. MO-AB-BRA-04: Correct Identification of Low-Attenuation Intracranial Hemorrhage and Calcification Using Dual-Energy Computed Tomography in a Phantom System

    Energy Technology Data Exchange (ETDEWEB)

    Nute, J; Jacobsen, M; Popnoe, D [UT MD Anderson Cancer Center, Department of Imaging Physics, Houston, TX (United States); UT Graduate School of Biomedical Sciences at Houston, Houston, TX (United States); Wei, W [UT MD Anderson Cancer Center, Department of Biostatistics, Houston, TX (United States); Baiu, C [Gammex Inc., Middleton, WI (United States); Schellingerhout, D [MD Anderson Cancer Center, Department of Diagnostic Radiology, Houston, TX (United States); Cody, D [UT MD Anderson Cancer Center, Department of Imaging Physics, Houston, TX (United States)

    2015-06-15

    Purpose: Intracranial hemorrhage and calcification with Single-Energy CT (SECT) attenuation below 100HU cannot be reliably identified using currently clinically available means. Calcification is typically benign but hemorrhage can carry a risk of intracranial bleeding and contraindicate use of anticoagulant therapies. A biologically-relevant phantom was used to investigate identification of unknown intracranial lesions using dual-energy CT (DECT) as a verification of prior lesion differentiation results. Methods: Prior phantom work investigating calcification and hemorrhage differentiation resulted in 3D-DECT raw data (water density, calcium density, 68keV) for a range of DECT protocol variations: image thicknesses (1.25, 2.5, 3.75, 5mm), CTDIvol (36.7 to 132.6mGy) and reconstruction algorithms (Soft, Standard, Detail). Acquisition-specific raw data were used to create a plane of optimal differentiation based on the geometric bisector of 3D-linear regression of the two lesion distributions. Verification hemorrhage and calcification lesions, ranging in size from 0.5 to 1.5cm, were created at varying attenuation from 50 to 100HU. Lesions were inserted into a biologically-relevant brain phantom and scanned using SECT (3.75mm images, Standard, 67mGy) and a range of DECT protocols (3.75mm images, Standard, [67, 105.6, 132.6mGy]). 3D-DECT data were collected and blinded for analysis. The 3D-DECT distribution of the lesion was then compared to the acquisition-matched geometric bisector plane and the mean lesion value’s position relative to the plane, indicating lesion identity, and the percentage of voxels on the identified side of the plane, indicating identification confidence, were derived. Results: 98% of the 120 lesions investigated were identified correctly as hemorrhage or calcification. 74% were identified with greater than 80% confidence. Increases in CTDIvol and lesion diameter were associated with increased identification confidence. Conclusion: Intracranial

  18. Acceleration of Cherenkov angle reconstruction with the new Intel Xeon/FPGA compute platform for the particle identification in the LHCb Upgrade

    Science.gov (United States)

    Faerber, Christian

    2017-10-01

    The LHCb experiment at the LHC will upgrade its detector by 2018/2019 to a ‘triggerless’ readout scheme, where all the readout electronics and several sub-detector parts will be replaced. The new readout electronics will be able to readout the detector at 40 MHz. This increases the data bandwidth from the detector down to the Event Filter farm to 40 TBit/s, which also has to be processed to select the interesting proton-proton collision for later storage. The architecture of such a computing farm, which can process this amount of data as efficiently as possible, is a challenging task and several compute accelerator technologies are being considered for use inside the new Event Filter farm. In the high performance computing sector more and more FPGA compute accelerators are used to improve the compute performance and reduce the power consumption (e.g. in the Microsoft Catapult project and Bing search engine). Also for the LHCb upgrade the usage of an experimental FPGA accelerated computing platform in the Event Building or in the Event Filter farm is being considered and therefore tested. This platform from Intel hosts a general CPU and a high performance FPGA linked via a high speed link which is for this platform a QPI link. On the FPGA an accelerator is implemented. The used system is a two socket platform from Intel with a Xeon CPU and an FPGA. The FPGA has cache-coherent memory access to the main memory of the server and can collaborate with the CPU. As a first step, a computing intensive algorithm to reconstruct Cherenkov angles for the LHCb RICH particle identification was successfully ported in Verilog to the Intel Xeon/FPGA platform and accelerated by a factor of 35. The same algorithm was ported to the Intel Xeon/FPGA platform with OpenCL. The implementation work and the performance will be compared. Also another FPGA accelerator the Nallatech 385 PCIe accelerator with the same Stratix V FPGA were tested for performance. The results show that the Intel

  19. Portable Brain-Computer Interface for the Intensive Care Unit Patient Communication Using Subject-Dependent SSVEP Identification.

    Science.gov (United States)

    Dehzangi, Omid; Farooq, Muhamed

    2018-01-01

    A major predicament for Intensive Care Unit (ICU) patients is inconsistent and ineffective communication means. Patients rated most communication sessions as difficult and unsuccessful. This, in turn, can cause distress, unrecognized pain, anxiety, and fear. As such, we designed a portable BCI system for ICU communications (BCI4ICU) optimized to operate effectively in an ICU environment. The system utilizes a wearable EEG cap coupled with an Android app designed on a mobile device that serves as visual stimuli and data processing module. Furthermore, to overcome the challenges that BCI systems face today in real-world scenarios, we propose a novel subject-specific Gaussian Mixture Model- (GMM-) based training and adaptation algorithm. First, we incorporate subject-specific information in the training phase of the SSVEP identification model using GMM-based training and adaptation. We evaluate subject-specific models against other subjects. Subsequently, from the GMM discriminative scores, we generate the transformed vectors, which are passed to our predictive model. Finally, the adapted mixture mean scores of the subject-specific GMMs are utilized to generate the high-dimensional supervectors. Our experimental results demonstrate that the proposed system achieved 98.7% average identification accuracy, which is promising in order to provide effective and consistent communication for patients in the intensive care.

  20. Computer vision applied to herbarium specimens of German trees: testing the future utility of the millions of herbarium specimen images for automated identification.

    Science.gov (United States)

    Unger, Jakob; Merhof, Dorit; Renner, Susanne

    2016-11-16

    Global Plants, a collaborative between JSTOR and some 300 herbaria, now contains about 2.48 million high-resolution images of plant specimens, a number that continues to grow, and collections that are digitizing their specimens at high resolution are allocating considerable recourses to the maintenance of computer hardware (e.g., servers) and to acquiring digital storage space. We here apply machine learning, specifically the training of a Support-Vector-Machine, to classify specimen images into categories, ideally at the species level, using the 26 most common tree species in Germany as a test case. We designed an analysis pipeline and classification system consisting of segmentation, normalization, feature extraction, and classification steps and evaluated the system in two test sets, one with 26 species, the other with 17, in each case using 10 images per species of plants collected between 1820 and 1995, which simulates the empirical situation that most named species are represented in herbaria and databases, such as JSTOR, by few specimens. We achieved 73.21% accuracy of species assignments in the larger test set, and 84.88% in the smaller test set. The results of this first application of a computer vision algorithm trained on images of herbarium specimens shows that despite the problem of overlapping leaves, leaf-architectural features can be used to categorize specimens to species with good accuracy. Computer vision is poised to play a significant role in future rapid identification at least for frequently collected genera or species in the European flora.

  1. Improving the Reliability of Student Scores from Speeded Assessments: An Illustration of Conditional Item Response Theory Using a Computer-Administered Measure of Vocabulary

    Science.gov (United States)

    Petscher, Yaacov; Mitchell, Alison M.; Foorman, Barbara R.

    2015-01-01

    A growing body of literature suggests that response latency, the amount of time it takes an individual to respond to an item, may be an important factor to consider when using assessment data to estimate the ability of an individual. Considering that tests of passage and list fluency are being adapted to a computer administration format, it is…

  2. INTRA- AND INTER-OBSERVER RELIABILITY IN SELECTION OF THE HEART RATE DEFLECTION POINT DURING INCREMENTAL EXERCISE: COMPARISON TO A COMPUTER-GENERATED DEFLECTION POINT

    Directory of Open Access Journals (Sweden)

    Bridget A. Duoos

    2002-12-01

    Full Text Available This study was designed to 1 determine the relative frequency of occurrence of a heart rate deflection point (HRDP, when compared to a linear relationship, during progressive exercise, 2 measure the reproducibility of a visual assessment of a heart rate deflection point (HRDP, both within and between observers 3 compare visual and computer-assessed deflection points. Subjects consisted of 73 competitive male cyclists with mean age of 31.4 ± 6.3 years, mean height 178.3 ± 4.8 cm. and weight 74.0 ± 4.4 kg. Tests were conducted on an electrically-braked cycle ergometer beginning at 25 watts and progressing 25 watts per minute to fatigue. Heart Rates were recorded the last 10 seconds of each stage and at fatigue. Scatter plots of heart rate versus watts were computer-generated and given to 3 observers on two different occasions. A computer program was developed to assess if data points were best represented by a single line or two lines. The HRDP represented the intersection of the two lines. Results of this study showed that 1 computer-assessed HRDP showed that 44 of 73 subjects (60.3% had scatter plots best represented by a straight line with no HRDP 2in those subjects having HRDP, all 3 observers showed significant differences(p = 0.048, p = 0.007, p = 0.001 in reproducibility of their HRDP selection. Differences in HRDP selection were significant for two of the three comparisons between observers (p = 0.002, p = 0.305, p = 0.0003 Computer-generated HRDP was significantly different than visual HRDP for 2 of 3 observers (p = 0.0016, p = 0.513, p = 0.0001. It is concluded that 1 HRDP occurs in a minority of subjects 2 significant differences exist, both within and between observers, in selection of HRDP and 3 differences in agreement between visual and computer-generated HRDP would indicate that, when HRDP exists, it should be computer-assessed

  3. Software reliability prediction using SPN | Abbasabadee | Journal of ...

    African Journals Online (AJOL)

    Software reliability prediction using SPN. ... In this research for computation of software reliability, component reliability model based on SPN would be proposed. An isomorphic markov ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  4. Synthesis of radiolabelled aryl azides from diazonium salts: experimental and computational results permit the identification of the preferred mechanism.

    Science.gov (United States)

    Joshi, Sameer M; de Cózar, Abel; Gómez-Vallejo, Vanessa; Koziorowski, Jacek; Llop, Jordi; Cossío, Fernando P

    2015-05-28

    Experimental and computational studies on the formation of aryl azides from the corresponding diazonium salts support a stepwise mechanism via acyclic zwitterionic intermediates. The low energy barriers associated with both transition structures are compatible with very fast and efficient processes, thus making this method suitable for the chemical synthesis of radiolabelled aryl azides.

  5. Identification of discrete vascular lesions in the extremities using post-mortem computed tomography angiography – Case reports

    NARCIS (Netherlands)

    Haakma, Wieke; Rohde, Marianne; Uhrenholt, Lars; Pedersen, Michael; Boel, Lene Warner Thorup

    2017-01-01

    In this case report, we introduced post-mortem computed tomography angiography (PMCTA) in three cases suffering from vascular lesions in the upper extremities. In each subject, the third part of the axillary arteries and veins were used to catheterize the arms. The vessels were filled with a barium

  6. Developing an Educational Computer Game for Migratory Bird Identification Based on a Two-Tier Test Approach

    Science.gov (United States)

    Chu, Hui-Chun; Chang, Shao-Chen

    2014-01-01

    Although educational computer games have been recognized as being a promising approach, previous studies have indicated that, without supportive models, students might only show temporary interest during the game-based learning process, and their learning performance is often not as good as expected. Therefore, in this paper, a two-tier test…

  7. Computational identification of potential multitarget treatments for ameliorating the adverse effects of amyloid-β on synaptic plasticity.

    Science.gov (United States)

    Anastasio, Thomas J

    2014-01-01

    The leading hypothesis on Alzheimer Disease (AD) is that it is caused by buildup of the peptide amyloid-β (Aβ), which initially causes dysregulation of synaptic plasticity and eventually causes destruction of synapses and neurons. Pharmacological efforts to limit Aβ buildup have proven ineffective, and this raises the twin challenges of understanding the adverse effects of Aβ on synapses and of suggesting pharmacological means to prevent them. The purpose of this paper is to initiate a computational approach to understanding the dysregulation by Aβ of synaptic plasticity and to offer suggestions whereby combinations of various chemical compounds could be arrayed against it. This data-driven approach confronts the complexity of synaptic plasticity by representing findings from the literature in a course-grained manner, and focuses on understanding the aggregate behavior of many molecular interactions. The same set of interactions is modeled by two different computer programs, each written using a different programming modality: one imperative, the other declarative. Both programs compute the same results over an extensive test battery, providing an essential crosscheck. Then the imperative program is used for the computationally intensive purpose of determining the effects on the model of every combination of ten different compounds, while the declarative program is used to analyze model behavior using temporal logic. Together these two model implementations offer new insights into the mechanisms by which Aβ dysregulates synaptic plasticity and suggest many drug combinations that potentially may reduce or prevent it.

  8. Computational identification of potential multitarget treatments for ameliorating the adverse effects of amyloid-beta on synaptic plasticity

    Directory of Open Access Journals (Sweden)

    Thomas J. Anastasio

    2014-05-01

    Full Text Available The leading hypothesis on Alzheimer Disease (AD is that it is caused by buildup of the peptide amyloid-beta (Abeta, which initially causes dysregulation of synaptic plasticity and eventually causes destruction of synapses and neurons. Pharmacological efforts to limit Abeta buildup have proven ineffective, and this raises the twin challenges of understanding the adverse effects of Abeta on synapses and of suggesting pharmacological means to prevent it. The purpose of this paper is to initiate a computational approach to understanding the dysregulation by Abeta of synaptic plasticity and to offer suggestions whereby combinations of various chemical compounds could be arrayed against it. This data-driven approach confronts the complexity of synaptic plasticity by representing findings from the literature in a course-grained manner, and focuses on understanding the aggregate behavior of many molecular interactions. The same set of interactions is modeled by two different computer programs, each written using a different programming modality: one imperative, the other declarative. Both programs compute the same results over an extensive test battery, providing an essential crosscheck. Then the imperative program is used for the computationally intensive purpose of determining the effects on the model of every combination of ten different compounds, while the declarative program is used to analyze model behavior using temporal logic. Together these two model implementations offer new insights into the mechanisms by which Abeta dysregulates synaptic plasticity and suggest many drug combinations that potentially may reduce or prevent it.

  9. Computational Prediction, Target Identification and Experimental Validation of miRNAs from Expressed Sequence Tags in Cannabis sativa L

    Czech Academy of Sciences Publication Activity Database

    Duraisamy, Ganesh Selvaraj; Mishra, Ajay Kumar; Jakše, J.; Matoušek, Jaroslav

    2015-01-01

    Roč. 4, č. 2 (2015), s. 32-42 ISSN 2320-0189 R&D Projects: GA ČR GA13-03037S Institutional support: RVO:60077344 Keywords : Cannabis sativa * microRNAs * Cis-regulating elements * Computational approach Subject RIV: EB - Genetics ; Molecular Biology

  10. A study on the value of computer-assisted assessment for SPECT/CT-scans in sentinel lymph node diagnostics of penile cancer as well as clinical reliability and morbidity of this procedure.

    Science.gov (United States)

    Lützen, Ulf; Naumann, Carsten Maik; Marx, Marlies; Zhao, Yi; Jüptner, Michael; Baumann, René; Papp, László; Zsótér, Norbert; Aksenov, Alexey; Jünemann, Klaus-Peter; Zuhayra, Maaz

    2016-09-07

    Because of the increasing importance of computer-assisted post processing of image data in modern medical diagnostic we studied the value of an algorithm for assessment of single photon emission computed tomography/computed tomography (SPECT/CT)-data, which has been used for the first time for lymph node staging in penile cancer with non-palpable inguinal lymph nodes. In the guidelines of the relevant international expert societies, sentinel lymph node-biopsy (SLNB) is recommended as a diagnostic method of choice. The aim of this study is to evaluate the value of the afore-mentioned algorithm and in the clinical context the reliability and the associated morbidity of this procedure. Between 2008 and 2015, 25 patients with invasive penile cancer and inconspicuous inguinal lymph node status underwent SLNB after application of the radiotracer Tc-99m labelled nanocolloid. We recorded in a prospective approach the reliability and the complication rate of the procedure. In addition, we evaluated the results of an algorithm for SPECT/CT-data assessment of these patients. SLNB was carried out in 44 groins of 25 patients. In three patients, inguinal lymph node metastases were detected via SLNB. In one patient, bilateral lymph node recurrence of the groins occurred after negative SLNB. There was a false-negative rate of 4 % in relation to the number of patients (1/25), resp. 4.5 % in relation to the number of groins (2/44). Morbidity was 4 % in relation to the number of patients (1/25), resp. 2.3 % in relation to the number of groins (1/44). The results of computer-assisted assessment of SPECT/CT data for sentinel lymph node (SLN)-diagnostics demonstrated high sensitivity of 88.8 % and specificity of 86.7 %. SLNB is a very reliable method, associated with low morbidity. Computer-assisted assessment of SPECT/CT data of the SLN-diagnostics shows high sensitivity and specificity. While it cannot replace the assessment by medical experts, it can still provide substantial

  11. An Introduction To Reliability

    International Nuclear Information System (INIS)

    Park, Kyoung Su

    1993-08-01

    This book introduces reliability with definition of reliability, requirement of reliability, system of life cycle and reliability, reliability and failure rate such as summary, reliability characteristic, chance failure, failure rate which changes over time, failure mode, replacement, reliability in engineering design, reliability test over assumption of failure rate, and drawing of reliability data, prediction of system reliability, conservation of system, failure such as summary and failure relay and analysis of system safety.

  12. Bone marrow edema pattern identification in patients with lytic bone lesions using digital subtraction angiography-like bone subtraction on large-area detector computed tomography.

    Science.gov (United States)

    Gondim Teixeira, Pedro Augusto; Hossu, Gabriela; Lecocq, Sophie; Razeto, Marco; Louis, Matthias; Blum, Alain

    2014-03-01

    The objective of this study was to evaluate the performance of digital subtraction angiography (DSA)-like bone subtraction with 2 different registration methods for the identification of bone marrow edema pattern (BMEP) in patients with lytic bone lesions, using magnetic resonance imaging as the criterion standard. Fifty-five patients with a lytic bone lesion were included in this prospective study with approval from the ethics committee. All patients underwent magnetic resonance imaging and low-dose computed tomographic (CT) perfusion after signing an informed consent. Two CT volumes were used for bone subtraction, which was performed with 2 different algorithms (rigid and nonrigid). Enhancement at the nonlytic bone marrow was considered as a sign of BMEP. Two readers evaluated the images blindly. The presence of BMEP on bone-subtracted CT images was evaluated subjectively and quantitatively. Image quality was assessed. Magnetic resonance imaging was used as the criterion standard. Using a rigid registration method, the sensitivity, specificity, positive predictive value, negative predictive value, and accuracy of CT with DSA-like bone subtraction BMEP was 77%, 100%, 100%, 68%, and 85%, respectively. The interobserver agreement was good (κ, 0.782). Image quality was better using a nonrigid registration. With this algorithm, artifacts interfered with image interpretation in only 5% of cases. However, there was a noticeable drop in sensitivity and negative predictive value when a nonrigid algorithm was used: 56% and 52%, respectively. The interobserver agreement was average with a nonrigid subtraction algorithm. Computed tomography with DSA-like bone subtraction is sensitive and highly specific for the identification of BMEP associated with lytic bone lesions. Rigid registering should be preferred, but nonrigid algorithms can be used as a second option when artifacts interfere with image interpretation.

  13. Reconstruction and identification of electrons in the Atlas experiment. Setup of a Tier 2 of the computing grid

    International Nuclear Information System (INIS)

    Derue, F.

    2008-03-01

    The origin of the mass of elementary particles is linked to the electroweak symmetry breaking mechanism. Its study will be one of the main efforts of the Atlas experiment at the Large Hadron Collider of CERN, starting in 2008. In most cases, studies will be limited by our knowledge of the detector performances, as the precision of the energy reconstruction or the efficiency to identify particles. This manuscript presents a work dedicated to the reconstruction of electrons in the Atlas experiment with simulated data and data taken during the combined test beam of 2004. The analysis of the Atlas data implies the use of a huge amount of computing and storage resources which brought to the development of a world computing grid. (author)

  14. A computational method for identification of vaccine targets from protein regions of conserved human leukocyte antigen binding

    DEFF Research Database (Denmark)

    Olsen, Lars Rønn; Simon, Christian; Kudahl, Ulrich J.

    2015-01-01

    Background: Computational methods for T cell-based vaccine target discovery focus on selection of highly conserved peptides identified across pathogen variants, followed by prediction of their binding of human leukocyte antigen molecules. However, experimental studies have shown that T cells ofte...... or proteome using human leukocyte antigen binding predictions and made a web-accessible software implementation freely available at http://met-hilab.cbs.dtu.dk/blockcons/....

  15. Identification of the Procedural Accidents During Root Canal Preparation Using Digital Intraoral Radiography and Cone Beam Computed Tomography

    OpenAIRE

    Csinszka K.-Ivácson A.-; Maria Monea Adriana; Monica Monea; Mihai Pop; Angela Borda

    2016-01-01

    Crown or root perforation, ledge formation, fractured instruments and perforation of the roots are the most important accidents which appear during endodontic therapy. Our objective was to evaluate the value of digital intraoral periapical radiographs compared to cone beam computed tomography images (CBCT) used to diagnose some procedural accidents. Material and methods: Eleven extracted molars were used in this study. A total of 18 perforations and 13 ledges were created artifically and 10 i...

  16. Parametric Mass Reliability Study

    Science.gov (United States)

    Holt, James P.

    2014-01-01

    The International Space Station (ISS) systems are designed based upon having redundant systems with replaceable orbital replacement units (ORUs). These ORUs are designed to be swapped out fairly quickly, but some are very large, and some are made up of many components. When an ORU fails, it is replaced on orbit with a spare; the failed unit is sometimes returned to Earth to be serviced and re-launched. Such a system is not feasible for a 500+ day long-duration mission beyond low Earth orbit. The components that make up these ORUs have mixed reliabilities. Components that make up the most mass-such as computer housings, pump casings, and the silicon board of PCBs-typically are the most reliable. Meanwhile components that tend to fail the earliest-such as seals or gaskets-typically have a small mass. To better understand the problem, my project is to create a parametric model that relates both the mass of ORUs to reliability, as well as the mass of ORU subcomponents to reliability.

  17. Reliability of a structured interview for admission to an emergency medicine residency program.

    Science.gov (United States)

    Blouin, Danielle

    2010-10-01

    Interviews are most important in resident selection. Structured interviews are more reliable than unstructured ones. We sought to measure the interrater reliability of a newly designed structured interview during the selection process to an Emergency Medicine residency program. The critical incident technique was used to extract the desired dimensions of performance. The interview tool consisted of 7 clinical scenarios and 1 global rating. Three trained interviewers marked each candidate on all scenarios without discussing candidates' responses. Interitem consistency and estimates of variance were computed. Twenty-eight candidates were interviewed. The generalizability coefficient was 0.67. Removing the central tendency ratings increased the coefficient to 0.74. Coefficients of interitem consistency ranged from 0.64 to 0.74. The structured interview tool provided good although suboptimal interrater reliability. Increasing the number of scenarios improves reliability as does applying differential weights to the rating scale anchors. The latter would also facilitate the identification of those candidates with extreme ratings.

  18. Reliability of structural systems subject to fatigue

    International Nuclear Information System (INIS)

    Rackwitz, R.

    1984-01-01

    Concepts and computational procedures for the reliability calculation of structural systems subject to fatigue are outlined. Systems are dealt with by approximately computing componential times to first failure. So-called first-order reliability methods are then used to formulate dependencies between componential failures and to evaluate the system failure probability. (Author) [pt

  19. Reliability and validity of the Chinese version on Alcohol Use Disorders Identification Test%中文版酒精使用障碍筛查量表信度和效度评价

    Institute of Scientific and Technical Information of China (English)

    张聪; 杨国平; 李圳; 李小宁; 李洋; 胡洁; 张凤云; 张徐军

    2017-01-01

    Objective To assess the reliability and validity of the Chinese version on Alcohol Use Disorders Identification Test (AUDIT) among medical students in China and to provide correct way of application on the recommended scales.Methods An E-questionnaire was developed and sent to medical students in five different colleges.Students were all active volunteers to accept the testings.Cronbach's α and split-half reliability were calculated to evaluate the reliability of AUDIT while content,contract,discriminant and convergent validity were performed to measure the validity of the scales.Results The overall Cronbach's α of AUDIT was 0.782 and the split-half reliability was 0.711.Data showed that the domain Cronbach's α and split-half reliability were 0.796 and 0.794 for hazardous alcohol use,0.561 and 0.623 for dependence symptoms,and 0.647 and 0.640 for harmful alcohol use.Results also showed that the content validity index on the levels of items I-CVI)were from 0.83 to 1.00,the content validity index of scale level (S-CVI/UA) was 0.90,content validity index of average scale level (S-CVI/Ave) was 0.99 and the content validity ratios (CVR) were from 0.80 to 1.00.The simplified version of AUDIT supported a presupposed three-factor structure which could explain 61.175% of the total variance revealed through exploratory factor analysis.AUDIT semed to have good convergent and discriminant validity,with the success rate of calibration experiment as 100%.Conclusion AUDIT showed good reliability and validity among medical students in China thus worth for promotion on its use.%目的 评价中文版酒精使用障碍筛查量表(AUDIT)的信度和效度,为该量表在中国医学生中的推广使用提供科学依据.方法 使用电子问卷,将问卷发给5所医学院校学生,让其按意愿自行填写.了解Cronbach'sα和分半信度评价量表的信度,了解内容效度、结构效度、会聚效度和区别效度评价量表的效度.结果 中文

  20. A new computational scheme on quantitative inner pipe boundary identification based on the estimation of effective thermal conductivity

    International Nuclear Information System (INIS)

    Fan Chunli; Sun Fengrui; Yang Li

    2008-01-01

    In the paper, the irregular configuration of the inner pipe boundary is identified based on the estimation of the circumferential distribution of the effective thermal conductivity of pipe wall. In order to simulate the true temperature measurement in the numerical examples, the finite element method is used to calculate the temperature distribution at the outer pipe surface based on the irregular shaped inner pipe boundary to be determined. Then based on this simulated temperature distribution the inverse identification work is conducted by employing the modified one-dimensional correction method, along with the finite volume method, to estimate the circumferential distribution of the effective thermal conductivity of the pipe wall. Thereafter, the inner pipe boundary shape is calculated based on the conductivity estimation result. A series of numerical experiments with different temperature measurement errors and different thermal conductivities of pipe wall have certified the effectiveness of the method. It is proved that the method is a simple, fast and accurate one for this inverse heat conduction problem.