WorldWideScience

Sample records for reliability processing presentation

  1. Knowledge modelling and reliability processing: presentation of the Figaro language and associated tools

    International Nuclear Information System (INIS)

    Bouissou, M.; Villatte, N.; Bouhadana, H.; Bannelier, M.

    1991-12-01

    EDF has been developing for several years an integrated set of knowledge-based and algorithmic tools for automation of reliability assessment of complex (especially sequential) systems. In this environment, the reliability expert has at his disposal all the powerful software tools for qualitative and quantitative processing, besides he gets various means to generate automatically the inputs for these tools, through the acquisition of graphical data. The development of these tools has been based on FIGARO, a specific language, which was built to get an homogeneous system modelling. Various compilers and interpreters get a FIGARO model into conventional models, such as fault-trees, Markov chains, Petri Networks. In this report, we introduce the main basics of FIGARO language, illustrating them with examples

  2. Present status of processing method

    Energy Technology Data Exchange (ETDEWEB)

    Kosako, Kazuaki [Sumitomo Atomic Energy Industries Ltd., Tokyo (Japan)

    1998-11-01

    Present status of processing method for a high-energy nuclear data file was examined. The NJOY94 code is the only one available to the processing. In Japan, present processing used NJOY94 is orienting toward the production of traditional cross section library, because a high-energy transport code using a high-energy cross section library is indistinct. (author)

  3. Equipment Reliability Process in Krsko NPP

    International Nuclear Information System (INIS)

    Gluhak, M.

    2016-01-01

    To ensure long-term safe and reliable plant operation, equipment operability and availability must also be ensured by setting a group of processes to be established within the nuclear power plant. Equipment reliability process represents the integration and coordination of important equipment reliability activities into one process, which enables equipment performance and condition monitoring, preventive maintenance activities development, implementation and optimization, continuous improvement of the processes and long term planning. The initiative for introducing systematic approach for equipment reliability assuring came from US nuclear industry guided by INPO (Institute of Nuclear Power Operations) and by participation of several US nuclear utilities. As a result of the initiative, first edition of INPO document AP-913, 'Equipment Reliability Process Description' was issued and it became a basic document for implementation of equipment reliability process for the whole nuclear industry. The scope of equipment reliability process in Krsko NPP consists of following programs: equipment criticality classification, preventive maintenance program, corrective action program, system health reports and long-term investment plan. By implementation, supervision and continuous improvement of those programs, guided by more than thirty years of operating experience, Krsko NPP will continue to be on a track of safe and reliable operation until the end of prolonged life time. (author).

  4. Improving Reliability of a Residency Interview Process

    Science.gov (United States)

    Serres, Michelle L.; Gundrum, Todd E.

    2013-01-01

    Objective. To improve the reliability and discrimination of a pharmacy resident interview evaluation form, and thereby improve the reliability of the interview process. Methods. In phase 1 of the study, authors used a Many-Facet Rasch Measurement model to optimize an existing evaluation form for reliability and discrimination. In phase 2, interviewer pairs used the modified evaluation form within 4 separate interview stations. In phase 3, 8 interviewers individually-evaluated each candidate in one-on-one interviews. Results. In phase 1, the evaluation form had a reliability of 0.98 with person separation of 6.56; reproducibly, the form separated applicants into 6 distinct groups. Using that form in phase 2 and 3, our largest variation source was candidates, while content specificity was the next largest variation source. The phase 2 g-coefficient was 0.787, while confirmatory phase 3 was 0.922. Process reliability improved with more stations despite fewer interviewers per station—impact of content specificity was greatly reduced with more interview stations. Conclusion. A more reliable, discriminating evaluation form was developed to evaluate candidates during resident interviews, and a process was designed that reduced the impact from content specificity. PMID:24159209

  5. Improving reliability of a residency interview process.

    Science.gov (United States)

    Peeters, Michael J; Serres, Michelle L; Gundrum, Todd E

    2013-10-14

    To improve the reliability and discrimination of a pharmacy resident interview evaluation form, and thereby improve the reliability of the interview process. In phase 1 of the study, authors used a Many-Facet Rasch Measurement model to optimize an existing evaluation form for reliability and discrimination. In phase 2, interviewer pairs used the modified evaluation form within 4 separate interview stations. In phase 3, 8 interviewers individually-evaluated each candidate in one-on-one interviews. In phase 1, the evaluation form had a reliability of 0.98 with person separation of 6.56; reproducibly, the form separated applicants into 6 distinct groups. Using that form in phase 2 and 3, our largest variation source was candidates, while content specificity was the next largest variation source. The phase 2 g-coefficient was 0.787, while confirmatory phase 3 was 0.922. Process reliability improved with more stations despite fewer interviewers per station-impact of content specificity was greatly reduced with more interview stations. A more reliable, discriminating evaluation form was developed to evaluate candidates during resident interviews, and a process was designed that reduced the impact from content specificity.

  6. A general software reliability process simulation technique

    Science.gov (United States)

    Tausworthe, Robert C.

    1991-01-01

    The structure and rationale of the generalized software reliability process, together with the design and implementation of a computer program that simulates this process are described. Given assumed parameters of a particular project, the users of this program are able to generate simulated status timelines of work products, numbers of injected anomalies, and the progress of testing, fault isolation, repair, validation, and retest. Such timelines are useful in comparison with actual timeline data, for validating the project input parameters, and for providing data for researchers in reliability prediction modeling.

  7. DEVELOPING VISUAL PRESENTATION ATTITUDE RUBRIC: VALIDITY AND RELIABILITY STUDY

    OpenAIRE

    ATEŞ, Hatice KADIOĞLU; ADA, Sefer; BAYSAL, Z. Nurdan

    2015-01-01

    Abstract The aim of this study is to develop visual presentation attitude rubric which is valid and reliable for the 4th grade students. 218 students took part in this study from Engin Can Güre which located in Istanbul, Esenler. While preparing this assessment tool with 34 criterias , 6 university lecturers view have been taken who are experts in their field. The answer key sheet has 4 (likert )type options. The rubric has been first tested by Kaiser-Meyer Olkin and Bartletts tests an...

  8. Reliability Estimation of the Pultrusion Process Using the First-Order Reliability Method (FORM)

    DEFF Research Database (Denmark)

    Baran, Ismet; Tutum, Cem Celal; Hattel, Jesper Henri

    2013-01-01

    In the present study the reliability estimation of the pultrusion process of a flat plate is analyzed by using the first order reliability method (FORM). The implementation of the numerical process model is validated by comparing the deterministic temperature and cure degree profiles...... with corresponding analyses in the literature. The centerline degree of cure at the exit (CDOCE) being less than a critical value and the maximum composite temperature (Tmax) during the process being greater than a critical temperature are selected as the limit state functions (LSFs) for the FORM. The cumulative...

  9. PROVIDING RELIABILITY OF HUMAN RESOURCES IN PRODUCTION MANAGEMENT PROCESS

    Directory of Open Access Journals (Sweden)

    Anna MAZUR

    2014-07-01

    Full Text Available People are the most valuable asset of an organization and the results of a company mostly depends on them. The human factor can also be a weak link in the company and cause of the high risk for many of the processes. Reliability of the human factor in the process of the manufacturing process will depend on many factors. The authors include aspects of human error, safety culture, knowledge, communication skills, teamwork and leadership role in the developed model of reliability of human resources in the management of the production process. Based on the case study and the results of research and observation of the author present risk areas defined in a specific manufacturing process and the results of evaluation of the reliability of human resources in the process.

  10. Reliable processing of graphene using metal etchmasks

    Directory of Open Access Journals (Sweden)

    Peltekis Nikos

    2011-01-01

    Full Text Available Abstract Graphene exhibits exciting properties which make it an appealing candidate for use in electronic devices. Reliable processes for device fabrication are crucial prerequisites for this. We developed a large area of CVD synthesis and transfer of graphene films. With patterning of these graphene layers using standard photoresist masks, we are able to produce arrays of gated graphene devices with four point contacts. The etching and lift off process poses problems because of delamination and contamination due to polymer residues when using standard resists. We introduce a metal etch mask which minimises these problems. The high quality of graphene is shown by Raman and XPS spectroscopy as well as electrical measurements. The process is of high value for applications, as it improves the processability of graphene using high-throughput lithography and etching techniques.

  11. Photovoltaic Reliability Group activities in USA and Brazil (Presentation Recording)

    Science.gov (United States)

    Dhere, Neelkanth G.; Cruz, Leila R. O.

    2015-09-01

    Recently prices of photovoltaic (PV) systems have been reduced considerably and may continue to be reduced making them attractive. If these systems provide electricity over the stipulated warranty period, it would be possible attain socket parity within the next few years. Current photovoltaic module qualifications tests help in minimizing infant mortality but do not guarantee useful lifetime over the warranty period. The PV Module Quality Assurance Task Force (PVQAT) is trying to formulate accelerated tests that will be useful towards achieving the ultimate goal of assuring useful lifetime over the warranty period as well as to assure manufacturing quality. Unfortunately, assuring the manufacturing quality may require 24/7 presence. Alternatively, collecting data on the performance of fielded systems would assist in assuring manufacturing quality. Here PV systems installed by home-owners and small businesses can constitute as an important untapped source of data. The volunteer group, PV - Reliable, Safe and Sustainable Quality! (PVRessQ!) is providing valuable service to small PV system owners. Photovoltaic Reliability Group (PVRG) is initiating activities in USA and Brazil to assist home owners and small businesses in monitoring photovoltaic (PV) module performance and enforcing warranty. It will work in collaboration with small PV system owners, consumer protection agencies. Brazil is endowed with excellent solar irradiance making it attractive for installation of PV systems. Participating owners of small PV systems would instruct inverter manufacturers to copy the daily e-mails to PVRG and as necessary, will authorize the PVRG to carry out review of PV systems. The presentation will consist of overall activities of PVRG in USA and Brazil.

  12. Product reliability and the reliability of its emanating operational processes.

    NARCIS (Netherlands)

    Sonnemans, P.J.M.; Geudens, W.H.J.M.

    1999-01-01

    This paper addresses the problem of proper reliability management in business operations today, facing increasing demands on essential business drivers such as time to market, quality and financial profit. In this paper a general method is described of how to achieve product quality in a highly

  13. Time-variant reliability assessment through equivalent stochastic process transformation

    International Nuclear Information System (INIS)

    Wang, Zequn; Chen, Wei

    2016-01-01

    Time-variant reliability measures the probability that an engineering system successfully performs intended functions over a certain period of time under various sources of uncertainty. In practice, it is computationally prohibitive to propagate uncertainty in time-variant reliability assessment based on expensive or complex numerical models. This paper presents an equivalent stochastic process transformation approach for cost-effective prediction of reliability deterioration over the life cycle of an engineering system. To reduce the high dimensionality, a time-independent reliability model is developed by translating random processes and time parameters into random parameters in order to equivalently cover all potential failures that may occur during the time interval of interest. With the time-independent reliability model, an instantaneous failure surface is attained by using a Kriging-based surrogate model to identify all potential failure events. To enhance the efficacy of failure surface identification, a maximum confidence enhancement method is utilized to update the Kriging model sequentially. Then, the time-variant reliability is approximated using Monte Carlo simulations of the Kriging model where system failures over a time interval are predicted by the instantaneous failure surface. The results of two case studies demonstrate that the proposed approach is able to accurately predict the time evolution of system reliability while requiring much less computational efforts compared with the existing analytical approach. - Highlights: • Developed a new approach for time-variant reliability analysis. • Proposed a novel stochastic process transformation procedure to reduce the dimensionality. • Employed Kriging models with confidence-based adaptive sampling scheme to enhance computational efficiency. • The approach is effective for handling random process in time-variant reliability analysis. • Two case studies are used to demonstrate the efficacy

  14. NPTool: Towards Scalability and Reliability of Business Process Management

    Science.gov (United States)

    Braghetto, Kelly Rosa; Ferreira, João Eduardo; Pu, Calton

    Currently one important challenge in business process management is provide at the same time scalability and reliability of business process executions. This difficulty becomes more accentuated when the execution control assumes complex countless business processes. This work presents NavigationPlanTool (NPTool), a tool to control the execution of business processes. NPTool is supported by Navigation Plan Definition Language (NPDL), a language for business processes specification that uses process algebra as formal foundation. NPTool implements the NPDL language as a SQL extension. The main contribution of this paper is a description of the NPTool showing how the process algebra features combined with a relational database model can be used to provide a scalable and reliable control in the execution of business processes. The next steps of NPTool include reuse of control-flow patterns and support to data flow management.

  15. DUAL-PROCESS, a highly reliable process control system

    International Nuclear Information System (INIS)

    Buerger, L.; Gossanyi, A.; Parkanyi, T.; Szabo, G.; Vegh, E.

    1983-02-01

    A multiprocessor process control system is described. During its development the reliability was the most important aspect because it is used in the computerized control of a 5 MW research reactor. DUAL-PROCESS is fully compatible with the earlier single processor control system PROCESS-24K. The paper deals in detail with the communication, synchronization, error detection and error recovery problems of the operating system. (author)

  16. Reliability: How much is it worth? Beyond its estimation or prediction, the (net) present value of reliability

    International Nuclear Information System (INIS)

    Saleh, J.H.; Marais, K.

    2006-01-01

    In this article, we link an engineering concept, reliability, to a financial and managerial concept, net present value, by exploring the impact of a system's reliability on its revenue generation capability. The framework here developed for non-repairable systems quantitatively captures the value of reliability from a financial standpoint. We show that traditional present value calculations of engineering systems do not account for system reliability, thus over-estimate a system's worth and can therefore lead to flawed investment decisions. It is therefore important to involve reliability engineers upfront before investment decisions are made in technical systems. In addition, the analyses here developed help designers identify the optimal level of reliability that maximizes a system's net present value-the financial value reliability provides to the system minus the cost to achieve this level of reliability. Although we recognize that there are numerous considerations driving the specification of an engineering system's reliability, we contend that the financial analysis of reliability here developed should be made available to decision-makers to support in part, or at least be factored into, the system reliability specification

  17. Reliability and energy efficiency of zero energy homes (Conference Presentation)

    Science.gov (United States)

    Dhere, Neelkanth G.

    2016-09-01

    Photovoltaic (PV) modules and systems are being installed increasingly on residential homes to increase the proportion of renewable energy in the energy mix. The ultimate goal is to attain sustainability without subsidy. The prices of PV modules and systems have declined substantially during the recent years. They will be reduced further to reach grid parity. Additionally the total consumed energy must be reduced by making the homes more energy efficient. FSEC/UCF Researchers have carried out research on development of PV cells and systems and on reducing the energy consumption in homes and by small businesses. Additionally, they have provided guidance on PV module and system installation and to make the homes energy efficient. The produced energy is fed into the utility grid and the consumed energy is obtained from the utility grid, thus the grid is assisting in the storage. Currently the State of Florida permits net metering leading to equal charge for the produced and consumed electricity. This paper describes the installation of 5.29 KW crystalline silicon PV system on a south-facing tilt at approximately latitude tilt on a single-story, three-bedroom house. It also describes the computer program on Building Energy Efficiency and the processes that were employed for reducing the energy consumption of the house by improving the insulation, air circulation and windows, etc. Finally it describes actual consumption and production of electricity and the installation of additional crystalline silicon PV modules and balance of system to make it a zero energy home.

  18. Can Reliability of Multiple Component Measuring Instruments Depend on Response Option Presentation Mode?

    Science.gov (United States)

    Menold, Natalja; Raykov, Tenko

    2016-01-01

    This article examines the possible dependency of composite reliability on presentation format of the elements of a multi-item measuring instrument. Using empirical data and a recent method for interval estimation of group differences in reliability, we demonstrate that the reliability of an instrument need not be the same when polarity of the…

  19. Processing of visually presented clock times.

    Science.gov (United States)

    Goolkasian, P; Park, D C

    1980-11-01

    The encoding and representation of visually presented clock times was investigated in three experiments utilizing a comparative judgment task. Experiment 1 explored the effects of comparing times presented in different formats (clock face, digit, or word), and Experiment 2 examined angular distance effects created by varying positions of the hands on clock faces. In Experiment 3, encoding and processing differences between clock faces and digitally presented times were directly measured. Same/different reactions to digitally presented times were faster than to times presented on a clock face, and this format effect was found to be a result of differences in processing that occurred after encoding. Angular separation also had a limited effect on processing. The findings are interpreted within the framework of theories that refer to the importance of representational codes. The applicability to the data of Bank's semantic-coding theory, Paivio's dual-coding theory, and the levels-of-processing view of memory are discussed.

  20. Process plant equipment operation, control, and reliability

    CERN Document Server

    Holloway, Michael D; Onyewuenyi, Oliver A

    2012-01-01

    "Process Plant Equipment Book is another great publication from Wiley as a reference book for final year students as well as those who will work or are working in chemical production plants and refinery…" -Associate Prof. Dr. Ramli Mat, Deputy Dean (Academic), Faculty of Chemical Engineering, Universiti Teknologi Malaysia "…give[s] readers access to both fundamental information on process plant equipment and to practical ideas, best practices and experiences of highly successful engineers from around the world… The book is illustrated throughout with numerous black & white p

  1. Process related contaminations causing climatic reliability issues

    DEFF Research Database (Denmark)

    Jellesen, Morten Stendahl; Dutta, Mondira; Verdingovas, Vadimas

    2012-01-01

    contaminants during the wave and re-flow soldering process; however variation in temperature on the PCBA surface during soldering can result in considerable amounts of active residues being left locally. Typical no-clean flux systems used today consist of weak organic acids (WOA) and active residues left...

  2. Representative process sampling for reliable data analysis

    DEFF Research Database (Denmark)

    Julius, Lars Petersen; Esbensen, Kim

    2005-01-01

    (sampling variances) can be reduced greatly however, and sampling biases can be eliminated completely, by respecting a simple set of rules and guidelines provided by TOS. A systematic approach for description of process heterogeneity furnishes in-depth knowledge about the specific variability of any 1-D lot...

  3. Present status of radiation processing in Japan

    International Nuclear Information System (INIS)

    Tabata, Y.

    1984-01-01

    Crosslinking of insulating materials including cables, tubes, sheets, pipes and polymer foam, curing of coating, surface treatment and other processes developed recently will be presented. Future prospects in this field will be discussed. (Author) [pt

  4. New process for weld metal reliability

    International Nuclear Information System (INIS)

    Hebel, A.G.

    1985-01-01

    The industry-wide nature of weld cracking alerts one to the possibility that there is a fundamental law being overlooked. And in overlooking this law, industry is unable to counteract it. That law mandates that restraint during welding causes internal stress; internal stress causes weld metal to crack. Component restraint during welding, according to the welding standard, is the major cause of weld metal failures. When the metal working industry accepts this fact and begins to counter the effects of restraint, the number of weld failures experienced fall dramatically. Bonal Technologies, inc., of Detroit, has developed the first consistently effective non-thermal process to relieve stress caused by restraint during welding. Bonal's patented Mets-Lax sub-resonant stress relief acts as a restraint neutralizer when used during welding. Meta-Lax weld conditioning produces a finer more uniform weld grain structure. A finer, more uniform grain structure is a clear metallurgical indication of improved mechanical weld properties. Other benefits like less internal stress, and less warpage are also achieved

  5. Reliability demonstration methodology for products with Gamma Process by optimal accelerated degradation testing

    International Nuclear Information System (INIS)

    Zhang, Chunhua; Lu, Xiang; Tan, Yuanyuan; Wang, Yashun

    2015-01-01

    For products with high reliability and long lifetime, accelerated degradation testing (ADT) may be adopted during product development phase to verify whether its reliability satisfies the predetermined level within feasible test duration. The actual degradation from engineering is usually a strictly monotonic process, such as fatigue crack growth, wear, and erosion. However, the method for reliability demonstration by ADT with monotonic degradation process has not been investigated so far. This paper proposes a reliability demonstration methodology by ADT for this kind of product. We first apply Gamma process to describe the monotonic degradation. Next, we present a reliability demonstration method by converting the required reliability level into allowable cumulative degradation in ADT and comparing the actual accumulative degradation with the allowable level. Further, we suggest an analytical optimal ADT design method for more efficient reliability demonstration by minimizing the asymptotic variance of decision variable in reliability demonstration under the constraints of sample size, test duration, test cost, and predetermined decision risks. The method is validated and illustrated with example on reliability demonstration of alloy product, and is applied to demonstrate the wear reliability within long service duration of spherical plain bearing in the end. - Highlights: • We present a reliability demonstration method by ADT for products with monotonic degradation process, which may be applied to verify reliability with long service life for products with monotonic degradation process within feasible test duration. • We suggest an analytical optimal ADT design method for more efficient reliability demonstration, which differs from the existed optimal ADT design for more accurate reliability estimation by different objective function and different constraints. • The methods are applied to demonstrate the wear reliability within long service duration of

  6. PRESENTATION POTENTIAL USING IN PEDAGOGICAL INTERACTION PROCESS

    Directory of Open Access Journals (Sweden)

    Olga V. Ershova

    2016-01-01

    Full Text Available The given article is aimed at considering multimedia presentation potential and its influence on strengthening classroom teacher-student interaction. In the article the importance of using this kind of activity in the study process is pointed in connection with educational state policy on the one hand. On the other hand, gained students’ skills as a final result of work with presentations met employers’ demand for both parent and world labour-markets and bring competitive benefit to the candidates. Scientific novelty and results. Multimedia presentation is considered as a specific complex of classroom activities. The students are oriented on the self analysis and presentation assessment. It is shown that well-organized process of peer students’ assessment allows to simultaneously helping in solving the didactic and methodical problems. To this purpose the system of assessment criteria should be developed. It has to be clear for students for making assessment feasible and time-saving. The example of a possible variant of criteria system is described; quality of the presentations prepared by students can be defined based on such system criteria. The author also analyzed software products of the three main platforms (Windows, Linux, MacOs which have different tools and allow to follow users’ needs for creating presentations. In the article there is a comparative table of the two most popular software development: the program Microsoft PowerPoint and the web-service Prezi for realizing the relevance of their use in the study process. Practical significance of the present article concludes in author’s suggestions of some recommendations for presentation potential use as a tool of improving pedagogical interaction process with contemporary students. 

  7. Reliability analysis of common hazardous waste treatment processes

    International Nuclear Information System (INIS)

    Waters, R.D.

    1993-05-01

    Five hazardous waste treatment processes are analyzed probabilistically using Monte Carlo simulation to elucidate the relationships between process safety factors and reliability levels. The treatment processes evaluated are packed tower aeration, reverse osmosis, activated sludge, upflow anaerobic sludge blanket, and activated carbon adsorption

  8. Reliability analysis of common hazardous waste treatment processes

    Energy Technology Data Exchange (ETDEWEB)

    Waters, Robert D. [Vanderbilt Univ., Nashville, TN (United States)

    1993-05-01

    Five hazardous waste treatment processes are analyzed probabilistically using Monte Carlo simulation to elucidate the relationships between process safety factors and reliability levels. The treatment processes evaluated are packed tower aeration, reverse osmosis, activated sludge, upflow anaerobic sludge blanket, and activated carbon adsorption.

  9. Sequential optimization and reliability assessment method for metal forming processes

    International Nuclear Information System (INIS)

    Sahai, Atul; Schramm, Uwe; Buranathiti, Thaweepat; Chen Wei; Cao Jian; Xia, Cedric Z.

    2004-01-01

    Uncertainty is inevitable in any design process. The uncertainty could be due to the variations in geometry of the part, material properties or due to the lack of knowledge about the phenomena being modeled itself. Deterministic design optimization does not take uncertainty into account and worst case scenario assumptions lead to vastly over conservative design. Probabilistic design, such as reliability-based design and robust design, offers tools for making robust and reliable decisions under the presence of uncertainty in the design process. Probabilistic design optimization often involves double-loop procedure for optimization and iterative probabilistic assessment. This results in high computational demand. The high computational demand can be reduced by replacing computationally intensive simulation models with less costly surrogate models and by employing Sequential Optimization and reliability assessment (SORA) method. The SORA method uses a single-loop strategy with a series of cycles of deterministic optimization and reliability assessment. The deterministic optimization and reliability assessment is decoupled in each cycle. This leads to quick improvement of design from one cycle to other and increase in computational efficiency. This paper demonstrates the effectiveness of Sequential Optimization and Reliability Assessment (SORA) method when applied to designing a sheet metal flanging process. Surrogate models are used as less costly approximations to the computationally expensive Finite Element simulations

  10. Modelling and estimating degradation processes with application in structural reliability

    International Nuclear Information System (INIS)

    Chiquet, J.

    2007-06-01

    The characteristic level of degradation of a given structure is modeled through a stochastic process called the degradation process. The random evolution of the degradation process is governed by a differential system with Markovian environment. We put the associated reliability framework by considering the failure of the structure once the degradation process reaches a critical threshold. A closed form solution of the reliability function is obtained thanks to Markov renewal theory. Then, we build an estimation methodology for the parameters of the stochastic processes involved. The estimation methods and the theoretical results, as well as the associated numerical algorithms, are validated on simulated data sets. Our method is applied to the modelling of a real degradation mechanism, known as crack growth, for which an experimental data set is considered. (authors)

  11. Numerical Model based Reliability Estimation of Selective Laser Melting Process

    DEFF Research Database (Denmark)

    Mohanty, Sankhya; Hattel, Jesper Henri

    2014-01-01

    Selective laser melting is developing into a standard manufacturing technology with applications in various sectors. However, the process is still far from being at par with conventional processes such as welding and casting, the primary reason of which is the unreliability of the process. While...... of the selective laser melting process. A validated 3D finite-volume alternating-direction-implicit numerical technique is used to model the selective laser melting process, and is calibrated against results from single track formation experiments. Correlation coefficients are determined for process input...... parameters such as laser power, speed, beam profile, etc. Subsequently, uncertainties in the processing parameters are utilized to predict a range for the various outputs, using a Monte Carlo method based uncertainty analysis methodology, and the reliability of the process is established....

  12. Statistical reliability analyses of two wood plastic composite extrusion processes

    International Nuclear Information System (INIS)

    Crookston, Kevin A.; Mark Young, Timothy; Harper, David; Guess, Frank M.

    2011-01-01

    Estimates of the reliability of wood plastic composites (WPC) are explored for two industrial extrusion lines. The goal of the paper is to use parametric and non-parametric analyses to examine potential differences in the WPC metrics of reliability for the two extrusion lines that may be helpful for use by the practitioner. A parametric analysis of the extrusion lines reveals some similarities and disparities in the best models; however, a non-parametric analysis reveals unique and insightful differences between Kaplan-Meier survival curves for the modulus of elasticity (MOE) and modulus of rupture (MOR) of the WPC industrial data. The distinctive non-parametric comparisons indicate the source of the differences in strength between the 10.2% and 48.0% fractiles [3,183-3,517 MPa] for MOE and for MOR between the 2.0% and 95.1% fractiles [18.9-25.7 MPa]. Distribution fitting as related to selection of the proper statistical methods is discussed with relevance to estimating the reliability of WPC. The ability to detect statistical differences in the product reliability of WPC between extrusion processes may benefit WPC producers in improving product reliability and safety of this widely used house-decking product. The approach can be applied to many other safety and complex system lifetime comparisons.

  13. Reliability

    OpenAIRE

    Condon, David; Revelle, William

    2017-01-01

    Separating the signal in a test from the irrelevant noise is a challenge for all measurement. Low test reliability limits test validity, attenuates important relationships, and can lead to regression artifacts. Multiple approaches to the assessment and improvement of reliability are discussed. The advantages and disadvantages of several different approaches to reliability are considered. Practical advice on how to assess reliability using open source software is provided.

  14. Achieving High Reliability with People, Processes, and Technology.

    Science.gov (United States)

    Saunders, Candice L; Brennan, John A

    2017-01-01

    High reliability as a corporate value in healthcare can be achieved by meeting the "Quadruple Aim" of improving population health, reducing per capita costs, enhancing the patient experience, and improving provider wellness. This drive starts with the board of trustees, CEO, and other senior leaders who ingrain high reliability throughout the organization. At WellStar Health System, the board developed an ambitious goal to become a top-decile health system in safety and quality metrics. To achieve this goal, WellStar has embarked on a journey toward high reliability and has committed to Lean management practices consistent with the Institute for Healthcare Improvement's definition of a high-reliability organization (HRO): one that is committed to the prevention of failure, early identification and mitigation of failure, and redesign of processes based on identifiable failures. In the end, a successful HRO can provide safe, effective, patient- and family-centered, timely, efficient, and equitable care through a convergence of people, processes, and technology.

  15. Wind Farm Reliability Modelling Using Bayesian Networks and Semi-Markov Processes

    Directory of Open Access Journals (Sweden)

    Robert Adam Sobolewski

    2015-09-01

    Full Text Available Technical reliability plays an important role among factors affecting the power output of a wind farm. The reliability is determined by an internal collection grid topology and reliability of its electrical components, e.g. generators, transformers, cables, switch breakers, protective relays, and busbars. A wind farm reliability’s quantitative measure can be the probability distribution of combinations of operating and failed states of the farm’s wind turbines. The operating state of a wind turbine is its ability to generate power and to transfer it to an external power grid, which means the availability of the wind turbine and other equipment necessary for the power transfer to the external grid. This measure can be used for quantitative analysis of the impact of various wind farm topologies and the reliability of individual farm components on the farm reliability, and for determining the expected farm output power with consideration of the reliability. This knowledge may be useful in an analysis of power generation reliability in power systems. The paper presents probabilistic models that quantify the wind farm reliability taking into account the above-mentioned technical factors. To formulate the reliability models Bayesian networks and semi-Markov processes were used. Using Bayesian networks the wind farm structural reliability was mapped, as well as quantitative characteristics describing equipment reliability. To determine the characteristics semi-Markov processes were used. The paper presents an example calculation of: (i probability distribution of the combination of both operating and failed states of four wind turbines included in the wind farm, and (ii expected wind farm output power with consideration of its reliability.

  16. Gamma processes and peaks-over-threshold distributions for time-dependent reliability

    International Nuclear Information System (INIS)

    Noortwijk, J.M. van; Weide, J.A.M. van der; Kallen, M.J.; Pandey, M.D.

    2007-01-01

    In the evaluation of structural reliability, a failure is defined as the event in which stress exceeds a resistance that is liable to deterioration. This paper presents a method to combine the two stochastic processes of deteriorating resistance and fluctuating load for computing the time-dependent reliability of a structural component. The deterioration process is modelled as a gamma process, which is a stochastic process with independent non-negative increments having a gamma distribution with identical scale parameter. The stochastic process of loads is generated by a Poisson process. The variability of the random loads is modelled by a peaks-over-threshold distribution (such as the generalised Pareto distribution). These stochastic processes of deterioration and load are combined to evaluate the time-dependent reliability

  17. Reliability of high power electron accelerators for radiation processing

    International Nuclear Information System (INIS)

    Zimek, Z.

    2011-01-01

    Accelerators applied for radiation processing are installed in industrial facilities where accelerator availability coefficient should be at the level of 95% to fulfill requirements according to industry standards. Usually the exploitation of electron accelerator reviles the number of short and few long lasting failures. Some technical shortages can be overcome by practical implementation the experience gained in accelerator technology development by different accelerator manufactures. The reliability/availability of high power accelerators for application in flue gas treatment process must be dramatically improved to meet industrial standards. Support of accelerator technology dedicated for environment protection should be provided by governmental and international institutions to overcome accelerator reliability/availability problem and high risk and low direct profit in this particular application. (author)

  18. Reliability of high power electron accelerators for radiation processing

    Energy Technology Data Exchange (ETDEWEB)

    Zimek, Z. [Department of Radiation Chemistry and Technology, Institute of Nuclear Chemistry and Technology, Warsaw (Poland)

    2011-07-01

    Accelerators applied for radiation processing are installed in industrial facilities where accelerator availability coefficient should be at the level of 95% to fulfill requirements according to industry standards. Usually the exploitation of electron accelerator reviles the number of short and few long lasting failures. Some technical shortages can be overcome by practical implementation the experience gained in accelerator technology development by different accelerator manufactures. The reliability/availability of high power accelerators for application in flue gas treatment process must be dramatically improved to meet industrial standards. Support of accelerator technology dedicated for environment protection should be provided by governmental and international institutions to overcome accelerator reliability/availability problem and high risk and low direct profit in this particular application. (author)

  19. Operational present status and reliability analysis of the upgraded EAST cryogenic system

    Science.gov (United States)

    Zhou, Z. W.; Y Zhang, Q.; Lu, X. F.; Hu, L. B.; Zhu, P.

    2017-12-01

    Since the first commissioning in 2005, the cryogenic system for EAST (Experimental Advanced Superconducting Tokamak) has been cooled down and warmed up for thirteen experimental campaigns. In order to promote the refrigeration efficiencies and reliability, the EAST cryogenic system was upgraded gradually with new helium screw compressors and new dynamic gas bearing helium turbine expanders with eddy current brake to improve the original poor mechanical and operational performance from 2012 to 2015. Then the totally upgraded cryogenic system was put into operation in the eleventh cool-down experiment, and has been operated for the latest several experimental campaigns. The upgraded system has successfully coped with various normal operational modes during cool-down and 4.5 K steady-state operation under pulsed heat load from the tokamak as well as the abnormal fault modes including turbines protection stop. In this paper, the upgraded EAST cryogenic system including its functional analysis and new cryogenic control networks will be presented in detail. Also, its operational present status in the latest cool-down experiments will be presented and the system reliability will be analyzed, which shows a high reliability and low fault rate after upgrade. In the end, some future necessary work to meet the higher reliability requirement for future uninterrupted long-term experimental operation will also be proposed.

  20. Experiments on data presentation to process operators in diagnostic tasks

    DEFF Research Database (Denmark)

    Rasmussen, Jens; Goodstein, L. P.

    1972-01-01

    Safety and reliability considerations in modern power plants have prompted our interest in man as an information receiver - especially in diagnostic tasks where the growing complexity of process plants and hence the amount of data involved make it imperative to give the staff proper support....... The great flexibility and capacity of the process computer for data reduction and presentation and for storing information on plant structure and functions give the system designer great freedom in the layout of information display for the staff, but the problem for the designer is how to make proper use...... of this freedom to support the operators efficiently. This is especially important in connection with unique, high-risk, and generally improbable abnormalities in plant functioning. Operator tasks and mental models and the need for matching the encoded information about the plant to these models are treated...

  1. Equipment reliability process improvement and preventive maintenance optimization

    International Nuclear Information System (INIS)

    Darragi, M.; Georges, A.; Vaillancourt, R.; Komljenovic, D.; Croteau, M.

    2004-01-01

    The Gentilly-2 Nuclear Power Plant wants to optimize its preventive maintenance program through an Integrated Equipment Reliability Process. All equipment reliability related activities should be reviewed and optimized in a systematic approach especially for aging plants such as G2. This new approach has to be founded on best practices methods with the purpose of the rationalization of the preventive maintenance program and the performance monitoring of on-site systems, structures and components (SSC). A rational preventive maintenance strategy is based on optimized task scopes and frequencies depending on their applicability, critical effects on system safety and plant availability as well as cost-effectiveness. Preventive maintenance strategy efficiency is systematically monitored through degradation indicators. (author)

  2. Reliability Engineering for ATLAS Petascale Data Processing on the Grid

    CERN Document Server

    Golubkov, D V; The ATLAS collaboration; Vaniachine, A V

    2012-01-01

    The ATLAS detector is in its third year of continuous LHC running taking data for physics analysis. A starting point for ATLAS physics analysis is reconstruction of the raw data. First-pass processing takes place shortly after data taking, followed later by reprocessing of the raw data with updated software and calibrations to improve the quality of the reconstructed data for physics analysis. Data reprocessing involves a significant commitment of computing resources and is conducted on the Grid. The reconstruction of one petabyte of ATLAS data with 1B collision events from the LHC takes about three million core-hours. Petascale data processing on the Grid involves millions of data processing jobs. At such scales, the reprocessing must handle a continuous stream of failures. Automatic job resubmission recovers transient failures at the cost of CPU time used by the failed jobs. Orchestrating ATLAS data processing applications to ensure efficient usage of tens of thousands of CPU-cores, reliability engineering ...

  3. Toddlers favor communicatively presented information over statistical reliability in learning about artifacts.

    Directory of Open Access Journals (Sweden)

    Hanna Marno

    Full Text Available Observed associations between events can be validated by statistical information of reliability or by testament of communicative sources. We tested whether toddlers learn from their own observation of efficiency, assessed by statistical information on reliability of interventions, or from communicatively presented demonstration, when these two potential types of evidence of validity of interventions on a novel artifact are contrasted with each other. Eighteen-month-old infants observed two adults, one operating the artifact by a method that was more efficient (2/3 probability of success than that of the other (1/3 probability of success. Compared to the Baseline condition, in which communicative signals were not employed, infants tended to choose the less reliable method to operate the artifact when this method was demonstrated in a communicative manner in the Experimental condition. This finding demonstrates that, in certain circumstances, communicative sanctioning of reliability may override statistical evidence for young learners. Such a bias can serve fast and efficient transmission of knowledge between generations.

  4. Radiation processing dosimetry - past, present and future

    International Nuclear Information System (INIS)

    McLaughlin, W.L.

    1999-01-01

    Since the two United Nations Conferences were held in Geneva in 1955 and 1958 on the Peaceful Uses of Atomic Energy and the concurrent foundation of the International Atomic Energy Agency in 1957, the IAEA has fostered high-dose dosimetry and its applications. This field is represented in industrial radiation processing, agricultural programmes, and therapeutic and preventative medicine. Such dosimetry is needed specifically for pest and quarantine control and in the processing of medical products, pharmaceuticals, blood products, foodstuffs, solid, liquid and gaseous wastes, and a variety of useful commodities, e.g. polymers, composites, natural rubber and elastomers, packaging, electronic, and automotive components, as well as in radiotherapy. Improvements and innovations of dosimetry materials and analytical systems and software continue to be important goals for these applications. Some of the recent advances in high-dose dosimetry include tetrazolium salts and substituted polydiacetylene as radiochromic media, on-line real-time as well as integrating semiconductor and diamond-detector monitors, quantitative label dosimeters, photofluorescent sensors for broad dose range applications, and improved and simplified parametric and computational codes for imaging and simulating 3D radiation dose distributions in model products. The use of certain solid-state devices, e.g. optical quality LiF, at low (down to 4K) and high (up to 500 K) temperatures, is of interest for materials testing. There have also been notable developments in experimental dose mapping procedures, e.g. 2D and 3D dose distribution analyses by flat-bed optical scanners and software applied to radiochromic and photofluorescent images. In addition, less expensive EPR spectrometers and new EPR dosimetry materials and high-resolution semiconductor diode arrays, charge injection devices, and photostimulated storage phosphors have been introduced. (author)

  5. Power Electronic Packaging Design, Assembly Process, Reliability and Modeling

    CERN Document Server

    Liu, Yong

    2012-01-01

    Power Electronic Packaging presents an in-depth overview of power electronic packaging design, assembly,reliability and modeling. Since there is a drastic difference between IC fabrication and power electronic packaging, the book systematically introduces typical power electronic packaging design, assembly, reliability and failure analysis and material selection so readers can clearly understand each task's unique characteristics. Power electronic packaging is one of the fastest growing segments in the power electronic industry, due to the rapid growth of power integrated circuit (IC) fabrication, especially for applications like portable, consumer, home, computing and automotive electronics. This book also covers how advances in both semiconductor content and power advanced package design have helped cause advances in power device capability in recent years. The author extrapolates the most recent trends in the book's areas of focus to highlight where further improvement in materials and techniques can d...

  6. Adhesives technology for electronic applications materials, processing, reliability

    CERN Document Server

    Licari, James J

    2011-01-01

    Adhesives are widely used in the manufacture and assembly of electronic circuits and products. Generally, electronics design engineers and manufacturing engineers are not well versed in adhesives, while adhesion chemists have a limited knowledge of electronics. This book bridges these knowledge gaps and is useful to both groups. The book includes chapters covering types of adhesive, the chemistry on which they are based, and their properties, applications, processes, specifications, and reliability. Coverage of toxicity, environmental impacts and the regulatory framework make this book par

  7. Guide to the collection and presentation of electrical, electronic, and sensing component reliability data for nuclear-power generating stations

    International Nuclear Information System (INIS)

    Anon.

    1979-01-01

    This Guide is intended to establish a method of collecting and presenting reliability data for quantitative systematic reliability analysis in nuclear power generating stations, as outlined in IEEE Std 351-1975. Appendix D, which is not a part of IEEE Std 500-1977 but which comprises the bulk of this publication, presents tables of reliability data for nuclear power generating stations, intended for use of nuclear systems reliability analysts or design engineers

  8. Peer-review for selection of oral presentations for conferences: Are we reliable?

    Science.gov (United States)

    Deveugele, Myriam; Silverman, Jonathan

    2017-11-01

    Although peer-review for journal submission, grant-applications and conference submissions has been called 'a counter- stone of science', and even 'the gold standard for evaluating scientific merit', publications on this topic remain scares. Research that has investigated peer-review reveals several issues and criticisms concerning bias, poor quality review, unreliability and inefficiency. The most important weakness of the peer review process is the inconsistency between reviewers leading to inadequate inter-rater reliability. To report the reliability of ratings for a large international conference and to suggest possible solutions to overcome the problem. In 2016 during the International Conference on Communication in Healthcare, organized by EACH: International Association for Communication in Healthcare, a calibration exercise was proposed and feedback was reported back to the participants of the exercise. Most abstracts, as well as most peer-reviewers, receive and give scores around the median. Contrary to the general assumption that there are high and low scorers, in this group only 3 peer-reviewers could be identified with a high mean, while 7 has a low mean score. Only 2 reviewers gave only high ratings (4 and 5). Of the eight abstracts included in this exercise, only one abstract received a high mean score and one a low mean score. Nevertheless, both these abstracts received both low and high scores; all other abstracts received all possible scores. Peer-review of submissions for conferences are, in accordance with the literature, unreliable. New and creative methods will be needed to give the participants of a conference what they really deserve: a more reliable selection of the best abstracts. More raters per abstract improves the inter-rater reliability; training of reviewers could be helpful; providing feedback to reviewers can lead to less inter-rater disagreement; fostering negative peer-review (rejecting the inappropriate submissions) rather than a

  9. Process evaluation of the human reliability data bank

    International Nuclear Information System (INIS)

    Miller, D.P.; Comer, K.

    1985-01-01

    The US Nuclear Regulatory Commission and Sandia National Laboratories have been developing a plan for a human reliability data bank since August 1981. This research is in response to the data need of the nuclear power industry's probabilistic risk assessment community. The three phases of the program are to: (a) develop the data bank concept, (b) develop an implementation plan and conduct a process evaluation, and (c) assist a sponsor in implementing the data bank. The program is now in Phase B. This paper describes the methods used and the results of the process evaluation. Decisions to be made in the future regarding full-scale implementation will be based, in part, on the outcome of this study

  10. Process evaluation of the human reliability data bank

    International Nuclear Information System (INIS)

    Miller, D.P.; Comer, K.

    1984-01-01

    The US Nuclear Regulatory Commission and Sandia National Laboratories have been developing a plan for a human reliability data bank since August 1981. This research is in response to the data needs of the nuclear power industry's probabilistic risk assessment community. The three phases of the program are to: (A) develop the data bank concept, (B) develop an implementation plan and conduct a process evaluation, and (C) assist a sponsor in implementing the data bank. The program is now in Phase B. This paper describes the methods used and the results of the process evaluation. Decisions to be made in the future regarding full-scale implementation will be based in part on the outcome of this study

  11. Stochastic process corrosion growth models for pipeline reliability

    International Nuclear Information System (INIS)

    Bazán, Felipe Alexander Vargas; Beck, André Teófilo

    2013-01-01

    Highlights: •Novel non-linear stochastic process corrosion growth model is proposed. •Corrosion rate modeled as random Poisson pulses. •Time to corrosion initiation and inherent time-variability properly represented. •Continuous corrosion growth histories obtained. •Model is shown to precisely fit actual corrosion data at two time points. -- Abstract: Linear random variable corrosion models are extensively employed in reliability analysis of pipelines. However, linear models grossly neglect well-known characteristics of the corrosion process. Herein, a non-linear model is proposed, where corrosion rate is represented as a Poisson square wave process. The resulting model represents inherent time-variability of corrosion growth, produces continuous growth and leads to mean growth at less-than-one power of time. Different corrosion models are adjusted to the same set of actual corrosion data for two inspections. The proposed non-linear random process corrosion growth model leads to the best fit to the data, while better representing problem physics

  12. Reliability modeling of degradation of products with multiple performance characteristics based on gamma processes

    International Nuclear Information System (INIS)

    Pan Zhengqiang; Balakrishnan, Narayanaswamy

    2011-01-01

    Many highly reliable products usually have complex structure, with their reliability being evaluated by two or more performance characteristics. In certain physical situations, the degradation of these performance characteristics would be always positive and strictly increasing. In such a case, the gamma process is usually considered as a degradation process due to its independent and non-negative increments properties. In this paper, we suppose that a product has two dependent performance characteristics and that their degradation can be modeled by gamma processes. For such a bivariate degradation involving two performance characteristics, we propose to use a bivariate Birnbaum-Saunders distribution and its marginal distributions to approximate the reliability function. Inferential method for the corresponding model parameters is then developed. Finally, for an illustration of the proposed model and method, a numerical example about fatigue cracks is discussed and some computational results are presented.

  13. AMSAA Reliability Growth Guide

    National Research Council Canada - National Science Library

    Broemm, William

    2000-01-01

    ... has developed reliability growth methodology for all phases of the process, from planning to tracking to projection. The report presents this methodology and associated reliability growth concepts.

  14. A novel approach for reliable detection of cathepsin S activities in mouse antigen presenting cells.

    Science.gov (United States)

    Steimle, Alex; Kalbacher, Hubert; Maurer, Andreas; Beifuss, Brigitte; Bender, Annika; Schäfer, Andrea; Müller, Ricarda; Autenrieth, Ingo B; Frick, Julia-Stefanie

    2016-05-01

    Cathepsin S (CTSS) is a eukaryotic protease mostly expressed in professional antigen presenting cells (APCs). Since CTSS activity regulation plays a role in the pathogenesis of various autoimmune diseases like multiple sclerosis, atherosclerosis, Sjögren's syndrome and psoriasis as well as in cancer progression, there is an ongoing interest in the reliable detection of cathepsin S activity. Various applications have been invented for specific detection of this enzyme. However, most of them have only been shown to be suitable for human samples, do not deliver quantitative results or the experimental procedure requires technical equipment that is not commonly available in a standard laboratory. We have tested a fluorogen substrate, Mca-GRWPPMGLPWE-Lys(Dnp)-DArg-NH2, that has been described to specifically detect CTSS activities in human APCs for its potential use for mouse samples. We have modified the protocol and thereby offer a cheap, easy, reproducible and quick activity assay to detect CTSS activities in mouse APCs. Since most of basic research on CTSS is performed in mice, this method closes a gap and offers a possibility for reliable and quantitative CTSS activity detection that can be performed in almost every laboratory. Copyright © 2016. Published by Elsevier B.V.

  15. The PedsQL™ Present Functioning Visual Analogue Scales: preliminary reliability and validity

    Directory of Open Access Journals (Sweden)

    Varni James W

    2006-10-01

    Full Text Available Abstract Background The PedsQL™ Present Functioning Visual Analogue Scales (PedsQL™ VAS were designed as an ecological momentary assessment (EMA instrument to rapidly measure present or at-the-moment functioning in children and adolescents. The PedsQL™ VAS assess child self-report and parent-proxy report of anxiety, sadness, anger, worry, fatigue, and pain utilizing six developmentally appropriate visual analogue scales based on the well-established Varni/Thompson Pediatric Pain Questionnaire (PPQ Pain Intensity VAS format. Methods The six-item PedsQL™ VAS was administered to 70 pediatric patients ages 5–17 and their parents upon admittance to the hospital environment (Time 1: T1 and again two hours later (Time 2: T2. It was hypothesized that the PedsQL™ VAS Emotional Distress Summary Score (anxiety, sadness, anger, worry and the fatigue VAS would demonstrate moderate to large effect size correlations with the PPQ Pain Intensity VAS, and that patient" parent concordance would increase over time. Results Test-retest reliability was demonstrated from T1 to T2 in the large effect size range. Internal consistency reliability was demonstrated for the PedsQL™ VAS Total Symptom Score (patient self-report: T1 alpha = .72, T2 alpha = .80; parent proxy-report: T1 alpha = .80, T2 alpha = .84 and Emotional Distress Summary Score (patient self-report: T1 alpha = .74, T2 alpha = .73; parent proxy-report: T1 alpha = .76, T2 alpha = .81. As hypothesized, the Emotional Distress Summary Score and Fatigue VAS were significantly correlated with the PPQ Pain VAS in the medium to large effect size range, and patient and parent concordance increased from T1 to T2. Conclusion The results demonstrate preliminary test-retest and internal consistency reliability and construct validity of the PedsQL™ Present Functioning VAS instrument for both pediatric patient self-report and parent proxy-report. Further field testing is required to extend these initial

  16. Development of equipment reliability process using predictive technologies at Hamaoka Nuclear Power Station

    International Nuclear Information System (INIS)

    Taniguchi, Yuji; Sakuragi, Futoshi; Hamada, Seiichi

    2014-01-01

    Development of equipment reliability(ER) process, specifically for predictive maintenance (PdM) technologies integrated condition based maintenance (CBM) process, at Hamaoka Nuclear Power Station is introduced in this paper. Integration of predictive maintenance technologies such as vibration, oil analysis and thermo monitoring is more than important to establish strong maintenance strategies and to direct a specific technical development. In addition, a practical example of CBM is also presented to support the advantage of the idea. (author)

  17. The reliability analysis of cutting tools in the HSM processes

    OpenAIRE

    W.S. Lin

    2008-01-01

    Purpose: This article mainly describe the reliability of the cutting tools in the high speed turning by normaldistribution model.Design/methodology/approach: A series of experimental tests have been done to evaluate the reliabilityvariation of the cutting tools. From experimental results, the tool wear distribution and the tool life are determined,and the tool life distribution and the reliability function of cutting tools are derived. Further, the reliability ofcutting tools at anytime for h...

  18. An Impact of Thermodynamic Processes in Human Bodies on Performance Reliability of Individuals

    Directory of Open Access Journals (Sweden)

    Smalko Zbigniew

    2015-01-01

    Full Text Available The article presents the problem of the influence of thermodynamic factors on human fallibility in different zones of thermal discomfort. Describes the processes of energy in the human body. Been given a formal description of the energy balance of the human body thermoregulation. Pointed to human reactions to temperature changes of internal and external environment, including reactions associated with exercise. The methodology to estimate and determine the reliability of indicators of human basal acting in different zones of thermal discomfort. The significant effect of thermodynamic factors on the reliability and security ofperson.

  19. Processes and Procedures for Estimating Score Reliability and Precision

    Science.gov (United States)

    Bardhoshi, Gerta; Erford, Bradley T.

    2017-01-01

    Precision is a key facet of test development, with score reliability determined primarily according to the types of error one wants to approximate and demonstrate. This article identifies and discusses several primary forms of reliability estimation: internal consistency (i.e., split-half, KR-20, a), test-retest, alternate forms, interscorer, and…

  20. Processing the Order of Symbolic Numbers: A Reliable and Unique Predictor of Arithmetic Fluency

    Directory of Open Access Journals (Sweden)

    Stephan E. Vogel

    2017-12-01

    Full Text Available A small but growing body of evidence suggests a link between individual differences in processing the order of numerical symbols (e.g., deciding whether a set of digits is arranged in ascending/descending order or not and arithmetic achievement. However, the reliability of behavioral correlates measuring symbolic and non-symbolic numerical order processing and their relationship to arithmetic abilities remain poorly understood. The present study aims to fill this knowledge gap by examining the behavioral correlates of numerical and non-numerical order processing and their unique associations with arithmetic fluency at two different time points within the same sample of individuals. Thirty-two right-handed adults performed three order judgment tasks consisting of symbolic numbers (i.e., digits, non-symbolic numbers (i.e., dots, and letters of the alphabet. Specifically, participants had to judge as accurately and as quickly as possible whether stimuli were ordered correctly (in ascending/descending order, e.g., 2-3-4; ●●●●-●●●-●●; B-C-D or not (e.g., 4-5-3; ●●●●-●●●●●-●●●; D-E-C. Results of this study demonstrate that numerical order judgments are reliable measurements (i.e., high test-retest reliability, and that the observed relationship between symbolic number processing and arithmetic fluency accounts for a unique and reliable portion of variance over and above the non-symbolic number and the letter conditions. The differential association of symbolic and non-symbolic numbers with arithmetic support the view that processing the order of symbolic and non-symbolic numbers engages different cognitive mechanisms, and that the ability to process ordinal relationships of symbolic numbers is a reliable and unique predictor of arithmetic fluency.

  1. Materials and processes for spacecraft and high reliability applications

    CERN Document Server

    D Dunn, Barrie

    2016-01-01

    The objective of this book is to assist scientists and engineers select the ideal material or manufacturing process for particular applications; these could cover a wide range of fields, from light-weight structures to electronic hardware. The book will help in problem solving as it also presents more than 100 case studies and failure investigations from the space sector that can, by analogy, be applied to other industries. Difficult-to-find material data is included for reference. The sciences of metallic (primarily) and organic materials presented throughout the book demonstrate how they can be applied as an integral part of spacecraft product assurance schemes, which involve quality, material and processes evaluations, and the selection of mechanical and component parts. In this successor edition, which has been revised and updated, engineering problems associated with critical spacecraft hardware and the space environment are highlighted by over 500 illustrations including micrographs and fractographs. Sp...

  2. Reliability engineering

    International Nuclear Information System (INIS)

    Lee, Chi Woo; Kim, Sun Jin; Lee, Seung Woo; Jeong, Sang Yeong

    1993-08-01

    This book start what is reliability? such as origin of reliability problems, definition of reliability and reliability and use of reliability. It also deals with probability and calculation of reliability, reliability function and failure rate, probability distribution of reliability, assumption of MTBF, process of probability distribution, down time, maintainability and availability, break down maintenance and preventive maintenance design of reliability, design of reliability for prediction and statistics, reliability test, reliability data and design and management of reliability.

  3. Designing the database for a reliability aware Model-Based System Engineering process

    International Nuclear Information System (INIS)

    Cressent, Robin; David, Pierre; Idasiak, Vincent; Kratz, Frederic

    2013-01-01

    This article outlines the need for a reliability database to implement model-based description of components failure modes and dysfunctional behaviors. We detail the requirements such a database should honor and describe our own solution: the Dysfunctional Behavior Database (DBD). Through the description of its meta-model, the benefits of integrating the DBD in the system design process is highlighted. The main advantages depicted are the possibility to manage feedback knowledge at various granularity and semantic levels and to ease drastically the interactions between system engineering activities and reliability studies. The compliance of the DBD with other reliability database such as FIDES is presented and illustrated. - Highlights: ► Model-Based System Engineering is more and more used in the industry. ► It results in a need for a reliability database able to deal with model-based description of dysfunctional behavior. ► The Dysfunctional Behavior Database aims to fulfill that need. ► It helps dealing with feedback management thanks to its structured meta-model. ► The DBD can profit from other reliability database such as FIDES.

  4. Techniques, processes, and measures for software safety and reliability

    International Nuclear Information System (INIS)

    Sparkman, D.

    1992-01-01

    The purpose of this report is to provide a detailed survey of current recommended practices and measurement techniques for the development of reliable and safe software-based systems. This report is intended to assist the United States Nuclear Reaction Regulation (NRR) in determining the importance and maturity of the available techniques and in assessing the relevance of individual standards for application to instrumentation and control systems in nuclear power generating stations. Lawrence Livermore National Laboratory (LLNL) provides technical support for the Instrumentation and Control System Branch (ICSB) of NRRin advanced instrumentation and control systems, distributed digital systems, software reliability, and the application of verificafion and validafion for the development of software

  5. The importance of reliability to the SunShot Initiative (Presentation Recording)

    Science.gov (United States)

    Jones-Albertus, Rebecca

    2015-09-01

    The U.S. Department of Energy's SunShot Initiative was launched in 2011 to make subsidy-free solar electricity cost competitive with conventional energy sources by the end of the decade. Research in reliability can play a major role in realizing the SunShot goal of 0.06/kWh. By improving photovoltaic module lifetime and reducing degradation rates, a system's lifetime energy output is increased. Increasing confidence in photovoltaic performance prediction can lower perceived investment risk and thus the cost of capital. Accordingly, in 2015, SunShot expects to award more than $40 million through its SunShot National Laboratory Multiyear Partnership (SuNLaMP) and Physics of Reliability: Evaluating Design Insights for Component Technologies in Solar (PREDICTS) 2 funding programs, for research into reliability topics such as determining acceleration factors, modeling degradation rates and failure mechanisms, improving predictive performance models, and developing new test methods and instrumentation.

  6. Validity and Reliability of Revised Inventory of Learning Processes.

    Science.gov (United States)

    Gadzella, B. M.; And Others

    The Inventory of Learning Processes (ILP) was developed by Schmeck, Ribich, and Ramanaiah in 1977 as a self-report inventory to assess learning style through a behavioral-oriented approach. The ILP was revised by Schmeck in 1983. The Revised ILP contains six scales: (1) Deep Processing; (2) Elaborative Processing; (3) Shallow Processing; (4)…

  7. Performance and Reliability of Bonded Interfaces for High-Temperature Packaging (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Devoto, D.

    2014-11-01

    The thermal performance and reliability of sintered-silver is being evaluated for power electronics packaging applications. This will be experimentally accomplished by the synthesis of large-area bonded interfaces between metalized substrates that will be subsequently subjected to thermal cycles. A finite element model of crack initiation and propagation in these bonded interfaces will allow for the interpretation of degradation rates by a crack-velocity (V)-stress intensity factor (K) analysis. The experiment is outlined, and the modeling approach is discussed.

  8. Dissecting antigen processing and presentation routes in dermal vaccination strategies

    NARCIS (Netherlands)

    Platteel, Anouk C M; Henri, Sandrine; Zaiss, Dietmar M; Sijts, Alice J A M

    2017-01-01

    The skin is an attractive site for vaccination due to its accessibility and presence of immune cells surveilling this barrier. However, knowledge of antigen processing and presentation upon dermal vaccination is sparse. In this study we determined antigen processing routes that lead to CD8(+) T cell

  9. Present status and expected progress in radiation processing dosimetry

    DEFF Research Database (Denmark)

    Kovács, A.; Miller, A.

    2004-01-01

    The paper describes the present status of radiation processing dosimetry including the methods used most widely in gamma- and electron processing as well as the new methods under development or introduction. The recent trends with respect to calibrationof routine dosimetry systems as well...

  10. Suncor maintenance and reliability

    Energy Technology Data Exchange (ETDEWEB)

    Little, S. [Suncor Energy, Calgary, AB (Canada)

    2006-07-01

    Fleet maintenance and reliability at Suncor Energy was discussed in this presentation, with reference to Suncor Energy's primary and support equipment fleets. This paper also discussed Suncor Energy's maintenance and reliability standard involving people, processes and technology. An organizational maturity chart that graphed organizational learning against organizational performance was illustrated. The presentation also reviewed the maintenance and reliability framework; maintenance reliability model; the process overview of the maintenance and reliability standard; a process flow chart of maintenance strategies and programs; and an asset reliability improvement process flow chart. An example of an improvement initiative was included, with reference to a shovel reliability review; a dipper trip reliability investigation; bucket related failures by type and frequency; root cause analysis of the reliability process; and additional actions taken. Last, the presentation provided a graph of the results of the improvement initiative and presented the key lessons learned. tabs., figs.

  11. Insights on the poster preparation and presentation process.

    Science.gov (United States)

    Moore, L W; Augspurger, P; King, M O; Proffitt, C

    2001-05-01

    Dissemination of research findings and effective clinical innovations is key to the growth and development of the nursing profession. Several avenues exist for the dissemination of information. One forum for communication that has gained increased recognition over the past decade is the poster presentation. Poster presentations are often a significant part of regional, national, and international nursing conferences. Although posters are frequently used to disseminate information to the nursing community, little is reported about actual poster presenters' experiences with preparation and presentation of their posters. The purpose of this article is to present insights derived from information shared by poster presenters regarding the poster preparation and presentation process. Such insights derived from the personal experiences of poster presenters may assist others to efficiently and effectively prepare and present scholarly posters that disseminate information to the nursing community. Copyright 2001 by W.B. Saunders Company

  12. A Structural Reliability Business Process Modelling with System Dynamics Simulation

    OpenAIRE

    Lam, C. Y.; Chan, S. L.; Ip, W. H.

    2010-01-01

    Business activity flow analysis enables organizations to manage structured business processes, and can thus help them to improve performance. The six types of business activities identified here (i.e., SOA, SEA, MEA, SPA, MSA and FIA) are correlated and interact with one another, and the decisions from any business activity form feedback loops with previous and succeeding activities, thus allowing the business process to be modelled and simulated. For instance, for any company that is eager t...

  13. Silicon analog components device design, process integration, characterization, and reliability

    CERN Document Server

    El-Kareh, Badih

    2015-01-01

    This book covers modern analog components, their characteristics, and interactions with process parameters. It serves as a comprehensive guide, addressing both the theoretical and practical aspects of modern silicon devices and the relationship between their electrical properties and processing conditions. Based on the authors’ extensive experience in the development of analog devices, this book is intended for engineers and scientists in semiconductor research, development and manufacturing. The problems at the end of each chapter and the numerous charts, figures and tables also make it appropriate for use as a text in graduate and advanced undergraduate courses in electrical engineering and materials science.

  14. FE modeling of Cu wire bond process and reliability

    NARCIS (Netherlands)

    Yuan, C.A.; Weltevreden, E.R.; Akker, P. van den; Kregting, R.; Vreugd, J. de; Zhang, G.Q.

    2011-01-01

    Copper based wire bonding technology is widely accepted by electronic packaging industry due to the world-wide cost reduction actions (compared to gold wire bond). However, the mechanical characterization of copper wire differs from the gold wire; hence the new wire bond process setting and new bond

  15. Prediction of thermo-mechanical reliability of wafer backend processes

    NARCIS (Netherlands)

    Gonda, V.; Toonder, den J.M.J.; Beijer, J.G.J.; Zhang, G.Q.; van Driel, W.D.; Hoofman, R.J.O.M.; Ernst, L.J.

    2004-01-01

    More than 65% of IC failures are related to thermal and mechanical problems. For wafer backend processes, thermo-mechanical failure is one of the major bottlenecks. The ongoing technological trends like miniaturization, introduction of new materials, and function/product integration will increase

  16. Dynamic analysis and reliability assessment of structures with uncertain-but-bounded parameters under stochastic process excitations

    International Nuclear Information System (INIS)

    Do, Duy Minh; Gao, Wei; Song, Chongmin; Tangaramvong, Sawekchai

    2014-01-01

    This paper presents the non-deterministic dynamic analysis and reliability assessment of structures with uncertain-but-bounded parameters under stochastic process excitations. Random ground acceleration from earthquake motion is adopted to illustrate the stochastic process force. The exact change ranges of natural frequencies, random vibration displacement and stress responses of structures are investigated under the interval analysis framework. Formulations for structural reliability are developed considering the safe boundary and structural random vibration responses as interval parameters. An improved particle swarm optimization algorithm, namely randomised lower sequence initialized high-order nonlinear particle swarm optimization algorithm, is employed to capture the better bounds of structural dynamic characteristics, random vibration responses and reliability. Three numerical examples are used to demonstrate the presented method for interval random vibration analysis and reliability assessment of structures. The accuracy of the results obtained by the presented method is verified by the randomised Quasi-Monte Carlo simulation method (QMCSM) and direct Monte Carlo simulation method (MCSM). - Highlights: • Interval uncertainty is introduced into structural random vibration responses. • Interval dynamic reliability assessments of structures are implemented. • Boundaries of structural dynamic response and reliability are achieved

  17. Cellular scanning strategy for selective laser melting: Generating reliable, optimized scanning paths and processing parameters

    DEFF Research Database (Denmark)

    Mohanty, Sankhya; Hattel, Jesper Henri

    2015-01-01

    method based uncertainty and reliability analysis. The reliability of the scanning paths are established using cumulative probability distribution functions for process output criteria such as sample density, thermal homogeneity, etc. A customized genetic algorithm is used along with the simulation model...

  18. Supporting change processes in design: Complexity, prediction and reliability

    Energy Technology Data Exchange (ETDEWEB)

    Eckert, Claudia M. [Engineering Design Centre, University of Cambridge, Trumpington Street, Cambridge, CB2 1PZ (United Kingdom)]. E-mail: cme26@cam.ac.uk; Keller, Rene [Engineering Design Centre, University of Cambridge, Trumpington Street, Cambridge, CB2 1PZ (United Kingdom)]. E-mail: rk313@cam.ac.uk; Earl, Chris [Open University, Department of Design and Innovation, Walton Hall, Milton Keynes MK7 6AA (United Kingdom)]. E-mail: C.F.Earl@open.ac.uk; Clarkson, P. John [Engineering Design Centre, University of Cambridge, Trumpington Street, Cambridge, CB2 1PZ (United Kingdom)]. E-mail: pjc10@cam.ac.uk

    2006-12-15

    Change to existing products is fundamental to design processes. New products are often designed through change or modification to existing products. Specific parts or subsystems are changed to similar ones whilst others are directly reused. Design by modification applies particularly to safety critical products where the reuse of existing working parts and subsystems can reduce cost and risk. However change is rarely a matter of just reusing or modifying parts. Changing one part can propagate through the entire design leading to costly rework or jeopardising the integrity of the whole product. This paper characterises product change based on studies in the aerospace and automotive industry and introduces tools to aid designers in understanding the potential effects of change. Two ways of supporting designers are described: probabilistic prediction of the effects of change and visualisation of change propagation through product connectivities. Change propagation has uncertainties which are amplified by the choices designers make in practice as they implement change. Change prediction and visualisation is discussed with reference to complexity in three areas of product development: the structural backcloth of connectivities in the existing product (and its processes), the descriptions of the product used in design and the actions taken to carry out changes.

  19. Wind Energy Deployment Process and Siting Tools (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Tegen, S.

    2015-02-01

    Regardless of cost and performance, some wind projects cannot proceed to completion as a result of competing multiple uses or siting considerations. Wind energy siting issues must be better understood and quantified. DOE tasked NREL researchers with depicting the wind energy deployment process and researching development considerations. This presentation provides an overview of these findings and wind siting tools.

  20. Medical image processing on the GPU - past, present and future.

    Science.gov (United States)

    Eklund, Anders; Dufort, Paul; Forsberg, Daniel; LaConte, Stephen M

    2013-12-01

    Graphics processing units (GPUs) are used today in a wide range of applications, mainly because they can dramatically accelerate parallel computing, are affordable and energy efficient. In the field of medical imaging, GPUs are in some cases crucial for enabling practical use of computationally demanding algorithms. This review presents the past and present work on GPU accelerated medical image processing, and is meant to serve as an overview and introduction to existing GPU implementations. The review covers GPU acceleration of basic image processing operations (filtering, interpolation, histogram estimation and distance transforms), the most commonly used algorithms in medical imaging (image registration, image segmentation and image denoising) and algorithms that are specific to individual modalities (CT, PET, SPECT, MRI, fMRI, DTI, ultrasound, optical imaging and microscopy). The review ends by highlighting some future possibilities and challenges. Copyright © 2013 Elsevier B.V. All rights reserved.

  1. Electron backscatter diffraction: Strategies for reliable data acquisition and processing

    International Nuclear Information System (INIS)

    Randle, Valerie

    2009-01-01

    In electron backscatter diffraction (EBSD) software packages there are many user choices both in data acquisition and in data processing and display. In order to extract maximum scientific value from an inquiry, it is helpful to have some guidelines for best practice in conducting an EBSD investigation. The purpose of this article therefore is to address selected topics of EBSD practice, in a tutorial manner. The topics covered are a brief summary on the principles of EBSD, specimen preparation, calibration of an EBSD system, experiment design, speed of data acquisition, data clean-up, microstructure characterisation (including grain size) and grain boundary characterisation. This list is not meant to cover exhaustively all areas where EBSD is used, but rather to provide a resource consisting of some useful strategies for novice EBSD users.

  2. Improving the reliability of seasonal climate forecasts through empirical downscaling and multi-model considerations; presentation

    CSIR Research Space (South Africa)

    Landman, WA

    2012-11-01

    Full Text Available , discrimination and sharpness. We present seasonal prediction verification for the equatorial Pacific Ocean (where El Niño and La Niña events occur) sea-surface temperatures. The verification is done over a recent multi-decadal period for which hindcasts (re...

  3. Testing effort dependent software reliability model for imperfect debugging process considering both detection and correction

    International Nuclear Information System (INIS)

    Peng, R.; Li, Y.F.; Zhang, W.J.; Hu, Q.P.

    2014-01-01

    This paper studies the fault detection process (FDP) and fault correction process (FCP) with the incorporation of testing effort function and imperfect debugging. In order to ensure high reliability, it is essential for software to undergo a testing phase, during which faults can be detected and corrected by debuggers. The testing resource allocation during this phase, which is usually depicted by the testing effort function, considerably influences not only the fault detection rate but also the time to correct a detected fault. In addition, testing is usually far from perfect such that new faults may be introduced. In this paper, we first show how to incorporate testing effort function and fault introduction into FDP and then develop FCP as delayed FDP with a correction effort. Various specific paired FDP and FCP models are obtained based on different assumptions of fault introduction and correction effort. An illustrative example is presented. The optimal release policy under different criteria is also discussed

  4. Virtual Reality in Presentation of the Underground Mine Technological Process

    Directory of Open Access Journals (Sweden)

    Kodym Oldøich

    2003-09-01

    Full Text Available Virtual Reality in Presentation of the Underground Mine Technological Process focuses on methods of presentation of an underground mine technologies in intranet technology. It shows usage of platform independent VRML client for presentation of static and dynamic information about technological process. Bi-directional interactions between client and process information database are solved.Based on analysis of technological process of underground mine a database structure was designed. It is skeleton for storing all information about any underground mine. This skeleton can be modified in any direction. Data in this "static model" of underground mine can be applied for visualization in VRML environment. In this way it is possible to simplify and unify a user's front-end for all kinds of tasks.All designed scenes can be interactively displayed in full view or in any detail view, so that a user is able to recognize every important part of installed equipment, its stage, technical parameters and other information. If manufacturers of mining equipment will supply VRML model of their real products everybody would be able to place it into VRML scene and learn everything about it.This work explores and tries to enlighten some of the areas and available approaches compliant with VRML 97 specification of modifying static scene by its browser. Concepts of animation pipeline, inside and outside scripting in scene displayed and authoring of VRML targeted geometry are discussed including database connectivity.

  5. Incorporating travel-time reliability into the congestion management process : a primer.

    Science.gov (United States)

    2015-02-01

    This primer explains the value of incorporating travel-time reliability into the Congestion Management Process (CMP) : and identifies the most current tools available to assist with this effort. It draws from applied research and best practices : fro...

  6. Utilizing clad piping to improve process plant piping integrity, reliability, and operations

    International Nuclear Information System (INIS)

    Chakravarti, B.

    1996-01-01

    During the past four years carbon steel piping clad with type 304L (UNS S30403) stainless steel has been used to solve the flow accelerated corrosion (FAC) problem in nuclear power plants with exceptional success. The product is designed to allow ''like for like'' replacement of damaged carbon steel components where the carbon steel remains the pressure boundary and type 304L (UNS S30403) stainless steel the corrosion allowance. More than 3000 feet of piping and 500 fittings in sizes from 6 to 36-in. NPS have been installed in the extraction steam and other lines of these power plants to improve reliability, eliminate inspection program, reduce O and M costs and provide operational benefits. This concept of utilizing clad piping in solving various corrosion problems in industrial and process plants by conservatively selecting a high alloy material as cladding can provide similar, significant benefits in controlling corrosion problems, minimizing maintenance cost, improving operation and reliability to control performance and risks in a highly cost effective manner. This paper will present various material combinations and applications that appear ideally suited for use of the clad piping components in process plants

  7. Challenges and opportunities for informational societies from the present to become reliable learning and knowledge societies

    Directory of Open Access Journals (Sweden)

    Eduardo ROMERO SÁNCHEZ

    2013-12-01

    Full Text Available This article pretend to describe the principal social trends and cultural features that prevail today, too look the philosophical foundation of thinking, feel , living and to give an educative  response and accepted to the axiological reality  and cultural present. In modern western societies great paradoxes and contradictions coexist: economical growth, technological development and greater dimensions of freedom, but also great consumption, cultural deterioration, technological dependence and unique thought. Given this we talk about the great possibilities and at the same time of the terrible threats that exist in that modern information societies. In order to become acquainted with this reality, we have focused the analysis in 3 key aspects: the impact of digital devolution, the condition of culture in contemporary society, and the need of a “new education”.

  8. reliability reliability

    African Journals Online (AJOL)

    eobe

    Corresponding author, Tel: +234-703. RELIABILITY .... V , , given by the code of practice. However, checks must .... an optimization procedure over the failure domain F corresponding .... of Concrete Members based on Utility Theory,. Technical ...

  9. Human Reliability Program Overview

    Energy Technology Data Exchange (ETDEWEB)

    Bodin, Michael

    2012-09-25

    This presentation covers the high points of the Human Reliability Program, including certification/decertification, critical positions, due process, organizational structure, program components, personnel security, an overview of the US DOE reliability program, retirees and academia, and security program integration.

  10. Equipment reliability improvement process; implementation in Almaraz NPP and Trillo NPP

    International Nuclear Information System (INIS)

    Risquez Bailon, Aranzazu; Gutierrez Fernandez, Eduardo

    2010-01-01

    The Equipment Reliability Improvement Process (INPO AP-913) is a non-regulatory process developed by the US Nuclear Industry for improving Plants Availability. This Process integrates and coordinates a broad range of equipment reliability activities into one process, performed by the Plant in a non-centralized way. The integration and coordination of these activities will allow plant personnel to evaluate the trends of important station equipment, develop and implement long-term equipment health plans, monitor equipment performance and condition, and make adjustments to preventive maintenance tasks and frequencies based on equipment operating experience, if necessary, arbitrating operational and design improvements, to reach a Failure-free Operation. This paper describes the methodology of Equipment Reliability Improvement Process, being focused on main aspects of the implementation process, relating to the scope and establishment of an Equipment Reliability Monitoring Plan, which should include and complement the existing mechanisms and organizations in the Plant to monitor the condition and performance of the equipments, with the common aim of achieving an operation free of failures. The paper will describe the tools that Iberdrola Ingenieria has developed to support the implementation and monitoring of the Equipment Reliability Improvement Process, as well as the results and lessons learned from its implementation in Almaraz NPP and Trillo NPP. (authors)

  11. Present status of libraries processed from JENDL-3.2

    International Nuclear Information System (INIS)

    Yamano, Naoki

    1996-01-01

    Data libraries processed from JENDL-3.2 were produced by JAERI for use of typical applications in the area of nuclear reactor and shielding. These libraries had the distinction of being free of restrictions on distribution. The other data libraries for decay, activation, dosimetry, nuclide transmutation, burn-up, PKA, KERMA and DPA calculations are now being accumulated. The current status on the development of these libraries has been described. A discussion is presented about how to disseminate and share the products of JNDC activities. (author)

  12. Personality and persona: personality processes in self-presentation.

    Science.gov (United States)

    Leary, Mark R; Allen, Ashley Batts

    2011-12-01

    This article examines the role that personality variables and processes play in people's efforts to manage their public images. Although most research on self-presentation has focused on situational influences, people differ greatly in the degree to which they care about others' impressions of them, the types of impressions they try to convey, and their evaluations of their self-presentational effectiveness. Personality constructs such as public self-consciousness, approval motivation, and fear of negative evaluation are associated with the motive to manage one's impressions, and people who differ in self-disclosure and desire for privacy differentially reveal information about themselves to others. Other variables relating to people's self-concepts, interpersonal goals, and traits influence the construction of specific images. Finally, the extent to which people believe they are capable of making desired impressions influences their impression management strategies and how they respond to other people's evaluations. © 2010 The Authors. Journal of Personality © 2011, Wiley Periodicals, Inc.

  13. Reliability data collection and processing for Romanian TRIGA-SSR 14MW

    International Nuclear Information System (INIS)

    Mladin, Daniela; Mladin, Mirea; Cristea, Dumitru

    2002-01-01

    The use of site specific reliability data for PSA use is highly recommended because it enhances the accurateness and credibility of the risk analysis. In order to obtain the database statistic items for the reactor components it is necessary to: Develop a brief reactor operation history; Identify the components which can be monitored and their specific failure modes: Clearly define the component boundary; Select and run through the sources of recorded information related to failures and operational details; Process the data; The paper presents how these steps are completed for obtaining failure rates and confidence intervals limits (95% and 5%) for a series of Romanian TRIGA components such as: pumps, motors, valves, compressors, fans, etc. The identification of component boundary and failure rates modes is performed according to the IAEA guides for research reactors database. (author)

  14. A Hybrid Approach for Reliability Analysis Based on Analytic Hierarchy Process and Bayesian Network

    International Nuclear Information System (INIS)

    Zubair, Muhammad

    2014-01-01

    By using analytic hierarchy process (AHP) and Bayesian Network (BN) the present research signifies the technical and non-technical issues of nuclear accidents. The study exposed that the technical faults was one major reason of these accidents. Keep an eye on other point of view it becomes clearer that human behavior like dishonesty, insufficient training, and selfishness are also play a key role to cause these accidents. In this study, a hybrid approach for reliability analysis based on AHP and BN to increase nuclear power plant (NPP) safety has been developed. By using AHP, best alternative to improve safety, design, operation, and to allocate budget for all technical and non-technical factors related with nuclear safety has been investigated. We use a special structure of BN based on the method AHP. The graphs of the BN and the probabilities associated with nodes are designed to translate the knowledge of experts on the selection of best alternative. The results show that the improvement in regulatory authorities will decrease failure probabilities and increase safety and reliability in industrial area.

  15. The reliability process as an integral part of the product creation process. A contribution to assure the maturity level; Der Zuverlaessigkeitsprozess als integraler Bestandteil des Produktentstehungsprozesses. Ein Beitrag zur Reifegradabsicherung

    Energy Technology Data Exchange (ETDEWEB)

    Savic, R.; Kusenic, D. [ZF Friedrichshafen AG (Germany)

    2007-07-01

    The reliability process in the automotive and supplier industry covers all phases of the product creation process. The objective of the main tasks of the reliability process is to meet the requirements of the customer, authorities and law. Therefore the reliability process assures a high readiness for a stable and undisturbed start of production in the line with the product creation process. The paper presents a reliability based model to assure the maturity level in form of the reliability growth methodology and its verification respectively monitoring of the progress during the product creation process. As a consequence, the performance of the product creation process is required to be monitored and reported on. In this case the monitoring is based on the reliability parameters of the reliability growth in form of the achieved MTBF respectively MTTF, out of which the maturity level is derived during the product creation process. This makes it possible for the reliability management to track the reliability targets and the efficiency of corrective actions and the product creation process itself for all phases of the process. (orig.)

  16. Present and future trends of laser materials processing in Japan

    Science.gov (United States)

    Matsunawa, Akira

    1991-10-01

    Lasers quickly penetrated into Japanese industries in the mid-80s. The paper reviews the present situation of industrial lasers and their applications in Japanese industries for materials removal, joining, and some surface modification technologies as well as their economical evaluation compared with competitive technologies. Laser cutting of metallic and nonmetallic thin sheets is widely prevalent even in small scale industries as a flexible manufacturing tool. As for the laser welding is concerned, industrial applications are rather limited in mass production lines. This mainly comes from the fact that the present laser technologies have not employed the adaptive control because of the lack of sensors, monitoring, and control systems which can tolerate the high-precision and high-speed processing. In spite of this situation, laser welding is rapidly increasing in recent years in industries such as automotive, machinery, electric/electronic, steel, heavy industries, etc. Laser surface modification technologies have attracted significant interest from industrial people, but actual application is very limited today. However, the number of R&D papers is increasing year by year. The paper also reviews these new technology trends in Japan.

  17. Comparative reliability of structured versus unstructured interviews in the admission process of a residency program.

    Science.gov (United States)

    Blouin, Danielle; Day, Andrew G; Pavlov, Andrey

    2011-12-01

    Although never directly compared, structured interviews are reported as being more reliable than unstructured interviews. This study compared the reliability of both types of interview when applied to a common pool of applicants for positions in an emergency medicine residency program. In 2008, one structured interview was added to the two unstructured interviews traditionally used in our resident selection process. A formal job analysis using the critical incident technique guided the development of the structured interview tool. This tool consisted of 7 scenarios assessing 4 of the domains deemed essential for success as a resident in this program. The traditional interview tool assessed 5 general criteria. In addition to these criteria, the unstructured panel members were asked to rate each candidate on the same 4 essential domains rated by the structured panel members. All 3 panels interviewed all candidates. Main outcomes were the overall, interitem, and interrater reliabilities, the correlations between interview panels, and the dimensionality of each interview tool. Thirty candidates were interviewed. The overall reliability reached 0.43 for the structured interview, and 0.81 and 0.71 for the unstructured interviews. Analyses of the variance components showed a high interrater, low interitem reliability for the structured interview, and a high interrater, high interitem reliability for the unstructured interviews. The summary measures from the 2 unstructured interviews were significantly correlated, but neither was correlated with the structured interview. Only the structured interview was multidimensional. A structured interview did not yield a higher overall reliability than both unstructured interviews. The lower reliability is explained by a lower interitem reliability, which in turn is due to the multidimensionality of the interview tool. Both unstructured panels consistently rated a single dimension, even when prompted to assess the 4 specific domains

  18. Mapping innovation processes: Visual techniques for opening and presenting the black box of service innovation processes

    DEFF Research Database (Denmark)

    Olesen, Anne Rørbæk

    2017-01-01

    This chapter argues for the usefulness of visual mapping techniques for performing qualitative analysis of complex service innovation processes. Different mapping formats are presented, namely, matrices, networks, process maps, situational analysis maps and temporal situational analysis maps....... For the purpose of researching service innovation processes, the three latter formats are argued to be particularly interesting. Process maps can give an overview of different periods and milestones in a process in one carefully organized location. Situational analysis maps and temporal situational analysis maps...... can open up complexities of service innovation processes, as well as close them down for presentational purposes. The mapping formats presented are illustrated by displaying maps from an exemplary research project, and the chapter is concluded with a brief discussion of the limitations and pitfalls...

  19. Reliability Omnipotent Analysis For First Stage Separator On The Separation Process Of Gas, Oil And Water

    International Nuclear Information System (INIS)

    Sony Tjahyani, D. T.; Ismu W, Puradwi; Asmara Santa, Sigit

    2001-01-01

    Reliability of industry can be evaluated based on two aspects which are risk and economic aspects. From these points, optimation value can be determined optimation value. Risk of the oil refinery process are fire and explosion, so assessment of this system must be done. One system of the oil refinery process is first stage separator which is used to separate gas, oil and water. Evaluation of reliability for first stage separator system has been done with FAMECA and HAZap method. The analysis results, the probability of fire and explosion of 1.1x10 - 2 3 /hour and 1.2x10 - 1 1 /hour, respectively. The reliability value of the system is high because each undesired event is anticipated with safety system or safety component

  20. An Evidential Reasoning-Based CREAM to Human Reliability Analysis in Maritime Accident Process.

    Science.gov (United States)

    Wu, Bing; Yan, Xinping; Wang, Yang; Soares, C Guedes

    2017-10-01

    This article proposes a modified cognitive reliability and error analysis method (CREAM) for estimating the human error probability in the maritime accident process on the basis of an evidential reasoning approach. This modified CREAM is developed to precisely quantify the linguistic variables of the common performance conditions and to overcome the problem of ignoring the uncertainty caused by incomplete information in the existing CREAM models. Moreover, this article views maritime accident development from the sequential perspective, where a scenario- and barrier-based framework is proposed to describe the maritime accident process. This evidential reasoning-based CREAM approach together with the proposed accident development framework are applied to human reliability analysis of a ship capsizing accident. It will facilitate subjective human reliability analysis in different engineering systems where uncertainty exists in practice. © 2017 Society for Risk Analysis.

  1. Software reliability evaluation of digital plant protection system development process using V and V

    International Nuclear Information System (INIS)

    Lee, Na Young; Hwang, Il Soon; Seong, Seung Hwan; Oh, Seung Rok

    2001-01-01

    In the nuclear power industry, digital technology has been introduced recently for the Instrumentation and Control (I and C) of reactor systems. For its application to the safety critical system such as Reactor Protection System(RPS), a reliability assessment is indispensable. Unlike traditional reliability models, software reliability is hard to evaluate, and should be evaluated throughout development lifecycle. In the development process of Digital Plant Protection System(DPPS), the concept of verification and validation (V and V) was introduced to assure the quality of the product. Also, test should be performed to assure the reliability. Verification procedure with model checking is relatively well defined, however, test is labor intensive and not well organized. In this paper, we developed the methodological process of combining the verification with validation test case generation. For this, we used PVS for the table specification and for the theorem proving. As a result, we could not only save time to design test case but also get more effective and complete verification related test case set. Add to this, we could extract some meaningful factors useful for the reliability evaluation both from the V and V and verification combined tests

  2. Managing the uncertainty aspect of reliability in an iterative product development process

    NARCIS (Netherlands)

    Ganesh, N.

    2009-01-01

    This study identifies the design criteria for a method that can be used to manage the risk and uncertainty aspects of product reliability of Really New Innovations (RNI) in an Iterative Product Development Process (IPDP). It is based on 7 years of longitudinal research exploring more than 10

  3. Reliable and Efficient Parallel Processing Algorithms and Architectures for Modern Signal Processing. Ph.D. Thesis

    Science.gov (United States)

    Liu, Kuojuey Ray

    1990-01-01

    Least-squares (LS) estimations and spectral decomposition algorithms constitute the heart of modern signal processing and communication problems. Implementations of recursive LS and spectral decomposition algorithms onto parallel processing architectures such as systolic arrays with efficient fault-tolerant schemes are the major concerns of this dissertation. There are four major results in this dissertation. First, we propose the systolic block Householder transformation with application to the recursive least-squares minimization. It is successfully implemented on a systolic array with a two-level pipelined implementation at the vector level as well as at the word level. Second, a real-time algorithm-based concurrent error detection scheme based on the residual method is proposed for the QRD RLS systolic array. The fault diagnosis, order degraded reconfiguration, and performance analysis are also considered. Third, the dynamic range, stability, error detection capability under finite-precision implementation, order degraded performance, and residual estimation under faulty situations for the QRD RLS systolic array are studied in details. Finally, we propose the use of multi-phase systolic algorithms for spectral decomposition based on the QR algorithm. Two systolic architectures, one based on triangular array and another based on rectangular array, are presented for the multiphase operations with fault-tolerant considerations. Eigenvectors and singular vectors can be easily obtained by using the multi-pase operations. Performance issues are also considered.

  4. Presentation

    Directory of Open Access Journals (Sweden)

    Eduardo Vicente

    2013-06-01

    Full Text Available In the present edition of Significação – Scientific Journal for Audiovisual Culture and in the others to follow something new is brought: the presence of thematic dossiers which are to be organized by invited scholars. The appointed subject for the very first one of them was Radio and the invited scholar, Eduardo Vicente, professor at the Graduate Course in Audiovisual and at the Postgraduate Program in Audiovisual Media and Processes of the School of Communication and Arts of the University of São Paulo (ECA-USP. Entitled Radio Beyond Borders the dossier gathers six articles and the intention of reuniting works on the perspectives of usage of such media as much as on the new possibilities of aesthetical experimenting being build up for it, especially considering the new digital technologies and technological convergences. It also intends to present works with original theoretical approach and original reflections able to reset the way we look at what is today already a centennial media. Having broadened the meaning of “beyond borders”, four foreign authors were invited to join the dossier. This is the first time they are being published in this country and so, in all cases, the articles where either written or translated into Portuguese.The dossier begins with “Radio is dead…Long live to the sound”, which is the transcription of a thought provoking lecture given by Armand Balsebre (Autonomous University of Barcelona – one of the most influential authors in the world on the Radio study field. It addresses the challenges such media is to face so that it can become “a new sound media, in the context of a new soundscape or sound-sphere, for the new listeners”. Andrew Dubber (Birmingham City University regarding the challenges posed by a Digital Era argues for a theoretical approach in radio studies which can consider a Media Ecology. The author understands the form and discourse of radio as a negotiation of affordances and

  5. Presentations

    International Nuclear Information System (INIS)

    2007-01-01

    The presented materials consist of presentations of international workshop which held in Warsaw from 4 to 5 October 2007. Main subject of the meeting was progress in manufacturing as well as research program development for neutron detector which is planned to be placed at GANIL laboratory and will be used in nuclear spectroscopy research

  6. The Process of Poster Presentation: A Valuable Learning Experience.

    Science.gov (United States)

    Bracher, Lee; Cantrell, Jane; Wilkie, Kay

    1998-01-01

    Describes the formative use of poster presentations in a nursing-education program. Discusses the use of poster presentation as a successful assessment strategy and a motivating experience for students and teachers. (Author/WRM)

  7. An approach for the condensed presentation of intuitive citation impact metrics which remain reliable with very few publications

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, D.; Tippett, Ch.; Côté, G.; Roberge, G.; Archambault, E.

    2016-07-01

    An approach for presenting citation data in a condensed and intuitive manner which will allow for their reliable interpretation by policy analysts even in cases where the number of peer-reviewed publications produced by a given entity remains small is presented. The approach is described using country level data in Agronomy & Agriculture (2004–2013), an area of specialisation for many developing countries with a small output size. Four citation impact metrics, and a synthesis graph that we call the distributional micro-charts of relative citation counts, are considered in building our “preferred” presentation layout. These metrics include two indicators that have long been used by Science-Metrix in its bibliometric reports, the Average of Relative Citations (ARC) and the percentage of publications in the 10% most cited publications in the database (HCP), as well as two newer metrics, the Median of Relative Citations (MRC) and the Relative Integration Score (RIS). The findings reveal that the proposed approach combining the MRC and HCP with the distributional micro-charts effectively allows to better qualify the citation impact of entities in terms of central location, density of the upper citation tail and overall distribution than Science-Metrix former approach based on the ARC and HCP. This is especially true of cases with small population sizes where a strong presence of outliers (denoted by strong HCP scores) can have a significant effect on the central location of the citation data when estimated with an average. (Author)

  8. Past, Present and Future of the Innovation Process

    Directory of Open Access Journals (Sweden)

    Ondřej Žižlavsk y

    2013-09-01

    management control of innovation performance under the postdoc research project “Innovation Process Performance Assessment: a Management Control System Approach in the Czech Small and Medium-sized Enterprises” No. 13- 20123P of the Czech Science Foundation.

  9. How to use an optimization-based method capable of balancing safety, reliability, and weight in an aircraft design process

    Energy Technology Data Exchange (ETDEWEB)

    Johansson, Cristina [Mendeley, Broderna Ugglasgatan, Linkoping (Sweden); Derelov, Micael; Olvander, Johan [Linkoping University, IEI, Dept. of Machine Design, Linkoping (Sweden)

    2017-03-15

    In order to help decision-makers in the early design phase to improve and make more cost-efficient system safety and reliability baselines of aircraft design concepts, a method (Multi-objective Optimization for Safety and Reliability Trade-off) that is able to handle trade-offs such as system safety, system reliability, and other characteristics, for instance weight and cost, is used. Multi-objective Optimization for Safety and Reliability Trade-off has been developed and implemented at SAAB Aeronautics. The aim of this paper is to demonstrate how the implemented method might work to aid the selection of optimal design alternatives. The method is a three-step method: step 1 involves the modelling of each considered target, step 2 is optimization, and step 3 is the visualization and selection of results (results processing). The analysis is performed within Architecture Design and Preliminary Design steps, according to the company's Product Development Process. The lessons learned regarding the use of the implemented trade-off method in the three cases are presented. The results are a handful of solutions, a basis to aid in the selection of a design alternative. While the implementation of the trade-off method is performed for companies, there is nothing to prevent adapting this method, with minimal modifications, for use in other industrial applications.

  10. How to use an optimization-based method capable of balancing safety, reliability, and weight in an aircraft design process

    International Nuclear Information System (INIS)

    Johansson, Cristina; Derelov, Micael; Olvander, Johan

    2017-01-01

    In order to help decision-makers in the early design phase to improve and make more cost-efficient system safety and reliability baselines of aircraft design concepts, a method (Multi-objective Optimization for Safety and Reliability Trade-off) that is able to handle trade-offs such as system safety, system reliability, and other characteristics, for instance weight and cost, is used. Multi-objective Optimization for Safety and Reliability Trade-off has been developed and implemented at SAAB Aeronautics. The aim of this paper is to demonstrate how the implemented method might work to aid the selection of optimal design alternatives. The method is a three-step method: step 1 involves the modelling of each considered target, step 2 is optimization, and step 3 is the visualization and selection of results (results processing). The analysis is performed within Architecture Design and Preliminary Design steps, according to the company's Product Development Process. The lessons learned regarding the use of the implemented trade-off method in the three cases are presented. The results are a handful of solutions, a basis to aid in the selection of a design alternative. While the implementation of the trade-off method is performed for companies, there is nothing to prevent adapting this method, with minimal modifications, for use in other industrial applications

  11. Evidential analytic hierarchy process dependence assessment methodology in human reliability analysis

    International Nuclear Information System (INIS)

    Chen, Lu Yuan; Zhou, Xinyi; Xiao, Fuyuan; Deng, Yong; Mahadevan, Sankaran

    2017-01-01

    In human reliability analysis, dependence assessment is an important issue in risky large complex systems, such as operation of a nuclear power plant. Many existing methods depend on an expert's judgment, which contributes to the subjectivity and restrictions of results. Recently, a computational method, based on the Dempster-Shafer evidence theory and analytic hierarchy process, has been proposed to handle the dependence in human reliability analysis. The model can deal with uncertainty in an analyst's judgment and reduce the subjectivity in the evaluation process. However, the computation is heavy and complicated to some degree. The most important issue is that the existing method is in a positive aspect, which may cause an underestimation of the risk. In this study, a new evidential analytic hierarchy process dependence assessment methodology, based on the improvement of existing methods, has been proposed, which is expected to be easier and more effective

  12. Evidential Analytic Hierarchy Process Dependence Assessment Methodology in Human Reliability Analysis

    Directory of Open Access Journals (Sweden)

    Luyuan Chen

    2017-02-01

    Full Text Available In human reliability analysis, dependence assessment is an important issue in risky large complex systems, such as operation of a nuclear power plant. Many existing methods depend on an expert's judgment, which contributes to the subjectivity and restrictions of results. Recently, a computational method, based on the Dempster–Shafer evidence theory and analytic hierarchy process, has been proposed to handle the dependence in human reliability analysis. The model can deal with uncertainty in an analyst's judgment and reduce the subjectivity in the evaluation process. However, the computation is heavy and complicated to some degree. The most important issue is that the existing method is in a positive aspect, which may cause an underestimation of the risk. In this study, a new evidential analytic hierarchy process dependence assessment methodology, based on the improvement of existing methods, has been proposed, which is expected to be easier and more effective.

  13. Evidential analytic hierarchy process dependence assessment methodology in human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Lu Yuan; Zhou, Xinyi; Xiao, Fuyuan; Deng, Yong [School of Computer and Information Science, Southwest University, Chongqing (China); Mahadevan, Sankaran [School of Engineering, Vanderbilt University, Nashville (United States)

    2017-02-15

    In human reliability analysis, dependence assessment is an important issue in risky large complex systems, such as operation of a nuclear power plant. Many existing methods depend on an expert's judgment, which contributes to the subjectivity and restrictions of results. Recently, a computational method, based on the Dempster-Shafer evidence theory and analytic hierarchy process, has been proposed to handle the dependence in human reliability analysis. The model can deal with uncertainty in an analyst's judgment and reduce the subjectivity in the evaluation process. However, the computation is heavy and complicated to some degree. The most important issue is that the existing method is in a positive aspect, which may cause an underestimation of the risk. In this study, a new evidential analytic hierarchy process dependence assessment methodology, based on the improvement of existing methods, has been proposed, which is expected to be easier and more effective.

  14. Trends in Process Analytical Technology: Present State in Bioprocessing.

    Science.gov (United States)

    Jenzsch, Marco; Bell, Christian; Buziol, Stefan; Kepert, Felix; Wegele, Harald; Hakemeyer, Christian

    2017-08-04

    Process analytical technology (PAT), the regulatory initiative for incorporating quality in pharmaceutical manufacturing, is an area of intense research and interest. If PAT is effectively applied to bioprocesses, this can increase process understanding and control, and mitigate the risk from substandard drug products to both manufacturer and patient. To optimize the benefits of PAT, the entire PAT framework must be considered and each elements of PAT must be carefully selected, including sensor and analytical technology, data analysis techniques, control strategies and algorithms, and process optimization routines. This chapter discusses the current state of PAT in the biopharmaceutical industry, including several case studies demonstrating the degree of maturity of various PAT tools. Graphical Abstract Hierarchy of QbD components.

  15. Radiation processing. Present situation of the applications in Europe

    International Nuclear Information System (INIS)

    Laizier, J.

    1977-01-01

    A review is given of radiation processings in Europe: sterilization, food irradiation, sewage treatment, cross-linking of polyethylenes, vinyl polychlorides, rubbers and polymers, electron beam drying of coatings on wood, plastics and paper, production of wood-plastic composites, polymerization of ethylene and vinyl monomers [fr

  16. Physics-based process modeling, reliability prediction, and design guidelines for flip-chip devices

    Science.gov (United States)

    Michaelides, Stylianos

    Flip Chip on Board (FCOB) and Chip-Scale Packages (CSPs) are relatively new technologies that are being increasingly used in the electronic packaging industry. Compared to the more widely used face-up wirebonding and TAB technologies, flip-chips and most CSPs provide the shortest possible leads, lower inductance, higher frequency, better noise control, higher density, greater input/output (I/O), smaller device footprint and lower profile. However, due to the short history and due to the introduction of several new electronic materials, designs, and processing conditions, very limited work has been done to understand the role of material, geometry, and processing parameters on the reliability of flip-chip devices. Also, with the ever-increasing complexity of semiconductor packages and with the continued reduction in time to market, it is too costly to wait until the later stages of design and testing to discover that the reliability is not satisfactory. The objective of the research is to develop integrated process-reliability models that will take into consideration the mechanics of assembly processes to be able to determine the reliability of face-down devices under thermal cycling and long-term temperature dwelling. The models incorporate the time and temperature-dependent constitutive behavior of various materials in the assembly to be able to predict failure modes such as die cracking and solder cracking. In addition, the models account for process-induced defects and macro-micro features of the assembly. Creep-fatigue and continuum-damage mechanics models for the solder interconnects and fracture-mechanics models for the die have been used to determine the reliability of the devices. The results predicted by the models have been successfully validated against experimental data. The validated models have been used to develop qualification and test procedures for implantable medical devices. In addition, the research has helped develop innovative face

  17. Harmonization process and reliability assessment of anthropometric measurements in the elderly EXERNET multi-centre study.

    Directory of Open Access Journals (Sweden)

    Alba Gómez-Cabello

    Full Text Available BACKGROUND: The elderly EXERNET multi-centre study aims to collect normative anthropometric data for old functionally independent adults living in Spain. PURPOSE: To describe the standardization process and reliability of the anthropometric measurements carried out in the pilot study and during the final workshop, examining both intra- and inter-rater errors for measurements. MATERIALS AND METHODS: A total of 98 elderly from five different regions participated in the intra-rater error assessment, and 10 different seniors living in the city of Toledo (Spain participated in the inter-rater assessment. We examined both intra- and inter-rater errors for heights and circumferences. RESULTS: For height, intra-rater technical errors of measurement (TEMs were smaller than 0.25 cm. For circumferences and knee height, TEMs were smaller than 1 cm, except for waist circumference in the city of Cáceres. Reliability for heights and circumferences was greater than 98% in all cases. Inter-rater TEMs were 0.61 cm for height, 0.75 cm for knee-height and ranged between 2.70 and 3.09 cm for the circumferences measured. Inter-rater reliabilities for anthropometric measurements were always higher than 90%. CONCLUSION: The harmonization process, including the workshop and pilot study, guarantee the quality of the anthropometric measurements in the elderly EXERNET multi-centre study. High reliability and low TEM may be expected when assessing anthropometry in elderly population.

  18. Reliability of an fMRI Paradigm for Emotional Processing in a Multisite Longitudinal Study

    Science.gov (United States)

    Gee, Dylan G.; McEwen, Sarah C.; Forsyth, Jennifer K.; Haut, Kristen M.; Bearden, Carrie E.; Addington, Jean; Goodyear, Bradley; Cadenhead, Kristin S.; Mirzakhanian, Heline; Cornblatt, Barbara A.; Olvet, Doreen; Mathalon, Daniel H.; McGlashan, Thomas H.; Perkins, Diana O.; Belger, Aysenil; Seidman, Larry J.; Thermenos, Heidi; Tsuang, Ming T.; van Erp, Theo G.M.; Walker, Elaine F.; Hamann, Stephan; Woods, Scott W.; Constable, Todd; Cannon, Tyrone D.

    2015-01-01

    Multisite neuroimaging studies can facilitate the investigation of brain-related changes in many contexts, including patient groups that are relatively rare in the general population. Though multisite studies have characterized the reliability of brain activation during working memory and motor functional magnetic resonance imaging tasks, emotion processing tasks, pertinent to many clinical populations, remain less explored. A traveling participants study was conducted with eight healthy volunteers scanned twice on consecutive days at each of the eight North American Longitudinal Prodrome Study sites. Tests derived from generalizability theory showed excellent reliability in the amygdala (Eρ2=0.82), inferior frontal gyrus (IFG;Eρ2=0.83), anterior cingulate cortex (ACC;Eρ2=0.76), insula (Eρ2=0.85), and fusiform gyrus (Eρ2=0.91) for maximum activation and fair to excellent reliability in the amygdala (Eρ2=0.44), IFG (Eρ2=0.48), ACC (Eρ2=0.55), insula (Eρ2=0.42), and fusiform gyrus (Eρ2=0.83) for mean activation across sites and test days. For the amygdala, habituation (Eρ2=0.71) was more stable than mean activation. In a second investigation, data from 111 healthy individuals across sites were aggregated in a voxelwise, quantitative meta-analysis. When compared with a mixed effects model controlling for site, both approaches identified robust activation in regions consistent with expected results based on prior single-site research. Overall, regions central to emotion processing showed strong reliability in the traveling participants study and robust activation in the aggregation study. These results support the reliability of blood oxygen level-dependent signal in emotion processing areas across different sites and scanners and may inform future efforts to increase efficiency and enhance knowledge of rare conditions in the population through multisite neuroimaging paradigms. PMID:25821147

  19. IEEE guide to the collection and presentation of electrical, electronic, and sensing component reliability data for nuclear-power generating stations

    International Nuclear Information System (INIS)

    Anon.

    1977-01-01

    Guidelines are given for the purpose of establishing standardization methods for collecting and presenting reliability data for quantitative systematic analysis in nuclear power plants. This guide may be also used for reliability analysis in other segments of power industry. The data considered include failure rates, failure modes and environmental impact on component behavior

  20. Presentations

    International Nuclear Information System (INIS)

    2007-01-01

    The PARIS meeting held in Cracow, Poland from 14 to 15 May 2007. The main subjects discussed during this meeting were the status of international project dedicated to gamma spectroscopy research. The scientific research program includes investigations of giant dipole resonance, probe of hot nuclei induced in heavy reactions, Jacobi shape transitions, isospin mixing and nuclear multifragmentation. The mentioned programme needs Rand D development such as new scintillations materials as lanthanum chlorides and bromides as well as new photo detection sensors as avalanche photodiodes - such subjects are also subjects of discussion. Additionally results of computerized simulations of scintillation detectors properties by means of GEANT- 4 code are presented

  1. Notes on human factors problems in process plant reliability and safety prediction

    International Nuclear Information System (INIS)

    Rasmussen, J.; Taylor, J.R.

    1976-09-01

    The basis for plant operator reliability evaluation is described. Principles for plant design, necessary to permit reliability evaluation, are outlined. Five approaches to the plant operator reliability problem are described. Case stories, illustrating operator reliability problems, are given. (author)

  2. Modeling Parameters of Reliability of Technological Processes of Hydrocarbon Pipeline Transportation

    Directory of Open Access Journals (Sweden)

    Shalay Viktor

    2016-01-01

    Full Text Available On the basis of methods of system analysis and parametric reliability theory, the mathematical modeling of processes of oil and gas equipment operation in reliability monitoring was conducted according to dispatching data. To check the quality of empiric distribution coordination , an algorithm and mathematical methods of analysis are worked out in the on-line mode in a changing operating conditions. An analysis of physical cause-and-effect relations mechanism between the key factors and changing parameters of technical systems of oil and gas facilities is made, the basic types of technical distribution parameters are defined. Evaluation of the adequacy the analyzed parameters of the type of distribution is provided by using a criterion A.Kolmogorov, as the most universal, accurate and adequate to verify the distribution of continuous processes of complex multiple-technical systems. Methods of calculation are provided for supervising by independent bodies for risk assessment and safety facilities.

  3. An enhanced reliability-oriented workforce planning model for process industry using combined fuzzy goal programming and differential evolution approach

    Science.gov (United States)

    Ighravwe, D. E.; Oke, S. A.; Adebiyi, K. A.

    2018-03-01

    This paper draws on the "human reliability" concept as a structure for gaining insight into the maintenance workforce assessment in a process industry. Human reliability hinges on developing the reliability of humans to a threshold that guides the maintenance workforce to execute accurate decisions within the limits of resources and time allocations. This concept offers a worthwhile point of deviation to encompass three elegant adjustments to literature model in terms of maintenance time, workforce performance and return-on-workforce investments. These fully explain the results of our influence. The presented structure breaks new grounds in maintenance workforce theory and practice from a number of perspectives. First, we have successfully implemented fuzzy goal programming (FGP) and differential evolution (DE) techniques for the solution of optimisation problem in maintenance of a process plant for the first time. The results obtained in this work showed better quality of solution from the DE algorithm compared with those of genetic algorithm and particle swarm optimisation algorithm, thus expressing superiority of the proposed procedure over them. Second, the analytical discourse, which was framed on stochastic theory, focusing on specific application to a process plant in Nigeria is a novelty. The work provides more insights into maintenance workforce planning during overhaul rework and overtime maintenance activities in manufacturing systems and demonstrated capacity in generating substantially helpful information for practice.

  4. Reliability analysis of nuclear component cooling water system using semi-Markov process model

    International Nuclear Information System (INIS)

    Veeramany, Arun; Pandey, Mahesh D.

    2011-01-01

    Research highlights: → Semi-Markov process (SMP) model is used to evaluate system failure probability of the nuclear component cooling water (NCCW) system. → SMP is used because it can solve reliability block diagram with a mixture of redundant repairable and non-repairable components. → The primary objective is to demonstrate that SMP can consider Weibull failure time distribution for components while a Markov model cannot → Result: the variability in component failure time is directly proportional to the NCCW system failure probability. → The result can be utilized as an initiating event probability in probabilistic safety assessment projects. - Abstract: A reliability analysis of nuclear component cooling water (NCCW) system is carried out. Semi-Markov process model is used in the analysis because it has potential to solve a reliability block diagram with a mixture of repairable and non-repairable components. With Markov models it is only possible to assume an exponential profile for component failure times. An advantage of the proposed model is the ability to assume Weibull distribution for the failure time of components. In an attempt to reduce the number of states in the model, it is shown that usage of poly-Weibull distribution arises. The objective of the paper is to determine system failure probability under these assumptions. Monte Carlo simulation is used to validate the model result. This result can be utilized as an initiating event probability in probabilistic safety assessment projects.

  5. THE RELIABILITY AND ACCURACY OF THE TRIPLE MEASUREMENTS OF ANALOG PROCESS VARIABLES

    Directory of Open Access Journals (Sweden)

    V. A. Anishchenko

    2017-01-01

    Full Text Available The increase in unit capacity of electric equipment as well as complication of technological processes, devices control and management of the latter in power plants and substations demonstrate the need to improve the reliability and accuracy of measurement information characterizing the state of the objects being managed. The mentioned objective is particularly important for nuclear power plants, where the price of inaccuracy of measurement responsible process variables is particularly high and the error might lead to irreparable consequences. Improving the reliability and accuracy of measurements along with the improvement of the element base is provided by methods of operational validation. These methods are based on the use of information redundancy (structural, topological, temporal. In particular, information redundancy can be achieved by the simultaneous measurement of one analog variable by two (duplication or three devices (triplication i.e., triple redundancy. The problem of operational control of the triple redundant system of measurement of electrical analog variables (currents, voltages, active and reactive power and energy is considered as a special case of signal processing by an orderly sampling on the basis of majority transformation and transformation being close to majority one. Difficulties in monitoring the reliability of measurements are associated with the two tasks. First, one needs to justify the degree of truncation of the distributions of random errors of measurements and allowable residuals of the pairwise differences of the measurement results. The second task consists in formation of the algorithm of joint processing of a set of separate measurements determined as valid. The quality of control is characterized by the reliability, which adopted the synonym of validity, and accuracy of the measuring system. Taken separately, these indicators might lead to opposite results. A compromise solution is therefore proposed

  6. Reliability of neural encoding

    DEFF Research Database (Denmark)

    Alstrøm, Preben; Beierholm, Ulrik; Nielsen, Carsten Dahl

    2002-01-01

    The reliability with which a neuron is able to create the same firing pattern when presented with the same stimulus is of critical importance to the understanding of neuronal information processing. We show that reliability is closely related to the process of phaselocking. Experimental results f...

  7. Research of radioecological processes by methods of the theory of reliability

    International Nuclear Information System (INIS)

    Kutlakhmedov, Yu.A.; Salivon, A.G.; Pchelovskaya, S.A.; Rodina, V.V.; Bevza, A.G.; Matveeva, I.V.

    2012-01-01

    Theory and the models of radiocapacity ecosystems using the theory and models of reliability have allowed adequately to describe the laws of migration and radionuclides distribution for different types ecosystems of reservoirs and land. The theory and the models of radiocapacity allow strictly to define critical elements of ecosystem where it is necessary to expect temporary or final depoting of radionuclides.The approach on the basis of application biogenic tracers allows within the framework of the theory both models of radiocapacity and reliability simultaneously to estimate the processes of radionuclides migration, to define the dozes of loading on biota ecosystems, and to establish fundamental parameters of radionuclides redistribution speeds and others pollutants in different types of ecosystems.

  8. LED Lighting System Reliability Modeling and Inference via Random Effects Gamma Process and Copula Function

    Directory of Open Access Journals (Sweden)

    Huibing Hao

    2015-01-01

    Full Text Available Light emitting diode (LED lamp has attracted increasing interest in the field of lighting systems due to its low energy and long lifetime. For different functions (i.e., illumination and color, it may have two or more performance characteristics. When the multiple performance characteristics are dependent, it creates a challenging problem to accurately analyze the system reliability. In this paper, we assume that the system has two performance characteristics, and each performance characteristic is governed by a random effects Gamma process where the random effects can capture the unit to unit differences. The dependency of performance characteristics is described by a Frank copula function. Via the copula function, the reliability assessment model is proposed. Considering the model is so complicated and analytically intractable, the Markov chain Monte Carlo (MCMC method is used to estimate the unknown parameters. A numerical example about actual LED lamps data is given to demonstrate the usefulness and validity of the proposed model and method.

  9. Research on reliability measures of the main transformer and GIS equipment manufacturing process

    International Nuclear Information System (INIS)

    Wu Honglong

    2014-01-01

    Based on the accidents of the main transformer GIS equipment and the accidents of the high voltage switch equipment, combined with the main transformer switch equipment maintenance experience and electrical theory, the reliability measures of the main transformer GIS equipment during manufacturing stage are studied and improved. Six successful reliability measures are identified: 1) design properly and check the ability of transformer for anti short circuit; 2) choose mature and reliable main transformer HV bushing; 3) choose GIS switch operation mechanism of high quality and reliability; 4) ensure that the insulation margin through tests piece by piece on withstand voltage and partial discharge of the GIS equipment insulation; 5) take test measures such as GIS conductor, shell polishing witness process and full form lightning impulse, to find out and eliminate the defects of abnormal electric field distribution; 6) Anti VFTO design for the main transformer connected with GIS with the voltage of 500 kV should be considered, and its anti VFTO ability to meet the safe operation under VFTO requirements should be checked. This paper proposed 2 new measures: 1) the main transformer insulation material quality standard is determined not only by its high dielectric strength, but more importantly by the homogeneous dielectric electric strength. Insulating Materials with a high and also uniform dielectric strength should be chosen. 2) During the silver-coating stage of the GIS equipment conductor, QC group activities should be organized to ensure that the plating layer quality, and the current lap surface DC resistance measurements should be supervised and witnessed to ensure the quality of the conductor contact surface. These measures are verified in Fuqing project of GIS main transformer equipment manufacturing process, and their effectiveness is proven. (author)

  10. Reliability of the MODS assay decentralisation process in three health regions in Peru

    Science.gov (United States)

    Mendoza, A.; Castillo, E.; Gamarra, N.; Huamán, T.; Perea, M.; Monroi, Y.; Salazar, R.; Coronel, J.; Acurio, M.; Obregón, G.; Roper, M.; Bonilla, C.; Asencios, L.; Moore, D. A. J.

    2011-01-01

    OBJECTIVE To deliver rapid isoniazid (INH) and rifampicin (RMP) drug susceptibility testing (DST) close to the patient, we designed a decentralisation process for the microscopic observation drug susceptibility (MODS) assay in Peru and evaluated its reliability. METHODS After 2 weeks of training, laboratory staff processed ≥120 consecutive sputum samples each in three regional laboratories. Samples were processed in parallel with MODS testing at an expert laboratory. Blinded paired results were independently analysed by the Instituto Nacional de Salud (INS) according to predetermined criteria: concordance for culture, DST against INH and RMP and diagnosis of multidrug-resistant t uberculosis (MDR-TB) ≥ 95%, McNemar's P > 0.05, kappa index (κ) ≥ 0.75 and contamination 1–4%. Sensitivity and specificity for MDR-TB were calculated. RESULTS The accreditation process for Callao (126 samples, 79.4% smear-positive), Lima Sur (n = 130, 84%) and Arequipa (n = 126, 80%) took respectively 94, 97 and 173 days. Pre-determined criteria in all regional laboratories were above expected values. The sensitivity and specificity for detecting MDR-TB in regional laboratories were >95%, except for sensitivity in Lima Sur, which was 91.7%. Contamination was 1.0–2.3%. Mean delay to positive MODS results was 9.9–12.9 days. CONCLUSION Technology transfer of MODS was reliable, effective and fast, enabling the INS to accredit regional laboratories swiftly. PMID:21219684

  11. Reflow Process Parameters Analysis and Reliability Prediction Considering Multiple Characteristic Values

    Directory of Open Access Journals (Sweden)

    Guo Yu

    2016-01-01

    Full Text Available As a major step surface mount technology, reflow process is the key factor affecting the quality of the final product. The setting parameters and characteristic value of temperature curve shows a nonlinear relationship. So parameter impacts on characteristic values are analyzed and the parameters adjustment process based on orthogonal experiment is proposed in the paper. First, setting parameters are determined and the orthogonal test is designed according to production conditions. Then each characteristic value for temperature profile is calculated. Further, multi-index orthogonal experiment is analyzed for acquiring the setting parameters which impacts the PCBA product quality greater. Finally, reliability prediction is carried out considering the main influencing parameters for providing a theoretical basis of parameters adjustment and product quality evaluation in engineering process.

  12. Graph theoretical calculation of systems reliability with semi-Markov processes

    International Nuclear Information System (INIS)

    Widmer, U.

    1984-06-01

    The determination of the state probabilities and related quantities of a system characterized by an SMP (or a homogeneous MP) can be performed by means of graph-theoretical methods. The calculation procedures for semi-Markov processes based on signal flow graphs are reviewed. Some methods from electrotechnics are adapted in order to obtain a representation of the state probabilities by means of trees. From this some formulas are derived for the asymptotic state probabilities and for the mean life-time in reliability considerations. (Auth.)

  13. SALP (Sensitivity Analysis by List Processing), a computer assisted technique for binary systems reliability analysis

    International Nuclear Information System (INIS)

    Astolfi, M.; Mancini, G.; Volta, G.; Van Den Muyzenberg, C.L.; Contini, S.; Garribba, S.

    1978-01-01

    A computerized technique which allows the modelling by AND, OR, NOT binary trees, of various complex situations encountered in safety and reliability assessment, is described. By the use of list-processing, numerical and non-numerical types of information are used together. By proper marking of gates and primary events, stand-by systems, common cause failure and multiphase systems can be analyzed. The basic algorithms used in this technique are shown in detail. Application to a stand-by and multiphase system is then illustrated

  14. On the application of nonhomogeneous Poisson process to the reliability analysis of service water pumps of nuclear power plants

    International Nuclear Information System (INIS)

    Cruz Saldanha, Pedro Luiz da.

    1995-12-01

    The purpose of this study is to evaluate the nonhomogeneous Poisson process as a model to rate of occurrence of failures when it is not constant, and the times between failures are not independent nor identically distributed. To this evaluation, an analyse of reliability of service water pumps of a typical nuclear power plant is made considering the model discussed in the last paragraph, as long as the pumps are effectively repairable components. Standard statistical techniques, such as maximum likelihood and linear regression, are applied to estimate parameters of nonhomogeneous Poisson process model. As a conclusion of the study, the nonhomogeneous Poisson process is adequate to model rate of occurrence of failures that are function of time, and can be used where the aging mechanisms are present in operation of repairable systems. (author). 72 refs., 45 figs., 21 tabs

  15. Human reliability analysis as an evaluation tool of the emergency evacuation process on industrial installation

    International Nuclear Information System (INIS)

    Santos, Isaac J.A.L. dos; Grecco, Claudio H.S.; Mol, Antonio C.A.; Carvalho, Paulo V.R.; Oliveira, Mauro V.; Botelho, Felipe Mury

    2007-01-01

    Human reliability is the probability that a person correctly performs some required activity by the system in a required time period and performs no extraneous activity that can degrade the system. Human reliability analysis (HRA) is the analysis, prediction and evaluation of work-oriented human performance using some indices as human error likelihood and probability of task accomplishment. The human error concept must not have connotation of guilt and punishment, having to be treated as a natural consequence, that emerges due to the not continuity between the human capacity and the system demand. The majority of the human error is a consequence of the work situation and not of the responsibility lack of the worker. The anticipation and the control of potentially adverse impacts of human action or interactions between the humans and the system are integral parts of the process safety, where the factors that influence the human performance must be recognized and managed. The aim of this paper is to propose a methodology to evaluate the emergency evacuation process on industrial installations including SLIM-MAUD, a HRA first-generation method, and using virtual reality and simulation software to build and to simulate the chosen emergency scenes. (author)

  16. Human reliability analysis as an evaluation tool of the emergency evacuation process on industrial installation

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Isaac J.A.L. dos; Grecco, Claudio H.S.; Mol, Antonio C.A.; Carvalho, Paulo V.R.; Oliveira, Mauro V.; Botelho, Felipe Mury [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)]. E-mail: luquetti@ien.gov.br; grecco@ien.gov.br; mol@ien.gov.br; paulov@ien.gov.br; mvitor@ien.gov.br; felipemury@superig.com.br

    2007-07-01

    Human reliability is the probability that a person correctly performs some required activity by the system in a required time period and performs no extraneous activity that can degrade the system. Human reliability analysis (HRA) is the analysis, prediction and evaluation of work-oriented human performance using some indices as human error likelihood and probability of task accomplishment. The human error concept must not have connotation of guilt and punishment, having to be treated as a natural consequence, that emerges due to the not continuity between the human capacity and the system demand. The majority of the human error is a consequence of the work situation and not of the responsibility lack of the worker. The anticipation and the control of potentially adverse impacts of human action or interactions between the humans and the system are integral parts of the process safety, where the factors that influence the human performance must be recognized and managed. The aim of this paper is to propose a methodology to evaluate the emergency evacuation process on industrial installations including SLIM-MAUD, a HRA first-generation method, and using virtual reality and simulation software to build and to simulate the chosen emergency scenes. (author)

  17. Simple and reliable procedure for the evaluation of short-term dynamic processes in power systems

    Energy Technology Data Exchange (ETDEWEB)

    Popovic, D P

    1986-10-01

    An efficient approach is presented to the solution of the short-term dynamics model in power systems. It consists of an adequate algebraic treatment of the original system of nonlinear differential equations, using linearization, decomposition and Cauchy's formula. The simple difference equations obtained in this way are incorporated into a model of the electrical network, which is of a low order compared to the ones usually used. Newton's method is applied to the model formed in this way, which leads to a simple and reliable iterative procedure. The characteristics of the procedure developed are demonstrated on examples of transient stability analysis of real power systems. 12 refs.

  18. Reliability evaluation of hard disk drive failures based on counting processes

    International Nuclear Information System (INIS)

    Ye, Zhi-Sheng; Xie, Min; Tang, Loon-Ching

    2013-01-01

    Reliability assessment for hard disk drives (HDDs) is important yet difficult for manufacturers. Motivated by the fact that the particle accumulation in the HDDs, which accounts for most HDD catastrophic failures, is contributed from the internal and external sources, a counting process with two arrival sources is proposed to model the particle cumulative process in HDDs. This model successfully explains the collapse of traditional ALT approaches for accelerated life test data. Parameter estimation and hypothesis tests for the model are developed and illustrated with real data from a HDD test. A simulation study is conducted to examine the accuracy of large sample normal approximations that are used to test existence of the internal and external sources.

  19. An application of modulated poisson processes to the reliability analysis of repairable systems

    Energy Technology Data Exchange (ETDEWEB)

    Saldanha, Pedro L.C. [Comissao Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil). Coordenacao de Reatores]. E-mail: saldanha@cnen.gov.br; Melo, P.F. Frutuoso e [Universidade Federal, Rio de Janeiro, RJ (Brazil). Coordenacao dos Programas de Pos-graduacao de Engenharia. Programa de Engenharia Nuclear]. E-mail: frutuoso@con.ufrj.br; Noriega, Hector C. [Universidad Austral de Chile (UACh), Valdivia (Chile). Faculdad de Ciencias de la Ingeniaria]. E-mail: hnoriega@uach.cl

    2005-07-01

    This paper discusses the application of the modulated power law process (MPLP) model to the rate of occurrence of failures of active repairable systems in reliability engineering. Traditionally, two ways of modeling repairable systems, in what concerns maintenance policies, are: a pessimistic approach (non-homogeneous process - NHPP), and a very optimistic approach (renewal processes - RP). It is important to build a generalized model that might consider characteristics and properties both of the NHPP and of the RP models as particular cases. In practice, by considering the pattern of times between failures, the MPLP appears to be more realistic to represent the occurrence of failures of repairable systems in order to define whether they can be modeled by a homogeneous or a non-homogeneous process. The study has shown that the model can be used to make decisions concerning the evaluation of the qualified life of plant equipment. By controlling and monitoring two of the three parameters of the MPLP model during the equipment operation, it is possible to check whether and how the equipment is following the basis of its qualification process, and so identify how the effects of time, degradation and operation modes are influencing the equipment performance. The discussion is illustrated by an application to the service water pumps of a typical PWR plant. (author)

  20. Waste container weighing data processing to create reliable information of household waste generation.

    Science.gov (United States)

    Korhonen, Pirjo; Kaila, Juha

    2015-05-01

    Household mixed waste container weighing data was processed by knowledge discovery and data mining techniques to create reliable information of household waste generation. The final data set included 27,865 weight measurements covering the whole year 2013 and it was selected from a database of Helsinki Region Environmental Services Authority, Finland. The data set contains mixed household waste arising in 6m(3) containers and it was processed identifying missing values and inconsistently low and high values as errors. The share of missing values and errors in the data set was 0.6%. This provides evidence that the waste weighing data gives reliable information of mixed waste generation at collection point level. Characteristic of mixed household waste arising at the waste collection point level is a wide variation between pickups. The seasonal variation pattern as a result of collective similarities in behaviour of households was clearly detected by smoothed medians of waste weight time series. The evaluation of the collection time series against the defined distribution range of pickup weights on the waste collection point level shows that 65% of the pickups were from collection points with optimally dimensioned container capacity and the collection points with over- and under-dimensioned container capacities were noted in 9.5% and 3.4% of all pickups, respectively. Occasional extra waste in containers occurred in 21.2% of the pickups indicating the irregular behaviour of individual households. The results of this analysis show that processing waste weighing data using knowledge discovery and data mining techniques provides trustworthy information of household waste generation and its variations. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Reliability and maintenance in European nuclear power plants: A structural analysis of a controlled stochastic process

    International Nuclear Information System (INIS)

    Sturm, R.

    1991-01-01

    Two aspects of performance are of main concern: plant availability and plant reliability (defined as the conditional probability of an unplanned shutdown). The goal of the research is a unified framework that combines behavioral models of optimizing agents with models of complex technical systems that take into account the dynamic and stochastic features of the system. In order to achieve this synthesis, two liens of work are necessary. One line requires a deeper understanding of complex production systems and the type of data they give rise to; the other line involves the specification and estimation of a rigorously specified behavioral model. Plant operations are modeled as a controlled stochastic process, and the sequence of up and downtime spells is analyzed during failure time and point process models. Similar to work on rational expectations and structural econometric models, the behavior model of how the plant process is controlled is formulated at the level of basic processes, i.e., the objective function of the plant manager, technical constraints, and stochastic disturbances

  2. Assessment of Sensory Processing and Executive Functions in Childhood: Development, Reliability, and Validity of the EPYFEI

    Directory of Open Access Journals (Sweden)

    Dulce Romero-Ayuso

    2018-03-01

    Full Text Available The aim of this study was to determine the psychometric properties of the “Assessment of Sensory Processing and Executive Functions in Childhood” (EPYFEI, a questionnaire designed to assess the sensory processing and executive functions of children aged between 3 and 11 years. The EPYFEI was completed by a sample of 1,732 parents of children aged between 3 and 11 years who lived in Spain. An exploratory factor analysis was conducted and showed five main factors: (1 executive attention, working memory, and initiation of actions; (2 general sensory processing; (3 emotional and behavioral self-regulation; (4 supervision, correction of actions, and problem solving; and (5 inhibitory. The reliability of the analysis was high both for the whole questionnaire and for the factors it is composed of. Results provide evidence of the potential usefulness of the EPYFEI in clinical contexts for the early detection of neurodevelopmental disorders, in which there may be a deficit of executive functions and sensory processing.

  3. Statistical test data selection for reliability evalution of process computer software

    International Nuclear Information System (INIS)

    Volkmann, K.P.; Hoermann, H.; Ehrenberger, W.

    1976-01-01

    The paper presents a concept for converting knowledge about the characteristics of process states into practicable procedures for the statistical selection of test cases in testing process computer software. Process states are defined as vectors whose components consist of values of input variables lying in discrete positions or within given limits. Two approaches for test data selection, based on knowledge about cases of demand, are outlined referring to a purely probabilistic method and to the mathematics of stratified sampling. (orig.) [de

  4. Integration of human reliability analysis into the probabilistic risk assessment process: phase 1

    International Nuclear Information System (INIS)

    Bell, B.J.; Vickroy, S.C.

    1985-01-01

    The US Nuclear Regulatory Commission and Pacific Northwest Laboratory initiated a research program in 1984 to develop a testable set of analytical procedures for integrating human reliability analysis (HRA) into the probabilistic risk assessment (PRA) process to more adequately assess the overall impact of human performance on risk. In this three phase program, stand-alone HRA/PRA analytic procedures will be developed and field evaluated to provide improved methods, techniques, and models for applying quantitative and qualitative human error data which systematically integrate HRA principles, techniques, and analyses throughout the entire PRA process. Phase 1 of the program involved analysis of state-of-the-art PRAs to define the structures and processes currently in use in the industry. Phase 2 research will involve developing a new or revised PRA methodology which will enable more efficient regulation of the industry using quantitative or qualitative results of the PRA. Finally, Phase 3 will be to field test those procedures to assure that the results generated by the new methodologies will be usable and acceptable to the NRC. This paper briefly describes the first phase of the program and outlines the second

  5. Integration of human reliability analysis into the probabilistic risk assessment process: Phase 1

    International Nuclear Information System (INIS)

    Bell, B.J.; Vickroy, S.C.

    1984-10-01

    A research program was initiated to develop a testable set of analytical procedures for integrating human reliability analysis (HRA) into the probabilistic risk assessment (PRA) process to more adequately assess the overall impact of human performance on risk. In this three-phase program, stand-alone HRA/PRA analytic procedures will be developed and field evaluated to provide improved methods, techniques, and models for applying quantitative and qualitative human error data which systematically integrate HRA principles, techniques, and analyses throughout the entire PRA process. Phase 1 of the program involved analysis of state-of-the-art PRAs to define the structures and processes currently in use in the industry. Phase 2 research will involve developing a new or revised PRA methodology which will enable more efficient regulation of the industry using quantitative or qualitative results of the PRA. Finally, Phase 3 will be to field test those procedures to assure that the results generated by the new methodologies will be usable and acceptable to the NRC. This paper briefly describes the first phase of the program and outlines the second

  6. The solution of the reliability problem in the repair process of the plates of the silica bricks press boxes

    Directory of Open Access Journals (Sweden)

    Nochvai V.М.

    2017-05-01

    Full Text Available The research analyzes recommendations existing in different sources of information for the choice of methods of strengthening and reconditioning of worn machine parts. These methods include: the method of electric arc deposition, chemical-thermal treatment, gas-powder deposition, gas-powder and plasma spraying, electric arc metallization. As a result of studies of wear of the working surfaces of the plates of silicate brick press boxes, we define that the plates wear out unevenly and the thickness of the worn layer varies between 0.3 ... 2 mm. Technological method is chosen as the method of the plate reliability enhancement and maintaining. One of the main technological stages of reliability formation is machine parts strengthening using the methods of strengthening technologies, namely electric arc metallization. Wire models Нп-65Г, ФМИ-2, Нп-40Х13 are used to develop wear-resistant coatings with desired properties. Technological process of the plates repair consists of the following basic operations: plate preparation, wire preparation, plate coating, plate grinding, final checking. Single and complex reliability indicators are determined by testing a set of the plates and registering all the indicators (operating time, failures, faults. The value of the economic reliability index of the plate Kе equals to 0,10. Higher plate reliability is achieved at the expense of extra cost for plate strengthening using wire Нп-40Х13, and the price of Bн plate reliability is 104,83 UAH. Complex indicators of reliability of the reconditioned plate of the silica bricks press boxes are used for more complete reliability assessment. Availability coefficient Kг. equals to 0,995 and characterizes two different properties simultaneously: reliability and maintainability. Coefficient of technical use Kт.в. equals to 0,974 and most fully characterizes the reliability of the plates because it considers time in the process of maintenance, repair and

  7. A study on hybrid split-spectrum processing technique for enhanced reliability in ultrasonic signal analysis

    International Nuclear Information System (INIS)

    Huh, Hyung; Koo, Kil Mo; Cheong, Yong Moo; Kim, G. J.

    1995-01-01

    Many signal-processing techniques have been found to be useful in ultrasonic and nondestructive evaluation. Among the most popular techniques are signal averaging, spatial compounding, matched filters, and homomorphic processing. One of the significant new process is split-spectrum processing(SSP), which can be equally useful in signal-to-noise ratio(SNR) improvement and grain characterization in several engineering materials. The purpose of this paper is to explore the utility of SSP in ultrasonic NDE. A wide variety of engineering problems are reviewed and suggestions for implementation of the technique are provided. SSP uses the frequency-dependent response of the interfering coherent noise produced by unresolvable scatters in the resolution range cell of a transducer. It is implemented by splitting the Sequency spectrum of the received signal by using Gaussian bandpass filters. The theoretical basis for the potential of SSP for grain characterization in SUS 304 material is discussed, and some experimental-evidence for the feasibility of the approach is presented. Results of SNR enhancement in signals obtained from real four samples of SUS 304. The influence of various processing parameters on the performance of the processing technique is also discussed. The minimization algorithm. which provides an excellent SNR enhancement when used either in conjunction with other SSP algorithms like polarity-check or by itself, is also presented.

  8. A Study on Hybrid Split-Spectrum Processing Technique for Enhanced Reliability in Ultrasonic Signal Analysis

    International Nuclear Information System (INIS)

    Huh, H.; Koo, K. M.; Kim, G. J.

    1996-01-01

    Many signal-processing techniques have been found to be useful in ultrasonic and nondestructive evaluation. Among the most popular techniques are signal averaging, spatial compounding, matched filters and homomorphic processing. One of the significant new process is split-spectrum processing(SSP), which can be equally useful in signal-to-noise ratio(SNR) improvement and grain characterization in several specimens. The purpose of this paper is to explore the utility of SSP in ultrasonic NDE. A wide variety of engineering problems are reviewed, and suggestions for implementation of the technique are provided. SSP uses the frequency-dependent response of the interfering coherent noise produced by unresolvable scatters in the resolution range cell of a transducer. It is implemented by splitting the frequency spectrum of the received signal by using gaussian bandpass filter. The theoretical basis for the potential of SSP for grain characterization in SUS 304 material is discussed, and some experimental evidence for the feasibility of the approach is presented. Results of SNR enhancement in signals obtained from real four samples of SUS 304. The influence of various processing parameters on the performance of the processing technique is also discussed. The minimization algorithm, which provides an excellent SNR enhancement when used either in conjunction with other SSP algorithms like polarity-check or by itself, is also presented

  9. Lucky Belief in Science Education - Gettier Cases and the Value of Reliable Belief-Forming Processes

    Science.gov (United States)

    Brock, Richard

    2018-05-01

    The conceptualisation of knowledge as justified true belief has been shown to be, at the very least, an incomplete account. One challenge to the justified true belief model arises from the proposition of situations in which a person possesses a belief that is both justified and true which some philosophers intuit should not be classified as knowledge. Though situations of this type have been imagined by a number of writers, they have come to be labelled Gettier cases. Gettier cases arise when a fallible justification happens to lead to a true belief in one context, a case of `lucky belief'. In this article, it is argued that students studying science may make claims that resemble Gettier cases. In some contexts, a student may make a claim that is both justified and true but which arises from an alternative conception of a scientific concept. A number of instances of lucky belief in topics in science education are considered leading to an examination of the criteria teachers use to assess students' claims in different contexts. The possibility of lucky belief leads to the proposal that, in addition to the acquisition of justified true beliefs, the development of reliable belief-forming processes is a significant goal of science education. The pedagogic value of various kinds of claims is considered and, it is argued, the criteria used to judge claims may be adjusted to suit the context of assessment. It is suggested that teachers should be alert to instances of lucky belief that mask alternative conceptions.

  10. Characterizing reliability in a product/process design-assurance program

    Energy Technology Data Exchange (ETDEWEB)

    Kerscher, W.J. III [Delphi Energy and Engine Management Systems, Flint, MI (United States); Booker, J.M.; Bement, T.R.; Meyer, M.A. [Los Alamos National Lab., NM (United States)

    1997-10-01

    Over the years many advancing techniques in the area of reliability engineering have surfaced in the military sphere of influence, and one of these techniques is Reliability Growth Testing (RGT). Private industry has reviewed RGT as part of the solution to their reliability concerns, but many practical considerations have slowed its implementation. It`s objective is to demonstrate the reliability requirement of a new product with a specified confidence. This paper speaks directly to that objective but discusses a somewhat different approach to achieving it. Rather than conducting testing as a continuum and developing statistical confidence bands around the results, this Bayesian updating approach starts with a reliability estimate characterized by large uncertainty and then proceeds to reduce the uncertainty by folding in fresh information in a Bayesian framework.

  11. A comparative study of the probabilistic fracture mechanics and the stochastic Markovian process approaches for structural reliability assessment

    Energy Technology Data Exchange (ETDEWEB)

    Stavrakakis, G.; Lucia, A.C.; Solomos, G. (Commission of the European Communities, Ispra (Italy). Joint Research Centre)

    1990-01-01

    The two computer codes COVASTOL and RELIEF, developed for the modeling of cumulative damage processes in the framework of probabilistic structural reliability, are compared. They are based respectively on the randomisation of a differential crack growth law and on the theory of discrete Markov processes. The codes are applied for fatigue crack growth predictions using two sets of data of crack propagation curves from specimens. The results are critically analyzed and an extensive discussion follows on the merits and limitations of each code. Their transferability for the reliability assessment of real structures is investigated. (author).

  12. The ALMA high speed optical communication link is here: an essential component for reliable present and future operations

    Science.gov (United States)

    Filippi, G.; Ibsen, J.; Jaque, S.; Liello, F.; Ovando, N.; Astudillo, A.; Parra, J.; Saldias, Christian

    2016-07-01

    Announced in 2012, started in 2013 and completed in 2015, the ALMA high bandwidth communication system has become a key factor to achieve the operational and scientific goals of ALMA. This paper summarizes the technical, organizational, and operational goals of the ALMA Optical Link Project, focused in the creation and operation of an effective and sustainable communication infrastructure to connect the ALMA Operations Support Facility and Array Operations Site, both located in the Atacama Desert in the Northern region of Chile, with the point of presence of REUNA in Antofagasta, about 400km away, and from there to the Santiago Central Office in the Chilean capital through the optical infrastructure created by the EC-funded EVALSO project and now an integral part of the REUNA backbone. This new infrastructure completed in 2014 and now operated on behalf of ALMA by REUNA, the Chilean National Research and Education Network, uses state of the art technologies, like dark fiber from newly built cables and DWDM transmission, allowing extending the reach of high capacity communication to the remote region where the Observatory is located. The paper also reports on the results obtained during the first year and a half testing and operation period, where different operational set ups have been experienced for data transfer, remote collaboration, etc. Finally, the authors will present a forward look of the impact of it to both the future scientific development of the Chajnantor Plateau, where many installations area are (and will be) located, as well as the potential Chilean scientific backbone long term development.

  13. Surveying the impact of satisfaction and e-reliability on customers' loyalty in e-purchase process: a case in Pars Khodro co

    Directory of Open Access Journals (Sweden)

    Vahid Qaemi

    2012-10-01

    Full Text Available Today, customer return issue in e-purchase process is considered as important topic in companies' marketing and managerial decision making. In this paper, we present an empirical study on measuring the impact of e-loyalty for an Iranian auto-industry called Pars Khodro co. The proposed study measures reliability, responsiveness, design, security/privacy as independent variables, e-confidence and e-satisfaction as mediator variable, and e-loyalty as dependent variable. The preliminary results show that effectiveness of e-satisfaction and e-confidence on loyalty and effectiveness of e-confidence on e-satisfaction are in high level. Reliability/Fulfillment and security variables on e-confidence have significant impacts, and effectiveness level of reliability/Fulfillment and responsiveness and website design on e-satisfaction is high. The results indicate that there is no significant relationship between responsiveness and e-confidence.

  14. [The effect of encoding on false memory: examination on levels of processing and list presentation format].

    Science.gov (United States)

    Hamajima, Hideki

    2004-04-01

    Using the Deese/Roediger-McDermott paradigm, the effects of lists presentation format (blocked/random) and levels of processing of critical nonpresented lures were examined. A levels-of-processing effect in a blocked presentation order was not observed for lures. Rates of false recognition and remember judgments for lures in a shallow level of processing were significantly lower than those in a deep level of processing when items from various themes were inter-mixed instead of blocked. Results showed an interaction between levels of processing and list presentation format. It is thus concluded that encoding of each word and whole list should be both considered in understanding false memory.

  15. Optimized work control process to improve safety and reliability in a risk-based and deregulated environment

    International Nuclear Information System (INIS)

    Anderson, Jon G.; Jeffries, Jeffrey D. E.; Mairs, Todd P.; Rahn, Frank J.

    1999-01-01

    This paper provides an overview of strategic models to assist power generating plants to improve their work control processes. These models include mechanisms to continually keep the process up to date. Included in the work control process are elements for system cost/performance analysis, life-cycle maintenance planning, on-line scheduling and look-ahead techniques, and schedule implementation to conduct work on the asset. The paper also discusses how risk management associated with work control issues that effect the safety and reliability, as well as O and M costs, is integrated into this strategy. The work control process is a pervasive and critical element in the successful implementation of operations and work management programs. While providing a method to implement maintenance activities in a cost-effective manner, the work control process improves plant safety and system reliability

  16. Multi-state reliability for pump group in system based on UGF and semi-Markov process

    International Nuclear Information System (INIS)

    Shang Yanlong; Cai Qi; Zhao Xinwen; Chen Ling

    2012-01-01

    In this paper, multi-state reliability value of pump group in nuclear power system is obtained by the combination method of the universal generating function (UGF) and Semi-Markov process. UGF arithmetic model of multi-state system reliability is studied, and the performance state probability expression of multi-state component is derived using semi-Markov theory. A quantificational model is defined to express the performance rate of the system and component. Different availability results by multi-state and binary state analysis method are compared under the condition whether the performance rate can satisfy the demanded value, and the mean value of system instantaneous output performance is also obtained. It shows that this combination method is an effective and feasible one which can quantify the effect of the partial failure on the system reliability, and the result of multi-state system reliability by this method deduces the modesty of the reliability value obtained by binary reliability analysis method. (authors)

  17. Reliability Calculations

    DEFF Research Database (Denmark)

    Petersen, Kurt Erling

    1986-01-01

    Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety...... and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic...... approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very...

  18. Reliability Prediction Of System And Component Of Process System Of RSG-GAS Reactor

    International Nuclear Information System (INIS)

    Sitorus Pane, Jupiter

    2001-01-01

    The older the reactor the higher the probability of the system and components suffer from loss of function or degradation. This phenomenon occurred because of wear, corrosion, and fatigue. Study on component reliability was generally performed deterministically and statistically. This paper would describe an analysis of using statistical method, i.e. regression Cox, in order to predict the reliability of the components and their environmental influence's factors. The result showed that the dynamics, non safety related, and mechanic components have higher risk of failure, whereas static, safety related, and electric have lower risk of failures. The relative risk value for variable of components dynamics, quality, dummy 1 and dummy 2 are of 1.54, 1.59, 1.50, and 0.83 compare to other components type with each variable. Component with the higher risk have lower reliability than lower one

  19. Assessing Reliability of Cellulose Hydrolysis Models to Support Biofuel Process Design – Identifiability and Uncertainty Analysis

    DEFF Research Database (Denmark)

    Sin, Gürkan; Meyer, Anne S.; Gernaey, Krist

    2010-01-01

    The reliability of cellulose hydrolysis models is studied using the NREL model. An identifiability analysis revealed that only 6 out of 26 parameters are identifiable from the available data (typical hydrolysis experiments). Attempting to identify a higher number of parameters (as done in the ori......The reliability of cellulose hydrolysis models is studied using the NREL model. An identifiability analysis revealed that only 6 out of 26 parameters are identifiable from the available data (typical hydrolysis experiments). Attempting to identify a higher number of parameters (as done...

  20. Reliability of fully automated versus visually controlled pre- and post-processing of resting-state EEG.

    Science.gov (United States)

    Hatz, F; Hardmeier, M; Bousleiman, H; Rüegg, S; Schindler, C; Fuhr, P

    2015-02-01

    To compare the reliability of a newly developed Matlab® toolbox for the fully automated, pre- and post-processing of resting state EEG (automated analysis, AA) with the reliability of analysis involving visually controlled pre- and post-processing (VA). 34 healthy volunteers (age: median 38.2 (20-49), 82% female) had three consecutive 256-channel resting-state EEG at one year intervals. Results of frequency analysis of AA and VA were compared with Pearson correlation coefficients, and reliability over time was assessed with intraclass correlation coefficients (ICC). Mean correlation coefficient between AA and VA was 0.94±0.07, mean ICC for AA 0.83±0.05 and for VA 0.84±0.07. AA and VA yield very similar results for spectral EEG analysis and are equally reliable. AA is less time-consuming, completely standardized, and independent of raters and their training. Automated processing of EEG facilitates workflow in quantitative EEG analysis. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  1. Validation of DWI pre-processing procedures for reliable differentiation between human brain gliomas.

    Science.gov (United States)

    Vellmer, Sebastian; Tonoyan, Aram S; Suter, Dieter; Pronin, Igor N; Maximov, Ivan I

    2018-02-01

    Diffusion magnetic resonance imaging (dMRI) is a powerful tool in clinical applications, in particular, in oncology screening. dMRI demonstrated its benefit and efficiency in the localisation and detection of different types of human brain tumours. Clinical dMRI data suffer from multiple artefacts such as motion and eddy-current distortions, contamination by noise, outliers etc. In order to increase the image quality of the derived diffusion scalar metrics and the accuracy of the subsequent data analysis, various pre-processing approaches are actively developed and used. In the present work we assess the effect of different pre-processing procedures such as a noise correction, different smoothing algorithms and spatial interpolation of raw diffusion data, with respect to the accuracy of brain glioma differentiation. As a set of sensitive biomarkers of the glioma malignancy grades we chose the derived scalar metrics from diffusion and kurtosis tensor imaging as well as the neurite orientation dispersion and density imaging (NODDI) biophysical model. Our results show that the application of noise correction, anisotropic diffusion filtering, and cubic-order spline interpolation resulted in the highest sensitivity and specificity for glioma malignancy grading. Thus, these pre-processing steps are recommended for the statistical analysis in brain tumour studies. Copyright © 2017. Published by Elsevier GmbH.

  2. Bayesian methods in reliability

    Science.gov (United States)

    Sander, P.; Badoux, R.

    1991-11-01

    The present proceedings from a course on Bayesian methods in reliability encompasses Bayesian statistical methods and their computational implementation, models for analyzing censored data from nonrepairable systems, the traits of repairable systems and growth models, the use of expert judgment, and a review of the problem of forecasting software reliability. Specific issues addressed include the use of Bayesian methods to estimate the leak rate of a gas pipeline, approximate analyses under great prior uncertainty, reliability estimation techniques, and a nonhomogeneous Poisson process. Also addressed are the calibration sets and seed variables of expert judgment systems for risk assessment, experimental illustrations of the use of expert judgment for reliability testing, and analyses of the predictive quality of software-reliability growth models such as the Weibull order statistics.

  3. Frontiers of reliability

    CERN Document Server

    Basu, Asit P; Basu, Sujit K

    1998-01-01

    This volume presents recent results in reliability theory by leading experts in the world. It will prove valuable for researchers, and users of reliability theory. It consists of refereed invited papers on a broad spectrum of topics in reliability. The subjects covered include Bayesian reliability, Bayesian reliability modeling, confounding in a series system, DF tests, Edgeworth approximation to reliability, estimation under random censoring, fault tree reduction for reliability, inference about changes in hazard rates, information theory and reliability, mixture experiment, mixture of Weibul

  4. Application of nonhomogeneous Poisson process to reliability analysis of repairable systems of a nuclear power plant with rates of occurrence of failures time-dependent

    International Nuclear Information System (INIS)

    Saldanha, Pedro L.C.; Simone, Elaine A. de; Melo, Paulo Fernando F.F. e

    1996-01-01

    Aging is used to mean the continuous process which physical characteristics of a system, a structure or an equipment changes with time or use. Their effects are increases in failure probabilities of a system, a structure or an equipment, and their are calculated using time-dependent failure rate models. The purpose of this paper is to present an application of the nonhomogeneous Poisson process as a model to study rates of occurrence of failures when they are time-dependent. To this application, an analysis of reliability of service water pumps of a typical nuclear power plant is made, as long as the pumps are effectively repaired components. (author)

  5. Supporting the personnel reliability decision-making process with artificial intelligence

    International Nuclear Information System (INIS)

    Harte, D.C.

    1991-01-01

    Recent legislation concerning personnel security has vastly increased the responsibility and accountability of the security manager. Access authorization, fitness for duty, and personnel security access programs require decisions regarding an individual's trustworthiness and reliability based on the findings of a background investigation. While these guidelines provide significant data and are useful as a tool, limited resources are available to the adjudicator of derogatory information on what is and is not acceptable in terms of granting access to sensitive areas of nuclear plants. The reason why one individual is deemed unacceptable and the next acceptable may be questioned and cause discriminatory accusations. This paper is continuation of discussion on workforce reliability, focusing on the use of artificial intelligence to support the decisions of a security manager. With this support, the benefit of previous decisions helps ensure consistent adjudication of background investigations

  6. ANALYSIS OF RELIABILITY OF RESERVED AUTOMATIC CONTROL SYSTEMS OF INDUSTRIAL POWER PROCESSES

    Directory of Open Access Journals (Sweden)

    V. A. Anishchenko

    2014-01-01

    Full Text Available This paper describes the comparative analysis of the main structural schemes for reserved automatic control and regulation devices of important objects of power supply with increased reliability requirements. There were analyzed schemes of passive and active doubling with control device, passive and active tripling, combined redundancy and majority redundancy according to schemes: “two from three” and “three from five”. On the results of calculations fulfilled there was made comparison of these schemes for ideal devices of built-in control and ideal majority elements. Scales of preferences of systems according to criterion of average time maximum and average probability of no-failure operation were built. These scales have variable character, depending on intervals in which there is a parameter obtained by multiplication of failure rate and time. The sequence of systems’ preferences is changing and is depending on each system failures and in moments of curves crossing of average probability of no-failure operation of systems. Analysis of calculation results showed the advantages of tripling systems and combined redundancy in reliability and this is achieved by a great amount of expenses for these systems creation. Under definite conditions the reliability of system of passive tripling is higher compared to system of active doubling. The majority schemes allow determining not only the full but also single (metrological failures. Boundary value of unreliability of built-in control device is determined, and this allows making a perfect choice between systems of active and passive redundancy.

  7. Crossing rate of labelled Poisson cluster processes and their application in the reliability theory

    International Nuclear Information System (INIS)

    Schrupp, K.

    1986-01-01

    A load process is modelled within a given interdependency system and the failure probability of a structure is estimated using the crossing rate method. The term 'labelled cluster process' is formally introduced. An approximation is given by the expected value of the point process of the crossing from the safe range to the failure range. This expected value is explicitly calculated for the instationary cluster process, the stationary borderline process, and for various types of superpositions (clustering) of such processes. (DG) [de

  8. Effects of Multimodal Presentation and Stimulus Familiarity on Auditory and Visual Processing

    Science.gov (United States)

    Robinson, Christopher W.; Sloutsky, Vladimir M.

    2010-01-01

    Two experiments examined the effects of multimodal presentation and stimulus familiarity on auditory and visual processing. In Experiment 1, 10-month-olds were habituated to either an auditory stimulus, a visual stimulus, or an auditory-visual multimodal stimulus. Processing time was assessed during the habituation phase, and discrimination of…

  9. Modeling of the thermal physical process and study on the reliability of linear energy density for selective laser melting

    Science.gov (United States)

    Xiang, Zhaowei; Yin, Ming; Dong, Guanhua; Mei, Xiaoqin; Yin, Guofu

    2018-06-01

    A finite element model considering volume shrinkage with powder-to-dense process of powder layer in selective laser melting (SLM) is established. Comparison between models that consider and do not consider volume shrinkage or powder-to-dense process is carried out. Further, parametric analysis of laser power and scan speed is conducted and the reliability of linear energy density as a design parameter is investigated. The results show that the established model is an effective method and has better accuracy allowing for the temperature distribution, and the length and depth of molten pool. The maximum temperature is more sensitive to laser power than scan speed. The maximum heating rate and cooling rate increase with increasing scan speed at constant laser power and increase with increasing laser power at constant scan speed as well. The simulation results and experimental result reveal that linear energy density is not always reliable using as a design parameter in the SLM.

  10. Test-retest reliability of fMRI-based graph theoretical properties during working memory, emotion processing, and resting state.

    Science.gov (United States)

    Cao, Hengyi; Plichta, Michael M; Schäfer, Axel; Haddad, Leila; Grimm, Oliver; Schneider, Michael; Esslinger, Christine; Kirsch, Peter; Meyer-Lindenberg, Andreas; Tost, Heike

    2014-01-01

    The investigation of the brain connectome with functional magnetic resonance imaging (fMRI) and graph theory analyses has recently gained much popularity, but little is known about the robustness of these properties, in particular those derived from active fMRI tasks. Here, we studied the test-retest reliability of brain graphs calculated from 26 healthy participants with three established fMRI experiments (n-back working memory, emotional face-matching, resting state) and two parcellation schemes for node definition (AAL atlas, functional atlas proposed by Power et al.). We compared the intra-class correlation coefficients (ICCs) of five different data processing strategies and demonstrated a superior reliability of task-regression methods with condition-specific regressors. The between-task comparison revealed significantly higher ICCs for resting state relative to the active tasks, and a superiority of the n-back task relative to the face-matching task for global and local network properties. While the mean ICCs were typically lower for the active tasks, overall fair to good reliabilities were detected for global and local connectivity properties, and for the n-back task with both atlases, smallworldness. For all three tasks and atlases, low mean ICCs were seen for the local network properties. However, node-specific good reliabilities were detected for node degree in regions known to be critical for the challenged functions (resting-state: default-mode network nodes, n-back: fronto-parietal nodes, face-matching: limbic nodes). Between-atlas comparison demonstrated significantly higher reliabilities for the functional parcellations for global and local network properties. Our findings can inform the choice of processing strategies, brain atlases and outcome properties for fMRI studies using active tasks, graph theory methods, and within-subject designs, in particular future pharmaco-fMRI studies. © 2013 Elsevier Inc. All rights reserved.

  11. Water chemistry data acquisition, processing, evaluation and diagnostic systems in Light Water Reactors: Future improvement of plant reliability and safety

    International Nuclear Information System (INIS)

    Uchida, S.; Takiguchi, H.; Ishigure, K.

    2006-01-01

    Data acquisition, processing and evaluation systems have been applied in major Japanese PWRs and BWRs to provide (1) reliable and quick data acquisition with manpower savings in plant chemical laboratories and (2) smooth and reliable information transfer among chemists, plant operators, and supervisors. Data acquisition systems in plants consist of automatic and semi-automatic instruments for chemical analyses, e. g., X-ray fluorescence analysis and ion chromatography, while data processing systems consist of PC base-sub-systems, e.g., data storage, reliability evaluation, clear display, and document preparation for understanding the plant own water chemistry trends. Precise and reliable evaluations of water chemistry data are required in order to improve plant reliability and safety. For this, quality assurance of the water chemistry data acquisition system is needed. At the same time, theoretical models are being applied to bridge the gaps between measured water chemistry data and the information desired to understand the interaction of materials and cooling water in plants. Major models which have already been applied for plant evaluation are: (1) water radiolysis models for BWRs and PWRs; (2) crevice radiolysis model for SCC in BWRs; and (3) crevice pH model for SG tubing in PWRs. High temperature water chemistry sensors and automatic plant diagnostic systems have been applied in only restricted areas. ECP sensors are gaining popularity as tools to determine the effects of hydrogen injection in BWR systems. Automatic plant diagnostic systems based on artificial intelligence will be more popular after having sufficient experience with off line diagnostic systems. (author)

  12. Learning process for performing and analyzing 3D/4D transperineal ultrasound imaging and interobserver reliability study.

    Science.gov (United States)

    Siafarikas, F; Staer-Jensen, J; Braekken, I H; Bø, K; Engh, M Ellström

    2013-03-01

    To evaluate the learning process for acquiring three- and four-dimensional (3D/4D) transperineal ultrasound volumes of the levator hiatus (LH) dimensions at rest, during pelvic floor muscle (PFM) contraction and on Valsalva maneuver, and for analyzing the ultrasound volumes, as well as to perform an interobserver reliability study between two independent ultrasound examiners. This was a prospective study including 22 women. We monitored the learning process of an inexperienced examiner (IE) performing 3D/4D transperineal ultrasonography and analyzing the volumes. The examination included acquiring volumes during three PFM contractions and three Valsalva maneuvers. LH dimensions were determined in the axial plane. The learning process was documented by estimating agreement between the IE and an experienced examiner (E) using the intraclass correlation coefficient. Agreement was calculated in blocks of 10 ultrasound examinations and analyzed volumes. After the learning process was complete the interobserver reliability for the technique was calculated between these two independent examiners. For offline analysis of the first 10 ultrasound volumes obtained by E, good to very good agreement between E and IE was achieved for all LH measurements except for the left and right levator-urethra gap and pubic arc. For the next 10 analyzed volumes, agreement improved for all LH measurements. Volumes that had been obtained by IE and E were then re-evaluated by IE, and good to very good agreement was found for all LH measurements indicating consistency in volume acquisition. The interobserver reliability study showed excellent ICC values (ICC, 0.81-0.97) for all LH measurements except the pubic arc (ICC = 0.67). 3D/4D transperineal ultrasound is a reliable technique that can be learned in a short period of time. Copyright © 2012 ISUOG. Published by John Wiley & Sons, Ltd.

  13. Application of a methodology for the development and validation of reliable process control software

    International Nuclear Information System (INIS)

    Ramamoorthy, C.V.; Mok, Y.R.; Bastani, F.B.; Chin, G.

    1980-01-01

    The necessity of a good methodology for the development of reliable software, especially with respect to the final software validation and testing activities, is discussed. A formal specification development and validation methodology is proposed. This methodology has been applied to the development and validation of a pilot software, incorporating typical features of critical software for nuclear power plants safety protection. The main features of the approach include the use of a formal specification language and the independent development of two sets of specifications. 1 ref

  14. System Reliability Engineering

    International Nuclear Information System (INIS)

    Lim, Tae Jin

    2005-02-01

    This book tells of reliability engineering, which includes quality and reliability, reliability data, importance of reliability engineering, reliability and measure, the poisson process like goodness of fit test and the poisson arrival model, reliability estimation like exponential distribution, reliability of systems, availability, preventive maintenance such as replacement policies, minimal repair policy, shock models, spares, group maintenance and periodic inspection, analysis of common cause failure, and analysis model of repair effect.

  15. Reliability Engineering

    CERN Document Server

    Lazzaroni, Massimo

    2012-01-01

    This book gives a practical guide for designers and users in Information and Communication Technology context. In particular, in the first Section, the definition of the fundamental terms according to the international standards are given. Then, some theoretical concepts and reliability models are presented in Chapters 2 and 3: the aim is to evaluate performance for components and systems and reliability growth. Chapter 4, by introducing the laboratory tests, puts in evidence the reliability concept from the experimental point of view. In ICT context, the failure rate for a given system can be

  16. Reliability calculations

    International Nuclear Information System (INIS)

    Petersen, K.E.

    1986-03-01

    Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very complex systems. In order to increase the applicability of the programs variance reduction techniques can be applied to speed up the calculation process. Variance reduction techniques have been studied and procedures for implementation of importance sampling are suggested. (author)

  17. Present status of radiation processing and its future development by using electron accelerator in Vietnam

    International Nuclear Information System (INIS)

    Tran Khac An; Tran Tich Canh; Doan Binh; Nguyen Quoc Hien

    2003-01-01

    In Vietnam, studies on Radiation Processing have been carried out since 1983. Some results are applicable in the field of agriculture, health and foodstuff, some researches were developed to commercial scale and others have high potential for development by using electron accelerator. The paper offers the present status of radiation processing and also give out the growing tendency of using electron accelerator in the future. (author)

  18. Multimedia presentation as a form of E-learning resources in the educational process

    Directory of Open Access Journals (Sweden)

    Bizyaev АА

    2017-06-01

    Full Text Available The article describes the features of the use of multimedia presentations as an electronic learning resource in the educational process, reflecting resource requirements; pedagogical goals that may be achieved. Currently one of the main directions in the educational process is the effective use of teaching computers. Pressing issue implementation of information and communication technologies in education is to develop educational resources with the aim to increase the level and quality of education.

  19. Present status of radiation processing and its future development by using electron accelerator in Vietnam

    Energy Technology Data Exchange (ETDEWEB)

    Tran Khac An; Tran Tich Canh; Doan Binh [Research and Development Center for Radiation Technology (VINAGAMMA), Ho Chi Minh (Viet Nam); Nguyen Quoc Hien [Nuclear Research Institute (NRI), Dalat (Viet Nam)

    2003-02-01

    In Vietnam, studies on Radiation Processing have been carried out since 1983. Some results are applicable in the field of agriculture, health and foodstuff, some researches were developed to commercial scale and others have high potential for development by using electron accelerator. The paper offers the present status of radiation processing and also give out the growing tendency of using electron accelerator in the future. (author)

  20. Process-aware information system development for the healthcare domain : consistency, reliability and effectiveness

    NARCIS (Netherlands)

    Mans, R.S.; Aalst, van der W.M.P.; Russell, N.C.; Bakker, P.J.M.; Moleman, A.J.; Rinderle-Ma, S.; Sadiq, S.; Leymann, F.

    2010-01-01

    Optimal support for complex healthcare processes cannot be provided by a single out-of-the-box Process-Aware Information System and necessitates the construction of customized applications based on these systems. In order to allow for the seamless integration of the new technology into the existing

  1. Development of a Tablet-based symbol digit modalities test for reliably assessing information processing speed in patients with stroke.

    Science.gov (United States)

    Tung, Li-Chen; Yu, Wan-Hui; Lin, Gong-Hong; Yu, Tzu-Ying; Wu, Chien-Te; Tsai, Chia-Yin; Chou, Willy; Chen, Mei-Hsiang; Hsieh, Ching-Lin

    2016-09-01

    To develop a Tablet-based Symbol Digit Modalities Test (T-SDMT) and to examine the test-retest reliability and concurrent validity of the T-SDMT in patients with stroke. The study had two phases. In the first phase, six experts, nine college students and five outpatients participated in the development and testing of the T-SDMT. In the second phase, 52 outpatients were evaluated twice (2 weeks apart) with the T-SDMT and SDMT to examine the test-retest reliability and concurrent validity of the T-SDMT. The T-SDMT was developed via expert input and college student/patient feedback. Regarding test-retest reliability, the practise effects of the T-SDMT and SDMT were both trivial (d=0.12) but significant (p≦0.015). The improvement in the T-SDMT (4.7%) was smaller than that in the SDMT (5.6%). The minimal detectable changes (MDC%) of the T-SDMT and SDMT were 6.7 (22.8%) and 10.3 (32.8%), respectively. The T-SDMT and SDMT were highly correlated with each other at the two time points (Pearson's r=0.90-0.91). The T-SDMT demonstrated good concurrent validity with the SDMT. Because the T-SDMT had a smaller practise effect and less random measurement error (superior test-retest reliability), it is recommended over the SDMT for assessing information processing speed in patients with stroke. Implications for Rehabilitation The Symbol Digit Modalities Test (SDMT), a common measure of information processing speed, showed a substantial practise effect and considerable random measurement error in patients with stroke. The Tablet-based SDMT (T-SDMT) has been developed to reduce the practise effect and random measurement error of the SDMT in patients with stroke. The T-SDMT had smaller practise effect and random measurement error than the SDMT, which can provide more reliable assessments of information processing speed.

  2. Antigen processing and remodeling of the endosomal pathway: requirements for antigen cross-presentation.

    Science.gov (United States)

    Compeer, Ewoud Bernardus; Flinsenberg, Thijs Willem Hendrik; van der Grein, Susanna Geertje; Boes, Marianne

    2012-01-01

    Cross-presentation of endocytosed antigen as peptide/class I major histocompatibility complex complexes plays a central role in the elicitation of CD8(+) T cell clones that mediate anti-viral and anti-tumor immune responses. While it has been clear that there are specific subsets of professional antigen presenting cells capable of antigen cross-presentation, identification of mechanisms involved is still ongoing. Especially amongst dendritic cells (DC), there are specialized subsets that are highly proficient at antigen cross-presentation. We here present a focused survey on the cell biological processes in the endosomal pathway that support antigen cross-presentation. This review highlights DC-intrinsic mechanisms that facilitate the cross-presentation of endocytosed antigen, including receptor-mediated uptake, maturation-induced endosomal sorting of membrane proteins, dynamic remodeling of endosomal structures and cell surface-directed endosomal trafficking. We will conclude with the description of pathogen-induced deviation of endosomal processing, and discuss how immune evasion strategies pertaining endosomal trafficking may preclude antigen cross-presentation.

  3. Antigen processing and remodeling of the endosomal pathway: requirements for antigen cross-presentation.

    Directory of Open Access Journals (Sweden)

    Ewoud Bernardus Compeer

    2012-03-01

    Full Text Available The cross-presentation of endocytosed antigen as peptide/class I MHC complexes plays a central role in the elicitation of CD8+ T cell clones that mediate anti-viral and anti-tumor immune responses. While it has been clear that there are specific subsets of professional antigen presenting cells (APC capable of antigen cross-presentation, description of mechanisms involved is still ongoing. Especially amongst dendritic cells (DC, there are specialized subsets that are highly proficient at antigen cross-presentation. We here present a focused survey on the cell biological processes in the endosomal pathway that support antigen cross-presentation. This review highlight DC-intrinsic mechanisms that facilitate the cross-presentation of endocytosed antigen, including receptor-mediated uptake, recycling and maturation including the sorting of membrane proteins, dynamic remodeling of endosomal structures and cell-surface directed endosomal trafficking. We will conclude with description of pathogen-induced deviation of endosomal processing, and discuss how immune evasion strategies pertaining endosomal trafficking may preclude antigen cross-presentation.

  4. Unveiling the fungal mycobiota present throughout the cork stopper manufacturing process

    NARCIS (Netherlands)

    Barreto, M.C.; Houbraken, J.; Samson, R.A.; Brito, D.; Gadanho, M.; San Romão, M.V.

    2012-01-01

    A particular fungal population is present in the main stages of the manufacturing process of cork discs. Its diversity was studied using both dependent (isolation) and independent culture methods (denaturing gel gradient electrophoresis and cloning of the ITS1-5.8S-ITS2 region). The mycobiota in the

  5. Risks and reliability of manufacturing processes as related to composite materials for spacecraft structures

    Science.gov (United States)

    Bao, Han P.

    1995-01-01

    Fabricating primary aircraft and spacecraft structures using advanced composite materials entail both benefits and risks. The benefits come from much improved strength-to-weight ratios and stiffness-to-weight ratios, potential for less part count, ability to tailor properties, chemical and solvent resistance, and superior thermal properties. On the other hand, the risks involved include high material costs, lack of processing experience, expensive labor, poor reproducibility, high toxicity for some composites, and a variety of space induced risks. The purpose of this project is to generate a manufacturing database for a selected number of materials with potential for space applications, and to rely on this database to develop quantitative approaches to screen candidate materials and processes for space applications on the basis of their manufacturing risks including costs. So far, the following materials have been included in the database: epoxies, polycyanates, bismalemides, PMR-15, polyphenylene sulfides, polyetherimides, polyetheretherketone, and aluminum lithium. The first four materials are thermoset composites; the next three are thermoplastic composites, and the last one is is a metal. The emphasis of this database is on factors affecting manufacturing such as cost of raw material, handling aspects which include working life and shelf life of resins, process temperature, chemical/solvent resistance, moisture resistance, damage tolerance, toxicity, outgassing, thermal cycling, and void content, nature or type of process, associate tooling, and in-process quality assurance. Based on industry experience and published literature, a relative ranking was established for each of the factors affecting manufacturing as listed above. Potential applications of this database include the determination of a delta cost factor for specific structures with a given process plan and a general methodology to screen materials and processes for incorporation into the current

  6. The Adaptation, Validation, Reliability Process of the Turkish Version Orientations to Happiness Scale

    Directory of Open Access Journals (Sweden)

    Hakan Saricam

    2015-12-01

    Full Text Available The purpose of this research is to adapt the Scale of Happiness Orientations, which was developed by Peterson, Park, and Seligman (2005, into Turkish and examine the psychometric properties of the scale. The participants of the research consist of 489 students. The psychometric properties of the scale was examined with test methods; linguistic equivalence, descriptive factor analysis, confirmatory factor analysis, criterion-related validity, internal consistency, and test-retest. For criterion-related validity (concurrent validity, the Oxford Happiness Questionnaire-Short Form is used. Articles resulting from the descriptive factor analysis for structural validity of scale were summed into three factors (life of meaning, life of pleasure, life of engagement in accordance with the original form. Confirmatory factor analysis conducted yielded the value of three-factor fit indexes of 18 items: (χ2/df=1.94, RMSEA= .059, CFI= .96, GFI= .95, IFI= .95, NFI= .96, RFI= .95 and SRMR= .044. Factor load of the scale ranges from .36 to .59. During criterion-validity analysis, between Scale of Happiness Orientations and the Oxford Happiness Questionnaire, positive strong relations were seen at the level of p<.01 significance level. Cronbach Alpha internal consistency coefficient was .88 for the life of meaning sub-scale, .84 for the life of pleasure sub-scale, and .81 for the life of engagement sub-scale. In addition, a corrected items total correlation ranges from .39 to .61. According to these results, it can be said that the scale is a valid and reliable assessment instrument for positive psychology, educational psychology, and other fields.

  7. Modeling of the thermal physical process and study on the reliability of linear energy density for selective laser melting

    Directory of Open Access Journals (Sweden)

    Zhaowei Xiang

    2018-06-01

    Full Text Available A finite element model considering volume shrinkage with powder-to-dense process of powder layer in selective laser melting (SLM is established. Comparison between models that consider and do not consider volume shrinkage or powder-to-dense process is carried out. Further, parametric analysis of laser power and scan speed is conducted and the reliability of linear energy density as a design parameter is investigated. The results show that the established model is an effective method and has better accuracy allowing for the temperature distribution, and the length and depth of molten pool. The maximum temperature is more sensitive to laser power than scan speed. The maximum heating rate and cooling rate increase with increasing scan speed at constant laser power and increase with increasing laser power at constant scan speed as well. The simulation results and experimental result reveal that linear energy density is not always reliable using as a design parameter in the SLM. Keywords: Selective laser melting, Volume shrinkage, Powder-to-dense process, Numerical modeling, Thermal analysis, Linear energy density

  8. Recycle attuned catalytic exchange (RACE) for reliable and low inventory processing of highly tritiated water

    International Nuclear Information System (INIS)

    Iseli, M.; Schaub, M.; Ulrich, D.

    1992-01-01

    The detritiation of highly tritiated water by liquid phase catalytic exchange needs dilution of the feed with water to tritium concentrations suitable for catalyst and safety rules and to assure flow rates large enough for wetting the catalyst. Dilution by recycling detritiated water from within the exchange process has three advantages: the amount and concentration of the water for dilution is controlled within the exchange process, there is no additional water load to processes located downstream RACE, and the ratio of gas to liquid flow rates in the exchange column could be adjusted by using several recycles differing in amount and concentration to avoid an excessively large number of theoretical separation stages. In this paper, the flexibility of the recycle attuned catalytic exchange (RACE) and its effect on the cryogenic distillation are demonstrated for the detritiation of the highly tritiated water from a tritium breeding blanket

  9. Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR): Guide to data processing and revision: Part 2, Human error probability data entry and revision procedures

    International Nuclear Information System (INIS)

    Gilmore, W.E.; Gertman, D.I.; Gilbert, B.G.; Reece, W.J.

    1988-11-01

    The Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR) is an automated data base management system for processing and storing human error probability (HEP) and hardware component failure data (HCFD). The NUCLARR system software resides on an IBM (or compatible) personal micro-computer. Users can perform data base searches to furnish HEP estimates and HCFD rates. In this manner, the NUCLARR system can be used to support a variety of risk assessment activities. This volume, Volume 3 of a 5-volume series, presents the procedures used to process HEP and HCFD for entry in NUCLARR and describes how to modify the existing NUCLARR taxonomy in order to add either equipment types or action verbs. Volume 3 also specifies the various roles of the administrative staff on assignment to the NUCLARR Clearinghouse who are tasked with maintaining the data base, dealing with user requests, and processing NUCLARR data. 5 refs., 34 figs., 3 tabs

  10. Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR): Guide to data processing and revision: Part 3, Hardware component failure data entry and revision procedures

    International Nuclear Information System (INIS)

    Gilmore, W.E.; Gertman, D.I.; Gilbert, B.G.; Reece, W.J.

    1988-11-01

    The Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR) is an automated data base management system for processing and storing human error probability (HEP) and hardware component failure data (HCFD). The NUCLARR system software resides on an IBM (or compatible) personal micro-computer. Users can perform data base searches to furnish HEP estimates and HCFD rates. In this manner, the NUCLARR system can be used to support a variety of risk assessment activities. This volume, Volume 3 of a 5-volume series, presents the procedures used to process HEP and HCFD for entry in NUCLARR and describes how to modify the existing NUCLARR taxonomy in order to add equipment types of action verbs. Volume 3 also specifies the various roles of the administrative staff on assignment to the NUCLARR Clearinghouse who are tasked with maintaining the data base, dealing with user requests, and processing NUCLARR data

  11. The Association of Social Work Boards' Licensure Examinations: A Review of Reliability and Validity Processes

    Science.gov (United States)

    Marson, Stephen M.; DeAngelis, Donna; Mittal, Nisha

    2010-01-01

    Objectives: The purpose of this article is to create transparency for the psychometric methods employed for the development of the Association of Social Work Boards' (ASWB) exams. Results: The article includes an assessment of the macro (political) and micro (statistical) environments of testing social work competence. The seven-step process used…

  12. Reliability of the primary triage process after the volendam fire disaster

    NARCIS (Netherlands)

    Welling, Lieke; van Harten, Sabine M.; Henny, C. Pieter; Mackie, Dave P.; Ubbink, Dirk T.; Kreis, Robert W.; Trouwborst, Ad

    2008-01-01

    In a major incident, correct triage is crucial to emergency treatment and transportation priority. The aim of this study was to evaluate the triage process pursued at the site of the fire disaster in Volendam, the Netherlands on January 1, 2001. On-site (OS) and Emergency Department (ED) data

  13. Multi-mission space science data processing systems - Past, present, and future

    Science.gov (United States)

    Stallings, William H.

    1990-01-01

    Packetized telemetry that is consistent with the international Consultative Committee for Space Data Systems (CCSDS) has been baselined for future NASA missions such as Space Station Freedom. Some experiences from past and present multimission systems are examined, including current experiences in implementing a CCSDS standard packetized data processing system, relative to the effectiveness of the multimission approach in lowering life cycle cost and the complexity of meeting new mission needs. It is shown that the continued effort toward standardization of telemetry and processing support will permit the development of multimission systems needed to meet the increased requirements of future NASA missions.

  14. Overview on collision processes of highly charged ions with atoms present status and problems

    International Nuclear Information System (INIS)

    Janev, R.K.

    1983-05-01

    This paper provides a brief discussion on the present status of the collision physics of highly charged ions with atoms. The emphasis is on the main achievements in understanding and describing the most important collision processes, and as charge transfer, ionization and Auger-type processes, and even more on those open problems which, due either to their scientific or practical importance, represent challenges to current research in this field. The paper concentrates on general ideas and problems whose development and solutions have advanced or will advance our basic understanding of the collision dynamics of multiply charged ions with atoms

  15. Knowing when you're wrong: Building fast and reliable approximate query processing systems

    OpenAIRE

    Agarwal, Sameer; Milner, Henry; Kleiner, Ariel; Talwalkar, Ameet; Jordan, Michael; Mozafari, Barzan; Stoica, Ion; Madden, Samuel R.

    2014-01-01

    Modern data analytics applications typically process massive amounts of data on clusters of tens, hundreds, or thousands of machines to support near-real-time decisions.The quantity of data and limitations of disk and memory bandwidth often make it infeasible to deliver answers at interactive speeds. However, it has been widely observed that many applications can tolerate some degree of inaccuracy. This is especially true for exploratory queries on data, where users are satisfied with "close-...

  16. Light Video Game Play is Associated with Enhanced Visual Processing of Rapid Serial Visual Presentation Targets.

    Science.gov (United States)

    Howard, Christina J; Wilding, Robert; Guest, Duncan

    2017-02-01

    There is mixed evidence that video game players (VGPs) may demonstrate better performance in perceptual and attentional tasks than non-VGPs (NVGPs). The rapid serial visual presentation task is one such case, where observers respond to two successive targets embedded within a stream of serially presented items. We tested light VGPs (LVGPs) and NVGPs on this task. LVGPs were better at correct identification of second targets whether they were also attempting to respond to the first target. This performance benefit seen for LVGPs suggests enhanced visual processing for briefly presented stimuli even with only very moderate game play. Observers were less accurate at discriminating the orientation of a second target within the stream if it occurred shortly after presentation of the first target, that is to say, they were subject to the attentional blink (AB). We find no evidence for any reduction in AB in LVGPs compared with NVGPs.

  17. Mitigation of release of volatile iodine species during severe reactor accidents - a novel reliable process of safety technology

    International Nuclear Information System (INIS)

    Guentay, S.; Bruchertseifer, H.

    2010-01-01

    In severe accidents, a significant risk for public health may be generated as a result of release of the gaseous iodine species into the environment through the containment leaks or containment venting filter systems with low retention efficiency. The elemental iodine and volatile organic iodides are the main gaseous iodine species in the containment. Potential release of large quantities of gaseous elemental iodine from the reactor coolant system or its radiolytic generation in the containment sump constitute the key source of gaseous elemental iodine in containment atmosphere. Iodine paint reactions as well as the reaction of iodine with organic residuals in sump water are the main mechanisms for the generation of high volatile organic iodides in the containment. Although very much desired, significant research activities conducted in 70's unfortunately did not create any technically feasible solution to mitigate iodine release into the environment under prevailing conditions. Development of a process leading to a fast, comprehensive and reliable retention of volatile iodine species in aqueous solution with an aim to implement for the severe accident management applications has been subject of a research project in the recent years at Paul Scherrer Institut. The process developed utilizes simultaneous use of two customary technical chemical additives in an aqueous solution. The results of the experimental program have demonstrated a fast and reliable destruction of high volatile organic iodine species and fast reduction of elemental iodine into iodide ions in aqueous solutions and an efficient mitigation of the re-formation of gaseous iodine from iodide ions. Investigations covered a broad range of anticipated severe accident conditions in the containment. The project additionally focused on possible application of the process to existing containment venting filter systems, specifically as a passive add-on for back-fitting. This paper describes the process

  18. Manufacturing process modeling for composite materials and structures, Sandia blade reliability collaborative

    Energy Technology Data Exchange (ETDEWEB)

    Guest, Daniel A.; Cairns, Douglas S.

    2014-02-01

    The increased use and interest in wind energy over the last few years has necessitated an increase in the manufacturing of wind turbine blades. This increase in manufacturing has in many ways out stepped the current understanding of not only the materials used but also the manufacturing methods used to construct composite laminates. The goal of this study is to develop a list of process parameters which influence the quality of composite laminates manufactured using vacuum assisted resin transfer molding and to evaluate how they influence laminate quality. Known to be primary factors for the manufacturing process are resin flow rate and vacuum pressure. An incorrect balance of these parameters will often cause porosity or voids in laminates that ultimately degrade the strength of the composite. Fiber waviness has also been seen as a major contributor to failures in wind turbine blades and is often the effect of mishandling during the lay-up process. Based on laboratory tests conducted, a relationship between these parameters and laminate quality has been established which will be a valuable tool in developing best practices and standard procedures for the manufacture of wind turbine blade composites.

  19. The role of cathepsin E in the antigen processing and presentation pathway.

    OpenAIRE

    Free, P. F.

    2006-01-01

    Although much has been unravelled with regards to the mechanisms of proteolysis of exogenously derived antigen for presentation via histocompatibility class-II (MHC-II), key questions remain unresolved. The exact role of each proteolytic enzyme in this process is not understood. The aspartic proteinase cathepsin E is hypothesised to play an important role. The aim of this study is to examine this by the use of novel aspartic proteinase inhibitors based upon the aspartic proteinase inhibitor p...

  20. The Impact of the Delivery of Prepared Power Point Presentations on the Learning Process

    Directory of Open Access Journals (Sweden)

    Auksė Marmienė

    2011-04-01

    Full Text Available This article describes the process of the preparation and delivery of Power Point presentations and how it can be used by teachers as a resource for classroom teaching. The advantages of this classroom activity covering some of the problems and providing a few suggestions for dealing with those difficulties are also outlined. The major objective of the present paper is to investigate the students ability to choose the material and the content of Power Point presentations on professional topics via the Internet as well as the ability to prepare and deliver the presentation in front of the audience. The factors which determine the choice of the presentation subject are also analysed in this paper. After the delivery students were requested to self- and peer-assess the difficulties they faced in preparation and performance of the presentations by writing the reports. Learners’ attitudes to the choice of the topic of Power Point presentations were surveyed by administering a self-assessment questionnaire.

  1. Automatic processing of unattended lexical information in visual oddball presentation: neurophysiological evidence

    Directory of Open Access Journals (Sweden)

    Yury eShtyrov

    2013-08-01

    Full Text Available Previous electrophysiological studies of automatic language processing revealed early (100-200 ms reflections of access to lexical characteristics of speech signal using the so-called mismatch negativity (MMN, a negative ERP deflection elicited by infrequent irregularities in unattended repetitive auditory stimulation. In those studies, lexical processing of spoken stimuli became manifest as an enhanced ERP in response to unattended real words as opposed to phonologically matched but meaningless pseudoword stimuli. This lexical ERP enhancement was explained by automatic activation of word memory traces realised as distributed strongly intra-connected neuronal circuits, whose robustness guarantees memory trace activation even in the absence of attention on spoken input. Such an account would predict the automatic activation of these memory traces upon any presentation of linguistic information, irrespective of the presentation modality. As previous lexical MMN studies exclusively used auditory stimulation, we here adapted the lexical MMN paradigm to investigate early automatic lexical effects in the visual modality. In a visual oddball sequence, matched short word and pseudoword stimuli were presented tachistoscopically in perifoveal area outside the visual focus of attention, as the subjects’ attention was concentrated on a concurrent non-linguistic visual dual task in the centre of the screen. Using EEG, we found a visual analogue of the lexical ERP enhancement effect, with unattended written words producing larger brain response amplitudes than matched pseudowords, starting at ~100 ms. Furthermore, we also found significant visual MMN, reported here for the first time for unattended lexical stimuli presented perifoveally. The data suggest early automatic lexical processing of visually presented language outside the focus of attention.

  2. A reliable technique for transfer of radioactivity filled vial from transport container to the processing station

    International Nuclear Information System (INIS)

    Kothalkar, Chetan; Dey, A.C.

    2005-01-01

    In Technetium Column Generator Production Facility (TCGPF project) of BRIT, a facility for unloading vial containing radioactive liquid sodium molybdate- 99 Mo solution from the transport cask into the processing station and unsealing the vial to transfer the liquid to a storage bottle has been developed. This is specifically conceptualized for safe handling of radioactivity and minimizing the radiation dose exposure to the personnel working at the time of transferring the radioactivity from the transport cask to a place for further processing. The facility, designed to handle around 1850 GBq activity, has two cells enclosed in 102mm thick lead wall and connected by a gravity actuated trolley conveyor. The first cell handles the transport cask carrying the vial-containing radioactivity, which houses two types of vial lifting gadgets assisted by manually operatable tongs. Gadgets use compressed air. In an experiment, it is found that the HDPE vial lifting gadget using suction cup continue to function up to 30-40 minutes after power failure. The experience shows that gadget using 3-point radial gripper to lift the glass vial will remain in grab position, even if the compressed air supply stops. In this facility the dose receivable, while handling radioactivity by the operator, is likely to be negligibly small (approx. 3.15 x 10 -4 mSv per year at the rate four glass vials/week and 2.25 x 10 -4 mSv per year considering at the rate 1 vial/week for HOPE vial transfer). (author)

  3. Determining optimal selling price and lot size with process reliability and partial backlogging considerations

    Science.gov (United States)

    Hsieh, Tsu-Pang; Cheng, Mei-Chuan; Dye, Chung-Yuan; Ouyang, Liang-Yuh

    2011-01-01

    In this article, we extend the classical economic production quantity (EPQ) model by proposing imperfect production processes and quality-dependent unit production cost. The demand rate is described by any convex decreasing function of the selling price. In addition, we allow for shortages and a time-proportional backlogging rate. For any given selling price, we first prove that the optimal production schedule not only exists but also is unique. Next, we show that the total profit per unit time is a concave function of price when the production schedule is given. We then provide a simple algorithm to find the optimal selling price and production schedule for the proposed model. Finally, we use a couple of numerical examples to illustrate the algorithm and conclude this article with suggestions for possible future research.

  4. Reliability engineering theory and practice

    CERN Document Server

    Birolini, Alessandro

    2014-01-01

    This book shows how to build in, evaluate, and demonstrate reliability and availability of components, equipment, systems. It presents the state-of-theart of reliability engineering, both in theory and practice, and is based on the author's more than 30 years experience in this field, half in industry and half as Professor of Reliability Engineering at the ETH, Zurich. The structure of the book allows rapid access to practical results. This final edition extend and replace all previous editions. New are, in particular, a strategy to mitigate incomplete coverage, a comprehensive introduction to human reliability with design guidelines and new models, and a refinement of reliability allocation, design guidelines for maintainability, and concepts related to regenerative stochastic processes. The set of problems for homework has been extended. Methods & tools are given in a way that they can be tailored to cover different reliability requirement levels and be used for safety analysis. Because of the Appendice...

  5. Development and reliability testing of a Health Action Process Approach inventory for physical activity participation among individuals with schizophrenia

    Directory of Open Access Journals (Sweden)

    Kelly eArbour-Nicitopoulos

    2014-06-01

    Full Text Available Individuals with schizophrenia tend to have high levels of cardiovascular disease and lower physical activity (PA levels than the general population. Research is urgently required in developing evidence-based behavioral interventions for increasing PA in this population. One model that has been increasingly used to understand the mechanisms underlying PA is the Health Action Process Approach (HAPA. The purpose of this study was to adapt and pilot-test a HAPA-based inventory that reliably captures salient, modifiable PA determinants for individuals with schizophrenia. Initially, twelve outpatients with schizophrenia reviewed the inventory and provided verbal feedback regarding comprehension, item relevance, and potential new content. A content analysis framework was used to inform modifications to the inventory. The resultant inventory underwent a quantitative assessment of internal consistency and test-retest reliability. Twenty-five outpatients (Mage= 41.5 ± 13.5 years; 64% male completed the inventory on two separate occasions, one week apart. All but two scales showed good internal consistency (Cronbach’s α=0.62–0.98 and test-retest correlations (rs = .21-.96. Preliminary assessment of criterion validity of the HAPA inventory showed significant, large-sized correlations between behavioural intentions and both affective outcome expectancies and task self-efficacy, and small-to-moderate correlations between self-reported minutes of moderate-to-vigorous PA and the volitional constructs of the HAPA model. These findings provide preliminary support for the reliability and validity of the first-ever inventory for examining theory-based predictors of moderate to vigorous PA intentions and behavior among individuals with schizophrenia. Further validation research with this inventory using an objective measure of PA behavior will provide additional support for its psychometric properties within the schizophrenia population.

  6. Reliability issues in PACS

    Science.gov (United States)

    Taira, Ricky K.; Chan, Kelby K.; Stewart, Brent K.; Weinberg, Wolfram S.

    1991-07-01

    Reliability is an increasing concern when moving PACS from the experimental laboratory to the clinical environment. Any system downtime may seriously affect patient care. The authors report on the several classes of errors encountered during the pre-clinical release of the PACS during the past several months and present the solutions implemented to handle them. The reliability issues discussed include: (1) environmental precautions, (2) database backups, (3) monitor routines of critical resources and processes, (4) hardware redundancy (networks, archives), and (5) development of a PACS quality control program.

  7. Combining Generalized Renewal Processes with Non-Extensive Entropy-Based q-Distributions for Reliability Applications

    Directory of Open Access Journals (Sweden)

    Isis Didier Lins

    2018-03-01

    Full Text Available The Generalized Renewal Process (GRP is a probabilistic model for repairable systems that can represent the usual states of a system after a repair: as new, as old, or in a condition between new and old. It is often coupled with the Weibull distribution, widely used in the reliability context. In this paper, we develop novel GRP models based on probability distributions that stem from the Tsallis’ non-extensive entropy, namely the q-Exponential and the q-Weibull distributions. The q-Exponential and Weibull distributions can model decreasing, constant or increasing failure intensity functions. However, the power law behavior of the q-Exponential probability density function for specific parameter values is an advantage over the Weibull distribution when adjusting data containing extreme values. The q-Weibull probability distribution, in turn, can also fit data with bathtub-shaped or unimodal failure intensities in addition to the behaviors already mentioned. Therefore, the q-Exponential-GRP is an alternative for the Weibull-GRP model and the q-Weibull-GRP generalizes both. The method of maximum likelihood is used for their parameters’ estimation by means of a particle swarm optimization algorithm, and Monte Carlo simulations are performed for the sake of validation. The proposed models and algorithms are applied to examples involving reliability-related data of complex systems and the obtained results suggest GRP plus q-distributions are promising techniques for the analyses of repairable systems.

  8. [DESCRIPTION AND PRESENTATION OF THE RESULTS OF ELECTROENCEPHALOGRAM PROCESSING USING AN INFORMATION MODEL].

    Science.gov (United States)

    Myznikov, I L; Nabokov, N L; Rogovanov, D Yu; Khankevich, Yu R

    2016-01-01

    The paper proposes to apply the informational modeling of correlation matrix developed by I.L. Myznikov in early 1990s in neurophysiological investigations, such as electroencephalogram recording and analysis, coherence description of signals from electrodes on the head surface. The authors demonstrate information models built using the data from studies of inert gas inhalation by healthy human subjects. In the opinion of the authors, information models provide an opportunity to describe physiological processes with a high level of generalization. The procedure of presenting the EEG results holds great promise for the broad application.

  9. The Healthcare Improvement Scotland evidence note rapid review process: providing timely, reliable evidence to inform imperative decisions on healthcare.

    Science.gov (United States)

    McIntosh, Heather M; Calvert, Julie; Macpherson, Karen J; Thompson, Lorna

    2016-06-01

    Rapid review has become widely adopted by health technology assessment agencies in response to demand for evidence-based information to support imperative decisions. Concern about the credibility of rapid reviews and the reliability of their findings has prompted a call for wider publication of their methods. In publishing this overview of the accredited rapid review process developed by Healthcare Improvement Scotland, we aim to raise awareness of our methods and advance the discourse on best practice. Healthcare Improvement Scotland produces rapid reviews called evidence notes using a process that has achieved external accreditation through the National Institute for Health and Care Excellence. Key components include a structured approach to topic selection, initial scoping, considered stakeholder involvement, streamlined systematic review, internal quality assurance, external peer review and updating. The process was introduced in 2010 and continues to be refined over time in response to user feedback and operational experience. Decision-makers value the responsiveness of the process and perceive it as being a credible source of unbiased evidence-based information supporting advice for NHSScotland. Many agencies undertaking rapid reviews are striving to balance efficiency with methodological rigour. We agree that there is a need for methodological guidance and that it should be informed by better understanding of current approaches and the consequences of different approaches to streamlining systematic review methods. Greater transparency in the reporting of rapid review methods is essential to enable that to happen.

  10. Dendritic Cells and Their Role in Allergy: Uptake, Proteolytic Processing and Presentation of Allergens

    Directory of Open Access Journals (Sweden)

    Piotr Humeniuk

    2017-07-01

    Full Text Available Dendritic cells (DCs are the most important antigen presenting cells to activate naïve T cells, which results in the case of Type 1 allergies in a Type 2 helper T cell (Th2-driven specific immune response towards allergens. So far, a number of different subsets of specialized DCs in different organs have been identified. In the recent past methods to study the interaction of DCs with allergenic proteins, their different uptake and processing mechanisms followed by the presentation to T cells were developed. The following review aims to summarize the most important characteristics of DC subsets in the context of allergic diseases, and highlights the recent findings. These detailed studies can contribute to a better understanding of the pathomechanisms of allergic diseases and contribute to the identification of key factors to be addressed for therapeutic interventions.

  11. [Allocation of attentional resource and monitoring processes under rapid serial visual presentation].

    Science.gov (United States)

    Nishiura, K

    1998-08-01

    With the use of rapid serial visual presentation (RSVP), the present study investigated the cause of target intrusion errors and functioning of monitoring processes. Eighteen students participated in Experiment 1, and 24 in Experiment 2. In Experiment 1, different target intrusion errors were found depending on different kinds of letters --romaji, hiragana, and kanji. In Experiment 2, stimulus set size and context information were manipulated in an attempt to explore the cause of post-target intrusion errors. Results showed that as stimulus set size increased, the post-target intrusion errors also increased, but contextual information did not affect the errors. Results concerning mean report probability indicated that increased allocation of attentional resource to response-defining dimension was the cause of the errors. In addition, results concerning confidence rating showed that monitoring of temporal and contextual information was extremely accurate, but it was not so for stimulus information. These results suggest that attentional resource is different from monitoring resource.

  12. A Review and Comparison of the Reliabilities of the MMPI-2, MCMI-III, and PAI Presented in Their Respective Test Manuals

    Science.gov (United States)

    Wise, Edward A.; Streiner, David L.; Walfish, Steven

    2010-01-01

    This article provides a review of the literature to determine the most frequently used personality tests. Based on this review, internal consistency and test-retest reliability coefficients from the test manuals for the Minnesota Multiphasic Personality Inventory-2 (MMPI-2), Millon Clinical Multiaxial Inventory-III (MCMI-III), and Personality…

  13. An integral equation approach to the interval reliability of systems modelled by finite semi-Markov processes

    International Nuclear Information System (INIS)

    Csenki, A.

    1995-01-01

    The interval reliability for a repairable system which alternates between working and repair periods is defined as the probability of the system being functional throughout a given time interval. In this paper, a set of integral equations is derived for this dependability measure, under the assumption that the system is modelled by an irreducible finite semi-Markov process. The result is applied to the semi-Markov model of a two-unit system with sequential preventive maintenance. The method used for the numerical solution of the resulting system of integral equations is a two-point trapezoidal rule. The system of implementation is the matrix computation package MATLAB on the Apple Macintosh SE/30. The numerical results are discussed and compared with those from simulation

  14. Making and Unmaking the Endangered in India (1880-Present: Understanding Animal-Criminal Processes

    Directory of Open Access Journals (Sweden)

    Varun Sharma

    2015-01-01

    Full Text Available The concerns of the present paper emerge from the single basic question of whether the available histories of the tiger are comprehensive enough to enable an understanding of how this nodular species comprises/contests the power dynamics of the present. Starting with this basic premise, this paper retells a series of events which go to clarify that a nuanced understanding of the manner in which a species serves certain political purposes is not possible by tracking the animal alone. A discourse on endangerment has beginnings in the body and being of species that are remarkably cut off from the tiger-the elephant, birds, and the rhino (and man if we might add-and develops with serious implications for power, resource appropriation, and criminality, over a period of time, before more directly recruiting the tiger itself. If we can refer to this as the intermittent making and unmaking of the endangered, it is by turning to the enunciations of Michel Foucault that we try to canvas a series of events that can be described as animal-criminal processes. The role of such processes in the construction of endangerment, the structuring of space, and shared ideas of man-animal relations is further discussed in this paper.

  15. Reliability Engineering

    International Nuclear Information System (INIS)

    Lee, Sang Yong

    1992-07-01

    This book is about reliability engineering, which describes definition and importance of reliability, development of reliability engineering, failure rate and failure probability density function about types of it, CFR and index distribution, IFR and normal distribution and Weibull distribution, maintainability and movability, reliability test and reliability assumption in index distribution type, normal distribution type and Weibull distribution type, reliability sampling test, reliability of system, design of reliability and functionality failure analysis by FTA.

  16. Secure Reliable Processing Systems

    Science.gov (United States)

    1984-02-21

    be attainable in principle, the more difficult goal is to meet all of the above while still maintaining good performance within the framwork of a well...managing the network, the user sees a conceptually simpler storage facility, composed merely of files, without machine boundaries, replicated copies

  17. MEMS Reliability Assurance Activities at JPL

    Science.gov (United States)

    Kayali, S.; Lawton, R.; Stark, B.

    2000-01-01

    An overview of Microelectromechanical Systems (MEMS) reliability assurance and qualification activities at JPL is presented along with the a discussion of characterization of MEMS structures implemented on single crystal silicon, polycrystalline silicon, CMOS, and LIGA processes. Additionally, common failure modes and mechanisms affecting MEMS structures, including radiation effects, are discussed. Common reliability and qualification practices contained in the MEMS Reliability Assurance Guideline are also presented.

  18. Classification of hyperspectral imagery using MapReduce on a NVIDIA graphics processing unit (Conference Presentation)

    Science.gov (United States)

    Ramirez, Andres; Rahnemoonfar, Maryam

    2017-04-01

    A hyperspectral image provides multidimensional figure rich in data consisting of hundreds of spectral dimensions. Analyzing the spectral and spatial information of such image with linear and non-linear algorithms will result in high computational time. In order to overcome this problem, this research presents a system using a MapReduce-Graphics Processing Unit (GPU) model that can help analyzing a hyperspectral image through the usage of parallel hardware and a parallel programming model, which will be simpler to handle compared to other low-level parallel programming models. Additionally, Hadoop was used as an open-source version of the MapReduce parallel programming model. This research compared classification accuracy results and timing results between the Hadoop and GPU system and tested it against the following test cases: the CPU and GPU test case, a CPU test case and a test case where no dimensional reduction was applied.

  19. Suicidality in pediatric bipolar disorder: predictor or outcome of family processes and mixed mood presentation?

    Science.gov (United States)

    Algorta, Guillermo Pérez; Youngstrom, Eric A; Frazier, Thomas W; Freeman, Andrew J; Youngstrom, Jennifer Kogos; Findling, Robert L

    2011-02-01

    Pediatric bipolar disorder (PBD) involves a potent combination of mood dysregulation and interpersonal processes, placing these youth at significantly greater risk of suicide. We examined the relationship between suicidal behavior, mood symptom presentation, family functioning, and quality of life (QoL) in youth with PBD. Participants were 138 youths aged 5-18 years presenting to outpatient clinics with DSM-IV diagnoses of bipolar I disorder (n=27), bipolar II disorder (n=18), cyclothymic disorder (n=48), and bipolar disorder not otherwise specified (n=45). Twenty PBD patients had lifetime suicide attempts, 63 had past or current suicide ideation, and 55 were free of suicide ideation and attempts. Attempters were older than nonattempters. Suicide ideation and attempts were linked to higher depressive symptoms, and rates were even higher in youths meeting criteria for the mixed specifier proposed for DSM-5. Both suicide ideation and attempts were associated with lower youth QoL and poorer family functioning. Parent effects (with suicidality treated as outcome) and child effects (where suicide was the predictor of poor family functioning) showed equally strong evidence in regression models, even after adjusting for demographics. These findings underscore the strong association between mixed features and suicidality in PBD, as well as the association between QoL, family functioning, and suicidality. It is possible that youths are not just a passive recipient of family processes, and their illness may play an active role in disrupting family functioning. Replication with longitudinal data and qualitative methods should investigate both child and parent effect models. © 2011 John Wiley and Sons A/S.

  20. Three-day dendritic cells for vaccine development: Antigen uptake, processing and presentation

    Directory of Open Access Journals (Sweden)

    Schendel Dolores J

    2010-09-01

    Full Text Available Abstract Background Antigen-loaded dendritic cells (DC are capable of priming naïve T cells and therefore represent an attractive adjuvant for vaccine development in anti-tumor immunotherapy. Numerous protocols have been described to date using different maturation cocktails and time periods for the induction of mature DC (mDC in vitro. For clinical application, the use of mDC that can be generated in only three days saves on the costs of cytokines needed for large scale vaccine cell production and provides a method to produce cells within a standard work-week schedule in a GMP facility. Methods In this study, we addressed the properties of antigen uptake, processing and presentation by monocyte-derived DC prepared in three days (3d mDC compared with conventional DC prepared in seven days (7d mDC, which represent the most common form of DC used for vaccines to date. Results Although they showed a reduced capacity for spontaneous antigen uptake, 3d mDC displayed higher capacity for stimulation of T cells after loading with an extended synthetic peptide that requires processing for MHC binding, indicating they were more efficient at antigen processing than 7d DC. We found, however, that 3d DC were less efficient at expressing protein after introduction of in vitro transcribed (ivtRNA by electroporation, based on published procedures. This deficit was overcome by altering electroporation parameters, which led to improved protein expression and capacity for T cell stimulation using low amounts of ivtRNA. Conclusions This new procedure allows 3d mDC to replace 7d mDC for use in DC-based vaccines that utilize long peptides, proteins or ivtRNA as sources of specific antigen.

  1. Emotional noun processing: an ERP study with rapid serial visual presentation.

    Directory of Open Access Journals (Sweden)

    Shengnan Yi

    Full Text Available Reading is an important part of our daily life, and rapid responses to emotional words have received a great deal of research interest. Our study employed rapid serial visual presentation to detect the time course of emotional noun processing using event-related potentials. We performed a dual-task experiment, where subjects were required to judge whether a given number was odd or even, and the category into which each emotional noun fit. In terms of P1, we found that there was no negativity bias for emotional nouns. However, emotional nouns elicited larger amplitudes in the N170 component in the left hemisphere than did neutral nouns. This finding indicated that in later processing stages, emotional words can be discriminated from neutral words. Furthermore, positive, negative, and neutral words were different from each other in the late positive complex, indicating that in the third stage, even different emotions can be discerned. Thus, our results indicate that in a three-stage model the latter two stages are more stable and universal.

  2. Emotional noun processing: an ERP study with rapid serial visual presentation.

    Science.gov (United States)

    Yi, Shengnan; He, Weiqi; Zhan, Lei; Qi, Zhengyang; Zhu, Chuanlin; Luo, Wenbo; Li, Hong

    2015-01-01

    Reading is an important part of our daily life, and rapid responses to emotional words have received a great deal of research interest. Our study employed rapid serial visual presentation to detect the time course of emotional noun processing using event-related potentials. We performed a dual-task experiment, where subjects were required to judge whether a given number was odd or even, and the category into which each emotional noun fit. In terms of P1, we found that there was no negativity bias for emotional nouns. However, emotional nouns elicited larger amplitudes in the N170 component in the left hemisphere than did neutral nouns. This finding indicated that in later processing stages, emotional words can be discriminated from neutral words. Furthermore, positive, negative, and neutral words were different from each other in the late positive complex, indicating that in the third stage, even different emotions can be discerned. Thus, our results indicate that in a three-stage model the latter two stages are more stable and universal.

  3. Human reliability in complex systems: an overview

    International Nuclear Information System (INIS)

    Embrey, D.E.

    1976-07-01

    A detailed analysis is presented of the main conceptual background underlying the areas of human reliability and human error. The concept of error is examined and generalized to that of human reliability, and some of the practical and methodological difficulties of reconciling the different standpoints of the human factors specialist and the engineer discussed. Following a survey of general reviews available on human reliability, quantitative techniques for prediction of human reliability are considered. An in-depth critical analysis of the various quantitative methods is then presented, together with the data bank requirements for human reliability prediction. Reliability considerations in process control and nuclear plant, and also areas of design, maintenance, testing and emergency situations are discussed. The effects of stress on human reliability are analysed and methods of minimizing these effects discussed. Finally, a summary is presented and proposals for further research are set out. (author)

  4. Reliability through markets in Ontario : submission by the Independent Electricity Market Operator to the Minister of Energy's Consultation Process

    International Nuclear Information System (INIS)

    2003-01-01

    For the past five years, Ontario has invested $1 billion to restructure and open its electricity market to competition. In recent months, and in response to residential consumers pricing concerns, the Independent Electricity Market Operator (IMO) transferred credits to local distribution companies allowing them to issue $75 rebates to all affected customers as of December 1, 2002, and to bill low-volume and other designated customers at a rate of 4.3 cents per kilowatt hour for the commodity portion of their bills. This report addresses the concern that price responsiveness will be lost for those parts of the market with fixed prices. It was noted that the reliability of the power system could be placed at risk if the range of customers with fixed prices is broadened. Fixed prices would also jeopardize the province's ability to attract new supply and enhance competition in the electricity sector. The IMO believes that price responsiveness in the wholesale market is crucial to the reliability of the electricity system and recommends that a plan for any additional fixed pricing should include a clearly defined phase-out over the period ending in 2006 as new supply comes on-line. The IMO emphasizes that the lack of price responsiveness to the market, particularly in peak energy demand periods, is equivalent to adding hundreds of MW to the load. The report presents lessons learned in other jurisdictions and highlights noteworthy considerations such as the market power mitigation agreement, improving competition, a phased-in approach, and demand side initiatives

  5. Use of RMPS to assess the reliability of Passive Safety Systems in CAREM-like reactor, past and present experiences. Second progress report

    International Nuclear Information System (INIS)

    Giménez, M; Mezio, F.; Zanocco, P.; Lorenzo, G.

    2011-01-01

    Conclusions: • RMPS is being used successfully to assess the fulfillment of design criteria from a probabilistic point of view, in case of LOHS and LOCA, considering uncertainties in the reactor, in the passive safety systems and in the models as well. • Allows to quantify the probability of Event Tree headers related to some systems whose demand depends on the accidental sequence evolution (i.e. probability to demand a safety valve in case of a LOHS with success of the PRHRS, but working under deteriorated conditions). • Functional reliability quantification not already used in CAREM PSA, (Fault Trees or in Event Trees?)

  6. The Requirement for Acquisition and Logistics Integration: An Examination of Reliability Management Within the Marine Corps Acquisition Process

    National Research Council Canada - National Science Library

    Norcross, Marvin

    2002-01-01

    Combat system reliability is central to creating combat power determining logistics supportability requirements and determining systems total ownership costs, yet the Marine Corps typically monitors...

  7. Afterglow luminescence in sol-gel/Pechini grown oxide materials: persistence or phosphorescence process? (Conference Presentation)

    Science.gov (United States)

    Sontakke, Atul; Ferrier, Alban; Viana, Bruno

    2017-03-01

    Persistent luminescence and phosphorescence, both yields afterglow luminescence, but are completely different mechanisms. Persistent luminescence involves a slow thermal release of trapped electrons stored in defect states, whereas the phosphorescence is caused due to triplet to singlet transition [1,2]. Many persistent luminescence phosphors are based on oxide inorganic hosts, and exhibit long afterglow luminescence after ceasing the excitation. We observed intense and long afterglow luminescence in sol-gel/pechini grown inorganic oxides, and as a first interpretation thought to be due to persistence mechanism. However, some of these materials do not exhibit defect trap centers, and a detailed investigation suggested it is due to phosphorescence, but not the persistence. Phosphorescence is not common in inorganic solids, and that too at room temperature, and therefore usually misinterpreted as persistence luminescence [3]. Here we present a detailed methodology to distinguish phosphorescence from persistence luminescence in inorganic solids, and the process to harvest highly efficient long phosphorescence afterglow at room temperature. 1. Jian Xu, Setsuhisa Tanabe, Atul D. Sontakke, Jumpei Ueda, Appl. Phys. Lett. 107, 081903 (2015) 2. Sebastian Reineke, Marc A. Baldo, Scientific Reports, 4, 3797 (2014) 3. Pengchong Xue, Panpan Wang, Peng Chen, Boqi Yao, Peng Gong, Jiabao Sun, Zhenqi Zhang, Ran Lu, Chem. Sci. (2016) DOI: 10.1039/C5SC03739E

  8. Present Status and Future Prospects in Bulk Processing of HIGH-Tc Superconductors

    Science.gov (United States)

    Jin, S.; Chu, C. W.

    The following sections are included: * INTRODUCTION * HIGH SUPERCONDUCTING TRANSITION TEMPERATURE * HIGH CRITICAL CURRENT DENSITY * Grain Boundary Weak Links * Nature of Weak Links * Possible Processing Approaches for Weak Link Problem * Processing Techniques for Texture Formation * Flux Creep in HTSC * Desirable Pinning Defects * Processing for Flux Pinning Enhancement * PROSPECTS FOR BULK APPLICATIONS * Magnetic Field Gener * Energy Storage * Magnetic Shielding * Other Applications * CONCLUDING REMARKS * ACKNOWLEDGMENT * REFERENCES

  9. Long-Term Reliability of a Hard-Switched Boost Power Processing Unit Utilizing SiC Power MOSFETs

    Science.gov (United States)

    Ikpe, Stanley A.; Lauenstein, Jean-Marie; Carr, Gregory A.; Hunter, Don; Ludwig, Lawrence L.; Wood, William; Iannello, Christopher J.; Del Castillo, Linda Y.; Fitzpatrick, Fred D.; Mojarradi, Mohammad M.; hide

    2016-01-01

    Silicon carbide (SiC) power devices have demonstrated many performance advantages over their silicon (Si) counterparts. As the inherent material limitations of Si devices are being swiftly realized, wide-band-gap (WBG) materials such as SiC have become increasingly attractive for high power applications. In particular, SiC power metal oxide semiconductor field effect transistors' (MOSFETs) high breakdown field tolerance, superior thermal conductivity and low-resistivity drift regions make these devices an excellent candidate for power dense, low loss, high frequency switching applications in extreme environment conditions. In this paper, a novel power processing unit (PPU) architecture is proposed utilizing commercially available 4H-SiC power MOSFETs from CREE Inc. A multiphase straight boost converter topology is implemented to supply up to 10 kilowatts full-scale. High Temperature Gate Bias (HTGB) and High Temperature Reverse Bias (HTRB) characterization is performed to evaluate the long-term reliability of both the gate oxide and the body diode of the SiC components. Finally, susceptibility of the CREE SiC MOSFETs to damaging effects from heavy-ion radiation representative of the on-orbit galactic cosmic ray environment are explored. The results provide the baseline performance metrics of operation as well as demonstrate the feasibility of a hard-switched PPU in harsh environments.

  10. Reliable Design Versus Trust

    Science.gov (United States)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    This presentation focuses on reliability and trust for the users portion of the FPGA design flow. It is assumed that the manufacturer prior to hand-off to the user tests FPGA internal components. The objective is to present the challenges of creating reliable and trusted designs. The following will be addressed: What makes a design vulnerable to functional flaws (reliability) or attackers (trust)? What are the challenges for verifying a reliable design versus a trusted design?

  11. Multilevel processes and cultural adaptation: Examples from past and present small-scale societies

    OpenAIRE

    Reyes-García, V.; Balbo, A. L.; Gomez-Baggethun, E.; Gueze, M.; Mesoudi, A.; Richerson, P.; Rubio-Campillo, X.; Ruiz-Mallén, I.; Shennan, S.

    2016-01-01

    Cultural adaptation has become central in the context of accelerated global change with authors increasingly acknowledging the importance of understanding multilevel processes that operate as adaptation takes place. We explore the importance of multilevel processes in explaining cultural adaptation by describing how processes leading to cultural (mis)adaptation are linked through a complex nested hierarchy, where the lower levels combine into new units with new organizations, functions, and e...

  12. Simulation of adsorption process of benzene present in effluent of the petrochemical industry; Simulacao do processo de adsorcao do benzeno presente em efluentes da industria petroquimica

    Energy Technology Data Exchange (ETDEWEB)

    Luz, Adriana D. da; Mello, Josiane M.M. de; Souza, Antonio Augusto Ulson de; Souza, Selene M.A. Guelli Ulson de [Universidade Federal de Santa Catarina (UFSC), Florianopolis, SC (Brazil); Silva, Adriano da [Universidade Comunitaria Regional de Chapeco (UNOCHAPECO), SC (Brazil)

    2008-07-01

    The adsorption processes have shown quite efficient in the removal of pollutant in liquid effluents, especially hydrocarbons of difficult removal, such as benzene. This work presents a phenomenological model that describes the process of benzene removal through the adsorption in a fixed bed column, being used coal activated as adsorbent. The model considers the internal and external resistances of mass transfer to the adsorbent particle. The method of Finite Volumes is used in the discretization of the equations. The numerical results obtained through the simulation presented good correlation when compared with experimental data found in the literature, demonstrating that the developed computational code, together with the mathematical modeling, represents an important tool for the project of adsorption columns. (author)

  13. Reliability data banks

    International Nuclear Information System (INIS)

    Cannon, A.G.; Bendell, A.

    1991-01-01

    Following an introductory chapter on Reliability, what is it, why it is needed, how it is achieved and measured, the principles of reliability data bases and analysis methodologies are the subject of the next two chapters. Achievements due to the development of data banks are mentioned for different industries in the next chapter, FACTS, a comprehensive information system for industrial safety and reliability data collection in process plants are covered next. CREDO, the Central Reliability Data Organization is described in the next chapter and is indexed separately, as is the chapter on DANTE, the fabrication reliability Data analysis system. Reliability data banks at Electricite de France and IAEA's experience in compiling a generic component reliability data base are also separately indexed. The European reliability data system, ERDS, and the development of a large data bank come next. The last three chapters look at 'Reliability data banks, - friend foe or a waste of time'? and future developments. (UK)

  14. Power electronics reliability analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Mark A.; Atcitty, Stanley

    2009-12-01

    This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.

  15. Present state and progress of industrial electron processing systems in Japan

    International Nuclear Information System (INIS)

    Sakamoto, I.; Mizusawa, K.

    1983-01-01

    A summary is given of the state of utilisation of electron processing systems in Japan, mainly for (1) cross-linking of wire and cable insulator, (2) heat shrinkable tube and sheet, (3) foamed polyethylene, and (4) curing of paint coats. Details are given of some of the electron processing systems. (U.K.)

  16. Problematics of Reliability of Road Rollers

    Science.gov (United States)

    Stawowiak, Michał; Kuczaj, Mariusz

    2018-06-01

    This article refers to the reliability of road rollers used in a selected roadworks company. Information on the method of road rollers service and how the service affects the reliability of these rollers is presented. Attention was paid to the process of the implemented maintenance plan with regard to the machine's operational time. The reliability of road rollers was analyzed by determining and interpreting readiness coefficients.

  17. Some present - day considerations about the reading processes, comprehension and text construction

    Directory of Open Access Journals (Sweden)

    Noraima García Valdés

    2013-09-01

    Full Text Available This article is about the way how reading constitute s today a challenge in secondary school, so as to make the reading processes more efficient and how the processes of text comprehension and text construction are essential to develop reading skills. The better this component is developed in the lesson, the greater the su ccess of speaking and writing; this way the educator will contribute to develop the communicative competence in the oral and written forms of the language.

  18. Response process and test–retest reliability of the Context Assessment for Community Health tool in Vietnam

    Directory of Open Access Journals (Sweden)

    Duong M. Duc

    2016-06-01

    Full Text Available Background: The recently developed Context Assessment for Community Health (COACH tool aims to measure aspects of the local healthcare context perceived to influence knowledge translation in low- and middle-income countries. The tool measures eight dimensions (organizational resources , community engagement, monitoring services for action, sources of knowledge, commitment to work, work culture, leadership, and informal payment through 49 items. Objective: The study aimed to explore the understanding and stability of the COACH tool among health providers in Vietnam. Designs: To investigate the response process, think-aloud interviews were undertaken with five community health workers, six nurses and midwives, and five physicians. Identified problems were classified according to Conrad and Blair's taxonomy and grouped according to an estimation of the magnitude of the problem's effect on the response data. Further, the stability of the tool was examined using a test–retest survey among 77 respondents. The reliability was analyzed for items (intraclass correlation coefficient (ICC and percent agreement and dimensions (ICC and Bland–Altman plots. Results: In general, the think-aloud interviews revealed that the COACH tool was perceived as clear, well organized, and easy to answer. Most items were understood as intended. However, seven prominent problems in the items were identified and the content of three dimensions was perceived to be of a sensitive nature. In the test–retest survey, two-thirds of the items and seven of eight dimensions were found to have an ICC agreement ranging from moderate to substantial (0.5–0.7, demonstrating that the instrument has an acceptable level of stability. Conclusions: This study provides evidence that the Vietnamese translation of the COACH tool is generally perceived to be clear and easy to understand and has acceptable stability. There is, however, a need to rephrase and add generic examples to clarify

  19. Response process and test-retest reliability of the Context Assessment for Community Health tool in Vietnam.

    Science.gov (United States)

    Duc, Duong M; Bergström, Anna; Eriksson, Leif; Selling, Katarina; Thi Thu Ha, Bui; Wallin, Lars

    2016-01-01

    The recently developed Context Assessment for Community Health (COACH) tool aims to measure aspects of the local healthcare context perceived to influence knowledge translation in low- and middle-income countries. The tool measures eight dimensions (organizational resources, community engagement, monitoring services for action, sources of knowledge, commitment to work, work culture, leadership, and informal payment) through 49 items. The study aimed to explore the understanding and stability of the COACH tool among health providers in Vietnam. To investigate the response process, think-aloud interviews were undertaken with five community health workers, six nurses and midwives, and five physicians. Identified problems were classified according to Conrad and Blair's taxonomy and grouped according to an estimation of the magnitude of the problem's effect on the response data. Further, the stability of the tool was examined using a test-retest survey among 77 respondents. The reliability was analyzed for items (intraclass correlation coefficient (ICC) and percent agreement) and dimensions (ICC and Bland-Altman plots). In general, the think-aloud interviews revealed that the COACH tool was perceived as clear, well organized, and easy to answer. Most items were understood as intended. However, seven prominent problems in the items were identified and the content of three dimensions was perceived to be of a sensitive nature. In the test-retest survey, two-thirds of the items and seven of eight dimensions were found to have an ICC agreement ranging from moderate to substantial (0.5-0.7), demonstrating that the instrument has an acceptable level of stability. This study provides evidence that the Vietnamese translation of the COACH tool is generally perceived to be clear and easy to understand and has acceptable stability. There is, however, a need to rephrase and add generic examples to clarify some items and to further review items with low ICC.

  20. Laser induced photochemical and photophysical processes in fuel reprocessing: present scenario and future prospects

    International Nuclear Information System (INIS)

    Bhowmick, G.K.; Sarkar, S.K.; Ramanujam, A.

    2001-01-01

    State-of-art lasers can meet the very stringent requirements of nuclear technology and hence find application in varied areas of nuclear fuel cycle. Here, we discuss two specific applications in nuclear fuel reprocessing namely (a) add-on photochemical modifications of PUREX process where photochemical reactors replace the chemical reactors, and (b) fast, matrix independent sensitive laser analytical techniques. The photochemical modifications based on laser induced valency adjustment offers efficient separation, easy maintenance and over all reduction in the volume of radioactive waste. The analytical technique of time resolved laser induced fluorescence (TRLIF) has several attractive features like excellent sensitivity, element selective, and capability of on line remote process monitoring. For optically opaque solutions, optical excitation is detected by its conversion into thermal energy by non-radiative relaxation processes using the photo-thermal spectroscopic techniques. (author)

  1. Present status of theoretical understanding of charge changing processes at low beam energies

    OpenAIRE

    Swami, D. K.; Nandi, T.

    2017-01-01

    A model for the evaluation of charge-state distributions of fast heavy ions in solid targets is being developed since late eighties in terms of ETACHA code. Time to time it is being updated to deal with more number of electrons and non-perturbative processes. The calculation approach of the recent one, which is formulated for handling the non-perturbative processes better, is different from the earlier ones. However, the experimental results for the projectiles up to 28 electrons can be compa...

  2. Proof tests on reliability

    International Nuclear Information System (INIS)

    Mishima, Yoshitsugu

    1983-01-01

    In order to obtain public understanding on nuclear power plants, tests should be carried out to prove the reliability and safety of present LWR plants. For example, the aseismicity of nuclear power plants must be verified by using a large scale earthquake simulator. Reliability test began in fiscal 1975, and the proof tests on steam generators and on PWR support and flexure pins against stress corrosion cracking have already been completed, and the results have been internationally highly appreciated. The capacity factor of the nuclear power plant operation in Japan rose to 80% in the summer of 1983, and considering the period of regular inspection, it means the operation of almost full capacity. Japanese LWR technology has now risen to the top place in the world after having overcome the defects. The significance of the reliability test is to secure the functioning till the age limit is reached, to confirm the correct forecast of deteriorating process, to confirm the effectiveness of the remedy to defects and to confirm the accuracy of predicting the behavior of facilities. The reliability of nuclear valves, fuel assemblies, the heat affected zones in welding, reactor cooling pumps and electric instruments has been tested or is being tested. (Kako, I.)

  3. Chemical contamination of groundwater at gas processing plants - the past, the present and the future

    International Nuclear Information System (INIS)

    Wrubleski, R.M.; Drury, C.R.

    1997-01-01

    The chemicals used to remove the sour gas components (primarily H 2 S) from raw gas in the sour gas sweetening processes were discussed. The chemicals, mainly amines and physical absorbents, have been found as contaminants in soil and groundwater at several sites. Studies have been conducted to evaluate the behaviour of some of these chemicals. In particular, the contamination by sulfolane and diisopropanolamine (DIPA) which originate from the Sulfinol R sweetening process, was discussed. Prior to the mid 1970s wastes from these processes were disposed of on site in landfills that were not engineered for groundwater protection. By the mid 1970s the landfills were closed by capping. Many of the gas plant sites were located on elevated terrain where hydraulic gradient was available for downward movement of groundwater and any chemicals contained within. Contaminant movement in fractured bedrock has also affected drinking water. Ground water monitoring began in the mid 1980s to address environmental concerns, focusing on monitoring for potability, metals and organics. It was discovered that most of the plants using the Sulfinol process had groundwater contaminated with sulfolane levels ranging from 1 ppm to over 800 ppm. A research project was developed to determine the soil interaction parameters and biodegradation behaviour of pure sulfolane and DIPA to provide data in order to predict plume migration. Ecotoxicity tests were also performed to verify toxicity effects of sulfolane, DIPA, reclaimer bottoms and observed biodegradation metabolites to bio-organisms and aquatic life in aquatic receptors. 3 refs., 1 tab., 1 fig

  4. How Do Turkish Middle School Science Coursebooks Present the Science Process Skills?

    Science.gov (United States)

    Aslan, Oktay

    2015-01-01

    An important objective in science education is the acquisition of science process skills (SPS) by the students. Therefore, science coursebooks, among the main resources of elementary science curricula, are to convey accurate SPS. This study is a qualitative study based on the content analysis of the science coursebooks used at middle schools. In…

  5. Using Processing Instruction for the Acquisition of English Present Perfect of Filipinos

    Science.gov (United States)

    Erfe, Jonathan P.; Lintao, Rachelle B.

    2012-01-01

    This is an experimental study on the relative effects of Van Patten's Processing Instruction (PI) (1996, 2002), a "psycholinguistically-motivated" intervention in teaching second-language (L2) grammar, on young-adult Filipino learners of English. A growing body of research on this methodological alternative, which establishes…

  6. Study types and reliability of Real World Evidence compared with experimental evidence used in Polish reimbursement decision-making processes.

    Science.gov (United States)

    Wilk, N; Wierzbicka, N; Skrzekowska-Baran, I; Moćko, P; Tomassy, J; Kloc, K

    2017-04-01

    The aim of this study was to identify the relationship and impact between Real World Evidence (RWE) and experimental evidence (EE) in Polish decision-making processes for the drugs from selected Anatomical Therapeutic Chemical (ATC) groups. Descriptive study. A detailed analysis was performed for 58 processes from five ATC code groups in which RWE for effectiveness, or effectiveness and safety were cited in Agency for Health Technology Assessment and Tariff System's (AOTMiT) documents published between January 2012 and September 2015: Verification Analysis of AOTMiT, Statement of the Transparency Council of AOTMiT, and Recommendation of the President of AOTMiT. In 62% of the cases, RWE supported the EE and confirmed its main conclusions. The majority of studies in the EE group showed to be RCTs (97%), and the RWE group included mainly cohort studies (89%). There were more studies without a control group within RWE compared with the EE group (10% vs 1%). Our results showed that EE are more often assessed using Jadad, NICE or NOS scale by AOTMiT compared with RWE (93% vs 48%). When the best evidence within a given decision-making process is analysed, half of RWE and two-thirds of EE are considered high quality evidence. RWE plays an important role in the decision-making processes on public funding of drugs in Poland, contributing to nearly half (45%) of all the evidence considered. There exist such processes in which the proportion of RWE is dominant, with one process showing RWE as the only evidence presented. Copyright © 2016 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  7. Present state and prospects of authorized radiation processing of food in Hungary

    International Nuclear Information System (INIS)

    Farkas, J.

    1974-01-01

    The results yielded by the radiation processing of potatoe, onion and champignon being stored have been given. The irradiations were made by a laboratory 60 Co γ-source of 15 kCi in the experimental plant of the Central Research Institute of Food Industry (Budapest). The irradiated products were well-saleable. The rentability of the applied methods is treated fully. A model program has been given for the irradiation process. The production costs of a plant with a 60 Co source of 200 kCi activity have been calculated and the economic results obtainable by the program above have been also estimated. If a capacity use of 30% is considered and 22,000 t of potatoe and 5,520 t of onion are treated by the given program, the index of return is 5.1 years. (K.A.)

  8. Past and Present of the s Process: a Nuclear Physicist's View

    International Nuclear Information System (INIS)

    Kaeppeler, F.

    2008-01-01

    The history of nuclear physics data for s-process calculations are briefly reviewed with emphasis on the actual status of the neutron capture cross sections. The remaining challenges will be illustrated and discussed in the light of new or optimized methods and state-of-the-art facilities, indicating the potential for accurate measurements and the possibility to study cross sections of radioactive isotopes. These opportunities will be considerably enriched by the enormous improvements provided by upcoming new facilities

  9. Chemical contamination of groundwater at gas processing plants - the past, the present and the future

    Energy Technology Data Exchange (ETDEWEB)

    Wrubleski, R.M.; Drury, C.R. [Shell Canada Ltd., Calgary, AB (Canada). Calgary Research Centre; Sevigny, J.H. [Komex Consultants Ltd., Calgary, AB (Canada)

    1997-12-31

    The chemicals used to remove the sour gas components (primarily H{sub 2}S) from raw gas in the sour gas sweetening processes were discussed. The chemicals, mainly amines and physical absorbents, have been found as contaminants in soil and groundwater at several sites. Studies have been conducted to evaluate the behaviour of some of these chemicals. In particular, the contamination by sulfolane and diisopropanolamine (DIPA) which originate from the Sulfinol{sup R} sweetening process, was discussed. Prior to the mid 1970s wastes from these processes were disposed of on site in landfills that were not engineered for groundwater protection. By the mid 1970s the landfills were closed by capping. Many of the gas plant sites were located on elevated terrain where hydraulic gradient was available for downward movement of groundwater and any chemicals contained within. Contaminant movement in fractured bedrock has also affected drinking water. Ground water monitoring began in the mid 1980s to address environmental concerns, focusing on monitoring for potability, metals and organics. It was discovered that most of the plants using the Sulfinol process had groundwater contaminated with sulfolane levels ranging from 1 ppm to over 800 ppm. A research project was developed to determine the soil interaction parameters and biodegradation behaviour of pure sulfolane and DIPA to provide data in order to predict plume migration. Ecotoxicity tests were also performed to verify toxicity effects of sulfolane, DIPA, reclaimer bottoms and observed biodegradation metabolites to bio-organisms and aquatic life in aquatic receptors. 3 refs., 1 tab., 1 fig.

  10. Ionic-Liquid-Mediated Extraction and Separation Processes for Bioactive Compounds: Past, Present, and Future Trends.

    Science.gov (United States)

    Ventura, Sónia P M; E Silva, Francisca A; Quental, Maria V; Mondal, Dibyendu; Freire, Mara G; Coutinho, João A P

    2017-05-24

    Ionic liquids (ILs) have been proposed as promising media for the extraction and separation of bioactive compounds from the most diverse origins. This critical review offers a compilation on the main results achieved by the use of ionic-liquid-based processes in the extraction and separation/purification of a large range of bioactive compounds (including small organic extractable compounds from biomass, lipids, and other hydrophobic compounds, proteins, amino acids, nucleic acids, and pharmaceuticals). ILs have been studied as solvents, cosolvents, cosurfactants, electrolytes, and adjuvants, as well as used in the creation of IL-supported materials for separation purposes. The IL-based processes hitherto reported, such as IL-based solid-liquid extractions, IL-based liquid-liquid extractions, IL-modified materials, and IL-based crystallization approaches, are here reviewed and compared in terms of extraction and separation performance. The key accomplishments and future challenges to the field are discussed, with particular emphasis on the major lacunas found within the IL community dedicated to separation processes and by suggesting some steps to overcome the current limitations.

  11. Predictive information processing is a fundamental learning mechanism present in early development: evidence from infants.

    Science.gov (United States)

    Trainor, Laurel J

    2012-02-01

    Evidence is presented that predictive coding is fundamental to brain function and present in early infancy. Indeed, mismatch responses to unexpected auditory stimuli are among the earliest robust cortical event-related potential responses, and have been measured in young infants in response to many types of deviation, including in pitch, timing, and melodic pattern. Furthermore, mismatch responses change quickly with specific experience, suggesting that predictive coding reflects a powerful, early-developing learning mechanism. Copyright © 2011 Elsevier B.V. All rights reserved.

  12. Reliability concepts applied to cutting tool change time

    Energy Technology Data Exchange (ETDEWEB)

    Patino Rodriguez, Carmen Elena, E-mail: cpatino@udea.edu.c [Department of Industrial Engineering, University of Antioquia, Medellin (Colombia); Department of Mechatronics and Mechanical Systems, Polytechnic School, University of Sao Paulo, Sao Paulo (Brazil); Francisco Martha de Souza, Gilberto [Department of Mechatronics and Mechanical Systems, Polytechnic School, University of Sao Paulo, Sao Paulo (Brazil)

    2010-08-15

    This paper presents a reliability-based analysis for calculating critical tool life in machining processes. It is possible to determine the running time for each tool involved in the process by obtaining the operations sequence for the machining procedure. Usually, the reliability of an operation depends on three independent factors: operator, machine-tool and cutting tool. The reliability of a part manufacturing process is mainly determined by the cutting time for each job and by the sequence of operations, defined by the series configuration. An algorithm is presented to define when the cutting tool must be changed. The proposed algorithm is used to evaluate the reliability of a manufacturing process composed of turning and drilling operations. The reliability of the turning operation is modeled based on data presented in the literature, and from experimental results, a statistical distribution of drilling tool wear was defined, and the reliability of the drilling process was modeled.

  13. Reliability concepts applied to cutting tool change time

    International Nuclear Information System (INIS)

    Patino Rodriguez, Carmen Elena; Francisco Martha de Souza, Gilberto

    2010-01-01

    This paper presents a reliability-based analysis for calculating critical tool life in machining processes. It is possible to determine the running time for each tool involved in the process by obtaining the operations sequence for the machining procedure. Usually, the reliability of an operation depends on three independent factors: operator, machine-tool and cutting tool. The reliability of a part manufacturing process is mainly determined by the cutting time for each job and by the sequence of operations, defined by the series configuration. An algorithm is presented to define when the cutting tool must be changed. The proposed algorithm is used to evaluate the reliability of a manufacturing process composed of turning and drilling operations. The reliability of the turning operation is modeled based on data presented in the literature, and from experimental results, a statistical distribution of drilling tool wear was defined, and the reliability of the drilling process was modeled.

  14. Electronic Slideshow Presentations in the Higher Education Teaching and Learning Process

    Science.gov (United States)

    Ferreira, Carlos Miguel; Santos, Ana Isabel; Serpa, Sandro

    2018-01-01

    The use of electronic slide presentations (ESP), usually through PowerPoint or Prezi software, has become widespread in higher education and is part of the expectations and perceptions of both teachers and students of how a successful and quality class should be. Is this dissemination of ESP use justified by the pedagogical quality fostered in…

  15. 23 CFR 636.111 - Can oral presentations be used during the procurement process?

    Science.gov (United States)

    2010-04-01

    ... making the source selection decision. You may decide the appropriate method and level of detail for the record (e.g., videotaping, audio tape recording, written record, contracting agency notes, copies of offeror briefing slides or presentation notes). A copy of the record should be placed in the contract file...

  16. A Time-Variant Reliability Model for Copper Bending Pipe under Seawater-Active Corrosion Based on the Stochastic Degradation Process

    Directory of Open Access Journals (Sweden)

    Bo Sun

    2018-03-01

    Full Text Available In the degradation process, the randomness and multiplicity of variables are difficult to describe by mathematical models. However, they are common in engineering and cannot be neglected, so it is necessary to study this issue in depth. In this paper, the copper bending pipe in seawater piping systems is taken as the analysis object, and the time-variant reliability is calculated by solving the interference of limit strength and maximum stress. We did degradation experiments and tensile experiments on copper material, and obtained the limit strength at each time. In addition, degradation experiments on copper bending pipe were done and the thickness at each time has been obtained, then the response of maximum stress was calculated by simulation. Further, with the help of one kind of Monte Carlo method we propose, the time-variant reliability of copper bending pipe was calculated based on the stochastic degradation process and interference theory. Compared with traditional methods and verified by maintenance records, the results show that the time-variant reliability model based on the stochastic degradation process proposed in this paper has better applicability in the reliability analysis, and it can be more convenient and accurate to predict the replacement cycle of copper bending pipe under seawater-active corrosion.

  17. Study of resonant processes in plasmonic nanostructures for sensor applications (Conference Presentation)

    Science.gov (United States)

    Pirunčík, Jiří; Kwiecien, Pavel; Fiala, Jan; Richter, Ivan

    2017-05-01

    This contribution is focused on the numerical studies of resonant processes in individual plasmonic nanostructures, with the attention particularly given to rectangular nanoparticles and concominant localized surface plasmon resonance processes. Relevant models for the description and anylysis of localized surface plasmon resonance are introduced, in particular: quasistatic approximation, Mie theory and in particular, a generalized (quasi)analytical approach for treating rectangularly shaped nanostructures. The parameters influencing resonant behavior of nanoparticles are analyzed with special interest in morphology and sensor applications. Results acquired with Lumerical FDTD Solutions software, using finite-difference time-domain simulation method, are shown and discussed. Simulations were mostly performed for selected nanostructures composed of finite rectangular nanowires with square cross-sections. Systematic analysis is made for single nanowires with varying length, parallel couple of nanowires with varying gap (cut -wires) and selected dolmen structures with varying gap between one nanowire transversely located with respect to parallel couple of nanowires (in both in-plane and -out-of-plane arrangements). The dependence of resonant peaks of cross-section spectral behavior (absorption, scattering, extinction) and their tunability via suitable structuring and morphology changes are primarily researched. These studies are then followed with an analysis of the effect of periodic arrangements. The results can be usable with respect to possible sensor applications.

  18. Comparing Proteolytic Fingerprints of Antigen-Presenting Cells during Allergen Processing.

    Science.gov (United States)

    Hofer, Heidi; Weidinger, Tamara; Briza, Peter; Asam, Claudia; Wolf, Martin; Twaroch, Teresa E; Stolz, Frank; Neubauer, Angela; Dall, Elfriede; Hammerl, Peter; Jacquet, Alain; Wallner, Michael

    2017-06-08

    Endolysosomal processing has a critical influence on immunogenicity as well as immune polarization of protein antigens. In industrialized countries, allergies affect around 25% of the population. For the rational design of protein-based allergy therapeutics for immunotherapy, a good knowledge of T cell-reactive regions on allergens is required. Thus, we sought to analyze endolysosomal degradation patterns of inhalant allergens. Four major allergens from ragweed, birch, as well as house dust mites were produced as recombinant proteins. Endolysosomal proteases were purified by differential centrifugation from dendritic cells, macrophages, and B cells, and combined with allergens for proteolytic processing. Thereafter, endolysosomal proteolysis was monitored by protein gel electrophoresis and mass spectrometry. We found that the overall proteolytic activity of specific endolysosomal fractions differed substantially, whereas the degradation patterns of the four model allergens obtained with the different proteases were extremely similar. Moreover, previously identified T cell epitopes were assigned to endolysosomal peptides and indeed showed a good overlap with known T cell epitopes for all four candidate allergens. Thus, we propose that the degradome assay can be used as a predictor to determine antigenic peptides as potential T cell epitopes, which will help in the rational design of protein-based allergy vaccine candidates.

  19. Private Property Rights and Compulsory Acquisition Process in Nigeria: the Past, Present and Future

    Directory of Open Access Journals (Sweden)

    Akintunde OTUBU

    2012-11-01

    Full Text Available Objectives: A property right is the exclusive authority to determine how a resource is used, whether that resource is owned by government or by individuals. In the context of land, it is the authority of the land owner to determine its use or otherwise. On the other hand, compulsory acquisition is the process by which government obtain land from private owners for development purposes in the best interest of the community. These diametrically opposed concepts of property rights and compulsory acquisition is reconciled with the payment of compensation for the extinguishment of private property rights. Implications: In Nigeria, these two concepts have a history of mutual conflicts, resulting in congruous resolutions most of the time, until the introduction of the Land Use Act 1978. With the coming of the Act, the pendulum has tilted in favors of compulsory acquisition to the detriment of private property rights; as compensation fails to assuage the loss occasioned by expropriation. Value: The paper explored the dichotomy between private property rights and compulsory acquisition in Nigeria in the last 50 years and submitted that the process under the Land Use Act changed the equilibrium that existed between these two concepts and produced a skewed and unfavorable result to the detriment of private property rights and National economy. It finally proposed a new equitable arrangement to the quagmire.

  20. Analysis of Present Day Election Processes vis-à-vis Elections Through Blockchain Technology

    OpenAIRE

    Hegadekatti, Kartik

    2017-01-01

    Currently, Democracy is realised through representatives elected by the people. These elections are periodic activities. They involve expenditure of big amounts of manpower, money, time and other resources. It is important to note that during an election, the administration and day-to-day lives of people are affected as election activities take centre stage. Present day elections are amenable to influence where Voters can possibly be intimidated to vote against their will. In many instances, ...

  1. Calculating system reliability with SRFYDO

    Energy Technology Data Exchange (ETDEWEB)

    Morzinski, Jerome [Los Alamos National Laboratory; Anderson - Cook, Christine M [Los Alamos National Laboratory; Klamann, Richard M [Los Alamos National Laboratory

    2010-01-01

    SRFYDO is a process for estimating reliability of complex systems. Using information from all applicable sources, including full-system (flight) data, component test data, and expert (engineering) judgment, SRFYDO produces reliability estimates and predictions. It is appropriate for series systems with possibly several versions of the system which share some common components. It models reliability as a function of age and up to 2 other lifecycle (usage) covariates. Initial output from its Exploratory Data Analysis mode consists of plots and numerical summaries so that the user can check data entry and model assumptions, and help determine a final form for the system model. The System Reliability mode runs a complete reliability calculation using Bayesian methodology. This mode produces results that estimate reliability at the component, sub-system, and system level. The results include estimates of uncertainty, and can predict reliability at some not-too-distant time in the future. This paper presents an overview of the underlying statistical model for the analysis, discusses model assumptions, and demonstrates usage of SRFYDO.

  2. Reliability Analysis Based on a Jump Diffusion Model with Two Wiener Processes for Cloud Computing with Big Data

    Directory of Open Access Journals (Sweden)

    Yoshinobu Tamura

    2015-06-01

    Full Text Available At present, many cloud services are managed by using open source software, such as OpenStack and Eucalyptus, because of the unification management of data, cost reduction, quick delivery and work savings. The operation phase of cloud computing has a unique feature, such as the provisioning processes, the network-based operation and the diversity of data, because the operation phase of cloud computing changes depending on many external factors. We propose a jump diffusion model with two-dimensional Wiener processes in order to consider the interesting aspects of the network traffic and big data on cloud computing. In particular, we assess the stability of cloud software by using the sample paths obtained from the jump diffusion model with two-dimensional Wiener processes. Moreover, we discuss the optimal maintenance problem based on the proposed jump diffusion model. Furthermore, we analyze actual data to show numerical examples of dependability optimization based on the software maintenance cost considering big data on cloud computing.

  3. Present state of works on development of electron accelerators for energy consuming processes at Efremov Research Institute

    Energy Technology Data Exchange (ETDEWEB)

    Ivanov, A. S.; Maznev, V. P.; Ovchinnikov, V. P.; Svinin, M. P.; Tolstun, N. G. [Efremov Research Institute of Electrophysical Apparatus, Saint-Petersburg (Russian Federation)

    2011-07-01

    Necessity to decrease anthropogenic environmental pollution puts a task of development of HV accelerators for introduction of nature conservation technologies in commercial scale. High efficiency and operation reliability in a power range noticeably higher than already mastered level are required. In design of the accelerators basic units, namely, HV generators, accelerating structures, electron beam irradiation field forming systems and extraction devices solutions that demonstrated already theirs operational capacity in the machines of a lesser power may be used. From the other hand, experience gained by already full-scale built powerful installations shows that a number of problems remain unsolved that put obstacles on a way of wide implementation of exhaust gases irradiation processing. Attempts to built the accelerator meeting all requirements in a frame of specific contracts, although already shown noticeable progress in a sense of the power grow, acquiring of very valuable experience, carry some risk caused by insufficient study of the problems connected with power increasing, lack of time and means for the thorough research works. It looks reasonable to suggest creation of full-scale pilot installation with HV accelerator of required power (1 MW, for example) not bound to a specific commercial contract, where researches and studies of the accelerator main systems, theirs optimization and longevity tests can be carried out thus providing development of the accelerator into really reliable and effective tool for applying to environmental tasks.

  4. Research in Social Work: the future in the present. Reflections on the portuguese knowledge building process

    Directory of Open Access Journals (Sweden)

    Raquel Marta

    2016-06-01

    Full Text Available The debate surrounding the construction of scientific knowledge within social work is discussed. The social work class seeks new foundations that allow within the context of structural change, the strengthening of professional identity and challenge of the vestiges of intellectual segregation that historical constraints have left. This paper seeks to outline a research strategy for reconciliation and coordination of intellectual and professional work in order to give visibility to new and different domains of interpretation and action, while claiming that considering pluri-perspectives potentiates the knowledge transformation process. Underlining this confluence of complex thinking elements, this article incorporates the space-time dimension and discusses and recognizes the unavoidable circularity as a way to interrogate knowledge that is compartmentalized and fragmented, placing an emphasis both on knowledge and on the interrelationship between knowing, doing, being and relating. In addition, examines the recognition of the nature of those relationships among various disciplines and perspectives.

  5. Quantitative comparison of clustered microcalcifications in for-presentation and for-processing mammograms in full-field digital mammography.

    Science.gov (United States)

    Wang, Juan; Nishikawa, Robert M; Yang, Yongyi

    2017-07-01

    Mammograms acquired with full-field digital mammography (FFDM) systems are provided in both "for-processing'' and "for-presentation'' image formats. For-presentation images are traditionally intended for visual assessment by the radiologists. In this study, we investigate the feasibility of using for-presentation images in computerized analysis and diagnosis of microcalcification (MC) lesions. We make use of a set of 188 matched mammogram image pairs of MC lesions from 95 cases (biopsy proven), in which both for-presentation and for-processing images are provided for each lesion. We then analyze and characterize the MC lesions from for-presentation images and compare them with their counterparts in for-processing images. Specifically, we consider three important aspects in computer-aided diagnosis (CAD) of MC lesions. First, we quantify each MC lesion with a set of 10 image features of clustered MCs and 12 textural features of the lesion area. Second, we assess the detectability of individual MCs in each lesion from the for-presentation images by a commonly used difference-of-Gaussians (DoG) detector. Finally, we study the diagnostic accuracy in discriminating between benign and malignant MC lesions from the for-presentation images by a pretrained support vector machine (SVM) classifier. To accommodate the underlying background suppression and image enhancement in for-presentation images, a normalization procedure is applied. The quantitative image features of MC lesions from for-presentation images are highly consistent with that from for-processing images. The values of Pearson's correlation coefficient between features from the two formats range from 0.824 to 0.961 for the 10 MC image features, and from 0.871 to 0.963 for the 12 textural features. In detection of individual MCs, the FROC curve from for-presentation is similar to that from for-processing. In particular, at sensitivity level of 80%, the average number of false-positives (FPs) per image region is 9

  6. Bee venom processes human skin lipids for presentation by CD1a.

    Science.gov (United States)

    Bourgeois, Elvire A; Subramaniam, Sumithra; Cheng, Tan-Yun; De Jong, Annemieke; Layre, Emilie; Ly, Dalam; Salimi, Maryam; Legaspi, Annaliza; Modlin, Robert L; Salio, Mariolina; Cerundolo, Vincenzo; Moody, D Branch; Ogg, Graham

    2015-02-09

    Venoms frequently co-opt host immune responses, so study of their mode of action can provide insight into novel inflammatory pathways. Using bee and wasp venom responses as a model system, we investigated whether venoms contain CD1-presented antigens. Here, we show that venoms activate human T cells via CD1a proteins. Whereas CD1 proteins typically present lipids, chromatographic separation of venoms unexpectedly showed that stimulatory factors partition into protein-containing fractions. This finding was explained by demonstrating that bee venom-derived phospholipase A2 (PLA2) activates T cells through generation of small neoantigens, such as free fatty acids and lysophospholipids, from common phosphodiacylglycerides. Patient studies showed that injected PLA2 generates lysophospholipids within human skin in vivo, and polyclonal T cell responses are dependent on CD1a protein and PLA2. These findings support a previously unknown skin immune response based on T cell recognition of CD1a proteins and lipid neoantigen generated in vivo by phospholipases. The findings have implications for skin barrier sensing by T cells and mechanisms underlying phospholipase-dependent inflammatory skin disease. © 2015 Bourgeois et al.

  7. Isotopic mass balance of Manzala Lake as indicators of present and past hydrogeological processes in Egypt

    International Nuclear Information System (INIS)

    Salem, M.W.M.

    2006-01-01

    Lakes are very important part of the aquatic ecosystem, which represent about 15% of the total commercial fishing areas in Egypt. Manzala lake is considered one of the largest lakes in Egypt. It is located in the north-eastern edge of the Nile delta and suffering from industrial and agricultural pollutions. The most serious source of pollution may be from Port Said and Damietta wastes, which dumped regularly into the lake. The main object of the present study is to investigate the hydrochemical and isotopic features of the lake waters and to compare the parameters deduced in the present and previous investigations in order to improve the current knowledge of the dynamic change during this time. The stable isotope (oxygen-18) component mass balance approach was used to find out the evaporation rate and the seepage from the groundwater to the lake. The data showed that the seepage rate from the groundwater to the lake was 305.54 x 106 m3/y (about 2% higher than previous study) since the amounts of drainage water became higher. The evaporation rate was 2185.844 x 106 m3/y (about 5% less than previous study). This is due to the reduction in the lake size. Although these rates are relatively small, yet they indicate an alarm for pollution propagation around the lake, which would increase with time

  8. Algorithmic processing of intrinsic signals in affixed transmission speckle analysis (ATSA) (Conference Presentation)

    Science.gov (United States)

    Ghijsen, Michael T.; Tromberg, Bruce J.

    2017-03-01

    Affixed Transmission Speckle Analysis (ATSA) is a method recently developed to measure blood flow that is based on laser speckle imaging miniaturized into a clip-on form factor the size of a pulse-oximeter. Measuring at a rate of 250 Hz, ATSA is capable or obtaining the cardiac waveform in blood flow data, referred to as the Speckle-Plethysmogram (SPG). ATSA is also capable of simultaneously measuring the Photoplethysmogram (PPG), a more conventional signal related to light intensity. In this work we present several novel algorithms for extracting physiologically relevant information from the combined SPG-PPG waveform data. First we show that there is a slight time-delay between the SPG and PPG that can be extracted computationally. Second, we present a set of frequency domain algorithms that measure harmonic content on pulse-by-pulse basis for both the SPG and PPG. Finally, we apply these algorithms to data obtained from a set of subjects including healthy controls and individuals with heightened cardiovascular risk. We hypothesize that the time-delay and frequency content are correlated with cardiovascular health; specifically with vascular stiffening.

  9. [Assessment of the validity and reliability of the processes of change scale based on the transtheoretical model of vegetable consumption behavior in Japanese male workers].

    Science.gov (United States)

    Kushida, Osamu; Murayama, Nobuko

    2012-12-01

    A core construct of the Transtheoretical model is that the processes and stages of change are strongly related to observable behavioral changes. We created the Processes of Change Scale of vegetable consumption behavior and examined the validity and reliability of this scale. In September 2009, a self-administered questionnaire was administered to male Japanese employees, aged 20-59 years, working at 20 worksites in Niigata City in Japan. The stages of change (precontempration, contemplation, preparation, action, and maintenance stage) were measured using 2 items that assessed participants' current implementation of the target behavior (eating 5 or more servings of vegetables per day) and their readiness to change their habits. The Processes of Change Scale of vegetable consumption behavior comprised 10 items assessing 5 cognitive processes (consciousness raising, emotional arousal, environmental reevaluation, self-reevaluation, and social liberation) and 5 behavioral processes (commitment, rewards, helping relationships, countering, and environment control). Each item was selected from an existing scale. Decisional balance (pros [2 items] and cons [2 items]), and self-efficacy (3 items) were also assessed, because these constructs were considered to be relevant to the processes of change. The internal consistency reliability of the scale was examined using Cronbach's alpha. Its construct validity was examined using a factor analysis of the processes of change, decisional balance, and self-efficacy variables, while its criterion-related validity was determined by assessing the association between the scale scores and the stages of change. The data of 527 (out of 600) participants (mean age, 41.1 years) were analyzed. Results indicated that the Processes of Change Scale had sufficient internal consistency reliability (Cronbach's alpha: cognitive processes=0.722, behavioral processes=0.803). The processes of change were divided into 2 factors: "consciousness raising

  10. Environmental process for elimination of phenolic water present in refinery gasoline tanks; Processo ambiental para eliminacao de agua fenolica presente em tanques de gasolina de refinarias de petroleo

    Energy Technology Data Exchange (ETDEWEB)

    Correa Junior, Bentaci; Pedroso, Osmar V.; Furlan, Luis T. [PETROBRAS, SP (Brazil). Refinaria de Paulinia

    2004-07-01

    Gasoline production in petroleum refineries usually implies carrying high phenol contents in water after treatment systems. Phenols are powerful bactericides and, therefore, harmful to microorganisms present in wastewater treatment plants and in rivers. Due to this reason, usually controlled phenolic water drainage is performed, enabling gasoline quality improvement, without jeopardizing the biological treatment. Increase of phenolic contents in the effluent, due to operational disarray during the drainage of gasoline tanks may cause inhibition or even mortality of the existing microorganisms in the wastewater treatment plants. Aiming at changing the traditional treatment logic of environmental demands at the 'end of pipe', sending the phenolic water to the sour water treatment systems was proposed and implemented, which in turn, is reutilized by the latter in the crude desalination of the Distillation Units, where the phenols are reincorporated to the crude oil, preventing negative consequences to the wastewater treatment plant. The implemented process has demonstrated that premises were correct, enabling to implement process flows quite higher than drainage flows, what has meant productivity gains and environmental improvement. (author)

  11. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  12. Processing and memory of information presented in narrative or expository texts.

    Science.gov (United States)

    Wolfe, Michael B W; Woodwyk, Joshua M

    2010-09-01

    Previous research suggests that narrative and expository texts differ in the extent to which they prompt students to integrate to-be-learned content with relevant prior knowledge during comprehension. We expand on previous research by examining on-line processing and representation in memory of to-be-learned content that is embedded in narrative or expository texts. We are particularly interested in how differences in the use of relevant prior knowledge leads to differences in terms of levels of discourse representation (textbase vs. situation model). A total of 61 university undergraduates in Expt 1, and 160 in Expt 2. In Expt 1, subjects thought out loud while comprehending circulatory system content embedded in a narrative or expository text, followed by free recall of text content. In Expt 2, subjects read silently and completed a sentence recognition task to assess memory. In Expt 1, subjects made more associations to prior knowledge while reading the expository text, and recalled more content. Content recall was also correlated with amount of relevant prior knowledge for subjects who read the expository text but not the narrative text. In Expt 2, subjects reading the expository text (compared to the narrative text) had a weaker textbase representation of the to-be-learned content, but a marginally stronger situation model. Results suggest that in terms of to-be-learned content, expository texts trigger students to utilize relevant prior knowledge more than narrative texts.

  13. Present status and future of hydrogel dressings processed by low energy EB

    International Nuclear Information System (INIS)

    Lugao, A.B.; Rogero, S.O.; Catalani, L.H.; Malmonge, S.M.

    2001-01-01

    Full text: The first hydrogel for wound dressing processed by radiation left the laboratories in Poland in 1986 by the hands of its inventor Janusz M. Rosiak and soon, after formal tests, arrived in the local market (1992). It was a technological breakthrough due to its product characteristics as pain reliever and enhanced healing properties besides its clever production process combining sterilization and crosslinking in a simultaneous operation (Rosiak a, 1989; Rosiak b, 1995). IAEA invited professor Rosiak to supported the transference of his technology for many laboratories around the world. The laboratories of developing countries, which face all kinds of restrictions, were seduced by the simplicity of the process and low cost of its raw materials. This was the seed of the flourishing activities in hydrogel dressings in Brazil and other developing countries. The so-called 'Rosiak membrane' have been applied in many european countries for the healing of burn wounds, ulcers, cosmetology purposes and so on. In the opinion of some physicians, the main therapeutic features of them are their ability to stop the pain and their capacity of absorption of exudates, besides many other properties as transparency, permeation of oxygen, cooling effect of water evaporation and so on. The pain relief property is controlled by a delicate balance of softness and proper consistency to be used as dressings as previously showed (Lugao a, 2000, Miranda a, 2000).The network features command the amount of exudates absorbed by the dressing and the permeability of drugs by the net. Although, its excellent healing properties confirmed by clinical use, its handling properties are still difficult (Lugao b, 1998, Miranda b, 1999). The methodology of membrane preparation is as follow: all polymers were medical grade: poli(N-vinil2- pirrolidona)/PVP-K90, from PLASDONE; agar/Agar Technical No 3, from OXOID; poly(ethylene-glycoD/ATPEG-300, from OXITENO. PVP concentration were 2, 5, 8, 10, 12

  14. Absolute gravity change in Taiwan: Present result of geodynamic process investigation

    Directory of Open Access Journals (Sweden)

    Ricky Kao

    2017-01-01

    Full Text Available Gravity values at 24 sites over 2004 - 2016 measured with absolute gravimeters are used to study geodynamic processes in Taiwan. We model rain-induced grav­ity effects and other temporal effects of non-geodynamic origins to obtain residual gravity, which cannot be fully explained by GPS-derived vertical displacements. We explain the gravity changes associated with deposited debris, earthquake, volcanism and Moho deepening. Gravity changes of 53.37 and 23.38 μGal near Sinwulyu and Laonong Rivers are caused by typhoon Morakot, leading to estimated volumes of 6.0 × 105 and 3.6 × 105 m3 in deposited debris. The observed co-seismic gravity change near the epicenter of the M 6.9 Pingtung earthquake (26 December 2006 is 3.12 ± 0.99 μGal, consistent with a dislocation-based gravity change at the μGal level, thereby supplying a gravity constraint on the modeled fault parameters. The AG re­cord at the Tatun Volcano Group is the longest, but large temporal gravity effects here has led to a current gravity signal-to-noise ratio of less than one, which cannot convince a sinking magma chamber, but supply an error bound for gravity detections of long-term or transient magma movements. The gravity values at Ludao and Lanyu decline steadily at the rates of -2.20 and -0.50 μGal yr-1, consistent with the expected magma states of the two extinct volcanoes. The gravity rates at an uplifting site in central Taiwan and three subsiding sites in eastern Taiwan are negative, and are po­tentially caused by Moho deepening at a rate of -3.34 cm yr-1 and a combined Moho deepening and plate subduction at the rates of -0.18, -2.03, and -1.34 cm yr-1.

  15. Anti-deception: reliable EEG-based biometrics with real-time capability from the neural response of face rapid serial visual presentation.

    Science.gov (United States)

    Wu, Qunjian; Yan, Bin; Zeng, Ying; Zhang, Chi; Tong, Li

    2018-05-03

    The electroencephalogram (EEG) signal represents a subject's specific brain activity patterns and is considered as an ideal biometric given its superior invisibility, non-clonality, and non-coercion. In order to enhance its applicability in identity authentication, a novel EEG-based identity authentication method is proposed based on self- or non-self-face rapid serial visual presentation. In contrast to previous studies that extracted EEG features from rest state or motor imagery, the designed paradigm could obtain a distinct and stable biometric trait with a lower time cost. Channel selection was applied to select specific channels for each user to enhance system portability and improve discriminability between users and imposters. Two different imposter scenarios were designed to test system security, which demonstrate the capability of anti-deception. Fifteen users and thirty imposters participated in the experiment. The mean authentication accuracy values for the two scenarios were 91.31 and 91.61%, with 6 s time cost, which illustrated the precision and real-time capability of the system. Furthermore, in order to estimate the repeatability and stability of our paradigm, another data acquisition session is conducted for each user. Using the classification models generated from the previous sessions, a mean false rejected rate of 7.27% has been achieved, which demonstrates the robustness of our paradigm. Experimental results reveal that the proposed paradigm and methods are effective for EEG-based identity authentication.

  16. Presenting and processing information in background noise: A combined speaker-listener perspective.

    Science.gov (United States)

    Bockstael, Annelies; Samyn, Laurie; Corthals, Paul; Botteldooren, Dick

    2018-01-01

    Transferring information orally in background noise is challenging, for both speaker and listener. Successful transfer depends on complex interaction between characteristics related to listener, speaker, task, background noise, and context. To fully assess the underlying real-life mechanisms, experimental design has to mimic this complex reality. In the current study, the effects of different types of background noise have been studied in an ecologically valid test design. Documentary-style information had to be presented by the speaker and simultaneously acquired by the listener in four conditions: quiet, unintelligible multitalker babble, fluctuating city street noise, and little varying highway noise. For both speaker and listener, the primary task was to focus on the content that had to be transferred. In addition, for the speakers, the occurrence of hesitation phenomena was assessed. The listener had to perform an additional secondary task to address listening effort. For the listener the condition with the most eventful background noise, i.e., fluctuating city street noise, appeared to be the most difficult with markedly longer duration of the secondary task. In the same fluctuating background noise, speech appeared to be less disfluent, suggesting a higher level of concentration from the speaker's side.

  17. Present-day Opening of the Natron Rift: Tectonic and Magmatic Processes at Work

    Science.gov (United States)

    Calais, E.; Dalaison, M.; Saria, E.; Doubre, C.; Masson, F.

    2017-12-01

    The young Natron basin (system, is an important locale to study the initial stage of continental rifting. It was the locus of a rarely observed tectono-magmatic event in July 2007, with slow slip on an intra-basin normal fault followed by a 10 km-long dike intrusion underneath the Gelai shield volcano. Here we report on a series of GPS observations over a 20-site network spanning the basin, measured repeatedly since 2013. We observe a long wavelength ( 200 km wide) extension with a horizontal rate of about 2 mm/yr, consistent with recentlty published regional kinematic models, and a velocity gradient centered on the west-bounding fault of the Natron basin. Initial models show that the data is best fit by a normal fault dipping 60 degrees to the east and slipping at a rate of 6 mm/yr. Superimposed on this long wavelength extension, we observe a smaller scale ( 30 km wide) extensional signal in the middle of the basin, roughly coincident with the location of the Gelai volcano, which was the locale of the 2007 seismic-magmatic crisis. We investigate the relative importance of tectonic faulting, post-diking relaxation following the 2007 intrusion (as observed for instance in Afar or Iceland after similar events), and melt recharge of the intra-basin magmatic system in present-day extension across this young segment of the East African Rift.

  18. Constraints on Mars Hydrogen loss from MAVEN: processes and present-day rates

    Science.gov (United States)

    Chaffin, M.; Deighan, J.; Stewart, I. F.; Schneider, N. M.; Chaufray, J. Y.; Jain, S.; Thiemann, E.; Mayyasi, M.; Clarke, J. T.; Crismani, M. M. J.; Stiepen, A.; Montmessin, F.; Epavier, F.; McClintock, B.; Holsclaw, G.; Jakosky, B. M.

    2017-12-01

    The surface of Mars today is dessicated and oxidized, despite a large body of evidence indicating that the planet was wet and redox neutral early in its history. H escape has the potential to explain both conditions, but until recently there was no long-term monitoring of H loss at Mars. The presence of MAVEN at Mars since late 2014 has established a seasonal record of H escape via airglow measurements of coronal hydrogen and the flux of energetic particles whose production is mediated by the H corona. H escape appears from multiple proxies to be more than an order of magnitude larger in Southern summer than Northern summer, potentially as a consequence of enhanced water transport to the middle atmosphere. This newly described escape channel potentially dominates water loss from Mars today and over its history, and may be the most important control on the chemistry of the atmosphere and surface. I will present an overview of MAVEN measurements of H loss, focusing on contributions made by the Imaging Ultraviolet Spectrograph, and discuss how photochemical models of the atmosphere may need to be adjusted to incorporate new mechanisms for H loss.

  19. Action-Specific Influences on Perception and Post-Perceptual Processes: Present Controversies and Future Directions

    Science.gov (United States)

    Philbeck, John W.; Witt, Jessica K.

    2015-01-01

    The action-specific perception account holds that people perceive the environment in terms of their ability to act in it. In this view, for example, decreased ability to climb a hill due to fatigue makes the hill visually appear to be steeper. Though influential, this account has not been universally accepted, and in fact a heated controversy has emerged. The opposing view holds that action capability has little or no influence on perception. Heretofore, the debate has been quite polarized, with efforts largely being focused on supporting one view and dismantling the other. We argue here that polarized debate can impede scientific progress and that the search for similarities between two sides of a debate can sharpen the theoretical focus of both sides and illuminate important avenues for future research. In this paper, we present a synthetic review of this debate, drawing from the literatures of both approaches, to clarify both the surprising similarities and the core differences between them. We critically evaluate existing evidence, discuss possible mechanisms of action-specific effects, and make recommendations for future research. A primary focus of future work will involve not only the development of methods that guard against action-specific post-perceptual effects, but also development of concrete, well-constrained underlying mechanisms. The criteria for what constitutes acceptable control of post-perceptual effects and what constitutes an appropriately specific mechanism vary between approaches, and bridging this gap is a central challenge for future research. PMID:26501227

  20. Effect of Pd Surface Roughness on the Bonding Process and High Temperature Reliability of Au Ball Bonds

    Science.gov (United States)

    Huang, Y.; Kim, H. J.; McCracken, M.; Viswanathan, G.; Pon, F.; Mayer, M.; Zhou, Y. N.

    2011-06-01

    A 0.3- μm-thick electrolytic Pd layer was plated on 1 μm of electroless Ni on 1 mm-thick polished and roughened Cu substrates with roughness values ( R a) of 0.08 μm and 0.5 μm, respectively. The rough substrates were produced with sand-blasting. Au wire bonding on the Ni/Pd surface was optimized, and the electrical reliability was investigated under a high temperature storage test (HTST) during 800 h at 250°C by measuring the ball bond contact resistance, R c. The average value of R c of optimized ball bonds on the rough substrate was 1.96 mΩ which was about 40.0% higher than that on the smooth substrate. The initial bondability increased for the rougher surface, so that only half of the original ultrasonic level was required, but the reliability was not affected by surface roughness. For both substrate types, HTST caused bond healing, reducing the average R c by about 21% and 27%, respectively. Au diffusion into the Pd layer was observed in scanning transmission electron microscopy/ energy dispersive spectroscopy (STEM-EDS) line-scan analysis after HTST. It is considered that diffusion of Au or interdiffusion between Au and Pd can provide chemically strong bonding during HTST. This is supported by the R c decrease measured as the aging time increased. Cu migration was indicated in the STEM-EDS analysis, but its effect on reliability can be ignored. Au and Pd tend to form a complete solid solution at the interface and can provide reliable interconnection for high temperature (250°C) applications.

  1. How to use the ENEA data bank for the classification and reliability processing of fast reactor component event data

    International Nuclear Information System (INIS)

    Righini, R.

    1987-01-01

    This report describes the input and inquiry procedures for the Data Bank set-up by ENEA for reliability studies on fast reactors. With reference to the structure and to the codes to be applied in the data entry and in the inquiry, see Report (2) in references. The data contained into the Bank are absolutely confidential. The input and inquiry procedures describes in this report may be applied only by the user who have previously specified the suitable password

  2. Realist Ontology and Natural Processes: A Semantic Tool to Analyze the Presentation of the Osmosis Concept in Science Texts

    Science.gov (United States)

    Spinelli Barria, Michele; Morales, Cecilia; Merino, Cristian; Quiroz, Waldo

    2016-01-01

    In this work, we developed an ontological tool, based on the scientific realism of Mario Bunge, for the analysis of the presentation of natural processes in science textbooks. This tool was applied to analyze the presentation of the concept of osmosis in 16 chemistry and biology books at different educational levels. The results showed that more…

  3. A new generation of fast cycling superconducting magnets for the accelerator system of FAIR - R and D process and present test status

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, Egbert; Schnizer, Pierre [GSI Helmholtzzentrum fuer Schwerionenforschung GmbH, Darmstadt (Germany); Mierau, Anna [Technische Universitaet Darmstadt (DE). Institut fuer Theorie Elektromagnetischer Felder (TEMF)

    2010-07-01

    The SIS100 is the core component of the FAIR accelerator complex. It will be the largest fast ramped synchrotron for heavy ion research using superconducting magnets. Starting from the design of its ancestor the Nuclotron in JINR Dubna we accomplished an intensive R and D process to develop magnets fulfilling the ambitious requirements for SIS100 operation concerning field quality, cycling frequency, cryogenic losses and reliability. In addition the beam pipe has to operate as a cryopump to reach extremely low vacuum pressures. We describe the different design modifications required to minimise the AC losses as well as to get a better field quality. We show the current vacuum chamber design and present the measurements result obtained for the first prototype dipole.

  4. Design for Reliability of Power Electronic Systems

    DEFF Research Database (Denmark)

    Wang, Huai; Ma, Ke; Blaabjerg, Frede

    2012-01-01

    Advances in power electronics enable efficient and flexible processing of electric power in the application of renewable energy sources, electric vehicles, adjustable-speed drives, etc. More and more efforts are devoted to better power electronic systems in terms of reliability to ensure high......). A collection of methodologies based on Physics-of-Failure (PoF) approach and mission profile analysis are presented in this paper to perform reliability-oriented design of power electronic systems. The corresponding design procedures and reliability prediction models are provided. Further on, a case study...... on a 2.3 MW wind power converter is discussed with emphasis on the reliability critical components IGBTs. Different aspects of improving the reliability of the power converter are mapped. Finally, the challenges and opportunities to achieve more reliable power electronic systems are addressed....

  5. Development of improved processing and evaluation methods for high reliability structural ceramics for advanced heat engine applications Phase II. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Pujari, V.J.; Tracey, D.M.; Foley, M.R. [and others

    1996-02-01

    The research program had as goals the development and demonstration of significant improvements in processing methods, process controls, and nondestructive evaluation (NDE) which can be commercially implemented to produce high reliability silicon nitride components for advanced heat engine applications at temperatures to 1370{degrees}C. In Phase I of the program a process was developed that resulted in a silicon nitride - 4 w% yttria HIP`ed material (NCX 5102) that displayed unprecedented strength and reliability. An average tensile strength of 1 GPa and a strength distribution following a 3-parameter Weibull distribution were demonstrated by testing several hundred buttonhead tensile specimens. The Phase II program focused on the development of methodology for colloidal consolidation producing green microstructure which minimizes downstream process problems such as drying, shrinkage, cracking, and part distortion during densification. Furthermore, the program focused on the extension of the process to gas pressure sinterable (GPS) compositions. Excellent results were obtained for the HIP composition processed for minimal density gradients, both with respect to room-temperature strength and high-temperature creep resistance. Complex component fabricability of this material was demonstrated by producing engine-vane prototypes. Strength data for the GPS material (NCX-5400) suggest that it ranks very high relative to other silicon nitride materials in terms of tensile/flexure strength ratio, a measure of volume quality. This high quality was derived from the closed-loop colloidal process employed in the program.

  6. Human reliability

    International Nuclear Information System (INIS)

    Embrey, D.E.

    1987-01-01

    Concepts and techniques of human reliability have been developed and are used mostly in probabilistic risk assessment. For this, the major application of human reliability assessment has been to identify the human errors which have a significant effect on the overall safety of the system and to quantify the probability of their occurrence. Some of the major issues within human reliability studies are reviewed and it is shown how these are applied to the assessment of human failures in systems. This is done under the following headings; models of human performance used in human reliability assessment, the nature of human error, classification of errors in man-machine systems, practical aspects, human reliability modelling in complex situations, quantification and examination of human reliability, judgement based approaches, holistic techniques and decision analytic approaches. (UK)

  7. Validity and reliability of an application review process using dedicated reviewers in one stage of a multi-stage admissions model.

    Science.gov (United States)

    Zeeman, Jacqueline M; McLaughlin, Jacqueline E; Cox, Wendy C

    2017-11-01

    With increased emphasis placed on non-academic skills in the workplace, a need exists to identify an admissions process that evaluates these skills. This study assessed the validity and reliability of an application review process involving three dedicated application reviewers in a multi-stage admissions model. A multi-stage admissions model was utilized during the 2014-2015 admissions cycle. After advancing through the academic review, each application was independently reviewed by two dedicated application reviewers utilizing a six-construct rubric (written communication, extracurricular and community service activities, leadership experience, pharmacy career appreciation, research experience, and resiliency). Rubric scores were extrapolated to a three-tier ranking to select candidates for on-site interviews. Kappa statistics were used to assess interrater reliability. A three-facet Many-Facet Rasch Model (MFRM) determined reviewer severity, candidate suitability, and rubric construct difficulty. The kappa statistic for candidates' tier rank score (n = 388 candidates) was 0.692 with a perfect agreement frequency of 84.3%. There was substantial interrater reliability between reviewers for the tier ranking (kappa: 0.654-0.710). Highest construct agreement occurred in written communication (kappa: 0.924-0.984). A three-facet MFRM analysis explained 36.9% of variance in the ratings, with 0.06% reflecting application reviewer scoring patterns (i.e., severity or leniency), 22.8% reflecting candidate suitability, and 14.1% reflecting construct difficulty. Utilization of dedicated application reviewers and a defined tiered rubric provided a valid and reliable method to effectively evaluate candidates during the application review process. These analyses provide insight into opportunities for improving the application review process among schools and colleges of pharmacy. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. JMP Applications in Photovoltaic Reliability (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Jordan, D.; Gotwalt, C.

    2011-09-01

    The ability to accurately predict power delivery over the course of time is of vital importance to the growth of the photovoltaic (PV) industry. Two key cost drivers are the efficiency with which sunlight is converted into power and secondly how this relationship develops over time. The accurate knowledge of power decline over time, also known as degradation rates, is essential and important to all stakeholders?utility companies, integrators, investors, and scientist alike. Outdoor testing plays a vital part in quantifying degradation rates of different technologies in various climates. Due to seasonal changes, however, several complete cycles (typically 3-5 years) need to be completed traditionally to obtain reasonably accurate degradation rates. In a rapidly evolving industry such a time span is often unacceptable and the need exists to determine degradation rates more accurately in a shorter period of time. Advanced time series modeling such as ARIMA (Autoregressive Integrated Moving Average) modeling can be utilized to decrease the required time span and is compared with some non-linear modeling. In addition, it will be demonstrated how the JMP 9 map feature was used to reveal important technological trends by climate.

  9. Proceeding of the Scientific Meeting and Presentation on Basic Research of Nuclear Science and Technology: Book II. Nuclear Chemistry, Process Technology, and Radioactive Waste Processing and Environment

    International Nuclear Information System (INIS)

    1996-06-01

    The proceeding contains papers presented on Scientific Meeting and Presentation on on Basic Research of Nuclear Science and Technology, held in Yogyakarta, 25-27 April 1995. This proceeding is second part of two books published for the meeting contains papers on nuclear chemistry, process technology, and radioactive waste management and environment. There are 62 papers indexed individually. (ID)

  10. Circuit design for reliability

    CERN Document Server

    Cao, Yu; Wirth, Gilson

    2015-01-01

    This book presents physical understanding, modeling and simulation, on-chip characterization, layout solutions, and design techniques that are effective to enhance the reliability of various circuit units.  The authors provide readers with techniques for state of the art and future technologies, ranging from technology modeling, fault detection and analysis, circuit hardening, and reliability management. Provides comprehensive review on various reliability mechanisms at sub-45nm nodes; Describes practical modeling and characterization techniques for reliability; Includes thorough presentation of robust design techniques for major VLSI design units; Promotes physical understanding with first-principle simulations.

  11. Design reliability engineering

    International Nuclear Information System (INIS)

    Buden, D.; Hunt, R.N.M.

    1989-01-01

    Improved design techniques are needed to achieve high reliability at minimum cost. This is especially true of space systems where lifetimes of many years without maintenance are needed and severe mass limitations exist. Reliability must be designed into these systems from the start. Techniques are now being explored to structure a formal design process that will be more complete and less expensive. The intent is to integrate the best features of design, reliability analysis, and expert systems to design highly reliable systems to meet stressing needs. Taken into account are the large uncertainties that exist in materials, design models, and fabrication techniques. Expert systems are a convenient method to integrate into the design process a complete definition of all elements that should be considered and an opportunity to integrate the design process with reliability, safety, test engineering, maintenance and operator training. 1 fig

  12. Improving machinery reliability

    CERN Document Server

    Bloch, Heinz P

    1998-01-01

    This totally revised, updated and expanded edition provides proven techniques and procedures that extend machinery life, reduce maintenance costs, and achieve optimum machinery reliability. This essential text clearly describes the reliability improvement and failure avoidance steps practiced by best-of-class process plants in the U.S. and Europe.

  13. Vocal reaction times to unilaterally presented concrete and abstract words: towards a theory of differential right hemispheric semantic processing.

    Science.gov (United States)

    Rastatter, M; Dell, C W; McGuire, R A; Loren, C

    1987-03-01

    Previous studies investigating hemispheric organization for processing concrete and abstract nouns have provided conflicting results. Using manual reaction time tasks some studies have shown that the right hemisphere is capable of analyzing concrete words but not abstract. Others, however, have inferred that the left hemisphere is the sole analyzer of both types of lexicon. The present study tested these issues further by measuring vocal reaction times of normal subjects to unilaterally presented concrete and abstract items. Results were consistent with a model of functional localization which suggests that the minor hemisphere is capable of differentially processing both types of lexicon in the presence of a dominant left hemisphere.

  14. Reliability and safety engineering

    CERN Document Server

    Verma, Ajit Kumar; Karanki, Durga Rao

    2016-01-01

    Reliability and safety are core issues that must be addressed throughout the life cycle of engineering systems. Reliability and Safety Engineering presents an overview of the basic concepts, together with simple and practical illustrations. The authors present reliability terminology in various engineering fields, viz.,electronics engineering, software engineering, mechanical engineering, structural engineering and power systems engineering. The book describes the latest applications in the area of probabilistic safety assessment, such as technical specification optimization, risk monitoring and risk informed in-service inspection. Reliability and safety studies must, inevitably, deal with uncertainty, so the book includes uncertainty propagation methods: Monte Carlo simulation, fuzzy arithmetic, Dempster-Shafer theory and probability bounds. Reliability and Safety Engineering also highlights advances in system reliability and safety assessment including dynamic system modeling and uncertainty management. Cas...

  15. Reliability Approach of a Compressor System using Reliability Block ...

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... This paper presents a reliability analysis of such a system using reliability ... Keywords-compressor system, reliability, reliability block diagram, RBD .... the same structure has been kept with the three subsystems: air flow, oil flow and .... and Safety in Engineering Design", Springer, 2009. [3] P. O'Connor ...

  16. Operational safety reliability research

    International Nuclear Information System (INIS)

    Hall, R.E.; Boccio, J.L.

    1986-01-01

    Operating reactor events such as the TMI accident and the Salem automatic-trip failures raised the concern that during a plant's operating lifetime the reliability of systems could degrade from the design level that was considered in the licensing process. To address this concern, NRC is sponsoring the Operational Safety Reliability Research project. The objectives of this project are to identify the essential tasks of a reliability program and to evaluate the effectiveness and attributes of such a reliability program applicable to maintaining an acceptable level of safety during the operating lifetime at the plant

  17. Reliability-Based Optimization in Structural Engineering

    DEFF Research Database (Denmark)

    Enevoldsen, I.; Sørensen, John Dalsgaard

    1994-01-01

    In this paper reliability-based optimization problems in structural engineering are formulated on the basis of the classical decision theory. Several formulations are presented: Reliability-based optimal design of structural systems with component or systems reliability constraints, reliability...

  18. Human reliability analysis

    International Nuclear Information System (INIS)

    Dougherty, E.M.; Fragola, J.R.

    1988-01-01

    The authors present a treatment of human reliability analysis incorporating an introduction to probabilistic risk assessment for nuclear power generating stations. They treat the subject according to the framework established for general systems theory. Draws upon reliability analysis, psychology, human factors engineering, and statistics, integrating elements of these fields within a systems framework. Provides a history of human reliability analysis, and includes examples of the application of the systems approach

  19. [Processing acoustically presented time intervals of seconds duration: an expression of the phonological loop of the working memory?].

    Science.gov (United States)

    Grube, D

    1996-01-01

    Working memory has been proposed to contribute to the processing of time, rhythm and music; the question which component of working memory is involved is under discussion. The present study tests the hypothesis that the phonological loop component (Baddeley, 1986) is involved in the processing of auditorily presented time intervals of a few seconds' duration. Typical effects well known with short-term retention of verbal material could be replicated with short-term retention of temporal intervals: The immediate reproduction of time intervals was impaired under conditions of background music and articulatory suppression. Neither the accuracy nor the speed of responses in a (non-phonological) mental rotation task were diminished under these conditions. Processing of auditorily presented time intervals seems to be constrained by the capacity of the phonological loop: The immediate serial recall of sequences of time intervals was shown to be related to the immediate serial recall of words (memory span). The results confirm the notion that working memory resources, and especially the phonological loop component, underlie the processing of auditorily presented temporal information with a duration of a few seconds.

  20. Congenital defects of C1 arches and odontoid process in a child with Down′s syndrome: A case presentation

    Directory of Open Access Journals (Sweden)

    Catherine Hatzantonis

    2016-01-01

    Full Text Available We present the case of a 2-year-old child with Down′s syndrome who presented to our unit with torticollis. Imaging studies revealed the rare occurrence of anterior and posterior C1 arch defects, absent odontoid process, and atlantoaxial subluxation. We managed her conservatively for 3 years without neurological deficits or worsening of atlantoaxial subluxation. We discuss the rare occurrences of anterior and posterior arch defects of the atlas, the radiological presentations of axis defects in patients, and the occurrence of atlantoaxial instability in patients with Down′s syndrome. Management options with consideration to surgery in asymptomatic and symptomatic patients are also discussed.

  1. Leaching study of heavy and radioactive elements present in wastes discarded by a uranium extraction and processing facility

    International Nuclear Information System (INIS)

    Pihlak, A.; Lippmaa, E.; Maremaee, E.; Sirk, A; Uustalu, E.

    1995-08-01

    The present report provides a systematic leaching study of the waste depository at the Sillamaee metallurgical plant 'Silmet' (former uranium extraction and processing facility), its construction and environmental impact. The following data are presented: γ-activity data of the depository and two drill cores, chemical composition and physical properties of depository material and leaching waters, results of γ- and α-spectrometric studies, leaching (with demineralized and sea water) intensities of loparite and uranium ore processing waste components. Environmental danger presented by the Sillamaee waste dump to the Gulf of Finland and the surrounding environment in Estonia is mainly due to uranium leaching and the presence of a large array of chemically poisonous substances

  2. Reliability training

    Science.gov (United States)

    Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Dillard, Richard B.; Wong, Kam L.; Barber, Frank J.; Barina, Frank J.

    1992-01-01

    Discussed here is failure physics, the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low cost reliable products. A review of reliability for the years 1940 to 2000 is given. Next, a review of mathematics is given as well as a description of what elements contribute to product failures. Basic reliability theory and the disciplines that allow us to control and eliminate failures are elucidated.

  3. Adenocarcinoma of the uncinate process of the pancreas: MDCT patterns of local invasion and clinical features at presentation

    Energy Technology Data Exchange (ETDEWEB)

    Padilla-Thornton, Amie E.; Willmann, Juergen K.; Jeffrey, R.B. [Stanford University School of Medicine, Department of Radiology, Stanford, CA (United States)

    2012-05-15

    To compare the multidetector CT (MDCT) patterns of local invasion and clinical findings at presentation in patients with adenocarcinoma of the uncinate process of the pancreas to patients with adenocarcinomas in the non-uncinate head of the pancreas. We evaluated the two cohorts for common duct and pancreatic duct dilatation, mesenteric vascular encasement, root of mesentery invasion, perineural invasion and duodenal invasion. In addition, we compared the clinical findings at presentation in both groups. Common duct (P < 0.001) and pancreatic duct dilatation (P = 0.001) were significantly less common in uncinate process adenocarcinomas than in the non-uncinate head of the pancreas. Clinical findings of jaundice (P = 0.01) and pruritis (P = 0.004) were significantly more common in patients with lesions in the non-uncinate head of the pancreas. Superior mesenteric artery encasement (P = 0.02) and perineural invasion (P = 0.001) were significantly more common with uncinate process adenocarcinomas. Owing to its unique anatomic location, adenocarcinomas within the uncinate process of the pancreas have significantly different patterns of both local invasion and clinical presentation compared to patients with carcinomas in the non-uncinate head of the pancreas. (orig.)

  4. Adenocarcinoma of the uncinate process of the pancreas: MDCT patterns of local invasion and clinical features at presentation

    International Nuclear Information System (INIS)

    Padilla-Thornton, Amie E.; Willmann, Juergen K.; Jeffrey, R.B.

    2012-01-01

    To compare the multidetector CT (MDCT) patterns of local invasion and clinical findings at presentation in patients with adenocarcinoma of the uncinate process of the pancreas to patients with adenocarcinomas in the non-uncinate head of the pancreas. We evaluated the two cohorts for common duct and pancreatic duct dilatation, mesenteric vascular encasement, root of mesentery invasion, perineural invasion and duodenal invasion. In addition, we compared the clinical findings at presentation in both groups. Common duct (P < 0.001) and pancreatic duct dilatation (P = 0.001) were significantly less common in uncinate process adenocarcinomas than in the non-uncinate head of the pancreas. Clinical findings of jaundice (P = 0.01) and pruritis (P = 0.004) were significantly more common in patients with lesions in the non-uncinate head of the pancreas. Superior mesenteric artery encasement (P = 0.02) and perineural invasion (P = 0.001) were significantly more common with uncinate process adenocarcinomas. Owing to its unique anatomic location, adenocarcinomas within the uncinate process of the pancreas have significantly different patterns of both local invasion and clinical presentation compared to patients with carcinomas in the non-uncinate head of the pancreas. (orig.)

  5. LED system reliability

    NARCIS (Netherlands)

    Driel, W.D. van; Yuan, C.A.; Koh, S.; Zhang, G.Q.

    2011-01-01

    This paper presents our effort to predict the system reliability of Solid State Lighting (SSL) applications. A SSL system is composed of a LED engine with micro-electronic driver(s) that supplies power to the optic design. Knowledge of system level reliability is not only a challenging scientific

  6. Supply chain reliability modelling

    Directory of Open Access Journals (Sweden)

    Eugen Zaitsev

    2012-03-01

    Full Text Available Background: Today it is virtually impossible to operate alone on the international level in the logistics business. This promotes the establishment and development of new integrated business entities - logistic operators. However, such cooperation within a supply chain creates also many problems related to the supply chain reliability as well as the optimization of the supplies planning. The aim of this paper was to develop and formulate the mathematical model and algorithms to find the optimum plan of supplies by using economic criterion and the model for the probability evaluating of non-failure operation of supply chain. Methods: The mathematical model and algorithms to find the optimum plan of supplies were developed and formulated by using economic criterion and the model for the probability evaluating of non-failure operation of supply chain. Results and conclusions: The problem of ensuring failure-free performance of goods supply channel analyzed in the paper is characteristic of distributed network systems that make active use of business process outsourcing technologies. The complex planning problem occurring in such systems that requires taking into account the consumer's requirements for failure-free performance in terms of supply volumes and correctness can be reduced to a relatively simple linear programming problem through logical analysis of the structures. The sequence of the operations, which should be taken into account during the process of the supply planning with the supplier's functional reliability, was presented.

  7. Neural processes underlying the"same"-"different" judgment of two simultaneously presented objects--an EEG study.

    Directory of Open Access Journals (Sweden)

    Ruiling Zhang

    Full Text Available The present study investigated the neural processes underlying "same" and -"different" judgments for two simultaneously presented objects, that varied on one or both, of two dimensions: color and shape. Participants judged whether or not the two objects were "same" or "different" on either the color dimension (color task or the shape dimension (shape task. The unattended irrelevant dimension of the objects was either congruent (same-same; different-different or incongruent (same-different. ERP data showed a main effect of color congruency in the time window 190-260 ms post-stimulus presentation and a main effect of shape congruency in the time window 220-280 ms post-stimulus presentation in both color and shape tasks. The interaction between color and shape congruency in the ERP data occurred in a later time window than the two main effects, indicating that mismatches in task-relevant and task-irrelevant dimensions were processed automatically and independently before a response was selected. The fact that the interference of the task-irrelevant dimension occurred after mismatch detection, supports a confluence model of processing.

  8. Variable processing and cross-presentation of HIV by dendritic cells and macrophages shapes CTL immunodominance and immune escape.

    Directory of Open Access Journals (Sweden)

    Jens Dinter

    2015-03-01

    Full Text Available Dendritic cells (DCs and macrophages (Møs internalize and process exogenous HIV-derived antigens for cross-presentation by MHC-I to cytotoxic CD8⁺ T cells (CTL. However, how degradation patterns of HIV antigens in the cross-presentation pathways affect immunodominance and immune escape is poorly defined. Here, we studied the processing and cross-presentation of dominant and subdominant HIV-1 Gag-derived epitopes and HLA-restricted mutants by monocyte-derived DCs and Møs. The cross-presentation of HIV proteins by both DCs and Møs led to higher CTL responses specific for immunodominant epitopes. The low CTL responses to subdominant epitopes were increased by pretreatment of target cells with peptidase inhibitors, suggestive of higher intracellular degradation of the corresponding peptides. Using DC and Mø cell extracts as a source of cytosolic, endosomal or lysosomal proteases to degrade long HIV peptides, we identified by mass spectrometry cell-specific and compartment-specific degradation patterns, which favored the production of peptides containing immunodominant epitopes in all compartments. The intracellular stability of optimal HIV-1 epitopes prior to loading onto MHC was highly variable and sequence-dependent in all compartments, and followed CTL hierarchy with immunodominant epitopes presenting higher stability rates. Common HLA-associated mutations in a dominant epitope appearing during acute HIV infection modified the degradation patterns of long HIV peptides, reduced intracellular stability and epitope production in cross-presentation-competent cell compartments, showing that impaired epitope production in the cross-presentation pathway contributes to immune escape. These findings highlight the contribution of degradation patterns in the cross-presentation pathway to HIV immunodominance and provide the first demonstration of immune escape affecting epitope cross-presentation.

  9. Analysis of information security reliability: A tutorial

    International Nuclear Information System (INIS)

    Kondakci, Suleyman

    2015-01-01

    This article presents a concise reliability analysis of network security abstracted from stochastic modeling, reliability, and queuing theories. Network security analysis is composed of threats, their impacts, and recovery of the failed systems. A unique framework with a collection of the key reliability models is presented here to guide the determination of the system reliability based on the strength of malicious acts and performance of the recovery processes. A unique model, called Attack-obstacle model, is also proposed here for analyzing systems with immunity growth features. Most computer science curricula do not contain courses in reliability modeling applicable to different areas of computer engineering. Hence, the topic of reliability analysis is often too diffuse to most computer engineers and researchers dealing with network security. This work is thus aimed at shedding some light on this issue, which can be useful in identifying models, their assumptions and practical parameters for estimating the reliability of threatened systems and for assessing the performance of recovery facilities. It can also be useful for the classification of processes and states regarding the reliability of information systems. Systems with stochastic behaviors undergoing queue operations and random state transitions can also benefit from the approaches presented here. - Highlights: • A concise survey and tutorial in model-based reliability analysis applicable to information security. • A framework of key modeling approaches for assessing reliability of networked systems. • The framework facilitates quantitative risk assessment tasks guided by stochastic modeling and queuing theory. • Evaluation of approaches and models for modeling threats, failures, impacts, and recovery analysis of information systems

  10. Reliability of functioning and reserves of system, controlling movements with different coordination structure of special health group girl students in physical education process

    Directory of Open Access Journals (Sweden)

    A.A. Pryimakov

    2017-04-01

    Full Text Available Purpose: to study reliability of functioning and reserves of system, controlling movements with different coordination structure of special health group girl students (low health level in physical education process. Material: in the research special health group girl students (n=136, age 17-19 participated. They were divided into 2 groups - control and experimental. The program, directed to increase reliability and reserves of system controlling movements, was realized. It was based on physical exercises of complicated coordination with novelty elements, which were fulfilled under musical accompaniment. The research continued one academic year. Results: in girl students with health problems we registered higher differential thresholds, when reproducing local movements in complicated conditions. They used visual and hearing feedback channels for informing brain’s programming areas about made mistakes. They were worse teachable in training accurate movements. These girl students have less expressed compensation reserves under impact of hindering factors and interferences. It can be interpreted as non-specific crisscross negative response to motor functional system in case of health problems. All these determine reduction of reserve potentials of motor control system. Conclusions: The main reserve potentials’ criteria of control over different coordination structure movements are: quickness of passing to program mechanism of fine movements’ regulation in stable conditions of functioning; power and effectiveness of compensatory reactions, ensuring interference immunity of system, controlling movements under interfering factors; reliability of maintaining movements’ qualitative parameters in optimal range under interfering factors; reduction of sensor interconnections in stable functioning conditions.

  11. Reliable solution processed planar perovskite hybrid solar cells with large-area uniformity by chloroform soaking and spin rinsing induced surface precipitation

    Directory of Open Access Journals (Sweden)

    Yann-Cherng Chern

    2015-08-01

    Full Text Available A solvent soaking and rinsing method, in which the solvent was allowed to soak all over the surface followed by a spinning for solvent draining, was found to produce perovskite layers with high uniformity on a centimeter scale and with much improved reliability. Besides the enhanced crystallinity and surface morphology due to the rinsing induced surface precipitation that constrains the grain growth underneath in the precursor films, large-area uniformity with film thickness determined exclusively by the rotational speed of rinsing spinning for solvent draining was observed. With chloroform as rinsing solvent, highly uniform and mirror-like perovskite layers of area as large as 8 cm × 8 cm were produced and highly uniform planar perovskite solar cells with power conversion efficiency of 10.6 ± 0.2% as well as much prolonged lifetime were obtained. The high uniformity and reliability observed with this solvent soaking and rinsing method were ascribed to the low viscosity of chloroform as well as its feasibility of mixing with the solvent used in the precursor solution. Moreover, since the surface precipitation forms before the solvent draining, this solvent soaking and rinsing method may be adapted to spinless process and be compatible with large-area and continuous production. With the large-area uniformity and reliability for the resultant perovskite layers, this chloroform soaking and rinsing approach may thus be promising for the mass production and commercialization of large-area perovskite solar cells.

  12. Systems reliability/structural reliability

    International Nuclear Information System (INIS)

    Green, A.E.

    1980-01-01

    The question of reliability technology using quantified techniques is considered for systems and structures. Systems reliability analysis has progressed to a viable and proven methodology whereas this has yet to be fully achieved for large scale structures. Structural loading variants over the half-time of the plant are considered to be more difficult to analyse than for systems, even though a relatively crude model may be a necessary starting point. Various reliability characteristics and environmental conditions are considered which enter this problem. The rare event situation is briefly mentioned together with aspects of proof testing and normal and upset loading conditions. (orig.)

  13. Presentation of the project Thermal processes with energy recovery for sludge and special or hazardous waste disposal

    International Nuclear Information System (INIS)

    Mininni, G.; Passino, R.

    2001-01-01

    Main results obtained in the framework of the project Thermal processes with energy recovery for sludge and special waste (also hazardous) disposal granted for 2.16 M Euro by the Italian Ministry of Education through Structural Funds are reported. This project is subdivided into the following four main sub projects: combustion of hazardous sludges on a demonstrative plant; combustion of sludges in fluidized bed reactors on pilot and laboratory scale; set up of a flue gas sampling device at high temperature; fluid dynamic modelling of combustion. Original and interesting results were produced mainly connected with the following aspects: combined incineration of sewage and hazardous sludges appeared to be reliable considering that gaseous emissions did not show any deterioration following the addition of chlorinated hydrocarbons to sewage sludge. Gaseous emissions were found to be not conform with the standards only in very few cases generally not linked with critical conditions: polynuclear aromatic hydrocarbons (PAHs) were found to be a reliable parameter for emissions monitoring considering that they are produced in abundance following up set conditions due to a temperature decline or an insufficient oxygen supply; the very stringent limit at the emissions of 0.1 ng/m 3 for dioxins and furans (TE) can be respected in the most of the cases but this monitoring is very complex, time consuming and expensive. Sampling and extraction errors might considerably impact the measured value, which is much more dependent on the conditions inside the recovery boiler rather than on the combustion efficiency; metals are enriched onto the fly ashes produced in the tests carried out by rotating drum furnace more than in the tests by fluidized bed furnace; after burning chamber mode of operation did not display a direct influence on the emissions; incineration tests carried out on reduced pilot scale have shown a total absence of fragmentation during volatilisation while particulate

  14. Introduction to quality and reliability engineering

    CERN Document Server

    Jiang, Renyan

    2015-01-01

    This book presents the state-of-the-art in quality and reliability engineering from a product life cycle standpoint. Topics in reliability include reliability models, life data analysis and modeling, design for reliability and accelerated life testing, while topics in quality include design for quality, acceptance sampling and supplier selection, statistical process control, production tests such as screening and burn-in, warranty and maintenance. The book provides comprehensive insights into two closely related subjects, and includes a wealth of examples and problems to enhance reader comprehension and link theory and practice. All numerical examples can be easily solved using Microsoft Excel. The book is intended for senior undergraduate and post-graduate students in related engineering and management programs such as mechanical engineering, manufacturing engineering, industrial engineering and engineering management programs, as well as for researchers and engineers in the quality and reliability fields. D...

  15. Increasing reliability of defect characterization on sg tubings using a combination of signal processing and expert system

    International Nuclear Information System (INIS)

    Benoist, B.; David, B.; Pigeon, M.

    1989-01-01

    An expert system is developed for automatic analysis of eddy current signals provided by the multifrequency control of steam generators tubing. This article describes on one hand the aim and the results of the elimination of pilgrim noise, on the other hand the expert system which uses signal analysis and signal processing in unison

  16. An automated process for building reliable and optimal in vitro/in vivo correlation models based on Monte Carlo simulations.

    Science.gov (United States)

    Sutton, Steven C; Hu, Mingxiu

    2006-05-05

    Many mathematical models have been proposed for establishing an in vitro/in vivo correlation (IVIVC). The traditional IVIVC model building process consists of 5 steps: deconvolution, model fitting, convolution, prediction error evaluation, and cross-validation. This is a time-consuming process and typically a few models at most are tested for any given data set. The objectives of this work were to (1) propose a statistical tool to screen models for further development of an IVIVC, (2) evaluate the performance of each model under different circumstances, and (3) investigate the effectiveness of common statistical model selection criteria for choosing IVIVC models. A computer program was developed to explore which model(s) would be most likely to work well with a random variation from the original formulation. The process used Monte Carlo simulation techniques to build IVIVC models. Data-based model selection criteria (Akaike Information Criteria [AIC], R2) and the probability of passing the Food and Drug Administration "prediction error" requirement was calculated. To illustrate this approach, several real data sets representing a broad range of release profiles are used to illustrate the process and to demonstrate the advantages of this automated process over the traditional approach. The Hixson-Crowell and Weibull models were often preferred over the linear. When evaluating whether a Level A IVIVC model was possible, the model selection criteria AIC generally selected the best model. We believe that the approach we proposed may be a rapid tool to determine which IVIVC model (if any) is the most applicable.

  17. OSS reliability measurement and assessment

    CERN Document Server

    Yamada, Shigeru

    2016-01-01

    This book analyses quantitative open source software (OSS) reliability assessment and its applications, focusing on three major topic areas: the Fundamentals of OSS Quality/Reliability Measurement and Assessment; the Practical Applications of OSS Reliability Modelling; and Recent Developments in OSS Reliability Modelling. Offering an ideal reference guide for graduate students and researchers in reliability for open source software (OSS) and modelling, the book introduces several methods of reliability assessment for OSS including component-oriented reliability analysis based on analytic hierarchy process (AHP), analytic network process (ANP), and non-homogeneous Poisson process (NHPP) models, the stochastic differential equation models and hazard rate models. These measurement and management technologies are essential to producing and maintaining quality/reliable systems using OSS.

  18. Human reliability

    International Nuclear Information System (INIS)

    Bubb, H.

    1992-01-01

    This book resulted from the activity of Task Force 4.2 - 'Human Reliability'. This group was established on February 27th, 1986, at the plenary meeting of the Technical Reliability Committee of VDI, within the framework of the joint committee of VDI on industrial systems technology - GIS. It is composed of representatives of industry, representatives of research institutes, of technical control boards and universities, whose job it is to study how man fits into the technical side of the world of work and to optimize this interaction. In a total of 17 sessions, information from the part of ergonomy dealing with human reliability in using technical systems at work was exchanged, and different methods for its evaluation were examined and analyzed. The outcome of this work was systematized and compiled in this book. (orig.) [de

  19. Microelectronics Reliability

    Science.gov (United States)

    2017-01-17

    inverters  connected in a chain. ................................................. 5  Figure 3  Typical graph showing frequency versus square root of...developing an experimental  reliability estimating methodology that could both illuminate the  lifetime  reliability of advanced devices,  circuits and...or  FIT of the device. In other words an accurate estimate of the device  lifetime  was found and thus the  reliability  that  can  be  conveniently

  20. Vibrotactile Identification of Signal-Processed Sounds from Environmental Events Presented by a Portable Vibrator: A Laboratory Study

    Directory of Open Access Journals (Sweden)

    Parivash Ranjbar

    2008-09-01

    Full Text Available Objectives: To evaluate different signal-processing algorithms for tactile identification of environmental sounds in a monitoring aid for the deafblind. Two men and three women, sensorineurally deaf or profoundly hearing impaired with experience of vibratory experiments, age 22-36 years. Methods: A closed set of 45 representative environmental sounds were processed using two transposing (TRHA, TR1/3 and three modulating algorithms (AM, AMFM, AMMC and presented as tactile stimuli using a portable vibrator in three experiments. The algorithms TRHA, TR1/3, AMFM and AMMC had two alternatives (with and without adaption to vibratory thresholds. In Exp. 1, the sounds were preprocessed and directly fed to the vibrator. In Exp. 2 and 3, the sounds were presented in an acoustic test room, without or with background noise (SNR=+5 dB, and processed in real time. Results: In Exp. 1, Algorithm AMFM and AMFM(A consistently had the lowest identification scores, and were thus excluded in Exp. 2 and 3. TRHA, AM, AMMC, and AMMC(A showed comparable identification scores (30%-42% and the addition of noise did not deteriorate the performance. Discussion: Algorithm TRHA, AM, AMMC, and AMMC(A showed good performance in all three experiments and were robust in noise they can therefore be used in further testing in real environments.

  1. Supervisor/Peer Involvement in Evaluation Transfer of Training Process and Results Reliability: A Research in an Italian Public Body

    Science.gov (United States)

    Capaldo, Guido; Depolo, Marco; Rippa, Pierluigi; Schiattone, Domenico

    2017-01-01

    Purpose: The aim of this paper is to present a study performed in conjunction with a branch of the Italian Public Italian Administration, the ISSP (Istituto Superiore di Studi Penitenziari--the Higher Institute of Penitentiary Studies). The study aimed to develop a Transfer of Training (ToT) evaluation methodology that would be both scientifically…

  2. Corantes alimentares presentes em alimentos ultraprocessados consumidos por universitários / Food dyes present in ultra-processed foods consumed by university students

    OpenAIRE

    Dayana Nolasco Gama; Maria Lúcia Teixeira Polônio

    2018-01-01

    Objetivo: Descrever os corantes alimentares presentes nos alimentos ultraprocessados consumidos por 273 graduandos de uma universidade pública do Rio de Janeiro. Métodos: Foi caracterizado o perfil sociodemográfico e de saúde a partir de questionário semiestruturado. Consumo de alimentos ultraprocessados foi obtido através do Questionário de Frequência Alimentar (gelatinas, biscoitos recheados, balas e chicletes, refrigerantes, preparados sólidos para refresco, sucos industrializados, tempero...

  3. Application of process monitoring to anomaly detection in nuclear material processing systems via system-centric event interpretation of data from multiple sensors of varying reliability

    International Nuclear Information System (INIS)

    Garcia, Humberto E.; Simpson, Michael F.; Lin, Wen-Chiao; Carlson, Reed B.; Yoo, Tae-Sic

    2017-01-01

    Highlights: • Process monitoring can strengthen nuclear safeguards and material accountancy. • Assessment is conducted at a system-centric level to improve safeguards effectiveness. • Anomaly detection is improved by integrating process and operation relationships. • Decision making is benefited from using sensor and event sequence information. • Formal framework enables optimization of sensor and data processing resources. - Abstract: In this paper, we apply an advanced safeguards approach and associated methods for process monitoring to a hypothetical nuclear material processing system. The assessment regarding the state of the processing facility is conducted at a system-centric level formulated in a hybrid framework. This utilizes architecture for integrating both time- and event-driven data and analysis for decision making. While the time-driven layers of the proposed architecture encompass more traditional process monitoring methods based on time series data and analysis, the event-driven layers encompass operation monitoring methods based on discrete event data and analysis. By integrating process- and operation-related information and methodologies within a unified framework, the task of anomaly detection is greatly improved. This is because decision-making can benefit from not only known time-series relationships among measured signals but also from known event sequence relationships among generated events. This available knowledge at both time series and discrete event layers can then be effectively used to synthesize observation solutions that optimally balance sensor and data processing requirements. The application of the proposed approach is then implemented on an illustrative monitored system based on pyroprocessing and results are discussed.

  4. Development of improved processing and evaluation methods for high reliability structural ceramics for advanced heat engine applications, Phase 1. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Pujari, V.K.; Tracey, D.M.; Foley, M.R.; Paille, N.I.; Pelletier, P.J.; Sales, L.C.; Wilkens, C.A.; Yeckley, R.L. [Norton Co., Northboro, MA (United States)

    1993-08-01

    The program goals were to develop and demonstrate significant improvements in processing methods, process controls and non-destructive evaluation (NDE) which can be commercially implemented to produce high reliability silicon nitride components for advanced heat engine applications at temperatures to 1,370{degrees}C. The program focused on a Si{sub 3}N{sub 4}-4% Y{sub 2}O{sub 3} high temperature ceramic composition and hot-isostatic-pressing as the method of densification. Stage I had as major objectives: (1) comparing injection molding and colloidal consolidation process routes, and selecting one route for subsequent optimization, (2) comparing the performance of water milled and alcohol milled powder and selecting one on the basis of performance data, and (3) adapting several NDE methods to the needs of ceramic processing. The NDE methods considered were microfocus X-ray radiography, computed tomography, ultrasonics, NMR imaging, NMR spectroscopy, fluorescent liquid dye penetrant and X-ray diffraction residual stress analysis. The colloidal consolidation process route was selected and approved as the forming technique for the remainder of the program. The material produced by the final Stage II optimized process has been given the designation NCX 5102 silicon nitride. According to plan, a large number of specimens were produced and tested during Stage III to establish a statistically robust room temperature tensile strength database for this material. Highlights of the Stage III process demonstration and resultant database are included in the main text of the report, along with a synopsis of the NCX-5102 aqueous based colloidal process. The R and D accomplishments for Stage I are discussed in Appendices 1--4, while the tensile strength-fractography database for the Stage III NCX-5102 process demonstration is provided in Appendix 5. 4 refs., 108 figs., 23 tabs.

  5. Development Of 12 Head GAMMA Detection And Graphical Presentation Software Suitable For Industrial Process Investigation By Radiotracer Technique

    International Nuclear Information System (INIS)

    Saengchantr, Dhanaj; Chueinta, Siripone

    2009-07-01

    Full text: Data logging with prompt graphical presentation software accommodating gamma radiation signals from 12 scintillation detectors through standard RS-232 interface has been developed. Laboratory testing by detection of injected-mixed radioactive tracer in a fluid flowing inside a pipe was conducted. The radioactive mixed fluid passed through the detectors located at several points along the pipe and the generated signals correspond to the mass flow inside the pipe were recorded. Up to 10,000 data points of fast (20 millisecond) dwell time could be accumulated. Graphical presentation allowed fast interpretation while the output data were suitable for more accurate evaluation with standard software e.g. Residence Time Distribution (RTD), Computed Tomography Visualization. Further utilization in the industry, in conjunction with radiotracer techniques, for troubleshooting and process optimization will be further carried out

  6. Multinomial-exponential reliability function: a software reliability model

    International Nuclear Information System (INIS)

    Saiz de Bustamante, Amalio; Saiz de Bustamante, Barbara

    2003-01-01

    The multinomial-exponential reliability function (MERF) was developed during a detailed study of the software failure/correction processes. Later on MERF was approximated by a much simpler exponential reliability function (EARF), which keeps most of MERF mathematical properties, so the two functions together makes up a single reliability model. The reliability model MERF/EARF considers the software failure process as a non-homogeneous Poisson process (NHPP), and the repair (correction) process, a multinomial distribution. The model supposes that both processes are statistically independent. The paper discusses the model's theoretical basis, its mathematical properties and its application to software reliability. Nevertheless it is foreseen model applications to inspection and maintenance of physical systems. The paper includes a complete numerical example of the model application to a software reliability analysis

  7. The processing of infrequently-presented low-intensity stimuli during natural sleep: An event-related potential study

    Directory of Open Access Journals (Sweden)

    Alexandra Muller-Gass

    2010-01-01

    Full Text Available Event-related potentials (ERPs provide an exquisite means to measure the extent of processing of external stimuli during the sleep period. This study examines ERPs elicited by stimuli with physical characteristics akin to environmental noise encountered during sleep. Brief duration 40, 60 or 80 dB sound pressure level (SPL tones were presented either rapidly (on average every two seconds or slowly (on average every 10 seconds. The rates of presentation and intensity of the stimuli were similar to those observed in environmental studies of noise. ERPs were recorded from nine young adults during sleep and wakefulness. During wakefulness, the amplitude of an early negative ERP, N1, systematically increased as intensity level increased. A later positivity, the P3a, was apparent following the loudest 80 dB stimulus regardless of the rate of stimulus presentation; it was also apparent following the 60 dB stimulus, when stimuli were presented slowly. The appearance of the N1-P3a deflections suggests that operations of the central executive controlling ongoing cognitive activity was interrupted, forcing subjects to become aware of the obtrusive task-irrelevant stimuli. The auditory stimuli elicited very different ERP patterns during sleep. During non-rapid eye movement (NREM sleep, the ERP was characterized by an enhanced (relative to wakefulness early positivity, P2, followed by a very prominent negativity, the N350. Both deflections systematically varied in amplitude with stimulus intensity level; in addition, N350 was much larger when stimuli were presented at slow rates. The N350, a sleep-specific ERP, is thought to reflect the inhibition of processing of potentially sleep-disrupting stimulus input. During rapid eye movement (REM sleep, a small amplitude N1 was apparent in the ERP, but only for the loudest, 80 dB stimulus. A small (nonsignificant P3a-like deflection was also visible following the 80 dB stimulus, but only when stimuli were presented

  8. Corantes alimentares presentes em alimentos ultraprocessados consumidos por universitários / Food dyes present in ultra-processed foods consumed by university students

    Directory of Open Access Journals (Sweden)

    Dayana Nolasco Gama

    2018-04-01

    Full Text Available Objetivo: Descrever os corantes alimentares presentes nos alimentos ultraprocessados consumidos por 273 graduandos de uma universidade pública do Rio de Janeiro. Métodos: Foi caracterizado o perfil sociodemográfico e de saúde a partir de questionário semiestruturado. Consumo de alimentos ultraprocessados foi obtido através do Questionário de Frequência Alimentar (gelatinas, biscoitos recheados, balas e chicletes, refrigerantes, preparados sólidos para refresco, sucos industrializados, temperos prontos e macarrão instantâneo e os corantes foram identificados nos rótulos dos produtos. Resultados: O consumo de produtos contendo corantes como gelatinas, balas e chicletes, refrigerantes e sucos industrializados foi acima de 80%. Balas e chicletes e temperos prontos tiveram consumo quase diário, sendo 56,9% e 54,1% respectivamente. Conclusão: Identificaram-se quatorze corantes nos rótulos dos produtos industrializados ultraprocessados.  Destacaram-se os corantes artificiais caramelos III e IV, bordeuax S, amarelo crepúsculo e tartrazina, e naturais urucum e carmim. Descritores: Corantes de alimentos, hábitos alimentares, risco.

  9. Hawaii Electric System Reliability

    Energy Technology Data Exchange (ETDEWEB)

    Loose, Verne William [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Silva Monroy, Cesar Augusto [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2012-08-01

    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers’ views of reliability “worth” and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers’ views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.

  10. Hawaii electric system reliability.

    Energy Technology Data Exchange (ETDEWEB)

    Silva Monroy, Cesar Augusto; Loose, Verne William

    2012-09-01

    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers' views of reliability %E2%80%9Cworth%E2%80%9D and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers' views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.

  11. Reliability and Probabilistic Risk Assessment - How They Play Together

    Science.gov (United States)

    Safie, Fayssal M.; Stutts, Richard; Huang, Zhaofeng

    2015-01-01

    The objective of this presentation is to discuss the PRA process and the reliability engineering discipline, their differences and similarities, and how they are used as complimentary analyses to support design and flight decisions.

  12. Mathematical Methods in Survival Analysis, Reliability and Quality of Life

    CERN Document Server

    Huber, Catherine; Mesbah, Mounir

    2008-01-01

    Reliability and survival analysis are important applications of stochastic mathematics (probability, statistics and stochastic processes) that are usually covered separately in spite of the similarity of the involved mathematical theory. This title aims to redress this situation: it includes 21 chapters divided into four parts: Survival analysis, Reliability, Quality of life, and Related topics. Many of these chapters were presented at the European Seminar on Mathematical Methods for Survival Analysis, Reliability and Quality of Life in 2006.

  13. Intraclass reliability for assessing how well Taiwan constrained hospital-provided medical services using statistical process control chart techniques.

    Science.gov (United States)

    Chien, Tsair-Wei; Chou, Ming-Ting; Wang, Wen-Chung; Tsai, Li-Shu; Lin, Weir-Sen

    2012-05-15

    Few studies discuss the indicators used to assess the effect on cost containment in healthcare across hospitals in a single-payer national healthcare system with constrained medical resources. We present the intraclass correlation coefficient (ICC) to assess how well Taiwan constrained hospital-provided medical services in such a system. A custom Excel-VBA routine to record the distances of standard deviations (SDs) from the central line (the mean over the previous 12 months) of a control chart was used to construct and scale annual medical expenditures sequentially from 2000 to 2009 for 421 hospitals in Taiwan to generate the ICC. The ICC was then used to evaluate Taiwan's year-based convergent power to remain unchanged in hospital-provided constrained medical services. A bubble chart of SDs for a specific month was generated to present the effects of using control charts in a national healthcare system. ICCs were generated for Taiwan's year-based convergent power to constrain its medical services from 2000 to 2009. All hospital groups showed a gradually well-controlled supply of services that decreased from 0.772 to 0.415. The bubble chart identified outlier hospitals that required investigation of possible excessive reimbursements in a specific time period. We recommend using the ICC to annually assess a nation's year-based convergent power to constrain medical services across hospitals. Using sequential control charts to regularly monitor hospital reimbursements is required to achieve financial control in a single-payer nationwide healthcare system.

  14. Application of advanced oxidation process by electron beam irradiation in the organic compounds degradation present in industrial effluents

    International Nuclear Information System (INIS)

    Duarte, Celina Lopes

    1999-01-01

    The inefficacy of conventional methods to destroy toxic organic compounds present in industrial effluent has taken the search for new technologies of treatment. he water irradiation is the most efficient process to generate radicals that mineralise these compounds. A study to evaluate the Advanced Oxidation Process by electron beam irradiation to treat industrial effluent with high toxic organic compounds concentration was carried out. Experiments were conducted using a Radiation Dynamics Electron Beam Accelerator with 1,5 MeV energy and 37 power. The effluent samples from a big industrial complex were irradiated using the IPEN's Liquid Effluent Irradiation Pilot Plant and the effluent samples from five steps of a Governmental Wastewater Treatment Plant from SABESP - ETE Suzano (industrial Receiver Unit, Coarse Bar Screens, Medium Bar Screens, Primary Sedimentation and Final Effluent), were irradiated in a batch system. The electron beam irradiation showed be efficient on destroying the organic compounds delivered in these effluents mainly chloroform, dichloroethane, methyl isobutyl ketone, benzene, toluene, xylene, phenol and in the decoloring of dyes present in some samples. To remove 90% of the most organic compounds was necessary a 20 kGy dose for industry's ETE, 20 kGy for IRU, CBS and MBS and 10 kGy to 20 kGy for PS and FE. (author)

  15. Pre-Brazed Casting and Hot Radial Pressing: A Reliable Process for the Manufacturing of CFC and W Monoblock Mockups

    International Nuclear Information System (INIS)

    Visca, E.; Libera, S.; Mancini, A.; Mazzone, G.; Pizzuto, A.; Testani, C.

    2006-01-01

    ENEA association is involved in the European International Thermonuclear Experimental Reactor (ITER) R-and-D activities and in particular for the manufacturing of high heat flux plasma-facing components (HHFC), such as the divertor targets, the baffles and the limiters: During the last years ENEA has manufactured actively cooled mock-ups by using different technologies, namely brazing, diffusion bonding and hot isostatic pressing (HIPping). A new manufacturing process has been set up and tested. It was successfully applied for the manufacturing of W armoured monoblock mockups. This technique is the HRP (Hot Radial Pressing) based on performing a radial diffusion bonding between the cooling tube and the armour tile by pressurizing only the internal tube and by keeping the joining zone in vacuum and at the required bonding temperature. The heating is obtained by a standard air furnace. The next step was to apply the HRP technique for the manufacturing of CFC armoured monoblock components. For this purpose some issues have to be solved like as the low CFC tensile strength, the pure copper interlayer between the heat sink and the armour necessary to mitigate the stress at the joint interface and the low wettability of the pure copper on the CFC matrix. This paper reports the research path followed to manufacture a medium scale vertical target CFC and W armoured mockup by HRP. An ad hoc rig able to maintain the CFC in a compressive constant condition was also designed and tested. The casting of a soft copper interlayer between the tube and the tile was performed by a new technique: the Pre-Brazed Casting (PBC, ENEA patent). Some mock-ups with three NB31 CFC tiles were successfully manufactured and tested to thermal fatigue using electron beam facilities. They all reached at least 1000 cycles at 20 MW/m 2 without suffering any damage. The manufactured medium scale vertical target mock-up is now under testing at the FE2000 (France) facility. (author)

  16. A review of culturally adapted versions of the Oswestry Disability Index: the adaptation process, construct validity, test-retest reliability and internal consistency.

    Science.gov (United States)

    Sheahan, Peter J; Nelson-Wong, Erika J; Fischer, Steven L

    2015-01-01

    The Oswestry Disability Index (ODI) is a self-report-based outcome measure used to quantify the extent of disability related to low back pain (LBP), a substantial contributor to workplace absenteeism. The ODI tool has been adapted for use by patients in several non-English speaking nations. It is unclear, however, if these adapted versions of the ODI are as credible as the original ODI developed for English-speaking nations. The objective of this study was to conduct a review of the literature to identify culturally adapted versions of the ODI and to report on the adaptation process, construct validity, test-retest reliability and internal consistency of these ODIs. Following a pragmatic review process, data were extracted from each study with regard to these four outcomes. While most studies applied adaptation processes in accordance with best-practice guidelines, there were some deviations. However, all studies reported high-quality psychometric properties: group mean construct validity was 0.734 ± 0.094 (indicated via a correlation coefficient), test-retest reliability was 0.937 ± 0.032 (indicated via an intraclass correlation coefficient) and internal consistency was 0.876 ± 0.047 (indicated via Cronbach's alpha). Researchers can be confident when using any of these culturally adapted ODIs, or when comparing and contrasting results between cultures where these versions were employed. Implications for Rehabilitation Low back pain is the second leading cause of disability in the world, behind only cancer. The Oswestry Disability Index (ODI) has been developed as a self-report outcome measure of low back pain for administration to patients. An understanding of the various cross-cultural adaptations of the ODI is important for more concerted multi-national research efforts. This review examines 16 cross-cultural adaptations of the ODI and should inform the work of health care and rehabilitation professionals.

  17. Redefining reliability

    International Nuclear Information System (INIS)

    Paulson, S.L.

    1995-01-01

    Want to buy some reliability? The question would have been unthinkable in some markets served by the natural gas business even a few years ago, but in the new gas marketplace, industrial, commercial and even some residential customers have the opportunity to choose from among an array of options about the kind of natural gas service they need--and are willing to pay for. The complexities of this brave new world of restructuring and competition have sent the industry scrambling to find ways to educate and inform its customers about the increased responsibility they will have in determining the level of gas reliability they choose. This article discusses the new options and the new responsibilities of customers, the needed for continuous education, and MidAmerican Energy Company's experiment in direct marketing of natural gas

  18. Structural Reliability Methods

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager; Madsen, H. O.

    The structural reliability methods quantitatively treat the uncertainty of predicting the behaviour and properties of a structure given the uncertain properties of its geometry, materials, and the actions it is supposed to withstand. This book addresses the probabilistic methods for evaluation...... of structural reliability, including the theoretical basis for these methods. Partial safety factor codes under current practice are briefly introduced and discussed. A probabilistic code format for obtaining a formal reliability evaluation system that catches the most essential features of the nature...... of the uncertainties and their interplay is the developed, step-by-step. The concepts presented are illustrated by numerous examples throughout the text....

  19. RTE - 2013 Reliability Report

    International Nuclear Information System (INIS)

    Denis, Anne-Marie

    2014-01-01

    RTE publishes a yearly reliability report based on a standard model to facilitate comparisons and highlight long-term trends. The 2013 report is not only stating the facts of the Significant System Events (ESS), but it moreover underlines the main elements dealing with the reliability of the electrical power system. It highlights the various elements which contribute to present and future reliability and provides an overview of the interaction between the various stakeholders of the Electrical Power System on the scale of the European Interconnected Network. (author)

  20. Weibull distribution in reliability data analysis in nuclear power plant

    International Nuclear Information System (INIS)

    Ma Yingfei; Zhang Zhijian; Zhang Min; Zheng Gangyang

    2015-01-01

    Reliability is an important issue affecting each stage of the life cycle ranging from birth to death of a product or a system. The reliability engineering includes the equipment failure data processing, quantitative assessment of system reliability and maintenance, etc. Reliability data refers to the variety of data that describe the reliability of system or component during its operation. These data may be in the form of numbers, graphics, symbols, texts and curves. Quantitative reliability assessment is the task of the reliability data analysis. It provides the information related to preventing, detect, and correct the defects of the reliability design. Reliability data analysis under proceed with the various stages of product life cycle and reliability activities. Reliability data of Systems Structures and Components (SSCs) in Nuclear Power Plants is the key factor of probabilistic safety assessment (PSA); reliability centered maintenance and life cycle management. The Weibull distribution is widely used in reliability engineering, failure analysis, industrial engineering to represent manufacturing and delivery times. It is commonly used to model time to fail, time to repair and material strength. In this paper, an improved Weibull distribution is introduced to analyze the reliability data of the SSCs in Nuclear Power Plants. An example is given in the paper to present the result of the new method. The Weibull distribution of mechanical equipment for reliability data fitting ability is very strong in nuclear power plant. It's a widely used mathematical model for reliability analysis. The current commonly used methods are two-parameter and three-parameter Weibull distribution. Through comparison and analysis, the three-parameter Weibull distribution fits the data better. It can reflect the reliability characteristics of the equipment and it is more realistic to the actual situation. (author)

  1. Reliability Centered Maintenance - Methodologies

    Science.gov (United States)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  2. Reliability data book

    International Nuclear Information System (INIS)

    Bento, J.P.; Boerje, S.; Ericsson, G.; Hasler, A.; Lyden, C.O.; Wallin, L.; Poern, K.; Aakerlund, O.

    1985-01-01

    The main objective for the report is to improve failure data for reliability calculations as parts of safety analyses for Swedish nuclear power plants. The work is based primarily on evaluations of failure reports as well as information provided by the operation and maintenance staff of each plant. In the report are presented charts of reliability data for: pumps, valves, control rods/rod drives, electrical components, and instruments. (L.E.)

  3. The analytic hierarchy process as a systematic approach to the identification of important parameters for the reliability assessment of passive systems

    International Nuclear Information System (INIS)

    Zio, E.; Cantarella, M.; Cammi, A.

    2003-01-01

    Passive systems play a crucial role in the development of future solutions for nuclear plant technology. A fundamental issue still to be resolved is the quantification of the reliability of such systems. In this paper, we firstly illustrate a systematic methodology to guide the definition of the failure criteria of a passive system and the evaluation of its probability of occurrence, through the identification of the relevant system parameters and the propagation of their associated uncertainties. Within this methodology, we propose the use of the analytic hierarchy process as a structured and reproducible tool for the decomposition of the problem and the identification of the dominant system parameters. An example of its application to a real passive system is illustrated in details

  4. Teflon/SiO2 Bilayer Passivation for Improving the Electrical Reliability of Oxide TFTs Fabricated Using a New Two-Photomask Self-Alignment Process

    Science.gov (United States)

    Fan, Ching-Lin; Shang, Ming-Chi; Li, Bo-Jyun; Lin, Yu-Zuo; Wang, Shea-Jue; Lee, Win-Der; Hung, Bohr-Ran

    2015-01-01

    This study proposes a two-photomask process for fabricating amorphous indium–gallium–zinc oxide (a-IGZO) thin-film transistors (TFTs) that exhibit a self-aligned structure. The fabricated TFTs, which lack etching-stop (ES) layers, have undamaged a-IGZO active layers that facilitate superior performance. In addition, we demonstrate a bilayer passivation method that uses a polytetrafluoroethylene (Teflon) and SiO2 combination layer for improving the electrical reliability of the fabricated TFTs. Teflon was deposited as a buffer layer through thermal evaporation. The Teflon layer exhibited favorable compatibility with the underlying IGZO channel layer and effectively protected the a-IGZO TFTs from plasma damage during SiO2 deposition, resulting in a negligible initial performance drop in the a-IGZO TFTs. Compared with passivation-free a-IGZO TFTs, passivated TFTs exhibited superior stability even after 168 h of aging under ambient air at 95% relative humidity. PMID:28788026

  5. Teflon/SiO₂ Bilayer Passivation for Improving the Electrical Reliability of Oxide TFTs Fabricated Using a New Two-Photomask Self-Alignment Process.

    Science.gov (United States)

    Fan, Ching-Lin; Shang, Ming-Chi; Li, Bo-Jyun; Lin, Yu-Zuo; Wang, Shea-Jue; Lee, Win-Der; Hung, Bohr-Ran

    2015-04-13

    This study proposes a two-photomask process for fabricating amorphous indium-gallium-zinc oxide (a-IGZO) thin-film transistors (TFTs) that exhibit a self-aligned structure. The fabricated TFTs, which lack etching-stop (ES) layers, have undamaged a-IGZO active layers that facilitate superior performance. In addition, we demonstrate a bilayer passivation method that uses a polytetrafluoroethylene (Teflon) and SiO₂ combination layer for improving the electrical reliability of the fabricated TFTs. Teflon was deposited as a buffer layer through thermal evaporation. The Teflon layer exhibited favorable compatibility with the underlying IGZO channel layer and effectively protected the a-IGZO TFTs from plasma damage during SiO₂ deposition, resulting in a negligible initial performance drop in the a-IGZO TFTs. Compared with passivation-free a-IGZO TFTs, passivated TFTs exhibited superior stability even after 168 h of aging under ambient air at 95% relative humidity.

  6. Teflon/SiO2 Bilayer Passivation for Improving the Electrical Reliability of Oxide TFTs Fabricated Using a New Two-Photomask Self-Alignment Process

    Directory of Open Access Journals (Sweden)

    Ching-Lin Fan

    2015-04-01

    Full Text Available This study proposes a two-photomask process for fabricating amorphous indium–gallium–zinc oxide (a-IGZO thin-film transistors (TFTs that exhibit a self-aligned structure. The fabricated TFTs, which lack etching-stop (ES layers, have undamaged a-IGZO active layers that facilitate superior performance. In addition, we demonstrate a bilayer passivation method that uses a polytetrafluoroethylene (Teflon and SiO2 combination layer for improving the electrical reliability of the fabricated TFTs. Teflon was deposited as a buffer layer through thermal evaporation. The Teflon layer exhibited favorable compatibility with the underlying IGZO channel layer and effectively protected the a-IGZO TFTs from plasma damage during SiO2 deposition, resulting in a negligible initial performance drop in the a-IGZO TFTs. Compared with passivation-free a-IGZO TFTs, passivated TFTs exhibited superior stability even after 168 h of aging under ambient air at 95% relative humidity.

  7. Enhancement of reliability of PLT-safety devices by utilization of process control components; Steigerung der Verfuegbarkeit von PLT-Schutzeinrichtungen durch Mitbenutzung von Komponenten des Prozessleitsystems

    Energy Technology Data Exchange (ETDEWEB)

    Gabriel, T.; Litz, L. [Technische Univ. Kaiserslautern (Germany); Schroers, B. [Material-Science AG, Leverkusen (Germany)

    2008-01-15

    According to the standard IEC 61511 each safety-related loop is assigned to one of the four Safety Integrity Levels (SILs). For every safety-related loop a SIL-specific Probability of Failure on Demand (PFD) must be proven. Usually, the PFD calculation is performed based upon the failure rates of each loop component aided by commercial software tools. However, this bottom-up approach suffers from many uncertainties. Especially, a lack of reliable failure rate data causes many problems. Reference data collected in different environments are available to solve this situation. However, this pragmatism leads to a PFD bandwidth, not to a single PFD value as desired. In order to make a decision for a numerical value appropriate for the chemical and pharmaceutical process industry a data ascertainment has been initiated by the European NAMUR. Its results display large deficiencies for the bottom-up approach. The error sources leading to this situation are located and analyzed. (GL)

  8. Reliability and protection against failure in computer systems

    International Nuclear Information System (INIS)

    Daniels, B.K.

    1979-01-01

    Computers are being increasingly integrated into the control and safety systems of large and potentially hazardous industrial processes. This development introduces problems which are particular to computer systems and opens the way to new techniques of solving conventional reliability and availability problems. References to the developing fields of software reliability, human factors and software design are given, and these subjects are related, where possible, to the quantified assessment of reliability. Original material is presented in the areas of reliability growth and computer hardware failure data. The report draws on the experience of the National Centre of Systems Reliability in assessing the capability and reliability of computer systems both within the nuclear industry, and from the work carried out in other industries by the Systems Reliability Service. (author)

  9. Process design kit and circuits at a 2 µm technology node for flexible wearable electronics applications (Conference Presentation)

    Science.gov (United States)

    Torres-Miranda, Miguel; Petritz, Andreas; Gold, Herbert; Stadlober, Barbara

    2016-09-01

    In this work we present our most advanced technology node of organic thin film transistors (OTFTs) manufactured with a channel length as short as 2 μm by contact photolithography and a self-alignment process directly on a plastic substrate. Our process design kit (PDK) is described with P-type transistors, capacitors and 3 metal layers for connections of complex circuits. The OTFTs are composed of a double dielectric layer with a photopatternable ultra thin polymer (PNDPE) and alumina, with a thickness on the order of 100 nm. The organic semiconductor is either Pentacene or DNTT, which have a stable average mobility up to 0.1 cm2/Vs. Finally, a polymer (e.g.: Parylene-C) is used as a passivation layer. We describe also our design rules for the placement of standard circuit cells. A "plastic wafer" is fabricated containing 49 dies. Each die of 1 cm2 has between 25 to 50 devices, proving larger scale integration in such a small space, unique in organic technologies. Finally, we present the design (by simulations using a Spice model for OTFTs) and the test of analog and digital basic circuits: amplifiers with DC gains of about 20 dB, comparators, inverters and logic gates working in the frequency range of 1-10 kHz. These standard circuit cells could be used for signal conditioning and integrated as active matrices for flexible sensors from 3rd party institutions, thus opening our fab to new ideas and sophisticated pre-industrial low cost applications for the emerging fields of biomedical devices and wearable electronics for virtual/augmented reality.

  10. The effect of modularity representation and presentation medium on the understandability of business process models in BPMN

    NARCIS (Netherlands)

    Turetken, Oktay; Rompen, Tessa; Vanderfeesten, Irene; Dikici, Ahmet; van Moll, Jan; La Rosa, M.; Loos, P.; Pastor, O.

    2016-01-01

    Many factors influence the creation of understandable business process models for an appropriate audience. Understandability of process models becomes critical particularly when a process is complex and its model is large in structure. Using modularization to represent such models hierarchically

  11. System reliability of corroding pipelines

    International Nuclear Information System (INIS)

    Zhou Wenxing

    2010-01-01

    A methodology is presented in this paper to evaluate the time-dependent system reliability of a pipeline segment that contains multiple active corrosion defects and is subjected to stochastic internal pressure loading. The pipeline segment is modeled as a series system with three distinctive failure modes due to corrosion, namely small leak, large leak and rupture. The internal pressure is characterized as a simple discrete stochastic process that consists of a sequence of independent and identically distributed random variables each acting over a period of one year. The magnitude of a given sequence follows the annual maximum pressure distribution. The methodology is illustrated through a hypothetical example. Furthermore, the impact of the spatial variability of the pressure loading and pipe resistances associated with different defects on the system reliability is investigated. The analysis results suggest that the spatial variability of pipe properties has a negligible impact on the system reliability. On the other hand, the spatial variability of the internal pressure, initial defect sizes and defect growth rates can have a significant impact on the system reliability.

  12. Reliability analysis techniques in power plant design

    International Nuclear Information System (INIS)

    Chang, N.E.

    1981-01-01

    An overview of reliability analysis techniques is presented as applied to power plant design. The key terms, power plant performance, reliability, availability and maintainability are defined. Reliability modeling, methods of analysis and component reliability data are briefly reviewed. Application of reliability analysis techniques from a design engineering approach to improving power plant productivity is discussed. (author)

  13. Assessing reliability in energy supply systems

    International Nuclear Information System (INIS)

    McCarthy, Ryan W.; Ogden, Joan M.; Sperling, Daniel

    2007-01-01

    Reliability has always been a concern in the energy sector, but concerns are escalating as energy demand increases and the political stability of many energy supply regions becomes more questionable. But how does one define and measure reliability? We introduce a method to assess reliability in energy supply systems in terms of adequacy and security. It derives from reliability assessment frameworks developed for the electricity sector, which are extended to include qualitative considerations and to be applicable to new energy systems by incorporating decision-making processes based on expert opinion and multi-attribute utility theory. The method presented here is flexible and can be applied to any energy system. To illustrate its use, we apply the method to two hydrogen pathways: (1) centralized steam reforming of imported liquefied natural gas with pipeline distribution of hydrogen, and (2) on-site electrolysis of water using renewable electricity produced independently from the electricity grid

  14. A Method of Nuclear Software Reliability Estimation

    International Nuclear Information System (INIS)

    Park, Gee Yong; Eom, Heung Seop; Cheon, Se Woo; Jang, Seung Cheol

    2011-01-01

    A method on estimating software reliability for nuclear safety software is proposed. This method is based on the software reliability growth model (SRGM) where the behavior of software failure is assumed to follow the non-homogeneous Poisson process. Several modeling schemes are presented in order to estimate and predict more precisely the number of software defects based on a few of software failure data. The Bayesian statistical inference is employed to estimate the model parameters by incorporating the software test cases into the model. It is identified that this method is capable of accurately estimating the remaining number of software defects which are on-demand type directly affecting safety trip functions. The software reliability can be estimated from a model equation and one method of obtaining the software reliability is proposed

  15. Simultaneous identification of DNA and RNA viruses present in pig faeces using process-controlled deep sequencing.

    Directory of Open Access Journals (Sweden)

    Jana Sachsenröder

    Full Text Available BACKGROUND: Animal faeces comprise a community of many different microorganisms including bacteria and viruses. Only scarce information is available about the diversity of viruses present in the faeces of pigs. Here we describe a protocol, which was optimized for the purification of the total fraction of viral particles from pig faeces. The genomes of the purified DNA and RNA viruses were simultaneously amplified by PCR and subjected to deep sequencing followed by bioinformatic analyses. The efficiency of the method was monitored using a process control consisting of three bacteriophages (T4, M13 and MS2 with different morphology and genome types. Defined amounts of the bacteriophages were added to the sample and their abundance was assessed by quantitative PCR during the preparation procedure. RESULTS: The procedure was applied to a pooled faecal sample of five pigs. From this sample, 69,613 sequence reads were generated. All of the added bacteriophages were identified by sequence analysis of the reads. In total, 7.7% of the reads showed significant sequence identities with published viral sequences. They mainly originated from bacteriophages (73.9% and mammalian viruses (23.9%; 0.8% of the sequences showed identities to plant viruses. The most abundant detected porcine viruses were kobuvirus, rotavirus C, astrovirus, enterovirus B, sapovirus and picobirnavirus. In addition, sequences with identities to the chimpanzee stool-associated circular ssDNA virus were identified. Whole genome analysis indicates that this virus, tentatively designated as pig stool-associated circular ssDNA virus (PigSCV, represents a novel pig virus. CONCLUSION: The established protocol enables the simultaneous detection of DNA and RNA viruses in pig faeces including the identification of so far unknown viruses. It may be applied in studies investigating aetiology, epidemiology and ecology of diseases. The implemented process control serves as quality control, ensures

  16. Optimal design of water supply networks for enhancing seismic reliability

    International Nuclear Information System (INIS)

    Yoo, Do Guen; Kang, Doosun; Kim, Joong Hoon

    2016-01-01

    The goal of the present study is to construct a reliability evaluation model of a water supply system taking seismic hazards and present techniques to enhance hydraulic reliability of the design into consideration. To maximize seismic reliability with limited budgets, an optimal design model is developed using an optimization technique called harmony search (HS). The model is applied to actual water supply systems to determine pipe diameters that can maximize seismic reliability. The reliabilities between the optimal design and existing designs were compared and analyzed. The optimal design would both enhance reliability by approximately 8.9% and have a construction cost of approximately 1.3% less than current pipe construction cost. In addition, the reinforcement of the durability of individual pipes without considering the system produced ineffective results in terms of both cost and reliability. Therefore, to increase the supply ability of the entire system, optimized pipe diameter combinations should be derived. Systems in which normal status hydraulic stability and abnormal status available demand could be maximally secured if configured through the optimal design. - Highlights: • We construct a seismic reliability evaluation model of water supply system. • We present technique to enhance hydraulic reliability in the aspect of design. • Harmony search algorithm is applied in optimal designs process. • The effects of the proposed optimal design are improved reliability about by 9%. • Optimized pipe diameter combinations should be derived indispensably.

  17. Reliability of Power Units in Poland and the World

    Directory of Open Access Journals (Sweden)

    Józef Paska

    2015-09-01

    Full Text Available One of a power system’s subsystems is the generation subsystem consisting of power units, the reliability of which to a large extent determines the reliability of the power system and electricity supply to consumers. This paper presents definitions of the basic indices of power unit reliability used in Poland and in the world. They are compared and analysed on the basis of data published by the Energy Market Agency (Poland, NERC (North American Electric Reliability Corporation – USA, and WEC (World Energy Council. Deficiencies and the lack of a unified national system for collecting and processing electric power equipment unavailability data are also indicated.

  18. Trends in Control Area of PLC Reliability and Safety Parameters

    Directory of Open Access Journals (Sweden)

    Juraj Zdansky

    2008-01-01

    Full Text Available Extension of the PLC application possibilities is closely related to increase of reliability and safety parameters. If the requirement of reliability and safety parameters will be suitable, the PLC could by implemented to specific applications such the safety-related processes control. The goal of this article is to show the way which producers are approaching to increase PLC`s reliability and safety parameters. The second goal is to analyze these parameters for range of present choice and describe the possibility how the reliability and safety parameters can be affected.

  19. Dynamic networks: Presentation held at the Workshop "Beyond Workflow Management: Supporting Dynamic Organizational Processes", CSCW 2000. 2. Dezember 2000, Philadelphia

    OpenAIRE

    Fuchs-Kittowski, F.

    2000-01-01

    Complex, dynamic organizational processes, especially problem-solving processes, require that the design and control of the cooperative work process is left to the cooperating persons. In reality of social organizations, a permanent change between extraneous- and self-organization is taking place. By integrating communication tools with application sharing synchronous CSCW systems can support dynamic workflows without restraining the self-organizing social processes of the people involved. .

  20. An Introduction To Reliability

    International Nuclear Information System (INIS)

    Park, Kyoung Su

    1993-08-01

    This book introduces reliability with definition of reliability, requirement of reliability, system of life cycle and reliability, reliability and failure rate such as summary, reliability characteristic, chance failure, failure rate which changes over time, failure mode, replacement, reliability in engineering design, reliability test over assumption of failure rate, and drawing of reliability data, prediction of system reliability, conservation of system, failure such as summary and failure relay and analysis of system safety.

  1. Multidisciplinary System Reliability Analysis

    Science.gov (United States)

    Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  2. Fluorine incorporation in solution-processed poly-siloxane passivation for highly reliable a-InGaZnO thin-film transistors

    Science.gov (United States)

    Yoshida, Naofumi; Bermundo, Juan Paolo; Ishikawa, Yasuaki; Nonaka, Toshiaki; Taniguchi, Katsuto; Uraoka, Yukiharu

    2018-03-01

    We investigated a fluorine-containing polysiloxane (Poly-SX) passivation layer fabricated by solution process for amorphous InGaZnO (a-IGZO) thin-film transistors (TFT). This passivation layer greatly improved the stability of the a-IGZO device even after being subjected to positive bias stress (PBS) and negative bias stress (NBS). The mobility (µ) of TFTs passivated by fluorine-containing Poly-SX increased by 31%-56% (10.50-12.54 cm2 V-1 s-1) compared with TFTs passivated by non-fluorinated Poly-SX (8.04 cm2 V-1 s-1). Increasing the amount of fluorine additives led to a higher µ in passivated TFTs. Aside from enhancing the performance, these passivation layers could increase the reliability of a-IGZO TFTs under PBS and NBS with a minimal threshold voltage shift (ΔV th) of up to  +0.2 V and  -0.1 V, respectively. Additionally, all TFTs passivated by the fluorinated passivation materials did not exhibit a hump effect after NBS. We also showed that fluorinated photosensitive Poly-SX, which can be fabricated without any dry etching process, had an effective passivation property. In this report, we demonstrated the photolithography of Poly-SX, and electrical properties of Poly-SX passivated TFTs, and analyzed the state of the a-IGZO layer to show the large potential of Poly-SX as an effective solution-processed passivation material.

  3. Carbon Dust Filtration in Three Different Nuclear Process Environments: A comparison the challenges Carbon Dust Filtration Presents Under Different Process Conditions

    International Nuclear Information System (INIS)

    Chadwick, Chris

    2014-01-01

    Inits thirty five years of activity as an engineering company in nuclear filtration sector, the Porvair Filtration Group has experienced several demands to remove of Carbon/graphite dust from several nuclear gas streams. Of particular interest among those applications are, and those to be reported upon in this paper, are; • High temperature, high pressure, high DP resistant (high strength) filters operating in the CO2 environment of the UK fleet of AGR (Advanced Gas-Cooled Reactors) • Removing gross quantities of Carbon dust from the exhaust stream of a radioactive, nuclear organics decomposition, waste process • High pressure Helium filtration to remove Carbon dust for a gas flow associated with the Fuel Handling System in the High Temperature Reactor programme Each process is different from the other and presents its own unique problems. The paper will present to this conference the very different properties Carbon dust appears to exhibit in each of these very different applications, and to discuss the effects those significant differences had/have on Porvair’s responses to each application. An interesting comparison will be made of the substantial difference between the performance of the UK AGR filters and those used in the US for the removal of decomposed organics, and the significantly different properties the Carbon appears to exhibit in each unique set of conditions Two UK AGR stations which are described are taken out of service when their bypass blowdown filters reach an operating DP of about 700mB DP (starting at a clean DP of around 100mB) to enable their replacement. The used filter assemblies are lifted from their housings and placed in an active storage area. Analysis of the used filter assemblies has shown that, where they are observable, they appear to be pristine with no apparent surface discolouration. It is only when examined under magnification that it becomes obvious that the filter medium, under the outer layer of fibres, is coated in

  4. Review of nuclear and non-nuclear applications of membrane processes - present problems and future R and D work

    International Nuclear Information System (INIS)

    Gutman, R.G.; Knibbs, R.H.

    1989-01-01

    This paper describes membrane processes that are of industrial significance in the fluid phase separations. The review covers pressure driven, cross-flow processes (reverse osmosis, ultrafiltration and microfiltration) and electrically driven membrane processes (electro-dialysis and electro-osmosis). A brief description of the mechanism of each of the different types of membrane process is given. The most common types of module design, spiral wound, hollow fibre and tubular are illustrated and compared and the operating limitations of temperature, pressure and pH are discussed. A review of membrane processes already finding large scale industrial applications is given and the paper concludes with a brief discussion of possible avenues of future R and D that might help to alleviate the problems of concentration polarisation and fouling of membranes. (author)

  5. Dynamic reliability of digital-based transmitters

    Energy Technology Data Exchange (ETDEWEB)

    Brissaud, Florent, E-mail: florent.brissaud.2007@utt.f [Institut National de l' Environnement Industriel et des Risques (INERIS), Parc Technologique Alata, BP 2, 60550 Verneuil-en-Halatte (France) and Universite de Technologie de Troyes - UTT, Institut Charles Delaunay - ICD and UMR CNRS 6279 STMR, 12 rue Marie Curie, BP 2060, 10010 Troyes Cedex (France); Smidts, Carol [Ohio State University (OSU), Nuclear Engineering Program, Department of Mechanical Engineering, Scott Laboratory, 201 W 19th Ave, Columbus OH 43210 (United States); Barros, Anne; Berenguer, Christophe [Universite de Technologie de Troyes (UTT), Institut Charles Delaunay (ICD) and UMR CNRS 6279 STMR, 12 rue Marie Curie, BP 2060, 10010 Troyes Cedex (France)

    2011-07-15

    Dynamic reliability explicitly handles the interactions between the stochastic behaviour of system components and the deterministic behaviour of process variables. While dynamic reliability provides a more efficient and realistic way to perform probabilistic risk assessment than 'static' approaches, its industrial level applications are still limited. Factors contributing to this situation are the inherent complexity of the theory and the lack of a generic platform. More recently the increased use of digital-based systems has also introduced additional modelling challenges related to specific interactions between system components. Typical examples are the 'intelligent transmitters' which are able to exchange information, and to perform internal data processing and advanced functionalities. To make a contribution to solving these challenges, the mathematical framework of dynamic reliability is extended to handle the data and information which are processed and exchanged between systems components. Stochastic deviations that may affect system properties are also introduced to enhance the modelling of failures. A formalized Petri net approach is then presented to perform the corresponding reliability analyses using numerical methods. Following this formalism, a versatile model for the dynamic reliability modelling of digital-based transmitters is proposed. Finally the framework's flexibility and effectiveness is demonstrated on a substantial case study involving a simplified model of a nuclear fast reactor.

  6. The consultation and relational empathy (CARE) measure: development and preliminary validation and reliability of an empathy-based consultation process measure.

    Science.gov (United States)

    Mercer, Stewart W; Maxwell, Margaret; Heaney, David; Watt, Graham Cm

    2004-12-01

    Empathy is a key aspect of the clinical encounter but there is a lack of patient-assessed measures suitable for general clinical settings. Our aim was to develop a consultation process measure based on a broad definition of empathy, which is meaningful to patients irrespective of their socio-economic background. Qualitative and quantitative approaches were used to develop and validate the new measure, which we have called the consultation and relational empathy (CARE) measure. Concurrent validity was assessed by correlational analysis against other validated measures in a series of three pilot studies in general practice (in areas of high or low socio-economic deprivation). Face and content validity was investigated by 43 interviews with patients from both types of areas, and by feedback from GPs and expert researchers in the field. The initial version of the new measure (pilot 1; high deprivation practice) correlated strongly (r = 0.85) with the Reynolds empathy measure (RES) and the Barrett-Lennard empathy subscale (BLESS) (r = 0.63), but had a highly skewed distribution (skew -1.879, kurtosis 3.563). Statistical analysis, and feedback from the 20 patients interviewed, the GPs and the expert researchers, led to a number of modifications. The revised, second version of the CARE measure, tested in an area of low deprivation (pilot 2) also correlated strongly with the established empathy measures (r = 0.84 versus RES and r = 0.77 versus BLESS) but had a less skewed distribution (skew -0.634, kurtosis -0.067). Internal reliability of the revised version was high (Cronbach's alpha 0.92). Patient feedback at interview (n = 13) led to only minor modification. The final version of the CARE measure, tested in pilot 3 (high deprivation practice) confirmed the validation with the other empathy measures (r = 0.85 versus RES and r = 0.84 versus BLESS) and the face validity (feedback from 10 patients). These preliminary results support the validity and reliability of the CARE

  7. Reliability engineering theory and practice

    CERN Document Server

    Birolini, Alessandro

    2010-01-01

    Presenting a solid overview of reliability engineering, this volume enables readers to build and evaluate the reliability of various components, equipment and systems. Current applications are presented, and the text itself is based on the author's 30 years of experience in the field.

  8. Wind turbine reliability : a database and analysis approach.

    Energy Technology Data Exchange (ETDEWEB)

    Linsday, James (ARES Corporation); Briand, Daniel; Hill, Roger Ray; Stinebaugh, Jennifer A.; Benjamin, Allan S. (ARES Corporation)

    2008-02-01

    The US wind Industry has experienced remarkable growth since the turn of the century. At the same time, the physical size and electrical generation capabilities of wind turbines has also experienced remarkable growth. As the market continues to expand, and as wind generation continues to gain a significant share of the generation portfolio, the reliability of wind turbine technology becomes increasingly important. This report addresses how operations and maintenance costs are related to unreliability - that is the failures experienced by systems and components. Reliability tools are demonstrated, data needed to understand and catalog failure events is described, and practical wind turbine reliability models are illustrated, including preliminary results. This report also presents a continuing process of how to proceed with controlling industry requirements, needs, and expectations related to Reliability, Availability, Maintainability, and Safety. A simply stated goal of this process is to better understand and to improve the operable reliability of wind turbine installations.

  9. Materials processing strategies for colloidal quantum dot solar cells: advances, present-day limitations, and pathways to improvement

    KAUST Repository

    Carey, Graham H.

    2013-05-13

    Colloidal quantum dot photovoltaic devices have improved from initial, sub-1% solar power conversion efficiency to current record performance of over 7%. Rapid advances in materials processing and device physics have driven this impressive performance progress. The highest-efficiency approaches rely on a fabrication process that starts with nanocrystals in solution, initially capped with long organic molecules. This solution is deposited and the resultant film is treated using a solution containing a second, shorter capping ligand, leading to a cross-linked, non-redispersible, and dense layer. This procedure is repeated, leading to the widely employed layer-by-layer solid-state ligand exchange. We will review the properties and features of this process, and will also discuss innovative pathways to creating even higher-performing films and photovoltaic devices.

  10. Materials processing strategies for colloidal quantum dot solar cells: advances, present-day limitations, and pathways to improvement

    KAUST Repository

    Carey, Graham H.; Chou, Kang Wei; Yan, Buyi; Kirmani, Ahmad R.; Amassian, Aram; Sargent, Edward H.

    2013-01-01

    Colloidal quantum dot photovoltaic devices have improved from initial, sub-1% solar power conversion efficiency to current record performance of over 7%. Rapid advances in materials processing and device physics have driven this impressive performance progress. The highest-efficiency approaches rely on a fabrication process that starts with nanocrystals in solution, initially capped with long organic molecules. This solution is deposited and the resultant film is treated using a solution containing a second, shorter capping ligand, leading to a cross-linked, non-redispersible, and dense layer. This procedure is repeated, leading to the widely employed layer-by-layer solid-state ligand exchange. We will review the properties and features of this process, and will also discuss innovative pathways to creating even higher-performing films and photovoltaic devices.

  11. Reliability Assessment Of Wind Turbines

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    2014-01-01

    Reduction of cost of energy for wind turbines are very important in order to make wind energy competitive compared to other energy sources. Therefore the turbine components should be designed to have sufficient reliability but also not be too costly (and safe). This paper presents models...... for uncertainty modeling and reliability assessment of especially the structural components such as tower, blades, substructure and foundation. But since the function of a wind turbine is highly dependent on many electrical and mechanical components as well as a control system also reliability aspects...... of these components are discussed and it is described how there reliability influences the reliability of the structural components. Two illustrative examples are presented considering uncertainty modeling, reliability assessment and calibration of partial safety factors for structural wind turbine components exposed...

  12. Validation of the design of small diameter pulsed columns for the process line DRA. Tests reliability compared with the industrial scale

    International Nuclear Information System (INIS)

    Leybros, J.

    2000-01-01

    As part of the Spin program related to the management of nuclear wastes, studies have been undertaken to develop partitioning processes like Diamex process. The process line CCBP/DRA in Atalante facility forms one of the main equipment devoted to these studies. On this line industrial apparatus are used but some like pulsed columns need to be adapted because of the specificity of the installation: limiting amount of nuclear matter, gaseous waste minimization, safety, limiting amounts of new extractants,... This article presents the comparison of 2 air pulsed columns, one with a standard diameter of 25 (DN25), the other with a reduced diameter of 15 (DN15). This comparison is based on 3 main criteria: pulsation capability, superficial throughput and mass transfer efficiency. The overall comparison shows that a DN15 pulsed column can be considered as a representative tool of research and development. Particularly, the study demonstrates the possibility of scaling up the results

  13. Effects of Different Multimedia Presentations on Viewers' Information-Processing Activities Measured by Eye-Tracking Technology

    Science.gov (United States)

    Chuang, Hsueh-Hua; Liu, Han-Chin

    2012-01-01

    This study implemented eye-tracking technology to understand the impact of different multimedia instructional materials, i.e., five successive pages versus a single page with the same amount of information, on information-processing activities in 21 non-science-major college students. The findings showed that students demonstrated the same number…

  14. Formation of Various Competencies in the Process of Training the Future Music Teachers at the Present Stage

    Science.gov (United States)

    Kovalev, Dmitry A.; Khussainova, Gulzada A.; Balagazova, Svetlana T.; Zhankul, Tamarasar

    2016-01-01

    The article is devoted to professional training of future music teachers. Based on the analysis of domestic and foreign studies, the authors proved the importance of studying this problem and focusing on different pedagogical aspects. The study of this topic in general shows that the process of training the future music teachers has its own…

  15. Reliability analysis of neutron transport simulation using Monte Carlo method

    International Nuclear Information System (INIS)

    Souza, Bismarck A. de; Borges, Jose C.

    1995-01-01

    This work presents a statistical and reliability analysis covering data obtained by computer simulation of neutron transport process, using the Monte Carlo method. A general description of the method and its applications is presented. Several simulations, corresponding to slowing down and shielding problems have been accomplished. The influence of the physical dimensions of the materials and of the sample size on the reliability level of results was investigated. The objective was to optimize the sample size, in order to obtain reliable results, optimizing computation time. (author). 5 refs, 8 figs

  16. Reliability of Wireless Sensor Networks

    Science.gov (United States)

    Dâmaso, Antônio; Rosa, Nelson; Maciel, Paulo

    2014-01-01

    Wireless Sensor Networks (WSNs) consist of hundreds or thousands of sensor nodes with limited processing, storage, and battery capabilities. There are several strategies to reduce the power consumption of WSN nodes (by increasing the network lifetime) and increase the reliability of the network (by improving the WSN Quality of Service). However, there is an inherent conflict between power consumption and reliability: an increase in reliability usually leads to an increase in power consumption. For example, routing algorithms can send the same packet though different paths (multipath strategy), which it is important for reliability, but they significantly increase the WSN power consumption. In this context, this paper proposes a model for evaluating the reliability of WSNs considering the battery level as a key factor. Moreover, this model is based on routing algorithms used by WSNs. In order to evaluate the proposed models, three scenarios were considered to show the impact of the power consumption on the reliability of WSNs. PMID:25157553

  17. The cost of reliability

    International Nuclear Information System (INIS)

    Ilic, M.

    1998-01-01

    In this article the restructuring process under way in the US power industry is being revisited from the point of view of transmission system provision and reliability was rolled into the average cost of electricity to all, it is not so obvious how is this cost managed in the new industry. A new MIT approach to transmission pricing is here suggested as a possible solution [it

  18. Dangerous poverty. Analysis of the criminalization process of poverty and youth in Uruguay and of the challenges that this process presents to the community psychology

    Directory of Open Access Journals (Sweden)

    Agustín Cano Menoni

    2014-02-01

    Full Text Available In this paper I analyze the components and main effects of what I characterized as a process of criminalization of youth in poverty, in the case of Uruguay. I argue that this process occurs through a series of discursive operations (in different levels: police, judicial, political and technical- scientific, which stigmatize the social reference group, placing them as a threat to society. To investigate this process, I analyze journalistic texts, testimonials and an advertising campaign, covering the following actors: a member of the national Parliament, the editorialist of the highest circulation newspaper in Uruguay, the Director of the largest hospice psychiatric in the country, and the Uruguayan Interior Ministry (police force. I conclude that in Uruguay started up a stigmatization process which place youth in poverty as a threat to society, and that this process involves the deepening of police approaches of the security problems, obscuring the conditions of social injustice behind them, and consecrating fear as the main principle of the social relationships. This situation also challenges to the social sciences, and in particular to psycological disciplines, by posing the challenge of finding new answers, both theoretical and methodological, alternatives to the stigmatization and police security approaches.

  19. Effects of elevated CO2 and trace ethylene present throughout the storage season on the processing colour of stored potatoes

    NARCIS (Netherlands)

    Daniels-Lake, B.J.

    2012-01-01

    Previous short-term trials (9-week duration) have shown that the fry colour of stored potatoes (Solanum tuberosum L.) can be negatively affected by simultaneous exposure to elevated CO2 plus a trace concentration of ethylene gas. In the present study, trials were conducted during each of two storage

  20. Types of organic materials present in CEGB waste streams and possible encapsulation processes for organic ion-exchange materials

    International Nuclear Information System (INIS)

    Haighton, A.P.

    1988-01-01

    The organic composition of low and intermediate-level radioactive wastes is discussed. Work underway in the development of immobilising binders for organic ion exchange resins found in radioactive wastes and in the encapsulation of these ion exchangers is presented. (U.K.)

  1. Software reliability studies

    Science.gov (United States)

    Hoppa, Mary Ann; Wilson, Larry W.

    1994-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Our research has shown that by improving the quality of the data one can greatly improve the predictions. We are working on methodologies which control some of the randomness inherent in the standard data generation processes in order to improve the accuracy of predictions. Our contribution is twofold in that we describe an experimental methodology using a data structure called the debugging graph and apply this methodology to assess the robustness of existing models. The debugging graph is used to analyze the effects of various fault recovery orders on the predictive accuracy of several well-known software reliability algorithms. We found that, along a particular debugging path in the graph, the predictive performance of different models can vary greatly. Similarly, just because a model 'fits' a given path's data well does not guarantee that the model would perform well on a different path. Further we observed bug interactions and noted their potential effects on the predictive process. We saw that not only do different faults fail at different rates, but that those rates can be affected by the particular debugging stage at which the rates are evaluated. Based on our experiment, we conjecture that the accuracy of a reliability prediction is affected by the fault recovery order as well as by fault interaction.

  2. Linking Parental Socialization to Interpersonal Protective Processes, Academic Self-Presentation, and Expectations among Rural African American Youth

    Science.gov (United States)

    Murry, Velma McBride; Berkel, Cady; Brody, Gene H.; Miller, Shannon J.; Chen, Yi-fu

    2008-01-01

    Data obtained from two waves of a longitudinal study of 671 rural African American families, with an 11-year-old preadolescent, were examined to test pathways through which racial and ethnic socialization influence youth's self-presentation and academic expectation and anticipation through the enhancement of youth self-pride. Structural equation modeling analyses indicated that racial and ethnic socialization was linked with youth's expectation and anticipation for academic success, through youth self-pride, including racial identity and self-esteem, and academic self-presentation. The results highlight the need to disaggregate racial and ethnic socialization in order to better understand how these parenting domains uniquely forecast youth self-pride, as well as their orientation to education and academic success. PMID:19209975

  3. Presentation of a method for consequence modeling and quantitative risk assessment of fire and explosion in process industry (Case study: Hydrogen Production Process

    Directory of Open Access Journals (Sweden)

    M J Jafari

    2013-05-01

     .Conclusion: Since the proposed method is applicable in all phases of process or system design, and estimates the risk of fire and explosion by a quantitative, comprehensive and mathematical-based equations approach. It can be used as an alternative method instead of qualitative and semi quantitative methods.

  4. RTE - 2015 Reliability Report. Summary

    International Nuclear Information System (INIS)

    2016-01-01

    Every year, RTE produces a reliability report for the past year. This report includes a number of results from previous years so that year-to-year comparisons can be drawn and long-term trends analysed. The 2015 report underlines the major factors that have impacted on the reliability of the electrical power system, without focusing exclusively on Significant System Events (ESS). It describes various factors which contribute to present and future reliability and the numerous actions implemented by RTE to ensure reliability today and in the future, as well as the ways in which the various parties involved in the electrical power system interact across the whole European interconnected network

  5. Byers Auto Group: A Case Study Into The Economics, Zoning, and Overall Process of Installing Small Wind Turbines at Two Automotive Dealerships in Ohio (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Sinclair, K.; Oteri, F.

    2011-05-01

    This presentation provides the talking points about a case study on the installation of a $600,000 small wind project, the installation process, estimated annual energy production and percentage of energy needs met by the turbines.

  6. Webinar Presentation: Exposures to Polycyclic Aromatic Hydrocarbons and Childhood Growth Trajectories and Body Composition: Linkages to Disrupted Self-Regulatory Processes

    Science.gov (United States)

    This presentation, Exposures to Polycyclic Aromatic Hydrocarbons and Childhood Growth Trajectories and Body Composition: Linkages to Disrupted Self-Regulatory Processes, was given at the NIEHS/EPA Children's Centers 2016 Webinar Series: Childhood Obesity

  7. Process of converting actinide ions present in the solid residues of a sulphating process for radioactive solid waste containing actinides into a useful state

    International Nuclear Information System (INIS)

    Wieczorek, H.; Oser, B.

    1985-01-01

    Stages of the process: a) The residue is dissolved in water or 1 to 2 mole nitric acid, where the greater part is dissolved. b) The solution formed is separated from the insoluble part of the residue and is heated to a temperature below its boiling point. c) The hot solution has an aquaeous barium nitrate solution added to it with a quantity which slightly exceeds that required for the stochiometric complete precipitation of the sulphate ions. The solution is kept at the selected temperature for a period of 0.5 to 2 hours. d) After subsequent cooling to room temperature, the precipitated barium sulphate is separated and e) the actinide-nitrate solution is fed into an extractive reprocessing process. (orig./PW) [de

  8. Rats Born to Mothers Treated with Dexamethasone 15 cH Present Changes in Modulation of Inflammatory Process

    Directory of Open Access Journals (Sweden)

    Leoni V. Bonamin

    2012-01-01

    Full Text Available As little information about the effect of ultra high dilutions of glucocorticoid in reproduction is available in the literature, pregnant female Wistar rats (N=12 were blindly subcutaneously treated during all gestational and lactation period with: dexamethasone 4 mg/kg diluted into dexamethasone 15 cH (mixed; or dexamethasone 4 mg/kg diluted in water; or dexamethasone 15 cH, or vehicle. Parental generation had body weight, food and water consumption monitored. The F1 generation was monitored regarding to newborn development. No birth occurred in both groups treated with dexamethasone 4 mg/kg. After 60 days from birth, 12 male F1 rats were randomly selected from each remaining group and inoculated subcutaneously with 1% carrageenan into the footpad, for evaluation of inflammatory performance. Edema and histopathology of the footpad were evaluated, using specific staining methods, immunohistochemistry and digital histomorphometry. Mothers treated with mixed dexamethasone presented reduced water consumption. F1 rats born to dexamethasone 15 cH treated females presented significant increase in mast cell degranulation, decrease in monocyte percentage, increase in CD18+ PMN cells, and early expression of ED2 protein, in relation to control. The results show that the exposure of parental generation to highly diluted dexamethasone interferes in inflammation modulation in the F1 generation.

  9. Nonparametric predictive inference in reliability

    International Nuclear Information System (INIS)

    Coolen, F.P.A.; Coolen-Schrijner, P.; Yan, K.J.

    2002-01-01

    We introduce a recently developed statistical approach, called nonparametric predictive inference (NPI), to reliability. Bounds for the survival function for a future observation are presented. We illustrate how NPI can deal with right-censored data, and discuss aspects of competing risks. We present possible applications of NPI for Bernoulli data, and we briefly outline applications of NPI for replacement decisions. The emphasis is on introduction and illustration of NPI in reliability contexts, detailed mathematical justifications are presented elsewhere

  10. Optics based signal processing methods for intraoperative blood vessel detection and quantification in real time (Conference Presentation)

    Science.gov (United States)

    Chaturvedi, Amal; Shukair, Shetha A.; Le Rolland, Paul; Vijayvergia, Mayank; Subramanian, Hariharan; Gunn, Jonathan W.

    2016-03-01

    Minimally invasive operations require surgeons to make difficult cuts to blood vessels and other tissues with impaired tactile and visual feedback. This leads to inadvertent cuts to blood vessels hidden beneath tissue, causing serious health risks to patients and a non-reimbursable financial burden to hospitals. Intraoperative imaging technologies have been developed, but these expensive systems can be cumbersome and provide only a high-level view of blood vessel networks. In this research, we propose a lean reflectance-based system, comprised of a dual wavelength LED, photodiode, and novel signal processing algorithms for rapid vessel characterization. Since this system takes advantage of the inherent pulsatile light absorption characteristics of blood vessels, no contrast agent is required for its ability to detect the presence of a blood vessel buried deep inside any tissue type (up to a cm) in real time. Once a vessel is detected, the system is able to estimate the distance of the vessel from the probe and the diameter size of the vessel (with a resolution of ~2mm), as well as delineate the type of tissue surrounding the vessel. The system is low-cost, functions in real-time, and could be mounted on already existing surgical tools, such as Kittner dissectors or laparoscopic suction irrigation cannulae. Having been successfully validated ex vivo, this technology will next be tested in a live porcine study and eventually in clinical trials.

  11. Nonlinear optical and multiphoton processes for in situ manipulation and conversion of photons: applications to energy and healthcare (Conference Presentation)

    Science.gov (United States)

    Prasad, Paras N.

    2017-02-01

    Chiral control of nonlinear optical functions holds a great promise for a wide range of applications including optical signal processing, bio-sensing and chiral bio-imaging. In chiral polyfluorene thin films, we demonstrated extremely large chiral nonlinearity. The physics of manipulating excitation dynamics for photon transformation will be discussed, along with nanochemistry control of upconversion in hierarchically built organic chromophore coupled-core-multiple shell nanostructures which enable introduce new, organic-inorganic energy transfer routes for broadband light harvesting and increased upconversion efficiency via multistep cascaded energy transfer. We are pursuing the applications of photon conversion technology in IR harvesting for photovoltaics, high contrast bioimaging, photoacoustic imaging, photodynamic therapy, and optogenetics. An important application is in Brain research and Neurophotonics for functional mapping and modulation of brain activities. Another new direction pursued is magnetic field control of light in in a chiral polymer nanocomposite to achieve large magneto-optic coefficient which can enable sensing of extremely weak magnetic field due to brain waves. Finally, we will consider the thought provoking concept of utilizing photons to quantify, through magneto-optics, and augment - through nanoptogenetics, the cognitive states, thus paving the path way to a quantified human paradigm.

  12. Localized surface plasmons modulated nonlinear optical processes in metal film-coupled and upconversion nanocrystals-coated nanoparticles (Conference Presentation)

    Science.gov (United States)

    Lei, Dangyuan

    2016-09-01

    In the first part of this talk, I will show our experimental investigation on the linear and nonlinear optical properties of metal film-coupled nanosphere monomers and dimers both with nanometric gaps. We have developed a new methodology - polarization resolved spectral decomposition and color decoding to "visualizing" unambiguously the spectral and radiation properties of the complex plasmonic gap modes in these hybrid nanostructures. Single-particle spectroscopic measurements indicate that these hybrid nanostructures can simultaneously enhance several nonlinear optical processes, such as second harmonic generation, two-photon absorption induced luminescence, and hyper-Raman scattering. In the second part, I will show how the polarization state of the emissions from sub-10 nm upconversion nanocrystals (UCNCs) can be modulated when they form a hybrid complex with a gold nanorod (GNR). Our single-particle scattering experiments expose how an interplay between excitation polarization and GNR orientation gives rise to an extraordinary polarized nature of the upconversion emissions from an individual hybrid nanostructure. We support our results by numerical simulations and, using Förster resonance energy transfer theory, we uncover how an overlap between the UCNC emission and GNR extinction bands as well as the mutual orientation between emission and plasmonic dipoles jointly determine the polarization state of the UC emissions.

  13. Reliability estimation of semi-Markov systems: a case study

    International Nuclear Information System (INIS)

    Ouhbi, Brahim; Limnios, Nikolaos

    1997-01-01

    In this article, we are concerned with the estimation of the reliability and the availability of a turbo-generator rotor using a set of data observed in a real engineering situation provided by Electricite De France (EDF). The rotor is modeled by a semi-Markov process, which is used to estimate the rotor's reliability and availability. To do this, we present a method for estimating the semi-Markov kernel from a censored data

  14. The development of all-polymer-based piezoelectrically active photocurable resin for 3D printing process (Conference Presentation)

    Science.gov (United States)

    Baker, Evan; Chu, Weishen; Ware, Henry Oliver T.; Farsheed, Adam C.; Sun, Cheng

    2017-02-01

    We present in this work the development and experimental validation of a new piezoelectric material (V-Ink) designed for compatibility with projection stereolithography additive manufacturing techniques. Piezoelectric materials generate a voltage output when a stress is applied to the material, and also can be actuated by using an external voltage and power source. This new material opens up new opportunities for functional devices to be developed and rapidly produced at low cost using emerging 3D printing techniques. The new piezoelectric material was able to generate 115mV under 1N of strain after being polled at 80°C for 40 minutes and the optimal results had a piezoelectric coefficient of 105x10^(-3)V.m/N. The current iteration of the material is a suspension, although further work is ongoing to make the resin a true solution. The nature of the suspension was characterized by a time-lapse monitoring and through viscosity testing. The potential exists to further increase the piezoelectric properties of this material by integrating a mechanical to electrical enhancer such as carbon nanotubes or barium titanate into the material. Such materials need to be functionalized to be integrated within the material, which is currently being explored. Printing with this material on a "continuous SLA" printer that we have developed will reduce build times by an order of magnitude to allow for mass manufacturing. Pairing those two advancements will enable faster printing and enhanced piezoelectric properties.

  15. Human α-amylase present in lower-genital-tract mucosal fluid processes glycogen to support vaginal colonization by Lactobacillus.

    Science.gov (United States)

    Spear, Gregory T; French, Audrey L; Gilbert, Douglas; Zariffard, M Reza; Mirmonsef, Paria; Sullivan, Thomas H; Spear, William W; Landay, Alan; Micci, Sandra; Lee, Byung-Hoo; Hamaker, Bruce R

    2014-10-01

    Lactobacillus colonization of the lower female genital tract provides protection from the acquisition of sexually transmitted diseases, including human immunodeficiency virus, and from adverse pregnancy outcomes. While glycogen in vaginal epithelium is thought to support Lactobacillus colonization in vivo, many Lactobacillus isolates cannot utilize glycogen in vitro. This study investigated how glycogen could be utilized by vaginal lactobacilli in the genital tract. Several Lactobacillus isolates were confirmed to not grow in glycogen, but did grow in glycogen-breakdown products, including maltose, maltotriose, maltopentaose, maltodextrins, and glycogen treated with salivary α-amylase. A temperature-dependent glycogen-degrading activity was detected in genital fluids that correlated with levels of α-amylase. Treatment of glycogen with genital fluids resulted in production of maltose, maltotriose, and maltotetraose, the major products of α-amylase digestion. These studies show that human α-amylase is present in the female lower genital tract and elucidates how epithelial glycogen can support Lactobacillus colonization in the genital tract. © The Author 2014. Published by Oxford University Press on behalf of the Infectious Diseases Society of America. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  16. Recent Advances in System Reliability Signatures, Multi-state Systems and Statistical Inference

    CERN Document Server

    Frenkel, Ilia

    2012-01-01

    Recent Advances in System Reliability discusses developments in modern reliability theory such as signatures, multi-state systems and statistical inference. It describes the latest achievements in these fields, and covers the application of these achievements to reliability engineering practice. The chapters cover a wide range of new theoretical subjects and have been written by leading experts in reliability theory and its applications.  The topics include: concepts and different definitions of signatures (D-spectra),  their  properties and applications  to  reliability of coherent systems and network-type structures; Lz-transform of Markov stochastic process and its application to multi-state system reliability analysis; methods for cost-reliability and cost-availability analysis of multi-state systems; optimal replacement and protection strategy; and statistical inference. Recent Advances in System Reliability presents many examples to illustrate the theoretical results. Real world multi-state systems...

  17. Reliability-Based Code Calibration

    DEFF Research Database (Denmark)

    Faber, M.H.; Sørensen, John Dalsgaard

    2003-01-01

    The present paper addresses fundamental concepts of reliability based code calibration. First basic principles of structural reliability theory are introduced and it is shown how the results of FORM based reliability analysis may be related to partial safety factors and characteristic values....... Thereafter the code calibration problem is presented in its principal decision theoretical form and it is discussed how acceptable levels of failure probability (or target reliabilities) may be established. Furthermore suggested values for acceptable annual failure probabilities are given for ultimate...... and serviceability limit states. Finally the paper describes the Joint Committee on Structural Safety (JCSS) recommended procedure - CodeCal - for the practical implementation of reliability based code calibration of LRFD based design codes....

  18. Trial application of reliability technology to emergency diesel generators at the Trojan Nuclear Power Plant

    International Nuclear Information System (INIS)

    Wong, S.M.; Boccio, J.L.; Karimian, S.; Azarm, M.A.; Carbonaro, J.; DeMoss, G.

    1986-01-01

    In this paper, a trial application of reliability technology to the emergency diesel generator system at the Trojan Nuclear Power Plant is presented. An approach for formulating a reliability program plan for this system is being developed. The trial application has shown that a reliability program process, using risk- and reliability-based techniques, can be interwoven into current plant operational activities to help in controlling, analyzing, and predicting faults that can challenge safety systems. With the cooperation of the utility, Portland General Electric Co., this reliability program can eventually be implemented at Trojan to track its effectiveness

  19. Reliability analysis and utilization of PEMs in space application

    Science.gov (United States)

    Jiang, Xiujie; Wang, Zhihua; Sun, Huixian; Chen, Xiaomin; Zhao, Tianlin; Yu, Guanghua; Zhou, Changyi

    2009-11-01

    More and more plastic encapsulated microcircuits (PEMs) are used in space missions to achieve high performance. Since PEMs are designed for use in terrestrial operating conditions, the successful usage of PEMs in space harsh environment is closely related to reliability issues, which should be considered firstly. However, there is no ready-made methodology for PEMs in space applications. This paper discusses the reliability for the usage of PEMs in space. This reliability analysis can be divided into five categories: radiation test, radiation hardness, screening test, reliability calculation and reliability assessment. One case study is also presented to illuminate the details of the process, in which a PEM part is used in a joint space program Double-Star Project between the European Space Agency (ESA) and China. The influence of environmental constrains including radiation, humidity, temperature and mechanics on the PEM part has been considered. Both Double-Star Project satellites are still running well in space now.

  20. Design for ASIC reliability for low-temperature applications

    Science.gov (United States)

    Chen, Yuan; Mojaradi, Mohammad; Westergard, Lynett; Billman, Curtis; Cozy, Scott; Burke, Gary; Kolawa, Elizabeth

    2005-01-01

    In this paper, we present a methodology to design for reliability for low temperature applications without requiring process improvement. The developed hot carrier aging lifetime projection model takes into account both the transistor substrate current profile and temperature profile to determine the minimum transistor size needed in order to meet reliability requirements. The methodology is applicable for automotive, military, and space applications, where there can be varying temperature ranges. A case study utilizing this methodology is given to design for reliability into a custom application-specific integrated circuit (ASIC) for a Mars exploration mission.

  1. Reliability modeling of Clinch River breeder reactor electrical shutdown systems

    International Nuclear Information System (INIS)

    Schatz, R.A.; Duetsch, K.L.

    1974-01-01

    The initial simulation of the probabilistic properties of the Clinch River Breeder Reactor Plant (CRBRP) electrical shutdown systems is described. A model of the reliability (and availability) of the systems is presented utilizing Success State and continuous-time, discrete state Markov modeling techniques as significant elements of an overall reliability assessment process capable of demonstrating the achievement of program goals. This model is examined for its sensitivity to safe/unsafe failure rates, sybsystem redundant configurations, test and repair intervals, monitoring by reactor operators; and the control exercised over system reliability by design modifications and the selection of system operating characteristics. (U.S.)

  2. Technology’s present situation and the development prospects of energy efficiency monitoring as well as performance testing & analysis for process flow compressors

    Science.gov (United States)

    Li, L.; Zhao, Y.; Wang, L.; Yang, Q.; Liu, G.; Tang, B.; Xiao, J.

    2017-08-01

    In this paper, the background of performance testing of in-service process flow compressors set in user field are introduced, the main technique barriers faced in the field test are summarized, and the factors that result in real efficiencies of most process flow compressors being lower than the guaranteed by manufacturer are analysed. The authors investigated the present operational situation of process flow compressors in China and found that low efficiency operation of flow compressors is because the compressed gas is generally forced to flow back into the inlet pipe for adapting to the process parameters variety. For example, the anti-surge valve is always opened for centrifugal compressor. To improve the operation efficiency of process compressors the energy efficiency monitoring technology was overviewed and some suggestions are proposed in the paper, which is the basis of research on energy efficiency evaluation and/or labelling of process compressors.

  3. Some remarks on software reliability

    International Nuclear Information System (INIS)

    Gonzalez Hernando, J.; Sanchez Izquierdo, J.

    1978-01-01

    Trend in modern NPPCI is toward a broad use of programmable elements. Some aspects concerning present status of programmable digital systems reliability are reported. Basic differences between software and hardware concept require a specific approach in all the reliability topics concerning software systems. The software reliability theory was initialy developed upon hardware models analogies. At present this approach is changing and specific models are being developed. The growing use of programmable systems necessitates emphasizing the importance of more adequate regulatory requirements to include this technology in NPPCI. (author)

  4. Reliability and construction control

    Directory of Open Access Journals (Sweden)

    Sherif S. AbdelSalam

    2016-06-01

    Full Text Available The goal of this study was to determine the most reliable and efficient combination of design and construction methods required for vibro piles. For a wide range of static and dynamic formulas, the reliability-based resistance factors were calculated using EGYPT database, which houses load test results for 318 piles. The analysis was extended to introduce a construction control factor that determines the variation between the pile nominal capacities calculated using static versus dynamic formulae. From the major outcomes, the lowest coefficient of variation is associated with Davisson’s criterion, and the resistance factors calculated for the AASHTO method are relatively high compared with other methods. Additionally, the CPT-Nottingham and Schmertmann method provided the most economic design. Recommendations related to a pile construction control factor were also presented, and it was found that utilizing the factor can significantly reduce variations between calculated and actual capacities.

  5. Improving Power Converter Reliability

    DEFF Research Database (Denmark)

    Ghimire, Pramod; de Vega, Angel Ruiz; Beczkowski, Szymon

    2014-01-01

    of a high-power IGBT module during converter operation, which may play a vital role in improving the reliability of the power converters. The measured voltage is used to estimate the module average junction temperature of the high and low-voltage side of a half-bridge IGBT separately in every fundamental......The real-time junction temperature monitoring of a high-power insulated-gate bipolar transistor (IGBT) module is important to increase the overall reliability of power converters for industrial applications. This article proposes a new method to measure the on-state collector?emitter voltage...... is measured in a wind power converter at a low fundamental frequency. To illustrate more, the test method as well as the performance of the measurement circuit are also presented. This measurement is also useful to indicate failure mechanisms such as bond wire lift-off and solder layer degradation...

  6. Accelerator Availability and Reliability Issues

    Energy Technology Data Exchange (ETDEWEB)

    Steve Suhring

    2003-05-01

    Maintaining reliable machine operations for existing machines as well as planning for future machines' operability present significant challenges to those responsible for system performance and improvement. Changes to machine requirements and beam specifications often reduce overall machine availability in an effort to meet user needs. Accelerator reliability issues from around the world will be presented, followed by a discussion of the major factors influencing machine availability.

  7. Power peaking nuclear reliability factors

    International Nuclear Information System (INIS)

    Hassan, H.A.; Pegram, J.W.; Mays, C.W.; Romano, J.J.; Woods, J.J.; Warren, H.D.

    1977-11-01

    The Calculational Nuclear Reliability Factor (CNRF) assigned to the limiting power density calculated in reactor design has been determined. The CNRF is presented as a function of the relative power density of the fuel assembly and its radial local. In addition, the Measurement Nuclear Reliability Factor (MNRF) for the measured peak hot pellet power in the core has been evaluated. This MNRF is also presented as a function of the relative power density and radial local within the fuel assembly

  8. Reliability issues at the LHC

    CERN Multimedia

    CERN. Geneva. Audiovisual Unit; Gillies, James D

    2002-01-01

    The Lectures on reliability issues at the LHC will be focused on five main Modules on five days. Module 1: Basic Elements in Reliability Engineering Some basic terms, definitions and methods, from components up to the system and the plant, common cause failures and human factor issues. Module 2: Interrelations of Reliability & Safety (R&S) Reliability and risk informed approach, living models, risk monitoring. Module 3: The ideal R&S Process for Large Scale Systems From R&S goals via the implementation into the system to the proof of the compliance. Module 4: Some Applications of R&S on LHC Master logic, anatomy of risk, cause - consequence diagram, decomposition and aggregation of the system. Module 5: Lessons learned from R&S Application in various Technologies Success stories, pitfalls, constrains in data and methods, limitations per se, experienced in aviation, space, process, nuclear, offshore and transport systems and plants. The Lectures will reflect in summary the compromise in...

  9. Reliability analysis and operator modelling

    International Nuclear Information System (INIS)

    Hollnagel, Erik

    1996-01-01

    The paper considers the state of operator modelling in reliability analysis. Operator models are needed in reliability analysis because operators are needed in process control systems. HRA methods must therefore be able to account both for human performance variability and for the dynamics of the interaction. A selected set of first generation HRA approaches is briefly described in terms of the operator model they use, their classification principle, and the actual method they propose. In addition, two examples of second generation methods are also considered. It is concluded that first generation HRA methods generally have very simplistic operator models, either referring to the time-reliability relationship or to elementary information processing concepts. It is argued that second generation HRA methods must recognise that cognition is embedded in a context, and be able to account for that in the way human reliability is analysed and assessed

  10. System evaluations by means of reliability analyses

    International Nuclear Information System (INIS)

    Breiling, G.

    1976-01-01

    The objective of this study is to show which analysis requirements are associated with the claim that a reliability analysis, as practised at present, can provide a quantitative risk assessment in absolute terms. The question arises of whether this claim can be substantiated without direct access to the specialist technical departments of a manufacturer and to the multifarious detail information available in these departments. The individual problems arising in the course of such an analysis are discussed on the example of a reliability analysis of a core flooding system. The questions discussed relate to analysis organisation, sequence analysis, fault-tree analysis, and the treatment of operational processes superimposed on the failure and repair processes. (orig.) [de

  11. Photovoltaic performance and reliability workshop

    Energy Technology Data Exchange (ETDEWEB)

    Mrig, L. [ed.

    1993-12-01

    This workshop was the sixth in a series of workshops sponsored by NREL/DOE under the general subject of photovoltaic testing and reliability during the period 1986--1993. PV performance and PV reliability are at least as important as PV cost, if not more. In the US, PV manufacturers, DOE laboratories, electric utilities, and others are engaged in the photovoltaic reliability research and testing. This group of researchers and others interested in the field were brought together to exchange the technical knowledge and field experience as related to current information in this evolving field of PV reliability. The papers presented here reflect this effort since the last workshop held in September, 1992. The topics covered include: cell and module characterization, module and system testing, durability and reliability, system field experience, and standards and codes.

  12. A study of operational and testing reliability in software reliability analysis

    International Nuclear Information System (INIS)

    Yang, B.; Xie, M.

    2000-01-01

    Software reliability is an important aspect of any complex equipment today. Software reliability is usually estimated based on reliability models such as nonhomogeneous Poisson process (NHPP) models. Software systems are improving in testing phase, while it normally does not change in operational phase. Depending on whether the reliability is to be predicted for testing phase or operation phase, different measure should be used. In this paper, two different reliability concepts, namely, the operational reliability and the testing reliability, are clarified and studied in detail. These concepts have been mixed up or even misused in some existing literature. Using different reliability concept will lead to different reliability values obtained and it will further lead to different reliability-based decisions made. The difference of the estimated reliabilities is studied and the effect on the optimal release time is investigated

  13. Melanoma cells present high levels of HLA-A2-tyrosinase in association with instability and aberrant intracellular processing of tyrosinase.

    Science.gov (United States)

    Michaeli, Yael; Sinik, Keren; Haus-Cohen, Maya; Reiter, Yoram

    2012-04-01

    Short-lived protein translation products are proposed to be a major source of substrates for major histocompatibility complex (MHC) class I antigen processing and presentation; however, a direct link between protein stability and the presentation level of MHC class I-peptide complexes has not been made. We have recently discovered that the peptide Tyr((369-377)) , derived from the tyrosinase protein is highly presented by HLA-A2 on the surface of melanoma cells. To examine the molecular mechanisms responsible for this presentation, we compared characteristics of tyrosinase in melanoma cells lines that present high or low levels of HLA-A2-Tyr((369-377)) complexes. We found no correlation between mRNA levels and the levels of HLA-A2-Tyr((369-377)) presentation. Co-localization experiments revealed that, in cell lines presenting low levels of HLA-A2-Tyr((369-377)) complexes, tyrosinase co-localizes with LAMP-1, a melanosome marker, whereas in cell lines presenting high HLA-A2-Tyr((369-377)) levels, tyrosinase localizes to the endoplasmic reticulum. We also observed differences in tyrosinase molecular weight and glycosylation composition as well as major differences in protein stability (t(1/2) ). By stabilizing the tyrosinase protein, we observed a dramatic decrease in HLA-A2-tyrosinase presentation. Our findings suggest that aberrant processing and instability of tyrosinase are responsible for the high presentation of HLA-A2-Tyr((369-377)) complexes and thus shed new light on the relationship between intracellular processing, stability of proteins, and MHC-restricted peptide presentation. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Proceedings of the Scientific Meeting and Presentation on Basic Research in Nuclear of the Science and Technology part II : Nuclear Chemistry and Process Technology

    International Nuclear Information System (INIS)

    Kamsul Abraha; Yateman Arryanto; Sri Jauhari S; Agus Taftazani; Kris Tri Basuki; Djoko Sardjono, Ign.; Sukarsono, R.; Samin; Syarip; Suryadi, MS; Sardjono, Y.; Tri Mardji Atmono; Dwiretnani Sudjoko; Tjipto Sujitno, BA.

    2007-08-01

    The Scientific Meeting and Presentation on Basic Research in Nuclear Science and Technology is a routine activity held by Centre for Accelerator Technology and Material Process, National Nuclear Energy Agency, for monitoring the research activity which achieved in National Nuclear Energy Agency. The Meeting was held in Yogyakarta on July 10, 2007. The proceedings contains papers presented on the meeting about Nuclear Chemistry and Process Technology and there are 47 papers which have separated index. The proceedings is the second part of the three parts which published in series. (PPIN)

  15. Space transportation main engine reliability and safety

    Science.gov (United States)

    Monk, Jan C.

    1991-01-01

    Viewgraphs are used to illustrate the reliability engineering and aerospace safety of the Space Transportation Main Engine (STME). A technology developed is called Total Quality Management (TQM). The goal is to develop a robust design. Reducing process variability produces a product with improved reliability and safety. Some engine system design characteristics are identified which improves reliability.

  16. A reliability simulation language for reliability analysis

    International Nuclear Information System (INIS)

    Deans, N.D.; Miller, A.J.; Mann, D.P.

    1986-01-01

    The results of work being undertaken to develop a Reliability Description Language (RDL) which will enable reliability analysts to describe complex reliability problems in a simple, clear and unambiguous way are described. Component and system features can be stated in a formal manner and subsequently used, along with control statements to form a structured program. The program can be compiled and executed on a general-purpose computer or special-purpose simulator. (DG)

  17. Structural reliability analysis and seismic risk assessment

    International Nuclear Information System (INIS)

    Hwang, H.; Reich, M.; Shinozuka, M.

    1984-01-01

    This paper presents a reliability analysis method for safety evaluation of nuclear structures. By utilizing this method, it is possible to estimate the limit state probability in the lifetime of structures and to generate analytically the fragility curves for PRA studies. The earthquake ground acceleration, in this approach, is represented by a segment of stationary Gaussian process with a zero mean and a Kanai-Tajimi Spectrum. All possible seismic hazard at a site represented by a hazard curve is also taken into consideration. Furthermore, the limit state of a structure is analytically defined and the corresponding limit state surface is then established. Finally, the fragility curve is generated and the limit state probability is evaluated. In this paper, using a realistic reinforced concrete containment as an example, results of the reliability analysis of the containment subjected to dead load, live load and ground earthquake acceleration are presented and a fragility curve for PRA studies is also constructed

  18. 76 FR 71011 - Reliability Technical Conference Agenda

    Science.gov (United States)

    2011-11-16

    ... process? Do you support the exemption process changes identified by the RTOs or other entities in comments... reliability plans? b. Do you support the exemption process changes identified by the RTOs or other entities in...

  19. Reliable 6 PEP LTPS device for AMOLED's

    Science.gov (United States)

    Chou, Cheng-Wei; Wang, Pei-Yun; Hu, Chin-Wei; Chang, York; Chuang, Ching-Sang; Lin, Yusin

    2013-09-01

    This study presents a TFT structure which has less photo process and higher cost competitiveness in AMOLED display markets. A novel LTPS based 6 masks TFT structure for bottom emission AMOLED display is demonstrated in this paper. High field effect mobility (PMOS < 80 cm2/Vs ) and high reliability (PBTS △Vth< 0.02V @ 50oC VG=15V 10ks) was accomplished without the high temperature and rapid thermal annealing (RTA) activation process. Furthermore, a 14-inch AMOLED TV was achieved on the proposed 6-pep TFT backplane using the Gen. 3.5 mass production factory.

  20. Proceeding on the scientific meeting and presentation on basic research of nuclear science and technology (book II): chemical, waste processing technology and environment

    International Nuclear Information System (INIS)

    Prayitno; Syarip; Samin; Darsono; Agus Taftazani; Sudjatmoko; Tri Mardji Atmono; Dwi Biyantoro; Gede Sutresna W; Tjipto Sujitno; Slamet Santosa; Herry Poernomo; Bambang Siswanto; Eko Edy Karmanto; Endro Kismolo; Budi Setiawan; Prajitno; Jumari; Wahini Nurhayati

    2015-06-01

    Scientific Meeting and Presentation on Basic Research in Nuclear Science and Technology is an annual activity held by Centre for Accelerator Science and Technology, National Nuclear Energy Agency, in Yogyakarta, for monitoring research activities achieved by the Agency. The papers presented in the meeting were collected into proceedings which were divided into two groups that are chemistry, environmental and waste treatment technology process . The proceedings consists of three articles from keynote speakers and 24 articles from BATAN and others participants.(PPIKSN)

  1. Proceedings of the Scientific Meeting and Presentation on Basic Researchin Nuclear Science and Technology part II: Nuclear Chemistry, Process Technology, Radioactive Waste Management and Environment

    International Nuclear Information System (INIS)

    Sukarsono, R.; Karmanto, Eko-Edy; Suradjijo, Ganang

    2000-01-01

    Scientific Meeting and Presentation on Basic Research in Nuclear Scienceand Technology is an annual activity held by Centre for Research and Development of Advanced Technology, National Nuclear Energy Agency, for monitoring research activities achieved by the Agency. The papers presented in the meeting were collected into proceedings. These are the second part of the proceedings that contain 71 articles in the fields of nuclear chemistry, process technology, radioactive waste management, and environment (PPIN).

  2. Reliability Evaluation of Bridges Based on Nonprobabilistic Response Surface Limit Method

    Directory of Open Access Journals (Sweden)

    Xuyong Chen

    2017-01-01

    Full Text Available Due to many uncertainties in nonprobabilistic reliability assessment of bridges, the limit state function is generally unknown. The traditional nonprobabilistic response surface method is a lengthy and oscillating iteration process and leads to difficultly solving the nonprobabilistic reliability index. This article proposes a nonprobabilistic response surface limit method based on the interval model. The intention of this method is to solve the upper and lower limits of the nonprobabilistic reliability index and to narrow the range of the nonprobabilistic reliability index. If the range of the reliability index reduces to an acceptable accuracy, the solution will be considered convergent, and the nonprobabilistic reliability index will be obtained. The case study indicates that using the proposed method can avoid oscillating iteration process, make iteration process stable and convergent, reduce iteration steps significantly, and improve computational efficiency and precision significantly compared with the traditional nonprobabilistic response surface method. Finally, the nonprobabilistic reliability evaluation process of bridge will be built through evaluating the reliability of one PC continuous rigid frame bridge with three spans using the proposed method, which appears to be more simple and reliable when lack of samples and parameters in the bridge nonprobabilistic reliability evaluation is present.

  3. Reliability of Circumplex Axes

    Directory of Open Access Journals (Sweden)

    Micha Strack

    2013-06-01

    Full Text Available We present a confirmatory factor analysis (CFA procedure for computing the reliability of circumplex axes. The tau-equivalent CFA variance decomposition model estimates five variance components: general factor, axes, scale-specificity, block-specificity, and item-specificity. Only the axes variance component is used for reliability estimation. We apply the model to six circumplex types and 13 instruments assessing interpersonal and motivational constructs—Interpersonal Adjective List (IAL, Interpersonal Adjective Scales (revised; IAS-R, Inventory of Interpersonal Problems (IIP, Impact Messages Inventory (IMI, Circumplex Scales of Interpersonal Values (CSIV, Support Action Scale Circumplex (SAS-C, Interaction Problems With Animals (IPI-A, Team Role Circle (TRC, Competing Values Leadership Instrument (CV-LI, Love Styles, Organizational Culture Assessment Instrument (OCAI, Customer Orientation Circle (COC, and System for Multi-Level Observation of Groups (behavioral adjectives; SYMLOG—in 17 German-speaking samples (29 subsamples, grouped by self-report, other report, and metaperception assessments. The general factor accounted for a proportion ranging from 1% to 48% of the item variance, the axes component for 2% to 30%; and scale specificity for 1% to 28%, respectively. Reliability estimates varied considerably from .13 to .92. An application of the Nunnally and Bernstein formula proposed by Markey, Markey, and Tinsley overestimated axes reliabilities in cases of large-scale specificities but otherwise works effectively. Contemporary circumplex evaluations such as Tracey’s RANDALL are sensitive to the ratio of the axes and scale-specificity components. In contrast, the proposed model isolates both components.

  4. Auditory semantic processing in dichotic listening: effects of competing speech, ear of presentation, and sentential bias on N400s to spoken words in context.

    Science.gov (United States)

    Carey, Daniel; Mercure, Evelyne; Pizzioli, Fabrizio; Aydelott, Jennifer

    2014-12-01

    The effects of ear of presentation and competing speech on N400s to spoken words in context were examined in a dichotic sentence priming paradigm. Auditory sentence contexts with a strong or weak semantic bias were presented in isolation to the right or left ear, or with a competing signal presented in the other ear at a SNR of -12 dB. Target words were congruent or incongruent with the sentence meaning. Competing speech attenuated N400s to both congruent and incongruent targets, suggesting that the demand imposed by a competing signal disrupts the engagement of semantic comprehension processes. Bias strength affected N400 amplitudes differentially depending upon ear of presentation: weak contexts presented to the le/RH produced a more negative N400 response to targets than strong contexts, whereas no significant effect of bias strength was observed for sentences presented to the re/LH. The results are consistent with a model of semantic processing in which the RH relies on integrative processing strategies in the interpretation of sentence-level meaning. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Time-dependent reliability analysis of nuclear reactor operators using probabilistic network models

    International Nuclear Information System (INIS)

    Oka, Y.; Miyata, K.; Kodaira, H.; Murakami, S.; Kondo, S.; Togo, Y.

    1987-01-01

    Human factors are very important for the reliability of a nuclear power plant. Human behavior has essentially a time-dependent nature. The details of thinking and decision making processes are important for detailed analysis of human reliability. They have, however, not been well considered by the conventional methods of human reliability analysis. The present paper describes the models for the time-dependent and detailed human reliability analysis. Recovery by an operator is taken into account and two-operators models are also presented

  6. An Integrated Approach to Establish Validity and Reliability of Reading Tests

    Science.gov (United States)

    Razi, Salim

    2012-01-01

    This study presents the processes of developing and establishing reliability and validity of a reading test by administering an integrative approach as conventional reliability and validity measures superficially reveals the difficulty of a reading test. In this respect, analysing vocabulary frequency of the test is regarded as a more eligible way…

  7. Constitutional Law--Procedural Due Process--Student Has Right to Have Attorney Present at University Disciplinary Hearing When Criminal Charges Are Pending.

    Science.gov (United States)

    Vessels, Rodney Jay

    1978-01-01

    In the case of Gabrilowitz v Newman the court used the due process balancing test to conclude that a student has a right to have counsel present at a university disciplinary hearing where the conduct in question is the object of a pending criminal proceeding. Available from J. Reuben Clark Law School, Brigham Young U., Provo, UT 84602. (MSE)

  8. Reliability evaluation of power systems

    CERN Document Server

    Billinton, Roy

    1996-01-01

    The Second Edition of this well-received textbook presents over a decade of new research in power system reliability-while maintaining the general concept, structure, and style of the original volume. This edition features new chapters on the growing areas of Monte Carlo simulation and reliability economics. In addition, chapters cover the latest developments in techniques and their application to real problems. The text also explores the progress occurring in the structure, planning, and operation of real power systems due to changing ownership, regulation, and access. This work serves as a companion volume to Reliability Evaluation of Engineering Systems: Second Edition (1992).

  9. Aerospace reliability applied to biomedicine.

    Science.gov (United States)

    Lalli, V. R.; Vargo, D. J.

    1972-01-01

    An analysis is presented that indicates that the reliability and quality assurance methodology selected by NASA to minimize failures in aerospace equipment can be applied directly to biomedical devices to improve hospital equipment reliability. The Space Electric Rocket Test project is used as an example of NASA application of reliability and quality assurance (R&QA) methods. By analogy a comparison is made to show how these same methods can be used in the development of transducers, instrumentation, and complex systems for use in medicine.

  10. Integrating reliability analysis and design

    International Nuclear Information System (INIS)

    Rasmuson, D.M.

    1980-10-01

    This report describes the Interactive Reliability Analysis Project and demonstrates the advantages of using computer-aided design systems (CADS) in reliability analysis. Common cause failure problems require presentations of systems, analysis of fault trees, and evaluation of solutions to these. Results have to be communicated between the reliability analyst and the system designer. Using a computer-aided design system saves time and money in the analysis of design. Computer-aided design systems lend themselves to cable routing, valve and switch lists, pipe routing, and other component studies. At EG and G Idaho, Inc., the Applicon CADS is being applied to the study of water reactor safety systems

  11. A viral, transporter associated with antigen processing (TAP)-independent, high affinity ligand with alternative interactions endogenously presented by the nonclassical human leukocyte antigen E class I molecule.

    Science.gov (United States)

    Lorente, Elena; Infantes, Susana; Abia, David; Barnea, Eilon; Beer, Ilan; García, Ruth; Lasala, Fátima; Jiménez, Mercedes; Mir, Carmen; Morreale, Antonio; Admon, Arie; López, Daniel

    2012-10-12

    The transporter associated with antigen processing (TAP) enables the flow of viral peptides generated in the cytosol by the proteasome and other proteases to the endoplasmic reticulum, where they complex with nascent human leukocyte antigen (HLA) class I. Later, these peptide-HLA class I complexes can be recognized by CD8(+) lymphocytes. Cancerous cells and infected cells in which TAP is blocked, as well as individuals with unusable TAP complexes, are able to present peptides on HLA class I by generating them through TAP-independent processing pathways. Here, we identify a physiologically processed HLA-E ligand derived from the D8L protein in TAP-deficient vaccinia virus-infected cells. This natural high affinity HLA-E class I ligand uses alternative interactions to the anchor motifs previously described to be presented on nonclassical HLA class I molecules. This octameric peptide was also presented on HLA-Cw1 with similar binding affinity on both classical and nonclassical class I molecules. In addition, this viral peptide inhibits HLA-E-mediated cytolysis by natural killer cells. Comparison between the amino acid sequences of the presenting HLA-E and HLA-Cw1 alleles revealed a shared structural motif in both HLA class molecules, which could be related to their observed similar cross-reactivity affinities. This motif consists of several residues located on the floor of the peptide-binding site. These data expand the role of HLA-E as an antigen-presenting molecule.

  12. Technical presentation

    CERN Document Server

    FP Department

    2009-01-01

    07 April 2009 Technical presentation by Leuze Electronics: 14.00 – 15.00, Main Building, Room 61-1-017 (Room A) Photoelectric sensors, data identification and transmission systems, image processing systems. We at Leuze Electronics are "the sensor people": we have been specialising in optoelectronic sensors and safety technology for accident prevention for over 40 years. Our dedicated staff are all highly customer oriented. Customers of Leuze Electronics can always rely on one thing – on us! •\tFounded in 1963 •\t740 employees •\t115 MEUR turnover •\t20 subsidiaries •\t3 production facilities in southern Germany Product groups: •\tPhotoelectric sensors •\tIdentification and measurements •\tSafety devices

  13. Optimal, Reliability-Based Code Calibration

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    2002-01-01

    Reliability based code calibration is considered in this paper. It is described how the results of FORM based reliability analysis may be related to the partial safety factors and characteristic values. The code calibration problem is presented in a decision theoretical form and it is discussed how...... of reliability based code calibration of LRFD based design codes....

  14. Structural reliability assessment capability in NESSUS

    Science.gov (United States)

    Millwater, H.; Wu, Y.-T.

    1992-07-01

    The principal capabilities of NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), an advanced computer code developed for probabilistic structural response analysis, are reviewed, and its structural reliability assessed. The code combines flexible structural modeling tools with advanced probabilistic algorithms in order to compute probabilistic structural response and resistance, component reliability and risk, and system reliability and risk. An illustrative numerical example is presented.

  15. Reliability models for Space Station power system

    Science.gov (United States)

    Singh, C.; Patton, A. D.; Kim, Y.; Wagner, H.

    1987-01-01

    This paper presents a methodology for the reliability evaluation of Space Station power system. The two options considered are the photovoltaic system and the solar dynamic system. Reliability models for both of these options are described along with the methodology for calculating the reliability indices.

  16. Presentation of the process External communications on the nuclear facilities operation of the Adjunct Head Office of Nuclear Safety of Comision Nacional de Seguridad Nuclear y Salvaguardias

    International Nuclear Information System (INIS)

    Espinosa V, J. M.

    2012-10-01

    The Comision Nacional de Seguridad Nuclear y Salvaguardias (CNSNS) in use of their attributions granted by the Regulation Law of the constitutional Art. 27 in nuclear matter began the development of the called process External communications on the nuclear facilities operation, with the purpose of negotiating the evaluation of the concerns related with the safety of the nuclear facilities received these of external people to the CNSNS. The process External communications on the nuclear facilities operation will allow to the public's members and the workers that carry out activities inside the mark regulator imposed by the CNSNS that report to this Commission their concerns related with safety for several means (for example, directly to the personnel of the assigned Office, official and public statements, phone communication, electronic mail, etc.) The present article presents the legal mark confers the CNSNS the attributions to develop the mentioned process and exposes the most important elements that compose it. The term External communication on the nuclear facilities operation is defined and also is described how these communications are received, evaluated and closed by the assigned Office. Of equal way the objectives that intents to reach this process are indicated. The intention of the mentioned process is to strengthen the actions that the CNSNS carries out in the execution of its functions to maintain the safety standards in the operation of the nuclear facilities in Mexico. (Author)

  17. Research on Control Method Based on Real-Time Operational Reliability Evaluation for Space Manipulator

    Directory of Open Access Journals (Sweden)

    Yifan Wang

    2014-05-01

    Full Text Available A control method based on real-time operational reliability evaluation for space manipulator is presented for improving the success rate of a manipulator during the execution of a task. In this paper, a method for quantitative analysis of operational reliability is given when manipulator is executing a specified task; then a control model which could control the quantitative operational reliability is built. First, the control process is described by using a state space equation. Second, process parameters are estimated in real time using Bayesian method. Third, the expression of the system's real-time operational reliability is deduced based on the state space equation and process parameters which are estimated using Bayesian method. Finally, a control variable regulation strategy which considers the cost of control is given based on the Theory of Statistical Process Control. It is shown via simulations that this method effectively improves the operational reliability of space manipulator control system.

  18. Discussion on an informative system set-up for the registration and processing of reliability data on FBR components in view of its application to design and safety studies and plant exploitation improvement

    International Nuclear Information System (INIS)

    Righini, R.; Sola, P.G.; Zappellini, G.

    1990-01-01

    This report describes the set-up and management activities carried-out by ENEA-VEL in collaboration with NIER in the development of a reliability data bank on fast reactor components; this data bank consists of an informative system implemented on the IBM 3090 computer of the ENEA centre of Bologna starting from the software of the CEDB, set-up by CCR Euratom of Ispra for the registration of reliability data on thermal reactor components. This report will contain a detailed description of all the modules (engineering, operating, etc.) provided in the informative system and of the modifications introduced by ENEA in order to adapt them to the peculiarities of the fast reactors and to increase its flexibility; a short description of the available data processing methods will be also included. It will be followed by a comparison between the results obtained applying the classical methods and the particular ones set-up by ENEA: this comparison will be useful to demonstrate the importance of the method applied in order to obtain significative reliability processed data. This report will be also useful to show the importance of the set-up data bank in the improvement of the component design and of the plant safety and exploitation with particular reference to the research of the critical areas and to the definition of the best inspection and maintenance programs

  19. DMD reliability: a MEMS success story

    Science.gov (United States)

    Douglass, Michael

    2003-01-01

    The Digital Micromirror Device (DMD) developed by Texas Instruments (TI) has made tremendous progress in both performance and reliability since it was first invented in 1987. From the first working concept of a bistable mirror, the DMD is now providing high-brightness, high-contrast, and high-reliability in over 1,500,000 projectors using Digital Light Processing technology. In early 2000, TI introduced the first DMD chip with a smaller mirror (14-micron pitch versus 17-micron pitch). This allowed a greater number of high-resolution DMD chips per wafer, thus providing an increased output capacity as well as the flexibility to use existing package designs. By using existing package designs, subsequent DMDs cost less as well as met our customers' demand for faster time to market. In recent years, the DMD achieved the status of being a commercially successful MEMS device. It reached this status by the efforts of hundreds of individuals working toward a common goal over many years. Neither textbooks nor design guidelines existed at the time. There was little infrastructure in place to support such a large endeavor. The knowledge we gained through our characterization and testing was all we had available to us through the first few years of development. Reliability was only a goal in 1992 when production development activity started; a goal that many throughout the industry and even within Texas Instruments doubted the DMD could achieve. The results presented in this paper demonstrate that we succeeded by exceeding the reliability goals.

  20. Safety, reliability and worker satisfaction during organizational change

    NARCIS (Netherlands)

    Zwetsloot, G.I.J.M.; Drupsteen, L.; Vroome, E.M.M. de

    2014-01-01

    The research presented in this paper was carried out in four process industry plants in the Netherlands, to identify factors that have the potential to increase safety and reliability while maintaining or improving job satisfaction. The data used were gathered as part of broader trajectories in

  1. Validation of Land Cover Products Using Reliability Evaluation Methods

    Directory of Open Access Journals (Sweden)

    Wenzhong Shi

    2015-06-01

    Full Text Available Validation of land cover products is a fundamental task prior to data applications. Current validation schemes and methods are, however, suited only for assessing classification accuracy and disregard the reliability of land cover products. The reliability evaluation of land cover products should be undertaken to provide reliable land cover information. In addition, the lack of high-quality reference data often constrains validation and affects the reliability results of land cover products. This study proposes a validation schema to evaluate the reliability of land cover products, including two methods, namely, result reliability evaluation and process reliability evaluation. Result reliability evaluation computes the reliability of land cover products using seven reliability indicators. Process reliability evaluation analyzes the reliability propagation in the data production process to obtain the reliability of land cover products. Fuzzy fault tree analysis is introduced and improved in the reliability analysis of a data production process. Research results show that the proposed reliability evaluation scheme is reasonable and can be applied to validate land cover products. Through the analysis of the seven indicators of result reliability evaluation, more information on land cover can be obtained for strategic decision-making and planning, compared with traditional accuracy assessment methods. Process reliability evaluation without the need for reference data can facilitate the validation and reflect the change trends of reliabilities to some extent.

  2. The Accelerator Reliability Forum

    CERN Document Server

    Lüdeke, Andreas; Giachino, R

    2014-01-01

    A high reliability is a very important goal for most particle accelerators. The biennial Accelerator Reliability Workshop covers topics related to the design and operation of particle accelerators with a high reliability. In order to optimize the over-all reliability of an accelerator one needs to gather information on the reliability of many different subsystems. While a biennial workshop can serve as a platform for the exchange of such information, the authors aimed to provide a further channel to allow for a more timely communication: the Particle Accelerator Reliability Forum [1]. This contribution will describe the forum and advertise it’s usage in the community.

  3. Development of the data logging and graphical presentation for gamma scanning, trouble shooting and process evaluation in the petroleum refinery column

    International Nuclear Information System (INIS)

    Saengchantr, Dhanaj; Chueinta Siripone

    2009-07-01

    Full text: Software of data logging and graphical presentation on gamma scanning for trouble shooting and process evaluation of the petroleum refinery column was developed. While setting the gamma source and gamma detector at the opposite orientation along side the column and recording the transmitted radiation through the column at several elevations, the relative density gamma intensity vs. vertical elevation could be obtained in the graphical mode. In comparison with engineering drawing, the physical and process abnormalities could be clearly evaluated during field investigation. The program could also accumulate up to 8 data sets, each of 1,000 points allowing with convenience, the comparison of different operational parameters adjustment during remedy of the problem and/or process optimization. Incorporated with this development and other factors, the technology capability of the TINT Service Center to the petroleum refinery was also enhanced

  4. Load Control System Reliability

    Energy Technology Data Exchange (ETDEWEB)

    Trudnowski, Daniel [Montana Tech of the Univ. of Montana, Butte, MT (United States)

    2015-04-03

    This report summarizes the results of the Load Control System Reliability project (DOE Award DE-FC26-06NT42750). The original grant was awarded to Montana Tech April 2006. Follow-on DOE awards and expansions to the project scope occurred August 2007, January 2009, April 2011, and April 2013. In addition to the DOE monies, the project also consisted of matching funds from the states of Montana and Wyoming. Project participants included Montana Tech; the University of Wyoming; Montana State University; NorthWestern Energy, Inc., and MSE. Research focused on two areas: real-time power-system load control methodologies; and, power-system measurement-based stability-assessment operation and control tools. The majority of effort was focused on area 2. Results from the research includes: development of fundamental power-system dynamic concepts, control schemes, and signal-processing algorithms; many papers (including two prize papers) in leading journals and conferences and leadership of IEEE activities; one patent; participation in major actual-system testing in the western North American power system; prototype power-system operation and control software installed and tested at three major North American control centers; and, the incubation of a new commercial-grade operation and control software tool. Work under this grant certainly supported the DOE-OE goals in the area of “Real Time Grid Reliability Management.”

  5. Microprocessor hardware reliability

    Energy Technology Data Exchange (ETDEWEB)

    Wright, R I

    1982-01-01

    Microprocessor-based technology has had an impact in nearly every area of industrial electronics and many applications have important safety implications. Microprocessors are being used for the monitoring and control of hazardous processes in the chemical, oil and power generation industries, for the control and instrumentation of aircraft and other transport systems and for the control of industrial machinery. Even in the field of nuclear reactor protection, where designers are particularly conservative, microprocessors are used to implement certain safety functions and may play increasingly important roles in protection systems in the future. Where microprocessors are simply replacing conventional hard-wired control and instrumentation systems no new hazards are created by their use. In the field of robotics, however, the microprocessor has opened up a totally new technology and with it has created possible new and as yet unknown hazards. The paper discusses some of the design and manufacturing techniques which may be used to enhance the reliability of microprocessor based systems and examines the available reliability data on lsi/vlsi microcircuits. 12 references.

  6. Reliability data collection and use in risk and availability assessment

    International Nuclear Information System (INIS)

    Colombari, V.

    1989-01-01

    For EuReDatA it is a prevailing objective to initiate and support contact between experts, companies and institutions active in reliability engineering and research. Main topics of this 6th EuReDatA Conference are: Reliability data banks; incidents data banks; common cause data; source and propagation of uncertainties; computer aided risk analysis; reliability and incidents data acquisition and processing; human reliability; probabilistic safety and availability assessment; feedback of reliability into system design; data fusion; reliability modeling and techniques; structural and mechanical reliability; consequence modeling; software and electronic reliability; reliability tests. Some conference papers are separately indexed in the database. (HP)

  7. Proceeding of the Scientific Meeting and Presentation on Basic Research in Nuclear of the Scientific and Technology Part II : Nuclear Chemistry; Process Technology and Radioactive Waste Management; Environment

    International Nuclear Information System (INIS)

    Sudjatmoko; Karmanto, Eko Edy; Endang-Supartini

    1996-04-01

    Scientific Meeting and Presentation on Basic Research in Nuclear Science and Technology is a routine activity was held by Yogyakarta Nuclear Research Centre, National Atomic Energy Agency (BATAN) for monitoring the research activity which achieved in BATAN. The Proceeding contains a proposal about basic which has Nuclear Chemistry, Process Technology, Radioactive Waste Management and Environment. This proceeding is the second part from two part which published in series. There are 61 articles which have separated index

  8. Reliability analysis of shutdown system

    International Nuclear Information System (INIS)

    Kumar, C. Senthil; John Arul, A.; Pal Singh, Om; Suryaprakasa Rao, K.

    2005-01-01

    This paper presents the results of reliability analysis of Shutdown System (SDS) of Indian Prototype Fast Breeder Reactor. Reliability analysis carried out using Fault Tree Analysis predicts a value of 3.5 x 10 -8 /de for failure of shutdown function in case of global faults and 4.4 x 10 -8 /de for local faults. Based on 20 de/y, the frequency of shutdown function failure is 0.7 x 10 -6 /ry, which meets the reliability target, set by the Indian Atomic Energy Regulatory Board. The reliability is limited by Common Cause Failure (CCF) of actuation part of SDS and to a lesser extent CCF of electronic components. The failure frequency of individual systems is -3 /ry, which also meets the safety criteria. Uncertainty analysis indicates a maximum error factor of 5 for the top event unavailability

  9. On Bayesian System Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Soerensen Ringi, M

    1995-05-01

    The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person`s state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs.

  10. On Bayesian System Reliability Analysis

    International Nuclear Information System (INIS)

    Soerensen Ringi, M.

    1995-01-01

    The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person's state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs

  11. As reliable as the sun

    Science.gov (United States)

    Leijtens, J. A. P.

    2017-11-01

    Fortunately there is almost nothing as reliable as the sun which can consequently be utilized as a very reliable source of spacecraft power. In order to harvest this power, the solar panels have to be pointed towards the sun as accurately and reliably as possible. To this extend, sunsensors are available on almost every satellite to support vital sun-pointing capability throughout the mission, even in the deployment and save mode phases of the satellites life. Given the criticality of the application one would expect that after more than 50 years of sun sensor utilisation, such sensors would be fully matured and optimised. In actual fact though, the majority of sunsensors employed are still coarse sunsensors which have a proven extreme reliability but present major issues regarding albedo sensitivity and pointing accuracy.

  12. Reliability analysis in intelligent machines

    Science.gov (United States)

    Mcinroy, John E.; Saridis, George N.

    1990-01-01

    Given an explicit task to be executed, an intelligent machine must be able to find the probability of success, or reliability, of alternative control and sensing strategies. By using concepts for information theory and reliability theory, new techniques for finding the reliability corresponding to alternative subsets of control and sensing strategies are proposed such that a desired set of specifications can be satisfied. The analysis is straightforward, provided that a set of Gaussian random state variables is available. An example problem illustrates the technique, and general reliability results are presented for visual servoing with a computed torque-control algorithm. Moreover, the example illustrates the principle of increasing precision with decreasing intelligence at the execution level of an intelligent machine.

  13. Reliability and Maintainability (RAM) Training

    Science.gov (United States)

    Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Packard, Michael H. (Editor)

    2000-01-01

    The theme of this manual is failure physics-the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low-cost reliable products. In a broader sense the manual should do more. It should underscore the urgent need CI for mature attitudes toward reliability. Five of the chapters were originally presented as a classroom course to over 1000 Martin Marietta engineers and technicians. Another four chapters and three appendixes have been added, We begin with a view of reliability from the years 1940 to 2000. Chapter 2 starts the training material with a review of mathematics and a description of what elements contribute to product failures. The remaining chapters elucidate basic reliability theory and the disciplines that allow us to control and eliminate failures.

  14. Automated reliability assessment for spectroscopic redshift measurements

    Science.gov (United States)

    Jamal, S.; Le Brun, V.; Le Fèvre, O.; Vibert, D.; Schmitt, A.; Surace, C.; Copin, Y.; Garilli, B.; Moresco, M.; Pozzetti, L.

    2018-03-01

    Context. Future large-scale surveys, such as the ESA Euclid mission, will produce a large set of galaxy redshifts (≥106) that will require fully automated data-processing pipelines to analyze the data, extract crucial information and ensure that all requirements are met. A fundamental element in these pipelines is to associate to each galaxy redshift measurement a quality, or reliability, estimate. Aim. In this work, we introduce a new approach to automate the spectroscopic redshift reliability assessment based on machine learning (ML) and characteristics of the redshift probability density function. Methods: We propose to rephrase the spectroscopic redshift estimation into a Bayesian framework, in order to incorporate all sources of information and uncertainties related to the redshift estimation process and produce a redshift posterior probability density function (PDF). To automate the assessment of a reliability flag, we exploit key features in the redshift posterior PDF and machine learning algorithms. Results: As a working example, public data from the VIMOS VLT Deep Survey is exploited to present and test this new methodology. We first tried to reproduce the existing reliability flags using supervised classification in order to describe different types of redshift PDFs, but due to the subjective definition of these flags (classification accuracy 58%), we soon opted for a new homogeneous partitioning of the data into distinct clusters via unsupervised classification. After assessing the accuracy of the new clusters via resubstitution and test predictions (classification accuracy 98%), we projected unlabeled data from preliminary mock simulations for the Euclid space mission into this mapping to predict their redshift reliability labels. Conclusions: Through the development of a methodology in which a system can build its own experience to assess the quality of a parameter, we are able to set a preliminary basis of an automated reliability assessment for

  15. Reliability of software

    International Nuclear Information System (INIS)

    Kopetz, H.

    1980-01-01

    Common factors and differences in the reliability of hardware and software; reliability increase by means of methods of software redundancy. Maintenance of software for long term operating behavior. (HP) [de

  16. Operational reliability management; Gestao da confiabilidade operacional

    Energy Technology Data Exchange (ETDEWEB)

    Bressan, Edemir [Refinaria Alberto Pasqualini (REFAP), Canoas, RS (Brazil). Setor de Tecnologia de Equipamentos

    2000-07-01

    It is described the PETROBRAS Alberto Pasqualini Refinery process plant reliability management, strategies, maintenance organizational structure, management processes, predictive and preventive maintenance, condition monitoring techniques, reliability metrics, pointing out a need for close work relationship between production, maintenance and project engineering functions with highly qualified and committed proper teams, in order to reach one of the highest mechanical availability among Latin America refineries. (author)

  17. Pocket Handbook on Reliability

    Science.gov (United States)

    1975-09-01

    exponencial distributions Weibull distribution, -xtimating reliability, confidence intervals, relia- bility growth, 0. P- curves, Bayesian analysis. 20 A S...introduction for those not familiar with reliability and a good refresher for those who are currently working in the area. LEWIS NERI, CHIEF...includes one or both of the following objectives: a) prediction of the current system reliability, b) projection on the system reliability for someI future

  18. Principles of Bridge Reliability

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle; Nowak, Andrzej S.

    The paper gives a brief introduction to the basic principles of structural reliability theory and its application to bridge engineering. Fundamental concepts like failure probability and reliability index are introduced. Ultimate as well as serviceability limit states for bridges are formulated......, and as an example the reliability profile and a sensitivity analyses for a corroded reinforced concrete bridge is shown....

  19. Reliability in engineering '87

    International Nuclear Information System (INIS)

    Tuma, M.

    1987-01-01

    The participants heard 51 papers dealing with the reliability of engineering products. Two of the papers were incorporated in INIS, namely ''Reliability comparison of two designs of low pressure regeneration of the 1000 MW unit at the Temelin nuclear power plant'' and ''Use of probability analysis of reliability in designing nuclear power facilities.''(J.B.)

  20. Stochastic reliability analysis using Fokker Planck equations

    International Nuclear Information System (INIS)

    Hari Prasad, M.; Rami Reddy, G.; Srividya, A.; Verma, A.K.

    2011-01-01

    The Fokker-Planck equation describes the time evolution of the probability density function of the velocity of a particle, and can be generalized to other observables as well. It is also known as the Kolmogorov forward equation (diffusion). Hence, for any process, which evolves with time, the probability density function as a function of time can be represented with Fokker-Planck equation. In stochastic reliability analysis one is more interested in finding out the reliability or failure probability of the components or structures as a function of time rather than instantaneous failure probabilities. In this analysis the variables are represented with random processes instead of random variables. A random processes can be either stationary or non stationary. If the random process is stationary then the failure probability doesn't change with time where as in the case of non stationary processes the failure probability changes with time. In the present paper Fokker Planck equations have been used to find out the probability density function of the non stationary random processes. In this paper a flow chart has been provided which describes step by step process for carrying out stochastic reliability analysis using Fokker-Planck equations. As a first step one has to identify the failure function as a function of random processes. Then one has to solve the Fokker-Planck equation for each random process. In this paper the Fokker-Planck equation has been solved by using Finite difference method. As a result one gets the probability density values of the random process in the sample space as well as time space. Later at each time step appropriate probability distribution has to be identified based on the available probability density values. For checking the better fitness of the data Kolmogorov-Smirnov Goodness of fit test has been performed. In this way one can find out the distribution of the random process at each time step. Once one has the probability distribution

  1. Reliability analysis using network simulation

    International Nuclear Information System (INIS)

    Engi, D.

    1985-01-01

    The models that can be used to provide estimates of the reliability of nuclear power systems operate at many different levels of sophistication. The least-sophisticated models treat failure processes that entail only time-independent phenomena (such as demand failure). More advanced models treat processes that also include time-dependent phenomena such as run failure and possibly repair. However, many of these dynamic models are deficient in some respects because they either disregard the time-dependent phenomena that cannot be expressed in closed-form analytic terms or because they treat these phenomena in quasi-static terms. The next level of modeling requires a dynamic approach that incorporates not only procedures for treating all significant time-dependent phenomena but also procedures for treating these phenomena when they are conditionally linked or characterized by arbitrarily selected probability distributions. The level of sophistication that is required is provided by a dynamic, Monte Carlo modeling approach. A computer code that uses a dynamic, Monte Carlo modeling approach is Q-GERT (Graphical Evaluation and Review Technique - with Queueing), and the present study had demonstrated the feasibility of using Q-GERT for modeling time-dependent, unconditionally and conditionally linked phenomena that are characterized by arbitrarily selected probability distributions

  2. Reliability considerations of electronics components for the deep underwater muon and neutrino detection system

    International Nuclear Information System (INIS)

    Leskovar, B.

    1980-02-01

    The reliability of some electronics components for the Deep Underwater Muon and Neutrino Detection (DUMAND) System is discussed. An introductory overview of engineering concepts and technique for reliability assessment is given. Component reliability is discussed in the contest of major factors causing failures, particularly with respect to physical and chemical causes, process technology and testing, and screening procedures. Failure rates are presented for discrete devices and for integrated circuits as well as for basic electronics components. Furthermore, the military reliability specifications and standards for semiconductor devices are reviewed

  3. Use of reliability analysis for the safety evaluation of technical facilities

    International Nuclear Information System (INIS)

    Balfanz, H.P.; Eggert, H.; Lindauer, E.

    1975-01-01

    Using examples from nuclear technology, the following is discussed: how efficient the present practical measures are for increasing reliability, which weak points can be recognized and what appears to be the most promising direction to take for improvements. The following are individually dealt with: 1) determination of the relevant parameters for the safety of a plant; 2) definition and fixing of reliability requirements; 3) process to prove the fulfilment of requirements; 4) measures to guarantee the reliability; 5) data feed-back to check and improve the reliability. (HP/LH) [de

  4. Delivery presentations

    Science.gov (United States)

    Pregnancy - delivery presentation; Labor - delivery presentation; Occiput posterior; Occiput anterior; Brow presentation ... The mother can walk, rock, and try different delivery positions during labor to help encourage the baby ...

  5. Semiconductor measurement technology: reliability technology for cardiac pacemakers 2: a workshop report, 1976

    International Nuclear Information System (INIS)

    Schafft, H.A.

    1977-01-01

    Summaries are presented of 12 invited talks on the following topics: the procurement and assurance of high reliability electronic parts, leak rate and moisture measurements, pacemaker batteries, and pacemaker leads. The workshop, second in a series, was held in response to strong interest expressed by the pacemaker community to address technical questions relevant to the enhancement and assurance of cardiac pacemaker reliability. Discussed at the workshop were a process validation wafer concept for assuring process uniformity in device chips; screen tests for assuring reliable electronic parts; reliability prediction; reliability comparison of semiconductor technologies; mechanisms of short-circuiting dendritic growths; details of helium and radioisotope leak test methods; a study to correlate package leak rates, as measured with test gasses, and actual moisture infusion; battery life prediction; microcalorimetric measurements to nondestructively evaluate batteries for pacemakers; and an engineer's and a physician's view of the present status of pacemaker leads. References are included with most of the reports

  6. Magnitude of Alloresponses to MHC Class I/II Expressing Human Cardiac Myocytes is Limited by their Intrinsic Ability to Process and Present Antigenic Peptides

    Directory of Open Access Journals (Sweden)

    Aftab A. Ansari

    2003-01-01

    Full Text Available In this investigation we have explored the relationship between the weak allogenicity of cardiac myocytes and their capacity to present allo-antigens by examining the ability of a human cardiac myocyte cell line (W-1 to process and present nominal antigens. W-1 cells (HLA-A*0201 and HLA-DR β1*0301 pulsed with the influenza A matrix 1 (58-66 peptide (M1 were able to serve as targets for the HLA-A*0201 restricted CTL line PG, specific for M1-peptide. However, PG-CTLs were unable to lyse W-1 target cells infected with a recombinant vaccinia virus expressing the M1 protein (M1-VAC. Pretreatment of these M1-VAC targets with IFN-γ partially restored their ability to process and present the M1 peptide. However, parallel studies demonstrated that IFN-γ pretreated W-1's could not process tetanus toxin (TT or present the TT(830-843 peptide to HLA-DR3 restricted TT-primed T cells. Semi-quantitative RT-PCR measurements revealed significantly lower constitutive levels of expression for MHC class I, TAP-1/2, and LMP-2/7 genes in W-1s that could be elevated by pretreatment with IFN-γ to values equal to or greater than those expressed in EBV-PBLs. However, mRNA levels for the genes encoding MHC class II, Ii, CIITA, and DMA/B were markedly lower in both untreated and IFN-γ pretreated W-1s relative to EBV-PBLs. Furthermore, pulse-chase analysis of the corresponding genes revealed significantly lower protein levels and longer half-life expression in W-1s relative to EBV-PBLs. These results suggest that weak allogenicity of cardiac myocytes may be governed by their limited expression of MHC genes and gene products critical for antigen processing and presentation.

  7. Reliable computer systems.

    Science.gov (United States)

    Wear, L L; Pinkert, J R

    1993-11-01

    In this article, we looked at some decisions that apply to the design of reliable computer systems. We began with a discussion of several terms such as testability, then described some systems that call for highly reliable hardware and software. The article concluded with a discussion of methods that can be used to achieve higher reliability in computer systems. Reliability and fault tolerance in computers probably will continue to grow in importance. As more and more systems are computerized, people will want assurances about the reliability of these systems, and their ability to work properly even when sub-systems fail.

  8. Human factor reliability program

    International Nuclear Information System (INIS)

    Knoblochova, L.

    2017-01-01

    The human factor's reliability program was at Slovenske elektrarne, a.s. (SE) nuclear power plants. introduced as one of the components Initiatives of Excellent Performance in 2011. The initiative's goal was to increase the reliability of both people and facilities, in response to 3 major areas of improvement - Need for improvement of the results, Troubleshooting support, Supporting the achievement of the company's goals. The human agent's reliability program is in practice included: - Tools to prevent human error; - Managerial observation and coaching; - Human factor analysis; -Quick information about the event with a human agent; -Human reliability timeline and performance indicators; - Basic, periodic and extraordinary training in human factor reliability(authors)

  9. Reliability and optimization of structural systems

    International Nuclear Information System (INIS)

    Thoft-Christensen, P.

    1987-01-01

    The proceedings contain 28 papers presented at the 1st working conference. The working conference was organized by the IFIP Working Group 7.5. The proceedings also include 4 papers which were submitted, but for various reasons not presented at the working conference. The working conference was attended by 50 participants from 18 countries. The conference was the first scientific meeting of the new IFIP Working Group 7.5 on 'Reliability and Optimization of Structural Systems'. The purpose of the Working Group 7.5 is to promote modern structural system optimization and reliability theory, to advance international cooperation in the field of structural system optimization and reliability theory, to stimulate research, development and application of structural system optimization and reliability theory, to further the dissemination and exchange of information on reliability and optimization of structural system optimization and reliability theory, and to encourage education in structural system optimization and reliability theory. (orig./HP)

  10. Diagnostic overshadowing and other challenges involved in the diagnostic process of patients with mental illness who present in emergency departments with physical symptoms--a qualitative study.

    Science.gov (United States)

    Shefer, Guy; Henderson, Claire; Howard, Louise M; Murray, Joanna; Thornicroft, Graham

    2014-01-01

    We conducted a qualitative study in the Emergency Departments (EDs) of four hospitals in order to investigate the perceived scope and causes of 'diagnostic overshadowing'--the misattribution of physical symptoms to mental illness--and other challenges involved in the diagnostic process of people with mental illness who present in EDs with physical symptoms. Eighteen doctors and twenty-one nurses working in EDs and psychiatric liaisons teams in four general hospitals in the UK were interviewed. Interviewees were asked about cases in which mental illness interfered with diagnosis of physical problems and about other aspects of the diagnostic process. Interviews were transcribed and analysed thematically. Interviewees reported various scenarios in which mental illness or factors related to it led to misdiagnosis or delayed treatment with various degrees of seriousness. Direct factors which may lead to misattribution in this regard are complex presentations or aspects related to poor communication or challenging behaviour of the patient. Background factors are the crowded nature of the ED environment, time pressures and targets and stigmatising attitudes held by a minority of staff. The existence of psychiatric liaison team covering the ED twenty-four hours a day, seven days a week, can help reduce the risk of misdiagnosis of people with mental illness who present with physical symptoms. However, procedures used by emergency and psychiatric liaison staff require fuller operationalization to reduce disagreement over where responsibilities lie.

  11. Reliability issues : a Canadian perspective

    International Nuclear Information System (INIS)

    Konow, H.

    2004-01-01

    A Canadian perspective of power reliability issues was presented. Reliability depends on adequacy of supply and a framework for standards. The challenges facing the electric power industry include new demand, plant replacement and exports. It is expected that demand will by 670 TWh by 2020, with 205 TWh coming from new plants. Canada will require an investment of $150 billion to meet this demand and the need is comparable in the United States. As trade grows, the challenge becomes a continental issue and investment in the bi-national transmission grid will be essential. The 5 point plan of the Canadian Electricity Association is to: (1) establish an investment climate to ensure future electricity supply, (2) move government and industry towards smart and effective regulation, (3) work to ensure a sustainable future for the next generation, (4) foster innovation and accelerate skills development, and (5) build on the strengths of an integrated North American system to maximize opportunity for Canadians. The CEA's 7 measures that enhance North American reliability were listed with emphasis on its support for a self-governing international organization for developing and enforcing mandatory reliability standards. CEA also supports the creation of a binational Electric Reliability Organization (ERO) to identify and solve reliability issues in the context of a bi-national grid. tabs., figs

  12. Extraction and reliable determination of acrylamide from thermally processed foods using ionic liquid-based ultrasound-assisted selective microextraction combined with spectrophotometry.

    Science.gov (United States)

    Altunay, Nail; Elik, Adil; Gürkan, Ramazan

    2018-02-01

    Acrylamide (AAm) is a carcinogenic chemical that can form in thermally processed foods by the Maillard reaction of glucose with asparagine. AAm can easily be formed especially in frequently consumed chips and cereal-based foods depending on processing conditions. Considering these properties of AAm, a new, simple and green method is proposed for the extraction of AAm from thermally processed food samples. In this study, an ionic liquid (1-butyl-3-methylimidazolium tetrafluoroborate, [Bmim][BF 4 ]) as extractant was used in the presence of a cationic phenazine group dye, 3,7-diamino-5-phenylphenazinium chloride (PSH + , phenosafranine) at pH 7.5 for the extraction of AAm as an ion-pair complex from selected samples. Under optimum conditions, the analytical features obtained for the proposed method were as follows; linear working range, the limits of detection (LOD, 3S b /m) and quantification (LOQ, 10S b /m), preconcentration factor, sensitivity enhancement factor, sample volume and recovery% were 2.2-350 µg kg -1 , 0.7 µg kg -1 , 2.3 µg kg -1 , 120, 95, 60 mL and 94.1-102.7%, respectively. The validity of the method was tested by analysis of two certified reference materials (CRMs) and intra-day and inter-day precision studies. Finally, the method was successfully applied to the determination of AAm levels in thermally processed foods using the standard addition method.

  13. Can non-destructive inspection be reliable

    International Nuclear Information System (INIS)

    Silk, M.G.; Stoneham, A.M.; Temple, J.A.G.

    1988-01-01

    The paper on inspection is based on the book ''The reliability of non-destructive inspection: assessing the assessment of structures under stress'' by the present authors (published by Adam Hilger 1987). Emphasis is placed on the reliability of inspection and whether cracks in welds or flaws in components can be detected. The need for non-destructive testing and the historical attitudes to non-destructive testing are outlined, along with the case of failure. Factors influencing reliable inspection are discussed, and defect detection trials involving round robin tests are described. The development of reliable inspection techniques and the costs of reliability and unreliability are also examined. (U.K.)

  14. Business of reliability

    Science.gov (United States)

    Engel, Pierre

    1999-12-01

    The presentation is organized around three themes: (1) The decrease of reception equipment costs allows non-Remote Sensing organization to access a technology until recently reserved to scientific elite. What this means is the rise of 'operational' executive agencies considering space-based technology and operations as a viable input to their daily tasks. This is possible thanks to totally dedicated ground receiving entities focusing on one application for themselves, rather than serving a vast community of users. (2) The multiplication of earth observation platforms will form the base for reliable technical and financial solutions. One obstacle to the growth of the earth observation industry is the variety of policies (commercial versus non-commercial) ruling the distribution of the data and value-added products. In particular, the high volume of data sales required for the return on investment does conflict with traditional low-volume data use for most applications. Constant access to data sources supposes monitoring needs as well as technical proficiency. (3) Large volume use of data coupled with low- cost equipment costs is only possible when the technology has proven reliable, in terms of application results, financial risks and data supply. Each of these factors is reviewed. The expectation is that international cooperation between agencies and private ventures will pave the way for future business models. As an illustration, the presentation proposes to use some recent non-traditional monitoring applications, that may lead to significant use of earth observation data, value added products and services: flood monitoring, ship detection, marine oil pollution deterrent systems and rice acreage monitoring.

  15. Advances in reliability and system engineering

    CERN Document Server

    Davim, J

    2017-01-01

    This book presents original studies describing the latest research and developments in the area of reliability and systems engineering. It helps the reader identifying gaps in the current knowledge and presents fruitful areas for further research in the field. Among others, this book covers reliability measures, reliability assessment of multi-state systems, optimization of multi-state systems, continuous multi-state systems, new computational techniques applied to multi-state systems and probabilistic and non-probabilistic safety assessment.

  16. Reliability tasks from prediction to field use

    International Nuclear Information System (INIS)

    Guyot, Christian.

    1975-01-01

    This tutorial paper is part of a series intended to sensitive on reliability prolems. Reliability probabilistic concept, is an important parameter of availability. Reliability prediction is an estimation process for evaluating design progress. It is only by the application of a reliability program that reliability objectives can be attained through the different stages of work: conception, fabrication, field use. The user is mainly interested in operational reliability. Indication are given on the support and the treatment of data in the case of electronic equipment at C.E.A. Reliability engineering requires a special state of mind which must be formed and developed in a company in the same way as it may be done for example for safety [fr

  17. Reliability analysis of operator's monitoring behavior in digital main control room of nuclear power plants and its application

    International Nuclear Information System (INIS)

    Zhang Li; Hu Hong; Li Pengcheng; Jiang Jianjun; Yi Cannan; Chen Qingqing

    2015-01-01

    In order to build a quantitative model to analyze operators' monitoring behavior reliability of digital main control room of nuclear power plants, based on the analysis of the design characteristics of digital main control room of a nuclear power plant and operator's monitoring behavior, and combining with operators' monitoring behavior process, monitoring behavior reliability was divided into three parts including information transfer reliability among screens, inside-screen information sampling reliability and information detection reliability. Quantitative calculation model of information transfer reliability among screens was established based on Senders's monitoring theory; the inside screen information sampling reliability model was established based on the allocation theory of attention resources; and considering the performance shaping factor causality, a fuzzy Bayesian method was presented to quantify information detection reliability and an example of application was given. The results show that the established model of monitoring behavior reliability gives an objective description for monitoring process, which can quantify the monitoring reliability and overcome the shortcomings of traditional methods. Therefore, it provides theoretical support for operator's monitoring behavior reliability analysis in digital main control room of nuclear power plants and improves the precision of human reliability analysis. (authors)

  18. Equipment Reliability Program in NPP Krsko

    International Nuclear Information System (INIS)

    Skaler, F.; Djetelic, N.

    2006-01-01

    Operation that is safe, reliable, effective and acceptable to public is the common message in a mission statement of commercial nuclear power plants (NPPs). To fulfill these goals, nuclear industry, among other areas, has to focus on: 1 Human Performance (HU) and 2 Equipment Reliability (EQ). The performance objective of HU is as follows: The behaviors of all personnel result in safe and reliable station operation. While unwanted human behaviors in operations mostly result directly in the event, the behavior flaws either in the area of maintenance or engineering usually cause decreased equipment reliability. Unsatisfied Human performance leads even the best designed power plants into significant operating events, which can be found as well-known examples in nuclear industry. Equipment reliability is today recognized as the key to success. While the human performance at most NPPs has been improving since the start of WANO / INPO / IAEA evaluations, the open energy market has forced the nuclear plants to reduce production costs and operate more reliably and effectively. The balance between these two (opposite) goals has made equipment reliability even more important for safe, reliable and efficient production. Insisting on on-line operation by ignoring some principles of safety could nowadays in a well-developed safety culture and human performance environment exceed the cost of electricity losses. In last decade the leading USA nuclear companies put a lot of effort to improve equipment reliability primarily based on INPO Equipment Reliability Program AP-913 at their NPP stations. The Equipment Reliability Program is the key program not only for safe and reliable operation, but also for the Life Cycle Management and Aging Management on the way to the nuclear power plant life extension. The purpose of Equipment Reliability process is to identify, organize, integrate and coordinate equipment reliability activities (preventive and predictive maintenance, maintenance

  19. Calculation of mean outcrossing rates of non-Gaussian processes with stochastic input parameters - Reliability of containers stowed on ships in severe sea

    DEFF Research Database (Denmark)

    Nielsen, Ulrik Dam

    2010-01-01

    values is expected to occur, and the final result, the mean outcrossing rate, is obtained by summation. The derived procedure is illustrated by an example considering the forces in containers stowed on ships and, in particular, results are presented for the so-called racking failure in the containers...

  20. Reliability Modeling of Wind Turbines

    DEFF Research Database (Denmark)

    Kostandyan, Erik

    Cost reductions for offshore wind turbines are a substantial requirement in order to make offshore wind energy more competitive compared to other energy supply methods. During the 20 – 25 years of wind turbines useful life, Operation & Maintenance costs are typically estimated to be a quarter...... for Operation & Maintenance planning. Concentrating efforts on development of such models, this research is focused on reliability modeling of Wind Turbine critical subsystems (especially the power converter system). For reliability assessment of these components, structural reliability methods are applied...... to one third of the total cost of energy. Reduction of Operation & Maintenance costs will result in significant cost savings and result in cheaper electricity production. Operation & Maintenance processes mainly involve actions related to replacements or repair. Identifying the right times when...

  1. Component reliability for electronic systems

    CERN Document Server

    Bajenescu, Titu-Marius I

    2010-01-01

    The main reason for the premature breakdown of today's electronic products (computers, cars, tools, appliances, etc.) is the failure of the components used to build these products. Today professionals are looking for effective ways to minimize the degradation of electronic components to help ensure longer-lasting, more technically sound products and systems. This practical book offers engineers specific guidance on how to design more reliable components and build more reliable electronic systems. Professionals learn how to optimize a virtual component prototype, accurately monitor product reliability during the entire production process, and add the burn-in and selection procedures that are the most appropriate for the intended applications. Moreover, the book helps system designers ensure that all components are correctly applied, margins are adequate, wear-out failure modes are prevented during the expected duration of life, and system interfaces cannot lead to failure.

  2. Augmented Virtuality: A Real-time Process for Presenting Real-world Visual Sensory Information in an Immersive Virtual Environment for Planetary Exploration

    Science.gov (United States)

    McFadden, D.; Tavakkoli, A.; Regenbrecht, J.; Wilson, B.

    2017-12-01

    Virtual Reality (VR) and Augmented Reality (AR) applications have recently seen an impressive growth, thanks to the advent of commercial Head Mounted Displays (HMDs). This new visualization era has opened the possibility of presenting researchers from multiple disciplines with data visualization techniques not possible via traditional 2D screens. In a purely VR environment researchers are presented with the visual data in a virtual environment, whereas in a purely AR application, a piece of virtual object is projected into the real world with which researchers could interact. There are several limitations to the purely VR or AR application when taken within the context of remote planetary exploration. For example, in a purely VR environment, contents of the planet surface (e.g. rocks, terrain, or other features) should be created off-line from a multitude of images using image processing techniques to generate 3D mesh data that will populate the virtual surface of the planet. This process usually takes a tremendous amount of computational resources and cannot be delivered in real-time. As an alternative, video frames may be superimposed on the virtual environment to save processing time. However, such rendered video frames will lack 3D visual information -i.e. depth information. In this paper, we present a technique to utilize a remotely situated robot's stereoscopic cameras to provide a live visual feed from the real world into the virtual environment in which planetary scientists are immersed. Moreover, the proposed technique will blend the virtual environment with the real world in such a way as to preserve both the depth and visual information from the real world while allowing for the sensation of immersion when the entire sequence is viewed via an HMD such as Oculus Rift. The figure shows the virtual environment with an overlay of the real-world stereoscopic video being presented in real-time into the virtual environment. Notice the preservation of the object

  3. Demonstrating the Safety and Reliability of a New System or Spacecraft: Incorporating Analyses and Reviews of the Design and Processing in Determining the Number of Tests to be Conducted

    Science.gov (United States)

    Vesely, William E.; Colon, Alfredo E.

    2010-01-01

    Design Safety/Reliability is associated with the probability of no failure-causing faults existing in a design. Confidence in the non-existence of failure-causing faults is increased by performing tests with no failure. Reliability-Growth testing requirements are based on initial assurance and fault detection probability. Using binomial tables generally gives too many required tests compared to reliability-growth requirements. Reliability-Growth testing requirements are based on reliability principles and factors and should be used.

  4. Reliability analysis under epistemic uncertainty

    International Nuclear Information System (INIS)

    Nannapaneni, Saideep; Mahadevan, Sankaran

    2016-01-01

    This paper proposes a probabilistic framework to include both aleatory and epistemic uncertainty within model-based reliability estimation of engineering systems for individual limit states. Epistemic uncertainty is considered due to both data and model sources. Sparse point and/or interval data regarding the input random variables leads to uncertainty regarding their distribution types, distribution parameters, and correlations; this statistical uncertainty is included in the reliability analysis through a combination of likelihood-based representation, Bayesian hypothesis testing, and Bayesian model averaging techniques. Model errors, which include numerical solution errors and model form errors, are quantified through Gaussian process models and included in the reliability analysis. The probability integral transform is used to develop an auxiliary variable approach that facilitates a single-level representation of both aleatory and epistemic uncertainty. This strategy results in an efficient single-loop implementation of Monte Carlo simulation (MCS) and FORM/SORM techniques for reliability estimation under both aleatory and epistemic uncertainty. Two engineering examples are used to demonstrate the proposed methodology. - Highlights: • Epistemic uncertainty due to data and model included in reliability analysis. • A novel FORM-based approach proposed to include aleatory and epistemic uncertainty. • A single-loop Monte Carlo approach proposed to include both types of uncertainties. • Two engineering examples used for illustration.

  5. Energy Efficient and Reliable Target Monitoring in the Tactical Battlefield

    Science.gov (United States)

    Li, Yan-Xiao; Guan, Hua; Zhang, Yue-Ling

    In the tactical battlefield target monitoring it is crucial to take into account the energy efficiency and data reliability issues for the purpose of military decision making, especially in large scale sensor networks. However, due to the inherent nature of power constraint and wireless communication medium it is a challenging problem in the process of actual application. An efficient and reliable data aggregation scheme is proposed to enhance the performance of wireless sensor network used in the target monitoring. Firstly, the energy consumption model is presented and analyzed in the multihop WSNs. Then idea of mobile sinks, adaptive energy saving mechanism is introduced and the concept of multiple sinks cooperation is used to assure the reliability of the data aggregation. The simulation and the associated analysis show the improved results of the presented schema. At last the future discussion about the large scale tactical battlefield application is made to broaden the coming research scope.

  6. Column Grid Array Rework for High Reliability

    Science.gov (United States)

    Mehta, Atul C.; Bodie, Charles C.

    2008-01-01

    Due to requirements for reduced size and weight, use of grid array packages in space applications has become common place. To meet the requirement of high reliability and high number of I/Os, ceramic column grid array packages (CCGA) were selected for major electronic components used in next MARS Rover mission (specifically high density Field Programmable Gate Arrays). ABSTRACT The probability of removal and replacement of these devices on the actual flight printed wiring board assemblies is deemed to be very high because of last minute discoveries in final test which will dictate changes in the firmware. The questions and challenges presented to the manufacturing organizations engaged in the production of high reliability electronic assemblies are, Is the reliability of the PWBA adversely affected by rework (removal and replacement) of the CGA package? and How many times can we rework the same board without destroying a pad or degrading the lifetime of the assembly? To answer these questions, the most complex printed wiring board assembly used by the project was chosen to be used as the test vehicle, the PWB was modified to provide a daisy chain pattern, and a number of bare PWB s were acquired to this modified design. Non-functional 624 pin CGA packages with internal daisy chained matching the pattern on the PWB were procured. The combination of the modified PWB and the daisy chained packages enables continuity measurements of every soldered contact during subsequent testing and thermal cycling. Several test vehicles boards were assembled, reworked and then thermal cycled to assess the reliability of the solder joints and board material including pads and traces near the CGA. The details of rework process and results of thermal cycling are presented in this paper.

  7. Maturity index on reliability: covering non-technical aspects of IEC61508 reliability certification

    International Nuclear Information System (INIS)

    Brombacher, A.C.

    1999-01-01

    One of the more recent developments in the field of reliability and safety is the realisation that these aspects are not only a function of the product itself, but also of the organisation realising this product. A second development is a trend from an often predominantly qualitative analysis towards a quantitative analysis. In contrast to the (older) DIN 0801, the (more recent) IEC61508 requires, on product level, also a quantitative analysis and, on organisational level, an assessment of the lifecycle of a product by analysing the (maturity of the) relevant business processes (DIN V VDE 0801. Grundsaetze fuer Rechner in Systemen mit Sicherheitsaufgaben, 1990; DIN V 0801. Grundlegende Sicherheitsbetrachtungen fuer MSR-Schutzeinrichtungen, 1994; DIN V VDE 0801 A1. Grundsaetze fuer Rechner in Systemen mit Sicherheitsaufgaben, Aenderung A1, 1994; IEC 61508 Functional Safety of electrical/electronic/programmable electronic safety-related systems, draft 4.0, 1997). The IEC standard 61508 covers: (i) technical aspects, both on a quantitative and a qualitative level; (ii) organisational aspects, both on aspects of maturity of business processes (quantitative) and on aspects of the definition and application of procedures (qualitative). This paper shows the necessity for an analysis on all aspects in a safety certification process, and presents an overview of the available tools and techniques for the various quadrants. As methods and tools for especially quadrant C are currently unavailable, this paper will propose a method to assess and improve the maturity of an organisation on reliability management: the maturity index on reliability (MIR)

  8. Characterization of the geochemical processes present in the radionuclides and metals mobilization in the tailing dam at the Uranium Mining and Milling Facilities - Pocos de Caldas, MG, Brazil

    International Nuclear Information System (INIS)

    Pinto, Patricia Freitas

    1995-08-01

    In Brazil, the first step of nuclear fuel cycle - the mining and milling of the uranium ore - is developed at the Uranium Mining and Milling Facilities of Pocos de Caldas, Minas Gerais state. The wastes management is a very important aspect of the process. The understanding of the geochemical processes that occur in the tailings dam is a key question to define a plan of action concerning the decommissioning strategy of the facility. The objective of the present work was to give some issues to help in the adoption of the remedial actions concerning the decommissioning of the facility. It focused on the characterization of the most important geochemical processes regulating the mobilization of radionuclides and heavy metals in the tailings dam. Two cores from the tailings dam (uncovered area) were collected. Seepage and drainage waters were sampled, the same being true for the tailings dam lake. Groundwater form an aquifer bellow the tailings dam and superficial waters from a river that receives the effluents of the dam (Soberbo River) were also sampled. Data from the mining company were used to calculate the inventory of radionuclides and heavy metals deposited in the waste dam.The obtained results showed that pyrite oxidation is the key process in the mobilization of radionuclides and heavy metals from the wastes. Pyrite oxidation is a process regulated by oxygen diffusion and water. In the studied scenario it could be shown that the process was limited to a one meter deep layer in the uncovered part of the waste dam. Because of this, Fe, Al, Mn, Zn, Th and 238 U showed higher concentrations in the bottom layers of the cores in relation to the upper ones. 226 Ra and 210 Pb showed opposite patterns. The coprecipitation with Ca SO 4 was the most relevant mechanism in both radionuclides immobilization in the wastes. Sulfate was the only chemical species that could be assigned as a contaminant in aquifer bellow the waste dam. As a conclusion, the target environmental

  9. Reliability of electronic systems

    International Nuclear Information System (INIS)

    Roca, Jose L.

    2001-01-01

    Reliability techniques have been developed subsequently as a need of the diverse engineering disciplines, nevertheless they are not few those that think they have been work a lot on reliability before the same word was used in the current context. Military, space and nuclear industries were the first ones that have been involved in this topic, however not only in these environments it is that it has been carried out this small great revolution in benefit of the increase of the reliability figures of the products of those industries, but rather it has extended to the whole industry. The fact of the massive production, characteristic of the current industries, drove four decades ago, to the fall of the reliability of its products, on one hand, because the massively itself and, for other, to the recently discovered and even not stabilized industrial techniques. Industry should be changed according to those two new requirements, creating products of medium complexity and assuring an enough reliability appropriated to production costs and controls. Reliability began to be integral part of the manufactured product. Facing this philosophy, the book describes reliability techniques applied to electronics systems and provides a coherent and rigorous framework for these diverse activities providing a unifying scientific basis for the entire subject. It consists of eight chapters plus a lot of statistical tables and an extensive annotated bibliography. Chapters embrace the following topics: 1- Introduction to Reliability; 2- Basic Mathematical Concepts; 3- Catastrophic Failure Models; 4-Parametric Failure Models; 5- Systems Reliability; 6- Reliability in Design and Project; 7- Reliability Tests; 8- Software Reliability. This book is in Spanish language and has a potentially diverse audience as a text book from academic to industrial courses. (author)

  10. Software reliability models for critical applications

    Energy Technology Data Exchange (ETDEWEB)

    Pham, H.; Pham, M.

    1991-12-01

    This report presents the results of the first phase of the ongoing EG&G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.

  11. Software reliability models for critical applications

    Energy Technology Data Exchange (ETDEWEB)

    Pham, H.; Pham, M.

    1991-12-01

    This report presents the results of the first phase of the ongoing EG G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.

  12. Reliability and radiation effects in compound semiconductors

    CERN Document Server

    Johnston, Allan

    2010-01-01

    This book discusses reliability and radiation effects in compound semiconductors, which have evolved rapidly during the last 15 years. Johnston's perspective in the book focuses on high-reliability applications in space, but his discussion of reliability is applicable to high reliability terrestrial applications as well. The book is important because there are new reliability mechanisms present in compound semiconductors that have produced a great deal of confusion. They are complex, and appear to be major stumbling blocks in the application of these types of devices. Many of the reliability problems that were prominent research topics five to ten years ago have been solved, and the reliability of many of these devices has been improved to the level where they can be used for ten years or more with low failure rates. There is also considerable confusion about the way that space radiation affects compound semiconductors. Some optoelectronic devices are so sensitive to damage in space that they are very difficu...

  13. Metrological Reliability of Medical Devices

    Science.gov (United States)

    Costa Monteiro, E.; Leon, L. F.

    2015-02-01

    The prominent development of health technologies of the 20th century triggered demands for metrological reliability of physiological measurements comprising physical, chemical and biological quantities, essential to ensure accurate and comparable results of clinical measurements. In the present work, aspects concerning metrological reliability in premarket and postmarket assessments of medical devices are discussed, pointing out challenges to be overcome. In addition, considering the social relevance of the biomeasurements results, Biometrological Principles to be pursued by research and innovation aimed at biomedical applications are proposed, along with the analysis of their contributions to guarantee the innovative health technologies compliance with the main ethical pillars of Bioethics.

  14. Safety and reliability in Europe

    International Nuclear Information System (INIS)

    Colombo, A.G.

    1985-01-01

    This volume contains the papers presented at the ESRA Pre-Launching Meeting. The meeting was attended by about eighty European reliability and safety experts from industry, research organizations and universities. This meeting was dealing with the following subjects: the historical perspective of safety and reliability in Europe and to the aims of ESRA. Status and Trends in Research and Development; Codes, Standards and Regulations; Academic and Technical Training. National and international Organizations. Twenty six papers have been analyzed and abstracted for inclusion in the data base

  15. Reliability in the design phase

    International Nuclear Information System (INIS)

    Siahpush, A.S.; Hills, S.W.; Pham, H.; Majumdar, D.

    1991-12-01

    A study was performed to determine the common methods and tools that are available to calculated or predict a system's reliability. A literature review and software survey are included. The desired product of this developmental work is a tool for the system designer to use in the early design phase so that the final design will achieve the desired system reliability without lengthy testing and rework. Three computer programs were written which provide the first attempt at fulfilling this need. The programs are described and a case study is presented for each one. This is a continuing effort which will be furthered in FY-1992. 10 refs

  16. System reliability analysis with natural language and expert's subjectivity

    International Nuclear Information System (INIS)

    Onisawa, T.

    1996-01-01

    This paper introduces natural language expressions and expert's subjectivity to system reliability analysis. To this end, this paper defines a subjective measure of reliability and presents the method of the system reliability analysis using the measure. The subjective measure of reliability corresponds to natural language expressions of reliability estimation, which is represented by a fuzzy set defined on [0,1]. The presented method deals with the dependence among subsystems and employs parametrized operations of subjective measures of reliability which can reflect expert 's subjectivity towards the analyzed system. The analysis results are also expressed by linguistic terms. Finally this paper gives an example of the system reliability analysis by the presented method

  17. Structural hybrid reliability index and its convergent solving method based on random–fuzzy–interval reliability model

    OpenAIRE

    Hai An; Ling Zhou; Hui Sun

    2016-01-01

    Aiming to resolve the problems of a variety of uncertainty variables that coexist in the engineering structure reliability analysis, a new hybrid reliability index to evaluate structural hybrid reliability, based on the random–fuzzy–interval model, is proposed in this article. The convergent solving method is also presented. First, the truncated probability reliability model, the fuzzy random reliability model, and the non-probabilistic interval reliability model are introduced. Then, the new...

  18. Inverse Reliability Task: Artificial Neural Networks and Reliability-Based Optimization Approaches

    OpenAIRE

    Lehký , David; Slowik , Ondřej; Novák , Drahomír

    2014-01-01

    Part 7: Genetic Algorithms; International audience; The paper presents two alternative approaches to solve inverse reliability task – to determine the design parameters to achieve desired target reliabilities. The first approach is based on utilization of artificial neural networks and small-sample simulation Latin hypercube sampling. The second approach considers inverse reliability task as reliability-based optimization task using double-loop method and also small-sample simulation. Efficie...

  19. Lecture Presentations

    International Nuclear Information System (INIS)

    2007-01-01

    The Heavy-Ion Collisions in the LHC workshop held in Cracow from 18 to 18 May 2007. The main subject of the workshop was to present the newest results of research provided at CERN LHC collider. Additionally some theoretical models and methods used for presented data analysis were discussed

  20. CATCHY PRESENTATIONS

    DEFF Research Database (Denmark)

    Eriksen, Kaare; Tollestrup, Christian; Ovesen, Nis

    2011-01-01

    An important competence for designers is the ability to communicate and present ideas and proposals for customers, partners, investors and colleagues. The Pecha Kucha principle, developed by Astrid Klein and Mark Dytham, has become a widely used and easy format for the presentation of new concepts...