WorldWideScience

Sample records for reliability processing presentation

  1. Gearbox Reliability Collaborative Update (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Sheng, S.; Keller, J.; Glinsky, C.

    2013-10-01

    This presentation was given at the Sandia Reliability Workshop in August 2013 and provides information on current statistics, a status update, next steps, and other reliability research and development activities related to the Gearbox Reliability Collaborative.

  2. Testing for PV Reliability (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, S.; Bansal, S.

    2014-09-01

    The DOE SUNSHOT workshop is seeking input from the community about PV reliability and how the DOE might address gaps in understanding. This presentation describes the types of testing that are needed for PV reliability and introduces a discussion to identify gaps in our understanding of PV reliability testing.

  3. Use of a structured functional evaluation process for independent medical evaluations of claimants presenting with disabling mental illness: rationale and design for a multi-center reliability study.

    Science.gov (United States)

    Bachmann, Monica; de Boer, Wout; Schandelmaier, Stefan; Leibold, Andrea; Marelli, Renato; Jeger, Joerg; Hoffmann-Richter, Ulrike; Mager, Ralph; Schaad, Heinz; Zumbrunn, Thomas; Vogel, Nicole; Bänziger, Oskar; Busse, Jason W; Fischer, Katrin; Kunz, Regina

    2016-07-29

    Work capacity evaluations by independent medical experts are widely used to inform insurers whether injured or ill workers are capable of engaging in competitive employment. In many countries, evaluation processes lack a clearly structured approach, standardized instruments, and an explicit focus on claimants' functional abilities. Evaluation of subjective complaints, such as mental illness, present additional challenges in the determination of work capacity. We have therefore developed a process for functional evaluation of claimants with mental disorders which complements usual psychiatric evaluation. Here we report the design of a study to measure the reliability of our approach in determining work capacity among patients with mental illness applying for disability benefits. We will conduct a multi-center reliability study, in which 20 psychiatrists trained in our functional evaluation process will assess 30 claimants presenting with mental illness for eligibility to receive disability benefits [Reliability of Functional Evaluation in Psychiatry, RELY-study]. The functional evaluation process entails a five-step structured interview and a reporting instrument (Instrument of Functional Assessment in Psychiatry [IFAP]) to document the severity of work-related functional limitations. We will videotape all evaluations which will be viewed by three psychiatrists who will independently rate claimants' functional limitations. Our primary outcome measure is the evaluation of claimant's work capacity as a percentage (0 to 100 %), and our secondary outcomes are the 12 mental functions and 13 functional capacities assessed by the IFAP-instrument. Inter-rater reliability of four psychiatric experts will be explored using multilevel models to estimate the intraclass correlation coefficient (ICC). Additional analyses include subgroups according to mental disorder, the typicality of claimants, and claimant perceived fairness of the assessment process. We hypothesize that a

  4. Gearbox Reliability Collaborative Update: A Brief (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Sheng, S.; Keller, J.; McDade, M.

    2012-01-01

    This presentation is an update on the Gearbox Reliability Collaborative (GRC) for the AWEA Wind Project Operations, Maintenance & Reliability Seminar. GRC accomplishments are: (1) Failure database software deployed - partners see business value for themselves and customers; (2) Designed, built, instrumented, and tested two gearboxes - (a) Generated unprecedented public domain test data from both field testing and dynamometer testing, (b) Different responses from 'identical' gearboxes, (c) Demonstrated importance of non-torque loading and modeling approach; and (3) Active collaborative, with wide industry support, leveraging DOE funding - Modeling round robin and Condition Monitoring round robin.

  5. Reliability Issues for Photovoltaic Modules (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, S.

    2009-10-01

    Si modules good in field; new designs need reliability testing. CdTe & CIGS modules sensitive to moisture; carefully seal. CPV in product development stage; benefits from expertise in other industries.

  6. Process control using reliability based control charts

    Directory of Open Access Journals (Sweden)

    J.K. Jacob

    2008-12-01

    Full Text Available Purpose: The paper presents the method to monitor the mean time between failures (MTBF and detect anychange in intensity parameter. Here, a control chart procedure is presented for process reliability monitoring.Control chart based on different distributions are also considered and were used in decision making. Results anddiscussions are presented based on the case study at different industries.Design/methodology/approach: The failure occurrence process can be modeled by different distributions likehomogeneous Poisson process, Weibull model etc. In each case the aim is to monitor the mean time betweenfailure (MTBF and detect any change in intensity parameter. When the process can be described by a Poissonprocess the time between failures will be exponential and can be used for reliability monitoring.Findings: In this paper, a new procedure based on the monitoring of time to observe r failures is also proposedand it can be more appropriate for reliability monitoring.Practical implications: This procedure is useful and more sensitive when compared with the λ-chart although itwill wait until r failures for a decision. These charts can be regarded as powerful tools for reliability monitoring.λr gives more accurate results than λ-chart.Originality/value: Adopting these measures to system of equipments can increase the reliability and availabilityof the system results in economic gain. A homogeneous Poisson process is usually used to model the failureoccurrence process with certain intensity.

  7. Reliability Estimation of the Pultrusion Process Using the First-Order Reliability Method (FORM)

    NARCIS (Netherlands)

    Baran, Ismet; Tutum, Cem C.; Hattel, Jesper H.

    2013-01-01

    In the present study the reliability estimation of the pultrusion process of a flat plate is analyzed by using the first order reliability method (FORM). The implementation of the numerical process model is validated by comparing the deterministic temperature and cure degree profiles with correspond

  8. Experiences with Two Reliability Data Collection Efforts (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Sheng, S.; Lantz, E.

    2013-08-01

    This presentation, given by NREL at the Wind Reliability Experts Meeting in Albuquerque, New Mexico, outlines the causes of wind plant operational expenditures and gearbox failures and describes NREL's efforts to create a gearbox failure database.

  9. Present status of processing method

    Energy Technology Data Exchange (ETDEWEB)

    Kosako, Kazuaki [Sumitomo Atomic Energy Industries Ltd., Tokyo (Japan)

    1998-11-01

    Present status of processing method for a high-energy nuclear data file was examined. The NJOY94 code is the only one available to the processing. In Japan, present processing used NJOY94 is orienting toward the production of traditional cross section library, because a high-energy transport code using a high-energy cross section library is indistinct. (author)

  10. PROVIDING RELIABILITY OF HUMAN RESOURCES IN PRODUCTION MANAGEMENT PROCESS

    Directory of Open Access Journals (Sweden)

    Anna MAZUR

    2014-07-01

    Full Text Available People are the most valuable asset of an organization and the results of a company mostly depends on them. The human factor can also be a weak link in the company and cause of the high risk for many of the processes. Reliability of the human factor in the process of the manufacturing process will depend on many factors. The authors include aspects of human error, safety culture, knowledge, communication skills, teamwork and leadership role in the developed model of reliability of human resources in the management of the production process. Based on the case study and the results of research and observation of the author present risk areas defined in a specific manufacturing process and the results of evaluation of the reliability of human resources in the process.

  11. Reliable processing of graphene using metal etchmasks

    Directory of Open Access Journals (Sweden)

    Peltekis Nikos

    2011-01-01

    Full Text Available Abstract Graphene exhibits exciting properties which make it an appealing candidate for use in electronic devices. Reliable processes for device fabrication are crucial prerequisites for this. We developed a large area of CVD synthesis and transfer of graphene films. With patterning of these graphene layers using standard photoresist masks, we are able to produce arrays of gated graphene devices with four point contacts. The etching and lift off process poses problems because of delamination and contamination due to polymer residues when using standard resists. We introduce a metal etch mask which minimises these problems. The high quality of graphene is shown by Raman and XPS spectroscopy as well as electrical measurements. The process is of high value for applications, as it improves the processability of graphene using high-throughput lithography and etching techniques.

  12. Photovoltaic Reliability Group activities in USA and Brazil (Presentation Recording)

    Science.gov (United States)

    Dhere, Neelkanth G.; Cruz, Leila R. O.

    2015-09-01

    Recently prices of photovoltaic (PV) systems have been reduced considerably and may continue to be reduced making them attractive. If these systems provide electricity over the stipulated warranty period, it would be possible attain socket parity within the next few years. Current photovoltaic module qualifications tests help in minimizing infant mortality but do not guarantee useful lifetime over the warranty period. The PV Module Quality Assurance Task Force (PVQAT) is trying to formulate accelerated tests that will be useful towards achieving the ultimate goal of assuring useful lifetime over the warranty period as well as to assure manufacturing quality. Unfortunately, assuring the manufacturing quality may require 24/7 presence. Alternatively, collecting data on the performance of fielded systems would assist in assuring manufacturing quality. Here PV systems installed by home-owners and small businesses can constitute as an important untapped source of data. The volunteer group, PV - Reliable, Safe and Sustainable Quality! (PVRessQ!) is providing valuable service to small PV system owners. Photovoltaic Reliability Group (PVRG) is initiating activities in USA and Brazil to assist home owners and small businesses in monitoring photovoltaic (PV) module performance and enforcing warranty. It will work in collaboration with small PV system owners, consumer protection agencies. Brazil is endowed with excellent solar irradiance making it attractive for installation of PV systems. Participating owners of small PV systems would instruct inverter manufacturers to copy the daily e-mails to PVRG and as necessary, will authorize the PVRG to carry out review of PV systems. The presentation will consist of overall activities of PVRG in USA and Brazil.

  13. Reliability theory for diffusion processes on interconnected networks

    Science.gov (United States)

    Khorramzadeh, Yasamin; Youssef, Mina; Eubank, Stephen

    2014-03-01

    We present the concept of network reliability as a framework to study diffusion dynamics in interdependent networks. We illustrate how different outcomes of diffusion processes, such as cascading failure, can be studied by estimating the reliability polynomial under different reliability rules. As an example, we investigate the effect of structural properties on diffusion dynamics for a few different topologies of two coupled networks. We evaluate the effect of varying the probability of failure propagating along the edges, both within a single network as well as between the networks. We exhibit the sensitivity of interdependent network reliability and connectivity to edge failures in each topology. Network Dynamics and Simulation Science Laboratory, Virginia Bioinformatics Institute, Virginia Tech, Blacksburg, Virginia 24061, USA.

  14. Reliability Methods for Shield Design Process

    Science.gov (United States)

    Tripathi, R. K.; Wilson, J. W.

    2002-01-01

    Providing protection against the hazards of space radiation is a major challenge to the exploration and development of space. The great cost of added radiation shielding is a potential limiting factor in deep space operations. In this enabling technology, we have developed methods for optimized shield design over multi-segmented missions involving multiple work and living areas in the transport and duty phase of space missions. The total shield mass over all pieces of equipment and habitats is optimized subject to career dose and dose rate constraints. An important component of this technology is the estimation of two most commonly identified uncertainties in radiation shield design, the shielding properties of materials used and the understanding of the biological response of the astronaut to the radiation leaking through the materials into the living space. The largest uncertainty, of course, is in the biological response to especially high charge and energy (HZE) ions of the galactic cosmic rays. These uncertainties are blended with the optimization design procedure to formulate reliability-based methods for shield design processes. The details of the methods will be discussed.

  15. Monitoring and Improving the Reliability of Plasma Spray Processes

    Science.gov (United States)

    Mauer, Georg; Rauwald, Karl-Heinz; Mücke, Robert; Vaßen, Robert

    2017-06-01

    Monitoring and improving of process reliability are prevalent issues in thermal spray technology. They are intended to accomplish specific quality characteristics by controlling the process. For this, implicit approaches are in demand to rapidly conclude on relevant coating properties, i.e., they are not directly measured, but it is assumed that the monitored variables are in fact suggestive for them. Such monitoring can be performed in situ (during the running process) instead of measuring coating characteristics explicitly (directly) and ex situ (after the process). Implicit approaches can be based on extrinsic variables (set from outside) as well as on intrinsic parameters (internal, not directly adjustable) having specific advantages and disadvantages, each. In this work, the effects of atmospheric plasma spray process variables are systemized in process schemes. On this basis, different approaches to contribute to improved process reliability are described and assessed paying particular attention to in-flight particle diagnostics. Finally, a new test applying spray bead analysis is introduced and first results are presented.

  16. Reliability Estimation of the Pultrusion Process Using the First-Order Reliability Method (FORM)

    Science.gov (United States)

    Baran, Ismet; Tutum, Cem C.; Hattel, Jesper H.

    2013-08-01

    In the present study the reliability estimation of the pultrusion process of a flat plate is analyzed by using the first order reliability method (FORM). The implementation of the numerical process model is validated by comparing the deterministic temperature and cure degree profiles with corresponding analyses in the literature. The centerline degree of cure at the exit (CDOCE) being less than a critical value and the maximum composite temperature ( T max) during the process being greater than a critical temperature are selected as the limit state functions (LSFs) for the FORM. The cumulative distribution functions of the CDOCE and T max as well as the correlation coefficients are obtained by using the FORM and the results are compared with corresponding Monte-Carlo simulations (MCS). According to the results obtained from the FORM, an increase in the pulling speed yields an increase in the probability of T max being greater than the resin degradation temperature. A similar trend is also seen for the probability of the CDOCE being less than 0.8.

  17. Reliability and energy efficiency of zero energy homes (Conference Presentation)

    Science.gov (United States)

    Dhere, Neelkanth G.

    2016-09-01

    Photovoltaic (PV) modules and systems are being installed increasingly on residential homes to increase the proportion of renewable energy in the energy mix. The ultimate goal is to attain sustainability without subsidy. The prices of PV modules and systems have declined substantially during the recent years. They will be reduced further to reach grid parity. Additionally the total consumed energy must be reduced by making the homes more energy efficient. FSEC/UCF Researchers have carried out research on development of PV cells and systems and on reducing the energy consumption in homes and by small businesses. Additionally, they have provided guidance on PV module and system installation and to make the homes energy efficient. The produced energy is fed into the utility grid and the consumed energy is obtained from the utility grid, thus the grid is assisting in the storage. Currently the State of Florida permits net metering leading to equal charge for the produced and consumed electricity. This paper describes the installation of 5.29 KW crystalline silicon PV system on a south-facing tilt at approximately latitude tilt on a single-story, three-bedroom house. It also describes the computer program on Building Energy Efficiency and the processes that were employed for reducing the energy consumption of the house by improving the insulation, air circulation and windows, etc. Finally it describes actual consumption and production of electricity and the installation of additional crystalline silicon PV modules and balance of system to make it a zero energy home.

  18. Accurate, reliable control of process gases by mass flow controllers

    Energy Technology Data Exchange (ETDEWEB)

    Hardy, J.; McKnight, T.

    1997-02-01

    The thermal mass flow controller, or MFC, has become an instrument of choice for the monitoring and controlling of process gas flow throughout the materials processing industry. These MFCs are used on CVD processes, etching tools, and furnaces and, within the semiconductor industry, are used on 70% of the processing tools. Reliability and accuracy are major concerns for the users of the MFCs. Calibration and characterization technologies for the development and implementation of mass flow devices are described. A test facility is available to industry and universities to test and develop gas floe sensors and controllers and evaluate their performance related to environmental effects, reliability, reproducibility, and accuracy. Additional work has been conducted in the area of accuracy. A gravimetric calibrator was invented that allows flow sensors to be calibrated in corrosive, reactive gases to an accuracy of 0.3% of reading, at least an order of magnitude better than previously possible. Although MFCs are typically specified with accuracies of 1% of full scale, MFCs may often be implemented with unwarranted confidence due to the conventional use of surrogate gas factors. Surrogate gas factors are corrections applied to process flow indications when an MFC has been calibrated on a laboratory-safe surrogate gas, but is actually used on a toxic, or corrosive process gas. Previous studies have indicated that the use of these factors may cause process flow errors of typically 10%, but possibly as great as 40% of full scale. This paper will present possible sources of error in MFC process gas flow monitoring and control, and will present an overview of corrective measures which may be implemented with MFC use to significantly reduce these sources of error.

  19. Performance and Reliability of Interface Materials for Automotive Power Electronics (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Narumanchi, S.; DeVoto, D.; Mihalic, M.; Paret, P.

    2013-07-01

    Thermal management and reliability are important because excessive temperature can degrade the performance, life, and reliability of power electronics and electric motors. Advanced thermal management technologies enable keeping temperature within limits; higher power densities; and lower cost materials, configurations and systems. Thermal interface materials, bonded interface materials and the reliability of bonded interfaces are discussed in this presentation.

  20. Representative process sampling for reliable data analysis

    DEFF Research Database (Denmark)

    Julius, Lars Petersen; Esbensen, Kim

    2005-01-01

    (sampling variances) can be reduced greatly however, and sampling biases can be eliminated completely, by respecting a simple set of rules and guidelines provided by TOS. A systematic approach for description of process heterogeneity furnishes in-depth knowledge about the specific variability of any 1-D lot...... of any hidden cycle, eliminating the risk of underestimating process variation. A brief description of selected hardware for extraction of samples from 1-D lots is provided in order to illustrate the key issues to consider when installing new, or optimizing existing sampling devices and procedures...

  1. Process plant equipment operation, control, and reliability

    CERN Document Server

    Holloway, Michael D; Onyewuenyi, Oliver A

    2012-01-01

    "Process Plant Equipment Book is another great publication from Wiley as a reference book for final year students as well as those who will work or are working in chemical production plants and refinery…" -Associate Prof. Dr. Ramli Mat, Deputy Dean (Academic), Faculty of Chemical Engineering, Universiti Teknologi Malaysia "…give[s] readers access to both fundamental information on process plant equipment and to practical ideas, best practices and experiences of highly successful engineers from around the world… The book is illustrated throughout with numerous black & white p

  2. Process related contaminations causing climatic reliability issues

    DEFF Research Database (Denmark)

    Jellesen, Morten Stendahl; Dutta, Mondira; Verdingovas, Vadimas

    2012-01-01

    Some level of solder flux residue is inevitably found on electronics no matter whether the Printed Circuit Board Assembly (PCBA) manufacturing is carried out by hand, wave or reflow soldering process. The current use of no-clean flux systems should in principle only leave benign surface contamina...

  3. Representative process sampling for reliable data analysis

    DEFF Research Database (Denmark)

    Julius, Lars Petersen; Esbensen, Kim

    2005-01-01

    Process sampling of moving streams of particulate matter, fluids and slurries (over time or space) or stationary one-dimensional (1-D) lots is often carried out according to existing tradition or protocol not taking the theory of sampling (TOS) into account. In many situations, sampling errors (s...

  4. Nanowire growth process modeling and reliability models for nanodevices

    Science.gov (United States)

    Fathi Aghdam, Faranak

    . This work is an early attempt that uses a physical-statistical modeling approach to studying selective nanowire growth for the improvement of process yield. In the second research work, the reliability of nano-dielectrics is investigated. As electronic devices get smaller, reliability issues pose new challenges due to unknown underlying physics of failure (i.e., failure mechanisms and modes). This necessitates new reliability analysis approaches related to nano-scale devices. One of the most important nano-devices is the transistor that is subject to various failure mechanisms. Dielectric breakdown is known to be the most critical one and has become a major barrier for reliable circuit design in nano-scale. Due to the need for aggressive downscaling of transistors, dielectric films are being made extremely thin, and this has led to adopting high permittivity (k) dielectrics as an alternative to widely used SiO2 in recent years. Since most time-dependent dielectric breakdown test data on bilayer stacks show significant deviations from a Weibull trend, we have proposed two new approaches to modeling the time to breakdown of bi-layer high-k dielectrics. In the first approach, we have used a marked space-time self-exciting point process to model the defect generation rate. A simulation algorithm is used to generate defects within the dielectric space, and an optimization algorithm is employed to minimize the Kullback-Leibler divergence between the empirical distribution obtained from the real data and the one based on the simulated data to find the best parameter values and to predict the total time to failure. The novelty of the presented approach lies in using a conditional intensity for trap generation in dielectric that is a function of time, space and size of the previous defects. In addition, in the second approach, a k-out-of-n system framework is proposed to estimate the total failure time after the generation of more than one soft breakdown.

  5. Process related contaminations causing climatic reliability issues

    DEFF Research Database (Denmark)

    Jellesen, Morten Stendahl; Dutta, Mondira; Verdingovas, Vadimas

    2012-01-01

    contaminants during the wave and re-flow soldering process; however variation in temperature on the PCBA surface during soldering can result in considerable amounts of active residues being left locally. Typical no-clean flux systems used today consist of weak organic acids (WOA) and active residues left...... of WOAs from reflow solder paste (malic, adipic, succinic, and glutaric acid) and its effects on leakage current and corrosion of Sn and Cu. Leakage current due to flux residue was investigated using a localized cleanliness test system C3 (Foresite Inc., USA). The system extracts residue contaminants......Some level of solder flux residue is inevitably found on electronics no matter whether the Printed Circuit Board Assembly (PCBA) manufacturing is carried out by hand, wave or reflow soldering process. The current use of no-clean flux systems should in principle only leave benign surface...

  6. Tera-Op Reliable Intelligently Adaptive Processing System (TRIPS)

    Science.gov (United States)

    2004-04-01

    AFRL-IF-WP-TR-2004-1514 TERA -OP RELIABLE INTELLIGENTLY ADAPTIVE PROCESSING SYSTEM (TRIPS) Stephen W. Keckler, Doug Burger, Michael Dahlin...03/31/2004 5a. CONTRACT NUMBER F33615-01-C-1892 5b. GRANT NUMBER 4. TITLE AND SUBTITLE TERA -OP RELIABLE INTELLIGENTLY ADAPTIVE PROCESSING...influence beyond the scope of this project; the influence is expected to increase with the fabrication of the prototype in phase 2. 1 2 Introduction The Tera

  7. Reliability analysis of common hazardous waste treatment processes

    Energy Technology Data Exchange (ETDEWEB)

    Waters, R.D. [Vanderbilt Univ., Nashville, TN (United States)

    1993-05-01

    Five hazardous waste treatment processes are analyzed probabilistically using Monte Carlo simulation to elucidate the relationships between process safety factors and reliability levels. The treatment processes evaluated are packed tower aeration, reverse osmosis, activated sludge, upflow anaerobic sludge blanket, and activated carbon adsorption.

  8. Reliability of resistivity quantification for shallow subsurface water processes

    CERN Document Server

    Rings, Joerg; 10.1016/j.jappgeo.2009.03.008

    2009-01-01

    The reliability of surface-based electrical resistivity tomography (ERT) for quantifying resistivities for shallow subsurface water processes is analysed. A method comprising numerical simulations of water movement in soil and forward-inverse modeling of ERT surveys for two synthetic data sets is presented. Resistivity contrast, e.g. by changing water content, is shown to have large influence on the resistivity quantification. An ensemble and clustering approach is introduced in which ensembles of 50 different inversion models for one data set are created by randomly varying the parameters for a regularisation based inversion routine. The ensemble members are sorted into five clusters of similar models and the mean model for each cluster is computed. Distinguishing persisting features in the mean models from singular artifacts in individual tomograms can improve the interpretation of inversion results. Especially in the presence of large resistivity contrasts in high sensitivity areas, the quantification of r...

  9. The process group approach to reliable distributed computing

    Science.gov (United States)

    Birman, Kenneth P.

    1992-01-01

    The difficulty of developing reliable distribution software is an impediment to applying distributed computing technology in many settings. Experience with the ISIS system suggests that a structured approach based on virtually synchronous process groups yields systems that are substantially easier to develop, exploit sophisticated forms of cooperative computation, and achieve high reliability. Six years of research on ISIS, describing the model, its implementation challenges, and the types of applications to which ISIS has been applied are reviewed.

  10. Monitoring Software Reliability using Statistical Process Control: An MMLE Approach

    Directory of Open Access Journals (Sweden)

    Bandla Sreenivasa Rao

    2011-11-01

    Full Text Available This paper consider an MMLE (Modified Maximum Likelihood Estimation based scheme to estimatesoftware reliability using exponential distribution. The MMLE is one of the generalized frameworks ofsoftware reliability models of Non Homogeneous Poisson Processes (NHPPs. The MMLE givesanalytical estimators rather than an iterative approximation to estimate the parameters. In this paper weproposed SPC (Statistical Process Control Charts mechanism to determine the software quality usinginter failure times data. The Control charts can be used to measure whether the software process isstatistically under control or not.

  11. Reliability Analysis and Standardization of Spacecraft Command Generation Processes

    Science.gov (United States)

    Meshkat, Leila; Grenander, Sven; Evensen, Ken

    2011-01-01

    center dot In order to reduce commanding errors that are caused by humans, we create an approach and corresponding artifacts for standardizing the command generation process and conducting risk management during the design and assurance of such processes. center dot The literature review conducted during the standardization process revealed that very few atomic level human activities are associated with even a broad set of missions. center dot Applicable human reliability metrics for performing these atomic level tasks are available. center dot The process for building a "Periodic Table" of Command and Control Functions as well as Probabilistic Risk Assessment (PRA) models is demonstrated. center dot The PRA models are executed using data from human reliability data banks. center dot The Periodic Table is related to the PRA models via Fault Links.

  12. PRESENTATION POTENTIAL USING IN PEDAGOGICAL INTERACTION PROCESS

    Directory of Open Access Journals (Sweden)

    Olga V. Ershova

    2016-01-01

    Full Text Available The given article is aimed at considering multimedia presentation potential and its influence on strengthening classroom teacher-student interaction. In the article the importance of using this kind of activity in the study process is pointed in connection with educational state policy on the one hand. On the other hand, gained students’ skills as a final result of work with presentations met employers’ demand for both parent and world labour-markets and bring competitive benefit to the candidates. Scientific novelty and results. Multimedia presentation is considered as a specific complex of classroom activities. The students are oriented on the self analysis and presentation assessment. It is shown that well-organized process of peer students’ assessment allows to simultaneously helping in solving the didactic and methodical problems. To this purpose the system of assessment criteria should be developed. It has to be clear for students for making assessment feasible and time-saving. The example of a possible variant of criteria system is described; quality of the presentations prepared by students can be defined based on such system criteria. The author also analyzed software products of the three main platforms (Windows, Linux, MacOs which have different tools and allow to follow users’ needs for creating presentations. In the article there is a comparative table of the two most popular software development: the program Microsoft PowerPoint and the web-service Prezi for realizing the relevance of their use in the study process. Practical significance of the present article concludes in author’s suggestions of some recommendations for presentation potential use as a tool of improving pedagogical interaction process with contemporary students. 

  13. Reliably Addressing "What Matters" Through a Quality Improvement Process.

    Science.gov (United States)

    Rutherford, Patricia A

    2016-02-01

    Oncology nurses have a critical role in mitigating the intense vulnerability, loss of control, and fear of the unknown that characterizes the experiences of patients with cancer and their family members. Reliably inquiring about the issues that are at the forefront for patients and their loved ones can encourage a deeper dialogue-where nurses can understand and address the issues that are most important to them. A practical quality improvement approach can help to ensure that processes are in place to assist nurses in devoting time to reliably inquire about "what matters" to each patient at every encounter.

  14. Numerical Model based Reliability Estimation of Selective Laser Melting Process

    DEFF Research Database (Denmark)

    Mohanty, Sankhya; Hattel, Jesper Henri

    2014-01-01

    Selective laser melting is developing into a standard manufacturing technology with applications in various sectors. However, the process is still far from being at par with conventional processes such as welding and casting, the primary reason of which is the unreliability of the process. While...... of the selective laser melting process. A validated 3D finite-volume alternating-direction-implicit numerical technique is used to model the selective laser melting process, and is calibrated against results from single track formation experiments. Correlation coefficients are determined for process input...... parameters such as laser power, speed, beam profile, etc. Subsequently, uncertainties in the processing parameters are utilized to predict a range for the various outputs, using a Monte Carlo method based uncertainty analysis methodology, and the reliability of the process is established....

  15. Achieving High Reliability with People, Processes, and Technology.

    Science.gov (United States)

    Saunders, Candice L; Brennan, John A

    2017-01-01

    High reliability as a corporate value in healthcare can be achieved by meeting the "Quadruple Aim" of improving population health, reducing per capita costs, enhancing the patient experience, and improving provider wellness. This drive starts with the board of trustees, CEO, and other senior leaders who ingrain high reliability throughout the organization. At WellStar Health System, the board developed an ambitious goal to become a top-decile health system in safety and quality metrics. To achieve this goal, WellStar has embarked on a journey toward high reliability and has committed to Lean management practices consistent with the Institute for Healthcare Improvement's definition of a high-reliability organization (HRO): one that is committed to the prevention of failure, early identification and mitigation of failure, and redesign of processes based on identifiable failures. In the end, a successful HRO can provide safe, effective, patient- and family-centered, timely, efficient, and equitable care through a convergence of people, processes, and technology.

  16. Wind Farm Reliability Modelling Using Bayesian Networks and Semi-Markov Processes

    Directory of Open Access Journals (Sweden)

    Robert Adam Sobolewski

    2015-09-01

    Full Text Available Technical reliability plays an important role among factors affecting the power output of a wind farm. The reliability is determined by an internal collection grid topology and reliability of its electrical components, e.g. generators, transformers, cables, switch breakers, protective relays, and busbars. A wind farm reliability’s quantitative measure can be the probability distribution of combinations of operating and failed states of the farm’s wind turbines. The operating state of a wind turbine is its ability to generate power and to transfer it to an external power grid, which means the availability of the wind turbine and other equipment necessary for the power transfer to the external grid. This measure can be used for quantitative analysis of the impact of various wind farm topologies and the reliability of individual farm components on the farm reliability, and for determining the expected farm output power with consideration of the reliability. This knowledge may be useful in an analysis of power generation reliability in power systems. The paper presents probabilistic models that quantify the wind farm reliability taking into account the above-mentioned technical factors. To formulate the reliability models Bayesian networks and semi-Markov processes were used. Using Bayesian networks the wind farm structural reliability was mapped, as well as quantitative characteristics describing equipment reliability. To determine the characteristics semi-Markov processes were used. The paper presents an example calculation of: (i probability distribution of the combination of both operating and failed states of four wind turbines included in the wind farm, and (ii expected wind farm output power with consideration of its reliability.

  17. Algorithm for break even availability allocation in process system modification using deterministic valuation model incorporating reliability

    Energy Technology Data Exchange (ETDEWEB)

    Shouri, P.V.; Sreejith, P.S. [Division of Mechanical Engineering, School of Engineering, Cochin University of Science and Technology (CUSAT), Cochin 682 022, Kerala (India)

    2008-06-15

    In the present scenario of energy demand overtaking energy supply, top priority is given for energy conservation programs and policies. As a result, most existing systems are redesigned or modified with a view for improving energy efficiency. Often these modifications can have an impact on process system configuration, thereby affecting process system reliability. The paper presents a model for valuation of process systems incorporating reliability that can be used to determine the change in process system value resulting from system modification. The model also determines the break even system availability and presents an algorithm for allocation of component reliabilities of the modified system based on the break even system availability. The developed equations are applied to a steam power plant to study the effect of various operating parameters on system value. (author)

  18. Moving to a Higher Level for PV Reliability through Comprehensive Standards Based on Solid Science (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, S.

    2014-11-01

    PV reliability is a challenging topic because of the desired long life of PV modules, the diversity of use environments and the pressure on companies to rapidly reduce their costs. This presentation describes the challenges, examples of failure mechanisms that we know or don't know how to test for, and how a scientific approach is being used to establish international standards.

  19. An expert system for ensuring the reliability of the technological process of cold sheet metal forming

    Science.gov (United States)

    Kashapova, L. R.; Pankratov, D. L.; Utyaganov, P. P.

    2016-06-01

    In order to exclude periodic defects in the parts manufacturing obtained by cold sheet metal forming a method of automated estimation of technological process reliability was developed. The technique is based on the analysis of reliability factors: detail construction, material, mechanical and physical requirements; hardware settings, tool characteristics, etc. In the work the expert system is presented based on a statistical accumulation of the knowledge of the operator (technologist) and decisions of control algorithms.

  20. Reliability Engineering for ATLAS Petascale Data Processing on the Grid

    CERN Document Server

    Golubkov, D V; The ATLAS collaboration; Vaniachine, A V

    2012-01-01

    The ATLAS detector is in its third year of continuous LHC running taking data for physics analysis. A starting point for ATLAS physics analysis is reconstruction of the raw data. First-pass processing takes place shortly after data taking, followed later by reprocessing of the raw data with updated software and calibrations to improve the quality of the reconstructed data for physics analysis. Data reprocessing involves a significant commitment of computing resources and is conducted on the Grid. The reconstruction of one petabyte of ATLAS data with 1B collision events from the LHC takes about three million core-hours. Petascale data processing on the Grid involves millions of data processing jobs. At such scales, the reprocessing must handle a continuous stream of failures. Automatic job resubmission recovers transient failures at the cost of CPU time used by the failed jobs. Orchestrating ATLAS data processing applications to ensure efficient usage of tens of thousands of CPU-cores, reliability engineering ...

  1. Power Electronic Packaging Design, Assembly Process, Reliability and Modeling

    CERN Document Server

    Liu, Yong

    2012-01-01

    Power Electronic Packaging presents an in-depth overview of power electronic packaging design, assembly,reliability and modeling. Since there is a drastic difference between IC fabrication and power electronic packaging, the book systematically introduces typical power electronic packaging design, assembly, reliability and failure analysis and material selection so readers can clearly understand each task's unique characteristics. Power electronic packaging is one of the fastest growing segments in the power electronic industry, due to the rapid growth of power integrated circuit (IC) fabrication, especially for applications like portable, consumer, home, computing and automotive electronics. This book also covers how advances in both semiconductor content and power advanced package design have helped cause advances in power device capability in recent years. The author extrapolates the most recent trends in the book's areas of focus to highlight where further improvement in materials and techniques can d...

  2. Adhesives technology for electronic applications materials, processing, reliability

    CERN Document Server

    Licari, James J

    2011-01-01

    Adhesives are widely used in the manufacture and assembly of electronic circuits and products. Generally, electronics design engineers and manufacturing engineers are not well versed in adhesives, while adhesion chemists have a limited knowledge of electronics. This book bridges these knowledge gaps and is useful to both groups. The book includes chapters covering types of adhesive, the chemistry on which they are based, and their properties, applications, processes, specifications, and reliability. Coverage of toxicity, environmental impacts and the regulatory framework make this book par

  3. "Extreme events" in STT-MRAM speed retention and reliability (Conference Presentation)

    Science.gov (United States)

    Wang, Xiaobin; Zhang, Jing; Wang, Zihui; Hao, Xiaojie; Zhou, Yuchen; Gan, Huadong; Jun, Dongha; Satoh, Kimihiro; Yen, Bing K.; Huai, Yiming

    2016-10-01

    Fast operation speed, high retention and high reliability are the most attractive features of the spin transfer torque magnetic random access memory (STT-MRAM) based upon perpendicular magnetic tunneling junction (pMTJ). For state-of-the-art pMTJ STT-MRAM, its device performance is fundamentally determined by material "extreme events" physics. For example, nanosecond write bit error rate is determined by extremely high probability (>(1-10^(-7))) stochastic magnetization switching events, retention is determined by magnetization configurations with extremely low switching probability, reliability is determined by extremely low probability (MRAM write, read, retention and reliability. Specifically, we will present our model that accurately calculates extremely low write BER for various magnetization configurations. We will review our study of thermal magnetization switching through the dynamic optimal reversal path approach, capable of characterizing extreme thermal magnetization switching events under both low frequency (e.g. static retention) and high frequency (e.g. fast read) excitations. We will also discuss a new MTJ breakdown reliability model that quantifies extreme events uniformly at different failure mode regions.

  4. Integration of human reliability analysis into the high consequence process

    Energy Technology Data Exchange (ETDEWEB)

    Houghton, F.K.; Morzinski, J.

    1998-12-01

    When performing a hazards analysis (HA) for a high consequence process, human error often plays a significant role in the hazards analysis. In order to integrate human error into the hazards analysis, a human reliability analysis (HRA) is performed. Human reliability is the probability that a person will correctly perform a system-required activity in a required time period and will perform no extraneous activity that will affect the correct performance. Even though human error is a very complex subject that can only approximately be addressed in risk assessment, an attempt must be made to estimate the effect of human errors. The HRA provides data that can be incorporated in the hazard analysis event. This paper will discuss the integration of HRA into a HA for the disassembly of a high explosive component. The process was designed to use a retaining fixture to hold the high explosive in place during a rotation of the component. This tool was designed as a redundant safety feature to help prevent a drop of the explosive. This paper will use the retaining fixture to demonstrate the following HRA methodology`s phases. The first phase is to perform a task analysis. The second phase is the identification of the potential human, both cognitive and psychomotor, functions performed by the worker. During the last phase the human errors are quantified. In reality, the HRA process is an iterative process in which the stages overlap and information gathered in one stage may be used to refine a previous stage. The rationale for the decision to use or not use the retaining fixture and the role the HRA played in the decision will be discussed.

  5. Experiments on data presentation to process operators in diagnostic tasks

    DEFF Research Database (Denmark)

    Rasmussen, Jens; Goodstein, L. P.

    1972-01-01

    Safety and reliability considerations in modern power plants have prompted our interest in man as an information receiver - especially in diagnostic tasks where the growing complexity of process plants and hence the amount of data involved make it imperative to give the staff proper support....... The great flexibility and capacity of the process computer for data reduction and presentation and for storing information on plant structure and functions give the system designer great freedom in the layout of information display for the staff, but the problem for the designer is how to make proper use...... of this freedom to support the operators efficiently. This is especially important in connection with unique, high-risk, and generally improbable abnormalities in plant functioning. Operator tasks and mental models and the need for matching the encoded information about the plant to these models are treated...

  6. Bayesian Reliability-Growth Analysis for Statistical of Diverse Population Based on Non-homogeneous Poisson Process

    Institute of Scientific and Technical Information of China (English)

    MING Zhimao; TAO Junyong; ZHANG Yunan; YI Xiaoshan; CHEN Xun

    2009-01-01

    New armament systems are subjected to the method for dealing with multi-stage system reliability-growth statistical problems of diverse population in order to improve reliability before starting mass production. Aiming at the test process which is high expense and small sample-size in the development of complex system, the specific methods are studied on how to process the statistical information of Bayesian reliability growth regarding diverse populations. Firstly, according to the characteristics of reliability growth during product development, the Bayesian method is used to integrate the testing information of multi-stage and the order relations of distribution parameters. And then a Gamma-Beta prior distribution is proposed based on non-homogeneous Poisson process(NHPP) corresponding to the reliability growth process. The posterior distribution of reliability parameters is obtained regarding different stages of product, and the reliability parameters are evaluated based on the posterior distribution. Finally, Bayesian approach proposed in this paper for multi-stage reliability growth test is applied to the test process which is small sample-size in the astronautics filed. The results of a numerical example show that the presented model can make use of the diverse information synthetically, and pave the way for the application of the Bayesian model for multi-stage reliability growth test evaluation with small sample-size. The method is useful for evaluating multi-stage system reliability and making reliability growth plan rationally.

  7. Illustrative presentation of some basic concepts of network internal reliability with comments as regards engineering surveys

    Science.gov (United States)

    Prószyński, Witold

    2016-06-01

    The paper integrates some earlier and the recent findings of the author in the area of network internal reliability and presents a consistent system of concepts in this respect. The concepts of outlier detection and outlier identification linked directly with the global model test and the outlier tests respectively, are shown as a basis for the concepts such as outlier detectability and outlier identifiability. Also, a four level classification of gross errors expressed in a form of a tree-diagram is presented including perceptible and imperceptible errors, detectable and undetectable errors and identifiable and unidentifiable errors. Their properties are given mainly in a descriptive way, deliberately limiting rigorous mathematical formulas to a necessary minimum. Understanding of different types of gross errors is useful in analyzing the results of the outlier detection and identification procedures as well as in designing the networks to make them duly robust to observation gross errors. It is of special importance for engineering surveys where quite often low-redundancy networks are used. Main objective of the paper is to demonstrate a clear and consistent system of basic concepts related to network internal reliability.

  8. Improving Emergency Department Door to Doctor Time and Process Reliability

    Science.gov (United States)

    El Sayed, Mazen J.; El-Eid, Ghada R.; Saliba, Miriam; Jabbour, Rima; Hitti, Eveline A.

    2015-01-01

    Abstract The aim of this study is to determine the effectiveness of using lean management methods on improving emergency department door to doctor times at a tertiary care hospital. We performed a before and after study at an academic urban emergency department with 49,000 annual visits after implementing a series of lean driven interventions over a 20 month period. The primary outcome was mean door to doctor time and the secondary outcome was length of stay of both admitted and discharged patients. A convenience sample from the preintervention phase (February 2012) was compared to another from the postintervention phase (mid-October to mid-November 2013). Individual control charts were used to assess process stability. Postintervention there was a statistically significant decrease in the mean door to doctor time measure (40.0 minutes ± 53.44 vs 25.3 minutes ± 15.93 P < 0.001). The postintervention process was more statistically in control with a drop in the upper control limits from 148.8 to 72.9 minutes. Length of stay of both admitted and discharged patients dropped from 2.6 to 2.0 hours and 9.0 to 5.5 hours, respectively. All other variables including emergency department visit daily volumes, hospital occupancy, and left without being seen rates were comparable. Using lean change management techniques can be effective in reducing door to doctor time in the Emergency Department and improving process reliability. PMID:26496278

  9. An Impact of Thermodynamic Processes in Human Bodies on Performance Reliability of Individuals

    Directory of Open Access Journals (Sweden)

    Smalko Zbigniew

    2015-01-01

    Full Text Available The article presents the problem of the influence of thermodynamic factors on human fallibility in different zones of thermal discomfort. Describes the processes of energy in the human body. Been given a formal description of the energy balance of the human body thermoregulation. Pointed to human reactions to temperature changes of internal and external environment, including reactions associated with exercise. The methodology to estimate and determine the reliability of indicators of human basal acting in different zones of thermal discomfort. The significant effect of thermodynamic factors on the reliability and security ofperson.

  10. The PedsQL™ Present Functioning Visual Analogue Scales: preliminary reliability and validity

    Directory of Open Access Journals (Sweden)

    Varni James W

    2006-10-01

    Full Text Available Abstract Background The PedsQL™ Present Functioning Visual Analogue Scales (PedsQL™ VAS were designed as an ecological momentary assessment (EMA instrument to rapidly measure present or at-the-moment functioning in children and adolescents. The PedsQL™ VAS assess child self-report and parent-proxy report of anxiety, sadness, anger, worry, fatigue, and pain utilizing six developmentally appropriate visual analogue scales based on the well-established Varni/Thompson Pediatric Pain Questionnaire (PPQ Pain Intensity VAS format. Methods The six-item PedsQL™ VAS was administered to 70 pediatric patients ages 5–17 and their parents upon admittance to the hospital environment (Time 1: T1 and again two hours later (Time 2: T2. It was hypothesized that the PedsQL™ VAS Emotional Distress Summary Score (anxiety, sadness, anger, worry and the fatigue VAS would demonstrate moderate to large effect size correlations with the PPQ Pain Intensity VAS, and that patient" parent concordance would increase over time. Results Test-retest reliability was demonstrated from T1 to T2 in the large effect size range. Internal consistency reliability was demonstrated for the PedsQL™ VAS Total Symptom Score (patient self-report: T1 alpha = .72, T2 alpha = .80; parent proxy-report: T1 alpha = .80, T2 alpha = .84 and Emotional Distress Summary Score (patient self-report: T1 alpha = .74, T2 alpha = .73; parent proxy-report: T1 alpha = .76, T2 alpha = .81. As hypothesized, the Emotional Distress Summary Score and Fatigue VAS were significantly correlated with the PPQ Pain VAS in the medium to large effect size range, and patient and parent concordance increased from T1 to T2. Conclusion The results demonstrate preliminary test-retest and internal consistency reliability and construct validity of the PedsQL™ Present Functioning VAS instrument for both pediatric patient self-report and parent proxy-report. Further field testing is required to extend these initial

  11. Materials and processes for spacecraft and high reliability applications

    CERN Document Server

    D Dunn, Barrie

    2016-01-01

    The objective of this book is to assist scientists and engineers select the ideal material or manufacturing process for particular applications; these could cover a wide range of fields, from light-weight structures to electronic hardware. The book will help in problem solving as it also presents more than 100 case studies and failure investigations from the space sector that can, by analogy, be applied to other industries. Difficult-to-find material data is included for reference. The sciences of metallic (primarily) and organic materials presented throughout the book demonstrate how they can be applied as an integral part of spacecraft product assurance schemes, which involve quality, material and processes evaluations, and the selection of mechanical and component parts. In this successor edition, which has been revised and updated, engineering problems associated with critical spacecraft hardware and the space environment are highlighted by over 500 illustrations including micrographs and fractographs. Sp...

  12. Report on Wind Turbine Subsystem Reliability - A Survey of Various Databases (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Sheng, S.

    2013-07-01

    Wind industry has been challenged by premature subsystem/component failures. Various reliability data collection efforts have demonstrated their values in supporting wind turbine reliability and availability research & development and industrial activities. However, most information on these data collection efforts are scattered and not in a centralized place. With the objective of getting updated reliability statistics of wind turbines and/or subsystems so as to benefit future wind reliability and availability activities, this report is put together based on a survey of various reliability databases that are accessible directly or indirectly by NREL. For each database, whenever feasible, a brief description summarizing database population, life span, and data collected is given along with its features & status. Then selective results deemed beneficial to the industry and generated based on the database are highlighted. This report concludes with several observations obtained throughout the survey and several reliability data collection opportunities in the future.

  13. Audience preferences are predicted by temporal reliability of neural processing.

    Science.gov (United States)

    Dmochowski, Jacek P; Bezdek, Matthew A; Abelson, Brian P; Johnson, John S; Schumacher, Eric H; Parra, Lucas C

    2014-07-29

    Naturalistic stimuli evoke highly reliable brain activity across viewers. Here we record neural activity from a group of naive individuals while viewing popular, previously-broadcast television content for which the broad audience response is characterized by social media activity and audience ratings. We find that the level of inter-subject correlation in the evoked encephalographic responses predicts the expressions of interest and preference among thousands. Surprisingly, ratings of the larger audience are predicted with greater accuracy than those of the individuals from whom the neural data is obtained. An additional functional magnetic resonance imaging study employing a separate sample of subjects shows that the level of neural reliability evoked by these stimuli covaries with the amount of blood-oxygenation-level-dependent (BOLD) activation in higher-order visual and auditory regions. Our findings suggest that stimuli which we judge favourably may be those to which our brains respond in a stereotypical manner shared by our peers.

  14. The Process Group Approach to Reliable Distributed Computing

    Science.gov (United States)

    1991-07-01

    under DARPA/NASA grant NAG-2-593, and by grants from EBM , HP, Siemens, GTE and Hitachi. I Ir in I a i gress SW Shwnu i Pnc" IBU r 00 8 133-1/4 1BM...system, but could make it harder to administer and less reliable. A theme of the paper will be that one overcomes this intrinsic problem by standardizing

  15. Scalability, Complexity and Reliability in Quantum Information Processing

    Science.gov (United States)

    2007-03-01

    focus at a time. We can reliably determine if there is an atom at a given site in ~20 ms. The image plane can then be changed using a piezo -electric...velocities tuned near subharmonics of the transverse trapping frequencies, resonant transfer of longitudinal energy to transverse excitation can cause a...dramatic reduction in the longitudinal dispersion of an atomic wave packet. Tuned to one of these resonances (the 5th subharmonic ), the atomic cloud is

  16. Ultrafast lasers--reliable tools for advanced materials processing

    National Research Council Canada - National Science Library

    Koji Sugioka; Ya Cheng

    2014-01-01

      The unique characteristics of ultrafast lasers, such as picosecond and femtosecond lasers, have opened up new avenues in materials processing that employ ultrashort pulse widths and extremely high peak intensities...

  17. A Structural Reliability Business Process Modelling with System Dynamics Simulation

    OpenAIRE

    Lam, C. Y.; S.L. Chan; Ip, W.H.

    2010-01-01

    Business activity flow analysis enables organizations to manage structured business processes, and can thus help them to improve performance. The six types of business activities identified here (i.e., SOA, SEA, MEA, SPA, MSA and FIA) are correlated and interact with one another, and the decisions from any business activity form feedback loops with previous and succeeding activities, thus allowing the business process to be modelled and simulated. For instance, for any company that is eager t...

  18. The reliability of thermocouples in microwave ceramics processing.

    Science.gov (United States)

    Aguilar, Juan; Valdez, Zarel; Ortiz, Ubaldo

    2004-01-01

    It is not rare to hear arguments against the use of thermocouples for taking temperatures in processes that are taking place under microwave fields. However, the simplicity of this device makes it attractive to consider its use. One question that arises when thermocouples are employed is whether the electric field perturbs the measurement, and if the thermocouple affects the processing. The process that was chosen for conducting this test was the synthesis of spinel (MgAl2O4) using microwaves as a power supply and hematite (Fe2O3) as an additive for both spinel formation promotion and susceptor. The alumina-based systems are very important to study because this is one of the most common ingredients in refractory materials. There are many discussions about the improvement of the process when microwaves are used, but a kinetic comparison cannot be performed if the temperature is unknown, and that is the reason for emphasizing the measurement techniques. The analysis of the obtained samples was carried out by X-ray diffraction of powders. The results of this work show that there is no difference between the products obtained when the thermocouple was inserted in the system, compared to processing without it; hence the thermocouple is appropriate for this application.

  19. Performance and Reliability of Bonded Interfaces for High-Temperature Packaging (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Devoto, D.

    2014-11-01

    The thermal performance and reliability of sintered-silver is being evaluated for power electronics packaging applications. This will be experimentally accomplished by the synthesis of large-area bonded interfaces between metalized substrates that will be subsequently subjected to thermal cycles. A finite element model of crack initiation and propagation in these bonded interfaces will allow for the interpretation of degradation rates by a crack-velocity (V)-stress intensity factor (K) analysis. The experiment is outlined, and the modeling approach is discussed.

  20. Silicon analog components device design, process integration, characterization, and reliability

    CERN Document Server

    El-Kareh, Badih

    2015-01-01

    This book covers modern analog components, their characteristics, and interactions with process parameters. It serves as a comprehensive guide, addressing both the theoretical and practical aspects of modern silicon devices and the relationship between their electrical properties and processing conditions. Based on the authors’ extensive experience in the development of analog devices, this book is intended for engineers and scientists in semiconductor research, development and manufacturing. The problems at the end of each chapter and the numerous charts, figures and tables also make it appropriate for use as a text in graduate and advanced undergraduate courses in electrical engineering and materials science.

  1. FE modeling of Cu wire bond process and reliability

    NARCIS (Netherlands)

    Yuan, C.A.; Weltevreden, E.R.; Akker, P. van den; Kregting, R.; Vreugd, J. de; Zhang, G.Q.

    2011-01-01

    Copper based wire bonding technology is widely accepted by electronic packaging industry due to the world-wide cost reduction actions (compared to gold wire bond). However, the mechanical characterization of copper wire differs from the gold wire; hence the new wire bond process setting and new bond

  2. FE modeling of Cu wire bond process and reliability

    NARCIS (Netherlands)

    Yuan, C.A.; Weltevreden, E.R.; Akker, P. van den; Kregting, R.; Vreugd, J. de; Zhang, G.Q.

    2011-01-01

    Copper based wire bonding technology is widely accepted by electronic packaging industry due to the world-wide cost reduction actions (compared to gold wire bond). However, the mechanical characterization of copper wire differs from the gold wire; hence the new wire bond process setting and new bond

  3. Time-Dependent Reliability Modeling and Analysis Method for Mechanics Based on Convex Process

    Directory of Open Access Journals (Sweden)

    Lei Wang

    2015-01-01

    Full Text Available The objective of the present study is to evaluate the time-dependent reliability for dynamic mechanics with insufficient time-varying uncertainty information. In this paper, the nonprobabilistic convex process model, which contains autocorrelation and cross-correlation, is firstly employed for the quantitative assessment of the time-variant uncertainty in structural performance characteristics. By combination of the set-theory method and the regularization treatment, the time-varying properties of structural limit state are determined and a standard convex process with autocorrelation for describing the limit state is formulated. By virtue of the classical first-passage method in random process theory, a new nonprobabilistic measure index of time-dependent reliability is proposed and its solution strategy is mathematically conducted. Furthermore, the Monte-Carlo simulation method is also discussed to illustrate the feasibility and accuracy of the developed approach. Three engineering cases clearly demonstrate that the proposed method may provide a reasonable and more efficient way to estimate structural safety than Monte-Carlo simulations throughout a product life-cycle.

  4. Supporting change processes in design: Complexity, prediction and reliability

    Energy Technology Data Exchange (ETDEWEB)

    Eckert, Claudia M. [Engineering Design Centre, University of Cambridge, Trumpington Street, Cambridge, CB2 1PZ (United Kingdom)]. E-mail: cme26@cam.ac.uk; Keller, Rene [Engineering Design Centre, University of Cambridge, Trumpington Street, Cambridge, CB2 1PZ (United Kingdom)]. E-mail: rk313@cam.ac.uk; Earl, Chris [Open University, Department of Design and Innovation, Walton Hall, Milton Keynes MK7 6AA (United Kingdom)]. E-mail: C.F.Earl@open.ac.uk; Clarkson, P. John [Engineering Design Centre, University of Cambridge, Trumpington Street, Cambridge, CB2 1PZ (United Kingdom)]. E-mail: pjc10@cam.ac.uk

    2006-12-15

    Change to existing products is fundamental to design processes. New products are often designed through change or modification to existing products. Specific parts or subsystems are changed to similar ones whilst others are directly reused. Design by modification applies particularly to safety critical products where the reuse of existing working parts and subsystems can reduce cost and risk. However change is rarely a matter of just reusing or modifying parts. Changing one part can propagate through the entire design leading to costly rework or jeopardising the integrity of the whole product. This paper characterises product change based on studies in the aerospace and automotive industry and introduces tools to aid designers in understanding the potential effects of change. Two ways of supporting designers are described: probabilistic prediction of the effects of change and visualisation of change propagation through product connectivities. Change propagation has uncertainties which are amplified by the choices designers make in practice as they implement change. Change prediction and visualisation is discussed with reference to complexity in three areas of product development: the structural backcloth of connectivities in the existing product (and its processes), the descriptions of the product used in design and the actions taken to carry out changes.

  5. A structure-based software reliability allocation using fuzzy analytic hierarchy process

    Science.gov (United States)

    Chatterjee, Subhashis; Singh, Jeetendra B.; Roy, Arunava

    2015-02-01

    During the design phase of a software, it is often required to evaluate the reliability of the software system. At this stage of development, one crucial question arises 'how to achieve a target reliability of the software?' Reliability allocation methods can be used to set reliability goals for individual components. In this paper, a software reliability allocation model has been proposed incorporating the user view point about various functions of a software. Proposed reliability allocation method attempts to answer the question 'how reliable should the system components be?' The proposed model will be useful for determining the reliability goal at the planning and design phase of a software project, hence making reliability a singular measure for performance evaluation. Proposed model requires a systematic formulation of user requirements and preference into the technical design and reliability of the software. To accomplish this task, a system hierarchy has been established, which combines the user's view of the system with that of the software manager and the programmer. Fuzzy analytic hierarchy process (FAHP) has been used to derive the required model parameters from the hierarchy. Sensitivity analysis has also been carried out in this paper. Finally, an example has been given to illustrate the effectiveness and feasibility of the proposed method.

  6. Reliability Analysis of Repairable Systems Using Stochastic Point Processes

    Institute of Scientific and Technical Information of China (English)

    TAN Fu-rong; JIANG Zhi-bin; BAI Tong-shuo

    2008-01-01

    In order to analyze the failure data from repairable systems, the homogeneous Poisson process(HPP) is usually used. In general, HPP cannot be applied to analyze the entire life cycle of a complex, re-pairable system because the rate of occurrence of failures (ROCOF) of the system changes over time rather thanremains stable. However, from a practical point of view, it is always preferred to apply the simplest methodto address problems and to obtain useful practical results. Therefore, we attempted to use the HPP model toanalyze the failure data from real repairable systems. A graphic method and the Laplace test were also usedin the analysis. Results of numerical applications show that the HPP model may be a useful tool for the entirelife cycle of repairable systems.

  7. Human Reliability Program Overview

    Energy Technology Data Exchange (ETDEWEB)

    Bodin, Michael

    2012-09-25

    This presentation covers the high points of the Human Reliability Program, including certification/decertification, critical positions, due process, organizational structure, program components, personnel security, an overview of the US DOE reliability program, retirees and academia, and security program integration.

  8. The PedsQL™ Present Functioning Visual Analogue Scales: preliminary reliability and validity

    OpenAIRE

    Varni James W; Burwinkle Tasha M; Eisen Sarajane; Sherman Sandra A

    2006-01-01

    Abstract Background The PedsQL™ Present Functioning Visual Analogue Scales (PedsQL™ VAS) were designed as an ecological momentary assessment (EMA) instrument to rapidly measure present or at-the-moment functioning in children and adolescents. The PedsQL™ VAS assess child self-report and parent-proxy report of anxiety, sadness, anger, worry, fatigue, and pain utilizing six developmentally appropriate visual analogue scales based on the well-established Varni/Thompson Pediatric Pain Questionnai...

  9. A Role For Mitochondria In Antigen Processing And Presentation.

    Science.gov (United States)

    Bonifaz, Lc; Cervantes-Silva, Mp; Ontiveros-Dotor, E; López-Villegas, Eo; Sánchez-García, Fj

    2014-09-23

    Immune synapse formation is critical for T lymphocyte activation, and mitochondria have a role in this process, by localizing close to the immune synapse, regulating intracellular calcium concentration, and providing locally required ATP. The interaction between antigen presenting cells (APCs) and T lymphocytes is a two-way signaling process. However, the role of mitochondria in antigen presenting cells during this process remains unknown. For APCs to be able to activate T lymphocytes, they must first engage in an antigen-uptake, -processing, and -presentation process. Here we show that HEL-loaded B lymphocytes, as a type of APCs, undergo a small but significant mitochondrial depolarization by 1-2 h following antigen exposure thus suggesting an increase in their metabolic demands. Inhibition of ATP synthase (oligomycin) or mitochondrial Ca(2+) uniporter (MCU) (Ruthenium red) had no effect on antigen uptake. Therefore, antigen processing and antigen presentation were further analyzed. Oligomycin treatment reduced the amount of specific MHC-peptide complexes but not total MHC II on the cell membrane of B lymphocytes which correlated with a decrease in antigen presentation. However, oligomycin also reduced antigen presentation by B lymphocytes that endogenously express HEL and by B lymphocytes loaded with the HEL48-62 peptide, although to a lesser extent. ATP synthase inhibition and MCU inhibition had a clear inhibitory effect on antigen processing (DQ-OVA). Taking together these results suggest that ATP synthase and MCU are relevant for antigen processing and presentation. Finally, APCs mitochondria were found to re-organize towards the APC-T immune synapse. This article is protected by copyright. All rights reserved.

  10. Cellular scanning strategy for selective laser melting: Generating reliable, optimized scanning paths and processing parameters

    DEFF Research Database (Denmark)

    Mohanty, Sankhya; Hattel, Jesper Henri

    2015-01-01

    to generate optimized cellular scanning strategies and processing parameters, with an objective of reducing thermal asymmetries and mechanical deformations. The optimized scanning strategies are used for selective laser melting of the standard samples, and experimental and numerical results are compared....... gradients that occur during the process. While process monitoring and control of selective laser melting is an active area of research, establishing the reliability and robustness of the process still remains a challenge.In this paper, a methodology for generating reliable, optimized scanning paths...

  11. Challenges and opportunities for informational societies from the present to become reliable learning and knowledge societies

    Directory of Open Access Journals (Sweden)

    Eduardo ROMERO SÁNCHEZ

    2013-12-01

    Full Text Available This article pretend to describe the principal social trends and cultural features that prevail today, too look the philosophical foundation of thinking, feel , living and to give an educative  response and accepted to the axiological reality  and cultural present. In modern western societies great paradoxes and contradictions coexist: economical growth, technological development and greater dimensions of freedom, but also great consumption, cultural deterioration, technological dependence and unique thought. Given this we talk about the great possibilities and at the same time of the terrible threats that exist in that modern information societies. In order to become acquainted with this reality, we have focused the analysis in 3 key aspects: the impact of digital devolution, the condition of culture in contemporary society, and the need of a “new education”.

  12. Wafer level reliability monitoring strategy of an advanced multi-process CMOS foundry

    NARCIS (Netherlands)

    Scarpa, Andrea; Tao, Guoqiao; Kuper, F.G.

    2000-01-01

    In an advanced multi-process CMOS foundry it is strategically important to make use of an optimum reliability monitoring strategy, in order to be able to run well controlled processes. Philips Semiconductors Business Unit Foundries wafer fab MOS4YOU has developed an end-of-line ultra-fast

  13. Hybrid Adsorption-Membrane Biological Reactors for Improved Performance and Reliability of Perchlorate Removal Processes

    Science.gov (United States)

    2008-12-01

    carbon supply for the autotrophic perchlorate reducing bacteria. The membrane used in the reactor is a hollow-fiber microfiltration membrane made from...1 HYBRID ADSORPTION- MEMBRANE BIOLOGICAL REACTORS FOR IMPROVED PERFORMANCE AND RELIABILITY OF PERCHLORATE REMOVAL PROCESSES L.C. Schideman...Center Champaign, IL 61826, USA ABSTRACT This study introduces the novel HAMBgR process (Hybrid Adsorption Membrane Biological Reactor) and

  14. Wafer level reliability monitoring strategy of an advanced multi-process CMOS foundry

    NARCIS (Netherlands)

    Scarpa, Andrea; Tao, Guoqiao; Kuper, Fred G.

    2000-01-01

    In an advanced multi-process CMOS foundry it is strategically important to make use of an optimum reliability monitoring strategy, in order to be able to run well controlled processes. Philips Semiconductors Business Unit Foundries wafer fab MOS4YOU has developed an end-of-line ultra-fast reliabilit

  15. Developing a Science Process Skills Test for Secondary Students: Validity and Reliability Study

    Science.gov (United States)

    Feyzioglu, Burak; Demirdag, Baris; Akyildiz, Murat; Altun, Eralp

    2012-01-01

    Science process skills are claimed to enable an individual to improve their own life visions and give a scientific view/literacy as a standard of their understanding about the nature of science. The main purpose of this study was to develop a test for measuring a valid, reliable and practical test for Science Process Skills (SPS) in secondary…

  16. Cellular scanning strategy for selective laser melting: Generating reliable, optimized scanning paths and processing parameters

    DEFF Research Database (Denmark)

    Mohanty, Sankhya; Hattel, Jesper Henri

    2015-01-01

    Selective laser melting is yet to become a standardized industrial manufacturing technique. The process continues to suffer from defects such as distortions, residual stresses, localized deformations and warpage caused primarily due to the localized heating, rapid cooling and high temperature...... gradients that occur during the process. While process monitoring and control of selective laser melting is an active area of research, establishing the reliability and robustness of the process still remains a challenge.In this paper, a methodology for generating reliable, optimized scanning paths...... and process parameters for selective laser melting of a standard sample is introduced. The processing of the sample is simulated by sequentially coupling a calibrated 3D pseudo-analytical thermal model with a 3D finite element mechanical model.The optimized processing parameters are subjected to a Monte Carlo...

  17. Wind Energy Deployment Process and Siting Tools (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Tegen, S.

    2015-02-01

    Regardless of cost and performance, some wind projects cannot proceed to completion as a result of competing multiple uses or siting considerations. Wind energy siting issues must be better understood and quantified. DOE tasked NREL researchers with depicting the wind energy deployment process and researching development considerations. This presentation provides an overview of these findings and wind siting tools.

  18. THE HISTORY OF THE S PROCESS AND ITS PRESENT STATE

    Directory of Open Access Journals (Sweden)

    Giora Shaviv

    2013-12-01

    Full Text Available We review the history and the present status of the s-process and point to problems in need of clarification. In some cases it has to do with lack of experimental data and in other the theory is missing.

  19. Medical image processing on the GPU - past, present and future.

    Science.gov (United States)

    Eklund, Anders; Dufort, Paul; Forsberg, Daniel; LaConte, Stephen M

    2013-12-01

    Graphics processing units (GPUs) are used today in a wide range of applications, mainly because they can dramatically accelerate parallel computing, are affordable and energy efficient. In the field of medical imaging, GPUs are in some cases crucial for enabling practical use of computationally demanding algorithms. This review presents the past and present work on GPU accelerated medical image processing, and is meant to serve as an overview and introduction to existing GPU implementations. The review covers GPU acceleration of basic image processing operations (filtering, interpolation, histogram estimation and distance transforms), the most commonly used algorithms in medical imaging (image registration, image segmentation and image denoising) and algorithms that are specific to individual modalities (CT, PET, SPECT, MRI, fMRI, DTI, ultrasound, optical imaging and microscopy). The review ends by highlighting some future possibilities and challenges.

  20. Dynamic Reliability Analysis Method of Degraded Mechanical Components Based on Process Probability Density Function of Stress

    Directory of Open Access Journals (Sweden)

    Peng Gao

    2014-01-01

    Full Text Available It is necessary to develop dynamic reliability models when considering strength degradation of mechanical components. Instant probability density function (IPDF of stress and process probability density function (PPDF of stress, which are obtained via different statistical methods, are defined, respectively. In practical engineering, the probability density function (PDF for the usage of mechanical components is mostly PPDF, such as the PDF acquired via the rain flow counting method. For the convenience of application, IPDF is always approximated by PPDF when using the existing dynamic reliability models. However, it may cause errors in the reliability calculation due to the approximation of IPDF by PPDF. Therefore, dynamic reliability models directly based on PPDF of stress are developed in this paper. Furthermore, the proposed models can be used for reliability assessment in the case of small amount of stress process samples by employing the fuzzy set theory. In addition, the mechanical components in solar array of satellites are chosen as representative examples to illustrate the proposed models. The results show that errors are caused because of the approximation of IPDF by PPDF and the proposed models are accurate in the reliability computation.

  1. An Evidential Reasoning-Based CREAM to Human Reliability Analysis in Maritime Accident Process.

    Science.gov (United States)

    Wu, Bing; Yan, Xinping; Wang, Yang; Soares, C Guedes

    2017-01-09

    This article proposes a modified cognitive reliability and error analysis method (CREAM) for estimating the human error probability in the maritime accident process on the basis of an evidential reasoning approach. This modified CREAM is developed to precisely quantify the linguistic variables of the common performance conditions and to overcome the problem of ignoring the uncertainty caused by incomplete information in the existing CREAM models. Moreover, this article views maritime accident development from the sequential perspective, where a scenario- and barrier-based framework is proposed to describe the maritime accident process. This evidential reasoning-based CREAM approach together with the proposed accident development framework are applied to human reliability analysis of a ship capsizing accident. It will facilitate subjective human reliability analysis in different engineering systems where uncertainty exists in practice.

  2. A Bayesian Framework for Reliability Assessment via Wiener Process and MCMC

    Directory of Open Access Journals (Sweden)

    Huibing Hao

    2014-01-01

    Full Text Available The population and individual reliability assessment are discussed, and a Bayesian framework is proposed to integrate the population degradation information and individual degradation data. Different from fixed effect Wiener process modeling, the population degradation path is characterized by a random effect Wiener process, and the model can capture sources of uncertainty including unit to unit variation and time correlated structure. Considering that the model is so complicated and analytically intractable, Markov Chain Monte Carlo (MCMC method is used to estimate the unknown parameters in the population model. To achieve individual reliability assessment, we exploit a Bayesian updating method, by which the unknown parameters are updated iteratively. Based on updated results, the residual use life and reliability evaluation are obtained. A lasers data example is given to demonstrate the usefulness and validity of the proposed model and method.

  3. The Impact of Process Capability on Service Reliability for Critical Infrastructure Providers

    Science.gov (United States)

    Houston, Clemith J., Jr.

    2013-01-01

    This study investigated the relationship between organizational processes that have been identified as promoting resiliency and their impact on service reliability within the scope of critical infrastructure providers. The importance of critical infrastructure to the nation is evident from the body of research and is supported by instances where…

  4. Optimizing the processing and presentation of PPCR imaging

    Science.gov (United States)

    Davies, Andrew G.; Cowen, Arnold R.; Parkin, Geoff J. S.; Bury, Robert F.

    1996-03-01

    Photostimulable phosphor computed radiography (CR) is becoming an increasingly popular image acquisition system. The acceptability of this technique, both diagnostically, ergonomically and economically is highly influenced by the method by which the image data is presented to the user. Traditional CR systems utilize an 11' by 14' film hardcopy format, and can place two images per exposure onto this film, which does not correspond to sizes and presentations provided by conventional techniques. It is also the authors' experience that the image enhancement algorithms provided by traditional CR systems do not provide optimal image presentation. An alternative image enhancement algorithm was developed, along with a number of hardcopy formats, designed to match the requirements of the image reporting process. The new image enhancement algorithm, called dynamic range reduction (DRR), is designed to provide a single presentation per exposure, maintaining the appearance of a conventional radiograph, while optimizing the rendition of diagnostically relevant features within the image. The algorithm was developed on a Sun SPARCstation, but later ported to a Philips' EasyVisionRAD workstation. Print formats were developed on the EasyVision to improve the acceptability of the CR hardcopy. For example, for mammographic examinations, four mammograms (a cranio-caudal and medio-lateral view of each breast) are taken for each patient, with all images placed onto a single sheet of 14' by 17' film. The new composite format provides a more suitable image presentation for reporting, and is more economical to produce. It is the use of enhanced image processing and presentation which has enabled all mammography undertaken within the general infirmary to be performed using the CR/EasyVisionRAD DRR/3M 969 combination, without recourse to conventional film/screen mammography.

  5. Reliable and Efficient Parallel Processing Algorithms and Architectures for Modern Signal Processing. Ph.D. Thesis

    Science.gov (United States)

    Liu, Kuojuey Ray

    1990-01-01

    Least-squares (LS) estimations and spectral decomposition algorithms constitute the heart of modern signal processing and communication problems. Implementations of recursive LS and spectral decomposition algorithms onto parallel processing architectures such as systolic arrays with efficient fault-tolerant schemes are the major concerns of this dissertation. There are four major results in this dissertation. First, we propose the systolic block Householder transformation with application to the recursive least-squares minimization. It is successfully implemented on a systolic array with a two-level pipelined implementation at the vector level as well as at the word level. Second, a real-time algorithm-based concurrent error detection scheme based on the residual method is proposed for the QRD RLS systolic array. The fault diagnosis, order degraded reconfiguration, and performance analysis are also considered. Third, the dynamic range, stability, error detection capability under finite-precision implementation, order degraded performance, and residual estimation under faulty situations for the QRD RLS systolic array are studied in details. Finally, we propose the use of multi-phase systolic algorithms for spectral decomposition based on the QR algorithm. Two systolic architectures, one based on triangular array and another based on rectangular array, are presented for the multiphase operations with fault-tolerant considerations. Eigenvectors and singular vectors can be easily obtained by using the multi-pase operations. Performance issues are also considered.

  6. Personality and persona: personality processes in self-presentation.

    Science.gov (United States)

    Leary, Mark R; Allen, Ashley Batts

    2011-12-01

    This article examines the role that personality variables and processes play in people's efforts to manage their public images. Although most research on self-presentation has focused on situational influences, people differ greatly in the degree to which they care about others' impressions of them, the types of impressions they try to convey, and their evaluations of their self-presentational effectiveness. Personality constructs such as public self-consciousness, approval motivation, and fear of negative evaluation are associated with the motive to manage one's impressions, and people who differ in self-disclosure and desire for privacy differentially reveal information about themselves to others. Other variables relating to people's self-concepts, interpersonal goals, and traits influence the construction of specific images. Finally, the extent to which people believe they are capable of making desired impressions influences their impression management strategies and how they respond to other people's evaluations.

  7. How to use an optimization-based method capable of balancing safety, reliability, and weight in an aircraft design process

    Energy Technology Data Exchange (ETDEWEB)

    Johansson, Cristina [Mendeley, Broderna Ugglasgatan, Linkoping (Sweden); Derelov, Micael; Olvander, Johan [Linkoping University, IEI, Dept. of Machine Design, Linkoping (Sweden)

    2017-03-15

    In order to help decision-makers in the early design phase to improve and make more cost-efficient system safety and reliability baselines of aircraft design concepts, a method (Multi-objective Optimization for Safety and Reliability Trade-off) that is able to handle trade-offs such as system safety, system reliability, and other characteristics, for instance weight and cost, is used. Multi-objective Optimization for Safety and Reliability Trade-off has been developed and implemented at SAAB Aeronautics. The aim of this paper is to demonstrate how the implemented method might work to aid the selection of optimal design alternatives. The method is a three-step method: step 1 involves the modelling of each considered target, step 2 is optimization, and step 3 is the visualization and selection of results (results processing). The analysis is performed within Architecture Design and Preliminary Design steps, according to the company's Product Development Process. The lessons learned regarding the use of the implemented trade-off method in the three cases are presented. The results are a handful of solutions, a basis to aid in the selection of a design alternative. While the implementation of the trade-off method is performed for companies, there is nothing to prevent adapting this method, with minimal modifications, for use in other industrial applications.

  8. How to Use an Optimization-Based Method Capable of Balancing Safety, Reliability, and Weight in an Aircraft Design Process

    Directory of Open Access Journals (Sweden)

    Cristina Johansson

    2017-03-01

    Full Text Available In order to help decision-makers in the early design phase to improve and make more cost-efficient system safety and reliability baselines of aircraft design concepts, a method (Multi-objective Optimization for Safety and Reliability Trade-off that is able to handle trade-offs such as system safety, system reliability, and other characteristics, for instance weight and cost, is used. Multi-objective Optimization for Safety and Reliability Trade-off has been developed and implemented at SAAB Aeronautics. The aim of this paper is to demonstrate how the implemented method might work to aid the selection of optimal design alternatives. The method is a three-step method: step 1 involves the modelling of each considered target, step 2 is optimization, and step 3 is the visualization and selection of results (results processing. The analysis is performed within Architecture Design and Preliminary Design steps, according to the company's Product Development Process. The lessons learned regarding the use of the implemented trade-off method in the three cases are presented. The results are a handful of solutions, a basis to aid in the selection of a design alternative. While the implementation of the trade-off method is performed for companies, there is nothing to prevent adapting this method, with minimal modifications, for use in other industrial applications.

  9. Harmonization process and reliability assessment of anthropometric measurements in the elderly EXERNET multi-centre study.

    Directory of Open Access Journals (Sweden)

    Alba Gómez-Cabello

    Full Text Available BACKGROUND: The elderly EXERNET multi-centre study aims to collect normative anthropometric data for old functionally independent adults living in Spain. PURPOSE: To describe the standardization process and reliability of the anthropometric measurements carried out in the pilot study and during the final workshop, examining both intra- and inter-rater errors for measurements. MATERIALS AND METHODS: A total of 98 elderly from five different regions participated in the intra-rater error assessment, and 10 different seniors living in the city of Toledo (Spain participated in the inter-rater assessment. We examined both intra- and inter-rater errors for heights and circumferences. RESULTS: For height, intra-rater technical errors of measurement (TEMs were smaller than 0.25 cm. For circumferences and knee height, TEMs were smaller than 1 cm, except for waist circumference in the city of Cáceres. Reliability for heights and circumferences was greater than 98% in all cases. Inter-rater TEMs were 0.61 cm for height, 0.75 cm for knee-height and ranged between 2.70 and 3.09 cm for the circumferences measured. Inter-rater reliabilities for anthropometric measurements were always higher than 90%. CONCLUSION: The harmonization process, including the workshop and pilot study, guarantee the quality of the anthropometric measurements in the elderly EXERNET multi-centre study. High reliability and low TEM may be expected when assessing anthropometry in elderly population.

  10. An approach for the condensed presentation of intuitive citation impact metrics which remain reliable with very few publications

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, D.; Tippett, Ch.; Côté, G.; Roberge, G.; Archambault, E.

    2016-07-01

    An approach for presenting citation data in a condensed and intuitive manner which will allow for their reliable interpretation by policy analysts even in cases where the number of peer-reviewed publications produced by a given entity remains small is presented. The approach is described using country level data in Agronomy & Agriculture (2004–2013), an area of specialisation for many developing countries with a small output size. Four citation impact metrics, and a synthesis graph that we call the distributional micro-charts of relative citation counts, are considered in building our “preferred” presentation layout. These metrics include two indicators that have long been used by Science-Metrix in its bibliometric reports, the Average of Relative Citations (ARC) and the percentage of publications in the 10% most cited publications in the database (HCP), as well as two newer metrics, the Median of Relative Citations (MRC) and the Relative Integration Score (RIS). The findings reveal that the proposed approach combining the MRC and HCP with the distributional micro-charts effectively allows to better qualify the citation impact of entities in terms of central location, density of the upper citation tail and overall distribution than Science-Metrix former approach based on the ARC and HCP. This is especially true of cases with small population sizes where a strong presence of outliers (denoted by strong HCP scores) can have a significant effect on the central location of the citation data when estimated with an average. (Author)

  11. Modeling Parameters of Reliability of Technological Processes of Hydrocarbon Pipeline Transportation

    Directory of Open Access Journals (Sweden)

    Shalay Viktor

    2016-01-01

    Full Text Available On the basis of methods of system analysis and parametric reliability theory, the mathematical modeling of processes of oil and gas equipment operation in reliability monitoring was conducted according to dispatching data. To check the quality of empiric distribution coordination , an algorithm and mathematical methods of analysis are worked out in the on-line mode in a changing operating conditions. An analysis of physical cause-and-effect relations mechanism between the key factors and changing parameters of technical systems of oil and gas facilities is made, the basic types of technical distribution parameters are defined. Evaluation of the adequacy the analyzed parameters of the type of distribution is provided by using a criterion A.Kolmogorov, as the most universal, accurate and adequate to verify the distribution of continuous processes of complex multiple-technical systems. Methods of calculation are provided for supervising by independent bodies for risk assessment and safety facilities.

  12. An enhanced reliability-oriented workforce planning model for process industry using combined fuzzy goal programming and differential evolution approach

    Science.gov (United States)

    Ighravwe, D. E.; Oke, S. A.; Adebiyi, K. A.

    2017-08-01

    This paper draws on the "human reliability" concept as a structure for gaining insight into the maintenance workforce assessment in a process industry. Human reliability hinges on developing the reliability of humans to a threshold that guides the maintenance workforce to execute accurate decisions within the limits of resources and time allocations. This concept offers a worthwhile point of deviation to encompass three elegant adjustments to literature model in terms of maintenance time, workforce performance and return-on-workforce investments. These fully explain the results of our influence. The presented structure breaks new grounds in maintenance workforce theory and practice from a number of perspectives. First, we have successfully implemented fuzzy goal programming (FGP) and differential evolution (DE) techniques for the solution of optimisation problem in maintenance of a process plant for the first time. The results obtained in this work showed better quality of solution from the DE algorithm compared with those of genetic algorithm and particle swarm optimisation algorithm, thus expressing superiority of the proposed procedure over them. Second, the analytical discourse, which was framed on stochastic theory, focusing on specific application to a process plant in Nigeria is a novelty. The work provides more insights into maintenance workforce planning during overhaul rework and overtime maintenance activities in manufacturing systems and demonstrated capacity in generating substantially helpful information for practice.

  13. Integrating Data Sources for Process Sustainability Assessments (presentation)

    Science.gov (United States)

    To perform a chemical process sustainability assessment requires significant data about chemicals, process design specifications, and operating conditions. The required information includes the identity of the chemicals used, the quantities of the chemicals within the context of ...

  14. THE RELIABILITY AND ACCURACY OF THE TRIPLE MEASUREMENTS OF ANALOG PROCESS VARIABLES

    Directory of Open Access Journals (Sweden)

    V. A. Anishchenko

    2017-01-01

    Full Text Available The increase in unit capacity of electric equipment as well as complication of technological processes, devices control and management of the latter in power plants and substations demonstrate the need to improve the reliability and accuracy of measurement information characterizing the state of the objects being managed. The mentioned objective is particularly important for nuclear power plants, where the price of inaccuracy of measurement responsible process variables is particularly high and the error might lead to irreparable consequences. Improving the reliability and accuracy of measurements along with the improvement of the element base is provided by methods of operational validation. These methods are based on the use of information redundancy (structural, topological, temporal. In particular, information redundancy can be achieved by the simultaneous measurement of one analog variable by two (duplication or three devices (triplication i.e., triple redundancy. The problem of operational control of the triple redundant system of measurement of electrical analog variables (currents, voltages, active and reactive power and energy is considered as a special case of signal processing by an orderly sampling on the basis of majority transformation and transformation being close to majority one. Difficulties in monitoring the reliability of measurements are associated with the two tasks. First, one needs to justify the degree of truncation of the distributions of random errors of measurements and allowable residuals of the pairwise differences of the measurement results. The second task consists in formation of the algorithm of joint processing of a set of separate measurements determined as valid. The quality of control is characterized by the reliability, which adopted the synonym of validity, and accuracy of the measuring system. Taken separately, these indicators might lead to opposite results. A compromise solution is therefore proposed

  15. LED Lighting System Reliability Modeling and Inference via Random Effects Gamma Process and Copula Function

    Directory of Open Access Journals (Sweden)

    Huibing Hao

    2015-01-01

    Full Text Available Light emitting diode (LED lamp has attracted increasing interest in the field of lighting systems due to its low energy and long lifetime. For different functions (i.e., illumination and color, it may have two or more performance characteristics. When the multiple performance characteristics are dependent, it creates a challenging problem to accurately analyze the system reliability. In this paper, we assume that the system has two performance characteristics, and each performance characteristic is governed by a random effects Gamma process where the random effects can capture the unit to unit differences. The dependency of performance characteristics is described by a Frank copula function. Via the copula function, the reliability assessment model is proposed. Considering the model is so complicated and analytically intractable, the Markov chain Monte Carlo (MCMC method is used to estimate the unknown parameters. A numerical example about actual LED lamps data is given to demonstrate the usefulness and validity of the proposed model and method.

  16. On the reliability of Shewhart-type control charts for multivariate process variability

    Science.gov (United States)

    Djauhari, Maman A.; Salleh, Rohayu Mohd; Zolkeply, Zunnaaim; Li, Lee Siaw

    2017-05-01

    We show that in the current practice of multivariate process variability monitoring, the reliability of Shewhart-type control charts cannot be measured except when the sub-group size n tends to infinity. However, the requirement of large n is meaningless not only in manufacturing industry where n is small but also in service industry where n is moderate. In this paper, we introduce a new definition of control limits in the two most appreciated control charts in the literature, i.e., the improved generalized variance chart (IGV-chart) and vector variance chart (VV-chart). With the new definition of control limits, the reliability of the control charts can be determined. Some important properties of new control limits will be derived and the computational technique of probability of false alarm will be delivered.

  17. Auditory Processing Theories of Language Disorders: Past, Present, and Future

    Science.gov (United States)

    Miller, Carol A.

    2011-01-01

    Purpose: The purpose of this article is to provide information that will assist readers in understanding and interpreting research literature on the role of auditory processing in communication disorders. Method: A narrative review was used to summarize and synthesize the literature on auditory processing deficits in children with auditory…

  18. Past, Present and Future of the Innovation Process

    Directory of Open Access Journals (Sweden)

    Ondřej Žižlavsk y

    2013-09-01

    management control of innovation performance under the postdoc research project “Innovation Process Performance Assessment: a Management Control System Approach in the Czech Small and Medium-sized Enterprises” No. 13- 20123P of the Czech Science Foundation.

  19. Presentation

    Directory of Open Access Journals (Sweden)

    Eduardo Vicente

    2013-06-01

    Full Text Available In the present edition of Significação – Scientific Journal for Audiovisual Culture and in the others to follow something new is brought: the presence of thematic dossiers which are to be organized by invited scholars. The appointed subject for the very first one of them was Radio and the invited scholar, Eduardo Vicente, professor at the Graduate Course in Audiovisual and at the Postgraduate Program in Audiovisual Media and Processes of the School of Communication and Arts of the University of São Paulo (ECA-USP. Entitled Radio Beyond Borders the dossier gathers six articles and the intention of reuniting works on the perspectives of usage of such media as much as on the new possibilities of aesthetical experimenting being build up for it, especially considering the new digital technologies and technological convergences. It also intends to present works with original theoretical approach and original reflections able to reset the way we look at what is today already a centennial media. Having broadened the meaning of “beyond borders”, four foreign authors were invited to join the dossier. This is the first time they are being published in this country and so, in all cases, the articles where either written or translated into Portuguese.The dossier begins with “Radio is dead…Long live to the sound”, which is the transcription of a thought provoking lecture given by Armand Balsebre (Autonomous University of Barcelona – one of the most influential authors in the world on the Radio study field. It addresses the challenges such media is to face so that it can become “a new sound media, in the context of a new soundscape or sound-sphere, for the new listeners”. Andrew Dubber (Birmingham City University regarding the challenges posed by a Digital Era argues for a theoretical approach in radio studies which can consider a Media Ecology. The author understands the form and discourse of radio as a negotiation of affordances and

  20. CIGS Material and Device Stability: A Processing Perspective (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Ramanathan, K.

    2012-03-01

    This is a general overview of CIGS material and device fundamentals. In the first part, the basic features of high efficiency CIGS absorbers and devices are described. In the second part, some examples of previous collaboration with Shell Solar CIGSS graded absorbers and devices are shown to illustrate how process information was used to correct deviations and improve the performance and stability.

  1. Reliable knowledge discovery

    CERN Document Server

    Dai, Honghua; Smirnov, Evgueni

    2012-01-01

    Reliable Knowledge Discovery focuses on theory, methods, and techniques for RKDD, a new sub-field of KDD. It studies the theory and methods to assure the reliability and trustworthiness of discovered knowledge and to maintain the stability and consistency of knowledge discovery processes. RKDD has a broad spectrum of applications, especially in critical domains like medicine, finance, and military. Reliable Knowledge Discovery also presents methods and techniques for designing robust knowledge-discovery processes. Approaches to assessing the reliability of the discovered knowledge are introduc

  2. A highly reliable, autonomous data communication subsystem for an advanced information processing system

    Science.gov (United States)

    Nagle, Gail; Masotto, Thomas; Alger, Linda

    1990-01-01

    The need to meet the stringent performance and reliability requirements of advanced avionics systems has frequently led to implementations which are tailored to a specific application and are therefore difficult to modify or extend. Furthermore, many integrated flight critical systems are input/output intensive. By using a design methodology which customizes the input/output mechanism for each new application, the cost of implementing new systems becomes prohibitively expensive. One solution to this dilemma is to design computer systems and input/output subsystems which are general purpose, but which can be easily configured to support the needs of a specific application. The Advanced Information Processing System (AIPS), currently under development has these characteristics. The design and implementation of the prototype I/O communication system for AIPS is described. AIPS addresses reliability issues related to data communications by the use of reconfigurable I/O networks. When a fault or damage event occurs, communication is restored to functioning parts of the network and the failed or damage components are isolated. Performance issues are addressed by using a parallelized computer architecture which decouples Input/Output (I/O) redundancy management and I/O processing from the computational stream of an application. The autonomous nature of the system derives from the highly automated and independent manner in which I/O transactions are conducted for the application as well as from the fact that the hardware redundancy management is entirely transparent to the application.

  3. Keynote presentation: Project Management, Technology and Evolving Work Processes

    DEFF Research Database (Denmark)

    Kampf, Constance Elizabeth

    Bridging the classroom and workplace is a challenge in the Project Management classroom because students rarely have the opportunity or the experience needed to head up large projects. So how can instructors present the opportunity to develop skills and gain experience needed to understand project...... management in a classroom setting? To begin to answer this question, the presentation describes three key strategies used in a Project Management course developed from a communications perspective in the International Bachelor Program in Marketing and Management Communication, Business & Social Sciences...... Management regardless of the technology used to support it....

  4. Reliability improvements and cost reductions through the innovative use of materials and processes

    Energy Technology Data Exchange (ETDEWEB)

    Khan, A.; Reid, D.; Chiovelli, S. [Syncrude Canada Ltd., Edmonton, AB (Canada)

    1998-09-01

    Syncrude has an annual maintenance budget of about $350 million. Most repairs are driven by material degradation which in most cases is predictable. One way to reduce repair costs is to address the damage mechanisms which result in the required maintenance. This can be done through emerging technologies such as: (1) bi-metallic and composite castings for wear resistance in double roll crushers and feeder breaker wear components, (2) new manufacturing processes for hydrotransport screens, (3) structural repairs through the use of fiber composite materials, (4) improved pump life through surface modifications and coatings, and (5) a user friendly materials database. A series of case studies are described to show how maintenance costs can be reduced and reliability improved through the innovative use of materials and processes. 3 refs., 4 tabs., 19 figs.

  5. Tunable high-refractive index hybrid for solution-processed light management devices (Conference Presentation)

    Science.gov (United States)

    Bachevillier, Stefan

    2016-10-01

    After the use of highly efficient but expensive inorganic optical materials, solution-processable polymers and hybrids have drawn more and more interest. Our group have recently developed a novel polymer-based hybrid optical material from titanium oxide hydrate exhibiting an outstanding set of optical and material properties. Firstly, their low cost, processability and cross-linked states are particularly attractive for many applications. Moreover, a high refractive index can be repeatedly achieved while optical losses stays considerably low over the entire visible and near-infrared wavelength regime. Indeed, the formation of inorganic nanoparticles, usually present in nanocomposites, is avoided by a specific formulation process. Even more remarkably, the refractive index can be tuned by either changing the inorganic content, using different titanium precursors or via a low-temperature curing process. A part of our work is focused on the reliable optical characterization of these properties, in particular a microscope-based setup allowing in-situ measurement and sample mapping has been developed. Our efforts are also concentrated on various applications of these exceptional properties. This hybrid material is tailored for photonic devices, with a specific emphasis on the production of highly efficient solution processable Distributed Bragg Reflectors (DBR) and anti-reflection coatings. Furthermore, waveguides can be fabricated from thin films along with in-coupling and out-coupling structures. These light managements structures are particularly adapted to organic photovoltaic cells (OPVs) and light emitting diodes (OLEDs).

  6. Keynote presentation: Project Management, Technology and Evolving Work Processes

    DEFF Research Database (Denmark)

    Kampf, Constance Elizabeth

    Bridging the classroom and workplace is a challenge in the Project Management classroom because students rarely have the opportunity or the experience needed to head up large projects. So how can instructors present the opportunity to develop skills and gain experience needed to understand project...... management in a classroom setting? To begin to answer this question, the presentation describes three key strategies used in a Project Management course developed from a communications perspective in the International Bachelor Program in Marketing and Management Communication, Business & Social Sciences......, Aarhus University. First, this Project Management course involves continual, ongoing development through working with real clients. To learn about project management from a communications perspective, learners were asked to work in teams for project conception and planning. To communicate with the client...

  7. An application of modulated poisson processes to the reliability analysis of repairable systems

    Energy Technology Data Exchange (ETDEWEB)

    Saldanha, Pedro L.C. [Comissao Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil). Coordenacao de Reatores]. E-mail: saldanha@cnen.gov.br; Melo, P.F. Frutuoso e [Universidade Federal, Rio de Janeiro, RJ (Brazil). Coordenacao dos Programas de Pos-graduacao de Engenharia. Programa de Engenharia Nuclear]. E-mail: frutuoso@con.ufrj.br; Noriega, Hector C. [Universidad Austral de Chile (UACh), Valdivia (Chile). Faculdad de Ciencias de la Ingeniaria]. E-mail: hnoriega@uach.cl

    2005-07-01

    This paper discusses the application of the modulated power law process (MPLP) model to the rate of occurrence of failures of active repairable systems in reliability engineering. Traditionally, two ways of modeling repairable systems, in what concerns maintenance policies, are: a pessimistic approach (non-homogeneous process - NHPP), and a very optimistic approach (renewal processes - RP). It is important to build a generalized model that might consider characteristics and properties both of the NHPP and of the RP models as particular cases. In practice, by considering the pattern of times between failures, the MPLP appears to be more realistic to represent the occurrence of failures of repairable systems in order to define whether they can be modeled by a homogeneous or a non-homogeneous process. The study has shown that the model can be used to make decisions concerning the evaluation of the qualified life of plant equipment. By controlling and monitoring two of the three parameters of the MPLP model during the equipment operation, it is possible to check whether and how the equipment is following the basis of its qualification process, and so identify how the effects of time, degradation and operation modes are influencing the equipment performance. The discussion is illustrated by an application to the service water pumps of a typical PWR plant. (author)

  8. Waste container weighing data processing to create reliable information of household waste generation.

    Science.gov (United States)

    Korhonen, Pirjo; Kaila, Juha

    2015-05-01

    Household mixed waste container weighing data was processed by knowledge discovery and data mining techniques to create reliable information of household waste generation. The final data set included 27,865 weight measurements covering the whole year 2013 and it was selected from a database of Helsinki Region Environmental Services Authority, Finland. The data set contains mixed household waste arising in 6m(3) containers and it was processed identifying missing values and inconsistently low and high values as errors. The share of missing values and errors in the data set was 0.6%. This provides evidence that the waste weighing data gives reliable information of mixed waste generation at collection point level. Characteristic of mixed household waste arising at the waste collection point level is a wide variation between pickups. The seasonal variation pattern as a result of collective similarities in behaviour of households was clearly detected by smoothed medians of waste weight time series. The evaluation of the collection time series against the defined distribution range of pickup weights on the waste collection point level shows that 65% of the pickups were from collection points with optimally dimensioned container capacity and the collection points with over- and under-dimensioned container capacities were noted in 9.5% and 3.4% of all pickups, respectively. Occasional extra waste in containers occurred in 21.2% of the pickups indicating the irregular behaviour of individual households. The results of this analysis show that processing waste weighing data using knowledge discovery and data mining techniques provides trustworthy information of household waste generation and its variations.

  9. Variables Refinery Process to Ensure Research Unbiasedness (Validity and Invariance (Reliability in Agricultural Extension and Education

    Directory of Open Access Journals (Sweden)

    Iraj M. Mohammadi

    2008-01-01

    Full Text Available Involving many variables, usually uncertain discrete, lowers the level of consistency and certainty in complex human studies like agricultural extension education. This research, as a part of a study entitled: Side by side situational analysis of current versus future situations of agricultural sector in manifesting the Ninth Malaysia Plan, is dealing primarily with the purpose and process of refining a set of uncertain independent variables to higher research validity and reliability. This investigation is quantitative in its nature, Applied in kind and design-wise, it is an Ex-Post-Facto analytical survey research. Eleven personal and professional characteristics of 224 agricultural experts were taken as independent and 158 attitudinal variables, measured based on dual (side-by-side Likert scale, as dependent variables. Research population consisted of all agricultural experts including extension experts in Malaysia. As a result, 13 items (8.2% of the attitudinal questions out of 157 were distinguished as impacted, moderate to very high sensitive variables to few personal characteristics and were eliminated in further data analysis. Applying this variable refinery method considerably increased construct and content validity as well as reliability of the research instrument and helped researcher to construct a rather impact-free side-by-side questionnaire.

  10. Presentation

    Directory of Open Access Journals (Sweden)

    Paulo Henrique Freire Vieira

    2013-12-01

    Full Text Available This dossier focuses on one of the essential debate topics today about the territorial dimension of the new development strategies concerned with the worsening of the global socioecological crisis, that is: the challenges related to the activation and integration in networks of localized agri-food systems. For its composition, some contributions presented and debated during the VI International Conference on Localized Agri-food System - The LAFS facing the opportunities and challenges of the new global context have been gathered. The event took place in the city of Florianópolis, from May 21th to 25th of 2013. The event was promoted by the Federal University of Santa Catarina (UFSC and by the Center for the International Cooperation on Agricultural Research for Development (CIRAD. Besides UFSC and CIRAD, EPAGRI, State University of Santa Catarina (UDESC, as well as research institutes and universities from other states (UFMG, IEA/SP, UFS, UFRGS and Mexican and Argentinian partners from the RED SIAL Latino Americana also participated in the organization of lectures, discussion tables and workshops.

  11. Presentation

    Directory of Open Access Journals (Sweden)

    Helmut Renders

    2008-10-01

    Full Text Available We present to our esteemed readers the second edition of our journal for 2008. We have chosen the theme “The life and work of Prof. Dr. Jürgen Moltmann” as its special emphasis. It is our way to pay homage to J. Moltmann in the year the Universidade Metodista de São Paulo awards him an honorary Doctor Honoris Causa degree. Sincethe seventies, Moltmann and Latin America have been in dialog. In his emblematic work “A Theology of Liberation”, Gustavo Gutiérrez, the Catholic, discussed with Moltmann, the Reformed, the relationship between eschatology and history (GUTIÉRREZ, Gustavo.Teologia da Libertação. 5ª edição. Petrópolis, RJ: Vozes, 1985, p. 27, 137-139. A dialog held in the premises of IMS, which nowadays is called UMESP, has produced the little book “Passion for life” (MOLTMANN, Jürgen. Paixão pela vida. São Paulo, SP: ASTE - Associaçãode Seminários Teológicos Evangélicos, 1978.In the following years, the wide theological work of J. Moltmann went all the way from debates to congresses and has conquered the classrooms. Most probably, J. Moltmann is nowadays the most widely read European author in Brazilian theological seminaries. Thisrecognition can only be held in unison and the wide response to our request for articles confirms the huge repercussion that Moltmann’s work has been having up to today in Brazil. The ecumenical theologian J. Moltmann is ecumenically read. We believe that thisway we may be better equipped to answer to anyone who asks us for the reason there is hope in us. We have organized the articles on J. Moltmann’s theology according to the original publication date of the books dealt with in each essay. We also communicate that some articles which were originally requested for this edition of the journal will be published in the journal Estudos de Regilião in May 2009.As it is usual with the journal Caminhando, we have, besides this thematic emphasis, yet other contributions in the areas of

  12. Reliable and reproducible classification system for scoliotic radiograph using image processing techniques.

    Science.gov (United States)

    Anitha, H; Prabhu, G K; Karunakar, A K

    2014-11-01

    Scoliosis classification is useful for guiding the treatment and testing the clinical outcome. State-of-the-art classification procedures are inherently unreliable and non-reproducible due to technical and human judgmental error. In the current diagnostic system each examiner will have diagrammatic summary of classification procedure, number of scoliosis curves, apex level, etc. It is very difficult to define the required anatomical parameters in the noisy radiographs. The classification system demands automatic image understanding system. The proposed automated classification procedures extracts the anatomical features using image processing and applies classification procedures based on computer assisted algorithms. The reliability and reproducibility of the proposed computerized image understanding system are compared with manual and computer assisted system using Kappa values.

  13. Effective confidence interval estimation of fault-detection process of software reliability growth models

    Science.gov (United States)

    Fang, Chih-Chiang; Yeh, Chun-Wu

    2016-09-01

    The quantitative evaluation of software reliability growth model is frequently accompanied by its confidence interval of fault detection. It provides helpful information to software developers and testers when undertaking software development and software quality control. However, the explanation of the variance estimation of software fault detection is not transparent in previous studies, and it influences the deduction of confidence interval about the mean value function that the current study addresses. Software engineers in such a case cannot evaluate the potential hazard based on the stochasticity of mean value function, and this might reduce the practicability of the estimation. Hence, stochastic differential equations are utilised for confidence interval estimation of the software fault-detection process. The proposed model is estimated and validated using real data-sets to show its flexibility.

  14. Presentation

    Directory of Open Access Journals (Sweden)

    Nicanor Lopes

    2010-11-01

    Full Text Available The Journal Caminhando debuts with a new editorial format: eachmagazine will have a Dossier.In 2010 Christianity celebrated the centenary of Edinburgh. TheWorld Missionary Conference in Edinburgh in 1910 is regarded by manyas missiological watershed in the missionary and ecumenical movement.So the Faculty of Theology of the Methodist Church (FATEO decidedto organize a Wesleyan Week discussing the issue of mission. For anevent of this magnitude FATEO invited the Rev. Dr. Wesley Ariarajah,Methodist pastor and teacher of Sri Lanka with extensive experience inpastoral ministry in local churches and professor of History of Religionsand the New Testament at the Theological College of Lanka, maintainedby the Protestant Churches in Sri Lanka. In 1981 he was invited to jointhe World Council of Churches, where he presided for over ten years theCouncil of Interreligious Dialogue. From 1992 he served as Deputy GeneralSecretary of the WCC.The following texts are not the speeches of the Rev. Dr. WesleyAriarajah, for they will be published separately. Nevertheless, the journaldialogs with the celebrations of the centenary of Edinburgh, parting formthe intriguing theme: "Mission in the 21st century in Brazil". After all, howis it that mission takes place among us in personal, church, and communityactivities?Within the Dossier, as common to the journal, the textos are organizedas follows: Bible, Theology / History and Pastoral Care. Other items thatdo not fit within the Dossier, but, do articulate mission, can be found inthe section Declarations and Documents and Book Reviews.The authors of the Dossier have important considerations in buildinga contemporary missiological concept considering Brazilian reality.Anderson de Oliveira, in the Bible-Section, presents a significantexegeses of Matthew 26.6-13. What does it mean when Jesus is quotedwith the words: "For the poor always ye have with you, but me ye havenot always." Is this declaration challenging the gospels

  15. Assessment of the reliability of reproducing two-dimensional resistivity models using an image processing technique.

    Science.gov (United States)

    Ishola, Kehinde S; Nawawi, Mohd Nm; Abdullah, Khiruddin; Sabri, Ali Idriss Aboubakar; Adiat, Kola Abdulnafiu

    2014-01-01

    This study attempts to combine the results of geophysical images obtained from three commonly used electrode configurations using an image processing technique in order to assess their capabilities to reproduce two-dimensional (2-D) resistivity models. All the inverse resistivity models were processed using the PCI Geomatica software package commonly used for remote sensing data sets. Preprocessing of the 2-D inverse models was carried out to facilitate further processing and statistical analyses. Four Raster layers were created, three of these layers were used for the input images and the fourth layer was used as the output of the combined images. The data sets were merged using basic statistical approach. Interpreted results show that all images resolved and reconstructed the essential features of the models. An assessment of the accuracy of the images for the four geologic models was performed using four criteria: the mean absolute error and mean percentage absolute error, resistivity values of the reconstructed blocks and their displacements from the true models. Generally, the blocks of the images of maximum approach give the least estimated errors. Also, the displacement of the reconstructed blocks from the true blocks is the least and the reconstructed resistivities of the blocks are closer to the true blocks than any other combined used. Thus, it is corroborated that when inverse resistivity models are combined, most reliable and detailed information about the geologic models is obtained than using individual data sets.

  16. Improving Emergency Department Door to Doctor Time and Process Reliability: A Successful Implementation of Lean Methodology.

    Science.gov (United States)

    El Sayed, Mazen J; El-Eid, Ghada R; Saliba, Miriam; Jabbour, Rima; Hitti, Eveline A

    2015-10-01

    The aim of this study is to determine the effectiveness of using lean management methods on improving emergency department door to doctor times at a tertiary care hospital.We performed a before and after study at an academic urban emergency department with 49,000 annual visits after implementing a series of lean driven interventions over a 20 month period. The primary outcome was mean door to doctor time and the secondary outcome was length of stay of both admitted and discharged patients. A convenience sample from the preintervention phase (February 2012) was compared to another from the postintervention phase (mid-October to mid-November 2013). Individual control charts were used to assess process stability.Postintervention there was a statistically significant decrease in the mean door to doctor time measure (40.0 minutes ± 53.44 vs 25.3 minutes ± 15.93 P < 0.001). The postintervention process was more statistically in control with a drop in the upper control limits from 148.8 to 72.9 minutes. Length of stay of both admitted and discharged patients dropped from 2.6 to 2.0 hours and 9.0 to 5.5 hours, respectively. All other variables including emergency department visit daily volumes, hospital occupancy, and left without being seen rates were comparable.Using lean change management techniques can be effective in reducing door to doctor time in the Emergency Department and improving process reliability.

  17. Characterizing reliability in a product/process design-assurance program

    Energy Technology Data Exchange (ETDEWEB)

    Kerscher, W.J. III [Delphi Energy and Engine Management Systems, Flint, MI (United States); Booker, J.M.; Bement, T.R.; Meyer, M.A. [Los Alamos National Lab., NM (United States)

    1997-10-01

    Over the years many advancing techniques in the area of reliability engineering have surfaced in the military sphere of influence, and one of these techniques is Reliability Growth Testing (RGT). Private industry has reviewed RGT as part of the solution to their reliability concerns, but many practical considerations have slowed its implementation. It`s objective is to demonstrate the reliability requirement of a new product with a specified confidence. This paper speaks directly to that objective but discusses a somewhat different approach to achieving it. Rather than conducting testing as a continuum and developing statistical confidence bands around the results, this Bayesian updating approach starts with a reliability estimate characterized by large uncertainty and then proceeds to reduce the uncertainty by folding in fresh information in a Bayesian framework.

  18. Surveying the impact of satisfaction and e-reliability on customers' loyalty in e-purchase process: a case in Pars Khodro co

    Directory of Open Access Journals (Sweden)

    Vahid Qaemi

    2012-10-01

    Full Text Available Today, customer return issue in e-purchase process is considered as important topic in companies' marketing and managerial decision making. In this paper, we present an empirical study on measuring the impact of e-loyalty for an Iranian auto-industry called Pars Khodro co. The proposed study measures reliability, responsiveness, design, security/privacy as independent variables, e-confidence and e-satisfaction as mediator variable, and e-loyalty as dependent variable. The preliminary results show that effectiveness of e-satisfaction and e-confidence on loyalty and effectiveness of e-confidence on e-satisfaction are in high level. Reliability/Fulfillment and security variables on e-confidence have significant impacts, and effectiveness level of reliability/Fulfillment and responsiveness and website design on e-satisfaction is high. The results indicate that there is no significant relationship between responsiveness and e-confidence.

  19. Implementation of a process of reliability of teams in ANAV; Implantacion de un proceso de fiabilidad de equipos en ANAV

    Energy Technology Data Exchange (ETDEWEB)

    Tarrasa, F.; Bueno, J. M.; Miralles, F.

    2013-07-01

    The INPO AP-913 document has been the basis for the implementation of the reliability of equipment process in ANAV and constitutes one of the strategic objectives established by ANAV to achieve an excellent management of assets that ensure the safe operation of the plants, their design life, both for the lengthening of life.

  20. Network Reliability: The effect of local network structure on diffusive processes

    CERN Document Server

    Youssef, Mina; Eubank, Stephen

    2013-01-01

    This paper re-introduces the network reliability polynomial - introduced by Moore and Shannon in 1956 -- for studying the effect of network structure on the spread of diseases. We exhibit a representation of the polynomial that is well-suited for estimation by distributed simulation. We describe a collection of graphs derived from Erd\\H{o}s-R\\'enyi and scale-free-like random graphs in which we have manipulated assortativity-by-degree and the number of triangles. We evaluate the network reliability for all these graphs under a reliability rule that is related to the expected size of a connected component. Through these extensive simulations, we show that for positively or neutrally assortative graphs, swapping edges to increase the number of triangles does not increase the network reliability. Also, positively assortative graphs are more reliable than neutral or disassortative graphs with the same number of edges. Moreover, we show the combined effect of both assortativity-by-degree and the presence of triangl...

  1. Design Optimization of ESD (Emergency ShutDown System for Offshore Process Based on Reliability Analysis

    Directory of Open Access Journals (Sweden)

    Bae Jeong-hoon

    2016-01-01

    Full Text Available Hydrocarbon leaks have a major accident potential and it could give significant damages to human, property and environment.To prevent these risks from the leak in design aspects, installation of ESD system is representative. Because the ESD system should be operated properly at any time, It needs high reliability and much cost. To make ESD system with high reliability and reasonable cost, it is a need to find specific design method.In this study, we proposed the multi-objective design optimization method and performed the optimization of the ESD system for 1st separation system to satisfy high reliability and cost-effective.‘NSGA-II (Non-dominated Sorting Genetic Algorithm-II’ was applied and two objective functions of ‘Reliability’ and ‘Cost’ of system were defined. Six design variables were set to related variables for system configuration. To verify the result of the optimization, the results of existing design and optimum design were compared in aspects of reliability and cost. With the optimization method proposed from this study, it was possible to derive the reliable and economical design of the ESD system.

  2. Improvement of mechanical properties and life extension of high reliability structural components by laser shock processing

    Science.gov (United States)

    Ocaña, J. L.; Morales, M.; Porro, J. A.; Iordachescu, D.; Díaz, M.; Ruiz de Lara, L.; Correa, C.

    2011-05-01

    Profiting by the increasing availability of laser sources delivering intensities above 109 W/cm2 with pulse energies in the range of several Joules and pulse widths in the range of nanoseconds, laser shock processing (LSP) is being consolidating as an effective technology for the improvement of surface mechanical and corrosion resistance properties of metals and is being developed as a practical process amenable to production engineering. The main acknowledged advantage of the laser shock processing technique consists on its capability of inducing a relatively deep compression residual stresses field into metallic alloy pieces allowing an improved mechanical behaviour, explicitly, the life improvement of the treated specimens against wear, crack growth and stress corrosion cracking. Following a short description of the theoretical/computational and experimental methods developed by the authors for the predictive assessment and experimental implementation of LSP treatments, experimental results on the residual stress profiles and associated surface properties modification successfully reached in typical materials (specifically Al and Ti alloys) under different LSP irradiation conditions are presented. In particular, the analysis of the residual stress profiles obtained under different irradiation parameters and the evaluation of the corresponding induced surface properties as roughness and wear resistance are presented.

  3. The ALMA high speed optical communication link is here: an essential component for reliable present and future operations

    Science.gov (United States)

    Filippi, G.; Ibsen, J.; Jaque, S.; Liello, F.; Ovando, N.; Astudillo, A.; Parra, J.; Saldias, Christian

    2016-07-01

    Announced in 2012, started in 2013 and completed in 2015, the ALMA high bandwidth communication system has become a key factor to achieve the operational and scientific goals of ALMA. This paper summarizes the technical, organizational, and operational goals of the ALMA Optical Link Project, focused in the creation and operation of an effective and sustainable communication infrastructure to connect the ALMA Operations Support Facility and Array Operations Site, both located in the Atacama Desert in the Northern region of Chile, with the point of presence of REUNA in Antofagasta, about 400km away, and from there to the Santiago Central Office in the Chilean capital through the optical infrastructure created by the EC-funded EVALSO project and now an integral part of the REUNA backbone. This new infrastructure completed in 2014 and now operated on behalf of ALMA by REUNA, the Chilean National Research and Education Network, uses state of the art technologies, like dark fiber from newly built cables and DWDM transmission, allowing extending the reach of high capacity communication to the remote region where the Observatory is located. The paper also reports on the results obtained during the first year and a half testing and operation period, where different operational set ups have been experienced for data transfer, remote collaboration, etc. Finally, the authors will present a forward look of the impact of it to both the future scientific development of the Chajnantor Plateau, where many installations area are (and will be) located, as well as the potential Chilean scientific backbone long term development.

  4. Techniques, processes, and measures for software safety and reliability. Version 3.0

    Energy Technology Data Exchange (ETDEWEB)

    Sparkman, D

    1992-05-30

    The purpose of this report is to provide a detailed survey of current recommended practices and measurement techniques for the development of reliable and safe software-based systems. This report is intended to assist the United States Nuclear Reaction Regulation (NRR) in determining the importance and maturity of the available techniques and in assessing the relevance of individual standards for application to instrumentation and control systems in nuclear power generating stations. Lawrence Livermore National Laboratory (LLNL) provides technical support for the Instrumentation and Control System Branch (ICSB) of NRRin advanced instrumentation and control systems, distributed digital systems, software reliability, and the application of verificafion and validafion for the development of software.

  5. Frontiers of reliability

    CERN Document Server

    Basu, Asit P; Basu, Sujit K

    1998-01-01

    This volume presents recent results in reliability theory by leading experts in the world. It will prove valuable for researchers, and users of reliability theory. It consists of refereed invited papers on a broad spectrum of topics in reliability. The subjects covered include Bayesian reliability, Bayesian reliability modeling, confounding in a series system, DF tests, Edgeworth approximation to reliability, estimation under random censoring, fault tree reduction for reliability, inference about changes in hazard rates, information theory and reliability, mixture experiment, mixture of Weibul

  6. In situ study of key material and process reliability issues in the chemical vapor deposition of copper

    Science.gov (United States)

    Lou, Ishing

    With the limitations of current aluminum based metallization schemes used in microelectronics, the development of a manufacturable chemical vapor deposition (CVD) process for copper metallization schemes is crucial to meet the stringent requirements of sub-quarter micron device technology and beyond. The work presented herein focused on investigating key material and process reliability issues pertaining to Cu CVD processing. In particular, a unique combination of in-situ gas phase Fourier transform infrared (FTIR) and quadrupole mass spectrometry (QMS) was employed to study the role of hydrogen in thermal CVD of copper using (tmvs)Cusp{I}(hfac). These studies showed that hydrogen provides significant enhancement in the deposition rate of copper interconnects. Based on the QMS and FTIR data, this enhancement could be attributed to the role of hydrogen in assisting in the removal of tmvs from (tmvs)Cusp{I}(hfac), thus enhancing the conversion of Cusp{I}(hfac) intermediates to Cusp{o} and Cusp{II}(hfac)sb2 and providing a wider process window with higher conversion efficiency. In addition, in-situ real time QMS studies were performed of the gas phase evolution and decomposition pathways of (tmvs)Cusp{I}(hfac) during thermal CVD of copper. The QMS investigations focused on determining the ionization efficiency curves and appearance potentials of (tmvs)Cusp{I}(hfac) under real CVD processing conditions. The resulting curves and associated potentials were then employed to identify the most likely precursor decomposition pathways and examine relevant implications for thermal CVD of copper from (tmvs)Cusp{I}(hfac). Finally, a hydrogen-plasma assisted CVD (PACVD) process was developed for the growth of device quality gold for incorporation as dopant in emerging Cu CVD based metallization interconnects. In particular, it was demonstrated that the PACVD gold process window identified can maintain very low gold deposition rates (gold is a promising in-situ Cu doping technique

  7. Reliability and validity of a palpation technique for identifying the spinous processes of C7 and L5.

    Science.gov (United States)

    Robinson, Roar; Robinson, Hilde Stendal; Bjørke, Gustav; Kvale, Alice

    2009-08-01

    The objective was to examine inter-tester reliability and validity of two therapists identifying the spinous processes (SP) of C7 and L5, using one predefined surface palpation procedure for each level. One identification method made it possible to examine the reliability and the validity of the procedure itself. Two manual therapists examined 49 patients (29 women). Aged between 26 and 79 years, 18 were cervical and 31 lumbar patients. An invisible marking pen and ultraviolet light were used, and the findings were compared. X-rays were taken as an objective measure of the correct spinal level. Percentage agreement and kappa statistics were used to evaluate reliability and validity. The best inter-therapist agreement was found for the skin marks. Percentage agreement within 10mm and 20mm was 67% and 85%, respectively. The inter-tester reliability for identifying a radiological nominated SP by palpation was found to be poor for C7 and moderate for L5, with kappa of 0.18 and 0.48, respectively. The results indicated acceptable inter-therapist surface palpation agreement, but the chosen procedures did not identify the correct SP. This indicates that the procedures are not precise enough. Future reliability studies should test other non-invasive palpation procedures, both individually and in combination, and compare these with radiological investigation.

  8. The validity and reliability process of interpersonal problem solving inventory for adults

    Directory of Open Access Journals (Sweden)

    Songül Tümkaya

    2011-05-01

    Full Text Available The current study investigated validity and reliability of Interpersonal Problem Solving Inventory in which whether it has psychometric properties that measure adults in their approach to interpersonal problems and behavior. The inventory includes five subscales which are called; Approaching problems in a negative way, Constructive problem solving, Lack of self-confidence, Unwilling to take responsibility, and Insistent-persevering approach. The scale consists of 50 items constructed in a form of 5-point scale. The sample includes 610 adults, 324 females and 286 males, and aged between 30 to73. Problem Solving Inventory (PSI which is administered to 99 participants and Trait Anxiety Scale (TAS, which were administered to 93 participants, were used to obtain the similar and different construct validity of Interpersonal Problem Solving Inventory. The scale was administered to 43 adults twice for 4 and 6-week interval and the test-retest reliability are found between .62 and .82. The results of the confirmatory factor analysis show that the scale consists of five interpersonal problem solving. In addition, there are positive correlations between TAS and PSI. The internal consistency values of Cronbach alfa for the sub-scales are found between .67 and .90. The results of the study indicated that the inventory could be used to measure problem solving skills of adults aged between 30 to 73 years old.   Keywords: Interpersonal Problem Solving Inventory, Validity, Reliability, Adults.

  9. Development of interpersonal problem solving inventory for high school students: The validity and reliability process

    Directory of Open Access Journals (Sweden)

    Sabahattin Çam

    2008-07-01

    Full Text Available The current study investigated reliability and validity of an inventory to measure high school students' problem solving strategies and abilities with Interpersonal Problem Solving Inventory (IPSI, which is originally developed to assess college students' problem solving strategies and abilities. IPSI consist of 50 items constructed in a form of 5 sub-scale. These five sub-scales are called; Approaching problems in a negative way, Constructive problem solving, Lack of self-confidence, Unwilling to take responsibility, and Insistent-persevering approach. The sample consist of 482 pupils, 48.1 % of boys (232, 51.9 % of girls (250. % 34.2 of these students were in 9th grade, %33.8 of them were in 10th grade, and %32 of them were in 11th grade. The results of confirmatory factor analysis (CFA show that the scale consists of five interpersonal problem solving. The relationships between sub-scale scores of high school student showed similar results to the college students' sub-scale score.  Confirmatory factor analysis, criterion validity, similar and different structure validity of the sub-scales were proved. The internal consistency values of Cronbach Alfa for the sub-scales are found between .67 and .89; the test-retest reliability is found between .67 and .84. The reliability and validity of the inventory showed that five sub-scales have the ability to measure the interpersonal problem solving ability and behavior for high school students.

  10. Processing and Device Oriented Approach to CIGS Module Reliability; SunShot Initiative, U.S. Department of Energy (DOE)

    Energy Technology Data Exchange (ETDEWEB)

    Ramanathan, K.; Mansfield, L.; Garris, R.; Deline, C.; Silverman, T.

    2015-02-24

    Abstract: A device level understanding of thin film module reliability has been lacking. We propose that device performance and stability issues are strongly coupled and simultaneous attention to both is necessary. Commonly discussed technical issues such as light soaking, metastability, reverse bias breakdown and junction breakdown can be understood by comparing the behaviors of cells made inAbstract: A device level understanding of thin film module reliability has been lacking. We propose that device performance and stability issues are strongly coupled and simultaneous attention to both is necessary. Commonly discussed technical issues such as light soaking, metastability, reverse bias breakdown and junction breakdown can be understood by comparing the behaviors of cells made in the laboratory and industry. It will then be possible to attribute the observed effects in terms of processing and cell design. Process connection to stability studies can help identify root causes and a path for mitigating the degradation.

  11. Reliability Engineering

    CERN Document Server

    Lazzaroni, Massimo

    2012-01-01

    This book gives a practical guide for designers and users in Information and Communication Technology context. In particular, in the first Section, the definition of the fundamental terms according to the international standards are given. Then, some theoretical concepts and reliability models are presented in Chapters 2 and 3: the aim is to evaluate performance for components and systems and reliability growth. Chapter 4, by introducing the laboratory tests, puts in evidence the reliability concept from the experimental point of view. In ICT context, the failure rate for a given system can be

  12. The Use of Concurrency in the Acquisition Process and Its Impact on Reliability and Maintainability.

    Science.gov (United States)

    1987-09-01

    IB Monaghan, MaJ Jeffery C. Deputy for Operations F-15E OT&E Schnick, Maj Robert H. AFOTEC F-16 MSIP Test Director Shearer, Col Richard Director...Demarchi, Capt Daniel. A Case Study of Reliability and Maintainability of the F-16 APG-66 Fire Control Radar. MS thesis, AFIT/LSSR99-81. School of...Acquisition Program Schedule Compression. MS thesis, AFIT/GLM/LSY/86S-23. School of Systems and Logistics, Air Force Institute of Technology (AU

  13. Hf-based high-k dielectrics process development, performance characterization, and reliability

    CERN Document Server

    Kim, Young-Hee

    2006-01-01

    In this work, the reliability of HfO2 (hafnium oxide) with poly gate and dual metal gate electrode (Ru-Ta alloy, Ru) was investigated. Hard breakdown and soft breakdown, particularly the Weibull slopes, were studied under constant voltage stress. Dynamic stressing has also been used. It was found that the combination of trapping and detrapping contributed to the enhancement of the projected lifetime. The results from the polarity dependence studies showed that the substrate injection exhibited a shorter projected lifetime and worse soft breakdown behavior, compared to the gate injection. The o

  14. Assessing Reliability of Cellulose Hydrolysis Models to Support Biofuel Process Design – Identifiability and Uncertainty Analysis

    DEFF Research Database (Denmark)

    Sin, Gürkan; Meyer, Anne S.; Gernaey, Krist

    2010-01-01

    The reliability of cellulose hydrolysis models is studied using the NREL model. An identifiability analysis revealed that only 6 out of 26 parameters are identifiable from the available data (typical hydrolysis experiments). Attempting to identify a higher number of parameters (as done...... are not informative enough (sensitivities of 16 parameters were insignificant). This indicates that the NREL model has severe parameter uncertainty, likely to be the case for other hydrolysis models as well since similar kinetic expressions are used. To overcome this impasse, we have used the Monte Carlo procedure...

  15. Delta-Reliability

    OpenAIRE

    Eugster, P.; Guerraoui, R.; Kouznetsov, P.

    2001-01-01

    This paper presents a new, non-binary measure of the reliability of broadcast algorithms, called Delta-Reliability. This measure quantifies the reliability of practical broadcast algorithms that, on the one hand, were devised with some form of reliability in mind, but, on the other hand, are not considered reliable according to the ``traditional'' notion of broadcast reliability [HT94]. Our specification of Delta-Reliability suggests a further step towards bridging the gap between theory and...

  16. All-inkjet-printed thin-film transistors: manufacturing process reliability by root cause analysis

    Science.gov (United States)

    Sowade, Enrico; Ramon, Eloi; Mitra, Kalyan Yoti; Martínez-Domingo, Carme; Pedró, Marta; Pallarès, Jofre; Loffredo, Fausta; Villani, Fulvia; Gomes, Henrique L.; Terés, Lluís; Baumann, Reinhard R.

    2016-01-01

    We report on the detailed electrical investigation of all-inkjet-printed thin-film transistor (TFT) arrays focusing on TFT failures and their origins. The TFT arrays were manufactured on flexible polymer substrates in ambient condition without the need for cleanroom environment or inert atmosphere and at a maximum temperature of 150 °C. Alternative manufacturing processes for electronic devices such as inkjet printing suffer from lower accuracy compared to traditional microelectronic manufacturing methods. Furthermore, usually printing methods do not allow the manufacturing of electronic devices with high yield (high number of functional devices). In general, the manufacturing yield is much lower compared to the established conventional manufacturing methods based on lithography. Thus, the focus of this contribution is set on a comprehensive analysis of defective TFTs printed by inkjet technology. Based on root cause analysis, we present the defects by developing failure categories and discuss the reasons for the defects. This procedure identifies failure origins and allows the optimization of the manufacturing resulting finally to a yield improvement. PMID:27649784

  17. All-inkjet-printed thin-film transistors: manufacturing process reliability by root cause analysis.

    Science.gov (United States)

    Sowade, Enrico; Ramon, Eloi; Mitra, Kalyan Yoti; Martínez-Domingo, Carme; Pedró, Marta; Pallarès, Jofre; Loffredo, Fausta; Villani, Fulvia; Gomes, Henrique L; Terés, Lluís; Baumann, Reinhard R

    2016-09-21

    We report on the detailed electrical investigation of all-inkjet-printed thin-film transistor (TFT) arrays focusing on TFT failures and their origins. The TFT arrays were manufactured on flexible polymer substrates in ambient condition without the need for cleanroom environment or inert atmosphere and at a maximum temperature of 150 °C. Alternative manufacturing processes for electronic devices such as inkjet printing suffer from lower accuracy compared to traditional microelectronic manufacturing methods. Furthermore, usually printing methods do not allow the manufacturing of electronic devices with high yield (high number of functional devices). In general, the manufacturing yield is much lower compared to the established conventional manufacturing methods based on lithography. Thus, the focus of this contribution is set on a comprehensive analysis of defective TFTs printed by inkjet technology. Based on root cause analysis, we present the defects by developing failure categories and discuss the reasons for the defects. This procedure identifies failure origins and allows the optimization of the manufacturing resulting finally to a yield improvement.

  18. Software Technology for Adaptable, Reliable System (STARS) Program. Reuse Library Process Model.

    Science.gov (United States)

    The Process Model described in this document is part of the Guidebook requirements for the STARS Reuse Library described in Subtask IS40.3 of the S...Increment Task Proposal for STARS. The objective of a Process Model is to formally characterize the various processes that take place in the context of

  19. Reliability of plasma-sprayed coatings: monitoring the plasma spray process and improving the quality of coatings

    Science.gov (United States)

    Fauchais, P.; Vardelle, M.; Vardelle, A.

    2013-06-01

    As for every coating technology, the reliability and reproducibility of coatings are essential for the development of the plasma spraying technology in industrial manufacturing. They mainly depend on the process reliability, equipment and spray booth maintenance, operator training and certification, implementation and use of consistent production practices and standardization of coating testing. This paper deals with the first issue, that is the monitoring and control of the plasma spray process; it does not tackle the coating characterization and testing methods. It begins with a short history of coating quality improvement under plasma spray conditions over the last few decades, details the plasma spray torches used in the industry, the development of the measurements of in-flight and impacting particle parameters and then of sensors. It concludes with the process maps that describe the interrelations between the operating parameters of the spray process, in-flight particle characteristics and coating properties and with the potential of in situ monitoring of the process by artificial neural networks and fuzzy logic methods.

  20. Electrical characterization of FIB processed metal layers for reliable conductive-AFM on ZnO microstructures

    Energy Technology Data Exchange (ETDEWEB)

    Pea, M. [Istituto di Fotonica e Nanotecnologie - CNR, Roma 00156 (Italy); Maiolo, L. [Istituto per la Microelettronica e i Microsistemi - CNR, Roma 00133 (Italy); Giovine, E. [Istituto di Fotonica e Nanotecnologie - CNR, Roma 00156 (Italy); Rinaldi, A. [University of L’Aquila, International Research Center for Mathematics & Mechanics of Complex System (MEMOCS), 04012, Cisterna di Latina (Italy); ENEA, C.R. Casaccia, Santa Maria di Galeria, 00123 Rome (Italy); Araneo, R. [Sapienza University of Rome, 00185 Rome (Italy); Notargiacomo, A., E-mail: andrea.notargiacomo@ifn.cnr.it [Istituto di Fotonica e Nanotecnologie - CNR, Roma 00156 (Italy)

    2016-05-15

    Graphical abstract: - Highlights: • Contact resistance between conductive AFM tip and different metals is investigated. • FIB processed Ti and Cr areas have larger resistance than as deposited films. • Gold displays low and ohmic tip-sample resistance even after FIB processing. • Au/Ti stack on top of ZnO pillars allows reliable I–V characterization by C-AFM. - Abstract: We report on the conductive-atomic force microscopy (C-AFM) study of metallic layers in order to find the most suitable configuration for electrical characterization of individual ZnO micro-pillars fabricated by focused ion beam (FIB). The electrical resistance between the probe tip and both as deposited and FIB processed metal layers (namely, Cr, Ti, Au and Al) has been investigated. Both chromium and titanium evidenced a non homogenous and non ohmic behaviour, non negligible scanning probe induced anodic oxidation associated to electrical measurements, and after FIB milling they exhibited significantly higher tip-sample resistance. Aluminium had generally a more apparent non conductive behaviour. Conversely, gold films showed very good tip-sample conduction properties being less sensitive to FIB processing than the other investigated metals. We found that a reliable C-AFM electrical characterization of ZnO microstructures obtained by FIB machining is feasible by using a combination of metal films as top contact layer. An Au/Ti bilayer on top of ZnO was capable to sustain the FIB fabrication process and to form a suitable ohmic contact to the semiconductor, allowing for reliable C-AFM measurement. To validate the consistency of this approach, we measured the resistance of ZnO micropillars finding a linear dependence on the pillar height, as expected for an ohmic conductor, and evaluated the resistivity of the material. This procedure has the potential to be downscaled to nanometer size structures by a proper choice of metal films type and thickness.

  1. Present Situation and Developing Trend on Laser Processing Industry in China

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Developing trend and present situation on laser processing industry in China are mainly reviewed. The market analysis on laser processing systems are given, some new and rapidly developed laser processing systems are introduced. Finally, the paper also introduces some new kinds of laser systems and equipment.

  2. AN APPROACH TO TRACE SEMISTRUCTURE FOR PROCESS MINING TOWARDS SOFTWARE RELIABILITY

    Directory of Open Access Journals (Sweden)

    V.PRIYADHARSHINI

    2014-09-01

    Full Text Available Process mining is a process management system that analyzes business processes built using event logs. The information is extracted from event logs by using knowledge recovery techniques. The process mining algorithms are capable of inevitably discover models to give details of all the events registered in some log traces provided as input. The theory of regions is a valuable tool in process discovery: it aims at learning a formal model (Petri nets from a set of traces. The primary objective of this paper is to propose a new concept for tracing semi-structure. The experiment is done based on standard bench mark dataset HELIX and RALIC datasets. The performance of the proposed system is better than other existing methods.

  3. A Case Study on Improving Intensive Care Unit (ICU) Services Reliability: By Using Process Failure Mode and Effects Analysis (PFMEA)

    Science.gov (United States)

    Yousefinezhadi, Taraneh; Jannesar Nobari, Farnaz Attar; Goodari, Faranak Behzadi; Arab, Mohammad

    2016-01-01

    Introduction: In any complex human system, human error is inevitable and shows that can’t be eliminated by blaming wrong doers. So with the aim of improving Intensive Care Units (ICU) reliability in hospitals, this research tries to identify and analyze ICU’s process failure modes at the point of systematic approach to errors. Methods: In this descriptive research, data was gathered qualitatively by observations, document reviews, and Focus Group Discussions (FGDs) with the process owners in two selected ICUs in Tehran in 2014. But, data analysis was quantitative, based on failures’ Risk Priority Number (RPN) at the base of Failure Modes and Effects Analysis (FMEA) method used. Besides, some causes of failures were analyzed by qualitative Eindhoven Classification Model (ECM). Results: Through FMEA methodology, 378 potential failure modes from 180 ICU activities in hospital A and 184 potential failures from 99 ICU activities in hospital B were identified and evaluated. Then with 90% reliability (RPN≥100), totally 18 failures in hospital A and 42 ones in hospital B were identified as non-acceptable risks and then their causes were analyzed by ECM. Conclusions: Applying of modified PFMEA for improving two selected ICUs’ processes reliability in two different kinds of hospitals shows that this method empowers staff to identify, evaluate, prioritize and analyze all potential failure modes and also make them eager to identify their causes, recommend corrective actions and even participate in improving process without feeling blamed by top management. Moreover, by combining FMEA and ECM, team members can easily identify failure causes at the point of health care perspectives. PMID:27157162

  4. Penetrator reliability investigation and design exploration : from conventional design processes to innovative uncertainty-capturing algorithms.

    Energy Technology Data Exchange (ETDEWEB)

    Martinez-Canales, Monica L.; Heaphy, Robert (Sandia National Laboratories, Albuquerque, NM); Gramacy, Robert B. (University of Cambridge); Taddy, Matt (University of California, Santa Cruz, CA); Chiesa, Michael L.; Thomas, Stephen W. (Sandia National Laboratories, Albuquerque, NM); Swiler, Laura Painton (Sandia National Laboratories, Albuquerque, NM); Hough, Patricia Diane; Lee, Herbert K. H. (University of California, Santa Cruz, CA); Trucano, Timothy Guy (Sandia National Laboratories, Albuquerque, NM); Gray, Genetha Anne

    2006-11-01

    This project focused on research and algorithmic development in optimization under uncertainty (OUU) problems driven by earth penetrator (EP) designs. While taking into account uncertainty, we addressed three challenges in current simulation-based engineering design and analysis processes. The first challenge required leveraging small local samples, already constructed by optimization algorithms, to build effective surrogate models. We used Gaussian Process (GP) models to construct these surrogates. We developed two OUU algorithms using 'local' GPs (OUU-LGP) and one OUU algorithm using 'global' GPs (OUU-GGP) that appear competitive or better than current methods. The second challenge was to develop a methodical design process based on multi-resolution, multi-fidelity models. We developed a Multi-Fidelity Bayesian Auto-regressive process (MF-BAP). The third challenge involved the development of tools that are computational feasible and accessible. We created MATLAB{reg_sign} and initial DAKOTA implementations of our algorithms.

  5. Risks and reliability of manufacturing processes as related to composite materials for spacecraft structures

    Science.gov (United States)

    Bao, Han P.

    1995-01-01

    Fabricating primary aircraft and spacecraft structures using advanced composite materials entail both benefits and risks. The benefits come from much improved strength-to-weight ratios and stiffness-to-weight ratios, potential for less part count, ability to tailor properties, chemical and solvent resistance, and superior thermal properties. On the other hand, the risks involved include high material costs, lack of processing experience, expensive labor, poor reproducibility, high toxicity for some composites, and a variety of space induced risks. The purpose of this project is to generate a manufacturing database for a selected number of materials with potential for space applications, and to rely on this database to develop quantitative approaches to screen candidate materials and processes for space applications on the basis of their manufacturing risks including costs. So far, the following materials have been included in the database: epoxies, polycyanates, bismalemides, PMR-15, polyphenylene sulfides, polyetherimides, polyetheretherketone, and aluminum lithium. The first four materials are thermoset composites; the next three are thermoplastic composites, and the last one is is a metal. The emphasis of this database is on factors affecting manufacturing such as cost of raw material, handling aspects which include working life and shelf life of resins, process temperature, chemical/solvent resistance, moisture resistance, damage tolerance, toxicity, outgassing, thermal cycling, and void content, nature or type of process, associate tooling, and in-process quality assurance. Based on industry experience and published literature, a relative ranking was established for each of the factors affecting manufacturing as listed above. Potential applications of this database include the determination of a delta cost factor for specific structures with a given process plan and a general methodology to screen materials and processes for incorporation into the current

  6. The Adaptation, Validation, Reliability Process of the Turkish Version Orientations to Happiness Scale

    Directory of Open Access Journals (Sweden)

    Hakan Saricam

    2015-12-01

    Full Text Available The purpose of this research is to adapt the Scale of Happiness Orientations, which was developed by Peterson, Park, and Seligman (2005, into Turkish and examine the psychometric properties of the scale. The participants of the research consist of 489 students. The psychometric properties of the scale was examined with test methods; linguistic equivalence, descriptive factor analysis, confirmatory factor analysis, criterion-related validity, internal consistency, and test-retest. For criterion-related validity (concurrent validity, the Oxford Happiness Questionnaire-Short Form is used. Articles resulting from the descriptive factor analysis for structural validity of scale were summed into three factors (life of meaning, life of pleasure, life of engagement in accordance with the original form. Confirmatory factor analysis conducted yielded the value of three-factor fit indexes of 18 items: (χ2/df=1.94, RMSEA= .059, CFI= .96, GFI= .95, IFI= .95, NFI= .96, RFI= .95 and SRMR= .044. Factor load of the scale ranges from .36 to .59. During criterion-validity analysis, between Scale of Happiness Orientations and the Oxford Happiness Questionnaire, positive strong relations were seen at the level of p<.01 significance level. Cronbach Alpha internal consistency coefficient was .88 for the life of meaning sub-scale, .84 for the life of pleasure sub-scale, and .81 for the life of engagement sub-scale. In addition, a corrected items total correlation ranges from .39 to .61. According to these results, it can be said that the scale is a valid and reliable assessment instrument for positive psychology, educational psychology, and other fields.

  7. Using a reliability process to reduce uncertainty in predicting crashes at unsignalized intersections.

    Science.gov (United States)

    Haleem, Kirolos; Abdel-Aty, Mohamed; Mackie, Kevin

    2010-03-01

    The negative binomial (NB) model has been used extensively by traffic safety analysts as a crash prediction model, because it can accommodate the over-dispersion criterion usually exhibited in crash count data. However, the NB model is still a probabilistic model that may benefit from updating the parameters of the covariates to better predict crash frequencies at intersections. The objective of this paper is to examine the effect of updating the parameters of the covariates in the fitted NB model using a Bayesian updating reliability method to more accurately predict crash frequencies at 3-legged and 4-legged unsignalized intersections. For this purpose, data from 433 unsignalized intersections in Orange County, Florida were collected and used in the analysis. Four Bayesian-structure models were examined: (1) a non-informative prior with a log-gamma likelihood function, (2) a non-informative prior with an NB likelihood function, (3) an informative prior with an NB likelihood function, and (4) an informative prior with a log-gamma likelihood function. Standard measures of model effectiveness, such as the Akaike information criterion (AIC), mean absolute deviance (MAD), mean square prediction error (MSPE) and overall prediction accuracy, were used to compare the NB and Bayesian model predictions. Considering only the best estimates of the model parameters (ignoring uncertainty), both the NB and Bayesian models yielded favorable results. However, when considering the standard errors for the fitted parameters as a surrogate measure for measuring uncertainty, the Bayesian methods yielded more promising results. The full Bayesian updating framework using the log-gamma likelihood function for updating parameter estimates of the NB probabilistic models resulted in the least standard error values.

  8. Estimating Heat and Mass Transfer Processes in Green Roof Systems: Current Modeling Capabilities and Limitations (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Tabares Velasco, P. C.

    2011-04-01

    This presentation discusses estimating heat and mass transfer processes in green roof systems: current modeling capabilities and limitations. Green roofs are 'specialized roofing systems that support vegetation growth on rooftops.'

  9. The Association of Social Work Boards' Licensure Examinations: A Review of Reliability and Validity Processes

    Science.gov (United States)

    Marson, Stephen M.; DeAngelis, Donna; Mittal, Nisha

    2010-01-01

    Objectives: The purpose of this article is to create transparency for the psychometric methods employed for the development of the Association of Social Work Boards' (ASWB) exams. Results: The article includes an assessment of the macro (political) and micro (statistical) environments of testing social work competence. The seven-step process used…

  10. The Association of Social Work Boards' Licensure Examinations: A Review of Reliability and Validity Processes

    Science.gov (United States)

    Marson, Stephen M.; DeAngelis, Donna; Mittal, Nisha

    2010-01-01

    Objectives: The purpose of this article is to create transparency for the psychometric methods employed for the development of the Association of Social Work Boards' (ASWB) exams. Results: The article includes an assessment of the macro (political) and micro (statistical) environments of testing social work competence. The seven-step process used…

  11. Validity, Reliability, and Equity Issues in an Observational Talent Assessment Process in the Performing Arts

    Science.gov (United States)

    Oreck, Barry A.; Owen, Steven V.; Baum, Susan M.

    2003-01-01

    The lack of valid, research-based methods to identify potential artistic talent hampers the inclusion of the arts in programs for the gifted and talented. The Talent Assessment Process in Dance, Music, and Theater (D/M/T TAP) was designed to identify potential performing arts talent in diverse populations, including bilingual and special education…

  12. Commentary: Advances in Research on Sourcing-Source Credibility and Reliable Processes for Producing Knowledge Claims

    Science.gov (United States)

    Chinn, Clark A.; Rinehart, Ronald W.

    2016-01-01

    In our commentary on this excellent set of articles on "Sourcing in the Reading Process," we endeavor to synthesize the findings from the seven articles and discuss future research. We discuss significant contributions related to source memory, source evaluation, use of sources in action and belief, integration of information from…

  13. Uniform presentation of process evaluation results facilitates the evaluation of complex interventions: development of a graph

    NARCIS (Netherlands)

    Bakker, F.C.; Persoon, A.; Schoon, Y.; Olde Rikkert, M.G.M.

    2015-01-01

    RATIONALE, AIMS AND OBJECTIVES: Process evaluation is a highly essential element for the increasing number of studies regarding multi-component interventions. Yet, researchers are challenged to collect and present appropriate process outcomes in such way that it is easy and valuable to be used by ot

  14. Effects of Multimodal Presentation and Stimulus Familiarity on Auditory and Visual Processing

    Science.gov (United States)

    Robinson, Christopher W.; Sloutsky, Vladimir M.

    2010-01-01

    Two experiments examined the effects of multimodal presentation and stimulus familiarity on auditory and visual processing. In Experiment 1, 10-month-olds were habituated to either an auditory stimulus, a visual stimulus, or an auditory-visual multimodal stimulus. Processing time was assessed during the habituation phase, and discrimination of…

  15. Reliability engineering theory and practice

    CERN Document Server

    Birolini, Alessandro

    2014-01-01

    This book shows how to build in, evaluate, and demonstrate reliability and availability of components, equipment, systems. It presents the state-of-theart of reliability engineering, both in theory and practice, and is based on the author's more than 30 years experience in this field, half in industry and half as Professor of Reliability Engineering at the ETH, Zurich. The structure of the book allows rapid access to practical results. This final edition extend and replace all previous editions. New are, in particular, a strategy to mitigate incomplete coverage, a comprehensive introduction to human reliability with design guidelines and new models, and a refinement of reliability allocation, design guidelines for maintainability, and concepts related to regenerative stochastic processes. The set of problems for homework has been extended. Methods & tools are given in a way that they can be tailored to cover different reliability requirement levels and be used for safety analysis. Because of the Appendice...

  16. Manufacturing process modeling for composite materials and structures, Sandia blade reliability collaborative

    Energy Technology Data Exchange (ETDEWEB)

    Guest, Daniel A.; Cairns, Douglas S.

    2014-02-01

    The increased use and interest in wind energy over the last few years has necessitated an increase in the manufacturing of wind turbine blades. This increase in manufacturing has in many ways out stepped the current understanding of not only the materials used but also the manufacturing methods used to construct composite laminates. The goal of this study is to develop a list of process parameters which influence the quality of composite laminates manufactured using vacuum assisted resin transfer molding and to evaluate how they influence laminate quality. Known to be primary factors for the manufacturing process are resin flow rate and vacuum pressure. An incorrect balance of these parameters will often cause porosity or voids in laminates that ultimately degrade the strength of the composite. Fiber waviness has also been seen as a major contributor to failures in wind turbine blades and is often the effect of mishandling during the lay-up process. Based on laboratory tests conducted, a relationship between these parameters and laminate quality has been established which will be a valuable tool in developing best practices and standard procedures for the manufacture of wind turbine blade composites.

  17. Hybrid reliability model for fatigue reliability analysis of steel bridges

    Institute of Scientific and Technical Information of China (English)

    曹珊珊; 雷俊卿

    2016-01-01

    A kind of hybrid reliability model is presented to solve the fatigue reliability problems of steel bridges. The cumulative damage model is one kind of the models used in fatigue reliability analysis. The parameter characteristics of the model can be described as probabilistic and interval. The two-stage hybrid reliability model is given with a theoretical foundation and a solving algorithm to solve the hybrid reliability problems. The theoretical foundation is established by the consistency relationships of interval reliability model and probability reliability model with normally distributed variables in theory. The solving process is combined with the definition of interval reliability index and the probabilistic algorithm. With the consideration of the parameter characteristics of theS−N curve, the cumulative damage model with hybrid variables is given based on the standards from different countries. Lastly, a case of steel structure in the Neville Island Bridge is analyzed to verify the applicability of the hybrid reliability model in fatigue reliability analysis based on the AASHTO.

  18. Design modification for the modular helium reactor for higher temperature operation and reliability studies for nuclear hydrogen production processes

    Science.gov (United States)

    Reza, S. M. Mohsin

    Design options have been evaluated for the Modular Helium Reactor (MHR) for higher temperature operation. An alternative configuration for the MHR coolant inlet flow path is developed to reduce the peak vessel temperature (PVT). The coolant inlet path is shifted from the annular path between reactor core barrel and vessel wall through the permanent side reflector (PSR). The number and dimensions of coolant holes are varied to optimize the pressure drop, the inlet velocity, and the percentage of graphite removed from the PSR to create this inlet path. With the removal of ˜10% of the graphite from PSR the PVT is reduced from 541°C to 421°C. A new design for the graphite block core has been evaluated and optimized to reduce the inlet coolant temperature with the aim of further reduction of PVT. The dimensions and number of fuel rods and coolant holes, and the triangular pitch have been changed and optimized. Different packing fractions for the new core design have been used to conserve the number of fuel particles. Thermal properties for the fuel elements are calculated and incorporated into these analyses. The inlet temperature, mass flow and bypass flow are optimized to limit the peak fuel temperature (PFT) within an acceptable range. Using both of these modifications together, the PVT is reduced to ˜350°C while keeping the outlet temperature at 950°C and maintaining the PFT within acceptable limits. The vessel and fuel temperatures during low pressure conduction cooldown and high pressure conduction cooldown transients are found to be well below the design limits. The reliability and availability studies for coupled nuclear hydrogen production processes based on the sulfur iodine thermochemical process and high temperature electrolysis process have been accomplished. The fault tree models for both these processes are developed. Using information obtained on system configuration, component failure probability, component repair time and system operating modes

  19. PowerPoint Presentations: A Creative Addition to the Research Process.

    Science.gov (United States)

    Perry, Alan E.

    2003-01-01

    Contends that the requirement of a PowerPoint presentation as part of the research process would benefit students in the following ways: learning how to conduct research; starting their research project sooner; honing presentation and public speaking skills; improving cooperative and social skills; and enhancing technology skills. Outlines the…

  20. PowerPoint Presentations: A Creative Addition to the Research Process.

    Science.gov (United States)

    Perry, Alan E.

    2003-01-01

    Contends that the requirement of a PowerPoint presentation as part of the research process would benefit students in the following ways: learning how to conduct research; starting their research project sooner; honing presentation and public speaking skills; improving cooperative and social skills; and enhancing technology skills. Outlines the…

  1. Materials and manufacturing processes for increased life/reliability. [of turbine wheels

    Science.gov (United States)

    Duttweiler, R. E.

    1977-01-01

    Improvements in both quality and durability of disk raw material for both military and commercial engines necessitated an entirely new concept in raw material process control which imposes careful selection, screening and sampling of the basic alloy ingredients, followed by careful monitoring of the melting parameters in all phases of the vacuum melting sequence. Special care is taken to preclude solidification conditions that produce adverse levels of segregation. Melt furnaces are routinely cleaned and inspected for contamination. Ingots are also cleaned and inspected before entering the final melt step.

  2. Combination of chemical analyses and animal feeding trials as reliable procedures to assess the safety of heat processed soybean seeds.

    Science.gov (United States)

    Vasconcelos, Ilka M; Brasil, Isabel Cristiane F; Oliveira, José Tadeu A; Campello, Cláudio C; Maia, Fernanda Maria M; Campello, Maria Verônica M; Farias, Davi F; Carvalho, Ana Fontenele U

    2009-06-10

    This study assessed whether chemical analyses are sufficient to guarantee the safety of heat processing of soybeans (SB) for human/animal consumption. The effects of extrusion and dry-toasting were analyzed upon seed composition and performance of broiler chicks. None of these induced appreciable changes in protein content and amino acid composition. Conversely, toasting reduced all antinutritional proteins by over 85%. Despite that, the animals fed on toasted SB demonstrated a low performance (feed efficiency 57.8 g/100 g). Extrusion gave place to higher contents of antinutrients, particularly of trypsin inhibitors (27.53 g/kg flour), but animal performance was significantly (p trials, extrusion appears to be the safest method. In conclusion, in order to evaluate the reliability of any processing method intended to improve nutritional value, the combination of chemical and animal studies is necessary.

  3. Modern-day power plant processes - enhanced cost-efficiency with accustomed reliability

    Energy Technology Data Exchange (ETDEWEB)

    Weirich, P.-H.; Pietzonka, F. [ABB Kraftwerke AG, Mannheim (Germany)

    1995-12-31

    Large coal-fired steam power plants operating exclusively on the basis of supercritical steam generation are being built or are already in operation in many industrialized countries - as a result of economic considerations. The development of net efficiency in steam power plants is illustrated by examples. Discussion is presented of the differences between subcritical and supercritical systems with regard to their design, operation, and availability, and to show the economic advantages of the supercritical version. This is done by using a new approach in which the additional power output obtained as a result of improved efficiency is related to the additional expense required. When comparing the two systems, identical fuel flow rates are assumed, so that the components making up the air- and flue-gas paths, as well as the coal and ash handling equipment, can be disregarded. 12 figs.

  4. Comparing the electrical characteristics and reliabilities of BJTs and MOSFETs between Pt and Ti contact silicide processes

    Science.gov (United States)

    Liu, Kaiping; Shang, Ling

    1999-08-01

    The sub-threshold characteristics and the reliability of BJTs, using platinum contact silicide (PtSi) or titanium contact silicide (TiSi2), are compared and analyzed. During processing, it is observed that the TiSi2 process produces higher interface state density (Dit) than the PtSi process. The increase in Dit not only leads to a higher base current in the BJTs, but also leads to a lower transconductance for the MOS transistors. The data also show that the impact on NPN and nMOS is more severe than the impact of PNP and pMOS, respectively. This can be explained by the non-symmetric interface state distribution, the re- activation of boron, and/or by substrate trap centers. The amount of interface states produced depends not only on the thickness of the titanium film deposited, but also on the temperature and duration of the titanium silicide process. The electrical data indicates that after all the Back-End- Of-The-Line processing steps, which includes a forming gas anneal, Dit is still higher on wafers with the TiSi2 transistor's base current increases at different rates between the two processes, but eventually levels off to the same final value. However, the PNP transistor's base current increases at approximately the same rate, but eventually levels off at different final values. These indicate that the TiSi2 process may have modified the silicon and oxygen dangling bond structure during its high temperature process in addition to removing the hydrogen from the passivated interface states.

  5. Reliability study of Zr and Al incorporated Hf based high-k dielectric deposited by advanced processing

    Science.gov (United States)

    Bhuyian, Md Nasir Uddin

    Hafnium-based high-kappa dielectric materials have been successfully used in the industry as a key replacement for SiO2 based gate dielectrics in order to continue CMOS device scaling to the 22-nm technology node. Further scaling according to the device roadmap requires the development of oxides with higher kappa values in order to scale the equivalent oxide thickness (EOT) to 0.7 nm or below while achieving low defect densities. In addition, next generation devices need to meet challenges like improved channel mobility, reduced gate leakage current, good control on threshold voltage, lower interface state density, and good reliability. In order to overcome these challenges, improvements of the high-kappa film properties and deposition methods are highly desirable. In this dissertation, a detail study of Zr and Al incorporated HfO 2 based high-kappa dielectrics is conducted to investigate improvement in electrical characteristics and reliability. To meet scaling requirements of the gate dielectric to sub 0.7 nm, Zr is added to HfO2 to form Hf1-xZrxO2 with x=0, 0.31 and 0.8 where the dielectric film is deposited by using various intermediate processing conditions, like (i) DADA: intermediate thermal annealing in a cyclical deposition process; (ii) DSDS: similar cyclical process with exposure to SPA Ar plasma; and (iii) As-Dep: the dielectric deposited without any intermediate step. MOSCAPs are formed with TiN metal gate and the reliability of these devices is investigated by subjecting them to a constant voltage stress in the gate injection mode. Stress induced flat-band voltage shift (DeltaVFB), stress induced leakage current (SILC) and stress induced interface state degradation are observed. DSDS samples demonstrate the superior characteristics whereas the worst degradation is observed for DADA samples. Time dependent dielectric breakdown (TDDB) shows that DSDS Hf1-xZrxO2 (x=0.8) has the superior characteristics with reduced oxygen vacancy, which is affiliated to

  6. Present status of radiation processing and its future development by using electron accelerator in Vietnam

    Energy Technology Data Exchange (ETDEWEB)

    Tran Khac An; Tran Tich Canh; Doan Binh [Research and Development Center for Radiation Technology (VINAGAMMA), Ho Chi Minh (Viet Nam); Nguyen Quoc Hien [Nuclear Research Institute (NRI), Dalat (Viet Nam)

    2003-02-01

    In Vietnam, studies on Radiation Processing have been carried out since 1983. Some results are applicable in the field of agriculture, health and foodstuff, some researches were developed to commercial scale and others have high potential for development by using electron accelerator. The paper offers the present status of radiation processing and also give out the growing tendency of using electron accelerator in the future. (author)

  7. Monitoring Actuarial Present Values of Term Life Insurance By a Statistical Process Control Chart

    Science.gov (United States)

    Hafidz Omar, M.

    2015-06-01

    Tracking performance of life insurance or similar insurance policy using standard statistical process control chart is complex because of many factors. In this work, we present the difficulty in doing so. However, with some modifications of the SPC charting framework, the difficulty can be manageable to the actuaries. So, we propose monitoring a simpler but natural actuarial quantity that is typically found in recursion formulas of reserves, profit testing, as well as present values. We shared some simulation results for the monitoring process. Additionally, some advantages of doing so is discussed.

  8. Three stages of emotional word processing: an ERP study with rapid serial visual presentation.

    Science.gov (United States)

    Zhang, Dandan; He, Weiqi; Wang, Ting; Luo, Wenbo; Zhu, Xiangru; Gu, Ruolei; Li, Hong; Luo, Yue-Jia

    2014-12-01

    Rapid responses to emotional words play a crucial role in social communication. This study employed event-related potentials to examine the time course of neural dynamics involved in emotional word processing. Participants performed a dual-target task in which positive, negative and neutral adjectives were rapidly presented. The early occipital P1 was found larger when elicited by negative words, indicating that the first stage of emotional word processing mainly differentiates between non-threatening and potentially threatening information. The N170 and the early posterior negativity were larger for positive and negative words, reflecting the emotional/non-emotional discrimination stage of word processing. The late positive component not only distinguished emotional words from neutral words, but also differentiated between positive and negative words. This represents the third stage of emotional word processing, the emotion separation. Present results indicated that, similar with the three-stage model of facial expression processing; the neural processing of emotional words can also be divided into three stages. These findings prompt us to believe that the nature of emotion can be analyzed by the brain independent of stimulus type, and that the three-stage scheme may be a common model for emotional information processing in the context of limited attentional resources. © The Author (2014). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  9. MHC Class II Auto-antigen Processing and Presentation is Unconventional

    Directory of Open Access Journals (Sweden)

    Scheherazade eSadegh-Nasseri

    2015-07-01

    Full Text Available Antigen presentation is highly critical in adoptive immunity. Only by interacting with antigens presented by MHC Class II molecules, can helper T cells be stimulated to fight infections or diseases. The degradation of a full protein into small peptide fragments bound to class II molecules is a dynamic, lengthy process consisting of many steps and chaperons. Deregulation in any step of antigen processing could lead to the development of self-reactive T cells or defective immune response to pathogens. Indeed Human Leucocyte Antigens (HLA Class II genes are the predominant contributors to susceptibility to autoimmune diseases. Conventional antigen processing calls for internalization of extracellular antigens followed by processing and epitope selection within antigen processing subcellular compartments, enriched with all necessary accessory molecules, processing enzymes, and proper pH and denaturing conditions. However, recent data examining the temporal relationship between antigen uptakes, processing and epitope selection revealed unexpected characteristics for autoantigenic epitopes, which was not shared with antigenic epitopes from pathogens. This review provides a discussion of the relevance of these findings to the mechanisms of autoimmunity.

  10. Antigen processing and remodeling of the endosomal pathway: requirements for antigen cross-presentation.

    Science.gov (United States)

    Compeer, Ewoud Bernardus; Flinsenberg, Thijs Willem Hendrik; van der Grein, Susanna Geertje; Boes, Marianne

    2012-01-01

    Cross-presentation of endocytosed antigen as peptide/class I major histocompatibility complex complexes plays a central role in the elicitation of CD8(+) T cell clones that mediate anti-viral and anti-tumor immune responses. While it has been clear that there are specific subsets of professional antigen presenting cells capable of antigen cross-presentation, identification of mechanisms involved is still ongoing. Especially amongst dendritic cells (DC), there are specialized subsets that are highly proficient at antigen cross-presentation. We here present a focused survey on the cell biological processes in the endosomal pathway that support antigen cross-presentation. This review highlights DC-intrinsic mechanisms that facilitate the cross-presentation of endocytosed antigen, including receptor-mediated uptake, maturation-induced endosomal sorting of membrane proteins, dynamic remodeling of endosomal structures and cell surface-directed endosomal trafficking. We will conclude with the description of pathogen-induced deviation of endosomal processing, and discuss how immune evasion strategies pertaining endosomal trafficking may preclude antigen cross-presentation.

  11. Antigen processing and remodeling of the endosomal pathway: requirements for antigen cross-presentation.

    Directory of Open Access Journals (Sweden)

    Ewoud Bernardus Compeer

    2012-03-01

    Full Text Available The cross-presentation of endocytosed antigen as peptide/class I MHC complexes plays a central role in the elicitation of CD8+ T cell clones that mediate anti-viral and anti-tumor immune responses. While it has been clear that there are specific subsets of professional antigen presenting cells (APC capable of antigen cross-presentation, description of mechanisms involved is still ongoing. Especially amongst dendritic cells (DC, there are specialized subsets that are highly proficient at antigen cross-presentation. We here present a focused survey on the cell biological processes in the endosomal pathway that support antigen cross-presentation. This review highlight DC-intrinsic mechanisms that facilitate the cross-presentation of endocytosed antigen, including receptor-mediated uptake, recycling and maturation including the sorting of membrane proteins, dynamic remodeling of endosomal structures and cell-surface directed endosomal trafficking. We will conclude with description of pathogen-induced deviation of endosomal processing, and discuss how immune evasion strategies pertaining endosomal trafficking may preclude antigen cross-presentation.

  12. Unveiling the fungal mycobiota present throughout the cork stopper manufacturing process

    NARCIS (Netherlands)

    Barreto, M.C.; Houbraken, J.; Samson, R.A.; Brito, D.; Gadanho, M.; San Romão, M.V.

    2012-01-01

    A particular fungal population is present in the main stages of the manufacturing process of cork discs. Its diversity was studied using both dependent (isolation) and independent culture methods (denaturing gel gradient electrophoresis and cloning of the ITS1-5.8S-ITS2 region). The mycobiota in the

  13. Microelectronics Reliability

    Science.gov (United States)

    2017-01-17

    convey any rights or permission to manufacture, use, or sell any patented invention that may relate to them. This report was cleared for public release...testing for reliability prediction of devices exhibiting multiple failure mechanisms. Also presented was an integrated accelerating and measuring ...13  Table 2  T, V, F and matrix versus  measured  FIT

  14. Image-Processing Techniques for the Creation of Presentation-Quality Astronomical Images

    CERN Document Server

    Rector, T A; Frattare, L M; English, J; Puuohau-Pummill, K

    2004-01-01

    The quality of modern astronomical data, the power of modern computers and the agility of current image-processing software enable the creation of high-quality images in a purely digital form. The combination of these technological advancements has created a new ability to make color astronomical images. And in many ways it has led to a new philosophy towards how to create them. A practical guide is presented on how to generate astronomical images from research data with powerful image-processing programs. These programs use a layering metaphor that allows for an unlimited number of astronomical datasets to be combined in any desired color scheme, creating an immense parameter space to be explored using an iterative approach. Several examples of image creation are presented. A philosophy is also presented on how to use color and composition to create images that simultaneously highlight scientific detail and are aesthetically appealing. This philosophy is necessary because most datasets do not correspond to t...

  15. Accelerator mass spectrometry detection of beryllium ions in the antigen processing and presentation pathway.

    Science.gov (United States)

    Tooker, Brian C; Brindley, Stephen M; Chiarappa-Zucca, Marina L; Turteltaub, Kenneth W; Newman, Lee S

    2015-01-01

    Exposure to small amounts of beryllium (Be) can result in beryllium sensitization and progression to Chronic Beryllium Disease (CBD). In CBD, beryllium is presented to Be-responsive T-cells by professional antigen-presenting cells (APC). This presentation drives T-cell proliferation and pro-inflammatory cytokine (IL-2, TNFα, and IFNγ) production and leads to granuloma formation. The mechanism by which beryllium enters an APC and is processed to become part of the beryllium antigen complex has not yet been elucidated. Developing techniques for beryllium detection with enough sensitivity has presented a barrier to further investigation. The objective of this study was to demonstrate that Accelerator Mass Spectrometry (AMS) is sensitive enough to quantify the amount of beryllium presented by APC to stimulate Be-responsive T-cells. To achieve this goal, APC - which may or may not stimulate Be-responsive T-cells - were cultured with Be-ferritin. Then, by utilizing AMS, the amount of beryllium processed for presentation was determined. Further, IFNγ intracellular cytokine assays were performed to demonstrate that Be-ferritin (at levels used in the experiments) could stimulate Be-responsive T-cells when presented by an APC of the correct HLA type (HLA-DP0201). The results indicated that Be-responsive T-cells expressed IFNγ only when APC with the correct HLA type were able to process Be for presentation. Utilizing AMS, it was determined that APC with HLA-DP0201 had membrane fractions containing 0.17-0.59 ng Be and APC with HLA-DP0401 had membrane fractions bearing 0.40-0.45 ng Be. However, HLA-DP0401 APC had 20-times more Be associated with the whole cells (57.68-61.12 ng) than HLA-DP0201 APC (0.90-3.49 ng). As these findings demonstrate, AMS detection of picogram levels of Be processed by APC is possible. Further, regardless of form, Be requires processing by APC to successfully stimulate Be-responsive T-cells to generate IFNγ.

  16. Neglected lateral process of talus fracture presenting as a loose body in tarsal canal

    Institute of Scientific and Technical Information of China (English)

    Kamal Bali; Sharad Prabhakar; Nitesh Gahlot; Mandeep S Dhillon

    2011-01-01

    Lateral process fractures of talus are rare injuries with a potential to cause significant morbidity if rnisdiagnosed.The appropriate management of these fractures is still controversial and only a few reports are available on this subject.We presented a case of a 37-year-old male with neglected fracture on the lateral process of talus which was misdiagnosed at the time of injury.The patient presented to 7 months after misdiagnosis with a chronic ankle pain.Our case is unique in the sense that it is a rare case of neglected fracture on the lateral process of talus which presented as a loose body in sinus tarsi.However,a surgery with an excision of the loose body presented a satisfactory outcome along with 2 years' follow-up.To our knowledge,it ought to be the first case reported in the English literature.Through this case report,we highlight the importance of high index of suspicion for such rare bony injuries while evaluating trauma to the lateral side of ankle and discuss the principles of management of these fractures.

  17. A Review and Comparison of the Reliabilities of the MMPI-2, MCMI-III, and PAI Presented in Their Respective Test Manuals

    Science.gov (United States)

    Wise, Edward A.; Streiner, David L.; Walfish, Steven

    2010-01-01

    This article provides a review of the literature to determine the most frequently used personality tests. Based on this review, internal consistency and test-retest reliability coefficients from the test manuals for the Minnesota Multiphasic Personality Inventory-2 (MMPI-2), Millon Clinical Multiaxial Inventory-III (MCMI-III), and Personality…

  18. A Review and Comparison of the Reliabilities of the MMPI-2, MCMI-III, and PAI Presented in Their Respective Test Manuals

    Science.gov (United States)

    Wise, Edward A.; Streiner, David L.; Walfish, Steven

    2010-01-01

    This article provides a review of the literature to determine the most frequently used personality tests. Based on this review, internal consistency and test-retest reliability coefficients from the test manuals for the Minnesota Multiphasic Personality Inventory-2 (MMPI-2), Millon Clinical Multiaxial Inventory-III (MCMI-III), and Personality…

  19. Rapid FLIM: The new and innovative method for ultra-fast imaging of biological processes (Conference Presentation)

    Science.gov (United States)

    Orthaus-Mueller, Sandra; Kraemer, Benedikt; Tannert, Astrid; Roehlicke, Tino; Wahl, Michael; Rahn, Hans-Juergen; Koberling, Felix; Erdmann, Rainer

    2017-02-01

    Over the last two decades, time-resolved fluorescence microscopy has become an essential tool in Life Sciences thanks to measurement procedures such as Fluorescence Lifetime Imaging (FLIM), lifetime based Foerster Resonance Energy Transfer (FRET), and Fluorescence (Lifetime) Correlation Spectroscopy (F(L)CS) down to the single molecule level. Today, complete turn-key systems are available either as stand-alone units or as upgrades for confocal laser scanning microscopes (CLSM). Data acquisition on such systems is typically based on Time-Correlated Single Photon Counting (TCSPC) electronics along with picosecond pulsed diode lasers as excitation sources and highly sensitive, single photon counting detectors. Up to now, TCSPC data acquisition is considered a somewhat slow process as a large number of photons per pixel is required for reliable data analysis, making it difficult to use FLIM for following fast FRET processes, such as signal transduction pathways in cells or fast moving sub-cellular structures. We present here a novel and elegant solution to tackle this challenge. Our approach, named rapidFLIM, exploits recent hardware developments such as TCSPC modules with ultra short dead times and hybrid photomultiplier detector assemblies enabling significantly higher detection count rates. Thanks to these improved components, it is possible to achieve much better photon statistics in significantly shorter time spans while being able to perform FLIM imaging for fast processes in a qualitative manner and with high optical resolution. FLIM imaging can now be performed with up to several frames per second making it possible to study fast processes such as protein interactions involved in endosome trafficking.

  20. Safety and reliability analysis in a polyvinyl chloride batch process using dynamic simulator-case study: Loss of containment incident.

    Science.gov (United States)

    Rizal, Datu; Tani, Shinichi; Nishiyama, Kimitoshi; Suzuki, Kazuhiko

    2006-10-11

    In this paper, a novel methodology in batch plant safety and reliability analysis is proposed using a dynamic simulator. A batch process involving several safety objects (e.g. sensors, controller, valves, etc.) is activated during the operational stage. The performance of the safety objects is evaluated by the dynamic simulation and a fault propagation model is generated. By using the fault propagation model, an improved fault tree analysis (FTA) method using switching signal mode (SSM) is developed for estimating the probability of failures. The timely dependent failures can be considered as unavailability of safety objects that can cause the accidents in a plant. Finally, the rank of safety object is formulated as performance index (PI) and can be estimated using the importance measures. PI shows the prioritization of safety objects that should be investigated for safety improvement program in the plants. The output of this method can be used for optimal policy in safety object improvement and maintenance. The dynamic simulator was constructed using Visual Modeler (VM, the plant simulator, developed by Omega Simulation Corp., Japan). A case study is focused on the loss of containment (LOC) incident at polyvinyl chloride (PVC) batch process which is consumed the hazardous material, vinyl chloride monomer (VCM).

  1. A Reliable Turning Process by the Early Use of a Deep Simulation Model at Several Manufacturing Stages

    Directory of Open Access Journals (Sweden)

    Gorka Urbikain

    2017-05-01

    Full Text Available The future of machine tools will be dominated by highly flexible and interconnected systems, in order to achieve the required productivity, accuracy, and reliability. Nowadays, distortion and vibration problems are easily solved in labs for the most common machining operations by using models based on the equations describing the physical laws of the machining processes; however, additional efforts are needed to overcome the gap between scientific research and real manufacturing problems. In fact, there is an increasing interest in developing simulation packages based on “deep-knowledge and models” that aid machine designers, production engineers, or machinists to get the most out of the machine-tools. This article proposes a methodology to reduce problems in machining by means of a simulation utility, which uses the main variables of the system and process as input data, and generates results that help in the proper decision-making and machining plan. Direct benefits can be found in (a the fixture/clamping optimal design; (b the machine tool configuration; (c the definition of chatter-free optimum cutting conditions and (d the right programming of cutting toolpaths at the Computer Aided Manufacturing (CAM stage. The information and knowledge-based approach showed successful results in several local manufacturing companies and are explained in the paper.

  2. Present state of radiation processing and its use in the USSR

    Energy Technology Data Exchange (ETDEWEB)

    Fedorov, V.V. (V.G. Khlopin Radium Institute, Leningrad (USSR)); Kon' kov, N.G.

    1984-01-01

    There is growing interest in the processes of radiation technology. This is because of their high economic efficiency when they are integrated in national economy. Also, the recent increase of power and the availability of electrophysical and radionuclide sources of ionizing radiation allow the production on a large scale. In the USSR, radiation processes are utilized widely for the production of materials and workpieces with new useful properties, the increase of agricultural productivity and the improvement of foodstuff preservation, the sterilization of medical materials, and the disinfection of municipal and industrial wastes. The present state of development in radiation processing as above is described on the basis of the proceedings of the all-union conferences in this field held in 1982 and 1983.

  3. Emergency Physicians' Perceptions and Decision-making Processes Regarding Patients Presenting with Palpitations.

    Science.gov (United States)

    Probst, Marc A; Kanzaria, Hemal K; Hoffman, Jerome R; Mower, William R; Moheimani, Roya S; Sun, Benjamin C; Quigley, Denise D

    2015-08-01

    Palpitations are a common emergency department (ED) complaint, yet relatively little research exists on this topic from an emergency care perspective. We sought to describe the perceptions and clinical decision-making processes of emergency physicians (EP) surrounding patients with palpitations. We conducted 21 semistructured interviews with a convenience sample of EPs. We recruited participants from academic and community practice settings from four regions of the United States. The transcribed interviews were analyzed using a combination of structural coding and grounded theory approaches with ATLAS.ti, a qualitative data analysis software program (version 7; Atlas.ti Scientific Software Development GmbH, Berlin, Germany). EPs perceive palpitations to be a common but generally benign chief complaint. EPs' clinical approach to palpitations, with regards to testing, treatment, and ED management, can be classified as relating to one or more of the following themes: (1) risk stratification, (2) diagnostic categorization, (3) algorithmic management, and (4) case-specific gestalt. With regard to disposition decisions, four main themes emerged: (1) presence of a serious diagnosis, (2) perceived need for further cardiac testing/monitoring, (3) presence of key associated symptoms, (4) request of other physician or patient desire. The interrater reliability exercise yielded a Fleiss' kappa measure of 0.69, indicating substantial agreement between coders. EPs perceive palpitations to be a common but generally benign chief complaint. EPs rely on one or more of four main clinical approaches to manage these patients. These findings could help guide future efforts at developing risk-stratification tools and clinical algorithms for patients with palpitations. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. CLIC1 regulates dendritic cell antigen processing and presentation by modulating phagosome acidification and proteolysis

    Directory of Open Access Journals (Sweden)

    Kanin Salao

    2016-05-01

    Full Text Available Intracellular chloride channel protein 1 (CLIC1 participates in inflammatory processes by regulating macrophage phagosomal functions such as pH and proteolysis. Here, we sought to determine if CLIC1 can regulate adaptive immunity by actions on dendritic cells (DCs, the key professional antigen presenting cells. To do this, we first generated bone marrow-derived DCs (BMDCs from germline CLIC1 gene-deleted (CLIC1−/− and wild-type (CLIC1+/+ mice, then studied them in vitro and in vivo. We found phagocytosis triggered cytoplasmic CLIC1 translocation to the phagosomal membrane where it regulated phagosomal pH and proteolysis. Phagosomes from CLIC1−/− BMDCs displayed impaired acidification and proteolysis, which could be reproduced if CLIC1+/+, but not CLIC1−/− cells, were treated with IAA94, a CLIC family ion channel blocker. CLIC1−/− BMDC displayed reduced in vitro antigen processing and presentation of full-length myelin oligodendrocyte glycoprotein (MOG and reduced MOG-induced experimental autoimmune encephalomyelitis. These data suggest that CLIC1 regulates DC phagosomal pH to ensure optimal processing of antigen for presentation to antigen-specific T-cells. Further, they indicate that CLIC1 is a novel therapeutic target to help reduce the adaptive immune response in autoimmune diseases.

  5. The Impact of the Delivery of Prepared Power Point Presentations on the Learning Process

    Directory of Open Access Journals (Sweden)

    Auksė Marmienė

    2011-04-01

    Full Text Available This article describes the process of the preparation and delivery of Power Point presentations and how it can be used by teachers as a resource for classroom teaching. The advantages of this classroom activity covering some of the problems and providing a few suggestions for dealing with those difficulties are also outlined. The major objective of the present paper is to investigate the students ability to choose the material and the content of Power Point presentations on professional topics via the Internet as well as the ability to prepare and deliver the presentation in front of the audience. The factors which determine the choice of the presentation subject are also analysed in this paper. After the delivery students were requested to self- and peer-assess the difficulties they faced in preparation and performance of the presentations by writing the reports. Learners’ attitudes to the choice of the topic of Power Point presentations were surveyed by administering a self-assessment questionnaire.

  6. Automatic processing of unattended lexical information in visual oddball presentation: neurophysiological evidence

    Directory of Open Access Journals (Sweden)

    Yury eShtyrov

    2013-08-01

    Full Text Available Previous electrophysiological studies of automatic language processing revealed early (100-200 ms reflections of access to lexical characteristics of speech signal using the so-called mismatch negativity (MMN, a negative ERP deflection elicited by infrequent irregularities in unattended repetitive auditory stimulation. In those studies, lexical processing of spoken stimuli became manifest as an enhanced ERP in response to unattended real words as opposed to phonologically matched but meaningless pseudoword stimuli. This lexical ERP enhancement was explained by automatic activation of word memory traces realised as distributed strongly intra-connected neuronal circuits, whose robustness guarantees memory trace activation even in the absence of attention on spoken input. Such an account would predict the automatic activation of these memory traces upon any presentation of linguistic information, irrespective of the presentation modality. As previous lexical MMN studies exclusively used auditory stimulation, we here adapted the lexical MMN paradigm to investigate early automatic lexical effects in the visual modality. In a visual oddball sequence, matched short word and pseudoword stimuli were presented tachistoscopically in perifoveal area outside the visual focus of attention, as the subjects’ attention was concentrated on a concurrent non-linguistic visual dual task in the centre of the screen. Using EEG, we found a visual analogue of the lexical ERP enhancement effect, with unattended written words producing larger brain response amplitudes than matched pseudowords, starting at ~100 ms. Furthermore, we also found significant visual MMN, reported here for the first time for unattended lexical stimuli presented perifoveally. The data suggest early automatic lexical processing of visually presented language outside the focus of attention.

  7. Canadian Thoracic Society: Presenting a New Process for Clinical Practice Guideline Production

    Directory of Open Access Journals (Sweden)

    Samir Gupta

    2009-01-01

    Full Text Available A key mandate of the Canadian Thoracic Society (CTS is to promote evidence-based respiratory care through clinical practice guidelines (CPGs. To improve the quality and validity of the production, dissemination and implementation of its CPGs, the CTS has revised its guideline process and has created the Canadian Respiratory Guidelines Committee to oversee this process. The present document outlines the basic methodological tools and principles of the new CTS guideline production process. Important features include standard methods for choosing and formulating optimal questions and for finding, appraising, and summarizing the evidence; use of the Grading of Recommendations Assessment, Development and Evaluation system for rating the quality of evidence and strength of recommendations; use of the Appraisal of Guidelines for Research and Evaluation instrument for quality control during and after guideline development and for appraisal of other guidelines; use of the ADAPTE process for adaptation of existing guidelines to the local context; and use of the GuideLine Implementability Appraisal tool to augment implementability of guidelines. The CTS has also committed to develop guidelines in new areas, an annual guideline review cycle, and a new formal process for dissemination and implementation. Ultimately, it is anticipated that these changes will have a significant impact on the quality of care and clinical outcomes of individuals suffering from respiratory diseases across Canada.

  8. HIV Protease Inhibitor-Induced Cathepsin Modulation Alters Antigen Processing and Cross-Presentation.

    Science.gov (United States)

    Kourjian, Georgio; Rucevic, Marijana; Berberich, Matthew J; Dinter, Jens; Wambua, Daniel; Boucau, Julie; Le Gall, Sylvie

    2016-05-01

    Immune recognition by T cells relies on the presentation of pathogen-derived peptides by infected cells, but the persistence of chronic infections calls for new approaches to modulate immune recognition. Ag cross-presentation, the process by which pathogen Ags are internalized, degraded, and presented by MHC class I, is crucial to prime CD8 T cell responses. The original degradation of Ags is performed by pH-dependent endolysosomal cathepsins. In this article, we show that HIV protease inhibitors (PIs) prescribed to HIV-infected persons variably modulate cathepsin activities in human APCs, dendritic cells and macrophages, and CD4 T cells, three cell subsets infected by HIV. Two HIV PIs acted in two complementary ways on cathepsin hydrolytic activities: directly on cathepsins and indirectly on their regulators by inhibiting Akt kinase activities, reducing NADPH oxidase 2 activation, and lowering phagolysosomal reactive oxygen species production and pH, which led to enhanced cathepsin activities. HIV PIs modified endolysosomal degradation and epitope production of proteins from HIV and other pathogens in a sequence-dependent manner. They altered cross-presentation of Ags by dendritic cells to epitope-specific T cells and T cell-mediated killing. HIV PI-induced modulation of Ag processing partly changed the MHC self-peptidome displayed by primary human cells. This first identification, to our knowledge, of prescription drugs modifying the regulation of cathepsin activities and the MHC-peptidome may provide an alternate therapeutic approach to modulate immune recognition in immune disease beyond HIV.

  9. Regulation of protein synthesis and autophagy in activated dendritic cells: implications for antigen processing and presentation.

    Science.gov (United States)

    Argüello, Rafael J; Reverendo, Marisa; Gatti, Evelina; Pierre, Philippe

    2016-07-01

    Antigenic peptides presented in the context of major histocompatibility complex (MHC) molecules originate from the degradation of both self and non-self proteins. T cells can therefore recognize at the surface of surveyed cells, the self-peptidome produced by the cell itself (mostly inducing tolerance) or immunogenic peptides derived from exogenous origins. The initiation of adaptive immune responses by dendritic cells (DCs), through the antigenic priming of naïve T cells, is associated to microbial pattern recognition receptors engagement. Activation of DCs by microbial product or inflammatory cytokines initiates multiple processes that maximize DC capacity to present exogenous antigens and stimulate T cells by affecting major metabolic and membrane traffic pathways. These include the modulation of protein synthesis, the regulation of MHC and co-stimulatory molecules transport, as well as the regulation of autophagy, that, all together promote exogenous antigen presentation while limiting the display of self-antigens by MHC molecules.

  10. Reliable Design Versus Trust

    Science.gov (United States)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    This presentation focuses on reliability and trust for the users portion of the FPGA design flow. It is assumed that the manufacturer prior to hand-off to the user tests FPGA internal components. The objective is to present the challenges of creating reliable and trusted designs. The following will be addressed: What makes a design vulnerable to functional flaws (reliability) or attackers (trust)? What are the challenges for verifying a reliable design versus a trusted design?

  11. Automatic processing of unattended lexical information in visual oddball presentation: neurophysiological evidence.

    Science.gov (United States)

    Shtyrov, Yury; Goryainova, Galina; Tugin, Sergei; Ossadtchi, Alexey; Shestakova, Anna

    2013-01-01

    Previous electrophysiological studies of automatic language processing revealed early (100-200 ms) reflections of access to lexical characteristics of speech signal using the so-called mismatch negativity (MMN), a negative ERP deflection elicited by infrequent irregularities in unattended repetitive auditory stimulation. In those studies, lexical processing of spoken stimuli became manifest as an enhanced ERP in response to unattended real words, as opposed to phonologically matched but meaningless pseudoword stimuli. This lexical ERP enhancement was explained by automatic activation of word memory traces realized as distributed strongly intra-connected neuronal circuits, whose robustness guarantees memory trace activation even in the absence of attention on spoken input. Such an account would predict the automatic activation of these memory traces upon any presentation of linguistic information, irrespective of the presentation modality. As previous lexical MMN studies exclusively used auditory stimulation, we here adapted the lexical MMN paradigm to investigate early automatic lexical effects in the visual modality. In a visual oddball sequence, matched short word and pseudoword stimuli were presented tachistoscopically in perifoveal area outside the visual focus of attention, as the subjects' attention was concentrated on a concurrent non-linguistic visual dual task in the center of the screen. Using EEG, we found a visual analogue of the lexical ERP enhancement effect, with unattended written words producing larger brain response amplitudes than matched pseudowords, starting at ~100 ms. Furthermore, we also found significant visual MMN, reported here for the first time for unattended perifoveal lexical stimuli. The data suggest early automatic lexical processing of visually presented language which commences rapidly and can take place outside the focus of attention.

  12. Power electronics reliability analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Mark A.; Atcitty, Stanley

    2009-12-01

    This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.

  13. Character Decomposition and Transposition Processes of Chinese Compound Words in Rapid Serial Visual Presentation

    Science.gov (United States)

    Cao, Hong-Wen; Yang, Ke-Yu; Yan, Hong-Mei

    2017-01-01

    Character order information is encoded at the initial stage of Chinese word processing, however, its time course remains underspecified. In this study, we assess the exact time course of the character decomposition and transposition processes of two-character Chinese compound words (canonical, transposed, or reversible words) compared with pseudowords using dual-target rapid serial visual presentation (RSVP) of stimuli appearing at 30 ms per character with no inter-stimulus interval. The results indicate that Chinese readers can identify words with character transpositions in rapid succession; however, a transposition cost is involved in identifying transposed words compared to canonical words. In RSVP reading, character order of words is more likely to be reversed during the period from 30 to 180 ms for canonical and reversible words, but the period from 30 to 240 ms for transposed words. Taken together, the findings demonstrate that the holistic representation of the base word is activated, however, the order of the two constituent characters is not strictly processed during the very early stage of visual word processing.

  14. Geodetic Constraints on the Qinghai-Tibetan Plateau Present-Day Geophysical Processes

    Directory of Open Access Journals (Sweden)

    Kamil Erkan

    2011-01-01

    Full Text Available The Qinghai-Tibetan Plateau is the largest and the highest area in the world with distinct and competing surface and subsurface processes. The entire Plateau has been undergoing crustal deformation and accompanying isostatic uplift as a result of the Cenozoic collision of the Indian and Eurasian continents. Regional secular surface mass changes include the melting of mountain glaciers and ice caps, and permafrost layer degradation due to global warming. There is also a plausible effect of glacial isostatic adjustment due to the removal of a possible Pleistocene ice-sheet. In this article, we present an assessment of the sizes and extents of these competing interior and exterior dynamical processes, and their possible detections using contemporary space geodetic techniques. These techniques include, in addition to GPS, satellite radar altimetry over land, and temporal gravity field measurements from the Gravity Recovery and Climate Experiment (GRACE satellite mission. These techniques are complementary: land satellite altimetry, similar to GPS, is sensitive only to surface uplift, whereas GRACE is sensitive to both surface uplift and mass changes inside the Earth. Each process may dominate the others in a particular region. Our analysis shows that GRACE data are more sensitive (than GPS or land altimetry to hydrologic and meteorology signals, some of which are larger than the combined effect of geodynamic processes and permafrost degradation.

  15. Long-Term Reliability of a Hard-Switched Boost Power Processing Unit Utilizing SiC Power MOSFETs

    Science.gov (United States)

    Ikpe, Stanley A.; Lauenstein, Jean-Marie; Carr, Gregory A.; Hunter, Don; Ludwig, Lawrence L.; Wood, William; Iannello, Christopher J.; Del Castillo, Linda Y.; Fitzpatrick, Fred D.; Mojarradi, Mohammad M.; Chen, Yuan

    2016-01-01

    Silicon carbide (SiC) power devices have demonstrated many performance advantages over their silicon (Si) counterparts. As the inherent material limitations of Si devices are being swiftly realized, wide-band-gap (WBG) materials such as SiC have become increasingly attractive for high power applications. In particular, SiC power metal oxide semiconductor field effect transistors' (MOSFETs) high breakdown field tolerance, superior thermal conductivity and low-resistivity drift regions make these devices an excellent candidate for power dense, low loss, high frequency switching applications in extreme environment conditions. In this paper, a novel power processing unit (PPU) architecture is proposed utilizing commercially available 4H-SiC power MOSFETs from CREE Inc. A multiphase straight boost converter topology is implemented to supply up to 10 kilowatts full-scale. High Temperature Gate Bias (HTGB) and High Temperature Reverse Bias (HTRB) characterization is performed to evaluate the long-term reliability of both the gate oxide and the body diode of the SiC components. Finally, susceptibility of the CREE SiC MOSFETs to damaging effects from heavy-ion radiation representative of the on-orbit galactic cosmic ray environment are explored. The results provide the baseline performance metrics of operation as well as demonstrate the feasibility of a hard-switched PPU in harsh environments.

  16. Manufacturability and optical functionality of multimode optical interconnections developed with fast processable and reliable polymer waveguide silicones

    Science.gov (United States)

    Liu, Joe; Lee, Allen; Hu, Mike; Chan, Lisa; Huang, Sean; Swatowski, Brandon W.; Weidner, W. Ken; Han, Joseph

    2015-03-01

    We report on the manufacturing, reliability, and optical functionality of multimode optical waveguide devices developed with a fast processable optical grade silicone. The materials show proven optical losses of 2000 hours 85°C/85% relative humidity testing as well as >4 cycles of wave solder reflow. Fabrication speeds of rigid FR4 and flexible polyimide substrates with precise alignment features (cut by dicing saw or ablated by UV laser). Two out-of-plane coupling techniques were demonstrated in this paper, a MT connectorized sample with a 45° turning lens as well as 45° dielectric mirrors on waveguides by dicing saw. Multiple connections between fiber and polymer waveguides with MPO and two out-of-plane coupling techniques in a complete optical link are demonstrated @ 10 Gbps data rates with commercial transceiver modules. Also, complex waveguide geometries such as turnings and crossings are demonstrated by QSFP+ transceiver. The eye diagram analyses show comparable results in functionality between silicone waveguide and fiber formats.

  17. BRIDGE: A Model for Modern Software Development Process to Cater the Present Software Crisis

    CERN Document Server

    Mandal, Ardhendu

    2011-01-01

    As hardware components are becoming cheaper and powerful day by day, the expected services from modern software are increasing like any thing. Developing such software has become extremely challenging. Not only the complexity, but also the developing of such software within the time constraints and budget has become the real challenge. Quality concern and maintainability are added flavour to the challenge. On stream, the requirements of the clients are changing so frequently that it has become extremely tough to manage these changes. More often, the clients are unhappy with the end product. Large, complex software projects are notoriously late to market, often exhibit quality problems, and don't always deliver on promised functionality. None of the existing models are helpful to cater the modern software crisis. Hence, a better modern software development process model to handle with the present software crisis is badly needed. This paper suggests a new software development process model, BRIDGE, to tackle pr...

  18. [DESCRIPTION AND PRESENTATION OF THE RESULTS OF ELECTROENCEPHALOGRAM PROCESSING USING AN INFORMATION MODEL].

    Science.gov (United States)

    Myznikov, I L; Nabokov, N L; Rogovanov, D Yu; Khankevich, Yu R

    2016-01-01

    The paper proposes to apply the informational modeling of correlation matrix developed by I.L. Myznikov in early 1990s in neurophysiological investigations, such as electroencephalogram recording and analysis, coherence description of signals from electrodes on the head surface. The authors demonstrate information models built using the data from studies of inert gas inhalation by healthy human subjects. In the opinion of the authors, information models provide an opportunity to describe physiological processes with a high level of generalization. The procedure of presenting the EEG results holds great promise for the broad application.

  19. Photo-induced electron transfer processes in doped conjugated polymer films (Presentation Recording)

    Science.gov (United States)

    Rumbles, Garry; Reid, Obadiah G.; Park, Jaehong; Ramirez, Jessica; Marsh, Hilary; Clikeman, Tyler T.

    2015-08-01

    With increasing knowledge of the role of the different phases in the bulk heterojunction organic solar cell, the primary site for charge generation is now considered to be the mixed phase, and not the clean interface between neat polymer and neat fullerene. To gain a better understanding of the primary charge generating and recombination steps in this region of the system, we focus our studies on the role of the solid-state microstructure of neat polymers and light-doping of these polymers with a variety of electron-accepting dopants at low concentration. This presentation will describe some recent work on the doping of polythiophene and polyfluorene derivatives with fullerenes, phthalocyanines and perylenes, which provide a range of reduction potentials that serve to control the driving force for electron transfer processes. Results from flash photolysis, time-resolved microwave conductivity (fp-TRMC), femtosecond transient absorption spectroscopy (fTA) and photoluminescence spectroscopy will be presented.

  20. Making and Unmaking the Endangered in India (1880-Present: Understanding Animal-Criminal Processes

    Directory of Open Access Journals (Sweden)

    Varun Sharma

    2015-01-01

    Full Text Available The concerns of the present paper emerge from the single basic question of whether the available histories of the tiger are comprehensive enough to enable an understanding of how this nodular species comprises/contests the power dynamics of the present. Starting with this basic premise, this paper retells a series of events which go to clarify that a nuanced understanding of the manner in which a species serves certain political purposes is not possible by tracking the animal alone. A discourse on endangerment has beginnings in the body and being of species that are remarkably cut off from the tiger-the elephant, birds, and the rhino (and man if we might add-and develops with serious implications for power, resource appropriation, and criminality, over a period of time, before more directly recruiting the tiger itself. If we can refer to this as the intermittent making and unmaking of the endangered, it is by turning to the enunciations of Michel Foucault that we try to canvas a series of events that can be described as animal-criminal processes. The role of such processes in the construction of endangerment, the structuring of space, and shared ideas of man-animal relations is further discussed in this paper.

  1. Unveiling the fungal mycobiota present throughout the cork stopper manufacturing process.

    Science.gov (United States)

    Barreto, Maria C; Houbraken, Jos; Samson, Robert A; Brito, Dulce; Gadanho, Mário; San Romão, Maria V

    2012-10-01

    A particular fungal population is present in the main stages of the manufacturing process of cork discs. Its diversity was studied using both dependent (isolation) and independent culture methods (denaturing gel gradient electrophoresis and cloning of the ITS1-5.8S-ITS2 region). The mycobiota in the samples taken in the stages before and after the first boiling seems to be distinct from the population in the subsequent manufacturing stages. Most isolated fungi belong to the genera Penicillium, Eurotium and Cladosporium. The presence of uncultivable fungi, Ascomycota and endophytes in raw cork was confirmed by sequencing. The samples taken after the first boiling contained uncultivable fungi, but in a few samples some isolated fungi were also detected. The main taxa present in the following stages were Chrysonilia sitophila, Penicillium glabrum and Penicillium spp. All applied techniques had complementary outcomes. The main factors driving the shift in cork fungal colonization seem to be the high levels of humidity and temperature to which the slabs are subjected during the boiling process.

  2. Cooperation or competition of the two hemispheres in processing characters presented at vertical midline.

    Directory of Open Access Journals (Sweden)

    Rolf Verleger

    Full Text Available Little is known about how the hemispheres interact in processing of stimuli presented at vertical midline. Processing might be mutually independent or cooperative. Here we measured target identification and visually evoked EEG potentials while stimulus streams containing two targets, T1 and T2, were either presented at vertical midline above and below fixation, or laterally, left and right. With left and right streams, potentials evoked by filler stimuli and by T2 were earlier at the right than the left visual cortex, and T2 was better identified left than right, confirming earlier results and suggesting better capabilities of the right hemisphere in this task. With streams above and below fixation, EEG potentials evoked by filler stimuli and by T2 were likewise earlier at the right than the left hemisphere, and T2 was generally identified as well as, but not better than left T2, in one target constellation even worse (T2 in lower stream preceded by T1 in upper stream. These results suggest right-hemisphere preference for this task even with stimuli at vertical midline, and no added value through hemispheric cooperation. Lacking asymmetry for T1 amidst asymmetries for filler stimuli and for T2 might indicate alternating access of the hemispheres to midline stimuli as one means of hemispheric division of labor.

  3. Applying Failure Modes, Effects, And Criticality Analysis And Human Reliability Analysis Techniques To Improve Safety Design Of Work Process In Singapore Armed Forces

    Science.gov (United States)

    2016-09-01

    interactions from a purely mechanical standpoint. However, SAF does not apply FMECA to work processes that are typical of SAF training and operational...take considerable effort to complete. Therefore, it should be applied to work processes or activities that are generally static in nature ...AND CRITICALITY ANALYSIS AND HUMAN RELIABILITY ANALYSIS TECHNIQUES TO IMPROVE SAFETY DESIGN OF WORK PROCESS IN SINGAPORE ARMED FORCES by Weihao K

  4. Reliability solutions for a smart digital factory using: (1) RFID based CEP; (2) Image processing based error detection; (3) RFID based HCI

    OpenAIRE

    Badr, Eid

    2011-01-01

    New technologies have a great influence on the production process in modern factories. Introducing new techniques and methods is crucial to optimize and enhance the working of factories. However, ensuring a reliable and correct integration requires complete evaluation and assessment. In this thesis I utilize RFID systems and image processing to develop and implement real time solutions to enhance and optimize the production and assembly processes. Solutions include: RFID based CEP to detect p...

  5. A New Metamodeling Approach for Time-dependent Reliability of Dynamic Systems with Random Parameters Excited by Input Random Processes

    Science.gov (United States)

    2014-04-09

    Simulation-based Time-dependent Reliability Analysis for Composite Hydrokinetic Turbine Blades,” Structural and Multidisciplinary Optimization...Genetic Algorithm,” ASME Journal of Mechanical Design, 131(7). 13. Hu, Z., and Du, X., 2012, “Reliability Analysis for Hydrokinetic Turbine Blades...to Seismic Risk Based on Dynamic Analysis,” Journal of Engineering Mechanics, 129, 901- 917. 19. Beck, J. L., and Au, S. K., 2002, “Bayesian Updating

  6. The R and M 2000 Process and Reliability and Maintainability Management: Attitudes of Senior Level Managers in Aeronautical Systems Division

    Science.gov (United States)

    1988-09-01

    maintain an Air Force Center of Excellence for Reliability and Maintainability ( CERM ). The Center’s charter is to develop R&M concepts, theory, and...AFIT’s role as the Air Force’s CERM and since ASD is located at Wright-Patterson along with AFIT, it seemed the natural choice for keeping the scope of...Integrity Program BCM Baseline Correlation Matrix CDR Critical Design Review CDS [F-16] Central Data System CERM Center of Excellence for Reliability

  7. Improving model prediction reliability through enhanced representation of wetland soil processes and constrained model auto calibration - A paired watershed study

    Science.gov (United States)

    Sharifi, Amirreza; Lang, Megan W.; McCarty, Gregory W.; Sadeghi, Ali M.; Lee, Sangchul; Yen, Haw; Rabenhorst, Martin C.; Jeong, Jaehak; Yeo, In-Young

    2016-10-01

    Process based, distributed watershed models possess a large number of parameters that are not directly measured in field and need to be calibrated, in most cases through matching modeled in-stream fluxes with monitored data. Recently, concern has been raised regarding the reliability of this common calibration practice, because models that are deemed to be adequately calibrated based on commonly used metrics (e.g., Nash Sutcliffe efficiency) may not realistically represent intra-watershed responses or fluxes. Such shortcomings stem from the use of an evaluation criteria that only concerns the global in-stream responses of the model without investigating intra-watershed responses. In this study, we introduce a modification to the Soil and Water Assessment Tool (SWAT) model, and a new calibration technique that collectively reduce the chance of misrepresenting intra-watershed responses. The SWAT model was modified to better represent NO3 cycling in soils with various degrees of water holding capacity. The new calibration tool has the capacity to calibrate paired watersheds simultaneously within a single framework. It was found that when both proposed methodologies were applied jointly to two paired watersheds on the Delmarva Peninsula, the performance of the models as judged based on conventional metrics suffered, however, the intra-watershed responses (e.g., mass of NO3 lost to denitrification) in the two models automatically converged to realistic sums. This approach also demonstrates the capacity to spatially distinguish areas of high denitrification potential, an ability that has implications for improved management of prior converted wetlands under crop production and for identifying prominent areas for wetland restoration.

  8. Response process and test-retest reliability of the Context Assessment for Community Health tool in Vietnam.

    Science.gov (United States)

    Duc, Duong M; Bergström, Anna; Eriksson, Leif; Selling, Katarina; Thi Thu Ha, Bui; Wallin, Lars

    2016-01-01

    The recently developed Context Assessment for Community Health (COACH) tool aims to measure aspects of the local healthcare context perceived to influence knowledge translation in low- and middle-income countries. The tool measures eight dimensions (organizational resources, community engagement, monitoring services for action, sources of knowledge, commitment to work, work culture, leadership, and informal payment) through 49 items. The study aimed to explore the understanding and stability of the COACH tool among health providers in Vietnam. To investigate the response process, think-aloud interviews were undertaken with five community health workers, six nurses and midwives, and five physicians. Identified problems were classified according to Conrad and Blair's taxonomy and grouped according to an estimation of the magnitude of the problem's effect on the response data. Further, the stability of the tool was examined using a test-retest survey among 77 respondents. The reliability was analyzed for items (intraclass correlation coefficient (ICC) and percent agreement) and dimensions (ICC and Bland-Altman plots). In general, the think-aloud interviews revealed that the COACH tool was perceived as clear, well organized, and easy to answer. Most items were understood as intended. However, seven prominent problems in the items were identified and the content of three dimensions was perceived to be of a sensitive nature. In the test-retest survey, two-thirds of the items and seven of eight dimensions were found to have an ICC agreement ranging from moderate to substantial (0.5-0.7), demonstrating that the instrument has an acceptable level of stability. This study provides evidence that the Vietnamese translation of the COACH tool is generally perceived to be clear and easy to understand and has acceptable stability. There is, however, a need to rephrase and add generic examples to clarify some items and to further review items with low ICC.

  9. Response process and test–retest reliability of the Context Assessment for Community Health tool in Vietnam

    Directory of Open Access Journals (Sweden)

    Duong M. Duc

    2016-06-01

    Full Text Available Background: The recently developed Context Assessment for Community Health (COACH tool aims to measure aspects of the local healthcare context perceived to influence knowledge translation in low- and middle-income countries. The tool measures eight dimensions (organizational resources , community engagement, monitoring services for action, sources of knowledge, commitment to work, work culture, leadership, and informal payment through 49 items. Objective: The study aimed to explore the understanding and stability of the COACH tool among health providers in Vietnam. Designs: To investigate the response process, think-aloud interviews were undertaken with five community health workers, six nurses and midwives, and five physicians. Identified problems were classified according to Conrad and Blair's taxonomy and grouped according to an estimation of the magnitude of the problem's effect on the response data. Further, the stability of the tool was examined using a test–retest survey among 77 respondents. The reliability was analyzed for items (intraclass correlation coefficient (ICC and percent agreement and dimensions (ICC and Bland–Altman plots. Results: In general, the think-aloud interviews revealed that the COACH tool was perceived as clear, well organized, and easy to answer. Most items were understood as intended. However, seven prominent problems in the items were identified and the content of three dimensions was perceived to be of a sensitive nature. In the test–retest survey, two-thirds of the items and seven of eight dimensions were found to have an ICC agreement ranging from moderate to substantial (0.5–0.7, demonstrating that the instrument has an acceptable level of stability. Conclusions: This study provides evidence that the Vietnamese translation of the COACH tool is generally perceived to be clear and easy to understand and has acceptable stability. There is, however, a need to rephrase and add generic examples to clarify

  10. Reliability Management for Information System

    Institute of Scientific and Technical Information of China (English)

    李睿; 俞涛; 刘明伦

    2005-01-01

    An integrated intelligent management is presented to help organizations manage many heterogeneous resources in their information system. A general architecture of management for information system reliability is proposed, and the architecture from two aspects, process model and hierarchical model, described. Data mining techniques are used in data analysis. A data analysis system applicable to real-time data analysis is developed by improved data mining on the critical processes. The framework of the integrated management for information system reliability based on real-time data mining is illustrated, and the development of integrated and intelligent management of information system discussed.

  11. Creating Highly Reliable Accountable Care Organizations.

    Science.gov (United States)

    Vogus, Timothy J; Singer, Sara J

    2016-12-01

    Accountable Care Organizations' (ACOs) pursuit of the triple aim of higher quality, lower cost, and improved population health has met with mixed results. To improve the design and implementation of ACOs we look to organizations that manage similarly complex, dynamic, and tightly coupled conditions while sustaining exceptional performance known as high-reliability organizations. We describe the key processes through which organizations achieve reliability, the leadership and organizational practices that enable it, and the role that professionals can play when charged with enacting it. Specifically, we present concrete practices and processes from health care organizations pursuing high-reliability and from early ACOs to illustrate how the triple aim may be met by cultivating mindful organizing, practicing reliability-enhancing leadership, and identifying and supporting reliability professionals. We conclude by proposing a set of research questions to advance the study of ACOs and high-reliability research.

  12. MATLAB® and Design Recipes for Earth Sciences: How to Collect, Process and Present Geoscientific Information

    Science.gov (United States)

    Trauth, M.; Sillmann, E.

    2012-04-01

    The overall aim of the class was to introduce undergraduate students to the typical course of a project. The project starts with searching of the relevant literature, reviewing and ranking of the published books and journal articles, extracting the relevant information as text, data or graphs from the literature, searching, processing and visualizing data, and compiling and presenting the results as posters, abstracts and oral presentations. In the first lecture, an unexpectedly-large number (ca. 65) of students subscribed to the course urging us to teach the course in a lecture hall with a projector, microphone and speaker system, a table for the teacher's laptop and equipment, private laptops of the students and wireless Internet. We used a MOODLE eLearning environment to handle the large number of participants in a highly interactive, tutorial-style course environment. Moreover, the students were organized in five GOOGLE groups not accessed by the course instructor, but led by elected student group leaders and their deputies. During the course, the instructor defined three principle topics for each of the groups within the overall theme Past Climate Changes. After having defined sub-themes within the groups for each student, the course culminated in the presentation of the project work as conference-style posters, 200-word abstracts and one-hour sessions with 10-15 two-minute presentations, chaired by the project leaders and their deputies. The course inspired a new textbook that will appear later this year, using a similar concept as its sister book MATLAB Recipes for Earth Sciences-3rd Edition (Trauth, Springer 2010).

  13. Hydrometallurgical-UV process to produce ferrous sulfate from the pyrite present in coal tailings

    Energy Technology Data Exchange (ETDEWEB)

    Viganico, E.M.; Silva, R.A. [South Rio Grande Federal Univ., Porto Alegre (Brazil).Graduate Program in Mining, Metallurgical and Materials Technology Center

    2010-07-01

    The oxidation of pyrite can promote acid mine drainage (AMD). This study developed a hydrometallurgical-UV route for the production of ferrous sulfate. The laboratory study was conducted using a pyrite concentrate obtained from a processed coal tailing. Leaching of the tailing was performed in packed bed columns in an oxidizing environment with an aqueous medium. Recirculation of the liquor produced an Fe{sup 3+} iron rich extract. Ultraviolet irradiation was then used to convert the Fe{sup 3+} to Fe{sup 2+}. Heat provided by the UV lamps caused the ferrous sulfate to crystallize. X-ray diffraction (XRD) studies of the crystals demonstrated that it is possible to produce commercial-grade ferrous sulfate heptahydrate crystals from the pyrite present in coal tailings. The crystals are used to treat anemia in humans and animals, and are also used as reagents for waste and waste water treatment. 7 refs., 2 tabs., 2 figs.

  14. Three-day dendritic cells for vaccine development: Antigen uptake, processing and presentation

    Directory of Open Access Journals (Sweden)

    Schendel Dolores J

    2010-09-01

    Full Text Available Abstract Background Antigen-loaded dendritic cells (DC are capable of priming naïve T cells and therefore represent an attractive adjuvant for vaccine development in anti-tumor immunotherapy. Numerous protocols have been described to date using different maturation cocktails and time periods for the induction of mature DC (mDC in vitro. For clinical application, the use of mDC that can be generated in only three days saves on the costs of cytokines needed for large scale vaccine cell production and provides a method to produce cells within a standard work-week schedule in a GMP facility. Methods In this study, we addressed the properties of antigen uptake, processing and presentation by monocyte-derived DC prepared in three days (3d mDC compared with conventional DC prepared in seven days (7d mDC, which represent the most common form of DC used for vaccines to date. Results Although they showed a reduced capacity for spontaneous antigen uptake, 3d mDC displayed higher capacity for stimulation of T cells after loading with an extended synthetic peptide that requires processing for MHC binding, indicating they were more efficient at antigen processing than 7d DC. We found, however, that 3d DC were less efficient at expressing protein after introduction of in vitro transcribed (ivtRNA by electroporation, based on published procedures. This deficit was overcome by altering electroporation parameters, which led to improved protein expression and capacity for T cell stimulation using low amounts of ivtRNA. Conclusions This new procedure allows 3d mDC to replace 7d mDC for use in DC-based vaccines that utilize long peptides, proteins or ivtRNA as sources of specific antigen.

  15. Emotional noun processing: an ERP study with rapid serial visual presentation.

    Directory of Open Access Journals (Sweden)

    Shengnan Yi

    Full Text Available Reading is an important part of our daily life, and rapid responses to emotional words have received a great deal of research interest. Our study employed rapid serial visual presentation to detect the time course of emotional noun processing using event-related potentials. We performed a dual-task experiment, where subjects were required to judge whether a given number was odd or even, and the category into which each emotional noun fit. In terms of P1, we found that there was no negativity bias for emotional nouns. However, emotional nouns elicited larger amplitudes in the N170 component in the left hemisphere than did neutral nouns. This finding indicated that in later processing stages, emotional words can be discriminated from neutral words. Furthermore, positive, negative, and neutral words were different from each other in the late positive complex, indicating that in the third stage, even different emotions can be discerned. Thus, our results indicate that in a three-stage model the latter two stages are more stable and universal.

  16. MEMS reliability

    CERN Document Server

    Hartzell, Allyson L; Shea, Herbert R

    2010-01-01

    This book focuses on the reliability and manufacturability of MEMS at a fundamental level. It demonstrates how to design MEMs for reliability and provides detailed information on the different types of failure modes and how to avoid them.

  17. Uncooled ultrasensitive broad-band solution-processed photodetectors (Conference Presentation)

    Science.gov (United States)

    Gong, Xiong

    2016-09-01

    Sensing from the ultraviolet (UV)-visible to infrared (IR) is critical to environmental monitoring and remote sensing, fibre-optic communication, day and night-time surveillance, and emerging medical imaging modalities. Today, separate sensors or materials are required for different sub-bands within the UV to IR wavelength range. In general, AlGaN, Si, InGaAs and PbS based photodetectors (PDs) are used for the four important sub-bands: 0.25 μm-0.4 μm (UV), 0.45 μm-0.8 μm (visible), 0.9 μm-1.7 μm (near IR), 1.5 μm-2.6 μm (middle IR), respectively. To obtain the desired sensitivity, these detectors must be operated at low temperatures (for example, at 4.2 K). Thus, a "breakthrough" technology would be enabled by a new class of PDs -- PDs that do not require cooling to obtain high detectivity; PDs which are fabricated by solution-processing to enable low-cost multi-color, high quantum efficiency, high sensitivity and high speed response over this broad spectral range. The availability of such PDs for use at room temperature (RT) would offer new and important applications. In this presentation, we would like to share with you how we approach RT operated ultrasensitive broad-band solution-processed PDs. - By developing novel low bandgap semiconducting polymers, we are able to develop RT operated solution-processed polymers PDs with spectral response from 350 nm to 1450 nm, the detectivity over 1013 Jones and linear dynamic range over 100 dB; spectral response from 350 nm to 2500 nm, the detectivity over 1012 Jones; - By using low bandgap semiconducting polymers mixed with high electrical conductivity PbS quantum dots (QDs), inverted polymer hybrid PDs with spectral response from 300 nm to 25000 nm, the detectivity over 1013 Jones and linear dynamic range over 100 dB are realized; - By using novel perovskite hybrid materials incorporated with carbon nanotubes, novel n-type newly developed semiconducting polymers, we are able to realize RT operated solution-processed

  18. Proportional spike-timing precision and firing reliability underlie efficient temporal processing of periodicity and envelope shape cues.

    Science.gov (United States)

    Zheng, Y; Escabí, M A

    2013-08-01

    Temporal sound cues are essential for sound recognition, pitch, rhythm, and timbre perception, yet how auditory neurons encode such cues is subject of ongoing debate. Rate coding theories propose that temporal sound features are represented by rate tuned modulation filters. However, overwhelming evidence also suggests that precise spike timing is an essential attribute of the neural code. Here we demonstrate that single neurons in the auditory midbrain employ a proportional code in which spike-timing precision and firing reliability covary with the sound envelope cues to provide an efficient representation of the stimulus. Spike-timing precision varied systematically with the timescale and shape of the sound envelope and yet was largely independent of the sound modulation frequency, a prominent cue for pitch. In contrast, spike-count reliability was strongly affected by the modulation frequency. Spike-timing precision extends from sub-millisecond for brief transient sounds up to tens of milliseconds for sounds with slow-varying envelope. Information theoretic analysis further confirms that spike-timing precision depends strongly on the sound envelope shape, while firing reliability was strongly affected by the sound modulation frequency. Both the information efficiency and total information were limited by the firing reliability and spike-timing precision in a manner that reflected the sound structure. This result supports a temporal coding strategy in the auditory midbrain where proportional changes in spike-timing precision and firing reliability can efficiently signal shape and periodicity temporal cues.

  19. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  20. Insights into the Role of GILT in HLA Class II Antigen Processing and Presentation by Melanoma

    Directory of Open Access Journals (Sweden)

    Duncan L. Norton

    2009-01-01

    Full Text Available Metastatic melanoma is one of the deadliest of skin cancers and is increasing in incidence. Since current treatment regimens are ineffective at controlling and/or curing the disease, novel approaches, such as immunotherapy, for treating this malignant disease are being explored. In this review, we discuss potential melanoma antigens (Ags and their role in utilizing the HLA class II pathway to elicit tumor Ag-specific CD4+ T cell responses in order to effectively induce long-lasting CD8+ antitumor memory. We also discuss the role of endolysosomal cathepsins and Gamma-Interferon-inducible Lysosomal Thiol reductase (GILT in Ag processing and presentation, and at enhancing CD4+ T cell recognition of melanoma cells. This review also summarizes our current knowledge on GILT and highlights a novel mechanism of GILT-mediated immune responses against melanoma cells. At the end, we propose a strategy employing GILT in the development of a potential whole cell vaccine for combating metastatic melanoma.

  1. How do general psychological processes inform FLL pedagogy? Presenting a new instructional framework

    Directory of Open Access Journals (Sweden)

    Michał B. Paradowski

    2008-02-01

    Full Text Available Learning invariably proceeds by relating new facts to the already familiar and present in the conceptual structure. In the context of FL study the familiar is, of course, the student’s mother tongue. Drawing on the learner’s L1 (or another mastered tongue and showing comparisons and contrasts between the languages mirrors, facilitates and accelerates the processes which occur independently in his/her mind. At the same time, when in a new situation, we look for familiar orientation points and similarities owing to our instinctive need for safety. This is also why the target language should literally be taught in the framework of the learner’s L1. Instruction in the Language Interface Model (LIM; Gozdawa-Gołębiowski 2003a,b, 2004a,b, 2005 proceeds from an explication of how relevant rules operate in the students’ L1 through an explanation of corresponding L2 rules and subsequent interface formation, modifying the L1 rule to accommodate L2 data, with practice first expecting the learner to apply the FL rules to L1 examples before moving to more traditional exercises, to finally end with competence expansion – integrating the two competences, leading to the development of multicompetence and allowing for the obliteration of the rules governing the structure of the utterance from the learner’s conscious mind.

  2. Reliability Analysis Based on a Jump Diffusion Model with Two Wiener Processes for Cloud Computing with Big Data

    Directory of Open Access Journals (Sweden)

    Yoshinobu Tamura

    2015-06-01

    Full Text Available At present, many cloud services are managed by using open source software, such as OpenStack and Eucalyptus, because of the unification management of data, cost reduction, quick delivery and work savings. The operation phase of cloud computing has a unique feature, such as the provisioning processes, the network-based operation and the diversity of data, because the operation phase of cloud computing changes depending on many external factors. We propose a jump diffusion model with two-dimensional Wiener processes in order to consider the interesting aspects of the network traffic and big data on cloud computing. In particular, we assess the stability of cloud software by using the sample paths obtained from the jump diffusion model with two-dimensional Wiener processes. Moreover, we discuss the optimal maintenance problem based on the proposed jump diffusion model. Furthermore, we analyze actual data to show numerical examples of dependability optimization based on the software maintenance cost considering big data on cloud computing.

  3. Use of BPPV processes in Emergency Department Dizziness Presentations: A Population-Based Study

    Science.gov (United States)

    Kerber, Kevin A.; Burke, James F.; Skolarus, Lesli E.; Meurer, William J.; Callaghan, Brian C.; Brown, Devin L.; Lisabeth, Lynda D.; McLaughlin, Thomas J.; Fendrick, A. Mark; Morgenstern, Lewis B.

    2013-01-01

    Objective A common cause of dizziness, benign paroxysmal positional vertigo (BPPV), is effectively diagnosed and cured with the Dix-Hallpike test (DHT) and the canalith repositioning maneuver (CRM). We aimed to describe the use of these processes in Emergency Departments (ED), to assess for trends in use over time, and to determine provider level variability in use. Design Prospective population-based surveillance study Setting EDs in Nueces County, Texas, January 15, 2008 to January 14, 2011 Subjects and Methods Adult patients discharged from EDs with dizziness, vertigo, or imbalance documented at triage. Clinical information was abstracted from source documents. A hierarchical logistic regression model adjusting for patient and provider characteristics was used to estimate trends in DHT use and provider level variability. Results 3,522 visits for dizziness were identified. A DHT was documented in 137 visits (3.9%). A CRM was documented in 8 visits (0.2%). Among patients diagnosed with BPPV, a DHT was documented in only 21.8% (34 of 156) and a CRM in 3.9% (6 of 156). In the hierarchical model (c statistic = 0.93), DHT was less likely to be used over time (odds ratio, 0.97, 95% CI [0.95, 0.99]) and the provider level explained 50% (ICC, 0.50) of the variance in the probability of DHT use. Conclusion BPPV is seldom examined for, and when diagnosed, infrequently treated in this ED population. DHT use is decreasing over time, and varies substantially by provider. Implementation research focused on BPPV care may be an opportunity to optimize management in ED dizziness presentations. PMID:23264119

  4. Gearbox Reliability Collaborative (GRC) Description and Loading

    Energy Technology Data Exchange (ETDEWEB)

    Oyague, F.

    2011-11-01

    This document describes simulated turbine load cases in accordance to the IEC 61400-1 Ed.3 standard, which is representative of the typical wind turbine design process. The information presented herein is intended to provide a broad understanding of the gearbox reliability collaborative 750kW drivetrain and turbine configuration. In addition, fatigue and ultimate strength drivetrain loads resulting from simulations are presented. This information provides the bases for the analytical work of the gearbox reliability collaborative effort.

  5. Simulation of adsorption process of benzene present in effluent of the petrochemical industry; Simulacao do processo de adsorcao do benzeno presente em efluentes da industria petroquimica

    Energy Technology Data Exchange (ETDEWEB)

    Luz, Adriana D. da; Mello, Josiane M.M. de; Souza, Antonio Augusto Ulson de; Souza, Selene M.A. Guelli Ulson de [Universidade Federal de Santa Catarina (UFSC), Florianopolis, SC (Brazil); Silva, Adriano da [Universidade Comunitaria Regional de Chapeco (UNOCHAPECO), SC (Brazil)

    2008-07-01

    The adsorption processes have shown quite efficient in the removal of pollutant in liquid effluents, especially hydrocarbons of difficult removal, such as benzene. This work presents a phenomenological model that describes the process of benzene removal through the adsorption in a fixed bed column, being used coal activated as adsorbent. The model considers the internal and external resistances of mass transfer to the adsorbent particle. The method of Finite Volumes is used in the discretization of the equations. The numerical results obtained through the simulation presented good correlation when compared with experimental data found in the literature, demonstrating that the developed computational code, together with the mathematical modeling, represents an important tool for the project of adsorption columns. (author)

  6. Circuit design for reliability

    CERN Document Server

    Cao, Yu; Wirth, Gilson

    2015-01-01

    This book presents physical understanding, modeling and simulation, on-chip characterization, layout solutions, and design techniques that are effective to enhance the reliability of various circuit units.  The authors provide readers with techniques for state of the art and future technologies, ranging from technology modeling, fault detection and analysis, circuit hardening, and reliability management. Provides comprehensive review on various reliability mechanisms at sub-45nm nodes; Describes practical modeling and characterization techniques for reliability; Includes thorough presentation of robust design techniques for major VLSI design units; Promotes physical understanding with first-principle simulations.

  7. 23 CFR 636.111 - Can oral presentations be used during the procurement process?

    Science.gov (United States)

    2010-04-01

    ... ENGINEERING AND TRAFFIC OPERATIONS DESIGN-BUILD CONTRACTING General § 636.111 Can oral presentations be used... offeror briefing slides or presentation notes). A copy of the record should be placed in the contract file... 23 Highways 1 2010-04-01 2010-04-01 false Can oral presentations be used during the...

  8. Caesium-137 as Indicator of Present Mass-Movement and Erosion Processes

    Science.gov (United States)

    Supper, R.; Baron, I.; Winkler, E.; Motschka, K.; Jaritz, W.; Moser, G.; Carman, M.

    2012-04-01

    with bare surfaces of active landslides, earthflows, erosion gullies, spring areas and zones of estimated higher superficial water flow. On the other hand, the flat and stable areas have a relatively high content of this isotope. As conclusion we can say that the 137Cs distribution could be used as a parameter for mapping of present active mass-movement, wash-out, and other superficial erosion processes that have occurred in Europe after the 1986 Chernobyl event. This study was done within the framework of the SafeLand project funded by the European Commission's FP7 Programme.

  9. Improving machinery reliability

    CERN Document Server

    Bloch, Heinz P

    1998-01-01

    This totally revised, updated and expanded edition provides proven techniques and procedures that extend machinery life, reduce maintenance costs, and achieve optimum machinery reliability. This essential text clearly describes the reliability improvement and failure avoidance steps practiced by best-of-class process plants in the U.S. and Europe.

  10. Reliability and safety engineering

    CERN Document Server

    Verma, Ajit Kumar; Karanki, Durga Rao

    2016-01-01

    Reliability and safety are core issues that must be addressed throughout the life cycle of engineering systems. Reliability and Safety Engineering presents an overview of the basic concepts, together with simple and practical illustrations. The authors present reliability terminology in various engineering fields, viz.,electronics engineering, software engineering, mechanical engineering, structural engineering and power systems engineering. The book describes the latest applications in the area of probabilistic safety assessment, such as technical specification optimization, risk monitoring and risk informed in-service inspection. Reliability and safety studies must, inevitably, deal with uncertainty, so the book includes uncertainty propagation methods: Monte Carlo simulation, fuzzy arithmetic, Dempster-Shafer theory and probability bounds. Reliability and Safety Engineering also highlights advances in system reliability and safety assessment including dynamic system modeling and uncertainty management. Cas...

  11. Development of improved processing and evaluation methods for high reliability structural ceramics for advanced heat engine applications Phase II. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Pujari, V.J.; Tracey, D.M.; Foley, M.R. [and others

    1996-02-01

    The research program had as goals the development and demonstration of significant improvements in processing methods, process controls, and nondestructive evaluation (NDE) which can be commercially implemented to produce high reliability silicon nitride components for advanced heat engine applications at temperatures to 1370{degrees}C. In Phase I of the program a process was developed that resulted in a silicon nitride - 4 w% yttria HIP`ed material (NCX 5102) that displayed unprecedented strength and reliability. An average tensile strength of 1 GPa and a strength distribution following a 3-parameter Weibull distribution were demonstrated by testing several hundred buttonhead tensile specimens. The Phase II program focused on the development of methodology for colloidal consolidation producing green microstructure which minimizes downstream process problems such as drying, shrinkage, cracking, and part distortion during densification. Furthermore, the program focused on the extension of the process to gas pressure sinterable (GPS) compositions. Excellent results were obtained for the HIP composition processed for minimal density gradients, both with respect to room-temperature strength and high-temperature creep resistance. Complex component fabricability of this material was demonstrated by producing engine-vane prototypes. Strength data for the GPS material (NCX-5400) suggest that it ranks very high relative to other silicon nitride materials in terms of tensile/flexure strength ratio, a measure of volume quality. This high quality was derived from the closed-loop colloidal process employed in the program.

  12. Graphic presentation of information of acoustic monitoring of stream grinding process

    Directory of Open Access Journals (Sweden)

    N.S. Pryadko

    2012-04-01

    Full Text Available The theoretical and experimental mechanisms of thin grinding the loose materials are analyzed. The relation of the density function of acoustic signal amplitudes of grinding process to the degree of loading the jets by material is established.

  13. Fluvial processes and vegetation - Glimpses of the past, the present, and perhaps the future.

    Science.gov (United States)

    Hupp, Cliff R.; Osterkamp, Waite R.

    2010-01-01

    "Most research before 1960 into interactions among fluvial processes, resulting landforms, and vegetation was descriptive. Since then, however, research has become more detailed and quantitative permitting numerical modeling and applications including agricultural-erosion abatement and rehabilitation of altered

  14. Present and perspective of enhanced biological phosphorus removal process; Seibutsugakuteki rin jokyoho no genjo to kadai

    Energy Technology Data Exchange (ETDEWEB)

    Nakamura, K. [National Inst. of Bioscience and Human-Technology, Tsukuba (Japan)

    1997-02-05

    Biological phosphorus removal process utilizing anaerobic and aerobic conditions, mechanism of phosphorus removal, and microbes relating to phosphorus removal are outlined, and future problems are discussed. By mixing waste water and sludge under anaerobic condition followed by treatment under aerobic condition, phosphorus content in sludge increases to enable biological phosphorus removal. More microbes with high polyphosphorus accumulating performance are acquired by the anaerobic/aerobic process than other microbes with the result of their preferential increase, and the process has high phosphorus removal characteristic than the general treatment process. Microbes relating to phosphorus removal, immobilization of polyphosphate accumulating microbes, and phosphorus acquiring and releasing characteristics of immobilized mycelium are discussed. Application of gel entrapped immobilized mycelium to phosphorus removal and problems in biological phosphorus removing methods are described. 18 refs., 12 figs.

  15. Measurement System Reliability Assessment

    Directory of Open Access Journals (Sweden)

    Kłos Ryszard

    2015-06-01

    Full Text Available Decision-making in problem situations is based on up-to-date and reliable information. A great deal of information is subject to rapid changes, hence it may be outdated or manipulated and enforce erroneous decisions. It is crucial to have the possibility to assess the obtained information. In order to ensure its reliability it is best to obtain it with an own measurement process. In such a case, conducting assessment of measurement system reliability seems to be crucial. The article describes general approach to assessing reliability of measurement systems.

  16. Reliability of fluid systems

    Directory of Open Access Journals (Sweden)

    Kopáček Jaroslav

    2016-01-01

    Full Text Available This paper focuses on the importance of detection reliability, especially in complex fluid systems for demanding production technology. The initial criterion for assessing the reliability is the failure of object (element, which is seen as a random variable and their data (values can be processed using by the mathematical methods of theory probability and statistics. They are defined the basic indicators of reliability and their applications in calculations of serial, parallel and backed-up systems. For illustration, there are calculation examples of indicators of reliability for various elements of the system and for the selected pneumatic circuit.

  17. The role of 'know-how' in maintenance activities and reliability in a high-risk process control plant.

    Science.gov (United States)

    Garrigou, A; Carballeda, G; Daniellou, F

    1998-04-01

    Using observation of a maintenance operator's activity and 'the history' that it produced as a basis, we will discuss the role of 'know-how' in maintenance activities, and particularly the problems raised by putting this knowledge into words. Second, we will underline the contribution of this know-how to the operational reliability of the facilities and we will investigate its being taken into account in work instructions. With such issues as a basis, we will conclude with the need for ergonomists to develop modes of interviewing to help people to put know-how into words, so that it is recognized and considered in maintenance activity organization.

  18. Effects of Animation's Speed of Presentation on Perceptual Processing and Learning

    Science.gov (United States)

    Meyer, Katja; Rasch, Thorsten; Schnotz, Wolfgang

    2010-01-01

    Animations presented at different speed are assumed to differentially interact with learners' perception and cognition due to the constraints imposed by learners' limited sensitivity to incoming dynamic information. To investigate the effects of high and low presentation speed of animation, two studies were conducted. In Study 1, participants were…

  19. Effects of Animation's Speed of Presentation on Perceptual Processing and Learning

    Science.gov (United States)

    Meyer, Katja; Rasch, Thorsten; Schnotz, Wolfgang

    2010-01-01

    Animations presented at different speed are assumed to differentially interact with learners' perception and cognition due to the constraints imposed by learners' limited sensitivity to incoming dynamic information. To investigate the effects of high and low presentation speed of animation, two studies were conducted. In Study 1, participants were…

  20. Processing and Memory of Information Presented in Narrative or Expository Texts

    Science.gov (United States)

    Wolfe, Michael B. W.; Woodwyk, Joshua M.

    2010-01-01

    Background: Previous research suggests that narrative and expository texts differ in the extent to which they prompt students to integrate to-be-learned content with relevant prior knowledge during comprehension. Aims: We expand on previous research by examining on-line processing and representation in memory of to-be-learned content that is…

  1. How Do Turkish Middle School Science Coursebooks Present the Science Process Skills?

    Science.gov (United States)

    Aslan, Oktay

    2015-01-01

    An important objective in science education is the acquisition of science process skills (SPS) by the students. Therefore, science coursebooks, among the main resources of elementary science curricula, are to convey accurate SPS. This study is a qualitative study based on the content analysis of the science coursebooks used at middle schools. In…

  2. Camera image processing for automated crack detection of pressed panel products (Conference Presentation)

    Science.gov (United States)

    Moon, Hoyeon; Jung, Hwee Kwon; Lee, Changwon; Park, Gyuhae

    2017-04-01

    Crack detection on pressed panel during the press forming process is an important step to ensure the quality of panel products. Traditional crack detection technique has been generally performed by experienced human inspectors, which is subjective and expensive. Therefore, the implementation of automated and accurate crack detection is necessary during the press forming process. In this study, we performed an optimal camera positioning and automated crack detection using two image processing techniques with multi-view-camera system. The first technique is based on evaluation of the panel edge lines which are extracted from a percolated object image. This technique does not require a reference image for crack detection. Another technique is based on the comparison between a reference and a test image using the local image amplitude mapping. Before crack detection, multi-view images of a panel product are captured using multiple cameras and 3D shape information is reconstructed. Optimal camera positions are then determined based on the shape information. Afterwards, cracks are automatically detected using two crack detection techniques based on image processing. In order to demonstrate the capability of the proposed technique, experiments were performed in the laboratory and the actual manufacturing lines with the real panel products. Experimental results show that proposed techniques could effectively improve the crack detection rate with improved speed.

  3. Structural Optimization with Reliability Constraints

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    1986-01-01

    During the last 25 years considerable progress has been made in the fields of structural optimization and structural reliability theory. In classical deterministic structural optimization all variables are assumed to be deterministic. Due to the unpredictability of loads and strengths of actual...... structures it is now widely accepted that structural problems are non-deterministic. Therefore, some of the variables have to be modelled as random variables/processes and a reliability-based design philosophy should be used, Comell [1], Moses [2], Ditlevsen [3] and Thoft-Christensen & Baker [ 4......]. In this paper we consider only structures which can be modelled as systems of elasto-plastic elements, e.g. frame and truss structures. In section 2 a method to evaluate the reliability of such structural systems is presented. Based on a probabilistic point of view a modern structural optimization problem...

  4. Sedimentary Mounds on Mars: Tracing Present-day Formation Processes into the Past

    Science.gov (United States)

    Niles, P. B.; Michalski, J.; Edwards, C. S.

    2014-01-01

    High resolution photography and spectroscopy of the martian surface (MOC, HiRISE) from orbit has revolutionized our view of Mars with one and revealed spectacular views of finely layered sedimentary materials throughout the globe [1]. Some of these sedimentary deposits are 'mound' shaped and lie inside of craters (Fig 1). Crater mound deposits are found throughout the equatorial region, as well as ice-rich deposits found in craters in the north and south polar region [2-4]. Despite their wide geographical extent and varying volatile content, the 'mound' deposits have a large number of geomorphic and structural similarities that suggest they formed via equivalent processes. Thus, modern depositional processes of ice and dust can serve as an invaluable analog for interpreting the genesis of ancient sedimentary mound deposits.

  5. Invited presentation:'Synchrotron Methods to Reveal Chemical Processes at Interfaces During Material Formation and Transformation'

    OpenAIRE

    Breynaert, Eric

    2016-01-01

    Material chemistry depends on interdisciplinary research targeting design, (trans-) formation, structure analysis and application of materials and minerals. Insight in the chemical processes occurring at interfaces not only enables design and synthesis of new technologically attractive structures, but also provides key information for a score of other research fields with direct impact on our daily lives. X-ray based techniques play a key role as they provide essential information...

  6. Traumatic odontoid process synchondrosis fracture with atlantoaxial instability in a calf: clinical presentation and imaging findings

    OpenAIRE

    Hülsmeyer, Velia-Isabel; Flatz, Katharina; Putschbach, Katrin; Bechter, Martina Ramona; Weiler, Sebastian; Fischer, Andrea; Feist, Melanie

    2015-01-01

    A 6-week-old female Simmental calf was evaluated for acute non-ambulatory tetraparesis. Physical and laboratory examinations revealed no clinically relevant abnormalities. Neurological findings were consistent with acute, progressive and painful cervical myelopathy. Radiographs displayed a fractured odontoid process (dens axis) and vertebral step misalignment at the fracture site. A traumatic origin was suspected. Advanced diagnostic imaging was considered to allow better planning of potentia...

  7. The Joint Polar Satellite System (JPSS) Program's Algorithm Change Process (ACP): Past, Present and Future

    Science.gov (United States)

    Griffin, Ashley

    2017-01-01

    The Joint Polar Satellite System (JPSS) Program Office is the supporting organization for the Suomi National Polar Orbiting Partnership (S-NPP) and JPSS-1 satellites. S-NPP carries the following sensors: VIIRS, CrIS, ATMS, OMPS, and CERES with instruments that ultimately produce over 25 data products that cover the Earths weather, oceans, and atmosphere. A team of scientists and engineers from all over the United States document, monitor and fix errors in operational software code or documentation with the algorithm change process (ACP) to ensure the success of the S-NPP and JPSS 1 missions by maintaining quality and accuracy of the data products the scientific community relies on. This poster will outline the programs algorithm change process (ACP), identify the various users and scientific applications of our operational data products and highlight changes that have been made to the ACP to accommodate operating system upgrades to the JPSS programs Interface Data Processing Segment (IDPS), so that the program is ready for the transition to the 2017 JPSS-1 satellite mission and beyond.

  8. LED system reliability

    NARCIS (Netherlands)

    Driel, W.D. van; Yuan, C.A.; Koh, S.; Zhang, G.Q.

    2011-01-01

    This paper presents our effort to predict the system reliability of Solid State Lighting (SSL) applications. A SSL system is composed of a LED engine with micro-electronic driver(s) that supplies power to the optic design. Knowledge of system level reliability is not only a challenging scientific ex

  9. Effect of set size, age, and mode of stimulus presentation on information-processing speed.

    Science.gov (United States)

    Norton, J. C.

    1972-01-01

    First, second, and third grade pupils served as subjects in an experiment designed to show the effect of age, mode of stimulus presentation, and information value on recognition time. Stimuli were presented in picture and printed word form and in groups of 2, 4, and 8. The results of the study indicate that first graders are slower than second and third graders who are nearly equal. There is a gross shift in reaction time as a function of mode of stimulus presentation with increase in age. The first graders take much longer to identify words than pictures, while the reverse is true of the older groups. With regard to set size, a slope appears in the pictures condition in the older groups, while for first graders, a large slope occurs in the words condition and only a much smaller one for pictures.

  10. The Role of Imaginal Processing in the Retention of Visually-Presented Sequential Motoric Stimuli.

    Science.gov (United States)

    Housner, Lynn Dale

    1984-01-01

    This study investigated the role of imagery in the short-term retention of complex, visually presented movement sequences. Findings suggest that visual imagery may play a functional role in the free recall of modeled movements; however, there was no indication that imagery was involved in the retention of serial information. (JMK)

  11. Two simultaneous autoimmune processes in a patient presenting with respiratory insufficiency.

    Science.gov (United States)

    Troy, Lauren; Hamor, Paul; Bleasel, Jane; Corte, Tamera

    2014-03-01

    The idiopathic inflammatory myopathies, including dermatomyositis, are uncommon acquired autoimmune diseases, sometimes associated with interstitial lung disease. Myasthenia gravis, a separate autoimmune disorder involving the neuromuscular junction, has some overlapping clinical features but has only rarely been reported to occur simultaneously within the same patient. Here we present the first reported case of concomitant dermatomyositis, myasthenia gravis, and interstitial lung disease.

  12. Supply chain reliability modelling

    Directory of Open Access Journals (Sweden)

    Eugen Zaitsev

    2012-03-01

    Full Text Available Background: Today it is virtually impossible to operate alone on the international level in the logistics business. This promotes the establishment and development of new integrated business entities - logistic operators. However, such cooperation within a supply chain creates also many problems related to the supply chain reliability as well as the optimization of the supplies planning. The aim of this paper was to develop and formulate the mathematical model and algorithms to find the optimum plan of supplies by using economic criterion and the model for the probability evaluating of non-failure operation of supply chain. Methods: The mathematical model and algorithms to find the optimum plan of supplies were developed and formulated by using economic criterion and the model for the probability evaluating of non-failure operation of supply chain. Results and conclusions: The problem of ensuring failure-free performance of goods supply channel analyzed in the paper is characteristic of distributed network systems that make active use of business process outsourcing technologies. The complex planning problem occurring in such systems that requires taking into account the consumer's requirements for failure-free performance in terms of supply volumes and correctness can be reduced to a relatively simple linear programming problem through logical analysis of the structures. The sequence of the operations, which should be taken into account during the process of the supply planning with the supplier's functional reliability, was presented.

  13. Comparing Proteolytic Fingerprints of Antigen-Presenting Cells during Allergen Processing

    Directory of Open Access Journals (Sweden)

    Heidi Hofer

    2017-06-01

    Full Text Available Endolysosomal processing has a critical influence on immunogenicity as well as immune polarization of protein antigens. In industrialized countries, allergies affect around 25% of the population. For the rational design of protein-based allergy therapeutics for immunotherapy, a good knowledge of T cell-reactive regions on allergens is required. Thus, we sought to analyze endolysosomal degradation patterns of inhalant allergens. Four major allergens from ragweed, birch, as well as house dust mites were produced as recombinant proteins. Endolysosomal proteases were purified by differential centrifugation from dendritic cells, macrophages, and B cells, and combined with allergens for proteolytic processing. Thereafter, endolysosomal proteolysis was monitored by protein gel electrophoresis and mass spectrometry. We found that the overall proteolytic activity of specific endolysosomal fractions differed substantially, whereas the degradation patterns of the four model allergens obtained with the different proteases were extremely similar. Moreover, previously identified T cell epitopes were assigned to endolysosomal peptides and indeed showed a good overlap with known T cell epitopes for all four candidate allergens. Thus, we propose that the degradome assay can be used as a predictor to determine antigenic peptides as potential T cell epitopes, which will help in the rational design of protein-based allergy vaccine candidates.

  14. Private Property Rights and Compulsory Acquisition Process in Nigeria: the Past, Present and Future

    Directory of Open Access Journals (Sweden)

    Akintunde OTUBU

    2012-11-01

    Full Text Available Objectives: A property right is the exclusive authority to determine how a resource is used, whether that resource is owned by government or by individuals. In the context of land, it is the authority of the land owner to determine its use or otherwise. On the other hand, compulsory acquisition is the process by which government obtain land from private owners for development purposes in the best interest of the community. These diametrically opposed concepts of property rights and compulsory acquisition is reconciled with the payment of compensation for the extinguishment of private property rights. Implications: In Nigeria, these two concepts have a history of mutual conflicts, resulting in congruous resolutions most of the time, until the introduction of the Land Use Act 1978. With the coming of the Act, the pendulum has tilted in favors of compulsory acquisition to the detriment of private property rights; as compensation fails to assuage the loss occasioned by expropriation. Value: The paper explored the dichotomy between private property rights and compulsory acquisition in Nigeria in the last 50 years and submitted that the process under the Land Use Act changed the equilibrium that existed between these two concepts and produced a skewed and unfavorable result to the detriment of private property rights and National economy. It finally proposed a new equitable arrangement to the quagmire.

  15. Commercialization of Kennedy Space Center Instrumentation Developed to Improve Safety, Reliability, Cost Effectiveness of Space Shuttle Processing, Launch, and Landing

    Science.gov (United States)

    Helms, William R.; Starr, Stanley O.

    1997-01-01

    Priorities and achievements of the Kennedy Space Center (KSF) Instrumentation Laboratories in improving operational safety and decreasing processing costs associated with the Shuttle vehicle are addressed. Technologies that have been or are in the process of technology transfer are reviewed, and routes by which commercial concerns can obtain licenses to other KSF Instrumentation Laboratory technologies are discussed.

  16. Research in Social Work: the future in the present. Reflections on the portuguese knowledge building process

    Directory of Open Access Journals (Sweden)

    Raquel Marta

    2016-06-01

    Full Text Available The debate surrounding the construction of scientific knowledge within social work is discussed. The social work class seeks new foundations that allow within the context of structural change, the strengthening of professional identity and challenge of the vestiges of intellectual segregation that historical constraints have left. This paper seeks to outline a research strategy for reconciliation and coordination of intellectual and professional work in order to give visibility to new and different domains of interpretation and action, while claiming that considering pluri-perspectives potentiates the knowledge transformation process. Underlining this confluence of complex thinking elements, this article incorporates the space-time dimension and discusses and recognizes the unavoidable circularity as a way to interrogate knowledge that is compartmentalized and fragmented, placing an emphasis both on knowledge and on the interrelationship between knowing, doing, being and relating. In addition, examines the recognition of the nature of those relationships among various disciplines and perspectives.

  17. Risks Associated To Present Geomorphologic Processes In The Stemnic (Buda River Basin

    Directory of Open Access Journals (Sweden)

    Bojoagă Ioan

    2015-10-01

    Full Text Available The paper analyses the main geomorphologic processes in the Stemnic (Buda river basin, conditioned by the joint action of several factors, among which are the lithological peculiarities and the nature of superficial deposits, morphometric characteristics, climate, vegetation type and structure, properties of the soil cover etc. The Stemnic river basin with an area of 15662.52 ha is characterized by its elongated shape (the maximum length is of 30.5 km, maximum width of 8.5 km, its relative lithological homogeneity, but also by a variety of superficial deposits (eluvium, diluvium, colluvium and proluvium, alluvium and by a relief energy of significant values between 136 m and 10 m (mean value of 73 m. Under these conditions, study area is characterized by a high degree of susceptibility to the occurrence of geomorphologic risk processes. For the morphometric and morphological analysis, we applied the method of the digital terrain model (DTM with vectorisation of the contour lines on topographic maps with a scale of 1:5,000. In this paper we used indicators that highlight the particular frequency of landslides, especially in the upper and middle sectors, but the rather reduced frequency of deep erosion. Due to the satisfactory coverage of the ground with vegetation, the erosion reaction is differentiated, as it depends on the use of the land and the concentration of liquid flow on the slopes. Consequently, landslides of different ages, types and forms hold large surfaces in the basin (approx. 8%, while surface erosion affects most areas of the slopes, but with different intensities depending on their use and on agricultural technologies.

  18. Processing of urushiol (poison ivy) hapten by both endogenous and exogenous pathways for presentation to T cells in vitro.

    Science.gov (United States)

    Kalish, R S; Wood, J A; LaPorte, A

    1994-05-01

    The antigen processing requirements for urushiol, the immunogen of poison ivy (Toxicodendron radicans), were tested by presentation of urushiol to cultured human urushiol-responsive T cells. Urushiol was added to antigen-presenting cells (APC) either before or after fixation with paraformaldehyde. Three distinct routes of antigen processing were detected. CD8+ and CD4+ T cells, which were dependent upon processing, proliferated if urushiol was added to APC before fixation, but did not proliferate when urushiol was added to APC after fixation. Processing of urushiol for presentation to CD8+ T cells was inhibited by azide, monensin, and brefeldin A. This suggests that urushiol was processed by the endogenous pathway. In contrast, presentation of urushiol to CD4+ T cells was inhibited by monensin but not by brefeldin A. This was compatible with antigen processing by the endosomal (exogenous) pathway. Finally, certain CD8+ T cells recognized urushiol in the absence of processing. These cells proliferated in response to APC incubated with urushiol after fixation. Classification of contact allergens by antigen processing pathway may predict the relative roles of CD4+ and CD8+ cells in the immunopathogensis of allergic contact dermatitis.

  19. Introduction to quality and reliability engineering

    CERN Document Server

    Jiang, Renyan

    2015-01-01

    This book presents the state-of-the-art in quality and reliability engineering from a product life cycle standpoint. Topics in reliability include reliability models, life data analysis and modeling, design for reliability and accelerated life testing, while topics in quality include design for quality, acceptance sampling and supplier selection, statistical process control, production tests such as screening and burn-in, warranty and maintenance. The book provides comprehensive insights into two closely related subjects, and includes a wealth of examples and problems to enhance reader comprehension and link theory and practice. All numerical examples can be easily solved using Microsoft Excel. The book is intended for senior undergraduate and post-graduate students in related engineering and management programs such as mechanical engineering, manufacturing engineering, industrial engineering and engineering management programs, as well as for researchers and engineers in the quality and reliability fields. D...

  20. LED环氧灌封工艺对其可靠性的影响%Impact of LED Epoxy Casting Process on Its Reliability

    Institute of Scientific and Technical Information of China (English)

    郑智斌

    2012-01-01

    根据实际工作中分析发光二极管(LED)产品失效问题所获得的经验,探讨环氧树脂灌封工艺对LED可靠性的影响.采用理论结合试验验证的方法,从环氧树脂材料性能、固化工艺条件和改善树脂材料性能等方面,总结出环氧树脂灌封工艺影响LED可靠性的具体因素.环氧树脂材料的性能参数如玻璃化转变温度、热膨胀系数和弹性模量等会影响LED耐焊接热和光衰的能力;降低固化工艺条件会减少LED内应力,防止芯片隐裂;在环氧树脂中添加偶联剂可以提高LED产品气密性,防止水汽渗透,提高LED可靠性.最后建议应当根据不同LED的可靠性要求,选择合适的环氧树脂和固化工艺条件.%According to the experience gained from the analysis of light emitting diode (LED) product failure problems in practical working, the impact of epoxy casting process on LED reliability was discussed. With the method of combining theory and experiment, the specific factors of epoxy resin casting process to influence LED reliability were summarized from aspects including material properties of epoxy resin, curing process condition and resin material performance improvement. The epoxy resin material performance parameters such as glass transition temperature, thermal expansion coefficient and elastic modulus will affect LED resistance to soldering heat and light decay; lowering LED curing process conditions will reduce LED inner stress and prevent the chip from hidden cracking; adding coupling agents in epoxy resin can improve the air tightness of LED products, prevent moisture penetration, and improve LED reliability. Finally, the suggestion of choosing appropriate epoxy resin and curing process conditions according to different LED reliability requirements was brought up.

  1. Environmental process for elimination of phenolic water present in refinery gasoline tanks; Processo ambiental para eliminacao de agua fenolica presente em tanques de gasolina de refinarias de petroleo

    Energy Technology Data Exchange (ETDEWEB)

    Correa Junior, Bentaci; Pedroso, Osmar V.; Furlan, Luis T. [PETROBRAS, SP (Brazil). Refinaria de Paulinia

    2004-07-01

    Gasoline production in petroleum refineries usually implies carrying high phenol contents in water after treatment systems. Phenols are powerful bactericides and, therefore, harmful to microorganisms present in wastewater treatment plants and in rivers. Due to this reason, usually controlled phenolic water drainage is performed, enabling gasoline quality improvement, without jeopardizing the biological treatment. Increase of phenolic contents in the effluent, due to operational disarray during the drainage of gasoline tanks may cause inhibition or even mortality of the existing microorganisms in the wastewater treatment plants. Aiming at changing the traditional treatment logic of environmental demands at the 'end of pipe', sending the phenolic water to the sour water treatment systems was proposed and implemented, which in turn, is reutilized by the latter in the crude desalination of the Distillation Units, where the phenols are reincorporated to the crude oil, preventing negative consequences to the wastewater treatment plant. The implemented process has demonstrated that premises were correct, enabling to implement process flows quite higher than drainage flows, what has meant productivity gains and environmental improvement. (author)

  2. Biohydrometallurgical process to produce the coagulant ferric sulfate from the pyrite present in coal tailings

    Energy Technology Data Exchange (ETDEWEB)

    Colling, A.V.; Santos Menezes dos, J.C.S.; Silveira, P.S.; Schneider, I.A.H. [South Rio Grande Federal Univ., Porto Alegre (Brazil). Graduate Program in Mining, Metallurgical and Materials Technology Center

    2010-07-01

    This paper presented details of a biohydrometallurgical study conducted to characterize the production of a ferric sulfate coagulate from pyrite (FeS{sub 2}) contained in coal tailings. Leaching experiments were conducted with coal tailings samples from the Santa Catarina mining site in Brazil. The experiments were conducted for sterile and non-sterile samples, as well as samples inoculated with acidophilic bacteria and acidophilic bacteria with the addition of nutrients. Samples were collected weekly in order to analyze total iron, sulfate, and the amounts of Acidithiobacillus ferroxidans bacteria. An analysis of the samples showed that the pyrite oxidation, iron sulfate production, and quantities of bacteria were higher in the column inoculated with the bacteria and nutrient additions. The samples produced an aqueous solution that was rich in ferric sulfate. Water treatment tests demonstrated that the resulting coagulant is as efficient as conventionally-produced coagulants. 8 refs., 2 tab., 2 figs.

  3. OSS reliability measurement and assessment

    CERN Document Server

    Yamada, Shigeru

    2016-01-01

    This book analyses quantitative open source software (OSS) reliability assessment and its applications, focusing on three major topic areas: the Fundamentals of OSS Quality/Reliability Measurement and Assessment; the Practical Applications of OSS Reliability Modelling; and Recent Developments in OSS Reliability Modelling. Offering an ideal reference guide for graduate students and researchers in reliability for open source software (OSS) and modelling, the book introduces several methods of reliability assessment for OSS including component-oriented reliability analysis based on analytic hierarchy process (AHP), analytic network process (ANP), and non-homogeneous Poisson process (NHPP) models, the stochastic differential equation models and hazard rate models. These measurement and management technologies are essential to producing and maintaining quality/reliable systems using OSS.

  4. Electrically controlled nonlinear optical generation and signal processing in plasmonic metamaterials (Conference Presentation)

    Science.gov (United States)

    Cai, Wenshan

    2016-09-01

    Metamaterials have offered not only the unprecedented opportunity to generate unconventional electromagnetic properties that are not found in nature, but also the exciting potential to create customized nonlinear media with tailored high-order effects. Two particularly compelling directions of current interests are active metamaterials, where the optical properties can be purposely manipulated by external stimuli, and nonlinear metamaterials, which enable intensity-dependent frequency conversion of light. By exploring the interaction of these two directions, we leverage the electrical and optical functions simultaneously supported in nanostructured metals and demonstrate electrically-controlled nonlinear processes from photonic metamaterials. We show that a variety of nonlinear optical phenomena, including the wave mixing and the optical rectification, can be purposely modulated by applied voltage signals. In addition, electrically-induced and voltage-controlled nonlinear effects facilitate us to demonstrate the backward phase matching in a negative index material, a long standing prediction in nonlinear metamaterials. Other results to be covered in this talk include photon-drag effect in plasmonic metamaterials and ion-assisted nonlinear effects from metamaterials in electrolytes. Our results reveal a grand opportunity to exploit optical metamaterials as self-contained, dynamic electrooptic systems with intrinsically embedded electrical functions and optical nonlinearities. Reference: L. Kang, Y. Cui, S. Lan, S. P. Rodrigues, M. L. Brongersma, and W. Cai, Nature Communications, 5, 4680 (2014). S. P. Rodrigues and W.Cai, Nature Nanotechnology, 10, 387 (2015). S. Lan, L. Kang, D. T. Schoen, S. P. Rodrigues, Y. Cui, M. L. Brongersma, and W. Cai, Nature Materials, 14, 807 (2015).

  5. ISO 2789 and ISO 11620: Short Presentation of Standards as Reference Documents in an Assessment Process

    Directory of Open Access Journals (Sweden)

    Pierre-Yves Renard

    2007-11-01

    Full Text Available The aim of this paper is to show how international standards dealing with library statistics and indicators (ISO 2789, ISO 11620 and others projects which are still under development can be used as reference documents and strategic tools in a performance assessment process. The task is not an easy one, because it requires linking up somewhat complex entities such as the standardization work characteristics, the capacity of statistics to account for reality and, lastly, the variety and speed of libraries’ advancement. Nevertheless, ISO 2789 (International Library Statistics and ISO 11620 (Performance indicators for libraries, which are based on an international consensus of experts, take into account, as much as possible, the recent evolutions in library structures and services. In addition, they are related to classical and shared assessment models. So, although their aim is not to draw up an assessment framework, they reveal themselves useful for basic operations in such a framework: to define objects and services, and to classify, count and build appropriate indicators. Moreover, as the issue of quantifying and promoting intangible assets becomes a concern in the public sector, these standards can be seen as a first attempt to define library resources and services as such intangible assets. Finally, the challenge of forthcoming evolutions of these standards is the ability to stay up-to-date in a very quickly evolving context. More precisely, the increase in the usability of these standards must be based on an ongoing search for more consistent data and relevant indicators. The question of improvement of the general design of the statistics and indicators standards family should also be addressed.

  6. Assessing cognitive-related disability in schizophrenia: Reliability, validity and underlying factors of the evaluation of cognitive processes involved in disability in schizophrenia scale.

    Science.gov (United States)

    Passerieux, Christine; Bulot, Virginie; Hardy-Baylé, Marie-Christine; Roux, Paul

    2017-04-11

    We have developed a new scale that assesses disability caused by cognitive impairments in schizophrenia, in order to evaluate the functional impact of schizophrenia and help the prescription of rehabilitation interventions. The aim of the study was to assess its psychometrical properties. Mental healthcare professionals and relatives of individuals with schizophrenia developed and rated the evaluation of cognitive processes involved in disability in schizophrenia scale, which included 13 items. Its construct validity was assessed through a factorial analysis; its concurrent validity was evaluated based on ecological outcomes, its convergent validity was tested against the World Health Organization Disability Assessment Schedule (WHODAS II), and its reliability was estimated based on internal consistency and inter-rater reliability. Overall, 215 patients were included. Our findings supported a two-factor structure which accounted for 46% of the variance. The internal consistency and inter-rater reliability were good. The convergent validity showed a strong correlation with the WHODAS II. The concurrent validity showed strong relationships with work status, independent living, level and adequacy of institutional care. The good psychometric properties of the scale suggest a role for this tool in assessing schizophrenia-related disability and evaluating the need for cognitive remediation. Implication for Rehabilitation Schizophrenia is a chronic disorder leading to a severe psychiatric handicap. The scale showed good psychometric properties in individuals with schizophrenia and severe psychiatric disability. The scale is easy and quick to administer (about 15 min). The scale may help to identify targets for rehabilitation interventions in individuals with schizophrenia.

  7. Reliable solution processed planar perovskite hybrid solar cells with large-area uniformity by chloroform soaking and spin rinsing induced surface precipitation

    Directory of Open Access Journals (Sweden)

    Yann-Cherng Chern

    2015-08-01

    Full Text Available A solvent soaking and rinsing method, in which the solvent was allowed to soak all over the surface followed by a spinning for solvent draining, was found to produce perovskite layers with high uniformity on a centimeter scale and with much improved reliability. Besides the enhanced crystallinity and surface morphology due to the rinsing induced surface precipitation that constrains the grain growth underneath in the precursor films, large-area uniformity with film thickness determined exclusively by the rotational speed of rinsing spinning for solvent draining was observed. With chloroform as rinsing solvent, highly uniform and mirror-like perovskite layers of area as large as 8 cm × 8 cm were produced and highly uniform planar perovskite solar cells with power conversion efficiency of 10.6 ± 0.2% as well as much prolonged lifetime were obtained. The high uniformity and reliability observed with this solvent soaking and rinsing method were ascribed to the low viscosity of chloroform as well as its feasibility of mixing with the solvent used in the precursor solution. Moreover, since the surface precipitation forms before the solvent draining, this solvent soaking and rinsing method may be adapted to spinless process and be compatible with large-area and continuous production. With the large-area uniformity and reliability for the resultant perovskite layers, this chloroform soaking and rinsing approach may thus be promising for the mass production and commercialization of large-area perovskite solar cells.

  8. Realist Ontology and Natural Processes: A Semantic Tool to Analyze the Presentation of the Osmosis Concept in Science Texts

    Science.gov (United States)

    Spinelli Barria, Michele; Morales, Cecilia; Merino, Cristian; Quiroz, Waldo

    2016-01-01

    In this work, we developed an ontological tool, based on the scientific realism of Mario Bunge, for the analysis of the presentation of natural processes in science textbooks. This tool was applied to analyze the presentation of the concept of osmosis in 16 chemistry and biology books at different educational levels. The results showed that more…

  9. Visual Field x Response Hand Interactions and Level Priming in the Processing of Laterally Presented Hierarchical Stimuli

    Science.gov (United States)

    Wendt, Mike; Vietze, Ina; Kluwe, Rainer H.

    2007-01-01

    Hemisphere-specific processing of laterally presented global and local stimulus levels was investigated by (a) examining interactions between the visual field of stimulus presentation and the response hand and (b) comparing intra- with inter-hemispheric effects of level priming (i.e. faster and more accurate performance when the target level…

  10. Action-Specific Influences on Perception and Post-Perceptual Processes: Present Controversies and Future Directions

    Science.gov (United States)

    Philbeck, John W.; Witt, Jessica K.

    2015-01-01

    The action-specific perception account holds that people perceive the environment in terms of their ability to act in it. In this view, for example, decreased ability to climb a hill due to fatigue makes the hill visually appear to be steeper. Though influential, this account has not been universally accepted, and in fact a heated controversy has emerged. The opposing view holds that action capability has little or no influence on perception. Heretofore, the debate has been quite polarized, with efforts largely being focused on supporting one view and dismantling the other. We argue here that polarized debate can impede scientific progress and that the search for similarities between two sides of a debate can sharpen the theoretical focus of both sides and illuminate important avenues for future research. In this paper, we present a synthetic review of this debate, drawing from the literatures of both approaches, to clarify both the surprising similarities and the core differences between them. We critically evaluate existing evidence, discuss possible mechanisms of action-specific effects, and make recommendations for future research. A primary focus of future work will involve not only the development of methods that guard against action-specific post-perceptual effects, but also development of concrete, well-constrained underlying mechanisms. The criteria for what constitutes acceptable control of post-perceptual effects and what constitutes an appropriately specific mechanism vary between approaches, and bridging this gap is a central challenge for future research. PMID:26501227

  11. Takotsubo cardiomyopathy systematic review: Pathophysiologic process, clinical presentation and diagnostic approach to Takotsubo cardiomyopathy.

    Science.gov (United States)

    Ono, Ryohei; Falcão, L Menezes

    2016-04-15

    Takotsubo cardiomyopathy (TTC) is characterized by transient left ventricular apical ballooning with the absence of coronary occlusion, which typically occurs in older women after emotional or physical stress. The pathophysiology of TTC is not well established, though several possible causes such as catecholamine cardiotoxicity, metabolic disturbance, coronary microvascular impairment and multivessel epicardial coronary artery spasm have been proposed. A number of diagnostic criteria have been suggested in the world and not unified as single, but the most common accepted one is Mayo Clinic proposed criteria. Since the clinical presentation of TTC is usually similar to acute coronary syndrome, differential diagnosis is essential to exclude other diseases and also for its treatment. Imaging modality including echocardiogram, angio CT and cardiac MRI, and lab tests for catecholamine, troponin T, creatine kinase MB and B-type natriuretic peptide can be useful to differentiate TTC from other diseases. Prognosis is generally favorable and in-hospital mortality is from 0% to within 10%. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  12. Full solution process approach for deterministic control of light emission at the nanoscale (Conference Presentation)

    Science.gov (United States)

    Jiménez-Solano, Alberto; Galisteo-López, Juan F.; Míguez, Hernán.

    2016-04-01

    Porous nanostructured photonic materials in the shape of periodic multilayers have demonstrated their potential in different fields ranging from photovoltaics[1] to sensing,[2] representing an ideal platform for flexible devices. When applications dealing with light absorption or emission are considered, knowledge on how the local density of states (LDOS) is distributed within them is mandatory[3] in order to realize a judicious design which maximizes light matter interaction. Using a combination of spin and dip-casting we report a detail study of how dye doped polystyrene nanospheres constitute an effective LDOS probe to study its distribution within nanostructured photonic media.[4] This full solution process approach allows to cover large areas keeping the photonics properties. Nanospheres with a diameter of 25 nm are incorporated in nanostructured multilayers (Fig. 1a).. This allows to place them at several positions of the structured sample (Fig. 1b). A combined use of photoluminescence spectroscopy and time resolved measurements are used to optically characterize the samples. While the former shows how depending on the probe position its PL intensity can be enhanced or suppressed, the latter allows to probe the LDOS changes within the sample, monitored via changes in its lifetime. We demonstrate how information on the local photonic environment can be retrieved with a spatial resolution of 25 nm (provided by the probe size) and relative changes in the decay rates as small as ca. 1% (Fig. 1c), evidencing the possibility of exerting a fine deterministic control on the photonic surroundings of an emitter. References [1] C. López-López, S. Colodrero, M. E. Calvo and H. Míguez, Energy Environ. Sci., 23, 2805 (2013). [2] A. Jiménez-Solano, C. López-López, O. Sánchez-Sobrado, J. M. Luque, M. E. Calvo, C. Fernández-López, A. Sánchez-Iglesias, L. M. Liz-Marzán and H. Míguez. Langmuir, 28, 9161 (2012). [3] N. Danz, R. Waldhäusl, A. Bräuer and R

  13. Sedimentological processes and environmental variability at Lake Ohrid (Macedonia, Albania between 640 ka and present day

    Directory of Open Access Journals (Sweden)

    A. Francke

    2015-09-01

    Full Text Available Lake Ohrid (FYROM, Albania is thought to be more than 1.2 million years old and hosts more than 200 endemic species. As a target of the International Continental Scientific Drilling Program (ICDP, a successful deep drilling campaign was carried out within the scope of the Scientific Collaboration on Past Speciation Conditions in Lake Ohrid (SCOPSCO project in 2013. Here, we present lithological, sedimentological, and (bio-geochemical data from the upper 247.8 m of the overall 569 m long DEEP site sediment succession from the central part of the lake. According to an age model, which is based on nine tephra layers (1st order tie points, and on tuning of biogeochemical proxy data to orbital parameters (2nd order tie points and to the global benthic isotope stack LR04 (3rd order tie points, respectively, the analyzed sediment sequence covers the last 640 ka. The DEEP site sediment succession consists of hemipelagic sediments, which are interspersed by several tephra layers and infrequent, thin (< 5 cm mass wasting deposits. The hemipelagic sediments can be classified into three different lithotypes. Lithotype 1 and 2 deposits comprise calcareous and slightly calcareous silty clay and are predominantly attributed to interglacial periods with high primary productivity in the lake during summer and reduced mixing during winter. The data suggest that high ion and nutrient concentrations in the lake water promoted calcite precipitation and diatom growth in the epilmnion in during MIS15, 13, and 5. Following a strong primary productivity, highest interglacial temperatures can be reported for MIS11 and 5, whereas MIS15, 13, 9, and 7 were comparable cooler. Lithotype 3 deposits consist of clastic, silty clayey material and predominantly represent glacial periods with low primary productivity during summer and longer and intensified mixing during winter. The data imply that most severe glacial conditions at Lake Ohrid persisted during MIS16, 12, 10, and 6

  14. Sedimentological processes and environmental variability at Lake Ohrid (Macedonia, Albania) between 637 ka and the present

    Science.gov (United States)

    Francke, Alexander; Wagner, Bernd; Just, Janna; Leicher, Niklas; Gromig, Raphael; Baumgarten, Henrike; Vogel, Hendrik; Lacey, Jack H.; Sadori, Laura; Wonik, Thomas; Leng, Melanie J.; Zanchetta, Giovanni; Sulpizio, Roberto; Giaccio, Biagio

    2016-02-01

    Lake Ohrid (Macedonia, Albania) is thought to be more than 1.2 million years old and host more than 300 endemic species. As a target of the International Continental scientific Drilling Program (ICDP), a successful deep drilling campaign was carried out within the scope of the Scientific Collaboration on Past Speciation Conditions in Lake Ohrid (SCOPSCO) project in 2013. Here, we present lithological, sedimentological, and (bio-)geochemical data from the upper 247.8 m composite depth of the overall 569 m long DEEP site sediment succession from the central part of the lake. According to an age model, which is based on 11 tephra layers (first-order tie points) and on tuning of bio-geochemical proxy data to orbital parameters (second-order tie points), the analyzed sediment sequence covers the last 637 kyr. The DEEP site sediment succession consists of hemipelagic sediments, which are interspersed by several tephra layers and infrequent, thin (< 5 cm) mass wasting deposits. The hemipelagic sediments can be classified into three different lithotypes. Lithotype 1 and 2 deposits comprise calcareous and slightly calcareous silty clay and are predominantly attributed to interglacial periods with high primary productivity in the lake during summer and reduced mixing during winter. The data suggest that high ion and nutrient concentrations in the lake water promoted calcite precipitation and diatom growth in the epilimnion during MIS15, 13, and 5. Following a strong primary productivity, highest interglacial temperatures can be reported for marine isotope stages (MIS) 11 and 5, whereas MIS15, 13, 9, and 7 were comparably cooler. Lithotype 3 deposits consist of clastic, silty clayey material and predominantly represent glacial periods with low primary productivity during summer and longer and intensified mixing during winter. The data imply that the most severe glacial conditions at Lake Ohrid persisted during MIS16, 12, 10, and 6, whereas somewhat warmer temperatures can be

  15. Sedimentological processes and environmental variability at Lake Ohrid (Macedonia, Albania) between 640 ka and present day

    Science.gov (United States)

    Francke, A.; Wagner, B.; Just, J.; Leicher, N.; Gromig, R.; Baumgarten, H.; Vogel, H.; Lacey, J. H.; Sadori, L.; Wonik, T.; Leng, M. J.; Zanchetta, G.; Sulpizio, R.; Giaccio, B.

    2015-09-01

    Lake Ohrid (FYROM, Albania) is thought to be more than 1.2 million years old and hosts more than 200 endemic species. As a target of the International Continental Scientific Drilling Program (ICDP), a successful deep drilling campaign was carried out within the scope of the Scientific Collaboration on Past Speciation Conditions in Lake Ohrid (SCOPSCO) project in 2013. Here, we present lithological, sedimentological, and (bio-)geochemical data from the upper 247.8 m of the overall 569 m long DEEP site sediment succession from the central part of the lake. According to an age model, which is based on nine tephra layers (1st order tie points), and on tuning of biogeochemical proxy data to orbital parameters (2nd order tie points) and to the global benthic isotope stack LR04 (3rd order tie points), respectively, the analyzed sediment sequence covers the last 640 ka. The DEEP site sediment succession consists of hemipelagic sediments, which are interspersed by several tephra layers and infrequent, thin (< 5 cm) mass wasting deposits. The hemipelagic sediments can be classified into three different lithotypes. Lithotype 1 and 2 deposits comprise calcareous and slightly calcareous silty clay and are predominantly attributed to interglacial periods with high primary productivity in the lake during summer and reduced mixing during winter. The data suggest that high ion and nutrient concentrations in the lake water promoted calcite precipitation and diatom growth in the epilmnion in during MIS15, 13, and 5. Following a strong primary productivity, highest interglacial temperatures can be reported for MIS11 and 5, whereas MIS15, 13, 9, and 7 were comparable cooler. Lithotype 3 deposits consist of clastic, silty clayey material and predominantly represent glacial periods with low primary productivity during summer and longer and intensified mixing during winter. The data imply that most severe glacial conditions at Lake Ohrid persisted during MIS16, 12, 10, and 6 whereas

  16. Ultimately Reliable Pyrotechnic Systems

    Science.gov (United States)

    Scott, John H.; Hinkel, Todd

    2015-01-01

    This paper presents the methods by which NASA has designed, built, tested, and certified pyrotechnic devices for high reliability operation in extreme environments and illustrates the potential applications in the oil and gas industry. NASA's extremely successful application of pyrotechnics is built upon documented procedures and test methods that have been maintained and developed since the Apollo Program. Standards are managed and rigorously enforced for performance margins, redundancy, lot sampling, and personnel safety. The pyrotechnics utilized in spacecraft include such devices as small initiators and detonators with the power of a shotgun shell, detonating cord systems for explosive energy transfer across many feet, precision linear shaped charges for breaking structural membranes, and booster charges to actuate valves and pistons. NASA's pyrotechnics program is one of the more successful in the history of Human Spaceflight. No pyrotechnic device developed in accordance with NASA's Human Spaceflight standards has ever failed in flight use. NASA's pyrotechnic initiators work reliably in temperatures as low as -420 F. Each of the 135 Space Shuttle flights fired 102 of these initiators, some setting off multiple pyrotechnic devices, with never a failure. The recent landing on Mars of the Opportunity rover fired 174 of NASA's pyrotechnic initiators to complete the famous '7 minutes of terror.' Even after traveling through extreme radiation and thermal environments on the way to Mars, every one of them worked. These initiators have fired on the surface of Titan. NASA's design controls, procedures, and processes produce the most reliable pyrotechnics in the world. Application of pyrotechnics designed and procured in this manner could enable the energy industry's emergency equipment, such as shutoff valves and deep-sea blowout preventers, to be left in place for years in extreme environments and still be relied upon to function when needed, thus greatly enhancing

  17. Reliable Quantum Computers

    CERN Document Server

    Preskill, J

    1997-01-01

    The new field of quantum error correction has developed spectacularly since its origin less than two years ago. Encoded quantum information can be protected from errors that arise due to uncontrolled interactions with the environment. Recovery from errors can work effectively even if occasional mistakes occur during the recovery procedure. Furthermore, encoded quantum information can be processed without serious propagation of errors. Hence, an arbitrarily long quantum computation can be performed reliably, provided that the average probability of error per quantum gate is less than a certain critical value, the accuracy threshold. A quantum computer storing about 10^6 qubits, with a probability of error per quantum gate of order 10^{-6}, would be a formidable factoring engine. Even a smaller, less accurate quantum computer would be able to perform many useful tasks. (This paper is based on a talk presented at the ITP Conference on Quantum Coherence and Decoherence, 15-18 December 1996.)

  18. Chapter 9: Reliability

    Energy Technology Data Exchange (ETDEWEB)

    Algora, Carlos; Espinet-Gonzalez, Pilar; Vazquez, Manuel; Bosco, Nick; Miller, David; Kurtz, Sarah; Rubio, Francisca; McConnell,Robert

    2016-04-15

    This chapter describes the accumulated knowledge on CPV reliability with its fundamentals and qualification. It explains the reliability of solar cells, modules (including optics) and plants. The chapter discusses the statistical distributions, namely exponential, normal and Weibull. The reliability of solar cells includes: namely the issues in accelerated aging tests in CPV solar cells, types of failure and failures in real time operation. The chapter explores the accelerated life tests, namely qualitative life tests (mainly HALT) and quantitative accelerated life tests (QALT). It examines other well proven and experienced PV cells and/or semiconductor devices, which share similar semiconductor materials, manufacturing techniques or operating conditions, namely, III-V space solar cells and light emitting diodes (LEDs). It addresses each of the identified reliability issues and presents the current state of the art knowledge for their testing and evaluation. Finally, the chapter summarizes the CPV qualification and reliability standards.

  19. Characterization of TAP Ambr 250 disposable bioreactors, as a reliable scale-down model for biologics process development.

    Science.gov (United States)

    Xu, Ping; Clark, Colleen; Ryder, Todd; Sparks, Colleen; Zhou, Jiping; Wang, Michelle; Russell, Reb; Scott, Charo

    2017-03-01

    Demands for development of biological therapies is rapidly increasing, as is the drive to reduce time to patient. In order to speed up development, the disposable Automated Microscale Bioreactor (Ambr 250) system is increasingly gaining interest due to its advantages, including highly automated control, high throughput capacity, and short turnaround time. Traditional early stage upstream process development conducted in 2 - 5 L bench-top bioreactors requires high foot-print, and running cost. The establishment of the Ambr 250 as a scale-down model leads to many benefits in process development. In this study, a comprehensive characterization of mass transfer coefficient (kL a) in the Ambr 250 was conducted to define optimal operational conditions. Scale-down approaches, including dimensionless volumetric flow rate (vvm), power per unit volume (P/V) and kL a have been evaluated using different cell lines. This study demonstrates that the Ambr 250 generated comparable profiles of cell growth and protein production, as seen at 5-L and 1000-L bioreactor scales, when using kL a as a scale-down parameter. In addition to mimicking processes at large scales, the suitability of the Ambr 250 as a tool for clone selection, which is traditionally conducted in bench-top bioreactors, was investigated. Data show that cell growth, productivity, metabolite profiles, and product qualities of material generated using the Ambr 250 were comparable to those from 5-L bioreactors. Therefore, Ambr 250 can be used for clone selection and process development as a replacement for traditional bench-top bioreactors minimizing resource utilization during the early stages of development in the biopharmaceutical industry. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 33:478-489, 2017. © 2017 American Institute of Chemical Engineers.

  20. Hawaii Electric System Reliability

    Energy Technology Data Exchange (ETDEWEB)

    Loose, Verne William [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Silva Monroy, Cesar Augusto [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2012-08-01

    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers’ views of reliability “worth” and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers’ views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.

  1. Hawaii electric system reliability.

    Energy Technology Data Exchange (ETDEWEB)

    Silva Monroy, Cesar Augusto; Loose, Verne William

    2012-09-01

    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers' views of reliability %E2%80%9Cworth%E2%80%9D and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers' views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.

  2. Vocal reaction times to unilaterally presented concrete and abstract words: towards a theory of differential right hemispheric semantic processing.

    Science.gov (United States)

    Rastatter, M; Dell, C W; McGuire, R A; Loren, C

    1987-03-01

    Previous studies investigating hemispheric organization for processing concrete and abstract nouns have provided conflicting results. Using manual reaction time tasks some studies have shown that the right hemisphere is capable of analyzing concrete words but not abstract. Others, however, have inferred that the left hemisphere is the sole analyzer of both types of lexicon. The present study tested these issues further by measuring vocal reaction times of normal subjects to unilaterally presented concrete and abstract items. Results were consistent with a model of functional localization which suggests that the minor hemisphere is capable of differentially processing both types of lexicon in the presence of a dominant left hemisphere.

  3. Integrated circuit reliability. Citations from the NTIS data base

    Science.gov (United States)

    Reed, W. E.

    1980-06-01

    The bibliography presents research pertinent to design, reliability prediction, failure and malfunction, processing techniques, and radiation damage. This updated bibliography contains 193 abstracts, 17 of which are new entries to the previous edition.

  4. Development of improved processing and evaluation methods for high reliability structural ceramics for advanced heat engine applications, Phase 1. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Pujari, V.K.; Tracey, D.M.; Foley, M.R.; Paille, N.I.; Pelletier, P.J.; Sales, L.C.; Wilkens, C.A.; Yeckley, R.L. [Norton Co., Northboro, MA (United States)

    1993-08-01

    The program goals were to develop and demonstrate significant improvements in processing methods, process controls and non-destructive evaluation (NDE) which can be commercially implemented to produce high reliability silicon nitride components for advanced heat engine applications at temperatures to 1,370{degrees}C. The program focused on a Si{sub 3}N{sub 4}-4% Y{sub 2}O{sub 3} high temperature ceramic composition and hot-isostatic-pressing as the method of densification. Stage I had as major objectives: (1) comparing injection molding and colloidal consolidation process routes, and selecting one route for subsequent optimization, (2) comparing the performance of water milled and alcohol milled powder and selecting one on the basis of performance data, and (3) adapting several NDE methods to the needs of ceramic processing. The NDE methods considered were microfocus X-ray radiography, computed tomography, ultrasonics, NMR imaging, NMR spectroscopy, fluorescent liquid dye penetrant and X-ray diffraction residual stress analysis. The colloidal consolidation process route was selected and approved as the forming technique for the remainder of the program. The material produced by the final Stage II optimized process has been given the designation NCX 5102 silicon nitride. According to plan, a large number of specimens were produced and tested during Stage III to establish a statistically robust room temperature tensile strength database for this material. Highlights of the Stage III process demonstration and resultant database are included in the main text of the report, along with a synopsis of the NCX-5102 aqueous based colloidal process. The R and D accomplishments for Stage I are discussed in Appendices 1--4, while the tensile strength-fractography database for the Stage III NCX-5102 process demonstration is provided in Appendix 5. 4 refs., 108 figs., 23 tabs.

  5. A Reliability-Oriented Design Method for Power Electronic Converters

    DEFF Research Database (Denmark)

    Wang, Huai; Zhou, Dao; Blaabjerg, Frede

    2013-01-01

    handbook) to the physics-of-failure approach and design for reliability process. A systematic design procedure consisting of various design tools is presented in this paper to design reliability into the power electronic converters since the early concept phase. The corresponding design procedures...

  6. [Processing acoustically presented time intervals of seconds duration: an expression of the phonological loop of the working memory?].

    Science.gov (United States)

    Grube, D

    1996-01-01

    Working memory has been proposed to contribute to the processing of time, rhythm and music; the question which component of working memory is involved is under discussion. The present study tests the hypothesis that the phonological loop component (Baddeley, 1986) is involved in the processing of auditorily presented time intervals of a few seconds' duration. Typical effects well known with short-term retention of verbal material could be replicated with short-term retention of temporal intervals: The immediate reproduction of time intervals was impaired under conditions of background music and articulatory suppression. Neither the accuracy nor the speed of responses in a (non-phonological) mental rotation task were diminished under these conditions. Processing of auditorily presented time intervals seems to be constrained by the capacity of the phonological loop: The immediate serial recall of sequences of time intervals was shown to be related to the immediate serial recall of words (memory span). The results confirm the notion that working memory resources, and especially the phonological loop component, underlie the processing of auditorily presented temporal information with a duration of a few seconds.

  7. Reliability Centered Maintenance - Methodologies

    Science.gov (United States)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  8. Congenital defects of C1 arches and odontoid process in a child with Down′s syndrome: A case presentation

    Directory of Open Access Journals (Sweden)

    Catherine Hatzantonis

    2016-01-01

    Full Text Available We present the case of a 2-year-old child with Down′s syndrome who presented to our unit with torticollis. Imaging studies revealed the rare occurrence of anterior and posterior C1 arch defects, absent odontoid process, and atlantoaxial subluxation. We managed her conservatively for 3 years without neurological deficits or worsening of atlantoaxial subluxation. We discuss the rare occurrences of anterior and posterior arch defects of the atlas, the radiological presentations of axis defects in patients, and the occurrence of atlantoaxial instability in patients with Down′s syndrome. Management options with consideration to surgery in asymptomatic and symptomatic patients are also discussed.

  9. A review of culturally adapted versions of the Oswestry Disability Index: the adaptation process, construct validity, test-retest reliability and internal consistency.

    Science.gov (United States)

    Sheahan, Peter J; Nelson-Wong, Erika J; Fischer, Steven L

    2015-01-01

    The Oswestry Disability Index (ODI) is a self-report-based outcome measure used to quantify the extent of disability related to low back pain (LBP), a substantial contributor to workplace absenteeism. The ODI tool has been adapted for use by patients in several non-English speaking nations. It is unclear, however, if these adapted versions of the ODI are as credible as the original ODI developed for English-speaking nations. The objective of this study was to conduct a review of the literature to identify culturally adapted versions of the ODI and to report on the adaptation process, construct validity, test-retest reliability and internal consistency of these ODIs. Following a pragmatic review process, data were extracted from each study with regard to these four outcomes. While most studies applied adaptation processes in accordance with best-practice guidelines, there were some deviations. However, all studies reported high-quality psychometric properties: group mean construct validity was 0.734 ± 0.094 (indicated via a correlation coefficient), test-retest reliability was 0.937 ± 0.032 (indicated via an intraclass correlation coefficient) and internal consistency was 0.876 ± 0.047 (indicated via Cronbach's alpha). Researchers can be confident when using any of these culturally adapted ODIs, or when comparing and contrasting results between cultures where these versions were employed. Implications for Rehabilitation Low back pain is the second leading cause of disability in the world, behind only cancer. The Oswestry Disability Index (ODI) has been developed as a self-report outcome measure of low back pain for administration to patients. An understanding of the various cross-cultural adaptations of the ODI is important for more concerted multi-national research efforts. This review examines 16 cross-cultural adaptations of the ODI and should inform the work of health care and rehabilitation professionals.

  10. Adenocarcinoma of the uncinate process of the pancreas: MDCT patterns of local invasion and clinical features at presentation

    Energy Technology Data Exchange (ETDEWEB)

    Padilla-Thornton, Amie E.; Willmann, Juergen K.; Jeffrey, R.B. [Stanford University School of Medicine, Department of Radiology, Stanford, CA (United States)

    2012-05-15

    To compare the multidetector CT (MDCT) patterns of local invasion and clinical findings at presentation in patients with adenocarcinoma of the uncinate process of the pancreas to patients with adenocarcinomas in the non-uncinate head of the pancreas. We evaluated the two cohorts for common duct and pancreatic duct dilatation, mesenteric vascular encasement, root of mesentery invasion, perineural invasion and duodenal invasion. In addition, we compared the clinical findings at presentation in both groups. Common duct (P < 0.001) and pancreatic duct dilatation (P = 0.001) were significantly less common in uncinate process adenocarcinomas than in the non-uncinate head of the pancreas. Clinical findings of jaundice (P = 0.01) and pruritis (P = 0.004) were significantly more common in patients with lesions in the non-uncinate head of the pancreas. Superior mesenteric artery encasement (P = 0.02) and perineural invasion (P = 0.001) were significantly more common with uncinate process adenocarcinomas. Owing to its unique anatomic location, adenocarcinomas within the uncinate process of the pancreas have significantly different patterns of both local invasion and clinical presentation compared to patients with carcinomas in the non-uncinate head of the pancreas. (orig.)

  11. Reliability-based design of wind turbine blades

    DEFF Research Database (Denmark)

    Toft, Henrik Stensgaard; Sørensen, John Dalsgaard

    2011-01-01

    Reliability-based design of wind turbine blades requires identification of the important failure modes/limit states along with stochastic models for the uncertainties and methods for estimating the reliability. In the present paper it is described how reliability-based design can be applied to wind...... turbine blades. For wind turbine blades, tests with the basic composite materials and a few full-scale blades are normally performed during the design process. By adopting a reliability-based design approach, information from these tests can be taken into account in a rational way during the design...... process. In the present paper, a probabilistic framework for design of wind turbine blades are presented and it is demonstrated how information from tests can be taken into account using the Maximum-Likelihood method and Bayesian statistics. In a numerical example, the reliability is estimated for a wind...

  12. Reliability-based design of wind turbine blades

    DEFF Research Database (Denmark)

    Toft, Henrik Stensgaard; Sørensen, John Dalsgaard

    2011-01-01

    Reliability-based design of wind turbine blades requires identification of the important failure modes/limit states along with stochastic models for the uncertainties and methods for estimating the reliability. In the present paper it is described how reliability-based design can be applied to wind...... turbine blades. For wind turbine blades, tests with the basic composite materials and a few full-scale blades are normally performed during the design process. By adopting a reliability-based design approach, information from these tests can be taken into account in a rational way during the design...... process. In the present paper, a probabilistic framework for design of wind turbine blades are presented and it is demonstrated how information from tests can be taken into account using the Maximum-Likelihood method and Bayesian statistics. In a numerical example, the reliability is estimated for a wind...

  13. Preservation of histological structure of cells in human skin presenting mummification and corification processes by Sandison's rehydrating solution.

    Science.gov (United States)

    Collini, Federica; Andreola, Salvatore Ambrogio; Gentile, Guendalina; Marchesi, Matteo; Muccino, Enrico; Zoja, Riccardo

    2014-11-01

    To overcome the difficulties of construction and interpretation of microscopic material from corpses presenting mummification and corification processes, a variety of techniques and tricks are used: in this research the results of applying the Sandison's rehydrating solution are presented, generally used in archeological field on Egyptian mummies of different ages, in human cadaveric material in an advanced state of decomposition. Nineteen skin specimens were taken from corpses presenting corification and mummification processes, discovered in a time ranging between one and four months and exhumed after 11 years. Each biological sample was divided into two parts: one, directly fixed in buffered formalin 10%; the other, preliminarily treated with the Sandison's rehydrating solution and, therefore, post-fixed in 10% buffered formalin. All samples were then carried out the routine histological preparation, and the sections were stained by hematoxylin-eosin and by other histochemical stains. Under the microscope, the samples placed directly into formalin, showed marked structural changes of the various components, while those previously rehydrated with the Sandison's rehydrating solution allowed the clear recognition of different structures. The use of the Sandison's rehydrating solution on skin samples presenting corification and mummification processes, preserving significantly its general setting, stands as an indispensable procedure in the study of such cases. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  14. Calculation of mean outcrossing rates of non-Gaussian processes with stochastic input parameters - Reliability of containers stowed on ships in severe sea

    DEFF Research Database (Denmark)

    Nielsen, Ulrik Dam

    2010-01-01

    Mean outcrossing rates can be used as a basis for decision support for ships in severe sea. The article describes a procedure for calculating the mean outcrossing rate of non-Gaussian processes with stochastic input parameters. The procedure is based on the first-order reliability method (FORM......) and stochastic parameters are incorporated by carrying out a number of FORM calculations corresponding to combinations of specific values of the stochastic parameters. Subsequently, the individual FORM calculation is weighted according to the joint probability with which the specific combination of parameter....... The results of the procedure are compared with brute force simulations obtained by Monte Carlo simulation (MCS) and good agreement is observed. Importantly, the procedure requires significantly less CPU time compared to MCS to produce mean outcrossing rates....

  15. Teflon/SiO2 Bilayer Passivation for Improving the Electrical Reliability of Oxide TFTs Fabricated Using a New Two-Photomask Self-Alignment Process

    Directory of Open Access Journals (Sweden)

    Ching-Lin Fan

    2015-04-01

    Full Text Available This study proposes a two-photomask process for fabricating amorphous indium–gallium–zinc oxide (a-IGZO thin-film transistors (TFTs that exhibit a self-aligned structure. The fabricated TFTs, which lack etching-stop (ES layers, have undamaged a-IGZO active layers that facilitate superior performance. In addition, we demonstrate a bilayer passivation method that uses a polytetrafluoroethylene (Teflon and SiO2 combination layer for improving the electrical reliability of the fabricated TFTs. Teflon was deposited as a buffer layer through thermal evaporation. The Teflon layer exhibited favorable compatibility with the underlying IGZO channel layer and effectively protected the a-IGZO TFTs from plasma damage during SiO2 deposition, resulting in a negligible initial performance drop in the a-IGZO TFTs. Compared with passivation-free a-IGZO TFTs, passivated TFTs exhibited superior stability even after 168 h of aging under ambient air at 95% relative humidity.

  16. Design for Reliability of Power Electronic Systems

    DEFF Research Database (Denmark)

    Wang, Huai; Ma, Ke; Blaabjerg, Frede

    2012-01-01

    availability, long lifetime, sufficient robustness, low maintenance cost and low cost of energy. However, the reliability predictions are still dominantly according to outdated models and terms, such as MIL-HDBK-217F handbook models, Mean-Time-To-Failure (MTTF), and Mean-Time-Between-Failures (MTBF......Advances in power electronics enable efficient and flexible processing of electric power in the application of renewable energy sources, electric vehicles, adjustable-speed drives, etc. More and more efforts are devoted to better power electronic systems in terms of reliability to ensure high......). A collection of methodologies based on Physics-of-Failure (PoF) approach and mission profile analysis are presented in this paper to perform reliability-oriented design of power electronic systems. The corresponding design procedures and reliability prediction models are provided. Further on, a case study...

  17. Normal Pressure Hydrocephalus as an Unusual Presentation of Supratentorial Extraventricular Space-Occupying Processes: Report on Two Cases

    Directory of Open Access Journals (Sweden)

    E. Naydenov

    2012-03-01

    Full Text Available Normal pressure hydrocephalus (NPH is a clinical and radiographic syndrome characterized by ventriculomegaly, abnormal gait, urinary incontinence, and dementia. The condition may occur due to a variety of secondary causes but may be idiopathic in approximately 50% of patients. Secondary causes may include head injury, subarachnoid hemorrhage, meningitis, and central nervous system tumor. Here, we describe two extremely rare cases of supratentorial extraventricular space-occupying processes: meningioma and glioblastoma multiforme, which initially presented with NPH.

  18. Effects of Different Multimedia Presentations on Viewers' Information-Processing Activities Measured by Eye-Tracking Technology

    Science.gov (United States)

    Chuang, Hsueh-Hua; Liu, Han-Chin

    2011-05-01

    This study implemented eye-tracking technology to understand the impact of different multimedia instructional materials, i.e., five successive pages versus a single page with the same amount of information, on information-processing activities in 21 non-science-major college students. The findings showed that students demonstrated the same number of eye fixations during information searching and spent the same amount of time on the overall instructional materials regardless of format. However, the total number of eye fixations on the picture areas was significantly greater for the multiple-page than for the single-page presentation. A significant difference was found in the duration of students' eye fixation durations on the picture areas under the two conditions, with students spending more time on the picture area of the multiple-page than of the single-page presentation. Greater pupil size was found when participants viewed the multiple-page presentation, implying that this presentation format was associated with a higher cognitive load. The participants' eye-movement data for specific areas was recorded and analyzed to determine students' information processing patterns and strategies and for triangulation with the quantitative findings. Discussion of the research findings and suggestions for future research are provided.

  19. Reliability of Power Units in Poland and the World

    Directory of Open Access Journals (Sweden)

    Józef Paska

    2015-09-01

    Full Text Available One of a power system’s subsystems is the generation subsystem consisting of power units, the reliability of which to a large extent determines the reliability of the power system and electricity supply to consumers. This paper presents definitions of the basic indices of power unit reliability used in Poland and in the world. They are compared and analysed on the basis of data published by the Energy Market Agency (Poland, NERC (North American Electric Reliability Corporation – USA, and WEC (World Energy Council. Deficiencies and the lack of a unified national system for collecting and processing electric power equipment unavailability data are also indicated.

  20. Trends in Control Area of PLC Reliability and Safety Parameters

    Directory of Open Access Journals (Sweden)

    Juraj Zdansky

    2008-01-01

    Full Text Available Extension of the PLC application possibilities is closely related to increase of reliability and safety parameters. If the requirement of reliability and safety parameters will be suitable, the PLC could by implemented to specific applications such the safety-related processes control. The goal of this article is to show the way which producers are approaching to increase PLC`s reliability and safety parameters. The second goal is to analyze these parameters for range of present choice and describe the possibility how the reliability and safety parameters can be affected.

  1. Multidisciplinary System Reliability Analysis

    Science.gov (United States)

    Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  2. Clustered granules present in the hippocampus of aged mice result from a degenerative process affecting astrocytes and their surrounding neuropil.

    Science.gov (United States)

    Manich, Gemma; Cabezón, Itsaso; Camins, Antoni; Pallàs, Mercè; Liberski, Pawel P; Vilaplana, Jordi; Pelegrí, Carme

    2014-01-01

    Clusters of pathological granular structures appear and progressively increase in number with age in the hippocampus of several mice strains, markedly in the senescence-accelerated mouse prone 8 mice. In the present work, we performed an ultrastructural study of these granules paying special attention to the first stages of their formation, which have not been previously explored. The analysis of the immature granules allowed concluding that granules are not simple accumulations of molecular waste but the result of a degenerative process involving principally astrocytic processes, although nearby neuronal structures can be also affected. The granule generation includes the instability of the plasmatic membranes and the appearance of abnormal membranous structures that form intracellular bubbles or blebs of variable sizes and irregular shapes. These structures and some organelles degenerate producing some membranous fragments, and an assembly process of the resulting fragments generates the dense-core nucleus of the mature granule. Moreover, we found out that the neo-epitope recently described in the mature granules and localised abundantly in the membranous fragments of their dense-core nucleus emerges in the first stages of the granule formation. On the other hand, with this study, we increase the evidences that each cluster of granules is formed by the granules comprised in one astrocyte. A better knowledge of the causes of the granule formation and the function of the neo-epitope will help in both the interpretation of the physiological significance of the granules and their contribution to the degenerating processes in aging brain.

  3. 提高微波器件封装可靠性的工艺研究%Research of Process for Microwave Package Reliability Improvement

    Institute of Scientific and Technical Information of China (English)

    庞学满; 曹坤; 唐利锋; 夏庆水; 王子良

    2011-01-01

    分析了微波外壳产生失效的主要原因,针对其在制备过程中的关键工艺以及技术难点展开讨论.分别论述了影响外壳可靠性的各个关键因素以及影响机理,包括瓷件尺寸精度和平整度、金属化强度与钎焊技术等,结合现有的工艺情况,分别提出相应的控制措施,并取得良好的效果.%The dominating reason for microwave package failure was discussed in this paper. The key technical difficulty in preparation process was studied. Various factors affecting the reliability of package and their mechanism were discussed, including the dimension precision and the flatness of ceramic, the strength of metal and ceramic hang together, and sealing technology. Considering the process in practice, the corresponding resolving techniques making excellent effect were employed in this paper.

  4. Grid reliability

    CERN Document Server

    Saiz, P; Rocha, R; Andreeva, J

    2007-01-01

    We are offering a system to track the efficiency of different components of the GRID. We can study the performance of both the WMS and the data transfers At the moment, we have set different parts of the system for ALICE, ATLAS, CMS and LHCb. None of the components that we have developed are VO specific, therefore it would be very easy to deploy them for any other VO. Our main goal is basically to improve the reliability of the GRID. The main idea is to discover as soon as possible the different problems that have happened, and inform the responsible. Since we study the jobs and transfers issued by real users, we see the same problems that users see. As a matter of fact, we see even more problems than the end user does, since we are also interested in following up the errors that GRID components can overcome by themselves (like for instance, in case of a job failure, resubmitting the job to a different site). This kind of information is very useful to site and VO administrators. They can find out the efficien...

  5. Ultra reliability at NASA

    Science.gov (United States)

    Shapiro, Andrew A.

    2006-01-01

    Ultra reliable systems are critical to NASA particularly as consideration is being given to extended lunar missions and manned missions to Mars. NASA has formulated a program designed to improve the reliability of NASA systems. The long term goal for the NASA ultra reliability is to ultimately improve NASA systems by an order of magnitude. The approach outlined in this presentation involves the steps used in developing a strategic plan to achieve the long term objective of ultra reliability. Consideration is given to: complex systems, hardware (including aircraft, aerospace craft and launch vehicles), software, human interactions, long life missions, infrastructure development, and cross cutting technologies. Several NASA-wide workshops have been held, identifying issues for reliability improvement and providing mitigation strategies for these issues. In addition to representation from all of the NASA centers, experts from government (NASA and non-NASA), universities and industry participated. Highlights of a strategic plan, which is being developed using the results from these workshops, will be presented.

  6. Photovoltaic module reliability workshop

    Energy Technology Data Exchange (ETDEWEB)

    Mrig, L. (ed.)

    1990-01-01

    The paper and presentations compiled in this volume form the Proceedings of the fourth in a series of Workshops sponsored by Solar Energy Research Institute (SERI/DOE) under the general theme of photovoltaic module reliability during the period 1986--1990. The reliability Photo Voltaic (PV) modules/systems is exceedingly important along with the initial cost and efficiency of modules if the PV technology has to make a major impact in the power generation market, and for it to compete with the conventional electricity producing technologies. The reliability of photovoltaic modules has progressed significantly in the last few years as evidenced by warranties available on commercial modules of as long as 12 years. However, there is still need for substantial research and testing required to improve module field reliability to levels of 30 years or more. Several small groups of researchers are involved in this research, development, and monitoring activity around the world. In the US, PV manufacturers, DOE laboratories, electric utilities and others are engaged in the photovoltaic reliability research and testing. This group of researchers and others interested in this field were brought together under SERI/DOE sponsorship to exchange the technical knowledge and field experience as related to current information in this important field. The papers presented here reflect this effort.

  7. Microvesicle Cargo of Tumor-Associated MUC1 to Dendritic Cells Allows Cross-presentation and Specific Carbohydrate Processing

    DEFF Research Database (Denmark)

    Rughetti, Aurelia; Rahimi, Hassan; Belleudi, Francesca

    2014-01-01

    . Here, we show that the form of the MUC1 antigen, i.e., soluble or as microvesicle cargo, influences MUC1 processing in dendritic cells. In fact, MUC1 carried by microvesicles translocates from the endolysosomal/HLA-II to the HLA-I compartment and is presented by dendritic cells to MUC1-specific CD8...... by deglycosylation that generates novel MUC1 glycoepitopes. Microvesicle-mediated transfer of tumor-associated glycoproteins to dendritic cells may be a relevant biologic mechanism in vivo contributing to define the type of immunogenicity elicited. Furthermore, these results have important implications...

  8. Dynamic reliability of digital-based transmitters

    Energy Technology Data Exchange (ETDEWEB)

    Brissaud, Florent, E-mail: florent.brissaud.2007@utt.f [Institut National de l' Environnement Industriel et des Risques (INERIS), Parc Technologique Alata, BP 2, 60550 Verneuil-en-Halatte (France) and Universite de Technologie de Troyes - UTT, Institut Charles Delaunay - ICD and UMR CNRS 6279 STMR, 12 rue Marie Curie, BP 2060, 10010 Troyes Cedex (France); Smidts, Carol [Ohio State University (OSU), Nuclear Engineering Program, Department of Mechanical Engineering, Scott Laboratory, 201 W 19th Ave, Columbus OH 43210 (United States); Barros, Anne; Berenguer, Christophe [Universite de Technologie de Troyes (UTT), Institut Charles Delaunay (ICD) and UMR CNRS 6279 STMR, 12 rue Marie Curie, BP 2060, 10010 Troyes Cedex (France)

    2011-07-15

    Dynamic reliability explicitly handles the interactions between the stochastic behaviour of system components and the deterministic behaviour of process variables. While dynamic reliability provides a more efficient and realistic way to perform probabilistic risk assessment than 'static' approaches, its industrial level applications are still limited. Factors contributing to this situation are the inherent complexity of the theory and the lack of a generic platform. More recently the increased use of digital-based systems has also introduced additional modelling challenges related to specific interactions between system components. Typical examples are the 'intelligent transmitters' which are able to exchange information, and to perform internal data processing and advanced functionalities. To make a contribution to solving these challenges, the mathematical framework of dynamic reliability is extended to handle the data and information which are processed and exchanged between systems components. Stochastic deviations that may affect system properties are also introduced to enhance the modelling of failures. A formalized Petri net approach is then presented to perform the corresponding reliability analyses using numerical methods. Following this formalism, a versatile model for the dynamic reliability modelling of digital-based transmitters is proposed. Finally the framework's flexibility and effectiveness is demonstrated on a substantial case study involving a simplified model of a nuclear fast reactor.

  9. Role of metalloproteases in vaccinia virus epitope processing for transporter associated with antigen processing (TAP)-independent human leukocyte antigen (HLA)-B7 class I antigen presentation.

    Science.gov (United States)

    Lorente, Elena; García, Ruth; Mir, Carmen; Barriga, Alejandro; Lemonnier, François A; Ramos, Manuel; López, Daniel

    2012-03-23

    The transporter associated with antigen processing (TAP) translocates the viral proteolytic peptides generated by the proteasome and other proteases in the cytosol to the endoplasmic reticulum lumen. There, they complex with nascent human leukocyte antigen (HLA) class I molecules, which are subsequently recognized by the CD8(+) lymphocyte cellular response. However, individuals with nonfunctional TAP complexes or tumor or infected cells with blocked TAP molecules are able to present HLA class I ligands generated by TAP-independent processing pathways. Herein, using a TAP-independent polyclonal vaccinia virus-polyspecific CD8(+) T cell line, two conserved vaccinia-derived TAP-independent HLA-B*0702 epitopes were identified. The presentation of these epitopes in normal cells occurs via complex antigen-processing pathways involving the proteasome and/or different subsets of metalloproteinases (amino-, carboxy-, and endoproteases), which were blocked in infected cells with specific chemical inhibitors. These data support the hypothesis that the abundant cellular proteolytic systems contribute to the supply of peptides recognized by the antiviral cellular immune response, thereby facilitating immunosurveillance. These data may explain why TAP-deficient individuals live normal life spans without any increased susceptibility to viral infections.

  10. Wind Energy - How Reliable.

    Science.gov (United States)

    1980-01-01

    The reliability of a wind energy system depends on the size of the propeller and the size of the back-up energy storage. Design of the optimum system...speed incidents which generate a significant part of the wind energy . A nomogram is presented, based on some continuous wind speed measurements

  11. Legal Regulation Of Citizens Of The Russian Federation Participation In The Electoral Process At The Present Stage

    Directory of Open Access Journals (Sweden)

    Dmitriy O. Ezhevskiy

    2015-03-01

    Full Text Available In the present article author analyzes features of the current legislation about elections in the Russian Federation with the latest changes. We consider legal regulation of Russian citizens participation in the electoral process, as well as the role and importance of the state and its agencies as participants in the electoral process. Legitimacy of free and fair elections at all levels requires that they relied on a solid legal basis, created in the democratic rule of law. Ongoing in the Russian Federation reforms of the political and economic systems are designed through the democratization of all aspects of life to connect the interests of the individual and society, in fact, to put people at the center of social development, to provide them decent living conditions, freedom and the opportunity to participate in managing state affairs both directly and through their elected representatives. According to the author, in terms of democratization of state and society an important role is played by the polling relationships, based on the electoral law and the electoral process in the Russian Federation. In conclusion, author underlines that the distinguishing feature of the legal status of election commissions, government agencies, local governments, enterprises, organizations, institutions, media, as well as their officials is that they have the right to participate in the electoral legal relations strictly within their authority and cannot go beyond these powers, no matter what reasons motivated the necessity and expediency of actions that go beyond the law.

  12. Reliability engineering theory and practice

    CERN Document Server

    Birolini, Alessandro

    2010-01-01

    Presenting a solid overview of reliability engineering, this volume enables readers to build and evaluate the reliability of various components, equipment and systems. Current applications are presented, and the text itself is based on the author's 30 years of experience in the field.

  13. Synthesis of Reliable Telecommunication Networks

    Directory of Open Access Journals (Sweden)

    Dusan Trstensky

    2005-01-01

    Full Text Available In many application, the network designer may to know to senthesise a reliable telecommunication network. Assume that a network, denoted Gm,e has the number of nodes n and the number of edges e, and the operational probability of each edge is known. The system reliability of the network is defined to be the reliability that every pair of nodes can communicate with each other. A network synthesis problem considered in this paper is to find a network G*n,e, that maximises system reliability over the class of all networks for the classes of networks Gn,n-1, Gn,m and Gn,n+1 respectively. In addition an upper bound of maximum reliability for the networks with n-node and e-edge (e>n+2 is derived in terms of node. Computational experiments for the reliability upper are also presented. the results show, that the proposed reliability upper bound is effective.

  14. Reliability based fatigue design and maintenance procedures

    Science.gov (United States)

    Hanagud, S.

    1977-01-01

    A stochastic model has been developed to describe a probability for fatigue process by assuming a varying hazard rate. This stochastic model can be used to obtain the desired probability of a crack of certain length at a given location after a certain number of cycles or time. Quantitative estimation of the developed model was also discussed. Application of the model to develop a procedure for reliability-based cost-effective fail-safe structural design is presented. This design procedure includes the reliability improvement due to inspection and repair. Methods of obtaining optimum inspection and maintenance schemes are treated.

  15. Mismatch negativity (MMN) and sensory auditory processing in children aged 9-12 years presenting with putative antecedents of schizophrenia.

    Science.gov (United States)

    Bruggemann, Jason M; Stockill, Helen V; Lenroot, Rhoshel K; Laurens, Kristin R

    2013-09-01

    Identification of markers of abnormal brain function in children at-risk of schizophrenia may inform early intervention and prevention programs. Individuals with schizophrenia are characterised by attenuation of MMN amplitude, which indexes automatic auditory sensory processing. The current aim was to examine whether children who may be at increased risk of schizophrenia due to their presenting multiple putative antecedents of schizophrenia (ASz) are similarly characterised by MMN amplitude reductions, relative to typically developing (TD) children. EEG was recorded from 22 ASz and 24 TD children aged 9 to 12 years (matched on age, sex, and IQ) during a passive auditory oddball task (15% duration deviant). ASz children were those presenting: (1) speech and/or motor development lags/problems; (2) social, emotional, or behavioural problems in the clinical range; and (3) psychotic-like experiences. TD children presented no antecedents, and had no family history of a schizophrenia spectrum disorder. MMN amplitude, but not latency, was significantly greater at frontal sites in the ASz group than in the TD group. Although the MMN exhibited by the children at risk of schizophrenia was unlike that of their typically developing peers, it also differed from the reduced MMN amplitude observed in adults with schizophrenia. This may reflect developmental and disease effects in a pre-prodromal phase of psychosis onset. Longitudinal follow-up is necessary to establish the developmental trajectory of MMN in at-risk children.

  16. Research on Explosive Crushing Process and Safety Reliability%关于炸药破碎工艺及其安全性和可靠性研究

    Institute of Scientific and Technical Information of China (English)

    贺飞; 何毅; 张广平; 孟凡军; 高君; 杨浩

    2013-01-01

    According to the riser drug recycling process encountered broken problem in the research,three differential roller crusher technology to realize the riser drug safety automation broken was used.The broken technology has very high security and reliability,which can reduce the labor intensity of workers,and improve the working environment of workers,the explosive development of crushing equipment improves production safety,ensures the size and efficiency,realizes the security of explosives recycle,reduces the cost of production.It has a certain promotion value in the military and civil explosive industry.%针对冒口药回收再利用过程中遇到的破碎问题开展技术研究,采用三级差动轧辊式破碎技术,实现冒口药的安全自动化破碎.该破碎技术具有很高的安全性及可靠性,降低了工人的劳动强度,改善了工人的工作环境.研制的炸药破碎设备提高了生产的本质安全度,保证了破碎粒度及效率,实现了炸药的安全回收利用,降低了生产成本,在军品及民爆行业具有一定的推广价值.

  17. Wind turbine reliability : a database and analysis approach.

    Energy Technology Data Exchange (ETDEWEB)

    Linsday, James (ARES Corporation); Briand, Daniel; Hill, Roger Ray; Stinebaugh, Jennifer A.; Benjamin, Allan S. (ARES Corporation)

    2008-02-01

    The US wind Industry has experienced remarkable growth since the turn of the century. At the same time, the physical size and electrical generation capabilities of wind turbines has also experienced remarkable growth. As the market continues to expand, and as wind generation continues to gain a significant share of the generation portfolio, the reliability of wind turbine technology becomes increasingly important. This report addresses how operations and maintenance costs are related to unreliability - that is the failures experienced by systems and components. Reliability tools are demonstrated, data needed to understand and catalog failure events is described, and practical wind turbine reliability models are illustrated, including preliminary results. This report also presents a continuing process of how to proceed with controlling industry requirements, needs, and expectations related to Reliability, Availability, Maintainability, and Safety. A simply stated goal of this process is to better understand and to improve the operable reliability of wind turbine installations.

  18. Reliability Assessment Of Wind Turbines

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    2014-01-01

    Reduction of cost of energy for wind turbines are very important in order to make wind energy competitive compared to other energy sources. Therefore the turbine components should be designed to have sufficient reliability but also not be too costly (and safe). This paper presents models...... for uncertainty modeling and reliability assessment of especially the structural components such as tower, blades, substructure and foundation. But since the function of a wind turbine is highly dependent on many electrical and mechanical components as well as a control system also reliability aspects...... of these components are discussed and it is described how there reliability influences the reliability of the structural components. Two illustrative examples are presented considering uncertainty modeling, reliability assessment and calibration of partial safety factors for structural wind turbine components exposed...

  19. Process design kit and circuits at a 2 µm technology node for flexible wearable electronics applications (Conference Presentation)

    Science.gov (United States)

    Torres-Miranda, Miguel; Petritz, Andreas; Gold, Herbert; Stadlober, Barbara

    2016-09-01

    In this work we present our most advanced technology node of organic thin film transistors (OTFTs) manufactured with a channel length as short as 2 μm by contact photolithography and a self-alignment process directly on a plastic substrate. Our process design kit (PDK) is described with P-type transistors, capacitors and 3 metal layers for connections of complex circuits. The OTFTs are composed of a double dielectric layer with a photopatternable ultra thin polymer (PNDPE) and alumina, with a thickness on the order of 100 nm. The organic semiconductor is either Pentacene or DNTT, which have a stable average mobility up to 0.1 cm2/Vs. Finally, a polymer (e.g.: Parylene-C) is used as a passivation layer. We describe also our design rules for the placement of standard circuit cells. A "plastic wafer" is fabricated containing 49 dies. Each die of 1 cm2 has between 25 to 50 devices, proving larger scale integration in such a small space, unique in organic technologies. Finally, we present the design (by simulations using a Spice model for OTFTs) and the test of analog and digital basic circuits: amplifiers with DC gains of about 20 dB, comparators, inverters and logic gates working in the frequency range of 1-10 kHz. These standard circuit cells could be used for signal conditioning and integrated as active matrices for flexible sensors from 3rd party institutions, thus opening our fab to new ideas and sophisticated pre-industrial low cost applications for the emerging fields of biomedical devices and wearable electronics for virtual/augmented reality.

  20. Nuclear weapon reliability evaluation methodology

    Energy Technology Data Exchange (ETDEWEB)

    Wright, D.L. [Sandia National Labs., Albuquerque, NM (United States)

    1993-06-01

    This document provides an overview of those activities that are normally performed by Sandia National Laboratories to provide nuclear weapon reliability evaluations for the Department of Energy. These reliability evaluations are first provided as a prediction of the attainable stockpile reliability of a proposed weapon design. Stockpile reliability assessments are provided for each weapon type as the weapon is fielded and are continuously updated throughout the weapon stockpile life. The reliability predictions and assessments depend heavily on data from both laboratory simulation and actual flight tests. An important part of the methodology are the opportunities for review that occur throughout the entire process that assure a consistent approach and appropriate use of the data for reliability evaluation purposes.

  1. Rapid spatial frequency domain inverse problem solutions using look-up tables for real-time processing (Conference Presentation)

    Science.gov (United States)

    Angelo, Joseph P.; Bigio, Irving J.; Gioux, Sylvain

    2016-03-01

    Imaging technologies working in the spatial frequency domain are becoming increasingly popular for generating wide-field optical property maps, enabling further analysis of tissue parameters such as absorption or scattering. While acquisition methods have witnessed a very rapid growth and are now performing in real-time, processing methods are yet slow preventing information to be acquired and displayed in real-time. In this work, we present solutions for rapid inverse problem solving for optical properties by use of advanced look-up tables. In particular, we present methods and results from a dense, linearized look-up table and an analytical representation that currently run 100 times faster than the standard method and within 10% in both absorption and scattering. With the resulting computation time in the tens of milliseconds range, the proposed techniques enable video-rate feedback of real-time techniques such as snapshot of optical properties (SSOP) imaging, making full video-rate guidance in the clinic possible.

  2. Production of thin glass mirrors by hot slumping for x-ray telescopes: present process and ongoing development

    Science.gov (United States)

    Salmaso, B.; Basso, S.; Brizzolari, C.; Civitani, M.; Ghigo, M.; Pareschi, G.; Spiga, D.; Tagliaferri, G.; Vecchi, G.

    2014-07-01

    Thin glass foils are considered good candidates to build a segmented X-ray telescope with effective area as large as 2 m2 and angular resolution better than 5 arcsec. In order to produce thin glass mirror segments, we developed a direct hot slumping technique assisted by pressure, in which the shape of a mould is replicated onto the optical surface of the glass. In this paper we present the result obtained with AF32 (by Schott) and EAGLE XG (by Corning) glass types. The selected mould material is Zerodur K20, as it does not require any anti-sticking layer and has a good matching, in terms of Coefficient of Thermal Expansion, with both glass types. Our group already produced a few prototypes, reaching angular resolution near 20 arcsec. In this work, relevant steps forward aimed at attaining a 5 arcsec angular resolution are described, along with the tuning of few key parameters in the slumping process. The results obtained on a newly procured cylindrical Zerodur K20 mould are presented.

  3. What can thermal infrared remote sensing of terrestrial volcanoes tell us about processes past and present on Mars?

    Science.gov (United States)

    Ramsey, Michael S.; Harris, Andrew J. L.; Crown, David A.

    2016-02-01

    Over the past fifty years, a diverse set of thermal infrared (TIR) remote sensing data has been acquired from the orbits of Earth and Mars, which both have ubiquitous volcanic landforms. These data vary in spatial, spectral and temporal resolution and are critical for investigating an ever-expanding set of science applications including the focus of this review paper: volcanic processes. Volcanic studies using TIR data include active monitoring of flows and plumes on Earth and mapping the compositional and thermophysical diversity on Mars. Furthermore, recent advances in high-resolution, low-cost, ground and laboratory TIR instrumentation now help to augment the orbital data through a synergistic approach to data analysis and validation. Field and laboratory studies also serve as terrestrially-focused analogues that provide important insights to interpret the geologic processes that have operated on other planetary surfaces including Mars. This review expands upon our invited talk of the same title at the 2014 Geological Society of America Meeting to include several case studies designed to give the reader an overview of how TIR data can be applied to volcanic processes on Earth and Mars. These case studies highlight prior work by the authors presented at past meetings, but which have not been published elsewhere. The examples were chosen specifically to identify the TIR data similarities between the two planets, and include analyses of volcanic surfaces to (1) derive composition and texture using TIR spectra (Earth and Mars); (2) analyze mantled flows with thermophysical data (Earth and Mars); (3) estimate lava discharge rate using TIR-derived temperature (Earth with application to Mars); and (4) model flow dynamics based on geomorphic measurements (Mars). Because of our focus on the TIR, we do not attempt to document other remote sensing wavelength regions nor even every volcanic study using TIR data. As TIR instruments have improved over time along similar

  4. The Human Reliability in the Accident Processing of NPP%基于核电站事故处理的人因可靠性研究

    Institute of Scientific and Technical Information of China (English)

    谷鹏飞; 张建波; 孙永滨

    2012-01-01

    Recently, Human Reliability Analysis (HRA) is becoming more important to the safety of Nuclear Power Plant (NPP). Since the digital technology is adopted in the control room of NPP, the computerized operator workstations have brought flexible operation methods. However, huge and centralized information also could cause some risks for operation tasks. Therefore, as the reliability of the NPP equipments have been increased even higher, HRA should be developed in order to guarantee the better safety of NPP. According to the new revision of NUREG 0700 Standard published by Brookhaven National Laboratory in USA, Advanced Control Room (ACR) has been defined as the control room adopted the digital technology. In China, self-determination design involving AGR is first beginning with Iingao Phase II project. Lingao Phase II NPP has been in operation mode since the year of 2010. It indicates the success of first serf-determination design involving ACR. The design process of ACR is focused on. In the analysis, the LOCA and SGTR are selected as the initiating events. And based on that, some failures of other equipments or systems have been added. Then the process that the operators deal with the accidents has been analyzed with the accident situations in order to obtain reasonable human performance data. By collecting human performance, especially in accident situations, it shows that enhancing the human reliability is very important. Therefore it would be benefit to analyze the human reliability. As a result, it also could be benefit to the design improvement of NPP, specially the design of main control room.%近些年人因可靠性研究对于核电站安全性这一问题越来越重要.在核电站控制室采用数字化技术以后,计算机化的操纵员工作站带来了便捷操作方式,但庞大且集中的信息量也带来了操作任务可靠性的风险.因此,在核电站的设备可靠性大幅度提高的前提下,人因可靠性也需要不断提高,以

  5. SOFTWARE RELIABILITY OF PROFICIENT ENACTMENT

    Directory of Open Access Journals (Sweden)

    B.Anni Princy

    2014-07-01

    Full Text Available A software reliability exemplary projects snags the random process as disillusionments which were the culmination yield of two progressions: emerging faults and initial state values. The predominant classification uses the logistic analysis effort function mounting efficient software on the real time dataset. The detriments of the logistic testing were efficaciously overcome by Pareto distribution. The estimated outline ventures the resolved technique for analyzing the suitable communities and the preeminent of fit for a software reliability progress model. Its constraints are predictable to evaluate the reliability of a software system. The future process will permit for software reliability estimations that can be used both as prominence Indicator, but also for planning and controlling resources, the development times based on the onslaught assignments of the efficient computing and reliable measurement of a software system was competent.

  6. Simultaneous identification of DNA and RNA viruses present in pig faeces using process-controlled deep sequencing.

    Directory of Open Access Journals (Sweden)

    Jana Sachsenröder

    Full Text Available BACKGROUND: Animal faeces comprise a community of many different microorganisms including bacteria and viruses. Only scarce information is available about the diversity of viruses present in the faeces of pigs. Here we describe a protocol, which was optimized for the purification of the total fraction of viral particles from pig faeces. The genomes of the purified DNA and RNA viruses were simultaneously amplified by PCR and subjected to deep sequencing followed by bioinformatic analyses. The efficiency of the method was monitored using a process control consisting of three bacteriophages (T4, M13 and MS2 with different morphology and genome types. Defined amounts of the bacteriophages were added to the sample and their abundance was assessed by quantitative PCR during the preparation procedure. RESULTS: The procedure was applied to a pooled faecal sample of five pigs. From this sample, 69,613 sequence reads were generated. All of the added bacteriophages were identified by sequence analysis of the reads. In total, 7.7% of the reads showed significant sequence identities with published viral sequences. They mainly originated from bacteriophages (73.9% and mammalian viruses (23.9%; 0.8% of the sequences showed identities to plant viruses. The most abundant detected porcine viruses were kobuvirus, rotavirus C, astrovirus, enterovirus B, sapovirus and picobirnavirus. In addition, sequences with identities to the chimpanzee stool-associated circular ssDNA virus were identified. Whole genome analysis indicates that this virus, tentatively designated as pig stool-associated circular ssDNA virus (PigSCV, represents a novel pig virus. CONCLUSION: The established protocol enables the simultaneous detection of DNA and RNA viruses in pig faeces including the identification of so far unknown viruses. It may be applied in studies investigating aetiology, epidemiology and ecology of diseases. The implemented process control serves as quality control, ensures

  7. Assessing the reliability and validity of the Revised Two Factor Study Process Questionnaire (R-SPQ2F in Ghanaian medical students

    Directory of Open Access Journals (Sweden)

    Victor Mogre

    2014-08-01

    Full Text Available Purpose: We investigated the validity and reliability of the Revised Two Factor Study Process Questionnaire (R-SPQ2F in preclinical students in Ghana. Methods: The R-SPQ2F was administered to 189 preclinical students of the University for Development Studies, School of Medicine and Health Sciences. Both descriptive and inferential statistics with Cronbach’s alpha test and factor analysis were done. Results: The mean age of the students was 22.69 ± 0.18 years, 60.8% (n = 115 were males and 42.3% (n = 80 were in their second year of medical training. The students had higher mean deep approach scores (31.23 ± 7.19 than that of surface approach scores (22.62 ± 6.48. Findings of the R-SPQ2F gave credence to a solution of two-factors indicating deep and surface approaches accounting for 49.80% and 33.57%, respectively, of the variance. The scales of deep approach (Cronbach’s alpha, 0.80 and surface approach (Cronbach’s alpha, 0.76 and their subscales demonstrated an internal consistency that was good. The factorial validity was comparable to other studies. Conclusion: Our study confirms the construct validity and internal consistency of the R-SPQ2F for measuring approaches to learning in Ghanaian preclinical students. Deep approach was the most dominant learning approach among the students. The questionnaire can be used to measure students’ approaches to learning in Ghana and in other African countries.

  8. Reliability and Intervention Management for the LHC

    CERN Document Server

    Foraz, K; Richard Cook, J; Coupard, J; Daudin, B; Baltasar Dos Santos Pedrosa, F; Reguero Fuentes, E; Garino, C; Golikov, K; Grillot, S; Richard Jaekel, M; Sollander, P

    2012-01-01

    Since 2010, CERN has entered a mode of continuous operation of the LHC and its injectors, which implies the continuous operation of all the infrastructure and support systems. High reliability of the machines is crucial to meet the physics goals. This high reliability must be accompanied by a fast restart after programmed stops. Since 2010, an important effort has been put in place, to ease the coordination process during the programmed stops and to reinforce the management of the interventions (preparation, approval, follow-up, traceability, closure). This paper describes the difficulties from the first year related to this coordination, and the impact on operation. The tools developed for the management of the interventions, their assets and the effect on the reliability of the LHC will also be presented and discussed.

  9. Reliability engineering in RF CMOS

    OpenAIRE

    2008-01-01

    In this thesis new developments are presented for reliability engineering in RF CMOS. Given the increase in use of CMOS technology in applications for mobile communication, also the reliability of CMOS for such applications becomes increasingly important. When applied in these applications, CMOS is typically referred to as RF CMOS, where RF stands for radio frequencies.

  10. Rapid visual information processing in schizophrenic patients: the impact of cognitive load and duration of stimulus presentation. A pilot study.

    Science.gov (United States)

    Cattapan-Ludewig, Katja; Hilti, Caroline C; Ludewig, Stephan; Vollenweider, Franz X; Feldon, Joram

    2005-01-01

    The inability to sustain attention has been proposed as a core deficit in schizophrenia. The Continuous Performance Task (AX-CPT) and the Rapid Visual Information Processing Task (RVP) are widely used neuropsychological tasks to measure sustained attention. The RVP displays numbers as stimuli, whereas the AX-CPT uses letters. Ten patients with chronic schizophrenia and 18 healthy control subjects were studied using four different versions of the RVP. The versions differed with regard to stimulus presentation time (600 vs. 1,200 ms) and the number of target sequences to be memorized: either one sequence (low cognitive load) or two sequences (high cognitive load). Schizophrenic patients showed a reduced number of hits only on the task version with 600 ms stimulus duration coupled with high cognitive load. The combination of high cognitive load and short stimulus duration created a critical performance breaking point for schizophrenic patients. This finding supports the hypothesis that patients have an impaired ability to coactivate different cognitive performances; thus the results favor the theory of impaired functional connectivity in schizophrenia.

  11. Present state and perspective of downstream processing of biologically produced 1,3-propanediol and 2,3-butanediol.

    Science.gov (United States)

    Xiu, Zhi-Long; Zeng, An-Ping

    2008-04-01

    1,3-Propanediol and 2,3-butanediol are two promising chemicals which have a wide range of applications and can be biologically produced. The separation of these diols from fermentation broth makes more than 50% of the total costs in their microbial production. This review summarizes the present state of methods studied for the recovery and purification of biologically produced diols, with particular emphasis on 1,3-propoanediol. Previous studies on the separation of 1,3-propanediol primarily include evaporation, distillation, membrane filtration, pervaporation, ion exchange chromatography, liquid-liquid extraction, and reactive extraction. Main methods for the recovery of 2,3-butanediol include steam stripping, pervaporation, and solvent extraction. No single method has proved to be simple and efficient, and improvements are especially needed with regard to yield, purity, and energy consumption. Perspectives for an improved downstream processing of biologically produced diols, especially 1,3-propanediol are discussed based on our own experience and recent work. It is argued that separation technologies such as aqueous two-phase extraction with short chain alcohols, pervaporation, reverse osmosis, and in situ extractive or pervaporative fermentations deserve more attention in the future.

  12. Solution-processed low dimensional nanomaterials with self-assembled polymers for flexible photo-electronic devices (Presentation Recording)

    Science.gov (United States)

    Park, Cheolmin

    2015-09-01

    Self assembly driven by complicated but systematic hierarchical interactions offers a qualified alternative for fabricating functional micron or nanometer scale pattern structures that have been potentially useful for various organic and nanotechnological devices. Self assembled nanostructures generated from synthetic polymer systems such as controlled polymer blends, semi-crystalline polymers and block copolymers have gained a great attention not only because of the variety of nanostructures they can evolve but also because of the controllability of these structures by external stimuli. In this presentation, various novel photo-electronic materials and devices are introduced based on the solution-processed low dimensional nanomaterials such as networked carbon nanotubes (CNTs), reduced graphene oxides (rGOs) and 2 dimensional transition metal dichalcogenides (TMDs) with self assembled polymers including field effect transistor, electroluminescent device, non-volatile memory and photodetector. For instance, a nanocomposite of networked CNTs and a fluorescent polymer turned out an efficient field induced electroluminescent layer under alternating current (AC) as a potential candidate for next generation displays and lightings. Furthermore, scalable and simple strategies employed for fabricating rGO as well as TMD nanohybrid films allowed for high performance and mechanically flexible non-volatile resistive polymer memory devices and broad band photo-detectors, respectively.

  13. Reliability computation from reliability block diagrams

    Science.gov (United States)

    Chelson, P. O.; Eckstein, E. Y.

    1975-01-01

    Computer program computes system reliability for very general class of reliability block diagrams. Four factors are considered in calculating probability of system success: active block redundancy, standby block redundancy, partial redundancy, and presence of equivalent blocks in the diagram.

  14. Reliability analysis of wastewater treatment plants.

    Science.gov (United States)

    Oliveira, Sílvia C; Von Sperling, Marcos

    2008-02-01

    This article presents a reliability analysis of 166 full-scale wastewater treatment plants operating in Brazil. Six different processes have been investigated, comprising septic tank+anaerobic filter, facultative pond, anaerobic pond+facultative pond, activated sludge, upflow anaerobic sludge blanket (UASB) reactors alone and UASB reactors followed by post-treatment. A methodology developed by Niku et al. [1979. Performance of activated sludge process and reliability-based design. J. Water Pollut. Control Assoc., 51(12), 2841-2857] is used for determining the coefficients of reliability (COR), in terms of the compliance of effluent biochemical oxygen demand (BOD), chemical oxygen demand (COD), total suspended solids (TSS), total nitrogen (TN), total phosphorus (TP) and fecal or thermotolerant coliforms (FC) with discharge standards. The design concentrations necessary to meet the prevailing discharge standards and the expected compliance percentages have been calculated from the COR obtained. The results showed that few plants, under the observed operating conditions, would be able to present reliable performances considering the compliance with the analyzed standards. The article also discusses the importance of understanding the lognormal behavior of the data in setting up discharge standards, in interpreting monitoring results and compliance with the legislation.

  15. Reliability-Based Code Calibration

    DEFF Research Database (Denmark)

    Faber, M.H.; Sørensen, John Dalsgaard

    2003-01-01

    The present paper addresses fundamental concepts of reliability based code calibration. First basic principles of structural reliability theory are introduced and it is shown how the results of FORM based reliability analysis may be related to partial safety factors and characteristic values....... Thereafter the code calibration problem is presented in its principal decision theoretical form and it is discussed how acceptable levels of failure probability (or target reliabilities) may be established. Furthermore suggested values for acceptable annual failure probabilities are given for ultimate...... and serviceability limit states. Finally the paper describes the Joint Committee on Structural Safety (JCSS) recommended procedure - CodeCal - for the practical implementation of reliability based code calibration of LRFD based design codes....

  16. Defining Requirements for Improved Photovoltaic System Reliability

    Energy Technology Data Exchange (ETDEWEB)

    Maish, A.B.

    1998-12-21

    Reliable systems are an essential ingredient of any technology progressing toward commercial maturity and large-scale deployment. This paper defines reliability as meeting system fictional requirements, and then develops a framework to understand and quantify photovoltaic system reliability based on initial and ongoing costs and system value. The core elements necessary to achieve reliable PV systems are reviewed. These include appropriate system design, satisfactory component reliability, and proper installation and servicing. Reliability status, key issues, and present needs in system reliability are summarized for four application sectors.

  17. Recent Advances in System Reliability Signatures, Multi-state Systems and Statistical Inference

    CERN Document Server

    Frenkel, Ilia

    2012-01-01

    Recent Advances in System Reliability discusses developments in modern reliability theory such as signatures, multi-state systems and statistical inference. It describes the latest achievements in these fields, and covers the application of these achievements to reliability engineering practice. The chapters cover a wide range of new theoretical subjects and have been written by leading experts in reliability theory and its applications.  The topics include: concepts and different definitions of signatures (D-spectra),  their  properties and applications  to  reliability of coherent systems and network-type structures; Lz-transform of Markov stochastic process and its application to multi-state system reliability analysis; methods for cost-reliability and cost-availability analysis of multi-state systems; optimal replacement and protection strategy; and statistical inference. Recent Advances in System Reliability presents many examples to illustrate the theoretical results. Real world multi-state systems...

  18. Reliability of Whole Circulation Process for Pork Product Based on GO Methodology%基于GO法的猪肉产品流通全过程可靠性研究

    Institute of Scientific and Technical Information of China (English)

    田帅辉; 王旭; 常兰; 王振锋

    2013-01-01

    In order to improve the reliability of the whole circulation process about pork product,GO methodology was applied in the reliability research.First,the structure model of the whole circulation process about pork product was built up,and the influencing factors of each step,including pigbreeding,slaughter-process,storage-transportation and sales,were diagnosed.Secondly,with the GO methodology the whole circulation process about pork product was transformed into the GO chart,and then the failure probability of influencing factors were determined by the method of frequency statistics and fuzzy analytic hierarchy process and the process reliability was calculated in detail.Finally,reliability of the whole circulation process about pork product was analyzed qualitatively and the importance degree of minimal cut sets was calculated so that the main factors influenced the reliability of whole circulation process about pork product were determined.The quantative and qualitative methods were compared and the result showed that the GO method had effectiveness and superior performance in research on reliability of the whole circulation process about pork product.%为提升猪肉产品流通过程的可靠性,提出将GO法引入猪肉产品流通全过程的可靠性研究中.首先建立猪肉产品流通全过程结构模型,对生猪养殖、屠宰加工、储运销售环节的影响因素进行诊断;然后运用GO法原理,将猪肉产品流通全过程转换成GO图模型,借助频数统计法和模糊层次分析法确定各影响因素的故障概率,对猪肉产品流通全过程可靠性进行精确计算;最后对猪肉产品流通全过程的可靠性进行直接定性分析和重要度计算,找出影响猪肉产品流通全过程可靠性的重要因素.通过对猪肉产品流通全过程可靠性进行定量计算和定性分析结果的比较,证明GO法在猪肉产品流通全过程可靠性分析的有效性及优越性.

  19. Fluorescent compounds present in food

    OpenAIRE

    Soto Serrano, Axel

    2014-01-01

    Póster The food industry demands fast, reliable, cheap and reproducible methods for quality and process control. This bibliographic review work investigates florescence spectroscopy, a method that couldn’t be used in food until the recent technological advances, concretely front-face fluorescence and chemometric tools. This technology presents advantages as compared to classical methods like HPLC or capillary electrophoresis, which require qualified staff, sample preparation and are time-c...

  20. Reliability Modeling and Optimization Strategy for Manufacturing System Based on RQR Chain

    Directory of Open Access Journals (Sweden)

    Yihai He

    2015-01-01

    Full Text Available Accurate and dynamic reliability modeling for the running manufacturing system is the prerequisite to implement preventive maintenance. However, existing studies could not output the reliability value in real time because their abandonment of the quality inspection data originated in the operation process of manufacturing system. Therefore, this paper presents an approach to model the manufacturing system reliability dynamically based on their operation data of process quality and output data of product reliability. Firstly, on the basis of importance explanation of the quality variations in manufacturing process as the linkage for the manufacturing system reliability and product inherent reliability, the RQR chain which could represent the relationships between them is put forward, and the product qualified probability is proposed to quantify the impacts of quality variation in manufacturing process on the reliability of manufacturing system further. Secondly, the impact of qualified probability on the product inherent reliability is expounded, and the modeling approach of manufacturing system reliability based on the qualified probability is presented. Thirdly, the preventive maintenance optimization strategy for manufacturing system driven by the loss of manufacturing quality variation is proposed. Finally, the validity of the proposed approach is verified by the reliability analysis and optimization example of engine cover manufacturing system.

  1. Sub 150 °C processed meso-superstructured perovskite solar cells with enhanced efficiency (presentation video)

    Science.gov (United States)

    Wojciechowski, Konrad; Saliba, Michael; Leijtens, Tomas; Abate, Antonio; Snaith, Henry J.

    2014-10-01

    The ability to process amorphous or polycrystalline solar cells at low temperature (TiO2 compact layers as charge selective contacts. With our optimized formulation we demonstrate full sun solar power conversion efficiencies exceeding 16 % in an all low temperature processed solar cell.

  2. VLSI Reliability in Europe

    NARCIS (Netherlands)

    Verweij, Jan F.

    1993-01-01

    Several issue's regarding VLSI reliability research in Europe are discussed. Organizations involved in stimulating the activities on reliability by exchanging information or supporting research programs are described. Within one such program, ESPRIT, a technical interest group on IC reliability was

  3. Reliability and construction control

    Directory of Open Access Journals (Sweden)

    Sherif S. AbdelSalam

    2016-06-01

    Full Text Available The goal of this study was to determine the most reliable and efficient combination of design and construction methods required for vibro piles. For a wide range of static and dynamic formulas, the reliability-based resistance factors were calculated using EGYPT database, which houses load test results for 318 piles. The analysis was extended to introduce a construction control factor that determines the variation between the pile nominal capacities calculated using static versus dynamic formulae. From the major outcomes, the lowest coefficient of variation is associated with Davisson’s criterion, and the resistance factors calculated for the AASHTO method are relatively high compared with other methods. Additionally, the CPT-Nottingham and Schmertmann method provided the most economic design. Recommendations related to a pile construction control factor were also presented, and it was found that utilizing the factor can significantly reduce variations between calculated and actual capacities.

  4. Improving Power Converter Reliability

    DEFF Research Database (Denmark)

    Ghimire, Pramod; de Vega, Angel Ruiz; Beczkowski, Szymon

    2014-01-01

    The real-time junction temperature monitoring of a high-power insulated-gate bipolar transistor (IGBT) module is important to increase the overall reliability of power converters for industrial applications. This article proposes a new method to measure the on-state collector?emitter voltage...... of a high-power IGBT module during converter operation, which may play a vital role in improving the reliability of the power converters. The measured voltage is used to estimate the module average junction temperature of the high and low-voltage side of a half-bridge IGBT separately in every fundamental...... is measured in a wind power converter at a low fundamental frequency. To illustrate more, the test method as well as the performance of the measurement circuit are also presented. This measurement is also useful to indicate failure mechanisms such as bond wire lift-off and solder layer degradation...

  5. Accelerator Availability and Reliability Issues

    Energy Technology Data Exchange (ETDEWEB)

    Steve Suhring

    2003-05-01

    Maintaining reliable machine operations for existing machines as well as planning for future machines' operability present significant challenges to those responsible for system performance and improvement. Changes to machine requirements and beam specifications often reduce overall machine availability in an effort to meet user needs. Accelerator reliability issues from around the world will be presented, followed by a discussion of the major factors influencing machine availability.

  6. Treatment of bilateral hyperplasia of the coronoid process of the mandible. Presentation of a case and review of the literature.

    Science.gov (United States)

    Fernández Ferro, Martín; Fernández Sanromán, Jacinto; Sandoval Gutierrez, Jesús; Costas López, Alberto; López de Sánchez, Annahys; Etayo Pérez, Amaya

    2008-09-01

    Bilateral hyperplasia of the coronoid process is infrequent. It consists of an elongation of the coronoid process of the mandible and is, accordingly, a mechanical problem, limiting mouth opening. This article looks at the case of a 28 year-old male with significant limitation on opening his mouth, secondary to bilateral hyperplasia of the coronoid process. We reviewed the literature and analysed the diagnostic and therapeutic procedures used, paying special attention to the surgical approaches to the coronoid process and emphasising the importance of early post-operative rehabilitation, describing our experience with the TheraBite (Atos Medical AB, PO Box 183, 242 22 Hörby, Sweden). The satisfactory result of the procedure is marked by the stable recovery of the mouth opening, achieved by a good combination of surgical and physiotherapeutic techniques.

  7. Presentation of an approach for adapting software production process based ISO/IEC 12207 to ITIL Service

    Directory of Open Access Journals (Sweden)

    Samira Haghighatfar

    2013-05-01

    Full Text Available The standard ISO/IEC 12207 is software life cycle standard that not only provides a framework for executable effective method for production and development software, but also can ensure that organizational goals are realized properly. In this paper, ITIL standard shall be used for better process control and management and providing a common language and syntax between stakeholders. In addition, the process of mapping between these two standards shall be considered.

  8. Photovoltaic performance and reliability workshop

    Energy Technology Data Exchange (ETDEWEB)

    Mrig, L. [ed.

    1993-12-01

    This workshop was the sixth in a series of workshops sponsored by NREL/DOE under the general subject of photovoltaic testing and reliability during the period 1986--1993. PV performance and PV reliability are at least as important as PV cost, if not more. In the US, PV manufacturers, DOE laboratories, electric utilities, and others are engaged in the photovoltaic reliability research and testing. This group of researchers and others interested in the field were brought together to exchange the technical knowledge and field experience as related to current information in this evolving field of PV reliability. The papers presented here reflect this effort since the last workshop held in September, 1992. The topics covered include: cell and module characterization, module and system testing, durability and reliability, system field experience, and standards and codes.

  9. Materials processing strategies for colloidal quantum dot solar cells: advances, present-day limitations, and pathways to improvement

    KAUST Repository

    Carey, Graham H.

    2013-05-13

    Colloidal quantum dot photovoltaic devices have improved from initial, sub-1% solar power conversion efficiency to current record performance of over 7%. Rapid advances in materials processing and device physics have driven this impressive performance progress. The highest-efficiency approaches rely on a fabrication process that starts with nanocrystals in solution, initially capped with long organic molecules. This solution is deposited and the resultant film is treated using a solution containing a second, shorter capping ligand, leading to a cross-linked, non-redispersible, and dense layer. This procedure is repeated, leading to the widely employed layer-by-layer solid-state ligand exchange. We will review the properties and features of this process, and will also discuss innovative pathways to creating even higher-performing films and photovoltaic devices.

  10. Reliability of Circumplex Axes

    Directory of Open Access Journals (Sweden)

    Micha Strack

    2013-06-01

    Full Text Available We present a confirmatory factor analysis (CFA procedure for computing the reliability of circumplex axes. The tau-equivalent CFA variance decomposition model estimates five variance components: general factor, axes, scale-specificity, block-specificity, and item-specificity. Only the axes variance component is used for reliability estimation. We apply the model to six circumplex types and 13 instruments assessing interpersonal and motivational constructs—Interpersonal Adjective List (IAL, Interpersonal Adjective Scales (revised; IAS-R, Inventory of Interpersonal Problems (IIP, Impact Messages Inventory (IMI, Circumplex Scales of Interpersonal Values (CSIV, Support Action Scale Circumplex (SAS-C, Interaction Problems With Animals (IPI-A, Team Role Circle (TRC, Competing Values Leadership Instrument (CV-LI, Love Styles, Organizational Culture Assessment Instrument (OCAI, Customer Orientation Circle (COC, and System for Multi-Level Observation of Groups (behavioral adjectives; SYMLOG—in 17 German-speaking samples (29 subsamples, grouped by self-report, other report, and metaperception assessments. The general factor accounted for a proportion ranging from 1% to 48% of the item variance, the axes component for 2% to 30%; and scale specificity for 1% to 28%, respectively. Reliability estimates varied considerably from .13 to .92. An application of the Nunnally and Bernstein formula proposed by Markey, Markey, and Tinsley overestimated axes reliabilities in cases of large-scale specificities but otherwise works effectively. Contemporary circumplex evaluations such as Tracey’s RANDALL are sensitive to the ratio of the axes and scale-specificity components. In contrast, the proposed model isolates both components.

  11. The Past is always Present in the Port. The decision-making process of Maasvlakte II (1993-2008).

    NARCIS (Netherlands)

    D.M. Koppenol (Dirk)

    2014-01-01

    markdownabstract__Abstract__ Maasvlakte II is a 2.3 billion euro port expansion of the Port of Rotterdam, the largest port in Europe. During the decision-making process, fierce conflicts arose. Not only, between the Port Management and the nature preservation and environment pressure groups, but a

  12. Effects of Different Multimedia Presentations on Viewers' Information-Processing Activities Measured by Eye-Tracking Technology

    Science.gov (United States)

    Chuang, Hsueh-Hua; Liu, Han-Chin

    2012-01-01

    This study implemented eye-tracking technology to understand the impact of different multimedia instructional materials, i.e., five successive pages versus a single page with the same amount of information, on information-processing activities in 21 non-science-major college students. The findings showed that students demonstrated the same number…

  13. An Action Research Process on University Tutorial Sessions with Small Groups: Presentational Tutorial Sessions and Online Communication

    Science.gov (United States)

    Alcaraz-Salarirche, Noelia; Gallardo-Gil, Monsalud; Herrera-Pastor, David; Servan-Nunez, Maria Jose

    2011-01-01

    We describe and analyse the action research process carried out by us as teachers in a general didactics course in the University of Malaga (Spain). The course methodology combined lectures to the whole class and small-group work. We were in charge of guiding small-group work. In the small groups, students researched on an educational innovation…

  14. Bayesian reliability analysis for structures based on gaussian process classification%基于高斯过程分类的结构贝叶斯可靠性分析

    Institute of Scientific and Technical Information of China (English)

    曹鸿钧; 朱玉强; 张功

    2012-01-01

    Bayesian reliability method is one of the efficient approaches tor rehabfllty analysis ior struc tures with incomplete probability information. The computational cost of the Bayesian reliability estima tion is often prohibitive for real applications. It is necessary to use surrogate models to replace actual models in order to reduce the computational burden. Commonly used surrogate modeling approaches, which construct approximation models for response functions rather than limit state surfaces, are indirect and not easy to take model uncertainties into account. Furthermore, these methods are difficult to be used for problems exhibiting discontinuous responses and disjoint failure domains. In order to handle these dif ficulties,this paper presents a method to identify the limit state surface by using Gaussian process classi fication. The variances of distribution parameters of failure probability due to the model uncertainty are derived. An adaptive sampling criterion for updating the surrogate model is proposed. An example is presented to demonstrate the efficiency and effectiveness of the proposed method.%贝叶斯可靠性方法是处理不完备信息条件下结构可靠性问题的有效途径之一。在实际应用中,由于可靠性分析的计算量较大,常须采用各种近似替代模型以提高计算效率。传统的替代模型方法是对结构的功能函数予以近似建模。这种方法不易定量考虑模型误差对可靠性分析的影响,且难以应用于诸如功能函数不连续和失效域不连通等情况。为此,本文提出一种基于高斯过程分类的替代模型,直接辨识结构的极限状态曲面,并将其应用于结构贝叶斯可靠性分析之中。分析了替代模型不确定性对可靠性预测结果的影响,给出了失效概率分布参数的方差算式,进而提出了改善模型精度的补充采样准则。通过算例验证了方法的适用性和有被性.

  15. Reliability Generalization: "Lapsus Linguae"

    Science.gov (United States)

    Smith, Julie M.

    2011-01-01

    This study examines the proposed Reliability Generalization (RG) method for studying reliability. RG employs the application of meta-analytic techniques similar to those used in validity generalization studies to examine reliability coefficients. This study explains why RG does not provide a proper research method for the study of reliability,…

  16. Reliability models applicable to space telescope solar array assembly system

    Science.gov (United States)

    Patil, S. A.

    1986-01-01

    A complex system may consist of a number of subsystems with several components in series, parallel, or combination of both series and parallel. In order to predict how well the system will perform, it is necessary to know the reliabilities of the subsystems and the reliability of the whole system. The objective of the present study is to develop mathematical models of the reliability which are applicable to complex systems. The models are determined by assuming k failures out of n components in a subsystem. By taking k = 1 and k = n, these models reduce to parallel and series models; hence, the models can be specialized to parallel, series combination systems. The models are developed by assuming the failure rates of the components as functions of time and as such, can be applied to processes with or without aging effects. The reliability models are further specialized to Space Telescope Solar Arrray (STSA) System. The STSA consists of 20 identical solar panel assemblies (SPA's). The reliabilities of the SPA's are determined by the reliabilities of solar cell strings, interconnects, and diodes. The estimates of the reliability of the system for one to five years are calculated by using the reliability estimates of solar cells and interconnects given n ESA documents. Aging effects in relation to breaks in interconnects are discussed.

  17. Controversies about interspinous process devices in the treatment of degenerative lumbar spine diseases: past, present, and future.

    Science.gov (United States)

    Gazzeri, Roberto; Galarza, Marcelo; Alfieri, Alex

    2014-01-01

    A large number of interspinous process devices (IPD) have been recently introduced to the lumbar spine market as an alternative to conventional decompressive surgery in managing symptomatic lumbar spinal pathology, especially in the older population. Despite the fact that they are composed of a wide range of different materials including titanium, polyetheretherketone, and elastomeric compounds, the aim of these devices is to unload spine, restoring foraminal height, and stabilize the spine by distracting the spinous processes. Although the initial reports represented the IPD as a safe, effective, and minimally invasive surgical alternative for relief of neurological symptoms in patients with low back degenerative diseases, recent studies have demonstrated less impressive clinical results and higher rate of failure than initially reported. The purpose of this paper is to provide a comprehensive overview on interspinous implants, their mechanisms of action, safety, cost, and effectiveness in the treatment of lumbar stenosis and degenerative disc diseases.

  18. Controversies about Interspinous Process Devices in the Treatment of Degenerative Lumbar Spine Diseases: Past, Present, and Future

    Directory of Open Access Journals (Sweden)

    Roberto Gazzeri

    2014-01-01

    Full Text Available A large number of interspinous process devices (IPD have been recently introduced to the lumbar spine market as an alternative to conventional decompressive surgery in managing symptomatic lumbar spinal pathology, especially in the older population. Despite the fact that they are composed of a wide range of different materials including titanium, polyetheretherketone, and elastomeric compounds, the aim of these devices is to unload spine, restoring foraminal height, and stabilize the spine by distracting the spinous processes. Although the initial reports represented the IPD as a safe, effective, and minimally invasive surgical alternative for relief of neurological symptoms in patients with low back degenerative diseases, recent studies have demonstrated less impressive clinical results and higher rate of failure than initially reported. The purpose of this paper is to provide a comprehensive overview on interspinous implants, their mechanisms of action, safety, cost, and effectiveness in the treatment of lumbar stenosis and degenerative disc diseases.

  19. XIIth international meeting on radiation processing Avignon 25-30 March 2001 (Polymer irradiation: past-present and future)

    Science.gov (United States)

    Chapiro, Adolphe

    2002-03-01

    Radiations are used efficiently and economically for the production of new or modified polymers. The following processes are considered: Radiation curing; Radiation cross-linking; Radiation grafting. These processes are commonly used today in industry and provide a broad range of new potential applications in various fields. The history of their development is briefly reported. The chemical reactions underlying these processes are described. (1) Radiation curing is used commercially on a large scale for the production of improved coatings, lacquers and inks. The process can be conducted at very high speeds. Curing of magnetic formulations leads to particularly stable products, which compete favourably with more conventional materials. (2) Radiation cross-linking is an established technology in the wire and cable industry. It emparts to the modified insulators improved resistance to solvents, to ageing and to elevated temperatures. The resulting cross-linked network also reduces the migration of fillers and thereby stabilizes in time any message imprinted with magnetic or colored pigments dispersed in a polymer. (3) Radiation grafting is a powerful method for modifying more profoundly the properties of a polymer and for creating numerous, entirely new materials. The chemical modification can be applied at will into the bulk of the material or limited to a surface zone of any desired depth. This method can be used for instance, for introducing polar groups in the bulk or on the surface of non-polar polymers, for increasing or reducing the wettability of a polymer, for imparting a better compatibility of a polymer to a specific coating and the like. The irradiation of water-soluble polymers in aqueous solutions, with or without the addition of another monomer gives rise to a variety of cross-linked gels which find useful applications in the biomedical field. Other promising applications will be considered.

  20. Unintended purge during the start-up process of a syringe pump: report of a case presented with vascular collapse.

    Science.gov (United States)

    Farbood, Arash; Kazemi, Asif Parviz; Akbari, Kamal

    2010-12-01

    The case of a 50 year-old woman who developed a sudden decrease in arterial pressure while she was being prepared for surgery for a fractured lumbar spine in the prone position, is reported. She was receiving propofol, remifentanil, and sodium nitroprusside via three syringe pumps through an intravenous cannula at the dorsum of her left hand. The cause of the vascular collapse was the purge of the syringe pumps during the self-check process.

  1. Dangerous poverty. Analysis of the criminalization process of poverty and youth in Uruguay and of the challenges that this process presents to the community psychology

    Directory of Open Access Journals (Sweden)

    Agustín Cano Menoni

    2014-02-01

    Full Text Available In this paper I analyze the components and main effects of what I characterized as a process of criminalization of youth in poverty, in the case of Uruguay. I argue that this process occurs through a series of discursive operations (in different levels: police, judicial, political and technical- scientific, which stigmatize the social reference group, placing them as a threat to society. To investigate this process, I analyze journalistic texts, testimonials and an advertising campaign, covering the following actors: a member of the national Parliament, the editorialist of the highest circulation newspaper in Uruguay, the Director of the largest hospice psychiatric in the country, and the Uruguayan Interior Ministry (police force. I conclude that in Uruguay started up a stigmatization process which place youth in poverty as a threat to society, and that this process involves the deepening of police approaches of the security problems, obscuring the conditions of social injustice behind them, and consecrating fear as the main principle of the social relationships. This situation also challenges to the social sciences, and in particular to psycological disciplines, by posing the challenge of finding new answers, both theoretical and methodological, alternatives to the stigmatization and police security approaches.

  2. RELIABILITY ANALYSIS OF A SYSTEM OF BOILER USED IN READYMADE GARMENT INDUSTRY

    Directory of Open Access Journals (Sweden)

    R.K. Agnihotri

    2008-01-01

    Full Text Available The present paper deals with the reliability analysis of a system of boiler used in garment industry.The system consists of a single unit of boiler which plays an important role in garment industry. Usingregenerative point technique with Markov renewal process various reliability characteristics of interest areobtained.

  3. Reliability-based robust optimization design for rubber fluid cell forming process%橡皮液压成形工艺的可靠性稳健优化设计方法

    Institute of Scientific and Technical Information of China (English)

    王淼; 李东升; 李小强; 杨伟俊

    2012-01-01

    The inherent variations that exist in aluminum alloy sheet properties, process parameters and other random factors effect forming quality of rubber fluid cell forming process. Oriented to precision manufacturing of aircraft, a reliability-based robust optimization design approach for rubber fluid cell forming process based on robust optimization design and reasonable reliability constraints was proposed. The reliability-based robust optimization design solution with the traditional optimization design solution was compared, and the typi- cal flanging parts of 2B06-O aluminum alloy sheet in rubber fluid cell forming were used to verify the practica- bility and validity. The results show that the proposed approach made quality performance indicators less sensitive to uncertainties, and improved the robustness and reliability of the rubber forming process.%针对当前橡皮液压成形工艺过程中材料特性、工艺参数和其他随机因素都存在的固有波动问题,面向飞机精准制造,提出了一种考虑不确定性的可靠性稳健优化设计方法.该方法基于稳健优化设计和合理的可靠性约束,对比了可靠性稳健优化解和传统优化解相关度量指标,并对2806-O态铝合金板材橡皮成形典型的曲弯边零件进行了工艺试验验证.结果表明该方法可有效降低质量评价指标对不确定性因素的敏感度,提高了橡皮成形工艺的稳健性和可靠性.

  4. Software Reliability Experimentation and Control

    Institute of Scientific and Technical Information of China (English)

    Kai-Yuan Cai

    2006-01-01

    This paper classifies software researches as theoretical researches, experimental researches, and engineering researches, and is mainly concerned with the experimental researches with focus on software reliability experimentation and control. The state-of-the-art of experimental or empirical studies is reviewed. A new experimentation methodology is proposed, which is largely theory discovering oriented. Several unexpected results of experimental studies are presented to justify the importance of software reliability experimentation and control. Finally, a few topics that deserve future investigation are identified.

  5. Presentation of a method for consequence modeling and quantitative risk assessment of fire and explosion in process industry (Case study: Hydrogen Production Process

    Directory of Open Access Journals (Sweden)

    M J Jafari

    2013-05-01

     .Conclusion: Since the proposed method is applicable in all phases of process or system design, and estimates the risk of fire and explosion by a quantitative, comprehensive and mathematical-based equations approach. It can be used as an alternative method instead of qualitative and semi quantitative methods.

  6. Webinar Presentation: Exposures to Polycyclic Aromatic Hydrocarbons and Childhood Growth Trajectories and Body Composition: Linkages to Disrupted Self-Regulatory Processes

    Science.gov (United States)

    This presentation, Exposures to Polycyclic Aromatic Hydrocarbons and Childhood Growth Trajectories and Body Composition: Linkages to Disrupted Self-Regulatory Processes, was given at the NIEHS/EPA Children's Centers 2016 Webinar Series: Childhood Obesity

  7. Puerperal gravid process like risk factor for the presentation of severe clinical pictures of Influenza A (H1N1

    Directory of Open Access Journals (Sweden)

    Carlos Zerquera

    2011-04-01

    Full Text Available An analysis is carried out about some factors that are considered implied in the normal development of the puerperal gravid process, and that should be kept in mind at the moment to consider the pregnant and postpartum women like a group of risk to suffer severe clinical pictures of Influenza A (H1N1, in the course of the current pandemic. Such considerations are the author's points of view, based on the review of medical literature, as very recent aspects in relation with to the beginning and normal development of the pregnancy and an experience of more than thirty years as the head of medical team for the care of severe pregnant women patients.

  8. CR reliability testing

    Science.gov (United States)

    Honeyman-Buck, Janice C.; Rill, Lynn; Frost, Meryll M.; Staab, Edward V.

    1998-07-01

    The purpose of this work was to develop a method for systematically testing the reliability of a CR system under realistic daily loads in a non-clinical environment prior to its clinical adoption. Once digital imaging replaces film, it will be very difficult to revert back should the digital system become unreliable. Prior to the beginning of the test, a formal evaluation was performed to set the benchmarks for performance and functionality. A formal protocol was established that included all the 62 imaging plates in the inventory for each 24-hour period in the study. Imaging plates were exposed using different combinations of collimation, orientation, and SID. Anthropomorphic phantoms were used to acquire images of different sizes. Each combination was chosen randomly to simulate the differences that could occur in clinical practice. The tests were performed over a wide range of times with batches of plates processed to simulate the temporal constraints required by the nature of portable radiographs taken in the Intensive Care Unit (ICU). Current patient demographics were used for the test studies so automatic routing algorithms could be tested. During the test, only three minor reliability problems occurred, two of which were not directly related to the CR unit. One plate was discovered to cause a segmentation error that essentially reduced the image to only black and white with no gray levels. This plate was removed from the inventory to be replaced. Another problem was a PACS routing problem that occurred when the DICOM server with which the CR was communicating had a problem with disk space. The final problem was a network printing failure to the laser cameras. Although the units passed the reliability test, problems with interfacing to workstations were discovered. The two issues that were identified were the interpretation of what constitutes a study for CR and the construction of the look-up table for a proper gray scale display.

  9. DMD reliability: a MEMS success story

    Science.gov (United States)

    Douglass, Michael

    2003-01-01

    The Digital Micromirror Device (DMD) developed by Texas Instruments (TI) has made tremendous progress in both performance and reliability since it was first invented in 1987. From the first working concept of a bistable mirror, the DMD is now providing high-brightness, high-contrast, and high-reliability in over 1,500,000 projectors using Digital Light Processing technology. In early 2000, TI introduced the first DMD chip with a smaller mirror (14-micron pitch versus 17-micron pitch). This allowed a greater number of high-resolution DMD chips per wafer, thus providing an increased output capacity as well as the flexibility to use existing package designs. By using existing package designs, subsequent DMDs cost less as well as met our customers' demand for faster time to market. In recent years, the DMD achieved the status of being a commercially successful MEMS device. It reached this status by the efforts of hundreds of individuals working toward a common goal over many years. Neither textbooks nor design guidelines existed at the time. There was little infrastructure in place to support such a large endeavor. The knowledge we gained through our characterization and testing was all we had available to us through the first few years of development. Reliability was only a goal in 1992 when production development activity started; a goal that many throughout the industry and even within Texas Instruments doubted the DMD could achieve. The results presented in this paper demonstrate that we succeeded by exceeding the reliability goals.

  10. A Method of Reliability Allocation of a Complicated Large System

    Institute of Scientific and Technical Information of China (English)

    WANG Zhi-sheng; QIN Yuan-yuan; WANG Dao-bo

    2004-01-01

    Aiming at the problem of reliability allocation for a complicated large system, a new thought is brought up. Reliability allocation should be a kind of decision-making behavior; therefore the more information is used when apportioning a reliability index, the more reasonable an allocation is obtained. Reliability allocation for a complicated large system consists of two processes, the first one is a reliability information reporting process fromt bottom to top, and the other one is a reliability index apportioning process from top to bottom. By a typical example, we illustrate the concrete process of reliability allocation algorithms.

  11. Research on Control Method Based on Real-Time Operational Reliability Evaluation for Space Manipulator

    Directory of Open Access Journals (Sweden)

    Yifan Wang

    2014-05-01

    Full Text Available A control method based on real-time operational reliability evaluation for space manipulator is presented for improving the success rate of a manipulator during the execution of a task. In this paper, a method for quantitative analysis of operational reliability is given when manipulator is executing a specified task; then a control model which could control the quantitative operational reliability is built. First, the control process is described by using a state space equation. Second, process parameters are estimated in real time using Bayesian method. Third, the expression of the system's real-time operational reliability is deduced based on the state space equation and process parameters which are estimated using Bayesian method. Finally, a control variable regulation strategy which considers the cost of control is given based on the Theory of Statistical Process Control. It is shown via simulations that this method effectively improves the operational reliability of space manipulator control system.

  12. [A rare inborn error of intracellular processing of cobalamine presenting with microcephalus and megaloblastic anemia: a report of 3 children].

    Science.gov (United States)

    Müller, P; Horneff, G; Hennermann, J B

    2007-01-01

    Defects of methionine synthase or methionine synthase reductase result in an impaired remethylation of homocysteine to methionine. Patients present with megaloblastic anemia, failure to thrive and various neurological manifestations including mental retardation, cerebral atrophy, muscular hypotonia or hypertonia, ataxia, seizures, nystagmus and visual disturbances. We report on three children (two girls, one boy), aged 3.5-7.5 years, who presented with severe megaloblastic anemia, micro-cephalus and partly nystagmus (2/3) due to a rare inborn error of remethylation. Methionine synthase reductase deficiency, cblE type of homocystinuria (OMIM 236270), is a rare autosomal recessive inherited disorder described only in 14 patients worldwide. Metabolic hallmarks of the disease are hyperhomocysteinemia (median 98 micromol/l, normal range megaloblastic anemia. Measurements of homocysteine and methionine in plasma as well as methylmalonic acid in urine is required for confirming the diagnosis. Early treatment im-proves the outcome, although mental disability may not be prevented. Treatment has a positive impact on megaloblastic anemia but only slight effect on hyperhomocysteinemia. The long-term cardiovascular risk of hyperhomocysteinemia in cblE deficient patients is not known yet.

  13. A novel GIS-based tool for estimating present-day ocean reference depth using automatically processed gridded bathymetry data

    Science.gov (United States)

    Jurecka, Mirosława; Niedzielski, Tomasz; Migoń, Piotr

    2016-05-01

    This paper presents a new method for computing the present-day value of the reference depth (dr) which is an essential input information for assessment of past sea-level changes. The method applies a novel automatic geoprocessing tool developed using Python script and ArcGIS, and uses recent data about ocean floor depth, sediment thickness, and age of oceanic crust. The procedure is multi-step and involves creation of a bathymetric dataset corrected for sediment loading and isostasy, delineation of subduction zones, computation of perpendicular sea-floor profiles, and statistical analysis of these profiles versus crust age. The analysis of site-specific situations near the subduction zones all around the world shows a number of instances where the depth of the oceanic crust stabilizes at a certain level before reaching the subduction zone, and this occurs at depths much lower than proposed in previous approaches to the reference depth issue. An analysis of Jurassic and Cretaceous oceanic lithosphere shows that the most probable interval at which the reference depth occurs is 5300-5800 m. This interval is broadly consistent with dr estimates determined using the Global Depth-Heatflow model (GDH1), but is significantly lower than dr estimates calculated on a basis of the Parsons-Sclater Model (PSM).

  14. Rats Born to Mothers Treated with Dexamethasone 15 cH Present Changes in Modulation of Inflammatory Process

    Directory of Open Access Journals (Sweden)

    Leoni V. Bonamin

    2012-01-01

    Full Text Available As little information about the effect of ultra high dilutions of glucocorticoid in reproduction is available in the literature, pregnant female Wistar rats (N=12 were blindly subcutaneously treated during all gestational and lactation period with: dexamethasone 4 mg/kg diluted into dexamethasone 15 cH (mixed; or dexamethasone 4 mg/kg diluted in water; or dexamethasone 15 cH, or vehicle. Parental generation had body weight, food and water consumption monitored. The F1 generation was monitored regarding to newborn development. No birth occurred in both groups treated with dexamethasone 4 mg/kg. After 60 days from birth, 12 male F1 rats were randomly selected from each remaining group and inoculated subcutaneously with 1% carrageenan into the footpad, for evaluation of inflammatory performance. Edema and histopathology of the footpad were evaluated, using specific staining methods, immunohistochemistry and digital histomorphometry. Mothers treated with mixed dexamethasone presented reduced water consumption. F1 rats born to dexamethasone 15 cH treated females presented significant increase in mast cell degranulation, decrease in monocyte percentage, increase in CD18+ PMN cells, and early expression of ED2 protein, in relation to control. The results show that the exposure of parental generation to highly diluted dexamethasone interferes in inflammation modulation in the F1 generation.

  15. Assuring reliability program effectiveness.

    Science.gov (United States)

    Ball, L. W.

    1973-01-01

    An attempt is made to provide simple identification and description of techniques that have proved to be most useful either in developing a new product or in improving reliability of an established product. The first reliability task is obtaining and organizing parts failure rate data. Other tasks are parts screening, tabulation of general failure rates, preventive maintenance, prediction of new product reliability, and statistical demonstration of achieved reliability. Five principal tasks for improving reliability involve the physics of failure research, derating of internal stresses, control of external stresses, functional redundancy, and failure effects control. A final task is the training and motivation of reliability specialist engineers.

  16. The Accelerator Reliability Forum

    CERN Document Server

    Lüdeke, Andreas; Giachino, R

    2014-01-01

    A high reliability is a very important goal for most particle accelerators. The biennial Accelerator Reliability Workshop covers topics related to the design and operation of particle accelerators with a high reliability. In order to optimize the over-all reliability of an accelerator one needs to gather information on the reliability of many different subsystems. While a biennial workshop can serve as a platform for the exchange of such information, the authors aimed to provide a further channel to allow for a more timely communication: the Particle Accelerator Reliability Forum [1]. This contribution will describe the forum and advertise it’s usage in the community.

  17. Improve of the adequacy and reliability of no standardized processing model datasets in studies of educational systems method of experiment planning

    Directory of Open Access Journals (Sweden)

    Valery Alexeev

    2014-06-01

    Full Text Available The article is devoted the questions of improvement of the accuracy of statistical mathematical models, used to investigate the effectiveness of the educational process. The authors suggest changing standard approaches to information processing large datasets, using the method of experiment planning. This takes into account features of data sets characteristic of pedagogical processes. Using these proposals one can improve the accuracy of the model and the value processing, and hence the accuracy of machining.

  18. Validation of Land Cover Products Using Reliability Evaluation Methods

    Directory of Open Access Journals (Sweden)

    Wenzhong Shi

    2015-06-01

    Full Text Available Validation of land cover products is a fundamental task prior to data applications. Current validation schemes and methods are, however, suited only for assessing classification accuracy and disregard the reliability of land cover products. The reliability evaluation of land cover products should be undertaken to provide reliable land cover information. In addition, the lack of high-quality reference data often constrains validation and affects the reliability results of land cover products. This study proposes a validation schema to evaluate the reliability of land cover products, including two methods, namely, result reliability evaluation and process reliability evaluation. Result reliability evaluation computes the reliability of land cover products using seven reliability indicators. Process reliability evaluation analyzes the reliability propagation in the data production process to obtain the reliability of land cover products. Fuzzy fault tree analysis is introduced and improved in the reliability analysis of a data production process. Research results show that the proposed reliability evaluation scheme is reasonable and can be applied to validate land cover products. Through the analysis of the seven indicators of result reliability evaluation, more information on land cover can be obtained for strategic decision-making and planning, compared with traditional accuracy assessment methods. Process reliability evaluation without the need for reference data can facilitate the validation and reflect the change trends of reliabilities to some extent.

  19. Materials Reliability Program: Development of a New Process for Calculating RPV Heat-Up and Cool-Down Curves - Proof of Concept

    Energy Technology Data Exchange (ETDEWEB)

    M. EricksonKirk

    2005-04-30

    A strategy and framework were developed for incorporating best-estimate, fracture toughness models and methodologies into procedures for fracture safety assessment of nuclear RPVs during normal heat-up and cool-down operations. The process included detailed process flow diagramming to identify all details of the current process for obtaining heat-up and cool-down curves.

  20. Localized surface plasmons modulated nonlinear optical processes in metal film-coupled and upconversion nanocrystals-coated nanoparticles (Conference Presentation)

    Science.gov (United States)

    Lei, Dangyuan

    2016-09-01

    In the first part of this talk, I will show our experimental investigation on the linear and nonlinear optical properties of metal film-coupled nanosphere monomers and dimers both with nanometric gaps. We have developed a new methodology - polarization resolved spectral decomposition and color decoding to "visualizing" unambiguously the spectral and radiation properties of the complex plasmonic gap modes in these hybrid nanostructures. Single-particle spectroscopic measurements indicate that these hybrid nanostructures can simultaneously enhance several nonlinear optical processes, such as second harmonic generation, two-photon absorption induced luminescence, and hyper-Raman scattering. In the second part, I will show how the polarization state of the emissions from sub-10 nm upconversion nanocrystals (UCNCs) can be modulated when they form a hybrid complex with a gold nanorod (GNR). Our single-particle scattering experiments expose how an interplay between excitation polarization and GNR orientation gives rise to an extraordinary polarized nature of the upconversion emissions from an individual hybrid nanostructure. We support our results by numerical simulations and, using Förster resonance energy transfer theory, we uncover how an overlap between the UCNC emission and GNR extinction bands as well as the mutual orientation between emission and plasmonic dipoles jointly determine the polarization state of the UC emissions.

  1. Nonlinear optical and multiphoton processes for in situ manipulation and conversion of photons: applications to energy and healthcare (Conference Presentation)

    Science.gov (United States)

    Prasad, Paras N.

    2017-02-01

    Chiral control of nonlinear optical functions holds a great promise for a wide range of applications including optical signal processing, bio-sensing and chiral bio-imaging. In chiral polyfluorene thin films, we demonstrated extremely large chiral nonlinearity. The physics of manipulating excitation dynamics for photon transformation will be discussed, along with nanochemistry control of upconversion in hierarchically built organic chromophore coupled-core-multiple shell nanostructures which enable introduce new, organic-inorganic energy transfer routes for broadband light harvesting and increased upconversion efficiency via multistep cascaded energy transfer. We are pursuing the applications of photon conversion technology in IR harvesting for photovoltaics, high contrast bioimaging, photoacoustic imaging, photodynamic therapy, and optogenetics. An important application is in Brain research and Neurophotonics for functional mapping and modulation of brain activities. Another new direction pursued is magnetic field control of light in in a chiral polymer nanocomposite to achieve large magneto-optic coefficient which can enable sensing of extremely weak magnetic field due to brain waves. Finally, we will consider the thought provoking concept of utilizing photons to quantify, through magneto-optics, and augment - through nanoptogenetics, the cognitive states, thus paving the path way to a quantified human paradigm.

  2. Optics based signal processing methods for intraoperative blood vessel detection and quantification in real time (Conference Presentation)

    Science.gov (United States)

    Chaturvedi, Amal; Shukair, Shetha A.; Le Rolland, Paul; Vijayvergia, Mayank; Subramanian, Hariharan; Gunn, Jonathan W.

    2016-03-01

    Minimally invasive operations require surgeons to make difficult cuts to blood vessels and other tissues with impaired tactile and visual feedback. This leads to inadvertent cuts to blood vessels hidden beneath tissue, causing serious health risks to patients and a non-reimbursable financial burden to hospitals. Intraoperative imaging technologies have been developed, but these expensive systems can be cumbersome and provide only a high-level view of blood vessel networks. In this research, we propose a lean reflectance-based system, comprised of a dual wavelength LED, photodiode, and novel signal processing algorithms for rapid vessel characterization. Since this system takes advantage of the inherent pulsatile light absorption characteristics of blood vessels, no contrast agent is required for its ability to detect the presence of a blood vessel buried deep inside any tissue type (up to a cm) in real time. Once a vessel is detected, the system is able to estimate the distance of the vessel from the probe and the diameter size of the vessel (with a resolution of ~2mm), as well as delineate the type of tissue surrounding the vessel. The system is low-cost, functions in real-time, and could be mounted on already existing surgical tools, such as Kittner dissectors or laparoscopic suction irrigation cannulae. Having been successfully validated ex vivo, this technology will next be tested in a live porcine study and eventually in clinical trials.

  3. Enlightenment on Computer Network Reliability From Transportation Network Reliability

    OpenAIRE

    Hu Wenjun; Zhou Xizhao

    2011-01-01

    Referring to transportation network reliability problem, five new computer network reliability definitions are proposed and discussed. They are computer network connectivity reliability, computer network time reliability, computer network capacity reliability, computer network behavior reliability and computer network potential reliability. Finally strategies are suggested to enhance network reliability.

  4. Load Control System Reliability

    Energy Technology Data Exchange (ETDEWEB)

    Trudnowski, Daniel [Montana Tech of the Univ. of Montana, Butte, MT (United States)

    2015-04-03

    This report summarizes the results of the Load Control System Reliability project (DOE Award DE-FC26-06NT42750). The original grant was awarded to Montana Tech April 2006. Follow-on DOE awards and expansions to the project scope occurred August 2007, January 2009, April 2011, and April 2013. In addition to the DOE monies, the project also consisted of matching funds from the states of Montana and Wyoming. Project participants included Montana Tech; the University of Wyoming; Montana State University; NorthWestern Energy, Inc., and MSE. Research focused on two areas: real-time power-system load control methodologies; and, power-system measurement-based stability-assessment operation and control tools. The majority of effort was focused on area 2. Results from the research includes: development of fundamental power-system dynamic concepts, control schemes, and signal-processing algorithms; many papers (including two prize papers) in leading journals and conferences and leadership of IEEE activities; one patent; participation in major actual-system testing in the western North American power system; prototype power-system operation and control software installed and tested at three major North American control centers; and, the incubation of a new commercial-grade operation and control software tool. Work under this grant certainly supported the DOE-OE goals in the area of “Real Time Grid Reliability Management.”

  5. Safety, reliability and worker satisfaction during organizational change

    NARCIS (Netherlands)

    Zwetsloot, G.I.J.M.; Drupsteen, L.; Vroome, E.M.M. de

    2014-01-01

    The research presented in this paper was carried out in four process industry plants in the Netherlands, to identify factors that have the potential to increase safety and reliability while maintaining or improving job satisfaction. The data used were gathered as part of broader trajectories in thes

  6. Safety, reliability and worker satisfaction during organizational change

    NARCIS (Netherlands)

    Zwetsloot, G.I.J.M.; Drupsteen, L.; Vroome, E.M.M. de

    2014-01-01

    The research presented in this paper was carried out in four process industry plants in the Netherlands, to identify factors that have the potential to increase safety and reliability while maintaining or improving job satisfaction. The data used were gathered as part of broader trajectories in

  7. Reliability analysis in intelligent machines

    Science.gov (United States)

    Mcinroy, John E.; Saridis, George N.

    1990-01-01

    Given an explicit task to be executed, an intelligent machine must be able to find the probability of success, or reliability, of alternative control and sensing strategies. By using concepts for information theory and reliability theory, new techniques for finding the reliability corresponding to alternative subsets of control and sensing strategies are proposed such that a desired set of specifications can be satisfied. The analysis is straightforward, provided that a set of Gaussian random state variables is available. An example problem illustrates the technique, and general reliability results are presented for visual servoing with a computed torque-control algorithm. Moreover, the example illustrates the principle of increasing precision with decreasing intelligence at the execution level of an intelligent machine.

  8. On Bayesian System Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Soerensen Ringi, M.

    1995-05-01

    The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person`s state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs.

  9. VCSEL reliability: a user's perspective

    Science.gov (United States)

    McElfresh, David K.; Lopez, Leoncio D.; Melanson, Robert; Vacar, Dan

    2005-03-01

    VCSEL arrays are being considered for use in interconnect applications that require high speed, high bandwidth, high density, and high reliability. In order to better understand the reliability of VCSEL arrays, we initiated an internal project at SUN Microsystems, Inc. In this paper, we present preliminary results of an ongoing accelerated temperature-humidity-bias stress test on VCSEL arrays from several manufacturers. This test revealed no significant differences between the reliability of AlGaAs, oxide confined VCSEL arrays constructed with a trench oxide and mesa for isolation. This test did find that the reliability of arrays needs to be measured on arrays and not be estimated with the data from singulated VCSELs as is a common practice.

  10. Human α-amylase present in lower-genital-tract mucosal fluid processes glycogen to support vaginal colonization by Lactobacillus.

    Science.gov (United States)

    Spear, Gregory T; French, Audrey L; Gilbert, Douglas; Zariffard, M Reza; Mirmonsef, Paria; Sullivan, Thomas H; Spear, William W; Landay, Alan; Micci, Sandra; Lee, Byung-Hoo; Hamaker, Bruce R

    2014-10-01

    Lactobacillus colonization of the lower female genital tract provides protection from the acquisition of sexually transmitted diseases, including human immunodeficiency virus, and from adverse pregnancy outcomes. While glycogen in vaginal epithelium is thought to support Lactobacillus colonization in vivo, many Lactobacillus isolates cannot utilize glycogen in vitro. This study investigated how glycogen could be utilized by vaginal lactobacilli in the genital tract. Several Lactobacillus isolates were confirmed to not grow in glycogen, but did grow in glycogen-breakdown products, including maltose, maltotriose, maltopentaose, maltodextrins, and glycogen treated with salivary α-amylase. A temperature-dependent glycogen-degrading activity was detected in genital fluids that correlated with levels of α-amylase. Treatment of glycogen with genital fluids resulted in production of maltose, maltotriose, and maltotetraose, the major products of α-amylase digestion. These studies show that human α-amylase is present in the female lower genital tract and elucidates how epithelial glycogen can support Lactobacillus colonization in the genital tract.

  11. Integration Of Digital Methodologies (Field, Processing, and Presentation) In A Combined Sedimentology/Stratigraphy and Structure Course

    Science.gov (United States)

    Malinconico, L. L., Jr.; Sunderlin, D.; Liew, C. W.

    2015-12-01

    Over the course of the last three years we have designed, developed and refined two Apps for the iPad. GeoFieldBook and StratLogger allow for the real-time display of spatial (structural) and temporal (stratigraphic) field data as well as very easy in-field navigation. Field techniques and methods for data acquisition and mapping in the field have dramatically advanced and simplified how we collect and analyze data while in the field. The Apps are not geologic mapping programs, but rather a way of bypassing the analog field book step to acquire digital data directly that can then be used in various analysis programs (GIS, Google Earth, Stereonet, spreadsheet and drawing programs). We now complete all of our fieldwork digitally. GeoFieldBook can be used to collect structural and other field observations. Each record includes location/date/time information, orientation measurements, formation names, text observations and photos taken with the tablet camera. Records are customizable, so users can add fields of their own choosing. Data are displayed on an image base in real time with oriented structural symbols. The image base is also used for in-field navigation. In StratLogger, the user records bed thickness, lithofacies, biofacies, and contact data in preset and modifiable fields. Each bed/unit record may also be photographed and geo-referenced. As each record is collected, a column diagram of the stratigraphic sequence is built in real time, complete with lithology color, lithology texture, and fossil symbols. The recorded data from any measured stratigraphic sequence can be exported as both the live-drawn column image and as a .csv formatted file for use in spreadsheet or other applications. Common to both Apps is the ability to export the data (via .csv files), photographs and maps or stratigraphic columns (images). Since the data are digital they are easily imported into various processing programs (for example for stereoplot analysis). Requiring that all maps

  12. Automatic Transaction Compensation for Reliable Grid Applications

    Institute of Scientific and Technical Information of China (English)

    Fei-Long Tang; Ming-Lu Li; Joshua Zhexue Huang

    2006-01-01

    As grid technology is expanding from scientific computing to business applications, service oriented grid computing is aimed at providing reliable services for users and hiding complexity of service processes from them. The grid services for coordinating long-lived transactions that occur in business applications play an important role in reliable grid applications. In this paper, the grid transaction service (GridTS) is proposed for dealing with long-lived business transactions. We present a compensation-based long-lived transaction coordination algorithm that enables users to select results from committed sub-transactions. Unlike other long-lived transaction models that require application programmers to develop corresponding compensating transactions, GridTS can automatically generate compensating transactions on execution of a long-lived grid transaction. The simulation result has demonstrated the feasibility of GridTS and effectiveness of the corresponding algorithm.

  13. Reliable aluminum contact formation by electrostatic bonding

    Science.gov (United States)

    Kárpáti, T.; Pap, A. E.; Radnóczi, Gy; Beke, B.; Bársony, I.; Fürjes, P.

    2015-07-01

    The paper presents a detailed study of a reliable method developed for aluminum fusion wafer bonding assisted by the electrostatic force evolving during the anodic bonding process. The IC-compatible procedure described allows the parallel formation of electrical and mechanical contacts, facilitating a reliable packaging of electromechanical systems with backside electrical contacts. This fusion bonding method supports the fabrication of complex microelectromechanical systems (MEMS) and micro-opto-electromechanical systems (MOEMS) structures with enhanced temperature stability, which is crucial in mechanical sensor applications such as pressure or force sensors. Due to the applied electrical potential of  -1000 V the Al metal layers are compressed by electrostatic force, and at the bonding temperature of 450 °C intermetallic diffusion causes aluminum ions to migrate between metal layers.

  14. Structural Reliability Sensitivities under Nonstationary Random Vibrations

    Directory of Open Access Journals (Sweden)

    Rita Greco

    2013-01-01

    Full Text Available Response sensitivity evaluation is an important element in reliability evaluation and design optimization of structural systems. It has been widely studied under static and dynamic forcing conditions with deterministic input data. In this paper, structural response and reliability sensitivities are determined by means of the time domain covariance analysis in both classically and nonclassically damped linear structural systems. A time integration scheme is proposed for covariance sensitivity. A modulated, filtered, white noise input process is adopted to model the stochastic nonstationary loads. The method allows for the evaluation of sensitivity statistics of different quantities of dynamic response with respect to structural parameters. Finally, numerical examples are presented regarding a multistorey shear frame building.

  15. Information Presentation

    Science.gov (United States)

    Holden, Kritina L.; Thompson, Shelby G.; Sandor, Aniko; McCann, Robert S.; Kaiser, Mary K.; Adelstein, Barnard D.; Begault, Durand R.; Beutter, Brent R.; Stone, Leland S.; Godfroy, Martine

    2009-01-01

    The goal of the Information Presentation Directed Research Project (DRP) is to address design questions related to the presentation of information to the crew. In addition to addressing display design issues associated with information formatting, style, layout, and interaction, the Information Presentation DRP is also working toward understanding the effects of extreme environments encountered in space travel on information processing. Work is also in progress to refine human factors-based design tools, such as human performance modeling, that will supplement traditional design techniques and help ensure that optimal information design is accomplished in the most cost-efficient manner. The major areas of work, or subtasks, within the Information Presentation DRP for FY10 are: 1) Displays, 2) Controls, 3) Procedures and Fault Management, and 4) Human Performance Modeling. The poster will highlight completed and planned work for each subtask.

  16. Processing and MHC class II presentation of exogenous soluble antigen involving a proteasome-dependent cytosolic pathway in CD40-activated B cells.

    Science.gov (United States)

    Becker, Hans Jiro; Kondo, Eisei; Shimabukuro-Vornhagen, Alexander; Theurich, Sebastian; von Bergwelt-Baildon, Michael S

    2016-08-01

    Activated B cells have the capacity to present antigen and induce immune responses as potent antigen-presenting cells (APCs). As in other APCs, antigen presentation by B cells involves antigen internalization, antigen processing, and peptide loading onto MHC molecules. However, while the mechanism of antigen processing has been studied extensively in other APCs, this pathway remains elusive in B cells. The aim of this study was to investigate the MHC class II processing pathway in CD40-activated B cells (CD40Bs), as a model for activated, antigen-presenting B cells. Using CMV pp65 as a model antigen, we evaluated processing and presentation of the CD4 + T-cell epitope 509-523 (K509) by human CD40Bs in ELISPOT assays. As expected, stimulation of specific CD4 + T-cell clones was attenuated after pretreatment of CD40Bs with inhibitors of classic class II pathway components. However, proteasome inhibitors such as epoxomicin limited antigen presentation as well. This suggests that the antigen is processed in a non-classical, cytosolic MHC class II pathway. Further experiments with truncated protein variants revealed involvement of the proteasome in processing of the N and C extensions of the epitope. Access to the cytosol was shown to be size dependent. Epoxomicin sensitivity exclusively in CD40B cells, but not in dendritic cells, suggests a novel processing mechanism unique to this APC. Our data suggest that B cells process antigen using a distinct, non-classical class II pathway.

  17. From Pangaea to the present: geochronology, thermochronology and isotopic tracking of tectonic processes along the Northern Andes

    Science.gov (United States)

    Spikings, R.; Cochrane, R.; Van der Lelij, R.; Villagomez, D.

    2013-05-01

    Triassic - Tertiary rocks within the Central Cordillera of Colombia and Eastern Cordillera of Ecuador provide a record of the rift-to-drift phase of the western Tethys Wilson Cycle, Jurassic steady-state active margin magmatism, Early Cretaceous attenuation of the margin and the formation of new continental crust, and the accretion of an extensive oceanic plateau and arc sequence at ~75 Ma, which shielded juvenile continental crust from tectonic erosion during the Tertiary. The margin remained active throughout the Tertiary, and exhumed in response to changing oceanic plate kinematics and the collision of heterogeneous oceanic crust. We present geochronological, thermochronological, geochemical and Hf, Nd and O isotopic data that provide a highly-temporally resolved record of the evolution of NW Gondwana from Pangaea to the present. Migmatitic leucosomes and S-type granites were emplaced along the NW South American margin during ~275-225 Ma, and tholeiitic amphibolites intruded during ~240 - 225 Ma. These sequences formed during continental rifting in a back-arc, leading to the formation of ophiolite sequences and oceanic crust by ~216 Ma. The Maya and Oaxaquia terranes of Central America may represent parts of the conjugate margin. The NW South American margin remained passive until ~183 Ma, when subduction gave rise to calc-alkaline, I-type granitoids until ~143 Ma. Earliest Cretaceous roll-back extended and exhumed the margin, causing the arc axis to migrate oceanward while the magmatic rocks became progressively more isotopically juvenile. Arc migration opened Early Cretaceous intra-arc basins that were floored by lavas and filled with arc detritus. The arc axis stabilized at ~130-115 Ma and fringed the continental margin outboard of the Jurassic arc. Compression at 120-110 Ma closed the intra-arc basins, exhumed the buttressing continental margin and obducted variably metamorphosed rocks of the east dipping-subduction channel onto the continental margin during

  18. Characterization of the Antigen Processing Machinery and Endogenous Peptide Presentation of a Bat MHC Class I Molecule.

    Science.gov (United States)

    Wynne, James W; Woon, Amanda P; Dudek, Nadine L; Croft, Nathan P; Ng, Justin H J; Baker, Michelle L; Wang, Lin-Fa; Purcell, Anthony W

    2016-06-01

    Bats are a major reservoir of emerging and re-emerging infectious diseases, including severe acute respiratory syndrome-like coronaviruses, henipaviruses, and Ebola virus. Although highly pathogenic to their spillover hosts, bats harbor these viruses, and a large number of other viruses, with little or no clinical signs of disease. How bats asymptomatically coexist with these viruses is unknown. In particular, little is known about bat adaptive immunity, and the presence of functional MHC molecules is mostly inferred from recently described genomes. In this study, we used an affinity purification/mass spectrometry approach to demonstrate that a bat MHC class I molecule, Ptal-N*01:01, binds antigenic peptides and associates with peptide-loading complex components. We identified several bat MHC class I-binding partners, including calnexin, calreticulin, protein disulfide isomerase A3, tapasin, TAP1, and TAP2. Additionally, endogenous peptide ligands isolated from Ptal-N*01:01 displayed a relatively broad length distribution and an unusual preference for a C-terminal proline residue. Finally, we demonstrate that this preference for C-terminal proline residues was observed in Hendra virus-derived peptides presented by Ptal-N*01:01 on the surface of infected cells. To our knowledge, this is the first study to identify endogenous and viral MHC class I ligands for any bat species and, as such, provides an important avenue for monitoring and development of vaccines against major bat-borne viruses both in the reservoir and spillover hosts. Additionally, it will provide a foundation to understand the role of adaptive immunity in bat antiviral responses.

  19. Viking Lander reliability program

    Science.gov (United States)

    Pilny, M. J.

    1978-01-01

    The Viking Lander reliability program is reviewed with attention given to the development of the reliability program requirements, reliability program management, documents evaluation, failure modes evaluation, production variation control, failure reporting and correction, and the parts program. Lander hardware failures which have occurred during the mission are listed.

  20. Reliability-Based Optimization in Structural Engineering

    DEFF Research Database (Denmark)

    Enevoldsen, I.; Sørensen, John Dalsgaard

    1994-01-01

    -based optimal inspection planning and reliability-based experiment planning. It is explained how these optimization problems can be solved by application of similar techniques. The reliability estimation is limited to first order reliability methods (FORM) for both component and systems reliability evaluation......, inclusion of the finite element method as the response evaluation tool and how the size of the problem can be made practicable. Finally, the important task of model evaluation and sensitivity analysis of the optimal solution is treated including a strategy for model-making with both pre and post-analysis.......In this paper reliability-based optimization problems in structural engineering are formulated on the basis of the classical decision theory. Several formulations are presented: Reliability-based optimal design of structural systems with component or systems reliability constraints, reliability...

  1. Brief Report: An Observational Measure of Empathy for Autism Spectrum--A Preliminary Study of the Development and Reliability of the Client Emotional Processing Scale

    Science.gov (United States)

    Robinson, Anna; Elliott, Robert

    2016-01-01

    People with autism spectrum disorder (ASD), can have difficulties in emotion processing, including recognising their own and others' emotions, leading to problems in emotion regulation and interpersonal relating. This study reports the development and piloting of the Client Emotional Processing Scale-Autism Spectrum (CEPS-AS), a new observer…

  2. 基于 Markov 链互模拟的航天器发射任务可靠度模型%Mission reliability model of spacecraft launch based on bisimulation of continuous-time Markov processes

    Institute of Scientific and Technical Information of China (English)

    董学军; 武小悦; 陈英武

    2012-01-01

    状态空间复杂、多过程并发执行和子过程反复迭代的特点,使航天器发射工程实施全过程的任务可靠性评估难以量化.通过构建多个并发执行的时间连续的 Markov 链对航天器发射工程状态转移约束关系进行描述,采用互模拟时间等价关系简化航天器发射工程实施过程的状态空间,利用连续时间 Markov 链的概率转移特性进行建模与分析,得到了全系统、全过程的航天器发射任务可靠度模型.数值验证表明该模型可用于航天器发射任务工期推演、可靠度评估以及薄弱环节分析.%Characteristics of complex state space, multi-process concurrent execution and sub-processes iterative make mission reliability assessment for the whole process of spacecraft launch engineering implementation is difficult to quantify. Multiple concurrently executing continuous time Markov chain is constructed to describe state transition constraint relations of spacecraft launch engineering. The state space of the whole process of spacecraft launch engineering implementation is simplified by bisimulation equivalence relation. The model of mission reliability for spacecraft launch engineering is builded by continuous time Markov chain transfer probability characteristics. In this paper, the example applied results shows that the model is a feasible for decision-making demonstration of spacecraft launch project, evaluation of mission reliability and analysis of weak link.

  3. Wind turbine reliability database update.

    Energy Technology Data Exchange (ETDEWEB)

    Peters, Valerie A.; Hill, Roger Ray; Stinebaugh, Jennifer A.; Veers, Paul S.

    2009-03-01

    This report documents the status of the Sandia National Laboratories' Wind Plant Reliability Database. Included in this report are updates on the form and contents of the Database, which stems from a fivestep process of data partnerships, data definition and transfer, data formatting and normalization, analysis, and reporting. Selected observations are also reported.

  4. Multi-mode reliability-based design of horizontal curves.

    Science.gov (United States)

    Essa, Mohamed; Sayed, Tarek; Hussein, Mohamed

    2016-08-01

    Recently, reliability analysis has been advocated as an effective approach to account for uncertainty in the geometric design process and to evaluate the risk associated with a particular design. In this approach, a risk measure (e.g. probability of noncompliance) is calculated to represent the probability that a specific design would not meet standard requirements. The majority of previous applications of reliability analysis in geometric design focused on evaluating the probability of noncompliance for only one mode of noncompliance such as insufficient sight distance. However, in many design situations, more than one mode of noncompliance may be present (e.g. insufficient sight distance and vehicle skidding at horizontal curves). In these situations, utilizing a multi-mode reliability approach that considers more than one failure (noncompliance) mode is required. The main objective of this paper is to demonstrate the application of multi-mode (system) reliability analysis to the design of horizontal curves. The process is demonstrated by a case study of Sea-to-Sky Highway located between Vancouver and Whistler, in southern British Columbia, Canada. Two noncompliance modes were considered: insufficient sight distance and vehicle skidding. The results show the importance of accounting for several noncompliance modes in the reliability model. The system reliability concept could be used in future studies to calibrate the design of various design elements in order to achieve consistent safety levels based on all possible modes of noncompliance.

  5. Reliability evaluation of LCD based on two-phase Wiener degradation process%基于两阶段维纳退化过程的液力耦合器可靠性评估

    Institute of Scientific and Technical Information of China (English)

    鄢伟安; 宋保维; 段桂林; 师义民

    2014-01-01

    Reliability modeling and evaluation for the liquid coupling device (LCD)degradation process are studied.There are two phases in the LCD degradation process.That is to say,there is a change-point in the degradation of LCD,and the LCD follows different degradation processes before and after the change-point.Ac-cording to the traditional method,the reliability analysis focuses only on the second phase,which ighores the in-formation of the first phase.In view of this,the two-stage Wiener degradation process model is established,and the reliability function is deduced.The change-point between the two phases is obtained based on the Schwarz information criterion (SIC).Finally,the method is applied to evaluate the reliability of LCD and compared with the traditional method.The results show that the two-phase Wiener degradation model can depict the LCD deg-radation process effectively,and the more believable evaluation result is gained by the proposed method.%对液力耦合器(liquid coupling device,LCD)的两阶段退化过程进行可靠性建模及评估。液力耦合器退化过程中呈现出明显的两阶段特性,即 LCD 在退化过程中存在某一时刻,在该时刻前后,LCD 分别服从不同的退化过程。传统做法只基于第二阶段进行可靠性分析,丢失了第一阶段的信息。鉴于此,建立两阶段维纳退化过程模型,并推导出该模型下的可靠度函数;同时,基于 Schwarz 信息准则(Schwarz information criterion,SIC),获得模型变点的估计。最后,运用所建模型对 LCD 实例进行可靠性评估,并与传统做法进行对比分析。结果表明该模型对 LCD 退化过程拟合效果良好,且给出更为可信的评估结果。

  6. Thermodynamic efficiency of present types of internal combustion engines for aircraft

    Science.gov (United States)

    Lucke, Charles E

    1917-01-01

    Report presents requirements of internal combustion engines suitable for aircraft. Topics include: (1) service requirements for aeronautic engines - power versus weight, reliability, and adaptability factors, (2) general characteristics of present aero engines, (3) aero engine processes and functions of parts versus power-weight ratio, reliability, and adaptability factors, and (4) general arrangement, form, proportions, and materials of aero parts - power-weight ratio, reliability, and adaptability.

  7. Bounded Intensity Process and Its Applications in Reliability Assessment of NC Machine Tools%数控机床可靠性评估中的边界强度过程

    Institute of Scientific and Technical Information of China (English)

    王智明; 杨建国

    2012-01-01

    Based on Akaike information criterion,Bayesian information criterion and the root-mean-square error of the fitting failure data,the best non-homogeneous Poisson process model of reliability analysis for repairable system was proposed.The point maximum likelihood and interval estimators of the parameters,as well as the reliability indices of the bounded intensity process model were given using the asymptotic lognormal distribution of the maximum likelihood estimation and the Fisher information matrix method.The failure data with time truncation of multiple NC machine tools were analyzed.The results show that the bounded intensity process is suitable for reliability assessment of deterioration machine tools with frequent maintenance actions.%基于Akaike信息准则(AIC)、Bayesian信息准则(BIC)及故障数据拟合的均方根误差(RMSE),提出了可修系统可靠性分析的非齐次泊松过程模型的选择方法,利用最大似然估计的渐近对数正态分布特性,用Fisher信息矩阵法给出了边界强度过程模型参数及系统可靠性指标的点估计及区间估计,分析了多台数控机床时间截尾的故障过程.结果表明,对于维修频繁的性能恶化数控机床,边界强度故障模型适合于其可靠性评估.

  8. Advances in reliability and system engineering

    CERN Document Server

    Davim, J

    2017-01-01

    This book presents original studies describing the latest research and developments in the area of reliability and systems engineering. It helps the reader identifying gaps in the current knowledge and presents fruitful areas for further research in the field. Among others, this book covers reliability measures, reliability assessment of multi-state systems, optimization of multi-state systems, continuous multi-state systems, new computational techniques applied to multi-state systems and probabilistic and non-probabilistic safety assessment.

  9. Reliability engineering in solar energy: workshop proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Gross, G.

    1980-03-01

    A workshop to reveal the scope of reliability-related activities in solar energy conversion projects and in nonsolar segments of industry is described. Two reliability programs, one in heating and cooling and one in photovoltaics, are explicated. This document also presents general suggestions for the establishment of a unified program for reliability, durability, maintainability, and safety (RDM and S) in present and future solar projects.

  10. Reliability engineering in solar energy: workshop proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Gross, G.

    1980-03-01

    A workshop to reveal the scope of reliability-related activities in solar energy conversion projects and in nonsolar segments of industry is described. Two reliability programs, one in heating and cooling and one in photovoltaics, are explicated. This document also presents general suggestions for the establishment of a unified program for reliability, durability, maintainability, and safety (RDM and S) in present and future solar projects.

  11. RESEARCH ON RELIABILITY GROWTH FOR SYNCHRONOUSLY DEVELOPED MULTI-SYSTEMS

    Institute of Scientific and Technical Information of China (English)

    MA Xiao-ning; L(U) Zhen-zhou; YUE Zhu-feng

    2005-01-01

    An advanced reliability growth model, i. e. exponential model, was presented to estimate the model parameters for multi-systems, which was synchronously tested, synchronously censored, and synchronously improved. In the presented method,the data during the reliability growth process were taken into consideration sufficiently,including the failure numbers, safety numbers and failure time at each censored time. If the multi-systems were synchronously improved for many times, and the reliability growth of each system fitted AMSAA (Army Material Systems Analysis Activity)model, the failure time of each system could be considered rationally as an exponential distribution between two adjoining censored times. The nonparametric method was employed to obtain the reliability at each censored time of the synchronous multisystems. The point estimations of the model parameters, a and b, were given by the least square method. The confidence interval for the parameter b was given as well. An engineering illustration was used to compare the result of the presented method with those of the available models. The result shows that the presented exponential growth model fits AMSAA-BISE ( Army Material Systems Analysis Activity-Beijing Institute of Structure and Environment) model rather well, and two models are suitable to estimate the reliability growth for the synchronously developed multi-systems.

  12. Modeling and Analysis of Component Faults and Reliability

    DEFF Research Database (Denmark)

    Le Guilly, Thibaut; Olsen, Petur; Ravn, Anders Peter;

    2016-01-01

    that are automatically generated. The stochastic information on the faults is used to estimate the reliability of the fault affected system. The reliability is given with respect to properties of the system state space. We illustrate the process on a concrete example using the Uppaal model checker for validating...... the ideal system model and the fault modeling. Then the statistical version of the tool, UppaalSMC, is used to find reliability estimates.......This chapter presents a process to design and validate models of reactive systems in the form of communicating timed automata. The models are extended with faults associated with probabilities of occurrence. This enables a fault tree analysis of the system using minimal cut sets...

  13. Component reliability for electronic systems

    CERN Document Server

    Bajenescu, Titu-Marius I

    2010-01-01

    The main reason for the premature breakdown of today's electronic products (computers, cars, tools, appliances, etc.) is the failure of the components used to build these products. Today professionals are looking for effective ways to minimize the degradation of electronic components to help ensure longer-lasting, more technically sound products and systems. This practical book offers engineers specific guidance on how to design more reliable components and build more reliable electronic systems. Professionals learn how to optimize a virtual component prototype, accurately monitor product reliability during the entire production process, and add the burn-in and selection procedures that are the most appropriate for the intended applications. Moreover, the book helps system designers ensure that all components are correctly applied, margins are adequate, wear-out failure modes are prevented during the expected duration of life, and system interfaces cannot lead to failure.

  14. Anodisation of Aluminium Alloys by Micro-Capillary Technique as a Tool for Reliable, Cost-Efficient, and Quick Process Parameter Determination

    Directory of Open Access Journals (Sweden)

    Daniela Nickel

    2016-01-01

    Full Text Available Anodisation is essential for improving surface properties of aluminium alloys and composites regarding wear and corrosion behaviour. Optimisation of the anodising process depends on microstructural constituents contained in aluminium alloys and represents a key task, consisting of the control of process parameters and electrolyte formulation. We applied the micro-capillary technique known from corrosion studies and modified it to form anodic aluminium oxide films on high-strength aluminium alloys in comparison to pure aluminium in sulphuric acid. A glass capillary with an opening of 800 μm in diameter was utilized. Corresponding electrochemical measurements during potentiodynamic and potentiostatic anodisation revealed anodic current responses similar to conventional anodisation. The measurement of film thickness was adapted to the thin anodised spots using ellipsometry and energy dispersive X-ray analysis. Cross sections prepared by focused ion beam milling confirm the thickness results and show the behaviour of intermetallic phases depending on the anodising potential. Consequently, micro-capillary anodising proved to be an effective tool for developing appropriate anodisation conditions for aluminium alloys and composites because it allows quick variation of electrolyte composition by applying low electrolyte volumes and rapid film formation due to short process durations at small areas and more flexible variation of process parameters due to the used set-up.

  15. A Study of Secondary Students' Decision-Making Processes with Respect to Information Use, Particularly Students' Judgements of Relevance and Reliability

    Science.gov (United States)

    Watson, Curtis L.

    2010-01-01

    This report details an ongoing investigation of the decision-making processes of a group of secondary school students in south-eastern Australia undertaking information search tasks. The study is situated in the field of information seeking and use, and, more broadly, in decision making. Research questions focus on students' decisions about the…

  16. Column Grid Array Rework for High Reliability

    Science.gov (United States)

    Mehta, Atul C.; Bodie, Charles C.

    2008-01-01

    Due to requirements for reduced size and weight, use of grid array packages in space applications has become common place. To meet the requirement of high reliability and high number of I/Os, ceramic column grid array packages (CCGA) were selected for major electronic components used in next MARS Rover mission (specifically high density Field Programmable Gate Arrays). ABSTRACT The probability of removal and replacement of these devices on the actual flight printed wiring board assemblies is deemed to be very high because of last minute discoveries in final test which will dictate changes in the firmware. The questions and challenges presented to the manufacturing organizations engaged in the production of high reliability electronic assemblies are, Is the reliability of the PWBA adversely affected by rework (removal and replacement) of the CGA package? and How many times can we rework the same board without destroying a pad or degrading the lifetime of the assembly? To answer these questions, the most complex printed wiring board assembly used by the project was chosen to be used as the test vehicle, the PWB was modified to provide a daisy chain pattern, and a number of bare PWB s were acquired to this modified design. Non-functional 624 pin CGA packages with internal daisy chained matching the pattern on the PWB were procured. The combination of the modified PWB and the daisy chained packages enables continuity measurements of every soldered contact during subsequent testing and thermal cycling. Several test vehicles boards were assembled, reworked and then thermal cycled to assess the reliability of the solder joints and board material including pads and traces near the CGA. The details of rework process and results of thermal cycling are presented in this paper.

  17. Calibrating ensemble reliability whilst preserving spatial structure

    Directory of Open Access Journals (Sweden)

    Jonathan Flowerdew

    2014-03-01

    Full Text Available Ensemble forecasts aim to improve decision-making by predicting a set of possible outcomes. Ideally, these would provide probabilities which are both sharp and reliable. In practice, the models, data assimilation and ensemble perturbation systems are all imperfect, leading to deficiencies in the predicted probabilities. This paper presents an ensemble post-processing scheme which directly targets local reliability, calibrating both climatology and ensemble dispersion in one coherent operation. It makes minimal assumptions about the underlying statistical distributions, aiming to extract as much information as possible from the original dynamic forecasts and support statistically awkward variables such as precipitation. The output is a set of ensemble members preserving the spatial, temporal and inter-variable structure from the raw forecasts, which should be beneficial to downstream applications such as hydrological models. The calibration is tested on three leading 15-d ensemble systems, and their aggregation into a simple multimodel ensemble. Results are presented for 12 h, 1° scale over Europe for a range of surface variables, including precipitation. The scheme is very effective at removing unreliability from the raw forecasts, whilst generally preserving or improving statistical resolution. In most cases, these benefits extend to the rarest events at each location within the 2-yr verification period. The reliability and resolution are generally equivalent or superior to those achieved using a Local Quantile-Quantile Transform, an established calibration method which generalises bias correction. The value of preserving spatial structure is demonstrated by the fact that 3×3 averages derived from grid-scale precipitation calibration perform almost as well as direct calibration at 3×3 scale, and much better than a similar test neglecting the spatial relationships. Some remaining issues are discussed regarding the finite size of the output

  18. A reliability assessment method based on support vector machines for CNC equipment

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    With the applications of high technology,a catastrophic failure of CNC equipment rarely occurs at normal operation conditions.So it is difficult for traditional reliability assessment methods based on time-to-failure distributions to deduce the reliability level.This paper presents a novel reliability assessment methodology to estimate the reliability level of equipment with machining performance degradation data when only a few samples are available.The least squares support vector machines(LS-SVM) are introduced to analyze the performance degradation process on the equipment.A two-stage parameter optimization and searching method is proposed to improve the LS-SVM regression performance and a reliability assessment model based on the LS-SVM is built.A machining performance degradation experiment has been carried out on an OTM650 machine tool to validate the effectiveness of the proposed reliability assessment methodology.

  19. A reliability assessment method based on support vector machines for CNC equipment

    Institute of Scientific and Technical Information of China (English)

    WU Jun; DENG Chao; SHAO XinYu; XIE S Q

    2009-01-01

    With the applications of high technology, a catastrophic failure of CNC equipment rarely occurs at normal operation conditions. So it is difficult for traditional reliability assessment methods based on time-to-failure distributions to deduce the reliability level. This paper presents a novel reliability assessment methodology to estimate the reliability level of equipment with machining performance degradation data when only a few samples are available. The least squares support vector machines(LS-SVM) are introduced to analyze the performance degradation process on the equipment. A two-stage parameter optimization and searching method is proposed to improve the LS-SVM regression performance and a reliability assessment model based on the LS-SVM is built. A machining performance degradation experiment has been carried out on an OTM650 machine tool to validate the effectiveness of the proposed reliability assessment methodology.

  20. Response and Reliability Problems of Dynamic Systems

    DEFF Research Database (Denmark)

    Nielsen, Søren R. K.

    The present thesis consists of selected parts of the work performed by the author on stochastic dynamics and reliability theory of dynamically excited structures primarily during the period 1986-1996.......The present thesis consists of selected parts of the work performed by the author on stochastic dynamics and reliability theory of dynamically excited structures primarily during the period 1986-1996....

  1. Software reliability models for critical applications

    Energy Technology Data Exchange (ETDEWEB)

    Pham, H.; Pham, M.

    1991-12-01

    This report presents the results of the first phase of the ongoing EG&G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.

  2. Software reliability models for critical applications

    Energy Technology Data Exchange (ETDEWEB)

    Pham, H.; Pham, M.

    1991-12-01

    This report presents the results of the first phase of the ongoing EG G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.

  3. Reliability and radiation effects in compound semiconductors

    CERN Document Server

    Johnston, Allan

    2010-01-01

    This book discusses reliability and radiation effects in compound semiconductors, which have evolved rapidly during the last 15 years. Johnston's perspective in the book focuses on high-reliability applications in space, but his discussion of reliability is applicable to high reliability terrestrial applications as well. The book is important because there are new reliability mechanisms present in compound semiconductors that have produced a great deal of confusion. They are complex, and appear to be major stumbling blocks in the application of these types of devices. Many of the reliability problems that were prominent research topics five to ten years ago have been solved, and the reliability of many of these devices has been improved to the level where they can be used for ten years or more with low failure rates. There is also considerable confusion about the way that space radiation affects compound semiconductors. Some optoelectronic devices are so sensitive to damage in space that they are very difficu...

  4. Metrological Reliability of Medical Devices

    Science.gov (United States)

    Costa Monteiro, E.; Leon, L. F.

    2015-02-01

    The prominent development of health technologies of the 20th century triggered demands for metrological reliability of physiological measurements comprising physical, chemical and biological quantities, essential to ensure accurate and comparable results of clinical measurements. In the present work, aspects concerning metrological reliability in premarket and postmarket assessments of medical devices are discussed, pointing out challenges to be overcome. In addition, considering the social relevance of the biomeasurements results, Biometrological Principles to be pursued by research and innovation aimed at biomedical applications are proposed, along with the analysis of their contributions to guarantee the innovative health technologies compliance with the main ethical pillars of Bioethics.

  5. Reliability Based Ship Structural Design

    DEFF Research Database (Denmark)

    Dogliani, M.; Østergaard, C.; Parmentier, G.;

    1996-01-01

    with developments of models of load effects and of structural collapse adopted in reliability formulations which aim at calibrating partial safety factors for ship structural design. New probabilistic models of still-water load effects are developed both for tankers and for containerships. New results are presented......This paper deals with the development of different methods that allow the reliability-based design of ship structures to be transferred from the area of research to the systematic application in current design. It summarises the achievements of a three-year collaborative research project dealing...... structure of several tankers and containerships. The results of the reliability analysis were the basis for the definition of a target safety level which was used to asses the partial safety factors suitable for in a new design rules format to be adopted in modern ship structural design. Finally...

  6. Improvement of reliability of welding by in-process sensing and control (development of smart welding machines for girth welding of pipes). Final report

    Energy Technology Data Exchange (ETDEWEB)

    Hardt, D.E.; Masubuchi, K.; Paynter, H.M.; Unkel, W.C.

    1983-04-01

    Closed-loop control of the welding variables represents a promising, cost-effective approach to improving weld quality and therefore reducing the total cost of producing welded structures. The ultimate goal is to place all significant weld variables under direct closed-loop control; this contrasts with preprogrammed machines which place the welding equipment under control. As the first step, an overall strategy has been formulated and an investigation of weld pool geometry control for gas tungsten arc process has been completed. The research activities were divided into the areas of arc phenomena, weld pool phenomena, sensing techniques and control activities.

  7. Photovoltaic performance and reliability workshop

    Energy Technology Data Exchange (ETDEWEB)

    Kroposki, B

    1996-10-01

    This proceedings is the compilation of papers presented at the ninth PV Performance and Reliability Workshop held at the Sheraton Denver West Hotel on September 4--6, 1996. This years workshop included presentations from 25 speakers and had over 100 attendees. All of the presentations that were given are included in this proceedings. Topics of the papers included: defining service lifetime and developing models for PV module lifetime; examining and determining failure and degradation mechanisms in PV modules; combining IEEE/IEC/UL testing procedures; AC module performance and reliability testing; inverter reliability/qualification testing; standardization of utility interconnect requirements for PV systems; need activities to separate variables by testing individual components of PV systems (e.g. cells, modules, batteries, inverters,charge controllers) for individual reliability and then test them in actual system configurations; more results reported from field experience on modules, inverters, batteries, and charge controllers from field deployed PV systems; and system certification and standardized testing for stand-alone and grid-tied systems.

  8. Operator adaptation to changes in system reliability under adaptable automation.

    Science.gov (United States)

    Chavaillaz, Alain; Sauer, Juergen

    2016-11-25

    This experiment examined how operators coped with a change in system reliability between training and testing. Forty participants were trained for 3 h on a complex process control simulation modelling six levels of automation (LOA). In training, participants either experienced a high- (100%) or low-reliability system (50%). The impact of training experience on operator behaviour was examined during a 2.5 h testing session, in which participants either experienced a high- (100%) or low-reliability system (60%). The results showed that most operators did not often switch between LOA. Most chose an LOA that relieved them of most tasks but maintained their decision authority. Training experience did not have a strong impact on the outcome measures (e.g. performance, complacency). Low system reliability led to decreased performance and self-confidence. Furthermore, complacency was observed under high system reliability. Overall, the findings suggest benefits of adaptable automation because it accommodates different operator preferences for LOA. Practitioner Summary: The present research shows that operators can adapt to changes in system reliability between training and testing sessions. Furthermore, it provides evidence that each operator has his/her preferred automation level. Since this preference varies strongly between operators, adaptable automation seems to be suitable to accommodate these large differences.

  9. Inhibiting DNA methylation activates cancer testis antigens and expression of the antigen processing and presentation machinery in colon and ovarian cancer cells.

    Science.gov (United States)

    Siebenkäs, Cornelia; Chiappinelli, Katherine B; Guzzetta, Angela A; Sharma, Anup; Jeschke, Jana; Vatapalli, Rajita; Baylin, Stephen B; Ahuja, Nita

    2017-01-01

    Innovative therapies for solid tumors are urgently needed. Recently, therapies that harness the host immune system to fight cancer cells have successfully treated a subset of patients with solid tumors. These responses have been strong and durable but observed in subsets of patients. Work from our group and others has shown that epigenetic therapy, specifically inhibiting the silencing DNA methylation mark, activates immune signaling in tumor cells and can sensitize to immune therapy in murine models. Here we show that colon and ovarian cancer cell lines exhibit lower expression of transcripts involved in antigen processing and presentation to immune cells compared to normal tissues. In addition, treatment with clinically relevant low doses of DNMT inhibitors (that remove DNA methylation) increases expression of both antigen processing and presentation and Cancer Testis Antigens in these cell lines. We confirm that treatment with DNMT inhibitors upregulates expression of the antigen processing and presentation molecules B2M, CALR, CD58, PSMB8, PSMB9 at the RNA and protein level in a wider range of colon and ovarian cancer cell lines and treatment time points than had been described previously. In addition, we show that DNMTi treatment upregulates many Cancer Testis Antigens common to both colon and ovarian cancer. This increase of both antigens and antigen presentation by epigenetic therapy may be one mechanism to sensitize patients to immune therapies.

  10. 76 FR 66229 - Transmission Planning Reliability Standards

    Science.gov (United States)

    2011-10-26

    ... planning process. The table includes a footnote regarding planned or controlled interruption of electric supply where a single contingency occurs on a transmission system. North American Electric Reliability...\\ Reliability Standard TPL-002-0a, Requirement R1. Planned or controlled interruption of electric supply to...

  11. Transitioning to Physics-of-Failure as a Reliability Driver in Power Electronics

    DEFF Research Database (Denmark)

    Wang, Huai; Liserre, Marco; Blaabjerg, Frede

    2014-01-01

    Power electronics has progressively gained important status in power generation, distribution and consumption. With more than 70% of electricity processed through power electronics, recent research endeavors to improve the reliability of power electronic systems to comply with more stringent...... constraints on cost, safety and availability in various applications. This paper serves to give an overview of the major aspects of reliability in power electronics and to address the future trends in this multidisciplinary research direction. The ongoing paradigm shift in reliability research is presented...... first. Then the three major aspects of power electronics reliability are discussed, respectively, which cover from physics-of-failure analysis of critical power elec-tronic components, state-of-the-art design for reliability process and robustness validation, and intelligent control and condition...

  12. A level set method for reliability-based topology optimization of compliant mechanisms

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Based on the level set model and the reliability theory, a numerical approach of reliability-based topology optimization for compliant mechanisms with multiple inputs and outputs is presented. A multi-objective topology optimal model of compliant mechanisms considering uncertainties of the loads, material properties, and member geometries is developed. The reliability analysis and topology optimization are integrated in the optimal iterative process. The reliabilities of the compliant mechanisms are evaluated by using the first order reliability method. Meanwhile, the problem of structural topology optimization is solved by the level set method which is flexible in handling complex topological changes and concise in describing the boundary shape of the mechanism. Numerical examples show the importance of considering the stochastic nature of the compliant mechanisms in the topology optimization process.

  13. Transitioning to Physics-of-Failure as a Reliability Driver in Power Electronics

    DEFF Research Database (Denmark)

    Wang, Huai; Liserre, Marco; Blaabjerg, Frede

    2014-01-01

    Power electronics has progressively gained important status in power generation, distribution and consumption. With more than 70% of electricity processed through power electronics, recent research endeavors to improve the reliability of power electronic systems to comply with more stringent...... constraints on cost, safety and availability in various applications. This paper serves to give an overview of the major aspects of reliability in power electronics and to address the future trends in this multidisciplinary research direction. The ongoing paradigm shift in reliability research is presented...... first. Then the three major aspects of power electronics reliability are discussed, respectively, which cover from physics-of-failure analysis of critical power elec-tronic components, state-of-the-art design for reliability process and robustness validation, and intelligent control and condition...

  14. Principles of Bridge Reliability

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle; Nowak, Andrzej S.

    The paper gives a brief introduction to the basic principles of structural reliability theory and its application to bridge engineering. Fundamental concepts like failure probability and reliability index are introduced. Ultimate as well as serviceability limit states for bridges are formulated...

  15. Combination of structural reliability and interval analysis

    Institute of Scientific and Technical Information of China (English)

    Zhiping Qiu; Di Yang; saac Elishakoff

    2008-01-01

    In engineering applications,probabilistic reliability theory appears to be presently the most important method,however,in many cases precise probabilistic reliability theory cannot be considered as adequate and credible model of the real state of actual affairs.In this paper,we developed a hybrid of probabilistic and non-probabilistic reliability theory,which describes the structural uncertain parameters as interval variables when statistical data are found insufficient.By using the interval analysis,a new method for calculating the interval of the structural reliability as well as the reliability index is introduced in this paper,and the traditional probabilistic theory is incorporated with the interval analysis.Moreover,the new method preserves the useful part of the traditional probabilistic reliability theory,but removes the restriction of its strict requirement on data acquisition.Example is presented to demonstrate the feasibility and validity of the proposed theory.

  16. Lz-transform and inverse Lz-transform application to dynamic reliability assessment for multi-state system

    DEFF Research Database (Denmark)

    Lisnianski, A.; Ding, Y.

    2014-01-01

    The paper presents a new method for reliability assessment for complex multi-state system. The system and its components can have different performance levels ranging from perfect functioning to complete failure. Straightforward Markov method applied to solve the problem will require building...... such as reliability function, mean time to failure etc. inverse LZ-transform is using that completely reveals underlying output process....

  17. 海水淡化工艺装备研发现状%Present Situation of Research and Development of Process Equipments for Seawater Desalination

    Institute of Scientific and Technical Information of China (English)

    王世明; 刘银

    2013-01-01

    Because of free from the variations in rainfall and wide sources,seawater desalination is a feasible water resource for the shortage of fresh water.A number of desalination technologies have been considered and developed significantly during the last several decades to augment the supply of water all around the world.A seawater desalination process separates seawater into two streams:a fresh water stream and a concentrated brine stream.Although the process equipment and support technologies are mature enough to be a reliable source for fresh water from the sea,a significant amount of research and development (R&D) has been carried out in order to constantly improve the technologies and reduce the cost of desalination.Current status,practices,and advances that have been made in the realm of seawater desalination technologies were reviewed in this paper.Additionally,an overview of R&D activities and outlines future prospects for the state-of-the-art seawater desalination technologies in combination with five-year plan and general problems for process equipment was put forward.%随着全球淡水资源紧缺区域增多,在过去的数十年里淡化海水因其来源广且不受降雨影响,而成为解决淡水短缺的可行方案,并得到越来越多关注和发展.海水淡化工艺(即脱盐)将海水分为两部分:淡水和高浓度的盐水.虽然海水淡化的工艺装备和支撑技术已经相当成熟,但为了不断提高技术并降低成本,国内外进行了大量的研究.该文概述了海水淡化主流技术的研究现状、工程项目以及新工艺.另外归纳了海水淡化工艺装备存在的一般问题,结合“十二五”规划,指出了海水淡化的研究方向.

  18. Design for Reliability of Power Electronic Systems

    DEFF Research Database (Denmark)

    Wang, Huai; Ma, Ke; Blaabjerg, Frede

    2012-01-01

    Advances in power electronics enable efficient and flexible processing of electric power in the application of renewable energy sources, electric vehicles, adjustable-speed drives, etc. More and more efforts are devoted to better power electronic systems in terms of reliability to ensure high...... on a 2.3 MW wind power converter is discussed with emphasis on the reliability critical components IGBTs. Different aspects of improving the reliability of the power converter are mapped. Finally, the challenges and opportunities to achieve more reliable power electronic systems are addressed....

  19. Technical presentation

    CERN Multimedia

    FP Department

    2009-01-01

    07 April 2009 Technical presentation by Leuze Electronics: 14.00 – 15.00, Main Building, Room 61-1-017 (Room A) Photoelectric sensors, data identification and transmission systems, image processing systems. We at Leuze Electronics are "the sensor people": we have been specialising in optoelectronic sensors and safety technology for accident prevention for over 40 years. Our dedicated staff are all highly customer oriented. Customers of Leuze Electronics can always rely on one thing – on us! •\tFounded in 1963 •\t740 employees •\t115 MEUR turnover •\t20 subsidiaries •\t3 production facilities in southern Germany Product groups: •\tPhotoelectric sensors •\tIdentification and measurements •\tSafety devices

  20. Estimation of the Reliability of Distributed Applications

    OpenAIRE

    Marian Pompiliu CRISTESCU; Laurentiu CIOVICA

    2010-01-01

    In this paper the reliability is presented as an important feature for use in mission-critical distributed applications. Certain aspects of distributed systems make the requested level of reliability more difficult. An obvious benefit of distributed systems is that they serve the global business and social environment in which we live and work. Another benefit is that they can improve the quality of services, in terms of reliability, availability and performance, for the complex systems. The ...

  1. Quality and Reliability of Missile System

    Directory of Open Access Journals (Sweden)

    Mr. Prahlada

    2002-01-01

    Full Text Available Missile system is a single-shot weapon system which requires very high quality and reliability. Therefore, quality and reliability have to be built into the system from designing to testing and evaluation. In this paper, the technological challenges encountered during development of operational missile system and the factors considered to build quality and reliability through the design, manufacture, assembly, testing and by sharing the knowledge with other aerospace agencies, industries and institutions, etc. have been presented.

  2. The Northern HIPASS catalogue - Data presentation, completeness and reliability measures

    CERN Document Server

    Wong, O I; Garcia-Appadoo, D A; Webster, R L; Staveley-Smith, L; Zwaan, M A; Meyer, M J; Barnes, D G; Kilborn, V A; Bhathal, R; De Blok, W J G; Disney, M J; Doyle, M T; Drinkwater, M J; Ekers, R D; Freeman, K C; Gibson, B K; Gurovich, S; Harnett, J I; Henning, P A; Jerjen, H; Kesteven, M J; Knezek, P M; Koribalski, B S; Mader, S; Marquarding, M; Minchin, R F; O'Brien, J; Putman, M E; Ryder, S D; Sadler, E M; Stevens, J; Stewart, I M; Stootman, F; Waugh, M

    2006-01-01

    The Northern HIPASS catalogue (NHICAT) is the northern extension of the HIPASS catalogue, HICAT (Meyer et al. 2004). This extension adds the sky area between the declination range of +2 deg 300 km/s . Sources with -300 km/s .

  3. MDP: Reliable File Transfer for Space Missions

    Science.gov (United States)

    Rash, James; Criscuolo, Ed; Hogie, Keith; Parise, Ron; Hennessy, Joseph F. (Technical Monitor)

    2002-01-01

    This paper presents work being done at NASA/GSFC by the Operating Missions as Nodes on the Internet (OMNI) project to demonstrate the application of the Multicast Dissemination Protocol (MDP) to space missions to reliably transfer files. This work builds on previous work by the OMNI project to apply Internet communication technologies to space communication. The goal of this effort is to provide an inexpensive, reliable, standard, and interoperable mechanism for transferring files in the space communication environment. Limited bandwidth, noise, delay, intermittent connectivity, link asymmetry, and one-way links are all possible issues for space missions. Although these are link-layer issues, they can have a profound effect on the performance of transport and application level protocols. MDP, a UDP-based reliable file transfer protocol, was designed for multicast environments which have to address these same issues, and it has done so successfully. Developed by the Naval Research Lab in the mid 1990's, MDP is now in daily use by both the US Post Office and the DoD. This paper describes the use of MDP to provide automated end-to-end data flow for space missions. It examines the results of a parametric study of MDP in a simulated space link environment and discusses the results in terms of their implications for space missions. Lessons learned are addressed, which suggest minor enhancements to the MDP user interface to add specific features for space mission requirements, such as dynamic control of data rate, and a checkpoint/resume capability. These are features that are provided for in the protocol, but are not implemented in the sample MDP application that was provided. A brief look is also taken at the status of standardization. A version of MDP known as NORM (Neck Oriented Reliable Multicast) is in the process of becoming an IETF standard.

  4. Fast and reliable DNA extraction protocol for identification of species in raw and processed meat products sold on the commercial market

    Directory of Open Access Journals (Sweden)

    Alvarado Pavel Espinoza

    2017-08-01

    Full Text Available In this work a protocol for the extraction of DNA from the meat of different animals (beef, pork, and horse was established. The protocol utilized TE lysis buffer with varying concentrations of phenol and chloroform as a base reagent. Reactions were carried out for verying time periods and under differing temperatures. All samples analyzed were obtained from commercial grade meat sourced from the local region. 12 samples were used for methodological optimization with 30 repetitions per sample. Once optimized, purity results for the three species were 1.7 with a concentration (determined spectrophotometrically at 260 nm of 100 μl/ml of DNA. The protocol was tested using 465 different meat samples from different animal species. All meat used was fresh and processed. Results showed a purity of 1.35 ± 0.076 and a DNA concentration of 70 ± 0.31 μl for a time duration of 1.5 hours. These results were tested by polymerase chain reaction (PCR as reported by several authors. The extracts were tested using different PCR reactions using specific primers for horses. Results suggest that there was 39 positive samples. The proposed methodology provides an efficient way to detect DNA concentration and purity, suitable for amplification with PCR.

  5. Making the most of RNA-seq: Pre-processing sequencing data with Opossum for reliable SNP variant detection [version 1; referees: 2 approved, 1 approved with reservations

    Directory of Open Access Journals (Sweden)

    Laura Oikkonen

    2017-01-01

    Full Text Available Identifying variants from RNA-seq (transcriptome sequencing data is a cost-effective and versatile alternative to whole-genome sequencing. However, current variant callers do not generally behave well with RNA-seq data due to reads encompassing intronic regions. We have developed a software programme called Opossum to address this problem. Opossum pre-processes RNA-seq reads prior to variant calling, and although it has been designed to work specifically with Platypus, it can be used equally well with other variant callers such as GATK HaplotypeCaller. In this work, we show that using Opossum in conjunction with either Platypus or GATK HaplotypeCaller maintains precision and improves the sensitivity for SNP detection compared to the GATK Best Practices pipeline. In addition, using it in combination with Platypus offers a substantial reduction in run times compared to the GATK pipeline so it is ideal when there are only limited time or computational resources available.

  6. Making the most of RNA-seq: Pre-processing sequencing data with Opossum for reliable SNP variant detection [version 2; referees: 2 approved, 1 approved with reservations

    Directory of Open Access Journals (Sweden)

    Laura Oikkonen

    2017-03-01

    Full Text Available Identifying variants from RNA-seq (transcriptome sequencing data is a cost-effective and versatile complement to whole-exome (WES and whole-genome sequencing (WGS analysis. RNA-seq (transcriptome sequencing is primarily considered a method of gene expression analysis but it can also be used to detect DNA variants in expressed regions of the genome. However, current variant callers do not generally behave well with RNA-seq data due to reads encompassing intronic regions. We have developed a software programme called Opossum to address this problem. Opossum pre-processes RNA-seq reads prior to variant calling, and although it has been designed to work specifically with Platypus, it can be used equally well with other variant callers such as GATK HaplotypeCaller. In this work, we show that using Opossum in conjunction with either Platypus or GATK HaplotypeCaller maintains precision and improves the sensitivity for SNP detection compared to the GATK Best Practices pipeline. In addition, using it in combination with Platypus offers a substantial reduction in run times compared to the GATK pipeline so it is ideal when there are only limited time or computational resources available.

  7. Constructing the 'Best' Reliability Data for the Job - Developing Generic Reliability Data from Alternative Sources Early in a Product's Development Phase

    Science.gov (United States)

    Kleinhammer, Roger K.; Graber, Robert R.; DeMott, D. L.

    2016-01-01

    Reliability practitioners advocate getting reliability involved early in a product development process. However, when assigned to estimate or assess the (potential) reliability of a product or system early in the design and development phase, they are faced with lack of reasonable models or methods for useful reliability estimation. Developing specific data is costly and time consuming. Instead, analysts rely on available data to assess reliability. Finding data relevant to the specific use and environment for any project is difficult, if not impossible. Instead, analysts attempt to develop the "best" or composite analog data to support the assessments. Industries, consortia and vendors across many areas have spent decades collecting, analyzing and tabulating fielded item and component reliability performance in terms of observed failures and operational use. This data resource provides a huge compendium of information for potential use, but can also be compartmented by industry, difficult to find out about, access, or manipulate. One method used incorporates processes for reviewing these existing data sources and identifying the available information based on similar equipment, then using that generic data to derive an analog composite. Dissimilarities in equipment descriptions, environment of intended use, quality and even failure modes impact the "best" data incorporated in an analog composite. Once developed, this composite analog data provides a "better" representation of the reliability of the equipment or component. It can be used to support early risk or reliability trade studies, or analytical models to establish the predicted reliability data points. It also establishes a baseline prior that may updated based on test data or observed operational constraints and failures, i.e., using Bayesian techniques. This tutorial presents a descriptive compilation of historical data sources across numerous industries and disciplines, along with examples of contents

  8. Reliability Analysis of Tubular Joints in Offshore Structures

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle; Sørensen, John Dalsgaard

    1987-01-01

    Reliability analysis of single tubular joints and offshore platforms with tubular joints is" presented. The failure modes considered are yielding, punching, buckling and fatigue failure. Element reliability as well as systems reliability approaches are used and illustrated by several examples....... Finally, optimal design of tubular.joints with reliability constraints is discussed and illustrated by an example....

  9. VERMEIL. Methods for knowledge based development of reliable process controll system. Final report; VERMEIL. Verfahren und Methoden zur wissensbasierten Entwicklung zuverlaessiger Leitsysteme. Abschlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Goetze, B.; Plessow, M.; Pocher, M.

    1998-12-29

    The goal of the VERMEIL project was to increase the quality of process control systems by using knowledge based methods. One important aspect therein is the quality of the documentation describing the control system. The research done in the subproject of the Society for the Promotion of Applied Computerscience (GFal) produced a basis for the development of graphical editors of a special kind. These graphical editors can be used to create classes of schematic drawings. There were two major aspects concerning the development of graphical editors. First, the editors should provide advanced graphical support. That includes automation of the layout of schematic drawings. Second, the editors should automatically lead to correct structures within the schematics, therefore making certain that the created document is semantically correct, also. Apart from the theoretical and conceptional research, some automated layout algorithms were created in the project. A new prototype of an intelligent graphical editor came into existence, that proves the results of the VERMEIL project and shows the necessity of such a project. (orig.) [Deutsch] Das Ziel des Projektes VERMEIL bestand darin, die Qualitaet von Prozessleittechnik durch den Einsatz von wissensbasierten Methoden waehrend ihrer Projektierung zu erhoehen. Ein Aspekt dabei ist die Qualitaet der Dokumentation, die eine Anlage beschreibt. Im Teilprojekt der GFal wurden deshalb Untersuchungen durchgefuehrt, die die methodischen Grundlagen fuer die Entwicklung spezieller grafischer Editoren fuer die Herstellung von Klassen schematischer Darstellungen schufen. Dabei spielen zwei Aspekte eine grosse Rolle. Einmal sollen diese Editoren in der Lage sein, dem Benutzer weitgehende grafische Unterstuetzungen zu geben. Das dabei erreichte Ziel beinhaltet eine umfassende automatisierte Layoutunterstuetzung beim Erstellen von grafischen Dokumenten. Ferner ist dafuer zu sorgen, dass die entstehenden Zeichnungen zu strukturell korrekten

  10. Effects of Gate Stack Structural and Process Defectivity on High-k Dielectric Dependence of NBTI Reliability in 32 nm Technology Node PMOSFETs

    Directory of Open Access Journals (Sweden)

    H. Hussin

    2014-01-01

    Full Text Available We present a simulation study on negative bias temperature instability (NBTI induced hole trapping in E′ center defects, which leads to depassivation of interface trap precursor in different geometrical structures of high-k PMOSFET gate stacks using the two-stage NBTI model. The resulting degradation is characterized based on the time evolution of the interface and hole trap densities, as well as the resulting threshold voltage shift. By varying the physical thicknesses of the interface silicon dioxide (SiO2 and hafnium oxide (HfO2 layers, we investigate how the variation in thickness affects hole trapping/detrapping at different stress temperatures. The results suggest that the degradations are highly dependent on the physical gate stack parameters for a given stress voltage and temperature. The degradation is more pronounced by 5% when the thicknesses of HfO2 are increased but is reduced by 11% when the SiO2 interface layer thickness is increased during lower stress voltage. However, at higher stress voltage, greater degradation is observed for a thicker SiO2 interface layer. In addition, the existence of different stress temperatures at which the degradation behavior differs implies that the hole trapping/detrapping event is thermally activated.

  11. Photovoltaic system reliability

    Energy Technology Data Exchange (ETDEWEB)

    Maish, A.B.; Atcitty, C. [Sandia National Labs., NM (United States); Greenberg, D. [Ascension Technology, Inc., Lincoln Center, MA (United States)] [and others

    1997-10-01

    This paper discusses the reliability of several photovoltaic projects including SMUD`s PV Pioneer project, various projects monitored by Ascension Technology, and the Colorado Parks project. System times-to-failure range from 1 to 16 years, and maintenance costs range from 1 to 16 cents per kilowatt-hour. Factors contributing to the reliability of these systems are discussed, and practices are recommended that can be applied to future projects. This paper also discusses the methodology used to collect and analyze PV system reliability data.

  12. Structural Reliability Methods

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager; Madsen, H. O.

    of structural reliability, including the theoretical basis for these methods. Partial safety factor codes under current practice are briefly introduced and discussed. A probabilistic code format for obtaining a formal reliability evaluation system that catches the most essential features of the nature......The structural reliability methods quantitatively treat the uncertainty of predicting the behaviour and properties of a structure given the uncertain properties of its geometry, materials, and the actions it is supposed to withstand. This book addresses the probabilistic methods for evaluation...

  13. A Method for Reviewing the Accuracy and Reliability of a Five-Level Triage Process (Canadian Triage and Acuity Scale in a Community Emergency Department Setting: Building the Crowding Measurement Infrastructure

    Directory of Open Access Journals (Sweden)

    Michael K. Howlett

    2012-01-01

    Full Text Available Objectives. Triage data are widely used to evaluate patient flow, disease severity, and emergency department (ED workload, factors used in ED crowding evaluation and management. We defined an indicator-based methodology that can be easily used to review the accuracy of Canadian Triage and Acuity Scale (CTAS performance. Methods. A trained nurse reviewer (NR retrospectively triaged two separate month’s ED charts relative to a set of clinical indicators based on CTAS Chief Complaints. Interobserver reliability and accuracy were compared using Kappa and comparative statistics. Results. There were 2838 patients in Trial 1 and 3091 in Trial 2. The rate of inconsistent triage was 14% and 16% (Kappa 0.596 and 0.604. Clinical Indicators “pain scale, chest pain, musculoskeletal injury, respiratory illness, and headache” captured 68% and 62% of visits. Conclusions. We have demonstrated a system to measure the levels of process accuracy and reliability for triage over time. We identified five key clinical indicators which captured over 60% of visits. A simple method for quality review uses a small set of indicators, capturing a majority of cases. Performance consistency and data collection using indicators may be important areas to direct training efforts.

  14. On Improving Reliability of SRAM-Based Physically Unclonable Functions

    Directory of Open Access Journals (Sweden)

    Arunkumar Vijayakumar

    2017-01-01

    Full Text Available Physically unclonable functions (PUFs have been touted for their inherent resistance to invasive attacks and low cost in providing a hardware root of trust for various security applications. SRAM PUFs in particular are popular in industry for key/ID generation. Due to intrinsic process variations, SRAM cells, ideally, tend to have the same start-up behavior. SRAM PUFs exploit this start-up behavior. Unfortunately, not all SRAM cells exhibit reliable start-up behavior due to noise susceptibility. Hence, design enhancements are needed for improving reliability. Some of the proposed enhancements in literature include fuzzy extraction, error-correcting codes and voting mechanisms. All enhancements involve a trade-off between area/power/performance overhead and PUF reliability. This paper presents a design enhancement technique for reliability that improves upon previous solutions. We present simulation results to quantify improvement in SRAM PUF reliability and efficiency. The proposed technique is shown to generate a 128-bit key in ≤0.2 μ s at an area estimate of 4538 μ m 2 with error rate as low as 10 − 6 for intrinsic error probability of 15%.

  15. 2,4-D abatement from groundwater samples by photo-Fenton processes at circumneutral pH using naturally iron present. Effect of inorganic ions.

    Science.gov (United States)

    Gutiérrez-Zapata, Héctor M; Rojas, Karen L; Sanabria, Janeth; Rengifo-Herrera, Julián Andrés

    2017-03-01

    This study evaluated, at laboratory scale, if the using iron naturally present (0.3 mg L(-1)) and adding 10 mg L(-1) of hydrogen peroxide was effective to remove 24.3 mgL(-1) of 2,4-dichlorophenoxyacetic acid (2,4-D) from groundwater samples by simulated solar irradiation (global intensity = 300 W m(-2)). Under these conditions, the degradation of 2,4-D reached 75.2 % and the apparition of its main oxidation byproduct 2,4-dichlorophenol (DCP) was observed. On the other hand, pH exhibited an increasing from 7.0 to 8.3 during the experiment. Experiments using Milli-Q water at pH 7.0, iron, and H2O2 concentrations of 0.3 and 10 mg L(-1), respectively, were carried out in order to study the effect of ions such as carbonate species, phosphate, and fluoride in typical concentrations often found in groundwater. Ion concentrations were combined by using a factorial experimental design 2(3). Results showed that carbonates and fluoride did not produce a detrimental effect on the 2,4-D degradation, while phosphate inhibited the process. In this case, the pH increased also from 7.0 to 7.95 and 8.99. Effect of parameters such as pH, iron concentration, and hydrogen peroxide concentration on the 2,4-D degradation by the photo-Fenton process in groundwater was evaluated by using a factorial experimental design 2(3). Results showed that the pH was the main parameter affecting the process. This study shows for the first time that using the photo-Fenton process at circumneutral pH and iron naturally present seems to be a promising process to remove pesticides from groundwater.

  16. Chapter 15: Reliability of Wind Turbines

    Energy Technology Data Exchange (ETDEWEB)

    Sheng, Shuangwen; O' Connor, Ryan

    2017-05-19

    The global wind industry has witnessed exciting developments in recent years. The future will be even brighter with further reductions in capital and operation and maintenance costs, which can be accomplished with improved turbine reliability, especially when turbines are installed offshore. One opportunity for the industry to improve wind turbine reliability is through the exploration of reliability engineering life data analysis based on readily available data or maintenance records collected at typical wind plants. If adopted and conducted appropriately, these analyses can quickly save operation and maintenance costs in a potentially impactful manner. This chapter discusses wind turbine reliability by highlighting the methodology of reliability engineering life data analysis. It first briefly discusses fundamentals for wind turbine reliability and the current industry status. Then, the reliability engineering method for life analysis, including data collection, model development, and forecasting, is presented in detail and illustrated through two case studies. The chapter concludes with some remarks on potential opportunities to improve wind turbine reliability. An owner and operator's perspective is taken and mechanical components are used to exemplify the potential benefits of reliability engineering analysis to improve wind turbine reliability and availability.

  17. Reliability-Based Topology Optimization With Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Kyoungryun [JAEIK Information and Communication Co. Ltd., Seoul (Korea, Republic of); Wang, Semyung [Kwangju Institute of Science and Technology, Kwangju (Korea, Republic of); Choi, Kyung K. [Univ. of Iowa, Iowa (United States)

    2002-11-15

    A probabilistic optimal design modeled with finite elements is presented. A 2-D finite element model is constructed for topology optimization. Young's modulus, thickness and loading are considered as uncertain variables. The uncertain variable means that the variable has a variance on a certain point. In order to compute reliability constraints, two methods-RIA, PMA-are widely used. To find reliability index easily, the limit state function is linearly approximated at the each iteration. This approximation method is called as the first order reliability method (FORM), which is widely used in reliability based design optimizations (RBDO)

  18. D5.3 Reading reliability report

    DEFF Research Database (Denmark)

    Cetin, Bilge Kartal; Galiotto, Carlo; Cetin, Kamil

    2010-01-01

    This deliverable presents a detailed description of the main causes of reading reliability degradation. Two main groups of impairments are recognized: those at the physical layer (e.g., fading, multipath, electromagnetic interference, shadowing due to obstacles, tag orientation misalignment, tag...... bending, metallic environments, etc.) and those at the medium access control sub-layer (e.g., collisions due to tag-to-tag, reader-to-reader and multiple readers-to-tag interference). The review presented in this deliverable covers previous reliability reports and existing definitions of RFID reading...... reliability. Performance metrics and methodologies for assessing reading reliability are further discussed. This document also presents a review of state-of-the-art RFID reading reliability improvement schemes. The solutions are classified into physical- (PHY), medium access control- (MAC), upper-, and cross...

  19. Reliable Electronic Equipment

    Directory of Open Access Journals (Sweden)

    N. A. Nayak

    1960-05-01

    Full Text Available The reliability aspect of electronic equipment's is discussed. To obtain optimum results, close cooperation between the components engineer, the design engineer and the production engineer is suggested.

  20. Reliability prediction techniques

    Energy Technology Data Exchange (ETDEWEB)

    Whittaker, B.; Worthington, B.; Lord, J.F.; Pinkard, D.

    1986-01-01

    The paper demonstrates the feasibility of applying reliability assessment techniques to mining equipment. A number of techniques are identified and described and examples of their use in assessing mining equipment are given. These techniques include: reliability prediction; failure analysis; design audit; maintainability; availability and the life cycle costing. Specific conclusions regarding the usefulness of each technique are outlined. The choice of techniques depends upon both the type of equipment being assessed and its stage of development, with numerical prediction best suited for electronic equipment and fault analysis and design audit suited to mechanical equipment. Reliability assessments involve much detailed and time consuming work but it has been demonstrated that the resulting reliability improvements lead to savings in service costs which more than offset the cost of the evaluation.

  1. Lifetime Reliability Assessment of Concrete Slab Bridges

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle

    A procedure for lifetime assesment of the reliability of short concrete slab bridges is presented in the paper. Corrosion of the reinforcement is the deterioration mechanism used for estimating the reliability profiles for such bridges. The importance of using sensitivity measures is stressed. Fi...

  2. Enhancing Energy Efficient TCP by Partial Reliability

    NARCIS (Netherlands)

    Donckers, L.; Smit, G.J.M.; Smit, L.T.

    2002-01-01

    We present a study on the effects on a mobile system's energy efficiency of enhancing, with partial reliability, our energy efficient TCP variant (E/sup 2/TCP) (see Donckers, L. et al., Proc. 2nd Asian Int. Mobile Computing Conf. - AMOC2002, p.18-28, 2002). Partial reliability is beneficial for mult

  3. Reliability of Arctic offshore installations

    Energy Technology Data Exchange (ETDEWEB)

    Bercha, F.G. [Bercha Group, Calgary, AB (Canada); Gudmestad, O.T. [Stavanger Univ., Stavanger (Norway)]|[Statoil, Stavanger (Norway)]|[Norwegian Univ. of Technology, Stavanger (Norway); Foschi, R. [British Columbia Univ., Vancouver, BC (Canada). Dept. of Civil Engineering; Sliggers, F. [Shell International Exploration and Production, Rijswijk (Netherlands); Nikitina, N. [VNIIG, St. Petersburg (Russian Federation); Nevel, D.

    2006-11-15

    Life threatening and fatal failures of offshore structures can be attributed to a broad range of causes such as fires and explosions, buoyancy losses, and structural overloads. This paper addressed the different severities of failure types, categorized as catastrophic failure, local failure or serviceability failure. Offshore tragedies were also highlighted, namely the failures of P-36, the Ocean Ranger, the Piper Alpha, and the Alexander Kieland which all resulted in losses of human life. P-36 and the Ocean Ranger both failed ultimately due to a loss of buoyancy. The Piper Alpha was destroyed by a natural gas fire, while the Alexander Kieland failed due to fatigue induced structural failure. The mode of failure was described as being the specific way in which a failure occurs from a given cause. Current reliability measures in the context of offshore installations only consider the limited number of causes such as environmental loads. However, it was emphasized that a realistic value of the catastrophic failure probability should consider all credible causes of failure. This paper presented a general method for evaluating all credible causes of failure of an installation. The approach to calculating integrated reliability involves the use of network methods such as fault trees to combine the probabilities of all factors that can cause a catastrophic failure, as well as those which can cause a local failure with the potential to escalate to a catastrophic failure. This paper also proposed a protocol for setting credible reliability targets such as the consideration of life safety targets and escape, evacuation, and rescue (EER) success probabilities. A set of realistic reliability targets for both catastrophic and local failures for representative safety and consequence categories associated with offshore installations was also presented. The reliability targets were expressed as maximum average annual failure probabilities. The method for converting these annual

  4. The rating reliability calculator

    Directory of Open Access Journals (Sweden)

    Solomon David J

    2004-04-01

    Full Text Available Abstract Background Rating scales form an important means of gathering evaluation data. Since important decisions are often based on these evaluations, determining the reliability of rating data can be critical. Most commonly used methods of estimating reliability require a complete set of ratings i.e. every subject being rated must be rated by each judge. Over fifty years ago Ebel described an algorithm for estimating the reliability of ratings based on incomplete data. While his article has been widely cited over the years, software based on the algorithm is not readily available. This paper describes an easy-to-use Web-based utility for estimating the reliability of ratings based on incomplete data using Ebel's algorithm. Methods The program is available public use on our server and the source code is freely available under GNU General Public License. The utility is written in PHP, a common open source imbedded scripting language. The rating data can be entered in a convenient format on the user's personal computer that the program will upload to the server for calculating the reliability and other statistics describing the ratings. Results When the program is run it displays the reliability, number of subject rated, harmonic mean number of judges rating each subject, the mean and standard deviation of the averaged ratings per subject. The program also displays the mean, standard deviation and number of ratings for each subject rated. Additionally the program will estimate the reliability of an average of a number of ratings for each subject via the Spearman-Brown prophecy formula. Conclusion This simple web-based program provides a convenient means of estimating the reliability of rating data without the need to conduct special studies in order to provide complete rating data. I would welcome other researchers revising and enhancing the program.

  5. Reliability of power connections

    Institute of Scientific and Technical Information of China (English)

    BRAUNOVIC Milenko

    2007-01-01

    Despite the use of various preventive maintenance measures, there are still a number of problem areas that can adversely affect system reliability. Also, economical constraints have pushed the designs of power connections closer to the limits allowed by the existing standards. The major parameters influencing the reliability and life of Al-Al and Al-Cu connections are identified. The effectiveness of various palliative measures is determined and the misconceptions about their effectiveness are dealt in detail.

  6. Diagnostic overshadowing and other challenges involved in the diagnostic process of patients with mental illness who present in emergency departments with physical symptoms--a qualitative study.

    Directory of Open Access Journals (Sweden)

    Guy Shefer

    Full Text Available We conducted a qualitative study in the Emergency Departments (EDs of four hospitals in order to investigate the perceived scope and causes of 'diagnostic overshadowing'--the misattribution of physical symptoms to mental illness--and other challenges involved in the diagnostic process of people with mental illness who present in EDs with physical symptoms. Eighteen doctors and twenty-one nurses working in EDs and psychiatric liaisons teams in four general hospitals in the UK were interviewed. Interviewees were asked about cases in which mental illness interfered with diagnosis of physical problems and about other aspects of the diagnostic process. Interviews were transcribed and analysed thematically. Interviewees reported various scenarios in which mental illness or factors related to it led to misdiagnosis or delayed treatment with various degrees of seriousness. Direct factors which may lead to misattribution in this regard are complex presentations or aspects related to poor communication or challenging behaviour of the patient. Background factors are the crowded nature of the ED environment, time pressures and targets and stigmatising attitudes held by a minority of staff. The existence of psychiatric liaison team covering the ED twenty-four hours a day, seven days a week, can help reduce the risk of misdiagnosis of people with mental illness who present with physical symptoms. However, procedures used by emergency and psychiatric liaison staff require fuller operationalization to reduce disagreement over where responsibilities lie.

  7. Sensitivity Analysis of Component Reliability

    Institute of Scientific and Technical Information of China (English)

    ZhenhuaGe

    2004-01-01

    In a system, Every component has its unique position within system and its unique failure characteristics. When a component's reliability is changed, its effect on system reliability is not equal. Component reliability sensitivity is a measure of effect on system reliability while a component's reliability is changed. In this paper, the definition and relative matrix of component reliability sensitivity is proposed, and some of their characteristics are analyzed. All these will help us to analyse or improve the system reliability.

  8. Fatigue reliability for LNG carrier

    Institute of Scientific and Technical Information of China (English)

    Xiao Taoyun; Zhang Qin; Jin Wulei; Xu Shuai

    2011-01-01

    The procedure of reliability-based fatigue analysis of liquefied natural gas (LNG) carrier of membrane type under wave loads is presented. The stress responses of the hotspots in regular waves with different wave heading angles and wave lengths are evaluated by global ship finite element method (FEM). Based on the probabilistic distribution function of hotspots' short-term stress-range using spectral-based analysis, Weibull distribution is adopted and discussed for fitting the long-term probabilistic distribution of stress-range. Based on linear cumulative damage theory, fatigue damage is characterized by an S-N relationship, and limit state function is established. Structural fatigue damage behavior of several typical hotspots of LNG middle ship section is clarified and reliability analysis is performed. It is believed that the presented results and conclusions can be of use in calibration for practical design and initial fatigue safety evaluation for membrane type LNG carrier.

  9. Improved reliability of power modules

    DEFF Research Database (Denmark)

    Baker, Nick; Liserre, Marco; Dupont, Laurent

    2014-01-01

    Power electronic systems play an increasingly important role in providing high-efficiency power conversion for adjustable-speed drives, power-quality correction, renewable-energy systems, energy-storage systems, and electric vehicles. However, they are often presented with demanding operating env...... temperature cycling conditions on the system. On the other hand, safety requirements in the aerospace and automotive industries place rigorous demands on reliability....

  10. NDT Reliability - Final Report. Reliability in non-destructive testing (NDT) of the canister components

    Energy Technology Data Exchange (ETDEWEB)

    Pavlovic, Mato; Takahashi, Kazunori; Mueller, Christina; Boehm, Rainer (BAM, Federal Inst. for Materials Research and Testing, Berlin (Germany)); Ronneteg, Ulf (Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden))

    2008-12-15

    This report describes the methodology of the reliability investigation performed on the ultrasonic phased array NDT system, developed by SKB in collaboration with Posiva, for inspection of the canisters for permanent storage of nuclear spent fuel. The canister is composed of a cast iron insert surrounded by a copper shell. The shell is composed of the tube and the lid/base which are welded to the tube after the fuel has been place, in the tube. The manufacturing process of the canister parts and the welding process are described. Possible defects, which might arise in the canister components during the manufacturing or in the weld during the welding, are identified. The number of real defects in manufactured components have been limited. Therefore the reliability of the NDT system has been determined using a number of test objects with artificial defects. The reliability analysis is based on the signal response analysis. The conventional signal response analysis is adopted and further developed before applied on the modern ultrasonic phased-array NDT system. The concept of multi-parameter a, where the response of the NDT system is dependent on more than just one parameter, is introduced. The weakness of use of the peak signal response in the analysis is demonstrated and integration of the amplitudes in the C-scan is proposed as an alternative. The calculation of the volume POD, when the part is inspected with more configurations, is also presented. The reliability analysis is supported by the ultrasonic simulation based on the point source synthesis method

  11. 提高军工产品质量可靠性的特殊过程确认%Special Process Confirming to Improve the Reliability of Quality of Military Products

    Institute of Scientific and Technical Information of China (English)

    王先超; 杜向辉; 李伟

    2012-01-01

    特殊过程产品传统的质量保证方法通常是采用抽检、相关性能的检测等间接检测方法,检测内容和方法的可靠性缺少科学试验验证,这无疑存在着质量隐患.特殊过程确认可以弥补这个缺失,可提高产品的可靠性,即对抽检或产品间接检测性能的特殊过程进行确认,对间接测量的边界条件,替代测量的性能与被替代性能间的关联性等进行试验验证.%The common quality assurance of special processing products is random inspection, detection of relative functions , which are all indirect methods. The reliability of detection methods and contents are lack of scientific proof test which may cause serious quality consequences. Special process confirming can make up the defects and improve the quality of products. It does indirect tesi confirming to special process of random inspection produces or all products, and also does proof test to boundary conditions and relationship between performance of indirect detection and substituted measuring.

  12. Exponential order statistic models of software reliability growth

    Science.gov (United States)

    Miller, D. R.

    1986-01-01

    Failure times of a software reliability growth process are modeled as order statistics of independent, nonidentically distributed exponential random variables. The Jelinsky-Moranda, Goel-Okumoto, Littlewood, Musa-Okumoto Logarithmic, and Power Law models are all special cases of Exponential Order Statistic Models, but there are many additional examples also. Various characterizations, properties and examples of this class of models are developed and presented.

  13. "Reliability Of Fiber Optic Lans"

    Science.gov (United States)

    Code n, Michael; Scholl, Frederick; Hatfield, W. Bryan

    1987-02-01

    Fiber optic Local Area Network Systems are being used to interconnect increasing numbers of nodes. These nodes may include office computer peripherals and terminals, PBX switches, process control equipment and sensors, automated machine tools and robots, and military telemetry and communications equipment. The extensive shared base of capital resources in each system requires that the fiber optic LAN meet stringent reliability and maintainability requirements. These requirements are met by proper system design and by suitable manufacturing and quality procedures at all levels of a vertically integrated manufacturing operation. We will describe the reliability and maintainability of Codenoll's passive star based systems. These include LAN systems compatible with Ethernet (IEEE 802.3) and MAP (IEEE 802.4), and software compatible with IBM Token Ring (IEEE 802.5). No single point of failure exists in this system architecture.

  14. Reliable design of electronic equipment an engineering guide

    CERN Document Server

    Natarajan, Dhanasekharan

    2014-01-01

    This book explains reliability techniques with examples from electronics design for the benefit of engineers. It presents the application of de-rating, FMEA, overstress analyses and reliability improvement tests for designing reliable electronic equipment. Adequate information is provided for designing computerized reliability database system to support the application of the techniques by designers. Pedantic terms and the associated mathematics of reliability engineering discipline are excluded for the benefit of comprehensiveness and practical applications. This book offers excellent support

  15. Reliable High Performance Processing System (RHPPS) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — NASA's exploration, science, and space operations systems are critically dependent on the hardware technologies used in their implementation. Specifically, the...

  16. 含砷废渣处理现状及对策%THE PRESENT SITUATION AND THE COUNTERMEASURE OF THE PROCESSING OF ARSENIC RESIDUES

    Institute of Scientific and Technical Information of China (English)

    徐建兵; 沈强华; 陈雯; 曹忠华

    2017-01-01

    伴随着优质矿逐渐被消耗,复杂含砷矿逐渐被开采出来,含砷废渣的产量不断增加,而砷及含砷化合物毒性很大,因此如何有效的处理含砷废渣使其无害化变得非常迫切.介绍了砷的危害,含砷废渣的来源,综述了含砷废渣处理方法的现状及存在的问题.目前,处理含砷废渣的方法主要有硫酸铜置换法、硫酸铁法和碱浸法等资源化处理,以及水泥固化、钙盐稳定化等固化稳定化处理.但这些方法都存在相应的不足,为了能有效地解决含砷废渣的问题,提出了合成臭葱石固化砷是处理含砷废渣的对策.%With the gradual depletion of high quality ore,comnplicated ore containing arsenic was gradually being mined,and the production of arsenic residues increased continuously.As the toxicity of arsenic and arsenic compounds are quite strong,so how to effectively deal with arsenic residues to make it harmless become very urgent.The detriment of arsenic and the origin of arsenic-containing wastes are introduced,the present situation and existing problems of the processing is reviewed.Currently,major treatment methods for the resource utilization of arsenic residues include copper sulfate cementation process,ferric sulfate leaching process,alkali leaching process etc.and the solidification-stabilization of arsenic residues include cement solidification process,calcium salt stabilization process etc.But all these methods have their respective shortcomings.In order to effectively solve the problem of arsenic residues,a process of stabilizing arsenic by synthesizing scorodite is proposed as the countermeasure for treatment of arsenic residues.

  17. Reliability-Based Optimization and Optimal Reliability Level of Offshore Wind Turbines

    DEFF Research Database (Denmark)

    Tarp-Johansen, N.J.; Sørensen, John Dalsgaard

    2006-01-01

    Different formulations relevant for the reliability-based optimization of offshore wind turbines are presented, including different reconstruction policies in case of failure. Illustrative examples are presented and, as a part of the results, optimal reliability levels for the different failure m...

  18. Reliability-Based Optimization and Optimal Reliability Level of Offshore Wind Turbines

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Tarp-Johansen, N.J.

    2005-01-01

    Different formulations relevant for the reliability-based optimization of offshore wind turbines are presented, including different reconstruction policies in case of failure. Illustrative examples are presented and, as a part of the results, optimal reliability levels for the different failure...

  19. THE AIRLINE'S RELIABILITY PROGRAM

    OpenAIRE

    Тамаргазін, О. А.; Національний авіаційний університет; Власенко, П. О.; Національний авіаційний університет

    2013-01-01

    Airline's operational structure for Reliability program implementation — engineering division, reliability  division, reliability control division, aircraft maintenance division, quality assurance division — was considered. Airline's Reliability program structure is shown. Using of Reliability program for reducing costs on aircraft maintenance is proposed. Рассмотрена организационная структура авиакомпании по выполнению Программы надежности - инженерный отдел, отделы по надежности авиацио...

  20. Mission Reliability Estimation for Repairable Robot Teams

    Directory of Open Access Journals (Sweden)

    Stephen B. Stancliff

    2008-11-01

    Full Text Available Many of the most promising applications for mobile robots require very high reliability. The current generation of mobile robots is, for the most part, highly unreliable. The few mobile robots that currently demonstrate high reliability achieve this reliability at a high financial cost. In order for mobile robots to be more widely used, it will be necessary to find ways to provide high mission reliability at lower cost. Comparing alternative design paradigms in a principled way requires methods for comparing the reliability of different robot and robot team configurations. In this paper, we present the first principled quantitative method for performing mission reliability estimation for mobile robot teams. We also apply this method to an example robot mission, examining the cost-reliability tradeoffs among different team configurations. Using conservative estimates of the cost-reliability relationship, our results show that it is possible to significantly reduce the cost of a robotic mission by using cheaper, lower-reliability components and providing spares.

  1. Assessing and updating the reliability of concrete bridges subjected to spatial deterioration - principles and software implementation

    DEFF Research Database (Denmark)

    Schneider, Ronald; Fischer, Johannes; Bügler, Maximilian

    2015-01-01

    . The overall system reliability is computed by means of a probabilistic structural model coupled with the deterioration model. Inspection data are included in the system reliability calculation through Bayesian updating on the basis of the DBN model. As proof of concept, a software prototype is developed...... to implement the method presented here. The software prototype is applied to a typical highway bridge and the influence of inspection information on the system deterioration state and the structural reliability is quantified taking into account the spatial correlation of the corrosion process. This work...... is a step towards developing a software tool that can be used by engineering practitioners to perform reliability assessments of ageing concrete bridges and update their reliability with inspection and monitoring data....

  2. Reliability and Power Quality Evaluation of High-Voltage Supplied Customer

    Science.gov (United States)

    Yoshino, Jun; Kita, Hiroyuki; Tanaka, Eiichi; Hasegawa, Jun; Kubo, Hiroshi; Yonaga, Shigeru

    Recently, a number of electric consumers have concerned about the reliability of electricity to be served. For example, some consumers need the electricity with a higher reliability by the automation of manufacturing processes. On the other hand, some consumers need the electricity of a cheaper price even if the reliability becomes a little worse. Under such circumstances, it is necessary that power suppliers evaluate the needs of every consumers precisely and propose the most desirable measures for meeting their requirements. This paper develops a tool to analyze the reliability for high-voltage supplied consumers quantitatively. Further, this paper presents a method for evaluating the outage cost of consumers to help them choose the most appropriate measures for maintaining the reliability. The proposed method applies the fuzzy reasoning approach. The validity of the proposed method is ascertained through some numerical simulations.

  3. Reliability estimation for multiunit nuclear and fossil-fired industrial energy systems

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, W. G.; Wilson, J. V.; Klepper, O. H.

    1977-06-29

    As petroleum-based fuels grow increasingly scarce and costly, nuclear energy may become an important alternative source of industrial energy. Initial applications would most likely include a mix of fossil-fired and nuclear sources of process energy. A means for determining the overall reliability of these mixed systems is a fundamental aspect of demonstrating their feasibility to potential industrial users. Reliability data from nuclear and fossil-fired plants are presented, and several methods of applying these data for calculating the reliability of reasonably complex industrial energy supply systems are given. Reliability estimates made under a number of simplifying assumptions indicate that multiple nuclear units or a combination of nuclear and fossil-fired plants could provide adequate reliability to meet industrial requirements for continuity of service.

  4. RELAV - RELIABILITY/AVAILABILITY ANALYSIS PROGRAM

    Science.gov (United States)

    Bowerman, P. N.

    1994-01-01

    RELAV (Reliability/Availability Analysis Program) is a comprehensive analytical tool to determine the reliability or availability of any general system which can be modeled as embedded k-out-of-n groups of items (components) and/or subgroups. Both ground and flight systems at NASA's Jet Propulsion Laboratory have utilized this program. RELAV can assess current system performance during the later testing phases of a system design, as well as model candidate designs/architectures or validate and form predictions during the early phases of a design. Systems are commonly modeled as System Block Diagrams (SBDs). RELAV calculates the success probability of each group of items and/or subgroups within the system assuming k-out-of-n operating rules apply for each group. The program operates on a folding basis; i.e. it works its way towards the system level from the most embedded level by folding related groups into single components. The entire folding process involves probabilities; therefore, availability problems are performed in terms of the probability of success, and reliability problems are performed for specific mission lengths. An enhanced cumulative binomial algorithm is used for groups where all probabilities are equal, while a fast algorithm based upon "Computing k-out-of-n System Reliability", Barlow & Heidtmann, IEEE TRANSACTIONS ON RELIABILITY, October 1984, is used for groups with unequal probabilities. Inputs to the program include a description of the system and any one of the following: 1) availabilities of the items, 2) mean time between failures and mean time to repairs for the items from which availabilities are calculated, 3) mean time between failures and mission length(s) from which reliabilities are calculated, or 4) failure rates and mission length(s) from which reliabilities are calculated. The results are probabilities of success of each group and the system in the given configuration. RELAV assumes exponential failure distributions for

  5. Reliability, availability, and maintainability: a key issue for ELTs

    Science.gov (United States)

    Ansorge, Wolfgang R.

    2006-06-01

    Hundreds of mirror segment, thousands of high precision actuators, highly complex mechanical, hydraulic, electrical and other technology subsystems, and extremely sophisticated control systems: an ELT system consists of millions of individual parts and components each of them may fail and lead to a partial or complete system breakdown. Component and system reliability does not only influence the acquisition costs of a product, it also defines the necessary maintenance work and the required logistic support. Taking the long operational life time of an ELT into account, reliability and maintainability are some of the main contributors to the system life cycle costs. Reliability and maintainability are key characteristics of any product and have to be designed into it from the early beginning; they can neither be tested into a product nor introduced by numerous maintenance activities. This presentation explains the interconnections between Reliability, Availability, Maintainability and Safety (RAMS) and outlines the necessary RAMS and Reliability Centered Maintenance (RCM) processes and activities during the entire life cycle of an ELT and an ELT instrument from the initial planning to the eventual disposal phase.

  6. Long-term reliability of the visual EEG Poffenberger paradigm.

    Science.gov (United States)

    Friedrich, Patrick; Ocklenburg, Sebastian; Mochalski, Lisa; Schlüter, Caroline; Güntürkün, Onur; Genc, Erhan

    2017-07-14

    The Poffenberger paradigm is a simple perception task that is used to estimate the speed of information transfer between the two hemispheres, the so-called interhemispheric transfer time (IHTT). Although the original paradigm is a behavioral task, it can be combined with electroencephalography (EEG) to assess the underlying neurophysiological processes during task execution. While older studies have supported the validity of both paradigms for investigating interhemispheric interactions, their long-term reliability has not been assessed systematically before. The present study aims to fill this gap by determining both internal consistency and long-term test-retest reliability of IHTTs produced by using the two different versions of the Poffenberger paradigm in a sample of 26 healthy subjects. The results show high reliability for the EEG Poffenberger paradigm. In contrast, reliability measures for the behavioral Poffenberger paradigm were low. Hence, our results indicate that electrophysiological measures of interhemispheric transfer are more reliable than behavioral measures; the later should be used with caution in research investigating inter-individual differences of neurocognitive measures. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Cost-effective solutions to maintaining smart grid reliability

    Science.gov (United States)

    Qin, Qiu

    As the aging power systems are increasingly working closer to the capacity and thermal limits, maintaining an sufficient reliability has been of great concern to the government agency, utility companies and users. This dissertation focuses on improving the reliability of transmission and distribution systems. Based on the wide area measurements, multiple model algorithms are developed to diagnose transmission line three-phase short to ground faults in the presence of protection misoperations. The multiple model algorithms utilize the electric network dynamics to provide prompt and reliable diagnosis outcomes. Computational complexity of the diagnosis algorithm is reduced by using a two-step heuristic. The multiple model algorithm is incorporated into a hybrid simulation framework, which consist of both continuous state simulation and discrete event simulation, to study the operation of transmission systems. With hybrid simulation, line switching strategy for enhancing the tolerance to protection misoperations is studied based on the concept of security index, which involves the faulted mode probability and stability coverage. Local measurements are used to track the generator state and faulty mode probabilities are calculated in the multiple model algorithms. FACTS devices are considered as controllers for the transmission system. The placement of FACTS devices into power systems is investigated with a criterion of maintaining a prescribed level of control reconfigurability. Control reconfigurability measures the small signal combined controllability and observability of a power system with an additional requirement on fault tolerance. For the distribution systems, a hierarchical framework, including a high level recloser allocation scheme and a low level recloser placement scheme, is presented. The impacts of recloser placement on the reliability indices is analyzed. Evaluation of reliability indices in the placement process is carried out via discrete event

  8. Quality and reliability management and its applications

    CERN Document Server

    2016-01-01

    Integrating development processes, policies, and reliability predictions from the beginning of the product development lifecycle to ensure high levels of product performance and safety, this book helps companies overcome the challenges posed by increasingly complex systems in today’s competitive marketplace.   Examining both research on and practical aspects of product quality and reliability management with an emphasis on applications, the book features contributions written by active researchers and/or experienced practitioners in the field, so as to effectively bridge the gap between theory and practice and address new research challenges in reliability and quality management in practice.    Postgraduates, researchers and practitioners in the areas of reliability engineering and management, amongst others, will find the book to offer a state-of-the-art survey of quality and reliability management and practices.

  9. Exact reliability quantification of highly reliable systems with maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Bris, Radim, E-mail: radim.bris@vsb.c [VSB-Technical University Ostrava, Faculty of Electrical Engineering and Computer Science, Department of Applied Mathematics, 17. listopadu 15, 70833 Ostrava-Poruba (Czech Republic)

    2010-12-15

    When a system is composed of highly reliable elements, exact reliability quantification may be problematic, because computer accuracy is limited. Inaccuracy can be due to different aspects. For example, an error may be made when subtracting two numbers that are very close to each other, or at the process of summation of many very different numbers, etc. The basic objective of this paper is to find a procedure, which eliminates errors made by PC when calculations close to an error limit are executed. Highly reliable system is represented by the use of directed acyclic graph which is composed from terminal nodes, i.e. highly reliable input elements, internal nodes representing subsystems and edges that bind all of these nodes. Three admissible unavailability models of terminal nodes are introduced, including both corrective and preventive maintenance. The algorithm for exact unavailability calculation of terminal nodes is based on merits of a high-performance language for technical computing MATLAB. System unavailability quantification procedure applied to a graph structure, which considers both independent and dependent (i.e. repeatedly occurring) terminal nodes is based on combinatorial principle. This principle requires summation of a lot of very different non-negative numbers, which may be a source of an inaccuracy. That is why another algorithm for exact summation of such numbers is designed in the paper. The summation procedure uses benefits from a special number system with the base represented by the value 2{sup 32}. Computational efficiency of the new computing methodology is compared with advanced simulation software. Various calculations on systems from references are performed to emphasize merits of the methodology.

  10. The Reliability Analyses and Implementation of Polysilicon Resistors in EHV BCD Process%超高压BCD工艺中多晶硅电阻的可靠性分析及实现

    Institute of Scientific and Technical Information of China (English)

    包飞军; 曹刚; 葛艳辉; 石艳玲; 陈滔

    2015-01-01

    Temperature and current have great effects on polysilicon resistors. So,the reliability of polysilicon resis-tors used in EHV BCD should be specially analyzed. According to the test and analysis of the polysilicon resistors with different doping concentrations in 0. 18 μm 700 V BCD process combining with the theory of Joule heating effect,electromigration effect and polysilicon conductive mechanism,the effects of Joule heating and electromigration on polysilicon resistors have been analyzed. Then the methods were proposed to the implementation of polysilicon re-sistors with high reliability.%多晶硅电阻由于其独特的温度特性及电迁移效应,阻值受温度和电流的影响很大,针对应用于超高压BCD工艺中的多晶硅电阻,其可靠性需进行特别分析和设计。通过对0.18μm 700 V BCD工艺中不同掺杂浓度多晶硅电阻的测试与分析,结合多晶硅结构、导电机制、焦耳热效应及电迁移理论,分析了焦耳热和电迁移对多晶硅电阻的影响,并实现了高压BCD工艺中高可靠性的多晶硅电阻。

  11. Method for combination of actions based on stochastic processes in analysis of structural reliability%结构可靠度分析中作用的随机过程组合方法

    Institute of Scientific and Technical Information of China (English)

    姚继涛; 解耀魁

    2013-01-01

    The analysis of structural reliability should be based on the probabilistic models describing directly the variety of actions and the method for combination of actions based on stochastic processes. However, the current method for combination of variable actions appearing with their frequent values or quasi-permanent values is actually the one based on random variables by maximum values expression, thus will lead to unreasonable results. According to stochastic model for variable action of equal time interval stationary binomial rectangular wave process, the interval samples corresponding to exceeded rates of frequent value and quasi-permanent value were determined, and the new concepts of frequent sequence values and quasi-permanent sequence values of variable actions were proposed. Consequently, the method for combination of actions based on stochastic process by sequence values expressions and models for reliability analysis were established. The proposed method is based on the theory of stochastic process, with clear and reasonable engineering concepts, thus can reflect accurately the ideas of actions combination, and derive reasonable results of reliability analysis, overcoming essentially the disadvantages of current method for combination of actions based on random variables.%结构可靠度分析应以直接描述作用随机变化的概率模型和随机过程组合方法为基础,但在目前的作用概率组合中,对由频遇值、准永久值参与组合的可变作用实际采用的是以最大值表达的随机变量组合方法,这将导致不合理的分析结果.依据可变作用的等时段平稳二项矩形波过程概率模型,确定与频遇值、准永久值超越比率对应的时段样本,提出可变作用频遇序位值和准永久序位值的概念,据此建立以作用序位值表达的随机过程组合方法和可靠度分析模型.该方法以随机过程理论为基础,具有明确、合理的工程概念,可准确反映作用组

  12. System Reliability Analysis: Foundations.

    Science.gov (United States)

    1982-07-01

    performance formulas for systems subject to pre- ventive maintenance are given. V * ~, , 9 D -2 SYSTEM RELIABILITY ANALYSIS: FOUNDATIONS Richard E...reliability in this case is V P{s can communicate with the terminal t = h(p) Sp2(((((p p)p) p)p)gp) + p(l -p)(((pL p)p)(p 2 JLp)) + p(l -p)((p(p p...For undirected networks, the basic reference is A. Satyanarayana and Kevin Wood (1982). For directed networks, the basic reference is Avinash

  13. Pseudomonas aeruginosa Cif Protein Enhances the Ubiquitination and Proteasomal Degradation of the Transporter Associated with Antigen Processing (TAP) and Reduces Major Histocompatibility Complex (MHC) Class I Antigen Presentation*

    Science.gov (United States)

    Bomberger, Jennifer M.; Ely, Kenneth H.; Bangia, Naveen; Ye, Siying; Green, Kathy A.; Green, William R.; Enelow, Richard I.; Stanton, Bruce A.

    2014-01-01

    Cif (PA2934), a bacterial virulence factor secreted in outer membrane vesicles by Pseudomonas aeruginosa, increases the ubiquitination and lysosomal degradation of some, but not all, plasma membrane ATP-binding cassette transporters (ABC), including the cystic fibrosis transmembrane conductance regulator and P-glycoprotein. The goal of this study was to determine whether Cif enhances the ubiquitination and degradation of the transporter associated with antigen processing (TAP1 and TAP2), members of the ABC transporter family that play an essential role in antigen presentation and intracellular pathogen clearance. Cif selectively increased the amount of ubiquitinated TAP1 and increased its degradation in the proteasome of human airway epithelial cells. This effect of Cif was mediated by reducing USP10 deubiquitinating activity, resulting in increased polyubiquitination and proteasomal degradation of TAP1. The reduction in TAP1 abundance decreased peptide antigen translocation into the endoplasmic reticulum, an effect that resulted in reduced antigen available to MHC class I molecules for presentation at the plasma membrane of airway epithelial cells and recognition by CD8+ T cells. Cif is the first bacterial factor identified that inhibits TAP function and MHC class I antigen presentation. PMID:24247241

  14. Pseudomonas aeruginosa Cif protein enhances the ubiquitination and proteasomal degradation of the transporter associated with antigen processing (TAP) and reduces major histocompatibility complex (MHC) class I antigen presentation.

    Science.gov (United States)

    Bomberger, Jennifer M; Ely, Kenneth H; Bangia, Naveen; Ye, Siying; Green, Kathy A; Green, William R; Enelow, Richard I; Stanton, Bruce A

    2014-01-03

    Cif (PA2934), a bacterial virulence factor secreted in outer membrane vesicles by Pseudomonas aeruginosa, increases the ubiquitination and lysosomal degradation of some, but not all, plasma membrane ATP-binding cassette transporters (ABC), including the cystic fibrosis transmembrane conductance regulator and P-glycoprotein. The goal of this study was to determine whether Cif enhances the ubiquitination and degradation of the transporter associated with antigen processing (TAP1 and TAP2), members of the ABC transporter family that play an essential role in antigen presentation and intracellular pathogen clearance. Cif selectively increased the amount of ubiquitinated TAP1 and increased its degradation in the proteasome of human airway epithelial cells. This effect of Cif was mediated by reducing USP10 deubiquitinating activity, resulting in increased polyubiquitination and proteasomal degradation of TAP1. The reduction in TAP1 abundance decreased peptide antigen translocation into the endoplasmic reticulum, an effect that resulted in reduced antigen available to MHC class I molecules for presentation at the plasma membrane of airway epithelial cells and recognition by CD8(+) T cells. Cif is the first bacterial factor identified that inhibits TAP function and MHC class I antigen presentation.

  15. A Review: Passive System Reliability Analysis – Accomplishments and Unresolved Issues

    Directory of Open Access Journals (Sweden)

    ARUN KUMAR NAYAK

    2014-10-01

    Full Text Available Reliability assessment of passive safety systems is one of the important issues, since safety of advanced nuclear reactors rely on several passive features. In this context, a few methodologies such as Reliability Evaluation of Passive Safety System (REPAS, Reliability Methods for Passive Safety Functions (RMPS and Analysis of Passive Systems ReliAbility (APSRA have been developed in the past. These methodologies have been used to assess reliability of various passive safety systems. While these methodologies have certain features in common, but they differ in considering certain issues; for example, treatment of model uncertainties, deviation of geometric and process parameters from their nominal values, etc. This paper presents the state of the art on passive system reliability assessment methodologies, the accomplishments and remaining issues. In this review three critical issues pertaining to passive systems performance and reliability have been identified. The first issue is, applicability of best estimate codes and model uncertainty. The best estimate codes based phenomenological simulations of natural convection passive systems could have significant amount of uncertainties, these uncertainties must be incorporated in appropriate manner in the performance and reliability analysis of such systems. The second issue is the treatment of dynamic failure characteristics of components of passive systems. REPAS, RMPS and APSRA methodologies do not consider dynamic failures of components or process, which may have strong influence on the failure of passive systems. The influence of dynamic failure characteristics of components on system failure probability is presented with the help of a dynamic reliability methodology based on Monte Carlo simulation. The analysis of a benchmark problem of Hold-up tank shows the error in failure probability estimation by not considering the dynamism of components. It is thus suggested that dynamic reliability

  16. Reliability assessment of numerical control machine tools with whole lifecycle based on bounded bathtub intensity process%基于边界浴盆强度过程的数控机床全寿命周期可靠性评估

    Institute of Scientific and Technical Information of China (English)

    任丽娜; 任帅; 王先芝; 刘宇

    2014-01-01

    This paper presented a reliability assessment method for numerical control machine tool based on bounded bathtub in-tensity process (BBIP)model.The method can assess the whole lifecycle reliability of numerical control (NC)machine tools in minimal repair.Bounded bathtub intensity function of failure times was built,and a formula for calculating the lasting time of ear-ly failure period was derived.The maximum likelihood estimation (MLE)for model parameters and reliability indices were given. The failure process of a single NC machine tool with failure truncation was analyzed.The results show that the lasting time of early failure period for NC machine tool is about 5 months.The estimated time is basically consistent with the actual situation.It is demonstrated that BBIP model can accurately evaluate the lasting time of early failure period and can provide a theoretical basis for improving the reliability of NC machine tools.%在最小维修情况下,为实现数控机床的全寿命周期可靠性评估,提出基于边界浴盆强度过程(bounded bathtub intensity process,BBIP)模型的数控机床可靠性评估方法。建立了故障时间的边界浴盆强度函数式,推导了早期故障期持续时间的计算公式,并给出了模型参数和可靠性指标的极大似然点估计。采用该评估方法分析了单台数控机床故障截尾的故障过程,结果表明:该数控机床早期故障期的持续时间大约为5个月,与机床的实际情况基本一致,说明BBIP模型可准确地评估早期故障期的持续时间,为提高机床可靠性提供了一定的理论依据。

  17. Reliability of Wave Energy Converters

    DEFF Research Database (Denmark)

    Ambühl, Simon

    There are many different working principles for wave energy converters (WECs) which are used to produce electricity from waves. In order for WECs to become successful and more competitive to other renewable electricity sources, the consideration of the structural reliability of WECs is essential....... Structural reliability considerations and optimizations impact operation and maintenance (O&M) costs as well as the initial investment costs. Furthermore, there is a control system for WEC applications which defines the harvested energy but also the loads onto the structure. Therefore, extreme loads but also...... of the Wavestar foundation is presented. The work performed in this thesis focuses on the Wavestar and WEPTOS WEC devices which are only two working principles out of a large diversity. Therefore, in order to gain general statements and give advice for standards for structural WEC designs, more working principles...

  18. RELIABILITY ANALYSIS OF POWER DISTRIBUTION SYSTEMS

    Directory of Open Access Journals (Sweden)

    Popescu V.S.

    2012-04-01

    Full Text Available Power distribution systems are basic parts of power systems and reliability of these systems at present is a key issue for power engineering development and requires special attention. Operation of distribution systems is accompanied by a number of factors that produce random data a large number of unplanned interruptions. Research has shown that the predominant factors that have a significant influence on the reliability of distribution systems are: weather conditions (39.7%, defects in equipment(25% and unknown random factors (20.1%. In the article is studied the influence of random behavior and are presented estimations of reliability of predominantly rural electrical distribution systems.

  19. Present time

    CERN Document Server

    Romero, Gustavo E

    2014-01-01

    The idea of a moving present or `now' seems to form part of our most basic beliefs about reality. Such a present, however, is not reflected in any of our theories of the physical world. I show in this article that presentism, the doctrine that only what is present exists, is in conflict with modern relativistic cosmology and recent advances in neurosciences. I argue for a tenseless view of time, where what we call `the present' is just an emergent secondary quality arising from the interaction of perceiving self-conscious individuals with their environment. I maintain that there is no flow of time, but just an ordered system of events.

  20. Expert system aids reliability

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, A.T. [Tennessee Gas Pipeline, Houston, TX (United States)

    1997-09-01

    Quality and Reliability are key requirements in the energy transmission industry. Tennessee Gas Co. a division of El Paso Energy, has applied Gensym`s G2, object-oriented Expert System programming language as a standard tool for maintaining and improving quality and reliability in pipeline operation. Tennessee created a small team of gas controllers and engineers to develop a Proactive Controller`s Assistant (ProCA) that provides recommendations for operating the pipeline more efficiently, reliably and safely. The controller`s pipeline operating knowledge is recreated in G2 in the form of Rules and Procedures in ProCA. Two G2 programmers supporting the Gas Control Room add information to the ProCA knowledge base daily. The result is a dynamic, constantly improving system that not only supports the pipeline controllers in their operations, but also the measurement and communications departments` requests for special studies. The Proactive Controller`s Assistant development focus is in the following areas: Alarm Management; Pipeline Efficiency; Reliability; Fuel Efficiency; and Controller Development.